EP3146412A2 - Causing gesture responses on connected devices - Google Patents
Causing gesture responses on connected devicesInfo
- Publication number
- EP3146412A2 EP3146412A2 EP15795423.1A EP15795423A EP3146412A2 EP 3146412 A2 EP3146412 A2 EP 3146412A2 EP 15795423 A EP15795423 A EP 15795423A EP 3146412 A2 EP3146412 A2 EP 3146412A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- gesture
- connected device
- devices
- processors
- connected devices
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000004044 response Effects 0.000 title claims abstract description 141
- 230000003993 interaction Effects 0.000 claims abstract description 76
- 238000000034 method Methods 0.000 claims abstract description 24
- 230000000007 visual effect Effects 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 5
- 238000010079 rubber tapping Methods 0.000 claims description 4
- 230000005540 biological transmission Effects 0.000 claims description 2
- 238000004891 communication Methods 0.000 description 12
- 230000009471 action Effects 0.000 description 10
- 238000012545 processing Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000001939 inductive effect Effects 0.000 description 5
- 210000004556 brain Anatomy 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000009118 appropriate response Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001755 vocal effect Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000008713 feedback mechanism Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000704 physical effect Effects 0.000 description 1
- 230000010399 physical interaction Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
Definitions
- FIG. 1 is an example block diagram depicting a system for causing gesture responses for connected devices
- FIG. 2 is an example flow chart showing a method for causing gesture responses for connected devices
- FIG. 3 is an example flow chart illustrating a method for associating devices and selecting response gestures for associated devices
- FIG. 4 is an example of a connected device to receive user interactions and perform response gestures
- FIG. 5 is an example block diagram depicting a computer system upon which examples described may be implemented.
- a system and method are provided relating to causing gesture responses on connected devices.
- the method can be performed on, for example, a server computing system implemented in accordance with an application running on any number of computing devices (e.g., mobile computing devices).
- the system can maintain a database storing information, such as user profile data, unique identifiers for connected devices to associate those devices with their respective owners, and data
- the method implemented by the system can include receiving a gesture signal indicating a user interaction with the user's connected device.
- the connected device can be a robotic figurine, or other mechanical toy, including sensors, mechanical systems, a controller, audio output, a lighting system, a transceiver, etc. Accordingly, the connected device can perform a variety a gestures or actions which include physical, audible, visual, and/or haptic gestures.
- the connected device can receive and transmit signals indicating user interactions (e.g., physical interactions) with the connected device, and causing the connected device to perform response gestures according to a response signal.
- the disclosed system can perform a lookup, in the database, to identify related connected devices associated with the user's connected device.
- the system can generate and transmit a response signal to the associated connected devices.
- the response signal can cause the associated connected devices to perform one or more gestures signifying the user interaction with the user's connected device.
- a user can perform a squeeze action on the user's connected device, which can be interpreted as a hug input on the connected device.
- the connected device can relay a gesture signal through the user's mobile computing device to the disclosed system over a network (e.g., the Internet).
- the gesture signal can indicate that the connected device received a hug input.
- the system can look up associated devices in the database. Such associated devices may correspond to devices associated with the user's children, relatives, friends, and the like. The system can identify those associated devices and generate a response signal to signify that the user's connected device received the hug input.
- the response signal can be transmitted to the associated devices, which, in response, can perform gesture actions (e.g., initiate a physical action, trigger visual indicators such as lights, perform audible or haptic actions, etc.).
- gesture actions e.g., initiate a physical action, trigger visual indicators such as lights, perform audible or haptic actions, etc.
- One or more examples described herein provide that methods, techniques, and actions performed by a computing device are performed programmatically, or as a computer-implemented method. Programmatically, as used herein, means through the use of code or computer-executable instructions. These instructions can be stored in one or more memory
- a programmatically performed step may or may not be automatic.
- a programmatic module or component can include a program, a sub-routine, a portion of a program, or a software component or a hardware component capable of performing one or more stated tasks or functions.
- a module or component can exist on a hardware component independently of other modules or
- a module or component can be a shared element or process of other modules, programs or machines.
- Some examples described herein can generally require the use of computing devices, including processing and memory resources.
- computing devices including processing and memory resources.
- one or more examples described herein can be implemented, in whole or in part, on computing devices such as digital cameras, digital camcorders, desktop computers, cellular or smart phones, personal digital assistants
- PDAs laptop computers, printers, digital picture frames, and tablet devices.
- Memory, processing, and network resources may all be used in connection with the establishment, use, or performance of any example described herein
- one or more examples described herein may be implemented through the use of instructions that are executable by one or more processors. These instructions may be carried on a computer-readable medium.
- Machines shown or described with figures below provide examples of processing resources and computer-readable mediums on which instructions for implementing examples can be carried and/or executed.
- the numerous machines shown with examples include processor(s) and various forms of memory for holding data and instructions.
- Examples of computer- readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers.
- Other examples of computer storage mediums include portable storage units, such as CD or DVD units, flash memory (such as carried on smart phones, multifunctional devices or tablets), and magnetic memory.
- Computers, terminals, network enabled devices are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums. Additionally, examples may be implemented in the form of computer-programs, or a non-transitory computer usable carrier medium capable of carrying such a program.
- FIG. 1 is an example block diagram depicting a system for causing gesture responses for connected devices.
- the system 100 can include an application module 160 to provide a gesture application 162 to any number of computing devices.
- the gesture application 162 can be provided to the computing device via a storage medium, such as a portable storage device. Additionally or alternatively, the gesture application 162 can be downloaded via an application store over a network 180.
- the gesture application 162 can further be associated with a mechanical device, such as a robotic figurine or other robotic device.
- the gesture application 162 can be launched and connected to the system 100 over the network 180.
- Communication links 186, 188 can be established between the computing devices 178, 198 and the network to communicate signals to the system 100.
- the communication links 186, 188 can enable a Wi- Fi system on each of the computing devices 178, 198 to connect to the
- the computing devices 178, 198 can communicate with the system 100 over such communication protocols as standardized by the Institute of Electrical and Electronics Engineers (IEEE), such as any of the IEEE 802.11 protocols.
- IEEE Institute of Electrical and Electronics Engineers
- a respective computing device e.g., computing device 178
- launch of the gesture application 162 can automatically establish a Bluetooth link between the computing device 178 and the connected device 170.
- various feedback mechanisms can be enabled between the computing device 178 and the connected device 170.
- the gesture application 162 can provide a user interface on a display of the computing device 178 to allow the user 174 to provide inputs to mechanically, visually, and/or audibly control the connected device 170. Additionally or alternatively, the user 174 can perform user interactions 176 with the connected device 170, which, in response, can perform any number of predetermined responses based on the user interaction 176.
- a number of sensors on the connected device 170 can provide an input regarding the type of user interaction 176.
- the user interaction 176 may exemplify a hug upon the connected device 170 which may be sensed and communicated to the computing device 178.
- Other user interactions 176 with the connected device 170 can include, for example, squeezing, throwing, tapping, rotating, dropping, shaking, twisting, or breaking a whole or a portion of the connected device 170.
- Such user interactions 176 can be sensed by the connected device 170 and data indicative of such user interactions 176 can be communicated to the system 100 either directly from the connected device 170, or relayed through the computing device 178.
- a gesture signal 182 is communicated to the system 100
- the user 174 can produce a gesture signal 182 via user input on the computing device 178.
- the gesture application 162 can provide a graphic user interface allowing the user 174 to select any number of gestures to be performed by an associated connected device 190.
- the graphic user interface can provide a selectable list of predetermined gestures 137 from a gesture database 135, that the user 174 can select from in order to cause a specified gesture to be performed by the associated connected device 190
- the connected devices 170, 190 can be directly connected to the system 100 over the network 180.
- the network 180 can be directly connected to the connected devices 170, 190 directly connected to the system 100 over the network 180.
- no relay through respective computing devices 178, 198 is necessary.
- such connected devices 170, 190 may be in communication with the system over a Wi-Fi network according to IEEE protocols (e.g., any IEEE 802.11 protocol).
- the connected device 170 can be preprogrammed to communicate data indicating user interactions 176 on the connected device 170.
- the connected device 190 can be preprogrammed to perform the same, and/or to receive response signals 152 that trigger an associated gesture 192.
- the gesture signal 182 can be detected by a gesture detector 120 included in the system 100.
- the gesture detector 120 can monitor connected devices over the network 180 for such gesture signals 182, or can passively receive such gesture signals 182.
- the gesture signal 182 can include
- the gesture signal 182 can include unique identifiers corresponding to the computing device 178 and/or the connected device 170.
- the gesture signal 182 can further indicate the type of user interaction 176 performed on the connected device 170 by the user 174.
- the gesture signal 182 can indicate that the user interaction 176 corresponds to a squeeze input on the connected device 170.
- the gesture detector 120 can parse the gesture signal 182 to determine the device identifiers 122 for the computing device 178 and/or the connected device 170.
- the gesture detector 120 can output a signal indicating the device identifiers 122 to an association module 110.
- the gesture detector 120 can parse the gesture signal 182 to determine the user interaction 176 on the connected device 170.
- the gesture detector 120 can output an interaction signal 124 indicating the user
- the association module 110 can receive the device identifiers 122 from the gesture detector 120 and perform a look up in an identifier database 130 included in the system 100.
- the identifier database 130 can include user accounts 132 and/or user profiles associated with computing devices (e.g., computing devices 178, 198) and/or connected devices (e.g., connected devices 170, 190) as disclosed.
- the system 100 can set up a user account 132, which can include one or more connected device identifiers 134 and one or more computing device identifiers associated with the user 174.
- the user account 132 may include a connected device identifier corresponding to the connected device 170, and a computing device identifier corresponding to the computing device 178.
- the user account 132 may be set up to include any number of identifiers for connected devices and/or computing devices associated with the user 174 or any other connected device or computing device.
- the identifier database 130 can include association information indicating devices associated with the computing device 178, the connected device 170, and/or the user 174.
- association information can include associated device identifiers 138 for devices associated as described.
- any combination of connected devices and computing devices can be paired (e.g., via established connection or inductive pairing), which can be detected by the system 100 to form associations between paired devices.
- the connected device 170 and the associated connected device 190 can be preconfigured as a pair, and therefore the identifier database 130 can include association information indicating that the connected device 170 and the associated connected device 190 are indeed associated.
- the association module 110 can look up, in the identifier database 130 the associated device identifiers 138 corresponding to any number of connected devices associated with the computing device 178, the connected device 170, and/or the user 174.
- the associated device identifiers 138 can be sent to the response selector 140, which determines which response gesture is to be performed by associated devices corresponding to the associated device identifiers 138.
- the response selector 140 can also receive the user interaction signal 124 from the gesture detector 120, which indicates the user interaction 176 performed on the connected device 170.
- the response selector 140 can process the user interaction signal 124 to determine the type of user
- the response selector 140 can make a determination regarding an associated gesture 192 to be performed by the associated connected device 190. For example, based on the user interaction signal 124, the response selector 140 can determine any number of response gestures, each of which can include one or more haptic, visual, audible, or physical gestures to be performed by the connected device 190. Furthermore, the response selector 140 can look up predetermined gestures 137 in a gesture database 135 to select an appropriate response gesture to be performed based on the user interaction 176 with the connected device 170.
- the user interaction 176 may correspond to a squeeze input on the connected device 170, which may cause the connected device 170 itself to perform a gesture including any number or combination of visual, audible, haptic, or physical responses.
- the response selector 140 can look up, in the gesture database 135, a predetermined gesture 137 to response to the user interaction 176.
- the user interaction 176 with the connected device 170 can cause the response selector 140 to choose a predetermined gesture 137 to be performed by the associated connected device 190.
- the response selector 140 can select a predetermined gesture 142 form the stored
- predetermined gestures 137 in the gesture database 135 having a visual response which causes the associated connected device to light up.
- the selected predetermined gesture 142 can also cause the associated connected device 190 to provide a haptic response in a
- the selected predetermined gesture 142 can cause mechanical motion of the associated connected device 190, and/or can further cause an audible action, such as speaking predetermined words or phrases.
- the response selector 140 can configure a customized response to the user interaction 176. According to such variations, the response selector 140 can configure any number or combination of visual, audible, haptic, or physical/mechanical gestures to be performed by the associated connected device 190.
- the response selector 140 communicates the gesture 142 to a response signal generator 150.
- the response signal generator 150 generates a response signal 152 incorporating the specific actions to be performed by the associated connected device 190. Accordingly, once the response signal 152 is generated, the response signal generator 150 can transmit the response signal 152 to the associated connected device 190, and other connected devices identified by the associated device identifiers 138. For example, the response signal 152 can be transmitted over the network 180 to the associated
- the response signal 152 may be sent over the network 180 to the associated computing device 198 or associated connected device 190 anywhere in the world.
- the computing device 198 can be connected to the network via a communication link 188 to receive the response signal 152 and relay it to the associated connected device 190 to perform the associated gesture 192.
- the gesture detector 120 can receive
- simultaneous gesture signals from any number of associated devices. For example, while receiving the gesture signal 182 corresponding to the user interaction 176 with the connected device 170.
- the gesture detector 120 may receive a simultaneous signal indicating simultaneous user interaction (by another user) with the associated connected device 190.
- the association module 110 can recognize such simultaneous interaction, and the response selector 140 may select a predetermined response based on the simultaneous interaction. For example, based on the simultaneous interaction, the response selector 140 may cause the response signal generator 150 to generate simultaneous response signals to be transmitted to both the connected device 170 and the associated connected device 190.
- the simultaneous response signals can be generated to cause the connected device 170 and the associated connected device 190 to perform the same or similar gestures selected by the response selector 140. Alternatively, the simultaneous response signals may be generated to intensify the gesture performed by the connected device 170 and the associated connected device 190 in response to the simultaneous user interactions.
- the system 100 can receive indications or determine that one or more associations have expired or that connected devices have been unpaired.
- the system 100 can include a timer 133 that can initiate when a connected device 170 and an associated
- connected device 190 are paired. Upon a predetermined duration, the pairing can expire and the connected device 170 and the associated connected device can be automatically unpaired. This unpairing may involve disassociating the unique identifiers corresponding to the connected device 170 and the
- Such a disassociation can be made by editing a user profile or user account 132 in the identifier database 130.
- the system 100 can receive an
- the unpairing signal indicating that the connected device 170 and the associated connected device 190 have been unpaired.
- the connected device 170 and the associated connected device 190 can be unpaired, for example, by
- the identifier database can be accessed to disassociate the unique identifiers corresponding to the connected device 170 and the associated connected device 190.
- a specified user interaction 176 on the connected device 170 may ultimately indicate that only one specific associated device, out of a plurality, is to receive a response signal 152.
- the user 174 may wish to communicate a gesture to a specified robotic teddy bear possessed by the user's son or daughter.
- a specified user interaction such as a tapping gesture on the connected device 170, or a squeezing input on a specified portion of the connected device, can be determined by the response selector 140, and the response signal generator 150 can be informed to only transmit a corresponding response signal 152 to the specified robotic teddy bear.
- the response signal 152 can be generated to cause the robotic teddy bear perform a specified associated gesture 192 based on the specified user interaction.
- the system 100 can detect when two connected devices are within a predetermined distance from each other. Such detection can be performed via location-based resources on the computing devices 178, 198. In response to such detection, the response signal generator 150 can transmit respective response signals to the computing devices 178, 198 to cause them to each perform a predetermined gesture. Such a gesture may be specific to proximity detection by the system 100. Furthermore, such a gesture may be selected to intensify, via a series of response signals 152, as the connected devices 170, 190 get closer in proximity.
- the system 100 can detect instances when computing devices have launched the gesture application 162. Accordingly, prior to receiving the gesture signal 182, the system 100 can receive a launch signal indicating that the computing device 178 has launched the gesture application. Furthermore, prior to transmitting the response signal 152, the system 100 can make a determination whether the associated computing device 198 is currently running the gesture application 162. In response to determining that the associated computing device 198 is not currently running the gesture application 162, the system 100 can associate or tag the user account in the identifier database 130 indicating that a specified response signal 152 selected by the response selector 140 needs to be transmitted to the associated computing device 190.
- the system 100 can queue the transmission of the response signal 152 until a subsequent launch signal is received indicating that the associated computing device 198 has launched the gesture application 162.
- the response signal 152 can be automatically transmitted to the associated computing device 198 to perform the associated gesture 192.
- the computing devices 178, 198 can be any device capable of running the gesture application 162, and/or Wi-Fi enabled devices.
- such computing devices 178, 198 may correspond to laptops, PCs, smartphones, tablet computing devices, and the like.
- FIG. 2 is an example flow chart showing a method for causing gesture responses for connected devices.
- the response detector 120 included in the system 100 receives a gesture signal 182
- an association module 110 performs a lookup in an identifier database 130 to identify connected devices or computing device associated with the first connected device 170 (220).
- the gesture detector 120 can determine the user interaction performed on the connected device 170 (230). For example, sensors on the connected device 170 can be triggered during the user interaction 176, the data of which can be
- the response selector 140 can determine or otherwise select an appropriate gesture 142 from a collection of predetermined gestures 137 (240). Alternatively, the response selector 140 can cause the response signal generator 150 to generate a custom response in accordance with the user interaction 176.
- the response signal generator 150 can then generate a specified response signal 152 according to the selected gesture 142 by the response selector 140 (250).
- the response signal 152 can be generated by the response signal generator 150 to cause the associated connected device 190 to perform the associated gesture 192. Accordingly, the response signal 152 can then be transmitted to the associated connected device 190 (250).
- FIG. 3 is an example flow chart illustrating a more detailed method for associating devices and selecting response gestures for associated devices.
- the system 100 may receive one or more pairing signals indicating one or more pairings between connected devices (310).
- connected devices may be paired (and unpaired) via inductive coupling.
- the system accesses the identifier database 130 to append or modify user accounts to make associations between the paired devices (320).
- connected devices e.g., robotic toys
- computing devices corresponding to the users of the connected devices can be associated in the identifier database (324).
- the gesture detector 120 can receive any number of gesture signals 182 indicating launched gesture applications 162 and user interactions with connected devices (330).
- the gesture signals 182 can be received continuously and dynamically and subsequent response signals 152 may be generated
- the association module 110 performs lookups in the identifier database 130 to identify all associated devices (340). The association module 110 determines whether associated devices exist in the identifier database (342). If associated devices are not found in the identifier database 130 (344), the system 100 ends the process (390). However, if associated devices are found for a respective connected device (346), the gesture detector 120 proceeds to determine the gesture performed on the respective connected device based on the user interaction and submit the user interaction signal 124 to the response selector 140 (350).
- the response selector 140 can select an appropriate response gesture (360). For example, a squeeze input on the respective connected device can cause the response selector to choose a response gesture that incorporates any number of audio, visual, haptic, and/or physical/mechanical actions.
- the response selector 140 can select a predetermined gesture from a gesture database 135, where response gestures are pre-associated with input gestures corresponding to the user interaction with the respective connected device. Additionally or alternatively, the response selector 140 can select any number or combination of physical gestures (362), audible gestures (364), visual gestures (366), or even haptic responses to be performed by the associated devices.
- the response signal generator 150 can generate the corresponding response signal 152 corresponding to the selected or
- the response signal 152 is then transmitted to the associated devices to cause them to perform an associated gesture 192 corresponding to the determined or selected gesture by the response selector 140 (380).
- the generated response signal 152 can be transmitted anywhere in the world to the associated connected devices.
- the response signal is configured to cause the associated device to perform the selected actions corresponding to the determined or selected gesture. Thereafter, the process is ended (390).
- FIG. 4 is an example of a connected device to receive user
- the connected device 400 can be linked to a computing device 490, which, in accordance with the above description, can run a gesture application specific to receiving user interactions and performing gesture responses.
- the connected device 400 can be linked 425 to the mobile computing device 490 via a communication link, (e.g., Bluetooth, RF, infrared, optical, etc.).
- the connected device 400 includes electronics within, which allows it to create verbal and non-verbal gestures, utilizing vibrations, tones, lights, and/or mechanical gestures.
- the connected device 400 can be directly connected to a network for communication with other connected devices or computing devices.
- the connected device 400 can relay signals through the computing device 490.
- the computing device 490 can run the gesture application and the link 425 can be established automatically, or configured by a user.
- the connected device 400 can include a pairing port 435, which allows the connected device 400 to pair with other connected devices.
- the pairing port 435 may comprise one or more coils to communicate with the computing device 490 and/or other connected devices. Accordingly, the connected device 400 can inductively pair with such other devices to allow the system 100, as disclosed in FIG. 1, to form associations between the devices.
- connected devices may pair with each other through established links over a graphical user interface via the gesture application, or by simply inductive pairing where devices are tapped together to form the pairing.
- Such an inductive pairing may be indicated by a gesture response on one or more of the connected devices (e.g., haptic response and/or lighting up).
- the connect device 400 can receive inputs from a user that can be detected by one or more sensors 480 on the connected device 400.
- the sensors 480 can include any number of any type or combination of sensor.
- the sensors 480 can include a number of accelerometers, touch sensors, pressure sensors, thermal sensors, analog buttons, and the like.
- Such sensors 480 can be arranged on and within the connected device 400 to detect any number of user interactions, such as squeezing, throwing, tapping, rotating, dropping, shaking, twisting, or breaking a whole or a portion of the connected device 400.
- Such user interactions may be communicated to other connected devices over long distances (e.g., anywhere in the world).
- the transceiver 420 can be any suitable wireless transceiver to establish the link 425 with the computing device 490 or a network.
- the transceiver 420 can be a Bluetooth or other RF transceiver module.
- Raw sensor data corresponding to user interactions with the connected device 400 can be directly communicated to the computing device 490 for external processing.
- the sensor data can be processed internally on the connected device 400 to provide information related to the type of user interaction.
- a memory 440 can be included to store lookup information related to types of user interactions in correlation with sensor inputs.
- the input from the sensors 480 can be processed by a controller 430, which can determine, based on the sensor inputs, the type of user interaction performed on the connected device 400. Accordingly, the controller 430 can communicate information relating to the type of user interaction to the computing device 490 via the transceiver 420.
- the memory 440 can further store instructions executable by the controller 430. Such instructions can cause the controller 430 to perform various operations based on sensor inputs from the sensors 480, and/or communications transmitted from the computing device 490 or over a network. For example, a user interaction with the connected device 400 can cause the controller 430 to operate any number of internal electronic components included with the connected device 400. Such electronics can include, for example, a light system 460 including one or more lights on the connected device 400, an audio system 440 including one or more auditory devices (e.g., a speaker), a haptic system 470 to cause a whole or one or more portions of the connected device to vibrate, or a mechanical system 450 to cause the connected device 400 to perform physical gestures. [0053] Thus, the controller 430 can ultimately control the connected device 400 to perform any number of gestures incorporating any of the foregoing systems. For example, a user performing a squeeze input on the connected device 400 can cause the connected device 400 to light up and vibrate.
- an input e.g., squeeze input
- an associated connected device located any distance from the connected device 400 can cause the connected device 400 to perform a gesture.
- a user interaction with a distant associated connect device can be communicated, via the computing device 490, to the connected device 400, which can perform an associated gesture (e.g., light up and raise its arms).
- gestures may be banked either in the memory 440 of the connected device 400, or within the system 100 as described with respect to FIG. 1.
- Banked gestures can correspond to received data that a user interaction has been performed on an associated connected device.
- the connected device 400 may be in a deep sleep mode, or dormant mode, when such a user interaction on a distant connected device takes place. Accordingly, a gesture may be saved for the connected device 400 to be performed when the connected device awakes.
- Awakening the connected device 400 can be achieved by any suitable means.
- the connected device 400 can be awakened by a user touching or moving the connected device 400 itself.
- the connected device 400 can be awakened when the computing device 490 establishes the link 425 or otherwise enters within a predetermined proximity from the connected device 400.
- the device can be awakened to perform a banked gesture by a user pushing a specific button or performing a specific action on the connected device 400.
- any of the electronics in the connected device 400 can be removable and can be inserted into another connected device.
- the controller 430 and/or memory 440 can behave as the "brain" of the connected device 400, and can be removable and inserted into another device.
- stored data included in the memory 440 can be transferred between devices.
- a radio frequency identification (RFID) chip 410 can be included in the connected device 400. Accordingly, upon insertion of the brain (i.e., memory 440 and/or controller 430), the system 100 can determine that the user is associated with the connected device 400.
- RFID radio frequency identification
- new or different gestures and/or behaviors stored on the memory 440 can be performed as the brain is transferred from device to device.
- the connected device 400 can further include a location-based system. Accordingly, the connected device 400 can be programmed or otherwise caused to perform any number of gestures upon entering a
- the connected device 400 can utilize a location based function on the computing device 490 to be location aware. As an example, the connected device 400 can determine that it is within a certain distance (e.g., 1 mile) from, for example, a home location or a theme park, causing the connected device 400 to perform a preselected gesture.
- a certain distance e.g. 1 mile
- FIG. 5 is a block diagram that illustrates a computer system upon which examples described may be implemented. For example, one or more components discussed with respect to the system 100 of FIG. 1 and the method of FIGS. 2-3 may be performed by the system 500 of FIG. 5. The system 100 can also be implemented using a combination of multiple computer systems as described by FIG. 5.
- the computer system 500 includes
- the computer system 500 includes at least one processor 510 for processing information and a main memory 520, such as a random access memory (RAM) or other dynamic storage device, for storing information and instructions 522 to be executed by the processor 510.
- the main memory 520 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor 510.
- the computer system 500 may also include a read only memory (ROM) 530 or other static storage device for storing static information and instructions for the processor 510.
- ROM read only memory
- a storage device 540 such as a magnetic disk or optical disk, is provided for storing information and
- the storage device 540 can correspond to a computer-readable medium that trigger gesture logic 542 for performing operations discussed with respect to FIGS. 1-4.
- the communication interface 550 can enable computer system 500 to communicate with one or more networks 580 (e.g., cellular or Wi-Fi network) through use of the network link (wireless or wireline). Using the network link, the computer system 500 can communicate with a plurality of devices, such as the mobile computing devices of the clients and service providers. The computer system 500 can further supply the gesture
- the computer system 500 can receive gesture signals 582 from the mobile computing devices of the clients and service providers via the network link.
- the communication interface 550 can further be utilized to transmit response signals 584 to various mobile computing devices in response to the gesture signals 582.
- the ROM 530 (or other storage device) can store device identifiers 532 and user accounts 534, which include various user information concerning previous device connections and device associations.
- the processor 510 can access the user accounts 534 to look up device identifiers 532 to determine the particular associations 512 between connected devices and computing devices. Once the processor 510 determines the associations 512, the processor 510 can make response selections 514 and generate response signals 584 to be transmitted to those associated devices.
- Examples described herein are related to the use of computer system 500 for implementing the techniques described herein. According to one example, those techniques are performed by computer system 500 in response to processor 510 executing one or more sequences of one or more instructions contained in main memory 520, such as the gesture logic 542. Such instructions may be read into main memory 520 from another machine- readable medium, such as storage device 540. Execution of the sequences of instructions contained in main memory 520 causes processor 510 to perform the process steps described herein. In alternative implementations, hardwired circuitry may be used in place of or in combination with software instructions to implement examples described herein. Thus, the examples described are not limited to any specific combination of hardware circuitry and software.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462002706P | 2014-05-23 | 2014-05-23 | |
PCT/US2015/032299 WO2015179838A2 (en) | 2014-05-23 | 2015-05-22 | Causing gesture responses on connected devices |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3146412A2 true EP3146412A2 (en) | 2017-03-29 |
EP3146412A4 EP3146412A4 (en) | 2017-12-06 |
Family
ID=58056060
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15795423.1A Withdrawn EP3146412A4 (en) | 2014-05-23 | 2015-05-22 | Causing gesture responses on connected devices |
Country Status (1)
Country | Link |
---|---|
EP (1) | EP3146412A4 (en) |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6160540A (en) * | 1998-01-12 | 2000-12-12 | Xerox Company | Zoomorphic computer user interface |
WO2001009863A1 (en) * | 1999-07-31 | 2001-02-08 | Linden Craig L | Method and apparatus for powered interactive physical displays |
WO2007072295A2 (en) * | 2005-12-22 | 2007-06-28 | Koninklijke Philips Electronics N.V. | Valentine pillow |
JP6036821B2 (en) * | 2012-06-05 | 2016-11-30 | ソニー株式会社 | Information processing apparatus, information processing method, program, and toy system |
-
2015
- 2015-05-22 EP EP15795423.1A patent/EP3146412A4/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
EP3146412A4 (en) | 2017-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9769686B2 (en) | Communication method and device | |
CN106416317B (en) | Method and apparatus for providing location information | |
JP6490890B2 (en) | Information providing method and portable terminal therefor | |
EP2738706B1 (en) | Method and mobile terminal for controlling screen lock | |
US10912131B2 (en) | Method and mobile terminal for controlling bluetooth low energy device | |
KR102092063B1 (en) | Method And Apparatus For Performing Communication Service | |
EP2663110A1 (en) | Near Field Communication Tag Data Management | |
US20150245164A1 (en) | Interaction between wearable devices via broadcasted sensor-related data | |
US10591589B2 (en) | Apparatus and method for measuring wireless range | |
KR20140127895A (en) | Sensor based configuration and control of network devices | |
EP3474517B1 (en) | Electronic device for controlling iot device to correspond to state of external electronic device and operation method thereof | |
US20150338925A1 (en) | Causing gesture responses on connected devices | |
KR102209068B1 (en) | Method for reconnecting master device and slave device | |
CN104641615A (en) | Portable token for pairing two devices | |
JP2018166341A (en) | Method for controlling ble devices and mobile terminal therefor | |
KR20200044505A (en) | Electronic device suporting link sharing and method therefor | |
EP3170330B1 (en) | Method and electronic device for providing data | |
KR20180114755A (en) | Device And Communication Connection Method Thereof | |
US9331745B2 (en) | Electronic device and communication system for mediating establishment of communication between plurality of communication devices | |
CN105094966B (en) | Control the method and device of PC | |
CN110063052B (en) | Method and system for confirming pairing | |
US11304076B2 (en) | Electronic apparatus and method for controlling the electronic apparatus | |
US11032376B2 (en) | Electronic device for controlling registration session, and operation method therefor; and server, and operation method therefor | |
KR101967320B1 (en) | Method and apparatus for associating online accounts | |
US11425081B2 (en) | Message reception notification method and electronic device supporting same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20161123 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20171107 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 3/01 20060101AFI20171030BHEP Ipc: G06F 17/30 20060101ALI20171030BHEP Ipc: G06F 15/16 20060101ALI20171030BHEP |
|
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 1240338 Country of ref document: HK |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20191011 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20200222 |
|
REG | Reference to a national code |
Ref country code: HK Ref legal event code: WD Ref document number: 1240338 Country of ref document: HK |