WO2016113693A1 - Wearable data processing and control platform apparatuses, methods and systems - Google Patents

Wearable data processing and control platform apparatuses, methods and systems Download PDF

Info

Publication number
WO2016113693A1
WO2016113693A1 PCT/IB2016/050161 IB2016050161W WO2016113693A1 WO 2016113693 A1 WO2016113693 A1 WO 2016113693A1 IB 2016050161 W IB2016050161 W IB 2016050161W WO 2016113693 A1 WO2016113693 A1 WO 2016113693A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearable device
user
wearable
display
processor
Prior art date
Application number
PCT/IB2016/050161
Other languages
French (fr)
Inventor
Simon Tian
Original Assignee
Neptune Computer Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neptune Computer Inc. filed Critical Neptune Computer Inc.
Publication of WO2016113693A1 publication Critical patent/WO2016113693A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/30Security of mobile devices; Security of mobile applications
    • H04W12/33Security of mobile devices; Security of mobile applications using wearable devices, e.g. using a smartwatch or smart-glasses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J7/00Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
    • H02J7/00032Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries characterised by data exchange
    • H02J7/00045Authentication, i.e. circuits for checking compatibility between one component, e.g. a battery or a battery charger, and another component, e.g. a power source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0853Network architectures or network communication protocols for network security for authentication of entities using an additional device, e.g. smartcard, SIM or a different communication terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/03Protecting confidentiality, e.g. by encryption
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W76/00Connection management
    • H04W76/10Connection setup
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J50/00Circuit arrangements or systems for wireless supply or distribution of electric power
    • H02J50/10Circuit arrangements or systems for wireless supply or distribution of electric power using inductive coupling
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J50/00Circuit arrangements or systems for wireless supply or distribution of electric power
    • H02J50/10Circuit arrangements or systems for wireless supply or distribution of electric power using inductive coupling
    • H02J50/12Circuit arrangements or systems for wireless supply or distribution of electric power using inductive coupling of the resonant type
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2463/00Additional details relating to network architectures or network communication protocols for network security covered by H04L63/00
    • H04L2463/082Additional details relating to network architectures or network communication protocols for network security covered by H04L63/00 applying multi-factor authentication

Definitions

  • Body-borne technology or wearables are small devices that can be embedded in clothing and/or personal accessories.
  • the non-intrusive nature and usability of body-borne technologies provide users with a unique human-device interaction wherein users can utilize a technology as an extension of their mind and body.
  • Embodiments of the present invention include a wearable device, such as a wrist- worn computer, which may be enabled with wide area network (WAN) (e.g., 2G, 3G and 4G LTE), built-in global positioning system (GPS), Bluetooth and WiFi connectivity mechanisms.
  • WAN wide area network
  • GPS global positioning system
  • the wearable device may effectively replace smartphones, tablets, laptops, desktops and smart TVs, by being paired with different screens and input/output devices.
  • the wearable device may be charged wirelessly and may utilize a high-bandwidth wireless protocol (e.g., WiGig, Bluetooth, and/or WiFi Direct) to stream video, audio, data, and various other content to a variety of screens.
  • WiGig WiGig
  • WiFi Direct Wireless Fidelity
  • the wearable device may also comprise multiple data sensors, such as an accelerometer, gyroscope, digital compass and/or the like, as well as sizable internal storage.
  • the wearable device may be controlled from other input devices; for example, the wearable device may receive commands from a paired device such as a mobile phone, a tablet and/or other computer-based devices.
  • the wearable device may have a screen suitable to display indications like time/date, notifications, connectivity toggles, and the like, as well as a touchscreen, and motion sensors.
  • the wearable device may use a secure and passive authentication method (e.g., heart signature) which can verify unique wave patterns of a user heart's electrical activity to allow user access to one or more applications installed in the wearable device.
  • a secure and passive authentication method e.g., heart signature
  • Another exemplary embodiment is directed to a dongie which resembles a typical webcam that mounts to the top part of a monitor or TV that may receive wireless video and audio signals from the wearable device and may transmit the signals via high definition multimedia interface (HDMI) to a TV, computer monitor, and/or the like display devices.
  • HDMI high definition multimedia interface
  • the device may comprise three parts: the plug (HDMI), the cord, and a satellite pari comprising a camera, microphone and audio jack that mounts to the top part of a display.
  • the dongie may transmit video, audio, sensory input and the like signals back to the wearable device.
  • a further exemplar ⁇ ' embodiment is directed to a display device which resembles a typical smartphone.
  • This device may comprise a plurality of components, including but not limited to a display, capacitive touch panel, loudspeaker, in-ear speaker, microphone, cameras, and sensors.
  • Such a display device can operate as an input and output interface for the wearable device.
  • the display device may comprise connectivity mechanisms (e.g., Bluetooth low energy (BTLE), wireless gigabit (WiGig), Wi-Fi direct (WFD) and the like) to pair and connect with a wearable device wirelessly.
  • connectivity mechanisms e.g., Bluetooth low energy (BTLE), wireless gigabit (WiGig), Wi-Fi direct (WFD) and the like
  • BTLE Bluetooth low energy
  • WiGig wireless gigabit
  • WFD Wi-Fi direct
  • the display device may comprise a magnetic resonance module to wirelessly charge the battery of a wearable device.
  • FIGURE 1A illustrates a front view of a wrist-shaped wearable device.
  • FIGURE IB illustrates a side view of a wrist-shaped wearable device.
  • FIGURE 1C illustrates an isometric view of a wrist-shaped wearable device.
  • FIGURE 2 illustrates an example block diagram of the components comprised by a wearable device.
  • FIGURE 3A illustrates a wireless multimedia interface apparatus.
  • FIGURE 3B illustrates a block diagram of a wireless multimedia interface apparatus.
  • FIGURE 4A-C illustrates back, side and front views of an example display device with enhanced communication mechanisms to interface with a wearable device.
  • FIGURE 5 illustrates a block diagram for an enhanced display device.
  • FIGURE 6A ⁇ B illustrates an example of proximity awareness and the initiation of communication between a wearable device and a display device.
  • FIGURE 7 illustrates an example gesture controlled device tethering method.
  • FIGURES 8A-B illustrates an example of a gesture control to define a virtual control surface for data manipulation.
  • FIGURE 9 illustrates an example of a gesture tracking module comprised by a wearable device.
  • FIGURES 10A and 10B illustrate an example of motion-controlled device tethering between a wrist-shaped band device and multiple displays.
  • FIGURE 11 illustrates a physical trackpad utilizing a display device with touch screen.
  • FIGURE 12 illustrates an example of motion-controlled device tethering between a wearable device and a domestic appliance.
  • FIGURE 13 illustrates an example of a hardware-based authentication process employing a wearable device.
  • FIGURE 14 illustrates an example of a customized advertising process employing a plurality of wearable devices.
  • FIGURE 15 illustrates an example of wirelessly charging a wearable device through magnetic resonance.
  • FIGURES ⁇ 6 ⁇ and 16B illustrate an example of wirelessly exchanging social profiles through wearable devices.
  • FIGURES 17A and 17B illustrate an example of configuration of motion controlled social connection.
  • FIGURES 18A and 18B illustrate an example of logged information and further alternative actions after attending an enhanced event with wearable devices.
  • FIGURES 1 A and 19B illustrate an example of a configuration for programming and customize motion patterns.
  • FIGURE 20 illustrates an enhanced conference employing wearable devices.
  • FIGURE 21 illustrates an example of parallel output processes performed by a wearable device.
  • FIGURE 22 illustrates an example of a gesture indicating a first time tethering to a device.
  • FIGURES 23A and 23B illustrate an example of user interfaces for a first-time tethering to a device and for a communication mechanism settings.
  • FIGURE 24 illustrates an example logic flow of a device recognition (DR) component.
  • DR device recognition
  • FIGURES 25A and 25B illustrate an example of magnetic charging connectors and logic flow for charging mode recognition.
  • FIGURE 26 illustrates face apps available for the wrist-shaped device.
  • FIGURE 27 illustrates an example use case for switching between apps that are most frequently used by a user.
  • FIGURE 28 illustrates an example use case for switching between apps featuring an apps grid.
  • FIGURE 29 illustrates an example of screens to display a notification summary area.
  • FIGURE 30 illustrates a use case to turn on and using the wearable device contact dash feature.
  • FIGURE 31 illustrates an example of a flexible wearable device with a rubber inner cover and glass exterior.
  • FIGURE 32 illustrates an example of a pocket size capacitive touchscreen display device enhanced with camera and sensors.
  • FIGURE 33 illustrates an example of a tablet enhanced with a capacitive touchscreen, camera, and sensors.
  • FIGURE 34 illustrates an example of a keyboard with a hinge in metal latch to plug a tablet
  • FIGURE 35 illustrates an example of a Bluetooth-enabled headset as a wearable necklace
  • FIGURE 36 illustrates an example of a high definition multimedia interface (HDMI) dongle.
  • HDMI high definition multimedia interface
  • FIGURE 37 illustrates an example Integrated Development Environment for the generation of SDKs.
  • FIGURE 38 illustrates an example motion pattern to indicate the storage of digital data in a hybrid storage cloud environment.
  • FIGURE 39 illustrates an example digital data replacement/retention policy Least-Recently Used (LRU).
  • LRU Least-Recently Used
  • FIGURE 40 illustrates an example of a wearable device working as a surrogate communication device.
  • FIGURES 1A-1C illustrate different views of a wearable device 105.
  • the wearable device may comprise a wrist-band shaped body 104 to be worn and removed from a wrist through a gap between two disjoint ends.
  • the wearable device may have a protective shell made of transparent and flexible nylon that encapsulates a black- painted printed circuit board made of flexible materials (e.g., Polyimide core, overlay, and/or the like) combined with multilayers of a rigid material (e.g., FR4 IPC- 101/21 or high-Tg FR4 filled and/or the like) to provide a built-in connection and/or to make a three- dimensional wrist-band shaped form comprising the circuit components.
  • a black- painted printed circuit board made of flexible materials (e.g., Polyimide core, overlay, and/or the like) combined with multilayers of a rigid material (e.g., FR4 IPC- 101/21 or high-Tg FR4 filled and/or the
  • the wearable device may comprise a digital display 101, visible via an outer surface of the wrist- band shaped body, to display a time/ date, connectivity toggles, and notifications such as incoming emails, incoming text messages, event alerts, and/or the like.
  • the digital display may comprise tactile capabilities such as control through simple or multi-touch gestures, compatibility with touch screen stylus pens, free hand writing recognition and/or the like.
  • the wearable device may appear in the form of a wrist band, a head band, a helmet, a neck tie, a pm, an arm band, a waist belt, foot wear, a badge, a key chain fob, and/or other wearable accessories.
  • the wearable device may comprise a plurality of motion sensors, e.g., a 3-axis accelerometer, a 3-axis gyroscope, and/or the like disposed within the wearable device body to detect movement of the wearable device.
  • the wearable device may comprise a vibration motor disposed within the wearable device body to enable the wearable device to vibrate.
  • the vibration motor may comprise piezoelectric vibration mechanisms to facilitate different vibration patterns each pattern having an specific meaning to a user for example a vibration pattern can be associated with incoming communication received from a predetermined user.
  • the wearable device may comprise a wireless communication transceiver, disposed within the wearable device body, to receive and transmit communications such as telephonic, instant messaging, email and the like communications.
  • the wearable device may comprise a processor disposed within the wearable device body and operably coupled to the digital display, the motion sensors, the vibration motor, and the wireless communication transceiver.
  • the wearable device may determine a type of communication request and may generate a notification based on the type of communication request to be displayed on the digital display.
  • the wearable device may determine a vibration pattern based on the type of the communication request for the vibration motor to vibrate the wearable device body according to the vibration pattern.
  • a user may be able to customize this vibration patterns to, for example, distinguish incoming communications from different contacts.
  • the configuration of customized vibration patterns may be enabled by a tangible haptic interface wherein the user may tap a configuration sequence on the wearable device screen, wherein each tap on the configuration sequence is synchronized with a vibration pulse chosen from a pre-recorded set of unique vibration pulses available to configure customized vibration patterns.
  • the wearable device may sense a user movement input, and interpret the movement as a control command.
  • the control command can be for example, in response to a notification l a b e l indicating an incoming communication and/or a unique vibration notification characterize by a unique oscillation frequency indicative of an incoming communication.
  • the wearable device may execute the determined control command, for example, the user movement input can be associated with an answering communication control command or with an ignoring communication control command.
  • the wearable device may comprise a power supply, operably coupled to the processor, to provide electrical power to the processor and a conductive coil, operably coupled to the power supply, to charge the power supply through magnetic resonance or magnetic induction.
  • the coil in the wearable device can receive electrical current from an alternating magnetic field generated by a transmitter coil.
  • FIGURE IB illustrates a side view of the wearable device 105.
  • the wearable device may comprise one or more polycarbonate plastic layers 102 and/or solid type nylon layers 103 to protect the wearable device's internal components.
  • the solid type nylon layer 105 makes the wearable device waterproof or water resistant.
  • the wearable device may comprise a button to perform a hard reset on the system.
  • a hard reset may be useful when an error or failure is produced in the wearable device for which the wearable device can't recover.
  • the reset button can normally restart the wearable device operative system and/or can run the operative system in safe mode with only basic drivers enabled.
  • FIGURE 1C illustrates a perspective view of the wearable device.
  • the wrist shaped body of the wearable device 104 may be implemented in different sizes to fit the wrist of numerous users.
  • the wearable device may include a power supply interface, e.g., a power supply plug-in socket on the outer surface of the wearable device for a power supply plug-in.
  • the wearable device may include a loop antenna (e.g., a coiled copper loop, etc.) inside the body 104 for magnetic resonance charging.
  • FIGURE 2 illustrates an example block diagram of the internal components comprised by a wearable device.
  • the wearable device as shown in FGURES 1A-C may comprise a microprocessor (application processor 201) to enable a plurality of communications, e.g., telephone, text messages, email as well as online and offline operations including but not limited to browsing the Internet, watching video streams, uploading and downloading files, writing text and/or the like operations.
  • the wearable device also comprises a memory module 202 including a low-power double data rate (LPDDR) random access memory capable to be connected over 16-bit or over 32-bit memory bus per channel, an embedded multimedia controller (eMMC) and/or the like components.
  • LPDDR low-power double data rate
  • eMMC embedded multimedia controller
  • the wearable device may comprise a 3G, LTE telecommunications module 203 including security mechanisms for subscriber identity module (SIM) cards, a dedicated data network processor and/or the like components.
  • the wearable device may comprise a connectivity module 204, including mechanisms to enable wireless connectivity Wi-Fi, mechanisms to establish short-range wireless interconnections (e.g., Bluetooth), pairing mechanisms to perform near field communications (NFC), and/or multicast wireless sensor network technologies and the like.
  • the wearable device may comprise a Wireless Display module 206 including wireless high definition multimedia interface (e.g., WiGig), mechanisms to establish short- range wireless interconnections (e.g., Bluetooth), pairing mechanisms to perform near field communications (NFC), and/or multicast wireless sensor network technologies and the like.
  • wireless high definition multimedia interface e.g., WiGig
  • mechanisms to establish short- range wireless interconnections e.g., Bluetooth
  • NFC near field communications
  • the wearable device may comprise a plurality of sensors, actuators, and computing components 206 including but not limited to a multipoint control unit (MCU); a nine- axis motion tracking sensor with an embedded gyroscope, accelerometer, and compass: one or more buttons; wireless charging mechanisms; an authentication sensor chip; a vibration motor; an LCD touch screen; a global positioning system (GPS); a power block battery; and/or the like components.
  • MCU multipoint control unit
  • GPS global positioning system
  • power block battery and/or the like components.
  • FIGURE 3A illustrates an embodiment of a wireless multimedia interface apparatus.
  • the wearable device can be wirelessly connected to a wireless multimedia interface apparatus;
  • the multimedia interface apparatus may comprise a wireless transceiver, disposed within its body member of the multimedia interface apparatus 301 to receive data content via wireless connection from a wearable device.
  • the multimedia interface apparatus may comprise a multimedia data format converter, disposed within the body member and communicatively coupled to the wireless transceiver, to convert a data format of the data content to a multimedia format compatible for display at a screen display device.
  • the multimedia interface apparatus may comprise a multimedia interface connector 302, communicatively coupled to the multimedia data format converter, to be plugged into a multimedia input receptacle of a screen display device and to transmit the data content in the multimedia format to the screen display for display.
  • the multimedia interface apparatus may comprise a power adapter 303 for select televisions.
  • FIGURE 3B illustrates an example of a block diagram of a wireless multimedia interface apparatus
  • the multimedia interface apparatus may comprise a microcontroller (MICOM) 304, a wireless display module 305 including security and pairing mechanisms, a wireless HDMI mechanism, and additional components 306, including but not limited to a battery and/or charger, HDMI socket and the like components.
  • MICOM microcontroller
  • additional components 306 including but not limited to a battery and/or charger, HDMI socket and the like components.
  • FIGURES 4A-C illustrate back, side, and front views of a display device with enhanced communication mechanisms to interface with a wearable device.
  • the display devise may comprise a high pixel density screen (e.g., 320 ppi) enhanced with a capacitive touch panel 401.
  • the display device may comprise a dual LED flash 403 and a rear-facing camera 402.
  • Further embodiments of the display device may a plurality of additional components including but not limited to transceivers for audio and video streaming, components for Bluetooth low energy connectivity, embedded microphone, in- ear speaker, loudspeaker, 3.5 mm audio and microphone jack, proximity sensors, 3-axis accelerometer, gyroscope, high capacity battery, components to support wireless charging towards a wearable device and the like components.
  • FIGURE, 5 illustrates an example of a block diagram for an enhanced display device.
  • the display device may comprise a central process unit CPU controlling and executing operation over a plurality of resources and components including but not limited to memory and storage including LPDDR mechanisms, front- and/or rear-facing cameras, illumination sensors, LED lighting, one or more buttons, n-axis sensors where n can be 3, 6 or other positive integer, battery, charger mechanism, voice processor unit, audio codec mechanisms including loud speaker, receiver mechanisms and/or microphones, to establish a plurality of communications (e.g., Bluetooth, WiFi, NEC, WiGig and/or ANT), an LCD display, one or more touch sensors, a touch controller unit and the like components.
  • a plurality of communications e.g., Bluetooth, WiFi, NEC, WiGig and/or ANT
  • FIGURE 6A illustrates an example of proximity awareness between a wearable device and a display device.
  • the wearable device 601 detects a gesture, such as a knocking or waving motion that indicates a request to connect to the display device 602 while the display device remains in standby mode. Thereafter, the wearable device may broadcast a message 603 to all the devices within a predefined proximity area.
  • the display device may receive a broadcast message emitted by a wearable device. Additionally, the display device may respond to the request with an acknowledgement message 604 comprising relevant information to establish a connection.
  • FIGURE 6B illustrates an example of an initiation of communication between a wearable device and a display device.
  • the wearable device 601 may receive a gesture for pairing, such as a knocking or waving motion, with the display device 602. Meanwhile the display device 602 may be waiting for new messages. Thereafter the wearable device may send a message comprising pairing information to the display device 605.
  • the display device 605 may respond to the wearable device 607 with an acknowledgment message 606 and then the wearable device 601 and the display device 605 may establish a connection for communication.
  • FIGURE 7 illustrates an example gesture- control led device tethering method.
  • a wearable device may receive from a motion sensor disposed within the wearable device body 701 a motion indication including a movement pattern of the wearable device a motion pattern 702 corresponding to knocking on a display device 703 (e.g., a motion similar to knocking on a door).
  • the display device 703 may include any user interface output device such as a display device, an audio speaker, and/or the like.
  • the wearable device may determine based in at least part of the motion indication that the movement pattern indicates a tethering request to tether the wearable device with the display device.
  • the wearable device may instantiate a device query on a communication stack i.e., multiple layers of software that process a message from transmitter to receiver within communication range of the wearable device in response to the tethering request.
  • a motion pattern e.g., "knock-knock,” when the user wearing the wearable device double-knocks on a surface, may indicate a tethering request withm the communication stack to an output device.
  • the motion pattern "scratching" may indicate a tethering request when a communication request is received at the wearable device.
  • a user wearing the wearable device may see a phone call received at the wearable device, e.g., a beep, vibration, etc.; the user may raise his/her wrist to "scratch" with the device behind his/her ear, and such a "scratch” motion may trigger a command to pick up the phone call, e.g., for the wearable device to answer the phone call, etc.
  • a variety of motion patterns including, but not limited to waving, scratching, knocking, tapping (fingers), and/or the like, may be contemplated for motion control of the device.
  • a user may define a motion pattern for a designated command via a user interface component, e.g., defining "knock-knock” as a tethering request for nearby display device, “scratching” as answering an incoming call, “waving” as moving the mouse on a tethered display device, and/or the like.
  • a user interface component e.g., defining "knock-knock” as a tethering request for nearby display device, "scratching” as answering an incoming call, “waving” as moving the mouse on a tethered display device, and/or the like.
  • the wearable device may send via a wireless transceiver, a connection request to the display device. Thereafter, the wearable device may receive, via the wireless transceiver, a connection approval from the display device in response to the connection request. Furthermore, the wearable device may send via the wireless transceiver data content for display to the display device.
  • a user can view data content on the display device and input data through the display device's touch screen or by activating the display device's sensors for example, activating an accelerometer by moving the display device.
  • FIGURE 8A illustrates an example of a gesture control to define a control surface and data manipulation.
  • the wearable device may receive, from the motion sensor disposed within the wearable device body 802, a motion indication including a movement pattern 803 performed over an object's surface 801. Thereafter, the wearable device may analyze a direction and length of the movement pattern and can generate a virtual control surface my mapping the dimensions gathered through the movement pattern to the dimension of a display device.
  • FIGURE 838 illustrates an example of data manipulation employing a gesture controlled surface.
  • the wearable device 806 may determine that the movement pattern (e.g., typing motions, finger tapping motions, finger swiping motions, finger or palm movements in particular relative or absolute directions, etc.) over the selected surface 805 indicates a control command (e.g., typing an address to be viewed on a map website) based on content displayed on the display device 807.
  • the wearable device may execute the determined control command in communication with a wireless multimedia interface apparatus and/or dongle 804 connected to the display device.
  • FIGURE 9 illustrates an example of a gesture tracking module comprised by a wearable device.
  • a wearable device may detect a gesture 902 employing a movement sensor 903, for example, an accelerometer and/or a gyroscope enhanced with a compass.
  • the raw gesture data may be sent to a pointing conversion module running a processor configured to transform the data mto a readable format for the target application 904.
  • an input manager may receive the pointing data 905 and may buffer the pointing data into a GUI Control Application 906 in charge of rendering the manipulation of graphical objects on a coupled display device.
  • FIGURE 10A illustrates an example of motion controlled device tethering between a wearable device and multiple displays.
  • a wearable device 1001 may receive, from a motion sensor (not shown) disposed within the wearable device body, a first motion indication including a first movement pattern, e.g., knocking on the device 1002. Thereafter, the wearable device may query an internal database with programmed movement patterns and determine that the first movement pattern indicates a first tethering request.
  • the wearable device 1001 may instantiate a device query on a communication stack within communication range of the wearable device. Thereafter, the wearable device 1001 may receive an indication of a first display device 1003 and a second display device 1006 within the communication stack. Furthermore, the wearable device may send a first connection request to the first display device 1003 and thereafter it may receive a first connection approval from the first display device 1003 in response to the first connection request.
  • the wearable device 1001 may receive, from the motion sensor, a second motion indication including a second movement pattern. Thereafter the wearable device 1001 may determine that the second movement pattern indicates a second tethering request. In addition, the wearable device 1001 may send a second connection request to the second display device 1006 and thereafter it may receive a second connection approval from the second display device in response to the second connection request.
  • FIGURE 10B illustrates an example of data manipulation on a first tethered display 1003 employing a second tethered display 1006 and a wearable device 1001.
  • the wearable device 1001 may send data content for display to a first display device 1003.
  • the wearable device 1001 may send data content for display to a second display device 1006 in communication with a wireless multimedia interface apparatus 1007 connected to the display device.
  • the wearable device 1001 may receive a user input indication from the first display device 1003, and process the user input indication to execute a user command for example typing a text or clicking on a GUI object. Thereafter, the wearable device may generate output data based on the user command and may send the output data for display to the second display device 1006.
  • FIGURE, 11 illustrates an example of physical trackpad utilizing a display device with touch screen.
  • the wearable device 1101 may scan for devices within a predetermined proximity area.
  • the display device with touchscreen 1102 may also broadcast a message to be found by the wearable device.
  • the wearable device 1101 may send a message to connect as input device to the display device with touch screen 1103.
  • the display device with touchscreen 1102 may receive the message and respond to it with an acknowledgement message 1104. Thereafter, the display device with this touchscreen 1102 may wait for a touch input performed by the user with the wearable device 1 101.
  • the aforementioned process allows the display device 1 102 to be used by the wearable device as a peripheral device for example a trackpad and/or to as a peripheral to view and enter information from and to the wearable device.
  • FIGURE 12 illustrates an example motion controlled device tethering with an appliance (e.g., a thermostat) employing a wearable device.
  • the wearable device 1201 may instantiate a device query on a communication stack within a communication range of the wearable device 1202 comprising a wireless transceiver operably coupled to a processor.
  • the communication stack enabling one or more communication protocols may be established by the wearable device 1202 v such as but not limited to Wi-Fi, Bluetooth, Bluetooth Low Energy (LE), Near Field Communications (NFC), iBeacon, and/or the like.
  • the wearable device may receive, via the wireless transceiver, an indication of a home electronics device from a security system home electronic device 1203 (e.g., a microwave, a refrigerator, a laundry machine, a thermostat, and/or the like) in the communication stack 1207. Thereafter, the wearable device may extract a device identifier from the received indication, query a list of pre-stored device identifiers for a match to the extracted identifier to determine a type of the home electronic device, and configure a control interface based on the type of the home electronic device 1204.
  • a security system home electronic device 1203 e.g., a microwave, a refrigerator, a laundry machine, a thermostat, and/or the like
  • the wearable device may send a control command based on the configured control interface for example lowering the temperature to the home electronic device 1205 and receive, from the home electronic device, a notification indicative of the operating status of the home electronic device in response to the control command 1206.
  • the home electronics device manufacturers may be provided with a hardware development kit (HDK) to equip the home electronics devices with hardware components to interface with the wearable device control commands.
  • HDK hardware development kit
  • FIGURE 13 illustrates an example of a hardware-based authentication process employing a wearable device.
  • a wearable device having a hardware identifier 1307 may send to a user service provider 1308 (e.g., an online banking system that requires user credentials to login, etc.) a system access request to the user service provider 1301.
  • a user service provider 1308 e.g., an online banking system that requires user credentials to login, etc.
  • Such an access request can comprise a set of user credentials.
  • the service provider may extract the credentials (e.g., hardware identifier, IP address, physical address, biometrics data, etc.) from the request and send them to a wearable device management server 1309 to be verified 1302.
  • credentials e.g., hardware identifier, IP address, physical address, biometrics data, etc.
  • the wearable device management server 1309 may access a data repository containing pre-stored client credentials 1310 to verify the validity of the received user credentials 1303. Thereafter, the wearable device management server may receive a response to the verification request 1304 from the data repository. The wearable device management server 1309 may send a credentials verification response which may confirm or not that the received credentials are associated with an existent client of the service provider 1305. In addition, the service provider may analyze the received credentials verification response and may send a corresponding response to the system access request sent by the wearable device 1306.
  • the wearable device 1307 when a user wearing a wearable device 1307 uses the wearable device as a computing device to access a service provider 1308, e.g., an online banking site, the online banking site may detect the source of the access request as originated from the wearable device 1307 and may provide an option of "Login with Your Wearable Device.”
  • the service provider 1308 may collect the hardware ID of the wearable device 1307 and additional information (e.g., IP address, physical address, GPS location, etc.) and direct the authentication request 1302 to the server 1309, which may in turn authenticate the user based on the database of the hardware IDs.
  • additional credentials e.g., user name and password, etc.
  • the server authentication at 1309 on behalf of the service provider 1308 may employ the user's biometnc data which may be collected by the wearable device 1 307, such as but not limited to fingerprint, iris/retina scanning, heart- signature, blood pressure pattern, body temperature, and/or the like.
  • the hardware authentication may be used in social media target ads. For example, social media platforms (e.g., Facebook, etc.) may collect "likes" based on hardware authentication to obtain an accurate count of user interests, avoiding robot generated spam "likes.”
  • FIGURE 14 illustrates an example of a customized advertising process employing a plurality of wearable devices.
  • ADServer 1414 may receive hardware identifiers from a plurality of wearable devices 1401, 1402 and 1403. Thereafter the ADServer 1414 may send a common interest request comprising a set of hardware identifiers e.g., 1404 to a ClientServer 1415.
  • the ChentServer 1415 may send a request of users' interests' profiles of the users who are associated with the received hardware identifiers 1405 to a repository containing clients' data 1416.
  • the ClientServer may receive a request response containing the users' interest profiles of the indicated users 1406. Thereafter, the ClientServer 1415 may analyze the received profiles to determine a common interest between the indicated users.
  • the ClientServer 1415 may send a response 1408 to the received common interest request sent by the ADServer Thereafter, the ADServer may process the common interest received in the response and determine an advertisement conceptually related to the received common interest 1409. Additionally, the ADServer may send a determined advertisement to be displayed in an automated billboard display 1417.
  • FIGURE 15 illustrates an example of wirelessly charging a wearable device through magnetic resonance.
  • a user wearing a wearable device 1507 may perform a motion pattern 1508, which may trigger a motion indication 1501 detected by a motion sensor (not shown) embedded in the wearable device 1507.
  • the wearable device 1507 may analyze the motion pattern 1502 and determine that the motion pattern 1502 matches a pre- programmed motion pattern to command a connection request 1503 to a close-range device, e.g., display device 1506.
  • the display device 1506 may approve a connection request by sending a connection approval message 1504 to the wearable device 1507.
  • the display device 1506 may wirelessly transfer power 1509 to the wearable device 1507 while, the wearable device transfers data content 1505 to be displayed on the display device 1506.
  • the wearable device 1507 may comprise a magnetic resonator which may receive a flow of power from a magnetic near field induced by a source resonator embedded in the display device 1506 charging the wearable device's battery by magnetic resonant power transfer.
  • the wearable device 1507 may be charged wirelessly from the display device by inductive power transfer and/or the like wirelessly power transfer technologies. In this way, the wearable device 1507 may take advantage of the display device 1506 to charge its battery, which can be equipped with a larger batter ⁇ '.
  • FIGURES 16A and 16B illustrate an example of wirelessly exchanging social profiles through wearable devices.
  • a user wearing a wearable device 1606 bumps the wearable device 1606 into another wearable device 1607 worn by another user.
  • a motion sensor embedded on the wearable device 1606 may detect a motion indication 1601 triggered by the bump.
  • the wearable device 1606 may analyze the motion pattern, i.e., bump 1602, and determine that the motion pattern matches a pre-programmed motion pattern to command an exchange of social profiles with the close- range device 1607.
  • the wearable device 1606 may instantiate a device query within communication range 1603 to detect and establish a communication link with the wearable device 1607.
  • the wearable device 606 may start a device recognition (DR) process 1604 to determine if the wearable device 1606 has been previously exposed to the wearable device 1607 and whether it has already recorded identification information regarding the wearable device 1607 for automatic recognition. Further details with respect to device recognition (DR) 1604 are described herein and particularly with respect to an example DR Component shown in Fig. 24.
  • the wearable device 1606 may send a social profile (SP) exchange request 1605 to the wearable device 1607.
  • the SPexchange request 1605 may include a user name, user contact information, user social media information (e.g., see Fig. 18B), allowing the user of wearable device 1607 to send a social media connection request directly (e.g., a Friend request on Facebook, etc.).
  • the wearable device 1607 may receive a confirmation input accepting the request to exchange SPs 1608.
  • the wearable device 1607 may send SPinformation 1609 to the wearable device 1606.
  • the wearable device 1606 may add the SP to an internal contacts database or may add or accept the social profile to a list of contacts in a social network.
  • the social profile 1609 may only include a wearable device identifier, and further information regarding the wearer of the wearable device may be viewable only after the wearer has authorized the exchange of information.
  • the wearable device 1606 may receive an indication of the device identifier of device 1607, and then it may generate a message for display on a tethered screen (e.g., 1102 in Fig. 11), e.g., "do you want to connect with 1607?"
  • the wearable device 1606 may also send a SP 1611 to the wearable device 1607.
  • the SP 6 1 may be sent to the wearable device 1607 only after the wearable device 1606 has accepted or added the social profile (SP) 1609 to a local contact list or repository or alternatively has accepted the social profile (SP) 1609 into a social network contact or friend list.
  • the exchange of social profiles may have been successfully executed wherein each wearable device may have stored social information of another wearable device.
  • FIGURES 17A and 17B illustrate an example user interface (UI) of privacy/security configuration of motion controlled social connection.
  • UI user interface
  • a user may configure privacy/security settings of the motion controlled social connection, and/or social profile exchange, via a touch screen UI tethered with the wearable device (e.g., see 1102 in Fig. 11).
  • a user may configure the wearable device to automatically share his or her social profile and send a connection request with another wearable device without having to manually confirm the connection request, e.g., by enabling the "auto- bump" option at 1701.
  • the user may elect to enable "notification” 1706, so that the user may receive a notification message when the wearable device detects a motion controlled social profile sharing request, and/or sends out a connection request.
  • Such notification may request a user action to continue, e.g., "You are about to connect with John Doe. Continue?" Or the notification is provided without requiring a user action when the "auto-bump" 1701 is enabled, e.g., "You have just sent out a Friend request to John Doe.”
  • a user may configure social media information through the "bump" page, e.g., Figure 17A.
  • the wearable device may access the social media platform via an API so that a user may directly send social media requests (e.g., a Facebook friend request, etc.) within the user's social profile page (e.g., see FIG. 18B).
  • social media requests e.g., a Facebook friend request, etc.
  • a user may configure the information he/she wants to receive and/or transmit from a social network profile after bumping his/her wearable device with another user's device, e.g., 1702.
  • a user may want to provide only an invitation to join a Facebook group instead of a Facebook friend request, similarly some users may rather to send an invitation to follow their Linkedln companies' profiles instead of requesting them to connect to their Lmkedln personal profiles.
  • the available social networks may include but may not be limited to Facebook, Twitter, Lmkedln, Instagram, and Tumblr, YouTube, and/or the like platforms.
  • a user may configure a motion controlled social connection to exchange profiles and/ or other information specified in the profile section 1712 which may be transmitted via an email account, Short Service Message (SMS) message, or through a personal call or an automated call that a user may receive notifying social profile information after an "auto-bump" event.
  • SMS Short Service Message
  • a user may configure the "auto-bump" option through a UI where he/she may specify in a list 1755 which of the multiple sources and mediums should be disabled for the exchange of profiles after an "auto-bump" event call through the configuration control 1703. Furthermore, a user may specify a list to enable specific mediums or social network platforms for the exchange of profile information after an "auto-bump" call through the configuration controll704. Additionally, whenever a social network or a type of communication is not specify on the disable list 1703 or the enable list 1704 it may be requested to be added for the exchange of profile information by any of the users exchanging profile information, even after the "auto-bump" event has been executed.
  • a user may specify the number of bumps required to execute the automated exchange of social profile information through an "auto-bump" call through the configuration control 705.
  • a user may specify a timespan indicating a date since when the wearable device will gather data for executing an "auto-bump" call through the configuration control 1707. In the example shown in FIG.
  • the user has configured the wearable device such that if the user has "bumped" (e.g., detect via the wearable device) another user more than three times 1708 this year 1709, the user's wearable device may automatically send a Linkedln contact request (e.g., 1710) to the other user; however, the wearable device will not send or receive invitations to connect via Facebook, SMS or phone calls, as they have been disabled for automatic connection 1711.
  • a Linkedln contact request e.g., 1710
  • FIGURES 18 A. and 18B illustrate an example of logged information and further alternative actions after attending an enhanced event with wearable devices.
  • a user may be able to view through a graphical user interface information of an event he/she has attended e.g., 1802.
  • the user may be able to view information from the attended event including but not limited to event's date 1803, event location 1804, a record of people with whom the user may have interacted during the event 1805 and the number of times the user interacted with another attendee 1806, e.g., the user has "bumped" into "John Smith” four times 1810, A user may view further information from an attendee 1801 with whom he/she had interacted e.g., 1806, and such information may be used to branch out to specific attendees in a variety of ways including but not limited to social networks relation requests 1807 and/or via other communication means e.g., email, telephone or SMS 1808. Additionally, a user may opt to save the received profile in the wearable device's contact list by executing the "Confirm" command 1809.
  • FIGURES 19A and 19B illustrate an example of configuration of programming and customized motion patterns.
  • a user may configure a wearable device to program and customize a motion pattern, via a touch screen UI tethered with the wearable device (e.g., see 1102 in Fig. 1 1).
  • a user may program and customize a new motion pattern and/or override a preset motion pattern.
  • a user may program and customize a motion pattern for professional networking at conferences 1901 and similar events.
  • the user may program other motion patterns, for example, a programmed motion pattern to count the number of strokes a swimmer performs, calculate a distance per stroke swam by a swimmer, or vibrate and/or emit a sound when a swimmer's stroke does not correspond to an appropriate form 1902.
  • a programmed motion pattern to count the number of strokes a swimmer performs, calculate a distance per stroke swam by a swimmer, or vibrate and/or emit a sound when a swimmer's stroke does not correspond to an appropriate form 1902.
  • Another illustrative example may be a user that has programmed a bowling motion pattern to provide haptic, textual, and/or sound feedback during and/or after delivering or rolling a bowling ball.
  • Feedback may be associated with metrics including but not limited to acceleration, speed, movement economy, and visual feedback including but not limited to recorded movement patterns displayed on a three dimensional space and/or the animation of recorded movements though a display device.
  • a user may provide a name to identify a motion pattern 1903 corresponding to a preset motion pattern e.g., a bump 1904 or a motion pattern previously recorded by the user.
  • a user may also configure the number of repetitions of the specified motion pattern that will have to be performed before an action is executed 905.
  • the actions that may be executed after a motion pattern has been detected by the wearable device 1906 may be specified, for example, the exchange of social profile information, start audio recording 1908, start movement recordings and the like actions.
  • a user may want to be notified after the action or actions have been completed 1907.
  • FIGURE 20 illustrates an example of enhanced conference employing wearable devices.
  • a user wearing a wearable device 2003 may check-in into a conference 2004 associated to a local area network 2001.
  • a check-in request 2004 may comprise credentials authenticating the user of the wearable device 2003.
  • a second user may also send a check-in request 2005 through a wearable device 2002.
  • the wearable device 2002 may instantiate a device query within communication range 2006 to identify other participants of the conference.
  • the wearable device may identify a close-range device 2003 and subsequently may send social profile information 2007 and may receive social profile information 2008 from the close- ange device 2003.
  • FIGURE 21 illustrates an example of parallel output processes performed by a wearable device 2102.
  • a wearable device 2102 may receive a SMS message 2105 from a mobile phone tower 2101. Thereafter the wearable device may start or alternatively continue a video streaming process 2106 with a first display device 2103 while sending the SMS message 2107 to be displayed on a second display device e.g., 2104.
  • FIGURE 22 illustrates an example of a gesture indicating a first time tethering to a device.
  • the wearable device 2202 and the display device 2201 comprise low power consumption wireless communication mechanisms 2203, for example, Bluetooth Low Energy (Bluetooth LE).
  • Bluetooth LE provides a lightweight link layer capable of providing ultra-low power idle mode operation, simple device discovery and reliable point-to- multipoint data transfer with advanced power-save and secure encrypted connections.
  • the device 2201 may remain in sleep mode most of the time and may only wake up when it receives a connection request through the Bluetooth LE mechanism 2203, thereby reducing the power consumption. While various embodiments of the present invention are described using Bluetooth LE, the use of this communication mechanism is not intended to limit the present invention. On the contrary, various embodiments of the present invention may be implemented using other wireless mediums as well or instead.
  • a user with a wearable device 2202 performs a motion pattern over a display device 2201.
  • the motion pattern may be for example knocking on the display device more than one time, moving the display device horizontally from left to right, or vertically up and down, and/or the like motion patterns.
  • the wearable device 2202 may recognize that a motion indication has been performed, e.g., 2211, and subsequently may analyze the motion pattern 2212 and determine that the motion pattern matches a pre- programmed motion pattern to command a wireless tethering request e.g., 2204.
  • the display device 2201 may send an acknowledgement 2205 via Bluetooth LE comprising a device name to the wearable device 2202.
  • the acknowledgement 2205 may only be sent after the display device has received a tethering request e.g., 2204.
  • the wearable device 2202 may send a command to display a confirmation screen 2206 to the display device 2201.
  • the display device 2201 may display the confirmation screen 2206 to be viewed by the user of the wearable device 2202.
  • the user with the wearable device 2202 may enter a personal identification number (PIN) 2207 into the device 2201, which may be displaying the confirmation screen 2206. Then, the PIN number and a device identification number 2208 may be sent to the wearable device 2202. Additionally, the wearable device 2202 may store the device identification number 2209 for example a media access control (MAC) address in a local repository for future automatic recognition and/or tethering.
  • the wearable device 2202 may store the device identification number 2209 for example a media access control (MAC) address in a local repository for future automatic recognition and/or tethering.
  • MAC media access control
  • every time the display device 2201 is woken up, for example, by pressing the button 2210 it may remain wirelessly tethered via Bluetooth LE 2203 to the wearable device 2202.
  • a user may press and hold the button 2210 for few seconds to untether the wearable device 2202 from the display device 220 such that the display device 2201 can be tethered with another device
  • FIGURES 23A and 23B illustrate an example of user interfaces for a first time tethering to a device and for a communication mechanism settings.
  • a confirmation screen to tether a device via Bluetooth LE may be displayed to a user, e.g., as shown in Figure 23 A.
  • the confirmation screen may display the tethering mechanism that may be employed for the instant connection 2307.
  • the confirmation screen may display the name of the device to which the wearable device may be attempting to tether, e.g., 2301.
  • the confirmation screen may comprise a text field 2302 to be employed by the user to enter a personal identification number as a way to confirm a tethering action.
  • a user of a wearable device may access a screen to change the settings of a communication mechanism and to view the devices that are tethered through a communication mechanism as shown in Figure 23B.
  • a user may enable or disable a communication mechanism through a toggle control, for example, the toggle control 2303 enables or disables a Bluetooth LE communication mechanism.
  • a user may enable or disable an auto-connect mode employing another toggle control 2306. When the auto-connect setting is enabled, the wearable device may automatically connect to known devices.
  • a user of a wearable device may view the devices to which the wearable device is tethered for example a wearable device may be wirelessly tethered via Bluetooth LE to a display device 2304 and simultaneously it may be tethered to a television 2305
  • FIGURE 24 illustrates an example logic flow: device recognition (DR) component.
  • a device recognition component comprised by a receiver device 2402 may execute instructions to retrieve a device identification number 2404 from a sender device e.g., 2401. Thereafter, the receiver device 2402 may run a query 2405 on a receiver device repositoiy 2403 to verify if the received device identification has been previously recorded by the receiver device 2402. In addition, the receiver device 2402 may analyze the query results 2406. If the device identification has been previously recorded 2406, then an interaction counter corresponding to the number of times the receiver device has received the device identification may be increased by one unit e.g., 2407 and thereafter the device recognition component may stop.
  • the receiver device may evaluate if a subsequent exchange of social profiles has been programmed to be executed e.g., 2408. If an exchange of social profiles has been programmed to be executed e.g., 2408, then an interaction counter is created and initialized to 1 unit, e.g., 2409, thereafter the receiver device 2402 may run a query on the receiver device repository 2403 to insert a record comprising the received device identification number e.g., 2410, subsequently the device recognition component may stop. Alternatively, when an exchange of social profiles has not been programmed then the device recognition component may stop and no further actions may be performed,
  • both the interaction counter 2409 may be created and initialized and the query 2410 may run regardless of whether or not a subsequent exchange of social profiles has been programmed to be executed.
  • the interaction counter may count the number of encounters with a sender device regardless of whether or not an exchange of profiles has been previously executed.
  • the device recognition component may count the interactions with a sender device and after the number interactions with a particular device has exceeded a predetermined threshold, the device recognition component may suggest to the user of the receiver device to exchange social profiles based on the number interactions that have previously occurred with respect to the sender device 2401.
  • FIGURES 25A and 25B illustrate an example of magnetic charging connectors and logic flow for charging mode recognition.
  • a wearable device 2501 and a display device 2502 may comprise electrical connectors 2503 and 2504 respectively that can be attached together by magnetic force. Both devices 2501 and 2502 may comprise additional electrical connectors to connect to external DC and/or AC power supplies.
  • a charging mode recognition software component comprised by a wearable device may determine which device or devices may be powered or charged at a given time when a wearable device 2501 is electrically and magnetically attached to a display device 2502, When one of the attached devices emits a charging indication or request 2505, the charging mode recognition component may determine if the wearable device is connected to a power outlet or any other external power source 2506. When the wearable device is not connected to a power outlet or any other external power source, then the wearable device 2501 may charge power from the power source comprised by a display device 2502 for example software module 2508.
  • the wearable device 2501 may charge the power source comprised by the display device 2502 for example software module 2507. Furthermore, the charging mode recognition component may notify the user of the wearable device 2501 the current charging mode and device or devices charging statuses e.g., 2509.
  • the user can also make use of a headset's 2510 electric wire being configured to serve as a charging cord to connect the wearable device to a wall outlet.
  • a headset's 2510 electric wire being configured to serve as a charging cord to connect the wearable device to a wall outlet.
  • Several charging configurations can be utilized including but not limited to: outlet to headset; outlet to headset and headset to wearable device; outlet to headset headset to display device, display device to wearable device; outlet to headset, headset to first display device, first display device to second display device, second display device to wearable device; display device to wearable device and the like configurations.
  • the aforementioned charging modes provide several advantages to a user for example the users do not need to carry charging cables, more than one device can be charge using only one power outlet devices can be charged from one another without the need of a power outlet.
  • Another exemplary embodiment includes an integrated portal platform instantiated on a user mobile device (e.g., a Smartphone, a tablet computer, a laptop computer, etc.).
  • the integrated portal platform may allow a user to access various client components on the user mobile device such as email, gaming applications, calendar applications, browser, social media portals, etc., through one portal platform.
  • client components such as email, gaming applications, calendar applications, browser, social media portals, etc.
  • the user could access all desired portal apps (e.g., email, calendar, social media, gaming, etc.) instantly through one portal platform, instead of having to search, download and install a number of separate mobile apps on his/her mobile device (e.g., an email app, a Facebook app, a Google calendar app, etc.).
  • a user may instantiate the integrated portal platform on his/her
  • Smartphone to launch a client component, e.g., Facebook.
  • a client component e.g., Facebook.
  • the user may enter "Facebook" to initiate a search within the portal platform, which may return a list of search results.
  • the list of search results provide a portal app connection to the Facebook portal app, instead of a URL link or a link to a downloadable mobile app; upon choosing the portal app connection to "Facebook," the user may launch a Facebook portal app within the portal platform directly, without downloading and/or installing a separate mobile app on his/her Smartphone.
  • the user may view an icon "Facebook” within the integrated portal platform UI ( or a desktop icon of "Facebook” ) and may directly launch the "Facebook” portal component.
  • the portal platform may facilitate shifting the bulk of data processing and computation to a server, a cloud or a remote system, e.g., the user mobile device may have reduced requirements for data processing capacity as the user mobile device may not need to download and install separate mobile applications.
  • the user may access all portal components via the portal platform.
  • a cached copy of the portal component may be saved on the user mobile device so that the user may access the cached copy of the portal app while offline, e.g., a user can still access his/her "Facebook" portal component to read his/her friend list while offline.
  • the portal platform allows a user to access the latest version of the portal components when there is an update (e.g., an updated "Facebook" component) without the user downloading or installing any updates, as the component update has been performed at the server/ cloud level.
  • an update e.g., an updated "Facebook” component
  • FIGURE 26 illustrates an example of a watch face app available for the wrist- shaped device.
  • the wearable device's display can be implemented as a 9: 16 vertical aspect ratio color display, with a curved surface.
  • the screen can have for example an overall size of
  • the display can include a touch panel supporting multi touch triggers capable to handle manipulation of up to 3 fingers at the time and wet use support, that is, the display functionality is not constraint when a user manipulates the screen with wet fingers.
  • the wearable device can use sensors to deduce when the display should be on. For example on "placed on wrist” event, or wrist brought up to check time event. Screen off can occur upon timeout (if the screen is not touched for x time, then turn off). In such a case, the wearable device can function without an ON/OFF button.
  • a call to display a home screen can be achieved by gestures, avoiding the implementation of a physical home screen button.
  • a home screen function can be called by a three finger pinch gesture, for example, a simultaneous pinch with the thumb, index and middle fingers.
  • a watch face can be utilized as a home screen.
  • a watch face can be defined through a set of apps (a user can select which " Face app to use) that are designed to show time and date (e.g., 2602, and 2604) or any combination of status content the user may want to see e.g., a compass with watch 2605, chronograph, athlete statistics and time 2603, weather centric 2601 and similar "Face apps”.
  • Face apps can combine the most interesting and most highly used watch configurations for a determined user.
  • the display on the wearable device works as a canvas to redefine a watch face and deliver a targeted solution to the consumer.
  • FIGURE 27 illustrates an example of a use case for switching between apps that are most frequently used by a user.
  • the screen can be configured to be a minimal display, showing only the home screen app for example a "watch face” 2701, but when a user touches and hold the central focus point, the "watch face” can vanish and a menu with four menu options can be displayed 2702. While holding, the user can slide a finger towards a desired option represented by an icon 2703 and when the finger is released from the screen, the option will be activated.
  • buttons can be memorize the positions of for example four key apps and flick their finger in the direction of the desired app before any icons are displayed, creating a gesture (e.g., long touch, swipe to top right).
  • a gesture e.g., long touch, swipe to top right.
  • a possible configuration for the interactive menu can be to display the four most frequently used apps when a user touch and hold a finger on the screen however, other type of configurations can be programmed for example, displaying a fix set of apps regardless how frequently are used, or displaying apps with a notification or the latest used apps.
  • the interactive menu can similarly be used on other devices related to the wearable device such as a pocket screen, a tablet, a virtual control surface and the like interfaces as long as they have an enabled touchscreen.
  • FIGURE 28 illustrates an example of a use case for switching between apps featuring an apps grid. From the home screen 2801, a user can swipe a finger to one side of the wearable device's display 2802, to access an app grid displaying multiple app grid screens or icons 2803.
  • FIGURE 29 illustrates an example of screens to display a notification summary- area.
  • the applications installed in the wearable device can generate notifications triggered by events relevant to a user.
  • the notifications can be configured to cause a "vibe notification" which can be perceived by a user as a subtle vibration generated within the wearable device. If the user looks at the screen, the notification can be view, and the user can tap the screen to accept an action or to view more information regarding the notification.
  • a dismissal to a notification can be configured as a hand gesture indicating that the user is not looking at the screen any more.
  • Face apps for example a "watch face” 2901 support notification summar features 2902, which can appear as a preview in a toggle panel 2903. A user can drag down the toggle panel to read further details of the shown notifications.
  • FIGURE 30 illustrates an example of use case to turn on and use the wearable device contact dash feature.
  • the wearable device can have an area of pressure or pressure strip 3003 responsive to more than one touch sensors on one or more sides of the device, such an area can include a sensitive input device corresponding to the size of the length of the screen 3004.
  • the pressure area or strip can be a dedicated contact dash button, allowing the user to scroll and select a contact to interact with.
  • a user can touch the side surface of the wearable device to bring up the contact dash 3001.
  • the wearable device can transition out or vanish any current app to bring up the contact dash app.
  • a user can scroll through contacts, magnify contacts, and select a contact to interact with by pressing harder on the pressure strip.
  • the contact dash feature may be extended to other items than contacts, such as notes, calendar events, etc., that can be selected and engaged with via the pressure area or strip.
  • a first plurality of selectable items are displayed in response to user touch sliding in one direction along the pressure area or strip and a second plurality of selectable items in response to user touch sliding in an opposite direction.
  • the communication options can be configured to fit multiple usage models, for example a walkie-talkie voice option, a draw/write option, conventional text messages, email, telephone conversation and other available communication methods.
  • a user can write with a finger a message on the display screen 3002, such a message will appear in the chosen contact screen as it is display on the sender's screen i.e, conveying the sender's handwriting style.
  • the contact dash feature whether implemented to select contacts or other items such as notes, calendar events, etc., can be applied to devices other than wearable devices, such as tablet computers, laptops, smartphones, etc.
  • FIGURE 31 illustrates an example of a flexible wearable device with a rubber inner cover and glass exterior.
  • the exterior of the wearable device can be protected by a bendable glass enabling a high degree of flexibility of the device's body.
  • the interior of the wearable device can be protected with a hypoallergenic rubber providing traction to the wearable device such that the device can be hold on the wrist in a fix or semi-fix position.
  • the wearable device is usable on both right and left hand. A user can configure the hand they will wear it on.
  • the user interface adjusts accordingly (flip) to keep the pressure strip on the correct side (side facing towards hand).
  • the device can feature a clean design free of markings on the top surface except for any information displayed through the user interface in which case the markings flip with user interface to match the direction in which the user wears the device.
  • the wearable device can be equipped with a multicore processor, for example, a quad-core processor, an internal memory, for example, 64GB or 128GB storage options, a 2.4" flexible capacitive touchscreen to implement the user interface with a resolution of for example, 854x480 pixels, Wi-Fi and Bluetooth communication devices, global positioning system (GPS), several sensors including but not limited to proximity sensors, accelerometer, gyroscope and compass, a power source for example a 1000 mA barter ⁇ '.
  • GPS global positioning system
  • Other features include support for multiple communication standards for example, long term evolution (LTE), global system for mobile (GSM) and 3G.
  • LTE long term evolution
  • GSM global system for mobile
  • 3G 3G.
  • FIGURE 32 illustrates an example of a pocket size capacitive touchscreen display device enhanced with camera and sensors.
  • the pocket size capacitive touchscreen 3201 can include a 4.97" capacitive touchscreen, a 1280x720 pixel display resolution, a 8 megapixel rear-facing camera, a rear LED flash as shown in 3202, a 1.3 megapixel front- facing camera, an accelerometer, a gyroscope, a microphone, a loudspeaker, an in-ear speaker, a proximity sensor, and a 2,800 mAh battery.
  • FIGURE 33 illustrates an example of a tablet enhanced with a capacitive touchscreen, camera and sensors.
  • the tablet 3300 can include a 10" capacitive touchscreen, a 1920x1080 pixel resolution display, a 5 megapixel rear-facing camera, a 1.3 megapixel front- facing camera, an accelerometer, a gyroscope, a microphone, a speaker and a power source for example, a 7,000 mAh battery.
  • FIGURE 34 illustrates an example of a keyboard with a hinge metal latch to plug a tablet.
  • the keyboard can be coupled to a tablet, pocket size display, television and other computer-based devices via Bluetooth.
  • the keyboard e.g., 3401 and 3402
  • the keyboard can feature a hinge metal latch to for example, plug a tablet device 3403.
  • FIGURE 35 illustrates an example of Bluetooth enabled headset as a wearable necklace.
  • the headset 3501 can be coupled wireiessly via Bluetooth technology to a wearable device or other computer-based device as presented herein for example, a pocket size touchscreen, a tablet and the like devices.
  • the headset can include a microphone, a selection button, and a pair of earbuds.
  • the headset can be wear as a necklace by snapping together the magnetic side of the earbuds forming a necklace pendant.
  • FIGURE 36 illustrates an example of a high definition multimedia interface (HD ) dongle.
  • the dongle can be utilized to connect a wearable device to other devices for example, a television via a Universal Serial Bus USB. Such a connection, allows a user to for example, display information stored and processed in the wearable device on a coupled television.
  • the dongle can include a high definition multimedia interface output, a 1.3-megapixel camera, a microphone, and a 3.5mm audio output.
  • FIGURE 37 illustrates an example Integrated Development Environment for the generation of SDKs.
  • the Integrated Development Environment (IDE) 3704 can be utilized to develop customized Software Development Kits SDKs to makeconsumer electronics, appliances and other computer-based products compatible with a wearable device.
  • the IDE can include a library with Hardware Development Kit (HDK) and Application Programming Interface (API) modules for users to utilize in the development of customized SDKs.
  • HDK Hardware Development Kit
  • API Application Programming Interface
  • the HDK and API can provide access to a set of functionalities of the wearable device, tablet, pocket size touchscreen and other devices presented herein.
  • the IDE can support a variety of compilers and programming languages to facilitate the development of supporting classes to simplify the access of the functionalities in the users' preferred programming language 3708 and their corresponding compilers 3712.
  • the IDE can include a built-in code editor featuring functionalities to support programmers coding projects such as code folding, window split views, multiline find and replace among other functions.
  • a customized SDK variant is produced 3718
  • users can develop applications that can be run in the wearable device operative system 3718 via a wearable device API 3716 to interact with third-party consumer electronics, appliances and other computer-based devices.
  • the developed third-party applications 3720 can be deployed and distributed to the general public via an App Market 3724.
  • the IDE can support a third party vendor to develop HDK/SDK that can be instantiated on the wearable device.
  • some existing Smartphone manufacturers may provide SDKs for third part)' vendors to develop compatible Smartphone apps (e.g., i Phone Apps, Android apps, etc.); in general, such SDKs provide APIs for a third part ⁇ ' vendor to develop new apps that will be operated on a single device, e.g., the Apple iPhone, the Android phone, etc.
  • the customized IDE shown in FIG. 37 provides an HDK and SDK combination that allows a third party vendor to develop extended hardware compatible with the wearable device, and software applications that can be operated on the wearable device to interface and control the extended hardware.
  • FIGURE 38 illustrates an example motion pattern to indicate the storage of digital data in a hybrid storage cloud environment.
  • a user can perform a gesture 3802 to indicate to the wearable device to save the information displayed on the tablet 3808.
  • the wearable device determines the performed gesture, it can send the data to be saved to a remote storage server 3810 in a save digital data request packet 3814.
  • the remote storage server can then send an acknowledgment packet back to the wearable device 3816 confirming that the information was successfully saved or informing the user that a problem occurred while saving the information.
  • the wearable device can additionally save the digital data in the wearable device's local storage e.g., 3806 through the local storage LS Component 3900. Further details with respect to the local storage component (LS) 3900 are described herein and particularly with respect to an example LS Component 3900 shown in Fig. 39.
  • FIGURE 39 illustrates an example digital data replacement/retention policy Least-Recently Used (LRU).
  • the wearable device has a finite capacity for internal storage, for example, 128GB.
  • the wearable device can take advantage of hybrid cloud storage with a retention/replacement policy chosen by the user.
  • the wearable device can replace digital data, according to a user defined replacement/retenti on policy .
  • a retention/replacement policy can be for example, a Least-Recently Used (LRU) retention policy, where the least recently accessed digital data or file is chosen to be replaced when more internal storage capacity is desired.
  • LRU Least-Recently Used
  • a user can choose other available retention policies.
  • MRU Most-Recently Used
  • RR Random Replacement
  • LFU Least Frequently Used
  • retention/replacement policies can be called every time new information is attempted to be saved locally or can be executed periodically as a background maintenance procedure.
  • users can create their own retention/replacement policies. For instance, a user can rank digital data in an order that determines the likelihood to be replaced, e.g., digital data ranked as 5 can be configured to be more likely to be replaced in the local storage than digital data ranked as 4 and so on.
  • Another way to specify a policy is by associating file extensions to ranks. For example, a user can configure . docx files to be prioritized over .mp4 files for replacement/retention purposes. Additionally, the user can choose to save certain digital information locally only and mark it as "Not Replaceable" or to "Save in the Cloud Only.”
  • the retention/replacement policies can be defined and selected from an installed app in the wearable device utilizing the device's display or a secondary display device.
  • Fig. 39 shows an exemplar ⁇ ' logic flow on how a retention/replacement policy works (e.g., Least-Recently Used (LRU)).
  • LRU Least-Recently Used
  • a command call is performed by the wearable device 3904. Such a command call will start a routine to execute several processing steps. Among other steps, the wearable device verifies that there is enough space to store the information locally 3908. If there is enough space in the local storage, the information is saved locally 3928 and the user 3902 receives an acknowledgment indicating that the save operation was successful 3928. If, however, there is not enough local storage space to save the file then the wearable device determines what retention/replacement policy is enabled in the device 3910.
  • LRU Least-Recently Used
  • LRU Least-Recently Use
  • the LRU policy defines a local variable $Min with a value of time that is one unit ahead of the current time 3914. Thereafter, a loop process is initiated 3916. The steps in the loop process include checking the last accessed time each of the files stored in the local storage were accessed 3918 and comparing the last accessed time to the time value stored in the variable $Min. When the last accessed time for a given file is earlier than the time value in $Min, a pointer to the file is stored in the variable
  • FIGURE 40 illustrates an example of a wearable device working as a surrogate communication device.
  • a user with a wearable device can perform a gesture to use the wearable device as a surrogate communication device 4002. Once the wearable device identifies the gesture it establishes a connection with a communication device 4001 to send an identifier utilization request 4003 which if accepted will allow the wearable device to communicate to other devices on behalf of the communication device 4001.
  • the device 4001 can accept or decline the request and communicate an answer to the wearable device through an identifier utilization response 4004. If the response is positive then the wearable device can communicate to for example a second communication device 4006 via text, email and/or voice call e.g., 4005.
  • the communication device 4006 can receive information from the wearable device 4001 on behalf of the communication device 4002.
  • the employment of the wearable device as a surrogate communication device can involve one or more exchange of credentials and logic information including but not limited to user name, passwords, IP addresses, subscriber identity module numbers and or identifiers.
  • inventive embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto; inventive embodiments may be practiced otherwise than as specifically described and claimed.
  • inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herem.
  • embodiments of designing and making the coupling structures and diffractive optical elements disclosed herein may be implemented using hardware, software or a combination thereof.
  • the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
  • a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PD A), a smart phone or any other suitable portable or fixed electronic device.
  • a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets.
  • a computer may receive input information through speech recognition or in other audible format,
  • Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet.
  • networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
  • the various methods or processes may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may ⁇ be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine.
  • inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other non-transitory medium or tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the invention discussed above.
  • the computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present invention as discussed above.
  • program or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of embodiments as discussed above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present invention need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present invention.
  • Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, etc. that performs particular tasks or implement particular abstract data types.
  • functionality of the program modules may be combined or distributed as desired in various embodiments.
  • data structures may be stored in computer-readable media in any suitable form.
  • data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by- assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields.
  • any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
  • inventive concepts may be embodied as one or more methods, of which an example has been provided.
  • the acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
  • a reference to "A and/or B", when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
  • the phrase "at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed withm the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase "at least one" refers, whether related or unrelated to those elements specifically identified.
  • “at least one of A and B" can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.

Abstract

The Intelligent Wearable Data Processing and Control Platform Apparatuses, Methods and Systems detects user movement patterns and transform them into computer commands to control electronic devices and process electronic data according to a variety of usability contexts. In some implementations, the disclosure provides a wrist-wearable apparatus, a wireless multimedia interface apparatus, a display device and processor implemented methods for motion controlled device tethering, hardware authentication, and hardware identification based targeted advertisement delivery.

Description

Wearable Data Processing and Control Platform Apparatuses, Methods and Systems
BACKGROUND
[0001] Body-borne technology or wearables are small devices that can be embedded in clothing and/or personal accessories. The non-intrusive nature and usability of body-borne technologies provide users with a unique human-device interaction wherein users can utilize a technology as an extension of their mind and body.
SUMMARY
[0002] Embodiments of the present invention include a wearable device, such as a wrist- worn computer, which may be enabled with wide area network (WAN) (e.g., 2G, 3G and 4G LTE), built-in global positioning system (GPS), Bluetooth and WiFi connectivity mechanisms. The wearable device may effectively replace smartphones, tablets, laptops, desktops and smart TVs, by being paired with different screens and input/output devices. The wearable device may be charged wirelessly and may utilize a high-bandwidth wireless protocol (e.g., WiGig, Bluetooth, and/or WiFi Direct) to stream video, audio, data, and various other content to a variety of screens. It may also comprise multiple data sensors, such as an accelerometer, gyroscope, digital compass and/or the like, as well as sizable internal storage. The wearable device may be controlled from other input devices; for example, the wearable device may receive commands from a paired device such as a mobile phone, a tablet and/or other computer-based devices. In one embodiment, the wearable device may have a screen suitable to display indications like time/date, notifications, connectivity toggles, and the like, as well as a touchscreen, and motion sensors. Moreover, the wearable device may use a secure and passive authentication method (e.g., heart signature) which can verify unique wave patterns of a user heart's electrical activity to allow user access to one or more applications installed in the wearable device.
[0003] Another exemplary embodiment is directed to a dongie which resembles a typical webcam that mounts to the top part of a monitor or TV that may receive wireless video and audio signals from the wearable device and may transmit the signals via high definition multimedia interface (HDMI) to a TV, computer monitor, and/or the like display devices.
The device may comprise three parts: the plug (HDMI), the cord, and a satellite pari comprising a camera, microphone and audio jack that mounts to the top part of a display. The dongie may transmit video, audio, sensory input and the like signals back to the wearable device. [0004] A further exemplar}' embodiment is directed to a display device which resembles a typical smartphone. This device may comprise a plurality of components, including but not limited to a display, capacitive touch panel, loudspeaker, in-ear speaker, microphone, cameras, and sensors. Such a display device can operate as an input and output interface for the wearable device. The display device may comprise connectivity mechanisms (e.g., Bluetooth low energy (BTLE), wireless gigabit (WiGig), Wi-Fi direct (WFD) and the like) to pair and connect with a wearable device wirelessly. In still another embodiment, the display device may comprise a magnetic resonance module to wirelessly charge the battery of a wearable device.
[0005] It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the inventive subject matter disclosed herein. It should also be appreciated that terminology explicitly employed herein that also may appear in any disclosure incorporated by reference should be accorded a meaning most consistent with the particular concepts disclosed herein.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] The skilled artisan will understand that the drawings primarily are for illustrative purposes and are not intended to limit the scope of the inventive subject matter described herein. The drawings are not necessarily to scale; in some instances, various aspects of the inventive subject matter disclosed herein may be shown exaggerated or enlarged in the drawings to facilitate an understanding of different features. In the drawings, like reference characters generally refer to like features (e.g., functionally similar and/or structurally similar
elements).
[0007] FIGURE 1A illustrates a front view of a wrist-shaped wearable device.
[0008] FIGURE IB illustrates a side view of a wrist-shaped wearable device.
[0009] FIGURE 1C illustrates an isometric view of a wrist-shaped wearable device.
[0010] FIGURE 2 illustrates an example block diagram of the components comprised by a wearable device.
[0011] FIGURE 3A illustrates a wireless multimedia interface apparatus. [0012] FIGURE 3B illustrates a block diagram of a wireless multimedia interface apparatus.
[0013] FIGURE 4A-C illustrates back, side and front views of an example display device with enhanced communication mechanisms to interface with a wearable device.
[0014] FIGURE 5 illustrates a block diagram for an enhanced display device.
[0015] FIGURE 6A~B illustrates an example of proximity awareness and the initiation of communication between a wearable device and a display device.
[0016] FIGURE 7 illustrates an example gesture controlled device tethering method.
[0017] FIGURES 8A-B illustrates an example of a gesture control to define a virtual control surface for data manipulation.
[0018] FIGURE 9 illustrates an example of a gesture tracking module comprised by a wearable device.
[0019] FIGURES 10A and 10B illustrate an example of motion-controlled device tethering between a wrist-shaped band device and multiple displays.
[0020] FIGURE 11 illustrates a physical trackpad utilizing a display device with touch screen.
[0021] FIGURE 12 illustrates an example of motion-controlled device tethering between a wearable device and a domestic appliance.
[0022] FIGURE 13 illustrates an example of a hardware-based authentication process employing a wearable device.
[0023] FIGURE 14 illustrates an example of a customized advertising process employing a plurality of wearable devices.
[0024] FIGURE 15 illustrates an example of wirelessly charging a wearable device through magnetic resonance.
[0025] FIGURES Ί6Α and 16B illustrate an example of wirelessly exchanging social profiles through wearable devices.
[0026] FIGURES 17A and 17B illustrate an example of configuration of motion controlled social connection. [0027] FIGURES 18A and 18B illustrate an example of logged information and further alternative actions after attending an enhanced event with wearable devices.
[0028] FIGURES 1 A and 19B illustrate an example of a configuration for programming and customize motion patterns.
[0029] FIGURE 20 illustrates an enhanced conference employing wearable devices.
[0030] FIGURE 21 illustrates an example of parallel output processes performed by a wearable device.
[0031] FIGURE 22 illustrates an example of a gesture indicating a first time tethering to a device.
[0032] FIGURES 23A and 23B illustrate an example of user interfaces for a first-time tethering to a device and for a communication mechanism settings.
[0033] FIGURE 24 illustrates an example logic flow of a device recognition (DR) component.
[0034] FIGURES 25A and 25B illustrate an example of magnetic charging connectors and logic flow for charging mode recognition.
[0035] FIGURE 26 illustrates face apps available for the wrist-shaped device.
[0036] FIGURE 27 illustrates an example use case for switching between apps that are most frequently used by a user.
[0037] FIGURE 28 illustrates an example use case for switching between apps featuring an apps grid.
[0038] FIGURE 29 illustrates an example of screens to display a notification summary area.
[0039] FIGURE 30 illustrates a use case to turn on and using the wearable device contact dash feature.
[0040] FIGURE 31 illustrates an example of a flexible wearable device with a rubber inner cover and glass exterior.
[0041] FIGURE 32 illustrates an example of a pocket size capacitive touchscreen display device enhanced with camera and sensors. [0042] FIGURE 33 illustrates an example of a tablet enhanced with a capacitive touchscreen, camera, and sensors.
[0043] FIGURE 34 illustrates an example of a keyboard with a hinge in metal latch to plug a tablet
[0044] FIGURE 35 illustrates an example of a Bluetooth-enabled headset as a wearable necklace,
[0045] FIGURE 36 illustrates an example of a high definition multimedia interface (HDMI) dongle.
[0046] FIGURE 37 illustrates an example Integrated Development Environment for the generation of SDKs.
[0047] FIGURE 38 illustrates an example motion pattern to indicate the storage of digital data in a hybrid storage cloud environment.
[0048] FIGURE 39 illustrates an example digital data replacement/retention policy Least-Recently Used (LRU).
[0049] FIGURE 40 illustrates an example of a wearable device working as a surrogate communication device.
DETAILED DESCRIPTION
[0050] Following below are more detailed descriptions of various concepts related to, and embodiments of, inventive systems, methods and apparatus of intelligent wearable data processing and control platforms. It should be appreciated that various concepts introduced above and discussed in greater detail below may be implemented in any of numerous ways, as the disclosed concepts are not limited to any particular manner of implementation. Examples of specific implementations and applications are provided primarily for illustrative purposes.
[0051] FIGURES 1A-1C illustrate different views of a wearable device 105. In one embodiment, the wearable device may comprise a wrist-band shaped body 104 to be worn and removed from a wrist through a gap between two disjoint ends. The wearable device may have a protective shell made of transparent and flexible nylon that encapsulates a black- painted printed circuit board made of flexible materials (e.g., Polyimide core, overlay, and/or the like) combined with multilayers of a rigid material (e.g., FR4 IPC- 101/21 or high-Tg FR4 filled and/or the like) to provide a built-in connection and/or to make a three- dimensional wrist-band shaped form comprising the circuit components. The transparency of the protective shell may create the illusion/perception of smaller size. Moreover, the wearable device may comprise a digital display 101, visible via an outer surface of the wrist- band shaped body, to display a time/ date, connectivity toggles, and notifications such as incoming emails, incoming text messages, event alerts, and/or the like. In a further embodiment, the digital display may comprise tactile capabilities such as control through simple or multi-touch gestures, compatibility with touch screen stylus pens, free hand writing recognition and/or the like. Within various implementations, the wearable device may appear in the form of a wrist band, a head band, a helmet, a neck tie, a pm, an arm band, a waist belt, foot wear, a badge, a key chain fob, and/or other wearable accessories.
[0052] In one embodiment, the wearable device may comprise a plurality of motion sensors, e.g., a 3-axis accelerometer, a 3-axis gyroscope, and/or the like disposed within the wearable device body to detect movement of the wearable device. The wearable device may comprise a vibration motor disposed within the wearable device body to enable the wearable device to vibrate. The vibration motor may comprise piezoelectric vibration mechanisms to facilitate different vibration patterns each pattern having an specific meaning to a user for example a vibration pattern can be associated with incoming communication received from a predetermined user. Furthermore, the wearable device may comprise a wireless communication transceiver, disposed within the wearable device body, to receive and transmit communications such as telephonic, instant messaging, email and the like communications.
[0053] The wearable device may comprise a processor disposed within the wearable device body and operably coupled to the digital display, the motion sensors, the vibration motor, and the wireless communication transceiver. The wearable device may determine a type of communication request and may generate a notification based on the type of communication request to be displayed on the digital display.
[0054] In one embodiment, the wearable device may determine a vibration pattern based on the type of the communication request for the vibration motor to vibrate the wearable device body according to the vibration pattern. A user may be able to customize this vibration patterns to, for example, distinguish incoming communications from different contacts. The configuration of customized vibration patterns may be enabled by a tangible haptic interface wherein the user may tap a configuration sequence on the wearable device screen, wherein each tap on the configuration sequence is synchronized with a vibration pulse chosen from a pre-recorded set of unique vibration pulses available to configure customized vibration patterns.
[0055] The wearable device may sense a user movement input, and interpret the movement as a control command. The control command can be for example, in response to a notification l a b e l indicating an incoming communication and/or a unique vibration notification characterize by a unique oscillation frequency indicative of an incoming communication. In addition, the wearable device may execute the determined control command, for example, the user movement input can be associated with an answering communication control command or with an ignoring communication control command.
[0056] The wearable device may comprise a power supply, operably coupled to the processor, to provide electrical power to the processor and a conductive coil, operably coupled to the power supply, to charge the power supply through magnetic resonance or magnetic induction. For example, the coil in the wearable device can receive electrical current from an alternating magnetic field generated by a transmitter coil.
[0057] FIGURE IB illustrates a side view of the wearable device 105. The wearable device may comprise one or more polycarbonate plastic layers 102 and/or solid type nylon layers 103 to protect the wearable device's internal components. The solid type nylon layer 105 makes the wearable device waterproof or water resistant.
[0058] Furthermore, the wearable device may comprise a button to perform a hard reset on the system. A hard reset may be useful when an error or failure is produced in the wearable device for which the wearable device can't recover. The reset button can normally restart the wearable device operative system and/or can run the operative system in safe mode with only basic drivers enabled. FIGURE 1C illustrates a perspective view of the wearable device. The wrist shaped body of the wearable device 104 may be implemented in different sizes to fit the wrist of numerous users. Independently of the size, the wearable device may include a power supply interface, e.g., a power supply plug-in socket on the outer surface of the wearable device for a power supply plug-in. The wearable device may include a loop antenna (e.g., a coiled copper loop, etc.) inside the body 104 for magnetic resonance charging.
[0059] FIGURE 2 illustrates an example block diagram of the internal components comprised by a wearable device. In one embodiment, the wearable device as shown in FGURES 1A-C may comprise a microprocessor (application processor 201) to enable a plurality of communications, e.g., telephone, text messages, email as well as online and offline operations including but not limited to browsing the Internet, watching video streams, uploading and downloading files, writing text and/or the like operations. The wearable device also comprises a memory module 202 including a low-power double data rate (LPDDR) random access memory capable to be connected over 16-bit or over 32-bit memory bus per channel, an embedded multimedia controller (eMMC) and/or the like components. The wearable device may comprise a 3G, LTE telecommunications module 203 including security mechanisms for subscriber identity module (SIM) cards, a dedicated data network processor and/or the like components. In a further embodiment, the wearable device may comprise a connectivity module 204, including mechanisms to enable wireless connectivity Wi-Fi, mechanisms to establish short-range wireless interconnections (e.g., Bluetooth), pairing mechanisms to perform near field communications (NFC), and/or multicast wireless sensor network technologies and the like.
[0060] The wearable device may comprise a Wireless Display module 206 including wireless high definition multimedia interface (e.g., WiGig), mechanisms to establish short- range wireless interconnections (e.g., Bluetooth), pairing mechanisms to perform near field communications (NFC), and/or multicast wireless sensor network technologies and the like.
[0061] The wearable device may comprise a plurality of sensors, actuators, and computing components 206 including but not limited to a multipoint control unit (MCU); a nine- axis motion tracking sensor with an embedded gyroscope, accelerometer, and compass: one or more buttons; wireless charging mechanisms; an authentication sensor chip; a vibration motor; an LCD touch screen; a global positioning system (GPS); a power block battery; and/or the like components.
Wearable Device Interfaces and Peripherals.
)62] FIGURE 3A illustrates an embodiment of a wireless multimedia interface apparatus. In one embodiment, the wearable device can be wirelessly connected to a wireless multimedia interface apparatus; the multimedia interface apparatus may comprise a wireless transceiver, disposed within its body member of the multimedia interface apparatus 301 to receive data content via wireless connection from a wearable device. The multimedia interface apparatus may comprise a multimedia data format converter, disposed within the body member and communicatively coupled to the wireless transceiver, to convert a data format of the data content to a multimedia format compatible for display at a screen display device. In addition, the multimedia interface apparatus may comprise a multimedia interface connector 302, communicatively coupled to the multimedia data format converter, to be plugged into a multimedia input receptacle of a screen display device and to transmit the data content in the multimedia format to the screen display for display. The multimedia interface apparatus may comprise a power adapter 303 for select televisions.
[0063] FIGURE 3B illustrates an example of a block diagram of a wireless multimedia interface apparatus, in one embodiment the multimedia interface apparatus may comprise a microcontroller (MICOM) 304, a wireless display module 305 including security and pairing mechanisms, a wireless HDMI mechanism, and additional components 306, including but not limited to a battery and/or charger, HDMI socket and the like components.
[0064] FIGURES 4A-C illustrate back, side, and front views of a display device with enhanced communication mechanisms to interface with a wearable device. In one embodiment the display devise may comprise a high pixel density screen (e.g., 320 ppi) enhanced with a capacitive touch panel 401. Additionally, the display device may comprise a dual LED flash 403 and a rear-facing camera 402.
[0065] Further embodiments of the display device may a plurality of additional components including but not limited to transceivers for audio and video streaming, components for Bluetooth low energy connectivity, embedded microphone, in- ear speaker, loudspeaker, 3.5 mm audio and microphone jack, proximity sensors, 3-axis accelerometer, gyroscope, high capacity battery, components to support wireless charging towards a wearable device and the like components.
[0066] FIGURE, 5 illustrates an example of a block diagram for an enhanced display device. The display device may comprise a central process unit CPU controlling and executing operation over a plurality of resources and components including but not limited to memory and storage including LPDDR mechanisms, front- and/or rear-facing cameras, illumination sensors, LED lighting, one or more buttons, n-axis sensors where n can be 3, 6 or other positive integer, battery, charger mechanism, voice processor unit, audio codec mechanisms including loud speaker, receiver mechanisms and/or microphones, to establish a plurality of communications (e.g., Bluetooth, WiFi, NEC, WiGig and/or ANT), an LCD display, one or more touch sensors, a touch controller unit and the like components.
[0067] FIGURE 6A illustrates an example of proximity awareness between a wearable device and a display device. In one embodiment, the wearable device 601 detects a gesture, such as a knocking or waving motion that indicates a request to connect to the display device 602 while the display device remains in standby mode. Thereafter, the wearable device may broadcast a message 603 to all the devices within a predefined proximity area. Moreover, the display device may receive a broadcast message emitted by a wearable device. Additionally, the display device may respond to the request with an acknowledgement message 604 comprising relevant information to establish a connection.
[0068] FIGURE 6B illustrates an example of an initiation of communication between a wearable device and a display device. In one embodiment the wearable device 601 may receive a gesture for pairing, such as a knocking or waving motion, with the display device 602. Meanwhile the display device 602 may be waiting for new messages. Thereafter the wearable device may send a message comprising pairing information to the display device 605. The display device 605 may respond to the wearable device 607 with an acknowledgment message 606 and then the wearable device 601 and the display device 605 may establish a connection for communication.
[0069] FIGURE 7 illustrates an example gesture- control led device tethering method. A wearable device may receive from a motion sensor disposed within the wearable device body 701 a motion indication including a movement pattern of the wearable device a motion pattern 702 corresponding to knocking on a display device 703 (e.g., a motion similar to knocking on a door). The display device 703 may include any user interface output device such as a display device, an audio speaker, and/or the like. The wearable device may determine based in at least part of the motion indication that the movement pattern indicates a tethering request to tether the wearable device with the display device. In addition, the wearable device may instantiate a device query on a communication stack i.e., multiple layers of software that process a message from transmitter to receiver within communication range of the wearable device in response to the tethering request. For example, a motion pattern e.g., "knock-knock," when the user wearing the wearable device double-knocks on a surface, may indicate a tethering request withm the communication stack to an output device. In another implementation, the motion pattern "scratching" may indicate a tethering request when a communication request is received at the wearable device. For example, a user wearing the wearable device may see a phone call received at the wearable device, e.g., a beep, vibration, etc.; the user may raise his/her wrist to "scratch" with the device behind his/her ear, and such a "scratch" motion may trigger a command to pick up the phone call, e.g., for the wearable device to answer the phone call, etc. It should be noted that a variety of motion patterns including, but not limited to waving, scratching, knocking, tapping (fingers), and/or the like, may be contemplated for motion control of the device. In one implementation, a user may define a motion pattern for a designated command via a user interface component, e.g., defining "knock-knock" as a tethering request for nearby display device, "scratching" as answering an incoming call, "waving" as moving the mouse on a tethered display device, and/or the like.
[0070] In one embodiment, the wearable device may send via a wireless transceiver, a connection request to the display device. Thereafter, the wearable device may receive, via the wireless transceiver, a connection approval from the display device in response to the connection request. Furthermore, the wearable device may send via the wireless transceiver data content for display to the display device. A user can view data content on the display device and input data through the display device's touch screen or by activating the display device's sensors for example, activating an accelerometer by moving the display device.
[0071] FIGURE 8A illustrates an example of a gesture control to define a control surface and data manipulation. In one embodiment, the wearable device may receive, from the motion sensor disposed within the wearable device body 802, a motion indication including a movement pattern 803 performed over an object's surface 801. Thereafter, the wearable device may analyze a direction and length of the movement pattern and can generate a virtual control surface my mapping the dimensions gathered through the movement pattern to the dimension of a display device.
[0072] FIGURE 838 illustrates an example of data manipulation employing a gesture controlled surface. For instance, the wearable device 806 may determine that the movement pattern (e.g., typing motions, finger tapping motions, finger swiping motions, finger or palm movements in particular relative or absolute directions, etc.) over the selected surface 805 indicates a control command (e.g., typing an address to be viewed on a map website) based on content displayed on the display device 807. In addition, the wearable device may execute the determined control command in communication with a wireless multimedia interface apparatus and/or dongle 804 connected to the display device.
[0073] FIGURE 9 illustrates an example of a gesture tracking module comprised by a wearable device. A wearable device may detect a gesture 902 employing a movement sensor 903, for example, an accelerometer and/or a gyroscope enhanced with a compass. The raw gesture data may be sent to a pointing conversion module running a processor configured to transform the data mto a readable format for the target application 904. Furthermore, an input manager may receive the pointing data 905 and may buffer the pointing data into a GUI Control Application 906 in charge of rendering the manipulation of graphical objects on a coupled display device. Device Tethering
[0074] FIGURE 10A illustrates an example of motion controlled device tethering between a wearable device and multiple displays. In one embodiment, a wearable device 1001 may receive, from a motion sensor (not shown) disposed within the wearable device body, a first motion indication including a first movement pattern, e.g., knocking on the device 1002. Thereafter, the wearable device may query an internal database with programmed movement patterns and determine that the first movement pattern indicates a first tethering request.
[0075] The wearable device 1001 may instantiate a device query on a communication stack within communication range of the wearable device. Thereafter, the wearable device 1001 may receive an indication of a first display device 1003 and a second display device 1006 within the communication stack. Furthermore, the wearable device may send a first connection request to the first display device 1003 and thereafter it may receive a first connection approval from the first display device 1003 in response to the first connection request.
[0076] The wearable device 1001 may receive, from the motion sensor, a second motion indication including a second movement pattern. Thereafter the wearable device 1001 may determine that the second movement pattern indicates a second tethering request. In addition, the wearable device 1001 may send a second connection request to the second display device 1006 and thereafter it may receive a second connection approval from the second display device in response to the second connection request.
[0077] FIGURE 10B illustrates an example of data manipulation on a first tethered display 1003 employing a second tethered display 1006 and a wearable device 1001. In one embodiment, the wearable device 1001 may send data content for display to a first display device 1003. In addition, the wearable device 1001 may send data content for display to a second display device 1006 in communication with a wireless multimedia interface apparatus 1007 connected to the display device. Moreover, the wearable device 1001 may receive a user input indication from the first display device 1003, and process the user input indication to execute a user command for example typing a text or clicking on a GUI object. Thereafter, the wearable device may generate output data based on the user command and may send the output data for display to the second display device 1006.
[0078] FIGURE, 11 illustrates an example of physical trackpad utilizing a display device with touch screen. In one embodiment, the wearable device 1101 may scan for devices within a predetermined proximity area. The display device with touchscreen 1102 may also broadcast a message to be found by the wearable device. Furthermore, the wearable device 1101 may send a message to connect as input device to the display device with touch screen 1103. The display device with touchscreen 1102 may receive the message and respond to it with an acknowledgement message 1104. Thereafter, the display device with this touchscreen 1102 may wait for a touch input performed by the user with the wearable device 1 101. The aforementioned process allows the display device 1 102 to be used by the wearable device as a peripheral device for example a trackpad and/or to as a peripheral to view and enter information from and to the wearable device.
[0079] FIGURE 12 illustrates an example motion controlled device tethering with an appliance (e.g., a thermostat) employing a wearable device. The wearable device 1201 may instantiate a device query on a communication stack within a communication range of the wearable device 1202 comprising a wireless transceiver operably coupled to a processor. For example, the communication stack enabling one or more communication protocols may be established by the wearable device 1202 v such as but not limited to Wi-Fi, Bluetooth, Bluetooth Low Energy (LE), Near Field Communications (NFC), iBeacon, and/or the like. The wearable device may receive, via the wireless transceiver, an indication of a home electronics device from a security system home electronic device 1203 (e.g., a microwave, a refrigerator, a laundry machine, a thermostat, and/or the like) in the communication stack 1207. Thereafter, the wearable device may extract a device identifier from the received indication, query a list of pre-stored device identifiers for a match to the extracted identifier to determine a type of the home electronic device, and configure a control interface based on the type of the home electronic device 1204. Additionally, the wearable device may send a control command based on the configured control interface for example lowering the temperature to the home electronic device 1205 and receive, from the home electronic device, a notification indicative of the operating status of the home electronic device in response to the control command 1206. In one implementation, the home electronics device manufacturers may be provided with a hardware development kit (HDK) to equip the home electronics devices with hardware components to interface with the wearable device control commands.
[0080] FIGURE 13 illustrates an example of a hardware-based authentication process employing a wearable device. A wearable device having a hardware identifier 1307 may send to a user service provider 1308 (e.g., an online banking system that requires user credentials to login, etc.) a system access request to the user service provider 1301. Such an access request can comprise a set of user credentials. Thereafter, the service provider may extract the credentials (e.g., hardware identifier, IP address, physical address, biometrics data, etc.) from the request and send them to a wearable device management server 1309 to be verified 1302.
[0081] In one embodiment the wearable device management server 1309 may access a data repository containing pre-stored client credentials 1310 to verify the validity of the received user credentials 1303. Thereafter, the wearable device management server may receive a response to the verification request 1304 from the data repository. The wearable device management server 1309 may send a credentials verification response which may confirm or not that the received credentials are associated with an existent client of the service provider 1305. In addition, the service provider may analyze the received credentials verification response and may send a corresponding response to the system access request sent by the wearable device 1306.
[0082] For example, when a user wearing a wearable device 1307 uses the wearable device as a computing device to access a service provider 1308, e.g., an online banking site, the online banking site may detect the source of the access request as originated from the wearable device 1307 and may provide an option of "Login with Your Wearable Device." Upon user selecting this login mode, the service provider 1308 may collect the hardware ID of the wearable device 1307 and additional information (e.g., IP address, physical address, GPS location, etc.) and direct the authentication request 1302 to the server 1309, which may in turn authenticate the user based on the database of the hardware IDs. In this way, the user wearing the wearable device 1307 may not need to enter additional credentials (e.g., user name and password, etc.) to securely login into the personal account at the service provider 1308 (e.g., the user's online banking account, etc).
[0083] In a further implementation, the server authentication at 1309 on behalf of the service provider 1308 may employ the user's biometnc data which may be collected by the wearable device 1 307, such as but not limited to fingerprint, iris/retina scanning, heart- signature, blood pressure pattern, body temperature, and/or the like. In a still further implementation, the hardware authentication may be used in social media target ads. For example, social media platforms (e.g., Facebook, etc.) may collect "likes" based on hardware authentication to obtain an accurate count of user interests, avoiding robot generated spam "likes." [0084] FIGURE 14 illustrates an example of a customized advertising process employing a plurality of wearable devices. In one embodiment, ADServer 1414 may receive hardware identifiers from a plurality of wearable devices 1401, 1402 and 1403. Thereafter the ADServer 1414 may send a common interest request comprising a set of hardware identifiers e.g., 1404 to a ClientServer 1415. In another embodiment, the ChentServer 1415 may send a request of users' interests' profiles of the users who are associated with the received hardware identifiers 1405 to a repository containing clients' data 1416. Additionally, the ClientServer may receive a request response containing the users' interest profiles of the indicated users 1406. Thereafter, the ClientServer 1415 may analyze the received profiles to determine a common interest between the indicated users.
[0085] In one embodiment, the ClientServer 1415 may send a response 1408 to the received common interest request sent by the ADServer Thereafter, the ADServer may process the common interest received in the response and determine an advertisement conceptually related to the received common interest 1409. Additionally, the ADServer may send a determined advertisement to be displayed in an automated billboard display 1417.
Wearable Device Power Charging Modes
[0086] FIGURE 15 illustrates an example of wirelessly charging a wearable device through magnetic resonance. A user wearing a wearable device 1507 may perform a motion pattern 1508, which may trigger a motion indication 1501 detected by a motion sensor (not shown) embedded in the wearable device 1507. Thereafter, the wearable device 1507 may analyze the motion pattern 1502 and determine that the motion pattern 1502 matches a pre- programmed motion pattern to command a connection request 1503 to a close-range device, e.g., display device 1506. Furthermore, the display device 1506 may approve a connection request by sending a connection approval message 1504 to the wearable device 1507. Additionally, the display device 1506 may wirelessly transfer power 1509 to the wearable device 1507 while, the wearable device transfers data content 1505 to be displayed on the display device 1506.
[0087] The wearable device 1507 may comprise a magnetic resonator which may receive a flow of power from a magnetic near field induced by a source resonator embedded in the display device 1506 charging the wearable device's battery by magnetic resonant power transfer. Alternatively, the wearable device 1507 may be charged wirelessly from the display device by inductive power transfer and/or the like wirelessly power transfer technologies. In this way, the wearable device 1507 may take advantage of the display device 1506 to charge its battery, which can be equipped with a larger batter}'. Wearable Device Social Networking
[0088] FIGURES 16A and 16B illustrate an example of wirelessly exchanging social profiles through wearable devices. In one embodiment, a user wearing a wearable device 1606 bumps the wearable device 1606 into another wearable device 1607 worn by another user. Thereafter, a motion sensor embedded on the wearable device 1606 may detect a motion indication 1601 triggered by the bump. Moreover, the wearable device 1606 may analyze the motion pattern, i.e., bump 1602, and determine that the motion pattern matches a pre-programmed motion pattern to command an exchange of social profiles with the close- range device 1607.
[0089] In one embodiment, the wearable device 1606 may instantiate a device query within communication range 1603 to detect and establish a communication link with the wearable device 1607. In addition, the wearable device 606 may start a device recognition (DR) process 1604 to determine if the wearable device 1606 has been previously exposed to the wearable device 1607 and whether it has already recorded identification information regarding the wearable device 1607 for automatic recognition. Further details with respect to device recognition (DR) 1604 are described herein and particularly with respect to an example DR Component shown in Fig. 24.
[0090] Subsequent to the detection of a bump motion pattern, the wearable device 1606 may send a social profile (SP) exchange request 1605 to the wearable device 1607. In one implementation, the SPexchange request 1605 may include a user name, user contact information, user social media information (e.g., see Fig. 18B), allowing the user of wearable device 1607 to send a social media connection request directly (e.g., a Friend request on Facebook, etc.). Thereafter, the wearable device 1607may receive a confirmation input accepting the request to exchange SPs 1608. Then, the wearable device 1607 may send SPinformation 1609 to the wearable device 1606.
[0091] Once the wearable device 1606 receives a new social profile 1609, it may add the SP to an internal contacts database or may add or accept the social profile to a list of contacts in a social network. In another implementation, the social profile 1609 may only include a wearable device identifier, and further information regarding the wearer of the wearable device may be viewable only after the wearer has authorized the exchange of information. For example, the wearable device 1606 may receive an indication of the device identifier of device 1607, and then it may generate a message for display on a tethered screen (e.g., 1102 in Fig. 11), e.g., "do you want to connect with 1607?"
[0092] The wearable device 1606 may also send a SP 1611 to the wearable device 1607. In one embodiment, the SP 6 1 may be sent to the wearable device 1607 only after the wearable device 1606 has accepted or added the social profile (SP) 1609 to a local contact list or repository or alternatively has accepted the social profile (SP) 1609 into a social network contact or friend list. At the end of this step the exchange of social profiles may have been successfully executed wherein each wearable device may have stored social information of another wearable device.
[0093] FIGURES 17A and 17B illustrate an example user interface (UI) of privacy/security configuration of motion controlled social connection. In one embodiment, a user may configure privacy/security settings of the motion controlled social connection, and/or social profile exchange, via a touch screen UI tethered with the wearable device (e.g., see 1102 in Fig. 11). For example, a user may configure the wearable device to automatically share his or her social profile and send a connection request with another wearable device without having to manually confirm the connection request, e.g., by enabling the "auto- bump" option at 1701. In another example, the user may elect to enable "notification" 1706, so that the user may receive a notification message when the wearable device detects a motion controlled social profile sharing request, and/or sends out a connection request. Such notification may request a user action to continue, e.g., "You are about to connect with John Doe. Continue?" Or the notification is provided without requiring a user action when the "auto-bump" 1701 is enabled, e.g., "You have just sent out a Friend request to John Doe."
[0094] In one embodiment, a user may configure social media information through the "bump" page, e.g., Figure 17A. For example, upon a user enters credentials for one or more social media accounts, the wearable device may access the social media platform via an API so that a user may directly send social media requests (e.g., a Facebook friend request, etc.) within the user's social profile page (e.g., see FIG. 18B).
[0095] In one embodiment, a user may configure the information he/she wants to receive and/or transmit from a social network profile after bumping his/her wearable device with another user's device, e.g., 1702. For example, a user may want to provide only an invitation to join a Facebook group instead of a Facebook friend request, similarly some users may rather to send an invitation to follow their Linkedln companies' profiles instead of requesting them to connect to their Lmkedln personal profiles. The available social networks may include but may not be limited to Facebook, Twitter, Lmkedln, Instagram, and Tumblr, YouTube, and/or the like platforms. Moreover, a user may configure a motion controlled social connection to exchange profiles and/ or other information specified in the profile section 1712 which may be transmitted via an email account, Short Service Message (SMS) message, or through a personal call or an automated call that a user may receive notifying social profile information after an "auto-bump" event.
[0096] A user may configure the "auto-bump" option through a UI where he/she may specify in a list 1755 which of the multiple sources and mediums should be disabled for the exchange of profiles after an "auto-bump" event call through the configuration control 1703. Furthermore, a user may specify a list to enable specific mediums or social network platforms for the exchange of profile information after an "auto-bump" call through the configuration controll704. Additionally, whenever a social network or a type of communication is not specify on the disable list 1703 or the enable list 1704 it may be requested to be added for the exchange of profile information by any of the users exchanging profile information, even after the "auto-bump" event has been executed.
[0097] A user may specify the number of bumps required to execute the automated exchange of social profile information through an "auto-bump" call through the configuration control 705. In addition, a user may specify a timespan indicating a date since when the wearable device will gather data for executing an "auto-bump" call through the configuration control 1707. In the example shown in FIG. 17B, the user has configured the wearable device such that if the user has "bumped" (e.g., detect via the wearable device) another user more than three times 1708 this year 1709, the user's wearable device may automatically send a Linkedln contact request (e.g., 1710) to the other user; however, the wearable device will not send or receive invitations to connect via Facebook, SMS or phone calls, as they have been disabled for automatic connection 1711.
[0098] FIGURES 18 A. and 18B illustrate an example of logged information and further alternative actions after attending an enhanced event with wearable devices. In one embodiment, a user may be able to view through a graphical user interface information of an event he/she has attended e.g., 1802. Moreover, the user may be able to view information from the attended event including but not limited to event's date 1803, event location 1804, a record of people with whom the user may have interacted during the event 1805 and the number of times the user interacted with another attendee 1806, e.g., the user has "bumped" into "John Smith" four times 1810, A user may view further information from an attendee 1801 with whom he/she had interacted e.g., 1806, and such information may be used to branch out to specific attendees in a variety of ways including but not limited to social networks relation requests 1807 and/or via other communication means e.g., email, telephone or SMS 1808. Additionally, a user may opt to save the received profile in the wearable device's contact list by executing the "Confirm" command 1809.
Programming Motion Patterns
[0099] FIGURES 19A and 19B illustrate an example of configuration of programming and customized motion patterns. A user may configure a wearable device to program and customize a motion pattern, via a touch screen UI tethered with the wearable device (e.g., see 1102 in Fig. 1 1). For example, a user may program and customize a new motion pattern and/or override a preset motion pattern. In one embodiment, a user may program and customize a motion pattern for professional networking at conferences 1901 and similar events. Similarly the user may program other motion patterns, for example, a programmed motion pattern to count the number of strokes a swimmer performs, calculate a distance per stroke swam by a swimmer, or vibrate and/or emit a sound when a swimmer's stroke does not correspond to an appropriate form 1902. Another illustrative example may be a user that has programmed a bowling motion pattern to provide haptic, textual, and/or sound feedback during and/or after delivering or rolling a bowling ball. Feedback may be associated with metrics including but not limited to acceleration, speed, movement economy, and visual feedback including but not limited to recorded movement patterns displayed on a three dimensional space and/or the animation of recorded movements though a display device. It will be obvious to a person of the ordinary skill in the art that the customization of motion patterns can be applied in many contexts wherein motion precision, training, and/or constant improvement of physical gestures may be desired including but not limited to sports, manufacturing processes, performing arts, and the like fields and disciplines.
[00100] For instance, a user may provide a name to identify a motion pattern 1903 corresponding to a preset motion pattern e.g., a bump 1904 or a motion pattern previously recorded by the user. A user may also configure the number of repetitions of the specified motion pattern that will have to be performed before an action is executed 905. Similarly, the actions that may be executed after a motion pattern has been detected by the wearable device 1906 may be specified, for example, the exchange of social profile information, start audio recording 1908, start movement recordings and the like actions. Additionally, a user may want to be notified after the action or actions have been completed 1907.
Wearable Device Event and Conference Support
[00101] FIGURE 20 illustrates an example of enhanced conference employing wearable devices. In one embodiment, a user wearing a wearable device 2003 may check-in into a conference 2004 associated to a local area network 2001. A check-in request 2004 may comprise credentials authenticating the user of the wearable device 2003. Additionally, a second user may also send a check-in request 2005 through a wearable device 2002. Thereafter, the wearable device 2002 may instantiate a device query within communication range 2006 to identify other participants of the conference. Moreover, the wearable device may identify a close-range device 2003 and subsequently may send social profile information 2007 and may receive social profile information 2008 from the close- ange device 2003.
[00102] FIGURE 21 illustrates an example of parallel output processes performed by a wearable device 2102. In one embodiment, a wearable device 2102 may receive a SMS message 2105 from a mobile phone tower 2101. Thereafter the wearable device may start or alternatively continue a video streaming process 2106 with a first display device 2103 while sending the SMS message 2107 to be displayed on a second display device e.g., 2104.
[00103] FIGURE 22 illustrates an example of a gesture indicating a first time tethering to a device. In one embodiment, the wearable device 2202 and the display device 2201 comprise low power consumption wireless communication mechanisms 2203, for example, Bluetooth Low Energy (Bluetooth LE). Bluetooth LE provides a lightweight link layer capable of providing ultra-low power idle mode operation, simple device discovery and reliable point-to- multipoint data transfer with advanced power-save and secure encrypted connections. The device 2201 may remain in sleep mode most of the time and may only wake up when it receives a connection request through the Bluetooth LE mechanism 2203, thereby reducing the power consumption. While various embodiments of the present invention are described using Bluetooth LE, the use of this communication mechanism is not intended to limit the present invention. On the contrary, various embodiments of the present invention may be implemented using other wireless mediums as well or instead.
[00104] In one embodiment, a user with a wearable device 2202 performs a motion pattern over a display device 2201. The motion pattern may be for example knocking on the display device more than one time, moving the display device horizontally from left to right, or vertically up and down, and/or the like motion patterns. Thereafter, the wearable device 2202 may recognize that a motion indication has been performed, e.g., 2211, and subsequently may analyze the motion pattern 2212 and determine that the motion pattern matches a pre- programmed motion pattern to command a wireless tethering request e.g., 2204. Furthermore, the display device 2201 may send an acknowledgement 2205 via Bluetooth LE comprising a device name to the wearable device 2202. The acknowledgement 2205 may only be sent after the display device has received a tethering request e.g., 2204. In a further embodiment, the wearable device 2202 may send a command to display a confirmation screen 2206 to the display device 2201. Additionally, the display device 2201 may display the confirmation screen 2206 to be viewed by the user of the wearable device 2202.
[00105] In one embodiment, the user with the wearable device 2202 may enter a personal identification number (PIN) 2207 into the device 2201, which may be displaying the confirmation screen 2206. Then, the PIN number and a device identification number 2208 may be sent to the wearable device 2202. Additionally, the wearable device 2202 may store the device identification number 2209 for example a media access control (MAC) address in a local repository for future automatic recognition and/or tethering. In a further embodiment, every time the display device 2201 is woken up, for example, by pressing the button 2210, it may remain wirelessly tethered via Bluetooth LE 2203 to the wearable device 2202. Furthermore, a user may press and hold the button 2210 for few seconds to untether the wearable device 2202 from the display device 220 such that the display device 2201 can be tethered with another device
[00106] FIGURES 23A and 23B illustrate an example of user interfaces for a first time tethering to a device and for a communication mechanism settings. In one embodiment, a confirmation screen to tether a device via Bluetooth LE may be displayed to a user, e.g., as shown in Figure 23 A. The confirmation screen may display the tethering mechanism that may be employed for the instant connection 2307. Furthermore, the confirmation screen may display the name of the device to which the wearable device may be attempting to tether, e.g., 2301. Additionally, the confirmation screen may comprise a text field 2302 to be employed by the user to enter a personal identification number as a way to confirm a tethering action.
[00107] In one embodiment, a user of a wearable device may access a screen to change the settings of a communication mechanism and to view the devices that are tethered through a communication mechanism as shown in Figure 23B. In addition, a user may enable or disable a communication mechanism through a toggle control, for example, the toggle control 2303 enables or disables a Bluetooth LE communication mechanism. Similarly, a user may enable or disable an auto-connect mode employing another toggle control 2306. When the auto-connect setting is enabled, the wearable device may automatically connect to known devices. Moreover, a user of a wearable device may view the devices to which the wearable device is tethered for example a wearable device may be wirelessly tethered via Bluetooth LE to a display device 2304 and simultaneously it may be tethered to a television 2305
[00108] FIGURE 24 illustrates an example logic flow: device recognition (DR) component. A device recognition component comprised by a receiver device 2402 may execute instructions to retrieve a device identification number 2404 from a sender device e.g., 2401. Thereafter, the receiver device 2402 may run a query 2405 on a receiver device repositoiy 2403 to verify if the received device identification has been previously recorded by the receiver device 2402. In addition, the receiver device 2402 may analyze the query results 2406. If the device identification has been previously recorded 2406, then an interaction counter corresponding to the number of times the receiver device has received the device identification may be increased by one unit e.g., 2407 and thereafter the device recognition component may stop. Furthermore, if the device identification has not been previously recorded e.g., 2406, the receiver device may evaluate if a subsequent exchange of social profiles has been programmed to be executed e.g., 2408. If an exchange of social profiles has been programmed to be executed e.g., 2408, then an interaction counter is created and initialized to 1 unit, e.g., 2409, thereafter the receiver device 2402 may run a query on the receiver device repository 2403 to insert a record comprising the received device identification number e.g., 2410, subsequently the device recognition component may stop. Alternatively, when an exchange of social profiles has not been programmed then the device recognition component may stop and no further actions may be performed,
[00109] In an alternative embodiment, both the interaction counter 2409 may be created and initialized and the query 2410 may run regardless of whether or not a subsequent exchange of social profiles has been programmed to be executed. In such an embodiment, the interaction counter may count the number of encounters with a sender device regardless of whether or not an exchange of profiles has been previously executed. In addition, the device recognition component may count the interactions with a sender device and after the number interactions with a particular device has exceeded a predetermined threshold, the device recognition component may suggest to the user of the receiver device to exchange social profiles based on the number interactions that have previously occurred with respect to the sender device 2401.
Wearable Device Compatible Charging Sources
[00110] FIGURES 25A and 25B illustrate an example of magnetic charging connectors and logic flow for charging mode recognition. In one embodiment (shown in Figure 25A), a wearable device 2501 and a display device 2502 may comprise electrical connectors 2503 and 2504 respectively that can be attached together by magnetic force. Both devices 2501 and 2502 may comprise additional electrical connectors to connect to external DC and/or AC power supplies.
[00111] As shown in Figure 25B, a charging mode recognition software component comprised by a wearable device may determine which device or devices may be powered or charged at a given time when a wearable device 2501 is electrically and magnetically attached to a display device 2502, When one of the attached devices emits a charging indication or request 2505, the charging mode recognition component may determine if the wearable device is connected to a power outlet or any other external power source 2506. When the wearable device is not connected to a power outlet or any other external power source, then the wearable device 2501 may charge power from the power source comprised by a display device 2502 for example software module 2508. Alternatively, if the wearable device is connected to a power outlet or any other external power source, then the wearable device 2501 may charge the power source comprised by the display device 2502 for example software module 2507. Furthermore, the charging mode recognition component may notify the user of the wearable device 2501 the current charging mode and device or devices charging statuses e.g., 2509.
[00112] The user can also make use of a headset's 2510 electric wire being configured to serve as a charging cord to connect the wearable device to a wall outlet. Several charging configurations can be utilized including but not limited to: outlet to headset; outlet to headset and headset to wearable device; outlet to headset headset to display device, display device to wearable device; outlet to headset, headset to first display device, first display device to second display device, second display device to wearable device; display device to wearable device and the like configurations. The aforementioned charging modes provide several advantages to a user for example the users do not need to carry charging cables, more than one device can be charge using only one power outlet devices can be charged from one another without the need of a power outlet.
Wearable Device Compatible Apps
[00113] Another exemplary embodiment includes an integrated portal platform instantiated on a user mobile device (e.g., a Smartphone, a tablet computer, a laptop computer, etc.). The integrated portal platform may allow a user to access various client components on the user mobile device such as email, gaming applications, calendar applications, browser, social media portals, etc., through one portal platform. In this way, the user could access all desired portal apps (e.g., email, calendar, social media, gaming, etc.) instantly through one portal platform, instead of having to search, download and install a number of separate mobile apps on his/her mobile device (e.g., an email app, a Facebook app, a Google calendar app, etc.).
[00114] For example, a user may instantiate the integrated portal platform on his/her
Smartphone to launch a client component, e.g., Facebook. If the user has not previously instantiated a Facebook component via the portal platform before, the user may enter "Facebook" to initiate a search within the portal platform, which may return a list of search results. In one implementation, the list of search results provide a portal app connection to the Facebook portal app, instead of a URL link or a link to a downloadable mobile app; upon choosing the portal app connection to "Facebook," the user may launch a Facebook portal app within the portal platform directly, without downloading and/or installing a separate mobile app on his/her Smartphone. In another example, if the user has previously instantiated a Facebook component via the portal platform, the user may view an icon "Facebook" within the integrated portal platform UI ( or a desktop icon of "Facebook" ) and may directly launch the "Facebook" portal component.
[00115] The portal platform may facilitate shifting the bulk of data processing and computation to a server, a cloud or a remote system, e.g., the user mobile device may have reduced requirements for data processing capacity as the user mobile device may not need to download and install separate mobile applications. In one implementation, once the user mobile device is connected to Internet, the user may access all portal components via the portal platform. In another implementation, a cached copy of the portal component may be saved on the user mobile device so that the user may access the cached copy of the portal app while offline, e.g., a user can still access his/her "Facebook" portal component to read his/her friend list while offline.
[00116] The portal platform allows a user to access the latest version of the portal components when there is an update (e.g., an updated "Facebook" component) without the user downloading or installing any updates, as the component update has been performed at the server/ cloud level.
[00117] FIGURE 26 illustrates an example of a watch face app available for the wrist- shaped device. The wearable device's display can be implemented as a 9: 16 vertical aspect ratio color display, with a curved surface. The screen can have for example an overall size of
30 x 53.3 mm and a 320 ppi or higher resolution. Moreover, the display can include a touch panel supporting multi touch triggers capable to handle manipulation of up to 3 fingers at the time and wet use support, that is, the display functionality is not constraint when a user manipulates the screen with wet fingers.
[00118] The wearable device can use sensors to deduce when the display should be on. For example on "placed on wrist" event, or wrist brought up to check time event. Screen off can occur upon timeout (if the screen is not touched for x time, then turn off). In such a case, the wearable device can function without an ON/OFF button.
[00119] Similarly, a call to display a home screen can be achieved by gestures, avoiding the implementation of a physical home screen button. For example, a home screen function can be called by a three finger pinch gesture, for example, a simultaneous pinch with the thumb, index and middle fingers.
[00120] A watch face, can be utilized as a home screen. A watch face can be defined through a set of apps (a user can select which "Face app to use) that are designed to show time and date (e.g., 2602, and 2604) or any combination of status content the user may want to see e.g., a compass with watch 2605, chronograph, athlete statistics and time 2603, weather centric 2601 and similar "Face apps". Face apps can combine the most interesting and most highly used watch configurations for a determined user. The display on the wearable device works as a canvas to redefine a watch face and deliver a targeted solution to the consumer.
[00121] FIGURE 27 illustrates an example of a use case for switching between apps that are most frequently used by a user. When a user is not interacting with the touchscreen interface of the wearable device, the screen can be configured to be a minimal display, showing only the home screen app for example a "watch face" 2701, but when a user touches and hold the central focus point, the "watch face" can vanish and a menu with four menu options can be displayed 2702. While holding, the user can slide a finger towards a desired option represented by an icon 2703 and when the finger is released from the screen, the option will be activated.
[00122] After some interaction time with the wearable device, users can memorize the positions of for example four key apps and flick their finger in the direction of the desired app before any icons are displayed, creating a gesture (e.g., long touch, swipe to top right). It is important to notice that although the picture shows only four alternative apps any other number of apps or functions can be configured to be displayed. A possible configuration for the interactive menu can be to display the four most frequently used apps when a user touch and hold a finger on the screen however, other type of configurations can be programmed for example, displaying a fix set of apps regardless how frequently are used, or displaying apps with a notification or the latest used apps. The interactive menu can similarly be used on other devices related to the wearable device such as a pocket screen, a tablet, a virtual control surface and the like interfaces as long as they have an enabled touchscreen.
[00123] FIGURE 28 illustrates an example of a use case for switching between apps featuring an apps grid. From the home screen 2801, a user can swipe a finger to one side of the wearable device's display 2802, to access an app grid displaying multiple app grid screens or icons 2803.
[00124] FIGURE 29 illustrates an example of screens to display a notification summary- area. The applications installed in the wearable device can generate notifications triggered by events relevant to a user. The notifications can be configured to cause a "vibe notification" which can be perceived by a user as a subtle vibration generated within the wearable device. If the user looks at the screen, the notification can be view, and the user can tap the screen to accept an action or to view more information regarding the notification. A dismissal to a notification can be configured as a hand gesture indicating that the user is not looking at the screen any more. Face apps for example a "watch face" 2901 support notification summar features 2902, which can appear as a preview in a toggle panel 2903. A user can drag down the toggle panel to read further details of the shown notifications.
[00125] FIGURE 30 illustrates an example of use case to turn on and use the wearable device contact dash feature. The wearable device can have an area of pressure or pressure strip 3003 responsive to more than one touch sensors on one or more sides of the device, such an area can include a sensitive input device corresponding to the size of the length of the screen 3004. The pressure area or strip can be a dedicated contact dash button, allowing the user to scroll and select a contact to interact with. A user can touch the side surface of the wearable device to bring up the contact dash 3001. The wearable device can transition out or vanish any current app to bring up the contact dash app. A user can scroll through contacts, magnify contacts, and select a contact to interact with by pressing harder on the pressure strip. A person of ordinaiy skill in the art will understand that the contact dash feature may be extended to other items than contacts, such as notes, calendar events, etc., that can be selected and engaged with via the pressure area or strip. In one embodiment, a first plurality of selectable items are displayed in response to user touch sliding in one direction along the pressure area or strip and a second plurality of selectable items in response to user touch sliding in an opposite direction. The communication options can be configured to fit multiple usage models, for example a walkie-talkie voice option, a draw/write option, conventional text messages, email, telephone conversation and other available communication methods. In for example a draw/write option a user can write with a finger a message on the display screen 3002, such a message will appear in the chosen contact screen as it is display on the sender's screen i.e, conveying the sender's handwriting style. A person of ordinary skill in the art will also understand that the contact dash feature, whether implemented to select contacts or other items such as notes, calendar events, etc., can be applied to devices other than wearable devices, such as tablet computers, laptops, smartphones, etc.
[00126] FIGURE 31 illustrates an example of a flexible wearable device with a rubber inner cover and glass exterior. The exterior of the wearable device can be protected by a bendable glass enabling a high degree of flexibility of the device's body. The interior of the wearable device can be protected with a hypoallergenic rubber providing traction to the wearable device such that the device can be hold on the wrist in a fix or semi-fix position. The wearable device is usable on both right and left hand. A user can configure the hand they will wear it on. The user interface adjusts accordingly (flip) to keep the pressure strip on the correct side (side facing towards hand). The device can feature a clean design free of markings on the top surface except for any information displayed through the user interface in which case the markings flip with user interface to match the direction in which the user wears the device.
[00127] Additionally the wearable device can be equipped with a multicore processor, for example, a quad-core processor, an internal memory, for example, 64GB or 128GB storage options, a 2.4" flexible capacitive touchscreen to implement the user interface with a resolution of for example, 854x480 pixels, Wi-Fi and Bluetooth communication devices, global positioning system (GPS), several sensors including but not limited to proximity sensors, accelerometer, gyroscope and compass, a power source for example a 1000 mA barter}'. Other features include support for multiple communication standards for example, long term evolution (LTE), global system for mobile (GSM) and 3G.
Wearable Device Peripherals
[00128] FIGURE 32 illustrates an example of a pocket size capacitive touchscreen display device enhanced with camera and sensors. The pocket size capacitive touchscreen 3201 can include a 4.97" capacitive touchscreen, a 1280x720 pixel display resolution, a 8 megapixel rear-facing camera, a rear LED flash as shown in 3202, a 1.3 megapixel front- facing camera, an accelerometer, a gyroscope, a microphone, a loudspeaker, an in-ear speaker, a proximity sensor, and a 2,800 mAh battery.
[00129] FIGURE 33 illustrates an example of a tablet enhanced with a capacitive touchscreen, camera and sensors. The tablet 3300 can include a 10" capacitive touchscreen, a 1920x1080 pixel resolution display, a 5 megapixel rear-facing camera, a 1.3 megapixel front- facing camera, an accelerometer, a gyroscope, a microphone, a speaker and a power source for example, a 7,000 mAh battery.
[00130] FIGURE 34 illustrates an example of a keyboard with a hinge metal latch to plug a tablet. The keyboard can be coupled to a tablet, pocket size display, television and other computer-based devices via Bluetooth. The keyboard (e.g., 3401 and 3402) can include a touchpad with keys attached to the keyboard's body via plastic pieces that interlock in a "scissor"-like fashion i.e., a scissor-switch keyboard. Additionally, the keyboard can feature a hinge metal latch to for example, plug a tablet device 3403.
[00131] FIGURE 35 illustrates an example of Bluetooth enabled headset as a wearable necklace. The headset 3501 can be coupled wireiessly via Bluetooth technology to a wearable device or other computer-based device as presented herein for example, a pocket size touchscreen, a tablet and the like devices. The headset can include a microphone, a selection button, and a pair of earbuds. The headset can be wear as a necklace by snapping together the magnetic side of the earbuds forming a necklace pendant.
[00132] FIGURE 36 illustrates an example of a high definition multimedia interface (HD ) dongle. The dongle can be utilized to connect a wearable device to other devices for example, a television via a Universal Serial Bus USB. Such a connection, allows a user to for example, display information stored and processed in the wearable device on a coupled television. Additionally, the dongle can include a high definition multimedia interface output, a 1.3-megapixel camera, a microphone, and a 3.5mm audio output.
Application Program Interfaces and Integrated Development Environment
[00133] FIGURE 37 illustrates an example Integrated Development Environment for the generation of SDKs. The Integrated Development Environment (IDE) 3704 can be utilized to develop customized Software Development Kits SDKs to makeconsumer electronics, appliances and other computer-based products compatible with a wearable device. The IDE can include a library with Hardware Development Kit (HDK) and Application Programming Interface (API) modules for users to utilize in the development of customized SDKs. Among other things, the HDK and API can provide access to a set of functionalities of the wearable device, tablet, pocket size touchscreen and other devices presented herein. Additionally, the IDE can support a variety of compilers and programming languages to facilitate the development of supporting classes to simplify the access of the functionalities in the users' preferred programming language 3708 and their corresponding compilers 3712. Moreover, the IDE can include a built-in code editor featuring functionalities to support programmers coding projects such as code folding, window split views, multiline find and replace among other functions.
[00134] Once a customized SDK variant is produced 3718, users can develop applications that can be run in the wearable device operative system 3718 via a wearable device API 3716 to interact with third-party consumer electronics, appliances and other computer-based devices. The developed third-party applications 3720 can be deployed and distributed to the general public via an App Market 3724. In this way, the IDE can support a third party vendor to develop HDK/SDK that can be instantiated on the wearable device. For example, some existing Smartphone manufacturers (e.g., Apple, Google, etc.) may provide SDKs for third part)' vendors to develop compatible Smartphone apps (e.g., i Phone Apps, Android apps, etc.); in general, such SDKs provide APIs for a third part}' vendor to develop new apps that will be operated on a single device, e.g., the Apple iPhone, the Android phone, etc. The customized IDE shown in FIG. 37 provides an HDK and SDK combination that allows a third party vendor to develop extended hardware compatible with the wearable device, and software applications that can be operated on the wearable device to interface and control the extended hardware.
[00135] FIGURE 38 illustrates an example motion pattern to indicate the storage of digital data in a hybrid storage cloud environment. In an exemplary use case, a user can perform a gesture 3802 to indicate to the wearable device to save the information displayed on the tablet 3808. Once the wearable device determines the performed gesture, it can send the data to be saved to a remote storage server 3810 in a save digital data request packet 3814. The remote storage server can then send an acknowledgment packet back to the wearable device 3816 confirming that the information was successfully saved or informing the user that a problem occurred while saving the information. In parallel to the remote saving action, the wearable device can additionally save the digital data in the wearable device's local storage e.g., 3806 through the local storage LS Component 3900. Further details with respect to the local storage component (LS) 3900 are described herein and particularly with respect to an example LS Component 3900 shown in Fig. 39.
[00136] FIGURE 39 illustrates an example digital data replacement/retention policy Least-Recently Used (LRU). The wearable device has a finite capacity for internal storage, for example, 128GB. In order to mitigate this limit on storage capacity limitation, the wearable device can take advantage of hybrid cloud storage with a retention/replacement policy chosen by the user. When a user saves digital data, the information is saved by default in the internal storage of the wearable device and in a remote repository. Eventually when the internal storage has reached a user specified capacity, the wearable device can replace digital data, according to a user defined replacement/retenti on policy .
[00137] A retention/replacement policy can be for example, a Least-Recently Used (LRU) retention policy, where the least recently accessed digital data or file is chosen to be replaced when more internal storage capacity is desired. Alternatively, a user can choose other available retention policies. For example, in a Most-Recently Used (MRU) retention policy, wherein the most recently used item is replaced first when more internal storage capacity is required; in a Random Replacement (RR) retention policy, data or files are randomly selected and removed to make memory space when necessary; in a Least Frequently Used (LFU) retention policy, the wearable device counts how often the digital data is accessed and the digital data that is accessed least often is replaced first when more storage capacity is desired; and in an Adaptive Replacement retention policy, the hub balances between LRU and LFU. The
retention/replacement policies can be called every time new information is attempted to be saved locally or can be executed periodically as a background maintenance procedure.
[00138] Additionally, users can create their own retention/replacement policies. For instance, a user can rank digital data in an order that determines the likelihood to be replaced, e.g., digital data ranked as 5 can be configured to be more likely to be replaced in the local storage than digital data ranked as 4 and so on. Another way to specify a policy is by associating file extensions to ranks. For example, a user can configure . docx files to be prioritized over .mp4 files for replacement/retention purposes. Additionally, the user can choose to save certain digital information locally only and mark it as "Not Replaceable" or to "Save in the Cloud Only." The retention/replacement policies can be defined and selected from an installed app in the wearable device utilizing the device's display or a secondary display device.
[00139] Fig. 39 shows an exemplar}' logic flow on how a retention/replacement policy works (e.g., Least-Recently Used (LRU)). When a user 3902 performs a Save File gesture 3906, a command call is performed by the wearable device 3904. Such a command call will start a routine to execute several processing steps. Among other steps, the wearable device verifies that there is enough space to store the information locally 3908. If there is enough space in the local storage, the information is saved locally 3928 and the user 3902 receives an acknowledgment indicating that the save operation was successful 3928. If, however, there is not enough local storage space to save the file then the wearable device determines what retention/replacement policy is enabled in the device 3910.
[00140] As aforementioned, a Least-Recently Use (LRU) is one of many retention/replacement policies that can be supported. The LRU policy defines a local variable $Min with a value of time that is one unit ahead of the current time 3914. Thereafter, a loop process is initiated 3916. The steps in the loop process include checking the last accessed time each of the files stored in the local storage were accessed 3918 and comparing the last accessed time to the time value stored in the variable $Min. When the last accessed time for a given file is earlier than the time value in $Min, a pointer to the file is stored in the variable
$candidateFile, and the value of the $Min variable is changed to the last accessed time of the determined file 3920. The loop continues after all the files in the local storage are exhausted 3922. Once the loop has ended the file referenced by the candidateFile variable is removed from the local storage and then the process starts again from the step 3908. Other retention policies can be enabled instead of the LRU an can similarly be considered at the time of filing information locally 3912.
[00141] FIGURE 40 illustrates an example of a wearable device working as a surrogate communication device. A user with a wearable device can perform a gesture to use the wearable device as a surrogate communication device 4002. Once the wearable device identifies the gesture it establishes a connection with a communication device 4001 to send an identifier utilization request 4003 which if accepted will allow the wearable device to communicate to other devices on behalf of the communication device 4001. The device 4001 can accept or decline the request and communicate an answer to the wearable device through an identifier utilization response 4004. If the response is positive then the wearable device can communicate to for example a second communication device 4006 via text, email and/or voice call e.g., 4005. Thereafter the communication device 4006 can receive information from the wearable device 4001 on behalf of the communication device 4002. The employment of the wearable device as a surrogate communication device can involve one or more exchange of credentials and logic information including but not limited to user name, passwords, IP addresses, subscriber identity module numbers and or identifiers.
CONCLUSION
[00142] While various inventive embodiments have been described and illustrated herein, those of ordinar skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto; inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herem. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.
[00143] The above-described embodiments can be implemented in any of numerous ways. For example, embodiments of designing and making the coupling structures and diffractive optical elements disclosed herein may be implemented using hardware, software or a combination thereof. When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.
[00144] Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PD A), a smart phone or any other suitable portable or fixed electronic device. [00145] Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible format,
[00146] Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
[00147] The various methods or processes (e.g., of designing and making the coupling structures and diffractive optical elements disclosed above) outlined herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may¬ be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine.
[00148] In this respect, various inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other non-transitory medium or tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the invention discussed above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present invention as discussed above.
[00149] The terms "program" or "software" are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of embodiments as discussed above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present invention need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present invention.
[00150] Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that performs particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.
[00151] Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by- assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
[00152] Also, various inventive concepts may be embodied as one or more methods, of which an example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
[00153] All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
[00154] The indefinite articles "a" and "an," as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean "at least one."
[00155] The phrase "and/or," as used herein in the specification and in the claims, should be understood to mean "either or both" of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with "and/or" should be construed in the same fashion, i.e., "one or more" of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the "and/or" clause, whether related or unrelated to those elements specifically identified. Thus, as a non- limiting example, a reference to "A and/or B", when used in conjunction with open-ended language such as "comprising" can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
[00156] As used herein in the specification and in the claims, "or" should be understood to have the same meaning as "and/or" as defined above. For example, when separating items in a list, "or" or "and/or" shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as "only one of or "exactly one of," or, when used in the claims, "consisting of," will refer to the inclusion of exactly one element of a number or list of elements. In general, the term "or" as used herein shall only be interpreted as indicating exclusive alternatives (i.e. "one or the other but not both") when preceded by terms of exclusivity, such as "either," "one of," "only one of," or "exactly one of." "Consisting essentially of," when used in the claims, shall have its ordinary meaning as used in the field of patent law.
[00157] As used herein in the specification and in the claims, the phrase "at least one," in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed withm the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase "at least one" refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, "at least one of A and B" (or, equivalent!}', "at least one of A or B," or, equivalent!)' "at least one of A and/or B") can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
[00158] In the claims, as well as in the specification above, all transitional phrases such as "comprising," "including," "carrying," "having," "containing," "involving," "holding," "composed of," and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases "consisting of and "consisting essentially of shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 21 1 1 .03.

Claims

A processor-implemented method for motion controlled device tethering, the method comprising:
establishing, via a wireless transceiver in a wearable personal mobile device, a first communication link with a first user interface output device and a second communication link with a second user interface output device;
determining a first type of the first user interface output device and a second type of the second user interface output device;
obtaining pre-stored privacy configuration parameters associated with the first type and the second type; and
sending first data content to the first user interface output device and second data content to the second user interface output device based on the pre-stored privacy configuration parameters.
A device power leveraging system, comprising:
a wearable device, including:
a wearable housing;
a first charging interface strip on a surface of the wearable housing; and
a first power supply unit, disposed within the wearable housing, to be recharged via the first charging interface strip;
a headset, including:
a pair of earphones, and
a conductive wire connecting the pair of earphones, the electric wire being configured to serve as a charging cord; and
a user interface device, including:
a second charging interface strip on a surface of the user interface device, and a second power supply unit, disposed within the user interface device, to be recharged via the second charging interface strip.
An electronic apparatus, comprising:
a housing;
a display, disposed on one side of the housing;
a plurality of touch sensors, disposed in a row along an edge of the display, to be touched by a user;
a processor, disposed within the housing, to execute processor- executable instructions to:
display a plurality of selectable items on the display;
receive a touch indication from one of the plurality of touch sensors; determine that the touch indication is related to an item from the plurality of selectable items; and
engage the item as a user selected option from the plurality of selectable items.
4. The apparatus of claim 3, wherein the plurality of selectable items are displayed on the display along with the plurality of touch sensors, with each of the plurality of selectable items corresponding to one of the plurality of touch sensors.
5. A wireless communication method, comprising:
instantiating, at a first wireless communication device having a first mobile communication identifier, a remote communication component;
establishing, via the remote communication component, a connection between the first wireless communication device and a second wireless communication device having a second mobile communication identifier;
obtaining authorization from the second wireless communication device to use the second mobile communication identifier for communication;
generating, at the first wireless communication device via the remote communication component, a communication request to a mobile contact;
wherein the communication request is sent from the first wireless communication device, and has the second mobile communication identifier as a sender.
6. In memory, storing a plurality of processor-executable instructions to provide an interaction interface having a plurality of interaction interface mechanism, comprising:
a lock screen display view on a touch screen;
a first display view having a first plurality of icons of engageable items, the first plurality icons being arranged in a first graphical representation,
wherein the first display view is triggered by a first motion gesture on the lock screen display view on the touch screen; and
a second display view having a second pluralit of icons of engageable items, the second plurality icons being arranged in a second graphical representation,
wherein the second display view is triggered by a second motion gesture on the lock screen display view on the touch screen.
7. A processor-readable non-transitory medium storing processor-executable instructions for providing interface components, the processor-executable instructions executable by a processor to:
instantiate an integrated development environment for interface component generation associated with an operating system running on a wearable device;
identify extended hardware to be engaged with the wearable device;
generate a hardware interface module to be instantiated with the operating system and interface with the extended hardware;
incorporate the hardware interface module with an application component such that the application component operates with the operating system to engage the extended hardware; and provide the application component with the incorporated hardware interface module to a virtual marketplace.
8. The medium of claim 7, wherein the processor-executable instructions are obtained from a development kit downloaded from a server.
9. The medium of claim 7, wherein the processor-executable instructions are obtained by a third party developer.
10. The medium of claim 7, wherein the interface component generation includes hardware development kits.
11. The medium of claim 7, wherein the interface component generation includes software development kits.
12. A motion controlled storage management method, comprising:
receiving, via a motion sensor at a wearable device, a motion pattern;
determining that the motion pattern indicates a request to save a data file at a remote storage server; sending a data storage request including the data file to the remote storage server; and receiving a data, storage acknowledgement message from the remote storage server.
13. The method of claim 12, further comprising: determining there is insufficient local memory space to store the data file at the wearable device.
14. The apparatus of claim 3, wherein said housing is a wearable housing.
15. The apparatus of claim 3, wherein the plurality of selectable items are selectable contacts for the user to interact with.
16. The apparatus of claim 3, wherein the plurality of selectable items are selectable notes.
17. The apparatus of claim 3, wherein the plurality of selectable items are selectable calendar items.
18. The apparatus of claim 3, wherein the processor displays a first plurality of selectable items in response to a user touch indication in one direction along said edge and a second plurality of selectable items in response to a user touch indication in an opposite direction along said edge.
PCT/IB2016/050161 2015-01-14 2016-01-14 Wearable data processing and control platform apparatuses, methods and systems WO2016113693A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562103548P 2015-01-14 2015-01-14
US62/103,548 2015-01-14

Publications (1)

Publication Number Publication Date
WO2016113693A1 true WO2016113693A1 (en) 2016-07-21

Family

ID=56405315

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2016/050161 WO2016113693A1 (en) 2015-01-14 2016-01-14 Wearable data processing and control platform apparatuses, methods and systems

Country Status (1)

Country Link
WO (1) WO2016113693A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10187512B2 (en) 2016-09-27 2019-01-22 Apple Inc. Voice-to text mode based on ambient noise measurement
CN109947501A (en) * 2019-03-14 2019-06-28 努比亚技术有限公司 Starting processing method, wearable device and the storage medium of application program
CN110134256A (en) * 2019-04-28 2019-08-16 努比亚技术有限公司 A kind of map operation method, wearable device and computer readable storage medium
CN110178159A (en) * 2016-10-17 2019-08-27 沐择歌公司 Audio/video wearable computer system with integrated form projector
WO2019204188A1 (en) * 2018-04-19 2019-10-24 Carrier Corporation Biometric feedback for intrusion system control
CN110782889A (en) * 2019-08-22 2020-02-11 腾讯科技(深圳)有限公司 Voice operation method and related equipment
CN111104175A (en) * 2020-01-14 2020-05-05 杭州鸿雁智能科技有限公司 Configurable sharing device and related methods
CN111885463A (en) * 2020-07-09 2020-11-03 上海闻泰信息技术有限公司 Bluetooth sound box
CN113542483A (en) * 2020-03-30 2021-10-22 Oppo广东移动通信有限公司 Data transmission method and device, wearable device and storage medium
CN114268980A (en) * 2021-12-24 2022-04-01 广东乐心医疗电子股份有限公司 Equipment binding method and device and electronic equipment
CN114726842A (en) * 2020-12-22 2022-07-08 成都鼎桥通信技术有限公司 Data processing method, medium, and communication device
CN115209514A (en) * 2022-09-16 2022-10-18 荣耀终端有限公司 Method for closing cellular communication function and related electronic equipment
EP4276578A1 (en) * 2022-05-13 2023-11-15 Meta Platforms Technologies, LLC Biometric monitoring wrist device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090146947A1 (en) * 2007-12-07 2009-06-11 James Ng Universal wearable input and authentication device
US20100030695A1 (en) * 2008-02-08 2010-02-04 Microsoft Corporation Mobile device security using wearable security tokens
US20100245585A1 (en) * 2009-02-27 2010-09-30 Fisher Ronald Eugene Headset-Based Telecommunications Platform

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090146947A1 (en) * 2007-12-07 2009-06-11 James Ng Universal wearable input and authentication device
US20100030695A1 (en) * 2008-02-08 2010-02-04 Microsoft Corporation Mobile device security using wearable security tokens
US20100245585A1 (en) * 2009-02-27 2010-09-30 Fisher Ronald Eugene Headset-Based Telecommunications Platform

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10187512B2 (en) 2016-09-27 2019-01-22 Apple Inc. Voice-to text mode based on ambient noise measurement
US10567569B2 (en) 2016-09-27 2020-02-18 Apple Inc. Dynamic prominence of voice-to-text mode selection
CN110178159A (en) * 2016-10-17 2019-08-27 沐择歌公司 Audio/video wearable computer system with integrated form projector
WO2019204188A1 (en) * 2018-04-19 2019-10-24 Carrier Corporation Biometric feedback for intrusion system control
US10629041B2 (en) 2018-04-19 2020-04-21 Carrier Corporation Biometric feedback for intrusion system control
CN109947501A (en) * 2019-03-14 2019-06-28 努比亚技术有限公司 Starting processing method, wearable device and the storage medium of application program
CN110134256A (en) * 2019-04-28 2019-08-16 努比亚技术有限公司 A kind of map operation method, wearable device and computer readable storage medium
CN110782889A (en) * 2019-08-22 2020-02-11 腾讯科技(深圳)有限公司 Voice operation method and related equipment
CN111104175A (en) * 2020-01-14 2020-05-05 杭州鸿雁智能科技有限公司 Configurable sharing device and related methods
CN113542483A (en) * 2020-03-30 2021-10-22 Oppo广东移动通信有限公司 Data transmission method and device, wearable device and storage medium
CN113542483B (en) * 2020-03-30 2022-03-18 Oppo广东移动通信有限公司 Data transmission method and device, wearable device and storage medium
CN111885463A (en) * 2020-07-09 2020-11-03 上海闻泰信息技术有限公司 Bluetooth sound box
CN111885463B (en) * 2020-07-09 2022-06-17 上海闻泰信息技术有限公司 Bluetooth sound box
CN114726842A (en) * 2020-12-22 2022-07-08 成都鼎桥通信技术有限公司 Data processing method, medium, and communication device
CN114726842B (en) * 2020-12-22 2024-02-13 成都鼎桥通信技术有限公司 Data processing method, medium, and communication device
CN114268980A (en) * 2021-12-24 2022-04-01 广东乐心医疗电子股份有限公司 Equipment binding method and device and electronic equipment
EP4276578A1 (en) * 2022-05-13 2023-11-15 Meta Platforms Technologies, LLC Biometric monitoring wrist device
CN115209514A (en) * 2022-09-16 2022-10-18 荣耀终端有限公司 Method for closing cellular communication function and related electronic equipment

Similar Documents

Publication Publication Date Title
WO2016113693A1 (en) Wearable data processing and control platform apparatuses, methods and systems
US20170055110A1 (en) Systems, apparatus, and methods relating to a wearable electronic hub for personal computing
US10627902B2 (en) Devices, methods, and graphical user interfaces for a wearable electronic ring computing device
US10719119B2 (en) Mobile terminal and method for controlling the same
US9973685B2 (en) Mobile terminal and method of controlling the same
CN105404412B (en) Portable terminal and control method thereof
US20160260086A1 (en) Mobile terminal and method for controlling the same
US10241613B2 (en) Mobile terminal and method for controlling the same
US10395233B2 (en) Mobile terminal and method for controlling the same
KR102414356B1 (en) Electronic device and Method for providing a haptic feedback of the same
US11870922B2 (en) Mobile terminal and electronic device having mobile terminal
US20190028579A1 (en) Mobile terminal and control method therefor
US20150362999A1 (en) Mobile terminal and controlling method thereof
US9635163B2 (en) Mobile terminal and controlling method thereof
EP3306902A1 (en) Mobile terminal
JP2016201090A (en) Mobile terminal and method for controlling the same
US10579260B2 (en) Mobile terminal having display screen and communication system thereof for unlocking connected devices using an operation pattern
KR20170079549A (en) Mobile terminal and method for controlling the same
EP2731369A1 (en) Mobile terminal and control method thereof
US9757651B2 (en) Electronic device and method of processing user input by electronic device
KR102135378B1 (en) Mobile terminal and method for controlling the same
WO2015193736A2 (en) Systems, apparatus, and methods relating to a wearable electronic hub for personal computing
EP2961141B1 (en) Mobile terminal and method for controlling the same
KR20170025270A (en) Mobile terminal and method for controlling the same
KR101688167B1 (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16737149

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16737149

Country of ref document: EP

Kind code of ref document: A1