WO2017218363A1 - Procédé et système destinés à la sélection de dispositifs iot à l'aide d'un point séquentiel et de gestes du pouce - Google Patents

Procédé et système destinés à la sélection de dispositifs iot à l'aide d'un point séquentiel et de gestes du pouce Download PDF

Info

Publication number
WO2017218363A1
WO2017218363A1 PCT/US2017/036891 US2017036891W WO2017218363A1 WO 2017218363 A1 WO2017218363 A1 WO 2017218363A1 US 2017036891 W US2017036891 W US 2017036891W WO 2017218363 A1 WO2017218363 A1 WO 2017218363A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
iot device
gesture
nudge
iot
Prior art date
Application number
PCT/US2017/036891
Other languages
English (en)
Inventor
James ROBARTS
Erik V. Chmelar
Original Assignee
Pcms Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pcms Holdings, Inc. filed Critical Pcms Holdings, Inc.
Publication of WO2017218363A1 publication Critical patent/WO2017218363A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • a person may hold or wear a finger-manipulated remote-control device (e.g., smartphone or smartwatch), which would require the person's eyes to guide their selection from a list or graphic of a home's lights or other Internet of Things (IoT) devices.
  • IoT Internet of Things
  • Pointing is an innate human gesture, primarily done by use of the forearm to indicate a desired pointing direction. Recently developed forearm sensor packages may be employed to determine where a person points.
  • IoT device user interface one which provides the ability to select devices residing in the real world (or virtual objects and tasks displayed on a monitor) by pointing at them while in motion (e.g., walking) is technically feasible.
  • UI IoT device user interface
  • Another challenge is how a user can select a particular device when it is in a crowd of devices. Crowding may be caused by devices being close to each other, or appearing to from a particular point of view (POV).
  • POV point of view
  • device location and orientation technology continues to improve, resolution between multiple distant devices remains problematic. This is a fundamental issue based on a user's perspective with respect to remote devices; even if devices are spatially separated, they may appear to overlap from certain vantage points. Increasing accuracy and precision of the location and orientation technology may not solve the issue.
  • Described herein are systems and methods related to enabling a user to select an IoT device by a "point" and "nudge” process.
  • a system and method of selecting an IoT device among a plurality of IoT devices beyond arm's reach from a user comprising: detecting the pointing of a user' s voluntarily controllable body part in the general direction of a plurality of IoT devices; determining a first IoT device among the plurality of IoT devices most closely associated with a direction of pointing; sending to the first IoT device information for causing the first IoT device to indicate selection of the first IoT device to the user; and responsive to detecting a predefined nudge gesture indicated by the user's voluntarily controllable body part: determining a direction of the predefined nudge gesture; determining a second IoT device located in the determined direction of the predefined nudge gesture; sending, to the first IoT device, information for
  • FIG. 1A is a schematic front view depicting an exemplary layout of a plurality of selectable devices in a home environment.
  • FIGS. IB and 1C depict exemplary layouts of overlapping selection regions for the plurality of selectable devices of FIG. 1A.
  • FIGS. ID and IE depict exemplary layouts of small and/or irregularly shaped selection regions.
  • FIG. IF depicts an exemplary layout of selection regions, with a desired selection region and a system selected region noted.
  • FIG. 2 is a plan view illustrating a room having various smart devices and/or sensors therein, and an exemplary user interaction according to the systems and methods herein.
  • FIG. 3 A illustrates an exemplary user's POV of an IoT device cluster.
  • FIG. 3B illustrates an exemplary embodiment of the axial reference frame (e.g., X, Y) that may be utilized by the disclosed systems and methods.
  • axial reference frame e.g., X, Y
  • FIG. 3C illustrates an exemplary embodiment of an axial reference frame (e.g., ⁇ , ⁇ , ⁇ ) that may be utilized by the disclosed systems and methods, in some embodiments, as applied to the device cluster of FIG. 3 A.
  • an axial reference frame e.g., ⁇ , ⁇ , ⁇
  • FIGS. 3D and 3E illustrate exemplary top-down views of the device cluster of FIG. 3 A, depicting the relative terminology of the interaction between the user and the device cluster.
  • FIG. 4A illustrates exemplary selection regions around the location (e.g., XY or XYZ) of individual IoT devices.
  • FIG. 4B illustrates an exemplary scenario wherein a user point indicates an interaction area
  • FIG. 4C illustrates an exemplary top-down view of the interaction area of FIG. 4B, wherein IoT devices are points that may fall within the interaction region.
  • FIG. 5 is a flow diagram of an embodiment of a method for selection of a device.
  • FIG. 6 is a graph showing data ranges for a body motion over a particular time period associated with a specific nudge gesture.
  • FIG. 7 is a graph depicting an exemplary stage in an adjacency scenario for a cluster of IoT devices in an origin based axial locational system, wherein a first device has been selected by a point.
  • FIG. 8 is a graph depicting an exemplary stage in an adjacency scenario for a cluster of IoT devices in an origin based axial locational system, wherein a second device has been selected by a nudge.
  • FIGS. 9 A and 9B are a first half and a second half, respectively, of a sequence chart for exemplary systems and methods.
  • FIG. 10 is sequence chart for other exemplary disclosed systems and methods.
  • FIG. 11 depicts an exemplary embodiment of a visual signal indicating a user's pointing direction.
  • FIG. 12 illustrates an exemplary depiction comparing capturable motion of a hand and a wrist.
  • FIG. 13 depicts an exemplary magnitude analysis as in FIG. 6, adapted for forearm capturable data.
  • FIG. 14 depicts an exemplary magnitude analysis as in FIG. 6, adapted for additional nudge directionality.
  • FIG. 15 illustrates an alternative embodiment of methods for selecting a device, wherein a nudge gesture is characterized as a "reach" gesture for selecting devices relative to the Z axis rather than the X and Y axes.
  • FIGS. 16A and 16B illustrate stages in a "reach" gesture selection process.
  • FIGS. 17A and 17B illustrate an exemplary embodiment wherein a user may set an interaction range or radius.
  • FIGS. 18A and 18B illustrate a reach selection process in relation to a point expansion embodiment.
  • FIGS. 19A and 19B are a first half and a second half, respectively, of an exemplary sequence chart for a reach gesture process.
  • FIGS. 20A-20D illustrate exemplary hand motion based gestures.
  • FIGS. 21A-21F illustrate an exemplary "hop over" hand motion based gesture.
  • FIGS. 22A-22D illustrate an exemplary "reach around" hand motion based gesture.
  • FIG. 23 illustrates an exemplary wireless transmit/receive unit (WTRU) that may be used as an IoT device or hub in some embodiments.
  • WTRU wireless transmit/receive unit
  • FIG. 24 illustrates an exemplary network entity that may be used as an IoT device or hub in some embodiments.
  • modules include hardware (e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices) deemed suitable by those of skill in the relevant art for a given implementation.
  • hardware e.g., one or more processors, one or more microprocessors, one or more microcontrollers, one or more microchips, one or more application-specific integrated circuits (ASICs), one or more field programmable gate arrays (FPGAs), one or more memory devices
  • Each described module may also include instructions executable for carrying out the one or more functions described as being carried out by the respective module, and those instructions may take the form of or include hardware (hardwired) instructions, firmware instructions, software instructions, and/or the like, and may be stored in any suitable non-transitory computer-readable medium or media, such as commonly referred to as RAM and ROM.
  • One challenge is how a user may select a particular device by pointing when it is in a crowd of devices. Crowding may be caused by devices being close to each other, or appearing to be close to each other from a particular POV.
  • a device A (a thermostat) 141 may be located behind one or more devices. As shown, thermostat A may be located behind wall monitor B 142, and also behind and in line with desk monitor C 143. Other devices in an environment may include a computer 144, speakers 145, and a lamp 146. Each device may have an X, Y, Z device location which may be expanded to "selection regions" to facilitate pointing selection, but may produce overlapping regions, as shown in FIGS. IB and 1C.
  • FIG. IB shows an environment 150 with selection regions for a thermostat 151, wall monitor 152, desk monitor 153, computer 154, speakers 155, and a lamp 156.
  • FIG. 1C shows environment 160 with overlapping selection regions 161.
  • FIG. ID shows one configuration of an environment 170 of irregular- shaped selection regions with a desired selection region 171.
  • FIG. IE shows another configuration of an environment 175 of irregular-shaped selection regions with a desired selection region 176.
  • FIG. IF shows an environment 180 in which the left selection region 181 is selected, but the desired selection region 182 is the region on the right.
  • the user may be provided with information sufficient to determine that a system has not selected the intended device, and the user may be provided a way to correct any incorrect selection by a system.
  • systems and methods may provide a "point and nudge" interface for a user to interact with an IoT device.
  • a user's body position and/or orientation may be analyzed to select an IoT device.
  • the user may "point” at a desired IoT device.
  • at least one IoT device may be configured to provide a visual signal indicating selection of the at least one device by a system in response to a user's "point.”
  • the user may select an alternative device (e.g., nearby or adjacent device, or the like) using a separate gesture. In effect, the user may "nudge" an incorrect device selection to the correct device.
  • Such systems may be advantageous because users are able to select precisely one device from among many devices using gestures with a variety of common characteristics. Such a paradigm may work even if a user is in motion and a user is no longer pointing at a device. Such a method does not rely on sound (hence others may not be disturbed). Also, such a method does not impede pointing but allows pointing and nudging (or reaching) without a user interface modality change and permits pointing and range adjustment to be performed simultaneously. Additionally, users may find such a methodology easy to use and understand, and such a system may use common location and motion sensors to determine gestures.
  • nudge examples in this disclosure use body motion, though any system-monitored body part may be employed as long as the motion may be conducted in at least two different (e.g. two approximately orthogonal) directions, or using two (or more) body parts with independent movement.
  • a nudge gesture may comprise a "reach" gesture which relates to selection of devices closer to or further away from a user.
  • One embodiment disclosed herein may comprise of a point response, a device visual signal, a nudge detection, a nudge response.
  • a point response may comprise detecting that a user is pointing to (and thereby selecting) a device and sending a signal to the selected device.
  • a device visual signal may comprise modulating a light (or other visual indicator) on the device.
  • a nudge detection may comprise a system interpreting a user's body motion as an instruction to select an adjacent (or nearby) device (e.g., left, right up, down, backward, or forward).
  • a nudge response may comprise a system selecting an adjacent (or nearby) device in response to a detected nudge, which may cause a different device to be selected and to display a visual indication.
  • a body part to indicate direction, such as: a forearm, the eyes, the nose, head movements, and/or the like.
  • the pointing direction is maintained for a minimum time duration to distinguish from a non-pointing motion.
  • One embodiment comprises using DecaWave DW1000 to determine device spatial locations and MUV Bird to determine user spatial orientation and user gestures.
  • Other embodiments may use additional methods as known to one of ordinary skill in the art.
  • FIG. 2 illustrates an exemplary room 200 having various smart devices and/or sensors therein, and an exemplary user interaction according to systems and methods disclosed herein.
  • a plurality of sensors in a room may include low precision distance detectors 202, high precision distance detectors 204, and direction and motion detectors (or pointing detectors) 206.
  • Low and high precision distance type sensors 202, 204 (which may also include body worn sensors) may be incorporated into electrified networked devices, such as light fixtures, wireless communication hubs, wall switches and outlets, and IoT devices.
  • Low precision distance detectors 202 may include environmental networked sensors (transducer and transceiver) capable of detecting user distances from a device within a few meters. Examples of low precision distance sensors may include, but are not limited to: infrared sensors; powerline fingerprinting; WiFi fingerprinting, acoustic trilateration, cameras, others.
  • High precision distance detectors 204 may include body-worn or environmental networked remote sensors capable of detecting distances within 10 cm. Examples of high precision distance sensors may include, but are not limited to Google Soli Near-Field Radar and DeccaWave DW 1000 ultra- wide band transceiver.
  • Direction and motion detectors (or pointing detectors) 206 may include environmental and body-worn networked sensors capable of determining a user's location, pointing direction, and nudge gesture.
  • pointing direction sensors may include, but are not limited to, Microsoft Wrist Band, MUV Bird, "Magic Ring,” PointSwitch, and AirTouch.
  • user location sensors may include DeccaWave DW 1000 ultra- wide band transceiver.
  • nudge gesture sensors may include, but are not limited to, Myo Gesture Control Armband and singlecue.
  • Exemplary techniques of detecting user gestures include techniques disclosed in US Patent Application No. 2015/0312398A1, Chinese Patent Application No. CN104635505A, US Patent Application No. 2015/0346834A1, US Patent Application No. 2015/0145653A1, and Chinese Patent Application No. CN104660420A.
  • FIGS. 3A-3C illustrate an exemplary IoT device cluster.
  • FIG. 3A illustrates an exemplary user's POV 300 of an IoT device cluster.
  • An IoT cluster may include a thermostat 301, a desk monitor 302, a wall monitor 303, speakers 304, a computer 305, and a lamp 306.
  • FIG. 3B illustrates an exemplary embodiment 310 of the axial reference frame (e.g., X, Y) that may be utilized by the disclosed system and method, as applied to the device cluster of FIG. 3 A.
  • the x axis may represent a horizontal portion
  • the y axis may represent a vertical portion (for example, in degrees relative to a frame of reference shared by the user and devices, from the user's current POV), and an origin may be the user's current pointing direction 312.
  • FIG. 3C illustrates an exemplary embodiment 320 of an axial reference frame (e.g., ⁇ , ⁇ , ⁇ ) that may be utilized by the disclosed systems and methods, in some embodiments, as applied to the device cluster of FIG. 3 A.
  • the X and Y axes of FIG. 3B are supplemented with a Z axis to permit focus on the proximity of a target to the user (e.g., closer or further away).
  • FIG. 3C also shows a user's current pointing direction 322.
  • the term Z axis is used to signify the current distance between the user (e.g. the hand or proximal end of forearm) and their intended pointing target (not the same at the Z value of the device's absolute location).
  • the term Interaction Range indicates limitations on the Z axis.
  • FIG. 3D shows a plan-view schematic of a room 330 for a user 338 with a pointing direction 332 for a desired device 334 and an interaction area 336.
  • FIG. 3E shows distance parameters related to a user 352 pointing in a direction 342 at an interaction area 344 that is a radius 350 away from the user 352.
  • the interaction area 344 for this example has a depth 346 and a width 348.
  • FIG. 4A shows an embodiment of a room 400 where individual IoT devices may have selection regions 406 around XY or XYZ points (or other locational data element), which may overlap with or impede selection of another device or that device's selection region. Gestures of a user 408 may be construed as indicating a straight line of selection along a pointing direction 402, which may be towards a desired device 404.
  • FIGS. 4B and 4C show other embodiments 410, 420 where devices may be treated as single XY or XYZ points (or other locational data element), and a user 424 gestures in a pointing direction 412 that may be interpreted as covering an expanded area to form an interaction area 414, 422 within which a device is located (shown as a user's perspective view in FIG. 4B and as a plan view in FIG. 4C).
  • systems and methods may work well with either device or pointing expansion. In some embodiments, systems and methods may also work when no expansion of either device or pointing is used. However, in some cases, without expansion the system may be difficult for some users to use unless devices are close and/or large and uncrowded.
  • FIG. 5 is flowchart 500 of one embodiment for selecting a device.
  • a system may perform user body monitoring 502.
  • User body monitoring 502 may include tracking X, Y, Z, and orientation coordinates of a user's body, such as by environment sensors in a room.
  • a system may also obtain IoT device locations 504 in X, Y, Z coordinates 504 (e.g., locations of devices A, B, and C).
  • a system may detect a "point" gesture, or attempt to detect such a gesture 506.
  • a system may determine whether a captured gesture matches a predetermined "point" pattern. In some embodiments, this may include analyzing the body orientation and the duration of the gesture. If there is no match with a pattern, the system may return to a monitoring state 502. If there is a match, the system may analyze the location of the gesture and the orientation to determine if they align with a selection region 508. For example, the system may determine whether any devices are within a region indicated by the gesture. If the gesture does not align with a selection area, the system may return to a monitoring state 502.
  • the system may determine a first device in the selection region that most closely aligns to the detected gesture.
  • the system may send to the first device a selection command 510.
  • the first device may receive the selection command and produce a visual signal to indicate the selection by the system 512.
  • the selected device may not produce a visual signal.
  • the system may detect any deviations in the user's body orientation and/or position, and determine whether such deviations match one or more predetermined patterns indicating a next action 514. For example, has the user's relevant body part moved over time according to a predetermined gesture pattern. In some instances, a deviation may not match a predetermined pattern. In such cases, the system may either return to a monitoring state, default to a set command, and/or the like. In some embodiments, if the system returns to a monitoring state, it may send a command to the first device to deactivate the visual signal.
  • a detected body orientation deviation may match a predetermined pattern for a particular command.
  • the command may be generic (e.g., on/off), while in other embodiments the command may be device specific (e.g., dim light, volume adjust).
  • the matched command may be sent in a message to the first device, which may carry out the command.
  • the first device may or may not produce a visual signal to confirm the command.
  • exemplary commands may include, but not be limited to, ⁇ Enter>, ⁇ Toggle On/Off>, ⁇ Down Arrow>, ⁇ Send Identity Credentials>, and/or the like.
  • the detected body orientation deviation may match a predetermined pattern for a "nudge" action. From the matched pattern, the system may determine a user POV and relative X, Y directionality of the detected body orientation deviation. From the user POV and relative directionality, the system may determine an adjacent (or nearby) second device 516, and repeat the visual signal and monitor for orientation deviation with the second device. In some embodiments, the determination may attempt to characterize which device appears adjacent (or near) from the user's POV in the direction of the "nudge" gesture. Generally, this sequence may be characterized as a "point” detection 506, 508, a "point” response 510, a "nudge” detection 514, and a "nudge” response 516.
  • a system may send a deselection command to the previously selected device to deactivate the visual signal being produced at that device.
  • a control gesture made by a user may be detected. If such a control gesture is detected, a system may send to a selected IoT device information for causing a selected IoT device to initiate an interface with a user.
  • a control gesture may comprise a predefined gesture of a user's hands or fingers.
  • Nudge detection may be performed various ways in different embodiments.
  • nudge detection may include a magnitude test, such that the detection may ignore movements that are too small or too large.
  • nudge detection may include a direction detection, to determine whether the motion of the gesture has a directionality (e.g., left, right, up, down, and combinations thereof).
  • FIG. 6 depicts an exemplary graph 600 showing data ranges for a body motion over a particular time period associated with a specific nudge gesture. In various embodiments, a variety of algorithms may implement these or similar criteria.
  • Magnitude Tests are illustrated as the dashed circles 602, 604.
  • nudge gestures may only be detectable/matchable if they fall within a specific range of deviation.
  • a deviation may be evaluated relative to the vertical and/or horizontal degrees of deviation in 0.5 seconds, with a minimum motion deviation of 15 degrees in 0.5 seconds and a maximum motion deviation of 30 degrees in 0.5 seconds.
  • detectable gestures may be restricted to a narrow range of optimal values (e.g., between circles 602 and 604).
  • the maximum value may not be increased, and the minimum may not be decreased, by more than 20% before increases in false positives (e.g., system incorrectly interprets user motions as nudges) significantly degrade system performance.
  • direction detection thresholds such as hashed regions 606, 608, 610, 612.
  • the hashed regions 606, 608, 610, 612 are regions of ⁇ 20° from the respective axis. In one embodiment, this falls within a range of optimal values for direction detection. In some instances, a greater than 50% increase in the detection regions may generate more false positives, and decrease of false negatives. In some embodiments, the user may be provided with the ability to adjust the default thresholds.
  • two relative-positional-aspects between the selected device and the potentially adjacent (or near) devices are combined in direction-specific adjacency rating formulas:
  • the first aspect is satisfied by determining which device, in the indicated direction, has the shortest Euclidian distance (e.g., apparently close in XY plane from user's POV) by taking the root of the sum of their squares:
  • direction-specific logics that also incorporate the second aspect, by giving weight to the direction (e.g., the axis value used in calculation) the user nudged. In one embodiment, this may comprise increasing the square to a cube of the location value of the axis that is orthogonal to the indicated direction, as shown in Table 2.
  • the lowest value generated by the calculation (e.g., "Jx 2 +
  • the device with the lowest Euclidian distance may be chosen. There are numerous techniques to perform this weighting, with a primary feature of adjacency analysis including both aspects.
  • Device E With Device E selected, Device E becomes the nudge response analysis origin of selection graph 800, as shown in FIG. 8. Device F is still the goal, so the user may nudge again. However, this time the user may nudge in the up direction.
  • Device F is selected by an Up nudge.
  • the disclosed method selects a device other than the closest (by Euclidian distance) in the indicated direction. If, however, a user instead wished to select device D at the second stage, still as shown in FIG. 8, the user may perform a Left nudge instead of an Up nudge. As shown in Table 5, the disclosed method would result in the selection of Device D, which is the closest both by Euclidian distance and the weighted distance. Table 5
  • FIGS. 9A and 9B depict an exemplary embodiment of a message sequence diagram 900 for the disclosed systems and methods using nudges.
  • the locations may alternatively be ( ⁇ , ⁇ , ⁇ ) or the like (see also discussion related to FIGS. 19A and 19B below).
  • device locations 904, point detection 906, point response 908, nudge detection 910, and nudge response 912 may be implemented with software modules.
  • Device Location Indications 1, 2, and 3 are sent from a plurality of IoT devices (e.g., IoT devices 1, 2, n) 914, 916, 918 to a device locations module 904.
  • IoT devices e.g., IoT devices 1, 2, n
  • Device Location Indications 1, 2, and 3 may have message content indicating the particular device's location (e.g., Device 1 location, Device 2 location, ... , Device n location).
  • a single device location data package may be defined as (X, Y) (or similar) in a frame of reference relative to the user.
  • the location may be determined by the device (such as by using RF tri angulation, or other known methods), or by environmental sensors separate from the selectable device (e.g., cameras or viewing devices).
  • devices 2 through n may be adj acent to (or nearby) device 1.
  • One or more user body sensors 902 may capture information related to a current user location and/or a current user pointing location, which may be communicated from the body sensor(s) to the system (e.g., a point detection module) via a User Location & User Pointing Location message 926.
  • a User Location & User Pointing Location message 926 may comprise a stream of user sensor data.
  • the data may include a current user location of at least one part of the human body. In some embodiments, this may be defined as a single absolute (X, Y) point relative to devices. In other embodiments, alternative relational location systems may be utilized.
  • the data may also include a current user pointing direction of the part of the body being used for pointing. In some embodiments, this data may be a second (X, Y) point, or in others it may be a directional vector with the user's location (or some offset therefrom) as origin.
  • a Local Device Locations and IDs message 928 may communicate message content of local device locations and IDs from the locations module 904 to the point detection module 906.
  • the user may configure which devices are selectable.
  • the selection of which devices to relay information regarding may involve filtering based on proximity, and/or the like.
  • the point detection module 906 may communicate to the point response 908 the IDs of the device(s) at which the module has determined the user is "pointing" 930.
  • This communication may include a specific "pointed at” device, as well as adjacent and/or nearby devices.
  • this data may include identifiers and offsets from the current pointing direction of devices appearing in a similar direction from the user's POV.
  • the point response module 908 may receive the ID of the "pointed at” device and optionally other nearby/adjacent devices.
  • the point response module 908 may communicate a "Select" command 932 to the "pointed at” device (e.g., IoT device 1 (914)).
  • the "Select” term is used to describe the system-wide function of the message 932.
  • the message operates to cause the selected device to provide a visual signal to the user (such as by modulating a light or otherwise identify itself) so the user may verify the device at which the system has determined the user has pointed.
  • This message 932 alternatively may be labeled a "Signal Request.”
  • the selected device here IoT Device 1 (914), may provide a visual signal to the user that the device currently is selected by the system.
  • the system may transition (e.g., by sending an Indication of Switch to Nudge Detection Indication 934) to a nudge detection modality (e.g., switch gesture recognition modality) from a selecting-any-device-by-pointing mode, to a select-adjacent-device-by-nudging mode, or the like.
  • these modes may be mutually exclusive. In some embodiments, they may not be mutually exclusive.
  • the transition may include communicating the relevant device locations and/or IDs to the nudge detection module 910.
  • the nudge detection module 910 may request, such as via a User Sensor Data Request 936, user sensor data to detect a "nudge.”
  • user sensor data such as via a User Sensor Data Request 936
  • various types of position and/or motion sensors may be used, depending on what body part the user moves to indicate a "nudge.”
  • the absolute location of the user is not required, but rather only the relative motion of the particular body part (e.g., movement of hand relative to a reference hand position, without needing to determine where user is standing in room).
  • the user body sensors may communicate data via a User Sensor Data Response 938 to the nudge detection module 910.
  • the communicated data may relate to current user gestures, and in some embodiments, the communicated data may comprise a stream of user sensor data, such as from the part of the body presently generating nudges.
  • the direction of the nudge may be communicated to a nudge response module 912 via a Direction of Nudge Indication 940.
  • the direction may be one of: up, down, left, right.
  • directions may further include intermediate directions, such as up/left, up/right, down/left, down/right.
  • the nudge response module 912 may communicate a "Deselect" command 942 to device 1 (914).
  • this response may be a "Turn off Signal Request” or similar.
  • the de-select command 942 may prompt a selected device (here, IoT device 1 (914)) to deactivate whatever visual signal indicates selection by the system (e.g., stop modulating light).
  • a message from the Nudge Response module to the Device Locations module may be sent requesting provision of relevant device locations for the adjacency calculation(s).
  • the nudge response module 910 may determine the appropriate device in light of the nudge direction, and send a "Select" Command 944 to said device (here, device 2 (916)). As before, this may cause device 2 (916) to provide a visual indication of current selection by the system.
  • a nudge detection process 956 may be repeated. In some embodiments, this repeat may comprise, e.g., a 5 second loop (or other appropriate time for said loop) or may be interruptible by recognition of some other gesture, or the like. In some instances, the nudge response process 956 may repeat, with device 2 (916) having its visual signal deactivated and a device n (918) being selected.
  • a nudge response process 956 may include a User Sensor Data Request 946, a User Sensor Data Response 948, a Direction Nudge Indication 950, a De- Select Command 952, and a Select Command 954.
  • nudge process there may be no nudge process (e.g., the correct device is selected by the pointing gesture). In some embodiments, a single nudge process may be required to select the correct device. In some embodiments, two or more nudge processes may be required to select the correct device. In some embodiments, additional or fewer data elements may be communicated with messages. In some embodiments, modules may pass pointers to information rather than discrete data elements, and/or the like.
  • FIG. 10 Alternative embodiments are shown in the message sequence diagram 1000 of FIG. 10.
  • the method shown in FIG. 10 is in relation to a home automation hub, rather than distinct software modules (e.g., a hub may include modules as described in relation to FIGS. 9A and 9B).
  • the method shown in FIG. 10 is in relation to an IoT hub.
  • IoT devices 1 and 2 (1006, 1008) may communicate 1010, 1012 their device IDs and/or location data to the hub 1004.
  • devices 1006, 1008 may communicate their locations, while in some other embodiments a hub 1004 may determine their locations from their IDs, or the locations may have previously been stored at a hub.
  • one or more body location and/or orientation sensors 1002 may communicate 1014 to the hub 1004, with data such as user location and/or orientation data.
  • the hub 1004 may use the received data to determine the user's POV and the relative positions of IoT devices 1016.
  • User body sensors 1002 may capture user body motion and communicate 1018 such data to the hub 1004.
  • the hub 1004 may perform one or more actions with a processing of body motion measurements 1020, including but not limited to: determining whether captured body motion was a point gesture; determining a relative point vector for a point gesture; identifying selectable devices within the user's POV relative to a point gesture; identifying a first intended device; and/or the like.
  • the hub 1004 may notify a selected device 1022, such as IoT Device 1 (1006), which may activate a visual indicator or signal (or other indication of selection) 1024.
  • a user may make a body motion (e.g., a nudge) or other gesture 1026.
  • the hub 1004 may perform one or more actions with a processing of body motion measurements 1028, including but not limited to: determining whether a nudge gesture was made; determining a nudge direction; identifying a second intended device; and/or the like.
  • the hub 1004 may notify the first device, IoT device 1 (1006), of its de-selection 1030, and IoT device 1 (1006) may de-activate its visual indicator or other indication of selection 1032.
  • the hub 1004 also may notify a second selected device, IoT device 2 (1008), of its selection 1034, and IoT device 2 (1008) may activate its visual indicator or other indication of selection 1036.
  • Other embodiments may perform these steps in another order, may skip steps, and/or may perform additional steps.
  • a user may know which direction they are pointing by having a selected device provide a visual signal. However, this may be problematic if the user may not easily point directly toward a device (e.g., the device is far away), or if they do not know (precisely or generally) where the selectable objects are located. In some embodiments, it may therefore be advantageous to have all devices within a range of the user's pointing direction provide a visual signal.
  • FIG. 11 illustrates one exemplary embodiment 1100 of such functionality.
  • each device within a predetermined range e.g., about 10 degrees of deflection, about 15 degrees, about 20 degrees, or about other ranges
  • a predetermined range e.g., about 10 degrees of deflection, about 15 degrees, about 20 degrees, or about other ranges
  • each device within a predetermined range may provide a constant or brief (or similar) low illumination (or other visual signal) if a pointing direction 1102 of a pointing gesture is within a "detection cone” 1104.
  • all devices within an interaction area may provide a "cone" signal while the selected device provides a selection signal.
  • nudge gestures based on lower arm data, which may be collected with wearable wrist/arm bands and sleeves.
  • Advantages for nudge gestures may include: sufficiency and reliability to detect innate pointing gesture; sufficiency and reliability to detect nudging in opposite directions for X and Y; sufficiency to detect many hand gestures; ability to suppress small hand and finger motions; and ability to incorporate a small, wearable wrist band.
  • Nudge detection using forearm data may use the same logic discussed above, except that a motion may be reversed and/or attenuated.
  • a motion may be reversed for forearm data because during a nudge gesture, for unconscious mechanical reasons, a "wrist motion" (such as motion of the lowest part of the forearm) is the attenuated reverse of hand motion.
  • FIG. 12 shows an exemplary depiction 1200 comparing capturable motion of a hand and a wrist. As shown, there is captured motion of the hand and the wrist, both related to an attempt to "nudge" towards a device at region 1202. A nudge-to-right wrist motion 1204 points in the opposite direction of a nudge-to-right hand motion 1208 about a nudge-to-right gesture origin
  • FIG. 13 shows an exemplary embodiment 1300 of a graph of directional nudge zones, such as a nudge right zone 1302 or a nudge up zone 1304.
  • the magnitude of ranges may be significantly decreased for vertical movements compared to horizontal movements because an arm may have greater rigidity in the vertical axis than in the horizontal axis.
  • a system may work with any other visual signal, such as (for appropriate devices) having it perform a physical action (e.g., move a part).
  • methods and systems set forth above may be used with haptic feedback instead of or in addition to visual feedback. For example, a user's wristband may vibrate when devices are selected.
  • FIG. 14 is an embodiment 1400 of nudge zones with multiple associated directions. For this embodiment, both vertical degrees of deviation 1402 and horizontal degrees of deviation 1408 are in 0.5 second increments. FIG. 14 also shows minimum motion boundaries 1412 and maximum motion boundaries 1414. In some embodiments, providing an increased number of directions in which a user may indicate a nudge may be advantageous for some systems and methods. In some instances, the number of available nudge directions may be selectable. For example, it may be desirable to limit users to up/down and left/right nudges 1406 until a user has gained familiarity with the system and method in practice. Additional options may be available for a user familiar with the system.
  • Some embodiments may be configured to enable nudge directions such as Nudge Up & Right 1404, Up & Left, Down & Right 1410, and Down & Left. In some embodiments, some or all of these additional directions may be enabled, or further additional nudge directions may be implemented, in line with this disclosure.
  • Some embodiments may use additional directional tests that are conceptually similar to those already discussed above.
  • a system may determine for a body motion a range of deviation from a reference position over a predetermined time period. [0113] FIG.
  • nudge gestures may be characterized as a "reach" gesture for selecting devices relative to the Z axis rather than the X and Y axes (discussed above).
  • a particular user gesture may be a "reach” gesture (e.g., indication of selection relative to Z axis), rather than an XY nudge gesture associated with FIG. 5.
  • a "reach” gesture may be a subset of "nudge" gestures.
  • User Body Monitoring 1502 may be performed by a sensor, while sending device location messages 1506 and production of a visual signal 1512 may be done by IoT devices. If a system detects a "point" gesture as matching a point pattern 1504, a system may attempt to determine a location and orientation for a selection region 1508 based on device locations 1506. A system may detect deviations in a user's body orientation and/or position, and determine whether such deviations match 1504 one or more predetermined patterns indicating a next action (e.g., a reach gesture). For example, a system may determine if a user's relevant body part moved over time according to a predetermined gesture pattern. In some instances, a deviation may not match a predetermined pattern.
  • a next action e.g., a reach gesture
  • the system may either return to a monitoring state 1502, default to a set command, and/or the like. In some embodiments, if a system returns to a monitoring state 1502, the system may send a command to the first device to deactivate the visual signal.
  • a detected body orientation deviation may match a predetermined pattern for a particular command.
  • the command may be generic (e.g., on/off), while in other embodiments the command may be device specific (e.g., dim light or adjust volume).
  • a matched command may be sent 1510 in a message to the first device, which may process and perform the command.
  • the first device may produce 1512 or may not produce a visual signal to confirm the command.
  • exemplary commands may include, but not be limited to, ⁇ Enter>, ⁇ Toggle On/Off>, ⁇ Down Arrow>, ⁇ Send Identity Credentials>, and/or the like.
  • determining if body orientation deviations match a pattern 1514 may be performed by a "Reach" Detection module.
  • the detected body orientation deviation may match a predetermined pattern for a "reach" action. From the matched pattern, the system may determine a user POV and relative Z directionality of the detected body orientation deviation. From the user POV and relative directionality, the system may determine an adjacent (or nearby) second device, and repeat the visual signal and monitor for orientation deviation with the second device. In some embodiments, the determination may attempt to characterize which device appears adjacent (or near) from the user's POV in the direction of the "reach" gesture. For one embodiment, reception of z-direction user point of view (POV) and determining an adjacent device 1516 may be performed by a predetermined pattern for a "reach" action. From the matched pattern, the system may determine a user POV and relative Z directionality of the detected body orientation deviation. From the user POV and relative directionality, the system may determine an adjacent (or nearby) second device, and repeat the visual signal and monitor for orientation deviation with the second device. In some embodiments, the determination may attempt to characterize which device appears adjacent (or near)
  • this sequence may be characterized as a "point" response, a "reach” detection, and a “reach” response.
  • the system may send a deselection command to the previously selected device, so as to deactivate the visual signal being produced at that device.
  • the disclosed reach gesture typically focus on selecting devices further away than one currently selected.
  • the disclosed reach gesture is not intended to be limited as such, and may work in reverse to allow selection of nearer devices as well.
  • specific gestures that may be used as the reach gesture may be reversed to provide the nearer reach gesture.
  • the general reach gesture process from the user may comprise: monitoring the user's body position and orientation for indications of desire to select an IoT device that is further away (or closer, in some cases); comparing the pointing direction to available IoT device locations; and the selecting nearest device in indicated direction. In some instances, the user may repeat the process until the desired device is selected.
  • a reach gesture process may be a method of resolving multiple devices in the same pointing direction with selection of the closest device.
  • a "reach” may be a "nudge" along the Z axis, rather than in the XY plane.
  • the user may make a point gesture in an attempt to select the thermostat (device A). The system may or may not properly select the device.
  • a system has selected computer E 1602, which is located between the user 1606 and the desired thermostat (device A) 1604. If the selected device E provides indication to the user of its selection, the user may respond by making another gesture, in this case a "reach past" gesture, to cause the system to select an alternative device (in this case, lamp G, which is behind the computer E), as shown in the plan view of a room 1650 in FIG. 16B. As this selected device 1652 is still not the desired device 1654, the user 1656 may perform a further "reach" gesture to cause the system to again select another device, in this instance likely the correct device in thermostat A.
  • a further "reach" gesture to cause the system to again select another device, in this instance likely the correct device in thermostat A.
  • a user may utilize a "set interaction range" gesture 1706. For example, if a user is satisfied (e.g., via signal from the selected device 1704) that he or she has selected the desired device 1702, the user may lock the relative range from the user to the device as a minimum range 1758 for selection with the system. Accordingly, the system may not select devices 1756 inside the minimum range if a user attempts to select a device 1754 (e.g., system selects only devices 1752, 1754 outside the range, even if other devices are closer and in line with the user' s point). While discussed in the context of reach gestures, interaction ranges may be used in XY plane embodiments as well.
  • FIGS. 18A and 18B The reach process illustrated in FIGS. 18A and 18B is discussed in relation to a point expansion embodiment below.
  • the desk of devices scenario 1800 is shown for an interaction area 1802 with a pointing expansion of ⁇ 20° from a pointing direction of a user 1804 (see FIG. 4C).
  • a user 1854 is pointing toward an interaction area where only one device 1852 is selectable.
  • a default spread may be ⁇ 5°.
  • the x dimension (relative to user POV) of the spread is referred to below as the Interaction Width.
  • device E would provide a visual signal that device E has been selected. This calculation, similar to the disclosed point and nudge embodiments discussed previously, may combine two user-based real-world aspects in its device selection calculations: user proximity to the selected device and device proximity to the Z axis in the direction the user has pointed.
  • this may be done as by modifying the Euclidian distance formula to emphasize the Z axis (system selects device with lowest calculated value):
  • a formula may generally track the following pseudocode: FOR each device in pointing direction,
  • Adjacency Rating x 2 + y 2 + z 3
  • a formula may be implemented such that distance values may be calculated for devices beyond a Z distance that is some fraction of the minimum interaction radius.
  • the minimum Z distance may be 95% (or some other value) of the minimum interaction radius.
  • device F if a user points towards a device C, as in FIG. 18B, device F is not selected. In some cases, device F may be selected if a "closer" (e.g., reach closer) gesture is performed by the user.
  • FIGS. 19A and 19B depict an exemplary message sequence diagram for a reach gesture process 1900, comparable to FIGS. 9A and 9B (e.g., replace nudge modules with reach modules).
  • Device Locations 1904, "Point” Detection 1906, “Point” Response 1908, "Reach” Detection 1910, and “Reach” Response 1912 may be implemented with software modules.
  • Device Location Indications 1920, 1922, 1924 may be sent from a plurality of IoT devices (e.g., IoT devices 1, 2, ... , n) 1914, 1916, 1918 to a device locations module 1904.
  • IoT devices e.g., IoT devices 1, 2, ... , n
  • Device Location Indications 1920, 1922, 1924 may have message content indicating the particular device's location (e.g., Device 1 location, Device 2 location, ... , Device n location).
  • a single device location data package may be defined as ( ⁇ , ⁇ , ⁇ ) (or the like), in a frame of reference relative to the user.
  • the location may be determined by the device, or by environmental sensors separate from the selectable device.
  • devices 2 (1916) through n (1918) may be adjacent to (or nearby) device 1 (1914).
  • One or more user body sensors may capture information related to a current user location and/or a current user pointing location, which may be communicated from the body sensor(s) 1902 to the system (e.g., a point detection module 1906), as a User Location & User Pointing Location message 1926.
  • a User Location & User Pointing Location message 1926 may comprise a stream of user sensor data.
  • the data may include a current user location of at least one part of the human body.
  • this user location may be defined as a single absolute (X, Y, Z) point relative to devices 1914, 1916, 1918.
  • alternative relational location systems may be utilized.
  • the data may also include a current user pointing direction of the part of the body being used for pointing. In some embodiments, this may be a second (X, Y, Z) point, or in others it may be a directional vector with the user's location (or some offset therefrom) as origin.
  • a Local Device Locations and IDs message 1928 may communicate message content of local device locations and IDs from the locations module 1904 to the point detection module 1906.
  • the user may configure which devices are selectable.
  • the selection of which devices to relay information regarding may involve filtering based on proximity, and/or the like (e.g., in point expansion embodiments).
  • the point detection module 1906 may communicate the IDs of the device(s) at which the module has determined the user is "pointing" 1930. This communication may include a specific "pointed at” device, as well as adjacent and/or nearby devices. In some embodiments, this data may include identifiers and offsets from the current pointing direction of devices appearing in a similar direction from the user's POV.
  • the point response module 1908 may receive the ID of the "pointed at” device and maybe other nearby/adjacent devices.
  • the point response module may communicate a "Select" command to the "pointed at” device (e.g., device 1 in FIGS. 19A and 19B).
  • the "Select” term is used to describe the system-wide function of the message.
  • the Select command is used to cause the selected device to provide a visual signal to the user (such as by modulating a light or otherwise identify itself) so the user may verify the device at which the system has determined the user has pointed. This message may therefore alternatively be labeled "Signal Request," or the like.
  • the selected device here Device 1 (1914), may provide a visual signal to the user that said device 1914 is currently selected by the system.
  • the system may transition (e.g., via an Indication of Switch to Reach Detection Indication 1934) to a Reach detection modality (e.g., switch gesture recognition modality) from a selecting-any-device-by-pointing mode, to a select-adjacent-device-by-reaching mode, or the like.
  • a Reach detection modality e.g., switch gesture recognition modality
  • these modes may be mutually exclusive. In some embodiments, they may not be mutually exclusive.
  • the transition may include communicating the relevant device locations and/or IDs to the Reach detection module 1910.
  • the reach detection module 1910 may request user sensor data 1936 to detect a "reach.”
  • various types of position and/or motion sensors (or the like) may be used, depending on what body part the user moves to indicate a "reach.”
  • a User Body Sensor 1902 may respond with a user sensor data response 1938.
  • the absolute location of the user is not required, but rather only relative motion of a particular body part (e.g., movement of hand relative to a reference hand position, without needing to determine where a user is standing in a room).
  • the communicated data may relate to current user gestures, and in some embodiments, the data may comprise a stream of user sensor data, such as from the part of the body presently generating reaches. If a reach is detected, the direction of the reach may be communicated to a reach response module 1940. In some embodiments, the direction may be further away from the user or closer to the user.
  • the reach response module may communicate a "De-select" command to device 1 (1942). In some embodiments, this communication may alternatively be a "Turn off Signal Request” or the like. Generally, the de-select command 1942 may prompt a selected device (here, device 1 (1914)) to deactivate whatever visual signal is being provided to indicate selection by the system (e.g., stop modulating light).
  • a message from the Reach Response module 1912 to the Device Locations module 1904 may be sent requesting provision of relevant device locations for the adjacency calculation(s).
  • the reach response module 1912 may determine the appropriate device in light of the reach direction, and send a "Select" Command 1944 to said device (here, device 2 (1916)). As before, this message may cause device 2 (1916) to provide a visual indication of current selection by the system.
  • the reach detection process 1956 may be repeated.
  • this repeat may comprise, e.g., a 5 second loop expiring (or other appropriate time for said loop), may be interruptible by recognition of some other gesture, or the like.
  • the reach detection process may repeat by a reach detection module sending/receiving a user sensor data request 1946 and response 1948 and sending a direction of reach indication 1950.
  • Device 2 (1916) may have its visual signal deactivated (via a De-Select Command 1952) and a next device n (1918) may be selected (via a Select Command 1954).
  • there may be no reach process e.g., when the correct device was selected at the pointing stage).
  • a single reach process may be required to select the correct device.
  • more than two reach processes may be required to select the correct device.
  • modules may pass pointers to information rather than discrete data elements, and/or the like.
  • a person wears and lives in a house full of smart devices.
  • Each smart device communicates its location to a system, which calculates a distance and direction of each object relative to the person's current location. If the person is wearing an IoT wearable device, their location is that of that device, e.g., a smart watch. If user is not wearing an IoT wearable device, the user's location may be that of their right hand, for example.
  • the smart device may be selected and may indicate that the device has system focus by turning on a light or making another visual change. If they are satisfied with the selection, they may send the system a command to interact with the selected device, such as by a predefined gesture.
  • nudge, reach, and other gestures may be implemented in a variety of ways.
  • FIGS. 20A-20D Exemplary depictions of hand motion based gestures are depicted in FIGS. 20A-20D.
  • FIG. 20A depicts an exemplary "point" gesture 2000 by a user with a hand gesture.
  • FIG. 20B depicts exemplary "reach past" gestures 2010, 2012 by a user (e.g., for selecting devices behind 2010 or in front of 2012 the currently selected device).
  • FIG. 20C depicts exemplary left or right "nudge" gestures 2020 by a user (e.g., for selecting a device to the left or right of the currently selected device).
  • FIG. 20D depicts exemplary up or down "nudge" gestures 2030 by a user (e.g., for selecting a device up from or lower than the currently selected device).
  • FIG. 21A shows a perspective view of a "hop over" gesture 2100 (one example of a "reach past” gesture), where a finger is raised up while the hand moves forward, followed quickly by the finger pointing down.
  • FIG. 21B shows one embodiment 2110 of a related motion graph depicting a maximum motion area 2112 for a "hop over" gesture 2100.
  • FIG. 21C shows one embodiment of a "next, next" gesture 2120 to move between devices may be performed by a user miming flipping hanging folders towards themselves (as if pulling back things forward).
  • FIG. 21D shows one embodiment 2130 of a related motion graph depicting a maximum motion area 2132 for a "next, next" gesture 2120.
  • FIG. 2 IE shows one embodiment of a "flip, flip” gesture 2150 to move between devices may be performed by a user miming flipping hanging folders away from themselves (as if pushing forward things back).
  • FIG. 2 IF shows one embodiment 2160 of a related motion graph depicting a maximum motion area 2162 for a "next, next" gesture 2120.
  • a "reach around" gesture may be performed by the user by moving their hand away, beginning mostly sideways, then arcing, and finishing mostly straight- ahead.
  • a "set interaction radius" gesture may be indicated with a hand and/or arm motion.
  • Such a gesture may comprise a start phase 2200, a motion phase 2210, and an end phase 2220.
  • a start phase 2200 a hand is horizontal with the thumb and finger joined side-by- side, and fingers flat or slightly curved.
  • fingers curve to point down while a hand moves down and forearm moves up slightly while the elbow remains stationary.
  • For an end phase 2220 there is rapid deceleration to stationary with fingers pointing down mostly.
  • FIG. 22D is a graph 2230 of x- and z-axis motion for a gesture path 2232 lasting 2 seconds.
  • alternative motions or gestures are used to indicate a set interaction radius command.
  • Other gestures may be used to indicate adjustments to an interaction radius.
  • One embodiment of such a gesture may be a squeezing motion of the thumb and forefinger, drawing the finger tips together, to signal a decrease in an interaction radius (or the like).
  • the opposite of this gesture, by a spreading of the finger tips, may signal an increase in an interaction radius.
  • Another embodiment of a gesture to modify the interaction radius may be a "stretch" motion, such as quickly stretching an arm or hand away from the user's body toward the device, to signal an increase in the interaction radius.
  • the opposite of this gesture by quickly drawing an arm or hand towards the user's body, may signal a decrease in the interaction radius.
  • Another gesture may be a "turn it up" motion to indicate an increase or decrease in the radius (or in other cases, an increase or decrease command to a device) by rotating a wrist clockwise or counter-clockwise.
  • Another gesture may use the fingers to add or subtract distance, such as by pointing to an additional digit forward, or reducing the number of digits pointing.
  • gestures may be used in line with these exemplary gestures disclosed herein.
  • user and device locations and orientations are determined.
  • various sensor configurations may provide a shared location frame of reference.
  • location frames of reference may include, but are not limited to, absolute location; location relative to a fixed sensor; location relative to a body sensor; location relative to absolute location and body sensors; location relative to a mobile sensor; and/or the like.
  • Exemplary embodiments disclosed herein are implemented using one or more wired and/or wireless network nodes, such as a wireless transmit/receive unit (WTRU) or other network entity that may be used as an IoT device or hub in some embodiments.
  • WTRU wireless transmit/receive unit
  • FIG. 23 is a system diagram of an exemplary WTRU 102, which may be employed as a device or hub in embodiments described herein.
  • the WTRU 102 may include a processor 118, a communication interface 119 including a transceiver 120, a transmit/receive element 122, a speaker/microphone 124, a keypad 126, a display/touchpad 128, a non-removable memory 130, a removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and sensors 138.
  • GPS global positioning system
  • the processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like.
  • the processor 1 18 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment.
  • the processor 118 may be coupled to the transceiver 120, which may be coupled to the transmit/receive element 122. While FIG. 23 depicts the processor 118 and the transceiver 120 as separate components, it will be appreciated that the processor 118 and the transceiver 120 may be integrated together in an electronic package or chip.
  • the transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station over the air interface 116.
  • the transmit/receive element 122 may be an antenna configured to transmit and/or receive RF signals.
  • the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, as examples.
  • the transmit/receive element 122 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.
  • the WTRU 102 may include any number of transmit/receive elements 122. More specifically, the WTRU 102 may employ MTMO technology. Thus, in one embodiment, the WTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 116.
  • the transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122.
  • the WTRU 102 may have multi-mode capabilities.
  • the transceiver 120 may include multiple transceivers for enabling the WTRU 102 to communicate via multiple RATs, such as UTRA and IEEE 802.11, as examples.
  • the processor 118 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit).
  • the processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128.
  • the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132.
  • the non-removable memory 130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device.
  • the removable memory 132 may include a subscriber identity module (SFM) card, a memory stick, a secure digital (SD) memory card, and the like.
  • the processor 118 may access information from, and store data in, memory that is not physically located on the WTRU 102, such as on a server or a home computer (not shown).
  • the processor 118 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the WTRU 102.
  • the power source 134 may be any suitable device for powering the WTRU 102.
  • the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel- zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li -ion), and the like), solar cells, fuel cells, and the like.
  • the processor 118 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102.
  • location information e.g., longitude and latitude
  • the WTRU 102 may receive location information over the air interface 1 16 from a base station and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
  • the processor 118 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity.
  • the peripherals 138 may include sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
  • sensors such as an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module
  • FIG. 24 depicts an exemplary network entity 190 that may be used in embodiments of systems and methods disclosed herein, for example as a hub.
  • network entity 190 includes a communication interface 192, a processor 194, and non-transitory data storage 196, all of which are communicatively linked by a bus, network, or other communication path 198.
  • Communication interface 192 may include one or more wired communication interfaces and/or one or more wireless-communication interfaces. With respect to wired communication, communication interface 192 may include one or more interfaces such as Ethernet interfaces, as an example. With respect to wireless communication, communication interface 192 may include components such as one or more antennae, one or more transceivers/chipsets designed and configured for one or more types of wireless (e.g., LTE) communication, and/or any other components deemed suitable by those of skill in the relevant art. And further with respect to wireless communication, communication interface 192 may be equipped at a scale and with a configuration appropriate for acting on the network side— as opposed to the client side— of wireless communications (e.g., LTE communications, Wi-Fi communications, and the like). Thus, communication interface 192 may include the appropriate equipment and circuitry (including multiple transceivers) for serving multiple mobile stations, UEs, or other access terminals in a coverage area.
  • wireless communication interface 192 may include the appropriate equipment and circuitry (including multiple transceivers) for serving
  • Processor 194 may include one or more processors of any type deemed suitable by those of skill in the relevant art, some examples including a general-purpose microprocessor and a dedicated DSP.
  • Data storage 196 may take the form of any non-transitory computer-readable medium or combination of such media, some examples including flash memory, read-only memory (ROM), and random-access memory (RAM) to name but a few, as any one or more types of non- transitory data storage deemed suitable by those of skill in the relevant art may be used.
  • data storage 196 contains program instructions 197 executable by processor 194 for carrying out various combinations of the various network-entity functions described herein.
  • ROM read only memory
  • RAM random access memory
  • register cache memory
  • semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD- ROM disks, and digital versatile disks (DVDs).
  • a processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne des systèmes et des procédés destinés à sélectionner un dispositif Internet des objets(IoT) parmi une pluralité de dispositifs IoT au moyen de : la détection du pointage d'un utilisateur dans la direction générale d'une pluralité de dispositifs IoT ; la détermination d'un premier dispositif IoT parmi la pluralité de dispositifs IoT le plus étroitement associés à une direction de pointage ; l'envoi au premier dispositif IoT des informations destinées à amener le premier dispositif IoT à indiquer la sélection du premier dispositif IoT ; et en réponse à la détection d'un geste du pouce prédéfini par l'utilisateur : la détermination d'une direction du geste du pouce prédéfini ; la détermination d'un second dispositif IoT situé dans la direction déterminée du geste du pouce prédéfini ; l'envoi, au premier dispositif IoT, des informations destinées à amener le premier dispositif IoT à ne plus indiquer sa sélection ; et au moyen de l'envoi, au second dispositif IoT, des informations destinées à amener le second dispositif IoT à indiquer la sélection du second dispositif IoT.
PCT/US2017/036891 2016-06-17 2017-06-09 Procédé et système destinés à la sélection de dispositifs iot à l'aide d'un point séquentiel et de gestes du pouce WO2017218363A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662351534P 2016-06-17 2016-06-17
US62/351,534 2016-06-17

Publications (1)

Publication Number Publication Date
WO2017218363A1 true WO2017218363A1 (fr) 2017-12-21

Family

ID=59093626

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/036891 WO2017218363A1 (fr) 2016-06-17 2017-06-09 Procédé et système destinés à la sélection de dispositifs iot à l'aide d'un point séquentiel et de gestes du pouce

Country Status (1)

Country Link
WO (1) WO2017218363A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022027435A1 (fr) 2020-08-06 2022-02-10 Huawei Technologies Co., Ltd. Activation d'une interaction entre dispositifs avec reconnaissance de gestes de désignation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110312311A1 (en) * 2010-06-16 2011-12-22 Qualcomm Incorporated Methods and apparatuses for gesture based remote control
CN104635505A (zh) 2015-01-19 2015-05-20 赵树乔 一种与室内智能设备的交互方法
CN104660420A (zh) 2015-03-06 2015-05-27 世纪天云科技(天津)有限公司 一种用于物联网的遥控交换机
US20150145653A1 (en) 2013-11-25 2015-05-28 Invensense, Inc. Device control using a wearable device
US20150312398A1 (en) 2014-04-24 2015-10-29 Samsung Electronics Co., Ltd. Apparatus and method for automatic discovery and suggesting personalized gesture control based on user's habit and context
US20150346834A1 (en) 2014-06-02 2015-12-03 Samsung Electronics Co., Ltd. Wearable device and control method using gestures

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110312311A1 (en) * 2010-06-16 2011-12-22 Qualcomm Incorporated Methods and apparatuses for gesture based remote control
US20150145653A1 (en) 2013-11-25 2015-05-28 Invensense, Inc. Device control using a wearable device
US20150312398A1 (en) 2014-04-24 2015-10-29 Samsung Electronics Co., Ltd. Apparatus and method for automatic discovery and suggesting personalized gesture control based on user's habit and context
US20150346834A1 (en) 2014-06-02 2015-12-03 Samsung Electronics Co., Ltd. Wearable device and control method using gestures
CN104635505A (zh) 2015-01-19 2015-05-20 赵树乔 一种与室内智能设备的交互方法
CN104660420A (zh) 2015-03-06 2015-05-27 世纪天云科技(天津)有限公司 一种用于物联网的遥控交换机

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022027435A1 (fr) 2020-08-06 2022-02-10 Huawei Technologies Co., Ltd. Activation d'une interaction entre dispositifs avec reconnaissance de gestes de désignation
EP4185939A4 (fr) * 2020-08-06 2023-08-30 Huawei Technologies Co., Ltd. Activation d'une interaction entre dispositifs avec reconnaissance de gestes de désignation

Similar Documents

Publication Publication Date Title
EP2945136B1 (fr) Terminal mobile et son procédé de commande
JP6400863B1 (ja) 建物内部における器械および他のオブジェクトをポインティング、アクセス、および制御するための直観的方法
US9235241B2 (en) Anatomical gestures detection system using radio signals
CN103430191B (zh) 经由模式匹配来学习态势
US9888090B2 (en) Magic wand methods, apparatuses and systems
KR102464384B1 (ko) 무선 전력 송신 장치 및 그 제어 방법
US20220137204A1 (en) Interactive control with ranging and gesturing between devices
US11113895B2 (en) Systems and methods for selecting spheres of relevance for presenting augmented reality information
CN110557741B (zh) 终端交互的方法及终端
US11095765B2 (en) Electronic device and method for connection to external device
CN102355623A (zh) 根据移动终端的位置改变其桌面应用主题的系统及方法
US20170124365A1 (en) Real-time locating system-based bidirectional performance imaging system
Lin et al. Human activity recognition using smartphones with WiFi signals
KR101618783B1 (ko) 이동 단말기, 이동 단말기의 제어방법, 그리고, 이동 단말기를 포함하는 제어시스템
US11841447B2 (en) 3D angle of arrival capability in electronic devices with adaptability via memory augmentation
CN103974190A (zh) 通过移动设备传输文件的方法及移动设备
CN113534946A (zh) 一种无接触手势识别方法
GB2606447A (en) Ultra-wideband to identify and control other device
KR20210000974A (ko) 영역(zone)을 형성하여 전자 장치에 관한 서비스를 제공하는 방법 및 그 장치
WO2017218363A1 (fr) Procédé et système destinés à la sélection de dispositifs iot à l'aide d'un point séquentiel et de gestes du pouce
Hua et al. Arciot: Enabling intuitive device control in the Internet of things through Augmented Reality
Xiong et al. A smart home control system based on indoor location and attitude estimation
JP2022522912A (ja) 視覚認識を使用してデバイスをペアリングするためのシステム及び方法
KR101611898B1 (ko) 매칭 시스템
EP4373139A1 (fr) Procédé et dispositif électronique pour rechercher un dispositif externe par l'intermédiaire d'un réglage d'angle de positionnement

Legal Events

Date Code Title Description
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17731984

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17731984

Country of ref document: EP

Kind code of ref document: A1