WO2016155577A1 - Time related interaction with handheld device - Google Patents

Time related interaction with handheld device Download PDF

Info

Publication number
WO2016155577A1
WO2016155577A1 PCT/CN2016/077402 CN2016077402W WO2016155577A1 WO 2016155577 A1 WO2016155577 A1 WO 2016155577A1 CN 2016077402 W CN2016077402 W CN 2016077402W WO 2016155577 A1 WO2016155577 A1 WO 2016155577A1
Authority
WO
WIPO (PCT)
Prior art keywords
handheld device
interaction
trigger event
events
time
Prior art date
Application number
PCT/CN2016/077402
Other languages
French (fr)
Inventor
Valluri KUMAR
Original Assignee
Huawei Technologies Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co., Ltd. filed Critical Huawei Technologies Co., Ltd.
Priority to CN201680019617.5A priority Critical patent/CN107529341A/en
Publication of WO2016155577A1 publication Critical patent/WO2016155577A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback

Definitions

  • the field of the present disclosure pertains to a method for time related interaction with a handheld device and a system thereof, and particularly, to an audio output oriented time related interaction of a user with a handheld device.
  • a time related interaction of a user with a handheld device involves the display screen being utilized. For instance, if a user would like to know the current time, the display screen of the handheld device would have to be powered on. In the event, a user would like to know the current time and time remaining till the alarm event, the user would have power on the display screen and determine the time remaining himself.
  • the constant powering on and off drains the battery of the handheld device unnecessarily and many features of the device remain unused in the process, which may be integrated in a manner that the device does not need to be powered on repeatedly.
  • embodiments of the present invention provide a method for time related interaction with a mobile device and implemented apparatus thereof.
  • An embodiment of the present disclosure refers to a method for time related interaction with a handheld device where the method comprises the steps of registering at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device, registering time interval between consecutive said plurality of sub-events in the handheld device, associating an output function with each said at least one interaction trigger event in the handheld device and verifying registered at least one interaction trigger event in the handheld device.
  • the step of associating an output function for each said at least one interaction trigger event comprises storing an alarm time within the handheld device, configuring the handheld device to determine time differential between instance of said at least one interaction trigger event and stored alarm time and providing an audio output of the determined time differential.
  • the step of associating an output function for each said at least one interaction trigger event comprises providing an audio output of time at the instance of said at least one interaction trigger event on the handheld device.
  • a further embodiment discloses a method for time related interaction with a handheld device where the method comprises identifying an interaction trigger event comprising a plurality of trigger sub-events, determining an output function associated with said interaction trigger event and executing determined output function.
  • an apparatus for time related interaction with a handheld device comprises a sensor module configured to identify at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device, a processing module coupled to the sensor module configured to register at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device, determine an output function associated with said interaction trigger event, register time interval between consecutive said plurality of sub-events in the handheld device, associate an output function for each said at least one interaction trigger event in the handheld device and execute the associated output function upon identification of at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device.
  • the apparatus further comprises a storage module coupled to said sensor module and processing module to store the settings configured by a user in the handheld device.
  • the alarm time set by a user is stored in the storage module, which is determined by the sensor module and the time differential between instance of said interaction trigger event and said alarm time is determined by the processing module. Subsequently, the processing module is configured to provide an audio output of determined time differential.
  • the processing module is configured to determine the output function and thereafter, provide an audio output of current time on the handheld device.
  • the interaction trigger event and said plurality of sub-events are motion based trigger events of the handheld device.
  • the time interval between consecutive sub-events is configured by the user of the handheld device.
  • the interaction trigger event may be a shake gesture event, which comprises three shakes consecutively with a time interval of 20milliseconds in each shake.
  • the handheld device is configured to operate the time related interaction with an appropriate platform i.e. it may executed on an Android TM based handheld device.
  • Embodiments of the present disclosure improve user experience by reducing the complexities of user operations required to discern current time or time gap to the next set alarm which is a very basic operation done by any user in their smart phone. Therefore, user interaction does not require the handheld device to be powered on thereby, reducing resultant power drainage of the device.
  • Figure 1 illustrates a flow diagram representation of a method for time related interaction with a handheld device according to an embodiment of the present disclosure.
  • Figure 2 illustrates a detailed flow diagram representation of a method for time related interaction with a handheld device according to an embodiment of the present disclosure.
  • Figure 3 illustrates a flow diagram representation of a method for time related interaction with a handheld device according to an embodiment of the present disclosure.
  • Figure 4 illustrates an exemplary embodiment of a method for time related interaction with a handheld device according to an embodiment of the present disclosure.
  • Figure 5 illustrates a block diagrammatic representation of an apparatus for time related interaction with a handheld device according to an embodiment of the present disclosure.
  • the following discussion provides a brief, general description of a suitable computing environment in which various embodiments of the present disclosure can be implemented.
  • the aspects and embodiments are described in the general context of computer executable mechanisms such as routines executed by a handheld device e.g. a mobile phone, a personalized digital assistant, a cellular device, a tablet et al.
  • the embodiments described herein can be practiced with other system configurations, including Internet appliances, hand held devices, multi-processor systems, microprocessor based or programmable consumer electronics, network PCs, mini computers, mainframe computers and the like.
  • the embodiments can be embodied in a special purpose computer or data processor that is specifically programmed configured or constructed to perform one or more of the computer executable mechanisms explained in detail below.
  • each unit may comprise within itself one or more components, which are implicitly understood. These components may be operatively coupled to each other and be configured to communicate with each other to perform the function of the said unit.
  • Figure 1 illustrates a flow diagram representation of a method of time related interaction with a handheld device according to an embodiment of the present disclosure.
  • the method requires registration of at least one interaction trigger event 101 with the handheld device.
  • the interaction trigger event may comprise a plurality of sub-events.
  • the method further comprises registration of time interval between consecutive sub-events within the handheld device 102.
  • an output function is associated with each interaction trigger event in the handheld device 103.
  • the registered interaction trigger event is then verified 104 to confirm its registration.
  • the interaction trigger event and plurality of sub-events are motion based trigger events of the handheld device.
  • the motion based trigger may be related to shake gesture.
  • the interaction trigger event is then a shake gesture event while the plurality of sub-events is the number of shakes which quantify a single shake gesture event.
  • the shake gesture event comprises three shakes as sub-events with a time interval of 30milliseconds between consecutive shakes. This time interval is registered and is based on the user of the handheld device, who may configure the device based on convenience and use. The user may elect to keep a wider gap between consecutive shakes or elect to keep a narrower gap between consecutive shakes.
  • the interaction trigger event may be associated with an output function which is then executed by the handheld device.
  • the interaction trigger event of shake gesture is associated with an audio output of the current time of the handheld device. Such association is configured by the user of the handheld device.
  • a user of the handheld device may associate an output function of audio output of current time with an interaction trigger event and associate an output function of audio output of the alarm time set with another interaction trigger event.
  • the user may also associate both the output functions with a single interaction trigger event.
  • the interaction trigger event is registered within the handheld device, it is verified by the user to ensure that it is functioning properly.
  • FIG. 2 of the present disclosure illustrates a detailed flow diagram in respect of the embodiment illustrated in Figure 1.
  • a number of shake events are configured to generate a shake gesture event 201.
  • the time interval between two consecutive shake events is configured 202.
  • shake events refers to plurality of sub-events while shake gesture event refers to interaction trigger event.
  • the configured time interval is then updated within the handheld device 204 and the interaction trigger event is then tested.
  • a test shake gesture event is triggered, which is received by the handheld device 204. It is determined whether the event is from the user 205, as it is highly possible that event may be triggered during a journey or walk by the user. According to an embodiment of the present disclosure, this is determined by ascertaining the time interval between the shake events. If the time interval determined meets the time interval configured by the user, the embodiment moves to the next step, else there is no update provided to the user. In the event, it is determined that the shake gesture event has been initiated by the user, the embodiment verifies whether any user is registered for that particular event 206. If there is an affirmative determination, the registered users are notified for the shake gesture event 207 based on the associated output function of the shake gesture event else there is no update provided to the user.
  • Figure 3 illustrates a method for time related interaction with a handheld device according to an embodiment of the present disclosure.
  • the embodiment is pertinent to execution of the output function once a shake gesture event is received.
  • an interaction trigger event is identified by the handheld device 301 upon receipt.
  • the output function associated with the interaction trigger event is determined 302 and executed by the handheld device 303.
  • more than one interaction trigger event may be identified based on the trigger sub-events. For instance, three shake events with a time interval of 30millisecond may imply a shake gesture event associated with an output function of audio output of current time while four shake events with a time interval of 50millisecond may imply a shake gesture event associated with an output function of audio output of the time pending for the alarm to go off.
  • the interaction trigger event is identified 401 and the associated output function is determined 402.
  • the handheld device may be configured to determine whether there is a single output function associated with the identified interaction trigger event or two. According to the embodiment, it is determined whether the associated output function pertains to the output of time differential i.e. time pending for the alarm to go off from that particular instance or whether it pertains to the output of the current time on the handheld device 403.
  • the embodiment determines the alarm time stored in the handheld device 404 and subsequently determines the time differential between the instance of the interaction trigger event and the determined alarm time 405. Then, an audio output of the determined time differential is provided 406 i.e. the handheld device verbally notifies the user of the determined time differential. For instance, the user triggers an interaction trigger event at 3: 30 am while the alarm time is set for 6: 30 am, the handheld device shall verbally notify the user that there are 3 hours for the alarm to go off.
  • the embodiment ensures that an audio output of the current time on the handheld device is notified to the user 407. For instance, the user triggers an interaction trigger event at 4: 00am, the handheld device shall verbally notify the user that it is 4: 00am in the morning.
  • Figure 5 illustrates a block diagrammatic representation of an apparatus 500 for time related interaction with a handheld device according to an embodiment of the present disclosure.
  • the apparatus may be a part of the handheld device itself or certain elements of the handheld device may be configured to function as per the elements described within said illustration. This diagrammatic representation should not be considered a limitation to the possible arrangements of the elements in the handheld device.
  • the apparatus comprises a sensor module 501, a processing module 502 and a storage module 503 where all modules are coupled to each other i.e. there is an exchange of information.
  • the sensor module 501 is configured to identify at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device.
  • the processing module 502 is configured to register at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device and register time interval between consecutive said plurality of sub-events in the handheld device.
  • the processing module 502 is configured to associate an output function for each said at least one interaction trigger event in the handheld device and the storage module 503 is configured to store the settings configured by a user in the handheld device.
  • the processing module determines an output function associated with said interaction trigger event and executes the determined output function.
  • the storage module 503 is coupled to said sensor module and processing module to store the settings configured by a user in the handheld device.
  • Figure 5 illustrates an arrangement of hardware elements configured to function as per the disclosure of Figures 1 to 4.
  • an apparatus for time related interaction with a handheld device comprises a sensor, a processor and a memory where all elements are coupled to each other i.e. there is an exchange of information.
  • the sensor is configured to identify at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device.
  • the processor coupled to the sensor registers at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device and registers time interval between consecutive said plurality of sub-events in the handheld device.
  • the processor is configured to associate an output function for each said at least one interaction trigger event in the handheld device and the memory is configured to store the settings configured by a user in the handheld device.
  • Embodiments of the present disclosure are exemplified based on use cases illustrated further.
  • Ritu is required to only shake her phone to listen to the time gap between the current time and the alarm time and is not required to open her eyes.
  • An apparatus configured as per the present disclosure is extremely helpful for physically handicapped people who cannot actually look at the mobile for the current time. It would be really helpful for such people who can get to know the current time by simply shaking the phone multiple times.
  • the present disclosure is therefore, beneficial as it may be used without the requirement of the display screen to be powered on. It is extremely useful for the differently abled such as blind people, who may shake their respective devices to know the current time or their alarm time. Such an interaction with the handheld device may be configured by a user as per requirement.
  • the present invention may be embodied as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, a software embodiment or an embodiment combining software and hardware aspects all generally referred to herein as a "circuit"or “module. "Furthermore, the present invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
  • Instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • Instructions may also be loaded onto a computer or other programmable data processing apparatus like a scanner/check scanner to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function (s) .
  • the function (s) noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending on the functionality involved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Telephone Function (AREA)

Abstract

A method for time related interaction with a handheld device and an apparatus thereof. The method comprises the steps of registering at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device(101), registering time interval between consecutive said plurality of sub-events in the handheld device(102), associating an output function with each said at least one interaction trigger event in the handheld device(103) and verifying registered at least one interaction trigger event in the handheld device(104).

Description

TIME RELATED INTERACTION WITH A HANDHELD DEVICE
Field of the Disclosure
The field of the present disclosure pertains to a method for time related interaction with a handheld device and a system thereof, and particularly, to an audio output oriented time related interaction of a user with a handheld device.
Background
Users of mobile devices, as part of their daily routines, use alarm and calendar event features for their intended purposes as separate applications. Conventionally, a time related interaction of a user with a handheld device involves the display screen being utilized. For instance, if a user would like to know the current time, the display screen of the handheld device would have to be powered on. In the event, a user would like to know the current time and time remaining till the alarm event, the user would have power on the display screen and determine the time remaining himself.
The constant powering on and off drains the battery of the handheld device unnecessarily and many features of the device remain unused in the process, which may be integrated in a manner that the device does not need to be powered on repeatedly.
Summary
In order to avoid repetitive power on and off of the handheld device during time related interaction by a user, embodiments of the present invention provide a method for time related interaction with a mobile device and implemented apparatus thereof.
An embodiment of the present disclosure refers to a method for time related interaction with a handheld device where the method comprises the steps of registering at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device, registering time interval between consecutive said plurality of sub-events in the handheld device, associating an output function with each said at least one interaction trigger event in the handheld device and verifying registered at least one interaction trigger event in the handheld device.
According to an embodiment of the preset disclosure, the step of associating an output function for each said at least one interaction trigger event comprises storing an alarm time within the handheld device, configuring the handheld device to determine time differential between instance of said at least  one interaction trigger event and stored alarm time and providing an audio output of the determined time differential.
According to yet another embodiment of the present disclosure the step of associating an output function for each said at least one interaction trigger event comprises providing an audio output of time at the instance of said at least one interaction trigger event on the handheld device.
A further embodiment discloses a method for time related interaction with a handheld device where the method comprises identifying an interaction trigger event comprising a plurality of trigger sub-events, determining an output function associated with said interaction trigger event and executing determined output function.
According to an embodiment of the disclosure, an apparatus for time related interaction with a handheld device comprises a sensor module configured to identify at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device, a processing module coupled to the sensor module configured to register at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device, determine an output function associated with said interaction trigger event, register time interval between consecutive said plurality of sub-events in the handheld device, associate an output function for each said at least one interaction trigger event in the handheld device and execute the associated output function upon identification of at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device. The apparatus further comprises a storage module coupled to said sensor module and processing module to store the settings configured by a user in the handheld device..
In an embodiment of the present disclosure, the alarm time set by a user is stored in the storage module, which is determined by the sensor module and the time differential between instance of said interaction trigger event and said alarm time is determined by the processing module. Subsequently, the processing module is configured to provide an audio output of determined time differential.
According to another embodiment of the present disclosure, the processing module is configured to determine the output function and thereafter, provide an audio output of current time on the handheld device.
According to several embodiments of the present disclosure, the interaction trigger event and said plurality of sub-events are motion based trigger events of the handheld device. The time interval  between consecutive sub-events is configured by the user of the handheld device. For instance, the interaction trigger event may be a shake gesture event, which comprises three shakes consecutively with a time interval of 20milliseconds in each shake.
In further embodiments of the present disclosure, the handheld device is configured to operate the time related interaction with an appropriate platform i.e. it may executed on an AndroidTM based handheld device.
Embodiments of the present disclosure improve user experience by reducing the complexities of user operations required to discern current time or time gap to the next set alarm which is a very basic operation done by any user in their smart phone. Therefore, user interaction does not require the handheld device to be powered on thereby, reducing resultant power drainage of the device.
Brief Description of Figures
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit (s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and components.
Figure 1 illustrates a flow diagram representation of a method for time related interaction with a handheld device according to an embodiment of the present disclosure.
Figure 2 illustrates a detailed flow diagram representation of a method for time related interaction with a handheld device according to an embodiment of the present disclosure.
Figure 3 illustrates a flow diagram representation of a method for time related interaction with a handheld device according to an embodiment of the present disclosure.
Figure 4 illustrates an exemplary embodiment of a method for time related interaction with a handheld device according to an embodiment of the present disclosure.
Figure 5 illustrates a block diagrammatic representation of an apparatus for time related interaction with a handheld device according to an embodiment of the present disclosure.
Detailed Description
The following discussion provides a brief, general description of a suitable computing environment in which various embodiments of the present disclosure can be implemented. The aspects and embodiments are described in the general context of computer executable mechanisms such as routines executed by a handheld device e.g. a mobile phone, a personalized digital assistant, a cellular device, a tablet et al. The embodiments described herein can be practiced with other system configurations, including Internet appliances, hand held devices, multi-processor systems, microprocessor based or programmable consumer electronics, network PCs, mini computers, mainframe computers and the like. The embodiments can be embodied in a special purpose computer or data processor that is specifically programmed configured or constructed to perform one or more of the computer executable mechanisms explained in detail below.
Exemplary embodiments now will be described with reference to the accompanying drawings. The disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey its scope to those skilled in the art. The terminology used in the detailed description of the particular exemplary embodiments illustrated in the accompanying drawings is not intended to be limiting. In the drawings, like numbers refer to like elements.
The specification may refer to “an” , “one” or “some” embodiment (s) in several locations. This does not necessarily imply that each such reference is to the same embodiment (s) , or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
As used herein, the singular forms “a” , “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes” , “comprises” , “including” and/or “comprising” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Furthermore, “connected” or “coupled” as used herein may include wirelessly connected or coupled. As used herein, the term “and/or” includes any and all combinations and arrangements of one or more of the associated listed items.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The figures depict a simplified structure only showing some elements and functional entities, all being logical units whose implementation may differ from what is shown. The connections shown are logical connections; the actual physical connections may be different. It is apparent to a person skilled in the art that the structure may also comprise other functions and structures. It should be appreciated that the functions, structures, elements and the protocols used in communication are irrelevant to the present disclosure. Therefore, they need not be discussed in more detail here.
In addition, all logical units described and depicted in the figures include the software and/or hardware components required for the unit to function. Further, each unit may comprise within itself one or more components, which are implicitly understood. These components may be operatively coupled to each other and be configured to communicate with each other to perform the function of the said unit.
Figure 1 illustrates a flow diagram representation of a method of time related interaction with a handheld device according to an embodiment of the present disclosure. According to the embodiment, the method requires registration of at least one interaction trigger event 101 with the handheld device. The interaction trigger event may comprise a plurality of sub-events. The method further comprises registration of time interval between consecutive sub-events within the handheld device 102. Further, an output function is associated with each interaction trigger event in the handheld device 103. The registered interaction trigger event is then verified 104 to confirm its registration.
According to example of the embodiment, the interaction trigger event and plurality of sub-events are motion based trigger events of the handheld device. For instance, the motion based trigger may be related to shake gesture. The interaction trigger event is then a shake gesture event while the plurality of sub-events is the number of shakes which quantify a single shake gesture event. In an exemplary embodiment, the shake gesture event comprises three shakes as sub-events with a time interval of 30milliseconds between consecutive shakes. This time interval is registered and is based on the user of the handheld device, who may configure the device based on convenience and use. The user may elect  to keep a wider gap between consecutive shakes or elect to keep a narrower gap between consecutive shakes.
Subsequently, the interaction trigger event may be associated with an output function which is then executed by the handheld device. In an embodiment of the disclosure, the interaction trigger event of shake gesture is associated with an audio output of the current time of the handheld device. Such association is configured by the user of the handheld device.
There may be multiple output functions associated to a single interaction trigger event or a plurality of interaction trigger events. For instance, a user of the handheld device may associate an output function of audio output of current time with an interaction trigger event and associate an output function of audio output of the alarm time set with another interaction trigger event. The user may also associate both the output functions with a single interaction trigger event.
Once, the interaction trigger event is registered within the handheld device, it is verified by the user to ensure that it is functioning properly.
Figure 2 of the present disclosure illustrates a detailed flow diagram in respect of the embodiment illustrated in Figure 1. According to an embodiment of the disclosure, a number of shake events are configured to generate a shake gesture event 201. Subsequently, the time interval between two consecutive shake events is configured 202. It would be noted that here, shake events refers to plurality of sub-events while shake gesture event refers to interaction trigger event.
The configured time interval is then updated within the handheld device 204 and the interaction trigger event is then tested. A test shake gesture event is triggered, which is received by the handheld device 204. It is determined whether the event is from the user 205, as it is highly possible that event may be triggered during a journey or walk by the user. According to an embodiment of the present disclosure, this is determined by ascertaining the time interval between the shake events. If the time interval determined meets the time interval configured by the user, the embodiment moves to the next step, else there is no update provided to the user. In the event, it is determined that the shake gesture event has been initiated by the user, the embodiment verifies whether any user is registered for that particular event 206. If there is an affirmative determination, the registered users are notified for the shake gesture event 207 based on the associated output function of the shake gesture event else there is no update provided to the user.
Figure 3 illustrates a method for time related interaction with a handheld device according to an embodiment of the present disclosure. The embodiment is pertinent to execution of the output function once a shake gesture event is received. According to the embodiment, an interaction trigger event is identified by the handheld device 301 upon receipt. Thereafter, the output function associated with the interaction trigger event is determined 302 and executed by the handheld device 303. According to the embodiment, more than one interaction trigger event may be identified based on the trigger sub-events. For instance, three shake events with a time interval of 30millisecond may imply a shake gesture event associated with an output function of audio output of current time while four shake events with a time interval of 50millisecond may imply a shake gesture event associated with an output function of audio output of the time pending for the alarm to go off.
According to an exemplary embodiment illustrated in Figure 4 of the present disclosure, the interaction trigger event is identified 401 and the associated output function is determined 402. Subsequently, the handheld device may be configured to determine whether there is a single output function associated with the identified interaction trigger event or two. According to the embodiment, it is determined whether the associated output function pertains to the output of time differential i.e. time pending for the alarm to go off from that particular instance or whether it pertains to the output of the current time on the handheld device 403.
In the event, it is determined that the associated output function is the audio output of time differential, the embodiment determines the alarm time stored in the handheld device 404 and subsequently determines the time differential between the instance of the interaction trigger event and the determined alarm time 405. Then, an audio output of the determined time differential is provided 406 i.e. the handheld device verbally notifies the user of the determined time differential. For instance, the user triggers an interaction trigger event at 3: 30 am while the alarm time is set for 6: 30 am, the handheld device shall verbally notify the user that there are 3 hours for the alarm to go off.
In the event, it is determined that the associated output function is the audio output of the current time, the embodiment ensures that an audio output of the current time on the handheld device is notified to the user 407. For instance, the user triggers an interaction trigger event at 4: 00am, the handheld device shall verbally notify the user that it is 4: 00am in the morning.
Figure 5 illustrates a block diagrammatic representation of an apparatus 500 for time related interaction with a handheld device according to an embodiment of the present disclosure. The apparatus may be a  part of the handheld device itself or certain elements of the handheld device may be configured to function as per the elements described within said illustration. This diagrammatic representation should not be considered a limitation to the possible arrangements of the elements in the handheld device. According to the embodiment, the apparatus comprises a sensor module 501, a processing module 502 and a storage module 503 where all modules are coupled to each other i.e. there is an exchange of information. According to an embodiment of the present disclosure, the sensor module 501 is configured to identify at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device. The processing module 502 is configured to register at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device and register time interval between consecutive said plurality of sub-events in the handheld device. The processing module 502 is configured to associate an output function for each said at least one interaction trigger event in the handheld device and the storage module 503 is configured to store the settings configured by a user in the handheld device.
Upon identification of the interaction trigger event comprising a plurality of trigger sub-events, the processing module determines an output function associated with said interaction trigger event and executes the determined output function. The storage module 503 is coupled to said sensor module and processing module to store the settings configured by a user in the handheld device. Figure 5 illustrates an arrangement of hardware elements configured to function as per the disclosure of Figures 1 to 4.
According to an exemplary embodiment, an apparatus for time related interaction with a handheld device comprises a sensor, a processor and a memory where all elements are coupled to each other i.e. there is an exchange of information. According to an embodiment of the present disclosure, the sensor is configured to identify at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device. The processor coupled to the sensor registers at least one interaction trigger event comprising a plurality of trigger sub-events in the handheld device and registers time interval between consecutive said plurality of sub-events in the handheld device. The processor is configured to associate an output function for each said at least one interaction trigger event in the handheld device and the memory is configured to store the settings configured by a user in the handheld device.
Embodiments of the present disclosure are exemplified based on use cases illustrated further.
Use Case 1:
Ritu is a mother of two kids and her daily routine involves getting up early in the morning to make all arrangements for her kids go to school. For this she has to put an alarm daily to wake up early. But as per human tendency, her biological mind always insists to know the time gap to the alarm set. This makes her to open her eyes repeatedly to check the time, which disturbs her sleep.
By using an apparatus configured as per the present disclosure, Ritu is required to only shake her phone to listen to the time gap between the current time and the alarm time and is not required to open her eyes.
Use case 2:
Smith is going to office on his bike /crowded public transport and he wants to know the current time based on which he can know if he is getting late to office. He cannot unlock his device and check for the time as either he will be busy driving or could be surrounded by a crowd. By using an apparatus configured as per the present disclosure, he can just shake his phone which reads out the current system time for him
Use case 3:
An apparatus configured as per the present disclosure is extremely helpful for physically handicapped people who cannot actually look at the mobile for the current time. It would be really helpful for such people who can get to know the current time by simply shaking the phone multiple times.
The present disclosure is therefore, beneficial as it may be used without the requirement of the display screen to be powered on. It is extremely useful for the differently abled such as blind people, who may shake their respective devices to know the current time or their alarm time. Such an interaction with the handheld device may be configured by a user as per requirement.
On a specific platform such as AndroidTM, which may be used by the handheld device, when the display screen goes OFF, Android fires an ACTION_SCREEN_OFF broadcast event which can be registered to in the application. When the Broadcast Receiver receives ACTION_SCREEN_OFF Intent, queue a thread to  run 500ms in the future, instead of running immediately. This ensures that most other activities surrounding the screen off will have completed. So at that time, 500ms later, we un-register and re-register our service as a sensor event listener. The foreground service has to register for a “SensorEventListener” of ACCELEROMETER sensor type. So, during “onSersorChanged” method implementation of the “SensorEventListener” , the business logic can be incorporated to read out current time in hours and minutes along with the set alarm differential based on the settings configured by the user.
As will be appreciated by one of skill in the art, the present invention may be embodied as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, a software embodiment or an embodiment combining software and hardware aspects all generally referred to herein as a "circuit"or "module. "Furthermore, the present invention may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
Furthermore, the present invention was described in part above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) , and computer program products according to embodiments of the invention.
It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
Instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
Instructions may also be loaded onto a computer or other programmable data processing apparatus like a scanner/check scanner to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions  which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and schematic diagrams of Figures 1 to 4 illustrate the architecture, functionality, and operations of some embodiments of methods, systems, and computer program products for time related interaction of a user with a handheld device. In this regard, each block may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function (s) . It should also be noted that in other implementations, the function (s) noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending on the functionality involved.
In the drawings and specification, there have been disclosed exemplary embodiments of the invention. Although specific terms are employed, they are used in a generic and descriptive sense only and not for purposes of limitation, the scope of the invention being defined by the following claims.

Claims (15)

  1. A method for time related interaction with a handheld device, the method comprising the steps of: ‐
    registering at least one interaction trigger event comprising a plurality of trigger sub‐events in the handheld device;
    registering time interval between consecutive said plurality of sub‐events in the handheld device;
    associating an output function with each said at least one interaction trigger event in the handheld device;
    verifying registered at least one interaction trigger event in the handheld device;
    identifying an interaction trigger event comprising a plurality of trigger sub‐events;
    determining an output function associated with said interaction trigger event; and
    executing determined output function.
  2. The method as claimed in claim 1 wherein the step of associating an output function for each said at least one interaction trigger event comprises: ‐
    storing an alarm time within the handheld device;
    configuring the handheld device to determine time differential between instance of said at least one interaction trigger event and stored alarm time; and
    providing an audio output of the determined time differential upon occurrence of said at least one interaction trigger event.
  3. The method as claimed in claim 1 wherein said at least one interaction trigger event and said plurality of sub‐events are motion based trigger events of the handheld device.
  4. The method as claimed in claim 1 wherein said time interval between consecutive said plurality of sub‐events is configured by user of the handheld device.
  5. The method as claimed in claim 1 wherein the handheld device is configured to operate the time related interaction with an appropriate platform.
  6. The method as claimed in claim 1 wherein the step of associating an output function for each said at least one interaction trigger event comprises providing an audio output of time at the instance of said at least one interaction trigger event on the handheld device.
  7. An apparatus for time related interaction with a handheld device, the apparatus comprising: ‐
    a sensor module is configured to identify at least one interaction trigger event comprising a plurality of trigger sub‐events in the handheld device
    a processing module coupled to the sensor module configured to
    register at least one interaction trigger event comprising a plurality of trigger sub‐events in the handheld device;
    determine an output function associated with said interaction trigger event; and
    register time interval between consecutive said plurality of sub‐events in the handheld device;
    associate an output function for each said at least one interaction trigger event in the handheld device;
    execute the associated output function upon identification of at least one interaction trigger event comprising a plurality of trigger sub‐events in the handheld device; and
    a storage module coupled to said sensor module and processing module to store the settings configured by a user in the handheld device.
  8. The apparatus as claimed in claim 7 wherein the storage module stores an alarm time set by the user of the handheld device.
  9. The apparatus as claimed in claims 7 wherein the processing module is configured to determine time differential between instance of said at least one interaction trigger event and stored alarm time.
  10. The apparatus as claimed in claims 7 wherein the processing module is configured to provide an audio output of determined time differential through the handheld device.
  11. The apparatus as claimed in claim 7 wherein the sensor module is configured to determine time at the instance of said at least one interaction trigger event.
  12. The apparatus as claimed in claim 7 wherein the processing module is configured to provide an audio output of determined current time on the handheld device.
  13. The apparatus as claimed in claim 7 wherein said at least one interaction trigger event and said plurality of sub‐events are motion based trigger events of the handheld device.
  14. The apparatus as claimed in claim 7 wherein said time interval between consecutive said plurality of sub‐events is configured by user of the handheld device.
  15. The apparatus as claimed in claim 7 wherein the handheld device is configured to operate the time related interaction with an appropriate platform.
PCT/CN2016/077402 2015-03-30 2016-03-25 Time related interaction with handheld device WO2016155577A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201680019617.5A CN107529341A (en) 2015-03-30 2016-03-25 The time correlation carried out with handheld device interacts

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN1646/CHE/2015 2015-03-30
IN1646CH2015 2015-03-30

Publications (1)

Publication Number Publication Date
WO2016155577A1 true WO2016155577A1 (en) 2016-10-06

Family

ID=57003864

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/077402 WO2016155577A1 (en) 2015-03-30 2016-03-25 Time related interaction with handheld device

Country Status (2)

Country Link
CN (1) CN107529341A (en)
WO (1) WO2016155577A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609165A (en) * 2011-01-24 2012-07-25 广州三星通信技术研究有限公司 Mobile terminal having touch screen and mobile terminal mode control method
CN103605465A (en) * 2013-12-06 2014-02-26 上海艾为电子技术有限公司 Method for controlling handheld equipment and handheld equipment
CN103713735A (en) * 2012-09-29 2014-04-09 华为技术有限公司 Method and device of controlling terminal equipment by non-contact gestures
US20140298672A1 (en) * 2012-09-27 2014-10-09 Analog Devices Technology Locking and unlocking of contacless gesture-based user interface of device having contactless gesture detection system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101754454A (en) * 2008-11-28 2010-06-23 英业达股份有限公司 Mobile phone control method and mobile phone applying mobile phone control method
KR101892233B1 (en) * 2012-08-03 2018-08-27 삼성전자주식회사 Method and apparatus for alarm service using context aware in portable terminal
CN102984376A (en) * 2012-11-23 2013-03-20 广东欧珀移动通信有限公司 Method and device for controlling mobile phone alarm clock and based on acceleration speed induction
CN103092342B (en) * 2012-12-31 2016-03-16 北京金山安全软件有限公司 Processing method and device of mobile terminal
CN103248762A (en) * 2013-04-16 2013-08-14 广东欧珀移动通信有限公司 Method, device and mobile phone for quick dialing
CN103617032A (en) * 2013-11-21 2014-03-05 苏州佳世达电通有限公司 Mobile terminal and prompting method thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609165A (en) * 2011-01-24 2012-07-25 广州三星通信技术研究有限公司 Mobile terminal having touch screen and mobile terminal mode control method
US20140298672A1 (en) * 2012-09-27 2014-10-09 Analog Devices Technology Locking and unlocking of contacless gesture-based user interface of device having contactless gesture detection system
CN103713735A (en) * 2012-09-29 2014-04-09 华为技术有限公司 Method and device of controlling terminal equipment by non-contact gestures
CN103605465A (en) * 2013-12-06 2014-02-26 上海艾为电子技术有限公司 Method for controlling handheld equipment and handheld equipment

Also Published As

Publication number Publication date
CN107529341A (en) 2017-12-29

Similar Documents

Publication Publication Date Title
Hossain et al. mCerebrum: a mobile sensing software platform for development and validation of digital biomarkers and interventions
CN108351697B (en) Electronic device including a plurality of displays and method for operating the same
KR102570000B1 (en) Electronic device of controlling a display and operating method thereof
CN106662905B (en) Displaying content related to always-on display
KR102393683B1 (en) Electronic Device including Sensor And Operating Method Thereof
US10236081B2 (en) Device for providing health management service and method thereof
US10485734B2 (en) Apparatus and method for sending reminders to a user
CN107343096B (en) Information reminding method and device, storage medium and electronic equipment
WO2017034746A1 (en) Contextual privacy engine for notifications
US10949811B2 (en) Health condition monitoring and action
KR102412425B1 (en) Electronic device and Method for processing a touch input of the same
CN103688232A (en) Gesture recognition device, electronic apparatus, gesture recognition device control method, control program, and recording medium
CN106250770B (en) Electronic device and method for encrypting content
US20160203695A1 (en) Method and apparatus for controlling alarms
EP3358446A1 (en) Gesture sensing method and electronic device supporting same
US20190188604A1 (en) Machine learning system for predicting optimal interruptions based on biometric data colllected using wearable devices
EP3557460A1 (en) Electronic device and method for controlling biosensor linked with display by using same
KR102317831B1 (en) Method and apparatus for batching process of multi data
KR20170086977A (en) Method and apparatus for processing image data
CN105531984B (en) Method and apparatus for selectively configuring alarm before starting mute operation mode
KR20180137915A (en) Method for determining data of barometer sensor using data obtained from motion sensor and electronic device for the same
US10552772B2 (en) Break management system
KR20170109401A (en) Electronic device and controlling method thereof
KR20170120376A (en) Electronic device and display method thereof
KR20170105262A (en) electronic device and method for acquiring biometric information thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16771332

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16771332

Country of ref document: EP

Kind code of ref document: A1