WO2017032292A1 - Method for reducing power consumption in touch screen device and a device thereof - Google Patents

Method for reducing power consumption in touch screen device and a device thereof Download PDF

Info

Publication number
WO2017032292A1
WO2017032292A1 PCT/CN2016/096267 CN2016096267W WO2017032292A1 WO 2017032292 A1 WO2017032292 A1 WO 2017032292A1 CN 2016096267 W CN2016096267 W CN 2016096267W WO 2017032292 A1 WO2017032292 A1 WO 2017032292A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
reducing
power consumption
power
display
Prior art date
Application number
PCT/CN2016/096267
Other languages
French (fr)
Inventor
Vaibhav BHALLA
Original Assignee
Huawei Technologies Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co., Ltd. filed Critical Huawei Technologies Co., Ltd.
Priority to CN201680048381.8A priority Critical patent/CN107924251B/en
Publication of WO2017032292A1 publication Critical patent/WO2017032292A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3262Power saving in digitizer or tablet

Definitions

  • the present disclosure generally relates to reducing power consumption in devices with a touch screen. Specifically it relates to dynamic power control of the touch subsystem based on the foreground user interface controls.
  • the present disclosure is applicable for devices with display and touch panels such as electronic portable devices or communication devices.
  • touch sensing is not required to an extensive extent.
  • the controller in the firmware decides the time interval for suspended state (idle timeout) .
  • running applications which are using the touch screen and are on display cannot decide these aspects. Therefore, for example, applications such as a book reader cannot use low power states supported by the controller. On each touch of the user, the touch power is turned to the maximum.
  • US 20100265209 relates to determining areas within the touch screen, where a user input is possible. To reduce power consumption of a device with a touch screen, the touch detection for sensing a user input is only activated within the determined areas, where a user input is possible.
  • US 20060284857 describes a power-saving function for a touch screen device that provides efficient use of power, such as from a battery of a portable device.
  • the touch screen device may include a sampling unit for digitizing an input to the touch screen.
  • the power-saving function is provided with a battery level-monitoring unit for monitoring a battery level, and a sampling rate controller for controlling a sampling rate of the sampling unit based on the battery level monitored by the battery level-monitoring unit.
  • WO 2009071123 relates to a determining areas within the touch screen, where user input is possible. To reduce power consumption of a device with a touch screen, the touch detection for sensing a user input is only activated within the determined areas, where a user input is possible.
  • EP 2556424 describes embodiments of a device having a touchscreen panel and a method for reducing its power consumption.
  • the touch-sensing capability of the touchscreen panel is activated when the user interaction with the touchscreen panel is detected.
  • the touch-sensing capability of the touchscreen panel is deactivated when no user interaction is detected.
  • the device includes a touch-event detector to sense motion or vibration, and the processing circuitry either activates or deactivates the touch-sensing capability of the touchscreen panel based at least in part on input from the touch-event detector.
  • the present disclosure enables a user to be able to use the portable device with lower power consumption since the touch subsystem consumes less power when it is not being used.
  • the device shall also be able to continuously playback video much longer than earlier. It also allows for optimum battery usage and consumption and touch subsystem consumption.
  • the disclosure reduces the power consumption when the application has taken complete control over the display area. It also provides additional video playback time.
  • the present disclosure bases its operation on the fact that if there is no control on the display the touch power can be put in low power mode to save power consumption.
  • the running application can request and set power states and the touch power will increase only when required and not as soon as any touch is detected. For example, if an application and the controller are in a low power state or mode and a user gesture is detected, then the reaction is not that the normal power is restored. Instead, the application is notified of the gesture and the application and the controller continue to work in the low power mode.
  • Figure 1 denotes an overview of a preferred embodiment of the present disclosure on a portable device.
  • Figure 2 shows the controls using which the user interacts with the running application in a preferred embodiment of the present disclosure.
  • Figure 3 depicts an example in which the application takes control over the complete display screen.
  • Figure 4 depicts a flowchart which gives the steps which are followed for a typical initialization of the system.
  • Figure 5 describes the steps of the method followed in a preferred embodiment of the present disclosure.
  • Figure 6 shows the steps involved in a preferred embodiment of a typical request for putting the touch subsystem in the normal power mode.
  • Figure 7 describes an embodiment of the process followed by the present disclosure when the system triggers the touch subsystem normal power mode based on a system event or any user input other than touch.
  • Figure 8 shows the flow in a preferred embodiment when the touch subsystem detects a touch and changes its mode to normal mode.
  • Figure 9 shows a basic flowchart depicting the core functioning of the present disclosure.
  • Figure 10 shows a basic system diagram depicting the core structure of the present disclosure.
  • the following discussion provides a brief, general description of a suitable computing environment in which various embodiments of the present disclosure can be implemented.
  • the aspects and embodiments are described in the general context of computer executable mechanisms such as routines executed by a handheld device e.g. a mobile phone, a personalized digital assistant, a cellular device, a tablet et al.
  • the embodiments described herein can be practiced with other system configurations, including Internet appliances, hand held devices, multi-processor systems, microprocessor based or programmable consumer electronics, network PCs, mini computers, mainframe computers and the like.
  • the embodiments can be embodied in a special purpose computer or data processor that is specifically programmed configured or constructed to perform one or more of the computer executable mechanisms explained in detail below.
  • each unit may comprise within itself one or more components, which are implicitly understood. These components may be operatively coupled to each other and be configured to communicate with each other to perform the function of the said unit.
  • the running application may take complete control on the display in full screen mode. In this mode only this application which is running in the foreground responds to the touch screen. For example, in case of a video being played on a portable touch based video player, there is generally a time out period for the controls on the interface to disappear and only the video to appear on the full screen. Since there are no controls on the user interface, the first touch leads to the controls to appear on the interface. When there are no controls on the interface, there is no significance of where on the touch screen the touch event occurs as any touch at any place shall lead to the same consequence irrespective on the location which is to display the hidden controls.
  • the power of the touch screen can be reduced to conserve power to a level that it can detect the touch irrespective of the position and multi-touch.
  • power can be reduced without reducing the functionality.
  • This functionality is applicable to any application which hides the controls from the interface.
  • FIGURE 1 denotes an overview of a preferred embodiment of the present disclosure on a portable device. This figure applies mainly to the display (1) and the touch screen (4) of the portable device. In this embodiment the touch screen (4) is placed on top of the display panel (4) . A running application shall be displayed on the display (1) and shall show the controls for the user to interact.
  • FIGURE 2 shows the controls using which the user interacts with the running application in a preferred embodiment of the present disclosure.
  • the controls shown on the display (1) are handled by the foreground application.
  • FIGURE 3 depicts an example in which the application takes control over the complete display (1) screen.
  • the controls are automatically hidden after a certain pre-decided or set time period or even event.
  • the next touch gesture will be considered just as a touch event to bring the controls into the foreground from the background. Therefore, as and when the device is in the state depicted in FIGURE 3, the next touch gesture is only required to bring the controls to the foreground and there is no requirement to keep the touch sub-system running on normal power mode.
  • FIGURE 4 depicts a flowchart which gives the steps which are followed for a typical initialization of the system.
  • the process starts with the system boot-up when the system reads the capabilities of the underlying hardware at step 101.
  • the system calls the operating system specific interface.
  • the driver will read the required information as per communication channel established by the touch panel and the host processor or board.
  • the I2C protocol could be used for the communication with the touch controller to get the capability to use power reduction and low power gesture support.
  • this configuration could also be configured in the device driver.
  • the driver will not communicate with the hardware or the firmware to get the capability and shall directly in step 103, process the capability of touch.
  • This information could be read onetime during system start up at the firmware or hardware level or each time the system needs to as can be seen in step 104. This depends on the implementation and optimization.
  • FIGURE 5 describes the steps of the method followed in a preferred embodiment of the present disclosure.
  • the process starts when the application is running in the normal mode, in a preferred embodiment, in full screen mode, with the controls in step 201. In another embodiment, it also may not own the complete display screen i.e. may not be in full screen.
  • step 202 it then changes its state from the normal working mode, to the full screen mode without any user interface control.
  • the application can call the system level API, in step 203, to configure the touch panel to go to low power state and support one or more gestures supported in low power mode. This will also have the ability to select only touch including but not limited to single, multi-touch, swipe and tap touch.
  • the system will validate the application request and it will pass it on to the lower layer which is the exposed OS interface.
  • the touch driver will create a request packet and transmit it to the hardware. After processing the touch driver will provide the result to the upper layer, and application will accordingly get a response to its request.
  • the low power mode of the touch subsystem in a preferred embodiment, is based on the touch IC used. Different IC vendors use different mechanisms to work in the low power mode. These mechanisms are then used in step 204, to lower down the power mode of the touch sub system.
  • Figure 6 shows the steps involved in a preferred embodiment of a typical request for putting the touch subsystem in the normal power mode.
  • the application is changing its state from the full screen mode without any user interface control to the normal working mode where it does have user interface control in a preferred embodiment, and in another embodiment, the application does not own the complete display screen.
  • the process starts when the application is running in a preferred embodiment, in full screen mode, without the controls in step 301. In step 302, it then changes its state to normal mode in a preferred embodiment and in another embodiment in full screen mode with controls.
  • the application can call the system level API to configure the touch panel to work in normal mode in step 303.
  • the system will validate the application request and will pass it to the lower layer, i.e. the exposed OS interface.
  • the touch driver will then create a request packet and transmit it to the hardware in step 304. After processing the touch driver will provide the result to the upper layer and the application shall get a response of the request.
  • FIGURE 7 describes an embodiment of the process followed by the present disclosure when the system triggers the touch subsystem normal power mode based on a system event or any user input other than touch.
  • an alarm gets triggered and the device therefore has to show the alarm application’s interface and controls.
  • Another example can be when a call gets received on the cellular device.
  • the system calls the exposed operating system interface to disable the low power mode of the touch sub-system in step 402.
  • the touch driver then creates a request packet and transmits it to the hardware in step 403. After processing the touch driver provides the result to the upper layer and the application gets a response of the request.
  • FIGURE 8 shows the flow in a preferred embodiment when the touch subsystem detects a touch and changes its mode to normal mode.
  • the user touches the touch screen to operate the running application. For example, if the user wants to pause a running video, then the user will need to touch the touch screen to operate the video playback controls which will initially be hidden. After the first touch, the user will have access to the interface controls and can operate them as normal.
  • the hardware shall notify the application.
  • the touch controller notifies the operating system or driver using interrupt or polling mechanism which is used for notification in step 502.
  • the driver responds to the event and changes the mode of the touch subsystem to the normal mode, if required.
  • the driver notifies the system with the touch event in step 503 and the system sends this event to the foreground application in step 504.
  • FIGURE 9 shows a basic flowchart depicting the core functioning of the present disclosure.
  • the process gets initiated at step 601 at detecting the absence of controls for user interaction on the display for the application presently running on the device.
  • this application when opened does not cover the complete display and may also display the controls for user interaction. Even if it does cover the complete screen, the controls for user interaction are visible.
  • the application thereafter covers the complete display and/or hides the user controls.
  • the first touch event is for operating the running application by bringing the controls to the foreground. Therefore, at step 602, the procedure proceeds to displaying the user interaction controls on the touch panel when the first touch event happens.
  • this changing of the power level of the touch panel comprises of reducing the power level to such a level that only supports basic gestures.
  • the power level of the touch panel may initially be at a normal mode in which the touch panel is able to detect the position of the touch, the frequency of path of the touch and whether the touch is a tap or a swipe. Thereafter, the power level of the touch panel is reduced such that the touch panel can detect only the presence of a touch and not the type, frequency, path or position of the touch.
  • the step of changing the power level of the touch panel depends on the touch IC used.
  • the operating system may receive a request or alert or trigger from another application in the background to come into the foreground.
  • this application is first brought to the foreground and the user controls are displayed.
  • the power level of the touch at that point shall be such that the controls can be accessed using the touch screen.
  • the operating system may receive a request or alert or trigger from another application in the background to come into the foreground.
  • this application is first brought to the foreground and the user controls are displayed.
  • the power level of the touch at that point shall be such that the controls can be accessed using the touch screen.
  • FIGURE 10 shows a basic system diagram depicting the core structure of the present disclosure.
  • the structure comprises of one or more display screens 701 which are configured to display one or more applications and one or more controls pertaining to these applications. These one or more display screens 701 are linked to one or more touch panels 702 which are configured to receive and process one or more touch events.
  • detection means 703 which are connected to the display screen 701 and are configured to detect the presence or absence of controls of an application on the display screen 701 for user interaction.
  • power control means 704 which are connected to the detection means 703 and touch panels 702 and are configured to change the power levels of the touch panels 702.
  • the display screen 701 is configured to initially display the application with its user controls and may occupy just a portion of the display screen 701. In another embodiment, the display screen 701 is configured to display the application using the complete display screen with the user controls hidden.
  • the power level of the touch panel 702 is initially at normal mode in which the touch panel 702 is able to detect whether the touch is a tap of a swipe as well as the position of the touch and also the frequency or path of the touch.
  • the power control means 704 are configured to reduce the power level of a touch panel to a level that supports only basic gestures and in an embodiment may do so using the touch IC. In such a case the touch panel 702 can detect only the presence of a touch and not the type, frequency, path or position of the touch.
  • system comprises of receiving means 705 for receiving a request from another application to come in the foreground of the display.
  • the present disclosure may be embodied as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, a software embodiment or an embodiment combining software and hardware aspects all generally referred to herein as a "circuit"or “module. "Furthermore, the present disclosure may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
  • Instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
  • Instructions may also be loaded onto a computer or other programmable data processing apparatus like a scanner/check scanner to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function (s) .
  • the function (s) noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending on the functionality involved.

Abstract

The device enables a user to be able to use with lower power consumption since the touch subsystem consumes less power when it sis not being used. The device shall also be able to continuously playback video much longer than earlier. It also allows for optimum battery usage and consumption and touch subsystem consumption. The device reduces the power consumption when the application has take complete control over the display. It also provides additional video playback time.

Description

METHOD FOR REDUCING POWER CONSUMPTION IN TOUCH SCREEN DEVICE AND A DEVICE THEREOF TECHNICAL FIELD
The present disclosure generally relates to reducing power consumption in devices with a touch screen. Specifically it relates to dynamic power control of the touch subsystem based on the foreground user interface controls.
BACKGROUND
The present disclosure is applicable for devices with display and touch panels such as electronic portable devices or communication devices.
The problem in conventional touch screen devices is that there is wastage and unnecessary consumption of power by the touch screen when there are no controls on the display or the touch screen to guide the user on where and how to touch a specific area to use or activate a particular function. In such a scenario touch sensing is not required to an extensive extent.
There exists certain technology to reduce power consumption in touch screen devices. For example, advance touch ICs that can change the power mode to a low power state. However, these are controlled in touch panel’s controller IC directly, i.e. the controller decides when the touch panel goes into suspended state normally based on idle timeout set in firmware. Therefore the application which is being played on the touch screen has no role to play.
The problems in these existing technologies is that the controller in the firmware decides the time interval for suspended state (idle timeout) . However running applications which are using the touch screen and are on display cannot decide these aspects. Therefore, for example, applications such as a book reader cannot use low power states supported by the controller. On each touch of the user, the touch power is turned to the maximum.
US 20100265209 relates to determining areas within the touch screen, where a user input is possible. To reduce power consumption of a device with a  touch screen, the touch detection for sensing a user input is only activated within the determined areas, where a user input is possible.
US 20060284857 describes a power-saving function for a touch screen device that provides efficient use of power, such as from a battery of a portable device. The touch screen device may include a sampling unit for digitizing an input to the touch screen. The power-saving function is provided with a battery level-monitoring unit for monitoring a battery level, and a sampling rate controller for controlling a sampling rate of the sampling unit based on the battery level monitored by the battery level-monitoring unit.
WO 2009071123 relates to a determining areas within the touch screen, where user input is possible. To reduce power consumption of a device with a touch screen, the touch detection for sensing a user input is only activated within the determined areas, where a user input is possible.
EP 2556424 describes embodiments of a device having a touchscreen panel and a method for reducing its power consumption. The touch-sensing capability of the touchscreen panel is activated when the user interaction with the touchscreen panel is detected. The touch-sensing capability of the touchscreen panel is deactivated when no user interaction is detected. In some embodiments, the device includes a touch-event detector to sense motion or vibration, and the processing circuitry either activates or deactivates the touch-sensing capability of the touchscreen panel based at least in part on input from the touch-event detector.
SUMMARY
The present disclosure enables a user to be able to use the portable device with lower power consumption since the touch subsystem consumes less power when it is not being used. The device shall also be able to continuously playback video much longer than earlier. It also allows for optimum battery usage and consumption and touch subsystem consumption. The disclosure reduces the power consumption when the application has taken complete control over the display area. It also provides additional video playback time.
The present disclosure bases its operation on the fact that if there is no control on the display the touch power can be put in low power mode to save power consumption. The running application can request and set power states and the touch power will increase only when required and not as soon as any touch is detected. For example, if an application and the controller are in a low power state or mode and a user gesture is detected, then the reaction is not that the normal power is restored. Instead, the application is notified of the gesture and the application and the controller continue to work in the low power mode.
It is a primary objective of the present disclosure to obviate the aforementioned drawbacks and provide a method for reducing power consumption in devices with a touch panel, comprising the steps of detecting the absence of controls in an application on the display for user interaction, displaying the user interaction controls on the touch panel on the first touch event; and changing the power level of the touch panel such that the power level is optimum to only detect a touch event.
It is another primary objective of the present disclosure to provide a device for reducing power consumption by a touch events, comprising of one or more display screens for displaying one or more applications and one or more user controls pertaining to the applications, one or more touch panels for receiving and processing one or more touch events, detection means for detecting the absence of controls in an application on the display for user interaction; and power control means for changing the power levels of the touch panel.
BRIEF DESCRIPTION OF DRAWINGS
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit (s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the drawings to reference like features and components.
Figure 1 denotes an overview of a preferred embodiment of the present disclosure on a portable device.
Figure 2 shows the controls using which the user interacts with the running  application in a preferred embodiment of the present disclosure.
Figure 3 depicts an example in which the application takes control over the complete display screen.
Figure 4 depicts a flowchart which gives the steps which are followed for a typical initialization of the system.
Figure 5 describes the steps of the method followed in a preferred embodiment of the present disclosure.
Figure 6 shows the steps involved in a preferred embodiment of a typical request for putting the touch subsystem in the normal power mode.
Figure 7 describes an embodiment of the process followed by the present disclosure when the system triggers the touch subsystem normal power mode based on a system event or any user input other than touch.
Figure 8 shows the flow in a preferred embodiment when the touch subsystem detects a touch and changes its mode to normal mode.
Figure 9 shows a basic flowchart depicting the core functioning of the present disclosure.
Figure 10 shows a basic system diagram depicting the core structure of the present disclosure.
Detailed Description of the Drawings
The following discussion provides a brief, general description of a suitable computing environment in which various embodiments of the present disclosure can be implemented. The aspects and embodiments are described in the general context of computer executable mechanisms such as routines executed by a handheld device e.g. a mobile phone, a personalized digital assistant, a cellular device, a tablet et al. The embodiments described herein can be practiced with other system configurations, including Internet appliances, hand held devices, multi-processor systems, microprocessor based or programmable consumer electronics, network PCs, mini computers, mainframe computers and the like. The embodiments can be embodied in a special purpose computer or data processor that is specifically programmed configured or constructed to perform  one or more of the computer executable mechanisms explained in detail below.
Exemplary embodiments now will be described with reference to the accompanying drawings. The disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey its scope to those skilled in the art. The terminology used in the detailed description of the particular exemplary embodiments illustrated in the accompanying drawings is not intended to be limiting. In the drawings, like numbers refer to like elements.
The specification may refer to “an” , “one” or “some” embodiment (s) in several locations. This does not necessarily imply that each such reference is to the same embodiment (s) , or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments.
As used herein, the singular forms “a” , “an” and “the” are intended to include the plural forms as well, unless expressly stated otherwise. It will be further understood that the terms “includes” , “comprises” , “including” and/or “comprising” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. Furthermore, “connected” or “coupled” as used herein may include wirelessly connected or coupled. As used herein, the term “and/or” includes any and all combinations and arrangements of one or more of the associated listed items.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted  as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The figures depict a simplified structure only showing some elements and functional entities, all being logical units whose implementation may differ from what is shown. The connections shown are logical connections; the actual physical connections may be different. It is apparent to a person skilled in the art that the structure may also comprise other functions and structures. It should be appreciated that the functions, structures, elements and the protocols used in communication are irrelevant to the present disclosure. Therefore, they need not be discussed in more detail here.
In addition, all logical units described and depicted in the figures include the software and/or hardware components required for the unit to function. Further, each unit may comprise within itself one or more components, which are implicitly understood. These components may be operatively coupled to each other and be configured to communicate with each other to perform the function of the said unit.
In a preferred embodiment, the running application may take complete control on the display in full screen mode. In this mode only this application which is running in the foreground responds to the touch screen. For example, in case of a video being played on a portable touch based video player, there is generally a time out period for the controls on the interface to disappear and only the video to appear on the full screen. Since there are no controls on the user interface, the first touch leads to the controls to appear on the interface. When there are no controls on the interface, there is no significance of where on the touch screen the touch event occurs as any touch at any place shall lead to the same consequence irrespective on the location which is to display the hidden controls. Therefore, at this time, the power of the touch screen can be reduced to conserve power to a level that it can detect the touch irrespective of the position and multi-touch. Thus, power can be reduced without reducing the functionality. This functionality is applicable to any application which hides the controls from  the interface.
FIGURE 1 denotes an overview of a preferred embodiment of the present disclosure on a portable device. This figure applies mainly to the display (1) and the touch screen (4) of the portable device. In this embodiment the touch screen (4) is placed on top of the display panel (4) . A running application shall be displayed on the display (1) and shall show the controls for the user to interact.
FIGURE 2 shows the controls using which the user interacts with the running application in a preferred embodiment of the present disclosure. The controls shown on the display (1) are handled by the foreground application.
FIGURE 3 depicts an example in which the application takes control over the complete display (1) screen. In these circumstances, only the foreground application will react to any user input through the touch screen (4) . For instance, on most of the video players present on portable devices, the controls are automatically hidden after a certain pre-decided or set time period or even event. When the controls are hidden in this manner, the next touch gesture will be considered just as a touch event to bring the controls into the foreground from the background. Therefore, as and when the device is in the state depicted in FIGURE 3, the next touch gesture is only required to bring the controls to the foreground and there is no requirement to keep the touch sub-system running on normal power mode. In the normal power mode, power is consumed to support multiple gestures and makes the touch screen sensitive enough to detect multiple touches which is not required in this case. For this situation, the requirement is only that the application should be able to decide whether to use all the touch based events, a specific touch event or just any touch event.
FIGURE 4 depicts a flowchart which gives the steps which are followed for a typical initialization of the system. The process starts with the system boot-up when the system reads the capabilities of the underlying hardware at step 101. For this purpose, the system calls the operating system specific interface. To get the touch subsystem capability, the driver will read the required information as per communication channel established by the touch panel and the host processor or board. For example, in a preferred embodiment, the I2C protocol could be  used for the communication with the touch controller to get the capability to use power reduction and low power gesture support.
In another preferred embodiment, as depicted in step 102, this configuration could also be configured in the device driver. In this case the driver will not communicate with the hardware or the firmware to get the capability and shall directly in step 103, process the capability of touch. This information could be read onetime during system start up at the firmware or hardware level or each time the system needs to as can be seen in step 104. This depends on the implementation and optimization.
FIGURE 5 describes the steps of the method followed in a preferred embodiment of the present disclosure. The process starts when the application is running in the normal mode, in a preferred embodiment, in full screen mode, with the controls in step 201. In another embodiment, it also may not own the complete display screen i.e. may not be in full screen. In step 202, it then changes its state from the normal working mode, to the full screen mode without any user interface control. In this state the application can call the system level API, in step 203, to configure the touch panel to go to low power state and support one or more gestures supported in low power mode. This will also have the ability to select only touch including but not limited to single, multi-touch, swipe and tap touch. The system will validate the application request and it will pass it on to the lower layer which is the exposed OS interface.
The touch driver will create a request packet and transmit it to the hardware. After processing the touch driver will provide the result to the upper layer, and application will accordingly get a response to its request. The low power mode of the touch subsystem, in a preferred embodiment, is based on the touch IC used. Different IC vendors use different mechanisms to work in the low power mode. These mechanisms are then used in step 204, to lower down the power mode of the touch sub system.
Figure 6 shows the steps involved in a preferred embodiment of a typical request for putting the touch subsystem in the normal power mode. In this case, the application is changing its state from the full screen mode without any user  interface control to the normal working mode where it does have user interface control in a preferred embodiment, and in another embodiment, the application does not own the complete display screen. The process starts when the application is running in a preferred embodiment, in full screen mode, without the controls in step 301. In step 302, it then changes its state to normal mode in a preferred embodiment and in another embodiment in full screen mode with controls.
In this state the application can call the system level API to configure the touch panel to work in normal mode in step 303. The system will validate the application request and will pass it to the lower layer, i.e. the exposed OS interface. The touch driver will then create a request packet and transmit it to the hardware in step 304. After processing the touch driver will provide the result to the upper layer and the application shall get a response of the request.
FIGURE 7 describes an embodiment of the process followed by the present disclosure when the system triggers the touch subsystem normal power mode based on a system event or any user input other than touch. This occurs at step 401, when the system detects a request from another application to come in the foreground or the present running application terminates. As an example, an alarm gets triggered and the device therefore has to show the alarm application’s interface and controls. Another example can be when a call gets received on the cellular device. In this case, the system calls the exposed operating system interface to disable the low power mode of the touch sub-system in step 402. The touch driver then creates a request packet and transmits it to the hardware in step 403. After processing the touch driver provides the result to the upper layer and the application gets a response of the request.
FIGURE 8 shows the flow in a preferred embodiment when the touch subsystem detects a touch and changes its mode to normal mode. At step 501, the user touches the touch screen to operate the running application. For example, if the user wants to pause a running video, then the user will need to touch the touch screen to operate the video playback controls which will initially be hidden. After the first touch, the user will have access to the interface controls and can operate  them as normal.
In such a case, the hardware shall notify the application. In this case, when the user touches the screen, the touch controller notifies the operating system or driver using interrupt or polling mechanism which is used for notification in step 502. The driver responds to the event and changes the mode of the touch subsystem to the normal mode, if required. The driver notifies the system with the touch event in step 503 and the system sends this event to the foreground application in step 504.
FIGURE 9 shows a basic flowchart depicting the core functioning of the present disclosure. The process gets initiated at step 601 at detecting the absence of controls for user interaction on the display for the application presently running on the device. In a preferred embodiment, this application when opened does not cover the complete display and may also display the controls for user interaction. Even if it does cover the complete screen, the controls for user interaction are visible. Thereafter, in another preferred embodiment, at a specified event or activation of a button, the application thereafter covers the complete display and/or hides the user controls. In a preferred embodiment, the first touch event is for operating the running application by bringing the controls to the foreground. Therefore, at step 602, the procedure proceeds to displaying the user interaction controls on the touch panel when the first touch event happens.
Lastly, the procedure concludes at step 603 by changing the power level of the touch panel such that the power level is optimum only to detect a touch event. In a preferred embodiment, this changing of the power level of the touch panel comprises of reducing the power level to such a level that only supports basic gestures. The power level of the touch panel may initially be at a normal mode in which the touch panel is able to detect the position of the touch, the frequency of path of the touch and whether the touch is a tap or a swipe. Thereafter, the power level of the touch panel is reduced such that the touch panel can detect only the presence of a touch and not the type, frequency, path or position of the touch. In a preferred embodiment, the step of changing the power level of the touch panel depends on the touch IC used.
In a preferred embodiment, the operating system may receive a request or alert or trigger from another application in the background to come into the foreground. In such a case, this application is first brought to the foreground and the user controls are displayed. The power level of the touch at that point shall be such that the controls can be accessed using the touch screen.
In a preferred embodiment, the operating system may receive a request or alert or trigger from another application in the background to come into the foreground. In such a case, this application is first brought to the foreground and the user controls are displayed. The power level of the touch at that point shall be such that the controls can be accessed using the touch screen.
FIGURE 10 shows a basic system diagram depicting the core structure of the present disclosure. The structure comprises of one or more display screens 701 which are configured to display one or more applications and one or more controls pertaining to these applications. These one or more display screens 701 are linked to one or more touch panels 702 which are configured to receive and process one or more touch events. There are detection means 703 which are connected to the display screen 701 and are configured to detect the presence or absence of controls of an application on the display screen 701 for user interaction. Finally there are power control means 704 which are connected to the detection means 703 and touch panels 702 and are configured to change the power levels of the touch panels 702.
In a preferred embodiment, the display screen 701 is configured to initially display the application with its user controls and may occupy just a portion of the display screen 701. In another embodiment, the display screen 701 is configured to display the application using the complete display screen with the user controls hidden.
In a preferred embodiment, the power level of the touch panel 702 is initially at normal mode in which the touch panel 702 is able to detect whether the touch is a tap of a swipe as well as the position of the touch and also the frequency or path of the touch. In a preferred embodiment the power control means 704 are configured to reduce the power level of a touch panel to a level  that supports only basic gestures and in an embodiment may do so using the touch IC. In such a case the touch panel 702 can detect only the presence of a touch and not the type, frequency, path or position of the touch.
In another embodiment the system comprises of receiving means 705 for receiving a request from another application to come in the foreground of the display.
As will be appreciated by one of skill in the art, the present disclosure may be embodied as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, a software embodiment or an embodiment combining software and hardware aspects all generally referred to herein as a "circuit"or "module. "Furthermore, the present disclosure may take the form of a computer program product on a computer-usable storage medium having computer-usable program code embodied in the medium.
Furthermore, the present disclosure was described in part above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) , and computer program products according to embodiments of the disclosure.
It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
Instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which  implement the function/act specified in the flowchart and/or block diagram block or blocks.
Instructions may also be loaded onto a computer or other programmable data processing apparatus like a scanner/check scanner to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and schematic diagrams of Figures 1 to 4 illustrate the architecture, functionality, and operations of some embodiments of methods, systems, and computer program products for time related interaction of a user with a handheld device. In this regard, each block may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function (s) . It should also be noted that in other implementations, the function (s) noted in the blocks may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending on the functionality involved.
In the drawings and specification, there have been disclosed exemplary embodiments of the disclosure. Although specific terms are employed, they are used in a generic and descriptive sense only and not for purposes of limitation, the scope of the disclosure being defined by the following claims.

Claims (24)

  1. A method for reducing power consumption in devices with a touch panel, comprising the steps of:
    - Detecting the absence of controls in an application on the display for user interaction
    - Displaying the user interaction controls on the touch panel on the first touch event; and
    - Changing the power level of the touch panel such that the power level is optimum to only detect a touch event.
  2. A method for reducing power consumption as claimed in claim 1, wherein the step of changing the power level of the touch panel comprises of reducing the power level which supports only basic gestures.
  3. A method for reducing the power consumption as claimed in claim 1 or 2, wherein the step of changing the power level of the touch panel depends on the touch IC used.
  4. A method for reducing the power consumption as claimed in claim 1, wherein the application on the display initially has displayed user controls and/or does not cover the complete display.
  5. A method for reducing the power consumption as claimed in claim 4, wherein the application thereafter covers the complete display and/or does not display any user control.
  6. A method for reducing the power consumption as claimed in any of claims 4 or 5, wherein the power level of the touch panel is initially at normal mode.
  7. A method for reducing the power consumption as claimed in claim 7, wherein the power level at normal mode is able to detect whether a touch is a tap or a swipe.
  8. A method for reducing the power consumption as claimed in claim 7, wherein the power level at normal mode is able to detect the position of the touch.
  9. A method for reducing the power consumption as claimed in any of claims 7 or 8, wherein the power level at normal mode is able to detect the frequency or path of the touch.
  10. A method for reducing the power consumption as claimed in any of claims 1 or 2, wherein the power level of the touch panel is reduced such that the touch panel can detect only the presence of a touch and not the type, frequency, path or position of the touch.
  11. A method for reducing the power consumption as claimed in claim 1, comprising the step of receiving a request from another application to come in the foreground of the display before the step of changing the power level of the touch panel such that the power level is optimum to only detect a touch event.
  12. A method for reducing the power consumption as claimed in claim 1, wherein the first touch event is for operating the running application.
  13. A device for reducing power consumption by a touch events, comprising:
    - One or more display screens for displaying one or more applications and one or more user controls pertaining to the applications;
    - One or more touch panels for receiving and processing one or more touch events;
    - Detection means for detecting the absence of controls in an application on the display for user interaction; and
    - Power control means for changing the power levels of the touch panel.
  14. A device for reducing power consumption as claimed in claim 13, wherein the power control means are configured to reduce the power level of the touch panel to a level which supports only basic gestures.
  15. A device for reducing the power consumption as claimed in claim 13 or 14, wherein the power control means are configured to reduce the power level of the touch panel by accessing the touch IC.
  16. A device for reducing the power consumption as claimed in claim 13, wherein the display screen is configured to initially display an application with user controls and configured to display the application using only a portion of the display screen.
  17. A device for reducing the power consumption as claimed in claim 16, wherein the display screen is configured to display the application using the complete display screen and configured to hide the user controls.
  18. A device for reducing the power consumption as claimed in claim 13, wherein the power level of the touch panel is initially at normal mode.
  19. A device for reducing the power consumption as claimed in claim 18, wherein the power level at normal mode is able to detect whether a touch is a tap or a swipe.
  20. A device for reducing the power consumption as claimed in claim 18, wherein the power level at normal mode is able to detect the position of the touch.
  21. A device for reducing the power consumption as claimed in any of claims 19 or 20, wherein the power level at normal mode is able to detect the frequency or path of the touch.
  22. A device for reducing the power consumption as claimed in any of claims 13 or 14, wherein the power control means are configured to change the power level of the touch panel such that the touch panel can detect only the presence of a touch and not the type, frequency, path or position of the touch.
  23. A device for reducing the power consumption as claimed in claim 13, comprising of receiving means for receiving a request from another application to come in the foreground of the display.
  24. A device for reducing the power consumption as claimed in claim 13, wherein the first touch event is for operating the running application.
PCT/CN2016/096267 2015-08-24 2016-08-22 Method for reducing power consumption in touch screen device and a device thereof WO2017032292A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201680048381.8A CN107924251B (en) 2015-08-24 2016-08-22 Method and device for reducing power consumption of touch screen device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN4436/CHE/2015 2015-08-24
IN4436CH2015 2015-08-24

Publications (1)

Publication Number Publication Date
WO2017032292A1 true WO2017032292A1 (en) 2017-03-02

Family

ID=58099598

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/096267 WO2017032292A1 (en) 2015-08-24 2016-08-22 Method for reducing power consumption in touch screen device and a device thereof

Country Status (2)

Country Link
CN (1) CN107924251B (en)
WO (1) WO2017032292A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009071123A1 (en) * 2007-12-06 2009-06-11 Nokia Corporation Power reduction for touch screens
CN102214050A (en) * 2010-04-06 2011-10-12 英特尔公司 Device with capacitive touchscreen panel and method for power management
CN102819357A (en) * 2012-08-10 2012-12-12 江苏物联网研究发展中心 Low-power consumption light-thin type medium/small-sized sound wave touch screen
CN103558941A (en) * 2013-11-05 2014-02-05 广东欧珀移动通信有限公司 Method, device and mobile terminal for adjusting power consumption of touch screen

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101387527B1 (en) * 2007-12-06 2014-04-23 엘지전자 주식회사 Terminal and method for displaying menu icon therefor
US8384677B2 (en) * 2008-04-25 2013-02-26 Research In Motion Limited Electronic device including touch-sensitive input surface and method of determining user-selected input
US8751194B2 (en) * 2010-09-30 2014-06-10 Fitbit, Inc. Power consumption management of display in portable device based on prediction of user input
JP5914824B2 (en) * 2011-07-05 2016-05-11 パナソニックIpマネジメント株式会社 Imaging device
CN103294323A (en) * 2013-06-13 2013-09-11 敦泰科技有限公司 Method and device for touch detection and touch screen system
CN105630393B (en) * 2015-12-31 2018-11-27 歌尔科技有限公司 A kind of control method and control device of touch screen operating mode

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009071123A1 (en) * 2007-12-06 2009-06-11 Nokia Corporation Power reduction for touch screens
CN102214050A (en) * 2010-04-06 2011-10-12 英特尔公司 Device with capacitive touchscreen panel and method for power management
CN102819357A (en) * 2012-08-10 2012-12-12 江苏物联网研究发展中心 Low-power consumption light-thin type medium/small-sized sound wave touch screen
CN103558941A (en) * 2013-11-05 2014-02-05 广东欧珀移动通信有限公司 Method, device and mobile terminal for adjusting power consumption of touch screen

Also Published As

Publication number Publication date
CN107924251B (en) 2021-01-29
CN107924251A (en) 2018-04-17

Similar Documents

Publication Publication Date Title
US10712799B2 (en) Intelligent management for an electronic device
US8866791B2 (en) Portable electronic device having mode dependent user input controls
US10761582B2 (en) Method and apparatus to optimize system battery-life for static and semi-static image viewing usage models
US10409482B2 (en) Electronic system, touch sensitive processing apparatus and method thereof for switching to normal operation mode upon receiving touch gesture in power saving mode
JP6363604B2 (en) Electronic device and control method thereof
EP2876537B1 (en) Power-save mode in electronic apparatus
US20140208145A1 (en) Methods and apparatus for saving power
JP2014501990A (en) Touch signal processing method and apparatus in touch sensor controller
US20210116986A1 (en) Prompt Information Display Method and Electronic Device
EP2857929B1 (en) Information processing apparatus, information processing system, and power control method
AU2015312632A1 (en) Method and apparatus for processing touch input
KR101107422B1 (en) A Portable Information Terminal with Touch Screen at the Front and the Rear Side
US20200019408A1 (en) Electronic device, method for controlling electronic device, and program
US8769326B2 (en) Computer system and operation method for changing operation state by capacitive button group
US9544419B2 (en) Methods and systems for configuring a mobile device based on an orientation-based usage context
JP4843481B2 (en) Information processing apparatus and information processing method
JP2015521768A (en) Launching applications on programmable devices that use gestures on images
WO2017032292A1 (en) Method for reducing power consumption in touch screen device and a device thereof
US10429988B2 (en) Touch screen support by emulating a legacy device
KR102417186B1 (en) Method for operating a notebook computer
WO2016018331A1 (en) Cursor locator
TWI442278B (en) Control method and electronic apparatus using gestures
JP2010086065A (en) Information processor and pointing device control method
WO2013179954A1 (en) Electronic device
TWI628598B (en) Electronic device and half-suspend controlling method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16838551

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16838551

Country of ref document: EP

Kind code of ref document: A1