US11360528B2 - Apparatus and methods for thermal management of electronic user devices based on user activity - Google Patents

Apparatus and methods for thermal management of electronic user devices based on user activity Download PDF

Info

Publication number
US11360528B2
US11360528B2 US16/728,774 US201916728774A US11360528B2 US 11360528 B2 US11360528 B2 US 11360528B2 US 201916728774 A US201916728774 A US 201916728774A US 11360528 B2 US11360528 B2 US 11360528B2
Authority
US
United States
Prior art keywords
user
electronic device
fan
constraint
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/728,774
Other versions
US20200133358A1 (en
Inventor
Columbia Mishra
Carin Ruiz
Helin Cao
Soethiha Soe
II James Hermerding
Bijendra Singh
Navneet Singh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intel Corp
Original Assignee
Intel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corp filed Critical Intel Corp
Priority to US16/728,774 priority Critical patent/US11360528B2/en
Assigned to INTEL CORPORATION reassignment INTEL CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HERMERDING, JAMES, II, SOE, Soethiha, SINGH, BIJENDRA, SINGH, NAVNEET, CAO, HELIN, RUIZ, CARIN, MISHRA, COLUMBIA
Publication of US20200133358A1 publication Critical patent/US20200133358A1/en
Priority to CN202010964468.8A priority patent/CN113050774A/en
Priority to EP20197335.1A priority patent/EP3865977A1/en
Priority to US17/732,173 priority patent/US11966268B2/en
Application granted granted Critical
Publication of US11360528B2 publication Critical patent/US11360528B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/20Cooling means
    • G06F1/206Cooling means comprising thermal management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • G06F1/3231Monitoring the presence, absence or movement of users
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/324Power saving characterised by the action undertaken by lowering clock frequency
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3058Monitoring arrangements for monitoring environmental properties or parameters of the computing system or of the computing system component, e.g. monitoring of power, currents, temperature, humidity, position, vibrations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/165Detection; Localisation; Normalisation using facial parts and geometric relationships
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/3827Portable transceivers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/20Indexing scheme relating to G06F1/20
    • G06F2200/201Cooling arrangements using cooling fluid
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination

Definitions

  • This disclosure relates generally to electronic user devices and, more particularly, to apparatus and methods for thermal management of electronic user devices.
  • an electronic user device e.g., a laptop, a tablet
  • hardware components of the device such as a processor, a graphics card, and/or battery
  • Electronic user devices include one or more fans to promote airflow to cool the device during use and prevent overheating of the hardware components.
  • FIG. 1 illustrates an example system constructed in accordance with teachings of this disclosure and including an example user device and an example thermal constraint manager for controlling a thermal constraint of the user device.
  • FIG. 2 is a block diagram of an example implementation of the thermal constraint manager of FIG. 1 .
  • FIG. 3 illustrates example thermal constraints that may be implemented with the example user device of FIG. 1 .
  • FIG. 4 illustrates an example user device constructed in accordance with teachings of this disclosure and, in particular, illustrates the user device in a first configuration associated with a first thermal constraint of the user device.
  • FIG. 5 illustrates the example user device of FIG. 4 and, in particular, illustrates the user device in a second configuration associated with a second thermal constraint of the user device.
  • FIG. 6 is a flowchart representative of example machine readable instructions which may be executed to implement the example training manager of FIG. 2 .
  • FIGS. 7A and 7B are flowcharts representative of example machine readable instructions which may be executed to implement the example thermal constraint manager of FIGS. 1 and/or 2 .
  • FIG. 8 is a block diagram of an example processing platform structured to execute the instructions of FIG. 6 to implement the example training manager of FIG. 2 .
  • FIG. 9 is a block diagram of an example processing platform structured to execute the instructions of FIGS. 7A and 7B to implement the example thermal constraint manager of FIGS. 1 and/or 2 .
  • Descriptors “first,” “second,” “third,” etc. are used herein when identifying multiple elements or components which may be referred to separately. Unless otherwise specified or understood based on their context of use, such descriptors are not intended to impute any meaning of priority, physical order or arrangement in a list, or ordering in time but are merely used as labels for referring to multiple elements or components separately for ease of understanding the disclosed examples.
  • the descriptor “first” may be used to refer to an element in the detailed description, while the same element may be referred to in a claim with a different descriptor such as “second” or “third.” In such instances, it should be understood that such descriptors are used merely for ease of referencing multiple elements or components.
  • an electronic user device e.g., a laptop, a tablet
  • hardware components disposed in a body or housing of the device such as a processor, graphics card, and/or battery, generate heat.
  • Heat generated by the hardware components of the user device can cause a temperature of one or more portions of an exterior surface, or skin, of the device housing to increase and become warm or hot to a user's touch.
  • the user device includes one or more fans to exhaust hot air generated within the body of the device and cool the device.
  • Some known electronic user devices are configured with one or more thermal constraints to control the temperature of the hardware components of the user device and/or of the skin of the device.
  • the thermal constraints(s) can define, for instance, a maximum temperature of a hardware component such as a processor to prevent overheating of the processor.
  • the thermal constraint(s) can define a maximum temperature of the skin of the device to prevent discomfort to a user touching and/or holding the device.
  • operation of the fan(s) of the user device and/or management of power consumed by the device are controlled based on the thermal constraint(s).
  • rotational speed(s) e.g., revolutions per minute (RPMs)
  • RPMs revolutions per minute
  • power consumption by one or more components of the device may be reduced to reduce the amount of heat generated by the component and, thus, the device.
  • the thermal constraint(s) define that a temperature of the skin of the device should not exceed, for instance, 45° C., to prevent user discomfort when the user is physically touching the device (e.g., typing on a keyboard of a laptop, scrolling on a touchscreen, etc.). Temperature of the skin of the device can be controlled by controlling power consumption of the hardware component(s) disposed within the device body to manage the amount of heat generated by the component(s) transferred to the skin of the device.
  • thermal constraint(s) can affect performance of the user device.
  • some known user devices can operate in a high performance mode, or a mode that favors increased processing speeds over energy conservation (e.g., a mode in which processing speeds remain high for the duration that the device is in use, the screen remains brightly lit, and other hardware components do not enter power-saving mode when those components are not in use).
  • the processor consumes increased power to accommodate the increased processing speeds associated with the high performance mode and, thus, the amount of heat generated by the processor is increased.
  • a temperature of the skin of the user device can increase due to the increased amount of heat generated within the device housing.
  • the processor may operate at lower performance speeds to consume less power and, thus, prevent the skin of the device from exceeding the maximum skin temperature defined by the thermal constraint.
  • processing performance is sacrificed in view of thermal constraint(s).
  • Higher fan speeds can be used to facilitate of cooling of hardware component(s) of a device to enable the component(s) to operate in, for instance, a high performance mode without exceeding the thermal constraint(s) for the hardware competent(s) and/or the device skin.
  • operation of the fan(s) at higher speeds increases audible acoustic noise generated by the fan(s).
  • the fan speed(s) and, thus, the amount of cooling that is provided by the fan(s) are restricted to avoid generating fan noise levels over certain decibels.
  • Some know devices define fan noise constraints that set, for instance, a maximum noise level of 35 dBA during operation of the fan(s). As a result of the restricted fan speed(s), performance of the device may be limited to enable the fan(s) to cool the user device within the constraints of the fan speed(s).
  • cooling capabilities of the fan(s) of the device degrade over time due to dust accumulating in the fan(s) and/or heat sink.
  • Some known user devices direct the fan(s) to reverse airflow direction (e.g., as compared to the default airflow direction to exhaust hot air from the device) to facilitate heatsink and fan shroud cleaning, which helps to de-clog dust from the airflow path and maintain device performance over time.
  • operation of the fan(s) in the reverse direction increases audible acoustics generated by the fan(s), which can disrupt the user's experience with the device.
  • thermal constraint(s) are implemented in a user device to prevent discomfort to the user when the user is directly touching the device (e.g., physically touching one or more components of the device accessible via the exterior housing of the device, such a keyboard and/or touchpad of a laptop, a touchscreen of a tablet, etc.), there are instances in which a temperature of the skin of the device can be increased without affecting the user's experience with the device. For instance, a user may view a video on the user device but not physically touch the user device; rather, the device may be resting on a table. In some instances, the user may interact with the user device via external accessories communicatively coupled to the device, such as an external keyboard and/or an external mouse.
  • external accessories communicatively coupled to the device, such as an external keyboard and/or an external mouse.
  • the user device is located in a noisy environment (e.g., a coffee shop, a train station). Additionally, or alternatively, in some instances, the user may be interacting with the user device while wearing headphones. In such instances, the amount of fan noise heard by the user is reduced because of the loud environment and/or the use of headphones. However, in known user devices, the rotational speed of the fan(s) of the device are maintained at a level that minimizes noise from the fan(s) regardless of the surrounding ambient noise levels and/or whether or not the user is wearing headphones.
  • Example disclosed herein use a multi-tier determination to control operation of fan(s) of the device and/or to adjust a performance level of the device and, thus, control heat generated by hardware component(s) of the device based on factors such as a presence of a user proximate to the device, user interaction(s) with the device (e.g., whether the user is using an on-board keyboard of the device or an external keyboard), and/or ambient noise levels in an environment in which the device is located.
  • Example user devices disclosed herein include sensors to detect user presence (e.g., proximity sensor(s), image sensor(s)), device configuration (e.g., sensor(s) to detect user input(s) received via an external keyboard, sensor(s) to detect device orientation), and/or conditions in the ambient environment in which the device is located (e.g., ambient noise sensor(s)). Based on the sensor data, examples disclosed herein determine whether a temperature of the skin of the device housing can be increased relative to a default thermal constraint, where the default thermal constraint corresponds to a skin temperature for the device when the user is directly touching the device (e.g., touching one or more components of the device accessible via the exterior housing of the device such as keyboard or touchpad of a laptop).
  • sensors to detect user presence e.g., proximity sensor(s), image sensor(s)
  • device configuration e.g., sensor(s) to detect user input(s) received via an external keyboard, sensor(s) to detect device orientation
  • conditions in the ambient environment in which the device is located e
  • Examples disclosed herein selectively control an amount of power provided to hardware component(s) of the user device and/or fan speed level(s) (e.g., RPMs) based on the selected thermal constraint (e.g., the default thermal constraint or a thermal constraint permitting a higher skin temperature for the device relative to the default thermal constraint).
  • the selected thermal constraint e.g., the default thermal constraint or a thermal constraint permitting a higher skin temperature for the device relative to the default thermal constraint.
  • power consumption by one or more component(s) of the user device is increased when the user is determined to be providing inputs to the user device via, for instance, an external keyboard. Because the user is not physically touching the exterior surface of the device housing when the user is providing inputs via the external keyboard, the temperature of the skin of the device can be increased without adversely affecting the user (e.g., without causing discomfort to the user).
  • rotational speed(s) e.g. RPM(s)
  • the fan(s) of the user device are increased when sensor data from the ambient noise sensor(s) indicates that the user is in a loud environment.
  • the resulting increase in fan acoustics from the increased rotational speed(s) of the fan(s) is offset by the ambient noise.
  • the rotational direction of the fan(s) of the user device is reversed (e.g., to facilitate heatsink and fan shroud cleaning) when sensor data from the ambient noise sensor(s) indicate that the user device is in a loud environment and/or is that the user is not present or within a threshold distance of the device.
  • the user is not interrupted by the increased fan noise and the device can be cooled and/or cleaned with increased efficiency.
  • examples disclosed herein dynamically adjust the constraints and, thus, the performance of the device, based on user and/or environmental factors. As a result, performance of the device can be selectively increased in view of the opportunities for increased device skin temperature and/or audible fan noise levels in response to user interactions with the device.
  • FIG. 1 illustrates an example system 100 constructed in accordance with teachings of this disclosure for controlling thermal constraint(s) and/or fan noise constraint(s) for a user device 102 .
  • the user device 102 can be, for example, a personal computing (PC) device such as a laptop, a desktop, an electronic tablet, a hybrid or convertible PC, etc.
  • the user device 102 includes a keyboard 104 .
  • a keyboard is presented via a display screen 103 of the user device 102 and the user provides inputs on the keyboard by touching the screen.
  • the user device 102 includes one or more pointing device(s) 106 such as a touchpad.
  • the keyboard 104 and the pointing device(s) 106 are carried by a housing the user device 102 and accessible via an exterior surface of the housing and, thus, can be considered on-board user input devices for the device 102 .
  • the user device 102 additionally or alternatively includes one or more external devices communicatively coupled to the device 102 , such as an external keyboard 108 , external pointing device(s) 110 (e.g., wired or wireless mouse(s)), and/or headphones 112 .
  • the external keyboard 108 , the external pointing device(s) 110 , and/or the headphones 112 can be communicatively coupled to the user device 102 via one or more wired or wireless connections.
  • the user device 102 includes one or more device configuration sensor(s) 120 that provide means for detecting whether user input(s) are being received via the external keyboard 108 and/or the external pointing device(s) 110 and/or whether output(s) (e.g., audio output(s)) are being delivered via the headphones 112 are coupled to the user device 102 .
  • the device status sensor(s) 120 detect a wired connection of one or more of the external devices 108 , 110 , 112 via a hardware interface (e.g., USB port, etc.).
  • the device configuration sensor(s) 120 detect the presence of the external device(s) 108 , 110 , 112 via wireless connection(s) (e.g., Bluetooth).
  • the device configuration sensor(s) 120 include accelerometers to detect an orientation of the device 102 (e.g., tablet mode) and/or sensor(s) to detect an angle of, for instance, a screen of a laptop (e.g., facing the laptop base, angled away from the base, etc.).
  • the example user device 102 includes a processor 130 that executes software to interpret and output response(s) based on the user input event(s) (e.g., touch event(s), keyboard input(s), etc.).
  • the user device 102 of FIG. 1 includes one or more power sources 116 such as a battery to provide power to the processor 130 and/or other components of the user device 102 communicatively coupled via a bus 117 .
  • the hardware components of the device 102 (e.g., the processor 130 , a video graphics card, etc.) generate heat during operation of the user device 102 .
  • the example user device 102 includes temperature sensor(s) 126 to measure temperature(s) associated with the hardware component(s) of the user device 102 .
  • the temperature sensor(s) 126 measure a temperature of a skin of the housing of the user device 102 , or an exterior surface of the user device that can be touched by a user (e.g., a base of a laptop) (the terms “user” and “subject” are used interchangeably herein and both refer to a biological creature such as a human being).
  • the temperature sensor(s) 126 can be disposed in the housing of the device 102 proximate to the skin (e.g., coupled to a side of the housing opposite the side of the housing that is visible to the user).
  • the temperature sensor(s) 126 can include one or more thermometers.
  • the example user device 102 of FIG. 1 includes one or more fan(s) 114 .
  • the fan(s) 114 provide means for cooling and/or regulating the temperature of the hardware component(s) (e.g., the processor 130 ) of the user device 102 in response to temperature data generated by the temperature sensor(s) 126 .
  • operation of the fan(s) 114 is controlled in view of one or more thermal constraints for the user device 102 that define temperature settings for the hardware component(s) of the device 102 and/or a skin temperature of the device 102 .
  • the thermal constraint(s) and/or fan acoustic constraint(s) for the device 102 are dynamically selected based on the user interaction(s) with the device 102 and/or ambient conditions in an environment in which the device 102 is located.
  • the example user device 102 of FIG. 1 includes one or more user presence detection sensor(s) 118 .
  • the user presence detection sensor(s) 118 provide a means for detecting a presence of a user relative to the user device 102 in an environment in which the user device 102 is located.
  • the user presence detection sensor(s) 118 may detect a user approaching the user device 102 .
  • the user presence detection sensor(s) 118 include proximity sensor(s) that emit electromagnetic radiation (e.g., light pulses) and detect changes in the signal due to the presence of a person or object (e.g., based on reflection of the electromagnetic radiation (e.g., light pulses).
  • the user presence detection sensor(s) 118 include time-of-flight (TOF) sensors that measure a length of time for light to return to the sensor after being reflected off a person or object, which can be used to determine depth.
  • the example user presence detection sensor(s) 118 can include other types of depth sensors, such as sensors that detect changes based on radar or sonar data.
  • the user presence detection sensor(s) 118 collect distance measurements for one or more (e.g., four) spatial regions (e.g., non-overlapping quadrants) relative to the user device 102 .
  • the user presence detection sensor(s) 118 associated with each region provide distance range data for region(s) of the user's face and/or body corresponding to the regions.
  • the user presence detection sensor(s) 118 are carried by the example user device 102 such that the user presence detection sensor(s) 118 can detect changes in an environment in which the user device 102 is located that occur with a range (e.g., a distance range) of the user presence detection sensor(s) 118 (e.g., within 10 feet of the user presence detection sensor(s) 118 , within 5 feet, etc.).
  • a range e.g., a distance range
  • the user presence detection sensor(s) 118 can be mounted on a bezel of the display screen 103 and oriented such that the user presence detection sensor(s) 118 can detect a user approaching the user device 102 .
  • the user presence detection sensor(s) 118 can additionally or alternatively be at any other locations on the user device 102 where the sensor(s) 118 face an environment in which the user device 102 is located, such as on a base of the laptop (e.g., on an edge of the base in front of a keyboard carried by base), a lid of the laptop, on a base of the laptop supporting the display screen 103 in examples where the display screen 103 is a monitor of a desktop or all-in-one PC, etc.
  • the user presence detection sensor(s) 118 are additionally or alternatively mounted at locations on the user device 102 where the user's arm, hand, and/or finger(s) are likely to move or pass over as the user brings his or her arm, hand, and/or finger(s) toward the display screen 103 , the keyboard 104 , and/or other user input device (e.g., the pointing device(s) 106 ).
  • the user presence detection sensor(s) 118 can be disposed proximate to the touchpad of the device 102 to detect when a user's arm is hovering over the touchpad (e.g., as the user reaches for the screen 103 or the keyboard 104 ).
  • the user device 102 includes image sensor(s) 122 .
  • the image sensor(s) 122 generate image data that is analyzed to detect, for example, a presence of the user proximate to the device, gestures performed by the user, whether the user is looking toward or away from the display screen 103 of the device 102 (e.g., eye-tracking), etc.
  • the image sensor(s) 122 of the user device 102 include one or more cameras to capture image data of the surrounding environment in which the device 102 is located.
  • the image sensor(s) 122 include depth-sensing camera(s). In the example of FIG.
  • the image sensor(s) 122 are carried by the example user device 102 such that when a user faces the display screen 103 , the user is within a field of view of the image sensor(s) 122 .
  • the image sensor(s) 122 can be carried by a bezel of the display screen 103 .
  • the example user device 102 of FIG. 1 includes one or more motion sensor(s) 123 .
  • the motion sensor(s) 123 can include, for example, infrared sensor(s) to detect user movements. As disclosed herein, data generated by the motion sensor(s) 123 can be analyzed to identify gestures performed by the user of the user device 102 .
  • the motion sensor(s) 123 can be carried by the device 102 proximate to, for example, a touchpad of the device 102 , a bezel of the display screen 103 , etc. so as to detect user motion(s) occurring proximate to the device 102 .
  • the user device 102 includes one or more microphone(s) 124 to detect sounds in an environment in which the user device 102 is located.
  • the microphone(s) 124 can be carried by the user device 102 at one or more locations, such as on a lid of the device 102 , on a base of the device 102 proximate to the keyboard 104 , etc.
  • the example user device 102 of FIG. 1 can include other types of sensor(s) to detect user interactions relative to the device 102 and/or environmental conditions (e.g., ambient light sensor(s)).
  • environmental conditions e.g., ambient light sensor(s)
  • the example user device 102 includes one or more semiconductor-based processors to process sensor data generated by the user presence detection sensor(s) 118 , the device configuration sensor(s) 120 , the image sensor(s) 122 , the motion sensor(s) 123 , the microphone(s) 124 , and/or the temperature sensor(s) 126 .
  • the sensor(s) 118 , 120 , 122 , 123 , 124 , 126 can transmit data to the on-board processor 130 of the user device 102 .
  • the senor(s) 118 , 120 , 122 , 123 , 124 , 126 can transmit data to a processor 127 of another user device 128 , such as such as a smartphone or a wearable device such as a smartwatch.
  • the sensor(s) 118 , 120 , 122 , 123 , 124 , 126 can transmit data to a cloud-based device 129 (e.g., one or more server(s), processor(s), and/or virtual machine(s)).
  • a cloud-based device 129 e.g., one or more server(s), processor(s), and/or virtual machine(s)
  • the processor 130 of the user device 102 is communicatively coupled to one or more other processors.
  • the sensor(s) 118 , 120 , 122 , 123 , 124 , 126 can transmit the sensor data to the on-board processor 130 of the user device 102 .
  • the on-board processor 130 of the user device 102 can then transmit the sensor data to the processor 127 of the user device 128 and/or the cloud-based device(s) 129 .
  • the user device 102 e.g., the sensor(s) 118 , 120 , 122 , 123 , 124 , 126 and/or the on-board processor 130
  • the processor(s) 127 , 130 are communicatively coupled via one or more wired connections (e.g., a cable) or wireless connections (e.g., cellular, Wi-Fi, or Bluetooth connections).
  • the sensor data may only be processed by the on-board processor 130 (i.e., not sent off the device).
  • the sensor data generated by the user presence detection sensor(s) 118 , the device configuration sensor(s) 120 , the image sensor(s) 122 , the motion sensor(s) 123 , the microphone(s) 124 , and/or the temperature sensor(s) 126 is processed by a thermal constraint manager 132 to select a thermal constraint for the user device 102 to affect a temperature of the skin of the housing of the device 102 and/or a fan acoustic constraint to affect rotational speed(s) of the fan(s) 114 of the user device 102 and, thus, noise generated by the fan(s) 114 .
  • the example thermal constraint manager 132 can affect performance of the device 102 . For instance, if the thermal constraint manager 132 determines that the temperature of the skin of the device 102 can be increased and/or that rotational speed(s) of the fan(s) 114 can be increased, additional power can be provided to hardware component(s) of the device 102 (e.g., the processor 130 ) to provide for increased performance of the component(s) (e.g., higher processing speeds). In such examples, the increased heat generated by the hardware component(s) and transferred to the skin of the device is permitted by the selected thermal constraint and/or is managed via increased rotation of the fan(s) 114 .
  • hardware component(s) of the device 102 e.g., the processor 130
  • the increased heat generated by the hardware component(s) and transferred to the skin of the device is permitted by the selected thermal constraint and/or is managed via increased rotation of the fan(s) 114 .
  • the thermal constraint manager 132 is implemented by executable instructions executed on the processor 130 of the user device 102 .
  • the thermal constraint manager 132 is implemented by instructions executed on the processor 127 of the wearable or non-wearable user device 128 and/or on the cloud-based device(s) 129 .
  • the thermal constraint manager 132 is implemented by dedicated circuitry located on the user device 102 and/or the user device 128 . These components may be implemented in software, firmware, hardware, or in combination of two or more of software, firmware, and hardware.
  • the thermal constraint manager 132 serves to process the sensor data generated by the respective sensor(s) 118 , 120 , 122 , 123 , 124 , 126 to identify user interaction(s) with the user device 102 and/or ambient conditions in the environment in which the device 102 is located and to select a thermal constraint and/or fan acoustic constraint for the user device 102 based on the user interaction(s) and/or the ambient environment conditions.
  • the thermal constraint manager 132 receives the sensor data in substantially real-time (e.g., near the time the data is collected).
  • the thermal constraint manager 132 receives the sensor data at a later time (e.g., periodically and/or aperiodically based on one or more settings but sometime after the activity that caused the sensor data to be generated, such as a hand motion, has occurred (e.g., seconds, minutes, etc. later)).
  • the thermal constraint manager 132 can perform one or more operations on the sensor data such as filtering the raw signal data, removing noise from the signal data, converting the signal data from analog data to digital data, and/or analyzing the data.
  • the thermal constraint manager 132 can convert the sensor data from analog to digital data at the on-board processor 130 and the digital data can be analyzed by on-board processor 130 and/or by one or more off-board processors, such as the processor 127 of the user device 128 and/or the cloud-based device 129 .
  • the thermal constraint manager 132 determines whether or not a subject is present within the range of the user presence detection sensor(s) 118 . In some examples, if the thermal constraint manager 132 determines that the user is not within the range of the user presence detection sensor(s) 118 , the thermal constraint manager 132 determines that the rotational speed of the fan(s) 114 can be increased, as the user is not present to hear the increased acoustic noise generated by the fan(s) 114 operating at an increased speed. The thermal constraint manager 132 generates instructs for the fan(s) 114 to increase the rotational speed at which the fan(s) 114 operate.
  • the fan(s) 114 can continue to operate at the increased rotational speed to provide efficient until, for instance, the processor 130 of the device 102 determines that no user input(s) have been received at the device 102 for a period of time and the device 102 should enter a low power state (e.g., a standby or sleep state).
  • a low power state e.g., a standby or sleep state
  • the thermal constraint manager 132 determines if the user is interacting with the device 102 .
  • the thermal constraint manager 132 can detect whether user input(s) are being received via (a) the on-board keyboard 104 and/or the on-board pointing device(s) 106 or (b) the external keyboard 108 and/or the external pointing device(s) 110 based on data generated by the device configuration sensor(s) 120 .
  • the thermal constraint manager 132 maintains the skin temperature of the device 102 at a first (e.g., default) thermal constraint that defines a maximum temperature for the device skin to prevent the skin of the device housing from becoming too hot and injuring the user. If the thermal constraint manager 132 determines that the user is interacting with the device 102 via the external keyboard 108 and/or the external pointing device(s) 110 , the thermal constraint manager 132 selects a thermal constraint for the device that defines an increased temperature for the skin of the device 102 relative to the first thermal constraint.
  • a first e.g., default
  • one or more hardware component(s) of the device 102 move to an increased performance mode in which the component(s) of the device consume more power and, thus, generate more heat.
  • the thermal constraint manager 132 selects a thermal constraint for the skin temperature of the device housing that is increased relative to the thermal constraint selected when the user is interacting with the device 102 via the on-board keyboard 104 and/or the on-board pointing device(s) 106 because the user is not directly touching the device 102 when providing input(s) via the external device(s) 108 , 110 .
  • the thermal constraint manager 132 determines that the user is within the range of the user presence detection sensor(s) 118 but is not providing input(s) at the device 102 and/or has not provided an input within a threshold period of time, the thermal constraint manager 132 infers a user intent to interact with the device.
  • the thermal constraint manager 132 can use data from multiple types of sensors to predict whether the user is likely to interact with the device.
  • the thermal constraint manager 132 can determine a distance of the user from the device 102 based on data generated by the user presence detection sensor(s) 118 . If the user is determined to be outside of a predefined threshold range of the device 102 (e.g., farther than 1 meter from the device 102 ), the thermal constraint manager 132 determines that the rotational speed of the fan(s) 114 of the device 102 and, thus, the fan acoustics, can be increased because the increased fan noise will not disrupt the user in view of the user's distance from the device 102 .
  • a predefined threshold range of the device 102 e.g., farther than 1 meter from the device 102
  • the thermal constraint manager 132 determines that the power level of the power source(s) 116 of the device 102 and, thus, the device skin temperature, can be increased because the increased skin temperature will not cause discomfort to the user based on the user's distance from the device 102 .
  • thermal constraint manager 132 analyzes image data generated by the image sensor(s) 122 to determine a position of the user's eyes relative to the display screen 103 of the device 102 . In such examples, if thermal constraint manager 132 identifies both of the user's eyes in the image data, the thermal constraint manager 132 determines that the user is looking at the display screen 103 . If the thermal constraint manager 132 identifies one of the user's eyes or none of the user's eyes in the image data, the thermal constraint manager 132 determines that the user is not engaged with the device 102 . In such examples, the thermal constraint manager 132 can instruct the fan(s) 114 to increase rotational speed(s) to cool the device 102 .
  • the thermal constraint manager 132 permits increased fan noise to be generated by the fan(s) 114 to efficiently cool the device 102 while the user is distracted relative to the device 102 . Additionally or alternatively, the thermal constraint manager 132 can instruct the power source(s) 116 to increase the power provided to the hardware component(s) of the user device 102 (and, thus, resulting in increased the skin temperature of the user device 102 ).
  • the thermal constraint manager 132 analyzes the image data generated by the image data sensor(s) 122 and/or the motion sensor(s) 123 to identify gesture(s) being performed by the user. If the thermal constraint manager 132 determines that the user is, for instance, looking away from the device 102 and talking on the phone based on the image data and/or the motion sensor data (e.g. image data and/or motion sensor data indicating that the user has moved his or her hand proximate to his or her ear), the thermal constraint manager 132 determines that the fan acoustics can be increased because the user is not likely to interact with the device 102 while the user is looking away and talking on the phone.
  • the thermal constraint manager 132 determines that the fan acoustics can be increased because the user is not likely to interact with the device 102 while the user is looking away and talking on the phone.
  • the example thermal constraint manager 132 of FIG. 1 evaluates ambient noise conditions to determine if fan noise levels can be increased.
  • the thermal constraint manager 132 of FIG. 1 analyzes data generated by the microphone(s) 124 to determine if ambient noise in the surrounding environment exceeds an environment noise level threshold. If the thermal constraint manager 132 determines that the ambient noise exceeds the environment noise level threshold, the thermal constraint manager 132 instructs the fan(s) to rotate at increased speed(s) and, thus, generate increased fan noise. In such examples, the increased fan noise is unlikely to be detected in the noisy environment in which the user device 102 is located and, thus, operation of the fan(s) 114 can be optimized to increase cooling and, thus, performance of the device 102 .
  • the thermal constraint manager 132 can determine whether the user is wearing headphones based on, for example, image data generated by the image sensor(s) 122 and/or data from the device configuration sensor(s) 120 indicating that headphones are connected to the device 102 (e.g., via wired or wireless connection(s)). In such examples, the thermal constraint manager 132 instructs the fan(s) 114 to rotate at increased speed(s) to increase cooling of the device 102 because the resulting increased fan noise is unlikely to be detected by the user who is wearing headphones.
  • the thermal constraint manager 132 dynamically adjusts the thermal constraint(s) and/or fan noise levels for the device 102 based on the inferred user intent to interact with the device and/or conditions in the environment. In some examples, the thermal constraint manager 132 determines that the user likely to interact with the device after previously instructing the fan(s) to increase rotational speed(s) based on, for example, data from the user presence detection sensor(s) 118 indicating that the user is moving toward the device 102 and/or reaching for the on-board keyboard. In such examples, the thermal constraint manager 132 instructs the fan(s) 114 to reduce the rotation speed and, thus, the fan noise in view of the expectation that the user is going to interact with the device 102 .
  • the thermal constraint manager 132 determines that the user is providing input(s) via the external device(s) 108 , 110 and, thus, selects a thermal constraint for the device 102 that increases the temperature of the skin of the device. If, at later time, the thermal constraint manager 132 determines that the user is reaching for the display screen 103 (e.g., based on data from the user presence detection sensor(s) 118 , the image sensor(s) 122 , and/or the motion sensor(s) 123 ), the thermal constraint manager selects a thermal constraint that results in decreased temperature of the device skin. In such examples, power consumption by the hardware component(s) of the device 102 and/or fan speed(s) can be adjusted to cool the device 102 .
  • the thermal constraint manager 132 determines at a later time that the user is no longer wearing the headphones 112 (e.g., based on the image data) after previously determining that the user was wearing the headphones 112 , the thermal constraint manager 132 instructs the fan(s) 114 to reduce rotational speed to generate less noise.
  • the thermal constraint manager 132 dynamically adjusts the thermal constraint(s) and/or fan acoustic constraint(s) based on temperature data generated by the temperature sensor(s) 126 . For example, if data from the temperature sensor(s) 126 indicates that skin temperature is approaching the threshold defined by a selected thermal constraint, the thermal constraint manager 132 generates instructions to maintain or reduce the skin temperature by adjusting power consumption of the hardware component(s) and/or by operation of the fan(s) 114 .
  • FIG. 2 is a block diagram of an example implementation of the thermal constraint manager 132 of FIG. 1 .
  • the thermal constraint manager 132 is constructed to detect user interaction(s) and/or ambient condition(s) relative to the user device 102 and to generate instructions that cause the user device 102 to transition between one or more thermal constraints with respect to skin temperature of the device 102 and/or one or more fan acoustic constraints with respect to audible noise generated by the fan(s) 114 of the device 102 .
  • the thermal constraint manager 132 is constructed to detect user interaction(s) and/or ambient condition(s) relative to the user device 102 and to generate instructions that cause the user device 102 to transition between one or more thermal constraints with respect to skin temperature of the device 102 and/or one or more fan acoustic constraints with respect to audible noise generated by the fan(s) 114 of the device 102 .
  • the thermal constraint manager 132 is implemented by one or more of the processor 130 of the user device 102 , the processor 127 of the second user device 128 , and/or cloud-based device(s) 129 (e.g., server(s), processor(s), and/or virtual machine(s) in the cloud 129 of FIG. 1 ).
  • some of the user interaction analysis and/or ambient condition analysis is implemented by the thermal constraint manager 132 via a cloud-computing environment and one or more other parts of the analysis is implemented by the processor 130 of the user device 102 being controlled and/or the processor 127 of a second user device 128 such as a wearable device
  • the example thermal constraint manager 132 receives user presence sensor data 200 from the user presence detection sensor(s) 118 of the example user device 102 of FIG. 1 , device configuration sensor data 202 from the device configuration sensor(s) 120 , image sensor data 204 from the image sensor(s) 122 , gesture data 205 from the motion sensor(s) 123 , ambient noise sensor data 206 from the microphone(s) 124 , and temperature sensor data 208 from the temperature sensor(s) 126 .
  • the sensor data 200 , 202 , 204 , 205 , 206 , 208 is stored in a database 212 .
  • the thermal constraint manager 132 includes the database 212 .
  • the database 212 is located external to the thermal constraint manager 132 in a location accessible to the thermal constraint manager 132 as shown in FIG. 2 .
  • the thermal constraint manager 132 includes a user presence detection analyzer 214 .
  • the user presence detection analyzer 214 provides means for analyzing the sensor data 200 generated by the user presence detection sensor(s) 118 .
  • the user presence detection analyzer 214 analyzes the sensor data 200 to determine if a user is within the range of the user presence detection sensor(s) 118 and, thus, is near enough to the user device 102 to suggest that the user is about to use the user device 102 .
  • the user presence detection analyzer 214 determines if the user is within a particular distance from the user device 102 (e.g., within 0.5 meters of the device 102 , within 0.75 meters of the device 102 ).
  • the user presence detection analyzer 214 analyzes the sensor data 200 based on one or more user presence detection rule(s) 216 .
  • the user presence detection rule(s) 216 can be defined based on user input(s) and stored in the database 212 .
  • the user presence detection rule(s) 216 can define, for instance, threshold time-of-flight (TOF) measurements by the user presence detection sensor(s) 118 that indicate presence of the user is within a range from the user presence detection sensor(s) 118 (e.g., measurements of the amount of time between emission of a wave pulse, reflection off a subject, and return to the sensor).
  • the user presence detection rule(s) 216 define threshold distance(s) for determining that a subject is within proximity of the user device 102 .
  • the user presence detection analyzer 214 determines the distance(s) based on the TOF measurement(s) in the sensor data 200 and the known speed of the light emitted by the sensor(s) 118 .
  • the user presence detection analyzer 214 identifies changes in the depth or distance values over time and detects whether the user is approaching the device 102 or moving away from the user device 102 based on the changes.
  • the threshold TOF measurement(s) and/or distance(s) for the sensor data 200 can be based on the range of the sensor(s) 118 in emitting pulses. In some examples, the threshold TOF measurement(s) and/or distances are based on user-defined reference distances for determining that a user is near or approaching the user device 102 as compared to simply being in the environment in which the user device 102 and the user are both present.
  • the example thermal constraint manager 132 of FIG. 2 includes a device configuration analyzer 218 .
  • the device configuration analyzer 218 provides means for analyzing the sensor data 202 generated by the device configuration sensor(s) 120 .
  • the device configuration analyzer 218 analyzes the sensor data 202 to detect, for example, whether user input(s) are being received via the on-board keyboard 104 and/or the on-board pointing device(s) 106 of the user device 102 or via one or more external devices (e.g., the external keyboard 108 , the external pointing device(s) 110 ) communicatively coupled to the user device 102 .
  • the device configuration analyzer 218 detects that audio output(s) from the device 102 are being delivered via an external output device such as the headphones 112 . In some examples, the device configuration analyzer 218 analyzes the orientation of the device 102 to infer, for example, whether a user is sitting while interacting with device 102 , standing while interacting with the device 102 (e.g., based on an angle of a display screen of the device 102 ), whether the device 102 is in tablet mode, etc.
  • the device configuration analyzer 218 analyzes the sensor data 202 based on one or more device configuration rule(s) 219 .
  • the device configuration rule(s) 219 can be defined based on user input(s) and stored in the database 212 .
  • the device configuration rule(s) 219 can define, for example, identifiers for recognizing when external device(s) such as the headphones 112 of FIG. 1 are communicatively coupled to the user device 102 via one or more wired or wireless connections.
  • the device configuration rule(s) 219 define rule(s) for detecting user input(s) being received at the user device via the external device(s) 108 , 110 based on data received from the external device(s).
  • the device configuration rule(s) 219 define rule(s) for detecting audio output(s) delivered via the external device such as the headphone(s) 118 .
  • the device configuration rule(s) 219 can define rule(s) indicating that if the display screen 103 is angled within a particular angle range (e.g., over 90° relative to a base of laptop), the user is sitting while interacting with the device 102 .
  • the example thermal constraint manager 132 of FIGS. 1 and 2 is trained to recognize user interaction(s) relative to the user device 102 to predict whether the user is likely to interact with the device 102 .
  • the thermal constraint manager 132 analyzes one or more of the sensor data 204 from the image sensor(s) 122 and/or the sensor data 205 from the motion sensor(s) 123 to detect user activity relative to the device 102 .
  • the thermal constraint manager 132 is trained to recognize user interactions by a training manager 224 using machine learning and training sensor data for one or more subjects, which may or may not include sensor data generated by the sensor(s) 122 , 123 of the user device 102 of FIG. 1 .
  • the training sensor data is generated from subject(s) who are interacting with the user device 102 and/or a different user device.
  • the training sensor data is stored in a database 232 .
  • the training manager 224 includes the database 232 .
  • the database 232 is located external to the training manager 224 in a location accessible to the training manager 224 as shown in FIG. 2 .
  • the databases 212 , 232 of FIG. 2 may be the same storage device or different storage devices.
  • the training sensor data includes training gesture data 230 , or data including a plurality of gestures performed by user(s) and associated user interactions represented by the gestures in the context of interacting with the user device 102 .
  • the training gesture data 230 can include a first rule indicating that if a user raises his or her hand proximate to his or her ear, the user is talking on a telephone.
  • the training gesture data 230 can include a second rule indicating that if a user is reaching his or her hand away from his or her body as detected by a motion sensor disposed proximate to a keyboard of the device and/or as captured in image data, the user is reaching for the display screen of the user device.
  • the training gesture data 230 can include a third rule indicating that if only a portion of the user's body from the waist upward is visible in image data, the user is in a sitting position.
  • the training sensor data includes training facial feature data 231 , or data including a plurality of images of subject(s) and associated eye position data, mouth position data, head accessory data (e.g., headphone usage) represented by the image(s) in the context of viewing the display screen 103 of the device 102 , looking away from the display screen 103 of the device 102 , interacting with the device 102 while wearing headphones, etc.
  • the training facial feature data 231 can include a first rule that if both of the user's eyes are visible in image data generated by the image sensor(s) 122 of the user device 102 , then the user is looking at the display screen 103 of the device 102 .
  • the training facial feature data 231 can include a second rule that if one of the user's eyes is visible in the image data, the user is likely to interact with the device 102 .
  • the training facial feature data 231 can include a third rule that if neither of the user's eyes is visible in the image data, the user is looking away from the device 102 .
  • the training facial feature data 231 can include a fourth rule that if the user's mouth is open in the image data, the user is talking.
  • the training facial feature data 231 can include a fifth rule that identifies when a user is wearing headphones based on feature(s) detected in the image data.
  • the example training manager 224 of FIG. 2 includes a trainer 226 and a machine learning engine 228 .
  • the trainer 226 trains the machine learning engine 228 using the training gesture data 230 and the training facial feature data 231 (e.g., via supervised learning) to generate one or more model(s) that are used by the thermal constraint manager 132 to control thermal constraints of the user device 102 based on user interaction(s) and/or inferred intent regarding user interaction(s) with the device 102 .
  • the trainer 226 uses the training gesture data 230 to generate one or more gesture data models 223 via the machine learning engine 228 that define user interaction(s) relative to the device 102 in response to particular gestures performed by the user.
  • the trainer 226 users the training facial feature data 231 to generate one or more facial feature data models 225 via the machine learning engine 228 that that define user interaction(s) relative to the device 102 in response to particular eye tracking positions, facial expressions of the user, and/or head accessories (e.g., headphones) worn by the user.
  • the gesture data model(s) 223 and the facial feature data model(s) 225 are stored in the database 212 .
  • the example database 212 can store additional or fewer models than shown in FIG. 2 .
  • the database 212 can store a model generated during training based on the training gesture data 230 and data indicative of a distance of the user relative to the device (e.g., based on proximity sensor data) and/or device configuration (e.g., based on sensor data indicating screen orientation).
  • the example thermal constraint manager 132 of FIG. 2 uses the model(s) 223 , 225 to interpret the respective sensor data generated by the motion sensor(s) 123 and/or the image sensor(s) 122 .
  • the example thermal constraint manager 132 of FIG. 2 includes a motion data analyzer 222 .
  • the motion data analyzer 222 provides means for analyzing the sensor data 205 generated by the motion sensor(s) 123
  • the example motion data analyzer 222 uses the gesture data model(s) 223 to identify gesture(s) performed by the user relative to the device 102 .
  • the motion data analyzer 222 can determine that the user is reaching for the display screen 103 of the user device 102 .
  • the example thermal constraint manager 132 of FIG. 2 includes an image data analyzer 220 .
  • the image data analyzer 220 provides means for analyzing the sensor data 204 generated by the image sensor(s) 122 .
  • the image data analyzer 220 uses the gesture data model(s) 223 and/or the facial feature data model(s) 225 to analyzes the sensor data 204 to identify, for instance, gesture(s) being performed by the user and/or the user's posture relative to the device 102 , and/or to track a position of the user's eyes relative to the device 102 .
  • the image data analyzer 220 can determine that the user is typing.
  • the image data analyzer 220 determines that the user is turned away from the device 102 because the user's eyes are not visible in the image data.
  • the thermal constraint manager 132 includes a timer 244 .
  • the timer 244 provides means for monitoring a duration of time within which a user input is received at the user device 102 after the user presence detection analyzer 214 determines that the user is within the range of the user presence detection sensor(s) 118 .
  • the timer 244 additionally or alternatively provides means for monitoring a duration of time in which the motion data analyzer 222 and/or the image data analyzer 220 determine that there is a likelihood of user interaction within the device after the user presence detection analyzer 214 determines that the user is within the range of the user presence detection sensor(s) 118 .
  • the timer 244 monitors the amount of time that has passed based on time interval threshold(s) 246 stored in the database 212 and defined by user input(s). As disclosed herein, if a user input is not received within the time interval threshold(s) 246 and/or if the motion data analyzer 222 and/or the image data analyzer 220 have not determined that a user interaction with the device 102 is likely to occur within the time interval threshold(s) 246 , the thermal constraint manager 132 can adjust the thermal constraint(s) and/or the fan acoustic constraint(s) in response to the lack of user interaction with the device 102 .
  • the thermal constraint manager 132 of FIG. 2 includes an ambient noise analyzer 234 .
  • the ambient noise analyzer 234 provides means for analyzing the sensor data 206 generated by the ambient noise sensor(s) 124 .
  • the ambient noise analyzer 234 analyzes the sensor data 206 analyzes the sensor data 206 based on one or more ambient noise rule(s) 235 .
  • the ambient noise rule(s) 235 define threshold ambient noise level(s) that, if exceeded, indicate that a user is unlikely to detect an increase in audible fan noise.
  • the ambient noise rule(s) 235 can be defined based on user input(s) and stored in the database 212 .
  • the thermal constraint manager 132 of FIG. 2 includes a temperature analyzer 236 .
  • the temperature analyzer 236 provides means for analyzing the sensor data 208 generated by the temperature sensor(s) 126 .
  • the temperature analyzer 236 analyzes the sensor data 208 to determine the temperature of one or more hardware component(s) of the user device 102 and/or the skin of the housing of the user device 102 .
  • the temperature analyzer 236 can detect an amount of heat generated by the processor 130 and/or a temperature of the exterior skin of the housing 102 during operation of the device 102 .
  • the example thermal constraint manager 132 of FIG. 2 includes a sensor manager 248 to manage operation of one or more of the user presence detection sensor(s) 118 , the device configuration sensor(s) 120 , the image sensor(s) 122 , the motion sensor(s) 122 , the ambient noise sensor(s) 124 , and/or the temperature sensor(s) 126 .
  • the sensor manager 248 controls operation of the sensor(s) 118 , 120 , 122 , 124 , 126 based on one or more sensor activation rule(s) 250 .
  • the sensor activation rule(s) 250 can be defined by user input(s) and stored in the database 212 .
  • the sensor activation rule(s) 250 define rule(s) for activating the sensor(s) to conserve power consumption by the device 102 .
  • the sensor activation rule(s) 250 can define that the user presence detection sensor(s) 118 should remain active while the device 102 is operative (e.g., in a working power state) and that the image sensor(s) 122 should be activated when the user presence detection analyzer 214 determines that a user is within the range of the user presence detection sensor(s) 118 .
  • Such a rule can prevent unnecessary power consumption by the device 102 when, for instance, the user is not proximate to the device 102 .
  • the sensor manager 248 selectively activates the image sensor(s) 122 to supplement data generated by the motion sensor(s) 123 to increase an accuracy with which the gesture(s) of the user are detected. In some examples, the sensor manager 248 deactivates the image sensor(s) 122 if the image data analyzer 220 does not predict a likelihood of a user interaction with the device and/or the device 102 does not receive a user input within a time threshold defined by the timer 244 to conserve power.
  • the example thermal constraint manager 132 of FIG. 2 includes a thermal constraint selector 252 .
  • the thermal constraint selector 252 selects a thermal constraint to be assigned to the user device 102 based on one or more of data from the user presence detection analyzer 214 , the device configuration analyzer 218 , the motion data analyzer 222 , the image data analyzer 220 , the ambient noise analyzer 234 , and/or the temperature analyzer 236 .
  • the example thermal constraint selector 252 selects the thermal constraint to be assigned to the user device based on one or more thermal constraint selection rule(s) 254 .
  • the thermal constraint selection rule(s) 254 are defined based on user input(s) and stored in the database 212 .
  • the thermal constraint selection rule(s) 254 can include a first rule that if the device configuration analyzer 218 determines that the user is providing input(s) via a keyboard or touch screen of the device 102 , a first, or default thermal constraint for the temperature of the skin of the housing device 102 should be assigned to the user device 102 to prevent discomfort to the user when touching the device 102 .
  • the default thermal constraint for the skin temperature can be for, for example, 45° C.
  • the thermal constraint selection rule(s) 254 can include a second rule that if the device configuration analyzer 218 determines that the user is providing input(s) via the external keyboard 108 , a second thermal constraint should be assigned to the device 102 , where the second thermal constraint provides for an increased skin temperature of the device as compared to the first (e.g., default) thermal constraint.
  • the second thermal constraint can define a skin temperature limit of 48° C.
  • the example thermal constraint manager 132 of FIG. 2 includes a fan acoustic constraint selector 258 .
  • the fan acoustic constraint selector 258 selects a fan acoustic constraint to be assigned to the user device 102 based on one or more of data from the user presence detection analyzer 214 , the device configuration analyzer 218 , the motion data analyzer 222 , the image data analyzer 220 , the ambient noise analyzer 234 , and/or the temperature analyzer 236 .
  • the example thermal constraint selector 252 selects the fan acoustic constraint to be assigned to the user device 102 based one or more fan acoustic constraint selection rule(s) 260 .
  • the fan acoustic constraint selection rule(s) 260 are defined based on user input(s) and stored in the database 212 .
  • the fan acoustic constraint selection rule(s) 260 can include a first or default rule for the fan noise level based on data from the user presence detection analyzer 214 indicating that the user is within a first range of the user presence detection sensor(s) 118 (e.g., 0.5 meters from the device 102 ).
  • the first rule can define a sound pressure level corresponding to 35 dBA for noise generated by the fan(s).
  • the fan acoustic constraint selection rule(s) 260 can include a second rule for the fan noise level based on data from the user presence detection analyzer 214 indicating that the user is within a second range of the user presence detection sensor(s) 118 (e.g., 1 meter from the device 102 ), where the second rule defines a sound pressure level corresponding to a sound pressure level (e.g., 41 dBA) for noise generated by the fan(s) 114 that is greater than the sound pressure level defined by the first rule.
  • the fan acoustic constraint selection rule(s) 260 can include a third rule for the fan noise level based on data from the image data analyzer 220 indicating that the user is turned away from the user device 102 .
  • the third rule can define a fan speed and, thus, acoustic noise level, that is increased relative to the fan speed and associated acoustic noise defined by the first or default fan acoustic rule in view of the determination that the user is not interacting or not likely interacting with the device 102 .
  • the fan acoustic constraint selection rule(s) 260 can include a fourth rule indicating that if the device configuration analyzer 218 determines that an angle of a display screen of the device 102 is within a particular angle range relative to, for instance, a base of a laptop, the user is sitting when interacting with the device 102 and, thus, located closer to the device than if the user is standing.
  • the fourth rule can define a reduced fan acoustic noise level as compared to if the user is standing or located farther from the device 102 .
  • the fan acoustic constraint selection rule(s) 260 can include a fifth rule indicating that if the device configuration analyzer 218 that headphones are coupled to the device 102 and/or the image data analyzer 220 determine that the user is wearing headphones, the fan acoustic noise can be increased relative to the default fan noise level.
  • the fan acoustic constraint selection rule(s) 260 can include a fifth rule indicating that if the ambient noise analyzer 234 determines that the fan noise exceeds an ambient noise threshold, the fan acoustic noise can be increased relative to the default fan noise level.
  • the fan acoustic constraint selection rule(s) 260 can include a sixth rule indicating that if the device configuration analyzer 218 , the image data analyzer 220 , and/or the motion data analyzer 222 do not detect a user input and/or a predict a likelihood of a user interaction with the device 102 within the time interval threshold(s) 246 as monitored by the timer 244 , the fan acoustic noise should be increased because the user is not likely interacting with the device 102 .
  • the thermal constraint selector 252 and the fan acoustic constraint selector 258 can communicate to optimize performance of the device 102 , thermal constraints for the skin of the device 102 , and fan acoustic noise levels in view of user interaction(s) and/or ambient conditions. For example, if the device configuration analyzer 218 determines that user is providing user inputs via an external device, the thermal constraint selector 252 can select a first thermal constraint that results in increased skin temperature of the device (e.g., 46° C.) relative to a default temperature (e.g., 45° C.).
  • a first thermal constraint that results in increased skin temperature of the device (e.g., 46° C.) relative to a default temperature (e.g., 45° C.).
  • the fan acoustic constraint selector 258 can select a first fan acoustic constraint for the device 102 that permits for a modest increase in fan noise level(s) (e.g., 38 dBA) over a default level (e.g., 35 dBA) to accommodate the increased heat permitted by the first thermal constraint and prevent overheating of the device 102 .
  • a modest increase in fan noise level(s) e.g., 38 dBA
  • a default level e.g. 35 dBA
  • the thermal constraint selector 252 can select a second thermal constraint for the device 102 that provides for an increased skin temperature (e.g., 48° C.) over the first thermal constraint (e.g., 46° C.) and the default thermal constraint (e.g., 45° C.) and, thus, permits increased device performance as result of increased power consumption by the device component(s).
  • an increased skin temperature e.g., 48° C.
  • the first thermal constraint e.g., 46° C.
  • the default thermal constraint e.g. 45° C.
  • the fan acoustic constraint selector can select a second fan acoustic constraint for the device 102 that permits an increase in fan noise level(s) (e.g., 41 dBA) over the first fan constraint (e.g., 38 dBA) and the default fan acoustic constraint (e.g., 35 dBA). Because the device 102 is in a loud environment, the performance of the device 102 can be increased by permitting increased heat to be generated by the component(s) of the device 102 as compared to if the device 102 where in a quiet environment and the fan acoustic constraints were limited in view of low ambient noise.
  • fan noise level(s) e.g. 41 dBA
  • the default fan acoustic constraint e.g. 35 dBA
  • the thermal constraint manager 132 of FIG. 2 includes a power source manager 238 .
  • the power source manager 238 generates instruction(s) that are transmitted to the power source(s) 116 of the user device 102 of FIG. 1 to control the power provided to the processor 130 and/or other hardware components of the user device 102 (e.g., a video graphics card).
  • increasing the power provided to the hardware component(s) of the device 102 increases the performance level of those component(s) (e.g., the responsiveness, availability, reliability, recoverability, and/or throughput of the processor 130 ).
  • the power source manager 238 generates instruction(s) that are transmitted to the power source(s) 116 of the user device 102 of FIG. 1 to control the power provided to the processor 130 and/or other hardware components of the user device 102 (e.g., a video graphics card).
  • increasing the power provided to the hardware component(s) of the device 102 increases the performance level of those component(s) (e.g., the responsiveness, availability
  • the thermal constraint selector 252 communicates with the power source manager 238 to increase or decrease the power provided to the hardware component(s) of the device 102 in view of the selected thermal constraint(s). For example, if the thermal constraint selector 252 selects a thermal constraint for the device skin temperature that allows the skin temperature to increase relative to a default skin temperature limit, the power source manager 238 generates instructions for increased power to be provided to the hardware component(s) of the device 102 .
  • the power source manager 238 If the thermal constraint selector 252 determines that the skin temperature of the device 102 should be reduced (e.g., in response to a change in user interaction with the device 102 ), the power source manager 238 generates instructions for power provided to the hardware component(s) of the device 102 to be reduced to decrease the amount of heat generated by the component(s). The example power source manager 238 transmits the instruction(s) to the power source 116 via one or more wired or wireless connections.
  • the example thermal constraint manager 132 of FIG. 2 includes a fan speed manager 240 .
  • the fan speed manager 240 generates instruction(s) to control the fan speed (e.g., revolutions per minute) of the fan(s) 114 of the user device 102 of FIG. 1 in response to selection of a fan acoustic constraint by the fan acoustic constraint selector 258 .
  • the fan speed manager 240 generates instruction(s) to control speed of the fan(s) 114 in response to selection of a thermal constraint by the thermal constraint selector 252 to prevent, for instance, overheating of the hardware component(s) of the device when the selected thermal constraint permits an increase in skin temperature of the device 102 .
  • the fan speed manager 240 transmits the instruction(s) to the fan(s) 114 via one or more wired or wireless connections.
  • the fan acoustic constraint selector 258 selects a fan acoustic constraint associated with increased fan acoustic noise when the user presence detection analyzer 214 does not detect the presence of a user within the range of the user presence detection sensor(s) 118 or when the user presence detection analyzer 214 determines that the user is a predefined distance from the device 102 to facilitate heatsink and fan shroud cleaning of heatsink(s) and fan shroud(s) of the device 102 (e.g., to remove accumulated dust).
  • the fan acoustic constraint selector 258 can select a fan acoustic constraint for the device 102 and communicate with the fan speed manager 240 to perform the cleaning when user(s) are not proximate to the device 102 .
  • the acoustic noise of the fan(s) 114 can be increased without disrupting a user interacting with the device 102 and longevity of the device performance can be increased though periodic cleanings.
  • the example thermal constraint selector 252 of FIGS. 1 and/or 2 dynamically selects the thermal constraint to be assigned to the device 102 based on analysis of the sensor data. For example, at first time, the thermal constraint selector 252 can select a first thermal constraint for the device 102 that corresponds to increased temperature of the skin of the housing of the device 102 based on data indicating the user is providing input(s) via the external keyboard 108 . If, at a later time, the gesture data analyzer detects that the user is reaching for the display screen 103 of the device 102 , the thermal constraint selector 252 selects a second thermal constraint for the device 102 that reduces the skin temperature of the device.
  • the power source manager 238 generates instructions to adjust the power provided to the hardware component(s) of the device to reduce heat generated and/or the fan speed manager 240 generate instructions to adjust the fan speed(s) (e.g., increase the fan speed(s) to exhaust hot air) in view of the change in the thermal constraint selected for the device 102 .
  • the thermal constraint selector 252 and/or the fan acoustic constraint selector 258 selectively adjust the constraint(s) applied to the device 102 based on temperature data generated by the temperature sensor(s) 126 during operation of the device. For example, if increased power is provided to the hardware component(s) of the device 102 in response to selection of a thermal constraint the permits increased skin temperature of the housing of the device 102 , the fan speed manager 240 can instruct the fan(s) 114 to increase rotational speed to prevent the skin temperature from exceeding the selected thermal constraint based on data from the temperature sensor(s) 126 .
  • the example thermal constraint manager 132 of FIGS. 1 and/or 2 is discussed in connection with analysis of sensor data from the user presence detection sensor(s) 118 , the user input sensor(s) 120 , the image sensor(s) 122 , and/or the ambient noise sensor(s) 124 , the example thermal constraint manager 132 can analyze data based on other sensors of the user device 102 of FIG. 1 (e.g., ambient light sensor(s)) to evaluate user interaction(s) and/or the environment in which the device 102 is located and assign thermal and/or fan acoustic constraints to the device 102 .
  • sensors of the user device 102 of FIG. 1 e.g., ambient light sensor(s)
  • FIG. 2 While an example manner of implementing the thermal constraint manager 132 of FIG. 1 is illustrated in FIG. 2 , one or more of the elements, processes and/or devices illustrated in FIG. 2 may be combined, divided, re-arranged, omitted, eliminated and/or implemented in any other way.
  • the example user presence detection analyzer 214 , the example device configuration analyzer 218 , the example image data analyzer 220 , the example motion data analyzer 222 , the example ambient noise analyzer 234 , the example temperature analyzer 236 , the example power source manager 238 , the example fan speed manager 240 , the example timer 244 , the example sensor manager 248 , the example thermal constraint selector 252 , the example fan acoustic constraint selector 258 , the example database 212 and/or, more generally, the example thermal constraint manager 132 of FIG. 2 may be implemented by hardware, software, firmware and/or any combination of hardware, software and/or firmware.
  • the example database 212 is/are hereby expressly defined to include a non-transitory computer readable storage device or storage disk such as a memory, a digital versatile disk (DVD), a compact disk (CD), a Blu-ray disk, etc.
  • the example thermal constraint manager 132 of FIG. 2 may include one or more elements, processes and/or devices in addition to, or instead of, those illustrated in FIG. 2 , and/or may include more than one of any or all of the illustrated elements, processes and devices.
  • the phrase “in communication,” including variations thereof, encompasses direct communication and/or indirect communication through one or more intermediary components, and does not require direct physical (e.g., wired) communication and/or constant communication, but rather additionally includes selective communication at periodic intervals, scheduled intervals, aperiodic intervals, and/or one-time events.
  • While an example manner of implementing the training manager 224 is illustrated in FIG. 2 , one or more of the elements, processes and/or devices illustrated in FIG. 2 may be combined, divided, re-arranged, omitted, eliminated and/or implemented in any other way. Further, the example trainer 224 , the example machine learning engineer 228 , the example database 232 and/or, more generally, the example training manager 224 of FIG. 2 may be implemented by hardware, software, firmware and/or any combination of hardware, software and/or firmware.
  • any of the example trainer 224 , the example machine learning engineer 228 , the example database 232 and/or, more generally, the example training manager 224 could be implemented by one or more analog or digital circuit(s), logic circuits, programmable processor(s), programmable controller(s), graphics processing unit(s) (GPU(s)), digital signal processor(s) (DSP(s)), application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)) and/or field programmable logic device(s) (FPLD(s)).
  • At least one of the example trainer 224 , the example machine learning engineer 228 , and/or the example database 232 is/are hereby expressly defined to include a non-transitory computer readable storage device or storage disk such as a memory, a digital versatile disk (DVD), a compact disk (CD), a Blu-ray disk, etc. including the software and/or firmware.
  • the example training manager 224 of FIG. 2 may include one or more elements, processes and/or devices in addition to, or instead of, those illustrated in FIG. 2 , and/or may include more than one of any or all of the illustrated elements, processes and devices.
  • the phrase “in communication,” including variations thereof, encompasses direct communication and/or indirect communication through one or more intermediary components, and does not require direct physical (e.g., wired) communication and/or constant communication, but rather additionally includes selective communication at periodic intervals, scheduled intervals, aperiodic intervals, and/or one-time events.
  • FIG. 3 illustrates a graph 300 of example thermal constraints that may be implemented in connection with an electronic user device such as the example user device 102 of FIG. 1 to control a temperature of an exterior surface, or skin, of the device (e.g., a housing or body of the device).
  • the example graph 300 of FIG. 3 illustrates temperature of the skin of the user device 102 over time for different thermal constraints.
  • a default temperature for the skin of the device 102 can be set at 45° C., as represented by line 302 in FIG. 3 .
  • a first thermal constraint 304 corresponds to a default thermal constraint in that, when implemented by the device 102 , the skin temperature of the user device 102 does not exceed the default skin temperature represented by line 302 .
  • the thermal constraint manager 132 of FIGS. 1 and/or 2 determines that a thermal constraint that permits the skin temperature of the device 102 to increase can be selected in view of, for instance, user interaction(s) with the device 102 .
  • a second thermal constraint 306 provides for an increase in skin temperature relative to the first thermal constraint 304 (e.g., a skin temperature limit of 46° C.).
  • a third thermal constraint 308 and a fourth thermal constraint 310 permit additional increases in skin temperature relative to the first and second thermal constraints 304 , 306 .
  • the power source manager 238 of the example thermal constraint manager 132 generates instructions to increases the power provided to the hardware component(s) of the user device 102 , which allows the component(s) to generate more heat without violating the thermal constraint and improve performance of the device 102 .
  • FIG. 4 illustrates an example user device 400 (e.g., the user device 102 of FIG. 1 ) in which examples disclosed herein may be implemented.
  • the example user device 400 is a laptop.
  • other types of user devices such as desktops or electronic tablets, can be used to implement the examples disclosed herein.
  • FIG. 4 illustrates the user device 400 in a first configuration in which a user 402 interacts with the user device 400 by providing input(s) via an on-board keyboard 404 (e.g., the keyboard 104 ) of the device 400 .
  • the keyboard 404 is supported by a housing 406 of the device 400 , where the housing 406 includes an exterior surface or skin 408 that defines the housing 406 .
  • the example thermal constraint manager 132 of FIGS. 1 and/or 2 can select a thermal constraint for the device 400 that maintains the skin temperature at or substantially at a default level (e.g., the first thermal constraint 304 of FIG.
  • the power source manager 238 of the example thermal constraint manager 132 manages power level(s) for the hardware component(s) of the device 400 so that the resulting temperature of the skin 408 does not exceed the thermal constraint. Additionally or alternatively, the thermal constraint manager 132 can determine the user 402 is not wearing headphones based on data generated by, for instance, the device configuration sensor(s) 120 and/or the image data sensor(s) 122 of FIG. 1 .
  • the fan constraint selector 258 can select a fan acoustic constraint for the device 400 so that the noise generated by the fan(s) of the device 400 (e.g., the fan(s) 114 ) do not exceed, for instance, a default fan noise level of 35 dBA.
  • FIG. 5 illustrates the example user device 400 of FIG. 4 in a second configuration in which the user 402 is interacting with the user device 102 via an external keyboard 500 .
  • the thermal constraint selector 252 can select a thermal constraint (e.g., the second, third, or fourth thermal constraints 306 , 308 , 310 of FIG. 3 ) that permits an increase in a temperature of the skin 408 of the device 400 above the default temperature (e.g., above the temperature associated with the first thermal constraint 304 of FIG. 3 ).
  • a thermal constraint e.g., the second, third, or fourth thermal constraints 306 , 308 , 310 of FIG. 3
  • the default temperature e.g., above the temperature associated with the first thermal constraint 304 of FIG. 3
  • power to one or more hardware components of the device 400 and, thus performance of those component(s) can be increased.
  • FIG. 6 A flowchart representative of example hardware logic, machine readable instructions, hardware implemented state machines, and/or any combination thereof for implementing the example training manager 224 of FIG. 2 is shown in FIG. 6 .
  • the machine readable instructions may be one or more executable programs or portion(s) of an executable program for execution by a computer processor such as the processor 224 shown in the example processor platform 800 discussed below in connection with FIG. 8 .
  • the program may be embodied in software stored on a non-transitory computer readable storage medium such as a CD-ROM, a floppy disk, a hard drive, a DVD, a Blu-ray disk, or a memory associated with the processor 224 , but the entire program and/or parts thereof could alternatively be executed by a device other than the processor 224 and/or embodied in firmware or dedicated hardware.
  • a non-transitory computer readable storage medium such as a CD-ROM, a floppy disk, a hard drive, a DVD, a Blu-ray disk, or a memory associated with the processor 224 , but the entire program and/or parts thereof could alternatively be executed by a device other than the processor 224 and/or embodied in firmware or dedicated hardware.
  • the example program is described with reference to the flowchart illustrated in FIG. 6 , many other methods of implementing the example training manager 224 may alternatively be used. For example, the order of execution of the blocks may be changed, and/or some of the
  • any or all of the blocks may be implemented by one or more hardware circuits (e.g., discrete and/or integrated analog and/or digital circuitry, an FPGA, an ASIC, a comparator, an operational-amplifier (op-amp), a logic circuit, etc.) structured to perform the corresponding operation without executing software or firmware.
  • hardware circuits e.g., discrete and/or integrated analog and/or digital circuitry, an FPGA, an ASIC, a comparator, an operational-amplifier (op-amp), a logic circuit, etc.
  • FIG. 6 is a flowchart of example machine readable instructions that, when executed, implement the example training manager 224 of FIG. 2 .
  • the training manager 224 trains the example thermal constraint manager 132 of FIGS. 1 and/or 2 using training gesture data and/or training facial feature data, which is generated for one or more users who may or may not be using the example user device 102 of FIG. 1 .
  • the training manager 224 generates machine learning models that are used by the thermal constraint manager 132 of FIGS.
  • thermal constraint(s) for a temperature of a skin of the a user device e.g., skin 408 of the housing 406 the user device 102 , 400
  • fan acoustic constraint(s) for noise generated by fan(s) of the user device e.g., the fan(s) 114 of the user device 102
  • user interaction(s) relative to the user device 102 e.g., the fan(s) 114 of the user device 102
  • the example instructions of FIG. 6 can be executed by one or more processors of, for instance, the user device 102 , another user device (e.g., the user device 128 ), and/or a cloud-based device (e.g., the cloud-based device(s) 129 ).
  • the instructions of FIG. 6 can be executed in substantially real-time as the training gesture data and/or the training facial feature data is received by the training manager 224 or at some time after the training data is received by the training manager 224 .
  • the training manager 224 can communicate with the thermal constraint manager 132 via one or more wired or wireless communication protocols.
  • the example trainer 226 of FIG. 2 accesses training gesture data 230 and/or training facial feature data 231 (block 600 ).
  • the training gesture data 230 and/or training facial feature data 231 can be stored in the database 232 .
  • the training gesture data 230 and/or training facial feature data 231 is generated for one or more users of the user device 102 .
  • the training gesture data 230 and/or the training facial feature data 231 can be received from the thermal constraint manager 132 and/or directly from the image sensor(s) 122 and/or the motion sensor(s) 123 of the example user device 102 , 400 .
  • the training gesture data 230 and/or the training facial feature data 231 is generated for users who are not the user(s) of the user device 102 .
  • the example trainer 226 of FIG. 2 identifies user interactions (e.g., user interactions with the user device 102 , 400 and/or other user interactions such as talking on a phone) represented by the training gesture data 230 and/or the training facial feature data 231 (block 602 ).
  • user interactions e.g., user interactions with the user device 102 , 400 and/or other user interactions such as talking on a phone
  • the trainer 226 based on the training gesture data 230 , the trainer 226 identifies an arm motion in which a user reaches his or her arm forward as indicating that the user intends to touch a touch screen of a user device.
  • the trainer 226 based on the training facial feature data 231 , the trainer 226 identifies eye positions indicating that a user is looking toward or away from a display screen of the device.
  • the example trainer 226 of FIG. 2 generates one or more gesture data model(s) 223 via the machine learning engine 228 and based on the training gesture data 230 and one or more facial feature data model(s) 225 via the machine learning engine 228 and based on the training facial feature data 231 (block 604 ).
  • the trainer 2226 uses the training gesture data 230 to generate the gesture data model(s) 223 that are used by the thermal constraint manager 132 to determine whether a user is typing on the keyboard 104 , 404 of the user device 102 , 400 .
  • the example trainer 226 can continue to train the thermal constraint manager 132 using different datasets and/or datasets having different levels of specificity (block 606 ). For example, the trainer 226 can generate a first gesture data model 223 to determine if the user is interacting with the keyboard 104 of the user device 102 , 400 and a second gesture data model 223 to determine if the user is interacting with the pointing device(s) 106 of the user device 102 , 400 .
  • the example instructions end when there is no additional training to be performed (e.g., based on user input(s)) (block 608 ).
  • the example instructions of FIG. 6 can be used to perform training based on other types of sensor data.
  • the example instructions of FIG. 6 can be used to train the thermal constraint manager 132 to associate different orientations of the device 102 , 400 , screen angle, etc., with different user positions (e.g., sitting, standing) relative to the device 102 , 400 and/or different locations of the device (e.g., resting a user's lap, held in a user's hand, resting on table).
  • FIGS. 7A-7B A flowchart representative of example hardware logic, machine readable instructions, hardware implemented state machines, and/or any combination thereof for implementing the thermal constraint manager 132 of FIG. 2 is shown in FIGS. 7A-7B .
  • the machine readable instructions may be one or more executable programs or portion(s) of an executable program for execution by a computer processor such as the processor 132 shown in the example processor platform 900 discussed below in connection with FIG. 9 .
  • the program may be embodied in software stored on a non-transitory computer readable storage medium such as a CD-ROM, a floppy disk, a hard drive, a DVD, a Blu-ray disk, or a memory associated with the processor 132 , but the entire program and/or parts thereof could alternatively be executed by a device other than the processor 132 and/or embodied in firmware or dedicated hardware.
  • a non-transitory computer readable storage medium such as a CD-ROM, a floppy disk, a hard drive, a DVD, a Blu-ray disk, or a memory associated with the processor 132 , but the entire program and/or parts thereof could alternatively be executed by a device other than the processor 132 and/or embodied in firmware or dedicated hardware.
  • FIGS. 7A-7B many other methods of implementing the example thermal constraint manager 132 may alternatively be used. For example, the order of execution of the blocks may be changed, and/or some of the blocks described may be changed, eliminated, or combined
  • any or all of the blocks may be implemented by one or more hardware circuits (e.g., discrete and/or integrated analog and/or digital circuitry, an FPGA, an ASIC, a comparator, an operational-amplifier (op-amp), a logic circuit, etc.) structured to perform the corresponding operation without executing software or firmware.
  • hardware circuits e.g., discrete and/or integrated analog and/or digital circuitry, an FPGA, an ASIC, a comparator, an operational-amplifier (op-amp), a logic circuit, etc.
  • FIGS. 7A and 7B are flowcharts of example machine readable instructions that, when executed, implement the example thermal constraint manager 132 of FIGS. 1 and/or 2 .
  • the thermal constraint manager 132 generates instruction(s) to control the thermal constraint(s) and/or fan acoustic constraint(s) of a user device (e.g., the user device 102 , 400 ) based on a user interaction(s) and/or ambient condition(s) for an environment in which the device is located.
  • FIGS. 7A and 7B can be executed by one or more processors of, for instance, the user device 102 , 400 another user device (e.g., the user device 128 ), and/or a cloud-based device (e.g., the cloud-based device(s) 129 ).
  • the instructions of FIGS. 7A and 7B can be executed in substantially real-time as sensor data received by the thermal constraint manager 132 or at some time after the sensor data is received by the thermal constraint manager 132 .
  • the device 102 , 400 can be in a working power state (e.g., a power state in which the device is fully operational in that the display screen is turned on, applications are being executed by processor(s) of the device) or a connected standby state (e.g., a low power standby state in which the device remains connected to the Internet such that processor(s) of the device can respond quickly to hardware and/or network events).
  • a working power state e.g., a power state in which the device is fully operational in that the display screen is turned on, applications are being executed by processor(s) of the device
  • a connected standby state e.g., a low power standby state in which the device remains connected to the Internet such that processor(s) of the device can respond quickly to hardware and/or network events.
  • the example user presence detection analyzer 214 determines whether the user is within a threshold distance of the user device 102 (block 700 ). For example, the user presence detection analyzer 214 detects a user is approaching the user device 102 , 400 based on data generated by the user presence detection sensor(s) 118 (e.g., TOF data, etc.) indicating that the user is within the range of the user presence detection sensor(s) 118 . In some examples, the user presence detection analyzer 214 determines if the user is within a predefined distance of the device 102 (e.g., within 1 meter, within 0.5 meters, etc.).
  • a predefined distance of the device 102 e.g., within 1 meter, within 0.5 meters, etc.
  • the example device configuration analyzer 218 of the example thermal constraint manager 132 of FIG. 2 determines whether user input(s) are detected within a threshold time (block 702 ). For example, the timer 244 communicates with the device configuration analyzer 218 to determine the amount of time between which a user presence is detected within a threshold distance of the device 102 , 400 (e.g., block 702 ) and when user input(s) are received by the device 102 , 400 .
  • the device configuration analyzer 218 detects user input(s) at the user device 102 such as keyboard input(s), touch screen input(s), mouse click(s), etc. If the device configuration analyzer 218 determines the user input(s) are detected within the threshold time, control proceeds to block 704 .
  • the device configuration analyzer 218 determines whether the user input(s) are received via external user input device(s) or on-board input device(s). For example, the device configuration analyzer 218 detects user input(s) via the external keyboard 108 and/or the external pointing device(s) 110 or via the on-board keyboard 104 and/or the on-board pointing device(s) 106 .
  • the thermal constraint selector 252 of the example thermal constraint manager 132 of FIG. 2 selects a thermal constraint for a temperature of the skin 408 of the device 102 (e.g., based on the thermal constraint selection rule(s) 254 stored in the database 212 ) that permits an increase in a temperature of a skin 408 of a housing 406 of the device 102 , 400 relative to a default temperature.
  • the power source manager 238 of the example thermal constraint manager 132 of FIG. 2 instructs the hardware component(s) of the device 102 , 400 (e.g., the processor 130 ) to consume increased amounts of power (block 706 ).
  • the thermal constraint selector 252 can select a thermal constraint that permits the skin temperature to increase to, for instance 47° C. from a default temperature of 45° C.
  • the power source manager 238 communicates with the power source(s) 116 of the device 102 , 400 to increase the power provided to the hardware component(s) of the user device 102 , 400 based on the thermal constraint selected by the thermal constraint selector 252 .
  • the thermal constraint selector 252 of the example thermal constraint manager 132 of FIG. 2 selects a thermal constraint for a temperature of the skin 408 of the device 102 that maintains the temperature of the skin 408 of the housing 406 of the device 102 , 400 at a default temperature and the power source manager 238 of the example thermal constraint manager 132 of FIG. 2 instructs the hardware component(s) of the device 102 , 400 (e.g., the processor 130 ) to consume power so as not to cause the temperature of the skin to exceed the default temperature (block 708 ).
  • the temperature analyzer 236 monitors the temperature of the hardware component(s) of the user device 102 based on the data generated by the temperature sensor(s) 126 and the fan speed manager 240 controls operation of the fan(s) 114 (e.g., increase fan level to exhaust hot air to cool the user device 102 ) to prevent the skin temperature from exceeding the selected thermal constraint at blocks 706 and/or 708 .
  • Control proceeds to block 718 from blocks 706 , 708 .
  • the device configuration analyzer 218 determines whether the user who is interacting with the device 102 , 400 is wearing headphones 112 .
  • the device configuration analyzer 218 detects whether headphones 112 are coupled with the user device 102 (e.g., via wired or wireless connection(s)) and audio output(s) are being provided via the device 102 , 400 .
  • the image data analyzer 220 determines whether the user is wearing headphones 112 based on image data generated by the image sensor(s) 122 .
  • the fan constraint selector 258 selects a fan acoustic constraint that permits the fan(s) 114 to rotate at increased speeds and, thus, generate more noise (e.g., 36 dBA) in view of the use of headphones 112 by the user and the fan speed manager 240 instructs the fan(s) to increase rotational speed(s) (block 720 ). If the device configuration analyzer 218 and/or the image data analyzer 220 determine the user is not wearing headphones, control proceeds to block 724 .
  • the ambient noise analyzer 234 analyzes microphone data generated by the microphone(s) 124 to determine an ambient noise level for an environment in which the user device 102 , 400 is located.
  • the ambient noise analyzer 234 determines whether the ambient noise level exceeds a threshold (e.g., based on the ambient noise rule(s) 235 ) (block 726 ). If the ambient noise level exceeds the threshold, the fan constraint selector 258 selects a fan acoustic constraint that permits the fan(s) 114 to rotate at increased speeds and, thus, generate more noise in view of the noisy surrounding environment and the fan speed manager 240 instructs the fan(s) to increase rotational speed(s) (block 728 ).
  • a threshold e.g., based on the ambient noise rule(s) 235
  • the fan acoustic constraint selector 258 selects a default fan acoustic constraint (e.g., based on the fan acoustic constraint selection rule(s) 260 ) for the fan(s) 114 and the fan speed manager 240 of the example thermal constraint manager 132 of FIG. 1 instructs the fan(s) to rotate at speed(s) that generate noise at or under, for instance 35 dBA (block 730 ). Control returns to block 722 .
  • a default fan acoustic constraint e.g., based on the fan acoustic constraint selection rule(s) 260 ) for the fan(s) 114 and the fan speed manager 240 of the example thermal constraint manager 132 of FIG. 1 instructs the fan(s) to rotate at speed(s) that generate noise at or under, for instance 35 dBA (block 730 ). Control returns to block 722 .
  • the image data analyzer 220 and/or the motion data analyzer 222 analyze user gesture(s) (e.g., movements, posture) and/or facial feature(s) (e.g., eye gaze) based on data generated by the image sensor(s) 122 and/or the motion sensor(s) 123 (block 710 ).
  • the sensor manager 248 activates the image sensor(s) 122 to generate image data when the user is detected as being proximate to the device (block 700 ).
  • the image data analyzer 220 analyzes image data generated by the image sensor(s) 122 to detect, for instance, a user's posture and/or eye gaze direction. Additionally or alternatively, the motion data analyzer 222 can analyze gesture data generated by the motion sensor(s) 123 to determine user gesture(s) (e.g., raising an arm, reaching a hand away from the user's body). In the example of FIGS. 7A and 7B , the image data analyzer 220 and/or the motion data analyzer 222 use machine-learning based model(s) 223 , 225 to determine if a user is likely to interact with the user device 102 .
  • machine-learning based model(s) 223 , 225 to determine if a user is likely to interact with the user device 102 .
  • the fan acoustic constraint selector 258 selects a default fan acoustic constraint (e.g., based on the fan acoustic constraint selection rule(s) 260 ) for the fan(s) 114 of the device 102 , 400 (block 714 ). Based on the default fan acoustic constraint, the fan speed manager 240 of the example thermal constraint manager 132 of FIG. 1 instructs the fan(s) to rotate at speed(s) that generate noise at or under, for instance 35 dBA.
  • the thermal constraint selector 252 selects a default thermal constraint for the skin temperature of the device 102 , 400 so the skin of the device 102 , 400 does not exceed a temperature of, for instance, 45° C. in anticipation of the user interacting with the device. Thereafter, control returns to block 702 to detect if user input(s) have been received at the device 102 , 400 .
  • the fan constraint selector 258 selects a fan acoustic constraint that permits the fan(s) 114 to rotate at increased speeds and, thus, generate more noise to more efficiently cool the device 102 , 400 (e.g., while the device 102 , 400 is in a working power state) and/or to clean the fan(s) 114 (block 716 ).
  • the fan constraint selector 258 selects a fan acoustic constraint that permits the fan(s) 114 to rotate at increased speeds and, thus, generate more noise to more efficiently cool the device 102 , 400 (e.g., while the device 102 , 400 is in a working power state) and/or to clean the fan(s) 114 .
  • Control proceeds to block 722 .
  • one or more of the user presence detection analyzer 214 , the device configuration analyzer 218 , the image data analyzer 220 , and/or the motion data analyzer 222 determines whether there is a change in user interaction with the user device 102 and/or a change in a likelihood that the user will interact with the user device 102 (block 722 ).
  • the user presence detection analyzer 214 can detect whether a user is no longer present based on the data generated by the user presence detection sensor(s) 118 .
  • the motion data analyzer 222 detects a user is reaching for the pointing device(s) 106 based on the data generated by the motion sensor(s) 123 and the gesture data model(s) 223 after a period of time in which the user was not interacting with the device 102 , 400 . If the one or more of the user presence detection analyzer 214 , the device configuration analyzer 218 , the image data analyzer 220 , and/or the motion data analyzer 222 detect a change in user interaction with the user device 102 and/or a change in a likelihood of a user interaction with the user device 102 , control returns to block 710 to analyzer user behavior relative to the device 102 . If no change in user interaction with the device 102 and/or likelihood of user interaction is detected, control proceeds to block 734 .
  • the example instructions of FIGS. 7A and 7B continue until the user device 102 enters a sleep mode (block 734 ), at which time the fan speed manager 240 disables the fan(s) 114 (block 736 ). If the device 102 , 114 returns a working power state (or, in some examples, a connected standby state) (block 738 ), the example instructions of FIGS. 7A and 7B resume with detecting presence of the user proximate to the device 102 , 400 (and moving component(s) such as the processor 130 and fan(s) 114 to higher power state) (block 700 ). The example instructions end when the device 102 , 400 is powered off (blocks 740 , 742 ).
  • the machine readable instructions described herein in connection with FIGS. 6 and/or 7A-7B may be stored in one or more of a compressed format, an encrypted format, a fragmented format, a compiled format, an executable format, a packaged format, etc.
  • Machine readable instructions as described herein may be stored as data (e.g., portions of instructions, code, representations of code, etc.) that may be utilized to create, manufacture, and/or produce machine executable instructions.
  • the machine readable instructions may be fragmented and stored on one or more storage devices and/or computing devices (e.g., servers).
  • the machine readable instructions may require one or more of installation, modification, adaptation, updating, combining, supplementing, configuring, decryption, decompression, unpacking, distribution, reassignment, compilation, etc. in order to make them directly readable, interpretable, and/or executable by a computing device and/or other machine.
  • the machine readable instructions may be stored in multiple parts, which are individually compressed, encrypted, and stored on separate computing devices, wherein the parts when decrypted, decompressed, and combined form a set of executable instructions that implement a program such as that described herein.
  • the machine readable instructions may be stored in a state in which they may be read by a computer, but require addition of a library (e.g., a dynamic link library (DLL)), a software development kit (SDK), an application programming interface (API), etc. in order to execute the instructions on a particular computing device or other device.
  • a library e.g., a dynamic link library (DLL)
  • SDK software development kit
  • API application programming interface
  • the machine readable instructions may need to be configured (e.g., settings stored, data input, network addresses recorded, etc.) before the machine readable instructions and/or the corresponding program(s) can be executed in whole or in part.
  • the disclosed machine readable instructions and/or corresponding program(s) are intended to encompass such machine readable instructions and/or program(s) regardless of the particular format or state of the machine readable instructions and/or program(s) when stored or otherwise at rest or in transit.
  • the machine readable instructions described herein can be represented by any past, present, or future instruction language, scripting language, programming language, etc.
  • the machine readable instructions may be represented using any of the following languages: C, C++, Java, C #, Perl, Python, JavaScript, HyperText Markup Language (HTML), Structured Query Language (SQL), Swift, etc.
  • FIGS. 6 and/or 7A-7B may be implemented using executable instructions (e.g., computer and/or machine readable instructions) stored on a non-transitory computer and/or machine readable medium such as a hard disk drive, a flash memory, a read-only memory, a compact disk, a digital versatile disk, a cache, a random-access memory and/or any other storage device or storage disk in which information is stored for any duration (e.g., for extended time periods, permanently, for brief instances, for temporarily buffering, and/or for caching of the information).
  • a non-transitory computer readable medium is expressly defined to include any type of computer readable storage device and/or storage disk and to exclude propagating signals and to exclude transmission media.
  • A, B, and/or C refers to any combination or subset of A, B, C such as (1) A alone, (2) B alone, (3) C alone, (4) A with B, (5) A with C, (6) B with C, and (7) A with B and with C.
  • the phrase “at least one of A and B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B.
  • the phrase “at least one of A or B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B.
  • the phrase “at least one of A and B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B.
  • the phrase “at least one of A or B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B.
  • FIG. 8 is a block diagram of an example processor platform 800 structured to execute the instructions of FIG. 6 to implement the training manager 224 of FIG. 2 .
  • the processor platform 800 can be, for example, a server, a personal computer, a workstation, a self-learning machine (e.g., a neural network), a mobile device (e.g., a cell phone, a smart phone, a tablet such as an iPad′), a personal digital assistant (PDA), an Internet appliance, a headset or other wearable device, or any other type of computing device.
  • a self-learning machine e.g., a neural network
  • a mobile device e.g., a cell phone, a smart phone, a tablet such as an iPad′
  • PDA personal digital assistant
  • Internet appliance e.g., a headset or other wearable device, or any other type of computing device.
  • the processor platform 800 of the illustrated example includes a processor 224 .
  • the processor 224 of the illustrated example is hardware.
  • the processor 224 can be implemented by one or more integrated circuits, logic circuits, microprocessors, GPUs, DSPs, or controllers from any desired family or manufacturer.
  • the hardware processor may be a semiconductor based (e.g., silicon based) device.
  • the processor implements the example trainer 226 and the example machine learning engine 228 .
  • the processor 224 of the illustrated example includes a local memory 813 (e.g., a cache).
  • the processor 224 of the illustrated example is in communication with a main memory including a volatile memory 814 and a non-volatile memory 816 via a bus 818 .
  • the volatile memory 814 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS® Dynamic Random Access Memory (RDRAM®) and/or any other type of random access memory device.
  • the non-volatile memory 816 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 814 , 816 is controlled by a memory controller.
  • the processor platform 800 of the illustrated example also includes an interface circuit 820 .
  • the interface circuit 820 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), a Bluetooth® interface, a near field communication (NFC) interface, and/or a PCI express interface.
  • one or more input devices 822 are connected to the interface circuit 820 .
  • the input device(s) 822 permit(s) a user to enter data and/or commands into the processor 224 .
  • the input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
  • One or more output devices 824 are also connected to the interface circuit 820 of the illustrated example.
  • the output devices 824 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube display (CRT), an in-place switching (IPS) display, a touchscreen, etc.), a tactile output device, a printer and/or speaker.
  • the interface circuit 820 of the illustrated example thus, typically includes a graphics driver card, a graphics driver chip and/or a graphics driver processor.
  • the interface circuit 820 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem, a residential gateway, a wireless access point, and/or a network interface to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 826 .
  • the communication can be via, for example, an Ethernet connection, a digital subscriber line (DSL) connection, a telephone line connection, a coaxial cable system, a satellite system, a line-of-site wireless system, a cellular telephone system, etc.
  • DSL digital subscriber line
  • the processor platform 800 of the illustrated example also includes one or more mass storage devices 828 for storing software and/or data.
  • mass storage devices 828 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, redundant array of independent disks (RAID) systems, and digital versatile disk (DVD) drives.
  • the machine executable instructions 832 of FIG. 6 may be stored in the mass storage device 828 , in the volatile memory 814 , in the non-volatile memory 816 , and/or on a removable non-transitory computer readable storage medium such as a CD or DVD.
  • FIG. 9 is a block diagram of an example processor platform 900 structured to execute the instructions of FIGS. 7A and 7B to implement the thermal constraint manager 132 of FIGS. 1 and/or 2 .
  • the processor platform 900 can be, for example, a server, a personal computer, a workstation, a self-learning machine (e.g., a neural network), a mobile device (e.g., a cell phone, a smart phone, a tablet such as an iPad′), a personal digital assistant (PDA), an Internet appliance, a headset or other wearable device, or any other type of computing device.
  • a self-learning machine e.g., a neural network
  • a mobile device e.g., a cell phone, a smart phone, a tablet such as an iPad′
  • PDA personal digital assistant
  • an Internet appliance e.g., a headset or other wearable device, or any other type of computing device.
  • the processor platform 900 of the illustrated example includes a processor 132 .
  • the processor 132 of the illustrated example is hardware.
  • the processor 132 can be implemented by one or more integrated circuits, logic circuits, microprocessors, GPUs, DSPs, or controllers from any desired family or manufacturer.
  • the hardware processor may be a semiconductor based (e.g., silicon based) device.
  • the processor implements the example user presence detection analyzer 214 , the example device configuration analyzer 218 , the example image data analyzer 220 , the example motion data analyzer 222 , the example ambient noise analyzer 234 , the example temperature analyzer 236 , the example power source manager 238 , the example fan speed manager 240 , the example timer 244 , the example sensor manager 248 , the example thermal constraint selector 252 , and the example fan acoustic constraint selector 258 .
  • the processor 132 of the illustrated example includes a local memory 913 (e.g., a cache).
  • the processor 132 of the illustrated example is in communication with a main memory including a volatile memory 914 and a non-volatile memory 916 via a bus 918 .
  • the volatile memory 914 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS® Dynamic Random Access Memory (RDRAM®) and/or any other type of random access memory device.
  • the non-volatile memory 916 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 914 , 916 is controlled by a memory controller.
  • the processor platform 900 of the illustrated example also includes an interface circuit 920 .
  • the interface circuit 920 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), a Bluetooth® interface, a near field communication (NFC) interface, and/or a PCI express interface.
  • one or more input devices 922 are connected to the interface circuit 920 .
  • the input device(s) 922 permit(s) a user to enter data and/or commands into the processor 132 .
  • the input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
  • One or more output devices 924 are also connected to the interface circuit 920 of the illustrated example.
  • the output devices 924 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube display (CRT), an in-place switching (IPS) display, a touchscreen, etc.), a tactile output device, a printer and/or speaker.
  • display devices e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube display (CRT), an in-place switching (IPS) display, a touchscreen, etc.
  • the interface circuit 920 of the illustrated example thus, typically includes a graphics driver card, a graphics driver chip and/or a graphics driver processor.
  • the interface circuit 920 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem, a residential gateway, a wireless access point, and/or a network interface to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 926 .
  • the communication can be via, for example, an Ethernet connection, a digital subscriber line (DSL) connection, a telephone line connection, a coaxial cable system, a satellite system, a line-of-site wireless system, a cellular telephone system, etc.
  • DSL digital subscriber line
  • the processor platform 900 of the illustrated example also includes one or more mass storage devices 928 for storing software and/or data.
  • mass storage devices 928 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, redundant array of independent disks (RAID) systems, and digital versatile disk (DVD) drives.
  • the machine executable instructions 932 of FIGS. 7A and 7B may be stored in the mass storage device 928 , in the volatile memory 814 , in the non-volatile memory 916 , and/or on a removable non-transitory computer readable storage medium such as a CD or DVD.
  • example methods, apparatus and articles of manufacture have been disclosed that provide for dynamic control of thermal constraints and/or fan acoustic constraints of an electronic user device (e.g., a laptop, a tablet).
  • Examples disclosed herein analyze sensor data indicative of, for instance, user interaction(s) with the device, other user activities (e.g., talking on a phone), and ambient noise to determine if a temperature of a skin of the device can be increased and/or if audible noises associated with rotation of the fan(s) of the device can be increased.
  • Examples disclosed herein detect opportunities for increased skin temperature (e.g., when a user is interacting with the device via an external keyboard) and/or increased fan noise (e.g., when a user is located a threshold distance from the device or in a noisy environment).
  • increased skin temperature of the device e.g., when a user is interacting with the device via an external keyboard
  • fan noise e.g., when a user is located a threshold distance from the device or in a noisy environment.
  • example disclosed herein enable increased power to be provided to the hardware component(s) of the device and, thus, can improve performance (e.g., processing performance) of the device.
  • examples disclosed herein provide for efficient cooling of the device.
  • the disclosed methods, apparatus and articles of manufacture improve the efficiency of using a computing device by selectively managing thermal constraint(s) for the device to optimize device performance and cooling in view user interactions with the device and/or ambient conditions.
  • the disclosed methods, apparatus and articles of manufacture are accordingly directed to one or more improvement(s) in the functioning of a computer.
  • Example methods, apparatus, systems, and articles of manufacture to implement thermal management of electronic user devices are disclosed herein. Further examples and combinations thereof include the following:
  • Example 1 includes an electronic device including a housing, a fan, a first sensor, a second sensor, and a processor to at least one of analyze first sensor data generated by the first sensor to detect a presence of a subject proximate to the electronic device or analyze second sensor data generated by the second sensor to detect a gesture of the subject, and adjust one or more of an acoustic noise level generated the fan or a temperature of an exterior surface of the housing based on one or more of the presence of the subject or the gesture.
  • Example 2 includes the electronic device of example 1, wherein the second sensor includes a camera.
  • Example 3 includes the electronic device of examples 1 or 2, wherein the processor is to adjust the acoustic noise level by generating an instruction to increase a rotational speed of the fan.
  • Example 4 includes the electronic device of any of examples 1-3, wherein the processor is to adjust the temperature of the exterior surface of the device by controlling a power source of the device.
  • Example 5 includes the electronic device of any of examples 1-4, further including a microphone, the processor to analyze third sensor data generated by the microphone to detect ambient noise in an environment including the device, and adjust the acoustic noise level of the fan based on the ambient noise.
  • Example 6 includes the electronic device of example 1, further including a keyboard carried by the housing, wherein the processor is to detect an input via the keyboard and adjust the temperature of the exterior surface of the housing based on the detection of the input.
  • Example 7 includes the electronic device of example 1, further including a keyboard external to the housing, wherein the processor is to detect an input via the keyboard and adjust the temperature of the exterior surface of the housing based on the detection of the input.
  • Example 8 includes the electronic device of example 1, wherein the processor is to adjust one the acoustic noise level to during cleaning of the fan and based on the distance of the user being within a threshold distance from the electronic device.
  • Example 9 includes an apparatus including a user presence detection analyzer, an image data analyzer, a motion data analyzer, at least one of (a) the user presence detection analyzer to identify a presence of a user relative to an electronic device based on first sensor data generated by a first sensor of the electronic device or (b) at least one of the image data analyzer or the motion data analyzer to determine a gesture of the user relative to the device based on second sensor data generated by a second sensor of the electronic device, a thermal constraint selector to select a thermal constraint for a temperature of an exterior surface of the electronic device based on one or more of the presence of the user or the gesture, and a power source manager to adjust a power level for a processor of the electronic device based on the thermal constraint.
  • Example 10 includes the apparatus of example 9, further including a device configuration analyzer to detect a presence of an external user input device communicatively coupled to the electronic device.
  • Example 11 includes the apparatus of example 10, wherein the external device is at least one of a keyboard, a pointing device, or headphones.
  • Example 12 includes the apparatus of example 9, wherein the second sensor data is image data and the image data analyzer is to determine the gesture based on a machine learning model.
  • Example 13 includes the apparatus of examples 9 or 12, wherein the second sensor data is image data and wherein the image data analyzer is to detect a position of an eye of the user relative to a display screen of the electronic device.
  • Example 14 includes the apparatus of example 9, further including a fan acoustic constraint selector to select a fan acoustic constraint for a noise level to be generated by a fan of the electronic device during operation of the fan.
  • Example 15 includes the apparatus of example 14, further including an ambient noise analyzer to determine an ambient noise level based on ambient noise data generated by a microphone of the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level.
  • Example 16 includes the apparatus of example 14, wherein the user presence detection sensor is further to determine a distance of the user from the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the distance.
  • Example 17 includes the apparatus of example 14, wherein the fan acoustic constraint selector is to select the fan acoustic constraint for the noise level to be generated by the fan during cleaning of the fan.
  • Example 18 includes the apparatus of example 14, wherein the image data analyzer is to detect that the user is wearing headphones based on image data generated by the second sensor, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level based on the detection of the headphones.
  • Example 19 includes at least one non-transitory computer readable storage medium including instructions that, when executed, cause a machine to at least identify one or more of (a) a presence of a user relative to an electronic device based on first sensor data generated by a first sensor of the electronic device, (b) a facial feature of the user based on second sensor data generated by a second sensor of the electronic device, or (c) a gesture of the user based on the second sensor data, select a thermal constraint for a temperature of an exterior surface of the electronic device based on one or more of the presence of the user, the facial feature, or the gesture, and adjust a power level for a processor of the electronic device based on the thermal constraint.
  • Example 20 includes the at least one non-transitory computer readable storage medium of example 19, wherein the instructions, when executed, further cause the machine to detect a presence of an external user input device communicatively coupled to the electronic device.
  • Example 21 includes the at least one non-transitory computer readable storage medium of example 19, wherein the instructions, when executed, further cause the machine to identify the gesture based on a machine learning model.
  • Example 22 includes the at least one non-transitory computer readable storage medium of examples 19 or 21, wherein the facial feature includes an eye position and wherein the instructions, when executed, further cause the machine to detect a position of an eye of the user relative to a display screen of the electronic device.
  • Example 23 includes the at least one non-transitory computer readable storage medium of examples 19 or 20, wherein the instructions, when executed, further cause the machine to select a fan acoustic constraint for a noise level to be generated by a fan of the electronic device during operation of the fan.
  • Example 24 includes the at least one non-transitory computer readable storage medium of example 23, wherein the instructions, when executed, further cause the machine to determine an ambient noise level based on ambient noise data generated by a microphone of the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level.
  • Example 25 includes the at least one non-transitory computer readable storage medium of example 23, wherein the instructions, when executed, further cause the machine to detect that the user is wearing headphones based on image data generated by the second sensor, the fan acoustic constraint selector to select the fan acoustic constraint based on the detection of the headphones.
  • Example 26 includes the at least one non-transitory computer readable storage medium of example 23, wherein the instructions, when executed, further cause the machine to determine a distance of the user from the electronic device and select the fan acoustic constraint based on the distance.
  • Example 27 includes the at least one non-transitory computer readable storage medium of example 23, wherein the instructions, when executed, further cause the machine to select the fan acoustic constraint for the noise level to be generated by the fan during cleaning of the fan.
  • Example 28 includes a method including at least one of (a) identifying a presence of a user relative to an electronic device based on first sensor data generated by a first sensor of the electronic device, (b) identifying a facial feature of the user based on second sensor data generated by a second sensor of the electronic device, or (c) identifying a gesture of the user based on the second sensor data, selecting a thermal constraint for a temperature of an exterior surface of the electronic device based on one or more of the presence of the user, the facial feature, or the gesture, and adjusting a power level for a processor of the electronic device based on the thermal constraint.
  • Example 29 includes the method of example 28, further including detecting a presence of an external user input device communicatively coupled to the electronic device.
  • Example 30 includes the method of example 28, further including determining the one or more of the facial feature or the gesture based on a machine learning model.
  • Example 31 includes the method of examples 28 or 30, wherein the facial feature includes eye position and further including detecting a position of an eye of the user relative to a display screen of the electronic device.
  • Example 32 includes the method of examples 28 or 29, further including selecting a fan acoustic constraint for a noise level to be generated by a fan of the electronic device.
  • Example 33 includes the method of example 32, further including determining an ambient noise level based on ambient noise data generated by a microphone of the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level.
  • Example 34 includes the method of example 32, further including detecting detect that the user is wearing headphones based on image data generated by the second sensor, the fan acoustic constraint selector to select the fan acoustic constraint based on the detection of the headphones.
  • Example 35 includes the method of example 32, further including determining a distance of the user from the electronic device and selecting the fan acoustic constraint based on the distance.
  • Example 36 includes the method of example 32, further including selecting the fan acoustic constraint for the noise level to be generated by the fan during cleaning of the fan.

Abstract

Apparatus and methods for thermal management of electronic user devices are disclosed herein. An example electronic device disclosed herein includes a housing, a fan, a first sensor, a second sensor, and a processor to at least one of analyze first sensor data generated by the first sensor to detect a presence of a subject proximate to the electronic device or analyze second sensor data generated by the second sensor to detect a gesture of the subject, and adjust one or more of an acoustic noise level generated the fan or a temperature of an exterior surface of the housing based on one or more of the presence of the subject or the gesture.

Description

FIELD OF THE DISCLOSURE
This disclosure relates generally to electronic user devices and, more particularly, to apparatus and methods for thermal management of electronic user devices.
BACKGROUND
During operation of an electronic user device (e.g., a laptop, a tablet), hardware components of the device, such as a processor, a graphics card, and/or battery, generate heat. Electronic user devices include one or more fans to promote airflow to cool the device during use and prevent overheating of the hardware components.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates an example system constructed in accordance with teachings of this disclosure and including an example user device and an example thermal constraint manager for controlling a thermal constraint of the user device.
FIG. 2 is a block diagram of an example implementation of the thermal constraint manager of FIG. 1.
FIG. 3 illustrates example thermal constraints that may be implemented with the example user device of FIG. 1.
FIG. 4 illustrates an example user device constructed in accordance with teachings of this disclosure and, in particular, illustrates the user device in a first configuration associated with a first thermal constraint of the user device.
FIG. 5 illustrates the example user device of FIG. 4 and, in particular, illustrates the user device in a second configuration associated with a second thermal constraint of the user device.
FIG. 6 is a flowchart representative of example machine readable instructions which may be executed to implement the example training manager of FIG. 2.
FIGS. 7A and 7B are flowcharts representative of example machine readable instructions which may be executed to implement the example thermal constraint manager of FIGS. 1 and/or 2.
FIG. 8 is a block diagram of an example processing platform structured to execute the instructions of FIG. 6 to implement the example training manager of FIG. 2.
FIG. 9 is a block diagram of an example processing platform structured to execute the instructions of FIGS. 7A and 7B to implement the example thermal constraint manager of FIGS. 1 and/or 2.
The figures are not to scale. In general, the same reference numbers will be used throughout the drawing(s) and accompanying written description to refer to the same or like parts.
Descriptors “first,” “second,” “third,” etc. are used herein when identifying multiple elements or components which may be referred to separately. Unless otherwise specified or understood based on their context of use, such descriptors are not intended to impute any meaning of priority, physical order or arrangement in a list, or ordering in time but are merely used as labels for referring to multiple elements or components separately for ease of understanding the disclosed examples. In some examples, the descriptor “first” may be used to refer to an element in the detailed description, while the same element may be referred to in a claim with a different descriptor such as “second” or “third.” In such instances, it should be understood that such descriptors are used merely for ease of referencing multiple elements or components.
DETAILED DESCRIPTION
During operation of an electronic user device (e.g., a laptop, a tablet), hardware components disposed in a body or housing of the device, such as a processor, graphics card, and/or battery, generate heat. Heat generated by the hardware components of the user device can cause a temperature of one or more portions of an exterior surface, or skin, of the device housing to increase and become warm or hot to a user's touch. To prevent overheating of the hardware components, damage to the device, and/or discomfort to the user of the device when the user touches or places one or more portions of the user's body proximate to the skin of the device and/or components of the device accessible via the exterior surface of the housing such as a touchpad, the user device includes one or more fans to exhaust hot air generated within the body of the device and cool the device.
Some known electronic user devices are configured with one or more thermal constraints to control the temperature of the hardware components of the user device and/or of the skin of the device. The thermal constraints(s) can define, for instance, a maximum temperature of a hardware component such as a processor to prevent overheating of the processor. The thermal constraint(s) can define a maximum temperature of the skin of the device to prevent discomfort to a user touching and/or holding the device. In known user devices, operation of the fan(s) of the user device and/or management of power consumed by the device are controlled based on the thermal constraint(s). For instance, if a temperature of a hardware component of the device is approaching a maximum temperature as defined by the thermal constraint for the component, rotational speed(s) (e.g., revolutions per minute (RPMs)) of the fan(s) can be increased to exhaust hot air and reduce a temperature of the component. Additionally or alternatively, power consumption by one or more components of the device (e.g., the graphics card) may be reduced to reduce the amount of heat generated by the component and, thus, the device.
In some known user devices, the thermal constraint(s) define that a temperature of the skin of the device should not exceed, for instance, 45° C., to prevent user discomfort when the user is physically touching the device (e.g., typing on a keyboard of a laptop, scrolling on a touchscreen, etc.). Temperature of the skin of the device can be controlled by controlling power consumption of the hardware component(s) disposed within the device body to manage the amount of heat generated by the component(s) transferred to the skin of the device. However, such thermal constraint(s) can affect performance of the user device. For instance, some known user devices can operate in a high performance mode, or a mode that favors increased processing speeds over energy conservation (e.g., a mode in which processing speeds remain high for the duration that the device is in use, the screen remains brightly lit, and other hardware components do not enter power-saving mode when those components are not in use). The processor consumes increased power to accommodate the increased processing speeds associated with the high performance mode and, thus, the amount of heat generated by the processor is increased. As a result, a temperature of the skin of the user device can increase due to the increased amount of heat generated within the device housing. In some known devices, the processor may operate at lower performance speeds to consume less power and, thus, prevent the skin of the device from exceeding the maximum skin temperature defined by the thermal constraint. Thus, in some known devices, processing performance is sacrificed in view of thermal constraint(s).
Higher fan speeds can be used to facilitate of cooling of hardware component(s) of a device to enable the component(s) to operate in, for instance, a high performance mode without exceeding the thermal constraint(s) for the hardware competent(s) and/or the device skin. However, operation of the fan(s) at higher speeds increases audible acoustic noise generated by the fan(s). Thus, in some known user devices, the fan speed(s) and, thus, the amount of cooling that is provided by the fan(s), are restricted to avoid generating fan noise levels over certain decibels. Some know devices define fan noise constraints that set, for instance, a maximum noise level of 35 dBA during operation of the fan(s). As a result of the restricted fan speed(s), performance of the device may be limited to enable the fan(s) to cool the user device within the constraints of the fan speed(s).
In some instances, cooling capabilities of the fan(s) of the device degrade over time due to dust accumulating in the fan(s) and/or heat sink. Some known user devices direct the fan(s) to reverse airflow direction (e.g., as compared to the default airflow direction to exhaust hot air from the device) to facilitate heatsink and fan shroud cleaning, which helps to de-clog dust from the airflow path and maintain device performance over time. However, operation of the fan(s) in the reverse direction increases audible acoustics generated by the fan(s), which can disrupt the user's experience with the device.
Although thermal constraint(s) are implemented in a user device to prevent discomfort to the user when the user is directly touching the device (e.g., physically touching one or more components of the device accessible via the exterior housing of the device, such a keyboard and/or touchpad of a laptop, a touchscreen of a tablet, etc.), there are instances in which a temperature of the skin of the device can be increased without affecting the user's experience with the device. For instance, a user may view a video on the user device but not physically touch the user device; rather, the device may be resting on a table. In some instances, the user may interact with the user device via external accessories communicatively coupled to the device, such as an external keyboard and/or an external mouse. In such instances, because the user is not directly touching the device (i.e., not directly touching the skin of the device housing and/or component(s) accessible via the exterior surface of the housing), an increase in a temperature of the skin of the device would not be detected by the user. However, known user devices maintain the skin temperature of the device at the same temperature as if the user were directly touching the user device regardless of whether the user is interacting with the device via external accessories.
In some instances, the user device is located in a noisy environment (e.g., a coffee shop, a train station). Additionally, or alternatively, in some instances, the user may be interacting with the user device while wearing headphones. In such instances, the amount of fan noise heard by the user is reduced because of the loud environment and/or the use of headphones. However, in known user devices, the rotational speed of the fan(s) of the device are maintained at a level that minimizes noise from the fan(s) regardless of the surrounding ambient noise levels and/or whether or not the user is wearing headphones.
Disclosed herein are example user devices that provide for dynamic adjustment of thermal constraints and/or fan acoustic noise levels of the user device. Example disclosed herein use a multi-tier determination to control operation of fan(s) of the device and/or to adjust a performance level of the device and, thus, control heat generated by hardware component(s) of the device based on factors such as a presence of a user proximate to the device, user interaction(s) with the device (e.g., whether the user is using an on-board keyboard of the device or an external keyboard), and/or ambient noise levels in an environment in which the device is located. Example user devices disclosed herein include sensors to detect user presence (e.g., proximity sensor(s), image sensor(s)), device configuration (e.g., sensor(s) to detect user input(s) received via an external keyboard, sensor(s) to detect device orientation), and/or conditions in the ambient environment in which the device is located (e.g., ambient noise sensor(s)). Based on the sensor data, examples disclosed herein determine whether a temperature of the skin of the device housing can be increased relative to a default thermal constraint, where the default thermal constraint corresponds to a skin temperature for the device when the user is directly touching the device (e.g., touching one or more components of the device accessible via the exterior housing of the device such as keyboard or touchpad of a laptop). Examples disclosed herein selectively control an amount of power provided to hardware component(s) of the user device and/or fan speed level(s) (e.g., RPMs) based on the selected thermal constraint (e.g., the default thermal constraint or a thermal constraint permitting a higher skin temperature for the device relative to the default thermal constraint).
In some examples disclosed herein, power consumption by one or more component(s) of the user device (e.g., the processor) is increased when the user is determined to be providing inputs to the user device via, for instance, an external keyboard. Because the user is not physically touching the exterior surface of the device housing when the user is providing inputs via the external keyboard, the temperature of the skin of the device can be increased without adversely affecting the user (e.g., without causing discomfort to the user). In some examples disclosed herein, rotational speed(s) (e.g. RPM(s)) of the fan(s) of the user device are increased when sensor data from the ambient noise sensor(s) indicates that the user is in a loud environment. In such examples, because the user device is located in a noisy environment, the resulting increase in fan acoustics from the increased rotational speed(s) of the fan(s) is offset by the ambient noise. In some other examples, the rotational direction of the fan(s) of the user device is reversed (e.g., to facilitate heatsink and fan shroud cleaning) when sensor data from the ambient noise sensor(s) indicate that the user device is in a loud environment and/or is that the user is not present or within a threshold distance of the device. Thus, the user is not interrupted by the increased fan noise and the device can be cooled and/or cleaned with increased efficiency. Rather than maintaining the thermal constraint(s) of the device and/or the fan noise constraint(s) at respective default levels during operation of the device, examples disclosed herein dynamically adjust the constraints and, thus, the performance of the device, based on user and/or environmental factors. As a result, performance of the device can be selectively increased in view of the opportunities for increased device skin temperature and/or audible fan noise levels in response to user interactions with the device.
FIG. 1 illustrates an example system 100 constructed in accordance with teachings of this disclosure for controlling thermal constraint(s) and/or fan noise constraint(s) for a user device 102. The user device 102 can be, for example, a personal computing (PC) device such as a laptop, a desktop, an electronic tablet, a hybrid or convertible PC, etc. In some examples, the user device 102 includes a keyboard 104. In other examples, such as when the user device 102 is an electronic tablet, a keyboard is presented via a display screen 103 of the user device 102 and the user provides inputs on the keyboard by touching the screen. In some examples, the user device 102 includes one or more pointing device(s) 106 such as a touchpad. In examples disclosed herein, the keyboard 104 and the pointing device(s) 106 are carried by a housing the user device 102 and accessible via an exterior surface of the housing and, thus, can be considered on-board user input devices for the device 102.
In some examples, the user device 102 additionally or alternatively includes one or more external devices communicatively coupled to the device 102, such as an external keyboard 108, external pointing device(s) 110 (e.g., wired or wireless mouse(s)), and/or headphones 112. The external keyboard 108, the external pointing device(s) 110, and/or the headphones 112 can be communicatively coupled to the user device 102 via one or more wired or wireless connections. In the example of FIG. 1, the user device 102 includes one or more device configuration sensor(s) 120 that provide means for detecting whether user input(s) are being received via the external keyboard 108 and/or the external pointing device(s) 110 and/or whether output(s) (e.g., audio output(s)) are being delivered via the headphones 112 are coupled to the user device 102. In some examples, the device status sensor(s) 120 detect a wired connection of one or more of the external devices 108, 110, 112 via a hardware interface (e.g., USB port, etc.). In other examples, the device configuration sensor(s) 120 detect the presence of the external device(s) 108, 110, 112 via wireless connection(s) (e.g., Bluetooth). In some examples, the device configuration sensor(s) 120 include accelerometers to detect an orientation of the device 102 (e.g., tablet mode) and/or sensor(s) to detect an angle of, for instance, a screen of a laptop (e.g., facing the laptop base, angled away from the base, etc.).
The example user device 102 includes a processor 130 that executes software to interpret and output response(s) based on the user input event(s) (e.g., touch event(s), keyboard input(s), etc.). The user device 102 of FIG. 1 includes one or more power sources 116 such as a battery to provide power to the processor 130 and/or other components of the user device 102 communicatively coupled via a bus 117.
In the example of FIG. 1, the hardware components of the device 102 (e.g., the processor 130, a video graphics card, etc.) generate heat during operation of the user device 102. The example user device 102 includes temperature sensor(s) 126 to measure temperature(s) associated with the hardware component(s) of the user device 102. In the example of FIG. 1, the temperature sensor(s) 126 measure a temperature of a skin of the housing of the user device 102, or an exterior surface of the user device that can be touched by a user (e.g., a base of a laptop) (the terms “user” and “subject” are used interchangeably herein and both refer to a biological creature such as a human being). The temperature sensor(s) 126 can be disposed in the housing of the device 102 proximate to the skin (e.g., coupled to a side of the housing opposite the side of the housing that is visible to the user). The temperature sensor(s) 126 can include one or more thermometers.
The example user device 102 of FIG. 1 includes one or more fan(s) 114. The fan(s) 114 provide means for cooling and/or regulating the temperature of the hardware component(s) (e.g., the processor 130) of the user device 102 in response to temperature data generated by the temperature sensor(s) 126. In the example of FIG. 1, operation of the fan(s) 114 is controlled in view of one or more thermal constraints for the user device 102 that define temperature settings for the hardware component(s) of the device 102 and/or a skin temperature of the device 102. In some examples, operation of the fan(s) 114 of the example user device 102 of FIG. 1 is controlled based on one or more fan acoustic constraints that define noise level(s) (e.g., decibels) to be generated during operation of the fan(s) 114. In the example of FIG. 1, the thermal constraint(s) and/or fan acoustic constraint(s) for the device 102 are dynamically selected based on the user interaction(s) with the device 102 and/or ambient conditions in an environment in which the device 102 is located.
The example user device 102 of FIG. 1 includes one or more user presence detection sensor(s) 118. The user presence detection sensor(s) 118 provide a means for detecting a presence of a user relative to the user device 102 in an environment in which the user device 102 is located. For example, the user presence detection sensor(s) 118 may detect a user approaching the user device 102. In the example of FIG. 1, the user presence detection sensor(s) 118 include proximity sensor(s) that emit electromagnetic radiation (e.g., light pulses) and detect changes in the signal due to the presence of a person or object (e.g., based on reflection of the electromagnetic radiation (e.g., light pulses). In some examples, the user presence detection sensor(s) 118 include time-of-flight (TOF) sensors that measure a length of time for light to return to the sensor after being reflected off a person or object, which can be used to determine depth. The example user presence detection sensor(s) 118 can include other types of depth sensors, such as sensors that detect changes based on radar or sonar data. In some instances, the user presence detection sensor(s) 118 collect distance measurements for one or more (e.g., four) spatial regions (e.g., non-overlapping quadrants) relative to the user device 102. The user presence detection sensor(s) 118 associated with each region provide distance range data for region(s) of the user's face and/or body corresponding to the regions.
The user presence detection sensor(s) 118 are carried by the example user device 102 such that the user presence detection sensor(s) 118 can detect changes in an environment in which the user device 102 is located that occur with a range (e.g., a distance range) of the user presence detection sensor(s) 118 (e.g., within 10 feet of the user presence detection sensor(s) 118, within 5 feet, etc.). For example, the user presence detection sensor(s) 118 can be mounted on a bezel of the display screen 103 and oriented such that the user presence detection sensor(s) 118 can detect a user approaching the user device 102. The user presence detection sensor(s) 118 can additionally or alternatively be at any other locations on the user device 102 where the sensor(s) 118 face an environment in which the user device 102 is located, such as on a base of the laptop (e.g., on an edge of the base in front of a keyboard carried by base), a lid of the laptop, on a base of the laptop supporting the display screen 103 in examples where the display screen 103 is a monitor of a desktop or all-in-one PC, etc.
In some examples, the user presence detection sensor(s) 118 are additionally or alternatively mounted at locations on the user device 102 where the user's arm, hand, and/or finger(s) are likely to move or pass over as the user brings his or her arm, hand, and/or finger(s) toward the display screen 103, the keyboard 104, and/or other user input device (e.g., the pointing device(s) 106). For instance, in examples in which the user device 102 is laptop or other device including a touchpad, the user presence detection sensor(s) 118 can be disposed proximate to the touchpad of the device 102 to detect when a user's arm is hovering over the touchpad (e.g., as the user reaches for the screen 103 or the keyboard 104).
In the example of FIG. 1, the user device 102 includes image sensor(s) 122. In this example, the image sensor(s) 122 generate image data that is analyzed to detect, for example, a presence of the user proximate to the device, gestures performed by the user, whether the user is looking toward or away from the display screen 103 of the device 102 (e.g., eye-tracking), etc. The image sensor(s) 122 of the user device 102 include one or more cameras to capture image data of the surrounding environment in which the device 102 is located. In some examples, the image sensor(s) 122 include depth-sensing camera(s). In the example of FIG. 1, the image sensor(s) 122 are carried by the example user device 102 such that when a user faces the display screen 103, the user is within a field of view of the image sensor(s) 122. For example, the image sensor(s) 122 can be carried by a bezel of the display screen 103.
The example user device 102 of FIG. 1 includes one or more motion sensor(s) 123. The motion sensor(s) 123 can include, for example, infrared sensor(s) to detect user movements. As disclosed herein, data generated by the motion sensor(s) 123 can be analyzed to identify gestures performed by the user of the user device 102. The motion sensor(s) 123 can be carried by the device 102 proximate to, for example, a touchpad of the device 102, a bezel of the display screen 103, etc. so as to detect user motion(s) occurring proximate to the device 102.
In the example of FIG. 1, the user device 102 includes one or more microphone(s) 124 to detect sounds in an environment in which the user device 102 is located. The microphone(s) 124 can be carried by the user device 102 at one or more locations, such as on a lid of the device 102, on a base of the device 102 proximate to the keyboard 104, etc.
The example user device 102 of FIG. 1 can include other types of sensor(s) to detect user interactions relative to the device 102 and/or environmental conditions (e.g., ambient light sensor(s)).
The example user device 102 includes one or more semiconductor-based processors to process sensor data generated by the user presence detection sensor(s) 118, the device configuration sensor(s) 120, the image sensor(s) 122, the motion sensor(s) 123, the microphone(s) 124, and/or the temperature sensor(s) 126. For example, the sensor(s) 118, 120, 122, 123, 124, 126 can transmit data to the on-board processor 130 of the user device 102. In other examples, the sensor(s) 118, 120, 122, 123, 124, 126 can transmit data to a processor 127 of another user device 128, such as such as a smartphone or a wearable device such as a smartwatch. In other examples, the sensor(s) 118, 120, 122, 123, 124, 126 can transmit data to a cloud-based device 129 (e.g., one or more server(s), processor(s), and/or virtual machine(s)).
In some examples, the processor 130 of the user device 102 is communicatively coupled to one or more other processors. In such an example, the sensor(s) 118, 120, 122, 123, 124, 126 can transmit the sensor data to the on-board processor 130 of the user device 102. The on-board processor 130 of the user device 102 can then transmit the sensor data to the processor 127 of the user device 128 and/or the cloud-based device(s) 129. In some such examples, the user device 102 (e.g., the sensor(s) 118, 120, 122, 123, 124, 126 and/or the on-board processor 130) and the processor(s) 127, 130 are communicatively coupled via one or more wired connections (e.g., a cable) or wireless connections (e.g., cellular, Wi-Fi, or Bluetooth connections). In other examples, the sensor data may only be processed by the on-board processor 130 (i.e., not sent off the device).
In the example system 100 of FIG. 1, the sensor data generated by the user presence detection sensor(s) 118, the device configuration sensor(s) 120, the image sensor(s) 122, the motion sensor(s) 123, the microphone(s) 124, and/or the temperature sensor(s) 126 is processed by a thermal constraint manager 132 to select a thermal constraint for the user device 102 to affect a temperature of the skin of the housing of the device 102 and/or a fan acoustic constraint to affect rotational speed(s) of the fan(s) 114 of the user device 102 and, thus, noise generated by the fan(s) 114. As a result of the selected thermal constraint and/or fan acoustic constraint, the example thermal constraint manager 132 can affect performance of the device 102. For instance, if the thermal constraint manager 132 determines that the temperature of the skin of the device 102 can be increased and/or that rotational speed(s) of the fan(s) 114 can be increased, additional power can be provided to hardware component(s) of the device 102 (e.g., the processor 130) to provide for increased performance of the component(s) (e.g., higher processing speeds). In such examples, the increased heat generated by the hardware component(s) and transferred to the skin of the device is permitted by the selected thermal constraint and/or is managed via increased rotation of the fan(s) 114. In the example of FIG. 1, the thermal constraint manager 132 is implemented by executable instructions executed on the processor 130 of the user device 102. However, in other examples, the thermal constraint manager 132 is implemented by instructions executed on the processor 127 of the wearable or non-wearable user device 128 and/or on the cloud-based device(s) 129. In other examples, the thermal constraint manager 132 is implemented by dedicated circuitry located on the user device 102 and/or the user device 128. These components may be implemented in software, firmware, hardware, or in combination of two or more of software, firmware, and hardware.
In the example of FIG. 1, the thermal constraint manager 132 serves to process the sensor data generated by the respective sensor(s) 118, 120, 122, 123, 124, 126 to identify user interaction(s) with the user device 102 and/or ambient conditions in the environment in which the device 102 is located and to select a thermal constraint and/or fan acoustic constraint for the user device 102 based on the user interaction(s) and/or the ambient environment conditions. In some examples, the thermal constraint manager 132 receives the sensor data in substantially real-time (e.g., near the time the data is collected). In other examples, the thermal constraint manager 132 receives the sensor data at a later time (e.g., periodically and/or aperiodically based on one or more settings but sometime after the activity that caused the sensor data to be generated, such as a hand motion, has occurred (e.g., seconds, minutes, etc. later)). The thermal constraint manager 132 can perform one or more operations on the sensor data such as filtering the raw signal data, removing noise from the signal data, converting the signal data from analog data to digital data, and/or analyzing the data. For example, the thermal constraint manager 132 can convert the sensor data from analog to digital data at the on-board processor 130 and the digital data can be analyzed by on-board processor 130 and/or by one or more off-board processors, such as the processor 127 of the user device 128 and/or the cloud-based device 129.
Based on the sensor data generated by the user presence detection sensor(s) 118, the thermal constraint manager 132 determines whether or not a subject is present within the range of the user presence detection sensor(s) 118. In some examples, if the thermal constraint manager 132 determines that the user is not within the range of the user presence detection sensor(s) 118, the thermal constraint manager 132 determines that the rotational speed of the fan(s) 114 can be increased, as the user is not present to hear the increased acoustic noise generated by the fan(s) 114 operating at an increased speed. The thermal constraint manager 132 generates instructs for the fan(s) 114 to increase the rotational speed at which the fan(s) 114 operate. The fan(s) 114 can continue to operate at the increased rotational speed to provide efficient until, for instance, the processor 130 of the device 102 determines that no user input(s) have been received at the device 102 for a period of time and the device 102 should enter a low power state (e.g., a standby or sleep state).
In the example of FIG. 1, if the thermal constraint manager 132 determines that a user is within the range of the user presence detection sensor(s) 118, the thermal constraint manager 132 determines if the user is interacting with the device 102. The thermal constraint manager 132 can detect whether user input(s) are being received via (a) the on-board keyboard 104 and/or the on-board pointing device(s) 106 or (b) the external keyboard 108 and/or the external pointing device(s) 110 based on data generated by the device configuration sensor(s) 120. If the user is interacting with the device 102 via the on-board keyboard 104 and/or the on-board pointing device(s) 106, the thermal constraint manager 132 maintains the skin temperature of the device 102 at a first (e.g., default) thermal constraint that defines a maximum temperature for the device skin to prevent the skin of the device housing from becoming too hot and injuring the user. If the thermal constraint manager 132 determines that the user is interacting with the device 102 via the external keyboard 108 and/or the external pointing device(s) 110, the thermal constraint manager 132 selects a thermal constraint for the device that defines an increased temperature for the skin of the device 102 relative to the first thermal constraint. As a result of the relaxation of the thermal constraint for the device 102 (i.e., the permitted increase in the skin temperature of the device), one or more hardware component(s) of the device 102 (e.g., the processor 130) move to an increased performance mode in which the component(s) of the device consume more power and, thus, generate more heat. In such examples, the thermal constraint manager 132 selects a thermal constraint for the skin temperature of the device housing that is increased relative to the thermal constraint selected when the user is interacting with the device 102 via the on-board keyboard 104 and/or the on-board pointing device(s) 106 because the user is not directly touching the device 102 when providing input(s) via the external device(s) 108, 110.
If the thermal constraint manager 132 determines that the user is within the range of the user presence detection sensor(s) 118 but is not providing input(s) at the device 102 and/or has not provided an input within a threshold period of time, the thermal constraint manager 132 infers a user intent to interact with the device. The thermal constraint manager 132 can use data from multiple types of sensors to predict whether the user is likely to interact with the device.
For example, the thermal constraint manager 132 can determine a distance of the user from the device 102 based on data generated by the user presence detection sensor(s) 118. If the user is determined to be outside of a predefined threshold range of the device 102 (e.g., farther than 1 meter from the device 102), the thermal constraint manager 132 determines that the rotational speed of the fan(s) 114 of the device 102 and, thus, the fan acoustics, can be increased because the increased fan noise will not disrupt the user in view of the user's distance from the device 102. Additionally or alternatively, the thermal constraint manager 132 determines that the power level of the power source(s) 116 of the device 102 and, thus, the device skin temperature, can be increased because the increased skin temperature will not cause discomfort to the user based on the user's distance from the device 102.
In some examples, thermal constraint manager 132 analyzes image data generated by the image sensor(s) 122 to determine a position of the user's eyes relative to the display screen 103 of the device 102. In such examples, if thermal constraint manager 132 identifies both of the user's eyes in the image data, the thermal constraint manager 132 determines that the user is looking at the display screen 103. If the thermal constraint manager 132 identifies one of the user's eyes or none of the user's eyes in the image data, the thermal constraint manager 132 determines that the user is not engaged with the device 102. In such examples, the thermal constraint manager 132 can instruct the fan(s) 114 to increase rotational speed(s) to cool the device 102. Because the user is not engaged or not likely engaged with the device 102 as determined based on eye tracking, the thermal constraint manager 132 permits increased fan noise to be generated by the fan(s) 114 to efficiently cool the device 102 while the user is distracted relative to the device 102. Additionally or alternatively, the thermal constraint manager 132 can instruct the power source(s) 116 to increase the power provided to the hardware component(s) of the user device 102 (and, thus, resulting in increased the skin temperature of the user device 102).
In some examples, the thermal constraint manager 132 analyzes the image data generated by the image data sensor(s) 122 and/or the motion sensor(s) 123 to identify gesture(s) being performed by the user. If the thermal constraint manager 132 determines that the user is, for instance, looking away from the device 102 and talking on the phone based on the image data and/or the motion sensor data (e.g. image data and/or motion sensor data indicating that the user has moved his or her hand proximate to his or her ear), the thermal constraint manager 132 determines that the fan acoustics can be increased because the user is not likely to interact with the device 102 while the user is looking away and talking on the phone.
The example thermal constraint manager 132 of FIG. 1 evaluates ambient noise conditions to determine if fan noise levels can be increased. The thermal constraint manager 132 of FIG. 1 analyzes data generated by the microphone(s) 124 to determine if ambient noise in the surrounding environment exceeds an environment noise level threshold. If the thermal constraint manager 132 determines that the ambient noise exceeds the environment noise level threshold, the thermal constraint manager 132 instructs the fan(s) to rotate at increased speed(s) and, thus, generate increased fan noise. In such examples, the increased fan noise is unlikely to be detected in the noisy environment in which the user device 102 is located and, thus, operation of the fan(s) 114 can be optimized to increase cooling and, thus, performance of the device 102.
Additionally or alternatively, the thermal constraint manager 132 can determine whether the user is wearing headphones based on, for example, image data generated by the image sensor(s) 122 and/or data from the device configuration sensor(s) 120 indicating that headphones are connected to the device 102 (e.g., via wired or wireless connection(s)). In such examples, the thermal constraint manager 132 instructs the fan(s) 114 to rotate at increased speed(s) to increase cooling of the device 102 because the resulting increased fan noise is unlikely to be detected by the user who is wearing headphones.
The thermal constraint manager 132 dynamically adjusts the thermal constraint(s) and/or fan noise levels for the device 102 based on the inferred user intent to interact with the device and/or conditions in the environment. In some examples, the thermal constraint manager 132 determines that the user likely to interact with the device after previously instructing the fan(s) to increase rotational speed(s) based on, for example, data from the user presence detection sensor(s) 118 indicating that the user is moving toward the device 102 and/or reaching for the on-board keyboard. In such examples, the thermal constraint manager 132 instructs the fan(s) 114 to reduce the rotation speed and, thus, the fan noise in view of the expectation that the user is going to interact with the device 102.
As another example, if the thermal constraint manager 132 determines that the user is providing input(s) via the external device(s) 108, 110 and, thus, selects a thermal constraint for the device 102 that increases the temperature of the skin of the device. If, at later time, the thermal constraint manager 132 determines that the user is reaching for the display screen 103 (e.g., based on data from the user presence detection sensor(s) 118, the image sensor(s) 122, and/or the motion sensor(s) 123), the thermal constraint manager selects a thermal constraint that results in decreased temperature of the device skin. In such examples, power consumption by the hardware component(s) of the device 102 and/or fan speed(s) can be adjusted to cool the device 102.
As another example, if the thermal constraint manager 132 determines at a later time that the user is no longer wearing the headphones 112 (e.g., based on the image data) after previously determining that the user was wearing the headphones 112, the thermal constraint manager 132 instructs the fan(s) 114 to reduce rotational speed to generate less noise.
In some examples, the thermal constraint manager 132 dynamically adjusts the thermal constraint(s) and/or fan acoustic constraint(s) based on temperature data generated by the temperature sensor(s) 126. For example, if data from the temperature sensor(s) 126 indicates that skin temperature is approaching the threshold defined by a selected thermal constraint, the thermal constraint manager 132 generates instructions to maintain or reduce the skin temperature by adjusting power consumption of the hardware component(s) and/or by operation of the fan(s) 114.
FIG. 2 is a block diagram of an example implementation of the thermal constraint manager 132 of FIG. 1. As mentioned above, the thermal constraint manager 132 is constructed to detect user interaction(s) and/or ambient condition(s) relative to the user device 102 and to generate instructions that cause the user device 102 to transition between one or more thermal constraints with respect to skin temperature of the device 102 and/or one or more fan acoustic constraints with respect to audible noise generated by the fan(s) 114 of the device 102. In the example of FIG. 2, the thermal constraint manager 132 is implemented by one or more of the processor 130 of the user device 102, the processor 127 of the second user device 128, and/or cloud-based device(s) 129 (e.g., server(s), processor(s), and/or virtual machine(s) in the cloud 129 of FIG. 1). In some examples, some of the user interaction analysis and/or ambient condition analysis is implemented by the thermal constraint manager 132 via a cloud-computing environment and one or more other parts of the analysis is implemented by the processor 130 of the user device 102 being controlled and/or the processor 127 of a second user device 128 such as a wearable device
As illustrated in FIG. 2, the example thermal constraint manager 132 receives user presence sensor data 200 from the user presence detection sensor(s) 118 of the example user device 102 of FIG. 1, device configuration sensor data 202 from the device configuration sensor(s) 120, image sensor data 204 from the image sensor(s) 122, gesture data 205 from the motion sensor(s) 123, ambient noise sensor data 206 from the microphone(s) 124, and temperature sensor data 208 from the temperature sensor(s) 126. The sensor data 200, 202, 204, 205, 206, 208 is stored in a database 212. In some examples, the thermal constraint manager 132 includes the database 212. In other examples, the database 212 is located external to the thermal constraint manager 132 in a location accessible to the thermal constraint manager 132 as shown in FIG. 2.
The thermal constraint manager 132 includes a user presence detection analyzer 214. In this example, the user presence detection analyzer 214 provides means for analyzing the sensor data 200 generated by the user presence detection sensor(s) 118. In particular, the user presence detection analyzer 214 analyzes the sensor data 200 to determine if a user is within the range of the user presence detection sensor(s) 118 and, thus, is near enough to the user device 102 to suggest that the user is about to use the user device 102. In some examples, the user presence detection analyzer 214 determines if the user is within a particular distance from the user device 102 (e.g., within 0.5 meters of the device 102, within 0.75 meters of the device 102). The user presence detection analyzer 214 analyzes the sensor data 200 based on one or more user presence detection rule(s) 216. The user presence detection rule(s) 216 can be defined based on user input(s) and stored in the database 212.
The user presence detection rule(s) 216 can define, for instance, threshold time-of-flight (TOF) measurements by the user presence detection sensor(s) 118 that indicate presence of the user is within a range from the user presence detection sensor(s) 118 (e.g., measurements of the amount of time between emission of a wave pulse, reflection off a subject, and return to the sensor). In some examples, the user presence detection rule(s) 216 define threshold distance(s) for determining that a subject is within proximity of the user device 102. In such examples, the user presence detection analyzer 214 determines the distance(s) based on the TOF measurement(s) in the sensor data 200 and the known speed of the light emitted by the sensor(s) 118. In some examples, the user presence detection analyzer 214 identifies changes in the depth or distance values over time and detects whether the user is approaching the device 102 or moving away from the user device 102 based on the changes. The threshold TOF measurement(s) and/or distance(s) for the sensor data 200 can be based on the range of the sensor(s) 118 in emitting pulses. In some examples, the threshold TOF measurement(s) and/or distances are based on user-defined reference distances for determining that a user is near or approaching the user device 102 as compared to simply being in the environment in which the user device 102 and the user are both present.
The example thermal constraint manager 132 of FIG. 2 includes a device configuration analyzer 218. In this example, the device configuration analyzer 218 provides means for analyzing the sensor data 202 generated by the device configuration sensor(s) 120. The device configuration analyzer 218 analyzes the sensor data 202 to detect, for example, whether user input(s) are being received via the on-board keyboard 104 and/or the on-board pointing device(s) 106 of the user device 102 or via one or more external devices (e.g., the external keyboard 108, the external pointing device(s) 110) communicatively coupled to the user device 102. In some examples, the device configuration analyzer 218 detects that audio output(s) from the device 102 are being delivered via an external output device such as the headphones 112. In some examples, the device configuration analyzer 218 analyzes the orientation of the device 102 to infer, for example, whether a user is sitting while interacting with device 102, standing while interacting with the device 102 (e.g., based on an angle of a display screen of the device 102), whether the device 102 is in tablet mode, etc.
The device configuration analyzer 218 analyzes the sensor data 202 based on one or more device configuration rule(s) 219. The device configuration rule(s) 219 can be defined based on user input(s) and stored in the database 212. The device configuration rule(s) 219 can define, for example, identifiers for recognizing when external device(s) such as the headphones 112 of FIG. 1 are communicatively coupled to the user device 102 via one or more wired or wireless connections. The device configuration rule(s) 219 define rule(s) for detecting user input(s) being received at the user device via the external device(s) 108, 110 based on data received from the external device(s). The device configuration rule(s) 219 define rule(s) for detecting audio output(s) delivered via the external device such as the headphone(s) 118. The device configuration rule(s) 219 can define rule(s) indicating that if the display screen 103 is angled within a particular angle range (e.g., over 90° relative to a base of laptop), the user is sitting while interacting with the device 102.
The example thermal constraint manager 132 of FIGS. 1 and 2 is trained to recognize user interaction(s) relative to the user device 102 to predict whether the user is likely to interact with the device 102. In the example of FIG. 2, the thermal constraint manager 132 analyzes one or more of the sensor data 204 from the image sensor(s) 122 and/or the sensor data 205 from the motion sensor(s) 123 to detect user activity relative to the device 102. In the example of FIG. 2, the thermal constraint manager 132 is trained to recognize user interactions by a training manager 224 using machine learning and training sensor data for one or more subjects, which may or may not include sensor data generated by the sensor(s) 122, 123 of the user device 102 of FIG. 1. In some examples, the training sensor data is generated from subject(s) who are interacting with the user device 102 and/or a different user device. The training sensor data is stored in a database 232. In some examples, the training manager 224 includes the database 232. In other examples, the database 232 is located external to the training manager 224 in a location accessible to the training manager 224 as shown in FIG. 2. The databases 212, 232 of FIG. 2 may be the same storage device or different storage devices.
In the example of FIG. 2, the training sensor data includes training gesture data 230, or data including a plurality of gestures performed by user(s) and associated user interactions represented by the gestures in the context of interacting with the user device 102. For instance, the training gesture data 230 can include a first rule indicating that if a user raises his or her hand proximate to his or her ear, the user is talking on a telephone. The training gesture data 230 can include a second rule indicating that if a user is reaching his or her hand away from his or her body as detected by a motion sensor disposed proximate to a keyboard of the device and/or as captured in image data, the user is reaching for the display screen of the user device. The training gesture data 230 can include a third rule indicating that if only a portion of the user's body from the waist upward is visible in image data, the user is in a sitting position.
In the example of FIG. 2, the training sensor data includes training facial feature data 231, or data including a plurality of images of subject(s) and associated eye position data, mouth position data, head accessory data (e.g., headphone usage) represented by the image(s) in the context of viewing the display screen 103 of the device 102, looking away from the display screen 103 of the device 102, interacting with the device 102 while wearing headphones, etc. The training facial feature data 231 can include a first rule that if both of the user's eyes are visible in image data generated by the image sensor(s) 122 of the user device 102, then the user is looking at the display screen 103 of the device 102. The training facial feature data 231 can include a second rule that if one of the user's eyes is visible in the image data, the user is likely to interact with the device 102. The training facial feature data 231 can include a third rule that if neither of the user's eyes is visible in the image data, the user is looking away from the device 102. The training facial feature data 231 can include a fourth rule that if the user's mouth is open in the image data, the user is talking. The training facial feature data 231 can include a fifth rule that identifies when a user is wearing headphones based on feature(s) detected in the image data.
The example training manager 224 of FIG. 2 includes a trainer 226 and a machine learning engine 228. The trainer 226 trains the machine learning engine 228 using the training gesture data 230 and the training facial feature data 231 (e.g., via supervised learning) to generate one or more model(s) that are used by the thermal constraint manager 132 to control thermal constraints of the user device 102 based on user interaction(s) and/or inferred intent regarding user interaction(s) with the device 102. For example, the trainer 226 uses the training gesture data 230 to generate one or more gesture data models 223 via the machine learning engine 228 that define user interaction(s) relative to the device 102 in response to particular gestures performed by the user. As another example, the trainer 226 users the training facial feature data 231 to generate one or more facial feature data models 225 via the machine learning engine 228 that that define user interaction(s) relative to the device 102 in response to particular eye tracking positions, facial expressions of the user, and/or head accessories (e.g., headphones) worn by the user. In the example of FIG. 2, the gesture data model(s) 223 and the facial feature data model(s) 225 are stored in the database 212. The example database 212 can store additional or fewer models than shown in FIG. 2. For example, the database 212 can store a model generated during training based on the training gesture data 230 and data indicative of a distance of the user relative to the device (e.g., based on proximity sensor data) and/or device configuration (e.g., based on sensor data indicating screen orientation).
The example thermal constraint manager 132 of FIG. 2 uses the model(s) 223, 225 to interpret the respective sensor data generated by the motion sensor(s) 123 and/or the image sensor(s) 122. The example thermal constraint manager 132 of FIG. 2 includes a motion data analyzer 222. In this example, the motion data analyzer 222 provides means for analyzing the sensor data 205 generated by the motion sensor(s) 123, The example motion data analyzer 222 uses the gesture data model(s) 223 to identify gesture(s) performed by the user relative to the device 102. For example, based on the gesture data model(s) 223 and the sensor data 205 generated by the motion sensor(s) 123 disposed proximate to, for instance, display screen 103 of the device 102 and/or a touchpad of the device 102, the motion data analyzer 222 can determine that the user is reaching for the display screen 103 of the user device 102.
The example thermal constraint manager 132 of FIG. 2 includes an image data analyzer 220. In this example, the image data analyzer 220 provides means for analyzing the sensor data 204 generated by the image sensor(s) 122. The image data analyzer 220 uses the gesture data model(s) 223 and/or the facial feature data model(s) 225 to analyzes the sensor data 204 to identify, for instance, gesture(s) being performed by the user and/or the user's posture relative to the device 102, and/or to track a position of the user's eyes relative to the device 102. For example, based on the gesture data model(s) 223 and the image sensor data 204, the image data analyzer 220 can determine that the user is typing. In other examples, based on the facial feature data model(s) 225 and the image sensor data 204 including a head of the user, the image data analyzer 220 determines that the user is turned away from the device 102 because the user's eyes are not visible in the image data.
In the example of FIG. 2, the thermal constraint manager 132 includes a timer 244. In this example, the timer 244 provides means for monitoring a duration of time within which a user input is received at the user device 102 after the user presence detection analyzer 214 determines that the user is within the range of the user presence detection sensor(s) 118. The timer 244 additionally or alternatively provides means for monitoring a duration of time in which the motion data analyzer 222 and/or the image data analyzer 220 determine that there is a likelihood of user interaction within the device after the user presence detection analyzer 214 determines that the user is within the range of the user presence detection sensor(s) 118. The timer 244 monitors the amount of time that has passed based on time interval threshold(s) 246 stored in the database 212 and defined by user input(s). As disclosed herein, if a user input is not received within the time interval threshold(s) 246 and/or if the motion data analyzer 222 and/or the image data analyzer 220 have not determined that a user interaction with the device 102 is likely to occur within the time interval threshold(s) 246, the thermal constraint manager 132 can adjust the thermal constraint(s) and/or the fan acoustic constraint(s) in response to the lack of user interaction with the device 102.
The thermal constraint manager 132 of FIG. 2 includes an ambient noise analyzer 234. In this example, the ambient noise analyzer 234 provides means for analyzing the sensor data 206 generated by the ambient noise sensor(s) 124. The ambient noise analyzer 234 analyzes the sensor data 206 analyzes the sensor data 206 based on one or more ambient noise rule(s) 235. In the example of FIG. 2, the ambient noise rule(s) 235 define threshold ambient noise level(s) that, if exceeded, indicate that a user is unlikely to detect an increase in audible fan noise. The ambient noise rule(s) 235 can be defined based on user input(s) and stored in the database 212.
The thermal constraint manager 132 of FIG. 2 includes a temperature analyzer 236. In this example, the temperature analyzer 236 provides means for analyzing the sensor data 208 generated by the temperature sensor(s) 126. The temperature analyzer 236 analyzes the sensor data 208 to determine the temperature of one or more hardware component(s) of the user device 102 and/or the skin of the housing of the user device 102. For example, the temperature analyzer 236 can detect an amount of heat generated by the processor 130 and/or a temperature of the exterior skin of the housing 102 during operation of the device 102.
The example thermal constraint manager 132 of FIG. 2 includes a sensor manager 248 to manage operation of one or more of the user presence detection sensor(s) 118, the device configuration sensor(s) 120, the image sensor(s) 122, the motion sensor(s) 122, the ambient noise sensor(s) 124, and/or the temperature sensor(s) 126. The sensor manager 248 controls operation of the sensor(s) 118, 120, 122, 124, 126 based on one or more sensor activation rule(s) 250. The sensor activation rule(s) 250 can be defined by user input(s) and stored in the database 212.
In some examples, the sensor activation rule(s) 250 define rule(s) for activating the sensor(s) to conserve power consumption by the device 102. For example, the sensor activation rule(s) 250 can define that the user presence detection sensor(s) 118 should remain active while the device 102 is operative (e.g., in a working power state) and that the image sensor(s) 122 should be activated when the user presence detection analyzer 214 determines that a user is within the range of the user presence detection sensor(s) 118. Such a rule can prevent unnecessary power consumption by the device 102 when, for instance, the user is not proximate to the device 102. In other examples, the sensor manager 248 selectively activates the image sensor(s) 122 to supplement data generated by the motion sensor(s) 123 to increase an accuracy with which the gesture(s) of the user are detected. In some examples, the sensor manager 248 deactivates the image sensor(s) 122 if the image data analyzer 220 does not predict a likelihood of a user interaction with the device and/or the device 102 does not receive a user input within a time threshold defined by the timer 244 to conserve power.
The example thermal constraint manager 132 of FIG. 2 includes a thermal constraint selector 252. In the example of FIG. 2, the thermal constraint selector 252 selects a thermal constraint to be assigned to the user device 102 based on one or more of data from the user presence detection analyzer 214, the device configuration analyzer 218, the motion data analyzer 222, the image data analyzer 220, the ambient noise analyzer 234, and/or the temperature analyzer 236. The example thermal constraint selector 252 selects the thermal constraint to be assigned to the user device based on one or more thermal constraint selection rule(s) 254. The thermal constraint selection rule(s) 254 are defined based on user input(s) and stored in the database 212.
For example, the thermal constraint selection rule(s) 254 can include a first rule that if the device configuration analyzer 218 determines that the user is providing input(s) via a keyboard or touch screen of the device 102, a first, or default thermal constraint for the temperature of the skin of the housing device 102 should be assigned to the user device 102 to prevent discomfort to the user when touching the device 102. The default thermal constraint for the skin temperature can be for, for example, 45° C. The thermal constraint selection rule(s) 254 can include a second rule that if the device configuration analyzer 218 determines that the user is providing input(s) via the external keyboard 108, a second thermal constraint should be assigned to the device 102, where the second thermal constraint provides for an increased skin temperature of the device as compared to the first (e.g., default) thermal constraint. For example, the second thermal constraint can define a skin temperature limit of 48° C.
The example thermal constraint manager 132 of FIG. 2 includes a fan acoustic constraint selector 258. In the example of FIG. 2, the fan acoustic constraint selector 258 selects a fan acoustic constraint to be assigned to the user device 102 based on one or more of data from the user presence detection analyzer 214, the device configuration analyzer 218, the motion data analyzer 222, the image data analyzer 220, the ambient noise analyzer 234, and/or the temperature analyzer 236. The example thermal constraint selector 252 selects the fan acoustic constraint to be assigned to the user device 102 based one or more fan acoustic constraint selection rule(s) 260. The fan acoustic constraint selection rule(s) 260 are defined based on user input(s) and stored in the database 212.
For example, the fan acoustic constraint selection rule(s) 260 can include a first or default rule for the fan noise level based on data from the user presence detection analyzer 214 indicating that the user is within a first range of the user presence detection sensor(s) 118 (e.g., 0.5 meters from the device 102). The first rule can define a sound pressure level corresponding to 35 dBA for noise generated by the fan(s). The fan acoustic constraint selection rule(s) 260 can include a second rule for the fan noise level based on data from the user presence detection analyzer 214 indicating that the user is within a second range of the user presence detection sensor(s) 118 (e.g., 1 meter from the device 102), where the second rule defines a sound pressure level corresponding to a sound pressure level (e.g., 41 dBA) for noise generated by the fan(s) 114 that is greater than the sound pressure level defined by the first rule. The fan acoustic constraint selection rule(s) 260 can include a third rule for the fan noise level based on data from the image data analyzer 220 indicating that the user is turned away from the user device 102. The third rule can define a fan speed and, thus, acoustic noise level, that is increased relative to the fan speed and associated acoustic noise defined by the first or default fan acoustic rule in view of the determination that the user is not interacting or not likely interacting with the device 102. The fan acoustic constraint selection rule(s) 260 can include a fourth rule indicating that if the device configuration analyzer 218 determines that an angle of a display screen of the device 102 is within a particular angle range relative to, for instance, a base of a laptop, the user is sitting when interacting with the device 102 and, thus, located closer to the device than if the user is standing. In such examples, the fourth rule can define a reduced fan acoustic noise level as compared to if the user is standing or located farther from the device 102.
The fan acoustic constraint selection rule(s) 260 can include a fifth rule indicating that if the device configuration analyzer 218 that headphones are coupled to the device 102 and/or the image data analyzer 220 determine that the user is wearing headphones, the fan acoustic noise can be increased relative to the default fan noise level. The fan acoustic constraint selection rule(s) 260 can include a fifth rule indicating that if the ambient noise analyzer 234 determines that the fan noise exceeds an ambient noise threshold, the fan acoustic noise can be increased relative to the default fan noise level. The fan acoustic constraint selection rule(s) 260 can include a sixth rule indicating that if the device configuration analyzer 218, the image data analyzer 220, and/or the motion data analyzer 222 do not detect a user input and/or a predict a likelihood of a user interaction with the device 102 within the time interval threshold(s) 246 as monitored by the timer 244, the fan acoustic noise should be increased because the user is not likely interacting with the device 102.
In the example of FIG. 2, the thermal constraint selector 252 and the fan acoustic constraint selector 258 can communicate to optimize performance of the device 102, thermal constraints for the skin of the device 102, and fan acoustic noise levels in view of user interaction(s) and/or ambient conditions. For example, if the device configuration analyzer 218 determines that user is providing user inputs via an external device, the thermal constraint selector 252 can select a first thermal constraint that results in increased skin temperature of the device (e.g., 46° C.) relative to a default temperature (e.g., 45° C.). If the ambient noise analyzer 234 determines that the user device is in a quiet environment, the fan acoustic constraint selector 258 can select a first fan acoustic constraint for the device 102 that permits for a modest increase in fan noise level(s) (e.g., 38 dBA) over a default level (e.g., 35 dBA) to accommodate the increased heat permitted by the first thermal constraint and prevent overheating of the device 102. However, if the ambient noise analyzer 234 determines that the user device 102 is in a loud environment, the thermal constraint selector 252 can select a second thermal constraint for the device 102 that provides for an increased skin temperature (e.g., 48° C.) over the first thermal constraint (e.g., 46° C.) and the default thermal constraint (e.g., 45° C.) and, thus, permits increased device performance as result of increased power consumption by the device component(s). Also, the fan acoustic constraint selector can select a second fan acoustic constraint for the device 102 that permits an increase in fan noise level(s) (e.g., 41 dBA) over the first fan constraint (e.g., 38 dBA) and the default fan acoustic constraint (e.g., 35 dBA). Because the device 102 is in a loud environment, the performance of the device 102 can be increased by permitting increased heat to be generated by the component(s) of the device 102 as compared to if the device 102 where in a quiet environment and the fan acoustic constraints were limited in view of low ambient noise.
The thermal constraint manager 132 of FIG. 2 includes a power source manager 238. In this example, the power source manager 238 generates instruction(s) that are transmitted to the power source(s) 116 of the user device 102 of FIG. 1 to control the power provided to the processor 130 and/or other hardware components of the user device 102 (e.g., a video graphics card). As disclosed herein, increasing the power provided to the hardware component(s) of the device 102 increases the performance level of those component(s) (e.g., the responsiveness, availability, reliability, recoverability, and/or throughput of the processor 130). In the example of FIG. 2, the thermal constraint selector 252 communicates with the power source manager 238 to increase or decrease the power provided to the hardware component(s) of the device 102 in view of the selected thermal constraint(s). For example, if the thermal constraint selector 252 selects a thermal constraint for the device skin temperature that allows the skin temperature to increase relative to a default skin temperature limit, the power source manager 238 generates instructions for increased power to be provided to the hardware component(s) of the device 102. If the thermal constraint selector 252 determines that the skin temperature of the device 102 should be reduced (e.g., in response to a change in user interaction with the device 102), the power source manager 238 generates instructions for power provided to the hardware component(s) of the device 102 to be reduced to decrease the amount of heat generated by the component(s). The example power source manager 238 transmits the instruction(s) to the power source 116 via one or more wired or wireless connections.
The example thermal constraint manager 132 of FIG. 2 includes a fan speed manager 240. The fan speed manager 240 generates instruction(s) to control the fan speed (e.g., revolutions per minute) of the fan(s) 114 of the user device 102 of FIG. 1 in response to selection of a fan acoustic constraint by the fan acoustic constraint selector 258. In some examples, the fan speed manager 240 generates instruction(s) to control speed of the fan(s) 114 in response to selection of a thermal constraint by the thermal constraint selector 252 to prevent, for instance, overheating of the hardware component(s) of the device when the selected thermal constraint permits an increase in skin temperature of the device 102. The fan speed manager 240 transmits the instruction(s) to the fan(s) 114 via one or more wired or wireless connections.
In some examples, the fan acoustic constraint selector 258 selects a fan acoustic constraint associated with increased fan acoustic noise when the user presence detection analyzer 214 does not detect the presence of a user within the range of the user presence detection sensor(s) 118 or when the user presence detection analyzer 214 determines that the user is a predefined distance from the device 102 to facilitate heatsink and fan shroud cleaning of heatsink(s) and fan shroud(s) of the device 102 (e.g., to remove accumulated dust). Because heatsink and fan shroud cleaning can increase acoustic generated by the fan(s) 114 when rotation of the fan(s) 114 are reversed to perform the cleaning, the fan acoustic constraint selector 258 can select a fan acoustic constraint for the device 102 and communicate with the fan speed manager 240 to perform the cleaning when user(s) are not proximate to the device 102. In such examples, the acoustic noise of the fan(s) 114 can be increased without disrupting a user interacting with the device 102 and longevity of the device performance can be increased though periodic cleanings.
The example thermal constraint selector 252 of FIGS. 1 and/or 2 dynamically selects the thermal constraint to be assigned to the device 102 based on analysis of the sensor data. For example, at first time, the thermal constraint selector 252 can select a first thermal constraint for the device 102 that corresponds to increased temperature of the skin of the housing of the device 102 based on data indicating the user is providing input(s) via the external keyboard 108. If, at a later time, the gesture data analyzer detects that the user is reaching for the display screen 103 of the device 102, the thermal constraint selector 252 selects a second thermal constraint for the device 102 that reduces the skin temperature of the device. In response, the power source manager 238 generates instructions to adjust the power provided to the hardware component(s) of the device to reduce heat generated and/or the fan speed manager 240 generate instructions to adjust the fan speed(s) (e.g., increase the fan speed(s) to exhaust hot air) in view of the change in the thermal constraint selected for the device 102.
In some examples, the thermal constraint selector 252 and/or the fan acoustic constraint selector 258 selectively adjust the constraint(s) applied to the device 102 based on temperature data generated by the temperature sensor(s) 126 during operation of the device. For example, if increased power is provided to the hardware component(s) of the device 102 in response to selection of a thermal constraint the permits increased skin temperature of the housing of the device 102, the fan speed manager 240 can instruct the fan(s) 114 to increase rotational speed to prevent the skin temperature from exceeding the selected thermal constraint based on data from the temperature sensor(s) 126.
Although the example thermal constraint manager 132 of FIGS. 1 and/or 2 is discussed in connection with analysis of sensor data from the user presence detection sensor(s) 118, the user input sensor(s) 120, the image sensor(s) 122, and/or the ambient noise sensor(s) 124, the example thermal constraint manager 132 can analyze data based on other sensors of the user device 102 of FIG. 1 (e.g., ambient light sensor(s)) to evaluate user interaction(s) and/or the environment in which the device 102 is located and assign thermal and/or fan acoustic constraints to the device 102.
While an example manner of implementing the thermal constraint manager 132 of FIG. 1 is illustrated in FIG. 2, one or more of the elements, processes and/or devices illustrated in FIG. 2 may be combined, divided, re-arranged, omitted, eliminated and/or implemented in any other way. Further, the example user presence detection analyzer 214, the example device configuration analyzer 218, the example image data analyzer 220, the example motion data analyzer 222, the example ambient noise analyzer 234, the example temperature analyzer 236, the example power source manager 238, the example fan speed manager 240, the example timer 244, the example sensor manager 248, the example thermal constraint selector 252, the example fan acoustic constraint selector 258, the example database 212 and/or, more generally, the example thermal constraint manager 132 of FIG. 2 may be implemented by hardware, software, firmware and/or any combination of hardware, software and/or firmware. Thus, for example, any of the example user presence detection analyzer 214, the example device configuration analyzer 218, the example image data analyzer 220, the example motion data analyzer 222, the example ambient noise analyzer 234, the example temperature analyzer 236, the example power source manager 238, the example fan speed manager 240, the example timer 244, the example sensor manager 248, the example thermal constraint selector 252, the example fan acoustic constraint selector 258, the example database 212 and/or, more generally, the example thermal constraint manager 132 could be implemented by one or more analog or digital circuit(s), logic circuits, programmable processor(s), programmable controller(s), graphics processing unit(s) (GPU(s)), digital signal processor(s) (DSP(s)), application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)) and/or field programmable logic device(s) (FPLD(s)). When reading any of the apparatus or system claims of this patent to cover a purely software and/or firmware implementation, at least one of the example user presence detection analyzer 214, the example device configuration analyzer 218, the example image data analyzer 220, the example motion data analyzer 222, the example motion data analyzer 222, the example ambient noise analyzer 234, the example temperature analyzer 236, the example power source manager 238, the example fan speed manager 240, the example timer 244, the example sensor manager 248, the example thermal constraint selector 252, and/or the example fan acoustic constraint selector 258, the example database 212 is/are hereby expressly defined to include a non-transitory computer readable storage device or storage disk such as a memory, a digital versatile disk (DVD), a compact disk (CD), a Blu-ray disk, etc. including the software and/or firmware. Further still, the example thermal constraint manager 132 of FIG. 2 may include one or more elements, processes and/or devices in addition to, or instead of, those illustrated in FIG. 2, and/or may include more than one of any or all of the illustrated elements, processes and devices. As used herein, the phrase “in communication,” including variations thereof, encompasses direct communication and/or indirect communication through one or more intermediary components, and does not require direct physical (e.g., wired) communication and/or constant communication, but rather additionally includes selective communication at periodic intervals, scheduled intervals, aperiodic intervals, and/or one-time events.
While an example manner of implementing the training manager 224 is illustrated in FIG. 2, one or more of the elements, processes and/or devices illustrated in FIG. 2 may be combined, divided, re-arranged, omitted, eliminated and/or implemented in any other way. Further, the example trainer 224, the example machine learning engineer 228, the example database 232 and/or, more generally, the example training manager 224 of FIG. 2 may be implemented by hardware, software, firmware and/or any combination of hardware, software and/or firmware. Thus, for example, any of the example trainer 224, the example machine learning engineer 228, the example database 232 and/or, more generally, the example training manager 224 could be implemented by one or more analog or digital circuit(s), logic circuits, programmable processor(s), programmable controller(s), graphics processing unit(s) (GPU(s)), digital signal processor(s) (DSP(s)), application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)) and/or field programmable logic device(s) (FPLD(s)). When reading any of the apparatus or system claims of this patent to cover a purely software and/or firmware implementation, at least one of the example trainer 224, the example machine learning engineer 228, and/or the example database 232 is/are hereby expressly defined to include a non-transitory computer readable storage device or storage disk such as a memory, a digital versatile disk (DVD), a compact disk (CD), a Blu-ray disk, etc. including the software and/or firmware. Further still, the example training manager 224 of FIG. 2 may include one or more elements, processes and/or devices in addition to, or instead of, those illustrated in FIG. 2, and/or may include more than one of any or all of the illustrated elements, processes and devices. As used herein, the phrase “in communication,” including variations thereof, encompasses direct communication and/or indirect communication through one or more intermediary components, and does not require direct physical (e.g., wired) communication and/or constant communication, but rather additionally includes selective communication at periodic intervals, scheduled intervals, aperiodic intervals, and/or one-time events.
FIG. 3 illustrates a graph 300 of example thermal constraints that may be implemented in connection with an electronic user device such as the example user device 102 of FIG. 1 to control a temperature of an exterior surface, or skin, of the device (e.g., a housing or body of the device). In particular, the example graph 300 of FIG. 3 illustrates temperature of the skin of the user device 102 over time for different thermal constraints. A default temperature for the skin of the device 102 can be set at 45° C., as represented by line 302 in FIG. 3. A first thermal constraint 304 corresponds to a default thermal constraint in that, when implemented by the device 102, the skin temperature of the user device 102 does not exceed the default skin temperature represented by line 302. As disclosed herein, in some examples, the thermal constraint manager 132 of FIGS. 1 and/or 2 determines that a thermal constraint that permits the skin temperature of the device 102 to increase can be selected in view of, for instance, user interaction(s) with the device 102. As shown in FIG. 3, a second thermal constraint 306 provides for an increase in skin temperature relative to the first thermal constraint 304 (e.g., a skin temperature limit of 46° C.). A third thermal constraint 308 and a fourth thermal constraint 310 permit additional increases in skin temperature relative to the first and second thermal constraints 304, 306. If one or more of the second, third, or fourth thermal constraints 306, 308, 310 is selected, the power source manager 238 of the example thermal constraint manager 132 generates instructions to increases the power provided to the hardware component(s) of the user device 102, which allows the component(s) to generate more heat without violating the thermal constraint and improve performance of the device 102.
FIG. 4 illustrates an example user device 400 (e.g., the user device 102 of FIG. 1) in which examples disclosed herein may be implemented. In FIG. 4, the example user device 400 is a laptop. However, as disclosed herein, other types of user devices, such as desktops or electronic tablets, can be used to implement the examples disclosed herein.
FIG. 4 illustrates the user device 400 in a first configuration in which a user 402 interacts with the user device 400 by providing input(s) via an on-board keyboard 404 (e.g., the keyboard 104) of the device 400. The keyboard 404 is supported by a housing 406 of the device 400, where the housing 406 includes an exterior surface or skin 408 that defines the housing 406. To prevent the temperature of one or more portions of the skin 408 from becoming too hot while the user is directly touching the device 400, the example thermal constraint manager 132 of FIGS. 1 and/or 2 can select a thermal constraint for the device 400 that maintains the skin temperature at or substantially at a default level (e.g., the first thermal constraint 304 of FIG. 3 corresponding to a skin temperature of 45° C. for the skin 408). In such examples, the power source manager 238 of the example thermal constraint manager 132 manages power level(s) for the hardware component(s) of the device 400 so that the resulting temperature of the skin 408 does not exceed the thermal constraint. Additionally or alternatively, the thermal constraint manager 132 can determine the user 402 is not wearing headphones based on data generated by, for instance, the device configuration sensor(s) 120 and/or the image data sensor(s) 122 of FIG. 1. Thus, the fan constraint selector 258 can select a fan acoustic constraint for the device 400 so that the noise generated by the fan(s) of the device 400 (e.g., the fan(s) 114) do not exceed, for instance, a default fan noise level of 35 dBA.
FIG. 5 illustrates the example user device 400 of FIG. 4 in a second configuration in which the user 402 is interacting with the user device 102 via an external keyboard 500. Thus, because the user 402 is interacting with the user device 400 via the external keyboard 500, the user 402 is not directly touching the device 400 (e.g., the skin 408 of the device 400). In this example, the thermal constraint selector 252 can select a thermal constraint (e.g., the second, third, or fourth thermal constraints 306, 308, 310 of FIG. 3) that permits an increase in a temperature of the skin 408 of the device 400 above the default temperature (e.g., above the temperature associated with the first thermal constraint 304 of FIG. 3). In view of the permitted increase in the temperature of the skin 404, power to one or more hardware components of the device 400 and, thus performance of those component(s) can be increased.
A flowchart representative of example hardware logic, machine readable instructions, hardware implemented state machines, and/or any combination thereof for implementing the example training manager 224 of FIG. 2 is shown in FIG. 6. The machine readable instructions may be one or more executable programs or portion(s) of an executable program for execution by a computer processor such as the processor 224 shown in the example processor platform 800 discussed below in connection with FIG. 8. The program may be embodied in software stored on a non-transitory computer readable storage medium such as a CD-ROM, a floppy disk, a hard drive, a DVD, a Blu-ray disk, or a memory associated with the processor 224, but the entire program and/or parts thereof could alternatively be executed by a device other than the processor 224 and/or embodied in firmware or dedicated hardware. Further, although the example program is described with reference to the flowchart illustrated in FIG. 6, many other methods of implementing the example training manager 224 may alternatively be used. For example, the order of execution of the blocks may be changed, and/or some of the blocks described may be changed, eliminated, or combined. Additionally or alternatively, any or all of the blocks may be implemented by one or more hardware circuits (e.g., discrete and/or integrated analog and/or digital circuitry, an FPGA, an ASIC, a comparator, an operational-amplifier (op-amp), a logic circuit, etc.) structured to perform the corresponding operation without executing software or firmware.
FIG. 6 is a flowchart of example machine readable instructions that, when executed, implement the example training manager 224 of FIG. 2. In the example of FIG. 6, the training manager 224 trains the example thermal constraint manager 132 of FIGS. 1 and/or 2 using training gesture data and/or training facial feature data, which is generated for one or more users who may or may not be using the example user device 102 of FIG. 1. As discussed herein, the training manager 224 generates machine learning models that are used by the thermal constraint manager 132 of FIGS. 1 and/or 2 to select thermal constraint(s) for a temperature of a skin of the a user device (e.g., skin 408 of the housing 406 the user device 102, 400) and/or fan acoustic constraint(s) for noise generated by fan(s) of the user device (e.g., the fan(s) 114 of the user device 102) based on user interaction(s) relative to the user device 102.
The example instructions of FIG. 6 can be executed by one or more processors of, for instance, the user device 102, another user device (e.g., the user device 128), and/or a cloud-based device (e.g., the cloud-based device(s) 129). The instructions of FIG. 6 can be executed in substantially real-time as the training gesture data and/or the training facial feature data is received by the training manager 224 or at some time after the training data is received by the training manager 224. The training manager 224 can communicate with the thermal constraint manager 132 via one or more wired or wireless communication protocols.
The example trainer 226 of FIG. 2 accesses training gesture data 230 and/or training facial feature data 231 (block 600). The training gesture data 230 and/or training facial feature data 231 can be stored in the database 232. In some examples, the training gesture data 230 and/or training facial feature data 231 is generated for one or more users of the user device 102. In some examples, the training gesture data 230 and/or the training facial feature data 231 can be received from the thermal constraint manager 132 and/or directly from the image sensor(s) 122 and/or the motion sensor(s) 123 of the example user device 102, 400. In some other examples, the training gesture data 230 and/or the training facial feature data 231 is generated for users who are not the user(s) of the user device 102.
The example trainer 226 of FIG. 2 identifies user interactions (e.g., user interactions with the user device 102, 400 and/or other user interactions such as talking on a phone) represented by the training gesture data 230 and/or the training facial feature data 231 (block 602). As an example, based on the training gesture data 230, the trainer 226 identifies an arm motion in which a user reaches his or her arm forward as indicating that the user intends to touch a touch screen of a user device. As another example, based on the training facial feature data 231, the trainer 226 identifies eye positions indicating that a user is looking toward or away from a display screen of the device.
The example trainer 226 of FIG. 2 generates one or more gesture data model(s) 223 via the machine learning engine 228 and based on the training gesture data 230 and one or more facial feature data model(s) 225 via the machine learning engine 228 and based on the training facial feature data 231 (block 604). For example, the trainer 2226 uses the training gesture data 230 to generate the gesture data model(s) 223 that are used by the thermal constraint manager 132 to determine whether a user is typing on the keyboard 104, 404 of the user device 102, 400.
The example trainer 226 can continue to train the thermal constraint manager 132 using different datasets and/or datasets having different levels of specificity (block 606). For example, the trainer 226 can generate a first gesture data model 223 to determine if the user is interacting with the keyboard 104 of the user device 102, 400 and a second gesture data model 223 to determine if the user is interacting with the pointing device(s) 106 of the user device 102, 400. The example instructions end when there is no additional training to be performed (e.g., based on user input(s)) (block 608).
The example instructions of FIG. 6 can be used to perform training based on other types of sensor data. For example, the example instructions of FIG. 6 can be used to train the thermal constraint manager 132 to associate different orientations of the device 102, 400, screen angle, etc., with different user positions (e.g., sitting, standing) relative to the device 102, 400 and/or different locations of the device (e.g., resting a user's lap, held in a user's hand, resting on table).
A flowchart representative of example hardware logic, machine readable instructions, hardware implemented state machines, and/or any combination thereof for implementing the thermal constraint manager 132 of FIG. 2 is shown in FIGS. 7A-7B. The machine readable instructions may be one or more executable programs or portion(s) of an executable program for execution by a computer processor such as the processor 132 shown in the example processor platform 900 discussed below in connection with FIG. 9. The program may be embodied in software stored on a non-transitory computer readable storage medium such as a CD-ROM, a floppy disk, a hard drive, a DVD, a Blu-ray disk, or a memory associated with the processor 132, but the entire program and/or parts thereof could alternatively be executed by a device other than the processor 132 and/or embodied in firmware or dedicated hardware. Further, although the example program is described with reference to the flowchart illustrated in FIGS. 7A-7B, many other methods of implementing the example thermal constraint manager 132 may alternatively be used. For example, the order of execution of the blocks may be changed, and/or some of the blocks described may be changed, eliminated, or combined. Additionally or alternatively, any or all of the blocks may be implemented by one or more hardware circuits (e.g., discrete and/or integrated analog and/or digital circuitry, an FPGA, an ASIC, a comparator, an operational-amplifier (op-amp), a logic circuit, etc.) structured to perform the corresponding operation without executing software or firmware.
FIGS. 7A and 7B are flowcharts of example machine readable instructions that, when executed, implement the example thermal constraint manager 132 of FIGS. 1 and/or 2. In the example of FIGS. 7A and 7B, the thermal constraint manager 132 generates instruction(s) to control the thermal constraint(s) and/or fan acoustic constraint(s) of a user device (e.g., the user device 102, 400) based on a user interaction(s) and/or ambient condition(s) for an environment in which the device is located. The example instructions of FIGS. 7A and 7B can be executed by one or more processors of, for instance, the user device 102, 400 another user device (e.g., the user device 128), and/or a cloud-based device (e.g., the cloud-based device(s) 129). The instructions of FIGS. 7A and 7B can be executed in substantially real-time as sensor data received by the thermal constraint manager 132 or at some time after the sensor data is received by the thermal constraint manager 132.
In the example instructions of FIGS. 7A and 7B, the device 102, 400 can be in a working power state (e.g., a power state in which the device is fully operational in that the display screen is turned on, applications are being executed by processor(s) of the device) or a connected standby state (e.g., a low power standby state in which the device remains connected to the Internet such that processor(s) of the device can respond quickly to hardware and/or network events).
The example user presence detection analyzer 214 determines whether the user is within a threshold distance of the user device 102 (block 700). For example, the user presence detection analyzer 214 detects a user is approaching the user device 102, 400 based on data generated by the user presence detection sensor(s) 118 (e.g., TOF data, etc.) indicating that the user is within the range of the user presence detection sensor(s) 118. In some examples, the user presence detection analyzer 214 determines if the user is within a predefined distance of the device 102 (e.g., within 1 meter, within 0.5 meters, etc.).
In the example of FIGS. 7A and 7B, if the user presence detection analyzer 214 of the example thermal constraint manager 132 of FIG. 2 determines a user is detected within a threshold distance of the user device 102, the example device configuration analyzer 218 of the example thermal constraint manager 132 of FIG. 2 determines whether user input(s) are detected within a threshold time (block 702). For example, the timer 244 communicates with the device configuration analyzer 218 to determine the amount of time between which a user presence is detected within a threshold distance of the device 102, 400 (e.g., block 702) and when user input(s) are received by the device 102, 400. In some examples, the device configuration analyzer 218 detects user input(s) at the user device 102 such as keyboard input(s), touch screen input(s), mouse click(s), etc. If the device configuration analyzer 218 determines the user input(s) are detected within the threshold time, control proceeds to block 704.
At block 704, the device configuration analyzer 218 determines whether the user input(s) are received via external user input device(s) or on-board input device(s). For example, the device configuration analyzer 218 detects user input(s) via the external keyboard 108 and/or the external pointing device(s) 110 or via the on-board keyboard 104 and/or the on-board pointing device(s) 106.
If the device configuration analyzer 218 determines that the user input(s) are received via an external user input device, the thermal constraint selector 252 of the example thermal constraint manager 132 of FIG. 2 selects a thermal constraint for a temperature of the skin 408 of the device 102 (e.g., based on the thermal constraint selection rule(s) 254 stored in the database 212) that permits an increase in a temperature of a skin 408 of a housing 406 of the device 102, 400 relative to a default temperature. In response, the power source manager 238 of the example thermal constraint manager 132 of FIG. 2 instructs the hardware component(s) of the device 102, 400 (e.g., the processor 130) to consume increased amounts of power (block 706). For example, if the device configuration analyzer 218 determines that the user is interacting with the device 102, 400 via an external keyboard 104, 500, the thermal constraint selector 252 can select a thermal constraint that permits the skin temperature to increase to, for instance 47° C. from a default temperature of 45° C. The power source manager 238 communicates with the power source(s) 116 of the device 102, 400 to increase the power provided to the hardware component(s) of the user device 102, 400 based on the thermal constraint selected by the thermal constraint selector 252.
If the device configuration analyzer 218 determines that the user input(s) are being received by the device 102, 400 via on-board user input device(s) such as the on-board keyboard 104, the thermal constraint selector 252 of the example thermal constraint manager 132 of FIG. 2 selects a thermal constraint for a temperature of the skin 408 of the device 102 that maintains the temperature of the skin 408 of the housing 406 of the device 102, 400 at a default temperature and the power source manager 238 of the example thermal constraint manager 132 of FIG. 2 instructs the hardware component(s) of the device 102, 400 (e.g., the processor 130) to consume power so as not to cause the temperature of the skin to exceed the default temperature (block 708).
In some examples, in view of the thermal constraint(s) assigned to the device 102, 400 at blocks 706, 708, the temperature analyzer 236 monitors the temperature of the hardware component(s) of the user device 102 based on the data generated by the temperature sensor(s) 126 and the fan speed manager 240 controls operation of the fan(s) 114 (e.g., increase fan level to exhaust hot air to cool the user device 102) to prevent the skin temperature from exceeding the selected thermal constraint at blocks 706 and/or 708.
Control proceeds to block 718 from blocks 706, 708. At block 718, the device configuration analyzer 218 determines whether the user who is interacting with the device 102, 400 is wearing headphones 112. For example, the device configuration analyzer 218 detects whether headphones 112 are coupled with the user device 102 (e.g., via wired or wireless connection(s)) and audio output(s) are being provided via the device 102, 400. In some examples, the image data analyzer 220 determines whether the user is wearing headphones 112 based on image data generated by the image sensor(s) 122. If the device configuration analyzer 218 and/or the image data analyzer 220 determine the user is wearing headphones 112, the fan constraint selector 258 selects a fan acoustic constraint that permits the fan(s) 114 to rotate at increased speeds and, thus, generate more noise (e.g., 36 dBA) in view of the use of headphones 112 by the user and the fan speed manager 240 instructs the fan(s) to increase rotational speed(s) (block 720). If the device configuration analyzer 218 and/or the image data analyzer 220 determine the user is not wearing headphones, control proceeds to block 724.
At block 724, the ambient noise analyzer 234 analyzes microphone data generated by the microphone(s) 124 to determine an ambient noise level for an environment in which the user device 102, 400 is located. The ambient noise analyzer 234 determines whether the ambient noise level exceeds a threshold (e.g., based on the ambient noise rule(s) 235) (block 726). If the ambient noise level exceeds the threshold, the fan constraint selector 258 selects a fan acoustic constraint that permits the fan(s) 114 to rotate at increased speeds and, thus, generate more noise in view of the noisy surrounding environment and the fan speed manager 240 instructs the fan(s) to increase rotational speed(s) (block 728). If the ambient noise level does not exceed the threshold, the fan acoustic constraint selector 258 selects a default fan acoustic constraint (e.g., based on the fan acoustic constraint selection rule(s) 260) for the fan(s) 114 and the fan speed manager 240 of the example thermal constraint manager 132 of FIG. 1 instructs the fan(s) to rotate at speed(s) that generate noise at or under, for instance 35 dBA (block 730). Control returns to block 722.
In the examples of FIGS. 7A and 7B, if the device configuration analyzer 218 does not detect the user input(s) within a threshold time (block 702), the image data analyzer 220 and/or the motion data analyzer 222 analyze user gesture(s) (e.g., movements, posture) and/or facial feature(s) (e.g., eye gaze) based on data generated by the image sensor(s) 122 and/or the motion sensor(s) 123 (block 710). In some instances, the sensor manager 248 activates the image sensor(s) 122 to generate image data when the user is detected as being proximate to the device (block 700).
For example, the image data analyzer 220 analyzes image data generated by the image sensor(s) 122 to detect, for instance, a user's posture and/or eye gaze direction. Additionally or alternatively, the motion data analyzer 222 can analyze gesture data generated by the motion sensor(s) 123 to determine user gesture(s) (e.g., raising an arm, reaching a hand away from the user's body). In the example of FIGS. 7A and 7B, the image data analyzer 220 and/or the motion data analyzer 222 use machine-learning based model(s) 223, 225 to determine if a user is likely to interact with the user device 102. If the image data analyzer 220 and/or the motion data analyzer 222 determines that the user is likely to interact with the device 102, 400 within a threshold time as measured by the timer 244 (block 712), the fan acoustic constraint selector 258 selects a default fan acoustic constraint (e.g., based on the fan acoustic constraint selection rule(s) 260) for the fan(s) 114 of the device 102, 400 (block 714). Based on the default fan acoustic constraint, the fan speed manager 240 of the example thermal constraint manager 132 of FIG. 1 instructs the fan(s) to rotate at speed(s) that generate noise at or under, for instance 35 dBA. In some examples, at block 712, the thermal constraint selector 252 selects a default thermal constraint for the skin temperature of the device 102, 400 so the skin of the device 102, 400 does not exceed a temperature of, for instance, 45° C. in anticipation of the user interacting with the device. Thereafter, control returns to block 702 to detect if user input(s) have been received at the device 102, 400.
If the image data analyzer 220 and/or the motion data analyzer 222 determines the user is not likely to interact with the user device 102 within the threshold time, the fan constraint selector 258 selects a fan acoustic constraint that permits the fan(s) 114 to rotate at increased speeds and, thus, generate more noise to more efficiently cool the device 102, 400 (e.g., while the device 102, 400 is in a working power state) and/or to clean the fan(s) 114 (block 716). Also, if the user presence detection analyzer 214 does not detect the presence of a user within the range of sensor(s) 118 (block 700), the fan constraint selector 258 selects a fan acoustic constraint that permits the fan(s) 114 to rotate at increased speeds and, thus, generate more noise to more efficiently cool the device 102, 400 (e.g., while the device 102, 400 is in a working power state) and/or to clean the fan(s) 114. Control proceeds to block 722.
At block 722, one or more of the user presence detection analyzer 214, the device configuration analyzer 218, the image data analyzer 220, and/or the motion data analyzer 222 determines whether there is a change in user interaction with the user device 102 and/or a change in a likelihood that the user will interact with the user device 102 (block 722). For example, the user presence detection analyzer 214 can detect whether a user is no longer present based on the data generated by the user presence detection sensor(s) 118. In some other examples, the motion data analyzer 222 detects a user is reaching for the pointing device(s) 106 based on the data generated by the motion sensor(s) 123 and the gesture data model(s) 223 after a period of time in which the user was not interacting with the device 102, 400. If the one or more of the user presence detection analyzer 214, the device configuration analyzer 218, the image data analyzer 220, and/or the motion data analyzer 222 detect a change in user interaction with the user device 102 and/or a change in a likelihood of a user interaction with the user device 102, control returns to block 710 to analyzer user behavior relative to the device 102. If no change in user interaction with the device 102 and/or likelihood of user interaction is detected, control proceeds to block 734.
The example instructions of FIGS. 7A and 7B continue until the user device 102 enters a sleep mode (block 734), at which time the fan speed manager 240 disables the fan(s) 114 (block 736). If the device 102, 114 returns a working power state (or, in some examples, a connected standby state) (block 738), the example instructions of FIGS. 7A and 7B resume with detecting presence of the user proximate to the device 102, 400 (and moving component(s) such as the processor 130 and fan(s) 114 to higher power state) (block 700). The example instructions end when the device 102, 400 is powered off (blocks 740, 742).
The machine readable instructions described herein in connection with FIGS. 6 and/or 7A-7B may be stored in one or more of a compressed format, an encrypted format, a fragmented format, a compiled format, an executable format, a packaged format, etc. Machine readable instructions as described herein may be stored as data (e.g., portions of instructions, code, representations of code, etc.) that may be utilized to create, manufacture, and/or produce machine executable instructions. For example, the machine readable instructions may be fragmented and stored on one or more storage devices and/or computing devices (e.g., servers). The machine readable instructions may require one or more of installation, modification, adaptation, updating, combining, supplementing, configuring, decryption, decompression, unpacking, distribution, reassignment, compilation, etc. in order to make them directly readable, interpretable, and/or executable by a computing device and/or other machine. For example, the machine readable instructions may be stored in multiple parts, which are individually compressed, encrypted, and stored on separate computing devices, wherein the parts when decrypted, decompressed, and combined form a set of executable instructions that implement a program such as that described herein.
In another example, the machine readable instructions may be stored in a state in which they may be read by a computer, but require addition of a library (e.g., a dynamic link library (DLL)), a software development kit (SDK), an application programming interface (API), etc. in order to execute the instructions on a particular computing device or other device. In another example, the machine readable instructions may need to be configured (e.g., settings stored, data input, network addresses recorded, etc.) before the machine readable instructions and/or the corresponding program(s) can be executed in whole or in part. Thus, the disclosed machine readable instructions and/or corresponding program(s) are intended to encompass such machine readable instructions and/or program(s) regardless of the particular format or state of the machine readable instructions and/or program(s) when stored or otherwise at rest or in transit.
The machine readable instructions described herein can be represented by any past, present, or future instruction language, scripting language, programming language, etc. For example, the machine readable instructions may be represented using any of the following languages: C, C++, Java, C #, Perl, Python, JavaScript, HyperText Markup Language (HTML), Structured Query Language (SQL), Swift, etc.
As mentioned above, the example processes of FIGS. 6 and/or 7A-7B may be implemented using executable instructions (e.g., computer and/or machine readable instructions) stored on a non-transitory computer and/or machine readable medium such as a hard disk drive, a flash memory, a read-only memory, a compact disk, a digital versatile disk, a cache, a random-access memory and/or any other storage device or storage disk in which information is stored for any duration (e.g., for extended time periods, permanently, for brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term non-transitory computer readable medium is expressly defined to include any type of computer readable storage device and/or storage disk and to exclude propagating signals and to exclude transmission media.
“Including” and “comprising” (and all forms and tenses thereof) are used herein to be open ended terms. Thus, whenever a claim employs any form of “include” or “comprise” (e.g., comprises, includes, comprising, including, having, etc.) as a preamble or within a claim recitation of any kind, it is to be understood that additional elements, terms, etc. may be present without falling outside the scope of the corresponding claim or recitation. As used herein, when the phrase “at least” is used as the transition term in, for example, a preamble of a claim, it is open-ended in the same manner as the term “comprising” and “including” are open ended. The term “and/or” when used, for example, in a form such as A, B, and/or C refers to any combination or subset of A, B, C such as (1) A alone, (2) B alone, (3) C alone, (4) A with B, (5) A with C, (6) B with C, and (7) A with B and with C. As used herein in the context of describing structures, components, items, objects and/or things, the phrase “at least one of A and B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B. Similarly, as used herein in the context of describing structures, components, items, objects and/or things, the phrase “at least one of A or B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B. As used herein in the context of describing the performance or execution of processes, instructions, actions, activities and/or steps, the phrase “at least one of A and B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B. Similarly, as used herein in the context of describing the performance or execution of processes, instructions, actions, activities and/or steps, the phrase “at least one of A or B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, and (3) at least one A and at least one B.
As used herein, singular references (e.g., “a”, “an”, “first”, “second”, etc.) do not exclude a plurality. The term “a” or “an” entity, as used herein, refers to one or more of that entity. The terms “a” (or “an”), “one or more”, and “at least one” can be used interchangeably herein. Furthermore, although individually listed, a plurality of means, elements or method actions may be implemented by, e.g., a single unit or processor. Additionally, although individual features may be included in different examples or claims, these may possibly be combined, and the inclusion in different examples or claims does not imply that a combination of features is not feasible and/or advantageous.
FIG. 8 is a block diagram of an example processor platform 800 structured to execute the instructions of FIG. 6 to implement the training manager 224 of FIG. 2. The processor platform 800 can be, for example, a server, a personal computer, a workstation, a self-learning machine (e.g., a neural network), a mobile device (e.g., a cell phone, a smart phone, a tablet such as an iPad′), a personal digital assistant (PDA), an Internet appliance, a headset or other wearable device, or any other type of computing device.
The processor platform 800 of the illustrated example includes a processor 224. The processor 224 of the illustrated example is hardware. For example, the processor 224 can be implemented by one or more integrated circuits, logic circuits, microprocessors, GPUs, DSPs, or controllers from any desired family or manufacturer. The hardware processor may be a semiconductor based (e.g., silicon based) device. In this example, the processor implements the example trainer 226 and the example machine learning engine 228.
The processor 224 of the illustrated example includes a local memory 813 (e.g., a cache). The processor 224 of the illustrated example is in communication with a main memory including a volatile memory 814 and a non-volatile memory 816 via a bus 818. The volatile memory 814 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS® Dynamic Random Access Memory (RDRAM®) and/or any other type of random access memory device. The non-volatile memory 816 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 814, 816 is controlled by a memory controller.
The processor platform 800 of the illustrated example also includes an interface circuit 820. The interface circuit 820 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), a Bluetooth® interface, a near field communication (NFC) interface, and/or a PCI express interface.
In the illustrated example, one or more input devices 822 are connected to the interface circuit 820. The input device(s) 822 permit(s) a user to enter data and/or commands into the processor 224. The input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 824 are also connected to the interface circuit 820 of the illustrated example. The output devices 824 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube display (CRT), an in-place switching (IPS) display, a touchscreen, etc.), a tactile output device, a printer and/or speaker. The interface circuit 820 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip and/or a graphics driver processor.
The interface circuit 820 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem, a residential gateway, a wireless access point, and/or a network interface to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 826. The communication can be via, for example, an Ethernet connection, a digital subscriber line (DSL) connection, a telephone line connection, a coaxial cable system, a satellite system, a line-of-site wireless system, a cellular telephone system, etc.
The processor platform 800 of the illustrated example also includes one or more mass storage devices 828 for storing software and/or data. Examples of such mass storage devices 828 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, redundant array of independent disks (RAID) systems, and digital versatile disk (DVD) drives.
The machine executable instructions 832 of FIG. 6 may be stored in the mass storage device 828, in the volatile memory 814, in the non-volatile memory 816, and/or on a removable non-transitory computer readable storage medium such as a CD or DVD.
FIG. 9 is a block diagram of an example processor platform 900 structured to execute the instructions of FIGS. 7A and 7B to implement the thermal constraint manager 132 of FIGS. 1 and/or 2. The processor platform 900 can be, for example, a server, a personal computer, a workstation, a self-learning machine (e.g., a neural network), a mobile device (e.g., a cell phone, a smart phone, a tablet such as an iPad′), a personal digital assistant (PDA), an Internet appliance, a headset or other wearable device, or any other type of computing device.
The processor platform 900 of the illustrated example includes a processor 132. The processor 132 of the illustrated example is hardware. For example, the processor 132 can be implemented by one or more integrated circuits, logic circuits, microprocessors, GPUs, DSPs, or controllers from any desired family or manufacturer. The hardware processor may be a semiconductor based (e.g., silicon based) device. In this example, the processor implements the example user presence detection analyzer 214, the example device configuration analyzer 218, the example image data analyzer 220, the example motion data analyzer 222, the example ambient noise analyzer 234, the example temperature analyzer 236, the example power source manager 238, the example fan speed manager 240, the example timer 244, the example sensor manager 248, the example thermal constraint selector 252, and the example fan acoustic constraint selector 258.
The processor 132 of the illustrated example includes a local memory 913 (e.g., a cache). The processor 132 of the illustrated example is in communication with a main memory including a volatile memory 914 and a non-volatile memory 916 via a bus 918. The volatile memory 914 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS® Dynamic Random Access Memory (RDRAM®) and/or any other type of random access memory device. The non-volatile memory 916 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 914, 916 is controlled by a memory controller.
The processor platform 900 of the illustrated example also includes an interface circuit 920. The interface circuit 920 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), a Bluetooth® interface, a near field communication (NFC) interface, and/or a PCI express interface.
In the illustrated example, one or more input devices 922 are connected to the interface circuit 920. The input device(s) 922 permit(s) a user to enter data and/or commands into the processor 132. The input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 924 are also connected to the interface circuit 920 of the illustrated example. The output devices 924 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube display (CRT), an in-place switching (IPS) display, a touchscreen, etc.), a tactile output device, a printer and/or speaker. The interface circuit 920 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip and/or a graphics driver processor.
The interface circuit 920 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem, a residential gateway, a wireless access point, and/or a network interface to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 926. The communication can be via, for example, an Ethernet connection, a digital subscriber line (DSL) connection, a telephone line connection, a coaxial cable system, a satellite system, a line-of-site wireless system, a cellular telephone system, etc.
The processor platform 900 of the illustrated example also includes one or more mass storage devices 928 for storing software and/or data. Examples of such mass storage devices 928 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, redundant array of independent disks (RAID) systems, and digital versatile disk (DVD) drives.
The machine executable instructions 932 of FIGS. 7A and 7B may be stored in the mass storage device 928, in the volatile memory 814, in the non-volatile memory 916, and/or on a removable non-transitory computer readable storage medium such as a CD or DVD.
From the foregoing, it will be appreciated that example methods, apparatus and articles of manufacture have been disclosed that provide for dynamic control of thermal constraints and/or fan acoustic constraints of an electronic user device (e.g., a laptop, a tablet). Examples disclosed herein analyze sensor data indicative of, for instance, user interaction(s) with the device, other user activities (e.g., talking on a phone), and ambient noise to determine if a temperature of a skin of the device can be increased and/or if audible noises associated with rotation of the fan(s) of the device can be increased. Examples disclosed herein detect opportunities for increased skin temperature (e.g., when a user is interacting with the device via an external keyboard) and/or increased fan noise (e.g., when a user is located a threshold distance from the device or in a noisy environment). By permitting the skin temperature of the device to increase, example disclosed herein enable increased power to be provided to the hardware component(s) of the device and, thus, can improve performance (e.g., processing performance) of the device. By allowing the fan(s) to rotate at increased speed(s) and, thus, generate more noise, examples disclosed herein provide for efficient cooling of the device. The disclosed methods, apparatus and articles of manufacture improve the efficiency of using a computing device by selectively managing thermal constraint(s) for the device to optimize device performance and cooling in view user interactions with the device and/or ambient conditions. The disclosed methods, apparatus and articles of manufacture are accordingly directed to one or more improvement(s) in the functioning of a computer.
Example methods, apparatus, systems, and articles of manufacture to implement thermal management of electronic user devices are disclosed herein. Further examples and combinations thereof include the following:
Example 1 includes an electronic device including a housing, a fan, a first sensor, a second sensor, and a processor to at least one of analyze first sensor data generated by the first sensor to detect a presence of a subject proximate to the electronic device or analyze second sensor data generated by the second sensor to detect a gesture of the subject, and adjust one or more of an acoustic noise level generated the fan or a temperature of an exterior surface of the housing based on one or more of the presence of the subject or the gesture.
Example 2 includes the electronic device of example 1, wherein the second sensor includes a camera.
Example 3 includes the electronic device of examples 1 or 2, wherein the processor is to adjust the acoustic noise level by generating an instruction to increase a rotational speed of the fan.
Example 4 includes the electronic device of any of examples 1-3, wherein the processor is to adjust the temperature of the exterior surface of the device by controlling a power source of the device.
Example 5 includes the electronic device of any of examples 1-4, further including a microphone, the processor to analyze third sensor data generated by the microphone to detect ambient noise in an environment including the device, and adjust the acoustic noise level of the fan based on the ambient noise.
Example 6 includes the electronic device of example 1, further including a keyboard carried by the housing, wherein the processor is to detect an input via the keyboard and adjust the temperature of the exterior surface of the housing based on the detection of the input.
Example 7 includes the electronic device of example 1, further including a keyboard external to the housing, wherein the processor is to detect an input via the keyboard and adjust the temperature of the exterior surface of the housing based on the detection of the input.
Example 8 includes the electronic device of example 1, wherein the processor is to adjust one the acoustic noise level to during cleaning of the fan and based on the distance of the user being within a threshold distance from the electronic device.
Example 9 includes an apparatus including a user presence detection analyzer, an image data analyzer, a motion data analyzer, at least one of (a) the user presence detection analyzer to identify a presence of a user relative to an electronic device based on first sensor data generated by a first sensor of the electronic device or (b) at least one of the image data analyzer or the motion data analyzer to determine a gesture of the user relative to the device based on second sensor data generated by a second sensor of the electronic device, a thermal constraint selector to select a thermal constraint for a temperature of an exterior surface of the electronic device based on one or more of the presence of the user or the gesture, and a power source manager to adjust a power level for a processor of the electronic device based on the thermal constraint.
Example 10 includes the apparatus of example 9, further including a device configuration analyzer to detect a presence of an external user input device communicatively coupled to the electronic device.
Example 11 includes the apparatus of example 10, wherein the external device is at least one of a keyboard, a pointing device, or headphones.
Example 12 includes the apparatus of example 9, wherein the second sensor data is image data and the image data analyzer is to determine the gesture based on a machine learning model.
Example 13 includes the apparatus of examples 9 or 12, wherein the second sensor data is image data and wherein the image data analyzer is to detect a position of an eye of the user relative to a display screen of the electronic device.
Example 14 includes the apparatus of example 9, further including a fan acoustic constraint selector to select a fan acoustic constraint for a noise level to be generated by a fan of the electronic device during operation of the fan.
Example 15 includes the apparatus of example 14, further including an ambient noise analyzer to determine an ambient noise level based on ambient noise data generated by a microphone of the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level.
Example 16 includes the apparatus of example 14, wherein the user presence detection sensor is further to determine a distance of the user from the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the distance.
Example 17 includes the apparatus of example 14, wherein the fan acoustic constraint selector is to select the fan acoustic constraint for the noise level to be generated by the fan during cleaning of the fan.
Example 18 includes the apparatus of example 14, wherein the image data analyzer is to detect that the user is wearing headphones based on image data generated by the second sensor, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level based on the detection of the headphones.
Example 19 includes at least one non-transitory computer readable storage medium including instructions that, when executed, cause a machine to at least identify one or more of (a) a presence of a user relative to an electronic device based on first sensor data generated by a first sensor of the electronic device, (b) a facial feature of the user based on second sensor data generated by a second sensor of the electronic device, or (c) a gesture of the user based on the second sensor data, select a thermal constraint for a temperature of an exterior surface of the electronic device based on one or more of the presence of the user, the facial feature, or the gesture, and adjust a power level for a processor of the electronic device based on the thermal constraint.
Example 20 includes the at least one non-transitory computer readable storage medium of example 19, wherein the instructions, when executed, further cause the machine to detect a presence of an external user input device communicatively coupled to the electronic device.
Example 21 includes the at least one non-transitory computer readable storage medium of example 19, wherein the instructions, when executed, further cause the machine to identify the gesture based on a machine learning model.
Example 22 includes the at least one non-transitory computer readable storage medium of examples 19 or 21, wherein the facial feature includes an eye position and wherein the instructions, when executed, further cause the machine to detect a position of an eye of the user relative to a display screen of the electronic device.
Example 23 includes the at least one non-transitory computer readable storage medium of examples 19 or 20, wherein the instructions, when executed, further cause the machine to select a fan acoustic constraint for a noise level to be generated by a fan of the electronic device during operation of the fan.
Example 24 includes the at least one non-transitory computer readable storage medium of example 23, wherein the instructions, when executed, further cause the machine to determine an ambient noise level based on ambient noise data generated by a microphone of the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level.
Example 25 includes the at least one non-transitory computer readable storage medium of example 23, wherein the instructions, when executed, further cause the machine to detect that the user is wearing headphones based on image data generated by the second sensor, the fan acoustic constraint selector to select the fan acoustic constraint based on the detection of the headphones.
Example 26 includes the at least one non-transitory computer readable storage medium of example 23, wherein the instructions, when executed, further cause the machine to determine a distance of the user from the electronic device and select the fan acoustic constraint based on the distance.
Example 27 includes the at least one non-transitory computer readable storage medium of example 23, wherein the instructions, when executed, further cause the machine to select the fan acoustic constraint for the noise level to be generated by the fan during cleaning of the fan.
Example 28 includes a method including at least one of (a) identifying a presence of a user relative to an electronic device based on first sensor data generated by a first sensor of the electronic device, (b) identifying a facial feature of the user based on second sensor data generated by a second sensor of the electronic device, or (c) identifying a gesture of the user based on the second sensor data, selecting a thermal constraint for a temperature of an exterior surface of the electronic device based on one or more of the presence of the user, the facial feature, or the gesture, and adjusting a power level for a processor of the electronic device based on the thermal constraint.
Example 29 includes the method of example 28, further including detecting a presence of an external user input device communicatively coupled to the electronic device.
Example 30 includes the method of example 28, further including determining the one or more of the facial feature or the gesture based on a machine learning model.
Example 31 includes the method of examples 28 or 30, wherein the facial feature includes eye position and further including detecting a position of an eye of the user relative to a display screen of the electronic device.
Example 32 includes the method of examples 28 or 29, further including selecting a fan acoustic constraint for a noise level to be generated by a fan of the electronic device.
Example 33 includes the method of example 32, further including determining an ambient noise level based on ambient noise data generated by a microphone of the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level.
Example 34 includes the method of example 32, further including detecting detect that the user is wearing headphones based on image data generated by the second sensor, the fan acoustic constraint selector to select the fan acoustic constraint based on the detection of the headphones.
Example 35 includes the method of example 32, further including determining a distance of the user from the electronic device and selecting the fan acoustic constraint based on the distance.
Example 36 includes the method of example 32, further including selecting the fan acoustic constraint for the noise level to be generated by the fan during cleaning of the fan.
Although certain example methods, apparatus and articles of manufacture have been disclosed herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, apparatus and articles of manufacture fairly falling within the scope of the claims of this patent.
The following claims are hereby incorporated into this Detailed Description by this reference, with each claim standing on its own as a separate embodiment of the present disclosure.

Claims (38)

What is claimed is:
1. An electronic device comprising:
a housing;
a fan;
a first sensor;
a second sensor; and
a processor to:
analyze first sensor data associated with signals output by the first sensor to detect a presence of a subject proximate to the electronic device;
in response to the detection of the presence of the subject, detect if a user input has been received at the electronic device within a threshold period of time;
when the user input is detected within the threshold period of time, adjust a temperature of an exterior surface of the housing;
when the user input is not detected within the threshold period of time, analyze second sensor data associated with signals output by the second sensor to detect a gesture of the subject, the gesture indicative of a likelihood of a user interaction with the electronic device within the threshold period of time; and
adjust one or more of an acoustic noise level generated by the fan or the temperature of the exterior surface of the housing based on the detection of the gesture.
2. The electronic device of claim 1, wherein the processor is to adjust the acoustic noise level by generating an instruction to increase a rotational speed of the fan.
3. The electronic device of claim 1, wherein the processor is to adjust the temperature of the exterior surface of the device by controlling a power source of the device.
4. The electronic device of claim 1, further including a microphone, the processor to:
analyze third sensor data associated with signals output by the microphone to detect ambient noise in an environment including the electronic device; and
adjust the acoustic noise level of the fan based on the ambient noise.
5. The electronic device of claim 1, further including a keyboard carried by the housing, wherein the processor is to detect the user input via the keyboard carried by the housing and adjust the temperature of the exterior surface of the housing based on the detection of the user input via the keyboard carried by the housing.
6. The electronic device of claim 1, further including a keyboard external to the housing, wherein the processor is to detect the user input via the external keyboard and adjust the temperature of the exterior surface of the housing based on the detection of the user input via the external keyboard.
7. The electronic device of claim 1, wherein the processor is to:
determine a distance of the user from the electronic device based on the first sensor data; and
adjust the acoustic noise level during cleaning of the fan based on the distance of the user being within a threshold distance from the electronic device.
8. An electronic device comprising:
a housing having an external surface;
an image data analyzer;
a motion data analyzer, at least one of the image data analyzer or the motion data analyzer to identify a gesture of a user relative to the electronic device based on sensor data associated with signals output by a sensor of the electronic device, the gesture indicative of a likelihood of a user interaction with the electronic device within a threshold period of time;
a thermal constraint selector to select a thermal constraint for a temperature of the exterior surface of the housing of the electronic device based on the identification of the gesture; and
a power source manager to adjust a power level for a processor of the electronic device based on the thermal constraint.
9. The electronic device of claim 8, further including a device configuration analyzer to detect a presence of an external user input device communicatively coupled to the electronic device.
10. The electronic device of claim 8, wherein the sensor data is image data and wherein the image data analyzer is to identify a position of an eye of the user relative to a display screen of the electronic device.
11. The electronic device of claim 8, further including a fan acoustic constraint selector to select a fan acoustic constraint for a noise level to be generated by a fan of the electronic device during operation of the fan.
12. The electronic device of claim 11, further including a user presence detection sensor to determine a distance of the user from the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the distance.
13. The electronic device of claim 11, wherein the fan acoustic constraint selector is to select the fan acoustic constraint for the noise level to be generated by the fan during cleaning of the fan.
14. The electronic device of claim 11, wherein the image data analyzer is to detect that the user is wearing headphones based on image data generated by the sensor, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level based on the detection of the headphones.
15. At least one non-transitory computer readable storage medium comprising instructions that, when executed, cause an electronic device to at least:
identify a presence of a user relative to the electronic device based on first sensor data associated with signals output by a first sensor of the electronic device;
in response to the identification of the presence of the user, determine whether a user input has been received at the electronic device within a threshold period of time;
when the user input is received within the threshold period of time, select a first thermal constraint for the electronic device;
when the user input is not received within the threshold period of time, identify one or more of a facial feature of the user based on second sensor data associated with signals output by a second sensor of the electronic device, or a gesture performed by the user based on the second sensor data, the one or more of the facial feature or the gesture indicative of a likelihood of a user interaction with the electronic device with the threshold period of time;
select a second thermal constraint for the electronic device based on the identification of the one or more of the facial feature or the gesture; and
adjust a power level for a processor of the electronic device based on the selected one of the first thermal constraint or the second thermal constraint.
16. The at least one non-transitory computer readable storage medium of claim 15, wherein the instructions, when executed, further cause the electronic device to detect a presence of an external user input device communicatively coupled to the electronic device, the user input received via the external user input device.
17. The at least one non-transitory computer readable storage medium of claim 15, wherein the instructions, when executed, further cause the electronic device to select a fan acoustic constraint for a noise level to be generated by a fan of the electronic device during operation of the fan.
18. The at least one non-transitory computer readable storage medium of claim 17, wherein the instructions, when executed, further cause the electronic device to determine an ambient noise level based on ambient noise data generated by a microphone of the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level.
19. The at least one non-transitory computer readable storage of claim 17, wherein the instructions, when executed, further cause the electronic device to determine a distance of the user from the electronic device and select the fan acoustic constraint based on the distance.
20. The at least one non-transitory computer readable storage of claim 17, wherein the instructions, when executed, further cause the electronic device to select the fan acoustic constraint for the noise level to be generated by the fan during cleaning of the fan.
21. An electronic device comprising:
at least one memory;
instructions in the electronic device; and
processor circuitry to execute the instructions to:
identify one or more of a facial feature of a user of the electronic device based on sensor data associated with signals output by a sensor of the electronic device or a gesture of the user based on the sensor data, the one or more of the facial feature or the gesture indicative of a likelihood of an interaction of the user with the electronic device within a threshold period of time;
select a thermal constraint for a temperature of an exterior surface of the electronic device based on the identification of the one or more of the facial feature or the gesture; and
adjust a power level for the processor circuitry of the electronic device based on the thermal constraint.
22. The electronic device of claim 21, wherein the processor circuitry is to execute the instructions to detect a presence of an external user input device communicatively coupled to the electronic device.
23. The electronic device of claim 21, wherein the processor circuitry is to execute the instructions to select a fan acoustic constraint for a noise level to be generated by a fan of the electronic device during operation of the fan.
24. The electronic device of claim 21, wherein the processor circuitry is to execute the instructions to determine an ambient noise level based on ambient noise data generated by a microphone of the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level.
25. The electronic device of claim 21, wherein the processor circuitry is to execute the instructions to determine a distance of the user from the electronic device and select the fan acoustic constraint based on the distance.
26. The electronic device of claim 21, wherein the processor circuitry is to execute the instructions to select the fan acoustic constraint for the noise level to be generated by the fan during cleaning of the fan.
27. The electronic device of claim 21, wherein the processor circuitry is to execute the instructions to determine a distance of the user relative to the electronic device and select the thermal constraint based on the distance.
28. A method comprising:
at least one of (a) identifying, by executing an instruction with at least one processor, a facial feature of a user of an electronic device based on sensor data associated with signals output by a sensor of the electronic device or (b) identifying, by executing an instruction with the at least one processor, a gesture performed by the user based on the sensor data, the facial feature or the gesture indicative of a likelihood of a user interaction with the electronic device;
selecting a thermal constraint for a temperature of an exterior surface of the electronic device based on the identification of the one or more of the facial feature or the gesture; and
adjusting, by executing an instruction with the at least one processor, a power level for the at least one processor of the electronic device based on the thermal constraint.
29. The method of claim 28, further including detecting a presence of an external user input device communicatively coupled to the electronic device.
30. The method of claim 28, further including determining the one or more of the facial feature or the gesture based on a machine learning model.
31. The method of claim 28, wherein the facial feature includes eye position and further including detecting a position of an eye of the user relative to a display screen of the electronic device.
32. The method of claim 28, further including selecting a fan acoustic constraint for a noise level to be generated by a fan of the electronic device.
33. The method of claim 32, further including determining an ambient noise level based on ambient noise data generated by a microphone of the electronic device, the fan acoustic constraint selector to select the fan acoustic constraint based on the ambient noise level.
34. The method of claim 32, further including detecting that the user is wearing headphones based on image data generated by the sensor and selecting the fan acoustic constraint based on the detection of the headphones.
35. The method of claim 32, further including determining a distance of the user from the electronic device and selecting the fan acoustic constraint based on the distance.
36. The method of claim 32, further including selecting the fan acoustic constraint for the noise level to be generated by the fan during cleaning of the fan.
37. The electronic device of claim 9, wherein the thermal constraint selector is to select the thermal constraint based on the detection of the external user input device.
38. The electronic device of claim 21, wherein the processor circuitry is to execute the instructions to:
detect a presence of one of (a) a first user input device communicatively coupled to the electronic device, the first user input external to the electronic device, or (b) a second user input device, the second user input device carried by the electronic device;
in response to detecting the presence of the first user input, select the thermal constraint as a first thermal constraint; and
in response to detecting the presence of the second user input, select the thermal constraint as a second thermal constraint, the first thermal constraint associated with the higher temperature for the exterior surface of the housing than the second thermal constraint.
US16/728,774 2019-12-27 2019-12-27 Apparatus and methods for thermal management of electronic user devices based on user activity Active 2040-01-26 US11360528B2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US16/728,774 US11360528B2 (en) 2019-12-27 2019-12-27 Apparatus and methods for thermal management of electronic user devices based on user activity
CN202010964468.8A CN113050774A (en) 2019-12-27 2020-09-15 Apparatus and method for thermal management of electronic user equipment
EP20197335.1A EP3865977A1 (en) 2019-12-27 2020-09-22 Apparatus and methods for thermal management of electronic user devices
US17/732,173 US11966268B2 (en) 2019-12-27 2022-04-28 Apparatus and methods for thermal management of electronic user devices based on user activity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/728,774 US11360528B2 (en) 2019-12-27 2019-12-27 Apparatus and methods for thermal management of electronic user devices based on user activity

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/732,173 Continuation US11966268B2 (en) 2019-12-27 2022-04-28 Apparatus and methods for thermal management of electronic user devices based on user activity

Publications (2)

Publication Number Publication Date
US20200133358A1 US20200133358A1 (en) 2020-04-30
US11360528B2 true US11360528B2 (en) 2022-06-14

Family

ID=70326974

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/728,774 Active 2040-01-26 US11360528B2 (en) 2019-12-27 2019-12-27 Apparatus and methods for thermal management of electronic user devices based on user activity
US17/732,173 Active US11966268B2 (en) 2019-12-27 2022-04-28 Apparatus and methods for thermal management of electronic user devices based on user activity

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/732,173 Active US11966268B2 (en) 2019-12-27 2022-04-28 Apparatus and methods for thermal management of electronic user devices based on user activity

Country Status (3)

Country Link
US (2) US11360528B2 (en)
EP (1) EP3865977A1 (en)
CN (1) CN113050774A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200133374A1 (en) * 2019-11-11 2020-04-30 Intel Corporation Methods and apparatus to manage power and performance of computing devices based on user presence
US20200366556A1 (en) * 2014-05-19 2020-11-19 Ebay Inc. Phone thermal context
US20210298206A1 (en) * 2020-03-17 2021-09-23 International Business Machines Corporation Intelligently deployed cooling fins
US20220334620A1 (en) 2019-05-23 2022-10-20 Intel Corporation Methods and apparatus to operate closed-lid portable computers
US11543873B2 (en) 2019-09-27 2023-01-03 Intel Corporation Wake-on-touch display screen devices and related methods
US11809535B2 (en) 2019-12-23 2023-11-07 Intel Corporation Systems and methods for multi-modal user device authentication
US11966268B2 (en) 2019-12-27 2024-04-23 Intel Corporation Apparatus and methods for thermal management of electronic user devices based on user activity

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11194398B2 (en) 2015-09-26 2021-12-07 Intel Corporation Technologies for adaptive rendering using 3D sensors
US11301009B2 (en) 2019-06-04 2022-04-12 Softiron Limited Fan control for computing devices
US11314311B2 (en) * 2019-09-20 2022-04-26 Dell Products, L.P. Battery runtime and performance management based upon presence detection
US11350543B2 (en) * 2020-04-17 2022-05-31 Dell Products L.P. Systems and methods for acoustic limits of thermal control system in an information handling system
JP2022081090A (en) * 2020-11-19 2022-05-31 レノボ・シンガポール・プライベート・リミテッド Information processor and control method
US11836062B2 (en) * 2021-07-21 2023-12-05 Dell Products L.P. System and method of managing acoustics of information handling systems
US20230070036A1 (en) * 2021-09-03 2023-03-09 Dell Products L.P. System and method of configuring an information handling system based at least on an ambient temperature
US20230400900A1 (en) * 2022-06-14 2023-12-14 Dell Products, L.P. Managing thermal and acoustic characteristics of an information handling system (ihs) based on the use of external peripheral devices

Citations (165)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5173940A (en) 1991-06-21 1992-12-22 Compaq Computer Corporation Keyboard activated screen blanking
US20020091738A1 (en) 2000-06-12 2002-07-11 Rohrabaugh Gary B. Resolution independent vector display of internet content
US20030043174A1 (en) 2001-08-29 2003-03-06 Hinckley Kenneth P. Automatic scrolling
US20030174149A1 (en) 2002-02-06 2003-09-18 Hitomi Fujisaki Apparatus and method for data-processing
US6657647B1 (en) 2000-09-25 2003-12-02 Xoucin, Inc. Controlling the order in which content is displayed in a browser
US6760649B2 (en) * 2002-05-22 2004-07-06 International Business Machines Corporation Thermal management of a laptop computer
US20040158739A1 (en) 1997-03-24 2004-08-12 Canon Kabushiki Kaisha Information processing apparatus for performing processing dependent on presence/absence of user, and method therefor
US20040175020A1 (en) 2003-03-05 2004-09-09 Bradski Gary R. Method and apparatus for monitoring human attention in dynamic power management
US20040252101A1 (en) 2003-06-12 2004-12-16 International Business Machines Corporation Input device that detects user's proximity
US20050071698A1 (en) 2003-09-30 2005-03-31 Kangas Paul Daniel Apparatus, system, and method for autonomic power adjustment in an electronic device
US20060192775A1 (en) 2005-02-25 2006-08-31 Microsoft Corporation Using detected visual cues to change computer system operating states
US20080046425A1 (en) 2006-08-15 2008-02-21 N-Trig Ltd. Gesture detection for a digitizer
US20080112571A1 (en) 2006-11-09 2008-05-15 Thomas Michael Bradicich Noise control in proximity to a computer system
US7386799B1 (en) 2002-11-21 2008-06-10 Forterra Systems, Inc. Cinematic techniques in avatar-centric communication during a multi-user online simulation
US20080158144A1 (en) 2004-03-18 2008-07-03 Koninklijke Philips Electronics, N.V. Scanning Display Apparatus
US20080301300A1 (en) 2007-06-01 2008-12-04 Microsoft Corporation Predictive asynchronous web pre-fetch
US20090092293A1 (en) 2007-10-03 2009-04-09 Micro-Star Int'l Co., Ltd. Method for determining power-save mode of multimedia application
US20090165125A1 (en) 2007-12-19 2009-06-25 Research In Motion Limited System and method for controlling user access to a computing device
US7559034B1 (en) 2000-10-19 2009-07-07 DG FastChannel, Inc. Method and system for using a hyperlink, banner, or graphical icon to initiate the overlaying of an object on a window
US7725547B2 (en) 2006-09-06 2010-05-25 International Business Machines Corporation Informing a user of gestures made by others out of the user's line of sight
WO2010071631A1 (en) 2008-12-15 2010-06-24 Hewlett-Packard Development Company, L.P. Temperature threshold adjustment based on human detection
US20100281432A1 (en) 2009-05-01 2010-11-04 Kevin Geisner Show body position
US20110055752A1 (en) 2009-06-04 2011-03-03 Rubinstein Jonathan J Method and Apparatus for Displaying and Auto-Correcting an Over-Scroll State on a Computing Device
US20110154266A1 (en) 2009-12-17 2011-06-23 Microsoft Corporation Camera navigation for presentations
US7971156B2 (en) 2007-01-12 2011-06-28 International Business Machines Corporation Controlling resource access based on user gesturing in a 3D captured image stream of the user
US20110175932A1 (en) 2010-01-21 2011-07-21 Tobii Technology Ab Eye tracker based contextual action
CN102197349A (en) 2008-10-22 2011-09-21 微软公司 Conserving power using predictive modelling and signaling
US20110252339A1 (en) 2010-04-12 2011-10-13 Google Inc. Collaborative Cursors in a Hosted Word Processor
US20110248918A1 (en) 2010-04-07 2011-10-13 Samsung Electronics Co., Ltd. Method for suspension sensing in interactive display, method for processing suspension sensing image, and proximity sensing apparatus
US20110296163A1 (en) 2009-02-20 2011-12-01 Koninklijke Philips Electronics N.V. System, method and apparatus for causing a device to enter an active mode
US20110298967A1 (en) 2010-06-04 2011-12-08 Microsoft Corporation Controlling Power Levels Of Electronic Devices Through User Interaction
US20110298702A1 (en) 2009-12-14 2011-12-08 Kotaro Sakata User interface device and input method
US20120032894A1 (en) 2010-08-06 2012-02-09 Nima Parivar Intelligent management for an electronic device
US20120054670A1 (en) 2010-08-27 2012-03-01 Nokia Corporation Apparatus and method for scrolling displayed information
US20120062470A1 (en) 2010-09-10 2012-03-15 Chang Ray L Power Management
US8139032B2 (en) 2008-12-22 2012-03-20 Kuo-Hsin Su Power-saving computer mouse
EP2518586A1 (en) 2011-04-25 2012-10-31 Sunon Electronics (Kunshan) Co., Ltd. Cooling system for a portable electronic device
US20120300061A1 (en) 2011-05-25 2012-11-29 Sony Computer Entertainment Inc. Eye Gaze to Alter Device Behavior
US20120319997A1 (en) 2011-06-20 2012-12-20 The Regents Of The University Of California Scalable distributed/cooperative/collaborative paradigm for multi-user interaction with projection-based display walls
US20130007590A1 (en) 2011-06-30 2013-01-03 Apple Inc. List view optimization
US20130021750A1 (en) * 2010-07-08 2013-01-24 Hewlett-Packard Development Company, Lp. Electronic device thermal management
US20130120460A1 (en) 2011-11-14 2013-05-16 Microsoft Corporation Animations for Scroll and Zoom
US20130173946A1 (en) 2011-12-29 2013-07-04 Efraim Rotem Controlling power consumption through multiple power limits over multiple time intervals
US20130174016A1 (en) 2011-12-29 2013-07-04 Chegg, Inc. Cache Management in HTML eReading Application
US20130185633A1 (en) 2012-01-16 2013-07-18 Microsoft Corporation Low resolution placeholder content for document navigation
US20130207895A1 (en) 2012-02-15 2013-08-15 Samsung Electronics Co., Ltd. Eye tracking method and display apparatus using the same
US20130212462A1 (en) 2011-09-14 2013-08-15 Nokia Corporation Method and apparatus for distributed script processing
US20130222329A1 (en) 2012-02-29 2013-08-29 Lars-Johan Olof LARSBY Graphical user interface interaction on a touch-sensitive device
US8566696B1 (en) 2011-07-14 2013-10-22 Google Inc. Predicting user navigation events
US20130283213A1 (en) 2012-03-26 2013-10-24 Primesense Ltd. Enhanced virtual touchpad
US20130289792A1 (en) * 2012-04-27 2013-10-31 Chao-Wen Cheng Thermal Management
US20130321265A1 (en) 2011-02-09 2013-12-05 Primesense Ltd. Gaze-Based Display Control
US20130332760A1 (en) 2012-06-08 2013-12-12 Russell Dean Reece Thermal-based acoustic management
US20140006830A1 (en) 2012-06-29 2014-01-02 Intel Corporation User behavior adaptive sensing scheme for efficient power consumption management
US20140085451A1 (en) 2012-09-24 2014-03-27 Fujitsu Limited Gaze detection apparatus, gaze detection computer program, and display apparatus
US20140089865A1 (en) 2012-09-24 2014-03-27 Co-Operwrite Limited Handwriting recognition server
US8717318B2 (en) 2011-03-29 2014-05-06 Intel Corporation Continued virtual links between gestures and user interface elements
US20140129937A1 (en) 2012-11-08 2014-05-08 Nokia Corporation Methods, apparatuses and computer program products for manipulating characteristics of audio objects by using directional gestures
US20140139456A1 (en) 2012-10-05 2014-05-22 Tactual Labs Co. Hybrid systems and methods for low-latency user input processing and feedback
US20140149935A1 (en) 2012-11-28 2014-05-29 Michael Dudley Johnson User-Intent-Based Chrome
US20140189579A1 (en) 2013-01-02 2014-07-03 Zrro Technologies (2009) Ltd. System and method for controlling zooming and/or scrolling
US20140191995A1 (en) 2009-04-24 2014-07-10 Cypress Semiconductor Corporation Touch Identification for Multi-Touch Technology
US20140201690A1 (en) 2013-01-15 2014-07-17 Leap Motion, Inc. Dynamic user interactions for display control and scaling responsiveness of display objects
US20140208260A1 (en) 2013-01-18 2014-07-24 Panasonic Corporation Scrolling apparatus, scrolling method, and computer-readable medium
US8812831B2 (en) 2010-09-30 2014-08-19 International Business Machines Corporation Fan control method and apparatus for adjusting initial fan speed based on a discreteness level of installed devices and calibrating fan speed according to threshold power and adjusted initial speed
WO2014131188A1 (en) 2013-02-28 2014-09-04 Hewlett-Packard Development Company, L.P. Input for portable computing device based on predicted input
US20140258942A1 (en) 2013-03-05 2014-09-11 Intel Corporation Interaction of multiple perceptual sensing inputs
US20140267021A1 (en) 2013-03-14 2014-09-18 Samsung Electronics Co., Ltd. Display control method and apparatus
US20140281918A1 (en) 2013-03-15 2014-09-18 Yottaa Inc. Systems and methods for configuration-based optimization by an intermediary
WO2014186294A1 (en) 2013-05-15 2014-11-20 Advanced Micro Devices, Inc. Method and system for power management
US20140361977A1 (en) 2013-06-07 2014-12-11 Sony Computer Entertainment Inc. Image rendering responsive to user actions in head mounted display
US20140372511A1 (en) 2013-06-14 2014-12-18 Microsoft Corporation Content Pre-Render and Pre-Fetch Techniques
WO2014205227A2 (en) 2013-06-20 2014-12-24 Bank Of America Corporation Utilizing voice biometrics
US20140380075A1 (en) 2013-06-19 2014-12-25 Microsoft Corporation Selective Blocking of Background Activity
US20150009238A1 (en) 2013-07-03 2015-01-08 Nvidia Corporation Method for zooming into and out of an image shown on a display
US20150015688A1 (en) 2013-07-09 2015-01-15 HTC Corportion Facial unlock mechanism using light level determining module
US8954884B1 (en) 2013-11-18 2015-02-10 Maestro Devices, LLC Navigation system for viewing an image data-stack in less time with less effort and less repetitive motions
US8994847B2 (en) 2009-04-07 2015-03-31 Mediatek Inc. Digital camera and image capturing method
US20150100884A1 (en) 2013-10-08 2015-04-09 Nvidia Corporation Hardware overlay assignment
US20150121193A1 (en) 2013-10-24 2015-04-30 Vmware, Inc. User interface virtualization for web applications
US20150121287A1 (en) 2006-07-03 2015-04-30 Yoram Ben-Meir System for generating and controlling a variably displayable mobile device keypad/virtual keyboard
US20150177843A1 (en) 2013-12-23 2015-06-25 Samsung Electronics Co., Ltd. Device and method for displaying user interface of virtual input device based on motion recognition
US20150185909A1 (en) 2012-07-06 2015-07-02 Freescale Semiconductor, Inc. Method of sensing a user input to a capacitive touch sensor, a capacitive touch sensor controller, an input device and an apparatus
US20150193395A1 (en) 2012-07-30 2015-07-09 Google Inc. Predictive link pre-loading
US20150220149A1 (en) 2012-02-14 2015-08-06 Google Inc. Systems and methods for a virtual grasping user interface
US20150220150A1 (en) 2012-02-14 2015-08-06 Google Inc. Virtual touch user interface system and methods
US20150248167A1 (en) 2014-02-28 2015-09-03 Microsoft Corporation Controlling a computing-based device using gestures
US20150264572A1 (en) 2010-11-29 2015-09-17 Biocatch Ltd. System, method, and device of detecting identity of a user of an electronic device
US20150360567A1 (en) 2013-01-21 2015-12-17 Toyota Jidosha Kabushiki Kaisha User interface apparatus and input acquiring method
US20150363070A1 (en) 2011-08-04 2015-12-17 Itay Katz System and method for interfacing with a device via a 3d display
US20160034019A1 (en) 2014-07-30 2016-02-04 Samsung Electronics Co., Ltd. Display apparatus and control method for controlling power consumption thereof
US9268434B2 (en) 2013-02-14 2016-02-23 Dell Products L.P. Systems and methods for reducing power consumption in a touch sensor display
US20160062584A1 (en) 2014-08-27 2016-03-03 Apple Inc. Anchoring viewport
US20160087981A1 (en) 2013-04-29 2016-03-24 Baseline Automatisering B.V. Method for Authentication, Server, Device and Data Carrier
US20160091938A1 (en) * 2014-09-25 2016-03-31 Intel Corporation System and method for adaptive thermal and performance management in electronic devices
US9311909B2 (en) * 2012-09-28 2016-04-12 Microsoft Technology Licensing, Llc Sensed sound level based fan speed adjustment
US20160109961A1 (en) 2013-06-20 2016-04-21 Uday Parshionikar Systems, methods, apparatuses, computer readable medium for controlling electronic devices
US20160116960A1 (en) 2014-10-24 2016-04-28 Ati Technologies Ulc Power management using external sensors and data
US20160132099A1 (en) 2014-11-10 2016-05-12 Novi Security, Inc. Security Sensor Power Management
US20160170617A1 (en) 2014-12-11 2016-06-16 Cisco Technology, Inc. Automatic active region zooming
US20160180762A1 (en) 2014-12-22 2016-06-23 Elwha Llc Systems, methods, and devices for controlling screen refresh rates
US20160179767A1 (en) 2014-12-22 2016-06-23 Prasanna Bhat Mavinakuli Architecture for an application with integrated dynamic content
US20160187994A1 (en) 2014-12-29 2016-06-30 Samsung Electronics Co., Ltd. Foldable device and method of controlling the same
US20160212317A1 (en) 2015-01-15 2016-07-21 Motorola Mobility Llc 3d ir illumination for iris authentication
US20160232701A1 (en) 2015-02-05 2016-08-11 Blackberry Limited Devices and methods for rendering graphics data
US9436241B2 (en) 2013-06-26 2016-09-06 Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. Electronic device and method for adjusting fan of electronic device
US20160259467A1 (en) 2015-03-02 2016-09-08 Apple Inc. Snr-aware active mode touch scans
US20160297362A1 (en) 2015-04-09 2016-10-13 Ford Global Technologies, Llc Vehicle exterior side-camera systems and methods
US20170034146A1 (en) 2015-07-30 2017-02-02 Optim Corporation User terminal and method for screen sharing
US20170039170A1 (en) 2015-08-04 2017-02-09 Google Inc. Systems and methods for interactively presenting a visible portion of a rendering surface on a user device
US20170085790A1 (en) 2015-09-23 2017-03-23 Microsoft Technology Licensing, Llc High-resolution imaging of regions of interest
US20170090585A1 (en) 2015-09-26 2017-03-30 Bryan G. Bernhart Technologies for adaptive rendering using 3d sensors
US20170201254A1 (en) 2013-05-29 2017-07-13 Ingar Hanssen Multi-State Capacitive Button
US9721383B1 (en) 2013-08-29 2017-08-01 Leap Motion, Inc. Predictive information for free space gesture control and communication
US20170219240A1 (en) 2016-02-03 2017-08-03 Avaya Inc. Method and apparatus for a fan auto adaptive noise
CN107077184A (en) 2014-06-27 2017-08-18 英特尔公司 System standby emulation with fast quick-recovery
US9740290B2 (en) 1999-12-17 2017-08-22 Immersion Corporation Haptic feedback for touchpads and other touch controls
US20170269725A1 (en) 2016-03-21 2017-09-21 Samsung Electronics Co., Ltd. Electronic device for touch and finger scan sensor input and control method thereof
US9785234B2 (en) 2015-12-26 2017-10-10 Intel Corporation Analysis of ambient light for gaze tracking
US20170321856A1 (en) 2016-05-04 2017-11-09 Intel Corporation Display backlighting using ambient light
US9846471B1 (en) 2015-02-12 2017-12-19 Marvell International Ltd. Systems and methods for power management in devices
US20180029370A1 (en) 2016-08-01 2018-02-01 Canon Kabushiki Kaisha Printing apparatus and performance maintaining method
US20180039990A1 (en) 2016-08-05 2018-02-08 Nok Nok Labs, Inc. Authentication techniques including speech and/or lip movement analysis
US20180039410A1 (en) 2014-07-25 2018-02-08 Lg Electronics Inc. Mobile terminal and control method thereof
EP3285133A1 (en) 2016-08-19 2018-02-21 Samsung Electronics Co., Ltd. Foldable device and method of controlling the same
KR20180029370A (en) 2016-09-12 2018-03-21 삼성전자주식회사 foldable electronic device with flexible display
US20180136719A1 (en) 2016-11-16 2018-05-17 Wuhan China Star Optoelectronics Technology Co., Ltd. Image brightness adjusting method and image brightness adjusting device
US9996638B1 (en) 2013-10-31 2018-06-12 Leap Motion, Inc. Predictive information for free space gesture control and communication
US20180164942A1 (en) 2016-12-12 2018-06-14 Microsoft Technology Licensing, Llc Apparatus and method of adjusting power mode of a display of a device
US20180188774A1 (en) 2016-12-31 2018-07-05 Lenovo (Singapore) Pte. Ltd. Multiple display device
US20180189547A1 (en) 2016-12-30 2018-07-05 Intel Corporation Biometric identification system
US10027662B1 (en) 2016-12-06 2018-07-17 Amazon Technologies, Inc. Dynamic user authentication
US20180224871A1 (en) 2017-02-03 2018-08-09 Qualcomm Incorporated System and method for thermal management of a wearable computing device based on proximity to a user
US10101817B2 (en) 2015-03-03 2018-10-16 Intel Corporation Display interaction detection
US20180321731A1 (en) 2017-05-04 2018-11-08 Dell Products, Lp System and method for heuristics based user presence detection for power management
US20190034609A1 (en) 2017-07-31 2019-01-31 Stmicroelectronics, Inc. Human presence detection
US20190079572A1 (en) 2011-06-17 2019-03-14 Sony Corporation Electronic device, control method of electronic device, and program
US10254178B2 (en) 2014-03-28 2019-04-09 Intel Corporation Ambient temperature estimation
US10262599B2 (en) 2017-03-24 2019-04-16 Intel Corporation Display backlight brightness adjustment
US20190174419A1 (en) 2012-07-20 2019-06-06 Facebook, Inc. Adjusting mobile device state based on user intentions and/or identity
US20190239384A1 (en) * 2018-01-31 2019-08-01 Dell Products L.P. Systems and methods for detecting impeded cooling air flow for information handling system chassis enclosures
US20190250691A1 (en) 2018-02-09 2019-08-15 Samsung Electronics Co., Ltd. Mobile device including context hub and operation method thereof
US20190258785A1 (en) 2018-02-17 2019-08-22 Motorola Mobility Llc Methods and Systems for Electronic Device Concealed Monitoring
US20190265831A1 (en) 2018-02-23 2019-08-29 Cirrus Logic International Semiconductor Ltd. Method and system for an electronic device
US20190278339A1 (en) 2019-05-23 2019-09-12 Intel Corporation Methods and apparatus to operate closed-lid portable computers
US20190371326A1 (en) 2015-11-24 2019-12-05 Intel IP Corporation Low resource key phrase detection for wake on voice
US20190371342A1 (en) 2018-06-05 2019-12-05 Samsung Electronics Co., Ltd. Methods and systems for passive wakeup of a user interaction device
US20200012331A1 (en) 2017-06-02 2020-01-09 Apple Inc. Techniques for adjusting computing device sleep states using onboard sensors and learned user behaviors
US20200026342A1 (en) 2019-09-27 2020-01-23 Intel Corporation Wake-on-touch display screen devices and related methods
US20200033920A1 (en) 2018-07-28 2020-01-30 Microsoft Technology Licensing, Llc Optimized touch temperature thermal management
US10551888B1 (en) * 2018-08-13 2020-02-04 Dell Products L.P. Skin transition thermal control for convertible information handling systems
US10620786B2 (en) 2016-03-07 2020-04-14 Intel Corporation Technologies for event notification interface management
US20200125158A1 (en) 2018-10-22 2020-04-23 Google Llc Smartphone-Based Radar System for Determining User Intention in a Lower-Power Mode
US20200133374A1 (en) 2019-11-11 2020-04-30 Intel Corporation Methods and apparatus to manage power and performance of computing devices based on user presence
US20200134151A1 (en) 2019-12-23 2020-04-30 Intel Corporation Systems and methods for multi-modal user device authentication
US10725510B2 (en) * 2018-03-16 2020-07-28 Microsoft Technology Licensing, Llc Device configuration-based thermal management control
US10740912B2 (en) 2016-05-19 2020-08-11 Intel Corporation Detection of humans in images using depth information
US20200259638A1 (en) 2019-02-08 2020-08-13 Keyless Technologies Ltd Authentication processing service
WO2020191643A1 (en) 2019-03-27 2020-10-01 Intel Corporation Smart display panel apparatus and related methods
US10819920B1 (en) 2019-05-22 2020-10-27 Dell Products L.P. Augmented information handling system user presence detection
US20200348745A1 (en) 2019-05-02 2020-11-05 Dell Products L.P. Information handling system power control sensor
US20210025976A1 (en) 2019-07-26 2021-01-28 Google Llc Reducing a State Based on IMU and Radar
US20210109585A1 (en) 2020-12-21 2021-04-15 Intel Corporation Methods and apparatus to improve user experience on computing devices
US20210240254A1 (en) 2020-01-31 2021-08-05 Dell Products L.P. Information handling system peripheral enhanced user presence detection
US20210318743A1 (en) 2018-12-03 2021-10-14 Hewlett-Packard Development Company, L.P. Sensing audio information and footsteps to control power

Family Cites Families (163)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD324036S (en) 1989-12-06 1992-02-18 Kabushi Kaisha Toshiba Combined electronic computer and telephone
JPH0651901A (en) 1992-06-29 1994-02-25 Nri & Ncc Co Ltd Communication equipment for glance recognition
USD376791S (en) 1992-08-04 1996-12-24 Hunt Holdings, Inc. Mouse pad
USD359275S (en) 1992-09-18 1995-06-13 International Business Machines Corporation Portable personal computer
USD389129S (en) 1996-04-24 1998-01-13 Stratos Product Development Group, Inc. Touch pad
US5835083A (en) 1996-05-30 1998-11-10 Sun Microsystems, Inc. Eyetrack-driven illumination and information display
USD388774S (en) 1996-07-01 1998-01-06 Stratos Product Development Group Touch pad with scroll bars
USD434773S (en) 1999-05-21 2000-12-05 Kabushiki Kaisha Toshiba Portion of an electronic computer
USD444462S1 (en) 1999-12-28 2001-07-03 Sharp Kabushiki Kaisha Electronic computer
USD433024S (en) 2000-01-21 2000-10-31 Hewlett-Packard Company Input device for a portable computing device
JP3499798B2 (en) 2000-03-13 2004-02-23 シャープ株式会社 Liquid crystal information display
USD462967S1 (en) 2000-03-16 2002-09-17 Kabushiki Kaisha Toshiba Portion of an electronic computer
USD449307S1 (en) 2000-03-24 2001-10-16 Hiatchi, Ltd. Combined touch pad and buttons for a portable computer
USD453508S1 (en) 2000-07-19 2002-02-12 Kabushiki Kaisha Toshiba Electronic computer
JP2002071833A (en) 2000-08-31 2002-03-12 Ricoh Co Ltd Human body detecting sensor device, image forming device, human body sensor driving method and storage medium
US6591198B1 (en) 2000-11-22 2003-07-08 Dell Products L.P. System and method for controlling noise outputs of devices in response to ambient noise levels
US6659516B2 (en) 2001-01-05 2003-12-09 Apple Computer, Inc. Locking system for a portable computer
USD454126S1 (en) 2001-04-10 2002-03-05 Hewlett-Packard Company Portable computing device
USD478089S1 (en) 2002-08-06 2003-08-05 Sony Corporation Computer
USD480089S1 (en) 2002-08-09 2003-09-30 Hewlett-Packard Development Company, L.P. Input device for portable computer
US20040120113A1 (en) 2002-12-18 2004-06-24 Gateway, Inc. Acoustically adaptive thermal management
USD494161S1 (en) 2003-03-11 2004-08-10 Fujitsu Limited Personal computer
JP4273926B2 (en) 2003-10-29 2009-06-03 株式会社日立製作所 Silencer and projector using the same
JP2005221907A (en) 2004-02-09 2005-08-18 Sanyo Electric Co Ltd Display device
USD517542S1 (en) 2004-05-14 2006-03-21 Lg Electronics Inc. Notebook computer
USD504129S1 (en) 2004-06-07 2005-04-19 Hewlett-Packard Development Company, L.P. Laptop computer
USD518042S1 (en) 2004-08-18 2006-03-28 Fujitsu Limited Personal computer
KR100677420B1 (en) 2004-12-30 2007-02-02 엘지전자 주식회사 Body rotation type portable terminal
USD534531S1 (en) 2005-06-24 2007-01-02 Sony Corporation Computer
US7861099B2 (en) 2006-06-30 2010-12-28 Intel Corporation Method and apparatus for user-activity-based dynamic power management and policy creation for mobile platforms
US20070027580A1 (en) * 2005-07-14 2007-02-01 Ligtenberg Chris A Thermal control of an electronic device for adapting to ambient conditions
US7697827B2 (en) 2005-10-17 2010-04-13 Konicek Jeffrey C User-friendlier interfaces for a camera
WO2007109924A1 (en) 2006-03-29 2007-10-04 Intel Corporation Apparatus and method for a mobile personal computer system (mpc) with a built-in scanner
US8195383B2 (en) 2006-11-29 2012-06-05 The Boeing Company System and method for electronic moving map and aeronautical context display
USD577013S1 (en) 2006-11-30 2008-09-16 Hewlett-Packard Development Company, L.P. Laptop computer
USD591737S1 (en) 2007-06-13 2009-05-05 Fujitsu Limited Personal computer
US8462959B2 (en) 2007-10-04 2013-06-11 Apple Inc. Managing acoustic noise produced by a device
US8515095B2 (en) 2007-10-04 2013-08-20 Apple Inc. Reducing annoyance by managing the acoustic noise produced by a device
USD604294S1 (en) 2008-01-14 2009-11-17 Apple Inc. Electronic device
JP5029428B2 (en) 2008-02-29 2012-09-19 富士通株式会社 Temperature control device, temperature control program, and information processing device
CN101656060A (en) 2008-08-18 2010-02-24 鸿富锦精密工业(深圳)有限公司 Energy saving system and method for screen display
USD608380S1 (en) 2008-08-29 2010-01-19 Kabushiki Kaisha Toshiba Data projector with screen
JP2010060746A (en) 2008-09-02 2010-03-18 Sharp Corp Liquid crystal display device
USD607449S1 (en) 2008-09-04 2010-01-05 Sony Corporation Computer
CN102231255B (en) 2008-09-16 2015-09-23 联想(北京)有限公司 Energy-efficient display and electronic equipment
USD616433S1 (en) 2008-09-24 2010-05-25 Fujitsu Limited Personal computer
US20100079508A1 (en) 2008-09-30 2010-04-01 Andrew Hodge Electronic devices with gaze detection capabilities
USD604290S1 (en) 2008-10-10 2009-11-17 Apple Inc. Portable computer
USD612830S1 (en) 2009-02-18 2010-03-30 Samsung Electronics Co., Ltd. Mobile phone
US9413831B2 (en) 2009-04-15 2016-08-09 Wyse Technology L.L.C. Method and apparatus for authentication of a remote session
JP5299866B2 (en) 2009-05-19 2013-09-25 日立コンシューマエレクトロニクス株式会社 Video display device
USD614180S1 (en) 2009-06-09 2010-04-20 Cheng Uei Precision Industry Co., Ltd. Netbook computer
US8171314B2 (en) * 2009-08-07 2012-05-01 Dell Products L.P. System and method for information handling system hybrid system level and power supply cooling
USD658171S1 (en) 2009-08-26 2012-04-24 Sony Corporation Computer
USD631039S1 (en) 2009-09-01 2011-01-18 Fujitsu Limited Personal computer
JP5263092B2 (en) 2009-09-07 2013-08-14 ソニー株式会社 Display device and control method
USD644641S1 (en) 2009-10-13 2011-09-06 Apple Inc. Electronic device
USD616882S1 (en) 2009-11-23 2010-06-01 Dell Products L.P. Information handling system housing
JP2011137874A (en) 2009-12-25 2011-07-14 Toshiba Corp Video playback device and video playback method
US8581974B2 (en) 2010-05-06 2013-11-12 Aptina Imaging Corporation Systems and methods for presence detection
USD645857S1 (en) 2010-08-11 2011-09-27 Samsung Electronics Co., Ltd. Notebook computer
JP5409931B2 (en) 2010-11-30 2014-02-05 三菱電機株式会社 Voice recognition device and navigation device
USD659134S1 (en) 2010-12-30 2012-05-08 Motorola Mobility, Inc. Computer terminal
US8682388B2 (en) 2010-12-31 2014-03-25 Motorola Mobility Llc Mobile device and method for proximity detection verification
US9830831B2 (en) 2011-01-05 2017-11-28 Pathway Innovations And Technologies, Inc. Mobile handwriting recording instrument and group lecture delivery and response system using the same
USD687831S1 (en) 2011-02-12 2013-08-13 Samsung Electronics Co., Ltd. Notebook computer
US9606723B2 (en) 2011-07-21 2017-03-28 Z124 Second view
USD669068S1 (en) 2011-09-19 2012-10-16 J. P. Sá Couto Laptop computer
WO2013081632A1 (en) 2011-12-02 2013-06-06 Intel Corporation Techniques for notebook hinge sensors
US9766700B2 (en) 2011-12-14 2017-09-19 Intel Corporation Gaze activated content transfer system
KR20130093962A (en) 2012-02-15 2013-08-23 (주)모빌랩 The apparatus and method for reducing power consumption and user convenience in hand-held device display using face recognition
CN104169838B (en) 2012-04-12 2017-07-21 英特尔公司 Display backlight is optionally made based on people's ocular pursuit
USD691995S1 (en) 2012-05-31 2013-10-22 Intel Corporation Electronic computer with partially transparent input device
USD694232S1 (en) 2012-05-31 2013-11-26 Intel Corporation Electronic computer with partially transparent input device
USD701501S1 (en) 2012-05-31 2014-03-25 Intel Corporation Electronic computer with an at least partially transparent input device
USD698350S1 (en) 2012-05-31 2014-01-28 Intel Corporation Electronic computer with an at least partially transparent input device
US9183845B1 (en) 2012-06-12 2015-11-10 Amazon Technologies, Inc. Adjusting audio signals based on a specific frequency range associated with environmental noise characteristics
USD698348S1 (en) 2012-08-06 2014-01-28 Hewlett-Packard Development Company, L.P. Computing device
USD706768S1 (en) 2012-08-10 2014-06-10 Kabushiki Kaisha Toshiba Electronic computer
USD706767S1 (en) 2012-08-10 2014-06-10 Kabushiki Kaisha Toshiba Electronic computer
USD706769S1 (en) 2012-08-10 2014-06-10 Kabushiki Kaisha Toshiba Electronic computer
USD706772S1 (en) 2012-08-29 2014-06-10 Kabushiki Kaisha Toshiba Electronic computer
USD716795S1 (en) 2012-09-06 2014-11-04 Asustek Computer Inc. Portable electronic device
USD708178S1 (en) 2012-09-24 2014-07-01 Panasonic Corporation Portable computer
USD715793S1 (en) 2012-10-04 2014-10-21 Acer Incorporated Notebook computer
USD720712S1 (en) 2012-10-05 2015-01-06 Lg Electronics Inc. Mobile phone
USD692875S1 (en) 2012-10-05 2013-11-05 Stephen Lawrence Notebook computer with wooden housing and silicon keyboard
US9158372B2 (en) 2012-10-30 2015-10-13 Google Technology Holdings LLC Method and apparatus for user interaction data storage
CA150941S (en) 2012-11-06 2014-06-02 Sony Computer Entertainment Inc Controller for electronic device
USD704185S1 (en) 2012-11-06 2014-05-06 Google Inc. Notebook computer housing
CA151691S (en) 2013-02-19 2014-06-02 Sony Computer Entertainment Inc Controller for electronic device
US10216266B2 (en) 2013-03-14 2019-02-26 Qualcomm Incorporated Systems and methods for device interaction based on a detected gaze
JP1488334S (en) 2013-06-14 2017-01-10
USD724576S1 (en) 2013-07-26 2015-03-17 Hewlett-Packard Development Company, L.P. Computer touch pad
TWD162093S (en) 2013-08-12 2014-08-01 東芝股份有限公司 Electronic computer
TWD162094S (en) 2013-08-12 2014-08-01 東芝股份有限公司 Electronic computer
US9652024B2 (en) 2013-08-23 2017-05-16 Samsung Electronics Co., Ltd. Mode switching method and apparatus of terminal
KR102305578B1 (en) 2013-08-23 2021-09-27 삼성전자 주식회사 Method and apparatus for switching mode of terminal
JP1505102S (en) 2013-09-26 2017-08-07
USD741318S1 (en) 2013-10-25 2015-10-20 Intel Corporation Electronic device with a window
JP6165979B2 (en) 2013-11-01 2017-07-19 インテル コーポレイション Gaze-assisted touch screen input
USD731475S1 (en) 2013-11-01 2015-06-09 Hewlett-Packard Development Company, L.P. Computer
USD746808S1 (en) 2013-12-09 2016-01-05 Google Inc. Notebook computer housing
US10620457B2 (en) 2013-12-17 2020-04-14 Intel Corporation Controlling vision correction using eye tracking and depth detection
US20140132514A1 (en) 2013-12-17 2014-05-15 Byron S. Kuzara Portable Electronic Device With Dual Opposing Displays
EP3779642B1 (en) 2014-02-05 2022-09-28 Fujitsu Client Computing Limited Display device, computer system and method for managing the operating states of a computer system
TWD167526S (en) 2014-03-19 2015-05-01 東芝股份有限公司 Electronic computer
WO2015156762A1 (en) 2014-04-07 2015-10-15 Hewlett-Packard Development Company, L.P. Adjusting display brightness based on user distance
DE112014006610B4 (en) 2014-04-24 2019-09-19 Mitsubishi Electric Corporation Robot control device and robot control method
USD813235S1 (en) 2014-06-25 2018-03-20 Sensel, Inc. Touch sensor typewriter keyboard
USD727314S1 (en) 2014-07-07 2015-04-21 Kabushiki Kaisha Toshiba Electronic computer
US10061332B2 (en) 2014-07-14 2018-08-28 Dell Products, Lp Active acoustic control of cooling fan and method therefor
USD771684S1 (en) 2014-08-28 2016-11-15 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
JP1531213S (en) 2014-09-04 2015-08-17
US9904775B2 (en) 2014-10-31 2018-02-27 The Toronto-Dominion Bank Systems and methods for authenticating user identity based on user-defined image data
US9936195B2 (en) 2014-11-06 2018-04-03 Intel Corporation Calibration for eye tracking systems
USD751062S1 (en) 2014-12-05 2016-03-08 Shuttle Inc. Portable computer
USD788767S1 (en) 2014-12-26 2017-06-06 Intel Corporation Portable computing device
USD776653S1 (en) 2015-01-06 2017-01-17 Apple Inc. Electronic device
USD801945S1 (en) 2015-03-06 2017-11-07 Samsung Display Co., Ltd. Mobile phone
KR102353218B1 (en) 2015-07-15 2022-01-20 삼성디스플레이 주식회사 Display apparatus and method for driving thereof
KR101789668B1 (en) 2015-07-16 2017-10-25 삼성전자주식회사 Mobile image forming apparatus, image compensation method of thereof and non-transitory computer readable recording medium
JP2017054471A (en) 2015-09-12 2017-03-16 レノボ・シンガポール・プライベート・リミテッド Portable electronic apparatus, control method, and computer program
USD769251S1 (en) 2015-09-25 2016-10-18 Getac Technology Corporation Detachable keyboard
USD814469S1 (en) 2016-01-05 2018-04-03 Pt Phototechnics Ag Touchpad
USD780760S1 (en) 2016-03-02 2017-03-07 Ironburg Inventions Ltd. Games controller
USD794027S1 (en) 2016-03-02 2017-08-08 Ironburg Inventions Ltd. Games controller
US11232316B2 (en) 2016-06-28 2022-01-25 Intel Corporation Iris or other body part identification on a computing device
KR20230133940A (en) 2016-07-25 2023-09-19 매직 립, 인코포레이티드 Imaging modification, display and visualization using augmented and virtual reality eyewear
JP1575920S (en) 2016-07-29 2017-05-08
US10415286B1 (en) 2016-09-20 2019-09-17 Apple Inc. Hinge with feedback
USD803946S1 (en) 2016-10-06 2017-11-28 Go Matsuda Grip
USD825435S1 (en) 2016-11-14 2018-08-14 Honda Motor Co., Ltd. Touch pad
JP6795387B2 (en) 2016-12-14 2020-12-02 パナソニック株式会社 Voice dialogue device, voice dialogue method, voice dialogue program and robot
JP1577368S (en) 2016-12-28 2017-05-29
US10324525B2 (en) 2016-12-31 2019-06-18 Intel Corporation Context aware selective backlighting techniques
USD816083S1 (en) 2017-02-27 2018-04-24 Lu Xue Wu Game controller
USD823850S1 (en) 2017-04-12 2018-07-24 Samsung Electronics Co., Ltd. Laptop computer
US10304209B2 (en) 2017-04-19 2019-05-28 The Nielsen Company (Us), Llc Methods and systems to increase accuracy of eye tracking
CN107272872B (en) 2017-05-31 2020-01-21 Oppo广东移动通信有限公司 Power saving control method and related product
US11249516B2 (en) 2017-06-27 2022-02-15 Lenovo (Singapore) Pte. Ltd. Multiple display device with rotating display
KR102417002B1 (en) 2017-06-28 2022-07-05 삼성전자 주식회사 An electronic apparatus using two display device and method for operating a screen in the same
US11209890B2 (en) 2017-07-25 2021-12-28 Hewlett-Packard Development Company, L.P. Determining user presence based on sensed distance
USD886112S1 (en) 2017-10-23 2020-06-02 Compal Electronics, Inc. Docking station with keyboard
US10901462B2 (en) 2017-10-26 2021-01-26 Samsung Electronics Co., Ltd. System and method for touch input
US11467650B2 (en) * 2017-11-21 2022-10-11 Advanced Micro Devices, Inc. Selecting a low power state in an electronic device
USD879777S1 (en) 2017-12-11 2020-03-31 Innopresso, Inc. Keyboard
USD867460S1 (en) 2018-01-09 2019-11-19 Intel Corporation Game controller having a game controller touchpad
CN108520728B (en) 2018-04-20 2020-08-04 京东方科技集团股份有限公司 Backlight adjusting method and device, computing device, display device and storage medium
USD982575S1 (en) 2018-07-12 2023-04-04 Google Llc Closed portable computer
USD878475S1 (en) 2018-08-08 2020-03-17 Logitech Europe, S.A. Gaming controller
USD914010S1 (en) 2018-09-04 2021-03-23 Compal Electronics, Inc. Notebook computer
USD934856S1 (en) 2018-09-04 2021-11-02 Compal Electronics, Inc. Notebook computer
USD916076S1 (en) 2018-11-07 2021-04-13 Samsung Electronics Co., Ltd. Notebook
US11031005B2 (en) 2018-12-17 2021-06-08 Intel Corporation Continuous topic detection and adaption in audio environments
USD914021S1 (en) 2018-12-18 2021-03-23 Intel Corporation Touchpad display screen for computing device
USD873835S1 (en) 2018-12-21 2020-01-28 Kung CHAN Keyboard
US10721408B1 (en) 2018-12-26 2020-07-21 Himax Imaging Limited Automatic exposure imaging system and method
US10768724B1 (en) 2019-06-03 2020-09-08 Evga Corporation Mouse device and computer control system thereof
US11448747B2 (en) 2019-09-26 2022-09-20 Apple Inc. Time-of-flight determination of user intent
US11360528B2 (en) 2019-12-27 2022-06-14 Intel Corporation Apparatus and methods for thermal management of electronic user devices based on user activity
WO2021258395A1 (en) 2020-06-26 2021-12-30 Intel Corporation Methods, systems, articles of manufacture, and apparatus to dynamically schedule a wake pattern in a computing system
US20210327394A1 (en) 2021-06-25 2021-10-21 Intel Corporation User-presence based adjustment of display characteristics

Patent Citations (171)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5173940A (en) 1991-06-21 1992-12-22 Compaq Computer Corporation Keyboard activated screen blanking
US20040158739A1 (en) 1997-03-24 2004-08-12 Canon Kabushiki Kaisha Information processing apparatus for performing processing dependent on presence/absence of user, and method therefor
US9740290B2 (en) 1999-12-17 2017-08-22 Immersion Corporation Haptic feedback for touchpads and other touch controls
US20020091738A1 (en) 2000-06-12 2002-07-11 Rohrabaugh Gary B. Resolution independent vector display of internet content
US6657647B1 (en) 2000-09-25 2003-12-02 Xoucin, Inc. Controlling the order in which content is displayed in a browser
US7559034B1 (en) 2000-10-19 2009-07-07 DG FastChannel, Inc. Method and system for using a hyperlink, banner, or graphical icon to initiate the overlaying of an object on a window
US20030043174A1 (en) 2001-08-29 2003-03-06 Hinckley Kenneth P. Automatic scrolling
US20030174149A1 (en) 2002-02-06 2003-09-18 Hitomi Fujisaki Apparatus and method for data-processing
US6760649B2 (en) * 2002-05-22 2004-07-06 International Business Machines Corporation Thermal management of a laptop computer
US7386799B1 (en) 2002-11-21 2008-06-10 Forterra Systems, Inc. Cinematic techniques in avatar-centric communication during a multi-user online simulation
US20040175020A1 (en) 2003-03-05 2004-09-09 Bradski Gary R. Method and apparatus for monitoring human attention in dynamic power management
US20040252101A1 (en) 2003-06-12 2004-12-16 International Business Machines Corporation Input device that detects user's proximity
US20050071698A1 (en) 2003-09-30 2005-03-31 Kangas Paul Daniel Apparatus, system, and method for autonomic power adjustment in an electronic device
US20080158144A1 (en) 2004-03-18 2008-07-03 Koninklijke Philips Electronics, N.V. Scanning Display Apparatus
US20060192775A1 (en) 2005-02-25 2006-08-31 Microsoft Corporation Using detected visual cues to change computer system operating states
US20150121287A1 (en) 2006-07-03 2015-04-30 Yoram Ben-Meir System for generating and controlling a variably displayable mobile device keypad/virtual keyboard
US20080046425A1 (en) 2006-08-15 2008-02-21 N-Trig Ltd. Gesture detection for a digitizer
US7725547B2 (en) 2006-09-06 2010-05-25 International Business Machines Corporation Informing a user of gestures made by others out of the user's line of sight
US20080112571A1 (en) 2006-11-09 2008-05-15 Thomas Michael Bradicich Noise control in proximity to a computer system
US7971156B2 (en) 2007-01-12 2011-06-28 International Business Machines Corporation Controlling resource access based on user gesturing in a 3D captured image stream of the user
US20080301300A1 (en) 2007-06-01 2008-12-04 Microsoft Corporation Predictive asynchronous web pre-fetch
US20090092293A1 (en) 2007-10-03 2009-04-09 Micro-Star Int'l Co., Ltd. Method for determining power-save mode of multimedia application
US20090165125A1 (en) 2007-12-19 2009-06-25 Research In Motion Limited System and method for controlling user access to a computing device
CN102197349A (en) 2008-10-22 2011-09-21 微软公司 Conserving power using predictive modelling and signaling
WO2010071631A1 (en) 2008-12-15 2010-06-24 Hewlett-Packard Development Company, L.P. Temperature threshold adjustment based on human detection
US8139032B2 (en) 2008-12-22 2012-03-20 Kuo-Hsin Su Power-saving computer mouse
US20110296163A1 (en) 2009-02-20 2011-12-01 Koninklijke Philips Electronics N.V. System, method and apparatus for causing a device to enter an active mode
US8994847B2 (en) 2009-04-07 2015-03-31 Mediatek Inc. Digital camera and image capturing method
US20140191995A1 (en) 2009-04-24 2014-07-10 Cypress Semiconductor Corporation Touch Identification for Multi-Touch Technology
US20100281432A1 (en) 2009-05-01 2010-11-04 Kevin Geisner Show body position
US20110055752A1 (en) 2009-06-04 2011-03-03 Rubinstein Jonathan J Method and Apparatus for Displaying and Auto-Correcting an Over-Scroll State on a Computing Device
US20110298702A1 (en) 2009-12-14 2011-12-08 Kotaro Sakata User interface device and input method
US20110154266A1 (en) 2009-12-17 2011-06-23 Microsoft Corporation Camera navigation for presentations
US20110175932A1 (en) 2010-01-21 2011-07-21 Tobii Technology Ab Eye tracker based contextual action
US20110248918A1 (en) 2010-04-07 2011-10-13 Samsung Electronics Co., Ltd. Method for suspension sensing in interactive display, method for processing suspension sensing image, and proximity sensing apparatus
US20110252339A1 (en) 2010-04-12 2011-10-13 Google Inc. Collaborative Cursors in a Hosted Word Processor
US20110298967A1 (en) 2010-06-04 2011-12-08 Microsoft Corporation Controlling Power Levels Of Electronic Devices Through User Interaction
US20130021750A1 (en) * 2010-07-08 2013-01-24 Hewlett-Packard Development Company, Lp. Electronic device thermal management
US20120032894A1 (en) 2010-08-06 2012-02-09 Nima Parivar Intelligent management for an electronic device
US20120054670A1 (en) 2010-08-27 2012-03-01 Nokia Corporation Apparatus and method for scrolling displayed information
US20120062470A1 (en) 2010-09-10 2012-03-15 Chang Ray L Power Management
US8812831B2 (en) 2010-09-30 2014-08-19 International Business Machines Corporation Fan control method and apparatus for adjusting initial fan speed based on a discreteness level of installed devices and calibrating fan speed according to threshold power and adjusted initial speed
US20150264572A1 (en) 2010-11-29 2015-09-17 Biocatch Ltd. System, method, and device of detecting identity of a user of an electronic device
US20160370860A1 (en) 2011-02-09 2016-12-22 Apple Inc. Gaze detection in a 3d mapping environment
US20140028548A1 (en) 2011-02-09 2014-01-30 Primesense Ltd Gaze detection in a 3d mapping environment
US20130321271A1 (en) 2011-02-09 2013-12-05 Primesense Ltd Pointing-based display interaction
US20130321265A1 (en) 2011-02-09 2013-12-05 Primesense Ltd. Gaze-Based Display Control
US8717318B2 (en) 2011-03-29 2014-05-06 Intel Corporation Continued virtual links between gestures and user interface elements
EP2518586A1 (en) 2011-04-25 2012-10-31 Sunon Electronics (Kunshan) Co., Ltd. Cooling system for a portable electronic device
US20120300061A1 (en) 2011-05-25 2012-11-29 Sony Computer Entertainment Inc. Eye Gaze to Alter Device Behavior
US20190079572A1 (en) 2011-06-17 2019-03-14 Sony Corporation Electronic device, control method of electronic device, and program
US20120319997A1 (en) 2011-06-20 2012-12-20 The Regents Of The University Of California Scalable distributed/cooperative/collaborative paradigm for multi-user interaction with projection-based display walls
US20130007590A1 (en) 2011-06-30 2013-01-03 Apple Inc. List view optimization
US8566696B1 (en) 2011-07-14 2013-10-22 Google Inc. Predicting user navigation events
US20150363070A1 (en) 2011-08-04 2015-12-17 Itay Katz System and method for interfacing with a device via a 3d display
US20130212462A1 (en) 2011-09-14 2013-08-15 Nokia Corporation Method and apparatus for distributed script processing
US20130120460A1 (en) 2011-11-14 2013-05-16 Microsoft Corporation Animations for Scroll and Zoom
US20130174016A1 (en) 2011-12-29 2013-07-04 Chegg, Inc. Cache Management in HTML eReading Application
US20130173946A1 (en) 2011-12-29 2013-07-04 Efraim Rotem Controlling power consumption through multiple power limits over multiple time intervals
US20130185633A1 (en) 2012-01-16 2013-07-18 Microsoft Corporation Low resolution placeholder content for document navigation
US20150220150A1 (en) 2012-02-14 2015-08-06 Google Inc. Virtual touch user interface system and methods
US20150220149A1 (en) 2012-02-14 2015-08-06 Google Inc. Systems and methods for a virtual grasping user interface
US20130207895A1 (en) 2012-02-15 2013-08-15 Samsung Electronics Co., Ltd. Eye tracking method and display apparatus using the same
US20130222329A1 (en) 2012-02-29 2013-08-29 Lars-Johan Olof LARSBY Graphical user interface interaction on a touch-sensitive device
US20130283213A1 (en) 2012-03-26 2013-10-24 Primesense Ltd. Enhanced virtual touchpad
US20130289792A1 (en) * 2012-04-27 2013-10-31 Chao-Wen Cheng Thermal Management
US20130332760A1 (en) 2012-06-08 2013-12-12 Russell Dean Reece Thermal-based acoustic management
US20140006830A1 (en) 2012-06-29 2014-01-02 Intel Corporation User behavior adaptive sensing scheme for efficient power consumption management
US20150185909A1 (en) 2012-07-06 2015-07-02 Freescale Semiconductor, Inc. Method of sensing a user input to a capacitive touch sensor, a capacitive touch sensor controller, an input device and an apparatus
US20190174419A1 (en) 2012-07-20 2019-06-06 Facebook, Inc. Adjusting mobile device state based on user intentions and/or identity
US20150193395A1 (en) 2012-07-30 2015-07-09 Google Inc. Predictive link pre-loading
US20140089865A1 (en) 2012-09-24 2014-03-27 Co-Operwrite Limited Handwriting recognition server
US20140085451A1 (en) 2012-09-24 2014-03-27 Fujitsu Limited Gaze detection apparatus, gaze detection computer program, and display apparatus
US9311909B2 (en) * 2012-09-28 2016-04-12 Microsoft Technology Licensing, Llc Sensed sound level based fan speed adjustment
US20140139456A1 (en) 2012-10-05 2014-05-22 Tactual Labs Co. Hybrid systems and methods for low-latency user input processing and feedback
US20140129937A1 (en) 2012-11-08 2014-05-08 Nokia Corporation Methods, apparatuses and computer program products for manipulating characteristics of audio objects by using directional gestures
US20140149935A1 (en) 2012-11-28 2014-05-29 Michael Dudley Johnson User-Intent-Based Chrome
US20140189579A1 (en) 2013-01-02 2014-07-03 Zrro Technologies (2009) Ltd. System and method for controlling zooming and/or scrolling
US20140201690A1 (en) 2013-01-15 2014-07-17 Leap Motion, Inc. Dynamic user interactions for display control and scaling responsiveness of display objects
US20140208260A1 (en) 2013-01-18 2014-07-24 Panasonic Corporation Scrolling apparatus, scrolling method, and computer-readable medium
US20150360567A1 (en) 2013-01-21 2015-12-17 Toyota Jidosha Kabushiki Kaisha User interface apparatus and input acquiring method
US9268434B2 (en) 2013-02-14 2016-02-23 Dell Products L.P. Systems and methods for reducing power consumption in a touch sensor display
WO2014131188A1 (en) 2013-02-28 2014-09-04 Hewlett-Packard Development Company, L.P. Input for portable computing device based on predicted input
US20140258942A1 (en) 2013-03-05 2014-09-11 Intel Corporation Interaction of multiple perceptual sensing inputs
US20140267021A1 (en) 2013-03-14 2014-09-18 Samsung Electronics Co., Ltd. Display control method and apparatus
US20140281918A1 (en) 2013-03-15 2014-09-18 Yottaa Inc. Systems and methods for configuration-based optimization by an intermediary
US20160087981A1 (en) 2013-04-29 2016-03-24 Baseline Automatisering B.V. Method for Authentication, Server, Device and Data Carrier
WO2014186294A1 (en) 2013-05-15 2014-11-20 Advanced Micro Devices, Inc. Method and system for power management
US20170201254A1 (en) 2013-05-29 2017-07-13 Ingar Hanssen Multi-State Capacitive Button
US20140361977A1 (en) 2013-06-07 2014-12-11 Sony Computer Entertainment Inc. Image rendering responsive to user actions in head mounted display
US20140372511A1 (en) 2013-06-14 2014-12-18 Microsoft Corporation Content Pre-Render and Pre-Fetch Techniques
US20140380075A1 (en) 2013-06-19 2014-12-25 Microsoft Corporation Selective Blocking of Background Activity
US20160202750A1 (en) 2013-06-19 2016-07-14 Microsoft Technology Licensing, Llc Selective blocking of background activity
WO2014205227A2 (en) 2013-06-20 2014-12-24 Bank Of America Corporation Utilizing voice biometrics
US20160109961A1 (en) 2013-06-20 2016-04-21 Uday Parshionikar Systems, methods, apparatuses, computer readable medium for controlling electronic devices
US9436241B2 (en) 2013-06-26 2016-09-06 Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. Electronic device and method for adjusting fan of electronic device
US20150009238A1 (en) 2013-07-03 2015-01-08 Nvidia Corporation Method for zooming into and out of an image shown on a display
US20150015688A1 (en) 2013-07-09 2015-01-15 HTC Corportion Facial unlock mechanism using light level determining module
US9721383B1 (en) 2013-08-29 2017-08-01 Leap Motion, Inc. Predictive information for free space gesture control and communication
US20150100884A1 (en) 2013-10-08 2015-04-09 Nvidia Corporation Hardware overlay assignment
US20150121193A1 (en) 2013-10-24 2015-04-30 Vmware, Inc. User interface virtualization for web applications
US9996638B1 (en) 2013-10-31 2018-06-12 Leap Motion, Inc. Predictive information for free space gesture control and communication
US8954884B1 (en) 2013-11-18 2015-02-10 Maestro Devices, LLC Navigation system for viewing an image data-stack in less time with less effort and less repetitive motions
US20150177843A1 (en) 2013-12-23 2015-06-25 Samsung Electronics Co., Ltd. Device and method for displaying user interface of virtual input device based on motion recognition
US20150248167A1 (en) 2014-02-28 2015-09-03 Microsoft Corporation Controlling a computing-based device using gestures
US10254178B2 (en) 2014-03-28 2019-04-09 Intel Corporation Ambient temperature estimation
CN107077184A (en) 2014-06-27 2017-08-18 英特尔公司 System standby emulation with fast quick-recovery
US20180039410A1 (en) 2014-07-25 2018-02-08 Lg Electronics Inc. Mobile terminal and control method thereof
US20160034019A1 (en) 2014-07-30 2016-02-04 Samsung Electronics Co., Ltd. Display apparatus and control method for controlling power consumption thereof
US20160062584A1 (en) 2014-08-27 2016-03-03 Apple Inc. Anchoring viewport
US20160091938A1 (en) * 2014-09-25 2016-03-31 Intel Corporation System and method for adaptive thermal and performance management in electronic devices
US20160116960A1 (en) 2014-10-24 2016-04-28 Ati Technologies Ulc Power management using external sensors and data
US20160132099A1 (en) 2014-11-10 2016-05-12 Novi Security, Inc. Security Sensor Power Management
US20160170617A1 (en) 2014-12-11 2016-06-16 Cisco Technology, Inc. Automatic active region zooming
US20160179767A1 (en) 2014-12-22 2016-06-23 Prasanna Bhat Mavinakuli Architecture for an application with integrated dynamic content
US20160180762A1 (en) 2014-12-22 2016-06-23 Elwha Llc Systems, methods, and devices for controlling screen refresh rates
US20160187994A1 (en) 2014-12-29 2016-06-30 Samsung Electronics Co., Ltd. Foldable device and method of controlling the same
US20160212317A1 (en) 2015-01-15 2016-07-21 Motorola Mobility Llc 3d ir illumination for iris authentication
US20170147879A1 (en) 2015-01-15 2017-05-25 Motorola Mobility Llc 3d ir illumination for iris authentication
US20160232701A1 (en) 2015-02-05 2016-08-11 Blackberry Limited Devices and methods for rendering graphics data
US9846471B1 (en) 2015-02-12 2017-12-19 Marvell International Ltd. Systems and methods for power management in devices
US20160259467A1 (en) 2015-03-02 2016-09-08 Apple Inc. Snr-aware active mode touch scans
US10101817B2 (en) 2015-03-03 2018-10-16 Intel Corporation Display interaction detection
US20160297362A1 (en) 2015-04-09 2016-10-13 Ford Global Technologies, Llc Vehicle exterior side-camera systems and methods
US20170034146A1 (en) 2015-07-30 2017-02-02 Optim Corporation User terminal and method for screen sharing
US20170039170A1 (en) 2015-08-04 2017-02-09 Google Inc. Systems and methods for interactively presenting a visible portion of a rendering surface on a user device
US20170085790A1 (en) 2015-09-23 2017-03-23 Microsoft Technology Licensing, Llc High-resolution imaging of regions of interest
US20170090585A1 (en) 2015-09-26 2017-03-30 Bryan G. Bernhart Technologies for adaptive rendering using 3d sensors
US20190371326A1 (en) 2015-11-24 2019-12-05 Intel IP Corporation Low resource key phrase detection for wake on voice
US9785234B2 (en) 2015-12-26 2017-10-10 Intel Corporation Analysis of ambient light for gaze tracking
US20170219240A1 (en) 2016-02-03 2017-08-03 Avaya Inc. Method and apparatus for a fan auto adaptive noise
US10620786B2 (en) 2016-03-07 2020-04-14 Intel Corporation Technologies for event notification interface management
US20170269725A1 (en) 2016-03-21 2017-09-21 Samsung Electronics Co., Ltd. Electronic device for touch and finger scan sensor input and control method thereof
US20170321856A1 (en) 2016-05-04 2017-11-09 Intel Corporation Display backlighting using ambient light
US10740912B2 (en) 2016-05-19 2020-08-11 Intel Corporation Detection of humans in images using depth information
US20180029370A1 (en) 2016-08-01 2018-02-01 Canon Kabushiki Kaisha Printing apparatus and performance maintaining method
US20180039990A1 (en) 2016-08-05 2018-02-08 Nok Nok Labs, Inc. Authentication techniques including speech and/or lip movement analysis
EP3285133A1 (en) 2016-08-19 2018-02-21 Samsung Electronics Co., Ltd. Foldable device and method of controlling the same
KR20180029370A (en) 2016-09-12 2018-03-21 삼성전자주식회사 foldable electronic device with flexible display
US20190361501A1 (en) 2016-09-12 2019-11-28 Samsung Electronics Co., Ltd Foldable electronic device comprising flexible display
US20180136719A1 (en) 2016-11-16 2018-05-17 Wuhan China Star Optoelectronics Technology Co., Ltd. Image brightness adjusting method and image brightness adjusting device
US10027662B1 (en) 2016-12-06 2018-07-17 Amazon Technologies, Inc. Dynamic user authentication
US20180164942A1 (en) 2016-12-12 2018-06-14 Microsoft Technology Licensing, Llc Apparatus and method of adjusting power mode of a display of a device
US20180189547A1 (en) 2016-12-30 2018-07-05 Intel Corporation Biometric identification system
US20180188774A1 (en) 2016-12-31 2018-07-05 Lenovo (Singapore) Pte. Ltd. Multiple display device
US20180224871A1 (en) 2017-02-03 2018-08-09 Qualcomm Incorporated System and method for thermal management of a wearable computing device based on proximity to a user
US10262599B2 (en) 2017-03-24 2019-04-16 Intel Corporation Display backlight brightness adjustment
US20180321731A1 (en) 2017-05-04 2018-11-08 Dell Products, Lp System and method for heuristics based user presence detection for power management
US20200012331A1 (en) 2017-06-02 2020-01-09 Apple Inc. Techniques for adjusting computing device sleep states using onboard sensors and learned user behaviors
US20190034609A1 (en) 2017-07-31 2019-01-31 Stmicroelectronics, Inc. Human presence detection
US20190239384A1 (en) * 2018-01-31 2019-08-01 Dell Products L.P. Systems and methods for detecting impeded cooling air flow for information handling system chassis enclosures
US20190250691A1 (en) 2018-02-09 2019-08-15 Samsung Electronics Co., Ltd. Mobile device including context hub and operation method thereof
US20190258785A1 (en) 2018-02-17 2019-08-22 Motorola Mobility Llc Methods and Systems for Electronic Device Concealed Monitoring
US20190265831A1 (en) 2018-02-23 2019-08-29 Cirrus Logic International Semiconductor Ltd. Method and system for an electronic device
US10725510B2 (en) * 2018-03-16 2020-07-28 Microsoft Technology Licensing, Llc Device configuration-based thermal management control
US20190371342A1 (en) 2018-06-05 2019-12-05 Samsung Electronics Co., Ltd. Methods and systems for passive wakeup of a user interaction device
US20200033920A1 (en) 2018-07-28 2020-01-30 Microsoft Technology Licensing, Llc Optimized touch temperature thermal management
US10551888B1 (en) * 2018-08-13 2020-02-04 Dell Products L.P. Skin transition thermal control for convertible information handling systems
US20200125158A1 (en) 2018-10-22 2020-04-23 Google Llc Smartphone-Based Radar System for Determining User Intention in a Lower-Power Mode
US20210318743A1 (en) 2018-12-03 2021-10-14 Hewlett-Packard Development Company, L.P. Sensing audio information and footsteps to control power
US20200259638A1 (en) 2019-02-08 2020-08-13 Keyless Technologies Ltd Authentication processing service
WO2020191643A1 (en) 2019-03-27 2020-10-01 Intel Corporation Smart display panel apparatus and related methods
US20200348745A1 (en) 2019-05-02 2020-11-05 Dell Products L.P. Information handling system power control sensor
US10819920B1 (en) 2019-05-22 2020-10-27 Dell Products L.P. Augmented information handling system user presence detection
US20190278339A1 (en) 2019-05-23 2019-09-12 Intel Corporation Methods and apparatus to operate closed-lid portable computers
US20210025976A1 (en) 2019-07-26 2021-01-28 Google Llc Reducing a State Based on IMU and Radar
US20200026342A1 (en) 2019-09-27 2020-01-23 Intel Corporation Wake-on-touch display screen devices and related methods
US20200133374A1 (en) 2019-11-11 2020-04-30 Intel Corporation Methods and apparatus to manage power and performance of computing devices based on user presence
US20200134151A1 (en) 2019-12-23 2020-04-30 Intel Corporation Systems and methods for multi-modal user device authentication
US20210240254A1 (en) 2020-01-31 2021-08-05 Dell Products L.P. Information handling system peripheral enhanced user presence detection
US20210109585A1 (en) 2020-12-21 2021-04-15 Intel Corporation Methods and apparatus to improve user experience on computing devices

Non-Patent Citations (42)

* Cited by examiner, † Cited by third party
Title
Brian Reads, "Microsoft Windows Vista SideShow—In-Depth (pics)", Notebook Review, available at www.notebookreview.com/news/microsoft-windows-vista-sideshow-in-depth-pics/ (retrieved May 6, 2019), Jan. 11, 2006, 7 pages.
Chethan, "Proximity Sensing with CapSense," Cypress AN92239, 2016, 62 pages.
Cravotta, Nicholas, "Optimizing Proximity Sensing for Consumer Electronics Applications," Digi-Key Electronics, Apr. 26, 2012, 9 pages.
Cutress, "Asus ZenBook Pro 15(UX580): A 5.5-inch Screen in the Touchpad", retrieved from https://www.anandtech.com/show/12880/asus-zenbook-pro-15-ux580-a-55inch-screen-in-the-touchpad, Jun. 5, 2018, 5 pages.
European Patent Office, "Extended European Search Report," issued in connection with European Patent Application No. 20164273.3, dated Oct. 9, 2020, 14 pages.
European Patent Office, "Extended European Search Report," issued in connection with European Patent Application No. 20194494.9, dated Feb. 17, 2021, 8 pages.
European Patent Office, "Extended European Search Report," issued in connection with European Patent No. 20197337.7, dated Mar. 9, 2021, 11 pages.
European Patent Office, "Extended Search Report," issued in connection with European Patent Application No. 20181123.9, dated Dec. 4, 2020, 11 pages.
European Patent Office, "Rule 62a(1) Communication," issued in connection with European Patent Application No. 20197335.1, dated Mar. 17, 2021, 2 pages.
European Ptaent Office, "Extended European Search Report," issued in connection with European Patent Application No. 20197335.1, dated Jul. 16, 2021, 11 pages.
Gajitz, "Open Sesame! Gesture-Controlled Motorized Laptop Lid", available at https://gajitz.com/open-sesame-gesture-controlled-motorized-laptop-lid/ (retrieved May 6, 2019), Sep. 2012, 3 pages.
Indiegogo, "Cosmo Communicator", available at https://www.indiegogo.com/projects/cosmo-communicator#/retrieved May 6, 2019), 2018, 18 pages.
International Searching Authority, "International Preliminary Report on Patentability," issued in connection with PCT/CN2016/048953, dated Mar. 27, 2018, 10 pages.
International Searching Authority, "International Search Report," issued in connection with PCT Application No. PCT/US2016/048953, dated Nov. 23, 2016, 3 pages.
International Searching Authority, "Search Report and Written Opinion," issued in connection with PCT Application No. PCT/US2020/098326, dated Mar. 29, 2021, 9 pages.
International Searching Authority, "Search Report," issued in connection with application No. PCT/CN2019/079790, dated Jan. 3, 2020, 4 pages.
International Searching Authority, "Written Opinion of the International Searching Authority " issued in connection with PCT Application No. PCT/US2016/048953, dated Nov. 23, 2016, 9 pages.
International Searching Authority, "Written Opinion," issued in connection with application No. PCT/CN2019/079790, dated Jan. 3, 2020, 4 pages.
Jack Purcher, "Google Patents a Motorized Pixelbook Lid that Opens and Closes with a Simple Touch & Auto-Aligns the Display to the user's Face", Patently Mobile, available at https://www.patentlymobile.com/2017/11/google-patents-a-motorized-pixelbook-lid-that-opens-and-closes-with-a-simple-touch-auto-aligns-the-display-to-the-users-fa.html (retrieved May 6, 2019), Nov. 25, 2017, 6 pages.
Kul Bushan, "CES 2019_ Dell's new laptop can sense your presence and wake itself" Hindustan Times, available at https://www.hindustantimes.com/tech/ces-2019-dell-latitude-7400-2-in-1-laptop-launched-price-specifications-features/story-CiRoU1GoHHsHq3K3qtPZWJ.html (retrieved May 6, 2019), Jan. 5, 2019, 8 pages.
Monica Chin, "Alexa on Windows 10 Hands-On: Useful, with 1 Big Catch", Laptop Magazine, available at https://www.laptopmag.com/articles/alexa-windows-10-hands-on (retrieved May 6, 2019), Nov. 14, 2018, 6 pages.
Notebook Review, "CES 2007: Vista SideShow in HP, Fujitsu, LG and Asus Notebooks," Notebook Review, available at www.notebookreview.com/news/ces-2007-vista-sideshow-in-hp-fujitsu-lg-and-asus-notebooks/ (retrieved May 6, 2019), Jan. 8, 2007, 8 pages.
NVIDIA "PDK User's Guide: Preface Personal Media Device," Manual, published Sep. 4, 2007, 39 pages.
NVIDIA, "NVIDIA and ASUS Deliver World's First Notebook with Windows Sideshow Secondary Display," Press Release, available at https://www.nvidia.com/object/IO_38772.html (retrieved May 6, 2019), Jan. 8, 2007, 5 pages.
NVIDIA, "NVIDIA® Preface™ Platform Enables Windows Vista On The Go," Press Release, available at https://www.nvidia.com/object/IO_38775.html (retrieved May 6, 2019), Jan. 8, 2007, 5 pages.
Pradeep, "Dell's New Latitude 7400 2-in-1 Can Detect Your Presence and Automatically Wake the System," MSPowerUser, Jan. 4, 2019, available at https://mspoweruser.com/dells-new-latitude-7400-2-in-1-can-detect-your-presence-and-automatically-wake-the-system/ (20 pages).
United States Patent and Trademark Office, "Advisory Action," issued in connection with U.S. Appl. No. 14/866,894, dated Aug. 17, 2020, 9 pages.
United States Patent and Trademark Office, "Advisory Action," issued in connection with U.S. Appl. No. 14/866,894, dated Nov. 5, 2019, 6 pages.
United States Patent and Trademark Office, "Corrected Notice of Allowability," issued in connection with U.S. Appl. No. 16/586,225, dated Dec. 16, 2021, 3 pages.
United States Patent and Trademark Office, "Corrected Notice of Allowability" issued in connection with U.S. Appl. No. 16/586,225, dated Mar. 16, 2022, 3 pages.
United States Patent and Trademark Office, "Final Office Action," issued in connection with U.S. Appl. No. 14/866,894, dated Jul. 29, 2019, 18 pages.
United States Patent and Trademark Office, "Final Office Action," issued in connection with U.S. Appl. No. 14/866,894, dated Jun. 23, 2020, 17 pages.
United States Patent and Trademark Office, "Final Office Action," issued in connection with U.S. Appl. No. 14/866,894, dated May 11, 2021, 17 pages.
United States Patent and Trademark Office, "Non Final Office Action," issued in connection with U.S. Appl. No. 14/866,894, dated Feb. 21, 2020, 17 pages.
United States Patent and Trademark Office, "Non Final Office Action," issued in connection with U.S. Appl. No. 14/866,894, dated Oct. 8, 2020, 18 pages.
United States Patent and Trademark Office, "Non-Final Office Action," issued in connection with U.S. Appl. No. 14/866,894, dated Dec. 14, 2018, 12 pages.
United States Patent and Trademark Office, "Non-Final Office Action," issued in connection with U.S. Appl. No. 16/586,225, dated Jun. 15, 2021, 14 pages.
United States Patent and Trademark Office, "Non-Final Office Action," issued in connection with U.S. Appl. No. 16/725,467, dated Apr. 7, 2022, 19 pages.
United States Patent and Trademark Office, "Non-Final Office Action" issued in connection with U.S. Appl. No. 16/728,899, dated Dec. 8, 2021, 9 pages.
United States Patent and Trademark Office, "Notice of Allowance and Fee(s) Due," issued in connection with U.S. Appl. No. 14/866,894, dated Jul. 30, 2021, 8 pages.
United States Patent and Trademark Office, "Notice of Allowance and Fee(s) Due" issued in connection with U.S. Appl. No. 16/421,217, dated Mar. 9, 2022, 6 pages.
United States Patent and Trademark Office, "Notice of Allowance," issued in connection with U.S. Appl. No. 16/586,225, dated Dec. 8, 2021, 6 pages.

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200366556A1 (en) * 2014-05-19 2020-11-19 Ebay Inc. Phone thermal context
US11949556B2 (en) * 2014-05-19 2024-04-02 Ebay Inc. Phone thermal context
US20220334620A1 (en) 2019-05-23 2022-10-20 Intel Corporation Methods and apparatus to operate closed-lid portable computers
US11782488B2 (en) 2019-05-23 2023-10-10 Intel Corporation Methods and apparatus to operate closed-lid portable computers
US11874710B2 (en) 2019-05-23 2024-01-16 Intel Corporation Methods and apparatus to operate closed-lid portable computers
US11543873B2 (en) 2019-09-27 2023-01-03 Intel Corporation Wake-on-touch display screen devices and related methods
US20200133374A1 (en) * 2019-11-11 2020-04-30 Intel Corporation Methods and apparatus to manage power and performance of computing devices based on user presence
US11733761B2 (en) * 2019-11-11 2023-08-22 Intel Corporation Methods and apparatus to manage power and performance of computing devices based on user presence
US11809535B2 (en) 2019-12-23 2023-11-07 Intel Corporation Systems and methods for multi-modal user device authentication
US11966268B2 (en) 2019-12-27 2024-04-23 Intel Corporation Apparatus and methods for thermal management of electronic user devices based on user activity
US20210298206A1 (en) * 2020-03-17 2021-09-23 International Business Machines Corporation Intelligently deployed cooling fins
US11751360B2 (en) * 2020-03-17 2023-09-05 International Business Machines Corporation Intelligently deployed cooling fins

Also Published As

Publication number Publication date
CN113050774A (en) 2021-06-29
US20200133358A1 (en) 2020-04-30
EP3865977A1 (en) 2021-08-18
US11966268B2 (en) 2024-04-23
US20220350385A1 (en) 2022-11-03

Similar Documents

Publication Publication Date Title
US11966268B2 (en) Apparatus and methods for thermal management of electronic user devices based on user activity
US11543873B2 (en) Wake-on-touch display screen devices and related methods
US10671231B2 (en) Electromagnetic interference signal detection
US11733761B2 (en) Methods and apparatus to manage power and performance of computing devices based on user presence
US10141929B2 (en) Processing electromagnetic interference signal using machine learning
CN104144377B9 (en) The low-power of voice activation equipment activates
CN107643921A (en) For activating the equipment, method and computer-readable recording medium of voice assistant
US20220147142A1 (en) Smart display panel apparatus and related methods
CN113038470A (en) System and method for multi-mode user equipment authentication
US10101869B2 (en) Identifying device associated with touch event
US20210149465A1 (en) Thermal management systems for electronic devices and related methods
EP3335099B1 (en) Electromagnetic interference signal detection
US20230401486A1 (en) Machine-learning based gesture recognition
US20220350481A1 (en) Touch control surfaces for electronic user devices and related methods
CN110602197A (en) Internet of things control device and method and electronic equipment
US20220164011A1 (en) Systems and methods for dynamic electronic device temperature threshold adjustment
EP3350681B1 (en) Electromagnetic interference signal detection
US20230413472A1 (en) Dynamic noise control for electronic devices
EP3335317B1 (en) Processing electromagnetic interference signal using machine learning
CN116601522A (en) Ultrasonic detection of user presence
CN117133260A (en) Dynamic noise control for electronic devices

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

AS Assignment

Owner name: INTEL CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MISHRA, COLUMBIA;RUIZ, CARIN;CAO, HELIN;AND OTHERS;SIGNING DATES FROM 20191226 TO 20200210;REEL/FRAME:051884/0873

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP, ISSUE FEE PAYMENT VERIFIED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE