CN111343331B - Embedded computing device management - Google Patents

Embedded computing device management Download PDF

Info

Publication number
CN111343331B
CN111343331B CN201911297659.7A CN201911297659A CN111343331B CN 111343331 B CN111343331 B CN 111343331B CN 201911297659 A CN201911297659 A CN 201911297659A CN 111343331 B CN111343331 B CN 111343331B
Authority
CN
China
Prior art keywords
performance processing
processing means
user
performance
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911297659.7A
Other languages
Chinese (zh)
Other versions
CN111343331A (en
Inventor
埃里克·林德曼
于尔基·乌西塔洛
蒂莫·埃里克松
亚里·阿卡凯拉
迈克尔·米耶蒂宁
尼科·克纳佩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Songtuo Co
Original Assignee
Amer Sports Digital Services Oy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/223,143 external-priority patent/US11874716B2/en
Application filed by Amer Sports Digital Services Oy filed Critical Amer Sports Digital Services Oy
Publication of CN111343331A publication Critical patent/CN111343331A/en
Application granted granted Critical
Publication of CN111343331B publication Critical patent/CN111343331B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72451User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to schedules, e.g. using calendar applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3442Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for planning or managing the needed capacity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/3293Power saving characterised by the action undertaken by switching to a less power-consuming processor, e.g. sub-CPU
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G19/00Electric power supply circuits specially adapted for use in electronic time-pieces
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G19/00Electric power supply circuits specially adapted for use in electronic time-pieces
    • G04G19/12Arrangements for reducing power consumption during storage
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • G04G21/04Input or output devices integrated in time-pieces using radio waves
    • GPHYSICS
    • G04HOROLOGY
    • G04GELECTRONIC TIME-PIECES
    • G04G21/00Input or output devices integrated in time-pieces
    • G04G21/08Touch switches specially adapted for time-pieces
    • GPHYSICS
    • G04HOROLOGY
    • G04RRADIO-CONTROLLED TIME-PIECES
    • G04R20/00Setting the time according to the time information carried or implied by the radio signal
    • G04R20/02Setting the time according to the time information carried or implied by the radio signal the radio signal being sent by a satellite, e.g. GPS
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/324Power saving characterised by the action undertaken by lowering clock frequency
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/3243Power saving in microcontroller unit
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3003Monitoring arrangements specially adapted to the computing system or computing system component being monitored
    • G06F11/3013Monitoring arrangements specially adapted to the computing system or computing system component being monitored where the computing system is an embedded system, i.e. a combination of hardware and software dedicated to perform a certain function in mobile devices, printers, automotive or aircraft systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/3051Monitoring arrangements for monitoring the configuration of the computing system or of the computing system component, e.g. monitoring the presence of processing resources, peripherals, I/O links, software programs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/30Arrangements for executing machine instructions, e.g. instruction decode
    • G06F9/38Concurrent instruction execution, e.g. pipeline, look ahead
    • G06F9/3877Concurrent instruction execution, e.g. pipeline, look ahead using a slave processor, e.g. coprocessor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues
    • G06F9/4893Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues taking into account power or heat criteria
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5005Allocation of resources, e.g. of the central processing unit [CPU] to service a request
    • G06F9/5027Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals
    • G06F9/5044Allocation of resources, e.g. of the central processing unit [CPU] to service a request the resource being a machine, e.g. CPUs, Servers, Terminals considering hardware capabilities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5094Allocation of resources, e.g. of the central processing unit [CPU] where the allocation takes into account power or heat criteria
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • G06F9/542Event management; Broadcasting; Multicasting; Notifications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/109Time management, e.g. calendars, reminders, meetings or time accounting
    • G06Q10/1093Calendar-based scheduling for persons or groups
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/3827Portable transceivers
    • H04B1/385Transceivers carried on the body, e.g. in helmets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0251Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity
    • H04W52/0254Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity detecting a user operation or a tactile contact or a motion of the device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0251Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity
    • H04W52/0258Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity controlling an operation mode according to history or models of usage information, e.g. activity schedule or time of day
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/3287Power saving characterised by the action undertaken by switching off individual functional units in the computer system
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • G09G2330/021Power management, e.g. power saving
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/06Use of more than one graphics processor to process data before displaying to one or more screens
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/08Power processing, i.e. workload management for processors involved in display operations, such as CPUs or GPUs
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/50Reducing energy consumption in communication networks in wire-line communication networks, e.g. low power modes or reduced link rate

Abstract

According to an exemplary aspect of the invention, there is provided an apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processing core, enable the apparatus to predict a need for a rich media interface based at least in part on a calendar application, and to trigger a start-up of a higher performance processing device from among lower and higher performance processing devices of the apparatus at a point in time selected according to the prediction.

Description

Embedded computing device management
Technical Field
The present invention relates to the field of user equipment, e.g. implementing multi-core or multi-chip embedded solutions.
Background
The user interface UI enables a user to interact with a device such as a car, a smart phone, an automated banking device or an airplane control system. Different user interfaces are suitable for different purposes, for example, when a user uses the device to perform actions that may put personnel at risk, the quality and quantity of information presented to the user when interacting with the user interface should be sufficient to allow the use of the device to be safe.
The user interface may be based on presenting information to the user and receiving input from the user. The information may be presented using an output device such as a display, e.g., an Organic Light Emitting Diode (OLED) display. Input may be received from a user through various input devices, such as a touch screen display, buttons, a microphone arranged to capture the user's voice, and/or a joystick that may be pulled by the user.
The conventional user interface of a watch includes a long hand and a short hand that rotate on the dial to indicate the time of day. The digital watch may, for example, comprise a display of the Liquid Crystal Display (LCD) type, indicating the time of day in a digital manner.
The smart watch may include a touchscreen such that a display portion of the touchscreen acts as an output device for the user interface and a touch-sensitive portion of the touchscreen acts as an input device for the user interface. Using a smart watch presents challenges because useful applications often require larger screens to display a useful amount of information using a font large enough that the user can read the information without having to enlarge the device.
Calendar applications can facilitate the planning of meetings, trips, and resources. Typically, a user uses a personal computer with a large screen to access a calendar application, for example, via a Linux or Windows operating system. The user can then, for example, see the entire work week at a glance.
Embedded devices typically include an object that contains the embedded computing system, which may enclose the embedded computing system. Embedded computer systems may be designed to take into account a particular use, or embedded computer systems may be at least partially generic, enabling users to install software therein. The embedded computer system may be based on a microcontroller or microprocessor CPU, for example.
The embedded device may include one or more processors, a user interface, and a display such that a user may interact with the device using the user interface. The user interface may comprise, for example, buttons. The embedded device may include a connectivity function configured to communicate with a communication network, such as a wireless communication network. The embedded device may be enabled to receive information from such a communication network, for example, relating to the current time and the current time zone.
More complex embedded devices, such as cellular telephones, may allow a user to install applications into memory included with the device, such as solid state memory. Embedded devices are often resource constrained as compared to desktop or portable computers. For example, storage capacity may be more limited, the computing performance of the processor may be lower, and energy may be drawn from the battery as compared to a desktop or portable computer. The battery may be small and may be rechargeable.
Saving battery power is a critical task in designing embedded devices. Lower current usage may extend the time interval between battery charges. For example, a smartphone is very advantageous when it can be used for an entire day to require charging, as it enables a user to charge the phone at night, while enjoying uninterrupted use during the day.
Battery resources may be conserved by adjusting the clock frequency of the processor between a maximum clock frequency and a lower clock frequency (e.g., half the maximum clock frequency). Another way to conserve battery power is to have the display of the embedded device turn itself off when the device is not in use, since displaying content on the display consumes energy to cause the display to emit light that is visible to humans.
Disclosure of Invention
The present invention generally relates to a user interface for presenting sunrise and sunset times in a new manner.
The invention is defined by the features of the independent claims. Some embodiments are defined in the dependent claims.
According to a first aspect of the present invention, there is provided an apparatus comprising at least one processing core, at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processing core, enable the apparatus at least to predict a need for a rich media interface based at least in part on a calendar application, and to trigger a start of a higher performance processing device from among lower performance processing devices and higher performance processing devices of the apparatus at a point in time selected in accordance with the prediction.
According to a second aspect of the invention, there is provided a method comprising: causing the device to predict a need for a rich media interface based at least in part on the calendar application; and triggering the start-up of a higher-performance processing device from among the lower-performance processing devices and the higher-performance processing devices of the apparatus at a point in time selected in accordance with the prediction.
According to a third aspect of the invention, there is provided a non-transitory computer readable medium having stored thereon a set of computer readable instructions which, when executed by at least one processor, enable an apparatus to predict a need for a rich media interface based at least in part on a calendar application and trigger a start of a higher performance processing means from among lower performance processing means and higher performance processing means of the apparatus at a point in time selected according to the prediction.
According to a fourth aspect of the present invention, there is provided a computer program configured to perform the method according to the second aspect when run on a processing device.
Drawings
FIG. 1A illustrates a system in accordance with at least some embodiments of the inventions;
FIG. 1B illustrates a system in accordance with at least some embodiments of the inventions;
FIG. 2 illustrates a user interface in accordance with at least some embodiments of the invention;
FIG. 3 illustrates an exemplary apparatus capable of supporting at least some embodiments of the invention;
FIG. 4 illustrates signaling in accordance with at least some embodiments of the invention;
FIG. 5 is a flow diagram of a method in accordance with at least some embodiments of the invention;
FIG. 6 illustrates an exemplary system capable of supporting at least some embodiments of the invention;
FIG. 7 illustrates a first exemplary apparatus capable of supporting at least some embodiments of the invention;
FIG. 8 illustrates signaling in accordance with at least some embodiments of the invention;
FIG. 9 is a first flowchart of a first method in accordance with at least some embodiments of the present invention; and
fig. 10 is a state transition diagram in accordance with at least some embodiments of the present invention.
Detailed Description
By presenting information along a time axis, the device may allow a user to gather information related to area and time from a screen of limited size. In particular, the user may scroll the content in the screen along a time axis to view past and/or future events, which may originate from the calendar application or the natural world around him. Combining the indication of sunset or sunrise with the timeline may enable a user to plan their activities according to the natural light available. The time axis provides a conceptually effective classification method capable of displaying only information that is currently of interest to a user on a screen of a limited size.
Providing an embedded device with two or more processor cores, wherein at least some of the processor cores are capable of controlling the display of the device, enables energy savings in the event that a lower performing processor core is configured to switch a higher performing processor core to and from a sleep state. The sleep state may include, for example, setting the clock frequency of the higher performance processing core to zero. In addition to or instead of setting the clock frequency of the higher performance processing core to zero in the sleep state, the memory refresh rate of the memory used by the higher performance core may be set to zero. Instead of zero, a lower non-zero frequency may be used as the clock frequency and/or the memory refresh frequency. In some embodiments, higher performance processing cores may employ higher density memory technologies, such as Double Data Rate (DDR) memory, while lower performance processing cores may employ lower density memory technologies, such as Static Random Access Memory (SRAM) memory. In the sleep state, the sleeping processing core (or more generally the processing unit) may be powered down. Instead of a processor core, in some embodiments, the entire processor may transition to a sleep state. An advantage of sleeping the entire processor is that circuitry outside the cores in the processor also sleeps, further reducing current consumption.
The device may predict a need for a rich media interface based at least in part on a calendar application and trigger activation of a higher-performance processing device from among lower-performance and higher-performance processing devices of the device at a point in time selected according to the prediction. The prediction may be based on a particular calendar event in the calendar application that includes an indication related to the application for handling the calendar event. The predicting may include comparing the indication relating to the application to a list of applications and their associated media requirements such that a higher performance processing device is triggered when the lower performance processing device fails to meet the media requirements of the application needed to process the particular calendar event. The trigger may occur at a point in time before the start time of the calendar event, the point in time being selected such that, according to the list, the higher performance processing device has time to start and the application is started at that start time in time. Thus, the trigger may be earlier than the start time of the particular calendar event by a period of time equal to the sum of the start time of the higher-performance processing device and the start delay of the application in the higher-performance processing device. Thus, the application will be available on a higher performance processing device at the appropriate time (rather than prematurely), which will waste resources. For example, the start-up delay of an application in a higher performance processing device may be recorded on the application list and its associated media requirements. The list may be updated when a new application is installed, and when the user is not using the device, the devices may determine their startup delay experimentally without user intervention.
FIG. 1A illustrates a system in accordance with at least some embodiments of the inventions. The system includes a device 110, which device 110 may include, for example, a smart watch, a digital watch, a smart phone, a tablet device, or another type of suitable device. Device 110 includes a display, which may include, for example, a touch screen display. The size of the display may be limited. The device 110 may be powered by a rechargeable battery, for example. An example of a display of limited size is a display worn on the wrist.
Device 110 may be communicatively coupled to a communication network. For example, in fig. 1A, device 110 is coupled with base station 120 via wireless link 112. Base stations 120 may include cellular or non-cellular base stations, where a non-cellular base station may be referred to as an access point. Examples of cellular technologies include Wideband Code Division Multiple Access (WCDMA) and Long Term Evolution (LTE), while examples of non-cellular technologies include Wireless Local Area Network (WLAN) and Worldwide Interoperability for Microwave Access (WiMAX). The base station 120 may be coupled with a network node 130 via a connection 123. For example, connection 123 may be a wired connection. The network node 130 may comprise, for example, a controller or a gateway device. Network node 130 may interface with a network 140 via connection 134, which network 140 may include, for example, the internet or a corporate network. Network 140 may be coupled with other networks via connection 141. In some embodiments, device 110 is not configured to couple with base station 120.
Device 110 may be configured to receive satellite positioning information from a constellation of satellites 150 via a satellite link 151. The satellite constellation may include, for example, the Global Positioning System (GPS) or galileo constellation. Satellite constellation 150 may include more than one satellite, however only one satellite is shown in FIG. 1A for clarity. Likewise, receiving positioning information over satellite link 151 may include receiving data from more than one satellite.
In embodiments where device 110 is unable to receive data from a constellation of satellites, device 110 may obtain positioning information by interacting with a network that includes base stations 120. For example, a cellular network may locate devices in various ways, such as trilateration, multilateration, or based on the identity of base stations that may be connected to it. Likewise, a non-cellular base station or access point may know its own location and provide it to device 110, thereby enabling device 110 to position itself within a communication range of the access point.
For example, device 110 may be configured to obtain the current time from satellite constellation 150, base station 120, or from a user request. Once the device 110 has an estimate of the current time and its location, the device 110 may, for example, consult a look-up table to determine how much time remains until sunset and/or sunrise.
The device 110 may be configured to determine a sunset time and/or a sunrise time and obtain event information from a calendar application. The device 110 may be further configured to present to the user a representation of at least some of these events, arranged relative to the time axis to enable the user to understand how these calendar events relate to each other and to sunset and/or sunrise. In this way, the user may, for example, perform tasks during the day. Using a time axis or timeline, the relevant information may be displayed to the user through a limited size screen in chronological order.
FIG. 1B illustrates a system in accordance with at least some embodiments of the inventions. The same reference numerals denote the same structures as in fig. 1A. The embodiment of FIG. 1B includes an auxiliary device 110 x.
Device 110 may be communicatively coupled, e.g., communicatively paired, with auxiliary device 110 x. The communicative coupling or pairing is represented in fig. 1B as a junction 111, which may be wireless as shown, or wired, depending on the embodiment. The auxiliary device 110x may include, for example, a smartphone, tablet computer, or other computing device. Auxiliary device 110x may include a device for use by an owner of device 110 to consume media, communicate, or interact with applications. The auxiliary device 110x may be equipped with a larger display screen than the device 110, which may make the auxiliary device 110x more desirable to the user when complex interactions with applications are required, since a larger screen may enable more detailed presentation of interaction options. In some embodiments, such as those shown in FIG. 1A, no auxiliary device 110x is provided.
In some embodiments, in the presence of the auxiliary device 100x, the device 110 is configured to use the connection capabilities of the auxiliary device 110 x. For example, device 110 may access a network via auxiliary device 110 x. In these embodiments, connectivity to base station 120 need not be provided to device 110, for example, because device 110 may access network resources via interface 111 and secondary device 110x has a connection with base station 120. Such a connection is shown in FIG. 1B as connection 112 x. For example, device 110 may comprise a smart watch and secondary device 110x may comprise a smartphone, which may have connectivity to a cellular and/or non-cellular data network. Also, in some embodiments, device 110 may receive satellite positioning information, or positioning information derived from satellite positioning information, via auxiliary device 110x in the absence of its own satellite positioning receiver by device 110. The satellite connection of the auxiliary device is shown in fig. 1B as connection 151 x.
In some embodiments, the device 110 may have some connectivity and be configured to use that connectivity as well as the connectivity provided by the secondary device 110 x. For example, device 110 may include a satellite receiver such that device 110 can obtain satellite positioning information directly from satellite constellation 150. Device 110 may then obtain a network connection with base station 120 via auxiliary device 110 x.
FIG. 2 illustrates a user interface in accordance with at least some embodiments of the invention. Display 200 may comprise, for example, a display included in device 110 shown in fig. 1. A time axis 210, which may be referred to as a timeline, is shown on the display 200. In the middle of the timeline is a current time indicator 220. The current time indication section 220 is optional. The shape of the display 200 need not be the same as that shown in fig. 2.
Events are represented along the timeline by symbols 240, 250, and 260. Each of the symbols 240, 250, and 260 corresponds to a calendar event or a dynamic event, such that the time at which the event occurs determines its position on or relative to the timeline, where the respective symbol is displayed. For example, in FIG. 2, the events represented by symbols 240 and 250 have occurred, while the event corresponding to symbol 260 will occur in the future. The user interface may communicate with the calendar application to obtain information therefrom characterizing calendar events so that they can be represented as symbols along the time axis. Naturally, the number of events need not be three as shown in fig. 2, but rather depends on the dynamic and calendar inputs.
Sunrise time 232 and sunset time 234 are represented along time axis 210. In various embodiments, sunrise or sunset times may be omitted from the user interface. Optionally, an arc 230 may be shown, which represents the course of the sun in the sky. The position of the sun 236 may be represented along an arc. Alternatively or additionally, the position or phase of the sun may be represented proximate to the time indicator 220 or in another suitable manner. Device 110 may determine the sunrise time and the sunset time, for example, based on the location information and a table stored, for example, in a memory accessible to device 110.
The device 110 running the user interface shown in fig. 2 may be configured to enable a user to scroll content on the screen along the timeline, for example by providing a sliding interaction to a touch screen for displaying the user interface. Likewise, the user is enabled to zoom in and/or out, for example, by providing a pinch interaction to a touch screen for displaying the user interface. Another possibility for scrolling and/or zooming the user interface is a rotatable hardware element provided in the device 110. For example, a rotatable hardware element may be partially retracted such that when the hardware element is not retracted, rotating it may provide a scrolling interaction with the user interface, and when the hardware element is retracted, rotating it may provide a zooming interaction with the user interface. Such interaction may be applicable to smaller screen sizes, where the user's finger may be similar to the size of the screen. An exemplary rotatable hardware element has been described in U.S. patent application No. 12/650303, publication No. US 2010/0187074.
The device 110 may be configured to determine at least one dynamic event. Dynamic events include events that occur at points in time that depend on the location of the device 110. The dynamic event may occur at a point in time that depends on the location of the device 110 and the location of the predetermined location. The predetermined locations may include points of interest, for example. The predetermined location may be defined by a user. The predetermined location may include, for example, a user's home, a large-book camp, a hotel, a hospital, or other type of location. For example, the device 110 may determine when the user needs to start walking, cycling, or driving to a predetermined location so that the user will arrive at the predetermined location before sunset. To enable the device 110 to determine this time, the device 110 may learn the sunset time from the location information and, for example, a look-up table stored in the device 110. The device 110 may then determine a route from the current location of the device 110 to the predetermined location and determine a length of the route. Device 110 may determine a route based at least in part on interaction with a mapping application. The time required to traverse the route (i.e., the travel time) may then be determined based on the user's movement speed, device 110 may be preconfigured with the user's movement speed, or device 110 may determine the user's movement speed through the user's past behavior. The time of the dynamic event may then be determined as a time that is ahead of the sunset time by the walk-through time.
As an alternative to sunset, dynamic events may be determined based on meteorological events (e.g., rainfall). To do this, the device 110 may obtain a locally relevant weather forecast and use it instead of sunset time to infer when the user needs to start going to a predetermined location. As another example, the dynamic event may be based on the departure time of a public transportation means, such as a train or an airplane. Thus, the user may be provided with a visual indication as to how long before he needs to start to go to the train station or airport.
The device 110 may be configured to trigger an alarm, for example, by a dynamic event, or to provide a vibration or other kind of prompt to the user. Thus, it is possible to improve the safety of people when roaming in nature, since they can be warned to drive to a predetermined location in time to arrive there before darkness, rain or other events.
Although FIG. 2 shows a view of the user interface in which both sunrise and sunset are visible, the zoom view and/or the scroll view may display only one of the sunrise and sunset, or indeed none of them when the view is zoomed into a portion of the timeline that is between the sunrise and sunset. Typically, an indication of the position or phase of the sun may be provided to enable the user to know how long it is to sunset or sunrise. Such an indication may take the form of an arc 230, an angle or a tangent, or another suitable indication.
The user may select a symbol, such as symbol 250, and interact with it to cause device 110 to perform an action related to the calendar event corresponding to symbol 250. For example, in response to the user touching symbol 250 or indeed another symbol, device 110 may display details such as the location, attendee, or duration of the calendar event on the screen. These details may be displayed below the timeline in the timeline view, or alternatively the timeline view may be replaced by these details, e.g. instead of five or ten seconds. In some embodiments, the user is able to interact with an application related to a calendar event. For example, in response to the user touching symbol 250, the user may participate in the teleconference by interacting with symbol 250 and then interacting with another user interface element, e.g., displayed with the details.
When the user interacts with the symbol corresponding to the dynamic event, the user may be presented with information related to the dynamic event, such as a map showing the determined route, or instructions on how to reach the predefined location.
The device 110 may be configured to detect a device scene. For example, device 110 may detect that a user is working or interacting with a worker program, in response to which device 110 may cause calendar events related to the work to be presented in a timeline user interface. For example, in the case where the user is working, calendar events unrelated to the work may be suppressed, meaning that symbols corresponding to them are not displayed in the user interface. As another example, dynamic events related to public transportation may be presented in a user interface as a user moves near a downtown. As yet another example, dynamic events related to sunset or rain may be presented and work-related events suppressed as the user roams through nature. In general, events within a scene may be presented in a user interface, while events outside of the scene may be suppressed and not graphically presented in the user interface.
The device 110 may be configured to autonomously detect device scenarios and suppress dynamic and/or calendar events outside of the scenarios without user input. The user may override the suppression using the user interface interaction element to view all calendar and/or dynamic events on the timeline or to reconfigure the device context in the event that the device 110 erroneously detected the device context. The advantage of this suppression is that in a screen of limited size of the device, the screen is used to display more relevant information, but not less relevant information that would disturb the view.
Device 110 may be configured to provide a display in at least two modes, a lean media mode and a rich media mode. The lean media mode may be presented by a low performance processing device in device 110, while the rich media mode may require device 110 to activate a high performance processing device in device 110. High performance processing devices may consume more battery resources than low performance processing devices. The device 110 may be configured to predictively activate the high-performance processing means in response to determining that a certain calendar event will soon occur, the processing of which will require the rich media mode.
FIG. 3 illustrates an exemplary device capable of supporting at least some embodiments of the present invention. There is shown a device 300 that may comprise, for example, the embedded device 110 shown in fig. 1. Processor 310 is included in device 300, and processor 310 may include, for example, a single-core or multi-core processor, where a single-core processor includes one processing core and a multi-core processor includes more than one processing core. Processor 310 may include more than one processor or processing unit. The processor 310 may include at least one Application Specific Integrated Circuit (ASIC). The processor 310 may include at least one Field Programmable Gate Array (FPGA). Processor 310 may be a device in apparatus 300 for performing method steps. Processor 310 may be configured, at least in part, by computer instructions to perform actions.
Device 300 may include memory 320. Memory 320 may include random access memory and/or persistent memory. The memory 320 may include at least one RAM chip. The memory 320 may include, for example, solid state, magnetic, optical, and/or holographic memory. The memory 320 may be at least partially accessible to the processor 310. The memory 320 may be at least partially included in the processor 310. The memory 320 may be a device for storing information. Memory 320 may include computer instructions that configure processor 310 to execute. When computer instructions configured to cause processor 310 to perform certain actions are stored in memory 320, and device 300 as a whole is configured to run under the direction of processor 310 using the computer instructions from memory 320, processor 310 and/or at least one processing core thereof may be considered to be configured to perform the certain actions. The memory 320 may be at least partially included in the processor 310. Memory 320 may be at least partially external to device 300, but accessible to device 300.
The device 300 may include a transmitter 330. Device 300 may include a receiver 340. Transmitter 330 and receiver 340 may be configured to transmit and receive information, respectively, according to at least one cellular or non-cellular standard. The transmitter 330 may include more than one transmitter. Receiver 340 may include more than one receiver. Transmitter 330 and/or receiver 340 may be configured to operate in accordance with, for example, global system for mobile communications (GSM), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), IS-95, Wireless Local Area Network (WLAN), ethernet, and/or Worldwide Interoperability for Microwave Access (WiMAX) standards.
The device 300 may include a Near Field Communication (NFC) transceiver 350. NFC transceiver 350 may support at least one NFC technology, such as NFC, bluetooth, Wibree, or the like.
Device 300 may include a User Interface (UI) 360. UI 360 may include at least one of a display, a keyboard, a touch screen, a vibrator configured to signal a user by vibrating device 300, a speaker, and a microphone. The user can operate the device 300, for example, via the UI 360, to interact with the timeline-based view.
The device 300 may include a user identity module 370 or be arranged to accept a user identity module 370. The user identity module 370 may comprise, for example, a Subscriber Identity Module (SIM) card that may be installed in the device 300. The user identity module 370 may include information identifying the user of the device 300 to subscribe to. User identity module 370 may include cryptographic information that may be used to verify the identity of a user of device 300 and/or to facilitate encryption of information communicated and billing for communications by device 300 to the user of device 300.
The processor 310 may be equipped with a transmitter arranged to output information from the processor 310 to other devices comprised in the device 300 via electrical leads internal to the device 300. Such a transmitter may comprise a serial bus transmitter, for example, configured to output information to memory 320 via at least one electrical lead for storage therein. As an alternative to a serial bus, the transmitter may comprise a parallel bus transmitter. Likewise, processor 310 may include a receiver configured to receive information from other devices included in device 300 into processor 310 via electrical leads internal to device 300. Such a receiver may comprise a serial bus receiver, for example, arranged to receive information from receiver 340 via at least one electrical lead for processing in processor 310. As an alternative to a serial bus, the receiver may comprise a parallel bus receiver.
Device 300 may include other components not shown in fig. 3. For example, where device 300 comprises a smartphone, it may comprise at least one digital camera. Some devices 300 may include a back camera that may be used for digital photography and a front camera that may be used for video telephony. The device 300 may comprise a fingerprint sensor arranged to at least partially authenticate a user of the device 300. In some embodiments, the apparatus 300 lacks at least one of the devices described above. For example, some devices 300 may lack NFC transceiver 350 and/or subscriber identity module 370.
Processor 310, memory 320, transmitter 330, receiver 340, NFC transceiver 350, UI 360, and/or subscriber identity module 370 may be interconnected in a number of different ways by electrical leads internal to device 300. For example, each of the above-described devices may be individually connected to a main bus within device 300 to allow the devices to exchange information. However, as will be understood by a person skilled in the art, this is only one example and various ways of interconnecting at least two of the above-described devices may be chosen depending on the embodiment without departing from the scope of the invention.
Fig. 4 illustrates signaling in accordance with at least some embodiments of the invention. Arranged from left to right on the vertical axis are a satellite constellation 150, a base station 120, a device 110 and an auxiliary device 110 x. Satellite constellation 150, base stations 120, and device 110 correspond to similar components described in connection with fig. 1. Auxiliary device 110x may comprise, for example, a user device equipped with a larger screen than device 110. For example, the auxiliary device 110x may comprise a smartphone or tablet computer. The auxiliary device 110x may be paired with the device 110 using, for example, the bluetooth protocol.
In step 410, device 110 obtains positioning information from satellite constellation 150. The device 110 may use the positioning information to determine its location and determine a sunrise time and a sunset time for the determined location.
In step 420, the device 110 obtains weather information from the base station 120. For example, the device 110 may request and responsively receive weather information related to the determined location of the device 110, such as from a server that the device 110 may reach through the base station 120.
In step 430, the device 110 may determine the time of the dynamic event, for example, based on the sunset time and/or weather information, as described above. The dynamic event may, for example, correspond to a time at which the user needs to start going to a predetermined location to avoid darkness and/or adverse weather.
In step 440, device 110 may provide an alert to the user, for example, via a user interface, a vibrator, and/or a speaker. In optional step 450, an alert may be provided to the user via auxiliary device 110 x.
Fig. 5 is a flow diagram of a method in accordance with at least some embodiments of the invention. The various steps of the illustrated method may be performed, for example, in the device 110, or in a control means (when implanted in the device) configured to control the functions of the device 110.
Step 510 includes obtaining at least one of a sunrise time and a sunset time for a current location of the device. Step 520 includes obtaining a plurality of calendar events from a calendar application. Step 530 includes displaying a time axis on the screen and displaying a plurality of symbols corresponding to at least a portion of the plurality of calendar events relative to the time axis. Finally, step 540 includes displaying at least one of the following relative to the time axis: an indication of sunrise associated with a portion of the time axis corresponding to sunrise time, and an indication of sunset associated with a portion of the time axis corresponding to sunset time.
FIG. 6 illustrates an exemplary system capable of supporting at least some embodiments of the invention. Included in the exemplary system of fig. 6 is a device 610, which may include an embedded device such as a smart watch, personal health monitor, cellular telephone, smart phone, or other suitable device.
In the example of fig. 6, the device 610 is configured with multiple communication interfaces. The first communication interface enables the device 610 to receive satellite positioning information from a satellite constellation 640 via a satellite link 614. Examples of suitable satellite positioning constellations include the Global Positioning System (GPS), GLONASS, beidou, and galileo satellite positioning constellations.
The second communication interface enables the device 610 to communicate with a cellular communication system, such as a Wideband Code Division Multiple Access (WCDMA) or Long Term Evolution (LTE) network. Cellular link 612 may be configured to communicate information between device 610 and base station 620. Cellular link 612 may be configured according to the same cellular communication standard that both device 610 and base station 620 support. The base station 620 may be comprised in a cellular radio access network comprising a plurality of base stations. The base station 620 may be arranged to communicate with a core network node 650 via a connection 625. The core network node 650 may comprise, for example, a switch, a mobility management entity, or a gateway. The core network node 650 may be arranged to communicate with another network 670, such as the internet, via a connection 657.
The third communication interface enables the device 610 to communicate with non-cellular communication systems, such as Wireless Local Area Network (WLAN), bluetooth, or Worldwide Interoperability for Microwave Access (WiMAX) systems. Another example is an inductive subsea communication interface. The non-cellular link 613 may be configured to communicate information between the device 610 and the access point 630. The non-cellular link 613 may be configured according to the same non-cellular technology supported by both the device 610 and the access point 630. The access point 630 may be arranged to communicate with the gateway 660 via connection 636. The gateway 660 may be arranged to communicate with another network 670 via a connection 667. Each of the connections 625, 657, 636, and 667 may be wired, or at least partially wireless. Not all of these connections need be of the same type. In certain embodiments, at least one of the first communication interface, the second communication interface, and the third communication interface is absent.
The fourth communication link may enable the device 610 to communicate with mobile devices. For example, the low power wireless interface may be capable of communicating with a mobile device other than the device 610 when the device 610 lacks cellular capabilities and the mobile device has cellular capabilities. One example of a low power wireless interface is Bluetooth Low Energy (BLE) or Bluetooth Smart.
In use, the device 610 may determine a geographic location of the device 610 using satellite positioning information from the satellite constellation 640. For example, the geographic location may be determined from the coordinates. The device 610 may be configured to present a map on a display included in the device 610 on which the determined geographic location of the device 610 is presented. For example, the device 610 may display a map of surrounding streets or features with symbols representing the current location of the device 610 on the map. Providing a map on which the current location of the device 610 is displayed and/or providing navigation instructions may be referred to as a map service.
In some embodiments, the device 610 may provide connectivity services to the user, such as web browsing, instant messaging, and/or email. In some embodiments, the device 610 may be configured to provide connectivity services for its functions and/or applications, including enabling remote access to such functions and/or services over a network such as the Internet. Thus, the device 610 may be trackable on the internet, for example. Such connectivity services may operate over bidirectional communication links such as cellular link 612 and/or non-cellular link 613. In general, the device 610 may provide services such as a map service or a connectivity service to a user through a display.
The device 610 may include two or more processing units. Two or more processing units may each include a processing core. Each processing unit may include one or more uniform or heterogeneous processor cores, and/or different volatile and non-volatile memories. For example, the device 610 may include a microprocessor having at least one processing core, and a microcontroller having at least one processing core. The processing cores need not be of the same type, for example, a processing core in a microcontroller may have more limited processing performance and/or a weaker performance memory technology than a processing core included in a microprocessor. In some embodiments, a single integrated circuit includes two processing cores, a first processing core having a lower processing performance and consuming less power, and a second processing core having a higher processing performance and consuming more power. In general, a first of the two processing units may have a lower processing performance and consume less power, while a second of the two processing units may have a higher processing performance and consume more power. Each processing unit may be enabled to control a display of the device 610. The higher performance processing unit may be configured to provide a richer visual experience via the display. The lower performance processing unit may be configured to provide an impaired visual experience via the display. One example of an impaired visual experience is an impaired color display mode, as opposed to a rich color display mode. Another example of an impaired visual experience is a black and white visual experience. One example of a richer visual experience is one that uses color. For example, color may be represented by 16 bits or 24 bits.
Each of the two processing units may include a display interface configured to communicate to a display. For example, where the processing unit includes a microprocessor and a microcontroller, the microprocessor may include transceiver circuitry coupled to at least one metal pin beneath the microprocessor, the at least one metal pin being electrically coupled with the input interface of the display control device. A display control device, which may be included in the display, is configured to enable the display to display information in accordance with electrical signals received in the display control device. Also, in this example, the microcontroller may include a transceiver circuit coupled to at least one metal pin underneath the microcontroller, the at least one metal pin being electrically coupled with the input interface of the display control device. The display control device may comprise two input interfaces, one to each of the two processing units, or alternatively the display control device may comprise a single input interface to which both processing units are capable of providing input via their respective display interfaces. Thus, the display interface in the processing unit may comprise transceiver circuitry which enables the processing unit to transmit electrical signals to the display.
One of the processing units, e.g. the one with the higher performance or the one with the lower performance, may be configured to at least partly control the other processing unit. For example, a lower performance processing unit (e.g., a lower performance processing core) may be enabled to transition a higher performance processing unit (e.g., a higher performance processing core) into and out of a sleep state. These transitions may be caused by signaling via an inter-processing unit interface, such as an inter-core interface.
When transitioning from the active state to the sleep state, the transition processing unit may store its scene at least partially in a memory, such as a Pseudo Static Random Access Memory (PSRAM), SRAM, FLASH, or ferroelectric ram (fram). A scene may include, for example, the contents of registers and/or addressing. When transitioning from a hibernation state using scenes stored in memory, the processing unit may resume processing faster and/or from the location in which the processing unit was in the hibernation state. In this way, the delay experienced by the user can be minimized. Alternative terms occasionally used for scenes include state and image. In the sleep state, the clock frequency of the processing unit and/or associated memory may be set to zero, which means that the processing unit is powered down and does not consume energy. The circuitry configured to provide the operating voltage to the at least one processing unit may comprise, for example, a Power Management Integrated Circuit (PMIC). Since the device 610 includes another processing unit, the hibernating processing unit may be completely powered down while maintaining the availability of the device 610.
When transitioning from the sleep state to the active state, the transition processing unit may set its clock frequency to a non-zero value. The transition processing unit may read the scenes from memory, where the scenes may include previously stored scenes, such as those stored in connection with transitioning to the sleep state, or the scenes may include default states or scenes of the processing unit stored in memory at the factory. The memory may include, for example, pseudo Static Random Access Memory (SRAM), FLASH, and/or FRAM. The memory used by the processing unit when transitioning to and from the sleep state may comprise, for example, DDR memory.
In the case where one processing unit is in a sleep state, the non-sleeping processing unit may control the device 610. For example, the non-hibernating processing unit may control a display via a display interface included in the non-hibernating processing unit. For example, where a lower performing processing unit has caused a higher performing processing unit to transition to a sleep state, the lower performing processing unit may provide an impaired user experience, for example, at least in part through a display. One example of an impaired user experience is a map experience with an impaired visual experience, including black and white rendering of a map service. The diminished experience may be sufficient for the user to benefit from, which has the advantage that battery power may be saved by sleeping higher performance processing units. In some embodiments, higher performance processing units, such as microprocessors, may consume current in the milliamp range when in a non-sleep low power state, while lower performance processing units, such as microcontrollers, may only consume current in the microamp range when in a non-sleep low power state. In the non-sleep state, the current consumption of the processing unit may be modified by setting the operating clock frequency to a value between a maximum clock frequency and a minimum non-zero clock frequency. In at least some embodiments, a processing unit (e.g., a lower performance processing unit) may be configured to be powered down for a short time, e.g., 10 or 15 microseconds, before being awakened. In the context of this document, this is not referred to as a sleep state, but rather an active low power configuration. The average clock frequency calculated over several such periods and the active periods in between is a non-zero positive value. For example, a higher performance processing unit may be made to run the Android operating system.
The triggering event for transitioning the processing unit to the sleep state includes: the user indicates that no more impairment-free experience is required; the communication interface of the processing unit is no longer required; and the device 610 has not been used for a predetermined length of time. An exemplary indication that a loss-free experience is no longer needed is where a user deactivates a full version of an application (e.g., a mapping application). Triggering events for transitioning a processing unit from a sleep state to an active state may include a user indicating that a non-detracting experience is desired, requesting a communication interface of the processing unit, and interacting with the device 610 after a period of inactivity. Alternatively or additionally, the external event may be configured to trigger an event, such as an event based on a sensor included in the device 610. One example of such an external event is a clock-based event that is configured to occur at a predetermined time of day, such as an alarm clock function. In at least some embodiments, the non-detracting experience includes using a graphics mode that is not supportable by non-sleeping processing units but is supportable by sleeping processing units. The graphics mode may include, for example, a combination of resolution, color depth, and/or refresh rate.
In some embodiments, user demand or user requests for non-detracting experiences may be predicted. Such predictions may be based at least in part on usage patterns of users who tend to perform particular actions with an impaired experience before requesting a non-impaired experience. In this case, the non-detracting mode may be triggered in response to determining that the user performed the particular action with a detracted experience.
If the processing unit is located in a separate device or housing, e.g. a wrist-type computer and a hand-held or fixed mountThe display device is mounted so that the bus can be implemented wirelessly using a wireless communication protocol. The radio transceiver units functionally connected to their respective processing units may thus perform the functions of a bus, thereby forming a Personal Area Network (PAN). The wireless communication protocol may be a protocol for communication between computers and/or between any remote sensors, such as bluetooth LE or a proprietary ANT + protocol. These use Direct Sequence Spread Spectrum (DSSS), modulation techniques and adaptive synchronous network configurations, respectively. For example, enabling descriptions of the necessary hardware for various implementations of wireless links can be obtained from the Texas instruments' Manual "Wireless connection," which includes IC circuitry and associated hardware configurations, such as ANT, for protocols operating in frequency bands below 1GHz and 2.4GHzTM
Figure BDA0002320991380000161
Low-energy-consumption RFID/NFC and PurpathTMWireless audio frequency,
Figure BDA0002320991380000162
IEEE 802.15.4, ZigBee RF4CE, 6LoWPAN and
Figure BDA0002320991380000163
in the case of hibernation, the PAN may be kept in operation by processing units that are not hibernating, such that when hibernation ends, processing units that are leaving the hibernation mode may access the PAN without re-establishing access to the PAN.
In some embodiments, in the first processor, the microphone data is used to determine whether to trigger the second processor from a sleep state. The first processor is lower in performance and consumes less energy than the second processor. For example, the first processor may comprise a microcontroller and the second processor may comprise a microprocessor. The microphone data may be compared with reference data and/or preprocessed to recognize in the microphone data characteristics so that it can be determined whether a voice command has been issued and recorded in the microphone data. Instead of or in addition to the voice command, the microphone data may be searched for an auditory control signal, such as a fire alarm or a beep.
The first processor may activate the second processor in response to the first processor detecting a voice instruction and/or an audible control signal in the microphone data. In some embodiments, the first processor initiates the second processor to a state in which the first processor makes a selection based on verbal instructions and/or auditory control signals in the microphone data. Thus, for example, where the voice instructions identify a web search engine, the second processor may be launched into the user interface of that particular web search engine. As another example, in the event that the audible control signal is a fire alarm, the second processor may be launched into a user interface of an application that provides emergency guidance to the user. Selecting an initial state for the second processor that already exists in the first processor may save time compared to the case where the state is selected by the user or the second processor itself.
In case the microphone is comprised in the device, the microphone may in particular be encapsulated within a waterproof housing. While such a housing may prevent the generation of high quality microphone data, it may allow sufficient microphone quality to be produced for the first processor to determine whether voice instructions and/or audible control signals are present.
In some embodiments, a first processor is configured to process a notification arriving in a device and determine whether a second processor is needed to process the notification. The notification may relate to, for example, a multimedia message or an incoming video call. The notification may relate to a software update presented to the device, where the first processor may cause the second processor to leave the sleep state to process the notification. The first processor may select an initial state in which the second state is entered from the sleep state according to the notification. The second processor may transition the first processor to the sleep state for the duration of the software update.
In general, an instruction from outside the device may be received in the device, and the first processor may cause the second processor to leave the sleep state in response. The instructions from outside the device may include, for example, notifications, voice instructions, or audible control signals.
FIG. 7 illustrates a first exemplary apparatus capable of supporting at least some embodiments of the present invention. The illustrated apparatus includes a microcontroller 710 and a microprocessor 720. Microcontroller 710 may comprise, for example, a Silabs EMF32 or Renesas RL78 microcontroller or the like. The microprocessor 720 may include, for example, a Qualcomm Snapdragon processor or an ARM Cortex based processor. In the example of fig. 7, microcontroller 710 and microprocessor 720 are communicatively coupled with an inter-core interface, which may include, for example, a serial or parallel communication interface. More generally, the interface disposed between microcontroller 710 and microprocessor 720 may be considered an inter-processing unit interface.
In the example shown, microcontroller 710 is communicatively coupled with buzzer 770, Universal Serial Bus (USB), interface 780, pressure sensor 790, acceleration sensor 7100, gyroscope 7110, magnetometer 7120, satellite positioning circuitry 7130, bluetooth interface 7140, user interface buttons 7150, and touch interface 7160. Pressure sensor 790 may include, for example, an atmospheric pressure sensor.
Microprocessor 720 is communicatively coupled with optional cellular interface 740, non-cellular interface 750, and USB interface 760. The microprocessor 720 is also communicatively coupled to a display 730 via a microprocessor display interface 722. Microcontroller 710 is also communicatively coupled to display 730 via microcontroller display interface 712. Microprocessor display interface 722 may include communication circuitry included in microprocessor 720. The microcontroller display interface 712 may include communication circuitry included in the microcontroller 710.
Microcontroller 710 may be configured to determine whether a triggering event has occurred, wherein microcontroller 710 may be configured to transition microprocessor 720 to and from the sleep state described above in response to the triggering event. When microprocessor 720 is in a sleep state, microcontroller 710 may control display 730 via microcontroller display interface 722. Thus, when microprocessor 720 is in a sleep state, microcontroller 710 may provide a detracting experience to the user, for example, through display 730.
In response to a triggering event, microcontroller 710 may transition microprocessor 720 from a sleep state to an active state. For example, in the case where the user indicates that he wishes to initiate a cellular communication connection, e.g., via button 7150, microcontroller 710 may transition microprocessor 720 to an active state since cellular interface 740 may be controlled by microprocessor 720, but in the example of fig. 7, microcontroller 710 cannot directly use microprocessor 720. In some embodiments, when the microprocessor 720 is in a sleep state, the cellular interface 740 is also in a sleep state. The cellular interface 740 may include, for example, an electrical interface to a cellular transceiver. The cellular interface 740 may include control circuitry for a cellular transceiver.
In various embodiments, at least two of the elements shown in FIG. 7 may be integrated on the same integrated circuit. For example, microprocessor 720 and microcontroller 710 may be provided as processing cores in the same integrated circuit. In this case, for example, cellular interface 740 may be a cellular interface in the integrated circuit that is included in the integrated circuit, and cellular interface 740 may be controlled by microprocessor 720 and not by microcontroller 710. In other words, various hardware features of the integrated circuit may be controlled by one, but not both, of microcontroller 710 and microprocessor 720. On the other hand, certain hardware features may be controlled by either processing unit. For example, in such an integrated embodiment, USB interface 760 and USB interface 780 may be the same USB interface of the integrated circuit controlled by either processing core.
Further illustrated in fig. 7 are a memory 7170 and a memory 7180. Memory 7170 is used by microprocessor 720 and may be based on DDR memory technology, such as DDR2 or DDR 3. The memory 7180 is used by the microcontroller 710 and may be based, for example, on SRAM technology.
Fig. 8 illustrates signaling in accordance with at least some embodiments of the invention. Arranged on the vertical axis from left to right are a user interface UI, a first processing unit PU1, a second processing unit PU2, and finally a display DISP. Time progresses from top to bottom. The second processing unit may have a higher processing performance than the first processing unit and is associated with a higher current consumption.
In step 810, a second processing unit, which may include a processing core, controls the display. For example, the second processing unit may run an application and provide instructions to the display to display information reflecting the state of the application.
In step 820, the first processing unit determines that a triggering event has occurred that is associated with a transition of the second processing unit from an active state to a sleep state. For example, the first processing unit may determine the occurrence of the trigger event by receiving an indication from the second processing unit that a task performed by the second processing unit has been completed. As mentioned above, the sleep state may comprise setting the clock frequency of the second processing unit to zero. In response to the determination of step 820, the first processing unit assumes control of the display at step 830 and transitions the second processing unit to a sleep state at step 840. Subsequently, in step 850, the second processing unit is in a sleep state. When the second processing unit is in the sleep state, the battery resources of the device may be depleted at a lower rate. In some embodiments, step 830 may begin at the same time step 840 occurs, or step 840 may occur before step 830 begins.
In step 860, the user interacts with the user interface UI to cause the first processing unit to determine a triggering event to transition the second processing unit from the sleep state to the active state. For example, the user may trigger a web browser application that requires the connection capability only available from the second processing unit. In response, in step 870, the first processing unit wakes up the second processing unit from the sleep state. In response, the second processing unit may read the state from the memory and wake up to that state and assume control of the display, which is shown as step 880.
Fig. 9 is a first flowchart of a first method in accordance with at least some embodiments of the present invention. The steps of the illustrated method may be performed, for example, in the apparatus 110 of fig. 6 or in the apparatus of fig. 7.
Step 910 includes generating, by a first processing core, a first control signal. Step 920 includes controlling the display by providing a first control signal to the display via the first display interface. Step 930 includes generating, by the second processing core, a second control signal. Step 940 includes controlling the display by providing a second control signal to the display via the second display interface. Finally, step 950 includes causing the second processing core to enter and leave the sleep state based at least in part on the first processing core determining an instruction from outside the device.
Fig. 10 is a state transition diagram in accordance with at least some embodiments of the present invention.
PU1 corresponds to a first processing unit, e.g., a lower performance processing unit. PU2 corresponds to a second processing unit, e.g., a higher performance processing unit. For example, these elements may be similar to those discussed in conjunction with FIG. 8. In the initial state, the device including PU1 and PU2 is in an inactive state, a "0" indicating a state where both PU1 and PU2 are powered off.
From the initial power-off state, PU1 is powered on, the state of PU1 is represented as a "1", and PU2 remains in the power-off state, represented by a "0". Thus, the composite state is "10", corresponding to the case where PU1 is active and PU2 is inactive. In this state, the device may provide a diminished experience for the user and consume less current from the battery charge.
Additionally or alternatively, PU1 and/or PU2 in a powered down state may have an intermediate low power state that transitions to an active state faster than from a fully powered down state. For example, the processing unit may be set to such an intermediate low power state before it is set to a powered down state. If the processing unit is needed shortly thereafter, it may be transitioned back to a powered state. If no processing unit is identified as needed within a predetermined time, the processing unit may be transitioned from the intermediate low power state to a powered down state.
Arrow 1010 represents a transition from state "10" to state "11", i.e., a transition of PU2 from a sleep state to an active state (e.g., its clock frequency is non-zero). PU1 may, for example, cause a transition represented by arrow 1010 in response to a triggering event. In state "11", the device can provide a richer experience at the expense of faster battery power consumption.
Arrow 1020 represents a transition from state "11" to state "10", i.e., a transition of PU2 from the active state to the dormant state. The PU1 may, for example, cause a transition represented by arrow 1020 in response to a triggering event.
It is to be understood that the disclosed embodiments of the invention are not limited to the particular structures, process steps, or materials disclosed herein, but extend to equivalents thereof as will be recognized by those skilled in the relevant art. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting.
Reference throughout this specification to one embodiment or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrase "in one embodiment" or similar language in various places throughout this specification are not necessarily all referring to the same embodiment. Where a numerical value is defined using terms such as "about" or "substantially," the exact numerical value is also disclosed.
As used herein, a plurality of items, structural elements, compositional elements, and/or materials may be presented in a list for convenience. However, these lists should be construed as though each member of the list is individually identified as a separate and unique member. Thus, no single member of such list should be construed as a de facto equivalent of any other member of the same list solely based on their presentation in a common group without indications to the contrary. In addition, reference may be made herein to various embodiments and examples of the invention, and alternatives to various components thereof. It should be understood that such embodiments, examples, and alternatives are not to be construed as actual equivalents of each other, but are to be considered as independent and autonomous representations of the invention.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided, such as examples of lengths, widths, shapes, etc., to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, and so forth. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
While the above examples illustrate the principles of the invention in one or more particular applications, it will be apparent to those of ordinary skill in the art that numerous modifications in form, usage and details can be made without departing from the principles and concepts of the invention. Accordingly, the invention is to be limited only by the claims as set forth below.
The verbs "comprise" and "comprise" are used herein as open-ended limitations that neither exclude nor require the presence of unrecited features. The features recited in the dependent claims may be freely combined with each other, unless explicitly stated otherwise. Furthermore, it should be understood that the use of "a" or "an" throughout this document, i.e., singular forms, does not exclude a plurality.
INDUSTRIAL APPLICABILITY
At least some embodiments of the invention have industrial application in providing an efficient human-machine interface and safe roaming in nature.
List of abbreviations
OLED organic light emitting diode
GPS global positioning system
LTE Long term evolution
UI user interface
WCDMA wideband code division multiple access
WiMAX global microwave interconnection access
WLAN wireless local area network

Claims (12)

1. An embedded device comprising a lower-performance processing means (710), a higher-performance processing means (720), at least one memory including computer program code, said at least one memory and said computer program code configured to enable, via said lower-performance processing means (710) and said higher-performance processing means (720), said device to predict a need for a rich media interface based at least in part on a particular calendar event in a calendar application, and to trigger a start of said higher-performance processing means from the lower-performance processing means (710) and the higher-performance processing means (720) in said device at a point in time selected in accordance with said prediction,
wherein the lower-performance processing means (710) is configured to wake up the higher-performance processing means (720) from a sleeping state to present a rich media mode with the higher-performance processing means (720) and to switch from an active state to a sleeping state to present a poor media mode with the lower-performance processing means (710) by inter-signalling between different processors, the specific calendar event comprising an indication about an application for handling the specific calendar event, the predicting comprising comparing the indication with a list of applications and their associated media requirements such that the higher-performance processing means (720) is triggered when the lower-performance processing means (710) fails to meet the media requirements of the application needed to handle the specific calendar event, the apparatus being configured to trigger the higher-performance processing means (720) at a delay time before the start of the specific calendar event occurs -a start-up of a performance processing device (720), said delay time being a sum of a start-up time of said higher performance processing device (720) and a start-up delay of an application in said higher performance processing device (720).
2. The apparatus of claim 1, wherein the apparatus is further configured to cause the higher-performance processing device (720) to enter and leave a sleep state based at least in part on a determination by the lower-performance processing device (710) of an instruction from outside the apparatus.
3. The apparatus of claim 1, wherein said higher-performance processing device (720) and said lower-performance processing device (710) each comprise a processing core.
4. The apparatus of claim 1, wherein said higher-performance processing device (720) and said lower-performance processing device (710) are both electrically connected to a shared random access memory.
5. The device of claim 1, wherein the device is configured to obtain from the calendar application a plurality of calendar events occurring within a same day so as to display a timeline on a screen, and to display a plurality of symbols relative to the timeline on a portion of the timeline selected based on predetermined times in the day of the calendar events, the symbols corresponding to at least two of the plurality of calendar events.
6. The apparatus according to any of the claims 1 to 5, characterized in that the lower-performance processing means (710) is unable to render the rich media interface.
7. The apparatus of claim 1, wherein said lower-capability processing device (710) is configured to enable said higher-capability processing device (720) to hibernate in response to a determination that a user interface type not supported by said lower-capability processing device (710) is no longer requested.
8. The device of claim 1, wherein the device comprises a smart watch (110).
9. The device of claim 1, wherein the device comprises a handheld communication device (110 x).
10. The apparatus of claim 1, wherein the apparatus comprises a personal fitness tracker.
11. The apparatus of claim 1, wherein the apparatus comprises an at least partially retractable, rotatable hardware element, and wherein the apparatus is configured to be operated by a user by interacting with the rotatable hardware element.
12. A method for implementing an embedded device, comprising:
-having the device predict a need for a rich media interface based at least in part on a particular calendar event in the calendar application; and
-triggering the activation of a higher-performance processing means (720) from a lower-performance processing means (710) and a higher-performance processing means (720) of the device at a point in time selected in accordance with the prediction,
wherein the lower-performance processing means (710) is configured to wake up the higher-performance processing means (720) from a sleeping state to present a rich media mode with the higher-performance processing means (720) and to switch from an active state to a sleeping state to present a poor media mode with the lower-performance processing means (710) by inter-signalling between different processors, the specific calendar event comprising an indication about an application for handling the specific calendar event, the predicting comprising comparing the indication with a list of applications and their associated media requirements such that the higher-performance processing means (720) is triggered when the lower-performance processing means (710) fails to meet the media requirements of the application needed to handle the specific calendar event, the apparatus being configured to trigger the higher-performance processing means (720) at a delay time before the start of the specific calendar event occurs -a start-up of a performance processing device (720), said delay time being a sum of a start-up time of said higher performance processing device (720) and a start-up delay of an application in said higher performance processing device (720).
CN201911297659.7A 2018-12-18 2019-12-17 Embedded computing device management Active CN111343331B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16/223,143 US11874716B2 (en) 2015-08-05 2018-12-18 Embedded computing device management
US16/223,143 2018-12-18

Publications (2)

Publication Number Publication Date
CN111343331A CN111343331A (en) 2020-06-26
CN111343331B true CN111343331B (en) 2022-02-22

Family

ID=69147138

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911297659.7A Active CN111343331B (en) 2018-12-18 2019-12-17 Embedded computing device management

Country Status (5)

Country Link
CN (1) CN111343331B (en)
DE (1) DE102019008590A1 (en)
FI (1) FI20196085A1 (en)
GB (1) GB2580218B (en)
TW (1) TWI736045B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112559440B (en) * 2020-12-30 2022-11-25 海光信息技术股份有限公司 Method and device for realizing serial service performance optimization in multi-small-chip system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102495756A (en) * 2011-11-07 2012-06-13 北京中星微电子有限公司 Method and system for switching operating system between different central processing units
CN103631359A (en) * 2013-11-15 2014-03-12 联想(北京)有限公司 Information processing method and electronic equipment
CN106062661A (en) * 2014-03-31 2016-10-26 英特尔公司 Location aware power management scheme for always-on-always-listen voice recognition system
CN106604369A (en) * 2016-10-26 2017-04-26 惠州Tcl移动通信有限公司 Terminal device with dual-mode switching function
CN108052272A (en) * 2012-10-30 2018-05-18 谷歌技术控股有限责任公司 The electronic equipment of Notification Method is shown with enhancing
CN108983873A (en) * 2012-08-27 2018-12-11 三星电子株式会社 Device and method for wake-up processor

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9785916B2 (en) * 2007-10-12 2017-10-10 Yahoo Holdings, Inc. Integrating rich media into a web-based calendar
FI124328B (en) * 2008-12-31 2014-06-30 Suunto Oy Two-function control means for a wrist computer or equivalent and a method for controlling a wrist computer or a corresponding terminal device
CN103309428B (en) * 2012-03-12 2016-12-14 联想(北京)有限公司 Information processing method and electronic equipment
US10990270B2 (en) * 2012-05-09 2021-04-27 Apple Inc. Context-specific user interfaces
US9081571B2 (en) * 2012-11-29 2015-07-14 Amazon Technologies, Inc. Gesture detection management for an electronic device
US10796397B2 (en) * 2015-06-12 2020-10-06 Intel Corporation Facilitating dynamic runtime transformation of graphics processing commands for improved graphics performance at computing devices
GB2555107B (en) * 2016-10-17 2020-10-21 Suunto Oy Embedded Computing Device
DE102016113417A1 (en) * 2015-08-05 2017-02-09 Suunto Oy TIME BLOCKS USER INTERFACE
GB2541234A (en) * 2015-08-14 2017-02-15 Suunto Oy Timeline user interface

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102495756A (en) * 2011-11-07 2012-06-13 北京中星微电子有限公司 Method and system for switching operating system between different central processing units
CN108983873A (en) * 2012-08-27 2018-12-11 三星电子株式会社 Device and method for wake-up processor
CN108052272A (en) * 2012-10-30 2018-05-18 谷歌技术控股有限责任公司 The electronic equipment of Notification Method is shown with enhancing
CN103631359A (en) * 2013-11-15 2014-03-12 联想(北京)有限公司 Information processing method and electronic equipment
CN106062661A (en) * 2014-03-31 2016-10-26 英特尔公司 Location aware power management scheme for always-on-always-listen voice recognition system
CN106604369A (en) * 2016-10-26 2017-04-26 惠州Tcl移动通信有限公司 Terminal device with dual-mode switching function

Also Published As

Publication number Publication date
DE102019008590A1 (en) 2020-06-18
GB201917729D0 (en) 2020-01-15
GB2580218A (en) 2020-07-15
GB2580218B (en) 2021-10-20
GB2580218A8 (en) 2020-09-23
TWI736045B (en) 2021-08-11
CN111343331A (en) 2020-06-26
TW202036285A (en) 2020-10-01
FI20196085A1 (en) 2020-06-19

Similar Documents

Publication Publication Date Title
US11145272B2 (en) Embedded computing device
US10168669B2 (en) Timeline user interface
GB2541578B (en) Embedded dual-processing core computing device
US8452353B2 (en) Apparatus and methods for providing intelligent battery management
GB2555107A (en) Embedded Computing Device
FI126911B (en) Timeline User Interface
GB2541234A (en) Timeline user interface
US11874716B2 (en) Embedded computing device management
CN111343331B (en) Embedded computing device management
US9900842B2 (en) Embedded computing device
US11703938B2 (en) Embedded computing device
US11210299B2 (en) Apparatus and method for presenting thematic maps
GB2592729A (en) Apparatus and method for presenting thematic maps
FI130395B (en) Apparatus and method for presenting thematic maps
US11144107B2 (en) Apparatus and method for presenting thematic maps
WO2023174158A1 (en) Terminal device control method and terminal device
FI128803B (en) Embedded computing device
GB2594766A (en) Embedded computing device
FI130397B (en) Embedded computing device
CN113127589A (en) Apparatus and method for presenting theme map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221021

Address after: Finland Vantaa

Patentee after: Songtuo Co.

Address before: Finland Vantaa

Patentee before: Amer Sports Digital Services OY