WO2022183985A1 - 电子设备的屏幕控制方法、可读介质和电子设备 - Google Patents

电子设备的屏幕控制方法、可读介质和电子设备 Download PDF

Info

Publication number
WO2022183985A1
WO2022183985A1 PCT/CN2022/077977 CN2022077977W WO2022183985A1 WO 2022183985 A1 WO2022183985 A1 WO 2022183985A1 CN 2022077977 W CN2022077977 W CN 2022077977W WO 2022183985 A1 WO2022183985 A1 WO 2022183985A1
Authority
WO
WIPO (PCT)
Prior art keywords
processor
touch
screen
electronic device
display screen
Prior art date
Application number
PCT/CN2022/077977
Other languages
English (en)
French (fr)
Inventor
李家文
靳百萍
宋明东
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP22762450.9A priority Critical patent/EP4280032A1/en
Priority to US18/548,519 priority patent/US20240168573A1/en
Priority to JP2023552060A priority patent/JP2024508830A/ja
Publication of WO2022183985A1 publication Critical patent/WO2022183985A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/3293Power saving characterised by the action undertaken by switching to a less power-consuming processor, e.g. sub-CPU
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1643Details related to the display arrangement, including those related to the mounting of the display in the housing the display being associated to a digitizer, e.g. laptops that can be used as penpads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/3287Power saving characterised by the action undertaken by switching off individual functional units in the computer system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display

Definitions

  • the present application relates to the field of terminal technologies, and in particular, to a screen control method of an electronic device, a readable medium, and an electronic device.
  • the touch screen of an electronic device is widely used in various intelligent electronic devices.
  • the user only needs to touch the touch screen of the intelligent electronic device with a finger to realize the operation of the electronic device, thereby realizing a more intuitive and convenient human-computer interaction.
  • a dual-processor solution is usually adopted in the hardware architecture of electronic equipment, that is, a high-performance main processor is responsible for the operation of the operating system and handles high-computation tasks, such as maps, navigation, and telephones. and other functions.
  • a low-power coprocessor is responsible for some low-computational tasks, such as sensor data acquisition and processing. In this way, since the high-performance main processor and the low-power co-processor share a set of display and touch screen, when switching between systems, it is easy to cause problems such as lost points and freezes in the user interface.
  • Embodiments of the present application provide a screen control method for an electronic device, a readable medium, and an electronic device.
  • the processing authority of the display screen is switched first, and then the processing authority of the touch screen is switched after the display screen switching is completed.
  • the first processor receives the touch data through the second processor, so that the first processor can receive the complete touch data, so as to accurately parse the user's touch data corresponding to the complete touch data.
  • This operation event and then accurately respond to the user's this operation event. It avoids the occurrence of the problem of causing the loss of data generated by the electronic device and corresponding to the user's operation during the screen switching process, thereby bringing the user an unsmooth sliding experience. Realize non-sensing switching and improve user experience.
  • an embodiment of the present application provides a screen control method for an electronic device, including:
  • the second processor of the electronic device processes the relevant information of the display screen and the touch screen; when the electronic device detects that the user starts the first touch operation, it switches the processing authority of the display screen from the second processor to the first processor, and Send the detected first touch data of the first touch operation to the first processor via the second processor, and when the electronic device detects that the first touch operation ends, transfers the processing authority of the touch screen to the second processor Switch to the first processor.
  • the relevant information of the display screen and the touch screen includes but is not limited to: one or more types of data, for example, touch data corresponding to the user's touch operation; one or more types of signaling, for example, as shown in FIG. 5 .
  • the co-processor application layer sends a wake-up command to the main processor; one or more types of messages, for example, in the embodiment shown in FIG. 5, the touch chip sends the co-touch driver Interrupt message; one or more types of notifications, one or more types of requests, one or more types of responses, one or more types of signals, etc.
  • the second processor processes the related information of the display screen and the touch screen.
  • the electronic device detects that the user starts to slide on the screen, it switches the processing authority of the display screen from the second processor to the first processor, and sends the detected first touch data of the first touch operation via the second processor sent to the first processor.
  • the electronic device detects that the user's finger leaves the screen, it is determined that the current touch operation of the user is over, and the processing authority of the touch screen is switched from the second processor to the first processor.
  • the first processor receives the touch data through the second processor, so that the first processor can receive the complete touch data, so as to accurately parse out the user's current location corresponding to the complete touch data. This operation event is then accurately responded to the user's current operation event. Realize non-sensing switching and improve user experience.
  • processing authority of the display screen herein refers to the processing authority of the relevant information of the display screen
  • processing authority of the touch screen refers to the processing authority of the relevant information of the touch screen
  • the above method further includes: the electronic device includes a virtual touch driver, and the electronic device sends the first touch data of the first touch operation to the virtual touch driver via the second processor, The first processor receives the first touch data via the virtual touch driver.
  • the first processor is the main processor, which can run a high-performance operating system and process tasks with a relatively low frequency and high computational load.
  • the main processor runs The system supports navigation, telephone, map, chat, music playback and other functions.
  • the second processor is a co-processor, which can run a light-weight system with low power consumption and process tasks with relatively high frequency and low computational load.
  • the coprocessor runs a lightweight embedded system, which is responsible for the collection and processing of sensor data, and supports functions such as time display, calculator, timer, alarm clock, heart rate measurement, step counting, and height measurement.
  • the virtual touch driver is the main virtual touch driver in the embodiment shown in FIG. 3 , and the processing authority for the display screen has been switched to the main processor, while the processing authority of the touch screen is still under the coordination
  • the main processor reads the touch data sent by the coprocessor through the main virtual touch driver.
  • the method further includes: when the electronic device detects that the user starts the first touch operation, switching the processing authority of the display screen from the second processor to the first processor include:
  • the electronic device When detecting that the user starts the first touch operation, the electronic device sends a wake-up instruction to the first processor through the second processor; after receiving the wake-up instruction, the first processor responds to the wake-up instruction and sends a wake-up instruction to the second processor Interrupt request for screen switching; after receiving the interrupt request for screen switching, the second processor responds to the interrupt request and switches the processing authority of the display screen from the second processor to the first processor.
  • the first processor is a main processor
  • the second processor is a co-processor
  • the electronic device sends a wake-up instruction to the main processor through the co-processor application layer.
  • the above-mentioned method further includes:
  • the first processor of the electronic device processes the relevant information of the display screen and the touch screen; when the electronic device detects that the user has an interactive operation for the setting application of the electronic device, the processing authority of the display screen and the touch screen is transferred to the first processor by the first processor. Switch to the second processor, and the second processor of the electronic device processes the related information of the display screen and the touch screen.
  • the setting application refers to an application that needs to be processed by the second processor, such as a calculator, a timer, an alarm clock, and a sports application with a relatively high frequency of use and a relatively low amount of computation.
  • the second processor replaces the first processor to process applications with a high frequency of use and a long time, which can reduce power consumption and improve the battery life of the electronic device.
  • the above-mentioned method further includes: when the electronic device detects that the user has an interactive operation of a setting application for the electronic device, changing the processing authority of the display screen and the touch screen from the first The processor switches to the second processor, and the second processor of the electronic device processes the relevant information of the display screen and the touch screen, including:
  • the electronic device switches the processing authority of the display screen from the first processor to the second processor when it detects that the user has an interactive operation for the setting application of the electronic equipment; when the electronic device determines that the processing authority of the display screen is set by After the first processor is switched to the second processor, the processing authority of the touch screen is switched from the first processor to the second processor.
  • the above-mentioned method further includes: the first processor of the electronic device processes the related information of the display screen and the touch screen; when the electronic device detects that the user starts the second touch operation, The detected second touch data of the second touch operation is sent to the first processor.
  • the first processor of the electronic device is currently running, the first processor controls the display screen and the touch screen, and when the user's finger slides on the screen of the electronic device, the first processor reads Touch data corresponding to the user's current touch operation generated by the touch chip.
  • the above-mentioned method further includes: the electronic device includes a display screen switch, the display screen switch is electrically connected to the display screen, and when the electronic device detects that the user starts the first touch operation , and switch the processing authority of the display screen from the second processor to the first processor in the following ways:
  • the electronic device When detecting that the user starts the first touch operation, the electronic device controls the display screen switch to disconnect from the second processor, and controls the display screen switch to connect to the first processor.
  • the MIPI interfaces of the first processor and the second processor are connected to the MIPI interface of the display screen through switch S2.
  • the electronic device detects that the user starts the first touch operation, the electronic device controls the switch S2 to disconnect from the second processor, and controls the switch S2 to connect to the first processor.
  • the above-mentioned method further includes: the electronic device includes a touch screen switch, the touch screen switch is electrically connected to the touch screen, and when the electronic device detects that the first touch operation ends, the electronic device uses the following methods Switch the processing authority of the touch screen from the second processor to the first processor:
  • the electronic device When detecting that the first touch operation ends, the electronic device controls the touch screen switch to disconnect from the second processor, and controls the touch screen switch to connect to the first processor.
  • the I2C interfaces of the first processor and the second processor are connected to the I2C interface of the touch screen through the switch S1, and when the electronic device detects that the first touch operation ends, the electronic device controls the switch S1 and the first touch operation.
  • the two processors are disconnected, and the switch S1 is controlled to connect to the first processor.
  • an embodiment of the present application provides a readable medium, where an instruction is stored on the readable medium, the instruction, when executed on an electronic device, causes the electronic device to perform the foregoing first aspect and various possible implementations of the first aspect any one of the screen control methods.
  • an electronic device including:
  • Screens including display screens and touch screens
  • memory for storing instructions for execution by one or more processors of the electronic device
  • the first processor is one of the processors of the electronic device
  • the second processor which is one of the processors of the electronic device, is configured to cooperate with the first processor to execute the first aspect and any one of the screen control methods in various possible implementations of the first aspect.
  • FIG. 1(a) shows a sequence diagram of a screen switching of the electronic device in the related technical solution
  • Fig. 1(b) shows, according to some embodiments of the present application, a sequence diagram of a screen switching of the electronic device when the electronic device executes the screen switching control method provided by the present application;
  • Fig. 1(c) shows an application scenario of a screen switching control method provided by the present application according to some embodiments of the present application
  • Fig. 2 shows a block diagram of the hardware structure of the smart watch shown in Fig. 1(c) according to some embodiments of the present application;
  • Fig. 3 shows a system architecture diagram of the smart watch shown in Fig. 1(c) according to some embodiments of the present application;
  • Fig. 4(a) shows an interface diagram of a smart watch in an off-screen state according to some embodiments of the present application
  • Figure 4(b) shows an interface diagram of the smart watch brightening the screen after the user lifts the wrist/presses a button, according to some embodiments of the present application
  • Figure 4(c) shows a desktop image displayed by the smart watch after the user slides the screen, according to some embodiments of the present application
  • FIG. 5 shows an interaction diagram of a screen switching control method provided by the present application according to some embodiments of the present application
  • FIG. 6 shows a schematic diagram of a coprocessor switching display screens according to some embodiments of the present application.
  • Figure 7(a) shows a schematic diagram of a sports application in which a user clicks on a smart watch desktop, according to some embodiments of the present application
  • Figure 7(b) shows an interface diagram of entering the sports application after the user clicks the sports application on the desktop of the smart watch, according to some embodiments of the present application
  • FIG. 8 shows an interaction diagram of another screen switching control method provided by the present application according to some embodiments of the present application.
  • FIG. 9 shows a flowchart of a system interaction method provided by the present application according to some embodiments of the present application.
  • Embodiments of the present application include, but are not limited to, a screen control method for an electronic device, a readable medium, and an electronic device.
  • an embodiment of the present application provides a screen control method for an electronic device.
  • the processing authority of the display screen is switched first, and then the processing authority of the touch screen is switched after the switching of the processing authority of the display screen is completed.
  • the electronic device is a dual-processor device with a main processor and a co-processor, the user wears the electronic device on the wrist, and when the user lifts the wrist on which the electronic device is worn, the co-processor of the electronic device is awakened , the display screen and touch screen are controlled by the coprocessor.
  • the electronic device first switches the processing authority of the display screen to the main processor. After the user's sliding operation is completed, the electronic device The device then switches the processing authority of the touch screen to the main processor.
  • the coprocessor is awakened and controlled by the coprocessor Display screen and touch screen; at time t2, when the user's finger starts to slide on the screen of the electronic device, the main processor is awakened, and the electronic device first switches the processing authority of the display screen to the main processor, and the main processor controls the display screen, and the processing authority of the touch screen is still in the coprocessor; at time t3, when the user's finger is lifted from the screen of the electronic device, that is, after the sliding operation is over, the electronic device switches the processing authority of the touch screen to the main processing device.
  • the electronic device when the processing authority of the display screen is switched to the main processor, and the processing authority of the touch screen is still maintained in the coprocessor, the electronic device creates a corresponding The virtual driver on the touch screen in the controller (hereinafter referred to as the main virtual touch driver for simplicity of description) is used to receive the touch data sent by the coprocessor. Therefore, when the main processor has started to control the display screen to display images frame by frame, but the touch screen has not completed the switching process, the main processor can receive the complete touch data corresponding to the user's current touch operation through the main virtual touch drive.
  • the main virtual touch driver for simplicity of description
  • the main processor can accurately analyze the user's current operation event corresponding to the complete touch data according to the received complete touch data, and then accurately respond to the user's current operation event.
  • the loss of data generated by the electronic device and corresponding to the user's operation is avoided, thereby bringing the user an unsmooth sliding experience. Realize non-sensing switching and improve user experience.
  • the main processor when the processing authority of the touch screen is in the main processor, the main processor is responsible for processing the touch data corresponding to the user's touch operation generated by the electronic device; when the processing authority of the touch screen is in the co-processor, the co-processing The controller is responsible for processing touch data generated by the electronic device and corresponding to the user's touch operation.
  • FIG. 1( c ) shows an application scenario of the screen control method provided by the present application, according to some embodiments of the present application.
  • the user's left wrist is wearing a dual-processor smart watch 100 .
  • the smart watch 100 includes a main processor and a co-processor, and the main processor and the co-processor share a screen.
  • the main processor can run a high-performance operating system to process tasks with low frequency and high computational load.
  • the main processor runs The system supports navigation, telephone, map, chat, music playback and other functions.
  • Coprocessors can run low-power, lightweight systems to handle low-computational tasks that use higher frequencies.
  • the coprocessor runs a lightweight embedded operating system, which is responsible for the collection and processing of sensor data, and supports functions such as time display, calculator, timer, alarm clock, heart rate measurement, step counting, and altitude measurement.
  • a low-power co-processor is used to replace the main processor to process tasks with high frequency of use and low computational load, thereby reducing power consumption and improving the battery life of the smart watch 100 .
  • the main processor and the coprocessor share a screen, which includes a display screen and a touch screen.
  • the screen may consist of a touch screen and a display screen stacked together.
  • the smart watch 100 when the processing authority of the display screen and the touch screen of the smart watch 100 is switched between the main processor and the co-processor, the smart watch 100 first switches the display by executing the screen switching control method provided by the embodiment of the present application. After switching the processing authority of the display screen, switch the processing authority of the touch screen. For example, in the process of switching the processing authority of the display screen and the touch screen from the coprocessor to the main processor, the main processor receives the touch data sent by the coprocessor through the main virtual touch driver. This ensures that during the switching process between the display screen and the touch screen, the smart watch 100 can completely acquire the data corresponding to the user's touch operation generated by the touch screen, so as to accurately respond to the user's touch operation and perform corresponding tasks. While reducing the power consumption of the whole machine, it can realize non-inductive switching and improve the user experience.
  • the smart watch 100 is taken as an example to introduce the technical solutions of the present application.
  • FIG. 2 shows a block diagram of the hardware structure of the smart watch 100 shown in FIG. 1( c ) according to some embodiments of the present application.
  • the smart watch 100 includes a touch screen 101, a display screen 102, a main processor 103, a co-processor 104, a memory 105, a communication module 106, a sensor module 107, a power supply 108, a switch S1 of the touch screen 101, and a display screen
  • the switch S2 of 102 the power management system 109 and the touch chip 110 and so on.
  • the touch screen 101 which may also be called a touch panel, can collect user's touch operations such as clicking and sliding on the screen.
  • the touch screen 101 can communicate with the main processor 103 and the coprocessor 104 through an I2C (Inter-Integrated Circuit) bus.
  • the touch screen 101 may be a resistive, surface capacitive, projected capacitive, infrared, surface acoustic wave, curved wave, active digital switch, and optical imaging touch screen.
  • the touch chip 110 is electrically connected to the touch screen 101.
  • the touch chip 110 scans the touch screen 101 at a certain scanning frequency to obtain the user's touch data, such as the coordinates, pressure, area and edge value of the touch point, etc. data. Then, the acquired touch data of the user is reported to the main processor 103 or the coprocessor 104 for processing. After the main processor 103 or the coprocessor 104 processes the touch data, the images corresponding to the user's touch operation are displayed frame by frame.
  • the display screen 102 may be used to display information input by the user, prompt information provided to the user, various menus on the smart watch 100, operation interfaces of various application programs of the smart watch 100, and the like.
  • the display screen 102 may be used to display the current time, the user's heart rate measured by the health detection application, the number of steps of the user during exercise calculated by the exercise application, and the like.
  • the display screen 102 can communicate with the main processor 103 and the coprocessor 104 through a Mobile Industry Processor Interface (Mobile Industry Processor Interface, MIPI) bus.
  • MIPI Mobile Industry Processor Interface
  • the display screen 102 may include a display panel, and the display panel may adopt a liquid crystal display (Liquid Crystal Display, LCD), an organic light-emitting diode (Organic Light-emitting Diode, OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode.
  • Polar body Active-matrix Organic Light-emitting Diode, AMOLED
  • flexible light-emitting diode Flexible light-emitting diode
  • Mini LED Micro LED, Micro OLED, quantum dot light-emitting diode (Quantum Dot Light-emitting Diodes, QLED), etc.
  • the main processor 103 includes a plurality of processing units that can operate and other operating systems, used to process tasks related to applications such as navigation, phone calls, maps, chats, music playback, etc., and when the processing authority of the touch screen 101 and the display screen 102 of the smart watch 100 is switched to the main processor 103, processing Data and the like generated by the user touching the touch screen 101 .
  • the coprocessor 104 can run a lightweight embedded operating system, and is responsible for the collection and processing of sensor data, and the processing is related to applications such as time display, calculator, timer, alarm clock, heart rate measurement, step counting, height measurement, etc.
  • the processing authority of the touch screen 101 and the display screen 102 of the smart watch 100 is switched to the coprocessor 104, the data that the user touches on the touch screen 101 is processed, and the like.
  • the coprocessor 104 may include a Digital Signal Processor (DSP), a Microcontroller Unit (MCU), a Field Programmable Gate Array (FPGA), an application-specific integrated circuit (Application Specific Integrated Circuit, ASIC) and other processing modules or processing circuits.
  • DSP Digital Signal Processor
  • MCU Microcontroller Unit
  • FPGA Field Programmable Gate Array
  • ASIC Application Specific Integrated Circuit
  • the I2C interfaces of the main processor 103 and the coprocessor 104 are connected to the I2C interface of the touch screen 101 through the switch S1, and the MIPI interfaces of the main processor 103 and the coprocessor 104 are connected to the display screen 102 through the switch S2 on the MIPI interface.
  • the main processor 103 and the coprocessor 104 are connected to the switch S2 through a general-purpose input/output (GPIO) interface (not shown in the figure), and the main processor 103 is pulled high through the GPIO interface level. Or pull down, to send a switching request of the display screen 102 to the coprocessor 104, and after the processing authority of the display screen 102 is switched, the processing authority of the touch screen 101 is switched.
  • GPIO general-purpose input/output
  • the memory 105 is used to store software programs and data, and the processor 203 executes various functional applications and data processing of the smart watch 100 by running the software programs and data stored in the memory 105 .
  • the memory 105 may store data such as air pressure and temperature collected by sensors during exercise of the user, and store sleep data, heart rate data, and the like of the user. Meanwhile, the memory 105 may also store the user's registration information, login information, and the like.
  • the communication module 106 can be used to make the smart watch 100 communicate with other electronic devices and connect to the network through other electronic devices.
  • the smart watch 100 electronically establishes a connection with a mobile phone, a server, etc. through the communication module 106 to perform data transmission.
  • the sensor module 107 may include a proximity light sensor, a pressure sensor, a gyro sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
  • the power supply 108 is used to power various components of the smart watch 100 .
  • the power source 108 may be a battery.
  • the power management system 109 is used to manage the charging of the power supply 108 and the power supply of the power supply 108 to other modules.
  • the smart watch 100 shown in FIG. 2 is only an exemplary structure for realizing the functions of the smart watch 100 in the technical solution of the present application, and does not constitute a specific limitation on the smart watch 100 .
  • the smart watch 100 may include more or less components than those shown in FIG. 2 , or combine some components, or separate some components, or different component arrangements.
  • the components shown in Figure 2 may be implemented in hardware, software or a combination of software and hardware.
  • FIG. 3 shows two operating systems (eg system and a lightweight embedded operating system).
  • the two operating systems run by the main processor 103 and the coprocessor 104 of the smart watch 100 both include an application layer 302 , a driver layer 301 and a device layer 300 , so no distinction is made here, but each layer is The specific originals are distinguished.
  • the main processor 103 and the coprocessor 104 share the display screen 102 , the touch screen 101 and the touch chip 110 .
  • the device layer 300 includes a display screen 102 , a touch screen 101 , a switch S2 of the display screen 102 , a switch S1 of the touch screen 101 , a main processor 103 , a coprocessor 104 , a touch chip 110 , and the like.
  • a display screen 102 a touch screen 101
  • a switch S2 of the display screen 102 a switch S1 of the touch screen 101
  • main processor 103 a main processor 103
  • a coprocessor 104 a touch chip 110
  • the functions of each device in the device layer 300 please refer to the above text description about FIG. 2 , which will not be repeated here.
  • the driving layer 301 is used to drive each device in the above-mentioned device layer 300 to realize functions such as read and write access to each device, interrupt setting and the like.
  • the driver layer 301 includes, but is not limited to, a co-touch driver 301 a.
  • the co-touch driver 301 a may be a software program in a lightweight embedded operating system run by the co-processor 104 for driving the touch screen 101 and capable of reading touch data generated by the touch chip 110 .
  • the coprocessor 104 is woken up, and the coprocessor 104 processes the display related information of the screen 102 and the touch screen 101, wherein the related information of the display screen 102 and the touch screen 101 includes but is not limited to: one or more types of data, for example, touch data corresponding to the user's touch operation; one or more type of signaling, eg, in the embodiment shown in FIG. 5, a wake-up command sent by the coprocessor application layer to the main processor 103; one or more types of messages, eg, in the implementation shown in FIG.
  • the interrupt message sent by the touch chip 110 to the co-touch driver 301a For example, the interrupt message sent by the touch chip 110 to the co-touch driver 301a; one or more types of notifications, one or more types of requests, one or more types of responses, one or more types of signal, etc.
  • the co-processor 104 drives the touch screen 101 through the co-touch driver 301a, and reads the user's touch data.
  • processing authority of the display screen 102 herein refers to the processing authority of the relevant information of the display screen 101
  • processing authority of the touch screen 101 refers to the processing authority of the relevant information of the touch screen 101.
  • the driving layer 301 includes, but is not limited to, a main virtual touch driver 301b, a main touch driver 301c, and the like for receiving and sending touch data of the user.
  • the main touch driver 301c is a software program used to drive the touch screen 101 in the high-performance operating system running in the main processor 103, and can switch to the main processor 103 when the processing authority of the display screen 102 and the touch screen 101 are both switched to the main processor 103.
  • the switched touch data generated by the touch chip 110 shown in FIG. 3 that is, after the processing authority of the display screen 102 and the touch screen 101 have been switched to the main processor 103, the user touches the screen again, and the data generated by the touch chip 110 corresponding to the user's re-touch operation).
  • the main virtual touch driver 301b is a virtual driver corresponding to the touch screen 101 created by the main processor 103 during initialization, and the processing authority for the display screen 102 has been switched to the main processor 103, while the processing authority of the touch screen 101 is still in When the coprocessor 104 is used, the touch data sent by the coprocessor application layer 302a is received.
  • the main processor 103 is woken up, and the smart watch 100 first changes the display screen 102
  • the processing authority is switched to the main processor 103 , and the main processor 103 processes the data of the display screen 102 , while the processing authority of the touch screen 101 is still in the coprocessor 104 .
  • the main processor 103 reads the data from the coprocessor 104 through the main virtual touch driver 301b. The touch data sent.
  • the smart watch 100 switches the processing authority of the touch screen 101 to the main processor 103 .
  • the main processor 103 reads the touch data sent by the coprocessor 104 through the main touch driver 301c.
  • application layer 302 includes coprocessor application layer 302a.
  • the co-processor application layer 302a is used to calculate the touch data read from the touch chip 110 by the co-touch driver 301a of the co-processor 104 when the processing authority of the display screen 102 and the touch screen 101 is in the co-processor 104, and determine the Corresponding to the touch event type of the user who should touch the data, and then according to the determined event type, the application program responds.
  • the coprocessor 104 is woken up, and the coprocessor 104 processes the display related information of the touch screen 102 and the touch screen 101.
  • the co-processor 104 drives the touch screen 101 through the co-touch driver 301a, and reads the user's touch data.
  • the coprocessor application layer 302a performs calculation upon receiving the user's touch data reported by the co-touch driver 301a, determines the touch event type of the user who should touch the data, and then responds through the application program according to the determined event type.
  • the main processor 103 runs In the system, the application layer 302 includes the main processor application layer 302b.
  • the main processor application layer 302b is used to calculate the touch data read from the touch chip 110 by the main touch driver 301c of the main processor 103 when the processing authority of the display screen 102 and the touch screen 101 is in the main processor 103, and determine the Corresponding to the touch event of the user who should touch the data, and then according to the determined event, the application program responds.
  • the main processor 103 is woken up, and the smart watch 100 first changes the display screen 102
  • the processing authority is switched to the main processor 103 .
  • the smart watch 100 switches the processing authority of the touch screen 101 to the main processor 103 .
  • the main processor 103 processes the relevant information of the display screen 102 and the touch screen 101.
  • the touch chip 110 generates touch data corresponding to the user's click operation.
  • the main touch driver 301c After reading the touch data from the touch chip 110, the main touch driver 301c reports the touch data to the main processor application layer 302b.
  • the main processor application layer 302b identifies the data that the user clicks on the screen, determines that the user's operation is to click the icon of the music playing application, and then opens the music playing application.
  • the smart watch 100 is in an off-screen state, and both the main processor 103 and the coprocessor 104 are in a sleep state.
  • the screen lights up, the dial displays the time interface, the coprocessor 104 is woken up, and the display screen 102 and the touch screen
  • the processing authority of 101 is in the coprocessor 104, and the coprocessor 104 processes the user's touch data generated by the touch chip 110. After the coprocessor 104 processes the touch data, it controls the display screen 102 to display frame by frame.
  • the smart watch 100 After the screen is turned on, if the user's touch operation is not detected within the set time, the smart watch 100 turns off the screen again, and the main processor 103 and the coprocessor 104 re-enter the sleep state. If the user's touch operation is detected within the set time, for example, after the user slides on the displayed time interface of the smart watch 100 shown in FIG. 4( b ), the user enters the smart watch 100 shown in FIG. 4( c ). desktop, including a plurality of application icons, such as navigation icons, sports application icons, calculator icons, weather icons, setting icons, etc., then the smart watch 100 implements the screen control method provided by the solution of the present application.
  • application icons such as navigation icons, sports application icons, calculator icons, weather icons, setting icons, etc.
  • the display screen 102 and the touch screen 101 The processing authority is switched to the main processor 103, that is, the main processor 103 processes the user's touch data generated by the touch chip 110. After the main processor 103 processes the touch data, the main processor 103 controls the display screen 102 to display frame by frame.
  • the coprocessor 104 when the processing authority of the display screen 102 and the touch screen 101 is in the coprocessor 104, the coprocessor 104, when detecting the user's touch operation, sends the display screen 102
  • the process of switching the processing authority of the touch screen 101 to the main processor 103 includes the following steps:
  • Step 501 The smart watch 100 generates touch data of the user.
  • the user touches the screen, and the touch chip 110 generates data such as coordinates, pressure, area, and edge value of the touch point of the user.
  • the operations of the user touching the smart watch 100 may be operations such as clicking, long pressing, double-clicking, sliding, and the like.
  • the user taps the touch screen 101 with a finger, or the user's finger slides from one position on the touch screen 101 to another position.
  • the user's touch operation on the smart watch 100 may be performed by the user through fingers, or may be performed by the user through a touch pen or other touch devices.
  • Step 502 the touch chip 110 of the smart watch 100 reports the interrupt message to the co-touch driver 301 a of the co-processor 104 .
  • the interrupt message refers to a trigger signal generated when the touch chip 110 receives a user's touch operation. If there is an interruption message, it indicates that the user has started a touch operation on the touch screen 101, and then the process goes to step 503; if there is no interruption message, it indicates that the user has not performed a touch operation on the touch screen 101, and the touch chip 110 has not been triggered to generate the user's touch data, and return to Step 501.
  • Step 503 After receiving the interrupt message, the co-touch driver 301 a of the co-processor 104 reads touch data from the touch chip 110 .
  • the co-touch driver 301a of the co-processor 104 reads the touch data from the touch chip 110 through a serial peripheral interface (Serial Peripheral Interface, SPI) after receiving the interrupt message.
  • SPI Serial Peripheral Interface
  • Step 504 the co-touch driver 301 a of the co-processor 104 reports the read touch data to the co-processor application layer 302 a of the co-processor 104 .
  • the co-processor application layer 302a executes step 505 after receiving the touch data reported by the co-touch driver 301a.
  • Step 505 After receiving the touch data, the coprocessor application layer 302a of the coprocessor 104 sends a wake-up instruction to the main processor 103. After receiving the wake-up instruction, the main processor 103 executes step 506 and step 507 at the same time.
  • Step 506 After receiving the wake-up command, the main processor 103 completes the power-on of the internal MIPI interface. That is, the power-on of the hardware interface connected between the main processor 103 and the switch S2 of the display screen 102 is completed, so that when the switch S2 of the display screen 102 is switched to the main processor 103, the main processor 103 can communicate with the display through the MIPI interface.
  • the MIPI interface of the screen 102 The MIPI interface of the screen 102.
  • Step 507 The main processor 103 sends an interrupt request to the coprocessor 104 to request to control the screen. That is, the coprocessor 104 is requested to switch the processing authority of the display screen 102 and the touch screen 101 to the main processor 103 .
  • Step 508 after receiving the interrupt request sent by the main processor 103 , the coprocessor 104 first switches the processing authority of the display screen 102 to the main processor 103 in response to the interrupt request.
  • the main processor 103 can pull up the level of the GPIO interface used for sending the switching request, and the coprocessor 104 can also pull up the level of the GPIO interface switched by the switch S2 that controls the display screen 102, Then the switch S2 is switched to the main processor 103 , that is, the display screen 102 communicates with the main processor 103 through the switch S2 , and the main processor 103 processes the relevant information of the display screen 102 . After the processing authority of the display screen 102 is switched from the coprocessor 104 to the main processor 103 , step 509 is entered.
  • Step 509 The main virtual touch driver 301b of the main processor 103 reads the user's touch data from the coprocessor application layer 302a.
  • the user's touch data is that the processing authority of the display screen 102 has been successfully switched from the coprocessor 104 to the main processor 103, while the touch screen 101 has not yet completed the switching, that is, the processing authority of the display screen 102 has been switched to the main processor 103, and In the process of maintaining the processing authority of the touch screen 101 in the coprocessor 104, the touch chip 110 generates data corresponding to the user's touch operation.
  • the coprocessor 104 first switches the display screen 102 to the main processor 103, and then switches the touch screen 101 to the main processor 103 after the user's one touch operation is completed, that is, after the user's finger or the touch device held by the user is lifted from the screen.
  • the main processor 103 proceeds to step 512 .
  • Step 510 The main virtual touch driver 301b of the main processor 103 reports the read touch data of the user to the main processor application layer 302b.
  • the processing authority of the touch screen 101 remains in the coprocessor 104.
  • the main processor 103 reads the user's touch data from the coprocessor application layer 302a through the main virtual touch driver 301b, The read touch data of the user is reported to the main processor application layer 302b for processing.
  • Step 511 After receiving the user's touch data, the main processor application layer 302b responds to the user's touch operation.
  • the main processor application layer 302b calculates the touch data, determines the user's touch event corresponding to the user who should touch the data, and then responds according to the determined event.
  • the main processor application layer 302b determines that the user's operation type is that when the user's finger presses the same position for a long time, the response to the operation is: forward the information in the information record of the smart watch 100 , Favorite, Edit, Delete, Multi-Select or Cite etc.
  • Step 512 After the coprocessor 104 switches the display screen 102 to the main processor 103, it determines whether the current touch operation ends. If finished, it indicates that the touch screen 101 can be switched to the main processor 103, and the process goes to step 513;
  • the coprocessor 104 may determine whether a touch operation ends by determining whether the time difference between the time when the user's finger leaves the screen and the time when the user starts touching the screen is greater than a set time threshold. For example, the time threshold for a sliding operation is set to 10 milliseconds. If the time difference between the time when the user's finger leaves the screen and the time when the user starts to touch the screen is greater than 10 milliseconds, the coprocessor 104 can determine that a sliding operation ends; otherwise , the coprocessor 104 determines that a sliding operation has not ended.
  • Step 513 the coprocessor 104 switches the touch screen 101 to the main processor 103 after determining that the current touch is over.
  • the coprocessor 104 controls the switch S1 of the touch screen 101 to switch to the main processor 103, so that the I2C interface of the touch screen 101 and the I2C interface of the main processor 103 are connected, that is, the touch screen 101 is switched to the main processor 103.
  • processor 103 controls the switch S1 of the touch screen 101 to switch to the main processor 103, so that the I2C interface of the touch screen 101 and the I2C interface of the main processor 103 are connected, that is, the touch screen 101 is switched to the main processor 103.
  • processor 103 controls the switch S1 of the touch screen 101 to switch to the main processor 103, so that the I2C interface of the touch screen 101 and the I2C interface of the main processor 103 are connected, that is, the touch screen 101 is switched to the main processor 103.
  • Step 514 In the case that the main processor 103 successfully controls the screen (that is, the main processor 103 processes the relevant information of the display screen 102 and the touch screen 101), that is, the processing authority of the display screen 102 and the touch screen 101 has been successfully switched to In the case of the main processor 103 , if the user touches the screen again, the main processor 103 will read the touch data generated by the touch chip 110 from the touch chip 110 corresponding to the user's re-touching.
  • the touch chip 110 of the smart watch 100 reports an interrupt message to the main touch driver 301 c of the main processor 103 .
  • the interrupt message refers to a trigger signal generated when the touch chip 110 receives a touch operation by the user again. If there is an interruption message, it indicates that the user has started the touch operation on the touch screen 101, and then go to step 515; if there is no interruption message, it indicates that the user does not touch the touch screen 101 again, and the touch chip 110 is not triggered to generate the user's touch data again. , to end the process.
  • Step 515 the main touch driver 301 c of the main processor 103 reads the re-touch data from the touch chip 110 after receiving the interrupt message that the user touches again.
  • the main touch driver 301c reads touch data from the touch chip 110 through a serial peripheral interface (Serial Peripheral Interface, SPI).
  • SPI Serial Peripheral Interface
  • Step 516 The main touch driver 301c of the main processor 103 reports the read re-touch data to the main processor application layer 302b for processing.
  • Step 517 After receiving the user's re-touch data, the main processor application layer 302b responds to the user's re-touch operation.
  • the main processor application layer 302b calculates the touch data, determines the user's touch event corresponding to the second touch data, and then responds according to the determined event. For example, the main processor application layer 302b identifies the data of the user's click on the screen, and determines that the user's operation is to continuously press and hold the user's finger at different positions on the screen of the smart watch 100, then the response to the operation is: The smart watch 100 drags and rolls the information records to display the information records of different time periods.
  • the main processor 103 is woken up, and the smart watch 100 enters the desktop of the smart watch 100 shown in FIG. 4(c).
  • the smart watch 100 executes the screen control method shown in FIG. Switch to the main processor 103 .
  • the main processor 103 processes the user's touch data generated by the touch chip 110, and controls the display screen 102 to display frame by frame.
  • the main processor 103 controls the screen (that is, the processing authority of the display screen 102 and the touch screen 101 is in the main processor 103)
  • the user triggers the main processing by clicking the icon of the setting application on the desktop of the smart watch 100.
  • the process of the processor 103 switching the processing authority of the display screen 102 and the touch screen 101 to the coprocessor 104 will be described in detail.
  • the setting application refers to applications that need to be processed by the coprocessor 104 , such as calculators, timers, alarm clocks, sports, and the like, which are frequently used and require less computation.
  • the coprocessor 104 replaces the main processor 103 to process the aforementioned applications with high usage frequency and long time, which can reduce power consumption and improve the battery life of the smart watch 100 .
  • the interface currently displayed by the smart watch 100 is a desktop, which includes icons of multiple applications, such as navigation icons, sports application icons, calculator icons, weather icons, settings icon etc.
  • the smart watch 100 runs the sports application, and enters an interface that displays the number of kilometers and the pace of the user's running as shown in FIG. 7( b ).
  • the currently displayed interface of the user's running kilometers and pace is displayed by the coprocessor 104 controlling the display screen 102 . Since the sports application is run by the coprocessor 104 , when the user clicks the icon 112 of the sports application, the display screen 102 and the touch screen 101 of the smart watch 100 need to be switched from the main processor 103 to the coprocessor 104 .
  • the applications executed by the main processor 103 and the coprocessor 104 respectively are as shown in Table 1.
  • the applications listed in Table 1 such as navigation, phone calls, chatting, shopping, tickets, photography, mobile payment, etc., executed by the main processor are usually third-party applications; Applications such as sports, calculators, clocks, weather, scientific sleep, intelligent heart rate, and blood oxygen saturation are usually self-developed applications by equipment manufacturers. It can be understood that the application names exemplarily listed in Table 1 above do not impose specific restrictions on the solution of the present application.
  • the main processor 103 is triggered to switch the processing authority of the display screen 102 and the touch screen 101 to the co-processing
  • the process of the controller 104 is described in detail.
  • the main processor 103 processes the related information of the display screen 102 and the touch screen 101 .
  • the main processor 103 is triggered to send a screen switching interrupt request to the coprocessor 104, requesting the coprocessor 104 to control the display screen 102 and the touch screen 101.
  • the coprocessor 104 receives the interrupt request sent by the main processor 103, it responds to the interrupt request, and controls the switch S2 of the display screen 102 to communicate with the coprocessor 104.
  • the display screen 102 After the display screen 102 is switched, it controls the touch screen 101.
  • the switch S1 is connected to the coprocessor 104 . So far, the main processor 103 switches the processing authority of the display screen 102 and the touch screen 101 to the coprocessor 104, and the coprocessor 104 is responsible for reading and processing the user's touch data, and controlling the display screen 102 to display frame by frame.
  • the process for the main processor 103 to switch the processing authority of the display screen 102 and the touch screen 101 to the coprocessor 104 includes the following steps :
  • Step 801 the sports application of the smart watch 100 sends a screen-off instruction to the main processor 103 . After receiving the screen-off instruction, the main processor 103 executes step 802 and step 803 at the same time.
  • Step 802 The main processor 103 completes the power-off of the internal MIPI interface. That is, the power-off of the hardware interface connecting the main processor 103 to the switch S2 of the display screen 102 is completed, so as to disconnect the connection between the MIPI interface of the main processor 103 and the MIPI interface of the display screen 102 .
  • Step 803 The main processor 103 sends an interrupt request to the coprocessor 104 to request to switch the screen, that is, the main processor 103 requests the coprocessor 104 to switch the processing authority of the display screen 102 and the touch screen 101 from the main processor 103 to the coprocessor. device 104.
  • Step 804 the coprocessor 104 responds after receiving the interrupt request for switching the screen, and switches the processing authority of the display screen 102 from the main processor 103 to the coprocessor 104 .
  • the main processor 103 can pull down the level of the GPIO interface used by the main processor 103 to send the switching request, and the coprocessor 104 can control the level of the GPIO interface switched by the switch S2 of the display screen 102 If the switch S2 is also pulled low, the switch S2 is switched to the coprocessor 104 , that is, the display screen 102 communicates with the coprocessor 104 through the switch S2 , and the coprocessor 104 starts to process the relevant information of the display screen 102 .
  • Step 805 the coprocessor 104 switches the processing authority of the touch screen 101 from the main processor 103 to the coprocessor 104 after the switching of the display screen 102 is completed.
  • the coprocessor 104 controls the switch S1 of the touch screen 101 to switch to the coprocessor 104 , so that the I2C interface of the touch screen 101 and the I2C interface of the coprocessor 104 are connected.
  • the coprocessor 104 is connected to the touch screen 101, and when the user touches the screen of the smart watch 100, the coprocessor 104 reads and processes the touch data corresponding to the user's touch operation generated by the touch chip 110.
  • the process of switching the processing authority of the display screen 102 and the touch screen 101 shown in FIG. 5 from the coprocessor 104 to the main processor 103 and the processing authority of the display screen 102 and the touch screen 101 shown in FIG. 8 are introduced. After the process of switching from the main processor 103 to the coprocessor 104 .
  • the following will continue to take the smart watch 100 as an example to introduce a system interaction method provided by the present application.
  • the smart watch 100 implements the switching between the main processor 103 and the co-processor 104 in different scenarios by executing the system interaction method shown in FIG. 9 .
  • the smart watch 100 switches between the main processor 103 and the co-processor 104, the display screen 102 and the touch screen 101 can be switched from the co-processor by executing the screen control method shown in FIG.
  • the processor 104 switches to the main processor 103 .
  • the smart watch 100 can switch the display screen 102 and the touch screen 101 from the main processor 103 to the coprocessor 104 by executing the screen control method shown in FIG. 8 , for example.
  • the system interaction method provided by this application includes the following steps:
  • Step 901 After the smart watch 100 in the dormant state detects the user's setting operation, it wakes up the coprocessor 104 , and the coprocessor 104 processes the information related to the display screen 102 and the touch screen 101 .
  • the user's setting operation may be operations such as the user lifting the wrist wearing the smart watch 100, the user pressing a button of the smart watch 100, and the user instructing the smart watch 100 to wake up through a voice command, which is not limited in this application.
  • the relevant information of the display screen 102 and the touch screen 101 includes but is not limited to: one or more types of data, for example, touch data corresponding to the user's touch operation; one or more types of signaling, for example, shown in FIG.
  • the co-processor application layer sends a wake-up command to the main processor; one or more types of messages, for example, in the embodiment shown in FIG. 5, the touch chip 110 sends a message to the co-touch The interrupt message sent by the driver 301a; one or more types of notifications, one or more types of requests, one or more types of responses, one or more types of signals, and the like.
  • Step 902 The coprocessor 104 determines whether there is a user's touch operation. If yes, it means that the coprocessor 104 detects the user's touch operation, and then goes to step 903a; if not, it means that the coprocessor 104 does not detect the user's touch operation, and goes to step 903b.
  • the touch chip 110 after the touch chip 110 detects the user's touch operation, the touch chip 110 sends an interrupt message to the coprocessor 104 .
  • the interrupt message refers to a trigger signal generated when the touch chip 110 receives a user's touch operation. If there is an interruption message, it indicates that the user has started a touch operation on the touch screen 101, and then go to step 903a;
  • Step 903 a after detecting the user's touch operation, the coprocessor 104 wakes up the main processor 103 , and the coprocessor 104 switches the processing authority of the display screen 102 and the touch screen 101 to the main processor 103 .
  • the touch chip 110 generates touch data and sends an interrupt message to the co-touch driver 301 a of the co-processor 104 .
  • the co-touch driver 301a reads the touch data from the touch chip 110, and reports the read touch data to the co-processor application layer 302a.
  • the coprocessor application layer 302a sends a wake-up instruction to the main processor 103 .
  • the main processor 103 completes the internal power-on, and sends an interrupt request to the coprocessor 104, that is, the main processor 103 requests the coprocessor 104 to switch the processing authority of the display screen 102 and the touch screen 101 to the main processor. 103.
  • the coprocessor 104 After receiving the interrupt request sent by the main processor 103 , the coprocessor 104 switches the display screen 102 to the main processor 103 first. In the case where the display screen 102 has been switched, but the touch screen 101 has not been switched, in order to avoid the problem of screen freezing during the switching process, the coprocessor application layer 302a sends the touch data to the main virtual touch driver of the main processor 103. 301b. After the current touch by the user ends, the coprocessor 104 switches the touch screen 101 to the main processor 103 .
  • the coprocessor application layer 302a sends the touch data to the main virtual touch driver of the main processor 103. 301b.
  • the coprocessor 104 switches the touch screen 101 to the main processor 103 .
  • the main processor 103 For the detailed process, please refer to the above text description about the screen control method shown in FIG. 5 , which will not be repeated here.
  • Step 903b The smart watch 100 turns off the screen after a timeout.
  • the coprocessor 104 After the coprocessor 104 is woken up, if the user's touch operation is not detected within the set time, the coprocessor 104 sends a screen-off instruction to the display screen 102 to control the screen-off of the display screen 102 .
  • the control display screen 102 lights up, and the user's touch operation is not detected within 5 seconds, then the coprocessor 104 goes into a sleep state, while the main processor 103 continues to maintain a sleep state, and the display screen 102 Screen off.
  • Step 904 The main processor 103 determines whether there is an interactive operation for setting the application. If yes, it means that the main processor 103 has detected the user's interactive operation on the setting application, and the process goes to step 905; otherwise, it means that the main processor 103 has not detected the user's interactive operation on the setting application. In some embodiments, if the main processor 103 does not detect any operation by the user within a set time, the screen will be turned off after a timeout.
  • the setting application refers to applications that need to be processed by the coprocessor 104 , such as calculators, timers, alarm clocks, sports, and the like, which are frequently used and require less computation.
  • the coprocessor 104 replaces the main processor 103 to process the aforementioned applications with high frequency and long time, thereby reducing power consumption and improving the battery life of the smart watch 100 .
  • Step 905 the main processor 103 wakes up the coprocessor 104 , and the main processor 103 switches the display screen 102 and the touch screen 101 to the coprocessor 104 .
  • the sports application of the smart watch 100 sends a screen-off instruction to the main processor 103 .
  • the main processor 103 completes the power-off of the internal MIPI interface, that is, the power-off of the hardware interface connecting the main processor 103 and the switch S2 of the display screen 102 is completed, so as to disconnect the MIPI of the main processor 103.
  • the connection between the interface and the MIPI interface of the display screen 102 .
  • the main processor 103 sends an interrupt request to the coprocessor 104 to request to switch the screen, that is, to request to switch the display screen 102 and the touch screen 101 .
  • the coprocessor 104 switches the display screen 102 in response to receiving the interrupt request for switching the screen. After the display screen 102 is switched, the touch screen 101 is switched.
  • the touch screen 101 is switched.
  • Embodiments disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementation methods.
  • Embodiments of the present application may be implemented as a computer program or program code executing on a programmable system including at least one processor, a storage system (including volatile and nonvolatile memory and/or storage elements) , at least one input device, and at least one output device.
  • Program code may be applied to input instructions to perform the functions described herein and to generate output information.
  • the output information can be applied to one or more output devices in a known manner.
  • a processing system includes any processor having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor system.
  • DSP Digital Signal Processor
  • ASIC Application Specific Integrated Circuit
  • the program code may be implemented in a high-level procedural language or an object-oriented programming language to communicate with the processing system.
  • the program code may also be implemented in assembly or machine language, if desired.
  • the mechanisms described in this application are not limited in scope to any particular programming language. In either case, the language may be a compiled language or an interpreted language.
  • the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof.
  • the disclosed embodiments can also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (eg, computer-readable) storage media, which can be executed by one or more processors read and execute.
  • the instructions may be distributed over a network or via other computer-readable media.
  • a machine-readable medium can include any mechanism for storing or transmitting information in a form readable by a machine (eg, a computer), including, but not limited to, floppy disks, optical disks, optical disks, read only memories (CD-ROMs), magnetic CD-ROM, Read Only Memory (ROM), Random Access Memory (RAM), Erasable Programmable Read Only Memory (EPROM), Electrically Erasable Programmable Only Memory (EPROM) Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM), magnetic or optical cards, flash memory, or for the use of the Internet to transmit information in electrical, optical, acoustic or other forms of propagation signals (for example, carrier waves, infrared signals, digital signals etc.) tangible machine-readable storage.
  • machine-readable media includes any type of machine-readable media suitable for storing or transmitting electronic instructions or information in a form readable by a machine (eg, a computer).
  • each unit/module mentioned in each device embodiment of this application is a logical unit/module.
  • a logical unit/module may be a physical unit/module or a physical unit/module.
  • a part of a module can also be implemented by a combination of multiple physical units/modules.
  • the physical implementation of these logical units/modules is not the most important, and the combination of functions implemented by these logical units/modules is the solution to the problem of this application. The crux of the technical question raised.
  • the above-mentioned device embodiments of the present application do not introduce units/modules that are not closely related to solving the technical problems raised in the present application, which does not mean that the above-mentioned device embodiments do not exist. other units/modules.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本申请公开了一种电子设备的屏幕控制方法、可读介质和电子设备。该方法包括:电子设备的第二处理器处理显示屏和触摸屏的相关信息;电子设备在检测到用户开始第一触摸操作的情况下,将显示屏的处理权限由第二处理器切换到第一处理器,并将检测到的第一触摸操作的第一触摸数据经由第二处理器发送给第一处理器,并且电子设备在检测到第一触摸操作结束的情况下,将触摸屏的处理权限由第二处理器切换到第一处理器。本申请方案在屏幕切换过程中,先切换显示屏的处理权限再切换触摸屏的处理权限,并且第一处理器通过第二处理器接收触摸数据,使得第一处理器可以接收到完整的触摸数据,进而准确地响应用户的触摸事件,实现无感切换,提升用户体验。

Description

电子设备的屏幕控制方法、可读介质和电子设备
本申请要求于2021年03月03日提交中国专利局、申请号为202110236124.X、申请名称为“电子设备的屏幕控制方法、可读介质和电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,特别涉及一种电子设备的屏幕控制方法、可读介质和电子设备。
背景技术
电子设备的触摸屏作为人机交互的载体,广泛应用于各种智能电子设备中。用户只需用手指触摸智能电子设备的触摸屏,就能实现对电子设备的操作,从而实现更为直观、便捷的人机交互。
为了实现更高的能效比,通常会在电子设备的硬件架构上采用双处理器方案,即一个高性能的主处理器负责操作系统的运行,处理高运算量的任务,如地图,导航,电话等功能。一个低功耗协处理器负责一些低运算量的任务,诸如传感器数据的采集和处理等。这样一来,由于高性能的主处理器和低功耗的协处理器共享一套显示屏和触摸屏,在系统之间切换时,容易造成用户界面的丢点、卡顿等问题。
发明内容
本申请实施例提供了一种电子设备的屏幕控制方法、可读介质和电子设备。本申请方案在屏幕切换过程中,通过先切换显示屏的处理权限,待显示屏切换完成后再切换触摸屏的处理权限。并且在触摸屏切换的过程中,第一处理器通过第二处理器接收触摸数据,使得第一处理器可以接收到完整的触摸数据,从而准确地解析出与该次完整的触摸数据对应的用户的本次操作事件,进而准确地响应用户的本次操作事件。避免在屏幕切换过程中,导致电子设备生成的与用户操作对应的数据的丢失,从而给用户带来滑动不流畅的体验的问题的出现。实现无感切换,提升用户体验。
第一方面,本申请实施例提供了一种电子设备的屏幕控制方法,包括:
电子设备的第二处理器处理显示屏和触摸屏的相关信息;电子设备在检测到用户开始第一触摸操作的情况下,将显示屏的处理权限由第二处理器切换到第一处理器,并将检测到的第一触摸操作的第一触摸数据经由第二处理器发送给第一处理器,并且电子设备在检测到第一触摸操作结束的情况下,将触摸屏的处理权限由第二处理器切换到第一处理器。
其中,显示屏和触摸屏的相关信息包括但不限于:一种或多种类型的数据,例如,对应于用户触摸操作的触摸数据;一种或多种类型的信令,例如,在图5所示的实施例中,协处理器应用层发送给主处理器的唤醒指令;一种或多种类型的消息,例如,在图5所示的实施例中,触摸芯片向协触控驱动发送的中断消息;一种或多种类型的通知,一种或多种类型的请求,一种或多种类型的响应,一种或多种类型的信号等。
例如,在一些实施例中,电子设备在休眠状态时,由第二处理器处理显示屏和触摸屏的相关信息。当电子设备检测到用户开始在屏幕上滑动时,将显示屏的处理权限由第二处理器切换到第一处理器,并将检测到的第一触摸操作的第一触摸数据经由第二处理器发送给第一处理器。并且,当电子设备检测到 用户的手指离开屏幕时,确定出用户的本次触摸操作结束,将触摸屏的处理权限由第二处理器切换到第一处理器。在触摸屏切换的过程中,第一处理器通过第二处理器接收触摸数据,使得第一处理器可以接收到完整的触摸数据,从而准确地解析出与该次完整的触摸数据对应的用户的本次操作事件,进而准确地响应用户的本次操作事件。实现无感切换,提升用户体验。
应理解:本文中“显示屏的处理权限”是指显示屏相关信息的处理权限,“触摸屏的处理权限”是指触摸屏相关信息的处理权限。
在上述第一方面的一种可能的实现中,上述方法还包括:电子设备包括虚拟触控驱动,电子设备将第一触摸操作的第一触摸数据经由第二处理器发送给虚拟触控驱动,第一处理器经由虚拟触控驱动接收第一触摸数据。
在一些实施例中,第一处理器为主处理器,可以运行高性能的操作系统,处理使用频率较低的高运算量的任务。例如,主处理器运行
Figure PCTCN2022077977-appb-000001
系统,支持导航、电话、地图、聊天、音乐播放等功能。
在一些实施例中,第二处理器为协处理器,可以运行低功耗的轻量级的系统,处理使用频率较高的低运算量的任务。例如,协处理器运行轻量级的嵌入式系统,负责传感器数据的采集、处理,支持时间显示、计算器、定时器、闹钟、测心率、计步、测高度等功能。
在一些实施例中,虚拟触控驱动为图3所示的实施例中的主虚拟触控驱动,用于在显示屏的处理权限已经被切换给主处理器,而触摸屏的处理权限依然在协处理器的情况下,主处理器通过主虚拟触控驱动读取由协处理器发来的触摸数据。
在上述第一方面的一种可能的实现中,上述方法还包括:电子设备在检测到用户开始第一触摸操作的情况下,将显示屏的处理权限由第二处理器切换到第一处理器包括:
电子设备在检测到用户开始第一触摸操作的情况下,通过第二处理器向第一处理器发送唤醒指令;第一处理器接收到唤醒指令后,响应唤醒指令,并且向第二处理器发送屏幕切换的中断请求;第二处理器接收到屏幕切换的中断请求后,响应中断请求,将显示屏的处理权限由第二处理器切换到第一处理器。
在一些实施例中,第一处理器为主处理器,第二处理器为协处理器,电子设备通过协处理器应用层向主处理器发送唤醒指令。
在上述第一方面的一种可能的实现中,上述方法还包括:
电子设备的第一处理器处理显示屏和触摸屏的相关信息;电子设备在检测到用户有针对电子设备的设定应用的交互操作的情况下,将显示屏和触摸屏的处理权限由第一处理器切换到第二处理器,由电子设备的第二处理器处理显示屏和触摸屏的相关信息。
其中,设定应用是指需要由第二处理器处理的例如计算器、定时器、闹钟、运动等使用频率较高、运算量较低的应用。由第二处理器代替第一处理器来处理使用频率高、时间长的应用,可以实现降低功耗,提升电子设备的续航能力。
在上述第一方面的一种可能的实现中,上述方法还包括:电子设备在检测到用户有针对电子设备的设定应用的交互操作的情况下,将显示屏和触摸屏的处理权限由第一处理器切换到第二处理器,由电子设备的第二处理器处理显示屏和触摸屏的相关信息包括:
电子设备在检测到用户有针对电子设备的设定应用的交互操作的情况下,将显示屏的处理权限由第一处理器切换到第二处理器;在电子设备确定出显示屏的处理权限由第一处理器切换到第二处理器之后,将触摸屏的处理权限由第一处理器切换到第二处理器。
如此,可以避免在屏幕切换过程中,用户触摸数据的丢失,实现无感切换,提升用户体验。
在上述第一方面的一种可能的实现中,上述方法还包括:电子设备的第一处理器处理显示屏和触 摸屏的相关信息;电子设备在检测到用户开始第二触摸操作的情况下,将检测到的第二触摸操作的第二触摸数据发送给第一处理器。
例如,在一些实施例中,当前正在运行电子设备的第一处理器,由第一处理器控制显示屏和触摸屏,当用户的手指在电子设备的屏幕上滑动时,由第一处理器读取由触摸芯片生成的对应于用户本次触摸操作的触摸数据。
在上述第一方面的一种可能的实现中,上述方法还包括:电子设备包括显示屏切换开关,显示屏切换开关电连接显示屏,并且电子设备在检测到用户开始第一触摸操作的情况下,通过以下方式将显示屏的处理权限由第二处理器切换到第一处理器:
电子设备在检测到用户开始第一触摸操作的情况下,控制显示屏切换开关与第二处理器断开连接,并控制显示屏切换开关连通第一处理器。
例如,在一些实施例中,第一处理器和第二处理器的MIPI接口通过开关S2连接到显示屏的MIPI接口上。电子设备在检测到用户开始第一触摸操作的情况下,控制开关S2与第二处理器断开连接,并控制开关S2连通第一处理器。
在上述第一方面的一种可能的实现中,上述方法还包括:电子设备包括触摸屏切换开关,触摸屏切换开关电连接触摸屏,并且电子设备在检测到第一触摸操作结束的情况下,通过以下方式将触摸屏的处理权限由第二处理器切换到第一处理器:
电子设备在检测到第一触摸操作结束的情况下,控制触摸屏切换开关与第二处理器断开连接,并控制触摸屏切换开关连通第一处理器。
例如,在一些实施例中,第一处理器和第二处理器的I2C接口通过开关S1连接到触摸屏的I2C接口上,电子设备在检测到第一触摸操作结束的情况下,控制开关S1与第二处理器断开连接,并控制开关S1连通第一处理器。
第二方面,本申请实施例提供了一种可读介质,可读介质上存储有指令,该指令在电子设备上执行时使电子设备执行上述第一方面以及第一方面的各种可能实现中的任意一种屏幕控制方法。
第三方面,本申请实施例提供了一种电子设备,包括:
屏幕,屏幕包括显示屏和触摸屏;
存储器,用于存储由电子设备的一个或多个处理器执行的指令,以及
第一处理器,是电子设备的处理器之一;
第二处理器,是电子设备的处理器之一,用于与第一处理器配合执行上述第一方面以及第一方面的各种可能实现中的任意一种屏幕控制方法。
附图说明
图1(a)示出了相关技术方案中,电子设备的一次屏幕切换的时序图;
图1(b)根据本申请的一些实施例,示出了电子设备在执行本申请提供的屏幕的切换控制方法的情况下,电子设备的一次屏幕切换的时序图;
图1(c)根据本申请的一些实施例,示出了本申请提供的一种屏幕的切换控制方法的应用场景;
图2根据本申请的一些实施例,示出了图1(c)所示的智能手表的硬件结构框图;
图3根据本申请的一些实施例,示出了图1(c)所示的智能手表的系统架构图;
图4(a)根据本申请的一些实施例,示出了智能手表处于灭屏状态的界面图;
图4(b)根据本申请的一些实施例,示出了用户抬腕/按键后智能手表亮屏的界面图;
图4(c)根据本申请的一些实施例,示出了用户滑动屏幕后智能手表显示的桌面图;
图5根据本申请的一些实施例,示出了本申请提供的一种屏幕的切换控制方法的交互图;
图6根据本申请的一些实施例,示出了协处理器切换显示屏的原理图;
图7(a)根据本申请的一些实施例,示出了用户点击智能手表桌面的运动应用的示意图;
图7(b)根据本申请的一些实施例,示出了用户点击智能手表桌面的运动应用后,进入运动应用的界面图;
图8根据本申请的一些实施例,示出了本申请提供的另一种屏幕的切换控制方法的交互图;
图9根据本申请的一些实施例,示出了本申请提供的一种系统交互方法的流程图。
具体实施方式
本申请的实施例包括但不限于一种电子设备的屏幕控制方法、可读介质和电子设备。
为了解决在具有双处理器的电子设备中进行屏幕切换时出现数据丢失、屏幕卡顿的问题,本申请实施例提供了一种电子设备的屏幕控制方法。具体地,当电子设备在一些场景下进行系统切换时,先切换显示屏的处理权限,待显示屏的处理权限切换完成后再切换触摸屏的处理权限。例如,电子设备为双处理器设备,具有主处理器和协处理器,用户将该电子设备佩戴在手腕上,当用户抬起佩戴有该电子设备的手腕时,会唤醒电子设备的协处理器,由协处理器控制显示屏和触摸屏,当用户的手指在电子设备的屏幕上滑动时,电子设备先将显示屏的处理权限切换给主处理器,待用户的本次滑动操作完成之后,电子设备再将触摸屏的处理权限切换给主处理器。
在相关技术方案中,如图1(a)所示,在t1时刻,当处于灭屏状态的电子设备接收到用户的按键操作后,协处理器被唤醒,由协处理器控制显示屏和触摸屏;在t2时刻,当用户的手指开始在电子设备的屏幕上滑动时,电子设备同时将显示屏和触摸屏的处理权限均切换给主处理器,由主处理器控制显示屏和触摸屏。这样有可能会造成在用户一次完整的触摸操作还未完成的情况下,电子设备将触摸屏的相关信息的处理权限由协处理器切换到主处理器,从而导致在触摸屏的切换过程中,用户的触摸数据部分丢失,从而导致主处理器无法完全接收触摸数据,从而无法准确地解析出与用户触摸数据对应的用户的操作事件,进而无法准确地响应用户的操作事件,而出现屏幕切换过程中的卡顿现象,影响用户体验。
而在本申请的一些实施例中,如图1(b)所示,在t1时刻,当处于灭屏状态的电子设备接收到用户的按键操作后,协处理器被唤醒,由协处理器控制显示屏和触摸屏;在t2时刻,当用户的手指开始在电子设备的屏幕上滑动时,主处理器被唤醒,电子设备先将显示屏的处理权限切换给主处理器,由主处理器控制显示屏,而触摸屏的处理权限依然在协处理器;在t3时刻,当用户的手指从电子设备的屏幕上抬起,即本次滑动操作结束后,电子设备再将触摸屏的处理权限切换给主处理器。
需要说明的是,在本申请的一些实施例中,当显示屏的处理权限被切换给主处理器,而触摸屏的处理权限依然保持在协处理器的情况下,电子设备通过创建对应于主处理器中关于触摸屏的虚拟驱动(为了简化描述,下文简称主虚拟触控驱动),来接收由协处理器发来的触摸数据。从而使得当主处理器已经开始控制显示屏逐帧显示图像,而触摸屏还未完成切换的过程中,主处理器可以通过主虚拟触控驱动接收对应于用户本次触摸操作的完整的触摸数据。进而使得主处理器可以根据接收到的完整的触摸数据,准确地解析出与该次完整的触摸数据对应的用户的本次操作事件,进而准确地响应用户的本次操作事件。避免在系统切换过程中,导致电子设备生成的与用户操作对应的数据的丢失,从而给用户带来滑动不流畅的体验的问题的出现。实现无感切换,提升用户体验。
可以理解的是,当触摸屏的处理权限在主处理器时,由主处理器负责处理电子设备生成的与用户的 触摸操作对应的触摸数据;当触摸屏的处理权限在协处理器时,由协处理器负责处理电子设备生成的与用户的触摸操作对应的触摸数据。
下面将结合附图对本申请的实施例作进一步地详细描述。
图1(c)根据本申请的一些实施例,示出了一种本申请提供的屏幕控制方法的应用场景。其中,用户的左手手腕处佩戴着双处理器的智能手表100。智能手表100包括主处理器和协处理器,主处理器和协处理器共用一个屏幕。
其中,主处理器可以运行高性能的操作系统,处理使用频率较低的高运算量的任务。例如,主处理器运行
Figure PCTCN2022077977-appb-000002
系统,支持导航、电话、地图、聊天、音乐播放等功能。协处理器可以运行低功耗的轻量级的系统,处理使用频率较高的低运算量的任务。例如,协处理器运行轻量级的嵌入式操作系统,负责传感器数据的采集、处理,支持时间显示、计算器、定时器、闹钟、测心率、计步、高度测量等功能。通过低功耗的协处理器来代替主处理器来处理使用频率高、运算量低的任务,从而降低功耗,提升智能手表100的续航能力。
另外,主处理器和协处理器共用一个屏幕,屏幕包括显示屏和触摸屏。在一些实施例中,屏幕可以由触摸屏和显示屏叠在一起组成。基于双处理器的硬件架构,智能手表100处于灭屏状态时,主处理器和协处理器均处于休眠状态,由协处理器控制显示屏和触摸屏;当处于休眠状态的协处理器检测到用户对触摸屏的触摸操作或者用户的按键操作时,将显示屏和触摸屏的处理权限切换给主处理器;在主处理器控制显示屏和触摸屏的过程中,若主处理器检测到用户点击了一些需要协处理器处理的应用时,再将显示屏和触摸屏的处理权限切换给协处理器。
然而在相关技术中,显示屏和触摸屏的处理权限在主处理器和协处理器之间的切换过程中,会出现用户的触摸数据部分缺失,用户对触摸屏的滑动不流畅,出现显示给用户的界面卡顿的现象。
为了解决上述问题,当智能手表100的显示屏和触摸屏的处理权限在主处理器和协处理器之间切换时,智能手表100通过执行本申请实施例提供的屏幕的切换控制方法,先切换显示屏的处理权限,待显示屏的处理权限切换完成后再切换触摸屏的处理权限。例如,在将显示屏和触摸屏的处理权限由协处理器切换到主处理器的过程中,主处理器通过主虚拟触控驱动来接收由协处理器发来的触摸数据。从而保证在显示屏和触摸屏的切换过程中,智能手表100可以完整地获取到触摸屏生成的与用户的触摸操作对应的数据,从而准确地响应用户的触摸操作,执行相应的任务。在降低整机功耗的同时,实现无感切换,提升用户体验。
可以理解的是,本申请的技术方案可以应用于具有双处理器、显示屏和触摸屏的各种电子设备。包括但不限于,智能手表、膝上型计算机、台式计算机、平板计算机、手机、服务器、可穿戴设备、头戴式显示器、移动电子邮件设备、便携式游戏机、便携式音乐播放器、阅读器设备、其中嵌入或耦接有一个或多个处理器的电视机、或能够访问网络的其他电子设备。
在下文的描述中,为了简化说明,以智能手表100为例,介绍本申请的技术方案。
图2根据本申请的一些实施例,示出了图1(c)所示的智能手表100的硬件结构框图。如图2所示,智能手表100包括触摸屏101、显示屏102、主处理器103、协处理器104、存储器105、通信模块106、传感器模块107、电源108、触摸屏101的切换开关S1、显示屏102的切换开关S2、电源管理系统109以及触摸芯片110等。
触摸屏101,也可以称为触控面板,可以采集用户在屏幕上的点击、滑动等的触摸操作。在一些实施例中,触摸屏101可以通过I2C(Inter-Integrated Circuit)总线和主处理器103、协处理器104通信。触摸屏101可以为电阻式、表面电容式、投射电容式、红外线式、表面声波式、弯 曲波式、有源数字切换器式以及光学成像式触摸屏。
触摸芯片110与触摸屏101电连接,触摸屏101工作时,触摸芯片110会以一定的扫描频率对触摸屏101进行扫描,以获取用户的触摸数据,例如触摸点的坐标、压力、面积以及切边值等数据。再将获取的用户的触摸数据上报至主处理器103或协处理器104进行处理,主处理器103或协处理器104对触摸数据进行处理后,将对应用户触摸操作的图像逐帧显示出来。
显示屏102可以用于显示用户输入的信息、提供给用户的提示信息、智能手表100上的各种菜单、智能手表100的各种应用程序的操作界面等等。例如,显示屏102可以用于显示当前的时间、健康检测应用测得的用户的心率、运动应用计算的用户运动时的步数等等。在一些实施例中,显示屏102可以通过移动行业处理器接口(Mobile Industry Processor Interface,MIPI)总线和主处理器103、协处理器104通信。显示屏102可以包括显示面板,显示面板可以采用液晶显示屏(Liquid Crystal Display,LCD),有机发光二极管(Organic Light-emitting Diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(Active-matrix Organic Light-emitting Diode的,AMOLED),柔性发光二极管(Flex Light-emitting Diode,FLED),Mini LED,Micro LED,Micro OLED,量子点发光二极管(Quantum Dot Light-emitting Diodes,QLED)等。
主处理器103包括多个处理单元,可以运行
Figure PCTCN2022077977-appb-000003
等操作系统,用来处理与导航、电话、地图、聊天、音乐播放等应用相关的任务,以及在智能手表100的触摸屏101和显示屏102的处理权限切换给主处理器103的情况下,处理用户对触摸屏101触摸所产生的数据等。
协处理器104,可以运行轻量级的嵌入式操作系统,用于负责传感器数据的采集、处理,处理与时间显示、计算器、定时器、闹钟、测心率、计步、高度测量等应用相关的任务,以及在智能手表100的触摸屏101和显示屏102的处理权限切换给协处理器104的情况下,处理用户对触摸屏101触摸的数据等。在一些实施例中,协处理器104可以包括数字信号处理器(Digital Signal Processer,DSP)、微控制单元(Microcontroller Unit,MCU)、可编程逻辑器件(Field Programmable Gate Array,FPGA)、专用集成电路(Application Specific Integrated Circuit,ASIC)等的处理模块或处理电路。
在一些实施例中,主处理器103和协处理器104的I2C接口通过开关S1连接到触摸屏101的I2C接口上,主处理器103和协处理器104的MIPI接口通过开关S2连接到显示屏102的MIPI接口上。并且主处理器103和协处理器104通过通用输入输出(General-purpose input/output,GPIO)接口(图中未示出)连接到开关S2上,主处理器103通过GPIO接口电平的拉高或拉低,来向协处理器104发送显示屏102的切换请求,待显示屏102的处理权限完成切换后,再切换触摸屏101的处理权限。
存储器105用于存储软件程序以及数据,处理器203通过运行存储在存储器105中的软件程序以及数据,执行智能手表100的各种功能应用以及数据处理。例如,在本申请的一些实施例中,存储器105可以存储用户在运动过程中,传感器采集到的气压、温度等数据;存储用户的睡眠数据、心率数据等等。同时,存储器105也可以存储用户的注册信息、登录信息等。
通信模块106可以用来使智能手表100和其他电子设备进行通信,并通过其他电子设备连接网络。例如,智能手表100通过通信模块106和手机、服务器等电子建立连接,进行数据传输。
传感器模块107可以包括接近光传感器、压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器,骨传导传感器等。
电源108用于为智能手表100的各个部件供电。电源108可以为电池。
电源管理系统109用于管理电源108的充电以及电源108向其他模块的供电。
可以理解,图2所示的智能手表100仅仅是实现本申请技术方案中智能手表100的功能的一种示例性结构,并不构成对智能手表100的具体限定。在本申请的另一些实施例中,智能手表100可以包括比图2所示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图2所示的部件可以以硬件,软件或软件和硬件的组合实现。
图3根据本申请的一些实施例,示出了图2所示的智能手表100中由主处理器103和协处理器104控制运行的两个操作系统(如
Figure PCTCN2022077977-appb-000004
系统和轻量级的嵌入式操作系统)的架构图。如图3所示,智能手表100的主处理器103和协处理器104运行的两个操作系统均包括应用层302、驱动层301和器件层300,故在此不做区分,而对各层中具体的原件进行区分。其中,主处理器103和协处理器104共享显示屏102和触摸屏101以及触摸芯片110。
如图3所示,器件层300包括显示屏102、触摸屏101、显示屏102的切换开关S2、触摸屏101的切换开关S1、主处理器103、协处理器104以及触摸芯片110等。器件层300中各个器件的功能请参考上述关于图2部分的文字描述,在此不再赘述。
驱动层301用于驱动上述器件层300中的各个器件,实现对各个器件的读写访问,中断设置等功能。
如图3所示,在协处理器104运行的轻量级的嵌入式操作系统中,驱动层301包括但不限于协触控驱动301a。协触控驱动301a可以是协处理器104运行的轻量级的嵌入式操作系统中用于驱动触摸屏101,并能够读取触摸芯片110生成的触摸数据的软件程序。
例如,在图1(b)所示的实施例中,在t1时刻,当处于灭屏状态的智能手表100接收到用户的按键操作后,协处理器104被唤醒,由协处理器104处理显示屏102和触摸屏101的相关信息,其中,显示屏102和触摸屏101的相关信息包括但不限于:一种或多种类型的数据,例如,对应于用户触摸操作的触摸数据;一种或多种类型的信令,例如,在图5所示的实施例中,协处理器应用层发送给主处理器103的唤醒指令;一种或多种类型的消息,例如,在图5所示的实施例中,触摸芯片110向协触控驱动301a发送的中断消息;一种或多种类型的通知,一种或多种类型的请求,一种或多种类型的响应,一种或多种类型的信号等。协处理器104通过协触控驱动301a驱动触摸屏101,并读取用户的触摸数据。
应理解:本文中“显示屏102的处理权限”是指显示屏101相关信息的处理权限,“触摸屏101的处理权限”是指触摸屏101相关信息的处理权限。
在主处理器103运行的
Figure PCTCN2022077977-appb-000005
系统中,驱动层301包括但不限于用于接收及发送用户的触摸数据的主虚拟触控驱动301b、主触控驱动301c等。其中,主触控驱动301c是主处理器103中运行的高性能的操作系统中用于驱动触摸屏101的软件程序,并能够在当显示屏102和触摸屏101的处理权限均切换到主处理器103的情况下,读取图3中所示的由触摸芯片110生成的切换后的触摸数据(即显示屏102和触摸屏101的处理权限均已切换到主处理器103后,用户再次触摸屏幕,由触摸芯片110生成的对应用户的再次触摸操作的数据)。
主虚拟触控驱动301b是主处理器103在初始化时,创建的对应于触摸屏101的虚拟驱动,用于在显示屏102的处理权限已经切换到主处理器103,而触摸屏101的处理权限还处于协处理器104时,接收由协处理器应用层302a发来的触摸数据。
例如,在图1(b)所示的实施例中,在t2时刻,当用户的手指开始在智能手表100的屏幕上滑动时,主处理器103被唤醒,智能手表100先将显示屏102的处理权限切换给主处理器103,由主处理器103处理显示屏102的数据,而触摸屏101的处理权限依然在协处理器104。在显示屏102的处理权限 已经被切换给主处理器103,而触摸屏101的处理权限依然在协处理器104的情况下,主处理器103通过主虚拟触控驱动301b读取由协处理器104发来的触摸数据。
在t3时刻,当用户的手指从智能手表100的屏幕上抬起,即本次滑动操作结束后,智能手表100再将触摸屏101的处理权限切换给主处理器103。主处理器103通过主触控驱动301c读取由协处理器104发来的触摸数据。
继续参考图3,在一些实施例中,在协处理器104运行的轻量级的嵌入式操作系统中,应用层302包括协处理器应用层302a。协处理器应用层302a用于当显示屏102和触摸屏101的处理权限在协处理器104时,将协处理器104的协触控驱动301a从触摸芯片110读取的触摸数据进行计算,确定出对应该触摸数据的用户的触摸事件类型,再根据确定出的事件类型,通过应用程序进行响应。
例如,在图1(b)所示的实施例中,在t1时刻,当处于灭屏状态的智能手表100接收到用户的按键操作后,协处理器104被唤醒,由协处理器104处理显示屏102和触摸屏101的相关信息。协处理器104通过协触控驱动301a驱动触摸屏101,并读取用户的触摸数据。协处理器应用层302a在接收到协触控驱动301a上报的用户的触摸数据进行计算,确定出对应该触摸数据的用户的触摸事件类型,再根据确定出的事件类型,通过应用程序进行响应。
相应地,在主处理器103运行的
Figure PCTCN2022077977-appb-000006
系统中,应用层302包括主处理器应用层302b。主处理器应用层302b用于当显示屏102和触摸屏101的处理权限在主处理器103时,将主处理器103的主触控驱动301c从触摸芯片110读取的触摸数据进行计算,确定出对应该触摸数据的用户的触摸事件,再根据确定出的事件,通过应用程序进行响应。
例如,在图1(b)所示的实施例中,在t2时刻,当用户的手指开始在智能手表100的屏幕上滑动时,主处理器103被唤醒,智能手表100先将显示屏102的处理权限切换给主处理器103。在t3时刻,当用户的手指从智能手表100的屏幕上抬起,即本次滑动操作结束后,智能手表100再将触摸屏101的处理权限切换给主处理器103。由主处理器103处理显示屏102和触摸屏101的相关信息,当用户再次点击屏幕时,触摸芯片110生成对应用户该次点击操作的触摸数据。主触控驱动301c从触摸芯片110读取该触摸数据后,将触摸数据上报给主处理器应用层302b。主处理器应用层302b对用户点击屏幕的数据进行识别,确定出用户的操作为点击音乐播放应用的图标,进而打开音乐播放应用。
以下将参考图2至图6,以智能手表100在灭屏状态下,用户抬腕或按键唤醒协处理器104,用户通过手指触摸屏幕的场景下,对本申请方案提供的一种屏幕控制方法进行详细介绍。
例如,在图4(a)所示的实施例中,智能手表100处于灭屏状态,主处理器103和协处理器104都处于休眠状态。在图4(b)所示的实施例中,当用户抬腕或按下智能手表100的按键111后,屏幕亮起,表盘显示时间界面,协处理器104被唤醒,并且显示屏102和触摸屏101的处理权限在协处理器104,由协处理器104处理触摸芯片110生成的用户的触摸数据,协处理器104对触摸数据处理后,控制显示屏102逐帧显示。屏幕亮起后,若超过设定时间未检查到用户的触摸操作,则智能手表100再次灭屏,主处理器103和协处理器104重新进入休眠状态。若在设定时间内检测到用户的触摸操作,例如,用户在图4(b)所示的智能手表100的显示的时间界面上滑动后,进入如图4(c)所示的智能手表100的桌面,其中包括多个应用图标,例如导航图标、运动应用图标、计算器图标、天气图标、设置图标等,则智能手表100通过执行本申请方案提供的屏幕控制方法将显示屏102和触摸屏101的处理权限切换给主处理器103,即由主处理器103 处理触摸芯片110生成的用户的触摸数据,主处理器103对触摸数据处理后,控制显示屏102逐帧显示。
具体地,在一些实施例中,如图5所示,当显示屏102和触摸屏101的处理权限在协处理器104,协处理器104在检测到用户的触摸操作的情况下,将显示屏102和触摸屏101的处理权限切换到主处理器103的过程包括以下步骤:
步骤501:智能手表100生成用户的触摸数据。例如,在一些实施例中,用户触摸屏幕,触摸芯片110生成用户触摸点的坐标、压力、面积以及切边值等数据。
此外,可以理解的是用户触摸智能手表100的操作可以为点击、长按、双击、滑动等操作。例如,用户通过手指点击触摸屏101,或者用户的手指从触摸屏101上的一个位置滑动到另一个位置。用户对智能手表100的触摸操作可以是用户通过手指进行的,也可以是用户通过触摸笔或其他触摸设备进行的。
步骤502:智能手表100的触摸芯片110将中断消息上报给协处理器104的协触控驱动301a。
其中,中断消息是指触摸芯片110接收到用户的触摸操作时产生的触发信号。如果存在中断消息,则表明用户开始了对触摸屏101的触摸操作,则进入步骤503;如果没有中断消息,则表明用户未对触摸屏101进行触摸操作,未触发触摸芯片110生成用户的触摸数据,返回步骤501。
步骤503:协处理器104的协触控驱动301a接收中断消息后,从触摸芯片110中读取触摸数据。
例如,在一些实施例中,协处理器104的协触控驱动301a接收中断消息后,通过串行外设接口(Serial Peripheral Interface,SPI)从触摸芯片110中读取触摸数据。
步骤504:协处理器104的协触控驱动301a将读取的触摸数据上报至协处理器104的协处理器应用层302a。协处理器应用层302a在接收到协触控驱动301a上报的触摸数据后,执行步骤505。
步骤505:协处理器104的协处理器应用层302a在接收到触摸数据之后,向主处理器103发送唤醒指令。主处理器103在接收到唤醒指令后,同时执行步骤506和步骤507。
步骤506:主处理器103在接收到唤醒指令后,完成内部MIPI接口的上电。即完成主处理器103与显示屏102的切换开关S2连接的硬件接口的上电,以使当显示屏102的切换开关S2切换到主处理器103时,主处理器103能够通过MIPI接口连通显示屏102的MIPI接口。
步骤507:主处理器103向协处理器104发送中断请求,请求控制屏幕。即请求协处理器104将显示屏102和触摸屏101的处理权限切换给主处理器103。
步骤508:协处理器104在接收到主处理器103发来的中断请求之后,响应该中断请求,先将显示屏102的处理权限切换给主处理器103。
例如,如图6所示,主处理器103可以将用于发送切换请求的GPIO接口电平拉高,协处理器104把控制显示屏102的切换开关S2切换的GPIO接口电平也拉高,则开关S2切换到主处理器103,即显示屏102通过开关S2与主处理器103连通,由主处理器103处理显示屏102的相关信息。当显示屏102的处理权限由协处理器104切换到主处理器103之后,进入步骤509。
步骤509:主处理器103的主虚拟触控驱动301b从协处理器应用层302a读取用户的触摸数据。其中,用户的触摸数据为显示屏102的处理权限从协处理器104成功切换到主处理器103,而触摸屏101暂未完成切换,即显示屏102的处理权限已经切换到主处理器103,而触摸屏101的处理权限保持在协处理器104的过程中,触摸芯片110产生的对应用户触摸操作的数据。
需要说明的是,若在用户一次完整的触摸操作还未完成,例如用户的手指或用户手持的触摸 设备还未从屏幕上抬起的情况下,就将触摸屏101的处理权限从协处理器104切换到主处理器103,则会导致用户的触摸数据部分丢失,从而导致主处理器103无法完全接收触摸数据,从而无法准确地解析出与用户触摸数据对应的用户的操作事件,进而无法准确地响应用户的操作事件,出现屏幕切换过程中的卡顿现象,影响用户体验。因此,协处理器104先把显示屏102切换给主处理器103,待用户的一次触摸操作完成后,即用户的手指或用户手持的触摸设备从屏幕上抬起后,再将触摸屏101切换给主处理器103,进入步骤512。
步骤510:主处理器103的主虚拟触控驱动301b将读取的用户的触摸数据上报给主处理器应用层302b。
即在触摸屏101未完成切换之前,触摸屏101的处理权限保持在协处理器104,主处理器103通过主虚拟触控驱动301b从协处理器应用层302a读取到用户的触摸数据之后,再将读取的用户的触摸数据上报给主处理器应用层302b进行处理。
步骤511:主处理器应用层302b在收到用户的触摸数据后,响应用户的触摸操作。
例如,在一些实施例中,主处理器应用层302b在收到用户的触摸数据后,对触摸数据进行计算,确定出对应该触摸数据的用户的触摸事件,再根据确定出的事件进行响应。
例如,在一些实施例中,主处理器应用层302b确定出用户的操作类型为用户的手指用力在同一位置长按时,对应该操作的响应为:对智能手表100的信息记录中的信息进行转发、收藏、编辑、删除、多选或引用等。
步骤512:协处理器104将显示屏102切换给主处理器103之后,判断本次触摸操作是否结束。若结束,表明可以将触摸屏101切换给主处理器103,进入步骤513;否则,表明暂时还不能将触摸屏101切换给主处理器103,返回步骤509。
在一些实施例中,协处理器104可以通过判断用户的手指离开屏幕的时间和用户开始触摸屏幕的时间之间的时间差是否大于设定时间阈值,来判断一次触摸操作是否结束。例如,设定一次滑动操作的时间阈值为10毫秒,若用户的手指离开屏幕的时间和用户开始触摸屏幕的时间之间的时间差大于10毫秒,则协处理器104可以确定一次滑动操作结束;反之,则协处理器104确定一次滑动操作未结束。
步骤513:协处理器104确定本次触摸结束后,将触摸屏101切换给主处理器103。
例如,如图2所示,协处理器104控制触摸屏101的切换开关S1切换到主处理器103,使得触摸屏101的I2C接口和主处理器103的I2C接口连通,即实现将触摸屏101切换给主处理器103。
步骤514:在主处理器103成功控屏(即由主处理器103处理显示屏102和触摸屏101的相关信息)的情况下,即在显示屏102和触摸屏101的处理权限都已被成功切换给主处理器103的情况下,若用户再次触摸屏幕,将由主处理器103从触摸芯片110读取触摸芯片110生成的对应用户再次触摸的触摸数据。
若用户再次触摸屏幕,则智能手表100的触摸芯片110将中断消息上报给主处理器103的主触控驱动301c。其中,中断消息是指用触摸芯片110接收到用户再次触摸操作时产生的触发信号。如果存在中断消息,则表明用户开始了对触摸屏101的触摸操作,则进行步骤515;如果没有中断消息,则表明用户未对触摸屏101再次进行触摸操作,未触发触摸芯片110生成用户的再次触摸数据,结束流程。
步骤515:主处理器103的主触控驱动301c在接收到用户再次触摸的中断消息之后,从触摸芯片110中读取再次触摸数据。
例如,主触控驱动301c接收中断消息后,通过串行外设接口(Serial Peripheral Interface,SPI)从触摸芯片110中读取触摸数据。
步骤516:主处理器103的主触控驱动301c将读取的再次触摸数据上报给主处理器应用层302b进行处理。
步骤517:主处理器应用层302b在收到用户的再次触摸数据后,响应用户的再次触摸操作。
例如,主处理器应用层302b在接收到用户的再次触摸数据后,对该触摸数据进行计算,确定出对应该次触摸数据的用户的触摸事件,再根据确定出的事件进行响应。例如,主处理器应用层302b对用户点击屏幕的数据进行识别,确定出用户的操作为在用户的手指在智能手表100的屏幕上的不同位置连续长按,则,对应该操作的响应为:智能手表100拖动信息记录翻滚,以展示不同时间段的信息记录。
例如,当用户的手指在图4(b)所示的智能手表100的时间界面上滑动后,唤醒主处理器103,智能手表100进入如图4(c)所示的智能手表100的桌面,其中包括多个应用图标,例如导航图标、运动应用图标、计算器图标、天气图标、设置图标等。用户的手指在图4(b)所示的智能手表100的时间界面上滑动的过程中,智能手表100通过执行如图5所示的屏幕控制方法,将显示屏102和触摸屏101的处理权限均切换给主处理器103。之后,当用户再触摸屏幕时,由主处理器103处理触摸芯片110生成的用户的触摸数据,控制显示屏102逐帧显示。
以下将对在主处理器103控屏(即显示屏102和触摸屏101的处理权限在主处理器103)的情况下,用户通过点击智能手表100桌面上的设定应用的图标后,触发主处理器103将显示屏102和触摸屏101的处理权限切换给协处理器104的过程进行详细介绍。
其中,设定应用是指需要由协处理器104处理的例如计算器、定时器、闹钟、运动等使用频率较高、运算量较低的应用。由协处理器104代替主处理器103来处理前述使用频率高、时间长的应用,可以降低功耗,提升智能手表100的续航能力。
例如,在图7(a)所示的实施例中,智能手表100当前显示的界面为桌面,其中包括多个应用的图标,例如导航图标、运动应用的图标、计算器图标、天气图标、设置图标等。当用户点击运动应用的图标112后,智能手表100运行运动应用,进入如图7(b)所示的显示用户跑步的公里数以及配速的界面。而当前显示的用户跑步的公里数以及配速的界面是由协处理器104控制显示屏102显示出来的。由于运动应用是由协处理器104运行的,因此,当用户点击运动应用的图标112后,智能手表100的显示屏102和触摸屏101需要由主处理器103切换到协处理器104。
在一些实施例中,主处理器103和协处理器104分别执行的应用如表1所示。
Figure PCTCN2022077977-appb-000007
表1
需要说明的是,表1中列出的导航、电话、聊天、购物、车票、拍照、移动支付等由主处理器执行的应用,通常情况下为第三方应用;而诸如表1中列出的运动、计算器、时钟、天气、科学睡眠、智能心率以及血氧饱和度等应用通常为设备厂商自主研发的应用。可以理解的是,上述表1中示例性地列出的应用名称并不造成对本申请方案的具体限制。
以下将参考图1、图2、图6至8,以用户在智能手表100的桌面上点击运动应用的图标112后,触发主处理器103将显示屏102和触摸屏101的处理权限切换给协处理器104的过程进行详细介绍。
如图1(b)所示,在t3时刻,协处理器104将触摸屏101的处理权限切换给主处理器103之后,由主处理器103处理显示屏102和触摸屏101的相关信息。在t4时刻,当用户点击智能手表100桌面的运动应用的图标112之后,触发主处理器103向协处理器104发送屏幕切换的中断请求,请求协处理器104控制显示屏102和触摸屏101。协处理器104接收到主处理器103发来的中断请求之后,对该中断请求进行响应,控制显示屏102的切换开关S2连通协处理器104,当显示屏102完成切换后,再控制触摸屏101的切换开关S1连通协处理器104。至此,主处理器103将显示屏102和触摸屏101的处理权限切换给协处理器104,由协处理器104负责读取用户的触摸数据并进行处理,控制显示屏102逐帧显示。具体地,如图8所示,用户在智能手表100的桌面上点击运动应用的图标112后,主处理器103将显示屏102和触摸屏101的处理权限切换给协处理器104的过程包括以下步骤:
步骤801:智能手表100的运动应用发送灭屏指令给主处理器103。主处理器103在接收到灭屏指令后,同时执行步骤802和步骤803。
步骤802:主处理器103完成内部MIPI接口下电。即完成主处理器103与显示屏102的切换开关S2连接的硬件接口的下电,以断开主处理器103的MIPI接口和显示屏102的MIPI接口之间的连接。
步骤803:主处理器103向协处理器104发送中断请求,请求切换屏幕,即主处理器103向协处理器104请求将显示屏102和触摸屏101的处理权限由主处理器103切换到协处理器104。
步骤804:协处理器104在接收到切换屏幕的中断请求之后进行响应,将显示屏102的处理权限由主处理器103切换到协处理器104。
例如,如图6所示,主处理器103可以将主处理器103用于发送切换请求的GPIO接口电平拉低,协处理器104把控制显示屏102的切换开关S2切换的GPIO接口电平也拉低,则开关S2切换到协处理器104,即显示屏102通过开关S2连通协处理器104,由协处理器104开始处理显示屏102的相关信息。
步骤805:协处理器104待显示屏102完成切换后,将触摸屏101的处理权限由主处理器103切换到协处理器104。
例如,如图2所示,协处理器104控制触摸屏101的切换开关S1切换到协处理器104,使得触摸屏101的I2C接口和协处理器104的I2C接口连通。从而使得协处理器104连通触摸屏101,当用户触摸智能手表100的屏幕时,由协处理器104读取并处理由触摸芯片110生成的对应用户触摸操作的触摸数据。
如上所述,在介绍了图5所示的显示屏102和触摸屏101的处理权限由协处理器104切换到主处理器103的过程,以及图8所示的显示屏102和触摸屏101的处理权限由主处理器103切换到协处理器104的过程之后。以下将继续以智能手表100为例,介绍本申请提供的一种系统交互 的方法。例如,智能手表100通过执行如图9所示的系统交互的方法,实现不同场景下,智能手表100在主处理器103和协处理器104之间的切换。
可以理解的是,当智能手表100在主处理器103和协处理器104之间进行切换时,可以通过执行例如上述图5所示的屏幕控制方法,实现将显示屏102和触摸屏101从协处理器104切换到主处理器103。并且,智能手表100可以通过执行例如上述图8所示的屏幕控制方法,实现将显示屏102和触摸屏101从主处理器103切换到协处理器104。具体地,如图9所示,本申请提供的系统交互的方法包括以下步骤:
步骤901:当处于休眠状态的智能手表100在检测到用户的设定操作之后,唤醒协处理器104,由协处理器104处理与显示屏102和触摸屏101相关信息。
其中,用户的设定操作可以是用户抬起佩戴智能手表100的手腕、用户按下智能手表100的按键以及用户通过语音指令来指示智能手表100唤醒等的操作,本申请对此不作限定。
其中,显示屏102和触摸屏101的相关信息包括但不限于:一种或多种类型的数据,例如,对应于用户触摸操作的触摸数据;一种或多种类型的信令,例如,在图5所示的实施例中,协处理器应用层发送给主处理器的唤醒指令;一种或多种类型的消息,例如,在图5所示的实施例中,触摸芯片110向协触控驱动301a发送的中断消息;一种或多种类型的通知,一种或多种类型的请求,一种或多种类型的响应,一种或多种类型的信号等。
步骤902:协处理器104判断是否有用户的触摸操作。若是,则表明协处理器104检测到用户的触摸操作,则进入步骤903a;若否,则表明协处理器104未检测到用户的触摸操作,进入步骤903b。
例如,在一些实施例中,当触摸芯片110检测到用户的触摸操作之后,触摸芯片110会向协处理器104发送中断消息。其中,中断消息是指触摸芯片110接收到用户的触摸操作时产生的触发信号。如果存在中断消息,则表明用户开始了对触摸屏101的触摸操作,则进行步骤903a;如果没有中断消息,则表明用户未对触摸屏101进行触摸操作,进入步骤903b。
步骤903a:协处理器104在检测到用户的触摸操作之后,唤醒主处理器103,协处理器104将显示屏102和触摸屏101的处理权限切换到主处理器103。
例如,在一些实施例中,触摸芯片110生成触摸数据,并向协处理器104的协触控驱动301a发送中断消息。协触控驱动301a在接收到中断消息后,从触摸芯片110读取触摸数据,并将读取的触摸数据上报给协处理器应用层302a。协处理器应用层302a在接收到触摸数据之后,向主处理器103发送唤醒指令。主处理器103在接收到唤醒指令后完成内部上电,并向协处理器104发送中断请求,即主处理器103请求协处理器104将显示屏102和触摸屏101的处理权限切换给主处理器103。协处理器104在接收到主处理器103发来的中断请求之后,先将显示屏102切换给主处理器103。在显示屏102完成切换,而触摸屏101暂未切换的情况下,为了避免切换过程中出现屏幕卡顿的问题,协处理器应用层302a将触摸数据发送给主处理器103的主虚拟触控驱动301b。待用户的本次触摸结束后,协处理器104再将触摸屏101切换给主处理器103。详细过程请参阅上述关于图5所示的屏幕控制方法部分的文字描述,在此不再赘述。
步骤903b:智能手表100超时灭屏。当协处理器104被唤醒后,若在设定时间内未检测到用户的触摸操作,则协处理器104给显示屏102发送灭屏指令,控制显示屏102灭屏。例如,协处理器104被唤醒后,控制显示屏102亮起,在5秒以内未检测到用户的触摸操作,则协处理器104进行休眠状态,同时主处理器103继续保持休眠状态,显示屏102灭屏。
步骤904:主处理器103判断是否有针对设定应用的交互操作。若是,则表明主处理器103检测到用户针对设定应用的交互操作,进入步骤905;否则,表明主处理器103未检测到用户针对设定应用的交互操作。在一些实施例中,若主处理器103在设定时间内未检测到用户的任何操作,则超时灭屏。
其中,设定应用是指需要由协处理器104处理的例如计算器、定时器、闹钟、运动等使用频率较高、运算量较低的应用。由协处理器104代替主处理器103来处理前述使用频率高、时间长的应用,从而降低功耗,提升智能手表100的续航能力。
步骤905:主处理器103唤醒协处理器104,主处理器103将显示屏102和触摸屏101切换给协处理器104。
例如,用户点击智能手表100的运动应用,智能手表100的运动应用发送灭屏指令给主处理器103。主处理器103在接收到灭屏指令后,完成内部MIPI接口下电,即完成主处理器103与显示屏102的切换开关S2连接的硬件接口的下电,以断开主处理器103的MIPI接口和显示屏102的MIPI接口之间的连接。并且,主处理器103向协处理器104发送中断请求,请求切换屏幕,即请求切换显示屏102和触摸屏101。协处理器104在接收到切换屏幕的中断请求之后进行响应,切换显示屏102。待显示屏102完成切换后,再切换触摸屏101。详细过程请参阅上述关于图8所示的屏幕控制方法部分的文字描述,在此不再赘述。
本申请公开的各实施例可以被实现在硬件、软件、固件或这些实现方法的组合中。本申请的实施例可实现为在可编程系统上执行的计算机程序或程序代码,该可编程系统包括至少一个处理器、存储系统(包括易失性和非易失性存储器和/或存储元件)、至少一个输入设备以及至少一个输出设备。
可将程序代码应用于输入指令,以执行本申请描述的各功能并生成输出信息。可以按已知方式将输出信息应用于一个或多个输出设备。为了本申请的目的,处理系统包括具有诸如例如数字信号处理器(Digital Signal Processor,DSP)、微控制器、专用集成电路(Application Specific Integrated Circuit,ASIC)或微处理器之类的处理器的任何系统。
程序代码可以用高级程序化语言或面向对象的编程语言来实现,以便与处理系统通信。在需要时,也可用汇编语言或机器语言来实现程序代码。事实上,本申请中描述的机制不限于任何特定编程语言的范围。在任一情形下,该语言可以是编译语言或解释语言。
在一些情况下,所公开的实施例可以以硬件、固件、软件或其任何组合来实现。所公开的实施例还可以被实现为由一个或多个暂时或非暂时性机器可读(例如,计算机可读)存储介质承载或存储在其上的指令,其可以由一个或多个处理器读取和执行。例如,指令可以通过网络或通过其他计算机可读介质分发。因此,机器可读介质可以包括用于以机器(例如,计算机)可读的形式存储或传输信息的任何机制,包括但不限于,软盘、光盘、光碟、只读存储器(CD-ROMs)、磁光盘、只读存储器(Read Only Memory,ROM)、随机存取存储器(Random access memory,RAM)、可擦除可编程只读存储器(Erasable Programmable Read Only Memory,EPROM)、电可擦除可编程只读存储器(Electrically Erasable Programmable Read-Only Memory,EEPROM)、磁卡或光卡、闪存、或用于利用因特网以电、光、声或其他形式的传播信号来传输信息(例如,载波、红外信号数字信号等)的有形的机器可读存储器。因此,机器可读介质包括适合于以机器(例如计算机)可读的形式存储或传输电子指令或信息的任何类型的机器可读介质。
在附图中,可以以特定布置和/或顺序示出一些结构或方法特征。然而,应该理解,可能不需要这样的特定布置和/或排序。而是,在一些实施例中,这些特征可以以不同于说明性附图中所示的方式和/ 或顺序来布置。另外,在特定图中包括结构或方法特征并不意味着暗示在所有实施例中都需要这样的特征,并且在一些实施例中,可以不包括这些特征或者可以与其他特征组合。
需要说明的是,本申请各设备实施例中提到的各单元/模块都是逻辑单元/模块,在物理上,一个逻辑单元/模块可以是一个物理单元/模块,也可以是一个物理单元/模块的一部分,还可以以多个物理单元/模块的组合实现,这些逻辑单元/模块本身的物理实现方式并不是最重要的,这些逻辑单元/模块所实现的功能的组合才是解决本申请所提出的技术问题的关键。此外,为了突出本申请的创新部分,本申请上述各设备实施例并没有将与解决本申请所提出的技术问题关系不太密切的单元/模块引入,这并不表明上述设备实施例并不存在其它的单元/模块。
需要说明的是,在本专利的示例和说明书中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
虽然通过参照本申请的某些优选实施例,已经对本申请进行了图示和描述,但本领域的普通技术人员应该明白,可以在形式上和细节上对其作各种改变,而不偏离本申请的精神和范围。

Claims (10)

  1. 一种电子设备的屏幕控制方法,所述电子设备包括屏幕、第一处理器以及第二处理器,其中所述屏幕包括显示屏和触摸屏,其特征在于,包括:
    所述电子设备的第二处理器处理所述显示屏和所述触摸屏的相关信息;
    所述电子设备在检测到用户开始第一触摸操作的情况下,将所述显示屏的处理权限由所述第二处理器切换到所述第一处理器,并将检测到的所述第一触摸操作的第一触摸数据经由所述第二处理器发送给所述第一处理器,并且
    所述电子设备在检测到所述第一触摸操作结束的情况下,将所述触摸屏的处理权限由所述第二处理器切换到所述第一处理器。
  2. 根据权利要求1所述的方法,其特征在于,所述电子设备包括虚拟触控驱动,所述电子设备将所述第一触摸操作的第一触摸数据经由所述第二处理器发送给所述虚拟触控驱动,所述第一处理器经由所述虚拟触控驱动接收所述第一触摸数据。
  3. 根据权利要求1或2所述的方法,其特征在于,所述电子设备在检测到用户开始第一触摸操作的情况下,将所述显示屏的处理权限由所述第二处理器切换到所述第一处理器包括:
    所述电子设备在检测到用户开始第一触摸操作的情况下,通过所述第二处理器向所述第一处理器发送唤醒指令;
    所述第一处理器接收到所述唤醒指令后,响应所述唤醒指令,并且向所述第二处理器发送屏幕切换的中断请求;
    所述第二处理器接收到所述屏幕切换的中断请求后,响应所述中断请求,将所述显示屏的处理权限由所述第二处理器切换到所述第一处理器。
  4. 根据权利要求1至3任一项所述的方法,其特征在于,还包括:
    所述电子设备的第一处理器处理所述显示屏和触摸屏的相关信息;
    所述电子设备在检测到用户有针对所述电子设备的设定应用的交互操作的情况下,将所述显示屏和触摸屏的处理权限由所述第一处理器切换到所述第二处理器,由所述电子设备的第二处理器处理所述显示屏和所述触摸屏的相关信息。
  5. 根据权利要求4所述的方法,其特征在于,所述电子设备在检测到用户有针对所述电子设备的设定应用的交互操作的情况下,将所述显示屏和触摸屏的处理权限由所述第一处理器切换到所述第二处理器,由所述电子设备的第二处理器处理所述显示屏和所述触摸屏的相关信息包括:
    所述电子设备在检测到用户有针对所述电子设备的设定应用的交互操作的情况下,将所述显示屏的处理权限由所述第一处理器切换到所述第二处理器;
    在所述电子设备确定出所述显示屏的处理权限由所述第一处理器切换到所述第二处理器之后,将所述触摸屏的处理权限由所述第一处理器切换到所述第二处理器。
  6. 根据权利要求1至5任一项所述的方法,其特征在于,还包括:
    所述电子设备的第一处理器处理所述显示屏和所述触摸屏的相关信息;
    所述电子设备在检测到用户开始第二触摸操作的情况下,将检测到的所述第二触摸操作的第二触摸数据发送给所述第一处理器。
  7. 根据权利要求1至6任一项所述的方法,其特征在于,所述电子设备包括显示屏切换开关,所述显示屏切换开关电连接所述显示屏,并且
    所述电子设备在检测到用户开始第一触摸操作的情况下,通过以下方式将所述显示屏的处理权限由所述第二处理器切换到所述第一处理器:
    所述电子设备在检测到用户开始第一触摸操作的情况下,控制所述显示屏切换开关与所述第二处理器断开连接,并控制所述显示屏切换开关连通所述第一处理器。
  8. 根据权利要求1至6任一项所述的方法,其特征在于,所述电子设备包括触摸屏切换开关,所述触摸屏切换开关电连接所述触摸屏,并且
    所述电子设备在检测到所述第一触摸操作结束的情况下,通过以下方式将所述触摸屏的处理权限由所述第二处理器切换到第一处理器:
    所述电子设备在检测到所述第一触摸操作结束的情况下,控制所述触摸屏切换开关与所述第二处理器断开连接,并控制所述触摸屏切换开关连通所述第一处理器。
  9. 一种可读介质,其特征在于,所述可读介质上存储有指令,该指令在电子设备上执行时使电子设备执行权利要求1-8中任一项所述的屏幕控制方法。
  10. 一种电子设备,其特征在于,包括:
    屏幕,所述屏幕包括显示屏和触摸屏;
    存储器,用于存储由电子设备的一个或多个处理器执行的指令,以及
    第一处理器,是电子设备的处理器之一;
    第二处理器,是电子设备的处理器之一,用于与所述第一处理器配合执行权利要求1-8中任一项所述的屏幕控制方法。
PCT/CN2022/077977 2021-03-03 2022-02-25 电子设备的屏幕控制方法、可读介质和电子设备 WO2022183985A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP22762450.9A EP4280032A1 (en) 2021-03-03 2022-02-25 Screen control method for electronic device, and readable medium and electronic device
US18/548,519 US20240168573A1 (en) 2021-03-03 2022-02-25 Electronic Device Screen Control Method, Readable Medium, and Electronic Device
JP2023552060A JP2024508830A (ja) 2021-03-03 2022-02-25 電子デバイス画面制御方法、読取可能媒体、及び電子デバイス

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110236124.XA CN115033122A (zh) 2021-03-03 2021-03-03 电子设备的屏幕控制方法、可读介质和电子设备
CN202110236124.X 2021-03-03

Publications (1)

Publication Number Publication Date
WO2022183985A1 true WO2022183985A1 (zh) 2022-09-09

Family

ID=83118378

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/077977 WO2022183985A1 (zh) 2021-03-03 2022-02-25 电子设备的屏幕控制方法、可读介质和电子设备

Country Status (5)

Country Link
US (1) US20240168573A1 (zh)
EP (1) EP4280032A1 (zh)
JP (1) JP2024508830A (zh)
CN (1) CN115033122A (zh)
WO (1) WO2022183985A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116431418B (zh) * 2023-06-13 2023-10-31 荣耀终端有限公司 屏幕静电检测方法、可读存储介质、电子设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140351617A1 (en) * 2013-05-27 2014-11-27 Motorola Mobility Llc Method and Electronic Device for Bringing a Primary Processor Out of Sleep Mode
CN108369445A (zh) * 2015-12-18 2018-08-03 高通股份有限公司 唤醒分割架构的级联触摸
CN110908496A (zh) * 2019-11-28 2020-03-24 出门问问信息科技有限公司 一种系统交互方法及可穿戴设备

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140351617A1 (en) * 2013-05-27 2014-11-27 Motorola Mobility Llc Method and Electronic Device for Bringing a Primary Processor Out of Sleep Mode
CN108369445A (zh) * 2015-12-18 2018-08-03 高通股份有限公司 唤醒分割架构的级联触摸
CN110908496A (zh) * 2019-11-28 2020-03-24 出门问问信息科技有限公司 一种系统交互方法及可穿戴设备

Also Published As

Publication number Publication date
US20240168573A1 (en) 2024-05-23
EP4280032A1 (en) 2023-11-22
CN115033122A (zh) 2022-09-09
JP2024508830A (ja) 2024-02-28

Similar Documents

Publication Publication Date Title
US10372326B2 (en) Mobile terminal device, operation method, program, and storage medium
US10466830B2 (en) Electronic device and method of controlling electronic device
CN110235086B (zh) 电子设备及电子设备的指纹识别方法
EP2926213B1 (en) Gesture detection management for an electronic device
US11449123B2 (en) Prompt information display method and electronic device
KR102429740B1 (ko) 터치 이벤트 처리 방법 및 그 장치
CN109313519A (zh) 包括力传感器的电子设备
CN109375890A (zh) 一种屏幕显示方法和多屏电子设备
WO2020155874A1 (zh) 触压信号处理方法及电子设备
WO2017202076A1 (zh) 一种终端管理方法以及装置
CN106547204A (zh) 一种智能手表及其亮显屏幕的方法
KR20180032970A (ko) 전자 장치 및 그 제어 방법
CN107949041A (zh) 一种终端显示方法、终端及计算机可读存储介质
US11354931B2 (en) Fingerprint sensing device and operation method thereof
US10289887B2 (en) Electronic apparatus, operating method of electronic apparatus, and recording medium
KR20170114515A (ko) 화면을 표시하는 전자 장치 및 그 제어 방법
WO2022183985A1 (zh) 电子设备的屏幕控制方法、可读介质和电子设备
CN109521940B (zh) 移动终端装置、操作方法和存储介质
TWI597659B (zh) 開啟顯示器前更新欲顯示內容的處理方法、模塊及其電子裝置
CN108845700A (zh) 显示屏触控方法、双面屏终端及计算机可读存储介质
CN110908732B (zh) 应用的任务删除方法及电子设备
CN117707320A (zh) 控制熄屏显示的方法、电子设备及存储介质
KR20180021581A (ko) 터치 패널 및 이를 포함하는 전자 장치
CN117707319A (zh) 控制熄屏显示的方法、电子设备及存储介质
KR20180097384A (ko) 전자 장치 및 전자 장치 제어 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22762450

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022762450

Country of ref document: EP

Effective date: 20230814

WWE Wipo information: entry into national phase

Ref document number: 2023552060

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 18548519

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE