WO2023030168A1 - Procédé d'affichage d'interface et dispositif électronique - Google Patents

Procédé d'affichage d'interface et dispositif électronique Download PDF

Info

Publication number
WO2023030168A1
WO2023030168A1 PCT/CN2022/114916 CN2022114916W WO2023030168A1 WO 2023030168 A1 WO2023030168 A1 WO 2023030168A1 CN 2022114916 W CN2022114916 W CN 2022114916W WO 2023030168 A1 WO2023030168 A1 WO 2023030168A1
Authority
WO
WIPO (PCT)
Prior art keywords
area
brightness
window
interface
display
Prior art date
Application number
PCT/CN2022/114916
Other languages
English (en)
Chinese (zh)
Inventor
杜奕全
周雨沛
孙奎全
李凯
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023030168A1 publication Critical patent/WO2023030168A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Definitions

  • the present application relates to the technical field of smart terminals, in particular to an interface display method and electronic equipment.
  • the present application provides an interface display method and an electronic device, which not only guarantee the user's viewing experience, but also achieve the purpose of saving power.
  • the embodiment of the present application provides an interface display device, including: a detection unit and an adjustment unit, wherein,
  • the detection unit is used to detect the gaze area of the user's eyeballs in the first interface;
  • the first interface is an interface displayed on the screen of the electronic device;
  • the adjustment unit is used to adjust the display brightness of the first interface to obtain the second interface, the brightness of the gaze area in the second interface is greater than the brightness of some or all of the non-gazing areas; the non-gazing area is an area other than the gaze area in the interface.
  • the detection unit is configured to detect the gaze area of the user's eyeballs in the interface, including:
  • the detection unit is used to: determine the focus area corresponding to the gaze focus of the user's eyeballs in the interface; and determine the gaze area of the user's eyeballs in the interface according to the focus area.
  • the detection unit is configured to determine the gaze area of the user's eyeballs in the interface according to the focus area, including:
  • the detection unit is used for: determining the gaze area of the user's eyes in the interface according to the focus area and the current window display type of the first interface, the window display type being single-window display or multi-window display.
  • the window display type is single-window display
  • the detection unit is configured to determine the gaze area of the user's eyeballs in the interface according to the focus area and the current window display type of the first interface, including:
  • the detection unit is used for: determining the gaze area according to the focus area, and the gaze area includes the focus area.
  • the window display type is multi-window display, and the interface is divided into at least two window areas by windows; the detection unit is used to determine the user's eyeballs in the interface according to the focus area and the current window display type of the first interface.
  • the fixation area in , including:
  • the detection unit is used for:
  • the first window area is determined as the gaze area.
  • the window display type is multi-window display, and the interface is divided into at least two window areas by windows; the detection unit is used to determine the user's eyeballs in the interface according to the focus area and the current window display type of the first interface.
  • the fixation area in , including:
  • the detection unit is used for:
  • the multi-window display includes: single floating window display, and/or, multiple floating window display, and/or, split-screen display, and/or, parallel view display.
  • the adjustment unit is used to adjust the display brightness of the interface, including:
  • the adjusting unit is used for: obtaining the first target brightness, and adjusting the brightness of the gaze area to the first target brightness.
  • the adjustment unit is configured to adjust the brightness of the gaze area to the first target brightness, including:
  • the adjustment unit is configured to adjust the brightness of the gaze area to the first target brightness, including:
  • the power is not less than the first threshold, and the brightness of the gaze area is gradually changed to the first target brightness according to the first step; and/or,
  • the power is less than the first threshold and not less than the second threshold, and the brightness of the gaze area is gradually changed to the first target brightness according to the second step; the first threshold is greater than the second threshold, and the first step is smaller than the second step; and/or ,
  • the brightness of the gaze area is directly adjusted to the first target brightness.
  • the adjustment unit is used to adjust the display brightness of the interface, including:
  • the adjustment unit is used to: obtain a brightness setting policy of the non-focus area, and adjust the brightness of the non-focus area according to the brightness setting policy.
  • the adjustment unit is configured to adjust the brightness of the non-focus area according to a brightness setting strategy, including:
  • the adjustment unit is used for: determining the second target brightness according to the brightness setting strategy, and adjusting the brightness of the non-watching area to the second target brightness.
  • the adjustment unit is configured to adjust the brightness of the non-focus area to the target brightness, including:
  • the adjustment unit is configured to adjust the brightness of the non-focus area to the target brightness, including:
  • the power is not less than the third threshold, and the brightness of the gaze area is gradually changed to the target brightness according to the third step length; and/or,
  • the power is less than the third threshold and not less than the fourth threshold, and the brightness of the gaze area is gradually changed to the target brightness according to the fourth step; the third threshold is greater than the fourth threshold, and the third step is smaller than the fourth step; and/or,
  • the brightness of the gaze area is directly adjusted to the target brightness.
  • the brightness setting strategy includes:
  • the non-gazing area is divided into several sub-areas, and the brightness of the sub-areas decreases successively according to the order of the distance between the several sub-areas and the watching area from small to large, and the maximum brightness of the sub-area is less than or equal to the first target brightness; or,
  • the brightness of the pixels decreases successively, and the maximum brightness of the pixels in the non-focused area is less than or equal to the first target brightness;
  • the brightness of the non-attention area is set as the second target brightness, and the second target brightness is smaller than the first target brightness.
  • the adjustment unit is used to adjust the display brightness of the interface, including:
  • an embodiment of the present application provides an electronic device, including: a display and a processor; wherein,
  • the processor is used to: detect the gaze area of the user's eyeballs in the first interface; the first interface is an interface displayed on the screen of the electronic device; adjust the display brightness of the first interface to obtain the second interface, and the gaze area of the second interface
  • the brightness is greater than the brightness of some or all of the areas in the non-faze area; the non-faze area is the area of the interface outside the fixation area.
  • the processor is configured to detect the gaze area of the user's eyeballs in the interface, including:
  • the processor is configured to: determine the focus area corresponding to the gaze focus of the user's eyeballs in the interface; and determine the gaze area of the user's eyeballs in the interface according to the focus area.
  • the processor is configured to determine the gaze area of the user's eyeballs in the interface according to the focus area, including:
  • the processor is configured to: determine the gaze area of the user's eyes in the interface according to the focus area and the current window display type of the first interface, and the window display type is single-window display or multi-window display.
  • the window display type is single-window display
  • the processor is configured to determine the gaze area of the user's eyeballs in the interface according to the focus area and the current window display type of the first interface, including:
  • the processor is configured to: determine a gaze area according to the focus area, where the gaze area includes the focus area.
  • the window display type is multi-window display, and the interface is divided into at least two window areas by windows; the processor is configured to determine the user's eyeballs in the interface according to the focus area and the current window display type of the first interface areas of fixation, including:
  • the processor is used for:
  • the first window area is determined as the gaze area.
  • the window display type is multi-window display, and the interface is divided into at least two window areas by windows; the processor is configured to determine the user's eyeballs in the interface according to the focus area and the current window display type of the first interface areas of fixation, including:
  • the processor is used for:
  • the multi-window display includes: single floating window display, and/or, multiple floating window display, and/or, split-screen display, and/or, parallel view display.
  • the processor is configured to adjust the display brightness of the interface, including:
  • the processor is configured to: acquire the first target brightness, and adjust the brightness of the gaze area to the first target brightness.
  • the processor is configured to adjust the brightness of the gaze area to the first target brightness, including:
  • the processor is used for:
  • the processor is configured to adjust the brightness of the gaze area to the first target brightness, including:
  • the processor is used for:
  • the power is not less than the first threshold, and the brightness of the gaze area is gradually changed to the first target brightness according to the first step; or,
  • the power is less than the first threshold and not less than the second threshold, and the brightness of the gaze area is gradually changed to the first target brightness according to the second step; the first threshold is greater than the second threshold, and the first step is smaller than the second step; or,
  • the brightness of the gaze area is directly adjusted to the first target brightness.
  • the processor is configured to adjust the display brightness of the interface, including:
  • the processor is configured to: obtain a brightness setting policy of the non-fixation area, and adjust the brightness of the non-fixation area according to the brightness setting policy.
  • the processor is configured to adjust the brightness of the non-focus area according to a brightness setting strategy, including:
  • the processor is configured to: determine the second target brightness according to the brightness setting strategy, and adjust the brightness of the non-focus area to the second target brightness.
  • the processor is configured to adjust the brightness of the non-fixation area to the target brightness, including:
  • the processor is used for:
  • the processor is configured to adjust the brightness of the non-fixation area to the target brightness, including:
  • the processor is used for:
  • the power is not less than the third threshold, and the brightness of the gaze area is gradually changed to the target brightness according to the third step length; or,
  • the power is less than the third threshold and not less than the fourth threshold, and the brightness of the gaze area is gradually changed to the target brightness according to the fourth step; the third threshold is greater than the fourth threshold, and the third step is smaller than the fourth step; or,
  • the brightness of the gaze area is directly adjusted to the target brightness.
  • the brightness setting strategy includes:
  • the non-gazing area is divided into several sub-areas, and the brightness of the sub-areas decreases successively according to the order of the distance between the several sub-areas and the watching area from small to large, and the maximum brightness of the sub-area is less than or equal to the first target brightness; or,
  • the brightness of the pixels decreases successively, and the maximum brightness of the pixels in the non-focused area is less than or equal to the first target brightness;
  • the brightness of the non-attention area is set as the second target brightness, and the second target brightness is smaller than the first target brightness.
  • the processor is configured to adjust the display brightness of the interface, including:
  • the processor is used for:
  • the embodiment of the present application provides an interface display method applied to an electronic device, including:
  • the first interface is an interface displayed on the screen of the electronic device
  • the brightness of the fixation area in the second interface is greater than the brightness of some or all of the non-fixation areas; the non-fixation area is the area outside the fixation area in the interface.
  • the detection of the gaze area of the user's eyeballs in the interface includes:
  • determining the gaze area of the user's eyes in the interface according to the focus area includes: determining according to the focus area and the current window display type of the first interface The gaze area of the user's eyeballs in the interface, and the window display type is single-window display or multi-window display.
  • the window display type is single-window display
  • the user's eyeballs' gaze in the interface is determined according to the focus area and the current window display type of the first interface.
  • the area includes: determining the gaze area according to the focus area, and the gaze area includes the focus area.
  • the window display type is multi-window display, and the interface is divided into at least two window areas by windows;
  • the display type determines the gaze area of the user's eyeballs in the interface, including: obtaining the first window area from at least two window areas, and the first window area is the window area with the largest intersection area with the focus area among the at least two window areas;
  • the first window area is determined as the gaze area.
  • the window display type is multi-window display, and the interface is divided into at least two window areas by windows; according to the focus area and the current window display of the first interface
  • the type determines the gaze area of the user's eyeballs in the interface, including: determining that the intersection area between each window area and the focus area in at least two window areas is equal; maintaining the gaze area and non-gazing area determined in the previous cycle; or, from Select a window area corresponding to a non-full-screen window from at least two window areas as the gaze area.
  • the multi-window display includes: displaying a single floating window, and/or displaying multiple floating windows, and/or , split screen display, and/or, parallel horizon display.
  • adjusting the display brightness of the interface includes: acquiring a first target brightness, and adjusting the brightness of the gaze area to the first target brightness.
  • adjusting the brightness of the gaze area to the first target brightness includes: directly adjusting the brightness of the gaze area to the first target brightness; or, adjusting the brightness of the gaze area to the first target brightness; The brightness of the area fades to the first target brightness.
  • the brightness of the gaze area is adjusted to the first target brightness, including: obtaining the power of the power supply; the power is not less than the first threshold, according to the first step Gradually change the brightness of the gaze area to the first target brightness; or, if the power is less than the first threshold and not less than the second threshold, gradually change the brightness of the gaze area to the first target brightness according to the second step; the first threshold is greater than the second threshold , the length of the first step is less than the second step; or, the power is less than the second threshold, and the brightness of the gaze area is directly adjusted to the first target brightness.
  • adjusting the display brightness of the interface includes: obtaining a brightness setting policy of the non-fixation area, and adjusting the brightness of the non-fixation area according to the brightness setting policy.
  • adjusting the brightness of the non-fixation area according to the brightness setting strategy includes: determining the second target brightness according to the brightness setting strategy, and adjusting the brightness of the non-fixation area is the second target brightness.
  • adjusting the brightness of the non-gazing area to the target brightness includes: directly adjusting the brightness of the non-gazing area to the target brightness; or, adjusting the brightness of the non-gazing area to the target brightness; The brightness of the gaze area fades to the target brightness.
  • adjusting the brightness of the non-gazing area to the target brightness includes: obtaining the power of the power supply; the power is not less than the third threshold, according to the third step Gradually change the brightness of the gaze area to the target brightness; or, if the power is less than the third threshold and not less than the fourth threshold, gradually change the brightness of the gaze area to the target brightness according to the fourth step length; the third threshold is greater than the fourth threshold, and the third The step length is less than the fourth step length; or, the battery power is less than the fourth threshold, and the brightness of the gaze area is directly adjusted to the target brightness.
  • the brightness setting strategy includes: dividing the non-focused area into several sub-areas, and according to the order of the distances between the several sub-areas and the focused area from small to large, the sub-areas The brightness of the area decreases in turn, and the maximum brightness of the sub-area is less than or equal to the first target brightness; or, according to the order of the minimum distance between the pixel point of the non-focus area and the boundary line of the focus area, the brightness of the pixel points decreases sequentially, and the non-focus area
  • the maximum luminance of the pixels in the gazing area is less than or equal to the first target luminance; or, the luminance of the non-gazing area is set as the second target luminance, and the second target luminance is smaller than the first target luminance.
  • adjusting the display brightness of the interface includes: adding a transparent mask layer to the interface; setting the color and/or transparency of the transparent mask layer , so that the brightness of the focused area is greater than the brightness of some or all of the non-focused areas.
  • an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when it runs on a computer, the computer executes the program described in any one of the third aspect. method.
  • the present application provides a computer program for executing the method described in the first aspect when the computer program is executed by a computer.
  • all or part of the program in the fifth aspect may be stored on a storage medium packaged with the processor, or part or all may be stored on a memory not packaged with the processor.
  • FIG. 1A is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 1B is a schematic diagram of the software structure of the electronic device of the embodiment of the present application.
  • FIG. 2A is a schematic diagram of a method for establishing a screen coordinate system in an embodiment of the present application
  • FIG. 2B is a schematic diagram of the interface in the single-window display scene of the embodiment of the present application.
  • Fig. 2C is a schematic diagram of the interface in the scenario of a single floating window in the embodiment of the present application.
  • FIG. 2D is a schematic diagram of the interface in the multi-floating window scenario of the embodiment of the present application.
  • FIG. 2E is a schematic diagram of the interface in the split-screen display scene of the embodiment of the present application.
  • FIG. 2F is a schematic diagram of the interface in the parallel horizon display scene of the embodiment of the present application.
  • FIG. 3 is a schematic diagram of a method for determining a gaze area in a single-window display scene according to an embodiment of the present application
  • FIG. 4A is a schematic diagram of a method for dividing a non-focused region into subregions according to an embodiment of the present application
  • FIG. 4B is a schematic diagram of the scene when the gaze area changes under the single-window display scene of the embodiment of the present application.
  • Fig. 5 is a flowchart of an embodiment of the interface display method of the present application.
  • FIG. 6A is a schematic diagram of overlapping gaze areas according to an embodiment of the present application.
  • FIG. 6B is a schematic diagram of the interface display effect in the single-window display scene of the embodiment of the present application.
  • Fig. 7 is a schematic diagram of the interface in the single floating window scenario of the embodiment of the present application.
  • Fig. 8A is a flowchart of another embodiment of the interface display method of the present application.
  • Fig. 8B is a schematic diagram of the interface display effect in the single floating window scene of the embodiment of the present application.
  • Fig. 9 is a schematic diagram of the interface in the multi-floating window scenario of the embodiment of the present application.
  • Fig. 10 is a schematic diagram of the interface display effect in the multi-floating window scene of the embodiment of the present application.
  • FIG. 11A is a schematic diagram of an interface in a split-screen display scenario according to an embodiment of the present application.
  • FIG. 11B is a schematic diagram of the interface display effect in the split-screen display scene of the embodiment of the present application.
  • Fig. 12A is a schematic diagram of the interface in the parallel view scene of the embodiment of the present application.
  • Fig. 12B is a schematic diagram of the interface display effect in the parallel view scene of the embodiment of the present application.
  • FIG. 13 is a schematic diagram of a software structure of an electronic device provided in an embodiment of the present application.
  • Fig. 14 is a flowchart of another embodiment of the interface display method of the present application.
  • Fig. 15 is a flowchart of another embodiment of the interface display method of the present application.
  • Fig. 16 is a flowchart of another embodiment of the interface display method of the present application.
  • Fig. 17 is a flowchart of another embodiment of the interface display method of the present application.
  • Fig. 18 is a flow chart of another embodiment of the interface display method of the present application.
  • FIG. 19 is a schematic structural diagram of an embodiment of an interface display device of the present application.
  • the present application proposes an interface display method and an electronic device, which not only guarantee the user's viewing experience, but also have a relatively better power saving effect.
  • the gaze area of the user's eyeballs on the screen is obtained, and the gaze area is displayed at normal brightness, thereby ensuring the user's viewing experience, and the brightness of the non-gazing area outside the gaze area is lower than the above-mentioned Normal brightness, so as to achieve the purpose of power saving.
  • the brightness of the non-watching area can be adjusted to a relatively low level, even reaching the lowest brightness of the screen, so as to achieve a better power saving effect.
  • the method provided by the embodiment of the present application can be applied to electronic devices, such as mobile phones, PADs, PCs, TVs, large screens, vehicle-mounted devices, and so on.
  • FIG. 1A shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flashlight, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the shooting function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100 , and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules shown in the embodiment of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 is charging the battery 142 , it can also provide power for electronic devices through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1 , the antenna 2 , the mobile communication module 150 , the wireless communication module 160 , a modem processor, a baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. applied on the electronic device 100.
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 realizes the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active ⁇ matrix organic light emitting diode, AMOLED), flexible light emitting diode (flex light ⁇ emitting diode, FLED), Miniled, MicroLed, Micro ⁇ oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the shooting function through the ISP, the camera 193 , the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used for processing the data fed back by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin color.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be realized through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. Such as saving music, video and other files in the external memory card.
  • the internal memory 121 may be used to store computer-executable program codes including instructions.
  • the internal memory 121 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data created during the use of the electronic device 100 (such as audio data, phonebook, etc.) and the like.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Electronic device 100 can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the receiver 170B can be placed close to the human ear to receive the voice.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a phone call or sending a voice message, the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In some other embodiments, the electronic device 100 may be provided with two microphones 170C, which may also implement a noise reduction function in addition to collecting sound signals. In some other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions, etc.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the electronic device 100 determines the intensity of pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the electronic device 100 when the electronic device 100 is a clamshell machine, the electronic device 100 can detect opening and closing of the clamshell according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user is holding the electronic device 100 close to the ear to make a call, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access to application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the electronic device 100 may reduce the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to prevent the electronic device 100 from being shut down abnormally due to the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180K is also called “touch device”.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • the bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 180M can also contact the human pulse and receive the blood pressure beating signal. In some embodiments, the bone conduction sensor 180M can also be disposed in the earphone, combined into a bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vibrating bone mass of the vocal part acquired by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card etc. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture. Specifically, it can be Android system, Hongmeng system and so on.
  • the embodiment of the present application takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
  • FIG. 1B is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application package may include application programs such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • application programs such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on.
  • the view system can be used to build applications.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100 . For example, the management of call status (including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • prompting text information in the status bar issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • the Android Runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
  • the workflow of the software and hardware of the electronic device 100 will be exemplarily described below in conjunction with capturing and photographing scenes.
  • a corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into original input events (including touch coordinates, time stamps of touch operations, and other information). Raw input events are stored at the kernel level.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event. Take the touch operation as a touch click operation, and the control corresponding to the click operation is the control of the camera application icon as an example.
  • the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer.
  • Camera 193 captures still images or video.
  • the interface referred to in the embodiment of the present application refers to a visual interface displayed on a screen to interact with a user. Multiple windows can be included in the interface.
  • the window referred to in the embodiment of the present application is a user interface area corresponding to an application program on the screen, and is a visual interface for interaction between the user and the application generating the window.
  • the application creates and displays a window; when the user operates controls in the window, the application reacts accordingly.
  • Each application can display one or more windows on the screen at the same time.
  • the complete display area of the screen of the electronic device is referred to as the screen display area.
  • a window whose display area is a screen display area is called a full-screen window; a window whose display area is smaller than an equal-screen display area is called a non-full-screen window.
  • the above-mentioned window may have a boundary parameter, which is used to record the position of the boundary line of the window.
  • the vertex in the upper left corner of the screen can be used as the origin O
  • the horizontal left screen edge is the x-axis
  • the vertically downward screen edge is the y-axis.
  • each pixel on the screen can have (x, y) coordinates
  • x is used to indicate the row where the pixel is located
  • y is used to indicate the column where the pixel is located
  • the boundary line of the window can also pass through the coordinate system coordinates to identify.
  • the boundary parameter of window 1 shown in Fig. 2A can be (x1, y1, x2, y2)
  • (x1, y1) is the coordinate of vertex A at the upper left corner of window 1
  • (x2, y2) is the window The coordinates of the lower right vertex C of 1.
  • the above boundary parameters of the window can also determine whether the window is a full-screen window or a non-full-screen window. For example, if the screen resolution is 1920*1080, then if the window's boundary parameters are (0,0,1920,1080), then the The window is a full-screen window, otherwise it is a non-full-screen window.
  • the electronic device is a PAD as an example.
  • the interface display method in the embodiment of the present application may be applicable to single-window display scenarios and multi-window display scenarios.
  • the single-window display scene refers to: only one window is displayed on the screen, and this window is a full-screen window, such as shown in Figure 2B, the interface is the desktop of the PAD, including only a full-screen window 201, and the desktop of the PAD is the picture displayed in the full-screen window .
  • the multi-window display scene means that the interface includes at least two windows, and each window can be a full-screen window or a non-full-screen window.
  • the multi-window display scenarios may specifically include: single floating window display, multiple floating window display, split-screen display, parallel horizon display, and the like.
  • a floating window is a movable window floating above a window, and the floating window is generally a non-full-screen window.
  • the interface of application 1 is displayed in 202; if there are 2 or more floating windows above a full-screen window, it can be called a multi-suspension window display scene, as shown in Figure 2D for example, the interface includes a full-screen window 201, the first floating window 203 and the second floating window 204 , the desktop of the PAD is displayed in the full screen window 201 , the interface of the application 1 is displayed in the first floating window 203 , and the interface of the application 2 is displayed in the second floating window 204 .
  • the size of the floating window can generally be adjusted manually; when multiple floating windows are displayed, the sizes of different floating windows can be the same or different.
  • Split-screen display refers to displaying windows of multiple applications on the screen.
  • the windows do not overlap each other.
  • the windows of multiple applications occupy the entire display area of the screen.
  • the windows of the application 1 interface and the The window of the application 2 interface is taken as an example.
  • the interface includes: a first split-screen window 205 and a second split-screen window 206, the first split-screen window 205 displays the interface of application 1, and the second split-screen window 206 displays App 2's interface.
  • Parallel horizon display means that multiple windows of the same application are displayed on the screen. The windows do not overlap each other, and multiple windows occupy the entire display area of the screen. For example, as shown in Figure 2F, two parallel horizon windows are used to display two Take the interface as an example, where the interface includes a first parallel horizon window 207 and a second parallel horizon window 208, the first parallel horizon window 207 displays interface 1 of application 1, and the second parallel horizon window 208 displays interface 2 of application 1 .
  • boundary line 22 there is a common boundary line 22 between two adjacent windows in the parallel horizon display scene.
  • the position of the boundary line can be adjusted artificially, thereby changing the display area of two adjacent windows.
  • the embodiment of the present application provides an interface display method in a single-window display scenario. As shown in FIG. 2B , in this scenario, the interface displayed on the screen includes only one full-screen window 201 .
  • the display brightness of the interface is determined according to the brightness parameter of the screen, therefore, the interface has a display brightness.
  • the electronic device can take a video image of the user through the front camera of the electronic device, and detect the gaze focus of the user's eyes according to the video image. If the gaze focus is on the screen, determine that the gaze focus is on the screen. For the corresponding area (hereinafter referred to as the focus area), the user's gaze area and non-gazing area are determined according to the focus area, and different display brightnesses are set for the user's gaze area and non-gazing area.
  • the gaze area When determining the gaze area of the user according to the focus area, the gaze area includes the focus area, and optionally, the gaze area is larger than the focus area.
  • the size of the gaze area can be preset, and the gaze area is determined according to the focus area and the preset size.
  • the center point of the focus area can be used as the center point of the gaze area, and a rectangular area with a preset length and width can be determined as the gaze area.
  • the focus area 300 is a rectangle
  • the gaze area 301 is also a rectangle
  • the center points of both rectangles are point O1 .
  • the interface is divided into two parts: a gaze area and a non-gazing area.
  • the interface 30 includes: a gaze area 301 and a non-gazing area 302 .
  • the display brightness of the gazing area 301 may be higher than the display brightness of some or all of the non-gazing area 302 .
  • the display brightness of the gaze area 301 may be determined according to brightness parameters of the screen, that is, displayed according to normal brightness.
  • the same display brightness can be set for the non-watching area 302 , or different display brightness can be set for each area or even pixel, as long as the set display brightness is lower than that of the watching area 301 .
  • the following examples illustrate possible ways of setting the display brightness of the non-watching area 302:
  • the non-attention area 302 can be displayed with the same brightness, which is lower than the brightness of the attention area 301 , and can be as low as the lowest display brightness of the pixel.
  • the non-gazing area 302 may be divided into sub-areas, the brightness of each sub-area is lower than that of the gazing area 301, and at least two sub-areas have different brightnesses.
  • the brightness of the sub-region 1 may be a2
  • the brightness of the sub-region 2 may be a3
  • the brightness of the sub-region 3 may be a4, a1>a2>a3>a4.
  • a4 may be the minimum display brightness of the pixel.
  • the display brightness of the pixels in the non-attention area 302 is gradually reduced according to the distance between the pixels in the non-attention area 302 and the closest boundary line of the attention area 301 from near to far.
  • This implementation method can be regarded as reducing the granularity of sub-region division in the second possible implementation manner from a preset value (multiple pixels) to 1 pixel.
  • the brightness setting methods may be used to set the display brightness for the non-gazing area based on the difference in the remaining power of the electronic device. For example, when the battery power is higher than the first value (for example, 20%), the brightness of the non-gazing area 302 is set according to the second possible implementation mode, that is, it is darkened sequentially according to the distance from the gazing area from near to far, such as non-gazing area 302. The gazing area 302 is divided into 3 sub-areas.
  • the brightness is respectively 75%, 50%, and 0% of the gazing area brightness;
  • the numerical value such as 10%
  • the brightness of the non-gazing area 302 is set according to the third possible implementation mode, the brightness of the pixels in the non-gazing area gradually becomes darker according to the distance from the boundary line of the watching area; when the power is not higher than the second
  • the numerical value for example, 10%
  • the brightness of the non-attention area 302 becomes 0 directly.
  • the gaze area in the interface also changes accordingly.
  • the display brightness of the area covered during the gaze area change changes accordingly. This change may be an increase in brightness or a decrease in brightness.
  • the gaze area 301 in the interface 30 moves from position 1 shown by the dotted line to position 2 shown by the solid line.
  • the display brightness of the preset fixation area is 100
  • the display brightness of the non-watching area is 0, then the display brightness of area 401 changes from 0 to 100, and the display brightness of area 402 changes from 100 to 0.
  • the original brightness when realizing the change of the above-mentioned display brightness, the original brightness (before the change) may be directly changed to the target brightness (after the change), or the original brightness may be gradually changed to the target brightness. for example:
  • the display brightness of area 402 changes from 100 to 0, it can directly change from 100 to 0, or it can also change from 100 to 0 according to the preset gradient step size.
  • the gradient step size is not limited in the embodiment of this application, for example, the gradient step is 25, then the display brightness of area 402 is gradually changed to 0 in the following way: 100, 75, 50, 25, 0, and the gradient step is 50, then the display brightness of area 402 is gradually changed to 0 in the following way: 100, 50, 0 .
  • the brightness adjustment method directly changing from 100 to 0 can also be regarded as a brightness gradual change method with a gradient step size of 100.
  • the display brightness of the area 402 is gradually changed to 0 according to the gradient step size of 25, that is, the brightness gradually becomes darker from 100, 75, 50, 25, and 0; when the power level is not higher than 20%, when it is higher than 10%, the display brightness of area 402 will gradually change to 0 according to the gradient step size of 50, that is, from brightness 100, 50 and then decrease to 0, and quickly become dark; when the power is not higher than 10%, the area 402 will The display brightness of 402 changes directly from 100 to 0, directly dimming.
  • this embodiment of the present application provides an interface display method, as shown in FIG. 5 , the method may include:
  • Step 501 Obtain a video image of the user, detect that the gaze focus of the user's eyes is on the screen according to the video image, and calculate the corresponding focal area of the gaze focus on the screen.
  • a camera may be provided directly above the screen of the electronic device, and the camera may have the function of detecting the user's eyeball gazing at the screen event. If the camera detects the user's eyeball gazing at the screen event, the event may be reported to the camera driver, and the camera driver acquires The eyeball image of the user calculates the focus area corresponding to the gaze focus on the screen according to the eyeball image, and sends the focus area to the processor.
  • the focus area may be an area with a preset size, and the area may be a rectangle or a circle.
  • Step 502 Determine that the current window display type is single-window display.
  • Step 503 Determine the gaze area of the user according to the focus area, and determine whether the gaze area of the user changes, if yes, execute step 505, and if not, execute step 504.
  • the gaze area includes a focus area, and for a specific determination method, reference may be made to the foregoing related descriptions, which will not be repeated here.
  • the acquisition of the focus area of the focus of the user's eye on the screen, and then the determination of the focus area is generally carried out periodically.
  • the fixation area determined this time can be compared with the previous one.
  • a determined fixation area is compared to determine whether the fixation area has changed.
  • Step 504 Keep the display brightness of the fixation area and the non-fixation area unchanged, and the flow of this branch ends.
  • Step 505 Obtain the current brightness and target brightness of the fixation area after the change, adjust the brightness of the fixation area from the current brightness to the target brightness after the change; obtain the current brightness and target brightness of the non-fixation area after the change, and adjust the brightness of the non-fixation area after the change The brightness is adjusted from the current brightness to the target brightness.
  • the brightness of the changed fixation area when adjusting the brightness of the changed fixation area from the current brightness to the target brightness, the brightness of the changed fixation area can be directly adjusted from the current brightness to the target brightness, or the brightness of the changed fixation area can be adjusted from the current brightness to the target brightness according to a certain step length.
  • the brightness fades to the target brightness.
  • the larger the step size the faster the gradient speed.
  • the implementation method of directly adjusting the brightness of the gaze area from the current brightness to the target brightness can also be considered as the largest step size, that is, the step size is the difference between the target brightness and the current brightness. At this time, Gradients are fastest.
  • different gradient speeds may be used to adjust the brightness of the changed gaze area from the current brightness to the target brightness according to the power of the power supply.
  • three brightness adjustment methods with different gradient speeds are preset, then adjusting the brightness of the gaze area from the current brightness to the target brightness may include:
  • the battery power is not less than the first threshold, and the brightness of the changed gaze area is gradually changed from the current brightness to the target brightness according to the first step;
  • the power is not less than the second threshold, and is less than the first threshold, and the brightness of the changed gaze area is gradually changed from the current brightness to the target brightness according to the second step length;
  • the brightness of the gaze area after the change is directly adjusted to the target brightness.
  • the first threshold is greater than the second threshold, and the first step is smaller than the second step.
  • the overlapping area 603 may be an overlapping area 603 between the gaze area 601 before the change and the gaze area 602 after the change.
  • the overlapping area 603 The display brightness is the target brightness.
  • the display brightness of areas other than the overlapping area 603 in the attention region 602 is less than the target brightness.
  • the areas outside the area 603 are adjusted to the target brightness according to the brightness adjustment method described above.
  • the brightness adjustment of the attention area and the non-attention area can be regarded as two independent processing processes, the adjustment methods of the two can be the same or different, and the gradual change speed of the brightness can be the same or different, which is not limited in the embodiment of the present application.
  • the non-focus area is divided into sub-regions, and different sub-regions have different brightness
  • the non-focus area can be re-divided into sub-regions, and the brightness adjustments are performed according to the sub-regions.
  • the specific implementation can refer to the above-mentioned adjustment method when the non-focus area has a single brightness.
  • OLED screens At present, many electronic devices use OLED screens.
  • this feature can be used to overlay a transparent mask layer on the interface displayed on the screen, and adjust the brightness of the area in the screen display interface by adjusting the transparency and/or color value of the transparent mask layer.
  • the color of the pixels in the transparent mask layer can be black, expressed as (0,0,0) in RGB, and each pixel in the transparent mask layer can be set to a different transparency, then, a
  • the actual display color of pixel a is the color after the fusion of the color of the pixel a on the interface and the color of the pixel a in the transparent mask layer according to the transparency of the black mask layer.
  • the RGB value of the pixel can be reduced by making the transparent mask layer, and then the brightness of the pixel can be reduced, and the adjustment degree of the pixel brightness can be adjusted by setting the transparency of the pixel in the transparent mask layer.
  • the transparency of the transparent mask layer can be set to a fixed value, and the RGB value of the pixel in the transparent mask layer can be adjusted to adjust the brightness of the interface.
  • is a fixed value
  • the color P of pixel a in the interface remains unchanged
  • the color Q of pixel a in the mask layer changes, then the pixel
  • the actual display color X of a can also be adjusted. Based on this principle, if you want to adjust the brightness of a certain pixel or a certain area in the interface, you only need to set the color value (RGB) of the corresponding pixel or area in the transparent mask layer, and you can achieve it.
  • the transparency and color value (RGB) of pixels in the transparent mask layer can also be adjusted at the same time to adjust the brightness of corresponding pixels in the interface. Based on this principle, if you want to adjust the brightness of a certain pixel or a certain area in the interface, you only need to set the transparency and color value (RGB) of the corresponding pixel or area in the transparent mask layer, and you can achieve it.
  • the embodiment of the present application provides an interface display method in a single floating window scenario.
  • the single floating window scenario is shown in FIG. 2C , including a full-screen window 201 and a floating window 202 .
  • the interface 700 is divided into a main window area 701 and a floating window area 702 .
  • the floating window area 702 refers to the area corresponding to the floating window 202 in the interface
  • the main window area 701 refers to the area in the interface except the floating window area 702 .
  • the display brightness of the screen is determined according to the brightness parameter of the screen, therefore, the display brightness of the main window area 701 and the floating window area 702 are the same.
  • the display brightness of the main window area 701 and the floating window area 702 is adjusted according to whether the focus area is located in the main window area 701 or in the floating window area 702 .
  • the focus area is located in the main window area 701, it means that the user is paying attention to the content displayed in the main window area 701.
  • the main window area 701 is the user's gaze area
  • the floating window area 702 is the user's non-gazing area.
  • the example provides the possible display brightness settings of the fixation area and the non-attention area as follows:
  • the display brightness of the main window area 701 and the floating window area 702 can be the same, and the specific display brightness can be determined according to the brightness parameters of the screen; or,
  • the display brightness of the main window area 701 can be higher than that of the floating window area 702; optionally, the display brightness of the main window area 701 can be based on the brightness parameter of the screen It is determined that the display brightness of the floating window area 702 may be partially or entirely lower than the display brightness of the main window area 701; or,
  • the user uses the floating window to display the application interface in the full-screen window, indicating that the user is relatively more concerned about the displayed content in the floating window.
  • the display brightness of the main window area 701 can be lower than that of the floating window.
  • the display brightness of the window area 702; optionally, the display brightness of the floating window area 702 can be determined according to the brightness parameter of the screen, and the display brightness of the main window area 701 can be partially or completely lower than the display brightness of the floating window area 702.
  • the floating window area 702 is the user's gaze area
  • the main window area 701 is the user's non-gazing area. This embodiment of the application provides gaze
  • the possible display brightness settings for regions and non-focus regions are as follows:
  • the display brightness of the main window area 701 can be lower than that of the floating window area 702; optionally, the display brightness of the floating window area 702 can be determined according to the brightness parameter of the screen, and the display brightness of the main window area 701 can be lower than that of the floating window Display brightness of area 702 .
  • the display brightness of the main window area 701 is lower than that of the floating window area 702, the display brightness of the pixels in the main window area 701 may be the same or different.
  • the following example illustrates the possible setting mode of the display brightness of the main window area 701:
  • the main window area 701 can be displayed with the same display brightness, which is lower than that of the floating window area 702, and can be as low as the lowest display brightness of pixels in the screen.
  • the main window area 701 may be divided into sub-areas, the display brightness of each sub-area is lower than that of the floating window area 702, and at least two sub-areas have different display brightness.
  • the display brightness of each sub-area is lower than that of the floating window area 702
  • at least two sub-areas have different display brightness.
  • the display brightness of the pixels in the main window area 701 is gradually reduced according to the distance between the pixels in the main window area 701 and the closest boundary line of the floating window area 702 from near to far.
  • different brightness setting methods may be used to set the display brightness for the non-gazing area based on the difference in the remaining power of the electronic device.
  • the main window area 701 and/or the display brightness of the floating window area 702 may change accordingly, such as brightness increase or brightness decrease, etc.
  • the display brightness of the main window area 701 and/or floating window area 702 changes, the original brightness (Brightness before change) changes to target brightness (brightness after change), and can also gradually change from original brightness to target brightness.
  • the focus area moves from the main window area 701 to the floating window area 702.
  • the display brightness of the main window area 701 is 100
  • the display brightness of the floating window area 702 is 100
  • the focus area moves to the floating window area 702
  • the brightness of the main window area 701 is 0,
  • the brightness of the floating window area 702 is 100
  • the display brightness of the main window area needs to be changed from 100 to 0, and the main window area
  • the display brightness of 701 can be directly changed from 100 to 0, or it can be gradually changed to 0.
  • the step size of the gradient is not limited in the embodiment of this application.
  • the brightness of the main window area 701 is gradually changed to 0 according to the following process: 100, 75, 50, 25, 0, and the gradient step is 50, then the brightness of the main window area 701 is gradually changed to 0: 100, 50, 0 according to the following process.
  • the interface display method provided by the embodiment of the present application is shown in Figure 8A, for example, the method may include:
  • Step 801 Obtain a video image of the user, detect that the gaze focus of the user's eyes is located on the screen according to the video image, and calculate the corresponding focal area of the gaze focus on the screen.
  • step 501 For the implementation of this step, reference may be made to the corresponding description in step 501, and details are not repeated here.
  • Step 802 Determine that the current window display type is a single-suspended window type.
  • Step 803 Determine whether the gaze area of the user changes according to the focus area, if yes, perform step 805, if not, perform step 804.
  • the gaze area is the area where the focus area is located in the main window area and the floating window area.
  • To determine whether the user's gaze area has changed is whether the focus area acquired this time is in the same area (the main window area or the floating window area) as the focus area acquired in the previous cycle.
  • a pixel point can be selected from the focus area, such as the center point of the focus area, and the user's gaze area can be determined according to the area where the pixel point is located (the main window area or the floating window area), such as the pixel If the point is located in the main window area, then the focus area is the main window area; if the pixel is located in the floating window area, then the focus area is the floating window area.
  • the proportion of the area where the focus area and the main window area intersect in the focus area can be calculated. If the proportion exceeds 50%, the gaze area is the main window area, otherwise the gaze area is the floating window area; similarly, it is also possible to calculate the proportion of the area where the focus area intersects with the floating window area in the focus area. If the proportion exceeds 50%, the gaze area is the floating window area, otherwise the gaze area is the main window area.
  • the processing method can be preset in the electronic device, for example: because the area that the user pays attention to cannot be determined, the gaze area and The non-gazing area remains unchanged, that is, the display brightness of the main window area and the floating window area remains unchanged; or, the general user pays more attention to the floating window, and in this case, it can be determined that the gazing area is the floating window area; and so on.
  • Step 804 Keep the display brightness of the main window area and the floating window area unchanged, and this branch process ends.
  • Step 805 Obtain the current brightness and target brightness of the floating window area, and adjust the display brightness of the floating window area from the current brightness to the target brightness; obtain the current brightness and target brightness of the main window area, and adjust the display brightness of the main window area from the current brightness to the target brightness. Adjust to target brightness.
  • both the floating window area and the main window area need to be switched between the gaze area and the non-focus area.
  • the change of the gazing area may bring about changes in the display brightness of the floating window area and/or the main window area, and the electronic device can determine whether the gazing area is the floating window area or the main window area. Determine the target brightness for the two regions.
  • the brightness adjustment step for this area may not be performed.
  • the display brightness of the floating window area when adjusting the display brightness of the floating window area from the current brightness to the target brightness, the display brightness of the floating window area can be directly adjusted from the current brightness to the target brightness, or the display brightness of the floating window area can be adjusted from the current brightness to the target brightness according to a certain step length.
  • the brightness fades to the target brightness.
  • the larger the step size the faster the gradient speed.
  • the implementation method of directly adjusting the display brightness of the floating window area from the current brightness to the target brightness can also be considered as the largest step size, that is, the step size is the difference between the target brightness and the current brightness.
  • the gradient speed is the fastest.
  • the display brightness of the floating window area can be adjusted from the current brightness to the target brightness using different gradient speeds according to the power of the power supply.
  • three brightness adjustment methods with different gradient speeds are preset, then adjusting the display brightness of the floating window area from the current brightness to the target brightness may include:
  • the power is not less than the first threshold, and the display brightness of the floating window area is gradually changed from the current brightness to the target brightness according to the first step;
  • the power is not less than the second threshold and less than the first threshold, and the display brightness of the floating window area is gradually changed from the current brightness to the target brightness according to the second step length;
  • the brightness of the floating window area is directly adjusted to the target brightness.
  • the first threshold is greater than the second threshold, and the first step is smaller than the second step.
  • the method of adjusting the display brightness of the main window area from the current brightness to the target brightness refer to the above-mentioned method of adjusting the display brightness of the floating window area from the current brightness to the target brightness, which will not be repeated here.
  • the brightness adjustment of the main window area and the floating window area can be regarded as two independent processing processes, the display adjustment methods of the two can be the same or different, and the fading speeds can be the same or different, which is not limited in the embodiment of the present application.
  • the main window area is divided into sub-areas, and different sub-areas have different brightness
  • the adjustment method when the main window area has a single brightness the difference is only that the brightness adjustment of the main window area is subdivided into the brightness adjustment of multiple sub-areas, which will not be repeated here.
  • the embodiment of the present application provides an interface display method in a scene with multiple floating windows.
  • the interface includes a full-screen window 201, a first floating window 203, and a second floating window 204.
  • FIG. 900 is divided into: a main window area 901 , a first floating window area 902 and a second floating window area 903 .
  • the first floating window area 902 refers to the area where the first floating window is located in the interface
  • the second floating window area 903 is the area where the second floating window is located in the interface
  • the main window area 901 refers to the area in the interface except the first floating window area 902 and the area outside the second floating window area 903 .
  • the display brightness of the interface is determined according to the brightness parameter of the screen, therefore, the display brightness of the main window area 901 , the first floating window area 902 and the second floating window area 903 are the same.
  • the display brightness of each area in the interface is adjusted according to which area the focus area is located in the main window area 901 , the first floating window area 902 , and the second floating window area 903 .
  • the main window area 901 is determined as the user's gaze area, and the first floating window area 902 and the second floating window area 903 are determined as non-gazing areas. At this time, the user is focusing on the main window area 901
  • the embodiment of this application provides the following possible display brightness setting methods:
  • the user uses the floating window to display the interface of the application in the main window, indicating that the user pays more attention to the content displayed in the floating window, but the current user is paying attention to the content displayed in the main window area 901, so it is impossible to know what the user is currently concerned about.
  • the specific window area therefore, in a possible implementation manner, the display brightness of the main window area 901, the first floating window area 902 and the second floating window area 903 may be the same, and the specific display brightness may be determined according to the brightness parameter of the screen; or,
  • the display brightness of the main window area 901 can be higher than the display brightness of the first floating window area 902 and the second floating window area 903; optionally, the main window area 901
  • the display brightness of the screen can be determined according to the brightness parameters of the screen.
  • the display brightness of the first floating window area 902 and the second floating window area 903 can be partially or completely lower than the display brightness of the main window area 901.
  • the first floating window area 902 and the second floating window area 902 and the second The display brightness of the floating window area 903 may be the same or different; or,
  • the display brightness of the main window area 901 can be lower than that
  • the display brightness of a floating window area 902 and the second floating window area 903; optionally, the display brightness of the first floating window area 902 and the second floating window area 903 can be determined according to the brightness parameter of the screen, and the display brightness of the main window area 901
  • the brightness may be partially or completely lower than the display brightness of the floating window area, and the display brightness of the first floating window area 902 and the second floating window area 903 may be the same or different.
  • the first floating window area 902 is determined as the user's gaze area, and the main window area 901 and the second floating window area 903 are determined as non-gazing areas.
  • the user is focusing on the first
  • this embodiment of the application provides the following possible display brightness setting methods:
  • the display brightness of the first floating window area 902 is at least higher than the display brightness of the main window area 901, wherein the display brightness of the first floating window area 902 can be determined according to the brightness parameters of the screen, and the display brightness of the main window area 901 is lower than the first The display brightness of the floating window area 902 .
  • the display brightness of the second floating window area 902 may be the same as or lower than the display brightness of the first floating window area 902 .
  • the display brightness of the second floating window area 903 may be higher than that of the main window area 901 and lower than that of the second floating window area 901. A display brightness of the floating window area 902 .
  • the display brightness setting method provided by the embodiment of the present application can refer to the display brightness setting method when the focus area is located in the first floating window area 902, the only difference is that the first floating window area 902 and the second floating window area 902
  • the floating window area 903 is interchanged, and details are not described here.
  • the display brightness of each area may change accordingly.
  • the change of the display brightness of each area can be directly changed, or it can be realized through a gradual change.
  • please refer to the figure The corresponding instructions in 3 are not repeated here.
  • the initial display brightness of the interface is the same; for example, as shown in the second picture in Figure 10, the electronic device detects that the focus area is located in the left floating window, and the floating window corresponds to The display brightness of the interface area is normal brightness (that is, the initial display brightness in the first picture), the display brightness of the interface area corresponding to the floating window on the right becomes lower, and the display brightness of other interface areas is 0; for example, in Figure 10
  • the electronic device detects that the focus area is located in the floating window on the right, the interface area corresponding to the floating window becomes normal brightness, the display brightness of the interface area corresponding to the left floating window becomes low, and other interfaces The display brightness of the area is 0.
  • this embodiment of the present application provides an interface display method
  • the specific process can refer to the interface display method shown in Figure 8A
  • the main difference from the method shown in Figure 8A is that the focus area is changed from the main window area or
  • the floating window area is further expanded into: the main window area, the first floating window area or the second floating window area, and the display brightness setting methods of the three areas are slightly different.
  • the embodiment of the present application provides an interface display method in a split-screen display scene, as shown in FIG. 2E, including a first split-screen window 205 and a second split-screen window 206.
  • the two windows divide the interface 110 into The first split screen area 111 and the second split screen area 112 .
  • the first split-screen area 111 is the area where the first split-screen window is located in the interface
  • the second split-screen area 112 is the area where the second split-screen window is located in the interface.
  • the display brightness of the interface is determined according to the brightness parameter of the screen, therefore, the display brightness of the first split screen area 111 and the second split screen area 112 are the same.
  • the display brightness of the above two areas is adjusted.
  • the embodiment of this application provides the following possible display brightness setting methods:
  • the display brightness of the first split-screen area 111 is higher than the display brightness of part or all of the second split-screen area 112 .
  • the display brightness of the first split-screen area 111 may be determined according to a brightness parameter of the screen, and the display brightness of the second split-screen area 112 is lower than the display brightness of the first split-screen area 111 .
  • the second split-screen area 112 can be divided into sub-areas, the display brightness of different sub-areas can be the same or different, the display brightness of each sub-area is not higher than the display brightness of the first split-screen area 111, and the display brightness of at least one sub-area The brightness is lower than the display brightness of the second split screen area 112 .
  • the display brightness setting method of the two areas can refer to the description when the focus area is located in the first split screen area 111, the only difference is that the first split screen area 111 and the second split screen area 112 swaps.
  • the display brightness of the first split screen area 111 and the second split screen area 112 may change, and the display brightness of each area
  • the change of can be a direct change or a gradual change.
  • the initial display brightness of the interface is the same; It is detected that the focus area is located in window A, the display brightness of the interface area corresponding to window A is normal brightness (that is, the initial display brightness in the first picture), and the display brightness of the interface area corresponding to window B becomes lower, the lowest being 0;
  • the electronic device detects that the focus area is located in window B, and the display brightness of the interface area corresponding to window B becomes normal, and the display brightness of the interface area corresponding to window A becomes low, the lowest possible is 0.
  • this embodiment of the present application provides an interface display method, the specific process can refer to the method shown in Figure 8A, the main difference from the method shown in Figure 8A is that the focus area is in the first split screen area 111 and the second split-screen area 112, and the display brightness settings of the two areas are slightly different, for details, please refer to the corresponding description in FIG. 11A.
  • the embodiment of the present application provides an interface display method in the parallel horizon scene, as shown in Figure 2F, including the first parallel horizon window and the second parallel horizon window, as shown in Figure 12A, the two windows divide the interface 120 into the first parallel horizon window.
  • the viewing horizon area 121 and the second parallel viewing horizon area 122 are the areas where the first parallel horizon window is located in the interface, and the second parallel horizon area 122 is the area where the second parallel horizon window is located in the interface.
  • the display brightness setting method of the first parallel horizon area 121 and the second parallel horizon area 122 in the scene shown in FIG. 12A can refer to the corresponding description in the split-screen display scene in FIG.
  • One split screen area is replaced by the first parallel horizon area
  • the second split screen area is replaced by the second parallel horizon area, which will not be described here.
  • the initial display brightness of the interface is the same;
  • the device detects that the focus area is located in window A, and the display brightness of the interface area corresponding to window A is normal brightness (that is, the initial display brightness in the first picture), and the display brightness of the interface area corresponding to window B becomes lower, the lowest possible is 0;
  • the electronic device detects that the focus area is located in window B, and the display brightness of the interface area corresponding to window B becomes normal, and the display brightness of the interface area corresponding to window A becomes low, the lowest.
  • this embodiment of the present application provides an interface display method.
  • the specific process can refer to the method shown in FIG. 8A.
  • the main difference from the method shown in FIG. and the second parallel horizon region 122, and the display brightness settings of the two regions are slightly different, for details, please refer to the corresponding description in FIG. 12A.
  • FIG. 13 is a block diagram of a software structure of an electronic device provided by an embodiment of the present application, and the software structure is applicable to the embodiments shown in FIGS. 3 to 5 .
  • the electronic device is divided into four layers of the Android system in FIG. 1B as an example. From top to bottom, it is the application layer, the framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer.
  • Application layer (Application, App) can include:
  • the screen display module is used to determine and execute the power saving strategy according to the power saving instruction, and receive information such as the focus area and the boundary parameters of the window sent by the power control module;
  • the interface display module is used to determine the fixation area, display the interface, and adjust the display brightness of the fixation area and/or non-attention area in the interface.
  • the interface display module is used for displaying the interface.
  • the framework layer (Framework, FWK) can include:
  • the window displays the power saving frame, which can include:
  • the monitoring module is used to monitor eye gaze events reported by the system library layer, including eye image data;
  • the eye gaze service module is used to receive the eye gaze event and request the window type management module to determine the display type of the current window;
  • the window type management module is used to record the current window display type, including: single window type, multi-window type, and multi-window type can be further subdivided into: single-suspended window, multi-suspended window, split-screen display, parallel horizon and other types ;
  • the window management module corresponding to each window display type is used to manage information such as boundary parameters of the window in the screen, calculate the focus area according to the image data, and send information such as window boundary parameters and focus area to the power control module.
  • the window management module may include: a single window management module, a single floating window management module, a multi-suspension window management module, a parallel horizon management module, a split-screen management module, etc., respectively corresponding to the window display types.
  • the power control module is used to record whether the power saving strategy is implemented, and if the power saving strategy is implemented, information such as power saving instructions, window boundary parameters, and focus areas are sent to the screen display module.
  • the display frame is used for processing such as drawing and rendering of the interface.
  • System libraries can include: camera module and display module.
  • the kernel layer can include: camera driver and display driver.
  • the camera driver is used to drive the camera of the hardware layer
  • the display driver is used to drive the display screen of the hardware layer, that is, the screen of the electronic device in the embodiment of the present application.
  • the embodiment of the present application provides an interface display method, as shown in Figure 14, the method is to combine the interface display method shown in Figure 5 with the schematic flowchart of the software structure shown in Figure 13, as shown in As shown in FIG. 14 , in which, compared with the software structure shown in FIG. 13 , a gaze area determination module is added to record the preset size of the gaze area, such as length and width.
  • the camera at the hardware layer receives the eye gaze event and triggers an interrupt to notify the camera driver at the kernel layer to obtain the eye image data; the camera driver receives the interrupt, obtains the eye image data, converts it into an eye gaze event, and passes through the camera module in the system library Transmission to the monitoring module of the framework layer; the monitoring module monitors the eye gaze event, and sends the eye gaze event to the eye gaze service module, and the eye gaze service module requests the window type management module to determine the current window type; the window type management module determines the current window display type It is a single-window type, and the eye gaze event is sent to the single-window management module; the single-window management module determines the focus area, and sends the boundary parameters of the window and the focus area information to the power control module; Information such as power-saving instructions, window boundary parameters, and focus areas are sent to the screen display module of the application layer; the screen display module determines that the power-saving instruction has been received, and sends information such as the focus area and window boundary parameters to the gaze area determination module;
  • the embodiment of the present application provides an interface display method, as shown in Figure 15, the method is a schematic flow chart of the method shown in Figure 8A under the software structure shown in Figure 13, as shown in Figure 15 show, among them,
  • the camera at the hardware layer receives the eye gaze event and triggers an interrupt to notify the camera driver at the kernel layer to obtain the eye image data; the camera driver receives the interrupt, obtains the eye image data, converts it into an eye gaze event, and passes through the camera module in the system library Transmission to the monitoring module of the framework layer; the monitoring module monitors the eye gaze event, and sends the eye gaze event to the eye gaze service module, and the eye gaze service module requests the window type management module to determine the current window type; the window type management module determines the current window display type It is a single-suspension window type, and sends eye gaze events to the single-window management module; the single-suspension window management module determines the focus area, and sends the window boundary parameters and focus area information to the power control module; the power control module determines that it is necessary to implement a power saving strategy , sending information such as power-saving instructions, window boundary parameters, and focus areas to the screen display module of the application layer; the screen display module determines that the power-saving instruction has been received,
  • the interface display module determines the gazing area and the non-gazing area, and adjusts the display brightness of the gazing area and/or the non-gazing area in the displayed interface.
  • the interface display module can display the required display interface on the display screen through the display frame, display module, and display driver. In the middle display, the display brightness of the gazing area in the interface displayed on the display screen is higher than the display brightness of all or part of the non-gazing area.
  • the embodiment of the present application provides an interface display method, as shown in Figure 16, this method is applicable to the scene shown in Figure 9, as shown in Figure 16, it is different from the method shown in Figure 15
  • the window type management module determines that the current window display type is a multi-suspension window type, and sends the eyeball gaze event to the multi-window management module
  • the multi-suspension window management module determines the focus area, and sends the window boundary parameters and focus area information to the battery
  • the control module the implementation of other parts can refer to the corresponding descriptions in the foregoing embodiments, and details are not repeated here.
  • the embodiment of the present application provides an interface display method, as shown in FIG.
  • the main difference is: the window type management module determines that the current window display type is split-screen display type, and sends the eyeball gaze event to the split-screen window management module; the split-screen window management module determines the focus area, and sends the window boundary parameters and focus area information to the power control module; for the implementation of other parts, reference may be made to the corresponding descriptions in the foregoing embodiments, and details are not repeated here.
  • the embodiment of the present application provides an interface display method, as shown in Figure 18, this method is applicable to the scene shown in Figure 12A, as shown in Figure 18, it is different from the method shown in Figure 15
  • the window type management module determines that the current window display type is the parallel horizon display type, and sends the eyeball gaze event to the parallel horizon window management module
  • the parallel horizon window management module determines the focus area, and sends the window boundary parameters and focus area information to the power control module; for the implementation of other parts, reference may be made to the corresponding descriptions in the foregoing embodiments, and details are not repeated here.
  • an embodiment of the present application provides an interface display device.
  • the device 1900 includes: a detection unit 1910 and an adjustment unit 1920 , wherein,
  • the detection unit 1910 is configured to detect the gaze area of the user's eyeballs in the first interface; the first interface is an interface displayed on the screen of the electronic device;
  • the adjustment unit 1920 is used to adjust the display brightness of the first interface to obtain a second interface, the brightness of the gaze area in the second interface is greater than the brightness of some or all of the areas in the non-gazing area; the non-gazing area is an area other than the gaze area in the interface .
  • the detection unit 1910 is configured to detect the gaze area of the user's eyeballs in the interface, including:
  • the detection unit 1910 is configured to: determine the focus area corresponding to the gaze focus of the user's eyeball in the interface; and determine the gaze area of the user's eyeball in the interface according to the focus area.
  • the detection unit 1910 is configured to determine the gaze area of the user's eyeballs in the interface according to the focus area, including:
  • the detection unit 1910 is configured to: determine the gaze area of the user's eyes in the interface according to the focus area and the current window display type of the first interface, where the window display type is single-window display or multi-window display.
  • the window display type is single-window display
  • the detection unit 1910 is configured to determine the gaze area of the user's eyeballs in the interface according to the focus area and the current window display type of the first interface, including:
  • the detecting unit 1910 is configured to: determine a gaze area according to the focus area, and the gaze area includes the focus area.
  • the window display type is multi-window display, and the interface is divided into at least two window areas by the windows; the detection unit 1910 is used to determine the user's eyeball position according to the focus area and the current window display type of the first interface. Gaze areas in the interface, including:
  • the detection unit 1910 is used for:
  • the first window area is determined as the gaze area.
  • the window display type is multi-window display, and the interface is divided into at least two window areas by the windows; the detection unit 1910 is used to determine the user's eyeball position according to the focus area and the current window display type of the first interface. Gaze areas in the interface, including:
  • the detection unit 1910 is used for:
  • the multi-window display includes: single floating window display, and/or, multiple floating window display, and/or, split-screen display, and/or, parallel view display.
  • the adjustment unit 1920 is configured to adjust the display brightness of the interface, including:
  • the adjustment unit 1920 is configured to: obtain the first target brightness, and adjust the brightness of the gaze area to the first target brightness.
  • the adjustment unit 1920 is configured to adjust the brightness of the gaze area to the first target brightness, including:
  • the adjustment unit 1920 is used for:
  • the adjustment unit 1920 is configured to adjust the brightness of the gaze area to the first target brightness, including:
  • the adjustment unit 1920 is used for:
  • the power is not less than the first threshold, and the brightness of the gaze area is gradually changed to the first target brightness according to the first step; and/or,
  • the power is less than the first threshold and not less than the second threshold, and the brightness of the gaze area is gradually changed to the first target brightness according to the second step; the first threshold is greater than the second threshold, and the first step is smaller than the second step; and/or ,
  • the brightness of the gaze area is directly adjusted to the first target brightness.
  • the adjustment unit 1920 is configured to adjust the display brightness of the interface, including:
  • the adjustment unit 1920 is configured to: acquire a brightness setting policy of the non-focus area, and adjust the brightness of the non-focus area according to the brightness setting policy.
  • the adjustment unit 1920 is configured to adjust the brightness of the non-gazing area according to a brightness setting strategy, including:
  • the adjusting unit 1920 is configured to: determine the second target brightness according to the brightness setting strategy, and adjust the brightness of the non-focus area to the second target brightness.
  • the adjustment unit 1920 is configured to adjust the brightness of the non-gazing area to the target brightness, including:
  • the adjustment unit 1920 is used for:
  • the adjustment unit 1920 is configured to adjust the brightness of the non-gazing area to the target brightness, including:
  • the adjustment unit 1920 is used for:
  • the power is not less than the third threshold, and the brightness of the gaze area is gradually changed to the target brightness according to the third step length; and/or,
  • the power is less than the third threshold and not less than the fourth threshold, and the brightness of the gaze area is gradually changed to the target brightness according to the fourth step; the third threshold is greater than the fourth threshold, and the third step is smaller than the fourth step; and/or,
  • the brightness of the gaze area is directly adjusted to the target brightness.
  • the brightness setting strategy includes:
  • the non-gazing area is divided into several sub-areas, and the brightness of the sub-areas decreases successively according to the order of the distance between the several sub-areas and the watching area from small to large, and the maximum brightness of the sub-area is less than or equal to the first target brightness; or,
  • the brightness of the pixels decreases successively, and the maximum brightness of the pixels in the non-focused area is less than or equal to the first target brightness;
  • the brightness of the non-attention area is set as the second target brightness, and the second target brightness is smaller than the first target brightness.
  • the adjustment unit 1920 is configured to adjust the display brightness of the interface, including:
  • the adjustment unit 1920 is used for:
  • An embodiment of the present application provides an electronic device, including a display and a processor; wherein, the processor is configured to execute the method provided in any one of the above embodiments in FIG. 3 to FIG. 18 .
  • the present application also provides an electronic device, the device includes a storage medium and a central processing unit, the storage medium may be a non-volatile storage medium, a computer executable program is stored in the storage medium, and the central processing unit and the The non-volatile storage medium is connected, and executes the computer executable program to implement the method provided in any one of the embodiments in FIG. 3 to FIG. 18 of the present application.
  • the embodiment of the present application also provides a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program. When it is run on a computer, the computer executes the program provided by any one of the embodiments in Fig. 3 to Fig. 18 of the present application. method.
  • An embodiment of the present application further provides a computer program product, the computer program product includes a computer program, and when it is run on a computer, the computer executes the method provided in any one of the embodiments in FIG. 3 to FIG. 18 of the present application.
  • "at least one” means one or more, and “multiple” means two or more.
  • “And/or” describes the association relationship of associated objects, indicating that there may be three kinds of relationships, for example, A and/or B may indicate that A exists alone, A and B exist simultaneously, or B exists alone. Among them, A and B can be singular or plural.
  • the character “/” generally indicates that the contextual objects are an “or” relationship.
  • “At least one of the following” and similar expressions refer to any combination of these items, including any combination of single items or plural items.
  • At least one of a, b, and c can represent: a, b, c, a and b, a and c, b and c or a and b and c, where a, b, c can be single, or Can be multiple.
  • any function is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory; hereinafter referred to as: ROM), random access memory (Random Access Memory; hereinafter referred to as: RAM), magnetic disk or optical disc, etc.
  • ROM read-only memory
  • RAM random access memory
  • magnetic disk or optical disc etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Procédé d'affichage d'interface et dispositif électronique. Dans le procédé, une zone de regard des globes oculaires d'un utilisateur dans une première interface est détectée, la première interface est une interface affichée sur un écran d'un dispositif électronique, et la luminosité d'affichage de la première interface est réglée de façon à obtenir une seconde interface, la luminosité de la zone de regard dans la seconde interface est supérieure à celle d'une partie ou de la totalité d'une zone de non-regard, et la zone de non-regard est une zone autre que la zone de regard dans la première interface. Selon la présente invention, non seulement l'expérience de visualisation d'utilisateurs est assurée, mais l'objectif d'économie d'énergie peut également être atteint.
PCT/CN2022/114916 2021-08-31 2022-08-25 Procédé d'affichage d'interface et dispositif électronique WO2023030168A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202122085565.2 2021-08-31
CN202122085565 2021-08-31
CN202210563228.6A CN115729346A (zh) 2021-08-31 2022-05-20 界面显示方法和电子设备
CN202210563228.6 2022-05-20

Publications (1)

Publication Number Publication Date
WO2023030168A1 true WO2023030168A1 (fr) 2023-03-09

Family

ID=85292405

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/114916 WO2023030168A1 (fr) 2021-08-31 2022-08-25 Procédé d'affichage d'interface et dispositif électronique

Country Status (2)

Country Link
CN (1) CN115729346A (fr)
WO (1) WO2023030168A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117372656A (zh) * 2023-09-25 2024-01-09 广东工业大学 一种用于混合现实的用户界面显示方法、设备及介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853424A (zh) * 2012-11-28 2014-06-11 三星电子株式会社 显示装置和控制显示装置的方法
CN107728770A (zh) * 2017-09-26 2018-02-23 努比亚技术有限公司 终端屏幕亮度调整方法、移动终端和计算机可读存储介质
CN110032271A (zh) * 2018-01-12 2019-07-19 京东方科技集团股份有限公司 对比度调节装置及其方法、虚拟现实设备及存储介质
CN111601373A (zh) * 2020-05-09 2020-08-28 Oppo广东移动通信有限公司 背光亮度控制方法、装置、移动终端及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103853424A (zh) * 2012-11-28 2014-06-11 三星电子株式会社 显示装置和控制显示装置的方法
CN107728770A (zh) * 2017-09-26 2018-02-23 努比亚技术有限公司 终端屏幕亮度调整方法、移动终端和计算机可读存储介质
CN110032271A (zh) * 2018-01-12 2019-07-19 京东方科技集团股份有限公司 对比度调节装置及其方法、虚拟现实设备及存储介质
CN111601373A (zh) * 2020-05-09 2020-08-28 Oppo广东移动通信有限公司 背光亮度控制方法、装置、移动终端及存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117372656A (zh) * 2023-09-25 2024-01-09 广东工业大学 一种用于混合现实的用户界面显示方法、设备及介质

Also Published As

Publication number Publication date
CN115729346A (zh) 2023-03-03

Similar Documents

Publication Publication Date Title
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
EP3872609B1 (fr) Procédé d'affichage d'application et dispositif électronique
EP4131911A1 (fr) Procédé d'interaction entre des interfaces d'application, dispositif électronique et support de stockage lisible par ordinateur
WO2021169337A1 (fr) Procédé d'affichage d'empreintes digitales sur écran et dispositif électronique
AU2020229917B2 (en) Recording frame rate control method and related apparatus
US20230419570A1 (en) Image Processing Method and Electronic Device
EP4012544A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2021036585A1 (fr) Procédé d'affichage sur écran souple, et dispositif électronique
US20230117194A1 (en) Communication Service Status Control Method, Terminal Device, and Readable Storage Medium
US20230386382A1 (en) Display method, electronic device, and computer storage medium
WO2023000772A1 (fr) Procédé et appareil de commutation de mode, dispositif électronique et système de puce
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
EP4280586A1 (fr) Procédé de détection d'image de source de lumière ponctuelle et dispositif électronique
US11990075B2 (en) Drive control method and related device
CN111522425A (zh) 一种电子设备的功耗控制方法及电子设备
WO2022037726A1 (fr) Procédé d'affichage à écran partagé et dispositif électronique
EP4020965A1 (fr) Procédé photographique et dispositif électronique
WO2020233593A1 (fr) Procédé d'affichage d'élément de premier plan et dispositif électronique
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
US11995317B2 (en) Method and apparatus for adjusting memory configuration parameter
WO2023030168A1 (fr) Procédé d'affichage d'interface et dispositif électronique
WO2022252972A1 (fr) Procédé d'affichage de fenêtre et dispositif électronique
CN113781959B (zh) 界面处理方法及装置
WO2024066834A1 (fr) Procédé de commande de signal vsync, dispositif électronique, support d'enregistrement et puce
WO2024066834A9 (fr) Procédé de commande de signal vsync, dispositif électronique, support d'enregistrement et puce

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22863306

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE