WO2022111593A1 - Appareil et procédé d'affichage d'interface graphique utilisateur - Google Patents

Appareil et procédé d'affichage d'interface graphique utilisateur Download PDF

Info

Publication number
WO2022111593A1
WO2022111593A1 PCT/CN2021/133215 CN2021133215W WO2022111593A1 WO 2022111593 A1 WO2022111593 A1 WO 2022111593A1 CN 2021133215 W CN2021133215 W CN 2021133215W WO 2022111593 A1 WO2022111593 A1 WO 2022111593A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
shadow
view control
electronic device
specified view
Prior art date
Application number
PCT/CN2021/133215
Other languages
English (en)
Chinese (zh)
Inventor
范振华
杨婉艺
曹原
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022111593A1 publication Critical patent/WO2022111593A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Definitions

  • the present application relates to the technical field of electronic equipment, and in particular, to a method and apparatus for displaying a user graphical interface.
  • GUI Graphical user interface
  • GUI refers to a user interface related to computer operations displayed in a graphical manner. It can be an icon, window, control and other interface elements displayed on the display screen of the electronic device, wherein the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, etc. interface elements.
  • the present application provides a user graphical interface display method and device thereof, which can dynamically display the shadow and light effect of a specified view control in the GUI, thereby improving the realism of the GUI.
  • the present application provides a method for displaying a graphical user interface.
  • the method is applied to an electronic device, the electronic device includes a display screen for displaying a graphical user interface, and the graphical user interface includes a specified view control with a depth attribute.
  • the method includes: the electronic device determines the position of the first light source according to the first image of the collected biological features; according to the position of the first light source, generating and outputting a first shadow light effect for a specified view control; according to the first image of the collected biological features Two images, determine the position of the second light source; according to the position of the second light source, generate and output the second shadow light effect for the specified view control.
  • the position of the first light source and the position of the second light source are the positions of the biological feature relative to the display screen, and the position of the first light source is different from the position of the second light source, and the first shadow light effect is different from the second shadow light effect. . That is to say, different light source positions can generate different shadow light effects for the same specified view control.
  • the shadow light effect of the same specified view control can change with the change of the light source position, so as to realize the dynamic display of the shadow effect of the specified view control, which is beneficial to build a more realistic user graphical interface.
  • the electronic device determines the first image position of the biometric feature in the first image according to the collected first image of the biometric feature, and estimates the distance of the biometric feature relative to the display screen ; Determine the position of the first light source according to the position and distance of the first image. In the same way, the position of the second light source can be determined.
  • the position of the light source can be determined according to the position of the biological feature in the image and the distance relative to the display screen, which can ensure the accuracy of the position of the light source.
  • the electronic device determines the first viewing angle according to the position and distance of the first image, and determines the first viewing angle according to the first image position and distance.
  • a viewing angle is used to determine the position of the first light source. If the angle between the display screen and the horizontal direction is within the preset range, it can be understood that the display screen is vertical or nearly vertical to the horizontal direction. In this case, the influence of the posture of the electronic device on the position of the light source can be ignored, and the amount of calculation can be reduced.
  • the electronic device determines the first image according to the angle between the display screen and the horizontal direction, the position and distance of the first image. There are two viewing angles, and the position of the first light source is determined according to the second viewing angle. If the angle between the display screen and the horizontal direction is not within the preset range, it can be understood that the tilt angle of the display screen is relatively large. In this case, considering the influence of the posture of the electronic device on the position of the light source, the position of the light source with higher accuracy can be obtained.
  • the first shadow light effect includes a first shadow.
  • the electronic device generates and outputs a first shadow for the specified view control according to the position of the first light source, the intensity of the first light source and the material attribute information of the specified view control.
  • electronic devices When generating shadows, electronic devices not only consider the position and intensity of the light source, but also the material attribute information, which can enrich the shadow effect and make the shadow effect more suitable for the material.
  • the electronic device determines the blur radius of the subject according to the position of the first light source and the intensity of the first light source; and determines the blur radius of the subject according to the position of the first light source and the material attribute information of the specified view control.
  • Projection blur information according to the subject blur radius and projection blur information, generate and output the first shadow for the specified view control.
  • the material property information of the specified view control may include one or more of refractive index, reflectance, diffuse reflectance or transparency.
  • Materials can include one or more of background materials, border materials, or backplane materials. That is to say, if the material property information of the specified view control is different, shadows of different weights, colors, and light effects of different effects can be generated.
  • the first shadow light effect includes a first light effect.
  • the electronic device generates and outputs a first light effect for the specified view control according to the position of the first light source, the intensity of the first light source and the original color information of the specified view control. That is to say, if the position of the light source, the intensity of the light source, and/or the original color information of the specified view control determined by the electronic device are different, the generated light effect will be different, so that the purpose of dynamically displaying the specified view control can be achieved.
  • the first shadow light effect includes a first light effect.
  • the electronic device determines the radial gradient radius according to the position of the first light source; determines the Gaussian blur radius according to the intensity of the first light source; generates and outputs the first light effect for the specified view control according to the radial gradient radius and the Gaussian blur radius. That is to say, the position of the light source or the intensity of the light source determined by the electronic device is different, the radial gradient radius and the Gaussian blur radius are different, and thus the generated light effects are different.
  • the present application provides a method for displaying a graphical user interface, the method is applied to an electronic device, the electronic device includes a display screen for displaying a graphical user interface, and the graphical user interface includes a specified view control with a depth attribute.
  • the method includes: the electronic device obtains a first position of the electronic device relative to a preset light source; according to the first position and the position of the preset light source, generating and outputting a first shadow light effect for a specified view control; obtaining the electronic device relative to a preset light source The second position of the light source; according to the second position and the position of the preset light source, generate and output a second shadow light effect for the specified view control.
  • the first position is different from the second position, and the first shadow light effect is different from the second shadow light effect. That is to say, the position of the preset light source is fixed, the position of the electronic device relative to the preset light source is different, and different shadow light effects can be generated for the same specified view control.
  • the difference between the first position and the second position can be understood as the different postures of the electronic device.
  • the shadow light effect of the same specified view control can change with the posture of the electronic device, so as to realize the dynamic display of the shadow effect of the specified view control, which is conducive to building a more realistic user graphical interface.
  • the method generates a shadow light effect according to the word strip of the electronic device, and has low power consumption.
  • the first shadow effect includes a first shadow.
  • the electronic device generates and outputs a first shadow for the specified view control according to the first position, the position of the preset light source, the intensity of the first light source, and the material attribute information of the specified view control.
  • electronic devices When generating shadows, electronic devices not only consider the position and intensity of the light source, but also the material attribute information, which can enrich the shadow effect and make the shadow effect more suitable for the material.
  • the material property information of the specified view control includes one or more of refractive index, reflectance, diffuse reflectance or transparency.
  • Materials can include one or more of background materials, border materials, or backplane materials. That is to say, if the material property information of the specified view control is different, shadows of different weights, colors, and light effects of different effects can be generated.
  • the first shadow light effect includes the first light effect.
  • the electronic device generates and outputs the first light effect for the specified view control according to the first position, the position of the preset light source, the intensity of the first light source, and the original color information of the specified view control. That is to say, if the first position determined by the electronic device, the position of the preset light source, the intensity of the light source, and/or the original color information of the specified view control are different, the generated light effects will be different, so that the specified view control can be dynamically displayed. the goal of.
  • the electronic device determines the radial gradient radius according to the first position and the position of the preset light source; determines the Gaussian blur radius according to the first position and the intensity of the preset light source; Radial gradient radius and Gaussian blur radius, generate and output the first light effect for the specified view control. That is to say, the position of the electronic device relative to the preset light source or the intensity of the preset light source is different, the radial gradient radius and the Gaussian blur radius are different, and thus the generated light effects are different.
  • a user graphical interface display device has part or all of the functions of the electronic device described in the first aspect or the second aspect.
  • the function of the apparatus may have the function of some or all of the embodiments of the electronic device in the present application, and may also have the function of independently implementing any embodiment of the present application.
  • the functions can be implemented by hardware, or can be implemented by hardware executing corresponding software.
  • the hardware or software includes one or more units or modules corresponding to the above functions.
  • the structure of the user graphic interface display device may include a processing unit and a display unit, and the processing unit is configured to support the user graphic interface display device to perform the corresponding functions in the above method.
  • the display unit is configured to display the corresponding output shadow light effect in the above-mentioned method performed by the user graphical interface display device.
  • the user graphical interface display device may further include a storage unit, which is used for coupling with the processing unit and the display unit, and stores necessary program instructions and data of the user graphical interface display device.
  • the above-mentioned user graphical interface display device includes:
  • a processing unit configured to determine the position of the first light source according to the collected first image of the biological feature; generate a first shadow light effect for the specified view control according to the position of the first light source;
  • a display unit for outputting the first shadow light effect
  • the processing unit is further configured to determine the position of the second light source according to the collected second image of the biological feature; according to the position of the second light source, generate a second shadow light effect for the specified view control;
  • a display unit further used for outputting a second shadow light effect
  • the position of the first light source and the position of the second light source are the positions of the biological feature relative to the display screen, and the position of the first light source is different from the position of the second light source; the first shadow light effect is different from the second shadow light effect.
  • the above-mentioned user graphical interface display device includes:
  • a processing unit for acquiring a first position of the electronic device relative to the preset light source; generating a first shadow light effect for a specified view control according to the first position and the position of the preset light source;
  • a display unit for outputting the first shadow light effect
  • the processing unit is further configured to obtain a second position of the electronic device relative to the preset light source; according to the second position and the position of the preset light source, generate a second shadow light effect for the specified view control;
  • a display unit further used for outputting a second shadow light effect
  • the first position is different from the second position, and the first shadow light effect is different from the second shadow light effect.
  • the present application provides a user graphical interface display device, including a display screen, a memory, one or more processors, multiple application programs, and one or more programs. Wherein, one or more programs are stored in the memory, and when the one or more processors execute the one or more programs, the user graphical interface display device implements the method described in the first aspect or the second aspect.
  • the present application provides a computer device, including a memory, a processor, and a computer program stored in the memory and running on the processor.
  • the processor executes the computer program
  • the computer device implements the first aspect or the second aspect. method described in the aspect.
  • the present application provides a computer program product containing instructions, when the computer program product is run on an electronic device, the electronic device is made to execute the first aspect and any possible implementation of the first aspect. method; or performing the method as described in the second aspect and any possible implementation manner of the second aspect.
  • the present application provides a computer-readable storage medium, comprising instructions that, when the above-mentioned instructions are executed on an electronic device, cause the above-mentioned electronic device to execute as described in the first aspect and any possible implementation manner of the first aspect method; or performing the method as described in the second aspect and any possible implementation manner of the second aspect.
  • Fig. 1 is an example diagram of a three-dimensional coordinate axis of a mobile phone
  • Figure 2 is an example diagram of the shadow effect of the specified view control under the action of the light source
  • FIG. 3 is a schematic diagram of a structure of an electronic device provided by an embodiment
  • FIG. 4 is a schematic flowchart of a method for displaying a graphical user interface provided by an embodiment of the present application
  • FIG. 5 is an example diagram of a first image provided by an embodiment of the present application.
  • FIG. 6 is an example diagram of a first image and a display screen provided by an embodiment of the present application.
  • FIG. 7-1 is an exemplary diagram of determining the position of the first light source according to an embodiment of the present application.
  • FIG. 7-2 is another exemplary diagram of determining the position of the first light source provided by the embodiment of the present application.
  • FIG. 8 is an example diagram of generating a first shadow provided by an embodiment of the present application.
  • FIG. 9 is an example diagram of generating a first light effect provided by an embodiment of the present application.
  • FIG. 10 is an example diagram provided by the embodiment of the present application.
  • FIG. 11 is another example diagram provided by an embodiment of the present application.
  • FIG. 12 is a schematic flowchart of another method for displaying a graphical user interface provided by an embodiment of the present application.
  • FIG. 13 is another example diagram provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of a user graphical interface display device provided by an embodiment of the present application.
  • the electronic device may be a portable electronic device that also includes other functions such as personal digital assistant and/or music player functions, such as a cell phone, tablet computer, wearable electronic device with wireless communication capabilities (eg, a smart watch) Wait.
  • portable electronic devices include, but are not limited to, carry-on Or portable electronic devices with other operating systems.
  • the portable electronic device described above may also be other portable electronic devices, such as a laptop computer (Laptop) with a touch-sensitive surface or a touch panel, or the like. It should also be understood that, in some other embodiments, the above-mentioned electronic device may not be a portable electronic device, but a desktop computer having a touch-sensitive surface or a touch panel.
  • UI user interface
  • UI user interface
  • the user interface of the application is the source code written in a specific computer language such as java and extensible markup language (XML).
  • XML extensible markup language
  • Controls also known as widgets, are the basic elements of the user interface. Typical controls include toolbars, menu bars, text boxes, buttons, and scroll bars. (scrollbar), pictures and text.
  • the attributes and content of controls in the interface are defined by tags or nodes.
  • XML specifies the controls contained in the interface through nodes such as ⁇ Textview>, ⁇ ImgView>, and ⁇ VideoView>.
  • a node corresponds to a control or property in the interface, and the node is rendered as user-visible content after parsing and rendering.
  • applications such as hybrid applications, often contain web pages in their interface.
  • a web page also known as a page, can be understood as a special control embedded in an application interface.
  • a web page is source code written in a specific computer language, such as hypertext markup language (GTML), cascading styles Tables (cascading style sheets, CSS), java scripts (JavaScript, JS), etc.
  • GTML hypertext markup language
  • cascading styles Tables cascading style sheets, CSS
  • java scripts JavaScript, JS
  • the specific content contained in a web page is also defined by tags or nodes in the source code of the web page.
  • GTML defines the elements and attributes of web pages through ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas>.
  • Graphical user interface refers to a user interface related to computer operation displayed in a graphical manner.
  • the user graphical interface can be an icon, window, control or other interface elements displayed on the display screen of the electronic device, wherein the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets and other visual interface elements.
  • the embodiments of the present application provide a user graphical interface display method and device thereof, which can dynamically display shadows and light effects of specified view controls in the user graphical interface, thereby providing a realistic sense of the user graphical interface.
  • the user graphical display interface includes a specified view control with a depth attribute.
  • View controls can be visual interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, and navigation bars in a user graphical interface.
  • the specified view control refers to a view control with a depth property.
  • the depth attribute can also be described as a Z-axis attribute, an elevation attribute, or an altitude attribute, among other names. It can be understood that the specified view control adds a Z-axis attribute on the basis of a conventional view control with X-axis and Y-axis attributes.
  • the electronic device takes a mobile phone as an example, and the definition of the three-dimensional coordinate axis of the mobile phone by the Android system can be seen in Figure 1.
  • the short axis of the mobile phone is the X axis
  • the long axis is the Y axis
  • the vertical upward direction of the mobile phone display is the Z axis
  • the short axis of the mobile phone is the Y axis
  • the long axis is the X axis
  • the vertical axis is the X axis
  • the screen of the mobile phone is the Z-axis upwards.
  • the specified view control can produce shadow effects under the action of the light source.
  • the specified view control takes a button as an example.
  • the button's depth attribute changes from 0dp to 6dp, which can produce a shadow effect; the button's depth attribute changes from 6dp to 0dp, without shadow effect.
  • a button with a depth attribute of 6dp has two shadow effects, namely the ambient light shadow effect and the point light shadow effect.
  • the shadow effect shown in Figure 2 is static and fixed, and will not change with the change of the posture of the mobile phone, nor will it change with the change of the user's eye position.
  • the specified view control can not only produce shadow effects, but also light effects.
  • the light effect of parallel light is related to the direction of the light and the orientation of the illuminated plane. When the direction of the light is at a 90-degree angle to the illuminated plane, the light effect is the strongest; when the angle between the direction of the light and the illuminated plane gradually decreases, the light effect gradually becomes weaker; When the illumination plane is parallel, no light is irradiated on the illuminated plane, the light intensity is 0, and there is no light effect.
  • the light effect of a point light source is not only related to the direction of the light and the orientation of the illuminated plane, but also to the light intensity and color of the light. In this application, the light effect is also related to the original color of the specified view control.
  • FIG. 3 shows a schematic structural diagram of the electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a separate device.
  • the modulation and demodulation processor may be independent of the processor 110, and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the display screen 194 may be used to display shadows and light effects of specified view controls.
  • the electronic device displays the shadow light effect of the specified view control reference may be made to the related descriptions in the subsequent embodiments, and details are not described herein again.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the electronic device 100 may include one or more rear cameras, and may also include one or more front cameras. The rear camera is usually located on the back of the display screen 194 , and the front camera is usually located on the side of the display screen 194 .
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG Moving Picture Experts Group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D can be the USB interface 130, or can be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 100 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the electronic device 100 exemplarily shown in FIG. 3 may dynamically display designated view controls in the GUI through the display screen 194 .
  • the electronic device 100 may determine the positions of different light sources through the images captured by the camera 193, and then dynamically display the shadow light effects of the specified view controls.
  • the electronic device 100 may detect the posture of the user holding the electronic device 100 through the gyro sensor 180B, the acceleration sensor 180E, etc., and then dynamically display the shadow light effect of the view control according to different postures.
  • the electronic device 100 can detect the gesture of the user holding the electronic device 100 through the image collected by the camera 193 and the gyro sensor 180B, the acceleration sensor 180E, etc., and dynamically display the shadow light effect of the specified view control.
  • FIG. 4 is a schematic flowchart of a method for displaying a graphical user interface provided by an embodiment of the present application. The process may be but not limited to the following steps:
  • Step 401 Determine the position of the first light source according to the collected first image of the biological feature.
  • the organism refers to an organism with kinetic energy, such as a human, a cat, a dog, and the like.
  • the creature takes a human as an example, that is, a user.
  • Biometrics are often unique (unlike other living things) and can be used for measurement, identification, verification, etc. Biometrics may include, but are not limited to, eyes, nose, mouth, face, iris, fingerprints, and the like.
  • the biological feature takes eyes as an example.
  • the electronic device 100 starts the front camera, and collects the first image of the biometric feature through the front camera.
  • the first image includes eyes, which can be the complete facial image shown in A in FIG.
  • the image shown in B in FIG. 5 only includes the eyes, and the position of the eyes in the first image is related to the focal length of the front camera and the distance of the eyes relative to the display screen 194 .
  • the distance of the eyes relative to the display screen 194 can be understood as the distance of the eyes relative to the electronic device 100 .
  • the electronic device 100 uses a preset face (face) identification (identify, ID) or a collected face ID as the first image.
  • face ID is used for face recognition, and in the embodiment of the present application, the face ID is also used to determine the position of the light source, and the eyes in the face ID are used as the light source.
  • the electronic device 100 may collect the face ID through the camera 193 in advance, and store it as a preset face ID.
  • the electronic device 100 determines the position of the first light source according to the first image.
  • the position of the first light source is the position of the eye relative to the display screen.
  • the electronic device 100 determines the position of the first light source according to the first image position of the eye in the first image and the distance of the eye relative to the display screen.
  • the first image position may include the position of the left eye in the first image, and/or the position of the right eye in the first image.
  • the electronic device 100 can identify the eye in the first image through a human eye recognition algorithm, and then determine the position of the first image. Or, the electronic device 100 may identify the human face in the first image through a face recognition algorithm, and then determine the position of the first image through the ratio of the eyes to the human face.
  • the method for how the electronic device 100 determines the position of the first image is not limited in this embodiment of the present application.
  • the distance of the eye relative to the display screen can be understood as the vertical distance of the eye relative to the display screen, that is, the vertical distance of the first image relative to the display screen.
  • the electronic device 100 can estimate the distance of the eyes relative to the display screen according to the face area, the area of the first image, and the preset focal length of the front camera. Specifically, the following formula can be used to estimate the distance of the eye relative to the display screen.
  • the conversion coefficient is related to the preset focal length and the focal length when the first image is collected, and the specific relationship and specific value are not limited in the embodiments of the present application.
  • the method of distance between eyes can be used to assist in calculating the distance.
  • the electronic device 100 may also use other methods to calculate the distance between the eyes and the display screen, and the embodiment of the present application does not limit how to calculate the distance between the eyes and the display screen.
  • the electronic device 100 determines the position of the eye, that is, the position of the first light source, when the position of the first image and the above-mentioned distance are obtained. As shown in FIG. 6 , the electronic device 100 determines the position of the eye relative to the display screen in the case of determining the distance of the eye relative to the display screen and the position of the eye in the first image. If the lines on which the two eyes are located are parallel to the horizontal direction, then the angle 1 and the angle 2 in FIG. 6 are the same; otherwise, there is a certain difference between the angle 1 and the angle 2.
  • the location of the first light source may include the location of the left eye and/or the location of the right eye.
  • the position of the first light source is a position comprehensively determined according to the position of the left eye and the position of the right eye.
  • the electronic device 100 determines the position of the first light source according to whether the angle between the display screen and the horizontal direction is within a preset range.
  • the preset range may be [90°-M°, 90°+M°], for example, M is 5, and within the range of [85°, 95°], the display screen is considered to be vertical relative to the horizontal direction.
  • M is not limited in the embodiments of the present application. It can be understood that when the included angle between the display screen and the horizontal direction is within the deviation range of 90 degrees, the display screen is considered to be vertical with respect to the horizontal direction, and the deviation value can be ignored.
  • Case 1 The angle between the display screen and the horizontal direction is within a preset range, that is, the display screen is vertical relative to the horizontal direction.
  • the electronic device 100 determines the first viewing angle according to the first image position and the above-mentioned distance, and determines the position of the first light source according to the first viewing angle. Taking the position of the first light source as the position of the right eye as an example, as shown in Figure 7-1, according to the first image position and the above distance, combined with the principle of trigonometric functions, the first angle of view can be calculated, and then according to the first angle of view, The position (x, y, z) of the right eye can be calculated.
  • Case 2 The angle between the display screen and the horizontal direction is not within the preset range, that is, the display screen has a certain angle relative to the horizontal direction.
  • the electronic device 100 determines the second viewing angle according to the first image position, the above-mentioned distance, and the angle between the display screen and the horizontal direction, and determines the position of the first light source according to the second viewing angle. It can be understood that, assuming that the display screen is vertical relative to the horizontal direction, the first viewing angle is determined, and the first viewing angle is corrected according to the angle between the display screen and the horizontal direction to obtain the second viewing angle.
  • the first viewing angle is corrected to obtain the second viewing angle, and then the position of the right eye (x) can be calculated according to the second viewing angle. , y, z).
  • the calculated position of the first light source in the second case is more accurate than that in the first case.
  • the electronic device 100 may determine the intensity of the first light source. It can be understood that the closer the distance between the light source and the display screen, the stronger the intensity of the light source; otherwise, the weaker the intensity of the light source. In this embodiment of the present application, how the electronic device 100 determines the intensity of the first light source is not limited.
  • Step 402 according to the position of the first light source, generate and output a first shadow light effect for a specified view control.
  • the first shadow light effect may include a first shadow.
  • the electronic device 100 may generate the first shadow for the specified view control according to the position of the first light source in the following manner, and output the first shadow through the display screen.
  • the electronic device 100 generates the first shadow for the specified view control according to the position of the first light source and the shadow drawing algorithm, and outputs the first shadow through the display screen.
  • the shadow drawing algorithm may be a shadow drawing command.
  • the shadow drawing command is executed, the shadow is drawn, and the ambient light shadow and the shadow are respectively generated according to the position of the first light source. Point light shadows.
  • Method 1 can be applied to specify that the outer frame of the view control is a rectangle, a rounded rectangle or a circle, and can generate ambient light shadows and point light shadows.
  • the electronic device 100 In a second manner, the electronic device 100 generates a first shadow for the specified view control according to the position of the first light source, the intensity of the first light source and the material attribute information of the specified view control, and outputs the first shadow through the display screen. Specifically, the electronic device 100 determines the subject blur radius according to the position of the first light source and the intensity of the first light source; determines the projection blur information according to the position of the first light source and the material attribute information of the specified view control; Blur information, generate the first shadow for the specified view control.
  • the electronic device 100 acquires the subject image, that is, the image of the specified view control, determines the subject blur radius, such as Gaussian blur 20, according to the position of the first light source and the intensity of the first light source, and uses the subject blur radius to perform subject blur on the specified view control. , to get the effect of blurring the subject.
  • the electronic device 100 determines the projection blur information according to the position of the first light source and the material property information of the specified view control, for example, includes projection color #000000, 13% opacity, projection blur radius 40, Y-axis offset 8, and uses the projection blur information Perform projection blur on the specified view control to obtain the effect of projection blur.
  • the electronic device 100 superimposes the blurred effect of the subject and the effect of projection blur to obtain a superimposed effect, and superimposes the superimposed effect with the subject image, so that the subject image has a shadow effect.
  • the material property information of the specified view control includes one or more of refractive index, reflectance, diffuse reflectance or transparency.
  • Materials can include one or more of background materials, border materials, or backplane materials. Users can define information such as refractive index, reflectivity, diffuse reflectivity or transparency for the view controls in the GUI, or directly set the type of view controls, such as frosted glass, paper, specular, etc. Different material property information of the specified view control will produce shadows of different weights, colors, and light effects of different effects.
  • the position of the light source and material property information can affect the shadow transparency, shadow blur radius, and shadow offset.
  • the second method can be applied to the outer frame of the specified view control in any shape, and has a wider scope of application. Combined with the material properties, it is more suitable for practical applications.
  • the first shadow light effect may include a first light effect.
  • the electronic device 100 can generate the first light effect for a specified view control according to the position of the first light source in the following manner, and output the first light effect through the display screen.
  • the electronic device 100 generates a first light effect for the specified view control according to the position of the first light source, the intensity of the first light source and the original color information of the specified view control, and outputs the first light effect through the display screen.
  • the electronic device 100 obtains the angle information of the light and the color information of the light.
  • the electronic device 100 may obtain the angle information of the light according to the position of the first light source and the depth attribute of the specified view control.
  • the electronic device 100 performs a dot product operation with the original color information of the specified view control according to the color information of the light, the angle information of the light and the intensity of the first light source, thereby changing the intensity information of the original color of the specified view control.
  • the electronic device 100 reproduces and draws the user graphical interface according to the changed intensity information, thereby displaying another light effect.
  • the electronic device 100 determines the radial gradient radius according to the position of the first light source; determines the Gaussian blur radius according to the intensity of the first light source; generates the first light effect for the specified view control according to the radial gradient radius and the Gaussian blur radius , and output the first light effect through the display screen.
  • the electronic device 100 determines the radial gradient radius according to the position of the first light source, and then draws a radial gradient sphere according to the radial gradient radius.
  • the color of the sphere can be a predefined color, such as white light by default.
  • the electronic device 100 determines the Gaussian blur radius according to the intensity of the first light source, and then performs Gaussian blur on the sphere according to the Gaussian blur radius. On the specified view control, draw a Gaussian blurred sphere to generate the first light effect, as shown in Figure 9.
  • Step 403 Determine the position of the second light source according to the collected second image of the biological feature.
  • Step 404 according to the position of the second light source, generate and output a second shadow light effect for the specified view control.
  • step 403 to step 404 is similar to the execution process of step 401 to step 402 , for details, please refer to the description of step 401 to step 402 .
  • the difference is that the position of the light source in step 403 is different from that in step 401 .
  • the position of the first light source and the position of the second light source are different positions of the eyes relative to the display screen, that is, the position of the first light source is different from the position of the second light source.
  • the position of the first light source is the first moment, the position of the eye relative to the display screen;
  • the position of the second light source is the second moment, the position of the eye relative to the display screen, so that the first light source is is different from the position of the second light source.
  • the position of the first light source is different from the position of the second light source, and thus the first shadow light effect is different from the second shadow light effect. It can be seen that the shadow light effect of the same specified view control can change with the change of the light source position, so as to realize the dynamic display of the shadow effect of the specified view control, which is beneficial to build a more realistic user graphical interface.
  • FIG. 12 is a schematic flowchart of another method for displaying a graphical user interface provided by an embodiment of the present application.
  • the process may be but not limited to the following steps:
  • Step 501 Obtain a first position of the electronic device relative to a preset light source.
  • the method shown in FIG. 12 can be used to realize the dynamic display of the user graphical interface.
  • the pre-light source can be understood as a hypothetical light source, which may not actually exist.
  • the position of the preset light source can be set by the user or defaulted by the system.
  • the specific position of the preset light source is not limited in the embodiments of the present application.
  • the electronic device 100 may acquire the position of the electronic device 100 relative to the preset light source according to the sensor in the electronic device 100 .
  • the position of the electronic device 100 relative to the preset light source may be understood as the posture of the electronic device 100 relative to the preset light source.
  • Step 502 Generate and output a first shadow light effect for a specified view control according to the first position and the position of the preset light source.
  • step 502 is to generate and output a first shadow light effect for the specified view control according to the position of the first light source, while step 502 is to generate and output the first shadow light effect according to the position of the preset light source and the first position of the electronic device relative to the preset light source. , generates and outputs the first shadow light effect for the specified view control.
  • Step 503 Obtain a second position of the electronic device relative to the preset light source.
  • Step 504 according to the second position and the position of the preset light source, generate and output a second shadow light effect for the specified view control.
  • step 503 to step 504 is similar to the execution process of step 401 to step 402 , for details, please refer to the description in step 501 to step 502 .
  • the difference is that the position of the electronic device relative to the preset light source in step 503 and step 501 is different.
  • the first position is different from the second position, and thus the first shadow light effect is different from the second shadow light effect. It can be seen that the shadow light effect of the same specified view control can change with the posture of the electronic device, so as to realize the dynamic display of the shadow effect of the specified view control, which is conducive to building a more realistic user graphical interface.
  • the position of the electronic device 100 relative to the preset light source is different from that of B and C in FIG. 13 , that is, the positions of the electronic device 100 in A, B, and C are different
  • the poses are different, the angle of the light is different, and the shadow light effects generated for the same specified view control are different.
  • the embodiment shown in FIG. 12 only uses the sensor inside the electronic device, has the characteristics of lower power consumption, and has more commercial value.
  • FIG. 14 is a schematic diagram of a user graphical interface display device provided by the present application.
  • the apparatus includes a processing unit 1401 and a display unit 1402 .
  • the processing unit 1401 is configured to determine the position of the first light source according to the collected first image of the biological feature; and generate the first shadow light effect for the specified view control according to the position of the first light source;
  • a display unit 1402 configured to output a first shadow light effect
  • the processing unit 1401 is further configured to determine the position of the second light source according to the collected second image of the biological feature; and generate a second shadow light effect for the specified view control according to the position of the second light source;
  • the display unit 1402 is further configured to output the second shadow light effect
  • the position of the first light source and the position of the second light source are the positions of the biological feature relative to the display screen, and the position of the first light source is different from the position of the second light source; the first shadow light effect is different from the second shadow light effect.
  • the processing unit 1401 is specifically configured to determine the first image position of the biological feature in the first image according to the collected first image of the biological feature, and estimate the distance of the biological feature relative to the display screen; The first image position and distance determine the position of the first light source.
  • the processing unit 1401 is specifically configured to determine the first viewing angle according to the first image position and distance according to the angle between the display screen and the horizontal direction within a preset range; and determine the first light source according to the first viewing angle s position.
  • the processing unit 1401 is specifically configured to determine the second viewing angle according to the angle between the display screen and the horizontal direction, the position and distance of the first image, if the angle between the display screen and the horizontal direction is not within the preset range; According to the second viewing angle, the position of the first light source is determined.
  • the first shadow light effect includes a first shadow;
  • the processing unit 1401 is specifically configured to generate the specified view control according to the position of the first light source, the intensity of the first light source and the material attribute information of the specified view control and output the first shadow.
  • the processing unit 1401 is specifically configured to determine the subject blur radius according to the position of the first light source and the intensity of the first light source; and determine the projection blur according to the position of the first light source and the material attribute information of the specified view control information; generate and output the first shadow for the specified view control according to the blur radius of the subject and the shadow blur information.
  • the material property information of the specified view control includes one or more of refractive index, reflectance, diffuse reflectance, or transparency.
  • the first shadow light effect includes a first light effect
  • the processing unit 1401 is specifically configured to, according to the position of the first light source, the intensity of the first light source and the original color information of the specified view control, target the specified view control A first light effect is generated and output.
  • the first shadow light effect includes a first light effect
  • the processing unit 1401 is specifically configured to determine the radial gradient radius according to the position of the first light source; and determine the Gaussian blur radius according to the intensity of the first light source; Generates and outputs the first light effect for the specified view control based on the radial gradient radius and Gaussian blur radius.
  • the processing unit 1401 is used to obtain a first position of the electronic device relative to the preset light source; according to the first position and the position of the preset light source, generate a first shadow light effect for a specified view control;
  • a display unit 1402 configured to output a first shadow light effect
  • the processing unit 1401 is further configured to acquire a second position of the electronic device relative to the preset light source; generate a second shadow light effect for the specified view control according to the second position and the position of the preset light source;
  • the display unit 1402 is further configured to output the second shadow light effect
  • the first position is different from the second position, and the first shadow light effect is different from the second shadow light effect.
  • the first shadow light effect includes a first shadow;
  • the processing unit 1401 is specifically configured to, according to the first position, the position of the preset light source, the intensity of the first light source, and the material attribute information of the specified view control, for Specifies that the view control generates and outputs the first shadow.
  • the material property information of the specified view control includes one or more of refractive index, reflectance, diffuse reflectance, or transparency.
  • the first shadow light effect includes a first light effect
  • the processing unit 1401 is specifically configured to, according to the first position, the position of the preset light source, the intensity of the first light source and the original color information of the specified view control, Generates and outputs the first light effect for the specified view control.
  • the first shadow light effect includes a first light effect
  • the processing unit 1401 is specifically configured to determine the radial gradient radius according to the first position and the position of the preset light source; according to the first position and the preset light source Determine the Gaussian blur radius; according to the radial gradient radius and Gaussian blur radius, generate and output the first light effect for the specified view control.
  • the term “when” may be interpreted to mean “if” or “after” or “in response to determining" or “in response to detecting" depending on the context.
  • the phrases “in determining" or “if detecting (the stated condition or event)” can be interpreted to mean “if determining" or “in response to determining" or “on detecting (the stated condition or event)” or “in response to the detection of (the stated condition or event)”.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • software it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission to another website site, computer, server, or data center by wire (eg, coaxial cable, optical fiber, digital subscriber line) or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, a data center, or the like that includes an integration of one or more available media.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state drives), and the like.
  • the process can be completed by instructing the relevant hardware by a computer program, and the program can be stored in a computer-readable storage medium.
  • the program When the program is executed , which may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: ROM or random storage memory RAM, magnetic disk or optical disk and other mediums that can store program codes.

Abstract

Appareil et procédé d'affichage d'interface graphique utilisateur (GUI), appliqués à un dispositif électronique, le dispositif électronique comprenant un écran d'affichage pour afficher une GUI, et la GUI comprenant une commande de visualisation spécifiée ayant un attribut de profondeur. Le procédé peut comprendre : la détermination, selon une première image capturée d'une caractéristique biologique, de la position de la caractéristique biologique par rapport à une première source de lumière d'un écran d'affichage ; la génération et la délivrance en sortie d'un premier effet de lumière d'ombre pour une commande de visualisation spécifiée en fonction de la position de la première source de lumière ; la détermination, en fonction d'une seconde image capturée de la caractéristique biologique, de la position de la caractéristique biologique par rapport à une seconde source de lumière de l'écran d'affichage ; et la génération et la délivrance en sortie d'un second effet de lumière d'ombre pour la commande de visualisation spécifiée en fonction de la position de la seconde source de lumière. La position de la première source de lumière est différente de la position de la seconde source de lumière, et le premier effet de lumière d'ombre est différent du second effet de lumière d'ombre, de sorte qu'une ombre et un effet de lumière de la commande de visualisation spécifiée peuvent être affichés de manière dynamique, ce qui améliore ainsi l'impression de réalité de la GUI.
PCT/CN2021/133215 2020-11-28 2021-11-25 Appareil et procédé d'affichage d'interface graphique utilisateur WO2022111593A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011363058.4 2020-11-28
CN202011363058.4A CN114584652B (zh) 2020-11-28 2020-11-28 一种用户图形界面显示方法、装置、计算机设备及存储介质

Publications (1)

Publication Number Publication Date
WO2022111593A1 true WO2022111593A1 (fr) 2022-06-02

Family

ID=81753720

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/133215 WO2022111593A1 (fr) 2020-11-28 2021-11-25 Appareil et procédé d'affichage d'interface graphique utilisateur

Country Status (2)

Country Link
CN (1) CN114584652B (fr)
WO (1) WO2022111593A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090179914A1 (en) * 2008-01-10 2009-07-16 Mikael Dahlke System and method for navigating a 3d graphical user interface
EP2090974A1 (fr) * 2006-08-02 2009-08-19 Research In Motion Limited Système et procédé d'ajustement de la présentation de texte et d'images sur un dispositif électronique selon l'orientation du dispositif
CN105808218A (zh) * 2014-12-30 2016-07-27 乐视致新电子科技(天津)有限公司 一种针对用户界面ui控件效果的绘制方法和装置
CN105827820A (zh) * 2015-12-25 2016-08-03 维沃移动通信有限公司 一种移动终端的防偷窥方法及移动终端
CN107436765A (zh) * 2017-07-27 2017-12-05 青岛海信电器股份有限公司 视图控件的处理方法和装置
CN108600733A (zh) * 2018-05-04 2018-09-28 成都泰和万钟科技有限公司 一种基于人眼跟踪的裸眼3d显示方法
CN111930291A (zh) * 2020-10-09 2020-11-13 广州宸祺出行科技有限公司 一种在Android平台实现个性化阴影的方法及系统

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8345046B2 (en) * 2009-04-17 2013-01-01 Trapcode Ab Method for adding shadows to objects in computer graphics
US8913056B2 (en) * 2010-08-04 2014-12-16 Apple Inc. Three dimensional user interface effects on a display by using properties of motion
CN104123743A (zh) * 2014-06-23 2014-10-29 联想(北京)有限公司 图像阴影添加方法及装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2090974A1 (fr) * 2006-08-02 2009-08-19 Research In Motion Limited Système et procédé d'ajustement de la présentation de texte et d'images sur un dispositif électronique selon l'orientation du dispositif
US20090179914A1 (en) * 2008-01-10 2009-07-16 Mikael Dahlke System and method for navigating a 3d graphical user interface
CN105808218A (zh) * 2014-12-30 2016-07-27 乐视致新电子科技(天津)有限公司 一种针对用户界面ui控件效果的绘制方法和装置
CN105827820A (zh) * 2015-12-25 2016-08-03 维沃移动通信有限公司 一种移动终端的防偷窥方法及移动终端
CN107436765A (zh) * 2017-07-27 2017-12-05 青岛海信电器股份有限公司 视图控件的处理方法和装置
CN108600733A (zh) * 2018-05-04 2018-09-28 成都泰和万钟科技有限公司 一种基于人眼跟踪的裸眼3d显示方法
CN111930291A (zh) * 2020-10-09 2020-11-13 广州宸祺出行科技有限公司 一种在Android平台实现个性化阴影的方法及系统

Also Published As

Publication number Publication date
CN114584652A (zh) 2022-06-03
CN114584652B (zh) 2023-06-20

Similar Documents

Publication Publication Date Title
WO2021129326A1 (fr) Procédé d'affichage d'écran et dispositif électronique
WO2020259452A1 (fr) Procédé d'affichage plein écran pour terminal mobile et appareil
CN113645351B (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
WO2020093988A1 (fr) Procédé de traitement d'image et dispositif électronique
WO2022127787A1 (fr) Procédé d'affichage d'image et dispositif électronique
WO2022001258A1 (fr) Procédé et appareil d'affichage à écrans multiples, dispositif terminal et support de stockage
CN114887323B (zh) 一种电子设备操控方法及电子设备
WO2020015144A1 (fr) Procédé de photographie et dispositif électronique
WO2021082815A1 (fr) Procédé d'affichage d'élément d'affichage et dispositif électronique
WO2022100685A1 (fr) Procédé de traitement de commande de dessin et dispositif associé
US11889386B2 (en) Device searching method and electronic device
WO2022012418A1 (fr) Procédé de photographie et dispositif électronique
CN112150499A (zh) 图像处理方法及相关装置
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
WO2022105702A1 (fr) Procédé et dispositif électronique d'enregistrement d'image
WO2023179123A1 (fr) Procédé de lecture audio bluetooth, dispositif électronique, et support de stockage
WO2022078116A1 (fr) Procédé de génération d'image à effet de pinceau, procédé et dispositif d'édition d'image et support de stockage
WO2022033344A1 (fr) Procédé de stabilisation vidéo, dispositif de terminal et support de stockage lisible par ordinateur
WO2020233581A1 (fr) Procédé de mesure de hauteur et dispositif électronique
WO2022111593A1 (fr) Appareil et procédé d'affichage d'interface graphique utilisateur
CN113970965A (zh) 消息显示方法和电子设备
CN114089902A (zh) 手势交互方法、装置及终端设备
CN113971823A (zh) 外表分析的方法和电子设备
CN113610943B (zh) 图标圆角化的处理方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21897093

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21897093

Country of ref document: EP

Kind code of ref document: A1