WO2022042163A1 - Procédé d'affichage appliqué à un dispositif électronique, et dispositif électronique - Google Patents

Procédé d'affichage appliqué à un dispositif électronique, et dispositif électronique Download PDF

Info

Publication number
WO2022042163A1
WO2022042163A1 PCT/CN2021/108283 CN2021108283W WO2022042163A1 WO 2022042163 A1 WO2022042163 A1 WO 2022042163A1 CN 2021108283 W CN2021108283 W CN 2021108283W WO 2022042163 A1 WO2022042163 A1 WO 2022042163A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
user
makeup
electronic device
face
Prior art date
Application number
PCT/CN2021/108283
Other languages
English (en)
Chinese (zh)
Inventor
高凌云
罗红磊
刘海波
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022042163A1 publication Critical patent/WO2022042163A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/80Creating or modifying a manually drawn or painted image using a manual input device, e.g. mouse, light pen, direction keys on keyboard

Definitions

  • the present application relates to the field of terminal technologies, and in particular, to a display method and an electronic device applied to an electronic device.
  • mobile phones With the increasingly powerful functions of mobile phones, mobile phones are applied to all aspects of people's lives. For example, mobile phones can act as mirrors. For another example, the mobile phone can be used as a makeup box, displaying the user's makeup and beautifying pictures in real time, and providing makeup guidance for the user.
  • the mobile phone can be used as a makeup box, displaying the user's makeup and beautifying pictures in real time, and providing makeup guidance for the user.
  • the present application provides a display method and an electronic device applied to an electronic device, which can assist a user to make up in an all-round way, and is convenient for the user to draw delicate makeup.
  • the present application provides a display method applied to an electronic device, the method may include: a first area of the display screen of the electronic device displays an image of the user's face collected by a camera; The image-generated user face simulates an image with makeup; a first input is received; and in response to the first input, a magnified image of at least a portion of the user's face image is displayed in the first area.
  • only the image of the user's face can be enlarged so that the user can view the details of the face.
  • the user's face image and the user's face simulated image with makeup can also be enlarged at the same time, so that the user can view the details of the face and the simulated makeup image, and it is also convenient for the user to compare the user's face image and the simulated user's face with makeup image.
  • receiving the first input includes: receiving the first input acting on the first area; or receiving the first input acting on the second area.
  • the method further includes: in response to the first input, displaying an enlarged image of at least part of the user's face simulating the makeup image in the second area.
  • an image of the user's face and an image of the user's face simulated with makeup are simultaneously enlarged in response to the first input.
  • the magnifying effect of the user's face image displayed in the first area and the magnifying effect of the user's face simulated image with makeup displayed in the second area are displayed synchronously.
  • displaying an enlarged image of at least part of the user's face image in the first area includes: taking a center point of the user's face image as a center, displaying an enlarged image of at least part of the user's face image in the first area
  • the image; displaying an enlarged image of at least part of the simulated makeup image on the user's face in the second area includes: centering on a center point of the simulated makeup image on the user's face, displaying the enlarged image of at least part of the simulated makeup image on the user's face in the second area.
  • the first image is displayed in the first area, and the simulated image with makeup of the first image is displayed in the second area
  • the first image is an enlarged image of part of the user's face image
  • the second image is displayed in the first area, and the simulated makeup image of the second image is displayed in the second area
  • the second image is an enlarged image of part of the user's face image
  • part of the user's face image and part of the user's face simulated image with makeup are determined according to the position where the first input acts on the display screen of the electronic device. According to the first input, the image area that the user needs to enlarge and display is determined, and the image is enlarged and displayed with the area as the center. In one possible design, as the user's face moves, the center of the part of the user's face image and the part of the user's face simulated image with makeup does not move.
  • part of the user's face image corresponds to part of the user's face simulated image with makeup. That is to say, the enlarged-displayed user face image and the enlarged-displayed user face simulated image with makeup are the same part of the user's face. In this way, it is more convenient for the user to compare the image of the user's face with the simulated image of the user's face with makeup.
  • the simulated image of the user's face with makeup includes first indication information, and the first indication information is used to indicate the part and shape of the makeup; the first indication information follows the simulated image of the user's face with makeup. enlarged and enlarged.
  • the first indication information is a dotted frame. In this way, a more comprehensive makeup guide can be provided to the user, so that the user can make makeup according to the instruction.
  • displaying the user's face image collected by the camera in the first area of the display screen of the electronic device includes: displaying the first area of the display screen of the electronic device obtained according to the user's face image collected by the camera.
  • the static image; displaying the simulated makeup image of the user's face generated according to the user's face image in the second area of the display screen of the electronic device includes: displaying a second static image formed by simulating makeup on the user's face image in the second area of the display screen of the electronic device.
  • the electronic device can enlarge and display the solidified and displayed image of the user's face and the simulated image of the user's face with makeup, which is convenient for the user to check carefully.
  • the third area of the display screen of the electronic device displays the first information; the electronic device receives the user's input operation on the first information; and changes the simulated makeup image of the user's face according to the first information. That is to say, the electronic device can provide multiple sets of cosmetic parameters, and the user can select different cosmetic parameters to form different simulated facial makeup images of the user.
  • the first information is generated based on features of the user's facial image. In a possible design, the first information is generated from a picture. In one possible design, the first information is generated from the makeup of the user's facial image. In a possible design, a modification operation of the user's face simulation image with makeup is received; the first information is generated according to the modified user face simulation image with makeup.
  • the beauty parameters may be preset by the electronic device, may also be set according to user characteristics, or may be modified by the user.
  • the present application provides a display method applied to an electronic device, the method may include: starting a first application; displaying a user's face image collected by the electronic device in a first area of the display screen of the electronic device; The second area displays the user's face simulation with makeup image generated according to the user's face image; receives the first operation; in response to the first operation, stops displaying the user's face image; displays the first object in the first area, the first object is according to the user Still images or short videos of facial image capture.
  • the dynamically displayed face image of the user can be solidified and displayed as a static picture or a short video, which is convenient for the user to check the facial makeup.
  • the display of the simulated image of the user's face with makeup is stopped; the second object is displayed in the second area, and the second object is obtained according to the simulated image of the user's face with makeup.
  • Still images or short videos in this method, the dynamically displayed image of the user's face with simulated makeup can be solidified and displayed as a static picture or a short video, which is convenient for the user to check the beauty effect.
  • the present application provides an electronic device, which includes: a display screen, a camera, an input device and a processor.
  • the camera is used to collect the user's face image; the first area of the display screen is used to display the user's face image collected by the camera; the second area of the display screen is used to display the user's face simulation image with makeup generated according to the user's face image; the input device for receiving a first input; the processor is configured to, in response to the first input, control the first area of the display screen to display an enlarged image of at least part of the user's face image.
  • receiving the first input by the input device includes: the input device receiving the first input acting on the first area; or the input device receiving the first input acting on the second area.
  • the processor is further configured to, in response to the first input, control the second area of the display screen to display an enlarged image of at least part of the user's face simulating the image with makeup.
  • the processor is specifically configured to: control the first area of the display screen to display an enlarged image of at least part of the user's face image with the center point of the user's face image as the center;
  • the second area is centered on the center point of the simulated image with makeup on the user's face, and displays an enlarged image of at least part of the simulated image with makeup on the user's face.
  • the processor is further configured to, if it is determined that the first input acts on the first position of the display screen, control the second area of the display screen to display the first image; Acting on the second position of the display screen of the electronic device, the second area of the display screen is controlled to display the second image; wherein the first position is different from the second position, and the first image is different from the second image.
  • part of the user's face image corresponds to part of the user's face simulated image with makeup.
  • the simulated image of the user's face with makeup includes first indication information, where the first indication information is used to indicate the part and shape of the makeup; the first indication information follows the simulated image of the user's face with makeup enlarged and enlarged.
  • the processor is further configured to: acquire a first static image according to the user's facial image collected by the camera; simulate wearing makeup on the user's facial image to form a second static image; the display screen is also used for : Display the first static image in the first area; display the second static image in the second area.
  • the third area of the display screen is used to display the first information; the input device is also used to receive the user's input operation on the first information; the processor is also used to display the first information according to the first information Changes the user's face to simulate an image with makeup.
  • the processor is further configured to generate the first information according to the feature of the user's facial image.
  • the input device is further configured to receive a user's modification operation on the simulated face makeup image of the user; the processor is further configured to generate the first information according to the modified simulated facial makeup image of the user .
  • the present application provides a method for displaying a graphical user interface.
  • the method includes: an electronic device displays a first graphical user interface GUI; a first area of the first GUI includes a user's face image collected by a camera; The second area includes an image of the user's face with makeup simulated generated from the user's face image; in response to receiving the first input, the electronic device displays a second GUI; the first area of the second GUI includes an enlarged image of at least part of the user's face image; wherein, The first area of the second GUI and the first area of the first GUI are the same display area on the display screen.
  • the second area of the second GUI includes an enlarged image of at least part of the user's face simulating an image with makeup; wherein the second area of the second GUI is the same as the second area of the first GUI. is the same display area on the display.
  • the center point of the user's face image coincides with the center point of part of the user's face image; the center point of the user's face simulation image with makeup overlaps with the center point of the user's face simulation image with makeup .
  • the first area of the second GUI displays the first image
  • the second GUI of the second GUI displays the first image.
  • the area displays the simulated image with makeup of the first image; the first image is an enlarged image of part of the user's face image;
  • the first input acts on the second position of the display screen of the electronic device, the first image of the second GUI
  • the second image area, the second area of the second GUI displays the simulated image with makeup of the second image; the second image is an enlarged image of part of the user's face image; wherein, the first position is different from the second position, the first image is different from the second position.
  • the two images are different.
  • part of the user's face image corresponds to part of the user's face simulated image with makeup.
  • the second area of the first GUI further includes first indication information, and the first indication information is used to indicate the makeup part and shape; the second area of the second GUI also includes the following The first instruction information displayed in an enlarged manner by simulating the enlargement of the makeup image on the user's face.
  • the image of the object covered by the user's face in the second area of the first GUI does not display the simulated makeup image.
  • the first area of the first GUI includes a first static image obtained according to the user's facial image collected by the camera; the second area of the first GUI includes simulating the user's facial image with makeup A second still image is formed.
  • the third area of the first GUI includes the first information
  • the method further includes: receiving an input operation of the first information by the user; responding to the user's input operation of the first information , the electronic device displays a third GUI; the second area of the third GUI includes a simulated image of the user's face with makeup formed according to the first information; wherein, the second area of the third GUI and the second area of the first GUI are display screens on the same display area.
  • the present application provides an electronic device, which can implement the display method applied to the electronic device described in any one of the first aspect and the second aspect and the possible design methods thereof, which can be implemented through software , hardware, or through hardware executing corresponding software to implement the above method.
  • the electronic device may include a processor and memory.
  • the processor is configured to support the electronic device to perform corresponding functions in any one of the above-mentioned first and second aspects and possible designs thereof.
  • the memory is for coupling with the processor, which holds the necessary program instructions and data for the electronic device.
  • an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium includes computer instructions, when the computer instructions are executed on an electronic device, the electronic device is made to perform the above-mentioned first aspect and the first aspect.
  • an embodiment of the present application provides a computer program product, which, when the computer program product runs on a computer, enables the computer to execute any one of the first and second aspects and possible designs thereof.
  • the display method described in the method is applied to an electronic device.
  • FIG. 1 is a schematic diagram of a scene example of a display method applied to an electronic device provided by an embodiment of the present application;
  • FIG. 2 is a schematic diagram of a scene example of a display method applied to an electronic device provided by an embodiment of the present application;
  • FIG. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a software architecture of an electronic device provided by an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a scene example of a display method applied to an electronic device provided by an embodiment of the present application
  • FIGS. 7A-7D are schematic diagrams of a scene example of a display method applied to an electronic device provided by an embodiment of the present application.
  • FIG. 8 is a schematic flowchart of a display method applied to an electronic device according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of a scene example of a display method applied to an electronic device provided by an embodiment of the present application.
  • 10A is a schematic flowchart of a display method applied to an electronic device provided by an embodiment of the present application.
  • 10B-10D are schematic diagrams of a scenario example of a display method applied to an electronic device provided by an embodiment of the present application.
  • 11A is a schematic flowchart of a display method applied to an electronic device according to an embodiment of the present application.
  • 11B-11D are schematic diagrams of a scenario example of a display method applied to an electronic device provided by an embodiment of the present application.
  • 12-18 are schematic diagrams of scene examples of a display method applied to an electronic device provided by an embodiment of the application.
  • FIG. 19 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • references in this specification to "one embodiment” or “some embodiments” and the like mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application.
  • appearances of the phrases “in one embodiment,” “in some embodiments,” “in other embodiments,” “in other embodiments,” etc. in various places in this specification are not necessarily All refer to the same embodiment, but mean “one or more but not all embodiments” unless specifically emphasized otherwise.
  • the terms “including”, “including”, “having” and their variants mean “including but not limited to” unless specifically emphasized otherwise.
  • the term “connected” includes both direct and indirect connections unless otherwise specified.
  • the embodiment of the present application provides a display method applied to an electronic device, which can display a user wearing makeup screen, provide makeup guidance to the user, etc., provide makeup auxiliary functions, and help the user to complete makeup.
  • the above-mentioned electronic devices may include mobile phones, tablet computers, notebook computers, ultra-mobile personal computers (UMPCs), handheld computers, netbooks, personal digital assistants (personal digital assistants, PDAs), wearable devices, virtual reality devices etc., the embodiments of the present application do not impose any limitations on this.
  • the display interface of the mobile phone 100 may include a first display content 11 and a second display content 12 .
  • the first display content 11 is used to display the user's face image collected by the mobile phone camera;
  • the second display content 12 is used to display the user's face with makeup image superimposed on the user's face image collected by the camera to simulate a makeup effect.
  • the user can refer to the second display content 12 to apply makeup, and can also compare the first display content 11 (the user's face image) with the second display content 12 (the user's face with makeup image) to correct the user's facial makeup in real time. As shown in FIG.
  • the embodiment of the present application does not limit the positions where the first display content 11 and the second display content 12 are displayed on the display screen (also referred to as a screen in this application) of the mobile phone 100 .
  • the display interface of the mobile phone 100 further includes a third display content 13 .
  • the third display content 13 is used to display makeup auxiliary information.
  • the makeup assistance information may include a makeup palette 131 that includes makeup parameters. The user can select different makeup parameters, so that the second display content 12 (image with makeup on the user's face) presents corresponding effects.
  • the makeup auxiliary information may further include makeup step guide information 132 for indicating makeup steps.
  • the display area of the first display content 11 in the display screen is referred to as the first display frame
  • the display area of the second display content 12 in the display screen is referred to as the second display frame.
  • the above electronic device may also be a mobile phone 200 with a folding screen.
  • the cell phone 200 can be folded along one or more folding axes for use.
  • the screen (folding screen) of the mobile phone 200 is divided into a first display area 21 and a second display area 22 along a folding axis, and an angle is formed between the first display area 21 and the second display area 22 ⁇ , ⁇ is in the interval from 0° to 180°. It can be understood that the first display area 21 and the second display area 22 are different areas belonging to the same folding screen.
  • the first display content 11 and the second display content 12 are displayed in the first display area 21 and the second display area 22 respectively; in other embodiments, the first display content 11 and the second display content 12 All are displayed in the first display area 21 . In some embodiments, the first display area 21 displays the first display content 11 and the second display content 12 ; the second display area 22 displays the third display content 13 .
  • the display screen of the electronic device can display the first display content 11 and/or the second display content 12 .
  • the user can view the current makeup progress in real time through the first display content 11, can see the complete image with makeup through the second display content 12, and can also compare the first display content 11 (the user's face image) with the second display content 12 (with makeup). makeup image).
  • the user can also perform makeup according to the makeup step guide information of the third display content 13 ; and can also select makeup parameters to modify the image effect of the second display content 12 . In this way, a full range of makeup assistance can be brought to the user to help the user draw delicate makeup.
  • FIG. 3 shows a schematic structural diagram of an electronic device 300 .
  • the electronic device 300 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 300 .
  • the electronic device 300 may include more or less components than shown, or some components are combined, or some components are separated, or different components are arranged.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • graphics processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller may be the nerve center and command center of the electronic device 300 .
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180L, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180L through the I2C interface, so that the processor 110 and the touch sensor 180L communicate through the I2C bus interface, so as to realize the touch function of the electronic device 300 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to implement the photographing function of the electronic device 300 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 300 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 300, and can also be used to transmit data between the electronic device 300 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 300 .
  • the electronic device 300 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 300 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 300 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 300 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the electronic device 300 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 300 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 300 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 300 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the electronic device 300 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the display screen 194 may include a display and a touch panel (TP). The display is used for outputting display content to the user, and the touch device is used for receiving touch events input by the user on the display screen 194 .
  • the display screen 194 may be used to display the first display content 11 , the second display content 12 , the third display content 13 , and the like.
  • the electronic device 300 can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 300 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the camera 193 may include a wide-angle camera, a photographic camera, a 3D depth-sensing camera (eg, a structured light camera, a time-of-flight depth (flignt, ToF) camera), a telephoto camera, and the like.
  • the camera 193 may include a front-facing camera and a rear-facing camera. In this embodiment of the present application, the camera 193 (for example, a front-facing camera) may be used to collect an image of the user's face.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 300 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy, and the like.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 300 may support one or more video codecs.
  • the electronic device 300 can play or record videos in various encoding formats, such as: moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
  • MPEG moving picture experts group
  • MPEG2 moving picture experts group
  • MPEG3 MPEG4
  • MPEG4 moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 300 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 300 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the electronic device 300 by executing the instructions stored in the internal memory 121 .
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 300 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the electronic device 300 may implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone jack 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 300 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 300 may be provided with at least one microphone 170C. In other embodiments, the electronic device 300 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 300 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D can be the USB interface 130, or can be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 300 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 300 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 300 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the air pressure sensor 180B is used to measure air pressure.
  • the electronic device 300 calculates the altitude through the air pressure value measured by the air pressure sensor 180B to assist in positioning and navigation.
  • the gyro sensor 180C may be used to determine the motion attitude of the electronic device 300 .
  • the angular velocity of electronic device 300 about three axes ie, x, y, and z axes
  • the gyro sensor 180C can be used for image stabilization.
  • the gyro sensor 180C detects the shaking angle of the electronic device 300, calculates the distance to be compensated by the lens module according to the angle, and allows the lens to offset the shaking of the electronic device 300 through reverse motion to achieve anti-shake.
  • the gyro sensor 180C can also be used for navigation and somatosensory game scenarios.
  • the acceleration sensor 180D can detect the magnitude of the acceleration of the electronic device 300 in various directions (generally three axes). When the electronic device 300 is stationary, the magnitude and direction of the gravitational acceleration can be detected.
  • the magnetic sensor 180E includes a Hall sensor, a magnetometer, and the like. Hall sensors can detect the direction of the magnetic field; magnetometers are used to measure the magnitude and direction of the magnetic field. The magnetometer can measure the strength of the environmental magnetic field. For example, the strength of the magnetic field can be measured by the magnetometer to obtain the azimuth angle information of the carrier of the magnetometer.
  • the touch device 180F can be used to detect the user's touch position.
  • the touch point of the user on the electronic device 300 may be detected by the touch device 180F, and then the user's grasping posture may be determined using a preset grasping algorithm according to the touch position.
  • the electronic device 300 can measure the distance through infrared or laser. In some embodiments, when shooting a scene, the electronic device 300 can use the distance sensor 180G to measure the distance to achieve fast focusing.
  • Proximity light sensor 180H may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 300 emits infrared light to the outside through the light emitting diode.
  • Electronic device 300 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 300 . When insufficient reflected light is detected, the electronic device 300 may determine that there is no object near the electronic device 300 .
  • the electronic device 300 can use the proximity light sensor 180H to detect that the user holds the electronic device 300 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180H can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180M is used to sense ambient light brightness.
  • the electronic device 300 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180M can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180M can also cooperate with the proximity light sensor 180H to detect whether the electronic device 300 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180J is used to collect fingerprints.
  • the electronic device 300 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking photos with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180K is used to detect the temperature.
  • the electronic device 300 uses the temperature detected by the temperature sensor 180K to execute the temperature processing strategy. For example, when the temperature reported by the temperature sensor 180K exceeds a threshold, the electronic device 300 reduces the performance of the processor located near the temperature sensor 180K so as to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the electronic device 300 heats the battery 142 to avoid abnormal shutdown of the electronic device 300 caused by the low temperature. In some other embodiments, when the temperature is lower than another threshold, the electronic device 300 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180L is also called “touch panel”.
  • the touch sensor 180L may be disposed on the display screen 194, and the touch sensor 180L and the display screen 194 form a touch screen, also referred to as a "touch screen”.
  • the touch sensor 180L is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180L may also be disposed on the surface of the electronic device 300 , which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180Q can acquire vibration signals.
  • the bone conduction sensor 180Q can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180Q can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180Q can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180Q, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180Q, so as to realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 300 may receive key inputs and generate signal inputs related to user settings and function control of the electronic device 300 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 300 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 300 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 300 interacts with the network through the SIM card to realize functions such as calls and data communication.
  • the electronic device 300 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 300 and cannot be separated from the electronic device 300 .
  • the software system of the electronic device 300 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiments of the present application take the layered architecture as an example to exemplarily describe the software structure of the electronic device 300 .
  • FIG. 4 is a block diagram of a software structure of an electronic device 300 provided by an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the electronic device 300 may include an application layer, an application framework layer, an Android runtime and system libraries, and a kernel layer.
  • the application layer can include a series of applications.
  • the application layer may include applications such as smart makeup box, makeup area layout display, makeup tray layout display, makeup mode switching control, curing and magnification control, makeup mirror fill light control, and makeup extraction control.
  • the smart makeup box is used to provide auxiliary makeup functions.
  • the makeup tray layout shows the layout control of the makeup auxiliary information for the smart makeup box on the display screen.
  • the extraction color makeup control is used to analyze the collected user face image and obtain the color makeup parameters.
  • the makeup area layout display is used to simulate makeup processing on the user's face image to form a makeup image on the user's face.
  • the makeup mode switching control is used to control the electronic device to switch the display mode.
  • the curing and magnification control module is used for image curing and display of the dynamic display content of the cosmetic application display interface; it is also used to control the electronic device to magnify and display the display content of the cosmetic application display interface.
  • the makeup mirror fill light control is used to control the display fill light effect in the peripheral area of the electronic device display screen.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include event distribution management service, power management service (power manager service, PMS), timing management service (alarm manager service, AMS), window management service (window manager service, WMS),
  • event distribution management service power management service
  • PMS power management service
  • timing management service alarm manager service
  • AMS timing management service
  • window management service window manager service
  • WMS window manager service
  • a content provider a view system, a phone manager, a resource manager, a notification manager, etc., are not limited in this embodiment of the present application.
  • the event distribution management service is used to distribute events to applications.
  • the power management service may be used to control turning on or off the screen of the electronic device 300 .
  • the timing management service is used to manage timers, alarm clocks, etc.
  • the window management service is used to manage window programs.
  • the window management service can obtain the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide the communication function of the electronic device 300 . For example, the management of call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: sensor service, surface manager (surface manager), media library (Media Libraries), gesture recognition engine, face recognition engine, graphics processing engine, graphics tracking engine, natural language recognition engine, etc.
  • Sensor services are used to store and process sensor-related data. For example, the output data of each sensor is provided, and the output data of multiple sensors is fused.
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the gesture recognition engine handles gesture recognition related processes.
  • the face recognition engine is used to process data and processes related to face recognition.
  • a graphics processing engine is a drawing engine used for graphics, image processing, etc.
  • the graphics tracking engine is used to process graphics and image tracking related processes. Natural language recognition engines are used to support processes such as speech recognition, semantic recognition, etc.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • the audio driver is used to upload the voice received by the microphone of the electronic device to the natural language recognition engine.
  • the natural language recognition engine performs semantic recognition to determine the semantics of the acquired speech; generates corresponding events according to the semantics of the speech, and reports the events to the event distribution management service. For example, if the natural language recognition engine determines that the semantics of the speech is "switch makeup mode", a mode switching event will be generated; if the semantics of the speech is determined to be "fixed”, a solidification event will be generated; if the semantics of the speech is determined to be "enlarged”, a center zoom will be generated. event; if the semantics of the speech is determined to be "enlarge the mouth", a tracking zoom event is generated.
  • the camera driver is used to upload the user's face image collected by the camera of the electronic device to the face recognition engine.
  • the face recognition engine recognizes the expression of the user's facial image; generates the corresponding event according to the expression of the user's facial image, and reports the event to the event distribution management service. For example, if the face recognition engine determines that the acquired expression is a "smiley face", a solidification event is generated.
  • the camera driver can also be used to upload the user's hand image captured by the camera of the electronic device to the gesture recognition engine.
  • the gesture recognition engine recognizes the gesture corresponding to the user's hand image; generates a corresponding event according to the gesture of the user's hand image, and reports the event to the event distribution management service. For example, if the gesture recognition engine determines that the acquired gesture is a "heart gesture", a solidification event is generated; if it determines that the acquired gesture is a zoom gesture, a center zoom event is generated.
  • the sensor of the electronic device detects the user's operation on the screen, and notifies the sensor service through the sensor driver.
  • the sensor service determines the user's operation gestures on the screen, generates corresponding events according to the current process, and reports the events to the event distribution management service. For example, if the sensor service determines that the user's right swipe operation on the screen is detected, a mode switching event is generated; it is determined that the user's tap on the screen is detected, and a center zoom event is generated; A center zoom event is generated; if it is determined that the user's two-finger expansion operation is detected, a tracking zoom event is generated.
  • the event distribution management service distributes each event to the corresponding modules in the application layer.
  • the mode switching event is distributed to the makeup mode switching control module; the curing event, the center zoom event, the tracking zoom event, etc. are distributed to the curing and zoom control module.
  • Electronic devices with folding screens can be folded along one or more folding axes. Please refer to (a) of FIG. 5 , the electronic device is in an unfolded state. The user can fold the electronic device shown in (a) of FIG. 5 along the folding axis of the electronic device. As shown in (b) of FIG. 5 , after the user folds the electronic device along the folding axis AB, the screen (folding screen) of the electronic device is divided into two display areas along the folding axis AB, namely the first display area 21 and the second display area 21 Display area 22. In the embodiment of the present application, the first display area 21 and the second display area 22 formed after folding can be displayed as two independent areas.
  • the display interface of the first display area 21 may be called the first display interface
  • the display interface of the second display area 22 may be called the second display interface; that is, the display area of the folding screen where the first display interface is located is the first display interface Area 21 ; the display area of the folding screen where the second display interface is located is the second display area 22 .
  • the first display interface and the second display interface are respectively located on both sides of the folding axis.
  • the areas of the first display area 21 and the second display area 22 may be the same or different.
  • An included angle ⁇ may be formed between the first display area 21 and the second display area 22 , as shown in (b) of FIG. 5 , the electronic device is in a bent state. The user can continue to fold the electronic device along the folding axis AB. As shown in (c) of FIG. 5 , the electronic device is in a folded state.
  • first display area 21 and the second display area 22 may have other names, for example, the first display area 21 is called the A screen of the electronic device, and the second display area 22 is called the B screen of the electronic device; Alternatively, the first display area 21 is referred to as the upper screen of the electronic device, and the second display area 22 is referred to as the lower screen of the electronic device; this is not limited in this embodiment of the present application.
  • the first display area 21 of the folding screen is referred to as the first screen
  • the second display area 22 of the folding screen is referred to as the second screen; it can be understood that the first screen and the second screen belong to the same folding screen different areas of the screen.
  • the included angle ⁇ between the first screen 21 and the second screen 22 may be in the range of 0° to 180°.
  • the electronic device can hover on the plane independently without a stand or the user's hand . In this way, the hands of the user can be liberated, and it is convenient for the user to operate the electronic device.
  • the first threshold may be 150°
  • the second threshold may be 60°
  • the first threshold may be 120°
  • the second threshold may be 60°
  • the first threshold may be 120°
  • the second threshold may be 60°
  • the first threshold may be 120°
  • the second threshold may be 30°. This embodiment of the present application does not limit this.
  • the user unfolds the electronic device such that the value of ⁇ increases to a first angle (the first angle is less than the first threshold and greater than the second threshold), and the electronic device remains at the first angle for a first duration (such as 1s); or the user bends the electronic device so that the value of ⁇ is reduced to the first angle, and the electronic device remains at the first angle for a first period of time; then the display screen of the electronic device displays a shortcut menu.
  • the shortcut menu includes shortcut icons for makeup applications, such as smart makeup boxes. The user can open the makeup application by clicking the shortcut icon of the makeup application. Exemplarily, as shown in FIG.
  • a shortcut menu 602 is displayed on the desktop 601 of the electronic device, and the shortcut menu 602 includes an icon 603 of a “smart cosmetic box”.
  • the user can open the makeup application by clicking on the "smart makeup box” icon 603 .
  • the shortcut menu 602 pops up on the desktop 601, other icons originally displayed on the desktop 601 are displayed in a virtualized manner.
  • the electronic device receives an operation of clicking the "smart makeup box" icon 603 by the user, and in response to the operation of clicking the "smart makeup box” icon 603, the electronic device displays a display interface of the makeup application.
  • the display interface of the makeup application may include at least one of the first display content 11 and the second display content 12 .
  • the display interface of the makeup application includes the first display content 11 , and the first display content 11 is displayed on the first screen 21 of the display screen.
  • the front camera of the electronic device collects an image of the user's face, and displays the collected image on the first screen 21 of the display screen.
  • the display mode is called the mirror mode of makeup application.
  • the display interface of the makeup application includes the second display content 12 , and the second display content 12 is displayed on the first screen 21 of the display screen.
  • the front camera of the electronic device collects the user's face image, and the user's face image is transmitted to the graphics processing engine through the camera driver.
  • the graphics processing engine adopts the augmented reality technology algorithm to process the simulated makeup on the user's facial image collected by the camera, so as to form a simulated facial makeup image of the user.
  • the smart makeup box module controls the simulated makeup image of the user's face to be displayed on the first screen 21 of the display screen.
  • the display mode is called the makeup mode of the makeup application.
  • the display interface of the makeup application includes a first display content 11 and a second display content 12 , and the first display content 11 and the second display content 12 are displayed on the display screen 21 of the first screen.
  • the smart makeup box module controls the user's facial image collected by the camera and the user's face simulation image with makeup formed by the graphics processing engine to be displayed on the first screen 21 of the display screen in a synchronous split screen.
  • this display mode is called a mixed mode of makeup application.
  • mirror mode, makeup mode, and blend mode can be switched among each other.
  • the electronic device receives the mode switching operation, it switches a display mode.
  • the mode switching operation includes a finger sliding operation on the display screen.
  • the mode switching operation includes the first voice.
  • the voice "switch makeup mode” it switches to the makeup mode for display.
  • the mode switching operation may also include other forms, such as clicking a switching button, a mode switching gesture, etc., which are not limited in this embodiment of the present application.
  • the first display content 11 includes an image of the user's face and objects covering the user's face (eg, makeup tools, human hands, etc.), that is, the image captured by the camera includes the user's face image and objects covering the user's face.
  • the electronic device uses an image recognition algorithm to calculate the image collected by the camera, and obtains the user's facial image (such as eyebrows, eyes, nose, mouth, facial contour, etc. ) and occluding objects (eg, human hands, makeup tools, etc.).
  • the electronic device uses an augmented reality technology algorithm to perform simulated makeup processing on the part of the image captured by the camera that is not blocked by the blocking object, so as to form the second display content 12 .
  • the part of the user's face that is not blocked by the occluding object displays the simulated image with makeup
  • the part blocked by the occluding object displays the occluding object (this part does not include the simulated makeup effect).
  • the simulated makeup effect will not appear on the occluded objects in the simulated makeup image displayed by the electronic device, thereby bringing an immersive makeup display experience to the user.
  • the face image of the user in the first display frame is partially occluded by lipstick and a human hand.
  • the electronic device performs simulated makeup processing on the part of the user's face image that is not covered by lipstick and human hands, so as to form a simulated image of the user's face with makeup.
  • the user's face in the second display frame simulates the mouth part of the image with makeup, and the simulated makeup effect is displayed only on the part that is not covered.
  • a dashed frame may be superimposed on the simulated makeup image of the user's face, which is used to indicate the makeup position and shape, and guide the user to apply makeup.
  • the dashed-line frame may move with the movement of the simulated makeup image on the user's face, so that the dashed-line frame is always located at the indicated location.
  • a dotted frame is displayed at the corresponding position on the simulated makeup image of the user's face. Exemplarily, as shown in FIG.
  • the makeup part indicated by the makeup step guide information 132 is the mouth, and the dashed frame 701 is superimposed on the corresponding mouth area of the user's face simulated image with makeup displayed in the second display frame.
  • a dashed box 701 is used to indicate the location and shape of the makeup applied to the mouth.
  • the display interface of the makeup application includes an analog button for controlling to turn on or off the function of displaying the dotted frame. For example, when the user clicks the simulation button, the simulation button is pressed, and the function of displaying the dotted frame is turned on.
  • the electronic device displays the dotted frame at the corresponding position on the simulated makeup image of the user's face according to the position indicated by the makeup step guide information 132 .
  • the makeup part indicated by the makeup step guide information 132 is the mouth.
  • the display interface of the electronic device includes an "image guide" button 702. If the electronic device receives the operation of the user pressing the "image guide” button 702, the user's face displayed in the second display frame simulates the mouth area of the image with makeup, and a dotted line is superimposed. Block 701.
  • the embodiment of the present application provides a display method applied to an electronic device.
  • the electronic device displays a display interface of a cosmetic application
  • the dynamic display content of the display interface can be image-solidified and displayed, which is convenient for a user to view.
  • the method may include:
  • the electronic device displays a first interface of a makeup application, where the first interface includes at least one of the first display content 11 and the second display content 12 .
  • the display mode of the makeup application is the mirror mode; the first interface includes the first display content 11 .
  • the first screen 21 of the electronic device displays an image of the user's face captured by the camera.
  • the display mode of the makeup application is the makeup mode; the first interface includes the second display content 12 .
  • the first screen 21 of the electronic device displays a simulated image of the user's face with makeup.
  • the display mode of the makeup application is a mixed mode; the first interface includes a first display content 11 and a second display content 12 .
  • the first screen 21 of the electronic device displays the user's face image and the user's face simulated image with makeup on the left and right screens.
  • the second screen 22 of the electronic device may not display any content (for example, the screen is off), or the second screen 22 of the electronic device may display the third display content 13 (makeup auxiliary information), or the electronic
  • the second screen 22 of the device may display other content, which is not limited in this embodiment.
  • the first screen 21 and the second screen 22 of the electronic device jointly display the first interface.
  • the first screen 21 of the electronic device displays the first display content 11, and the second screen 22 displays the second display content 12; or the first screen 21 displays the second display content 12, and the second screen 22 displays the first display content 11, etc. .
  • the second screen 22 of the electronic device displays the first interface of the makeup application
  • the first screen 21 of the electronic device may not display any content (such as turning off the screen)
  • the first screen 21 of the electronic device may display the first screen
  • the third display content 13 (makeup auxiliary information) or the first screen 21 of the electronic device may display other content, which is not limited in this embodiment. It will not be repeated here.
  • the electronic device receives the first operation.
  • the first operation may include: voice (eg, voice "fix”), gesture (eg, "OK gesture”, “heart gesture”), facial expression (eg, smiley face), button click operation, screen tap operation, and the like.
  • voice eg, voice "fix
  • gesture eg, "OK gesture", “heart gesture”
  • facial expression eg, smiley face
  • button click operation screen tap operation, and the like.
  • the electronic device In response to receiving the first operation, displays the first object in the first display frame, and displays the second object in the second display frame.
  • the electronic device receives the first operation of the user, and obtains the first object and/or the second object after a delay of a first time period (for example, 3 seconds). In this way, it is possible to avoid including gestures, expressions, etc. corresponding to the first operation on the interface displayed by the image solidification, which affects the user's viewing of makeup.
  • the electronic device after receiving the user's first operation and delaying for a first period of time, acquires the first object according to the first display content 11 , and acquires the second object according to the second display content 12 .
  • the electronic device stops displaying the first display content in the first display frame, and displays the first object in the first display frame; the electronic device stops displaying the second display content in the second display frame, and displays the first display content in the second display frame.
  • Two objects The first display frame is the display area of the first display content 11 on the display screen, and the second display frame is the display area of the second display content 12 on the display screen.
  • the first object and the second object may be still images or short videos.
  • the first object and the second object are static images.
  • the electronic device captures the current frame of the user's face image captured by the camera as the first object at the first moment (receives the user's first operation, after a first time delay).
  • the electronic device intercepts the current frame of the simulated image with makeup on the user's face as the second object at the first moment (receives the first operation of the user, after a delay of the first duration). Further, the user can save the first object and the second object as pictures.
  • the first object and the second object are short videos.
  • the electronic device captures the current frame and subsequent t (t>0) frames of the user's face image captured by the camera at the first moment (after receiving the first operation of the user, after a delay of the first duration) as the first object.
  • the electronic device intercepts the current frame and subsequent t (t>0) frames of the user's face simulation image with makeup as the second object at the first moment (receives the user's first operation, after a delay of the first duration). Further, the user can save the first object and the second object as a video.
  • the display mode of the makeup application is a mixed mode
  • the first interface includes a first display content 11 and a second display content 12 .
  • the electronic device in response to receiving the first operation, obtains the first object according to the first display content 11, obtains the second object according to the second display content 12, and displays the first object and the second object (image
  • the interface for solidification and display includes a first object and a second object); that is, the image solidification display is performed on both the user's face image and the user's face simulation image with makeup; it is convenient for the user to compare the user's face image and the user's face simulation image with makeup, and adjust the makeup steps, etc. .
  • the first object and the second object may be saved as a picture.
  • the electronic device displays an interface 901.
  • the interface 901 includes a first object 902 and a second object 903 .
  • the interface 901 further includes prompt information 904 for prompting the user to save the first object 902 and the second object 903 .
  • the interface 901 may further include an “OK” button 905 and a “Cancel” button 906; the “OK” button 905 is used to confirm that the first object 902 and the second object 903 are saved as pictures, and the “Cancel” button 906 is used for It is determined not to save the first object 902 and the second object 903.
  • the electronic device in response to receiving the first operation, acquires the first object according to the first display content 11, and displays the first object and the second display content 12; that is, only the image of the user's face is image-solidified Display; it is convenient for the user to carefully check the current facial makeup effect. Further, the first object may be saved as a picture.
  • the electronic device in response to receiving the first operation, acquires the second object according to the second display content 12, and displays the first display content 11 and the second object; that is, only an image with makeup on the user's face is simulated Perform image curing display; it is convenient for users to check the simulated makeup effect carefully. Further, the second object may be saved as a picture.
  • a corresponding image curing display mode may also be implemented according to user selection. For example, after receiving the user's "heart-to-heart gesture", the electronic device performs image solidification display on both the user's face image and the user's face simulation image with makeup; after receiving the user's double-click operation in the first display frame, the electronic device only The image of the user's face is image-solidified and displayed; after receiving the user's double-click operation in the second display frame, the electronic device only performs image-solidified display on the simulated image of the user's face with makeup.
  • the user can view and edit the saved pictures or videos. For example, the user can perform operations such as zooming in, rotating, cropping, and adjusting the color of the saved picture. For another example, the user can save one or more frames of images in the saved video as a picture. In some examples, the user may also share the saved picture or video to the social application.
  • the embodiment of the present application provides a display method applied to an electronic device.
  • the electronic device displays a display interface of a makeup application
  • the display content of the display interface can be enlarged and displayed, so as to facilitate viewing by a user.
  • the electronic device magnifies the display content in a center magnification manner. In this way, the center of the user's face can be kept at the center of the display.
  • the method includes:
  • the electronic device displays a first interface of a makeup application.
  • the electronic device displays a first interface of the makeup application, where the first interface includes at least one of a first display content 11 , a first object, a second display content 12 and a second object.
  • the first display content 11 or the first object is displayed in the first display frame, and the second display content 12 or the second object is displayed in the second display frame.
  • the electronic device can enlarge and display the dynamic user face image or the user face simulated image with makeup.
  • the display content (the first object or the second object) displayed by the solidified image can also be enlarged and displayed.
  • the electronic device receives the first input.
  • the first input may include: voice (eg, voice "zoom in”), tapping on the screen (eg, single-click, double-tap), clicking on a button, and the like.
  • voice eg, voice "zoom in”
  • tapping on the screen eg, single-click, double-tap
  • clicking on a button e.g., clicking on a button, and the like.
  • the electronic device In response to receiving the first input, the electronic device enlarges and displays the display content of the first interface in a center-enlarged manner.
  • the first interface includes first display content 11 .
  • the electronic device displays the enlarged first display content 11 in the first display frame with the center point of the first display content 11 as the center.
  • the first interface includes the second display content 12 .
  • the electronic device displays the enlarged second display content 12 in the second display frame with the center point of the second display content 12 as the center.
  • the first interface includes a first object.
  • the electronic device displays the enlarged first object in the first display frame with the center point of the first object as the center.
  • the first interface includes a second object.
  • the electronic device displays the enlarged second object in the second display frame with the center point of the second object as the center.
  • the display mode of the makeup application is the hybrid mode (that is, the first interface includes a first display frame and a second display frame)
  • the electronic device in response to receiving a first input (the first input may act on inside the first display frame or the second display frame, or the first input can also act on the display area outside the first display frame and the second display frame), the electronic device simultaneously enlarges the first display frame and the second display frame display content.
  • the first screen 21 of the display screen of the electronic device displays the first display content 11 (the user's face image) and the second display content 12 (the user's face simulation image with makeup).
  • the electronic device receives the user's operation of clicking and dragging the button 100a to the right, and in response to the operation of clicking and dragging the button 100a to the right, the electronic device enlarges and displays the user's face image and the user's face simulation image with makeup in a center magnification manner.
  • the electronic device in response to receiving the first input (the first input may be applied to the first display frame or the second display frame, or the first input may also be applied to the first display frame and the second display frame display area outside the display frame), the electronic device only enlarges the display content within the first display frame or the second display frame. Exemplarily, as shown in FIG.
  • the first screen 21 of the display screen of the electronic device displays the first display content 11 (the user's face image) and the second display content 12 (the user's face simulation image with makeup).
  • the electronic device receives the user's operation of clicking the screen (the area of the clicked screen is located in the first display frame), and in response to the operation of clicking the screen, the electronic device enlarges and displays the user's face image in a center magnification manner, and the user's face simulates wearing makeup. The way the image is displayed does not change.
  • the first screen 21 of the display screen of the electronic device displays the first display content 11 (the user's face image) and the second display content 12 (the user's face simulation image with makeup).
  • the electronic device receives the user's operation of clicking on the screen (the area of the clicked screen is located in the second display frame), and in response to the operation of clicking the screen, the electronic device enlarges and displays the simulated image of the user's face with makeup in a center magnification manner. The way the image is displayed does not change.
  • displaying the magnified image in the first display frame and displaying the magnified image in the second display frame are performed simultaneously.
  • the dynamic effect of displaying the enlarged image in the first display frame and the dynamic effect of displaying the enlarged image in the second display frame are played synchronously.
  • n is a default value, such as 2, 3, 5, 10, etc.
  • the electronic device zooms in on the displayed content in a tracking zoom-in manner. This way, the tracking area can always be centered on the display.
  • the method includes:
  • the electronic device displays a first interface of a makeup application.
  • the electronic device receives the second input.
  • the second input may be the same as the first input or different from the first input.
  • the second input may include: voice (for example, the voice “enlarge the eyes”, the voice “enlarge the mouth”), the user's gesture operation on the display screen (for example, the two-finger spreading operation) and the like.
  • the electronic device receives the speech, and determines the tracking area according to the semantics of the speech. For example, if the voice "enlarge the eyes" is received, the tracking area is determined to be the display area where the user's eyes are located on the display screen.
  • the electronic device receives the user's gesture operation on the display screen, and determines the tracking area according to the operation position of the user's gesture on the display screen. For example, when a two-finger spreading operation is received, it is determined that when the two fingers leave the display screen, the midpoint of the line connecting the contact points of the two fingers on the display screen is the tracking area. If the operation positions of the user's gestures on the display screen are different, the determined tracking areas are different.
  • the electronic device zooms in and displays the display content of the first interface in a tracking zoom manner.
  • the tracking area is located within the first display frame.
  • the electronic device displays the enlarged first display content 11 or the first object in the first display frame with the tracking area as the center. It can be understood that, if the tracking area determined according to the second input is different, different parts of the simulated image of the user's face with makeup are displayed in the first display frame. Exemplarily, as shown in FIG. 11B , the first screen 21 of the display screen of the electronic device displays the first display content 11 (the user's face image) and the second display content 12 (the user's face simulation image with makeup).
  • the electronic device receives the two-finger spreading operation in the first display frame (when the two fingers leave the display screen, the midpoint of the line connecting the contact points of the two fingers on the display screen is the user's mouth), and responds to the two-finger spreading operation.
  • the electronic device zooms in and displays the user's face image in a tracking zoom mode.
  • the user's mouth is located at the center point of the first display frame.
  • the electronic device displays the enlarged second display content 12 or the second object in the second display frame with the corresponding position of the tracking area in the second display frame as the center.
  • the enlarged image displayed in the second display frame corresponds to the enlarged image displayed in the first display frame.
  • the enlarged image displayed in the second display frame is a simulated makeup image of the enlarged image displayed in the first display frame.
  • the tracking area is the display area where the user's eyes are located
  • the corresponding position of the tracking area in the second display frame is the display area where the eyes are located in the simulated image with makeup.
  • the first screen 21 of the display screen of the electronic device displays the first display content 11 (the user's face image) and the second display content 12 (the user's face simulation image with makeup).
  • the electronic device receives the two-finger spreading operation in the first display frame (when the two fingers leave the display screen, the midpoint of the line connecting the contact points of the two fingers on the display screen is the user's mouth), and responds to the two-finger spreading operation.
  • the electronic device enlarges and displays the user's face image and the user's face simulation image with makeup in a tracking and zooming manner.
  • the user's mouth is located at the center point of the first display frame; in the enlarged user face simulation image with makeup, the user's mouth is located at the center point of the second display frame.
  • the tracking area is located within the second display frame.
  • the electronic device displays the enlarged second display content 12 or the second object in the second display frame centered on the tracking area.
  • the electronic device uses the corresponding position of the tracking area in the first display frame (for example, the tracking area is the display area where the eyes are located in the simulated image with makeup, then the corresponding position of the tracking area in the first display frame is where the user's eyes are located. display area) as the center, and the enlarged first display content 11 or the first object is displayed in the first display frame.
  • displaying the magnified image in the first display frame and displaying the magnified image in the second display frame are performed simultaneously.
  • the dynamic effect of displaying the enlarged image in the first display frame and the dynamic effect of displaying the enlarged image in the second display frame are played synchronously.
  • n is a default value, such as 2, 3, 5, 10, etc.
  • the tracking area remains displayed at the center point of the first display frame; the dynamic user face simulation displayed in the enlarged display in the second display frame
  • the tracking area remains displayed at the center point of the second display frame.
  • the tracking area is the user's mouth.
  • the graphics tracking engine locates the position of the mouth in each frame of the image through the image recognition algorithm.
  • the magnified face image of the user with the mouth position as the center point is displayed in the first display frame of the display screen.
  • an enlarged image of the user's face with makeup centered on the position of the mouth is displayed in the second display frame of the display screen.
  • a dotted frame may be superimposed on the enlarged and displayed image, which is used to indicate the makeup part and shape, and guide the user to apply makeup.
  • the enlarged user's face is displayed in the second display frame to simulate the mouth area of the image with makeup, and a dotted frame 110a is superimposed.
  • the dashed box 110a is used to indicate the position and shape of the makeup applied to the mouth.
  • an enlarged image of the user's face simulating wearing makeup is displayed in the second display frame of the display screen of the electronic device.
  • the electronic device receives the operation that the user clicks the simulation button to make the simulation button in the pressed state, and displays a dotted frame at the corresponding position of the enlarged and displayed user's face simulation image with makeup according to the makeup part indicated by the makeup step guide information 132 .
  • the image of the simulated user's face with makeup displayed by the electronic device includes a dashed frame, the dashed frame is enlarged when the first input or the second input is received, and the dashed frame is enlarged as the simulated image of the user's face with makeup is enlarged.
  • the dashed box may move with the movement of the enlarged image of the user's face so that the dashed box is always located at the indicated location.
  • the embodiment of the present application provides a display method applied to an electronic device.
  • the user can select different makeup parameters on the display interface of the makeup application, so that the second display content 12 (the user's face with makeup image) presents corresponding effects.
  • the user can apply makeup according to the effect of the makeup image on the user's face.
  • the display interface of the makeup application includes a third display content 13, and the third display content 13 is used for displaying makeup auxiliary information.
  • the makeup assistance information may include a makeup palette 131 that includes various makeup parameters.
  • the makeup palette 131 includes a "recommended” option 1310, a "partial makeup” option 1320, a "whole makeup” option 1330, a "favorite” option 1340, and a "custom” option 1350 etc. The user can click on any of the options to open the corresponding page.
  • the "overall makeup” page includes one or more examples of overall makeup, and each example of overall makeup includes parameters of each partial makeup corresponding to the overall makeup.
  • the user may select one of the examples of overall makeup, so that the second display content 12 presents a corresponding overall makeup effect.
  • the electronic device displays the "whole makeup” page 1331 shown in (b) of FIG. 12 .
  • the "overall makeup” page 1331 includes “retro”, “fashion”, “fresh”, “nude”, “peach blossom”, “smoky” and other overall makeup examples.
  • the "fresh” style of the overall makeup has a foundation color of "#1", eyeliner color “black 01”, eyebrow color “light gray”, eye shadow color “peach”, lip gloss color “RD06”, cheek color “RD06”.
  • the red color is “light pink” and so on.
  • the simulated makeup of the second display content 12 (the image with makeup on the user's face) is presented in the "Fresh” style, and its foundation color is "#1".
  • the eyeliner color is "Black 01"
  • the eyebrow color is “Light Grey”
  • the eye shadow color is "Peach”
  • the lip gloss color is "RD06”
  • the blush color is "Light Pink”, etc.
  • the user clicks an icon of a partial makeup item in the overall makeup example, and the electronic device displays the corresponding makeup step guide information 132 .
  • the electronic device displays the interface shown in (c) of FIG. 12
  • the interface includes the makeup step guide information “Step 5/12: Select RD06 Lip balm all over the upper lip.”
  • the "Partial Makeup” page includes multiple partial makeup options.
  • the "Partial Makeup” page includes multiple options such as “Foundation”, “Eyeliner”, “Brows”, “Eye Shadow”, “Lip Gloss”, and “Blush”.
  • each partial makeup option includes one or more makeup parameters.
  • the user can select one of the makeup parameters, so that the second display content 12 presents a corresponding partial makeup effect.
  • the user may click on the "Partial Makeup" option 1320 of the make-up palette.
  • the electronic device displays a "partial makeup” page 1321; wherein, the "partial makeup” page 1321 includes “foundation”, “eyeliner”, “eyebrow”, “eye shadow”, “lip gloss”, “blush”, etc.
  • the electronic device supports the user to add custom makeup parameters on the "partial makeup” page.
  • the "Partial Makeup" page 1321 includes a "Lip Gloss” option
  • the "Lip Gloss” page 1322 includes multiple lip gloss colors.
  • the user can long press (for example, press the display screen for more than 3 seconds) on the blank space of the "Lip Gloss” page 1322, and the electronic device receives the operation of long pressing the blank space on the "Lip Gloss” page 1322, and displays the add icon 1323.
  • the electronic device In response to receiving the operation of the user clicking the add icon 1323 , the electronic device displays a text box 1324 and an “Open” button 1325 on the “Lip Gloss” page 1322 .
  • the user may enter a file location or path in text box 1324 and click on "Open” button 1325.
  • the electronic device receives the operation of clicking the "Open” button 1325, and in response to the operation of clicking the "Open” button 1325, a picture 1326 is displayed on the "Lip Gloss" page 1322.
  • picture 1326 includes the color and name of a lip gloss "Huawei Red”.
  • the electronic device can receive the user's double-click operation on the picture 1326, and in response to the user's double-click operation on the picture 1326, a "Huawei Red” option 1327 is added to the "Lip Gloss" page 1322; wherein, the color displayed by the "Huawei Red” option 1327 is the picture 1326
  • the color of the middle lip gloss "Huawei Red”, the name of the "Huawei Red” option 1327 is the name of the lip gloss "Huawei Red” in the picture 1326 (or specified by the user).
  • the "Recommendations" page includes one or more examples of overall makeup.
  • the one or more overall makeup examples are generated by the electronic device based on the user's facial features. For example, the matching eyebrow shape in the overall makeup example is generated according to the user's face shape, and the corresponding foundation color in the overall makeup example is generated according to the user's skin color.
  • the user may click on the "recommended" option 1310 of the makeup palette.
  • the electronic device displays a "recommendation” page 1311; the "recommendation” page 1311 includes various overall makeup examples such as "shopping", “dating", "occupation", and "sports".
  • Each of the multiple overall makeup examples is generated based on the user's facial features. The user may select one of the examples of overall makeup, so that the second display content 12 presents a corresponding overall makeup effect.
  • the Favorites page includes one or more examples of overall makeup.
  • the one or more overall makeup examples are saved in the "Favorites" page according to the user's selection.
  • the electronic device receives an operation of the user pressing the “fresh” option for a long time (for example, the pressing time is longer than 3 seconds); or, as shown in FIG. 15A
  • the second display content 12 presents the overall makeup effect corresponding to the "fresh" option on the "overall makeup” page 1331.
  • the electronic device receives the user's long-pressing operation on the display screen (the pressing area is located in the second display frame); the electronic device adds the "fresh" style overall makeup to the "favorites” page.
  • the user may click on the "favorite” option 1340 of the makeup palette.
  • the electronic device displays a "Favorites” page 1341; Example.
  • the "Favorite 5" option is the overall makeup of the "fresh" style saved by the method of FIG. 15A .
  • the user may select an example of overall makeup in the "Favorites" page 1341, so that the second display content 12 presents a corresponding overall makeup effect.
  • the electronic device may generate a customized overall makeup example, or makeup parameters of a partial makeup according to a picture uploaded by the user.
  • the user can select a custom overall makeup example to make the second display content 12 present a corresponding overall makeup effect; and can also select a custom makeup parameter to make the second display content 12 present a corresponding partial makeup effect.
  • the electronic device receives a picture uploaded by the user, and the picture includes an image of a face with makeup.
  • the electronic device (for example, extracting the makeup control module in Figure 4) extracts the feature data of the face image in the picture (for example, shape features: face shape, eyebrow shape, lip shape, etc., color features: foundation color, eye shadow color, eyebrow color, etc.) .
  • the electronic device creates makeup parameters (such as eye shadow, eyeliner, lip gloss, eyebrow shape, blush, etc.) based on the feature data (shape feature, color feature, etc.).
  • the makeup parameters of each part of the face may be packaged and saved as the overall makeup; optionally, the makeup parameters of each part of the face may be saved as the makeup parameters of the local makeup.
  • the electronic device receives the user's click operation on the “custom” option 1350 .
  • the electronic device displays a "customize” page 1351.
  • the user can open the picture on the "Custom” page 1351.
  • the "Customize” page 1351 includes a text box 1352.
  • the user can input the save location or path of the picture in the text box 1352, and click the "Open” button 1353.
  • the electronic device receives the operation of clicking the "Open” button 1353, and displays the "Custom” page 1354 in response to the operation of clicking the "Open” button 1353.
  • the "Custom” page 1354 includes one or more pictures.
  • the electronic device receives the operation of the user clicking on a picture, extracts the feature data of the face image in the picture, and creates makeup parameters according to the feature data.
  • the electronic device displays a "customize” page 1355.
  • the "Custom” page 1355 includes makeup parameters generated by the electronic device from the picture.
  • the "Customize” page 1355 includes a "Save All” button 1356, a “Save Partial” button 1357, and a “Cancel” button 1358; the "Save All” button 1356 is used to package and save the makeup parameters generated by the electronic device according to the picture.
  • the "Save Part” button 1357 is used to save the makeup parameters generated by the electronic device according to the picture as the makeup parameters of the partial makeup
  • the "Cancel” button 1358 is used to not save the makeup parameters generated by the electronic device according to the picture.
  • the electronic device receives the user's click operation on the "Save All” button 1356, and in response to the click operation on the "Save All” button 1356, the electronic device includes the "Custom" page 1355 including Makeup parameters (the foundation color is "#3", the eyeliner color is "Black 01”, the eyebrow color is “Light Brown”, the eye shadow color is “Golden Brown”, the lip gloss color is “RD04”, and the blush color is "Pink Orange”.
  • FIG. 16B an example of “Custom 1” overall makeup is added to the “Overall Makeup” page 1331 .
  • the electronic device receives the user's click operation on the "Save Part” button 1357, and in response to the click operation on the "Save Part” button 1357, the electronic device will "customize” the page 1355
  • the makeup parameters selected by the user eye shadow color is "golden brown”, blush color is "pink orange" are respectively saved as the makeup parameters of local makeup.
  • the “Pink Orange” option is added to the blush color of the “Partial Makeup” page 1321
  • the “Golden Brown” option is added to the eyeshadow color of the “Partial Makeup” page 1321 .
  • the above picture may include a complete face image with makeup, or only a partial face image, or only a part of the face (such as eyebrows, mouth, nose, etc.).
  • the electronic device may generate a customized overall makeup example or makeup parameters of a partial makeup according to the display content in the first display frame.
  • what is displayed in the first display frame is an image of the user's face with makeup on.
  • the electronic device receives the operation of extracting the user's facial makeup parameters (for example, the knuckles click on the area in the first display frame of the display screen), and then generate a customized overall makeup example or a partial makeup makeup based on the user's makeup face image. parameter.
  • the camera of the electronic device collects an image of the user's face with makeup, and displays the image of the user's face with makeup in the first display frame.
  • the electronic device receives the operation of extracting the user's facial makeup parameters, and the electronic device (for example, the control module for extracting makeup in Figure 4) extracts the feature data of the face image in the user's facial image with makeup (for example, shape features: face shape, eyebrow shape, lips shape, etc., color characteristics: foundation color, eye shadow color, eyebrow color, etc.).
  • the electronic device creates makeup parameters (such as eye shadow, eyeliner, lip gloss, eyebrow shape, blush, etc.) based on the feature data (shape feature, color feature, etc.).
  • the makeup parameters of each part of the face can be packaged and saved as the overall makeup; optionally, the makeup parameters of each part of the face can be saved as the makeup parameters of the local makeup.
  • the first screen 21 of the display screen of the electronic device displays an image of the user's face with makeup and a simulated image of the user's face with makeup.
  • the electronic device receives the operation of the user's knuckle clicking on the area in the first display frame of the display screen, and in response to the operation of the finger joint clicking on the area in the first display frame of the display screen, the electronic device extracts the face image in the image of the user's face with makeup feature data, and create makeup parameters based on the feature data.
  • the electronic device packs and saves the generated makeup parameters as an overall makeup.
  • An example of "Custom 2" overall makeup is added on the "Overall makeup" page 1331.
  • the electronic device may receive the modification of the makeup parameters by the user, for example, the user may modify the makeup effect of any part of the simulated image with makeup on the user's face in the second display frame. For example, modify eyebrow shape, lip shape, eye shadow color, etc.
  • the electronic device stores the modified makeup parameters.
  • the modified makeup parameters can be saved separately.
  • the overall makeup after modifying the makeup parameters may be saved as an example of the overall makeup.
  • the electronic device can receive the user's modification of the makeup parameters on the second display content 12 dynamically displayed, and can also receive the user's modification of the makeup parameters on the second display content 12 that is solidified and displayed, and can also be displayed in an enlarged manner.
  • the modification of the makeup parameters by the user is received on the second display content 12 of the user, which is not limited in this embodiment of the present application.
  • the electronic device receives a user's modification operation on the blush in the second display frame on the display screen (for example, a single-finger long press and drag operation, a click and drag operation, etc.), and responds
  • a user's modification operation on the blush in the second display frame on the display screen for example, a single-finger long press and drag operation, a click and drag operation, etc.
  • the blush shape of the user's face displayed in the second display frame simulates the image with makeup to change (the blush shape changes with the drag position of the finger).
  • the display interface in the second display frame includes a "save” icon 1701 and a "cancel” icon 1702; the "save” icon 1701 is used to save the modified makeup parameters of the simulated face image of the user, and the "cancel” icon 1702 is used to not save the makeup parameters of the modified user's face simulation image with makeup.
  • the electronic device receives the user's click operation on the "save” icon 1701, and in response to the user's click operation on the "save” icon 1701, saves the modified makeup parameters of the user's face simulation image with makeup.
  • the modified makeup parameters of the simulated image with makeup on the user's face are saved as an example of the overall makeup, and the user can view it in the overall makeup of the makeup palette.
  • the sensor of the electronic device detects the user's single-finger long-press and drag operation on the display screen, and notifies the sensor service through the sensor driver.
  • the sensor service determines that the user's single-finger long press and drag operation on the screen is received, generates a makeup modification event, and reports the modification makeup event (the modification makeup event includes the finger drag position) to the event distribution management service.
  • the event distribution management service distributes modified makeup events (modified makeup events include finger dragging positions) to the makeup area layout display module.
  • the makeup area layout display module generates the modified user's face simulation image with makeup according to the dragging position of the finger.
  • the sensor of the electronic device detects the user's click operation on the "save” icon 1701 on the display screen, and notifies the sensor service through the sensor driver.
  • the sensor service determines that the user's click operation on the "save” icon 1701 on the display screen is received, generates an event for saving color makeup parameters, and reports the saving color makeup parameter event to the event distribution management service.
  • the event distribution management service distributes the event of saving makeup parameters to the extraction makeup control module, and the extraction makeup control module analyzes the simulated makeup image of the user's face in the second display frame to obtain makeup parameters; and saves the acquired makeup parameters.
  • the electronic device detects that the surrounding environment is dark (for example, a sensor of the electronic device detects that the surrounding ambient light brightness is less than a preset value).
  • the electronic device Displays fill light effects (such as highlighting colored rings or colored lights) in the peripheral area of its display. In this way, the user can be helped to achieve the effect of filling light when looking in the mirror with makeup, and the user experience can be further improved.
  • the above-mentioned electronic device includes corresponding hardware structures and/or software modules for executing each function.
  • the embodiments of the present application can be implemented in hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Experts may use different methods for each specific application to implement the described functions, but such implementation should not be considered beyond the scope of the embodiments of the present application.
  • the electronic device may be divided into functional modules according to the foregoing method examples.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules. It should be noted that, the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
  • FIG. 19 shows a possible schematic structural diagram of the electronic device involved in the above embodiment.
  • the electronic device 2000 includes: a processing unit 2001 , a display unit 2002 and a storage unit 2003 .
  • the processing unit 2001 is used to control and manage the actions of the electronic device 2000 .
  • it can be used to perform the processing steps of performing image solidification on the dynamic display content of the display interface of the makeup application in the embodiments of the present application; the processing steps of amplifying the display content of the display interface of the makeup application; extracting makeup parameters, and controlling the layout of the makeup area , control makeup tray layout, control makeup mode switching, control makeup mirror fill light processing, and other processing steps; and/or other processes used in the techniques described herein.
  • the display unit 2002 is used to display the interface of the electronic device 2000 .
  • it can be used to display the user's face image in the first display frame, display the user's face simulation image with makeup in the second display frame, and display auxiliary makeup information.
  • the storage unit 2003 is used for storing program codes and data of the electronic device 2000 .
  • the unit modules in the above electronic device 2000 include, but are not limited to, the above processing unit 2001 , the display unit 2002 and the storage unit 2003 .
  • the electronic device 2000 may further include a detection unit and the like.
  • the detection unit may be used to detect user's actions, gestures, and the like.
  • the processing unit 2001 may be a processor or a controller, for example, a central processing unit (CPU), a digital signal processor (DSP), an application-specific integrated circuit (ASIC) ), field programmable gate array (FPGA), or other programmable logic devices, transistor logic devices, hardware components, or any combination thereof.
  • the processors may include application processors and baseband processors. It may implement or execute the various exemplary logical blocks, modules and circuits described in connection with this disclosure.
  • the processor may also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of a DSP and a microprocessor, and the like.
  • the display unit 2002 may be a display screen.
  • the storage unit 2003 may be a memory.
  • the detection unit may be a sensor, a touch device, a camera, and the like.
  • the processing unit 2001 is a processor (the processor 110 shown in FIG. 3 ), and the display unit 2002 is a display screen (the display screen 194 shown in FIG. 3 , the display screen 194 may be a touch screen, and the touch screen may be integrated display panel and touch panel), the storage unit 2003 may be a memory (as shown in FIG. 3 , the internal memory 121 ), and the detection unit may include a sensor (as shown in FIG. 3 , the sensor module 180 ), a camera (as shown in FIG. 3 ) camera 193).
  • the electronic device 2000 provided in this embodiment of the present application may be the electronic device 300 shown in FIG. 3 .
  • the above-mentioned processor, display screen, memory, etc. can be coupled together, for example, connected through a bus.
  • Embodiments of the present application further provide a computer-readable storage medium, where computer program codes are stored in the computer-readable storage medium.
  • the electronic device executes the relevant method steps in the above-mentioned embodiments. .
  • Embodiments of the present application also provide a computer program product, which when the computer program product runs on a computer, causes the computer to execute the relevant method steps in the foregoing embodiments.
  • the electronic device 2000, the computer-readable storage medium, or the computer program product provided in the embodiments of the present application are all used to execute the corresponding methods provided above. Therefore, for the beneficial effects that can be achieved, reference may be made to the above-provided methods. The beneficial effects in the corresponding method will not be repeated here.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be Incorporation may either be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may be one physical unit or multiple physical units, that is, they may be located in one place, or may be distributed to multiple different places . Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware, and can also be implemented in the form of software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, which are stored in a storage medium , including several instructions to make a device (may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk and other mediums that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Image Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention porte sur un procédé d'affichage appliqué à un dispositif électronique et sur un dispositif électronique, qui se rapportent au domaine technique des terminaux. Le procédé consiste : dans une première zone d'un écran de visualisation d'un dispositif électronique, à afficher une image faciale d'un utilisateur qui est collectée par une caméra ; dans une seconde zone de l'écran de visualisation, à afficher une image de maquillage de simulation faciale de l'utilisateur qui est générée selon l'image faciale de l'utilisateur ; et à recevoir une première entrée et, à la suite de la première entrée, à afficher, dans la première zone, une image agrandie d'au moins une partie de l'image faciale de l'utilisateur et à afficher, dans la seconde zone, une image agrandie d'au moins une partie de l'image de maquillage de simulation faciale de l'utilisateur. De cette manière, un utilisateur peut appliquer un maquillage en référence à une image de maquillage de simulation faciale de l'utilisateur ; et une image faciale de l'utilisateur et l'image de maquillage de simulation faciale de l'utilisateur peuvent également être affichées de manière agrandie de telle sorte que les images soient plus pratiques à visualiser, et que l'application d'un maquillage raffiné par un utilisateur soit facilitée.
PCT/CN2021/108283 2020-08-27 2021-07-23 Procédé d'affichage appliqué à un dispositif électronique, et dispositif électronique WO2022042163A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010880184.0A CN114115617B (zh) 2020-08-27 2020-08-27 一种应用于电子设备的显示方法及电子设备
CN202010880184.0 2020-08-27

Publications (1)

Publication Number Publication Date
WO2022042163A1 true WO2022042163A1 (fr) 2022-03-03

Family

ID=80352610

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/108283 WO2022042163A1 (fr) 2020-08-27 2021-07-23 Procédé d'affichage appliqué à un dispositif électronique, et dispositif électronique

Country Status (2)

Country Link
CN (1) CN114115617B (fr)
WO (1) WO2022042163A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117315165A (zh) * 2023-11-28 2023-12-29 成都白泽智汇科技有限公司 一种基于显示界面的化妆智能辅助显示方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000076398A1 (fr) * 1999-06-14 2000-12-21 The Procter & Gamble Company Systemes et methodes d'imagerie et d'analyse de la peau
CN109658167A (zh) * 2017-10-10 2019-04-19 阿里巴巴集团控股有限公司 试妆镜设备及其控制方法、装置
CN110045872A (zh) * 2019-04-25 2019-07-23 廖其锋 日用智能镜及使用方法
CN111047384A (zh) * 2018-10-15 2020-04-21 北京京东尚科信息技术有限公司 智能设备的信息处理方法和智能设备
CN111553220A (zh) * 2020-04-21 2020-08-18 海信集团有限公司 一种智能设备及数据处理方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6971824B2 (ja) * 2017-12-13 2021-11-24 キヤノン株式会社 表示制御装置及びその制御方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000076398A1 (fr) * 1999-06-14 2000-12-21 The Procter & Gamble Company Systemes et methodes d'imagerie et d'analyse de la peau
CN109658167A (zh) * 2017-10-10 2019-04-19 阿里巴巴集团控股有限公司 试妆镜设备及其控制方法、装置
CN111047384A (zh) * 2018-10-15 2020-04-21 北京京东尚科信息技术有限公司 智能设备的信息处理方法和智能设备
CN110045872A (zh) * 2019-04-25 2019-07-23 廖其锋 日用智能镜及使用方法
CN111553220A (zh) * 2020-04-21 2020-08-18 海信集团有限公司 一种智能设备及数据处理方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117315165A (zh) * 2023-11-28 2023-12-29 成都白泽智汇科技有限公司 一种基于显示界面的化妆智能辅助显示方法
CN117315165B (zh) * 2023-11-28 2024-03-12 成都白泽智汇科技有限公司 一种基于显示界面的化妆智能辅助显示方法

Also Published As

Publication number Publication date
CN114115617A (zh) 2022-03-01
CN114115617B (zh) 2024-04-12

Similar Documents

Publication Publication Date Title
EP4057135A1 (fr) Procédé d'affichage pour dispositif électronique ayant un écran pliable et dispositif électronique
US20230046708A1 (en) Application Interface Interaction Method, Electronic Device, and Computer-Readable Storage Medium
WO2021000881A1 (fr) Procédé de division d'écran et dispositif électronique
EP3846427B1 (fr) Procédé de commande et dispositif électronique
WO2021036585A1 (fr) Procédé d'affichage sur écran souple, et dispositif électronique
CN114397981A (zh) 一种应用显示方法及电子设备
WO2021036770A1 (fr) Procédé de traitement d'écran partagé et dispositif terminal
US20230276014A1 (en) Photographing method and electronic device
WO2021063098A1 (fr) Procédé de réponse d'écran tactile, et dispositif électronique
WO2022037726A1 (fr) Procédé d'affichage à écran partagé et dispositif électronique
CN113973189B (zh) 显示内容的切换方法、装置、终端及存储介质
CN113935898A (zh) 图像处理方法、系统、电子设备及计算机可读存储介质
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
WO2022007707A1 (fr) Procédé de commande de dispositif domestique, dispositif terminal et support de stockage lisible par ordinateur
WO2021042878A1 (fr) Procédé photographique et dispositif électronique
CN113986070A (zh) 一种应用卡片的快速查看方法及电子设备
WO2022042163A1 (fr) Procédé d'affichage appliqué à un dispositif électronique, et dispositif électronique
EP4390643A1 (fr) Procédé de prévisualisation, dispositif électronique et système
WO2022078116A1 (fr) Procédé de génération d'image à effet de pinceau, procédé et dispositif d'édition d'image et support de stockage
WO2022002213A1 (fr) Procédé et appareil d'affichage de résultat de traduction, et dispositif électronique
WO2022135273A1 (fr) Procédé permettant d'invoquer des capacités d'autres dispositifs, dispositif électronique et système
CN115328592B (zh) 显示方法及相关装置
WO2024109573A1 (fr) Procédé d'affichage de fenêtre flottante et dispositif électronique
WO2024037542A1 (fr) Procédé d'entrée tactile, système, dispositif électronique et support de stockage
WO2023098417A1 (fr) Procédé et appareil d'affichage d'interface

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21860003

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21860003

Country of ref document: EP

Kind code of ref document: A1