WO2020134891A1 - Procédé de prévisualisation de photo pour dispositif électronique, interface graphique d'utilisateur et dispositif électronique - Google Patents

Procédé de prévisualisation de photo pour dispositif électronique, interface graphique d'utilisateur et dispositif électronique Download PDF

Info

Publication number
WO2020134891A1
WO2020134891A1 PCT/CN2019/122516 CN2019122516W WO2020134891A1 WO 2020134891 A1 WO2020134891 A1 WO 2020134891A1 CN 2019122516 W CN2019122516 W CN 2019122516W WO 2020134891 A1 WO2020134891 A1 WO 2020134891A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
electronic device
person
photographed
preview
Prior art date
Application number
PCT/CN2019/122516
Other languages
English (en)
Chinese (zh)
Inventor
刘梦莹
孙雨生
王波
钟顺才
肖喜中
朱聪超
吴磊
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2020134891A1 publication Critical patent/WO2020134891A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Definitions

  • the invention relates to the field of electronic technology, in particular to a photo preview method applied to electronic devices, a graphical user interface, and electronic devices.
  • the object of the present invention is to provide a photo preview method, graphic user interface (GUI) and electronic device of an electronic device, which can enable the user to beautify the person being photographed during the photo preview process, and the operation is intuitive and simple Effective, can improve the utilization rate of electronic equipment.
  • GUI graphic user interface
  • a photo preview method for an electronic device may have a 3D camera module.
  • the method may include: the electronic device may turn on the 3D camera module, and collect the first image of the photographed person through the 3D camera module.
  • the electronic device may display a first graphical user interface, and the first graphical user interface may include a preview frame, and the preview frame displays the first image of the photographed person.
  • the electronic device may detect the first user operation, and in response to the first user operation, may display the second image of the photographed person in the preview box of the first graphical user interface.
  • the contour of the first body part of the photographed person in the second image is adjusted according to the first body parameters
  • the contour of the second body part of the photographed person in the second image is adjusted according to the second body parameters
  • the first part is different from the second part
  • the first body parameters are different from the second body parameters. That is to say, the electronic device can adjust the contours of different body parts in the first image by using different body parameters.
  • the first graphical user interface may further include a first control, and the first user operation is a user operation for selecting a beauty level that acts on the first control; the selected The higher the body level, the greater the gap between the contour of the first part in the second image and the contour of the first part in the first image, and the greater the gap between the contour of the second part in the second image and the contour of the second part in the first image .
  • the first user operation may specifically include a second user operation acting on the first control (such as a rightward sliding operation on the first control), and a third user operation acting on the first control (such as the first control Swipe left).
  • a second user operation acting on the first control such as a rightward sliding operation on the first control
  • a third user operation acting on the first control such as the first control Swipe left
  • the electronic device can detect a second user operation acting on the first control (such as a rightward sliding operation on the first control).
  • the second user operation can be used to increase the body level corresponding to the person being photographed (such as increasing from body level 7 To body level 10).
  • the electronic device may refresh the display content in the preview box.
  • the body shape of the photographed person's body image in the preview frame after refreshing has a beautification compared with the body shape of the photographed person's body image in the preview frame before refreshing. That is to say, in response to the second user operation, the outline of the first part in the second image in the preview frame after refreshing is compared with the outline of the first part in the second image in the preview frame before refreshing, and The gap between the contours of the first part is larger.
  • the contour of the second part in the second image in the preview frame after refreshing is larger than the contour of the second part in the first image.
  • the electronic device can detect a third user operation acting on the first control (such as a leftward sliding operation on the first control).
  • the third user operation can be used to reduce the body level corresponding to the person being photographed (such as from the body level 10 Up to body level 7), in response to the operation of the third user, the display content in the preview box can be refreshed.
  • the body shape of the photographed person's body image in the preview frame after refreshing is closer to the actual body shape of the photographed person than that in the preview frame before refreshing.
  • the outline of the first part in the second image in the preview frame after refreshing is compared with the outline of the first part in the second image in the preview frame before refreshing, and The gap between the contours of the first part is smaller.
  • the contour of the second part in the second image in the preview frame after refreshing is smaller than the contour of the second part in the second image in the preview frame before refreshing, and the contour of the second part in the first image is smaller.
  • the electronic device may display a prompt message (such as the text "pregnant woman detected") in the preview box.
  • the photographed person is a pregnant woman.
  • the electronic device can use the beauty parameters corresponding to the selected beauty level to perform beauty treatment on the color images of other partial bodies of the photographed person (such as arms, legs, and shoulders), but not the stomach of the photographed person.
  • the image is processed with a body to preserve the characteristics of the pregnant woman. That is to say, the outline of the belly of the photographed person in the second image displayed in the preview frame matches the outline of the belly of the photographed person in the first image.
  • the electronic device may adjust the color value of the face skin of the person photographed in the second image in response to the detected fourth user operation.
  • the fourth user operation may be a user operation that turns on the “beauty skin” function and selects a specific skin beauty level (such as skin beauty level 7).
  • the electronic device may refresh the display in the preview box Content, the image of the photographed person displayed in the refreshed preview box includes a face image and a body image, where the face image is a face image after skin treatment, and the body image is corresponding to the selected beauty level Body image
  • the electronic device may also display the second control in the preview box.
  • the electronic device may display the first image in the preview box.
  • the electronic device may display the second image in the preview box. In this way, the user can compare the body shape represented by the first image with the body shape represented by the second image.
  • the electronic device may display the first prompt information in the preview box for prompting the contour of the first part in the second image and the contour of the first part in the first image ,
  • the second image is the image of the photographed person after the body treatment displayed in the preview frame
  • the first image is the image of the photographed person collected by the 3D camera module.
  • the contours of the body parts of the captured person in the first image are consistent with the actual contours. These differences can be expressed as the difference in waist circumference, the difference in leg circumference, the difference in leg length, the difference in shoulder width and so on.
  • the first prompt information may also be used to prompt one or more of the following manifestations: weight to be lost, required amount of exercise, and calories to be consumed.
  • weight that needs to be lost is used to indicate how much weight the person to be photographed needs to lose to approach or reach the body shape represented by the body image displayed in the preview box.
  • required amount of exercise is used to indicate how much exercise the person to be photographed needs to approach or reach the figure represented by the body image displayed in the preview box.
  • the calories that need to be consumed are used to remind how much calories the person to be photographed needs to approach or reach the body shape shown in the preview frame.
  • the electronic device displays the third control and the second prompt message in the preview box.
  • the third control can be used To monitor user operations for sharing fitness information, the second prompt information can be used to prompt the user to click the third control to share fitness information with the person being photographed.
  • the electronic device may display a user interface for sharing fitness information.
  • the electronic device may share fitness information with the selected contact.
  • the electronic device may display the fourth control and the third prompt information in the preview box.
  • the third prompt information It can be used to prompt the user to click the fourth control to view fitness information.
  • the electronic device may display fitness information.
  • the first graphical user interface may further include: a shooting control.
  • the electronic device may save the second image displayed in the preview frame in response to the detected user operation acting on the shooting control.
  • the first graphical user interface may further include: a shooting mode list, the shooting mode list includes multiple shooting mode options; the multiple shooting modes include: the first shooting mode option and the first Two shooting mode options. among them:
  • the shooting control is specifically used to monitor the user operation that triggers the photographing, and in response to the detected user operation acting on the shooting control, saving the second image displayed in the preview box, specifically including : In response to the detected user operation acting on the shooting control, save the second image displayed in the preview box as a photo.
  • the shooting control is specifically used to monitor the user operation that triggers the recording.
  • the second image displayed in the preview box is saved, specifically including :
  • the second image displayed in the preview box is saved, specifically including :
  • the electronic device before displaying the second image of the photographed person in the preview box of the first graphical user interface in response to the first user operation, the electronic device may also be based on the photographed person Of the images identify human bone points.
  • the electronic device may further determine the photographed person The angle with the plane where the electronic device is located, and the angle does not exceed the preset angle threshold.
  • the electronic device may also according to the photographed person The angle with the plane where the electronic device is located determines the perspective transformation matrix, and performs perspective transformation on the first image of the person to be photographed according to the perspective transformation matrix.
  • a photo preview method for an electronic device may have a 3D camera module.
  • the method may include: the electronic device turns on the 3D camera module, and may collect the image of the person being photographed through the 3D camera module.
  • the electronic device can display a second graphical user interface.
  • the second graphical user interface includes: a preview box and a body template column; wherein, the preview box displays a color image collected by the 3D camera module, and the body template column displays one or more bodies Template options.
  • the electronic device may display the fifth control.
  • the electronic device may refresh the preview box.
  • the body image of the photographed person's body image in the preview box after refreshing is closer to the body type represented by the selected body template than that of the body image of the photographed person in the preview box before refreshing.
  • the electronic device can refresh the preview frame; the body image of the photographed person in the preview frame after refreshing is compared with the body shape of the preview frame before refreshing The body shape of the photographed person's body image is closer to the actual shape of the person being photographed.
  • an electronic device may include a 3D camera module, a display screen, a touch sensor, a wireless communication module, a memory, and one or more processors.
  • the one or more processors are used to execute One or more computer programs stored in the aforementioned memory, wherein:
  • the 3D camera module can be used to collect the first image of the person being photographed
  • the display screen may be used to display a first graphical user interface, and the first user interface may include a preview frame, and the preview frame displays a first image of the photographed person collected by the 3D camera module.
  • the touch sensor may be used to detect the first user operation.
  • the display screen may be used to display the second image of the person being photographed in the preview box in response to the first operation.
  • the contour of the first body part of the photographed person in the second image is adjusted according to the first body parameters
  • the contour of the second body part of the photographed person in the second image is adjusted according to the second body parameters
  • the first part is different from the second part
  • the first body parameters are different from the second body parameters. That is to say, the electronic device can adjust the contours of different body parts in the first image by using different body parameters.
  • an electronic device may include an apparatus, which may implement any possible implementation manner as in the first aspect, or as any possible implementation manner in the second aspect.
  • a photo preview device which has a function to implement the behavior of an electronic device in the above method.
  • the above-mentioned functions can be realized by hardware, and can also be realized by hardware executing corresponding software.
  • the above hardware or software includes one or more modules corresponding to the above functions.
  • a computer device which includes a memory, a processor, and a computer program stored on the memory and operable on the processor, characterized in that, when the processor executes the computer program, the computer device is realized As in any possible implementation manner in the first aspect, or as in any possible implementation manner in the second aspect.
  • a computer program product containing instructions is characterized in that, when the computer program product runs on an electronic device, the electronic device is caused to execute any possible implementation manner as in the first aspect, or as in the second Any possible implementation in any aspect.
  • a computer-readable storage medium which includes instructions, characterized in that, when the above instructions run on an electronic device, the above electronic device executes any possible implementation manner as in the first aspect, or as in Any possible implementation in the two aspects.
  • FIG. 1A is a schematic diagram of a structure of an electronic device provided by an embodiment
  • 1B is a schematic diagram of a software structure of an electronic device provided by an embodiment
  • 2A is a schematic diagram of a user interface for an application menu on an electronic device provided by an embodiment
  • 2B is a schematic diagram of a rear 3D camera module on an electronic device provided by an embodiment
  • 3A-3C are schematic diagrams of shooting scenes involved in this application.
  • 3D-3E are schematic diagrams of a user interface of a character beautification function provided by an embodiment
  • 4A-4H are schematic diagrams of a UI for beautifying a person being photographed during a photo preview provided by an embodiment
  • 5A-5D are schematic diagrams of a UI provided for revoking a beautified body during a photo preview provided by an embodiment
  • 6A-6G are schematic diagrams of a UI for pushing fitness information during a photo preview provided by an embodiment
  • FIGS. 7A-7H are schematic diagrams of UIs for pushing fitness information during a photo preview provided by another embodiment
  • FIGS. 8A-8B are schematic diagrams of UI provided before and after beautification of a comparison body during a photo preview provided by an embodiment
  • 9A-9C are schematic diagrams of UIs for users to take photos provided by an embodiment
  • 10A-10G are schematic diagrams of a UI for beautifying a photographed person during video preview provided by an embodiment
  • 11A-11E are schematic diagrams of a UI for beautifying a photographed person using a body shape template provided by an embodiment
  • 12A-12C are schematic diagrams of a one-button UI for full-body beautification provided by an embodiment
  • 13A-13D are schematic diagrams of UI for beautifying multiple people provided by an embodiment
  • 14A-14B are schematic diagrams of a UI for manual beautification provided by an embodiment
  • 15A-15D are schematic diagrams of UI for beautifying the figures in the captured pictures provided by an embodiment
  • FIG. 16 is a schematic flowchart of a photo preview method provided by an embodiment
  • 17A-17C show pixel points in a color image, a depth image, and a 3D coordinate space, respectively;
  • Figure 18 shows basic human bone points
  • FIG. 19 is a schematic diagram of calculating the angle ⁇ between the photographed person and the electronic device according to the depth value of the bone point and the 2D coordinates;
  • 20 is a schematic diagram of calculating the length of the bone between the bone points according to the depth value of the bone points and the 2D coordinates;
  • 21 is a schematic diagram of dividing a color image collected by a 3D camera module into a color image (that is, a foreground image) and a background image of a person to be photographed;
  • Fig. 22 is a comparison diagram before and after image restoration
  • FIG. 23 is an architectural diagram of functional modules included in an electronic device according to an embodiment.
  • the electronic device may be a portable electronic device that also contains other functions such as personal digital assistant and/or music player functions, such as a mobile phone, a tablet computer, and a wearable electronic device with wireless communication function (such as a smart watch) Wait.
  • portable electronic devices include, but are not limited to Or portable electronic devices of other operating systems.
  • the above portable electronic device may also be other portable electronic devices, such as a laptop computer with a touch-sensitive surface or a touch panel. It should also be understood that in some other embodiments, the above electronic device may not be a portable electronic device, but a desktop computer with a touch-sensitive surface or a touch panel.
  • UI user interface
  • the term “user interface (UI)” in the specification, claims, and drawings of this application is a media interface for interaction and information exchange between applications or operating systems and users, which implements the internal form of information And users can accept the conversion between forms.
  • the user interface of the application is source code written in a specific computer language such as java and extensible markup language (XML).
  • the interface source code is parsed and rendered on the terminal device, and finally presented as content that can be recognized by the user.
  • Controls also called widgets
  • Typical controls include toolbars, menu bars, text boxes, buttons, and scroll bars. (scrollbar), pictures and text.
  • the attributes and contents of the controls in the interface are defined by tags or nodes.
  • XML uses ⁇ Textview>, ⁇ ImgView>, ⁇ VideoView> and other nodes to specify the controls contained in the interface.
  • a node corresponds to a control or attribute in the interface, and the node is rendered as user-visible content after being parsed and rendered.
  • applications such as hybrid applications (hybrid application) interface usually also contains web pages.
  • a web page, also known as a page, can be understood as a special control embedded in the application interface.
  • the web page is source code written in a specific computer language, such as hypertext markup language (hypertext markuplanguage, GTML), cascading styles Tables (cascading styles, CSS), java scripts (JavaScript, JS), etc.
  • web page source code can be loaded and displayed as user-recognizable content by the browser or a web page display component similar to the browser function.
  • the specific content contained in a web page is also defined by tags or nodes in the source code of the web page.
  • GTML uses ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas> to define the elements and attributes of the web page.
  • GUI graphical user interface
  • the following embodiments of the present application provide a photo preview method for an electronic device, a graphical user interface, and an electronic device, which can enable the user to beautify the person being photographed during the camera preview process, and the operation is intuitive and simple and effective, which can improve the electronic device Of usage.
  • the application “camera” of an electronic device such as a smartphone can provide two kinds of beautification functions: a “beauty” function and a “beauty” function. among them,
  • the "beauty” function can be used to adjust the face image of the person being photographed during the photo preview or video preview process, so that the face represented by the adjusted face image occurs compared to the actual face of the person being photographed
  • Skin beautification such as skin whitening, skin dermabrasion (such as removing acne, freckles, wrinkles, etc.), and so on.
  • the adjustment of the face image involved in the "beauty” function may refer to the smoothing of the face image using algorithms such as surface blur, mean filtering, and bilateral filtering. In the following embodiments of the present application, such processing performed on the face image may be referred to as skin beautification processing.
  • the "Beauty" function can be used to adjust the body image of the person being photographed during the photo preview or video preview process, so that the body shape of the adjusted body image is beautified compared to the actual body shape of the person being photographed.
  • Body beautification can include: beautifying the body proportions (such as lengthening the legs, widening the shoulders, etc.), adjusting the body's fat and thin (such as thin waist, thin legs, thin belly, thin buttocks, or full body treatment, etc.) ).
  • the adjustment of the body image involved in the "beauty" function may include: determining the target position to which the bone point needs to be adjusted according to the beautification parameters of the beautification ratio of the body (such as the body parameters corresponding to the elongated legs, which will be introduced in the following content)
  • Common image scaling algorithms such as bicubic, bilinear, and nearest neighbor can be used to scale the local body image between bone points, so that the bone point can be located at its corresponding target position after the local body image is scaled. In order to achieve the purpose of beautifying the proportion of the body.
  • the adjustment of the body image involved in the "beauty" function may also include: using common image scaling algorithms such as bicubic, bilinear, and nearest neighbor to scale the overall body image or partial body image of the person being photographed (scale) Treatment to achieve the purpose of adjusting body fat or thin.
  • the image processing related to stovepipe may include performing compression processing on the leg image using an image scaling algorithm, and the leg image represented by the compressed leg image is slimmer than the actual leg of the person being photographed.
  • the image processing of the waist shaping design may include: using an image scaling algorithm to compress the middle portion of the waist image, and stretching the upper and lower ends of the waist image.
  • the waist image expresses The waist is more curved than the actual waist of the person being photographed, and the waist represented by this image-processed waist image may be an S-shaped waist (the waist is thinner in the middle).
  • body treatment such processing performed on the body image may be referred to as body treatment.
  • body images (including images of various partial bodies, such as leg images) targeted for body treatment refer to color images of the body collected by the color camera module.
  • face image targeted by the beauty treatment also refers to the color image of the face collected by the color camera module.
  • the "Body” function analyzes the color images collected by the color camera module of the electronic device based on the bone point recognition technology to identify the bone points of the person being photographed, such as the head point, neck point, left shoulder point, and right Shoulder point.
  • the following content will introduce the skeleton point recognition technology and the basic skeleton points of the human body, which will not be repeated here.
  • the electronic device can analyze the body proportion, fat and thin body shape of the person being photographed.
  • the electronic device can adjust the body image between the bone points of the photographed person during the camera preview, so that the figure represented by the adjusted body image is more beautified than the actual figure of the figure.
  • the photographer can beautify the photographed person during the camera preview (photo preview and video preview), the operation is intuitive, simple and effective, and the utilization rate of the electronic device can be improved.
  • the "beauty” function and/or the “beauty” function may be integrated into the "portrait” camera function and video recording function included in the "camera” application.
  • the "beauty” function and/or “beauty” function can also be used as a separate camera function in the “camera” application.
  • "Portrait” camera function is a camera function that is set when the subject is a person to highlight the person and enhance the beauty of the person in the captured picture.
  • the electronic device can use a larger aperture to keep the depth of field shallower to highlight the person, and can improve the color effect to optimize the person's skin color.
  • the electronic device can also turn on the flash to perform illumination compensation.
  • Camera is an application for capturing images on smart phones, tablets, and other electronic devices.
  • the application does not limit the name of the application.
  • the "portrait” camera function and video recording function may be the camera function included in the "camera” application.
  • the "camera” application can also include a variety of other camera functions.
  • the camera parameters such as aperture size, shutter speed, and sensitivity corresponding to different camera functions can be different, and different camera effects can be presented.
  • the camera function can also be called a camera mode, for example, the "portrait” camera function can also be called a "portrait” camera mode.
  • “beauty skin”, “beauty body”, and “portrait” are just some of the words used in this embodiment, and the meanings they represent have been recorded in this embodiment, and their names do not constitute any limitation to this embodiment.
  • “beauty skin” may also be referred to as other terms such as “face beauty”.
  • the "beauty body” mentioned in the embodiments of the present application may also be called other names such as "slimming shape” in other embodiments.
  • FIG. 1A shows a schematic structural diagram of the electronic device 100.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , Mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, key 190, motor 191, indicator 192, 3D camera module 193, display screen 194, and a subscriber identification module (SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180G, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a central processing unit (CPU), and a graphics processing unit (GPU).
  • Application processor application processor
  • CPU central processing unit
  • GPU graphics processing unit
  • Neural network processor neural-network processing unit, NPU
  • modem processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • the different processing units may be independent devices or may be integrated in one or more processors.
  • the electronic device 100 may also include one or more processors 110.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller can generate the operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetch and execution.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory may store instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. The repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the electronic device 100.
  • the processor 110 may include one or more interfaces.
  • Interfaces can include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit, sound, I2S) interface, pulse code modulation (pulse code modulation (PCM) interface, universal asynchronous transceiver (universal) asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /Or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may respectively couple the touch sensor 180K, charger, flash, 3D camera module 193, etc. through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface, and realize the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, to realize the function of answering the phone call through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface to implement the function of answering the phone call through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 to peripheral devices such as the display screen 194 and the 3D camera module 193.
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI) and so on.
  • the processor 110 and the 3D camera module 193 communicate through a CSI interface to implement the camera function of the electronic device 100.
  • the processor 110 and the display screen 194 communicate through the DSI interface to realize the display function of the electronic device 100.
  • the GPIO interface can be configured via software.
  • the GPIO interface can be configured as a control signal or a data signal.
  • the GPIO interface may be used to connect the processor 110 to the 3D camera module 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specifications, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiments of the present invention is only a schematic description, and does not constitute a limitation on the structure of the electronic device 100.
  • the electronic device 100 may also use different interface connection methods in the foregoing embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive the charging input of the wired charger through the USB interface 130.
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the electronic device 100. While the charging management module 140 charges the battery 142, it can also supply power to the electronic device through the power management module 141.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, internal memory 121, external memory, display screen 194, 3D camera module 193, and wireless communication module 160.
  • the power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the power management module 141 may also be disposed in the processor 110.
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive the electromagnetic wave from the antenna 1 and filter, amplify, etc. the received electromagnetic wave, and transmit it to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor and convert it to electromagnetic wave radiation through the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be transmitted into a high-frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to a speaker 170A, a receiver 170B, etc.), or displays an image or video through a display screen 194.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110, and may be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wi-Fi) networks), Bluetooth (bluetooth, BT), and global navigation satellites that are applied to the electronic device 100 Wireless communication solutions such as global navigation (satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR), etc.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives the electromagnetic wave via the antenna 2, frequency-modulates and filters the electromagnetic wave signal, and sends the processed signal to the processor 110.
  • the wireless communication module 160 can also receive the signal to be transmitted from the processor 110, frequency-modulate it, amplify it, and convert it to electromagnetic waves through the antenna 2 to radiate it out.
  • the wireless communication module 160 may include a Bluetooth module, a Wi-Fi module, and the like.
  • the antenna 1 of the electronic device 100 and the mobile communication module 150 are coupled, and the antenna 2 and the wireless communication module 160 are coupled so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global mobile communication system (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long-term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a beidou navigation system (BDS), and a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite-based augmentation system (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS beidou navigation system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation system
  • the electronic device 100 can realize a display function through a GPU, a display screen 194, and an application processor.
  • the GPU is a microprocessor for image processing, connecting the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations, and is used for graphics rendering.
  • the processor 110 may include one or more GPUs that execute instructions to generate or change display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active matrix organic light-emitting diode (active-matrix organic light-emitting diode) emitting diode, AMOLED, flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the electronic device 100 can realize a camera function through a 3D camera module 193, an ISP, a video codec, a GPU, a display screen 194, an application processor AP, and a neural network processor NPU.
  • the 3D camera module 193 can be used to collect color image data and depth data of the subject.
  • the ISP can be used to process the color image data collected by the 3D camera module 193. For example, when taking a picture, the shutter is opened, and light is transmitted to the photosensitive element of the camera through the lens, and the optical signal is converted into an electrical signal. The photosensitive element of the camera transmits the electrical signal to the ISP for processing and converts it into an image visible to the naked eye.
  • ISP can also optimize the algorithm of image noise, brightness and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, the ISP may be set in the 3D camera module 193.
  • the 3D camera module 193 may be composed of a color camera module and a 3D sensing module.
  • the photosensitive element of the camera of the color camera module may be a charge coupled device (charge), a coupled device (CCD), or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the 3D sensing module may be a (time of flight, TOF) 3D sensing module or a structured light (structured light) 3D sensing module.
  • the structured light 3D sensing is an active depth sensing technology.
  • the basic components of the structured light 3D sensing module may include an infrared (Infrared) emitter, an IR camera module, and the like.
  • the working principle of the structured light 3D sensing module is to first emit a specific pattern of spots on the object to be photographed, and then receive the light spot pattern encoding on the surface of the object, and then compare the similarities and differences with the original projected spot. And use the principle of triangle to calculate the three-dimensional coordinates of the object.
  • the three-dimensional coordinates include the distance between the electronic device 100 and the object to be photographed.
  • TOF 3D sensing is also an active depth sensing technology.
  • the basic components of TOF 3D sensing modules may include infrared (Infrared) transmitters, IR camera modules, and so on.
  • the working principle of the TOF 3D sensing module is to calculate the distance (that is, the depth) between the TOF 3D sensing module and the object to be captured by the time of infrared foldback to obtain a 3D depth map.
  • the structured light 3D sensing module can also be used in the fields of face recognition, somatosensory game consoles, industrial machine vision inspection, etc.
  • the TOF 3D sensing module can also be used in game consoles, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) and other fields.
  • the 3D camera module 193 may also be composed of two or more cameras.
  • the two or more cameras may include a color camera, and the color camera may be used to collect color image data of the object to be photographed.
  • the two or more cameras may use stereo vision technology to collect depth data of the object being photographed.
  • Stereo vision technology is based on the principle of parallax of human eyes. Under natural light, two or more cameras are used to capture images of the same object from different angles, and then triangulation and other operations are performed to obtain the electronic device 100 and the The distance information between the objects is the depth information.
  • the electronic device 100 may include 1 or N 3D camera modules 193, where N is a positive integer greater than 1.
  • the electronic device 100 may include a front 3D camera module 193 and a rear 3D camera module 193.
  • the front 3D camera module 193 can usually be used to collect the color image data and depth data of the photographer facing the display screen 194, and the rear 3D camera module can be used to collect the subject (such as people) faced by the photographer , Landscape, etc.) color image data and depth data.
  • the CPU or GPU or NPU in the processor 110 can process the color image data and depth data collected by the 3D camera module 193.
  • the NPU can recognize the color image collected by the 3D camera module 193 (specifically, the color camera module) through a neural network algorithm based on the bone point recognition technology, such as a convolutional neural network algorithm (CNN) Data to determine the bone points of the person being photographed.
  • CNN convolutional neural network algorithm
  • the CPU or GPU can also run a neural network algorithm to determine the bone point of the person to be photographed based on the color image data.
  • the CPU, GPU, or NPU can also be used to confirm the figure of the person being photographed based on the depth data collected by the 3D camera module 193 (specifically, the 3D sensing module) and the identified bone points (such as The proportion of the body, the fatness and thinness of the body parts between the bone points), and can further determine the body beautification parameters for the photographed person, and finally process the photographed image of the photographed person according to the body beautification parameters to make the shooting The figure in the image is beautified. Subsequent embodiments will introduce in detail how to perform body treatment on the image of the captured person based on the color image data and depth data collected by the 3D camera module 193, which will not be repeated here.
  • the digital signal processor is used to process digital signals. In addition to digital image signals, it can also process other digital signals. For example, when the electronic device 100 is selected at a frequency point, the digital signal processor is used to perform Fourier transform on the energy at the frequency point.
  • the video codec is used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (MPEG)-1, MPEG-2, MPEG-3, MPEG-4, etc.
  • MPEG moving picture experts group
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • the NPU can realize applications such as intelligent recognition of the electronic device 100, such as image recognition, face recognition, voice recognition, and text understanding.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, photos, videos and other data in an external memory card.
  • the internal memory 121 may be used to store one or more computer programs including instructions.
  • the processor 110 may run the above instructions stored in the internal memory 121, so that the electronic device 100 executes the photo preview method of the electronic device provided in some embodiments of the present application, as well as various functional applications and data processing.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store the operating system; the storage program area can also store one or more application programs (such as gallery, contacts, etc.) and so on.
  • the storage data area may store data (such as photos, contacts, etc.) created during use of the electronic device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • a non-volatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and so on.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headphone interface 170D, and an application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and also used to convert analog audio input into digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also known as "handset" is used to convert audio electrical signals into sound signals.
  • the voice can be received by bringing the receiver 170B close to the ear.
  • the microphone 170C also known as “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C. In addition to collecting sound signals, it may also implement a noise reduction function. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the headset interface 170D is used to connect wired headsets.
  • the earphone interface 170D may be a USB interface 130, or a 3.5mm open mobile electronic device (open mobile terminal) platform (OMTP) standard interface, and the American Telecommunications Industry Association (cellular telecommunications industry association of the United States, CTIA) standard interface.
  • OMTP open mobile electronic device
  • CTIA American Telecommunications Industry Association
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 194.
  • the capacitive pressure sensor may be a parallel plate including at least two conductive materials. When force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the strength of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation intensities may correspond to different operation instructions. For example, when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the electronic device 100. In some embodiments, the angular velocity of the electronic device 100 around three axes (ie, x, y, and z axes) may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the jitter angle of the electronic device 100, calculates the distance that the lens module needs to compensate based on the angle, and allows the lens to counteract the jitter of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 may detect the opening and closing of the clamshell according to the magnetic sensor 180D.
  • characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to recognize the posture of electronic devices, and can be used in horizontal and vertical screen switching, pedometer and other applications.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting scenes, the electronic device 100 may use the distance sensor 180F to measure distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light outward through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense the brightness of ambient light.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived brightness of the ambient light.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touch.
  • the fingerprint sensor 180G is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access to application lock, fingerprint photo taking, fingerprint answering call, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs to reduce the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 due to low temperature.
  • the electronic device 100 when the temperature is below another threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
  • the touch sensor 180K can also be called a touch panel or a touch-sensitive surface.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 constitute a touch screen, also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 194.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the location where the display screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive a blood pressure beating signal.
  • the bone conduction sensor 180M may also be provided in the earphone and combined into a bone conduction earphone.
  • the audio module 170 may parse out the voice signal based on the vibration signal of the vibrating bone block of the voice part acquired by the bone conduction sensor 180M to realize the voice function.
  • the application processor may analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M to implement the heart rate detection function.
  • the key 190 includes a power-on key, a volume key, and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 can receive key input and generate key signal input related to user settings and function control of the electronic device 100.
  • the motor 191 may generate a vibration prompt.
  • the motor 191 can be used for vibration notification of incoming calls and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminder, receiving information, alarm clock, game, etc.
  • Touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the amount of power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be inserted into or removed from the SIM card interface 195 to achieve contact and separation with the electronic device 100.
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 can also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through a SIM card to realize functions such as call and data communication.
  • the electronic device 100 uses eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the electronic device 100 exemplarily shown in FIG. 1A may display various user interfaces described in the following embodiments through the display screen 194.
  • the electronic device 100 can detect a touch operation in each user interface through the touch sensor 180K, for example, a click operation in each user interface (such as a touch operation on an icon, a double-click operation), and, for example, an upward or in each user interface. Swipe down, or perform a gesture of drawing a circle, and so on.
  • the electronic device 100 may detect a motion gesture performed by the user holding the electronic device 100, such as shaking the electronic device, through the gyro sensor 180B, the acceleration sensor 180E, or the like.
  • the electronic device 100 can detect non-touch gesture operations through the 3D camera module 193 (eg, 3D camera, depth camera).
  • the software system of the electronic device 100 may adopt a layered architecture, event-driven architecture, micro-core architecture, micro-service architecture, or cloud architecture.
  • the embodiment of the present invention takes the Android system with a layered architecture as an example to exemplarily explain the software structure of the electronic device 100.
  • FIG. 1B is a block diagram of the software structure of the electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor.
  • the layers communicate with each other through a software interface.
  • the Android system is divided into four layers, from top to bottom are the application layer, the application framework layer, the Android runtime and the system library, and the kernel layer.
  • the application layer may include a series of application packages.
  • the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface) and programming framework for applications at the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and so on.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display screen, determine whether there is a status bar, lock the screen, intercept the screen, etc.
  • Content providers are used to store and retrieve data, and make these data accessible to applications.
  • the data may include videos, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
  • the view system includes visual controls, such as controls for displaying text and controls for displaying pictures.
  • the view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface including an SMS notification icon may include a view to display text and a view to display pictures.
  • the phone manager is used to provide the communication function of the electronic device 100. For example, the management of the call status (including connection, hang up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a short stay without user interaction.
  • the notification manager is used to notify the completion of downloading, message reminders, etc.
  • the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window.
  • the text message is displayed in the status bar, a prompt sound is emitted, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core library and virtual machine. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one part is the function function that Java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in the virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer into binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library may include multiple functional modules. For example: surface manager (surface manager), media library (Media library), 3D graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • surface manager surface manager
  • media library Media library
  • 3D graphics processing library for example: OpenGL ES
  • 2D graphics engine for example: SGL
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports a variety of commonly used audio, video format playback and recording, and still image files.
  • the media library can support multiple audio and video encoding formats, such as: MPEG4, G.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least the display driver, camera driver, audio driver, and sensor driver.
  • the software system shown in Figure 1B involves the presentation of applications that use sharing capabilities (such as a gallery, file manager), an instant sharing module that provides sharing capabilities, a print service (print service) and a print spooler (print background service) that provide print capabilities , And the application framework layer provide printing framework, WLAN services, Bluetooth services, and the core and the bottom layer provide WLAN Bluetooth capabilities and basic communication protocols.
  • sharing capabilities such as a gallery, file manager
  • instant sharing module that provides sharing capabilities
  • print service print service
  • print spooler print background service
  • the following describes the workflow of the software and hardware of the electronic device 100 in combination with capturing a photographing scene.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into original input events (including touch coordinates, time stamps and other information of touch operations).
  • the original input event is stored in the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event. Taking the touch operation as a touch operation, the control corresponding to the touch operation is a camera application icon as an example.
  • the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer, and the 3D camera Module 193 captures still images or video.
  • the following describes an exemplary user interface for the application menu on the electronic device 100.
  • FIG. 2A exemplarily shows an exemplary user interface 21 on the electronic device 100 for an application menu.
  • the electronic device 100 may be configured with a 3D camera module 193.
  • 193-1 may be a color camera
  • 193-2 may be a structured light 3D camera module.
  • 193-1 may be a color camera
  • 193-2 may be a TOF 3D camera module.
  • 193-1 and 193-2 may be two color cameras.
  • the 3D camera module 193 may be disposed on the top of the electronic device 100, such as the "bangs" position of the electronic device 100 (ie, the area AA shown in FIG. 2A).
  • the area AA may also include an illuminator 197 (not shown in FIG. 1A), a speaker 170A, a proximity light sensor 180G, an ambient light sensor 180L, and the like.
  • the back of the electronic device 100 may also be equipped with a 3D camera module 193 and an illuminator 197.
  • the user interface 21 may include a status bar 201, a tray 223 with frequently used application icons, a calendar indicator 203, a weather indicator 205, a navigation bar 225, and other application icons. among them:
  • the status bar 201 may include: one or more signal strength indicators 201-1 of the mobile communication signal (also called a cellular signal), an indicator 201-2 of the operator of the mobile communication signal, a time indicator 201-3, Battery status indicator 201-4, etc.
  • the calendar indicator 203 can be used to indicate the current time, such as date, day of the week, hour and minute information, and so on.
  • the weather indicator 205 can be used to indicate the weather type, such as cloudy to sunny, light rain, etc., and can also be used to indicate temperature and other information.
  • the tray 223 with frequently used application icons may display: a phone icon 223-1, a short message icon 223-2, a contact icon 221-4, and the like.
  • the navigation bar 225 may include: a return button 225-1, a main screen (Gome) button 225-3, an outgoing task history button 225-5, and other system navigation keys.
  • the electronic device 100 may display the previous page of the current page.
  • the main interface button 225-3 the electronic device 100 may display the main interface.
  • the electronic device 100 may display the task recently opened by the user.
  • the naming of each navigation key can be other, and this application does not limit it. Not limited to virtual keys, each navigation key in the navigation bar 225 may also be implemented as a physical key.
  • Other application icons may be, for example: Wechat icon 211, QQ icon 212, Twitter icon 213, Facebook icon 214, mailbox icon 215, cloud sharing icon 216, memo Icon 217, set icon 218, gallery icon 219, camera icon 220.
  • the user interface 21 may also include a page indicator 221.
  • Other application icons may be distributed on multiple pages, and the page indicator 221 may be used to indicate which page of the application the user is currently browsing. The user can slide the area of other application icons left and right to browse the application icons in other pages.
  • the user interface 21 exemplarily shown in FIG. 2A may be a main interface (Gome screen).
  • the electronic device 100 may further include a home screen key.
  • the home screen key may be a physical key or a virtual key (such as key 225-3).
  • the home screen key can be used to receive user instructions and return the currently displayed UI to the home interface, which can facilitate the user to view the home screen at any time.
  • the above instruction may specifically be an operation instruction in which the user presses the home screen key once, or an operation instruction in which the user presses the home screen key twice in a short period of time, or a long press of the home screen key in a predetermined time Operating instructions.
  • the home screen key may also be integrated with a fingerprint reader, so that when the home screen key is pressed, fingerprint collection and identification are performed accordingly.
  • FIG. 2A only exemplarily shows the user interface on the electronic device 100, and should not constitute a limitation on the embodiments of the present application.
  • the electronic device may detect a touch operation (such as a click operation on the icon 220) acting on the icon 220 of the camera, and in response to the operation, the user interface 31 exemplarily shown in FIG. 3B may be displayed.
  • the user interface 31 may be a user interface of a "camera” application, and may be used for a user to take a picture, for example, take a picture or record a video.
  • "Camera” is an application for capturing images on smart phones, tablets, and other electronic devices. The application does not limit the name of the application. That is to say, the user can click the icon 220 to open the user interface 31 of the "camera". Not limited to this, the user may also open the user interface 31 in other applications, for example, the user clicks the shooting control in “WeChat” to open the user interface 31.
  • “WeChat” is a social application that allows users to share photos taken with others.
  • FIG. 3B exemplarily shows a user interface 31 of a “camera” application on an electronic device such as a smartphone.
  • the user interface 31 may include an area 301, a shooting mode list 302, a control 303, a control 304, and a control 305. among them:
  • the area 301 may be referred to as a preview frame 301.
  • the preview frame 301 can be used to display the color image collected by the 3D camera module 193 in real time.
  • the electronic device can refresh the display content in real time, so that the user can preview the color image currently collected by the 3D camera module 193.
  • the 3D camera module 193 may be a rear camera or a front camera.
  • the one or more shooting mode options may be displayed in the shooting mode list 302.
  • the one or more camera options may include: night view mode option 302A, portrait mode option 302B, camera mode option 302C, video mode option 302D, and more shooting mode options 302E.
  • the one or more camera options can be represented as text information on the interface, for example, night view mode option 302A, portrait mode option 302B, camera mode option 302C, video mode option 302D, and more shooting mode options 302E can correspond to the text "night view” , "Portrait”, “Take Photo", “Video”, “More”.
  • the one or more camera options can also be represented as icons or other forms of interactive elements (IE) on the interface.
  • IE interactive elements
  • the electronic device 100 may select the photographing mode option 302C by default, and the display state of the photographing mode option 302C (eg, the photographing mode option 302C is highlighted) may indicate that the photographing mode option 302C has been selected.
  • the electronic device 100 may detect a user operation acting on a shooting mode option, and the user operation may be used to select a shooting mode, and in response to the operation, the electronic device 100 may turn on the shooting mode selected by the user.
  • the electronic device 100 can further display more other shooting mode options, such as large aperture shooting mode options, slow motion shooting mode options, etc., which can be shown to the user More abundant camera functions.
  • more shooting mode options 302E may not be displayed in the shooting mode list 302, and the user may browse other shooting mode options by swiping left/right in the shooting mode list 302.
  • the control 303 can be used to monitor user operations that trigger shooting (photographing or video recording).
  • the electronic device can detect a user operation on the control 303 (such as a click operation on the control 303), and in response to the operation, the electronic device 100 can save the image in the preview box 301.
  • the saved image can be a picture or a video.
  • the electronic device 100 can also display the thumbnail of the saved image in the control 304. That is to say, the user can click the control 303 to trigger shooting.
  • the control 303 may be a button or other forms of controls. In this application, the control 303 may be referred to as a shooting control.
  • the control 304 can be used to monitor user operations that trigger camera switching.
  • the electronic device 100 can detect a user operation on the control 304 (such as a click operation on the control 304), and in response to the operation, the electronic device 100 can switch the camera (such as switching the rear camera to the front camera, or the front Set the camera to switch to the rear camera).
  • the control 305 can be used to monitor user operations that trigger the opening of the "Gallery".
  • the electronic device 100 can detect a user operation (such as a click operation on the control 305) acting on the control 305, and in response to the operation, the electronic device 100 can display a user interface of "Gallery", in which the electronic device can be displayed 100 saved pictures.
  • "Gallery” is an application for managing pictures on smart phones, tablet computers, and other electronic devices, and may also be called an "album.” In this embodiment, the name of the application is not limited. "Gallery” can support users to perform various operations on the pictures stored on the electronic device, such as browsing, editing, deleting, selecting and other operations.
  • the user interface 31 can show the user various camera functions (modes) provided by the “camera”, and the user can choose to turn on the corresponding shooting mode by clicking the shooting mode option.
  • modes camera functions
  • UI user interface
  • FIG. 3C exemplarily shows the user interface 32 provided by the “portrait” photographing function of the “camera” application.
  • the electronic device 100 can detect a user operation acting on the portrait mode option 302B (such as a click operation on the portrait mode option 302B), and in response to the user operation, the electronic device 100 can turn on the "portrait” photo Function and display the user interface shown as an example in FIG. 3C.
  • a user operation acting on the portrait mode option 302B such as a click operation on the portrait mode option 302B
  • the electronic device 100 can turn on the "portrait" photo Function and display the user interface shown as an example in FIG. 3C.
  • the definition of the electronic device 100 enabling the "portrait" camera function has been explained in the foregoing, and will not be repeated here.
  • the portrait mode option may be referred to as the first shooting mode option.
  • the user interface 32 includes: a preview box 301, a shooting mode list 302, a control 303, a control 304, a control 305, and 306, a control 207.
  • a preview box 301 a shooting mode list 302
  • a control 303 a control 304
  • a control 305 a control 305
  • 306 a control 207
  • the control 306 can be used to monitor the user operation of opening the light effect template option
  • the control 307 can be used to monitor the user operation of opening the character beautification option.
  • the electronic device 100 may display various light effect template options in the user interface 31.
  • Different light effect templates can represent (or correspond to) different light effect parameters, such as light source position, layer fusion parameter, texture pattern projection position, projection direction, etc.
  • the user can select different light effect templates to make the photos taken show different effects.
  • This application does not limit the interface representation forms of various light effect template options in the user interface 31.
  • the electronic device 100 may display the user interface 33 exemplarily shown in FIG. 3D.
  • FIG. 3D exemplarily shows the user interface provided by the character beautification function.
  • the user interface shown by way of example in FIG. 3D will be described in detail in the following content, which will not be repeated here.
  • the electronic device 100 may also update the display state of the portrait shooting mode option, and the updated display state may indicate that the portrait shooting mode has been selected.
  • the updated display state may be the text information "Portrait” corresponding to the highlight shooting mode option 303B.
  • the updated display state can also present other interface expressions, such as the font of the text information "Portrait” becomes larger, the text information "Portrait” is framed, the text information "Portrait” is underlined, options 303B color deepening, etc.
  • a prompt message 308 may be output in the preview box 301
  • the prompt information 308 may be the text “No person detected”, which may be used to alert the electronic device 100 that no person has been detected.
  • the character beautification function can be integrated into the "portrait" camera function.
  • the person beautification function may also be a camera function in the “camera” application.
  • the person beautification mode option may be displayed in the shooting mode list 302 in the user interface 31.
  • the electronic device 100 may display the user interface provided by the character beautification function exemplarily shown in FIGS. 3D-3E.
  • 3D-3E exemplarily show the user interface 33 provided by the character beautification function of the "camera” application.
  • the user interface 33 includes a preview frame 301, a shooting mode list 302, a control 303, a control 304, a control 305, and a skin beauty option 309 and a body beauty option 310.
  • a shooting mode list 302 for the preview box 301, the shooting mode list 302, the control 303, the control 304, and the control 305, reference may be made to the relevant description in the user interface 31, which will not be repeated here.
  • the skin beauty option 309 and the body beauty option 310 can be represented as icons on the interface, as shown in FIG. 3D. Not limited to icons, the skin beauty option 309 and the body beauty option 310 can also be expressed as text (such as text “beauty”, “beauty”) or other forms of interactive elements (IE) on the interface.
  • icons such as text “beauty”, “beauty”
  • IE interactive elements
  • the electronic device 100 When the electronic device 100 detects a user operation acting on the skin beautification option 309 (such as a click operation on the skin beautification option 309), as shown in FIG. 3D, the electronic device 100 may display the control 311 in the user interface 33, and may Turn on the "beauty” function.
  • the "beauty” function can be used to perform skin beautification on the face image of the person being photographed in the photo preview or video preview process, so that the face represented by the face image after skin treatment is compared with the actual face of the person being photographed Skin beautification has occurred on the face, such as skin whitening, skin dermabrasion (such as removing acne, freckles, wrinkles, etc.), and so on.
  • the control 311 can be used by the user to adjust the degree of skin beauty of the person being photographed.
  • the degree of beautification may refer to the degree of beautification of the skin of the face of the photographed person.
  • the degree of skin beautification can also be called the skin beautification level.
  • the face image of the photographed person displayed in the preview frame 301 does not show skin beautification compared to the actual face of the photographed person; the beautification level 10 can represent a greater degree of the face of the photographed person Skin beautification, at this time the face represented by the face image of the photographed person displayed in the preview frame 301 has a greater degree of skin beautification than the actual face of the photographed person.
  • the electronic device 100 When the electronic device 100 detects a user operation (such as a click operation on the body option 310) acting on the body option 310, the user operation is used to select the body option 310.
  • a user operation such as a click operation on the body option 310 acting on the body option 310
  • the user operation is used to select the body option 310.
  • the electronic device 100 may The control 312 is displayed in the interface 33, and the "Body" function can be turned on.
  • the "Beauty” function can be used to perform body treatment on the body image of the person being photographed during the photo preview or video preview process, so that the body image after the body treatment performs a body beautification compared to the actual body shape of the person being photographed.
  • the control 312 may be a slider.
  • the control 312 can be used by the user to adjust the beauty of the person being photographed.
  • the degree of beautification may refer to the degree of beautification of the body of the person being photographed.
  • the body level can also be called the body level.
  • the body image of the photographed person displayed in the preview box 301 is consistent with the actual body shape of the photographed person, that is, no beautification occurs; the body level 10 can represent a greater degree of body beautification of the body of the photographed person.
  • the body image of the photographed person displayed in the preview frame 301 has a greater degree of body beautification than the actual shape of the photographed person.
  • the skin beauty option 309 may be the option selected by default. That is, when a user operation for turning on the character beautification function is detected, the electronic device 100 may display the user interface shown in FIG. 3D by default.
  • the body style option 310 may be the option selected by default. That is, when a user operation for turning on the character beautification function is detected, the electronic device 100 may display the user interface shown in FIG. 3E by default.
  • the user operation for turning on the character beautification function may be the aforementioned user operation acting on the portrait mode option 302B, and the aforementioned user operation acting on the character beautification mode option.
  • the display state of the beauty option 309 may indicate that the beauty option 309 has been selected; when the beauty option 310 is selected.
  • the display state of the body style option 310 may indicate that the body style option 310 has been selected.
  • the display state in which the skin beauty option 309 (or body beauty option 310) has been selected may be the highlight skin beauty option 309 (or body beauty option 310).
  • the display state may also present other interface expressions.
  • a prompt message 308 may be output in the preview box 301,
  • the prompt information 308 may be the text “No person detected”, which may be used to prompt the electronic device 100 that no person is detected.
  • the electronic device 100 can detect whether the color image collected by the 3D camera module 193 includes a face image. If the face image is included, it is determined that a person is detected, otherwise it is determined that no person is detected.
  • a prompt message 308 may be output in the preview box 301, prompting
  • the information 308 may be the text "No person detected", which may be used to prompt the electronic device 100 that no person has been detected.
  • the electronic device 100 may analyze whether the color image collected by the 3D camera module 193 contains human bone points based on the bone point recognition technology. If the human bone points are included, it is determined that a person is detected, otherwise it is determined that no person is detected. The specific implementation of determining human bone points based on bone point recognition technology will be described in detail in the subsequent content, and will not be expanded here first.
  • FIGS. 4A-4G exemplarily show a UI embodiment for beautifying a photographed person during a photo preview.
  • the 3D camera module 193 can collect the image of the person 411.
  • the image of the person 411 may be an image of the front body of the person 411 or an image of the back body of the person 411.
  • the image of the person 411 may include a body image and a face image.
  • the body image may include images of multiple partial bodies (eg, legs, waist, shoulders, stomach, etc.).
  • the electronic device 100 may determine that the person has been detected and may be used to detect A user operation of performing a body treatment on the image of the person photographed in the preview frame 301. This user operation may be referred to as a first user operation.
  • the electronic device 100 may update the display content in the preview frame 301, and display the image of the photographed person after the body treatment in the preview frame 301.
  • the image of the photographed person after the body treatment displayed in the preview frame 301 may be referred to as a second image.
  • the image of the captured person (that is, the foreground image) collected by the 3D camera module displayed in the preview frame 301 may be referred to as a first image.
  • the contour of the first body part of the photographed person in the second image can be adjusted according to the first body parameters
  • the contour of the second body part of the photographed person in the second image can be adjusted according to the second body parameters
  • the first part and the second part may be different
  • the first body parameter and the second body parameter may be different. That is to say, the electronic device 100 can use different body parameters to adjust the contours of different body parts in the first image to obtain the contours of different body parts in the second image.
  • the first user operation may specifically include a second user operation acting on the control 312 (such as a rightward sliding operation on the control 312), and a third user operation acting on the control 312 (such as a leftward operation on the control 312) Slide operation).
  • a second user operation acting on the control 312 such as a rightward sliding operation on the control 312
  • a third user operation acting on the control 312 such as a leftward operation on the control 312) Slide operation.
  • the electronic device 100 does not perform the body treatment on the body image of the person 411.
  • the figure represented by the body image of the person 411 displayed in the preview frame 301 matches the actual figure of the person 411.
  • the body image of the person 411 displayed in the preview frame 301 is the first image of the captured person.
  • the electronic device 100 can detect a second user operation (such as a rightward sliding operation on the control 312) acting on the control 312, and the second user operation can be used to increase the body level corresponding to the character 411 (such as from Body level 7 is increased to body level 10).
  • the electronic device 100 may refresh the display content in the preview box 301.
  • the body image of the person 411 in the preview frame 301 after refreshing is more beautified than the body image of the person 411 in the preview frame 301 before refreshing.
  • body beautification can include one or more of the following: beautification of body proportions (such as elongated legs, widened shoulders, etc.), and the whole body or partial body (such as waist, legs, stomach, shoulders, etc.) Fat and thin to adjust.
  • Beautifying the body ratio can refer to adjusting the length and width of each part of the body according to the human body aesthetics, so that the adjusted body ratio is closer to the standard body ratio defined by the human body aesthetics, such as the golden ratio of the upper and lower body (with the navel as the boundary, upper The lower body ratio is about 5 to 8).
  • the beautification of body proportions can also be not limited to human body aesthetics. For example, the person to be photographed that satisfies the golden ratio of the upper and lower body can still stretch his legs to show the advantage of stature with protruding legs.
  • the outline of the first part in the second image in the preview frame after refreshing is compared with the outline of the first part in the second image in the preview frame before refreshing, and The difference in the outline of the first part in the first image is greater.
  • the contour of the second part in the second image in the preview frame after refreshing is larger than the contour of the second part in the first image.
  • the electronic device 100 can detect a third user operation acting on the control 312 (such as a leftward sliding operation on the control 312), and the third user operation can be used to lower the body level corresponding to the character 411 (such as from The body level 10 is reduced to the body level 7), in response to the third user operation, the display content in the preview box 301 can be refreshed.
  • the body image of the person 411 in the preview frame 301 after refreshing is closer to the actual body shape of the person 411 than the body image of the person 411 in the preview frame 301 before refreshing.
  • the outline of the first part in the second image in the preview frame after refreshing is compared with the outline of the first part in the second image in the preview frame before refreshing, and The difference in the outline of the first part in the first image is smaller.
  • the contour of the second part in the second image in the preview frame after refreshing is smaller than the contour of the second part in the second image in the preview frame before refreshing, and the contour of the second part in the first image is smaller.
  • body level 5 or other body levels may be the default body level of the “Body” function, so that the user can click the “Body option 310” to open the “Body” function and see in the preview box 301 To the image of the person who has been processed by the body, the user can directly click on the control 303 to save the image of the person 411 that has been processed by the body.
  • the operation is simple and the use efficiency is high.
  • the default body level is body level 5, which is in the middle of the control 312, then the sliding distance required by the user to manually adjust the body level to the left or right can be greatly reduced, which can save user operations and improve the use efficiency.
  • the electronic device 100 can successively detect multiple user operations to improve the body level, and can refresh the display content in the preview box 301 multiple times, so that Compared with the body image before the refresh, the body image after each refresh has a body beautification, which can provide a visual effect of gradually enhancing the body beautification in the preview box 301.
  • the electronic device 100 can successively detect multiple user operations that lower the body level, and can refresh the display content in the preview box 301 multiple times. Provides a visual effect that gradually diminishes beautification.
  • both the third user operation or the third user operation are used to select a body shape level for the photographed person 411, and the electronic device 100 may utilize the selected body level ( For example, the beauty parameters corresponding to the beauty level 5) perform a body treatment on the body image of the person 411 collected by the 3D camera module 193, so as to obtain the body image of the person 411 displayed in the refreshed preview frame 301.
  • the electronic device uses the beauty parameters corresponding to the selected beauty level to perform the body treatment on the body image of the person 411, details will be described in the subsequent method embodiments, and details are not described here.
  • the electronic device 100 may display a prompt message (such as the text "pregnant woman detected") in the preview box 301, the prompt message may be Prompt character 411 is a pregnant woman.
  • the electronic device 100 can use the beauty parameters corresponding to the selected beauty level to perform beautification on the color images of other partial bodies of the person 411 (such as arms, legs, and shoulders) Processing without performing body treatment on the belly image of the person 411 to retain the characteristics of the pregnant woman.
  • the electronic device 100 may also display a switch control in the preview box 301.
  • the switch control When the switch control is in the "on” state, it means that the characteristics of pregnant women are retained, that is, the electronic device 100 does not perform body treatment on the belly image; when the switch control is in the "off” state, it means that the characteristics of the pregnant woman are not retained, that is, the electronic device 100 can be Fig. 4A shows body contouring on the belly image to show the effect of thin belly.
  • the electronic device 100 may still use the body parameters corresponding to the selected body level to perform body treatment on the current person being photographed.
  • the electronic device 100 can refresh the display content in the preview box 301, the image of the person 411 displayed in the refreshed preview box 301 includes the face The image and the body image, wherein the face image is a face image after skin treatment, and the body image is a body image that has undergone body treatment using the body parameters corresponding to the selected body level.
  • the user can continue to beautify the skin of the photographed person on the basis of the beautification of the photographed person to achieve a more comprehensive beautification effect. Conversely, the user can continue to beautify the photographed person on the basis of the skin beautification of the photographed person.
  • the electronic device 100 may also perform a beautification process on the body image of the photographed person in response to the user operation for selecting the body-beauty level, so as to realize the beautification of the photographed person in other poses.
  • FIGS. 5A-5D exemplarily show a UI embodiment in which the beautification is cancelled during the photo preview.
  • FIGS. 5A-5B exemplarily show a UI embodiment for revoking a partial body
  • FIG. 5C-5D exemplarily show a UI embodiment for revoking a whole-body body with one click.
  • the partial body where the figure 411 has been beautified includes: left arm, right arm, stomach, left leg, and right leg. These partial bodies were respectively beautified as follows: thin left arm, thin right arm, thin belly, elongated left leg, elongated right leg.
  • the electronic device 100 may display the undo icon 501 on the images of these partial beautified bodies. In this way, the user can click the undo icon 501 on the partial body image to undo the beautification of the partial body.
  • the electronic device 100 can detect a user operation of the undo icon 501 (such as a click operation on the undo icon 501) acting on the belly image of the person 411. In response to this operation, the electronic device 100 can refresh the display content in the preview box 301, the shape of the belly represented by the belly image of the person 411 in the preview box 301 after refreshing matches the actual shape of the belly of the person 411, and the preview before refreshing The belly image of the character 411 in the frame 301 is thinner than the actual belly of the character 411.
  • a user operation of the undo icon 501 such as a click operation on the undo icon 501 acting on the belly image of the person 411.
  • the electronic device 100 can refresh the display content in the preview box 301, the shape of the belly represented by the belly image of the person 411 in the preview box 301 after refreshing matches the actual shape of the belly of the person 411, and the preview before refreshing
  • the belly image of the character 411 in the frame 301 is thinner than the actual belly of the character 411.
  • the electronic device 100 can detect a user operation (such as a click operation on the undo icon 501) of the undo icon 501 acting on the partial body image of the person being photographed.
  • the partial body image is the selected partial body image.
  • the electronic device 100 may refresh the preview box 301.
  • the shape of the partial body represented by the selected partial body image in the preview box 301 after refreshing is consistent with the actual shape of the partial body of the person being photographed, and the representation of the selected partial body image in the preview box 301 before refreshing Compared to the actual part of the person being photographed, the part of the body has been beautified.
  • FIGS. 5A-5B can support the user to cancel the body treatment of the partial body of the person being photographed in the preview box 301, so that the user can retain the original features of certain body parts (such as a sturdy and thick arm), which can be more Flexible to meet the user's body needs.
  • certain body parts such as a sturdy and thick arm
  • the electronic device 100 may display the one-key undo icon 503 in the preview box 301. In this way, the user can click the one-key undo icon 503 to undo all body treatments performed on the entire body.
  • the electronic device 100 can detect a user operation (such as a click operation on the icon 503) acting on the one-key undo icon 503. In response to this operation, the electronic device 100 can refresh the display content in the preview box 301.
  • the body image of the person 411 in the preview box 301 after refreshing shows the same shape as the actual figure of the person 411, while the preview box 301 before refreshing
  • the figure represented by the body image of the person 411 is beautified compared to the actual figure of the person 411.
  • FIGS. 5C-5D can support the user to cancel the whole body body performed on the photographed person in the preview box 301 with one key, the operation is simple and effective, and the use efficiency can be improved.
  • FIGS. 6A-6G exemplarily show some UI embodiments for pushing fitness information during a photo preview.
  • Fitness information can include fitness classes, fitness methods, fitness diet, etc.
  • fitness information may be referred to as the first content.
  • FIGS. 6A-6D exemplarily show UI embodiments for pushing fitness information when the rear camera module shoots
  • FIGS. 6E-6G exemplarily show UI implementations for pushing fitness information when the front camera module shoots example.
  • the image of the person 411 displayed in the preview frame 301 is collected by the rear 3D camera module 193 of the electronic device 100.
  • the person 411 is usually not a photographer holding the electronic device 100, but a friend, relative, etc. of the photographer. The photographer can share the fitness information with the photographed person 411 to help or encourage the photographed person 411 to exercise.
  • the electronic device 100 can display the control 601 and the prompt information 603 in the preview box 301.
  • the control 601 can be used to monitor user operations that share fitness information.
  • the prompt information 603 can be used to prompt the user to click the control 601 to share fitness information such as fitness classes, fitness methods, fitness diets, and the like with the characters 411.
  • the prompt information 603 may gradually disappear in the preview box 301 after being displayed for a period of time (eg, 1 second).
  • the user can also cancel the display of the prompt information 603 by swiping up or left on the prompt information 603.
  • the electronic device 100 can detect a user operation (such as a click operation on the control 601) acting on the control 601. In response to this operation, the electronic device 100 may display a user interface 605 for sharing fitness information. The user can select a contact in the user interface 605 to share fitness information with the contact. The electronic device 100 may detect a user operation of selecting a contact to share fitness information in the user interface 605. In response to this operation, the electronic device 100 can share fitness information with the selected contact.
  • a user operation such as a click operation on the control 601 acting on the control 601.
  • the electronic device 100 may display a user interface 605 for sharing fitness information.
  • the user can select a contact in the user interface 605 to share fitness information with the contact.
  • the electronic device 100 may detect a user operation of selecting a contact to share fitness information in the user interface 605. In response to this operation, the electronic device 100 can share fitness information with the selected contact.
  • the user interface 605 may be exemplarily shown in FIG. 6C.
  • the user interface 605 may be a recent chat interface of the application "WeChat", and the conversation list 605-1 in the interface may be used to display a conversation in which one or more conversations have recently occurred. entry. These multiple conversations are associated with different contacts, such as "MAC”, "Kate”, “William”, etc.
  • the electronic device 100 can detect a user operation of selecting a contact, such as a click operation on the conversation item of the user and the contact "Kate", and in response to the operation, the electronic device 100 can share fitness information with the selected contact "Kate” , And can display the user and contact "Kate” dialogue interface 607.
  • the dialogue interface 607 may be exemplarily shown in FIG.
  • a message 607-1 sent by the user to the contact "Kate” may be displayed.
  • Message 607-1 may include a brief description of fitness information.
  • the message 607-1 may also be associated with a link to fitness information, which may be provided by a third-party fitness application. In this way, the contact "Kate" can click the message 607-1 to open the link after receiving the message 607-1, so that he can jump to the user interface of the third-party fitness application to view fitness information.
  • the user interface 605 for sharing fitness information can also be the user interface of other social applications or other data sharing functions (such as Apple's AirDrop, Huawei's Huawei Share (Guawei, Share, etc.) provides a user interface for sharing data. In this way, it is easy to promote the fitness information, and it is convenient for the photographed person to obtain fitness information, which can encourage or help the photographed person to exercise.
  • other social applications or other data sharing functions such as Apple's AirDrop, Huawei's Huawei Share (Guawei, Share, etc.
  • the image of the person 411 displayed in the preview frame 301 is collected by the front-mounted 3D camera module 193 of the electronic device 100.
  • the person 411 may generally be (or include) a photographer holding the electronic device 100.
  • the electronic device 100 can display the control 608 and the prompt information 606 in the preview box 301.
  • the prompt information 606 can be used to prompt the user to click the control 608 to view fitness information.
  • the control 608 in the preview box 301 can be used to monitor user operations that open fitness information instead of sharing fitness information. In this way, it is convenient for the photographer to obtain fitness information such as fitness classes, fitness guidance, and fitness diet plans when taking a self-portrait, thereby improving the user experience.
  • the electronic device 100 can detect a user operation (such as a click operation on the control 608) acting on the control 608. In response to this operation, the electronic device 100 can display fitness information.
  • the fitness information may be specifically displayed in the user interface 609 exemplarily shown in FIG. 6G.
  • the user interface 609 may be a user interface of a third-party fitness application program (such as "ABC Fitness"), and the specific implementation of the user interface is not limited in this application.
  • the fitness information may be related to the body-level selected for the person being photographed. For example, the higher the body-level, the higher the difficulty of the fitness class.
  • the fitness information can also be related to the gender, age and other characteristics of the person being photographed. For example, if the person being photographed is a female, the fitness information can be a fitness class suitable for women.
  • Fitness information can also be related to the current shape of the person being photographed. For example, the fatter the person being photographed, the more difficult the fitness of the fitness class.
  • the electronic device 100 may specifically display the control 601 or the control 608 in the preview box 301 in the following situations:
  • Case 1 After detecting that the user selects the beauty level, the time for keeping the selected beauty level unchanged does not exceed a specific duration (such as 1 second).
  • Case 2 It is detected that the user clicks the control 303 under the selected beauty level to trigger shooting, and the selected beauty level may be a beauty level other than the beauty level 0.
  • both case 1 and case 2 can indicate that the user is satisfied with the body shape of the photographed person presented in the preview box 301 at the selected beauty level.
  • the electronic device 100 may also display the control 601 or the control 608 in the preview box 301 in other cases, so that the photographer can share fitness information with the person being photographed or the photographer can open the fitness information by himself.
  • FIGS. 7A-7G exemplarily show other UI embodiments for pushing fitness information during a photo preview.
  • FIGS. 7A-7D exemplarily show UI embodiments for pushing fitness information when shooting with the rear camera module
  • FIGS. 7E-7G exemplarily show UI implementations for pushing fitness information when shooting with the front camera module example.
  • FIG. 7A-7G embodiments can refer to FIG. 6A-6G embodiments. Different from the embodiment of FIGS. 6A-6G, the electronic device 100 may display prompt information 703 in the preview box 301.
  • the prompt information 703 can be used to prompt the difference between the contour of the first part in the second image and the contour of the first part in the first image, the second part in the second image The difference between the outline of and the outline of the second part in the first image.
  • the second image is the image of the photographed person after the body treatment displayed in the preview frame 301
  • the first image is the image of the photographed person collected by the 3D camera module.
  • the contours of the body parts of the photographed person in the first image are consistent with the actual contours. These differences can be expressed as: differences in waist circumference, differences in leg circumference, differences in leg length, differences in shoulder width and so on.
  • the prompt information 703 can also be used to prompt one or more of the following manifestations: weight to be lost, amount of exercise required, calories to be consumed, and so on.
  • the weight that needs to be lost is used to prompt how much weight the person to be photographed needs to lose to approach or reach the body shape represented by the body image displayed in the preview box 301.
  • the required amount of exercise is used to indicate how much exercise the person to be photographed needs to approach or reach the figure represented by the body image displayed in the preview box 301.
  • the calories that need to be consumed are used to indicate how much calories the person to be shot needs to approach or reach the body shape represented by the body image displayed in the preview box 301.
  • the gap can be the text "You still need to be away from your ideal size: weight loss 5KG aerobic run 100KM consumes 7000 calories”. This makes it easier for users to understand fitness goals and encourages users to exercise to achieve their ideal body shape.
  • the ideal body shape may be the body shape displayed in the preview box 301 at the currently selected beauty level.
  • the prompt information 703-1 can also be displayed in the prompt information 703.
  • the prompt information 703-1 can be used to prompt the user to click the prompt information 703-1 to share fitness information with the photographed person 411.
  • the electronic device can detect a user operation acting on the prompt information 703-1 (such as a click operation on the prompt information 703-1).
  • the electronic device 100 may display a user interface 705 for sharing fitness information.
  • the user can select a contact in the user interface 705 to share fitness information to the contact.
  • the electronic device 100 may detect a user operation of selecting a contact to share fitness information in the user interface 705.
  • the electronic device 100 can share fitness information with the selected contact.
  • the prompt information 706 may also be displayed in the prompt information 706.
  • the prompt information 706-1 can be used to prompt the user to click the prompt information 706-1 to share fitness information with the person 411 to be photographed.
  • the electronic device can detect a user operation (such as a click operation on the prompt information 706-1) acting on the prompt information 706-1.
  • the electronic device 100 can display fitness information.
  • 8A-8B exemplarily show UI embodiments before and after beautification of a comparison body during a photo preview.
  • the electronic device 100 may
  • the control 801 is displayed in the preview box 301.
  • the electronic device 100 can detect a user operation acting on the control 801, such as on the control 801 Press and hold the user operation without releasing. In response to this operation, the electronic device 100 can refresh the display content in the preview frame 301, and the refreshed preview frame 301 displays the body image of the person 411 actually collected by the 3D camera module 193.
  • the electronic device 100 can detect the user acting on the control 801 An operation, such as a user operation of releasing the control 801 (that is, releasing the control 801 when it is no longer held). In response to this operation, the electronic device 100 may refresh the display content in the preview frame 301, and the body image of the person 411 displayed in the refreshed preview frame 301 is the body image after the body treatment.
  • the user can compare the body image of the person 411 actually collected by the 3D camera module 193 with the body image of the person 411 displayed in the refreshed preview frame 301.
  • 9A-9C exemplarily show a UI embodiment in which a user takes a photo.
  • the body shape of the photographed person 411 in the preview frame 301 has a beautification compared to the actual shape of the photographed person 411.
  • the electronic device 100 can detect a user operation (such as a click operation on the control 303) acting on the control 303, and in response to the operation, the electronic device 100 can save the image in the preview box 301, and can also display the saved in the control 304 Thumbnail of the image. Because the currently selected camera option is a portrait mode option, the image in the preview box 301 can be saved as a photo.
  • the electronic device 100 can also detect a user operation (such as a click operation on the control 304) acting on the control 304, and in response to the operation, the photo 801 recently saved by the electronic device 100 can be displayed. In this way, the user can view the image in the preview frame 301 that has just been saved, and the body image of the photographed person 411 included in the image is more beautified than the actual shape of the photographed person 411.
  • a user operation such as a click operation on the control 304
  • FIGS. 10A-10C exemplarily show a UI embodiment for beautifying a photographed person during video preview.
  • the electronic device 100 can detect a user operation (such as a click operation on the recording mode option 302D) acting on the recording mode option 302D, and in response to the user operation, the electronic device 100 can turn on the recording function, and
  • the user interface shown exemplarily in FIGS. 10A to 10C is displayed.
  • the user interface shown in FIGS. 10A to 10C can be specifically referred to the aforementioned UI embodiment for beautifying the photographed person during the photo preview, which will not be repeated here. The difference is that at this time the control 303 can be used to monitor user operations that trigger recording.
  • the recording mode option 302D may be referred to as the second shooting mode option.
  • the electronic device 100 can respond to the detected user operation on the control 303,
  • the image in the preview box 301 can be started to record, and the user interface during recording can be displayed.
  • the user interface during recording may be exemplarily shown in FIG. 10D.
  • the electronic device 100 can respond to the detected user operation on the control 10D-1, can end recording the image in the preview box 301, and can save the recorded image as a video, and display the saved in the control 304 Thumbnail of your video.
  • the display content in the preview box 301 is updated regularly, for example, every 10 ms.
  • the body image of the photographed person displayed in the preview frame 301 may be the body image of the body of the photographed person using the body parameters corresponding to the selected body level.
  • the electronic device 100 can also detect a user operation (such as a click operation on the control 304) acting on the control 304, and in response to the operation, the video 10F-1 recently saved by the electronic device 100 can be displayed. In this way, the user can view the video that has just been saved, and the body image of the photographed person 411 included in the video is subjected to body treatment on the body image of the photographed person using the body parameters corresponding to the selected body level.
  • a user operation such as a click operation on the control 304
  • the user can beautify the person being photographed before starting to record.
  • the body image of the photographed person displayed in the preview box 301 has been subjected to body treatment using the body parameters corresponding to the body level selected before starting the recording. In this way, the body shape of the photographed person's body image will be consistent throughout the video recording process.
  • the user interface in the recording may also include a control 312 for adjusting the beauty level and/or a control 3311 for adjusting the beauty level, so that the user can also adjust The figure's body level and/or skin level.
  • the electronic device 100 may adopt the default body contour level (such as the body contour level 5) or the body contour parameter corresponding to the previously selected body contour level Take a body image of the person's body image for body treatment.
  • Some UI embodiments during the photo preview described in the foregoing content such as the embodiment of undoing the beautification shown in FIGS. 5A-5D, and the embodiment of pushing fitness information shown in FIGS. 6A-6G and 7A-7G
  • the embodiments before and after the beautification of the figure shown in FIGS. 8A-8B, and subsequent UI embodiments are also applicable to the figure beautification in the recording scene, which will not be repeated here.
  • FIGS. 11A to 11E exemplarily show an embodiment of a UI for beautifying a photographed person using a figure template.
  • the user interface 33 may further display a control 11B-1.
  • the control 11B-1 can be used to monitor and view user operations of the open-body template.
  • the electronic device 100 can detect a user operation (such as a click operation on the control 11B-1) acting on the control 11B-1, and in response to the operation, the user interface exemplarily shown in FIG. 11B may be displayed.
  • the user interface may include a preview box 301, a control 303, a control 304, a control 305, a shooting mode list 302, and a figure template list 11B-2.
  • the body template list 11B-2 may display one or more body template options 11B-3. Different figure templates can represent different figures.
  • the preview box 301, the control 303, the control 304, the control 305, and the shooting mode list 302 reference may be made to the description in the foregoing embodiment, and details are not repeated here.
  • the electronic device 100 can detect a user operation (such as a click operation on the body template option 11B-3) acting on the body template option 11B-3, which can be used to select a body template, such as Figure template "Kate".
  • a user operation such as a click operation on the body template option 11B-3 acting on the body template option 11B-3, which can be used to select a body template, such as Figure template "Kate”.
  • the user interface exemplarily shown in FIG. 11C may be displayed.
  • the user interface may include icons 11B-4, controls 314, a shooting mode list 302, controls 303, controls 304, and controls 305.
  • the icon 11B-4 can be used to indicate the selected figure template.
  • the control 314 can be used to select a beauty level. The higher the beauty level, the closer the body image of the photographed person displayed in the preview box 301 is to the body type represented by the body shape template.
  • the control 303, the control 304, the control 305, and the shooting mode list 302 reference may be made to
  • the electronic device 100 may detect the first user operation (such as a rightward sliding operation on the control 314) acting on the control 314, the first user The operation can be used to increase the figure level corresponding to the character 411 (eg, from figure level 7 to figure level 10).
  • the electronic device 100 can refresh the display content in the preview box 301, and the body image of the person 411 in the preview box 301 after refreshing is compared with the figure of the person 411 in the preview box 301 before refreshing The figure represented by the body image is closer to the figure represented by the selected figure template.
  • the higher the body contour level the closer the body shape of the person 411 in the preview frame 301 is to the body shape represented by the selected body shape template.
  • the body contour level is 10 (the highest body contour level)
  • the body image represented by the body image of the person 411 in the preview frame 301 is consistent with the body shape represented by the selected body template.
  • the electronic device 100 may detect a second user operation (such as a leftward sliding operation on the control 314) acting on the control 314, the second user The operation can be used to lower the figure level corresponding to the character 411 (for example, from figure level 10 to figure level 7).
  • the electronic device 100 can refresh the display content in the preview box 301, and the body image of the person 411 in the preview box 301 after refreshing is compared with the figure of the person 411 in the preview box 301 before refreshing The figure represented by the body image is closer to the actual figure of the person 411.
  • the body template options displayed in the body template list 11B-2 may be related to the gender, age, and other characteristics of the person being photographed. For example, if the person to be photographed is a female, a body template suitable for women is displayed in the body template list 11B-2, and a body template suitable for men is not displayed.
  • the electronic device 100 may refresh the display content in the preview box 301, and the refreshed preview box 301 represents the body image of the photographed person.
  • the figure is the same as the figure represented by the selected figure template. That is to say, the user can directly beautify the figure of the person being photographed to be the same as the figure represented by the figure template by clicking the figure template option.
  • 12A-12C exemplarily show a one-click UI embodiment for full-body beautification.
  • the user interface 33 may also display a control 12B-1.
  • the electronic device 100 can detect a user operation (such as a click operation on the control 12B-1) acting on the control 12B-1, and in response to the operation, the user interface exemplarily shown in FIG. 12B may be displayed.
  • the user interface may include a preview box 301, a control 303, a control 304, a control 305, a shooting mode list 302, a control 313, and an icon 12B-2.
  • the icon 12B-2 can be used to indicate that the currently selected character beautification method is full-body beautification.
  • Full-body beautification may refer to facial beautification and body beautification of the person being photographed.
  • the control 313 can be used for the user to select the level of full-body beautification.
  • the control 303, the control 304, the control 305, and the shooting mode list 302 reference may be made to the description in the foregoing embodiment, and details are not repeated here.
  • the electronic device 100 can detect a user operation for selecting a full-body beautification level, and in response to the operation, the electronic device 100 can beautify the body image of the person being photographed according to the selected full-body beautification level
  • the face image of the photographed person is subjected to skin beautification processing, and the preview frame 301 is refreshed.
  • the preview box 301 displays the image of the photographed person who has undergone the body treatment and skin treatment.
  • the higher the overall beautification level the greater the difference between the body image of the photographed person displayed in the preview frame 301 and the actual body shape of the photographed person. The greater the difference in the actual facial skin of the person photographed.
  • the one-button UI embodiment for full-body beautification shown in FIGS. 12A to 12C can quickly realize the whole-body beautification of the photographed person, the operation is simple and effective, and the use efficiency of the electronic device can be improved.
  • 13A to 13D exemplarily show a UI embodiment for beautifying multiple people.
  • the electronic device 100 can respond to the detected user operation for selecting the body-level, using the The body parameters corresponding to the selected body level perform body processing on the body images of the plurality of photographed persons, and refresh the display content in the preview box 301.
  • the refreshed preview frame 301 displays the body images of the plurality of photographed persons after the body treatment. That is to say, the user can beautify the plurality of photographed people with the same beauty level.
  • the electronic device 100 can detect a user operation to select the captured person (such as when shooting Click operation on the image of the character), and a user operation to select a body level is detected. In response to this operation, the electronic device 100 can perform the body treatment on the body image of the selected photographed person using the body parameters corresponding to the selected body level, and refresh the display content in the preview box 301.
  • the refreshed preview frame 301 displays the body image of the selected photographed person and the body image of the unselected photographed person.
  • the body image of the selected photographed person has undergone body treatment and is not The body image of the selected person to be photographed has not undergone body treatment. That is to say, the user can select a certain person or persons from the plurality of shot persons for beautification.
  • the electronic device 100 may recognize the image of a specific person from the images of the plurality of captured persons collected by the 3D camera module 193, such as the owner of the electronic device 100.
  • the electronic device 100 may perform body treatment on the image of the specific person by default.
  • the electronic device 100 can use the image data stored on the electronic device 100 (such as photos in a gallery) to identify the image of a specific person from the images of multiple photographed persons, for example, the face image in the image of the specific person
  • the facial images in the pictures marked, favorited or named in the gallery show the same facial features.
  • 14A-14B exemplarily show a UI embodiment for manually performing beautification.
  • the electronic device 100 can detect a user operation acting on the body image of the photographed person, as shown in FIG. 14A, a pinch gesture on the belly image. In response to the operation, the electronic device 100 may determine the partial body image selected by the operation, and then perform body treatment on the partial body image.
  • the magnitude of the user's operation can determine the degree of body treatment. For example, the greater the magnitude, the greater the degree of body treatment.
  • the user can slim down a body part through a pinch operation on the body part.
  • the stretching gesture on the leg image can be used to lengthen the leg, and the pressing operation on the tummy image can also be used to thin the belly.
  • 15A to 15D exemplarily show a UI embodiment for beautifying a person in a captured picture.
  • the electronic device 100 may mark such a picture in the gallery, for example, displaying an indicator 15A-1 on the thumbnail of the picture .
  • the indicator 15A-1 can be used to indicate that the image of the person in the picture has undergone body treatment and/or skin treatment.
  • the user interface displaying the picture 15B-1 may also display the control 15B-2.
  • the electronic device 100 may display the control 312, the control 309, and the control 310 in the user interface displaying the picture 15B-1.
  • the control 312, the control 309, and the control 310 reference may be made to the foregoing content, and details are not described here.
  • the electronic device 100 can use the body parameters corresponding to the selected body level to perform the body image of the person in the picture 15B-1 Perform body treatment and refresh picture 15B-1.
  • the body image of the person's body image in the refreshed picture 15B-1 is more beautified than the body image of the person's body image in the before-refresh picture 15B-1.
  • the picture 15B-1 may also include pictures that have not been subjected to body treatment or skin treatment.
  • the user can beautify the figures in the photo after the photo is taken.
  • the user can also beautify the figures in the captured video.
  • FIGS. 15A to 15D For specific implementation, refer to the embodiments in FIGS. 15A to 15D, and details are not described here.
  • the image of the photographed person collected by the 3D camera module 193 displayed in the preview frame 301 may be referred to as the first image of the photographed person, and the image of the photographed person displayed in the preview frame 301 after the body treatment It can be called the second image of the person being photographed.
  • control 312 may be referred to as a first control
  • control 801 may be referred to as a second control
  • the prompt information 703 in the embodiments of FIGS. 7A-7G may be referred to as first prompt information.
  • the method may include:
  • the electronic device can turn on the 3D camera module and collect color images and depth images through the 3D camera module.
  • the color image may include an image of a person being photographed (ie, a foreground image) and a background image.
  • the depth image may include depth information of the person being photographed.
  • the color image may include a plurality of pixels, and each pixel has two-dimensional coordinates and color values.
  • the color value can be an RGB value or a YUV value.
  • the depth image may include a plurality of pixels, each pixel having two-dimensional coordinates and a depth value.
  • the color value of the corresponding pixel in the color image of the position represents the color of the position (for example, the color of clothing, the color of bare skin, etc.), and the position corresponds to the depth image
  • the depth value of the pixel indicates the vertical distance between the position and the electronic device (specifically a 3D camera module).
  • the two-dimensional coordinates of the corresponding pixel point of the position A in the color image is (x1, y1), the pixel point
  • the RGB value of is (255,255,255);
  • the two-dimensional coordinates of the corresponding pixel in position A in the depth image are (x1,y1), and the depth value of the pixel is 350 cm.
  • the color at position A is white, and the vertical distance between position A and the electronic device is 350 cm.
  • the RGB value (0,0,0) of the pixel corresponding to the location B in the color image and the pixel corresponding to the location B in the depth image A depth value of 345 cm can indicate that the color at position B is black, and the vertical distance between position B and the electronic device is 345 cm.
  • the two-dimensional coordinates of each photographed part of the photographed person, the depth value relative to the 3D camera module, and the color value can be determined.
  • the two-dimensional coordinates and depth values may represent 3D coordinates.
  • the color image and the depth image shown in FIGS. 17A and 17B can be combined into a distribution of color values in a 3D coordinate space, as shown in FIG. 17C.
  • the z axis represents the depth value.
  • the RGB value at the 3D coordinate is (255,255,255)
  • the 3D coordinate of position B is (x2,y2,z2)
  • z2 345 Cm
  • the RGB value at this 3D coordinate is (0,0,0).
  • the photographed part refers to the part captured by the 3D camera module.
  • the photographed part of the photographed person may include a face, stomach, etc.
  • the buttocks and back are not the part to be photographed.
  • the electronic device may display a first user interface.
  • the first user interface may be a user interface provided by the character beautification function of the "camera" application.
  • the user interface may be the user interface shown in FIGS. 4A-4G, the user interface shown in FIGS. 5A-5D, the user interface shown in FIGS. 6A-6G, or the user interface shown in FIGS. 7A-7H.
  • User interface the user interface shown in FIGS. 13A-13D.
  • the first user interface is the user interface shown in FIGS. 4A-4G, the user interface shown in FIGS. 5A-5D, the user interface shown in FIGS. 6A-6G, or the one shown in FIGS. 7A-7H User interface, the user interface shown in FIGS. 8A-8B, the user interface shown in FIGS. 9A-9C, the user interface shown in FIGS. 10A-10G, the user interface shown in FIGS. 12A-12C, and FIG. 13A- The user interface shown in FIG. 13D.
  • the body treatment method adopted by the electronic device in the subsequent S108 is the body treatment method 1.
  • the body style option 310 is in a selected state, and the control 312 can be used for the user to select the body style for the person being photographed.
  • the control 312 can be used for the user to select the body style for the person being photographed.
  • the body treatment method adopted by the electronic device in subsequent S108 is the body treatment method 2.
  • the first user interface can be used for the user to select a beauty template for the photographed person, and can also be used for the user to adjust the beauty level under the selected beauty template.
  • the user operation of the user selecting the beauty template and adjusting the beauty level reference may be made to the embodiments of FIGS. 11A to 11E in detail, and details are not described here.
  • the electronic device can use the color image of the photographed person and the human bone point positioning algorithm to identify the skeleton point of the person.
  • recognizing the bone point means identifying the 2D coordinates of the bone point.
  • the 2D coordinates of the identified bone point and the depth value of the bone point determined in S104 can be used by the electronic device to determine the body shape of the photographed person in subsequent S106, and can also be used by the electronic device to determine the photographed person and electronic equipment in subsequent S105 The angle between the planes.
  • the input of the human bone point positioning algorithm may be a color image of the human body
  • the output of the human bone point positioning algorithm may be the 2D coordinates of the human bone point.
  • the electronic device can specifically take the color image of the photographed person as input and obtain the 2D coordinates of each bone point in the color image of the photographed person through the human bone point positioning algorithm.
  • the electronic device can recognize the basic human bone points shown in FIG. 18, such as head point, neck point, left shoulder point, right shoulder point, left elbow point, right elbow point, left hand point, right hand point, left hip point, Right hip point, left knee point, right knee point, left foot point, right foot point.
  • the electronic device can also recognize more or fewer human bone points.
  • the human bone point positioning algorithm is obtained by training a neural network algorithm by using a large number of human color images and human bone points in the color image as training data.
  • the reliability of the human skeleton point positioning algorithm (ie neural network algorithm) obtained by the last training is relatively high, for example, the confidence is above 90%.
  • the electronic device may use the two-dimensional coordinates of the skeleton point of the photographed person and the depth information of the photographed person identified in S102 to determine the depth value of the skeleton point of the photographed person.
  • the determined depth value of the bone point and the 2D coordinates of the bone point determined in the foregoing S103 can be used by the electronic device to determine the body shape of the photographed person in subsequent S106, and can also be used by the electronic device to determine where the photographed person and electronic device are located in subsequent S105 The angle between the planes.
  • the two-dimensional coordinates of the left hip point (position A) of the photographed person are (x1, y1). Since the pixel at the two-dimensional coordinates (x1, y1) has a depth value of 350 cm in the color image of the person being photographed, the depth value of the left hip point is 350 cm.
  • the electronic device can also determine the depth value of other body parts of the person being photographed, such as the depth value of the belly, because the depth value of the belly can be used to determine the fatness and thinness of the belly of the person being photographed.
  • the depth value of the belly may include the depth value of one or more feature points on the belly.
  • the vertical distance between each bone point of the human body and the plane where the electronic device is located is usually not equal.
  • the more common situation is that the vertical distance between the body of the upper half of the human body (such as the head) and the plane where the electronic device is located is smaller, and the vertical distance between the body of the lower half of the human body (such as the legs) and the plane where the electronic device is located The distance is larger.
  • the electronic device can correct the visual effect of "short head and thighs" by performing a perspective transformation on the color image before performing body treatment on the color image of the photographed person.
  • the electronic device may first determine the angle between the person being photographed and the plane where the electronic device is located. Then, the electronic device can calculate the perspective transformation matrix using the included angle, and finally use the perspective transformation matrix to perform perspective transformation on the color image of the photographed person.
  • the two-dimensional coordinates (x, y) of any pixel in the color image are transformed into the 3D coordinates (X, Y, Z) after the following matrix transformation:
  • the z value of any pixel is set to 1.
  • the two-dimensional coordinates (x, y) can be transformed into three-dimensional coordinates (X, Y, Z) through matrix transformation. Then X, Y can be divided by the value of Z to get x’, y’.
  • the original two-dimensional coordinates (x, y) are converted into new two-dimensional coordinates (x', y') to obtain a perspective-transformed color image.
  • the 2D coordinates of the human skeleton point after the perspective transformation can be obtained by multiplying the 2D coordinates of the human skeleton point before the perspective transformation by the transformation matrix.
  • the electronic device can determine the angle ⁇ between the person being photographed and the plane where the electronic device is located in the following two ways:
  • Method 1 The angle ⁇ is determined by the depth value of each bone point and the 2D coordinates.
  • the vertical distance between the left hip point P1 and the left knee point P2 of the person being photographed and the electronic device will be D1 and D2, respectively.
  • the electronic device can calculate multiple angle values by calculating multiple sets of bone points, and then can normalize the multiple angle values.
  • the normalized angle value can be determined as the angle ⁇ between the person being photographed and the plane where the electronic device is located.
  • Method 2 Collect the pitch angle ⁇ of the electronic device through a sensor of the electronic device (such as a gyro sensor) (value range is -180° to 180°), and determine the included angle ⁇ according to the pitch angle ⁇ .
  • the electronic device can determine the shape of the person to be photographed based on the color image of the person to be photographed and the depth information of the person to be photographed. If the electronic device performs perspective transformation on the color image collected by the 3D camera module, the color image of the photographed person may specifically be a color image after the perspective transformation of the photographed person.
  • the determined body shape of the photographed person can be used by the electronic device to perform a body treatment on the color image of the photographed person in S108.
  • the electronic device determines the figure of the person being photographed, which may include one or more of the following:
  • the electronic device can determine the length of the bone between the bone points (such as the length of the leg, the width of the shoulder, etc.), and can determine the proportion of the body of the person to be photographed, such as the head-to-body ratio, according to the length of each bone.
  • the electronic device may determine the length of the bone between the bone points according to the depth value of the bone points and the 2D coordinates of the bone points. For example, as shown in FIG. 20, the vertical distance between the left hip point P1 and the left knee point P2 of the person being photographed and the electronic device will be D1 and D2, respectively. In the color image of the photographed person, the distance L between the left hip point P1 and the left knee point P2 can be calculated from the 2D coordinates of P1 and the 2D coordinates of P2.
  • the bone length X of the thigh between the left hip point P1 and the left knee point P2 can be calculated as: If the electronic device performs perspective transformation on the color image collected by the 3D camera module, the 2D coordinates of the bone points (such as the left hip point P1 and the left knee point P2) may be the 2D coordinates of the bone points after perspective transformation.
  • the electronic device can determine the entire human body contour of the photographed person through the color image of the photographed person and the human body contour detection algorithm, and then can further determine the contour of the local body (such as the stomach, waist, and legs) between the bone points, that is, the local contour.
  • the local contour of the waist can be used to determine the waist circumference
  • the local contour of the leg can be used to determine the leg circumference, and so on. That is to say, the contour of the partial body can be used to determine the fatness and thinness of the partial body.
  • the input of the human body contour detection algorithm may be a color image of the human body
  • the output of the human body contour detection algorithm may be a set of two-dimensional coordinates representing the human body contour.
  • the electronic device can specifically take the perspective-transformed color image of the photographed person as an input, and obtain a two-dimensional coordinate set of the human body contour in the color-transformed color image after the human body contour detection algorithm.
  • the human body contour detection algorithm is obtained by training a neural network algorithm by using a large number of human body color images and a set of two-dimensional coordinates of the human body contours in the color image as training data.
  • the reliability of the human contour detection algorithm (that is, the neural network algorithm) obtained by the final training is relatively high, for example, the confidence is above 90%.
  • the smaller the depth value of the partial body the closer it is to the electronic device, and thus the higher the degree of protrusion of the partial body, that is, the fatter the partial body.
  • the protruding part is closer to the electronic device, that is, the depth value of the protruding part is smaller.
  • the electronic device Before subjecting the body image to the color image of the photographed person (S108), as shown in FIG. 21, the electronic device may divide the color image into a color image (ie, foreground image) and background image of the photographed person. Among them, (a) shows the color image of the person being photographed, and (b) shows the background image.
  • the segmented color image of the photographed person can be used for subsequent body treatment. If the electronic device performs perspective transformation on the color image collected by the 3D camera module, the divided color image may be a color image after perspective transformation.
  • body treatment refers to the processing of the body image of the photographed person, such as image stretching, image compression, shading, etc., so that the body shape of the processed body image is compared with the actual body shape of the photographed person. beautify. If the electronic device performs perspective transformation on the color image collected by the 3D camera module, the body image of the photographed person may specifically be the body image of the photographed person after perspective transformation.
  • the electronic device may perform a body treatment process when it detects a user operation for selecting a body level (that is, a user operation acting on the control 312 ).
  • a body treatment process when it detects a user operation for selecting a body level (that is, a user operation acting on the control 312 ).
  • a body treatment method 1 described later.
  • the electronic device can detect a user operation for selecting a body shape template, or select a user operation at a beauty level under the selected body shape template
  • the body treatment is executed at the time (that is, the user operation acting on the control 312).
  • the method of body treatment at this time please refer to the body treatment method 2 described in the following content.
  • the electronic device may fuse the color image and the background image of the photographed person after the body treatment.
  • the electronic device can also use the background image data at the hollow edge to interpolate the hollow part to compensate Fully hollow out the part to get the restored image.
  • FIG. 22 shows a case where there is a hollow before the interpolation process is performed
  • FIG. 22 shows a case where the hollow is repaired after the interpolation process is performed.
  • the electronic device can update and display the restored image in the preview box when shooting the preview.
  • the body image of the photographed person included in the restored image is subjected to body treatment, and the body image represented by the body image is beautified compared to the actual body shape of the photographed person.
  • the image of the photographed person (that is, the foreground image) collected by the 3D camera module may be referred to as a first image, and the image of the photographed person subjected to body treatment may be referred to as a second image.
  • the electronic device may adjust the contour of the first body part of the photographed person in the first image according to the first body parameters, to obtain the contour of the first body part of the photographed person in the second image. That is to say, the contour of the first body part of the photographed person in the second image can be adjusted according to the first body parameters.
  • the electronic device may adjust the contour of the second body part of the person in the first image according to the second body parameters to obtain the contour of the second body part of the person in the second image. That is to say, the outline of the second body part of the photographed person in the second image is adjusted according to the second body parameters.
  • the first part and the second part may be different, and the first body parameter and the second body parameter may be different. That is to say, the electronic device can adjust the contours of different body parts in the first image by using different body parameters.
  • the following describes two implementation manners in which the electronic device performs body treatment on the body image of the photographed person.
  • the electronic device may perform body treatment on the body image of the photographed person according to the body treatment parameters corresponding to the selected body treatment level (the body treatment level selected by the user or the default body treatment level).
  • the body treatment level the body treatment level selected by the user or the default body treatment level.
  • Body parameters Beautify Legs 0.1 Elongated legs belly -0.05 Thin belly Shoulder 0.3 Widen shoulders waist -0.1 Thin waist
  • the value of the body parameter corresponding to the leg 0.1 is a positive value, which can indicate that the leg is elongated; the value of the body parameter 0.1 can also indicate the length of the leg, and the larger the value, the leg The greater the elongation of the department.
  • 0.1 can mean stretching the leg by 10%. The example is only an implementation manner of the embodiment of the present application, and should not constitute a limitation.
  • the value of the body parameters corresponding to the belly -0.05 is a negative value, which can indicate thin stomach treatment; the absolute value of the value of the body parameters -0.05 can also indicate the degree of thin stomach, and the larger the absolute value, the thin stomach The greater the degree.
  • -0.05 can mean that the belly is thinned by 5%. The example is only an implementation manner of the embodiment of the present application, and should not constitute a limitation.
  • the value of the body-body parameter corresponding to the shoulder is 0.3, which can indicate that the shoulder is widened; the value of the body-body parameter of 0.3 can also indicate the degree of shoulder widening, and the larger the value, the greater the value The greater the width of the shoulder.
  • 0.3 may indicate a 30% widening of the shoulder.
  • the example is only an implementation manner of the embodiment of the present application, and should not constitute a limitation.
  • the value of the body parameter corresponding to the waist -0.1 is a negative value, which can indicate thin waist treatment; the absolute value of the value of the body parameter -0.1 can also indicate the degree of thin waist, and the larger the absolute value, the thin waist The greater the degree.
  • -0.1 can mean thinning the waist by 10%. The example is only an implementation manner of the embodiment of the present application, and should not constitute a limitation.
  • the value of the body-body parameter corresponding to the whole body -0.09 is a negative value, which can indicate that the whole body is thinned; the absolute value of the body-body parameter -0.09 can also indicate the degree of weight loss.
  • 0.1 can mean 10% thinner body.
  • the example is only an implementation manner of the embodiment of the present application, and should not constitute a limitation.
  • the body parameters corresponding to the specific body level may be a set of body parameters, which may include the body parameters corresponding to each body part.
  • some body parameters in the set of body parameters may be invalid.
  • the electronic device 100 may invalidate the body parameters corresponding to the thin belly (for example, the value is set to 0), that is, the contour of the pregnant woman’s belly in the first image is not Make adjustments so that the outline of the belly of the person photographed in the second image matches the outline of the belly of the person photographed in the first image to preserve the characteristics of the pregnant woman.
  • Table 1 only exemplarily shows an implementation of body-body parameters, which is not limited thereto.
  • the body-body parameters may also include more or less parameters, or other different parameters.
  • the body treatment refers to processing the color image of the body of the person being photographed.
  • the electronic device 100 may perform one or more of the following body treatments.
  • the electronic device 100 can perform the following body treatment on the leg image of the person 411 to achieve the beautification of the elongated leg: First, in the image collected by the 3D camera module 193, according to the parameter value 0.1 corresponding to the leg and the The actual leg length of the photographed person determined in S106, and determine the respective target positions of the skeletal points of the knee point and the foot point; then stretch the leg image of the person 411, so that the leg image is stretched Skeletal points such as the back knee point and the foot point can be at their respective target positions. In this way, the legs of the person 411 represented by the leg image after the stretching process are longer than the actual legs of the person 411.
  • the electronic device 100 can perform the following body treatment on the belly image of the person 411 to realize the slimming of the belly: perform image compression processing on the belly image of the person 411, so that the belly expressed by the processed belly image is compared with the person The actual belly of 411 is thinner (or narrower); the belly image of the person 411 is shaded, so that the belly expressed by the processed belly image is flatter than the actual belly of the person 411, that is, the degree of protrusion of the belly can be reduced.
  • the value of the body parameter corresponding to the belly -0.05 and the actual contour and depth values of the belly of the person to be photographed determined in S106 can be used to determine the range of the belly image to be compressed and the target position to be compressed,
  • the parameter value -0.05 can also be used to determine the image range of the belly image plus shadow and the brightness of the shadow. In this way, the belly of the belly image after the processing is thinner (narrower and flatter) than the actual belly of the person 411.
  • the electronic device 100 can perform the following body treatment on the shoulder image of the person 411 to achieve the beautification of the widened shoulder: First, in the image collected by the 3D camera module 193, according to the body parameters corresponding to the widened shoulder The value 0.3 and the shoulder width determined in S106 determine the respective target positions of the left shoulder point and the right shoulder point; then the shoulder image of the person 411 is stretched so that the left shoulder point, The right shoulder point can be at its respective target position. In this way, the shoulder of the person 411 expressed by the shoulder image after the stretching process is wider than the actual shoulder of the person 411.
  • the electronic device 100 can perform the following body treatment on the waist image of the person 411 to realize the slimming of the waist: perform a compression process on the middle part of the waist image of the person 411, so that the waist represented by the image-processed waist image Thinner than the actual waist of character 411.
  • the value of the body parameter corresponding to the waist-0.1 and the waist contour determined in S106 can be used to determine the range of the waist image to be compressed and the target position to be compressed.
  • the electronic device 100 can perform the following body treatment on the waist image of the person 411 to achieve a slim waist beautification: perform a compression process on the color image of the entire body of the person 411, in which the face image of the person 411 is excluded so that The color image of the body after the image processing is thinner than the actual figure of the person 411.
  • Compressing the color image of the entire body of the person 411 may specifically include separately compressing the color images of each partial body between the bone points, so that the partial body represented by the color image of each partial body (such as arms and legs) , Belly, waist, etc.) are thinner than the actual parts of the character 411.
  • the color image (belly image) of some partial bodies can also be shaded to present a more natural slimming effect.
  • the value of the body parameter corresponding to the whole body -0.09 and the contour of the entire body determined in S106 can be used to determine the range and target of the color image of the entire body or the color image of each partial body to be compressed position.
  • the electronic device 100 may also perform other body treatments to achieve other body beautification.
  • the body parameters corresponding to each body level may be equal to the base body parameters multiplied by the weight.
  • the specific implementation may include:
  • the body parameters corresponding to the body level 10 may be used as the basic body parameters.
  • the body parameters corresponding to other body levels can be obtained by multiplying the basic body parameters by the weight, and the weight is determined by the body level.
  • the weight corresponding to the body level 10 is 1, and the weight corresponding to other body levels may be greater than 0 and less than 1.
  • the smaller the body-level the smaller the weight.
  • Table 1 shows that the beauty level 10 corresponds to the beauty parameters, and the weight corresponding to the beauty level 9 is 0.9.
  • the body parameters corresponding to body level 9 are specifically: 0.1*0.9 (body parameters corresponding to elongated legs), -0.05*0.9 (body parameters corresponding to thin stomachs), 0.3*0.9 (body corresponding to widened shoulders) Parameter), -0.1*0.9 (body parameters corresponding to thin waist), -0.09*0.9 (body parameters corresponding to thin body).
  • the body parameters corresponding to the body level 1 may be used as the basic body parameters.
  • the body parameters corresponding to other body levels can be obtained by multiplying the basic body parameters by the weight, and the weight is determined by the body level.
  • the weight corresponding to body level 1 is 1, and the weight corresponding to other body levels may be greater than 1.
  • the greater the body level the greater the weight.
  • Table 1 shows that the beauty level 1 corresponds to the beauty parameters, and the beauty level 5 corresponds to the beauty weight 1.2.
  • body parameters corresponding to body level 5 are: 0.1*1.2 (body parameters corresponding to elongated legs), -0.05*1.2 (body parameters corresponding to thin belly), 0.3*1.2 (body parameters corresponding to widened shoulders) ), -0.1*1.2 (body parameters corresponding to thin waist), -0.09*1.2 (body parameters corresponding to thin body).
  • the body parameters corresponding to each body level may also be pre-configured separately, and may not have a multiple relationship of weight expression. This application does not limit how to specify the body parameters corresponding to each body level.
  • the body contour parameters used for body contour processing on the color image of the photographed person's body may also be related to the gender, age, and other characteristics of the photographed person. There may be some differences between the body parameters corresponding to men and the body parameters corresponding to women. Because men's body beautification needs and women's body beautification needs are usually different. For example, men prefer thinner shoulders and women prefer narrower shoulders. Then, the body parameters of widened shoulders corresponding to men may have larger values than those of women. For another example, women prefer slim waists more than men, then the absolute value of body parameters corresponding to women's thin waist may be greater than the absolute value of body parameters corresponding to men's thin waist.
  • body parameters corresponding to people of different ages there may be some differences in body parameters corresponding to people of different ages. Because the beautification needs of people of different ages can also be different. In this way, it is possible to differentiate the body of the person being photographed of different genders, ages, etc. in the preview box 301, and to achieve a better body-body imaging effect. This application does not limit how to differentiate the body parameters according to the characteristics of the person being photographed, such as gender and age.
  • the beauty parameters used to perform the beauty treatment on the color image of the photographed person's body may also be related to the actual figure of the photographed person.
  • the body parameters corresponding to the fat person can be obtained by multiplying the body parameters exemplarily shown in Table 1 by a weight of 1.2 to achieve the purpose of enhancing body beautification
  • the body parameters corresponding to the thin person can be obtained through the body parameters exemplarily shown in Table 1. It is obtained by multiplying the weight by 0.8 to achieve the purpose of weakening the beautification.
  • the body of the person with different body types can be differentiated to achieve a better body-effect camera effect.
  • the body-building parameters corresponding to people of different body types may also be pre-configured separately, and there may not be a multiple relationship of weight expression among each other. This application does not limit how to define body parameters corresponding to people of different body types.
  • the electronic device may use the selected body shape template (the body shape template selected by the user or the default body shape template) to perform beauty treatment on the color image of the photographed person.
  • the selected body shape template the body shape template selected by the user or the default body shape template
  • FIGS. 11A-11E For UI embodiments, reference may be made to FIGS. 11A-11E.
  • the posture of the figure template and the posture of the subject may be different.
  • the posture of the photographed object can be determined according to the color image of the photographed person and the depth information of the photographed person.
  • the electronic device can transform the posture of the figure template into the posture of the photographed person through similar transformation. Specifically, the electronic device can compare the displacement of the bone points of the two poses in the two-dimensional space, and the relative angle between the two limbs connected by the bone points of the photographed person. Then, the electronic device can rotate or translate the bone points of the body template and the limbs connected to the bone points, so that the posture of the transformed body template is consistent with the posture of the object to be photographed.
  • the electronic device can align the bone points of the person being photographed with the bone points of the body template, and then pull the image of the body of the person being photographed according to the difference between the body template and the contour of the body of the person being photographed Stretching treatment (such as elongated legs) or compression treatment (such as thin legs).
  • Stretching treatment such as elongated legs
  • compression treatment such as thin legs.
  • the degree of stretch or compression can be determined by the body level under the selected body template. The higher the body level, the greater the degree of stretch or compression.
  • FIGS. 11D-11E For example, as shown in FIG.
  • the image of the person’s limb can be stretched (such as elongated legs) or compressed (such as skinny legs) so that The contour of the limb represented by the image of the stretched or compressed limb is consistent with the contour of the limb of the body template.
  • the electronic device may perform image preprocessing.
  • the image pre-processing may include pre-processing operations such as image enhancement, depth complementation, and denoising of color images and depth information.
  • the image enhancement may be to enhance the brightness of the image for a dark or low-light environment.
  • Depth completion may refer to smoothing the holes in the depth map.
  • Denoising can refer to removing some noise in the depth map to avoid the noise above the depth value from affecting the calculation of the bone point depth.
  • the electronic device detects in S105 that the angle between the person being photographed and the plane on which the electronic device is located is too large and exceeds a preset angle threshold (such as 65°). In response to this situation, the electronic device may not perform perspective transformation on the color image collected by the 3D camera, and may not perform body treatment on the color image of the photographed person. In addition, the electronic device can also output prompt information in the preview box 301 to remind the user that the body treatment is not supported in this case.
  • a preset angle threshold such as 65°
  • the photo preview method described in the embodiment of FIG. 16 can realize the adjustment of the body image of the photographed person displayed in the preview frame during shooting preview, so that the body shape of the adjusted body image occurs compared with the actual body shape of the photographed person Beautify the shape, the user's operation is simple and intuitive, which can effectively improve the utilization rate of electronic equipment.
  • FIG. 23 shows a structural diagram of a functional module included in an electronic device in conjunction with a 3D camera module 193. Expand below:
  • the 3D camera module 193 can be used to collect color images and depth information.
  • the color image may include the color image of the person to be photographed and the color image of the background.
  • the depth information may include depth information of the person being photographed and depth information of the background.
  • the color image of the photographed person may include a body image and a face image.
  • the 3D camera module 193 may be composed of a color camera module and a depth camera module.
  • the depth camera module may be a TOF depth camera module or a structured light camera module.
  • the human skeleton point positioning algorithm module can be used to identify the skeleton point of the photographed person by using the color image of the photographed person and the human skeleton point positioning algorithm.
  • the human bone point positioning algorithm module For specific implementation of the human bone point positioning algorithm module, reference may be made to S103 in the method embodiment of FIG. 16, and details are not described herein again.
  • the skeletal point depth acquisition module can be used to determine the skeletal point of the photographed person based on the 2D coordinates of the skeletal point of the photographed person identified by the human skeletal point positioning algorithm module and the depth information of the photographed person collected by the 3D camera module 193 Depth value.
  • the bone point depth acquisition module reference may be made to S104 in the method embodiment of FIG. 16, and details are not described herein again.
  • the bone length calculation module can be used to determine the length of the bone between the bone points according to the depth value of each bone point determined by the bone point depth acquisition module and the 2D coordinates of the bone point of the photographed person identified by the human bone point positioning algorithm module.
  • the bone length calculation module reference may be made to S106 in the method embodiment of FIG. 16, and details are not described here.
  • the perspective transformation module can be used to determine the depth value of each bone point determined by the bone point depth acquisition module and the length of the bone determined by the bone length calculation module, to determine the angle ⁇ between the photographed person and the plane where the electronic device is located, and use this angle to calculate The perspective transformation matrix is obtained, and finally the perspective transformation matrix is used to perform perspective transformation on the color image of the photographed person.
  • the perspective transformation module may also determine the angle ⁇ according to the pitch angle of the electronic device.
  • the portrait segmentation module can be used to segment the color image obtained by the perspective transformation module into a color image (ie, a foreground image) and a background image of the person being photographed.
  • a color image ie, a foreground image
  • a background image ie, a portrait segmentation module
  • the body processing module can be used to process the body image of the person after the perspective transformation, such as image stretching, image compression, shadowing, etc., so that the body shape of the processed body image occurs compared to the actual body shape of the person To beautify the body.
  • the body treatment module may be used to perform body treatment on the body image of the photographed person according to the body treatment parameters corresponding to the selected body treatment level (user-selected body treatment level or default body treatment level).
  • the beauty parameters can be determined by the beauty parameter determination module.
  • the body treatment module may be used to perform body treatment on the color image of the photographed person using the selected body template (user-selected body template or default body template).
  • S108 for a specific implementation of the body treatment module, reference may be made to S108 in the method embodiment of FIG. 16, and details are not described herein again.
  • the image restoration module can be used to fuse the color image and background image of the photographed person after body treatment. During image fusion, in order to avoid the hollow between the limb image and the background image after the body treatment, the image repair module can also be used to interpolate the hollow part using the background image data at the hollow edge to complete the hollow part, To get the repaired image.
  • the image finally output by the image repair module can be updated and displayed in the preview box of the first user interface.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de prévisualisation de photo pour un dispositif électronique, une interface graphique d'utilisateur et un dispositif électronique, le procédé comportant les étapes consistant à: ajuster une image de corps d'un individu photographié pendant une prévisualisation de photo ou une prévisualisation de vidéo de telle façon que le corps représenté par l'image de corps ajustée soit embelli en comparaison du corps réel de l'individu photographié. L'embellissement du corps peut comporter: l'embellissement de proportions du corps (par exemple, allongement des jambes et élargissement des épaules), et l'ajustement de la taille du corps (par exemple, amincissement de la taille, amincissement des jambes, amincissement du ventre et amincissement des hanches ou agrandissement de parties du corps). La solution décrite permet à un utilisateur d'embellir le corps d'un individu photographié pendant la prévisualisation de capture d'image, et l'utilisation est intuitive, simple et efficace.
PCT/CN2019/122516 2018-12-26 2019-12-03 Procédé de prévisualisation de photo pour dispositif électronique, interface graphique d'utilisateur et dispositif électronique WO2020134891A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811608420.2 2018-12-26
CN201811608420.2A CN109495688B (zh) 2018-12-26 2018-12-26 电子设备的拍照预览方法、图形用户界面及电子设备

Publications (1)

Publication Number Publication Date
WO2020134891A1 true WO2020134891A1 (fr) 2020-07-02

Family

ID=65712517

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/122516 WO2020134891A1 (fr) 2018-12-26 2019-12-03 Procédé de prévisualisation de photo pour dispositif électronique, interface graphique d'utilisateur et dispositif électronique

Country Status (2)

Country Link
CN (1) CN109495688B (fr)
WO (1) WO2020134891A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111861868A (zh) * 2020-07-15 2020-10-30 广州光锥元信息科技有限公司 用于视频中人像美化的图像处理方法及装置
CN112380990A (zh) * 2020-11-13 2021-02-19 咪咕文化科技有限公司 图片调节方法、电子设备和可读存储介质
CN112770058A (zh) * 2021-01-22 2021-05-07 维沃移动通信(杭州)有限公司 拍摄方法、装置、电子设备和可读存储介质
CN115225753A (zh) * 2021-04-19 2022-10-21 华为技术有限公司 拍摄方法、相关装置及系统
CN115278060A (zh) * 2022-07-01 2022-11-01 北京五八信息技术有限公司 一种数据处理方法、装置、电子设备及存储介质

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109495688B (zh) * 2018-12-26 2021-10-01 华为技术有限公司 电子设备的拍照预览方法、图形用户界面及电子设备
CN114666435B (zh) * 2019-04-19 2023-03-28 华为技术有限公司 使用电子设备的增强功能的方法、电子设备、芯片及存储介质
CN111866404B (zh) * 2019-04-25 2022-04-29 华为技术有限公司 一种视频编辑方法及电子设备
CN110289072A (zh) * 2019-05-10 2019-09-27 咪咕互动娱乐有限公司 健身方案的生成方法、装置、电子设备及可读存储介质
CN110264431A (zh) * 2019-06-29 2019-09-20 北京字节跳动网络技术有限公司 视频美化方法、装置及电子设备
CN110727488B (zh) * 2019-09-09 2022-03-25 联想(北京)有限公司 一种信息处理方法、电子设备及计算机可读存储介质
CN110719455A (zh) * 2019-09-29 2020-01-21 深圳市火乐科技发展有限公司 视频投影方法及相关装置
CN110852958B (zh) * 2019-10-11 2022-12-16 北京迈格威科技有限公司 基于物体倾斜角度的自适应校正方法和装置
CN111064887A (zh) * 2019-12-19 2020-04-24 上海传英信息技术有限公司 终端设备的拍照方法、终端设备和计算机可读存储介质
CN111107281B (zh) * 2019-12-30 2022-04-12 维沃移动通信有限公司 图像处理方法、装置、电子设备及介质
CN113382154A (zh) * 2020-02-25 2021-09-10 荣耀终端有限公司 基于深度的人体图像美化方法及电子设备
CN111339971B (zh) * 2020-03-02 2022-06-28 北京字节跳动网络技术有限公司 视频中人体肩颈处理方法、装置及电子设备
CN111402116A (zh) * 2020-03-11 2020-07-10 北京字节跳动网络技术有限公司 图片中人体腰部美体处理方法、装置及电子设备
CN111311519A (zh) * 2020-03-12 2020-06-19 北京字节跳动网络技术有限公司 视频中人体腰部美体处理方法、装置及电子设备
CN111405198A (zh) * 2020-03-23 2020-07-10 北京字节跳动网络技术有限公司 视频中人体胸部美体处理方法、装置及电子设备
CN111310749A (zh) * 2020-03-23 2020-06-19 北京字节跳动网络技术有限公司 视频中人体臀部美体处理方法、装置及电子设备
CN111445514A (zh) * 2020-03-24 2020-07-24 北京字节跳动网络技术有限公司 图片中人体臀部处理方法、装置及电子设备
CN111445405B (zh) * 2020-03-24 2022-06-17 北京字节跳动网络技术有限公司 图片中人体肩颈处理方法、装置及电子设备
CN111885333A (zh) * 2020-06-15 2020-11-03 东方通信股份有限公司 一种采集三维音视频及运动姿态的装置及方法
CN112035195A (zh) * 2020-07-30 2020-12-04 北京达佳互联信息技术有限公司 应用界面的展示方法、装置、电子设备及存储介质
CN112383713B (zh) * 2020-11-13 2022-02-22 艾体威尔电子技术(北京)有限公司 自动加载人脸mipi或者人脸usb摄像头的驱动方法
CN112641441B (zh) * 2020-12-18 2024-01-02 河南翔宇医疗设备股份有限公司 一种体态评估方法、系统、装置及计算机可读存储介质
CN113489895B (zh) * 2021-06-23 2022-05-31 荣耀终端有限公司 确定推荐场景的方法及电子设备

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130141605A1 (en) * 2011-12-06 2013-06-06 Youngkoen Kim Mobile terminal and control method for the same
CN104159032A (zh) * 2014-08-20 2014-11-19 广东欧珀移动通信有限公司 一种实时调整相机拍照美颜效果的方法及装置
CN105227832A (zh) * 2015-09-09 2016-01-06 厦门美图之家科技有限公司 一种基于关键点检测的自拍方法、自拍系统及拍摄终端
CN106991654A (zh) * 2017-03-09 2017-07-28 广东欧珀移动通信有限公司 基于深度的人体美化方法和装置及电子装置
CN107077719A (zh) * 2014-09-05 2017-08-18 波莱特股份有限公司 数码照片中基于深度图的透视校正
CN107124548A (zh) * 2017-04-25 2017-09-01 深圳市金立通信设备有限公司 一种拍照方法及终端
JP2017199008A (ja) * 2017-06-12 2017-11-02 フリュー株式会社 写真撮影遊戯機、制御方法、並びにプログラム
CN107808137A (zh) * 2017-10-31 2018-03-16 广东欧珀移动通信有限公司 图像处理方法、装置、电子设备及计算机可读存储介质
CN109495688A (zh) * 2018-12-26 2019-03-19 华为技术有限公司 电子设备的拍照预览方法、图形用户界面及电子设备

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105303523A (zh) * 2014-12-01 2016-02-03 维沃移动通信有限公司 一种图像处理方法及移动终端
US20160357578A1 (en) * 2015-06-03 2016-12-08 Samsung Electronics Co., Ltd. Method and device for providing makeup mirror
CN105657249A (zh) * 2015-12-16 2016-06-08 东莞酷派软件技术有限公司 一种图像处理方法及用户终端
CN106445168A (zh) * 2016-11-01 2017-02-22 中南大学 一种智能手套及其使用方法
CN107123081A (zh) * 2017-04-01 2017-09-01 北京小米移动软件有限公司 图像处理方法、装置及终端
CN107590481A (zh) * 2017-09-28 2018-01-16 北京小米移动软件有限公司 穿衣镜、数据处理方法及装置

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130141605A1 (en) * 2011-12-06 2013-06-06 Youngkoen Kim Mobile terminal and control method for the same
CN104159032A (zh) * 2014-08-20 2014-11-19 广东欧珀移动通信有限公司 一种实时调整相机拍照美颜效果的方法及装置
CN107077719A (zh) * 2014-09-05 2017-08-18 波莱特股份有限公司 数码照片中基于深度图的透视校正
CN105227832A (zh) * 2015-09-09 2016-01-06 厦门美图之家科技有限公司 一种基于关键点检测的自拍方法、自拍系统及拍摄终端
CN106991654A (zh) * 2017-03-09 2017-07-28 广东欧珀移动通信有限公司 基于深度的人体美化方法和装置及电子装置
CN107124548A (zh) * 2017-04-25 2017-09-01 深圳市金立通信设备有限公司 一种拍照方法及终端
JP2017199008A (ja) * 2017-06-12 2017-11-02 フリュー株式会社 写真撮影遊戯機、制御方法、並びにプログラム
CN107808137A (zh) * 2017-10-31 2018-03-16 广东欧珀移动通信有限公司 图像处理方法、装置、电子设备及计算机可读存储介质
CN109495688A (zh) * 2018-12-26 2019-03-19 华为技术有限公司 电子设备的拍照预览方法、图形用户界面及电子设备

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111861868A (zh) * 2020-07-15 2020-10-30 广州光锥元信息科技有限公司 用于视频中人像美化的图像处理方法及装置
CN111861868B (zh) * 2020-07-15 2023-10-27 广州光锥元信息科技有限公司 用于视频中人像美化的图像处理方法及装置
CN112380990A (zh) * 2020-11-13 2021-02-19 咪咕文化科技有限公司 图片调节方法、电子设备和可读存储介质
CN112770058A (zh) * 2021-01-22 2021-05-07 维沃移动通信(杭州)有限公司 拍摄方法、装置、电子设备和可读存储介质
CN115225753A (zh) * 2021-04-19 2022-10-21 华为技术有限公司 拍摄方法、相关装置及系统
CN115278060A (zh) * 2022-07-01 2022-11-01 北京五八信息技术有限公司 一种数据处理方法、装置、电子设备及存储介质
CN115278060B (zh) * 2022-07-01 2024-04-09 北京五八信息技术有限公司 一种数据处理方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN109495688B (zh) 2021-10-01
CN109495688A (zh) 2019-03-19

Similar Documents

Publication Publication Date Title
WO2020134891A1 (fr) Procédé de prévisualisation de photo pour dispositif électronique, interface graphique d'utilisateur et dispositif électronique
WO2021169394A1 (fr) Procédé d'embellissement d'une image du corps humain sur la base de la profondeur et dispositif électronique
WO2021103981A1 (fr) Procédé et appareil de traitement d'affichage à écran divisé, et dispositif électronique
WO2021000881A1 (fr) Procédé de division d'écran et dispositif électronique
CN113645351B (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2020125410A1 (fr) Procédé de traitement d'image et dispositif électronique
WO2020077511A1 (fr) Procédé permettant d'afficher une image dans une scène photographique et dispositif électronique
CN112262563B (zh) 图像处理方法及电子设备
WO2021036585A1 (fr) Procédé d'affichage sur écran souple, et dispositif électronique
WO2020029306A1 (fr) Procédé de capture d'image et dispositif électronique
WO2021013132A1 (fr) Procédé d'entrée et dispositif électronique
WO2021169399A1 (fr) Procédé de mise en cache d'une interface d'application, et appareil électronique
WO2023065873A1 (fr) Procédé de réglage de fréquence d'images, dispositif terminal et système de réglage de fréquence d'images
CN114115769A (zh) 一种显示方法及电子设备
WO2020118490A1 (fr) Procédé de division d'écran automatique, interface utilisateur graphique et dispositif électronique
WO2020113534A1 (fr) Procédé de photographie d'image à longue exposition et dispositif électronique
WO2020173152A1 (fr) Procédé de prédiction d'apparence faciale et dispositif électronique
WO2023241209A1 (fr) Procédé et appareil de configuration de papier peint de bureau, dispositif électronique et support de stockage lisible
WO2021042878A1 (fr) Procédé photographique et dispositif électronique
CN112150499A (zh) 图像处理方法及相关装置
CN115967851A (zh) 快速拍照方法、电子设备及计算机可读存储介质
WO2022206494A1 (fr) Procédé et dispositif de suivi de cible
CN115115679A (zh) 一种图像配准方法及相关设备
WO2022012418A1 (fr) Procédé de photographie et dispositif électronique
CN113973189A (zh) 显示内容的切换方法、装置、终端及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19904164

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19904164

Country of ref document: EP

Kind code of ref document: A1