WO2022143118A1 - Procédé de traitement d'image et dispositif électronique - Google Patents

Procédé de traitement d'image et dispositif électronique Download PDF

Info

Publication number
WO2022143118A1
WO2022143118A1 PCT/CN2021/137401 CN2021137401W WO2022143118A1 WO 2022143118 A1 WO2022143118 A1 WO 2022143118A1 CN 2021137401 W CN2021137401 W CN 2021137401W WO 2022143118 A1 WO2022143118 A1 WO 2022143118A1
Authority
WO
WIPO (PCT)
Prior art keywords
picture
screen
flexible screen
expanded state
electronic device
Prior art date
Application number
PCT/CN2021/137401
Other languages
English (en)
Chinese (zh)
Inventor
蒋东生
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202011641789.0A external-priority patent/CN114690998B/zh
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022143118A1 publication Critical patent/WO2022143118A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/04102Flexible digitiser, i.e. constructional details for allowing the whole digitising part of a device to be flexed or rolled like a sheet of paper
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20132Image cropping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present application relates to the field of terminal technologies, and in particular, to a picture processing method and electronic device.
  • Flexible screens are also known as organic light-emitting diodes (OLEDs). Compared with the traditional display screen, the display area of the flexible screen is larger, and the flexible screen can be bent and has better flexibility.
  • OLEDs organic light-emitting diodes
  • FIG. 1 shows a smart phone with a flexible screen, and the flexible screen of the smart phone can be folded or bent.
  • Users can use the smartphone to perform various types of image processing, such as cropping and splitting, merging and stitching, or background replacement.
  • image processing such as cropping and splitting, merging and stitching, or background replacement.
  • the user needs to perform frequent manual operations when using the smartphone to process the image to be processed. , the user experience is poor.
  • the present application provides a picture processing method and electronic device, which can solve the problems existing in the existing picture processing methods.
  • the present application provides a picture processing method, the method is applied to an electronic device with a flexible screen, the physical form of the flexible screen includes an expanded state and a non-expanded state, and the method includes: the flexible screen is in a In the expanded state, the flexible screen displays the first picture; receives a user's cropping and segmentation operation on the first picture; determines that the flexible screen is transformed from the expanded state to the non-expanded state, and converts the first picture to the non-expanded state.
  • a picture is cropped and divided into a second picture and a third picture.
  • the electronic device determines that the flexible screen of the electronic device is changed from the expanded state to the non-expanded state, and cuts and divides the first picture into the second picture and the third picture. picture. It can be seen that using this image processing method, after the user selects the first image and inputs the operation of cropping and dividing, the user only needs to fold the flexible screen of the electronic device, and then the first image can be cropped and divided into two pictures, and no manual operation is required. Cropping and segmentation processing, the processing process is simpler, and the user experience is better.
  • the flexible screen when the flexible screen is in the non-expanded state, the flexible screen is divided into a first screen and a second screen, and between the first screen and the second screen
  • the included angle is less than or equal to the first preset angle; the determining that the flexible screen is transformed from the expanded state to the non-expanded state, and the first picture is cropped and divided into a second picture and a third picture, including: It is determined that the included angle is less than or equal to the first preset angle, and the first picture is cropped and divided into a second picture and a third picture.
  • the physical form of the flexible screen can be determined by detecting the change of the angle between the first screen and the second screen, and then the first picture can be cut and segmented by detecting the change of the angle.
  • the processing process is relatively simple, and the applicability is good.
  • determining that the included angle is less than or equal to the first preset angle, and cropping and dividing the first picture into a second picture and a third picture includes: determining that the included angle is For a second preset angle, the first picture is cropped and divided into the second picture and the third picture; the second preset angle is less than or equal to the first preset angle.
  • the first picture is triggered to be cropped and divided, which is convenient for the user to preview and has a better user experience.
  • the method further includes: determining the included angle as a third preset angle, and storing the second picture and the third picture; the third preset angle is smaller than the third preset angle. Two preset angles.
  • the flexible screen is folded so that the angle between the first screen and the second screen is a third preset angle different from the second preset angle, triggering the storage of the second picture and the third picture, which is convenient for The user reserves sufficient preview time, and the user experience is better.
  • the method further includes: determining that the included angle is a fourth preset angle, and storing the second picture; the fourth preset angle is smaller than the second preset angle; determining The included angle is a fifth preset angle, and the third picture is stored; the fifth preset angle is smaller than the fourth preset angle.
  • the included angle between the first screen and the second screen of the flexible screen can be folded into different angles, respectively triggering the storage of the second picture and the third picture, which is convenient for the user to store the second picture and the third picture respectively. Preview, the user experience effect is better.
  • the determining that the flexible screen is transformed from the expanded state to the non-expanded state, and the first picture is cropped and divided into a second picture and a third picture including: the flexible screen.
  • the screen displays a dividing line, the dividing line corresponds to the folding line of the flexible screen; it is determined that the flexible screen is transformed from the expanded state to the non-expanding state, and along the dividing line, the first picture is Crop and split into a second picture and a third picture.
  • the dividing line can be displayed on the flexible screen, and the user can preview the specific cutting and dividing position through the dividing line, which is convenient for subsequent adjustment of the cutting and dividing position, and has better applicability.
  • the method further includes: changing the The display position of the first picture on the flexible screen.
  • the display position of the first image on the flexible screen can be adjusted according to the translation operation and/or zoom operation of the first image input by the user, so that the dividing line corresponds to the user's ideal cropping and dividing position, and the user experience better.
  • the present application provides a picture processing method, which is applied to an electronic device with a flexible screen, and the physical form of the flexible screen includes an expanded state and a non-expanded state; when the flexible screen is in the non-expanded state When the flexible screen is in the non-expanded state, the flexible screen is divided into a first screen and a second screen, and the method includes: the flexible screen is in the non-expanded state, the first screen displays a first picture, and the second screen Displaying a second picture; receiving a first operation of generating a third picture according to the first picture and the second picture input by the user; determining that the flexible screen is transformed from the non-expanded state to the expanded state, according to the The third picture is generated from the first picture and the second picture.
  • the electronic device determines that the flexible screen of the electronic device is changed from the non-expanded state to the expanded state, and according to the first picture and the second picture The second picture generates the third picture. It can be seen that, using the image processing method, after the user selects the first image and the second image and inputs the operation of generating the third image according to the first image and the second image, he only needs to unfold the flexible screen of the electronic device, and then the user can The first picture and the second picture generate the third picture, during which there is no need to manually merge, replace the background, or transfer the style. The processing process is simpler and the user experience is better.
  • the included angle between the first screen and the second screen is less than or equal to a first preset angle; the determining the The flexible screen is transformed from the non-expanded state to the expanded state, and generating the third picture according to the first picture and the second picture includes: determining that the included angle is greater than the first preset angle, The third picture is generated from the first picture and the second picture.
  • the physical form of the flexible screen can be determined by detecting the change of the angle between the first screen and the second screen, and then the change of the angle can be detected to trigger the trigger according to the first picture and the second picture.
  • the execution of the processing process of generating the third picture is relatively simple and has good applicability.
  • the first operation is to replace the background of the first picture with the background of the second picture to generate a background replacement operation of the third picture.
  • the change of the physical form of the flexible screen can trigger the background replacement of the first picture, the background replacement process is simpler, and the applicability is better.
  • the determining that the included angle is greater than the first preset angle, and generating the third picture according to the first picture and the second picture includes: determining the included angle For a second preset angle, image segmentation is performed on the first picture to generate a foreground image of the first picture; the second preset angle is smaller than the first preset angle and greater than the initial angle; the The initial angle is the value of the included angle when the first operation is received; the foreground image is displayed on the second image; according to the user's operation of expanding the flexible screen, and the difference between the included angle and the display position Presetting the corresponding relationship, determining the target display position of the foreground picture on the second picture; determining that the included angle is greater than the first preset angle, and setting the background of the foreground picture according to the target display position Replace with the background of the second picture to generate the third picture.
  • the display position of the foreground picture of the first picture in the second picture can be changed according to the change of the angle between the first screen and the second screen, which is convenient for the user to preview, and the foreground picture of the first picture can be changed.
  • the image is replaced in the background image of the second picture according to the ideal position, and the third picture that meets the user's needs is synthesized, and the user experience is better.
  • the first operation is a style transfer operation of updating the picture style of the first picture to the picture style of the second picture to generate a third picture.
  • the style transfer of the first picture can be triggered by the change of the physical form of the flexible screen, the style transfer process is simpler, and the applicability is better.
  • the determining that the included angle is greater than the first preset angle, and generating the third picture according to the first picture and the second picture includes: unfolding the third picture according to the user.
  • the operation of the flexible screen, and the preset corresponding relationship between the included angle and the stylization degree determine the target stylization degree of the style transfer for the first picture; determine that the included angle is greater than the first preset angle, According to the target stylization degree, the picture style of the first picture is updated to the picture style of the second picture, and the third picture is generated.
  • the degree of style transfer can be changed according to the change of the angle between the first screen and the second screen, so as to obtain a style transfer picture that meets user requirements, and the user experience is better.
  • the first operation is a merging and splicing operation of merging and splicing the first picture and the second picture into a third picture.
  • the change of the physical form of the flexible screen can trigger the merging and splicing of the first picture and the second picture, the merging and splicing process is simpler and the applicability is better.
  • the method further includes: when the flexible screen is in the non-expanded state, the first screen displays a fourth picture, and the second screen displays a fifth picture; According to the adjustment operation of the fourth picture, the fourth picture is adjusted to a preset size to generate the first picture; according to the user's adjustment operation on the fifth picture, the fifth picture is adjusted to the preset size , and generate the second picture.
  • the first picture and the second picture with the same size can be obtained through size adjustment, so that the third picture obtained by merging and splicing the first picture and the second picture is more in line with the needs of the user, and the user experience is better.
  • the present application provides an electronic device having the function of implementing the above method.
  • the functions can be implemented by hardware, or can be implemented by hardware executing corresponding software.
  • the hardware or software includes one or more modules corresponding to the above functions.
  • the structure of the above electronic device includes a processor and a memory, and the processor is configured to process the electronic device to perform corresponding functions in the above method.
  • the memory is used for storing necessary program instructions and data of the electronic device.
  • the present application provides a computer storage medium, in which a computer program or instruction is stored, and when the computer program or instruction is run on a computer, the computer is made to execute the first aspect or the second aspect. Some or all of the steps of an image processing method.
  • the present application provides a computer program product, which, when running on a computer, causes the computer to execute part or all of the steps of the image processing method of the first aspect or the second aspect.
  • the present application provides a picture processing method and an electronic device.
  • the electronic device receives the user's operation of cropping and dividing the selected first picture, it determines that the flexible screen of the electronic device is changed from the expanded state to the non-expanded state, and the first picture is cropped and divided into the first picture.
  • the second picture and the third picture It can be seen that using this image processing method, after the user selects the first image and inputs the operation of cropping and dividing, the user only needs to fold the flexible screen of the electronic device, and then the first image can be cropped and divided into two pictures, and no manual operation is required. Cropping and segmentation processing, the processing process is simpler, and the user experience is better.
  • FIG. 1 is a schematic structural diagram of a smart phone with a flexible screen provided by the application
  • 2A is a structural block diagram of an embodiment of an electronic device provided by the application.
  • 2B is a schematic structural diagram of an embodiment of a flexible screen of an electronic device provided by the present application.
  • 2C is a schematic structural diagram of another implementation manner of a flexible screen of an electronic device provided by the present application.
  • 2D is a schematic structural diagram of another implementation manner of a flexible screen of an electronic device provided by the present application.
  • 2E is a block diagram of a software structure of an electronic device provided by the application.
  • 3A-3E are schematic diagrams of an exemplary GUI interface of a human-computer interaction scenario provided by the present application.
  • FIGS. 4A-4C are schematic diagrams of another exemplary GUI interface of the human-computer interaction scene provided by the present application.
  • 5A-5C are schematic diagrams of another exemplary GUI interface of the human-computer interaction scene provided by the present application.
  • 6A-6B are schematic diagrams of another exemplary GUI interface of the human-computer interaction scene provided by the present application.
  • FIG. 7 is a schematic flowchart of an embodiment of a picture processing method provided by the present application.
  • FIG. 8 is a schematic flowchart of another embodiment of the image processing method provided by the present application.
  • FIG. 9 is a structural block diagram of another implementation manner of an electronic device provided by the present application.
  • Embodiments of the present application provide a picture processing method and an electronic device.
  • the image processing method provided in this application can be applied to electronic devices with flexible screens.
  • the electronic devices referred to in this application may be, for example, cell phones, tablet computers, notebook computers, ultra-mobile personal computers (UMPCs), handheld computers, netbooks, personal digital assistants (PDAs), wearable devices , virtual reality equipment, monitoring equipment, vehicle equipment and other electronic equipment with flexible screens.
  • the electronic equipment according to the present application can be mounted on Or other operating systems, which are not limited in this application.
  • the "user interface (UI)" involved in the image processing method provided in this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, which realizes the internal form of information and the form acceptable to the user. conversion between.
  • the user interface of the application is the source code written in a specific computer language such as java and extensible markup language (XML).
  • the interface source code is parsed and rendered on the terminal device, and finally presented as content that the user can recognize.
  • Controls also known as widgets, are the basic elements of the user interface. Typical controls include toolbars, menu bars, text boxes, buttons, and scroll bars. (scrollbar), pictures and text.
  • the attributes and content of controls in the interface are defined by tags or nodes.
  • XML specifies the controls contained in the interface through nodes such as ⁇ Textview>, ⁇ ImgView>, and ⁇ VideoView>.
  • a node corresponds to a control or property in the interface, and the node is rendered as user-visible content after parsing and rendering.
  • applications such as hybrid applications, often contain web pages in their interface.
  • a web page, also known as a page can be understood as a special control embedded in an application program interface.
  • a web page is source code written in a specific computer language, such as hypertext markup language (HTML), cascading styles Tables (cascading style sheets, CSS), java scripts (JavaScript, JS), etc.
  • the source code of the web page can be loaded and displayed as user-identifiable content by a browser or a web page display component similar in function to a browser.
  • the specific content contained in a web page is also defined by tags or nodes in the source code of the web page. For example, HTML defines the elements and attributes of web pages through ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas>.
  • GUI graphical user interface
  • GUI refers to a user interface related to computer operations that is displayed graphically. It can be an icon, window, control and other interface elements displayed on the flexible screen of the electronic device, wherein the control can include icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, Widgets, etc. visual interface elements.
  • the electronic device 100 provided by the embodiments of the present application is exemplarily introduced below.
  • the image processing method provided in this application may be implemented in the electronic device 100 .
  • FIG. 2A shows a structural block diagram of the electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, flexible screen 194, and Subscriber identification module (subscriber identification module, SIM) card interface 195 and so on.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structures illustrated in this application do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors. In some embodiments, electronic device 100 may also include one or more processors 110 .
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate the operation control signal according to the instruction operation code and the timing signal, and complete the control of the detection instruction and the like.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the electronic device 100 .
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate with each other through the I2C bus interface, so as to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with the flexible screen 194, the camera 193 and other peripheral devices. MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 communicates with the camera 193 through a CSI interface, so as to realize the photographing function of the electronic device 100 .
  • the processor 110 communicates with the flexible screen 194 through the DSI interface to implement the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the flexible screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in this application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the flexible screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the flexible screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the wireless communication solution provided by the mobile communication module 150 may enable the electronic device 100 to communicate with a device in the network (eg, a cloud server), and the WLAN wireless communication solution provided by the wireless communication module 160 may also enable The electronic device 100 may communicate with a device (eg, a cloud server) in a network. In this way, the electronic device 100 can perform data transmission with the cloud server.
  • the electronic device 100 implements a display function through a GPU, a flexible screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the flexible screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the flexible screen 194 may include a display and a touch device. The display is used for outputting display content to the user, and the touch device is used for receiving touch events input by the user on the flexible screen 194 .
  • the flexible screen 194 can be displayed as a complete display area in the unfolded state.
  • the user can fold the screen along one or more fold lines in the flexible screen 194 .
  • the position of the folding line may be preset, or may be arbitrarily selected by the user in the flexible screen 194 .
  • the flexible screen 194 can be divided into two display areas along the AB fold line, namely the display area 1 and the display area 2.
  • the folded display area 1 and the display area 2 may be displayed as two independent display areas.
  • the display area 1 may be referred to as the first screen of the electronic device 100
  • the display area 2 may be referred to as the second screen of the electronic device 100 .
  • the display areas of the first screen and the second screen may be the same or different.
  • the first screen and the second screen may be disposed opposite to each other, or the first screen and the second screen may also be away from each other.
  • the first screen 1941 and the second screen 1942 are away from each other.
  • both the first screen 1941 and the second screen 1942 are exposed to the external environment, and the user can
  • the first screen 1941 is used for display, and the second screen 1942 can also be used for display.
  • the screen of the folded part (also referred to as the third screen 1943 ) can also be used as an independent display area.
  • the flexible screen 194 is divided into three independent display areas: a first screen 1941 , a second screen 1942 and a third screen 1943 .
  • the size of the flexible screen 194 is 2200*2480 (unit is pixel).
  • the width of the folding line AB on the flexible screen 194 is 166 .
  • the area on the right side of the flexible screen 194 with a size of 1144*2480 is divided into a first screen 1941
  • the area on the left side of the flexible screen 194 with a size of 890*2480 is divided into a second screen 1942 .
  • the folding line AB with a size of 166*2480 can be used as the third screen 1943 .
  • the folding lines involved in this application are only for the convenience of understanding, and the folding lines may also be called folding belts, dividing lines or dividing belts, etc., which are not limited in this application.
  • a certain angle is formed between the divided first screen 1941 and the second screen 1942.
  • the angle between the first screen 1941 and the second screen 1942 may be represented by ⁇ .
  • the electronic device 100 may calculate the included angle ⁇ between the first screen 1941 and the second screen 1942 according to data detected by one or more sensors.
  • a gyro sensor and an acceleration sensor may be provided on the first screen 1941 and the second screen 1942 of the electronic device 100, respectively.
  • the gyroscope sensor on the first screen 1941 can detect the rotational angular velocity when the first screen 1941 rotates
  • the acceleration sensor on the first screen 1941 can detect the acceleration generated when the first screen 1941 moves.
  • the electronic device 100 may determine the magnitude and direction of the gravity G according to the data detected by the gyro sensor and the acceleration sensor on the first screen 1941 . Similarly, the electronic device 100 can also determine the magnitude and direction of the gravity G according to the data detected by the gyro sensor and the acceleration sensor on the second screen 1942 . Alternatively, if the magnitude and direction of the gravity G can be detected by using the sensor A, the sensor A can be set on the first screen 1941 and the second screen 1942 respectively. For example, a gyro sensor and an acceleration sensor may be integrated into one sensor and disposed on the first screen 1941 and the second screen 1942, respectively.
  • corresponding coordinate systems may be set on the first screen 1941 and the second screen 1942 respectively.
  • a Cartesian coordinate system O1 may be set in the second screen 1942, in which the x-axis of the Cartesian coordinate system O1 is parallel to the shorter side of the second screen 1942, and the y-axis is parallel to the longer side of the second screen 1942, The z-axis is perpendicular to the plane formed by the x-axis and the y-axis and points out of the second screen 1942 .
  • a Cartesian coordinate system O2 can be set in the first screen 1941.
  • the x-axis is parallel to the shorter side of the first screen 1941, and the y-axis is parallel to the longer side of the first screen 1941.
  • the z-axis is perpendicular to the plane formed by the x-axis and the y-axis and points into the first screen 1941 .
  • the gyro sensor and the acceleration sensor in the second screen 1942 can detect the magnitude and direction of the gravity G in the Cartesian coordinate system O1, and the gyro sensor and the acceleration sensor in the first screen 1941 can be in Cartesian coordinates.
  • the magnitude and direction of gravity G are detected in the system O2. Since the Cartesian coordinate system O1 and the Cartesian coordinate system O2 point in the same direction of the y-axis, the components G1 of the gravity G on the x-axis and z-axis planes in the Cartesian coordinate system O1 are the same as the gravity G in the Cartesian coordinate system O2 The components G2 on the x-axis and z-axis planes are equal in magnitude but different in direction.
  • the angle between the component G1 and the component G2 is the angle between the Cartesian coordinate system O1 and the Cartesian coordinate system O2 , and is also the angle ⁇ between the second screen 1942 and the first screen 1941 .
  • the electronic device 100 can obtain the difference between the second screen 1942 and the first screen 1941 by calculating the angle between the component G1 of the gravity G in the Cartesian coordinate system O1 and the component G2 of the gravity G in the Cartesian coordinate system O2 the included angle ⁇ . It can be understood that the angle ⁇ between the first screen 1941 and the second screen 1942 is within a closed interval formed by 0 to 180°.
  • the electronic device 100 may determine that the flexible screen 194 is in the unfolded state.
  • a first threshold eg, 170°
  • the electronic device 100 may determine that the flexible screen 194 is in the unfolded state.
  • the angle ⁇ between the first screen 1941 and the second screen 1942 is smaller than a second threshold (eg, 20°)
  • the electronic device 100 may determine that the flexible screen 194 is in a folded state.
  • the included angle ⁇ between the first screen 1941 and the second screen 1942 is within a preset interval (eg, between 40° and 60°)
  • the electronic device 100 may determine that the flexible screen 194 is in the stand state .
  • the physical form of the flexible screen 194 can be divided into an unfolded state and a non- unfolded state, and the physical form of the flexible screen 194 other than the above-mentioned unfolded state can be referred to as a non- unfolded state.
  • a non- unfolded state both the above-mentioned stand state and the folded state belong to the non-expanded state.
  • the electronic device 100 may determine the specific physical form of the flexible screen 194 according to the angle between the first screen 1941 and the second screen 1942 . Further, the electronic device 100 can trigger the electronic device 100 to process the picture on the flexible screen 194 according to the change of the physical form of the flexible screen 194 , or process the picture on the first screen 1941 and the picture on the second screen 1942 . For example, when the electronic device 100 detects that the flexible screen 194 changes from the expanded state to the non-expanded state, the electronic device 100 may be triggered to perform cropping and segmentation processing on the picture on the flexible screen 194 .
  • the electronic device 100 when the electronic device 100 detects that the flexible screen 194 changes from a non-expanded state to an expanded state, the electronic device 100 may be triggered to perform merging and splicing processing on the picture on the first screen 1941 and the picture on the second screen 1942 .
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a flexible screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include at least two camera entities.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG)-1, MPEG-2, MPEG-3, MPEG-4 and so on.
  • MPEG Moving Picture Experts Group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save data such as music, photos, videos, etc. in an external memory card.
  • Internal memory 121 may be used to store one or more computer programs including instructions.
  • the processor 110 may execute the above-mentioned instructions stored in the internal memory 121, thereby causing the electronic device 100 to execute the image processing methods (such as image cropping or image merging, etc.) provided in some embodiments of the present application, as well as various functional applications and data. processing etc.
  • the internal memory 121 may include a storage program area and a storage data area. Wherein, the stored program area may store the operating system; the stored program area may also store one or more application programs (such as gallery, contacts, etc.) and the like.
  • the storage data area may store data created during the use of the electronic device 100 .
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the flexible screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the angle at which the electronic device 100 shakes, calculates the distance that the lens module needs to compensate for according to the angle, and allows the lens to counteract the shake of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist in positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D. Further, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, characteristics such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 100 can measure the distance through infrared or laser. In the shooting scene of some embodiments, the electronic device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • Proximity light sensor 180G can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the flexible screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect the temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180K may also be referred to as a touch panel or a touch sensitive surface.
  • the touch sensor 180K may be disposed on the flexible screen 194, and the touch sensor 180K and the flexible screen 194 form a touch screen, also called a "touch screen".
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through the flexible screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the location where the flexible screen 194 is located.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the pulse of the human body and receive the blood pressure beating signal.
  • the bone conduction sensor 180M can also be disposed in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vocal vibration bone block obtained by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M, and realize the function of heart rate detection.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the flexible screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the plurality of cards may be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the Android system with a layered architecture is used as an example to illustrate the software structure of the electronic device 100 as an example.
  • FIG. 2E is a block diagram of the software structure of the electronic device 100 of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages. As shown in Figure 2E, the application package can include applications such as camera, gallery, calling, navigation, Bluetooth, music, video, short message, etc.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions. As shown in Figure 2E, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, and a notification manager, among others.
  • a window manager is used to manage window programs.
  • the window manager can obtain the size of the flexible screen, parameters of each display area on the display interface, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views. For example, a display interface that includes a camera icon.
  • the phone manager is used to provide the communication function of the electronic device 100 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine.
  • Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (media library), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to realize 3D graphics drawing, image rendering, compositing and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer can contain display drivers, camera drivers, audio drivers, sensor drivers, etc.
  • the system library and kernel layer below the application framework layer can also be referred to as the underlying system.
  • the underlying system includes a status monitoring service for identifying changes in the physical shape of the flexible screen.
  • the status monitoring service can be set in the system library and/or the kernel layer.
  • the state monitoring service may call a sensor service (sensor service) to start sensors such as a gyroscope sensor and an acceleration sensor for detection.
  • the status monitoring service can calculate the angle between the current first screen and the second screen according to the detection data reported by each sensor. In this way, through the angle between the first screen and the second screen, the state monitoring service can determine whether the flexible screen is in an unfolded state, a folded state, or a physical state such as a stand state.
  • the gyroscope sensor and the acceleration sensor at the hardware layer can report the detected data to the sensor driver, and the sensor driver reports the data detected by the gyroscope sensor and the acceleration sensor to the status monitoring service through the sensor service.
  • the state monitoring service can determine the angle between the first screen and the second screen according to the data detected by the gyro sensor and the acceleration sensor, and then determine the physical form of the flexible screen.
  • the following describes the image processing method provided by the embodiment of the present application in detail from the perspective of human-computer interaction with reference to the schematic diagram of the GUI of the electronic device 100 .
  • the GUI of the electronic device 100 may be displayed on the flexible screen 194 of the electronic device 100 illustrated in FIG. 2A .
  • the smart phone 100 has a flexible screen 194, and the physical form of the flexible screen 194 includes an expanded state and a non-expanded state.
  • the flexible screen 194 is divided into two independent display screens, namely a first screen and a second screen.
  • a gallery application (application, APP) is installed in the smart phone 100. It should be understood that other APPs may also be installed in the smart phone 100, which is not limited in this application.
  • the first application scenario is the application scenario of cropping and segmentation.
  • Cropping and segmentation refers to cropping and dividing a picture into two pictures.
  • details, refer to the contents of the following embodiments.
  • the flexible screen 194 of the smartphone 100 is in an unfolded state, and the flexible screen 194 can be displayed as a complete display area.
  • the main interface GUI of the flexible screen 194 may display icons such as gallery 31 , camera 32 , calculator 33 , clock 34 , contacts 35 , information 36 , settings 37 , and browser 38 . It should be understood that the main interface GUI of the flexible screen 194 may also display other interface elements, such as a navigation bar and date and time, which are not limited in this application.
  • the gallery 31 is an icon corresponding to the gallery APP.
  • the user may click the gallery 31 on the main interface GUI of the flexible screen 194 , and the smartphone 100 may receive a click operation of the user clicking the gallery 31 .
  • the smartphone 100 may run the gallery APP, and update the GUI of the flexible screen 194 to the photo interface shown in (b) of FIG. 3B .
  • the photo interface displays icons such as photo 311 , album 312 , time 313 , and discovery 314 .
  • the color of the photo 311 is different from the colors of other icons, indicating that the current interface is the interface corresponding to the photo 311 .
  • the photo interface also displays multiple pictures. It should be understood that other interface elements may also be displayed on the photo interface, which is not limited in this application.
  • the user can click to select a to-be-processed picture to be cut and segmented from the plurality of pictures, and the to-be-processed picture can be marked as the first picture.
  • the user can also browse pictures by swiping up or down on the photo interface, and then select the first picture by clicking.
  • the smartphone 100 may receive a click operation of the user clicking the first picture.
  • the smartphone 100 may update the GUI of the flexible screen 194 to the picture interface shown in (b) of FIG. 3C .
  • the user can also click on the icons such as album 312, time 313 or discovery 314 on the photo interface shown in (b) of FIG. 3B to enter the album interface, time interface or discovery interface accordingly, and then select the corresponding interface.
  • the icons such as album 312, time 313 or discovery 314 on the photo interface shown in (b) of FIG. 3B to enter the album interface, time interface or discovery interface accordingly, and then select the corresponding interface.
  • the picture interface displays a first picture 3111 , and may also display icons such as crop 3112 , merge 3113 , background replacement 3114 , style transfer 3115 , and edit 3116 .
  • the user may click on crop 3112 to trigger cropping and segmentation processing on the selected picture (eg, the first picture 3111 ).
  • the user can trigger the merge processing of the selected pictures by clicking merge 3113 .
  • the user can trigger the background replacement process on the selected picture by clicking on the background replacement 3114 .
  • the user can click style transfer 3115 to trigger style transfer processing for the selected picture.
  • the user can trigger editing of the selected picture by clicking Edit 3116 .
  • the picture interface may also include other icons or other interface elements that can be used to trigger the processing of the selected picture, which is not limited in this application.
  • the user can click crop 3112 on the picture interface, and then fold the flexible screen 194, that is, fold the flexible screen 194 from the expanded state to the non-expanded state. .
  • the smart phone 100 can calculate the angle ⁇ between the first screen and the second screen through the acceleration sensor and the gyroscope sensor provided inside, and determine the angle ⁇ according to the calculated value of the angle ⁇ .
  • the smartphone 100 may determine that the flexible screen 194 is in the unfolded state.
  • the smartphone 100 may determine that the flexible screen 194 is in a non-expanded state.
  • the smartphone 100 may receive a click operation of the user clicking the crop 3112 on the picture interface.
  • the smartphone 100 may determine that the user folds the flexible screen 194 , that is, determine that the flexible screen is folded.
  • the physical form of 194 is changed from the expanded state to the non-expanded state, then the smartphone 100 can cut and divide the first picture 3111 into two pictures according to the default cropping position (for example, the position corresponding to the folding line of the flexible screen 194 ). Record as the second picture and the third picture.
  • the second picture 31111 can be displayed on the first screen 1941
  • the third picture 31112 can be displayed on the second screen 1942
  • the second picture can also be displayed on the second screen 1942
  • the third picture can also be displayed on the first screen 1941, which is not limited in this application.
  • the smartphone 100 cuts and divides the first picture 3111 into the second picture and the third picture, the second picture and the third picture are also stored.
  • the smartphone 100 in response to the user's click operation of clicking crop 3112 on the picture interface, when the smartphone 100 detects that the angle ⁇ between the first screen 1941 and the second screen 1942 is At the second preset angle (for example, 150 degrees), the smartphone 100 crops and divides the first picture 3111 into a second picture and a third picture.
  • the second preset angle may be less than or equal to the first preset angle.
  • the smartphone 100 can The second picture and the third picture are stored.
  • the third preset angle is smaller than the second preset angle.
  • the smartphone 100 detects that the angle ⁇ between the first screen 1941 and the second screen 1942 is a fourth preset angle (for example, 100 degrees)
  • the second picture is first stored;
  • the smartphone 100 detects that the angle ⁇ between the first screen 1941 and the second screen 1942 is a fifth preset angle (eg, 60 degrees)
  • the third picture is stored again.
  • the fourth preset angle is smaller than the second preset angle.
  • the fifth preset angle is smaller than the fourth preset angle.
  • the third picture may be stored first, and then the second picture may be stored, which is not limited in this application.
  • the default cropping position may not be the ideal cropping position desired by the user.
  • the user can move the ideal cropping position of the first picture 3111 to a position corresponding to the folding line of the flexible screen 194 by panning and/or zooming.
  • the smartphone 100 can receive the user Click to crop 3112's click operation.
  • the smartphone 100 may update the GUI of the flexible screen 194 to the display interface shown in (b) of FIG. 3E .
  • the display interface displays a first picture 3111, and also displays a dividing line 3117, and the position of the dividing line 3117 may correspond to the position of the folding line of the flexible screen 194.
  • the user can perform translation and/or zoom processing on the first picture 3111 on the display interface, so that the dividing line 3117 is located at the position where the first picture 3111 is to be cropped and divided, such as the position shown in (c) in FIG. 3E . .
  • the smartphone 100 may cut and divide the first picture 3111 into the second picture and the third picture along the dividing line 3117 according to any implementation manner of the above embodiments.
  • the second picture and the third picture may also be stored according to any implementation manner in the foregoing embodiments, which will not be described in detail here.
  • a dividing line 3117 is displayed on the flexible screen 194, and the first picture 3111 is cropped and divided along the dividing line 3117, which is convenient for the user to preview, thereby determining an ideal cropping position, and the user experience is better.
  • the specific values of the first preset angle, the second preset angle, the third preset angle, the fourth preset angle and the fifth preset angle can all be based on the needs of actual application scenarios. Set to other values, which are not limited in this application.
  • the image processing method provided by the above-mentioned embodiment, after the user selects the first image and clicks the crop icon, the user only needs to fold the flexible screen of the smartphone, and then the first image can be cut and segmented, and there is no need to manually edit the first image.
  • the image is cropped and segmented, the processing process is simpler, and the user experience is better.
  • the second application scenario is the application scenario of merging and splicing.
  • Merging and splicing refers to merging and splicing two independent pictures into one picture.
  • the flexible screen of the smartphone 100 is in a non-expanded state, and the flexible screen is divided into two independent display screens, namely a first screen 1941 and a second screen 1942 .
  • the included angle ⁇ between the first screen 1941 and the second screen 1942 is less than or equal to a first preset angle (eg, 170 degrees).
  • Both the first screen 1941 and the second screen 1942 can display the main interface GUI, and both the main interface GUI of the first screen 1941 and the main interface GUI of the second screen 1942 can display icons corresponding to the gallery APP.
  • the main interface GUI of the first screen 1941 and the main interface GUI of the second screen 1942 may be the same or different, which is not limited in this application.
  • the user can refer to the method of selecting a picture to be processed in the embodiment shown in Application Scenario 1, select a picture to be processed on the display interface of the first screen 1941, and denote it as the first picture, such as the first picture 410 shown in FIG. 4A. .
  • the user can also refer to the method of selecting a picture to be processed in the embodiment shown in Application Scenario 1, select a picture to be processed on the display interface of the second screen 1942, and denote it as the second picture, such as the second picture shown in FIG. 4A. Picture 420.
  • both the display interface of the first screen 1941 and the display interface of the second screen 1942 may include other interface elements, which are not limited in this application.
  • the user can click to merge 412 on the display interface of the first screen 1941 shown in FIG. 4A, and then expand the The flexible screen is to expand the flexible screen of the smartphone 100 from a non-expanded state to an expanded state.
  • the smart phone 100 may receive a click operation of the user clicking merge 412 on the display interface of the first screen 1941 shown in FIG. 4A .
  • the smartphone 100 may determine that When the user expands the flexible screen, that is, it is determined that the physical form of the flexible screen changes from the non-expanded state to the expanded state, the smartphone 100 can combine the first picture 410 and the second picture 420 into one picture, and the picture can be marked as the third picture picture.
  • the third picture is, for example, the picture 430 shown in (b) of FIG. 4B .
  • the third picture 430 may be displayed in full screen on the flexible screen 194, or may be displayed in a partial display area of the flexible screen 194, which is not limited in this application.
  • the user can also click to merge 422 on the display interface of the second screen 1942 shown in FIG. 4A, and then expand the flexible screen, that is, the flexible screen The screen is expanded from a non-expanded state to an expanded state.
  • the smart phone 100 receives the click operation of the user on the display interface of the second screen 1942 shown in FIG. 4A to click and merge 422, when it is detected that the physical form of the flexible screen changes from a non-expanded state to an expanded state, the first The picture 410 and the second picture 420 are merged into a third picture.
  • the user can also click to merge 412 on the display interface of the first screen 1941 shown in FIG. Click merge 422 on the display interface of the second screen 1942, and then expand the flexible screen, that is, expand the flexible screen from a non-expanded state to an expanded state.
  • the smartphone 100 may combine the first picture 410 and the second picture 420 into a third picture.
  • the user can also adjust the size of the picture to be processed, and adjust the size of the picture to be processed to an ideal size to obtain an ideal size 's first picture.
  • the user can adjust the picture to be processed to an ideal size to obtain a second picture with an ideal size.
  • the user selects the picture to be processed on the display interface of the first screen 1941 shown in FIG. After the picture is processed, the to-be-processed picture can be recorded as the fourth picture. Then, as shown in (a) of FIG. 4C , the user may click edit 415 on the display interface where the fourth picture 440 is located, and the smartphone 100 may receive the click operation of the user clicking edit 415 on the display interface where the fourth picture 440 is located . In response to the click operation, the smartphone 100 may update the GUI of the first screen 1941 to the editing interface shown in (b) of FIG. 4C .
  • the editing interface may include interface elements such as an edit box 4151 , scale adjustment icons 4152 - 4157 , a rotation control 4158 , and a save icon 4159 .
  • the fourth picture 440 is located within the edit box 4151 . It should be noted that the editing interface may also include other interface elements, which are not limited in this application.
  • the editing box 4151 can be used to edit the fourth picture 440, so that the first picture required for subsequent merging and splicing processing is selected from the fourth picture 440.
  • the editing box 4151 can be used to edit the fourth picture 440, so that the first picture required for subsequent merging and splicing processing is selected from the fourth picture 440.
  • the scale adjustment icons 4152-4157 can be used to adjust the size of the editing box 4151, and then the first picture required for the subsequent merging and splicing process can be selected from the fourth picture 440 by adjusting the size of the editing box 4151.
  • the user can adjust the size of the edit box 4151 to the default size of the system by clicking the scale adjustment icon 4152 .
  • the user can also click the scale adjustment icon 4153 to reduce the vertical size of the editing box 4151 based on the default size of the system. direction of extension.
  • the user can also click the scale adjustment icon 4154 to reduce the horizontal size of the editing box 4151 based on the default size of the system. direction.
  • the user can also click the scale adjustment icon 4155 to adjust the ratio of the horizontal size and the vertical size of the edit box 4151 to 1:1 based on the default size of the system.
  • the user can also click the scale adjustment icon 4156 to adjust the ratio of the horizontal size and the vertical size of the edit box 4151 to 16:9 based on the default size of the system.
  • the user can also click the scale adjustment icon 4157 to adjust the ratio of the horizontal size and the vertical size of the edit box 4151 to 9:16 based on the default size of the system.
  • the user can also move the editing frame 4151 to select the first image required for subsequent merging and splicing processing from the fourth image 440 .
  • the rotation control 4158 can be used to rotate and translate the fourth picture 440, so that the relative position between the fourth picture 440 and the edit box 4151 can be changed.
  • the pointer of the rotation control 4158 may be slid to the left along the ruler of the rotation control 4158 , so that the fourth picture 440 is rotated in the counterclockwise direction in the edit box 4151 .
  • the pointer of the rotation control 4158 may be slid to the right along the ruler of the rotation control 4158 , so that the fourth picture 440 is rotated in the clockwise direction in the edit box 4151 .
  • the user adjusts the size of the edit box 4151 through the scale adjustment icons 4152-4157, and/or, after rotating and translating the fourth picture 440 through the rotation control 4158, the user can click the save icon 4159, and the smartphone 100 can receive the user click
  • the click operation of the icon 4159 is saved.
  • the smartphone 100 may determine the picture currently selected in the editing box 4151 as the first picture, and save the first picture. In this way, the fourth picture 440 can be adjusted to an ideal size to obtain a first picture of an ideal size.
  • the smartphone 100 can also update the GUI of the first screen 1941 to the display interface shown in (c) of FIG. 4C .
  • the display interface shown in (c) in FIG. 4C displays the first picture 410 , and also displays cropping 411 , merging 412 , background replacement 413 , style transfer 414 , and editing 415 , etc. icon. It should be understood that the display interface shown in (c) of FIG. 4C may also include other interface elements, which are not limited in this application.
  • the user can also select the picture to be processed on the display interface of the second screen 1942 shown in FIG. 4A according to the method of selecting the picture to be processed in the embodiment shown in Application Scenario 1, and record the picture to be processed on the display interface of the second screen 1942 shown in FIG. 4A . for the fifth picture. Then, in the manner shown in the above embodiment, the fifth picture is adjusted to an ideal size to obtain a second picture of the ideal size.
  • the first picture and the second picture with the same size can be obtained.
  • both the fourth picture and the fifth picture can be adjusted to a preset size (the preset size can be set according to actual needs), so as to obtain the first picture and the second picture with the same size, so that according to the first picture and the second picture
  • the merged and spliced third picture is more in line with the needs of the user, and the user experience is better.
  • the third picture may also be stored.
  • the execution order of determining the first picture and determining the second picture is not limited. There is also no restriction on the execution order of determining the fourth picture and determining the fifth picture. There is also no restriction on the execution order of determining the first picture and the fifth picture, and the execution order of determining the second picture and the fourth picture.
  • first picture in the second application scenario and the first picture in the first application scenario may be the same or different, which is not limited in this application.
  • the second picture and the third picture in the application scenario 2 have different meanings from the second picture and the third picture in the application scenario 1.
  • the user After the user selects the images to be merged on two independent screens, the user does not need to perform a traditional merge operation on the images to be merged, but only needs to expand the flexible screen of the smartphone from the non-expanded state to the image to be merged. In the expanded state, the images to be merged displayed on the two independent screens can be merged into one image, the processing process is simpler, and the user experience is better.
  • the third application scenario is an application scenario for background replacement.
  • Background replacement refers to replacing the background of one picture with the background of another picture. For details, please refer to the following embodiments.
  • the flexible screen of the smartphone 100 is in a non-expanded state, and the flexible screen is divided into two independent display screens, namely a first screen 1941 and a second screen 1942 .
  • the included angle ⁇ between the first screen 1941 and the second screen 1942 is less than or equal to a first preset angle (eg, 170 degrees).
  • Both the first screen 1941 and the second screen 1942 can display the main interface GUI, and both the main interface GUI of the first screen 1941 and the main interface GUI of the second screen 1942 can display icons corresponding to the gallery APP.
  • the main interface GUI of the first screen 1941 and the main interface GUI of the second screen 1942 may be the same or different, which is not limited in this application.
  • the user can refer to the method of selecting a picture to be processed in the embodiment shown in Application Scenario 1, select a picture to be processed on the display interface of the first screen 1941, and denote it as the first picture, such as the first picture 510 shown in FIG. 5A. .
  • the user can also refer to the method of selecting a picture to be processed in the embodiment shown in Application Scenario 1, select a picture to be processed on the display interface of the second screen 1942, and denote it as the second picture, such as the second picture shown in FIG. 5A. 520.
  • icons such as crop 511 , merge 512 , background replacement 513 , style transfer 514 , and edit 515 can also be displayed on the display interface.
  • icons such as crop 511 , merge 512 , background replacement 513 , style transfer 514 , and edit 515 can also be displayed on the display interface.
  • cropping 521 , merging 522 , background replacement 523 , style transfer 524 and editing 525 can also be displayed on the display interface. etc icon.
  • both the display interface of the first screen 1941 and the display interface of the second screen 1942 may include other interface elements, which are not limited in this application.
  • the first picture 510 may be used as the picture to be replaced with the background, and the second picture 520 may be used as the background picture.
  • the first picture 510 may also be used as the background picture, and the second picture 520 may be used as the picture to be replaced with the background.
  • the first picture 510 is taken as the picture to be replaced with the background, and the second picture 520 is taken as the background picture for illustration.
  • the first picture 510 includes a foreground image one and a background image one.
  • the second picture 520 includes a background image two.
  • the user may click the background replacement 513 on the display interface of the first screen 1941 shown in FIG. 5A , and then expand the flexible screen.
  • the smartphone 100 may receive a click operation of the user clicking the background replacement 513 .
  • the smartphone 100 may determine When the user expands the flexible screen, that is, it is determined that the physical form of the flexible screen changes from the non-expanded state to the expanded state, then
  • the smartphone 100 can call an image segmentation algorithm from the application framework layer to perform image segmentation on the first picture 510 , and obtain a foreground image of the first picture 510 from the first picture 510 , that is, foreground image 1 . Then, the smartphone 100 can synthesize the third picture by replacing the background according to the first foreground image and the second picture 520 .
  • the foreground image of the third picture is foreground image one
  • the background image of the third picture is all or part of the background image two in the second picture 520 .
  • the background of the first picture 510 can be replaced by the background shown in the first picture 510 with the background shown in the second picture 520 , thereby realizing the replacement of the background of the first picture 510 .
  • the third picture may be the third picture 530 shown in (b) of FIG. 5B .
  • the foreground image of the third picture 530 is the first foreground image
  • the background image of the third picture 530 is the second background image.
  • the flexible screen can be gradually expanded.
  • the smartphone 100 may receive a click operation of the user clicking the background replacement 513 .
  • the smartphone 100 detects that the angle ⁇ between the first screen 1941 and the second screen 1942 is the second preset angle, the smartphone 100 responds to the first Image 510 is image-segmented to obtain a foreground image of the first image 510, the foreground image includes foreground image 1, and, as shown in (a) of FIG.
  • the smartphone 100 can display the foreground image 5101 in a floating form at the preset position of the second picture 520 (eg, the upper left corner of the second picture 520 ).
  • the second preset angle is smaller than the first preset angle and larger than the initial angle.
  • the initial angle is the value of the included angle ⁇ between the first screen 1941 and the second screen 1942 when the click operation of the user clicking on the background replacement 513 is received.
  • the second preset angle in application scenario 3 and the second preset angle in application scenario 1 may be the same or different, which is not limited in this application.
  • the smartphone 100 may, according to the value of the included angle ⁇ and the preset corresponding relationship between the included angle ⁇ and the display position, follow the The change of the included angle ⁇ , according to the preset movement trajectory (for example, the movement trajectory from the upper left corner to the lower right corner shown by the dotted line in FIG. 5C (b)), on the second picture 520, move the foreground picture 5101, thereby changing The display position of the foreground picture 5101 on the second picture 520, until the foreground picture 5101 is moved to an ideal position recognized by the user, for example, when moving to the target display position shown in (c) in FIG.
  • the preset movement trajectory for example, the movement trajectory from the upper left corner to the lower right corner shown by the dotted line in FIG. 5C (b)
  • the user can stop expanding the flexible screen, and after a preset time interval (for example, 3 seconds), the user can unfold the flexible screen to the unfolded state.
  • a preset time interval for example, 3 seconds
  • the smartphone 100 detects that the flexible screen changes to the unfolded state, for example, the smartphone 100 detects that the first screen 1941 is connected to the first screen 1941.
  • the smart phone 100 can determine that the flexible screen is transformed into the unfolded state, and the smart phone 100 can pass the foreground picture 5101 and the second picture 520 through the background according to the target display position
  • the replacement is synthesized into one picture, that is, the third picture.
  • the user can perform portrait segmentation, that is, segment the human image from the original image where it is located, and then replace it with a new background.
  • the user can also segment the face image from the original image where it is located, and then replace the segmented face image with another person image, so as to realize face-changing operations and the like.
  • the smartphone 100 may also save the third picture.
  • the execution order of determining the first picture and determining the second picture is not limited.
  • first picture in application scenario 3 may be the same as or different from the first picture in application scenario 1 and the first picture in application scenario 2, which is not limited in this application.
  • second picture in application scenario 3 may be the same as or different from the second picture in application scenario 2, which is not limited in this application.
  • the meaning of the second picture in Application Scenario 3 is different from that of the second picture in Application Scenario 1.
  • the third picture in application scenario 3 has different meanings from the third picture in application scenario 1 and the third picture in application scenario 2.
  • the image processing method provided by the above embodiment, after the user selects the image to be replaced and the background image on two independent screens, there is no need to manually perform image segmentation and synthesis operations, and only need to change the flexible screen of the smartphone from the non-expanded screen.
  • the state is opened to the expanded state, the background of the image to be replaced can be replaced from the original background to a new background, the processing process is simpler, and the user experience is better.
  • the fourth application scenario is the application scenario of style transfer.
  • the style transfer refers to updating the picture style in one picture to the picture style in another picture. For details, please refer to the following embodiments.
  • the flexible screen of the smartphone 100 is in a non-expanded state, and the flexible screen is divided into two independent display screens, namely a first screen 1941 and a second screen 1942 .
  • the included angle ⁇ between the first screen 1941 and the second screen 1942 is less than or equal to a first preset angle (eg, 170 degrees).
  • Both the first screen 1941 and the second screen 1942 can display the main interface GUI, and both the main interface GUI of the first screen 1941 and the main interface GUI of the second screen 1942 can display icons corresponding to the gallery APP.
  • the main interface GUI of the first screen 1941 and the main interface GUI of the second screen 1942 may be the same or different. This application does not limit this.
  • the user can refer to the method of selecting a picture to be processed in the embodiment shown in Application Scenario 1, select a picture to be processed on the display interface of the first screen 1941, and denote it as the first picture, such as the first picture 610 shown in FIG. 6A. .
  • the user can also refer to the method of selecting a picture to be processed in the embodiment shown in Application Scenario 1, select a picture to be processed on the display interface of the second screen 1942, and denote it as the second picture, such as the second picture shown in FIG. 6A. Picture 620.
  • icons such as crop 611 , merge 612 , background replacement 613 , style transfer 614 , and edit 615 can also be displayed on the display interface.
  • cropping 621 , merging 622 , background replacement 623 , style transfer 624 and editing 625 can also be displayed on the display interface. etc icon.
  • both the display interface of the first screen 1941 and the display interface of the second screen 1942 may include other interface elements, which are not limited in this application.
  • the first picture 610 may be used as a picture to be subjected to style transfer, and the second picture 620 may be used as a style reference picture.
  • the first picture 610 may also be used as a style reference picture, and the second picture 620 may be used as a picture to be subjected to style transfer.
  • the first picture 610 is used as the picture to be subjected to style transfer, and the second picture 620 is used as the style reference picture for illustration.
  • the first picture 610 includes a picture content one and a picture style one.
  • the second picture 620 includes picture style two.
  • the user may click the style transition 614 on the display interface of the first screen 1941 shown in FIG. 6A , and then expand the flexible screen.
  • the smartphone 100 may receive the click operation of the user click style transfer 614 .
  • the smartphone 100 may determine When the user expands the flexible screen, that is, it is determined that the physical form of the flexible screen changes from the non-expanded state to the expanded state, the smart phone 100 can call the style transfer algorithm from the application framework layer to perform style transfer on the first picture 610 , from the first picture 610 The picture content of the first picture, that is, the first picture content, is extracted from the first picture, and then the third picture is synthesized according to the first picture content and the second picture 620 .
  • the third picture may be the third picture 630 shown in (b) of FIG. 6B .
  • the picture content of the third picture 630 is picture content one
  • the picture style of the third picture is picture style two.
  • the picture style of the first picture 610 can be updated to the picture style of the second picture 620 , thereby realizing the style transfer of the first picture 610 .
  • a preset corresponding relationship between the stylization degree and the included angle ⁇ may also be preset.
  • the smart phone 100 can determine the value of the angle ⁇ between the first screen 1941 and the second screen 1942, as well as the stylization degree and the angle when it is detected that the user expands the flexible screen.
  • the preset correspondence between ⁇ different stylization degrees are selected, until it is detected that the user stops expanding the flexible screen, the stylization degree corresponding to the current angle ⁇ is determined as the target stylization degree. After the user stops unfolding the flexible screen, the flexible screen can be fully unfolded after a preset time interval.
  • the smart phone 100 detects that the flexible screen is in the unfolded state, it detects the clip between the first screen 1941 and the second screen 1942.
  • the angle ⁇ is greater than the first preset angle, according to the target stylization degree, the picture style of the first picture is updated to the picture style of the second picture through style migration to obtain a third picture.
  • the smartphone 100 may also save the third picture.
  • first picture in application scenario 4 may be the same as or different from the first picture in application scenario 1, the first picture in application scenario 2, and the first picture in application scenario 3, which is not limited in this application.
  • second picture in application scenario 4 may be the same as or different from the second picture in application scenario 2 and the second picture in application scenario 3, which is not limited in this application.
  • the meaning of the second picture in the application scenario 4 is different from that of the second picture in the application scenario 1.
  • the meaning of the third picture in application scenario 4 is different from the third picture in application scenario 1, the third picture in application scenario 2, and the third picture in application scenario 3.
  • the image processing method provided by the above embodiment, after the user selects the image to be styled and the style reference image on two separate screens, there is no need to manually perform style transfer and synthesis operations, and only need to change the flexible screen of the smart phone from the non-
  • the picture style of the picture to be style-migrated can be updated from the original picture style to the new picture style, the processing process is simpler, and the user experience is better.
  • the image processing method provided by the present application will be exemplarily described below from the perspective of the electronic device 100 .
  • the electronic device 100 has a flexible screen, and the physical form of the flexible screen includes an expanded state and a non-expanded state.
  • the flexible screen is divided into two independent display screens, namely a first screen and a second screen.
  • the image processing method provided in this application can be applied to the electronic device 100 .
  • FIG. 7 is a schematic flowchart of an implementation manner of a picture processing method provided by the present application. The method may include the following steps:
  • Step S11 when the flexible screen is in an unfolded state, the flexible screen displays a first picture.
  • the first picture is a picture displayed on the flexible screen according to a selection operation input by the user when the flexible screen is in an unfolded state.
  • the first picture may be the first picture 3111 in the embodiment shown in the foregoing application scenario 1.
  • the selection operation input by the user may be, for example, in the first embodiment of the above application scenario, when the flexible screen of the electronic device is in an expanded state, the user first clicks on the gallery 31, and then clicks on the picture to be processed on the photo interface.
  • Step S12 Receive a user's operation of cropping and dividing the first picture.
  • the user's operation of cropping and dividing the first picture may be a click operation in which the user clicks the cropping 3112 in the embodiment shown in the above application scenario 1.
  • the user's operation of cropping and dividing the first picture is used to instruct the electronic device to perform cropping and dividing processing on the first picture.
  • Step S13 Determine that the flexible screen is transformed from the expanded state to the non-expanded state, and cut and divide the first picture into a second picture and a third picture.
  • the flexible screen of the electronic device when the flexible screen of the electronic device is in a non-expanded state, the flexible screen is divided into two independent display screens, namely a first screen and a second screen.
  • the electronic device can calculate the angle ⁇ between the first screen and the second screen through the acceleration sensor and gyroscope sensor provided inside, and determine the physical location where the flexible screen of the electronic device is located according to the value of the calculated angle ⁇ . form.
  • the included angle ⁇ between the first screen and the second screen is greater than the first preset angle.
  • the angle ⁇ between the first screen and the second screen is less than or equal to the first preset angle.
  • the first preset angle may be set according to the requirements of the application scenario, for example, the first preset angle may be 170 degrees.
  • the electronic device can determine the angle ⁇ of the flexible screen.
  • the physical form is changed from the expanded state to the non-expanded state, then the electronic device can cut the first picture into two pictures along the default cropping position (for example, the position corresponding to the folding line of the flexible screen), and the two pictures are respectively Can be recorded as the second picture and the third picture.
  • the second picture and the third picture may be stored.
  • the electronic device may crop and divide the first picture into a second picture and a third picture.
  • the second preset angle is less than or equal to the first preset angle.
  • the second preset angle can be set according to the needs of the application scenario. For example, the second preset angle may be set to 150 degrees.
  • the electronic device may store the second picture and the third picture.
  • the third preset angle is smaller than the second preset angle.
  • the third preset angle can be set according to the needs of the application scenario. For example, the third preset angle may be set to 130 degrees. or,
  • the electronic device When the electronic device determines that the angle ⁇ between the first screen and the second screen is the fourth preset angle, the electronic device can store the second picture; when the electronic device determines the angle ⁇ between the first screen and the second screen When the angle is the fifth preset angle, the electronic device may store the third picture.
  • the fourth preset angle is smaller than the second preset angle
  • the fifth preset angle is smaller than the fourth preset angle.
  • both the fourth preset angle and the fifth preset angle can be set according to the needs of the application scenario. For example, the fourth preset angle may be set to 100 degrees, and the fifth preset angle may be set to 60 degrees.
  • the user can respectively trigger the electronic device to cut and divide the first picture by folding the flexible screen into different angles, and store the second picture and the third picture, so as to reserve sufficient preview time for the user. , the user experience is better.
  • the electronic device may further display a dividing line on the flexible screen, and the position of the dividing line may correspond to the position of the flexible screen.
  • the position of the folding line then, the electronic device can change the display position of the first picture on the flexible screen according to the user's translation operation and/or zoom operation of the first picture, so that the dividing line corresponds to the ideal cropping position desired by the user; Afterwards, when the electronic device determines that the angle ⁇ between the first screen and the second screen is less than or equal to the first preset angle, the electronic device can cut and divide the first picture into a second picture and a third picture along the dividing line.
  • the electronic device may cut and divide the first picture into the second picture and the third picture along the dividing line. According to this implementation, it is convenient for the user to preview and select an ideal cropping position, so that the first picture is cropped and divided into two pictures from the ideal cropping position desired by the user, and the user experience is better.
  • the electronic device determines that the flexible screen of the electronic device is changed from the expanded state to the non-expanded state, and cuts and divides the first picture. for the second picture and the third picture. Therefore, using this image processing method, after the user selects the first image and clicks the crop icon, the user only needs to fold the flexible screen of the electronic device, and the first image can be cropped and divided into two images, without the need for manual cropping and division. Processing, the processing process is simpler, and the user experience is better.
  • FIG. 8 is a schematic flowchart of another implementation manner of the image processing method provided by the present application.
  • the method may include the following steps:
  • Step S21 when the flexible screen is in a non-expanded state, the first screen displays the first picture, and the second screen displays the second picture.
  • the first picture is a picture displayed on the first screen of the electronic device according to a first selection operation input by the user when the flexible screen of the electronic device is in a non-expanded state.
  • the second picture is a picture displayed on the second screen of the electronic device according to the second selection operation input by the user.
  • the first selection operation may be, for example, an operation for the user to select a picture to be processed on the first screen of the electronic device when the flexible screen of the electronic device is in a non-expanded state.
  • the second selection operation may be, for example, an operation for the user to select a picture to be processed on the second screen of the electronic device when the flexible screen of the electronic device is in a non-expanded state. will not be detailed here.
  • first picture in the embodiment shown in FIG. 8 and the first picture in the embodiment shown in FIG. 7 may be the same or different.
  • the meanings of the second picture and the third picture in the embodiment shown in FIG. 8 are different from those of the second picture and the third picture in the embodiment shown in FIG. 7 .
  • Step S22 Receive a first operation of generating a third picture according to the first picture and the second picture input by the user.
  • the first operation can be one of the following: replace the background of the first picture with the background of the second picture, and generate a background replacement operation of the third picture; or, update the picture style of the first picture to that of the second picture Picture style, a style transfer operation for generating a third picture; or a merge and splicing operation for merging and splicing the first picture and the second picture into the third picture.
  • the first operation may be the click operation in which the user clicks to merge 412 and/or merge 422 in the embodiment shown in the above application scenario 2; or, the first operation may also be the user click background replacement in the embodiment shown in the above application scenario 3 513; or, the first operation may also be the click operation of the user click style migration 614 in the embodiment shown in the fourth application scenario above.
  • Step S23 Determine that the flexible screen is transformed from the non-expanded state to the expanded state, and generate a third picture according to the first picture and the second picture.
  • the flexible screen of the electronic device when the flexible screen of the electronic device is in a non-expanded state, the flexible screen is divided into two independent display screens, namely a first screen and a second screen.
  • the electronic device can calculate the angle ⁇ between the first screen and the second screen through the acceleration sensor and gyroscope sensor provided inside, and determine the physical location where the flexible screen of the electronic device is located according to the value of the calculated angle ⁇ . form.
  • the included angle ⁇ between the first screen and the second screen is greater than the first preset angle.
  • the angle ⁇ between the first screen and the second screen is less than or equal to the first preset angle.
  • the first preset angle may be set according to the requirements of the application scenario, for example, the first preset angle may be 170 degrees.
  • the electronic device may determine that the flexible screen is transformed from a non-expanded state to an expanded state, and the electronic device may generate a third picture according to the first picture and the second picture.
  • the electronic device When the first operation is to replace the background of the first picture with the background of the second picture to generate the background replacement operation of the third picture.
  • the electronic device can call the image segmentation algorithm from the application framework layer to perform the first operation on the first screen.
  • the image is segmented to obtain a foreground image of the first image, and then a third image is generated by replacing the background according to the foreground image of the first image and the second image.
  • the foreground image of the third picture is the foreground image of the first picture
  • the background image of the third picture is all or part of the background image of the second picture. Therefore, the background of the first picture is replaced with the background of the second picture, and the background replacement of the first picture is realized.
  • the electronic device when the first operation is to replace the background of the first picture with the background of the second picture to generate the background replacement operation of the third picture.
  • the electronic device can call the image segmentation algorithm from the application framework layer to perform the first image segmentation on the first image. Perform image segmentation to generate a foreground picture of the first picture, the second preset angle is smaller than the first preset angle, and is greater than the initial angle, the initial angle is when the first operation is received.
  • the electronic device can display the foreground picture of the first picture on the second picture; after that, in the process of the user continuing to expand the flexible screen, the electronic device can determine the desired value according to the user's operation of expanding the flexible screen
  • the value of the included angle ⁇ , and combined with the preset correspondence between the included angle ⁇ and the display position, with the change of the included angle ⁇ , the display position of the foreground picture on the second picture is changed until the detection
  • the background of the foreground picture is replaced with the background of the second picture to generate the third picture .
  • the second preset angle in the embodiment shown in FIG. 8 and the second preset angle in the embodiment shown in FIG. 7 may be the same or different, and this application does not limit this.
  • the electronic device can call the style transfer algorithm from the application framework layer to perform the first picture.
  • the style transfer is to obtain the picture content of the first picture, and then according to the picture content of the first picture and the second picture, the third picture is generated by fusion through style transfer.
  • the picture content of the third picture is the picture content of the first picture
  • the picture style of the third picture is the picture style of the second picture.
  • the electronic device when the first operation is to update the picture style of the first picture to the picture style of the second picture, and generate the style transfer operation of the third picture.
  • the electronic device can determine the value of the included angle ⁇ according to the user's operation of unfolding the flexible screen, and combine the included angle ⁇ with the value of the included angle ⁇ .
  • the preset corresponding relationship of the stylization degree, with the change of the included angle ⁇ select different stylization degrees until it is detected that the user stops expanding the flexible screen, that is, until it is detected that the value of the included angle ⁇ is greater than or equal to the preset duration
  • the stylization degree corresponding to the current angle ⁇ does not change within the time period of , updating the picture style of the first picture to the picture style of the second picture to generate a third picture.
  • the electronic device When the first operation is a merging and splicing operation of merging and splicing the first picture and the second picture into a third picture. After the electronic device receives the first operation and determines that the angle ⁇ between the first screen and the second screen is greater than the first preset angle, the electronic device can combine the first picture and the second picture into a third picture.
  • the first operation is a merging and splicing operation of merging and splicing the first picture and the second picture into a third picture.
  • a fourth picture can also be displayed on the first screen; then, according to the user's adjustment operation on the fourth picture, the fourth picture can be adjusted to a preset size to generate the desired size. the first picture.
  • a fifth picture may be displayed on the second screen; then, the fifth picture may be adjusted to the preset size according to the user's adjustment operation on the fifth picture to generate a second picture. In this way, the first picture and the second picture with the same size can be obtained, so that the third picture merged and spliced according to the first picture and the second picture is more in line with the needs of the user, and the user experience is better.
  • the user's adjustment operation on the fourth picture may be the editing operation on the fourth picture in the embodiment shown in the foregoing application scenario 2
  • the user's adjustment operation on the fifth picture may be in the embodiment shown in the foregoing application scenario 2.
  • For the editing operation of the five pictures reference may be made to the content of the embodiment shown in Application Scenario 2, which will not be described in detail here.
  • the electronic device determines that when the flexible screen changes from the non-expanded state to the expanded state, according to the first picture and the second picture
  • the picture and the second picture generate a third picture. Therefore, using this picture processing method, after the user selects the first picture and the second picture and inputs the operation of generating the third picture according to the first picture and the second picture, he only needs to unfold the flexible screen of the electronic device, The first picture and the second picture generate the third picture, during which there is no need to manually merge, replace the background, or transfer the style.
  • the processing process is simpler and the user experience is better.
  • the above embodiments are all schematic descriptions, and do not limit the technical solutions of the embodiments of the present application.
  • the electronic device may be implemented according to one of the foregoing embodiments, or may be implemented according to a combination of the foregoing embodiments, which is not limited herein.
  • the electronic device may be divided into functional modules according to the foregoing method examples.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules. It should be noted that, the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation. The following description is given by taking as an example that each function module is divided corresponding to each function.
  • FIG. 9 is a structural block diagram of an implementation manner of an electronic device provided by the present application.
  • the electronic device 900 may include: a flexible screen (not shown in the figure), a transceiver 901 and a processor 902; the physical form of the flexible screen includes an expanded state and a non-expanded state; when the flexible screen is in the non-expanded state In the state, the flexible screen is divided into a first screen and a second screen.
  • the electronic device 900 may be used to perform the actions performed by the smart phone or the electronic device in the above method embodiments.
  • the electronic device 900 can be used to execute one of the following solutions:
  • the processor 902 may be configured to display the first picture on the flexible screen when the flexible screen of the electronic device is in an unfolded state;
  • the transceiver 901 may be configured to receive a user's operation of cropping and dividing the first picture.
  • the processor 902 may also be configured to determine that the flexible screen is transformed from the expanded state to the non-expanded state, and to crop and divide the first picture into a second picture and a third picture.
  • the processor 902 is configured to determine the The flexible screen is transformed from the expanded state to the non-expanded state, and the first picture is cropped and divided into a second picture and a third picture, including: the processor 902 is configured to determine that the included angle is less than or equal to the specified angle.
  • the first preset angle is cut, and the first picture is cropped and divided into a second picture and a third picture.
  • the processor 902 is configured to determine that the included angle is less than or equal to the first preset angle, and to crop and divide the first picture into a second picture and a third picture, including: the processor 902 is used to determine that the included angle is a second preset angle, and to crop and divide the first picture into the second picture and the third picture; the second preset angle is less than or equal to the first preset angle.
  • the processor 902 is further configured to: determine that the included angle is a third preset angle, and store the second picture and the third picture; the third preset angle is smaller than the second preset angle.
  • the processor 902 is further configured to: determine that the included angle is a fourth preset angle, and store the second picture; the fourth preset angle is smaller than the second preset angle; determine the The included angle is a fifth preset angle, and the third picture is stored; the fifth preset angle is smaller than the fourth preset angle.
  • the processor 902 is configured to determine that the flexible screen is transformed from the expanded state to the non-expanded state, and to crop and divide the first picture into a second picture and a third picture, including: the The processor 902 is configured to display a dividing line on the flexible screen, where the dividing line corresponds to a folding line of the flexible screen; determine that the flexible screen is transformed from the expanded state to the non-expanded state, along the A dividing line is used to cut and divide the first picture into a second picture and a third picture.
  • the processor 902 is further configured to: after displaying the dividing line, before cropping and dividing the first picture, change the The display position of a picture on the flexible screen.
  • the processor 902 may also be configured to display the first picture on the first screen and display the second picture on the second screen when the flexible screen of the electronic device is in a non-expanded state.
  • the transceiver 901 may also be configured to receive a first operation of generating a third picture according to the first picture and the second picture input by the user.
  • the processor 902 may also be configured to determine that the flexible screen is transformed from the non-expanded state to the expanded state, and generate the third picture according to the first picture and the second picture.
  • the processor 902 is configured to determine the The flexible screen is transformed from the non-expanded state to the expanded state, and the third picture is generated according to the first picture and the second picture, including: the processor 902 is configured to determine that the included angle is greater than For the first preset angle, the third picture is generated according to the first picture and the second picture.
  • the first operation is to replace the background of the first picture with the background of the second picture to generate a background replacement operation of the third picture.
  • the processor 902 is configured to determine that the included angle is greater than the first preset angle, and generate the third picture according to the first picture and the second picture, including: the processor 902 is used to determine that the included angle is a second preset angle, perform image segmentation on the first picture, and generate a foreground picture of the first picture; the second preset angle is smaller than the first preset angle , and is greater than the initial angle; the initial angle is the value of the included angle when the first operation is received; the foreground image is displayed on the second image; according to the user's operation to expand the flexible screen, and
  • the preset corresponding relationship between the included angle and the display position is to determine the target display position of the foreground picture on the second picture; it is determined that the included angle is greater than the first preset angle, and the target display position is determined according to the target display position. , and replace the background of the foreground picture with the background of the second picture to generate the third picture.
  • the first operation is a style transfer operation of updating the picture style of the first picture to the picture style of the second picture to generate a third picture.
  • the processor 902 is configured to determine that the included angle is greater than the first preset angle, and generate the third picture according to the first picture and the second picture, including: the processor 902 is used to determine the target stylization degree for performing style transfer on the first picture according to the user's operation of unfolding the flexible screen and the preset correspondence between the included angle and the stylization degree; determine that the included angle is greater than At the first preset angle, according to the target stylization degree, the picture style of the first picture is updated to the picture style of the second picture, and the third picture is generated.
  • the first operation is a merging and splicing operation of merging and splicing the first picture and the second picture into a third picture.
  • the processor 902 is further configured to: when the flexible screen is in the non-expanded state, display a fourth picture on the first screen, and display a fifth picture on the second screen; According to the adjustment operation of the fourth picture, the fourth picture is adjusted to a preset size to generate the first picture; according to the user's adjustment operation on the fifth picture, the fifth picture is adjusted to the preset size , and generate the second picture.
  • the electronic device 900 may also include a memory.
  • the memory may be used to store programs/codes preinstalled in the electronic device 900 when it leaves the factory, and may also store codes used for execution by the processor 902, and the like.
  • the electronic device 900 in this embodiment of the present application may correspond to the electronic device 100 described in the foregoing embodiments.
  • the transceiver 901 is configured to perform various operations of receiving user input
  • the processor 902 is configured to perform various picture processing processes in the foregoing method embodiments. It is not repeated here.
  • the present application also provides a computer storage medium, wherein the computer storage medium set in any device may store a program or instruction, and when the program or instruction is executed, the image processing method including the above-mentioned image processing method may be implemented. some or all of the steps in each of the embodiments.
  • the storage medium in any device may be a magnetic disk, an optical disk, a read-only memory (ROM) or a random access memory (RAM), and the like.
  • the transceiver may be a wired transceiver.
  • the wired transceiver may be, for example, an optical interface, an electrical interface, or a combination thereof.
  • Wired transceivers can also be various types of sensors, for example.
  • the processor can be a central processing unit (CPU), a network processor (NP), or a combination of CPU and NP.
  • the processor may further include a hardware chip.
  • the above-mentioned hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD) or a combination thereof.
  • ASIC application-specific integrated circuit
  • PLD programmable logic device
  • the above-mentioned PLD can be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), a general-purpose array logic (generic array logic, GAL) or any combination thereof.
  • Memory may include volatile memory (volatile memory), such as random-access memory (RAM); memory may also include non-volatile memory (non-volatile memory), such as read-only memory (read-only memory) memory, ROM), flash memory (flash memory), hard disk drive (HDD) or solid-state drive (solid-state drive, SSD); the memory may also include a combination of the above-mentioned types of memory.
  • the electronic device of the present application may also include a bus interface.
  • the bus interface may include any number of interconnected buses and bridges. Specifically, one or more processors represented by a processor and various circuits of a memory represented by a memory are linked together.
  • the bus interface may also link together various other circuits, such as peripherals, voltage regulators, and power management circuits, which are well known in the art and, therefore, will not be described further herein.
  • the bus interface provides the interface.
  • a transceiver provides a unit for communicating with various other devices over a transmission medium.
  • the processor is responsible for managing the bus architecture and general processing, and the memory stores the messages that the processor uses when performing operations.
  • a general-purpose processor may be a microprocessor, but in the alternative, the general-purpose processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented by a combination of computing devices, such as a digital signal processor and a microprocessor, multiple microprocessors, one or more microprocessors in combination with a digital signal processor core, or any other similar configuration. accomplish.
  • a software unit may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, removable disk, CD-ROM, or any other form of storage medium known in the art.
  • a storage medium may be coupled to the processor such that the processor may read information from, and store information in, the storage medium.
  • the storage medium can also be integrated into the processor.
  • the processor and storage medium may be provided in an ASIC, and the ASIC may be provided in an electronic device. Alternatively, the processor and the storage medium may also be provided in different components in the electronic device.
  • the computer program product includes one or more computer programs or instructions.
  • the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website, computer, server, or message.
  • the center transmits to another website site, computer, server or message center by wire (eg coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (eg infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a message storage device such as a server, a message center, etc. that includes one or more available mediums integrated.
  • the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media (eg, solid state disks (SSDs)), and the like.
  • the electronic devices, computer storage media, and computer program products provided by the above-mentioned embodiments of the present application are all used to execute the methods provided above. Therefore, the beneficial effects that can be achieved may refer to the beneficial effects corresponding to the methods provided above. It is not repeated here.
  • the size of the sequence numbers of each process does not mean the sequence of execution, and the execution sequence of each process should be determined by its function and internal logic, rather than the implementation process of the embodiment. constitute any limitation.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Le procédé est appliqué à un dispositif électronique doté d'un écran souple, et la forme physique de l'écran souple comporte un état déplié et un état non déplié. Le procédé comporte les étapes consistant: lorsqu'un écran souple est dans un état déplié, à faire afficher une première image par l'écran souple (S11); à recevoir une opération de rognage et de segmentation d'un utilisateur sur la première image (S12); et à déterminer que l'écran souple est converti de l'état déplié à un état non déplié, puis à rogner et à segmenter la première image en une deuxième image et une troisième image (S13). En utilisant le procédé de traitement d'image, après qu'un utilisateur a sélectionné une première image et introduit une opération de rognage et de segmentation, la première image peut être rognée et segmentée en deux images simplement en repliant un écran souple d'un dispositif électronique sans effectuer manuellement un traitement de rognage et de segmentation, de sorte que le processus de traitement est plus simple et que l'agrément d'utilisation est meilleur.
PCT/CN2021/137401 2020-12-31 2021-12-13 Procédé de traitement d'image et dispositif électronique WO2022143118A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011641789.0 2020-12-31
CN202011641789.0A CN114690998B (zh) 2020-12-31 图片处理方法及电子设备

Publications (1)

Publication Number Publication Date
WO2022143118A1 true WO2022143118A1 (fr) 2022-07-07

Family

ID=82135522

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/137401 WO2022143118A1 (fr) 2020-12-31 2021-12-13 Procédé de traitement d'image et dispositif électronique

Country Status (1)

Country Link
WO (1) WO2022143118A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116820229A (zh) * 2023-05-17 2023-09-29 荣耀终端有限公司 Xr空间的显示方法、xr设备、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160085319A1 (en) * 2014-09-18 2016-03-24 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN108053364A (zh) * 2017-12-28 2018-05-18 努比亚技术有限公司 图片裁剪方法、移动终端及计算机可读存储介质
CN110221738A (zh) * 2019-05-16 2019-09-10 珠海格力电器股份有限公司 一种图片处理的方法及设备
WO2020000448A1 (fr) * 2018-06-29 2020-01-02 华为技术有限公司 Procédé et terminal d'affichage d'écran flexible
CN111124326A (zh) * 2018-10-31 2020-05-08 中兴通讯股份有限公司 图片显示方法、终端和计算机可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160085319A1 (en) * 2014-09-18 2016-03-24 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN108053364A (zh) * 2017-12-28 2018-05-18 努比亚技术有限公司 图片裁剪方法、移动终端及计算机可读存储介质
WO2020000448A1 (fr) * 2018-06-29 2020-01-02 华为技术有限公司 Procédé et terminal d'affichage d'écran flexible
CN111124326A (zh) * 2018-10-31 2020-05-08 中兴通讯股份有限公司 图片显示方法、终端和计算机可读存储介质
CN110221738A (zh) * 2019-05-16 2019-09-10 珠海格力电器股份有限公司 一种图片处理的方法及设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116820229A (zh) * 2023-05-17 2023-09-29 荣耀终端有限公司 Xr空间的显示方法、xr设备、电子设备及存储介质
CN116820229B (zh) * 2023-05-17 2024-06-07 荣耀终端有限公司 Xr空间的显示方法、xr设备、电子设备及存储介质

Also Published As

Publication number Publication date
CN114690998A (zh) 2022-07-01

Similar Documents

Publication Publication Date Title
JP7142783B2 (ja) 音声制御方法及び電子装置
CN112217923B (zh) 一种柔性屏幕的显示方法及终端
CN114397979B (zh) 一种应用显示方法及电子设备
WO2021103981A1 (fr) Procédé et appareil de traitement d'affichage à écran divisé, et dispositif électronique
WO2021129326A1 (fr) Procédé d'affichage d'écran et dispositif électronique
WO2021036571A1 (fr) Procédé d'édition de bureau et dispositif électronique
WO2021139768A1 (fr) Procédé d'interaction pour traitement de tâches inter-appareils, et dispositif électronique et support de stockage
CN112714901B (zh) 系统导航栏的显示控制方法、图形用户界面及电子设备
WO2020253758A1 (fr) Procédé de disposition d'interface utilisateur et dispositif électronique
WO2021082835A1 (fr) Procédé d'activation de fonction et dispositif électronique
CN111669459B (zh) 键盘显示方法、电子设备和计算机可读存储介质
JP7400095B2 (ja) 表示要素の表示方法及び電子機器
WO2021190524A1 (fr) Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal
CN113961157A (zh) 显示交互系统、显示方法及设备
WO2022143180A1 (fr) Procédé d'affichage collaboratif, dispositif terminal et support de stockage lisible par ordinateur
WO2022143118A1 (fr) Procédé de traitement d'image et dispositif électronique
CN114173005B (zh) 一种应用布局控制方法、装置、终端设备及计算机可读存储介质
CN114690998B (zh) 图片处理方法及电子设备
CN114356196B (zh) 一种显示方法及电子设备
WO2023169542A1 (fr) Procédé d'affichage et dispositif électronique
WO2024037542A1 (fr) Procédé d'entrée tactile, système, dispositif électronique et support de stockage
WO2023185886A1 (fr) Procédé de photographie et dispositif électronique
WO2024067551A1 (fr) Procédé d'affichage d'interface et dispositif électronique
WO2023160455A1 (fr) Procédé de suppression d'objet et dispositif électronique
WO2023169237A1 (fr) Procédé de capture d'écran, dispositif électronique, et système

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21913859

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21913859

Country of ref document: EP

Kind code of ref document: A1