CN115034948A - Image processing method and electronic equipment - Google Patents

Image processing method and electronic equipment Download PDF

Info

Publication number
CN115034948A
CN115034948A CN202111657813.4A CN202111657813A CN115034948A CN 115034948 A CN115034948 A CN 115034948A CN 202111657813 A CN202111657813 A CN 202111657813A CN 115034948 A CN115034948 A CN 115034948A
Authority
CN
China
Prior art keywords
image
image sensor
processing
camera application
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111657813.4A
Other languages
Chinese (zh)
Other versions
CN115034948B (en
Inventor
赵玉霞
黄立波
杨阳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202111657813.4A priority Critical patent/CN115034948B/en
Publication of CN115034948A publication Critical patent/CN115034948A/en
Application granted granted Critical
Publication of CN115034948B publication Critical patent/CN115034948B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/60Rotation of whole images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application discloses an image processing method and electronic equipment, and relates to the field of image processing, so that the electronic equipment has low power consumption for turning or rotating an original image. The specific scheme is as follows: in response to the first operation, a camera application is launched. And the camera application program issues a first parameter to the image sensor, and the first parameter is used for indicating the image sensor to return the currently acquired image after first processing. The first processing includes at least one of: turning over and rotating. And the image sensor performs first processing on the currently acquired image according to the first parameter to generate a first image. The image sensor returns the first image to the camera application to cause the camera application to display the first image.

Description

Image processing method and electronic equipment
Technical Field
The present disclosure relates to the field of image processing, and in particular, to an image processing method and an electronic device.
Background
With the continuous development of electronic circuit technology, the shooting function of electronic equipment is more and more abundant. For example, many electronic devices can process images of original images and then output or store the processed images when taking photos. The original image refers to an image acquired by an image sensor of the electronic device, and the image processing includes flipping or rotating.
Generally, the process of image processing the original image is performed by a Graphics Processing Unit (GPU) module in the electronic device. For example, after an image sensor in the electronic device acquires an original image, the original image is sent to the GPU module, and the GPU module flips or rotates the original image.
However, in the scheme of processing the original image by using the GPU module, the electronic device needs to transmit the original image acquired by the image sensor back to the camera application program, and then the camera application program transmits the original image to the GPU module to perform the flipping or rotating. Thereby causing a problem of large power consumption.
Disclosure of Invention
The embodiment of the application provides an image processing method and electronic equipment, which can reduce the power consumption of the electronic equipment for turning or rotating an original image.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, an image processing method is provided, which is applied to an electronic device including a camera application and an image sensor, and includes: in response to the first operation, a camera application is launched. The camera application program issues a first parameter to the image sensor, and the first parameter is used for indicating the image sensor to return after first processing is carried out on a currently acquired image; the first process includes at least one of: turning over and rotating. And the image sensor performs first processing on the currently acquired image according to the first parameter to generate a first image. The image sensor returns the first image to the camera application to cause the camera application to display the first image.
Based on the scheme, the electronic equipment does not need to transmit the original image to the GPU module for executing image processing through the camera application program, but directly outputs the first image subjected to the image processing through the image processor, so that the power consumption of the image processing is reduced, and the image processing efficiency is improved.
In one possible design, the camera application issues a first parameter to the image sensor, including: the camera application program judges whether a first configuration is started, and the first configuration is used for indicating whether the image sensor needs to perform first processing on a currently acquired image and then returns. If the first configuration is enabled, the camera application issues the first parameter to the image sensor. If the first configuration is closed, the camera application program issues the first parameter to the image sensor in response to a second operation, wherein the second operation comprises an operation of opening the first configuration. Based on the scheme, no matter whether the first configuration is opened by default or is opened in response to the second operation, the camera application program can directly display the image subjected to mirror image turning, so that the user can preview the finally shot image, and the use experience of the user is favorably improved.
In one possible design, the camera application issues a first parameter to the image sensor, including: the camera application program judges whether the first configuration is started or not, and the first configuration is used for indicating whether the image sensor needs to perform first processing on the currently acquired image or not and then returning. If the first configuration is started, the camera application program issues the first parameter to the image sensor in response to the shooting operation. And if the first configuration is closed, responding to a third operation, and issuing the first parameter to the image sensor by the camera application program, wherein the third operation comprises the operation of opening the first configuration and the shooting operation. Based on the scheme, before the camera application program receives the shooting operation, the image output by the image sensor is always the currently acquired image, and the power consumption and the operation pressure of the image sensor are reduced.
In one possible design, after the image sensor returns the first image to the camera application, the method further includes: and the camera application program issues a preset instruction to the image sensor, wherein the preset instruction is used for indicating the image sensor to return the currently acquired image. Based on the scheme, after the camera application program finishes shooting, the camera application program instructs the image sensor to output the currently acquired image, so that the power consumption and the operation pressure of the image sensor are reduced.
In one possible design, after launching the camera application in response to the first operation, the method further includes: the camera application program judges whether the image sensor supports first processing on the currently acquired image; if the first parameter is supported, the camera application program configures a first attribute for the image sensor, and the first attribute is used for returning the currently acquired image after first processing when the image sensor receives the first parameter. Based on the scheme, the accuracy of interaction between the camera application program and the image sensor is improved, and therefore the image processing efficiency is improved.
In one possible design, when the first process includes a flipping process, the first parameter includes a flipping parameter, and the flipping parameter is used to instruct the image sensor to perform the flipping process on the currently acquired image and then return to the image sensor. When the first processing comprises rotation processing, the first parameters comprise rotation parameters, and the rotation parameters are used for indicating the image sensor to return after the currently acquired image is subjected to the rotation processing. Based on the scheme, the image sensor can conveniently determine how to process the currently acquired image according to the first parameter.
In one possible design, the electronic device further includes a camera hardware abstraction layer. The camera application program issues a first parameter to the image sensor, and the first parameter comprises the following steps: the camera application sends the first parameters to the camera hardware abstraction layer. And the camera hardware abstraction layer generates a first instruction according to the first parameter, and the first instruction is used for indicating the image sensor to return the currently acquired image after first processing. The camera hardware abstraction layer issues a first instruction to the image sensor. The image sensor carries out first processing on the currently acquired image according to the first parameter, and the first processing comprises the following steps: and the image sensor performs first processing on the currently acquired image according to the first instruction. Based on the scheme, for the image sensor capable of recognizing and applying the first instruction, the camera hardware abstraction layer in the embodiment of the application can directly send the first instruction to the image sensor, so that the image sensor directly performs first processing on the currently acquired image according to the first instruction, and the image processing efficiency is improved.
In a possible design, the electronic device further comprises a drive for the image sensor. The image sensor carries out first processing on the currently acquired image according to a first instruction, and the first processing comprises the following steps: and the driving of the image sensor generates a second instruction according to the first instruction, wherein the second instruction is an instruction which can be directly applied by the image sensor, and the second instruction is used for instructing the image sensor to perform first processing on the currently acquired image and then return. And the image sensor performs first processing on the currently acquired image according to the second instruction. Based on the scheme, for the image sensor which cannot recognize or apply the first instruction, the camera hardware abstraction layer in the embodiment of the application can send the first instruction to the driver of the image sensor, and the first instruction is converted into the second instruction which can be directly applied by the image sensor through the driver of the image sensor, so that the applicability of the image processing scheme provided by the embodiment of the application is improved.
In one possible design, the first instruction includes one of: a first processing instruction, a second processing instruction, and a third processing instruction. The camera hardware abstraction layer generates a first instruction according to the first parameter, including: and when the first parameters comprise the turning parameters, the camera hardware abstraction layer generates a first processing instruction, and the first processing instruction is used for indicating the image sensor to turn over the currently acquired image and then return. And when the first parameters comprise rotation parameters, the camera hardware abstraction layer generates a second processing instruction, and the second processing instruction is used for indicating the image sensor to rotate the currently acquired image and then return. And when the first parameters comprise the turning parameters and the rotation parameters, the camera hardware abstraction layer generates a third processing instruction, and the third processing instruction is used for indicating the image sensor to turn over and rotate the currently acquired image and then return. Based on the scheme, the image sensor can conveniently determine how to process the currently acquired image according to the first instruction
In one possible design, the electronic device further includes a memory. After the image sensor performs the first processing on the currently acquired image according to the first parameter, the method further includes: the camera application stores the first image to memory. Based on the scheme, the user can conveniently check or call the first image when in follow-up needs, and the user experience is favorably improved.
In one possible design, the electronic device further includes a camera hardware abstraction layer, a driver for the image sensor, and an encoding module. The first parameters further include a size parameter for instructing the camera hardware abstraction layer to perform a second process on the first image, the second process including at least a cropping process. The image sensor returning the first image to the camera application for the camera application to display the first image, comprising: the image sensor sends the first image to a drive of the image sensor. The driving of the image sensor carries out code conversion on the first image to generate a second image, and the second image is an image which can be directly identified by a camera hardware abstract layer. And the camera hardware abstract layer performs second processing on the second image according to the size parameter to generate a third image. And the coding module performs format conversion on the third image to generate a fourth image. The encoding module sends the fourth image to the camera application to cause the camera application to display the fourth image. Based on the scheme, the personalized requirements of the user for the image size can be favorably met, so that the user experience is improved.
In one possible design, the flipping process includes at least one of: mirror image turning, up-down turning. Based on the scheme, the personalized requirements of the user for the turning processing are favorably met, and therefore the user experience is improved.
In one possible design, the rotation processes one of: rotate 90 degrees, rotate 180 degrees, rotate 270 degrees. Based on the scheme, the personalized requirements of the user for the rotation processing are favorably met, and therefore the user experience is improved.
In one possible design, the first operation includes at least one of: click, long press, double click. Based on the scheme, the personalized requirements of the user for the first operation are favorably met, and therefore the user experience is improved.
In a second aspect, an electronic device is provided that includes one or more processors and one or more memories; one or more memories are coupled to the one or more processors, the one or more memories storing computer instructions. The computer instructions, when executed by the one or more processors, cause the electronic device to perform the image processing method of any one of the first aspect and its possible designs.
In a third aspect, a chip system is provided, the chip comprising processing circuitry and an interface; the processing circuit is configured to call up and run a computer program stored in the storage medium from the storage medium to execute the image processing method according to any one of the first aspect and its possible designs.
In a fourth aspect, a computer-readable storage medium is provided, comprising computer instructions which, when executed, perform the image processing method of any one of the first aspect and its possible designs.
It should be understood that, technical features of the technical solutions provided in the second, third and fourth aspects may all correspond to the image processing method provided in the first aspect and possible designs thereof, so that similar beneficial effects can be achieved, and details are not described herein.
Drawings
FIG. 1 is a schematic diagram of a mobile phone front camera shooting a letter F;
FIG. 2 is a schematic view of a process of mirror image flipping of an original image by a mobile phone through a GPU module;
fig. 3 is a schematic composition diagram of an electronic device according to an embodiment of the present disclosure;
FIG. 4 is a diagram of a software framework provided by an embodiment of the present application;
fig. 5 is a scene diagram of a photographing process provided in the embodiment of the present application;
fig. 6 is a schematic flowchart of an image processing method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a gesture provided in an embodiment of the present application;
FIG. 8 is a schematic diagram of another gesture provided by an embodiment of the present application;
fig. 9 is a schematic diagram of a shooting button provided in an embodiment of the present application;
fig. 10 is a schematic flowchart of another image processing method according to an embodiment of the present application;
fig. 11 is an interaction diagram of an image processing method according to an embodiment of the present application;
fig. 12 is a flowchart of an image processing method according to an embodiment of the present application;
FIG. 13 is an interaction diagram of another image processing method provided in an embodiment of the present application;
fig. 14 is a schematic diagram of an electronic device according to an embodiment of the present application;
fig. 15 is a schematic diagram of a chip system according to an embodiment of the present disclosure.
Detailed Description
The terms "first", "second", and "third" and the like in the embodiments of the present application are used for distinguishing different objects, and are not used for defining a specific order. Furthermore, the words "exemplary" or "such as" are used herein to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "such as" is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
To facilitate understanding of the embodiments of the present application, the following first presents an application context of the embodiments of the present application.
When an electronic device takes a picture, an original image acquired by an image sensor (sensor) needs to be output or stored after being subjected to image processing. For example, a mobile phone often needs to mirror a picture taken through a front camera on a display screen to make the displayed image more visually intuitive.
Please refer to fig. 1, which is a schematic diagram of a mobile phone with a front camera for shooting a letter F. In order to display the letter F in a mirror image manner on the display screen, when the mobile phone shoots the letter F through the front-facing camera, the original image collected by the image sensor is displayed after being subjected to mirror image turning. In other words, the image obtained by shooting the letter F by the mobile phone through the front camera is the image after the letter F is turned left and right.
Note that the image processing in the embodiment of the present application includes at least either of flipping and rotation. Wherein, the overturning comprises up-down overturning and mirror image overturning, and the rotating comprises rotating by 90 degrees, 180 degrees, 270 degrees and the like.
Typically, the image processing of the raw image is performed by a GPU module in the electronic device. In the process of processing an original image by using a GPU module in an electronic device, the electronic device needs to transmit the original image acquired by an image sensor back to a camera application program, and then the camera application program transmits the original image to the GPU module to execute image processing.
Taking an electronic device as an example, the process of mirror image flipping of an original image by a GPU module by a mobile phone will be specifically described with reference to fig. 2. Please refer to fig. 2, which is a schematic diagram of a mobile phone performing mirror image inversion on an original image through a GPU module.
S201, the mobile phone transmits the original image acquired by the image sensor back to the camera application program.
S202, the mobile phone transmits the original image to the GPU module through the camera application program to execute mirror image overturning.
As shown in S201 to S202, in order to display the mirror-flipped image to the user, the camera application cannot directly display an original image (for example, referred to as image 1) after acquiring the original image from the image sensor, and needs to transmit the original image to the GPU module for mirror-flipping. When an image after the GPU module performs image processing (such as referred to as image 2) is acquired, the camera application can show it accordingly. For example, the image 2 is subjected to encoding processing, and then displayed on the interface based on the image after the encoding processing. This complicates the image processing process, resulting in a large power consumption overhead.
In order to solve the above problem, embodiments of the present application provide an image processing method that enables an image sensor to directly output an image subjected to image processing. The complicated image processing process is avoided, and the power consumption of the electronic equipment in the image processing process of the original image is reduced.
The scheme provided by the embodiment of the application is described in detail below with reference to the accompanying drawings. The image processing method provided by the embodiment of the present application may be applied to an electronic device of a user. The electronic device may be a device having a photographing function. For example, the electronic device may be a portable mobile device having a shooting function, such as a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \ Virtual Reality (VR) device, or a media player, and may also be a wearable electronic device, such as a smart watch. The embodiment of the present application does not particularly limit the specific form of the apparatus.
As an example, please refer to fig. 3, which is a schematic composition diagram of an electronic device 300 according to an embodiment of the present disclosure. The image processing method provided by the embodiment of the application can be applied to the electronic device 300 shown in fig. 3.
As shown in fig. 3, the electronic device 300 may include a processor 301, a display 302, a communication module 303, and the like.
Among other things, processor 301 may include one or more processing units, such as: the processor 301 may include an Application Processor (AP), a modem processor, a graphics processor, an Image Signal Processor (ISP), a controller, a memory, a video stream codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated in one or more processors 301.
The controller may be a neural center and a command center of the electronic device 300. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 301 for storing instructions and data. In some embodiments, the memory in the processor 301 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 301. If the processor 301 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 301, thereby increasing the efficiency of the system.
In some embodiments, processor 301 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface 304, and the like.
The electronic device 300 implements display functions via the GPU, the display screen 302, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 302 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 301 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 302 is used to display images, video streams, etc. The display screen 302 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-o led, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device 300 may include 1 or N display screens 302, N being a positive integer greater than 1.
The communication module 303 may include an antenna 1, an antenna 2, a mobile communication module, and/or a wireless communication module.
As shown in fig. 3, in some implementations, the electronic device 300 may further include an external memory interface 305, an internal memory 306, a USB interface 304, a charging management module 307, a power management module 308, a battery 309, an audio module 310, a sensor module 311, a key (not shown), a motor (not shown), an indicator (not shown), a camera 312, a Subscriber Identity Module (SIM) card interface, and the like.
The NPU is a neural-network (NN) computing processor, which processes input information quickly by referring to a biological neural network structure, for example, by referring to a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent recognition of the electronic device 300, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The charging management module 307 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 307 may receive charging input of a wired charger through the USB interface 304. In some wireless charging embodiments, the charging management module 307 may receive a wireless charging input through a wireless charging coil of the electronic device 300. The charging management module 307 may charge the battery 309 and supply power to the electronic device 300 through the power management module 308.
The power management module 308 is used to connect the battery 309, the charging management module 307 and the processor 301. The power management module 308 receives input from the battery 309 and/or the charge management module 307, and provides power to the processor 301, the internal memory 306, the external memory, the display 302, the camera 312, the communication module 303, and the like. The power management module 308 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In other embodiments, the power management module 308 may also be disposed in the processor 301. In other embodiments, the power management module 308 and the charging management module 307 may be disposed in the same device.
The external memory interface 305 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 300. The external memory card communicates with the processor 301 through the external memory interface 305 to implement a data storage function. For example, files such as music, video streams, etc. are saved in the external memory card.
The internal memory 306 may be used to store computer-executable program code, which includes instructions. The processor 301 executes various functional applications and data processing of the electronic device 300 by executing instructions stored in the internal memory 306. The internal memory 306 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, and the like) required by at least one function, and the like. The data storage area may store data (e.g., audio data, phone book, etc.) created during use of the electronic device 300, and the like. Further, the internal memory 306 may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The sensor module 311 in the electronic device 300 may include an image sensor, a touch sensor, a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, an ambient light sensor, a fingerprint sensor, a temperature sensor, a bone conduction sensor, and the like to realize sensing and/or acquiring functions for different signals.
The keys include a power-on key, a volume key and the like. The keys may be mechanical keys. Or may be touch keys. The electronic device 300 may receive a key input, and generate a key signal input related to user setting and function control of the electronic device 300.
The motor may generate a vibration indication. The motor can be used for incoming call vibration prompt and can also be used for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 302. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator can be an indicator light, can be used for indicating the charging state, the electric quantity changes, also can be used for indicating messages, missed calls, notices and the like.
The SIM card interface is used for connecting the SIM card. The SIM card can be brought into and out of contact with the electronic device 300 by being inserted into and pulled out of the SIM card interface. The electronic device 300 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface can support a Nano SIM card, a Micro SIM card, a SIM card and the like. Multiple cards can be inserted into the same SIM card interface at the same time. The types of the plurality of cards may be the same or different. The SIM card interface may also be compatible with different types of SIM cards. The SIM card interface may also be compatible with external memory cards. The electronic device 300 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 300 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 300 and cannot be separated from the electronic device 300.
In some embodiments, the electronic device 300 may implement the shooting function through the ISP, the camera 312, the video stream codec, the GPU, the display screen 302, the application processor, and the like.
The ISP is used to process the data fed back by the camera 312. For example, when taking a picture, the shutter is opened, light is transmitted to the camera 312 photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera 312 photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to the naked eye. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 312.
The camera 312 is used to capture still images or video streams. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 300 may include 1 or N cameras, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 300 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic device 300. In other embodiments, electronic device 300 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Please refer to fig. 4, which is a schematic diagram of an electronic device according to an embodiment of the present disclosure. When the image processing method provided in the embodiment of the application is applied to the electronic device 300 shown in fig. 3, software in the electronic device 300 may be divided into an application Layer 401, a Hardware Abstraction Layer (HAL) 402 and a driver Layer 403 shown in fig. 4.
A plurality of applications may be installed in the application layer 401, and a camera application (camera application) for implementing a photographing function may be included therein. The camera application is a computer program running on an operating system and capable of executing a photographing task. Wherein the operating system can be Android (Android) TM ) Windows (Windows) TM ) And the like. When the format of the camera application is an Android Application Package (APK) format, the camera should be usedThe application program can run on the android operating system. In the embodiment of the present application, the camera application may be an application having a shooting function. For example, an application program with a function of sharing a short video may be used as the camera application program in the embodiment of the present application if the application program has a shooting function.
The hardware abstraction layer 402 is software that sits between the operating system kernel and the hardware circuitry and generally serves to abstract the hardware to achieve operating system and hardware circuitry interaction at the logical level. In an embodiment of the present application, the Hardware Abstraction Layer 402 may include a camera Hardware Abstraction Layer (camera HAL) that enables the camera application program to interact with the image sensor at a logical Layer.
A plurality of drivers for driving hardware operations may be installed in the driver layer 403. In the embodiment of the present application, the driving layer 403 may include an image sensor driver (sensor driver) for driving the image sensor 404 to operate.
It should be noted that the application layer 401, the hardware abstraction layer 402, and the driver layer 403 may include other contents, and are not specifically limited herein.
The following describes in detail the image processing method according to the embodiment of the present application with reference to the electronic device 300 provided in fig. 3 and the software division of the electronic device 300 provided in fig. 4.
The image processing method provided by the embodiment of the application can be applied to a scene that a user takes pictures through a front camera of a mobile phone. For example, when a user takes a picture through a front camera of a mobile phone, the mobile phone can display an original image collected by the image sensor in the display screen after the original image is subjected to mirror image turning so as to be close to the visual feeling of the user.
Please refer to fig. 5, which is a scene diagram of a photographing process according to an embodiment of the present disclosure. As shown in fig. 5, when a user takes a picture through the front camera of the mobile phone, the user may first click an icon 501 of a camera application program in the mobile phone to start the camera application program; then, a mirror image turning control 502 in the camera application program can be clicked to open mirror image turning configuration; finally, the shooting control 503 in the camera application program can be clicked to obtain the image subjected to mirror image turning, so that the shooting is completed. The camera application is a computer program capable of performing a photographing task, and may be located in the application layer 401 in the electronic device shown in fig. 4.
The camera application program can judge whether the image sensor supports mirror image overturning or not in the starting process. For example, the camera application may obtain a configuration code of the image sensor, and determine whether the image sensor supports mirror flipping by determining whether the configuration code of mirror flipping is included in the configuration code. If the configuration code of the image sensor comprises a configuration code of mirror image turning, the image sensor supports the mirror image turning; if the configuration code of the image sensor does not include the configuration code of the mirror image turning, the image sensor does not support the mirror image turning.
If the image sensor does not support mirror image inversion, the camera application program can send the image output by the image sensor to the GPU module, and the mirror image inversion process of the image is completed through the GPU module. The present application generally describes scenarios in which the image sensor supports mirror flipping.
In the embodiment of the present application, in the case that it is determined that the image sensor supports mirror image flipping, the effect of mirror image flipping may be efficiently achieved according to any one of the following schemes of fig. 6 to 14.
If the image sensor supports mirror image flipping, the camera application program may configure an attribute of mirror image flipping for the image sensor, so that the image sensor can directly output an image after mirror image flipping according to an instruction. Illustratively, the camera application may add the following code in the configuration code of the image sensor:
m(si,ss,se,HAL_CUSTOM_SUPPORT_SENSOR_MIRROR,"sensorSupportMirror",uint8_t,1)。
by adding the code to the configuration code of the image sensor, the image sensor can be enabled to start the mirror inversion attribute. In other words, the image sensor can be enabled to directly output a mirror-flipped image according to an instruction. In this embodiment, the first attribute may include an attribute of the mirror flip.
After the camera application program is started, a preset instruction can be issued to the image sensor to instruct the image sensor to return the acquired image. In some embodiments, the mirror-flipped configuration is turned off by default, and the preset instruction instructs the image sensor to return the captured original image. In other embodiments, if the mirror image flip configuration is turned on by default, the preset instruction may instruct the image sensor to return to the acquired original image, or may also carry a configuration parameter, where the configuration parameter instructs the image sensor to turn on the mirror image register and output the image after mirror image flip. The image currently acquired by the image sensor may be an original image acquired by the image sensor.
As shown in fig. 5, after the camera application is started, a shooting interface may be provided, and the shooting interface may include a mirror flipping control 502, and a user may open or close the mirror flipping configuration by clicking the mirror flipping control 502. When the mirror image overturning configuration is started, the image sensor outputs the acquired original image after mirror image overturning; when the mirror image turning configuration is closed, the image sensor outputs the acquired original image. In this embodiment of the present application, the second operation may include clicking the mirror flipping control 502.
A preview image display area 504 may also be included in the photographic interface. The preview image display area 504 is used to display a preview image, and the user can adjust the shooting parameters such as the shooting angle and the shooting range by viewing the preview image.
In the embodiment of the application, the image sensor can acquire images in real time to display preview images. When the camera application program receives the shooting instruction, the camera application program instructs the image sensor to correspondingly store the currently acquired image into the storage module of the electronic equipment after the image is processed.
It should be noted that when the mirror-flipped configuration is closed, the preview image is the same as the image that is ultimately stored by the camera application. When the mirror flip configuration is enabled, the preview image may be the same as or different from the image stored by the camera application, as described below.
The preview image may be the same as the image ultimately stored by the camera application, both being mirror-flipped images. In other words, the preview image is an image output by the image sensor after the acquired original image is subjected to mirror image inversion, and the image finally stored by the camera application program is also an image output by the image sensor after the acquired original image is subjected to mirror image inversion. Therefore, the user can watch the finally stored image of the camera application program in advance according to the preview image, and the use experience of the user is improved.
In this case, taking the scenario shown in fig. 5 as an example, after the camera application is started, the mirror image is turned over and configured to be in a closed state, and the preview image is an original image acquired by the image sensor. When a user opens the mirror image flipping configuration by clicking the mirror image flipping control 502, the camera application program issues configuration parameters to the image sensor, instructs the image sensor to open the mirror image register, and outputs the image after the mirror image flipping, so that the camera application program displays the image after the mirror image flipping in the preview image display area 504. When the user clicks the shooting control 503 in the camera application, the camera application issues a shooting instruction to the image sensor, and instructs the image sensor to return an image after image flipping, so that the camera application stores the image after image flipping to the storage module.
The preview image may be different from the image ultimately stored by the camera application, with a mirror-symmetric relationship. The preview image is an original image directly output by the image sensor, and the image finally stored by the camera application program is an image output by the image sensor after the original image is subjected to mirror image turning. Thus, it is advantageous to reduce power consumption of the image sensor.
In this case, taking the scenario shown in fig. 5 as an example, after the camera application is started, the mirror image is turned over and configured to be in the closed state, and the preview image is the original image acquired by the image sensor. When a user clicks the mirror image turning control 502 to open the mirror image turning configuration and clicks the shooting control 503 in the camera application program, the camera application program issues a shooting instruction carrying configuration information to the image sensor, instructs the image sensor to open the mirror image register, and returns an image after mirror image turning, so that the camera application program stores the image after mirror image turning to the storage module. In this embodiment of the present application, the shooting operation may be the above-mentioned clicking of the shooting control 503 in the camera application. A third operation may open the mirror flip configuration for clicking the mirror flip control 502 described above and click the capture control 502 in the camera application.
The above-mentioned fig. 5 explains the process of taking a picture by the user through the front camera of the mobile phone based on the angle of the scene. The following explains the process of taking a picture by the user through the front camera of the mobile phone again based on the angle executed by the device.
The flow of photographing when the preview image is the same as the image ultimately stored by the camera application is described herein. Please refer to fig. 6, which is a flowchart illustrating an image processing method. The flow may include S601-S613.
S601, starting a camera application program.
The camera application is a computer program capable of performing a photographing task, and may be located in the application layer 401 in the electronic device shown in fig. 4.
In some embodiments, a user interface of the electronic device may provide an icon for the camera application. When the electronic device detects a user click, double click, long press, etc., on the icon, the camera application may be launched. For example, as shown in fig. 5, the camera application may be launched by clicking on the icon 501 of the camera application in fig. 5.
In other embodiments, a voice assistant may be installed in the electronic device, which may be located in the application layer 401 of the electronic device shown in FIG. 4 and have the authority to launch the camera application. The camera application may be launched when a voice assistant in the electronic device detects a voice instruction from a user to launch the camera application.
In addition, the camera application program can also be started through gestures, and the gestures can be fist making, palm stretching and the like. Please refer to fig. 7, which is a schematic diagram of a gesture according to an embodiment of the present disclosure. As shown in fig. 7, the gesture to launch the camera application may be a fist. For example, when the electronic device detects a fist-making gesture through the camera, the camera application may be started. Please refer to fig. 8, which is a schematic diagram of another gesture provided in the embodiment of the present application. As shown in fig. 8, the gesture to launch the camera application may be stretching the palm. For example, when the electronic device detects a gesture of extending a palm of a hand through the camera, a camera application may be started. In this embodiment of the application, the first operation may be the operation such as clicking, double-clicking, long-pressing, etc. on the camera application icon, may also be the voice instruction, and may also be the gesture shown in fig. 7 and/or fig. 8, which is not limited herein.
And S602, in the process of starting the camera application program, judging whether the image sensor supports mirror image overturning.
The process of the camera application determining whether the image sensor supports mirror inversion may refer to the description of the scheme shown in fig. 5. That is, the camera application may obtain the code of the image sensor, and determine whether the image sensor supports mirror inversion by determining whether the configuration code includes the configuration code of mirror inversion, which is not described herein again.
And S603a, if the image is not supported, the camera application program sends the image output by the image sensor to the GPU module, and the image is turned over through the GPU module.
S603 b. if supported, the camera application configures the image sensor with the property of mirror flipping.
In the embodiment of the present application, reference may be made to the scheme shown in fig. 5 for a process of configuring the image sensor with the attribute of mirror flipping by the camera application. Namely, the camera application program can add codes in the configuration codes of the image sensor so that the image sensor can directly output the image after mirror image turning according to the instructions. The first attribute may include an attribute of the mirror flip.
S604, the camera application program judges whether mirror image overturning configuration is started.
For example, a shooting interface of the camera application may include a mirror image flipping control, and the camera application may determine whether the mirror image flipping configuration is turned on according to a state of the mirror image flipping control. Wherein the mirror flip control can be the mirror flip control 502 shown in fig. 5. The state of the mirror flip control can include an on state and an off state. The mirror image turning control is in an open state to indicate that the mirror image turning configuration is open, and the mirror image turning control is in a closed state to indicate that the mirror image turning configuration is closed.
In an embodiment of the present application, the first configuration may include the mirror-flipped configuration described above.
And S605a, if the mirror image overturning configuration is closed, the camera application program issues a preset instruction to the image sensor. The preset instruction is used for instructing the image sensor to return the acquired original image so that the camera application program can display the original image in the preview image display area.
Wherein, the original image is an image which is not subjected to mirror image inversion. And when the mirror image overturning configuration is closed, the preview image is the original image.
The preview image display area is used for displaying a preview image so that a user can adjust shooting parameters such as a shooting angle and a shooting range by observing the preview image. Illustratively, as shown in fig. 5, the preview image display area may be the preview image display area 504 shown in fig. 5.
And S605b, if the mirror image overturning configuration is started, the camera application program issues configuration parameters to the image sensor. Wherein the configuration parameter is used for instructing the image sensor to return the mirror-flipped image so that the camera application displays the mirror-flipped image in the preview image display area. Jump to S609.
In the embodiment of the present application, the configuration parameter may also be referred to as a first parameter.
The configuration parameter is specifically used for instructing the image sensor to open the mirror image register, so that the image sensor outputs an image after mirror image turning.
And S606, the image sensor sends the acquired original image to the camera application program according to a preset instruction.
S607, the camera application displays the original image in the preview image display area.
And S608, responding to the opening of the mirror image overturning configuration, and issuing configuration parameters to the image sensor by the camera application program.
In some embodiments, as shown in FIG. 5, the mirror flip configuration may be opened by clicking on the mirror flip control 502 in FIG. 5.
In other embodiments, the mirror flip configuration may be opened by a gesture as shown in fig. 7 and/or fig. 8.
In addition, the mirror flip configuration can also be opened by voice command. The voice command can be 'turn mirror on flip', 'turn mirror on', etc. For example, when the electronic device detects a voice command "turn on mirror," the mirror flip configuration may be turned on.
In this embodiment, the second operation may include clicking the mirror flip control 502 in fig. 5, such as the gesture shown in fig. 7 and/or fig. 8, and the voice instruction.
And S609, the image sensor opens the mirror image register according to the configuration parameters so as to send the image after mirror image turning to the camera application program.
In an embodiment of the present application, an image sensor includes a mirror register. When the mirror image register is opened, the image sensor outputs an image subjected to mirror image overturning; when the mirror register is closed, the image sensor outputs the original image.
And S610, the camera application program displays the image after mirror image turning in the display area of the preview image.
And S611, responding to the shooting control clicked in the camera application program, and sending a shooting instruction to the image sensor by the camera application program. The shooting instruction is used for indicating the image sensor to return the image after mirror image turning, so that the camera application program can store the image after mirror image turning to the storage module.
The shooting control may be the shooting control 503 shown in fig. 5. When the electronic device detects a click operation on the shooting control 503, a shooting instruction is issued to the image sensor. As shown in fig. 5, the shooting control 503 in the embodiment of the present application may be circular and located at the boundary of the display screen of the electronic device. Thus, the aesthetic property of the user interface of the camera application program can be improved, the user can easily select the shooting control 503 when holding the electronic equipment, and the usability of the camera application program is improved.
In the embodiment of the application, the camera application program can also be triggered by the entity key to issue the shooting instruction to the image sensor. The physical key is, for example, a shooting button provided in the electronic device. For example, the electronic device may issue a shooting instruction to the image sensor upon detecting that the shooting button is pressed.
Please refer to fig. 9, which is a schematic diagram of a shooting button according to an embodiment of the present disclosure. Illustratively, the photographing button 901 may be provided at a bezel of the electronic device. In this way, the user can easily select the shooting button 901 when holding the electronic device, and usability of the camera application is improved.
In different implementations, the function of the shooting button 901 may also be implemented by other existing physical keys, for example, the function of the shooting button 901 is implemented by a volume adjustment button.
When the image sensor needs to be described, the camera application program can also be triggered to issue a shooting instruction to the image sensor through a voice instruction, a gesture and the like, and the voice instruction can be 'shooting', 'eggplant' and the like. The gesture may be a gesture as shown in fig. 7 and/or fig. 8.
And S612, the image sensor sends the image after mirror image turning to a camera application program according to the shooting instruction.
And S613, the camera application program stores the image after mirror image turning to a storage module.
The storage module may be a memory, a memory card, or the like in the electronic device, which is not limited herein.
Fig. 6 describes a photographing process when the preview image is identical to the image finally stored by the camera application, and a photographing process when the preview image is not identical to the image finally stored by the camera application. Please refer to fig. 10, which is a flowchart illustrating another image processing method. The flow may include S1001-S1013.
And S1001, starting a camera application program.
S1002, in the process of starting the camera application program, judging whether the image sensor supports mirror image overturning or not.
And S1003a, if the image is not supported, the camera application program sends the image output by the image sensor to the GPU module, and the image is subjected to mirror image overturning through the GPU module.
S1003b, if supported, the camera application configures the image sensor with the property of mirror flipping.
And S1004, the camera application program judges whether the mirror image overturning configuration is started.
And S1005a, if the mirror image turnover configuration is closed, the camera application program issues a preset instruction to the image sensor. The preset instruction is used for instructing the image sensor to return the acquired original image so that the camera application program can display the original image in the preview image display area.
The specific implementation of S1001-S1005a may refer to S601-S605a in the scheme shown in fig. 6, which is not described herein again.
And S1005b, if the mirror image turnover configuration is started, responding to the click of the shooting control in the camera application program, and the camera application program issues the configuration parameters to the image sensor. The configuration parameters are used for indicating the image sensor to open the mirror image register and output the image after mirror image turning. Jump to S1009.
The shooting control may be, for example, the shooting control 503 shown in fig. 5. When the electronic device detects that the mirror image flip configuration is opened and detects a click operation for the shooting control 503, it issues a configuration parameter to the image sensor. The shooting control may also be the shooting button 901 shown in fig. 9, and details are not repeated here.
The image sensor may include a mirror register. When the mirror image register is opened, the image sensor outputs an image subjected to mirror image overturning; when the mirror register is closed, the image sensor outputs the original image. The configuration parameter may instruct the image sensor to open the mirror register, so that the image sensor outputs the mirror-flipped image.
It should be noted that the camera application may also be triggered to issue a shooting instruction to the image sensor through a voice instruction, a gesture, or the like. The gesture may be a fist making gesture as shown in fig. 7, an extending palm gesture as shown in fig. 8, or a flipping palm gesture, which is not limited herein.
In this embodiment of the application, the shooting operation may be the above-mentioned clicking operation on the shooting control 503, such as a fist making gesture shown in fig. 7, a palm stretching gesture shown in fig. 8, a palm flipping gesture, a voice instruction, and the like, which is not limited herein.
And S1006, the image sensor sends the acquired original image to a camera application program according to a preset instruction.
S1007, the camera application displays the original image in the preview image display area.
And S1008, responding to the mirror image turning configuration and clicking a shooting control in the camera application program, and sending the configuration parameters to the image sensor by the camera application program. The configuration parameters are used for indicating the image sensor to open the mirror image register and output the image after mirror image turning.
The specific implementation of the mirror-turning-on configuration may refer to the related description of S608, and is not described herein again.
In some embodiments, the camera application may be triggered by voice instructions to issue configuration parameters to the image sensor. The voice command can be 'shooting mirror image', 'shooting after opening mirror image', and the like. For example, when the electronic device detects a voice command "take a picture", configuration parameters may be issued to the image sensor.
In other embodiments, the camera application may be triggered to issue configuration parameters to the image sensor by a gesture as shown in fig. 7 and/or fig. 8. In this embodiment, the third operation may include the operation of opening the mirror-flipping configuration and the voice instruction, such as the gesture shown in fig. 7 and/or fig. 8.
And S1009, the image sensor sends the image after mirror image turning to the camera application program according to the configuration parameters.
And S1010, the camera application program stores the image after the mirror image is turned to the storage module.
The storage module may be a memory, a memory card, or the like in the electronic device, and is not limited herein.
And S1011, the camera application program issues a preset instruction to the image sensor. The preset instruction is used for instructing the image sensor to close the mirror image register and returning the acquired original image so that the camera application program can display the original image in the preview image display area.
After the image after the mirror image turning is stored in the storage module by the camera application program, the preview image display area of the camera application program can continuously display the preview image, so that the user can continuously shoot. The preview image is an image that is not subjected to mirror image turning, and therefore the camera application program also needs to issue a preset instruction to the image sensor to instruct the image sensor to return to the original image.
And S1012, the image sensor closes the mirror image register according to a preset instruction, and sends the acquired original image to the camera application program.
S1013, the camera application displays the original image in the preview image display area.
In this way, the camera application can display the original image that has not undergone mirror inversion in the preview image display area, so that the user can continue shooting.
It should be noted that all relevant contents of the steps in fig. 6 and fig. 10 may refer to the relevant contents in fig. 5, in this embodiment of the application, the electronic device turns the acquired original image up and down through the image sensor, and the process of rotation is similar to the process of mirror image turning, and is not described again here.
In the scheme shown in fig. 6, when the mirror image flipping configuration is changed from the off state to the on state, the camera application may issue the mirror image flipping configuration parameters to the image sensor, so that the image sensor outputs the mirror image flipped image, and the preview image is changed from the original image to the mirror image flipped image.
In the scheme shown in fig. 10, when the mirror-flipped configuration is changed from the off state to the on state, the preview image does not change, and the image sensor outputs the original image that is not mirror-flipped. When the electronic device detects that the mirror image turnover configuration is in an open state and detects a shooting operation or detects a third operation, the camera application program issues mirror image turnover configuration parameters to the image sensor so that the image sensor outputs a mirror image turnover image. The photographing operation and the third operation may refer to the descriptions of S905b and S908, and are not described herein again.
That is, both the scheme shown in fig. 6 and the scheme shown in fig. 10 include a process in which the camera application issues configuration parameters to the image sensor, and the configuration parameters are sent to the image sensor to return an image after image processing. This process is explained below.
Please refer to fig. 11, which is an interaction diagram of an image processing method according to an embodiment of the present application. As shown in fig. 11, the image processing method provided by the embodiment of the present application may include S1101-S1105.
S1101, the camera application program sends configuration parameters to a camera hardware abstraction layer, and the configuration parameters are used for indicating the image sensor to return the currently acquired image after first processing. The camera application is a computer program capable of performing a photographing task, and may be located in the application layer 401 in the electronic device shown in fig. 4. The camera hardware abstraction layer may be located in the hardware abstraction layer 402 as shown in fig. 4.
In the embodiment of the present application, the configuration parameter may also be referred to as a first parameter.
In some embodiments, the configuration parameters may be used to indicate whether and the type of flip the image sensor is flipping over the currently acquired image. The type of flipping may include: up-down flip, left-right flip, etc. The left-right turning is mirror turning as shown in fig. 1, and is not described in detail later.
In other embodiments, the configuration parameter may also be used to indicate whether the image sensor rotates the currently acquired image by a preset angle, where the preset angle may be 90 °, 180 °, 270 °, and the like, which is not limited herein.
Illustratively, the configuration parameters may include at least one of: up-down flip parameters, mirror flip parameters, rotation parameters, and the like.
In some embodiments, the flip-up parameter, the mirror flip parameter, and the rotation parameter may be represented numerically in the configuration parameters. As an example, a parameter value of 1 indicates that the corresponding configuration is on, and a parameter value of 0 indicates that the corresponding configuration is off. For example, if the mirror image flipping parameter is 1, the mirror image flipping configuration is indicated to be turned on, that is, the image sensor is indicated to flip the currently acquired image in the mirror image.
In other embodiments, the flip-up parameter, the mirror flip parameter, and the rotation parameter may be represented by a character string in the configuration parameters. As an example, the configuration parameters include a corresponding character string indicating that the corresponding configuration is on, and the configuration parameters do not include the corresponding character string indicating that the corresponding configuration is off. For example, if the configuration parameter includes a character string "sensorSupportMirror", it indicates that the configuration of mirror image flipping is turned on, that is, the image sensor is instructed to mirror image flip the currently acquired image.
And S1102, the camera hardware abstraction layer of the electronic equipment sends an image processing instruction to the image sensor driver according to the configuration parameters.
Wherein the image sensor drive may be located in the drive layer 403 as shown in fig. 4.
In the embodiment of the present application, the image processing instruction may also be referred to as a first instruction, and the image processing instruction includes a configuration parameter indicating an on configuration. The configuration indicating on may include at least one of: flip up and down, mirror flip, and rotate.
Illustratively, a camera hardware abstraction layer of the electronic device may determine the image processing instructions according to the configuration parameters. Examples of the flip-up parameter, the mirror flip parameter, and the rotation parameter are indicated by character strings in S1101. In some embodiments, the electronic device may determine, through the camera hardware abstraction layer, whether a character string of the vertical flipping parameter, the mirror flipping parameter, and the rotation parameter exists one by one or in parallel, and generate the image processing instruction according to a determination result. As an example, the code for determining whether the character string of the mirror flip parameter exists may be as follows:
if(GetMetadataByTag(sensorSupportMirror)==true)
{
setSensorMirrorSetting()
}。
wherein if (getmetadatabytag (sensorSupportMirror) ═ true refers to the camera hardware abstraction layer determining whether the configuration parameters include the configuration sensorSupportMirror of the mirror image flip, if yes, that is, the camera hardware abstraction layer determines that the configuration parameters include the configuration sensorSupportMirror of the mirror image flip
GetMetadataByTag(sensorSuppo rtMirror)==true。
The function setSensorMirrorSetting () is executed to generate image processing instructions that instruct the image sensor to mirror-flip the captured image.
In other embodiments, the electronic device may determine whether the vertical flipping parameter, the mirror flipping parameter, and the rotation parameter are 1 one by one or in parallel through the camera hardware abstraction layer, and generate the image processing instruction according to the determination result.
The image processing instructions in the embodiments of the present application may be numbers, letters, or a combination of numbers and letters. Taking the image processing instruction as a number, different bits of the number may indicate different parameters, and the value of the number may indicate whether the corresponding configuration is on. For example, the first digit of the number indicates the flip-up parameter, the second digit of the number indicates the mirror flip parameter, the third digit of the number indicates the rotation parameter, 1 indicates the corresponding configuration is on, and 0 indicates the corresponding configuration is off. On the basis, if the image processing instruction is 010, the up-down flip configuration is closed, the mirror flip configuration is opened, and the rotation configuration is closed. It should be noted that the rotation parameter is used to indicate whether the image sensor rotates the acquired image by a preset angle. The preset angle may be 90 °, 180 °, 270 °, and the like, which is not limited herein.
S1103, the image sensor of the electronic device drives the image sensor to send a first driving instruction according to the image processing instruction.
S1104, an image sensor of the electronic device performs first processing on the acquired image according to the first driving instruction to obtain a first image.
In the embodiment of the present application, the first driving instruction may also be referred to as a second instruction, and the first driving instruction is an instruction that can be directly applied to the image sensor. The camera hardware abstraction layer is software, and the image sensor is hardware. The image sensor driver can convert image processing instructions output by the camera hardware abstraction layer into first driving instructions which can be recognized by the image sensor.
The first driving instruction may include a configuration parameter indicating a configuration for turning on. The first process may be a process corresponding to the configuration indicating the opening. For example, the configuration indicating the on may be a flip-up configuration, and correspondingly, the first process may be a flip-up; the configuration indicating the turning on may be a mirror flip configuration, and correspondingly, the first process may be a mirror flip.
Illustratively, the configuration indicating the power on in the first drive instruction is a mirror flip. The image sensor driver may drive the image sensor to mirror-invert the acquired image, and take the mirror-inverted image as the first image.
In the embodiment of the application, when the configuration started in the image processing instruction is multiple, the collected images can be subjected to first processing according to a preset sequence, wherein the preset sequence can be that the images are firstly turned over from top to bottom, then turned over in a mirror image mode, and finally rotated. For example, the configuration indicated to be turned on in the first driving instruction is up-down flipping, mirror image flipping, and rotating, the image sensor driver may drive the image sensor to first flip the acquired image up and down, then drive the image sensor to mirror image flip the image flipped up and down, and finally drive the image sensor to rotate the image flipped by a preset angle to obtain the first image. It is to be understood that this is by way of example only and is not meant to limit the application thereto.
In a possible design, for an image sensor capable of recognizing and applying a first instruction, the camera hardware abstraction layer in the embodiment of the present application may directly send the first instruction to the image sensor, so that the image sensor directly performs first processing on a currently acquired image according to the first instruction, thereby improving image processing efficiency.
S1105, the image sensor returns the first image to the camera application to cause the camera application to display the first image.
It should be noted that the first image is an image obtained by performing first processing on a currently acquired image by the image sensor. For example, the first processing is mirror inversion, and the first image is an image obtained by image sensor mirror-inverting a currently acquired image.
Therefore, the electronic equipment does not need to transmit the original image to the GPU module to execute image processing through the camera application program, and directly outputs the first image subjected to the image processing through the image processor, so that the power consumption of the image processing is reduced, and the image processing efficiency is improved.
Fig. 11 illustrates the solution of the present application from the perspective of interaction of the modules, and the following describes the solution provided by the embodiment of the present application from the perspective of the electronic device with reference to fig. 12. Please refer to fig. 12, which is a flowchart illustrating an image processing method according to an embodiment of the present disclosure. As shown in fig. 12, the image processing method provided by the embodiment of the present application may include S1201-S1205.
S1201, the electronic equipment generates configuration parameters.
And S1202, the electronic equipment generates an image processing instruction according to the configuration parameters.
S1203, the electronic equipment generates a first driving instruction according to the image processing instruction.
S1204, the electronic device performs first processing on the currently acquired image according to the first driving instruction to obtain a first image.
And S1205, the electronic equipment displays the first image.
It should be noted that all relevant contents of the steps in fig. 12 may refer to the corresponding steps in fig. 11, and are not described herein again.
In the above description of the scheme of fig. 11 or fig. 12, the configuration parameters may further include a size parameter indicating a desired size of the image, the first image output by the image sensor may not conform to the size parameter, and the format of the first image may not be directly displayed. Therefore, please refer to fig. 13, which is an interaction diagram of another image processing method according to an embodiment of the present application.
S1105 may include S1301-S1305 described below in conjunction with the description of fig. 11.
S1301, the image sensor sends the first image to a driver of the image sensor.
And S1302, the driving of the image sensor performs code conversion on the first image to generate a second image, wherein the second image can be directly identified by the camera hardware abstraction layer.
The camera hardware abstraction layer is software, and the image sensor is hardware. The driving of the image sensor can generate a second image which can be recognized by a camera hardware abstract layer through code conversion of a first image output by the image sensor.
And S1303, performing second processing on the second image by a camera hardware abstraction layer of the electronic device according to the configuration parameters to obtain a third image. The second process may include image cropping, among others.
Illustratively, the configuration parameters may also include a size parameter. On this basis, the camera hardware abstraction layer of the electronic device may perform a second processing on the second image according to the size parameter to crop the second image to the size indicated in the size parameter. For example, the image size indicated in the size parameter is 640dpi × 480dpi, then the camera hardware abstraction layer of the electronic device crops the second image into a third image of 640dpi × 480 dpi.
And S1304, encoding the third image by an encoding module of the electronic equipment to obtain a fourth image.
The encoding module encodes the third image by converting a format of the third image into a Joint Photographic Experts Group (JPEG) format. Therefore, the occupied space of the fourth display is reduced, and the electronic equipment can directly display the fourth display conveniently.
In some embodiments, the encoding module may be built-in software in the electronic device, and software-encode the third image. Thus, the power consumption of the encoding process is reduced.
In other embodiments, the encoding module may be a coprocessor in the electronic device, and the third image is hardware encoded. Thus, the coding efficiency is improved.
And S1305, displaying the fourth image by a camera application program of the electronic equipment.
The camera application of the electronic device may store the fourth image in the storage module. So that the user can check the information when needed and the user experience is improved. The memory module may be a memory in the electronic device, a memory card installed in the electronic device, and the like, which is not limited herein.
So far, the image processing method provided by the embodiments of the present application should be clearly and clearly understood by those skilled in the art. It should be noted that, in the above example, only the electronic device integrated with the camera is taken as an example, in other implementations of the present application, the electronic device may be a camera, and the camera application may be an operating system of the camera. For a specific implementation manner, reference may be made to the above example, which is not described again.
The above description mainly introduces the solutions provided in the embodiments of the present application from the perspective of electronic devices (e.g., mobile phones). To implement the above functions, it includes hardware structures and/or software modules for performing the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed in hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the functional modules of the devices involved in the method may be divided according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
Please refer to fig. 14, which is a block diagram of an electronic device 1400 according to an embodiment of the present disclosure. The electronic device 1400 may be any one of the above-mentioned examples, for example, the electronic device 1400 may be a mobile phone, a computer, or the like. Illustratively, as shown in fig. 14, the electronic device 1400 may include: a processor 1401, and a memory 1402. The memory 1402 is used to store computer-executable instructions. For example, in some embodiments, when the processor 1401 executes the instructions stored in the memory 1402, the electronic device 1400 may be caused to perform any of the functions of the electronic device in the above embodiments to implement any of the image processing methods in the above examples.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
Fig. 15 shows a schematic diagram of a chip system 1500. The chip system 1500 may be disposed in an electronic device. For example, the chip system 1500 may be disposed in a mobile phone. Illustratively, the chip system 1500 may include: a processor 1501 and a communication interface 1502 for enabling the electronic device to carry out the functions referred to in the above embodiments. In one possible design, the system-on-chip 1500 also includes memory for storing necessary program instructions and data for the electronic device. The chip system may be constituted by a chip, or may include a chip and other discrete devices. It should be noted that, in some implementations of the present application, the communication interface 1502 may also be referred to as an interface circuit.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The functions or actions or operations or steps, etc., in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented using a software program, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the present application are all or partially generated upon loading and execution of computer program instructions on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or can comprise one or more data storage devices, such as a server, a data center, etc., that can be integrated with the medium. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations may be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and drawings are merely illustrative of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to include such modifications and variations.

Claims (16)

1. An image processing method applied to an electronic device including a camera application and an image sensor, the method comprising:
in response to a first operation, launching the camera application;
the camera application program issues a first parameter to the image sensor, and the first parameter is used for indicating the image sensor to return the currently acquired image after first processing; the first processing includes at least one of: turning over and rotating;
the image sensor performs the first processing on the currently acquired image according to the first parameter to generate a first image;
the image sensor returns the first image to the camera application to cause the camera application to display the first image.
2. The image processing method of claim 1, wherein the camera application issues a first parameter to the image sensor, comprising:
the camera application program judges whether a first configuration is started, wherein the first configuration is used for indicating whether the image sensor needs to perform first processing on a currently acquired image and then returns the image;
if the first configuration is started, the camera application program issues the first parameter to the image sensor;
and if the first configuration is closed, responding to a second operation, and issuing the first parameter to the image sensor by the camera application program, wherein the second operation comprises an operation of opening the first configuration.
3. The image processing method of claim 1, wherein the camera application issues a first parameter to the image sensor, comprising:
the camera application program judges whether a first configuration is started or not, wherein the first configuration is used for indicating whether the image sensor needs to perform the first processing on the currently acquired image and then returns the image;
if the first configuration is started, responding to shooting operation, and issuing the first parameter to the image sensor by the camera application program;
and if the first configuration is closed, responding to a third operation, and issuing the first parameter to the image sensor by the camera application program, wherein the third operation comprises an operation of opening the first configuration and a shooting operation.
4. The method of claim 3, wherein after the image sensor returns the first image to the camera application, the method further comprises:
and the camera application program issues a preset instruction to the image sensor, wherein the preset instruction is used for indicating the image sensor to return the currently acquired image.
5. The image processing method according to claim 2 or 3, characterized in that the method further comprises:
and if the first configuration is closed, the camera application program issues a preset instruction to the image sensor, wherein the preset instruction is used for indicating the image sensor to return the currently acquired image.
6. The image processing method according to any one of claims 1 to 5, wherein after the starting of the camera application in response to the first operation, the method further comprises:
the camera application program judges whether the image sensor supports the first processing of the currently acquired image;
if the first parameter is supported, the camera application program configures a first attribute for the image sensor, and the first attribute is used for indicating that the image sensor performs the first processing on the currently acquired image and then returns the image when receiving the first parameter.
7. The image processing method according to any one of claims 1 to 6,
when the first processing comprises the turning processing, the first parameters comprise turning parameters, and the turning parameters are used for indicating the image sensor to return the currently acquired image after turning processing;
when the first processing comprises the rotation processing, the first parameters comprise rotation parameters, and the rotation parameters are used for indicating the image sensor to return after the image sensor performs the rotation processing on the currently acquired image.
8. The image processing method of claim 7, wherein the electronic device further comprises a camera hardware abstraction layer;
the camera application issues a first parameter to the image sensor, including:
the camera application sending the first parameter to the camera hardware abstraction layer;
the camera hardware abstraction layer generates a first instruction according to the first parameter, and the first instruction is used for instructing the image sensor to return the currently acquired image after the first processing;
the camera hardware abstraction layer issues the first instruction to the image sensor;
the image sensor performs the first processing on the currently acquired image according to the first parameter, and the first processing comprises the following steps:
and the image sensor performs the first processing on the currently acquired image according to the first instruction.
9. The image processing method according to claim 8, wherein the electronic device further includes a drive of an image sensor;
the image sensor performs the first processing on the currently acquired image according to the first instruction, and the first processing includes:
the driving of the image sensor generates a second instruction according to the first instruction, wherein the second instruction is an instruction which can be directly applied by the image sensor, and the second instruction is used for instructing the image sensor to perform the first processing on the currently acquired image and then return;
and the image sensor performs the first processing on the currently acquired image according to the second instruction.
10. The image processing method according to claim 8 or 9, wherein the first instruction includes one of: a first processing instruction, a second processing instruction, a third processing instruction;
the camera hardware abstraction layer generates a first instruction according to the first parameter, including:
when the first parameter comprises the turning parameter, the camera hardware abstraction layer generates a first processing instruction, and the first processing instruction is used for instructing the image sensor to turn the currently acquired image and then return the image;
when the first parameter comprises the rotation parameter, the camera hardware abstraction layer generates a second processing instruction, and the second processing instruction is used for instructing the image sensor to rotate the currently acquired image and then return the image;
and when the first parameters comprise the turning parameters and the rotation parameters, the camera hardware abstraction layer generates a third processing instruction, and the third processing instruction is used for instructing the image sensor to return the currently acquired image after the turning processing and the rotation processing are performed on the currently acquired image.
11. The image processing method according to any one of claims 1 to 6, wherein the electronic device further comprises a camera hardware abstraction layer, a driver of an image sensor, and an encoding module; the first parameters further comprise a size parameter, wherein the size parameter is used for instructing the camera hardware abstraction layer to perform second processing on the first image, and the second processing at least comprises cropping processing;
the image sensor returning the first image to the camera application to cause the camera application to display the first image, including:
the image sensor sends the first image to a driver of the image sensor;
the driving of the image sensor carries out code conversion on the first image to generate a second image, and the second image is an image which can be directly identified by the camera hardware abstraction layer;
the camera hardware abstraction layer carries out second processing on the second image according to the size parameter to generate a third image;
the coding module performs format conversion on the third image to generate a fourth image;
the encoding module sends the fourth image to the camera application to cause the camera application to display the fourth image.
12. The image processing method according to any one of claims 1 to 11, wherein the flipping process includes at least one of: mirror image turning, up-down turning.
13. The image processing method according to any one of claims 1 to 11, wherein the rotation process includes one of: rotate 90 degrees, rotate 180 degrees, rotate 270 degrees.
14. An electronic device, characterized in that the electronic device comprises one or more processors and one or more memories; the one or more memories coupled with the one or more processors, the one or more memories storing computer instructions;
the computer instructions, when executed by the one or more processors, cause the electronic device to perform the image processing method of any of claims 1-13.
15. A chip system, wherein the chip comprises processing circuitry and an interface; the processing circuit is used for calling and running a computer program stored in a storage medium from the storage medium to execute the image processing method according to any one of claims 1 to 13.
16. A computer-readable storage medium, comprising computer instructions which, when executed, perform the image processing method of any one of claims 1-13.
CN202111657813.4A 2021-12-30 2021-12-30 Image processing method and electronic equipment Active CN115034948B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111657813.4A CN115034948B (en) 2021-12-30 2021-12-30 Image processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111657813.4A CN115034948B (en) 2021-12-30 2021-12-30 Image processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN115034948A true CN115034948A (en) 2022-09-09
CN115034948B CN115034948B (en) 2023-06-16

Family

ID=83117981

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111657813.4A Active CN115034948B (en) 2021-12-30 2021-12-30 Image processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN115034948B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117221713A (en) * 2023-11-09 2023-12-12 荣耀终端有限公司 Parameter loading method and electronic equipment

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62154967A (en) * 1985-12-27 1987-07-09 Canon Inc Image processor
US20050152197A1 (en) * 2004-01-09 2005-07-14 Samsung Electronics Co., Ltd. Camera interface and method using DMA unit to flip or rotate a digital image
JP2006287715A (en) * 2005-04-01 2006-10-19 Seiko Epson Corp Image processing controller and electronic device
US20100104221A1 (en) * 2008-10-29 2010-04-29 Clifford Yeung Method and system for frame rotation within a jpeg compressed pipeline
US20120044364A1 (en) * 2010-08-19 2012-02-23 Samsung Electronics Co., Ltd. Image communication method and apparatus
CN104243831A (en) * 2014-09-30 2014-12-24 北京金山安全软件有限公司 Method and device for shooting through mobile terminal and mobile terminal
CN105163017A (en) * 2013-03-25 2015-12-16 锤子科技(北京)有限公司 Method and device for showing self-shooting image
CN105959544A (en) * 2016-05-23 2016-09-21 维沃移动通信有限公司 Mobile terminal and image processing method thereof
WO2018076156A1 (en) * 2016-10-25 2018-05-03 上海思恩电子信息科技有限公司 Selfie camera for displaying real dynamic image of photographer in eye of another person
CN108055463A (en) * 2017-12-26 2018-05-18 努比亚技术有限公司 Image processing method, terminal and storage medium
WO2019056242A1 (en) * 2017-09-21 2019-03-28 深圳传音通讯有限公司 Camera photographing parameter setting method for smart terminal, setting device, and smart terminal
CN112929561A (en) * 2021-01-19 2021-06-08 北京达佳互联信息技术有限公司 Multimedia data processing method and device, electronic equipment and storage medium
CN113014804A (en) * 2021-02-04 2021-06-22 维沃移动通信有限公司 Image processing method, image processing device, electronic equipment and readable storage medium
CN113079316A (en) * 2021-03-26 2021-07-06 维沃移动通信有限公司 Image processing method, image processing device and electronic equipment
CN113395413A (en) * 2020-03-11 2021-09-14 三星电子株式会社 Camera module, imaging apparatus, and image processing method
US11158028B1 (en) * 2019-10-28 2021-10-26 Snap Inc. Mirrored selfie

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS62154967A (en) * 1985-12-27 1987-07-09 Canon Inc Image processor
US20050152197A1 (en) * 2004-01-09 2005-07-14 Samsung Electronics Co., Ltd. Camera interface and method using DMA unit to flip or rotate a digital image
JP2006287715A (en) * 2005-04-01 2006-10-19 Seiko Epson Corp Image processing controller and electronic device
US20100104221A1 (en) * 2008-10-29 2010-04-29 Clifford Yeung Method and system for frame rotation within a jpeg compressed pipeline
US20120044364A1 (en) * 2010-08-19 2012-02-23 Samsung Electronics Co., Ltd. Image communication method and apparatus
CN105163017A (en) * 2013-03-25 2015-12-16 锤子科技(北京)有限公司 Method and device for showing self-shooting image
CN104243831A (en) * 2014-09-30 2014-12-24 北京金山安全软件有限公司 Method and device for shooting through mobile terminal and mobile terminal
CN105959544A (en) * 2016-05-23 2016-09-21 维沃移动通信有限公司 Mobile terminal and image processing method thereof
WO2018076156A1 (en) * 2016-10-25 2018-05-03 上海思恩电子信息科技有限公司 Selfie camera for displaying real dynamic image of photographer in eye of another person
WO2019056242A1 (en) * 2017-09-21 2019-03-28 深圳传音通讯有限公司 Camera photographing parameter setting method for smart terminal, setting device, and smart terminal
CN108055463A (en) * 2017-12-26 2018-05-18 努比亚技术有限公司 Image processing method, terminal and storage medium
US11158028B1 (en) * 2019-10-28 2021-10-26 Snap Inc. Mirrored selfie
CN113395413A (en) * 2020-03-11 2021-09-14 三星电子株式会社 Camera module, imaging apparatus, and image processing method
CN112929561A (en) * 2021-01-19 2021-06-08 北京达佳互联信息技术有限公司 Multimedia data processing method and device, electronic equipment and storage medium
CN113014804A (en) * 2021-02-04 2021-06-22 维沃移动通信有限公司 Image processing method, image processing device, electronic equipment and readable storage medium
CN113079316A (en) * 2021-03-26 2021-07-06 维沃移动通信有限公司 Image processing method, image processing device and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
侯彦宾: "数字视频视觉采集与处理系统及SoPC设计", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117221713A (en) * 2023-11-09 2023-12-12 荣耀终端有限公司 Parameter loading method and electronic equipment
CN117221713B (en) * 2023-11-09 2024-05-17 荣耀终端有限公司 Parameter loading method and electronic equipment

Also Published As

Publication number Publication date
CN115034948B (en) 2023-06-16

Similar Documents

Publication Publication Date Title
CN114787773B (en) Display method applied to electronic equipment with folding screen and electronic equipment
CN112671976B (en) Control method and device of electronic equipment, electronic equipment and storage medium
WO2022262475A1 (en) Image capture method, graphical user interface, and electronic device
CN111566606B (en) Interface display method and electronic equipment
US11986726B2 (en) Application running method and electronic device
CN110633043A (en) Split screen processing method and terminal equipment
CN110830645B (en) Operation method, electronic equipment and computer storage medium
CN110442277B (en) Method for displaying preview window information and electronic equipment
CN105427369A (en) Mobile terminal and method for generating three-dimensional image of mobile terminal
CN114995693A (en) Display screen window switching method and electronic equipment
CN113010076A (en) Display element display method and electronic equipment
WO2022262550A1 (en) Video photographing method and electronic device
CN115034948B (en) Image processing method and electronic equipment
WO2021254113A1 (en) Control method for three-dimensional interface and terminal
CN112449101A (en) Shooting method and electronic equipment
CN114065312A (en) Component display method and electronic equipment
WO2023040775A1 (en) Preview method, electronic device, and system
WO2022262540A1 (en) Photographing method and electronic device
WO2022262549A1 (en) Method for photographing video and electronic device
CN115543276A (en) Method, system and electronic equipment for realizing software development
CN115480629A (en) Multi-interface display method and electronic equipment
WO2022022381A1 (en) Method and apparatus for generating graffiti patterns, electronic device, and storage medium
WO2022262547A1 (en) Video photographing method and electronic device
WO2023078133A1 (en) Video playback method and device
CN115484394B (en) Guide use method of air separation gesture and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant