CN113766120B - Shooting mode switching method and electronic equipment - Google Patents

Shooting mode switching method and electronic equipment Download PDF

Info

Publication number
CN113766120B
CN113766120B CN202110910375.1A CN202110910375A CN113766120B CN 113766120 B CN113766120 B CN 113766120B CN 202110910375 A CN202110910375 A CN 202110910375A CN 113766120 B CN113766120 B CN 113766120B
Authority
CN
China
Prior art keywords
preview
stream
camera
memory block
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110910375.1A
Other languages
Chinese (zh)
Other versions
CN113766120A (en
Inventor
褚元强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110910375.1A priority Critical patent/CN113766120B/en
Publication of CN113766120A publication Critical patent/CN113766120A/en
Application granted granted Critical
Publication of CN113766120B publication Critical patent/CN113766120B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/665Control of cameras or camera modules involving internal camera communication with the image sensor, e.g. synchronising or multiplexing SSIS control signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the application provides a switching method of a shooting mode and electronic equipment, the switching method adopts a copying mode of copying image data of each line in a preview stream memory block to a fuzzy frame memory block according to the line, so that the copying of a fuzzy frame can be free from the limitation of a chip model in the switching process of the shooting mode, a set of fuzzy frame copying interface can be further ensured to copy image data needing fuzzy processing from the preview stream memory block of the preview stream of different memory alignment formats, the development cost and the maintenance difficulty are reduced, the fuzzy frame copying can be ensured to be smoothly carried out, the switching process of the shooting mode is ensured, and one frame of fuzzy frame can be used as transition animation to carry out transition all the time.

Description

Shooting mode switching method and electronic equipment
Technical Field
The embodiment of the application relates to the field of terminals, in particular to a shooting mode switching method and electronic equipment.
Background
With the development of communication technology, the computing power and hardware capability of electronic equipment are improved, and the camera function of electronic equipment is also more and more powerful, for example, a camera application (camera application) of the electronic equipment with a camera at present generally provides two shooting modes of taking a picture and recording a video. Furthermore, the mode of taking the picture is refined, such as a beauty mode, a delay mode, a flash lamp mode, a real-time mode and the like.
At present, when the shooting mode of a camera is switched, an electronic device uses a frame of blurred frame as transition animation for transition. However, the current operation of acquiring the blurred frame is to copy the blurred frame in a data block manner from the memory buffering the video frame data at the camera frame level. With the rapid rise of different communication platforms, the current electronic device can be accessed to different platforms according to actual requirements, because of the difference of different platform bottom layers and the difference of parameters, often the same socket can not be directly used on multiple platforms, therefore, when the shooting mode of the camera is switched by the electronic device, a frame of fuzzy frame can be acquired to be used as transition dynamic to perform transition, different codes need to be compiled for different platforms, different interfaces are provided, if the same socket is used, when the fuzzy frame is copied on different platforms, video frame data copy abnormity can occur, and further the mode switching abnormity can be caused, such as black screen, stuck and the like, and multiple sets of interfaces are used, so that the development cost and the maintenance difficulty can be increased.
Disclosure of Invention
In order to solve the technical problem, the application provides a method for switching a shooting mode and an electronic device. In the method, a mode of copying the image data of each line in the preview stream memory block to the fuzzy frame memory block according to the line is adopted, so that the copying of the fuzzy frame is not limited by the model of a chip in the switching process of the shooting mode, and a set of fuzzy frame copying interface is ensured to be capable of copying the image data needing to be subjected to fuzzy processing from the preview stream memory block of the preview stream with different memory alignment formats, thereby reducing the development cost and the maintenance difficulty, ensuring the smooth copying of the fuzzy frame, ensuring the switching process of the shooting mode, and always enabling one frame of fuzzy frame to be used as transition animation for transition.
In a first aspect, a method for switching a shooting mode is provided. The method is applied to electronic equipment, wherein a camera application is installed on an application program layer of the electronic equipment, an application program frame layer comprises a camera frame, an HAL layer comprises a camera hardware abstraction layer, and the camera frame performs data interaction with the camera application and the camera hardware abstraction layer respectively, and comprises the following steps: when the application program layer monitors the selected operation of the camera application, acquiring a first preview stream memory block in which a first preview stream is buffered from the camera frame, generating a first preview picture according to the first preview stream buffered in the first preview stream memory block, and displaying the first preview picture on a display interface, wherein the first preview stream is uploaded by the camera hardware abstraction layer, and the memory alignment format of the first preview stream is a first memory alignment format; when monitoring the switching operation of any shooting mode in the camera application, the application program layer sends a fuzzy frame request carrying preset canvas information to the camera frame, wherein the preset canvas information comprises a second memory alignment format; the camera frame allocates a fuzzy frame internal memory block in a second memory alignment format according to the preset canvas information, copies the image data in the first preview stream internal memory block to the fuzzy frame internal memory block in a line-by-line copying mode, and sends the fuzzy frame internal memory block to the camera application after copying is finished; in the switching process of the shooting mode, the camera application carries out fuzzy processing on the image data in the fuzzy frame memory block to obtain a second preview picture, and the second preview picture is adopted to replace the first preview picture displayed in the display interface. Therefore, by adopting the mode of copying the image data of each line in the preview stream memory block to the fuzzy frame memory block according to the line, the copying of the fuzzy frame can be free from the limitation of the chip model in the switching process of the shooting mode, and further, the problem that a set of fuzzy frame copying interface can copy the image data needing to be subjected to fuzzy processing from the preview stream memory block of the preview stream with different memory alignment formats is solved, the development cost and the maintenance difficulty are reduced, the fuzzy frame copying can be smoothly carried out, the switching process of the shooting mode is ensured, and one frame of fuzzy frame can be used as transition animation to carry out transition all the time.
Illustratively, the first memory alignment format is different from the second memory alignment format.
Illustratively, the first memory alignment format is the same as the second memory alignment format.
Illustratively, the first memory alignment format includes, but is not limited to, one of 64 bits, 128 bits, 512 bits.
Illustratively, the second memory alignment format includes, but is not limited to, one of 64 bits, 128 bits, 512 bits.
According to a first aspect, when the application program layer monitors a selection operation of the camera application, acquiring, from the camera frame, a first preview stream memory chunk in which a first preview stream is buffered, includes: when the application program layer monitors the selection operation of the camera application, a request for opening the camera is sent to the camera frame; the camera frame sends a configuration stream to the camera hardware abstraction layer according to the request for opening the camera; the camera hardware abstraction layer opens the camera according to the configuration stream and uploads the first preview stream in the first memory alignment format to the camera frame; the camera frame allocates the first preview stream memory block in the first memory alignment format for the first preview stream according to the first memory alignment format, buffers the first preview stream to the first preview memory block, and sends the first preview memory block in which the first preview stream is buffered to the camera application. Therefore, the camera is started, and before any function button of the display interface is triggered, the camera application can continuously upload the first preview stream in the first memory alignment format, so that the camera application can generate the corresponding first preview picture according to the first preview stream acquired at different moments, and real-time change of the first preview picture in the display interface is realized.
According to the first aspect, or any implementation manner of the first aspect, when the application program layer monitors a selection operation of the camera application, the method further includes acquiring a first preview memory block in which a first preview stream is buffered from the camera frame, generating a first preview picture according to the first preview stream buffered in the first preview memory block, and after the first preview picture is displayed on a display interface: and the camera application transmits the first preview memory block after the first preview stream is taken out back to the camera frame, so that the camera frame buffers the second preview stream to the first preview memory block after receiving the second preview stream uploaded by the camera hardware abstract layer. In this way, after the camera application takes out the first preview stream from the first preview stream memory block, the empty first preview stream memory block is returned to the camera frame, so that when the camera frame receives the preview stream at the next moment uploaded by the camera hardware abstraction layer, the camera frame does not need to allocate a new preview stream memory block, and directly multiplexes the first preview stream memory block, thereby effectively avoiding memory redundancy.
According to the first aspect, or any implementation manner of the first aspect, in the switching process of the shooting mode, after the camera application performs blurring processing on the image data in the blurred frame memory block to obtain a second preview picture, and replaces the first preview picture displayed in the display interface with the second preview picture, the method further includes: after the shooting mode is switched to the selected shooting mode, the camera application acquires a second preview stream memory block buffering a second preview stream from the camera frame, generates a third preview picture according to the second preview stream buffered in the second preview stream memory block, and replaces the second preview picture displayed in the display interface with the third preview picture, wherein the second preview stream is uploaded on the camera hardware abstraction layer after the shooting mode is switched to the selected shooting mode, and the memory alignment format of the second preview stream is the first memory alignment format. Therefore, after the shooting mode is switched to the selected shooting mode, and when a second preview stream uploaded to a second preview stream memory block in the camera frame by the camera hardware abstraction layer is acquired, a third preview picture is generated by the second preview stream uploaded after the shooting mode is switched to replace the second preview picture displayed on the display interface, so that smooth transition of the preview picture displayed on the interface in the shooting mode switching process is realized, and the display interface can be switched to the latest preview picture in time after the shooting mode is successfully switched.
According to the first aspect, or any implementation manner of the first aspect, after switching to the selected shooting mode, the method further includes: when the camera frame receives the second preview stream in the first memory alignment format uploaded by the camera hardware abstraction layer, detecting whether the first preview stream memory block which can be reused exists or not; when the first reusable preview in-stream memory block exists, the camera frame buffers the second preview stream into the first preview in-stream memory block, and sends the first preview in-stream memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the first preview in-stream memory block, and replaces a second preview picture displayed in the display interface with the third preview picture; when the first reusable preview internal memory block does not exist, the camera frame allocates the second preview internal memory block in the first memory alignment format to the second preview stream according to the first memory alignment format, buffers the second preview stream to the second preview internal memory block, and sends the second preview internal memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the second preview internal memory block, and replaces the second preview picture displayed in the display interface with the third preview picture. In this way, when buffering the second preview stream uploaded by the camera hardware abstraction layer, the camera frame first detects whether there is a reusable first preview stream memory block at present, that is, whether a first preview stream memory block returned by the camera application is received, if there is a reusable first preview stream memory block, the second preview stream memory block is directly buffered to the first preview stream memory block, that is, the first preview stream memory block is directly multiplexed, and only when there is no reusable first preview stream memory block at the present time, the second preview stream memory block in the first memory alignment format is allocated, thereby avoiding occurrence of memory redundancy as much as possible.
According to the first aspect, or any implementation manner of the first aspect, before the camera frame copies the image data in the first preview in-stream memory block to the blurred frame in-stream memory block in a line-by-line manner, the method further includes: the camera frame compares the first memory alignment format with the second memory alignment format; when the first memory alignment format is different from the second memory alignment format, executing a step of copying the image data in the first preview in-stream memory block to the blurred frame in-stream memory block in a line-by-line manner by the camera frame; when the first memory alignment format is the same as the second memory alignment format, the camera frame copies the entire content in the first preview memory chunk to the blurred frame memory chunk with the first preview memory chunk as a unit. In this way, before the camera frame performs fuzzy frame copying in a line copying manner, whether the memory alignment formats of the first preview memory streaming block and the fuzzy frame memory block are consistent or not is judged, and when the memory alignment formats of the first preview memory streaming block and the fuzzy frame memory block are consistent, the first preview memory streaming block is directly taken as a unit to copy the whole memory in the first preview memory streaming block, so that the copying efficiency is improved; when the frame is inconsistent, the mode of line-based copying is adopted, so that the fuzzy frame processing can be still realized under the condition that the memory alignment formats are different, namely, in any case, one frame of fuzzy frame can be used as transition animation to carry out transition all the time, and the phenomenon of black screen is avoided.
According to the first aspect, or any implementation manner of the first aspect, the content in the first preview in-stream memory chunk includes the image data and an extended content spliced with the image data, the image data is a pixel point for generating the second preview picture, and the extended content is used for occupying a place; the camera frame copies the image data in the first preview in-stream memory block to the blurred frame in-stream memory block in a line-by-line manner, including: and the camera frame copies the pixel points, which are used for generating the second preview picture, of each line in the first preview in-stream memory block to the corresponding position in the corresponding line of the fuzzy in-stream memory block on the basis of a YUV color coding method. In practical application, under the condition of the same resolution, the widths of the memory spaces for buffering the image data in the first preview streaming memory block and the blurred frame streaming memory block are the same, so that under the condition that the resolution of a display interface for displaying a preview picture is the same, only the pixel point for generating the second preview picture in the first preview streaming memory block is required to be copied to the blurred frame streaming memory block, and the camera application can be ensured to generate the second preview picture according to the image data in the blurred frame streaming memory block.
According to the first aspect, or any implementation manner of the first aspect, the copying, by the camera frame, each line in the first preview in-stream memory block, where the pixel point used for generating the second preview picture is used to a corresponding position in a corresponding line of the blurred frame in-stream memory block based on a YUV color coding method includes: the camera frame determines values of Y, U and V channels corresponding to each position of each line in the first preview in-stream memory block based on a YUV color coding method; the camera frame distinguishes image data of the pixel point of each line for generating the second preview picture from the expanded content for occupying according to the change of the numerical values of the three channels of Y, U and V; and the camera frame copies the pixel points of each line in the first preview in-stream memory block for generating the second preview picture to corresponding positions in the corresponding line of the memory block in the fuzzy frame one by one. In practical application, the extended content does not affect the display of the picture, and is only used for occupying space to realize memory alignment, so that the whole copy is facilitated, therefore, based on a YUV color coding method, pixel points with actual content and the extended content of each content are distinguished, and the pixel points of each line for generating the second preview picture can be copied to the corresponding positions in the corresponding lines of the memory block in the fuzzy frame, so that in the switching process of the shooting mode, one frame of fuzzy frame can be used as transition animation to perform transition all the time.
According to the first aspect, or any implementation manner of the first aspect, the content in the first preview in-stream memory chunk includes the image data and an extended content spliced with the image data, the image data is a pixel point for generating the second preview picture, and the extended content is used for occupying a place; the camera frame uses the first preview in-stream memory block as a unit to copy the whole content in the first preview in-stream memory block to the fuzzy frame in-stream memory block, and the method includes: the camera frame copies the image data in the first preview in-stream memory block and the extended content spliced with the image data to the blurred frame in-stream memory block together in units of the first preview in-stream memory block. Therefore, when the memories of the first preview streaming memory block and the fuzzy frame memory block are aligned, namely the memory alignment formats are the same, the data block is directly used as a unit for integral copying, and the copying efficiency is greatly improved.
According to the first aspect, or any implementation manner of the first aspect above, before the camera frame compares the first memory-aligned format with the second memory-aligned format, the method further includes: the camera frame detects whether the first preview stream buffered in the first preview stream memory block is in a preset stream format; when the first preview stream buffered in the first preview stream memory block is in the preset stream format, executing a step of comparing the first memory alignment format with the second memory alignment format by the camera frame; and when the first preview stream buffered in the first preview stream memory block is not in the preset stream format, applying callback fuzzy frame copy failure to the camera. In this way, before the fuzzy frame copying operation is executed, whether a first preview stream in a preset stream format exists in the first preview stream memory block is determined, the fuzzy frame copying method is executed only when the first preview stream in the preset stream format exists, execution of an invalid flow is avoided, and besides, when the first preview stream in the preset stream format does not exist, a fuzzy frame copying failure is directly recalled to the camera application, so that the camera application can perform fuzzy processing according to a preset processing mode, such as by using a currently displayed first preview picture, avoid a black screen from appearing, or shorten a black screen time, and user experience is improved.
In a second aspect, an electronic device is provided. The camera application is installed on the application program layer of the electronic equipment, the application program frame layer comprises a camera frame, the HAL layer comprises a camera hardware abstraction layer, the camera frame is respectively in data interaction with the camera application and the camera hardware abstraction layer, and the electronic equipment further comprises: one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored on the memory, and when executed by the one or more processors, cause the electronic device to perform the steps of: when the application program layer monitors the selected operation of the camera application, acquiring a first preview stream memory block in which a first preview stream is buffered from the camera frame, generating a first preview picture according to the first preview stream buffered in the first preview stream memory block, and displaying the first preview picture on a display interface, wherein the first preview stream is uploaded by the camera hardware abstraction layer, and the memory alignment format of the first preview stream is a first memory alignment format; when monitoring the switching operation of any shooting mode in the camera application, the application program layer sends a fuzzy frame request carrying preset canvas information to the camera frame, wherein the preset canvas information comprises a second memory alignment format; the camera frame allocates a fuzzy frame internal memory block in a second memory alignment format according to the preset canvas information, copies the image data in the first preview stream internal memory block to the fuzzy frame internal memory block in a line-by-line copying mode, and sends the fuzzy frame internal memory block to the camera application after copying is finished; in the switching process of the shooting mode, the camera application carries out fuzzy processing on the image data in the fuzzy frame memory block to obtain a second preview picture, and the second preview picture is adopted to replace the first preview picture displayed in the display interface.
According to a second aspect, the computer programs, when executed by the one or more processors, cause the electronic device to perform the steps of: when the application program layer monitors the selection operation of the camera application, a request for opening the camera is sent to the camera frame; the camera frame sends a configuration stream to the camera hardware abstraction layer according to the request for opening the camera; the camera hardware abstraction layer opens the camera according to the configuration stream and uploads the first preview stream in the first memory alignment format to the camera frame; the camera frame allocates the first preview stream memory block in the first memory alignment format for the first preview stream according to the first memory alignment format, buffers the first preview stream to the first preview memory block, and sends the first preview memory block in which the first preview stream is buffered to the camera application.
According to a second aspect, or any implementation of the second aspect above, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of: and the camera application transmits the first preview memory block after the first preview stream is taken out back to the camera frame, so that the camera frame buffers the second preview stream to the first preview memory block after receiving the second preview stream uploaded by the camera hardware abstract layer.
According to a second aspect, or any implementation of the second aspect above, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of: after the shooting mode is switched to the selected shooting mode, the camera application acquires a second preview stream memory block buffering a second preview stream from the camera frame, generates a third preview picture according to the second preview stream buffered in the second preview stream memory block, and replaces the second preview picture displayed in the display interface with the third preview picture, wherein the second preview stream is uploaded on the camera hardware abstraction layer after the shooting mode is switched to the selected shooting mode, and the memory alignment format of the second preview stream is the first memory alignment format.
According to a second aspect, or any implementation of the second aspect above, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of: when the camera frame receives the second preview stream in the first memory alignment format uploaded by the camera hardware abstraction layer, detecting whether the first preview stream memory block which can be reused exists or not; when the first reusable preview in-stream memory block exists, the camera frame buffers the second preview stream into the first preview in-stream memory block, and sends the first preview in-stream memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the first preview in-stream memory block, and replaces a second preview picture displayed in the display interface with the third preview picture; when the first reusable preview internal memory block does not exist, the camera frame allocates the second preview internal memory block in the first memory alignment format to the second preview stream according to the first memory alignment format, buffers the second preview stream to the second preview internal memory block, and sends the second preview internal memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the second preview internal memory block, and replaces the second preview picture displayed in the display interface with the third preview picture.
According to a second aspect, or any implementation of the second aspect above, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of: the camera frame compares the first memory alignment format with the second memory alignment format; when the first memory alignment format is different from the second memory alignment format, executing a step of copying the image data in the first preview in-stream memory block to the fuzzy in-frame memory block by line in a line-by-line copying manner by the camera frame; when the first memory alignment format is the same as the second memory alignment format, the camera frame copies the entire content in the first preview in-stream memory block to the blurred frame memory block in units of the first preview in-stream memory block.
According to the second aspect, or any implementation manner of the second aspect, the content in the first preview in-stream memory chunk includes the image data and extended content spliced with the image data, the image data is a pixel point for generating the second preview picture, and the extended content is used for occupying a place; the computer programs, when executed by the one or more processors, cause the electronic device to perform the steps of: and the camera frame copies the pixel points of each line in the first preview in-stream memory block for generating the second preview picture to the corresponding positions in the corresponding lines of the fuzzy in-stream memory block on the basis of a YUV color coding method.
According to a second aspect, or any implementation of the second aspect above, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of: the camera frame determines values of Y, U and V channels corresponding to each position of each line in the first preview in-stream memory block based on a YUV color coding method; the camera frame distinguishes image data of the pixel point of each line for generating the second preview picture from the expanded content for occupying according to the change of the numerical values of the three channels of Y, U and V; and the camera frame copies the pixel points of each line in the first preview in-stream memory block for generating the second preview picture to corresponding positions in the corresponding line of the memory block in the fuzzy frame one by one.
According to the second aspect, or any implementation manner of the second aspect, the content in the first preview in-stream memory chunk includes the image data and extended content spliced with the image data, the image data is a pixel point for generating the second preview picture, and the extended content is used for occupying a place; the computer programs, when executed by the one or more processors, cause the electronic device to perform the steps of: the camera frame copies the image data in the first preview in-stream memory block and the extended content spliced with the image data to the blurred frame in-stream memory block together in units of the first preview in-stream memory block.
According to a second aspect, or any implementation of the second aspect above, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of: the camera frame detects whether the first preview stream buffered in the first preview stream memory block is in a preset stream format; when the first preview stream buffered in the first preview stream memory block is in the preset stream format, executing a step of comparing the first memory alignment format with the second memory alignment format by the camera frame; and when the first preview stream buffered in the first preview stream memory block is not in the preset stream format, applying callback fuzzy frame copy failure to the camera.
Any one implementation manner of the second aspect and the second aspect corresponds to any one implementation manner of the first aspect and the first aspect, respectively. For technical effects corresponding to any one implementation manner of the second aspect and the second aspect, reference may be made to the technical effects corresponding to any one implementation manner of the first aspect and the first aspect, and details are not repeated here.
In a third aspect, a computer-readable storage medium is provided. The medium includes a computer program that, when run on an electronic apparatus, causes the electronic apparatus to execute the method of switching the shooting mode in any one of the first aspect and the first aspect.
Illustratively, the electronic device may be an electronic device employing a first chip or an electronic device employing a second chip.
Illustratively, the memory alignment format of the preview stream provided by the first chip is different from the memory alignment format of the preview stream provided by the second chip.
Illustratively, the memory alignment format of the preview stream provided by the first chip is the same as the memory alignment format of the preview stream provided by the second chip.
Any one implementation manner of the third aspect and the third aspect corresponds to any one implementation manner of the first aspect and the first aspect, respectively. For technical effects corresponding to any one implementation manner of the third aspect and the third aspect, reference may be made to the technical effects corresponding to any one implementation manner of the first aspect and the first aspect, and details are not described here again.
In a fourth aspect, the present application provides a computer program including instructions for executing the method of the first aspect or any possible implementation manner of the first aspect.
Any one implementation manner of the fourth aspect and the fourth aspect corresponds to any one implementation manner of the first aspect and the first aspect, respectively. For technical effects corresponding to any one implementation manner of the fourth aspect and the fourth aspect, reference may be made to the technical effects corresponding to any one implementation manner of the first aspect and the first aspect, and details are not repeated here.
In a fifth aspect, an embodiment of the present application provides a chip, which includes a processing circuit and a transceiver pin. Wherein the transceiver pin and the processing circuit are in communication with each other via an internal connection path, and the processing circuit is configured to perform the method of the first aspect or any one of the possible implementations of the first aspect to control the receiving pin to receive signals and to control the sending pin to send signals. Illustratively, the chip may be a chip of an electronic device.
Any one implementation manner of the fifth aspect and the fifth aspect corresponds to any one implementation manner of the first aspect and the first aspect, respectively. For technical effects corresponding to any one of the implementation manners of the fifth aspect and the fifth aspect, reference may be made to the technical effects corresponding to any one of the implementation manners of the first aspect and the first aspect, and details are not repeated here.
In a sixth aspect, an embodiment of the present application provides an assembling method of a camera of an electronic device, where the method includes: when a chip adopted by an electronic device is a first chip providing a preview stream in a first memory alignment format, a memory alignment format of a preview stream memory block allocated by a camera frame for buffering the preview stream is the first memory alignment format, and when it is monitored that a shooting mode of the camera is triggered to be switched, the electronic device executes the method in the first aspect or any possible implementation manner of the first aspect; when the chip adopted by the electronic device is a second chip providing a preview stream in a second memory alignment format, the memory alignment format of a preview stream memory block allocated by a camera frame for buffering the preview stream is the second memory alignment format, and when it is monitored that a shooting mode of the camera is triggered to be switched, the electronic device executes the method in the first aspect or any possible implementation manner of the first aspect; wherein the first memory alignment format is different from the second memory alignment format.
Any one implementation form of the sixth aspect and the sixth aspect corresponds to any one implementation form of the first aspect and the first aspect, respectively. For technical effects corresponding to any one implementation manner of the sixth aspect and the sixth aspect, reference may be made to the technical effects corresponding to any one implementation manner of the first aspect and the first aspect, and details are not described here again.
Drawings
Fig. 1 is a schematic view illustrating an example of a scene in which an abnormal switching process of shooting modes occurs due to the adoption of the same interface in the prior art;
FIG. 2 is a diagram illustrating the architecture of a software system to which an existing way of obtaining blurred frames is applied;
FIG. 3 is a timing diagram of an exemplary illustrative prior art acquisition blur frame;
fig. 4 is a diagram schematically illustrating copying of buffer data when the step size of one data block in the buffer coincides with the step size of the canvas for copying a blurred frame carried when the switching operation of the photographing mode is triggered;
FIG. 5 is a diagram illustrating that the step size of one data block in the buffer is not consistent with the step size of the canvas for copying the blurred frame carried when the switching operation of the shooting mode is triggered;
FIG. 6 is a schematic diagram of an exemplary handset;
fig. 7 is a schematic diagram of the software architecture of an exemplary illustrated handset;
FIG. 8 is a schematic diagram illustrating an application scenario in which multiple platforms share a set of fuzzy frame copy interface systems according to an embodiment of the present application;
fig. 9 is an exemplary diagram illustrating a software system architecture to which a switching method of a shooting mode provided in an embodiment of the present application is applied;
fig. 10 is a diagram schematically illustrating still another software system architecture to which the switching method of the shooting mode provided by the embodiment of the present application is applied;
fig. 11 is a schematic flowchart illustrating a switching method of a shooting mode according to an embodiment of the present application;
fig. 12 is a timing diagram illustrating data interaction among a camera application, a camera framework, and a camera hardware abstraction layer in a successful switching process of a shooting mode from a camera application being turned on according to an exemplary switching method of a shooting mode provided in an embodiment of the present application;
fig. 13 is a diagram exemplarily illustrating that buffer data is copied when a step size of one data block in a buffer is not identical to a step size of a canvas for copying a blurred frame, which is carried when a switching operation of a photographing mode is triggered;
fig. 14 is a schematic diagram illustrating a position distribution of image data stored per line in the buffer and extended content added at the end of the line, as exemplary;
FIG. 15 is a schematic diagram illustrating a distribution of locations of each row of a canvas for copying blurred frames for storing image data and extended content added at the end of the row;
FIG. 16a is a diagram illustrating the distribution of data in the canvas after copying the image data of the first line from the inconsistent-step buffer to the first line of the canvas;
FIG. 16b is a diagram illustrating the distribution of the image data in the canvas after copying the first and second rows of image data from the inconsistent step size buffer to the first and second rows of the canvas;
FIG. 16c is a diagram illustrating the distribution of data in the canvas after copying all of the image data in one data block in the buffer having inconsistent step sizes to the canvas;
fig. 17 is a schematic view of an exemplary scene when the shooting mode is switched by using the same interface based on the method for switching the shooting mode provided by the embodiment of the present application;
fig. 18 is a schematic structural diagram of an apparatus provided in an exemplary embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, of the embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone.
The terms "first" and "second," and the like in the description and in the claims of the embodiments of the present application, are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first target object and the second target object, etc. are specific sequences for distinguishing different target objects, rather than describing target objects.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present application, the meaning of "a plurality" means two or more unless otherwise specified. For example, a plurality of processing units refers to two or more processing units; the plurality of systems refers to two or more systems.
Before explaining the technical solution of the embodiment of the present application, a scheme of performing fuzzy frame copying on a plurality of platforms by using the same set of fuzzy frame processing interface in the prior art is first explained with reference to fig. 1 to 5.
Referring to fig. 1, it is assumed that a default state after a camera application 10-1 in a main page 10 of a mobile phone is opened is that a rear camera is enabled and is in a photographing mode, when a user a needs to use the camera application 10-1 to photograph a user B, if the user a clicks the camera application 10-1 and aligns the user B with the rear camera, after the camera application 10-1 is started, an image of the user B will be presented in a 20 region of a display interface of the mobile phone.
For example, in an actual application scenario, the display interface of the mobile phone, in addition to displaying the image of the user B in the area 20, typically displays a series of function buttons on the top and bottom of the area 20 for the user to operate. For example, a function button 20-1 for performing front and rear camera mode switching, a function button 20-2 for turning on a live view mode, a function button 20-3 for turning on a beauty mode, a function button 20-4 for turning on a flash mode, which are displayed at the top of the area 20, and a function button 20-5 for turning on a camera mode, a function button 20-6 for turning on a photographing mode, and a function button 20-7 for entering a gallery, which are displayed at the bottom of the area 20.
For example, if the user a wants to switch from the current photographing mode to the photographing mode, the user a only needs to click the function button 20-5 in the display interface.
Accordingly, when the user a clicks the function button 20-5 for starting the camera shooting mode, the existing system for supporting the fuzzy frame copying performs fuzzy frame copying, so that the 20 areas of the display interface of the mobile phone can use one frame of fuzzy frame as transition animation for transition.
The software system architecture to which the existing copy interface calling the blurred frame acquires the blurred frame is adapted is shown in fig. 2.
Referring to fig. 2, after the camera application located in the application program Layer of the mobile phone system, that is, after the user a clicks the camera application 10-1 in fig. 1, the mobile phone system monitors the click operation of the user a, issues an operation instruction/request for opening the camera to the camera frame located in the application program frame Layer, calls a corresponding camera interface by the camera frame, and issues a configuration stream to the camera Hardware Abstraction Layer (HAL Layer) of the Hardware Abstraction Layer (HAL Layer), thereby starting the camera application.
Correspondingly, after the camera application is started, when the user A does not click any functional button of the display interface, the camera hardware abstraction layer can continuously report the preview stream.
It can be understood that, in an actual application scenario, the types of chips and platforms providing the chips adopted in the hardware abstraction layer of the camera are different, and the memories occupied by the uploaded preview streams stored in the buffer located in the camera frame are different, and at present, in order to efficiently and quickly copy the preview streams from the buffer, the memory alignment operation is usually performed on the image data corresponding to the preview streams, that is, extended content (padding) is automatically added at the end of each line of image data, so as to achieve the memory alignment.
In addition, it should be noted that the image data is a real image that needs to be displayed in a 20 region of a display interface corresponding to the camera application of the application layer, for example, each pixel point of the image of the user B displayed in the 20 region of the display interface in fig. 1; the extended content automatically added at the end of the line is substantially without any content, and is only used for occupying the memory space of the buffer, so that the display of the image is not influenced.
For convenience of explanation, fig. 2 shows a handset using a 512-bit chip, and thus the preview stream uploaded to the camera frame by the camera hardware abstraction layer is a preview stream with a memory alignment format of 512 bits.
Accordingly, the memory alignment format allocated by the camera framework to buffer the memory blocks in that format is 512 bits. For convenience of explanation, the preview stream memory block that buffers the preview stream is hereinafter referred to as a preview stream Buffer.
It can be understood that, since the memory alignment format of the preview stream is 512 bits, the step size occupied by each row of pixels in the preview stream Buffer (the width + padding of the real image data) is 512 bits.
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not to be taken as the only limitation of the present embodiment. In practical applications, the memory alignment format of the uploaded preview stream may also be 64 or 128 bits due to the difference of chip types in the camera hardware abstraction layer and the difference of platforms providing the chips. Accordingly, the memory alignment format of the preview stream Buffer of the buffered preview stream may be 64 bits or 128 bits, i.e. consistent with the memory alignment format of the preview stream, which is not listed here.
Accordingly, when the user a clicks the function button 20-6 in fig. 1 in the default photographing mode, the camera hardware abstraction layer reports the photographing stream.
Accordingly, when the user clicks the function button 20-5 in fig. 1 in the video mode, the video stream is reported by the camera hardware abstraction layer.
It should be noted that, although not shown in detail in fig. 2, in an actual application process, the reporting process of the video stream and the recording stream is similar to that of the preview stream, and is first uploaded to a buffer in a camera frame from a camera hardware abstraction layer, and then is uploaded to the camera application by taking out the buffer.
After the camera application is started, the camera application located in the application program layer of the mobile phone system, that is, the camera application 10-1 in fig. 1 clicked by the user a monitors that the user a triggers the mode switching function, and at this time, a fuzzy frame request carrying preset canvas (surface) information is issued to the camera frame located in the application program frame layer. A fuzzy frame request for copying preset canvas (surface) information of the fuzzy frame.
Understandably, because the existing fuzzy frame is copied in a data block mode, the integral copying is carried out, and the carried preset surface information is a memory alignment format required by the fuzzy frame Buffer corresponding to the preset surface.
In practical applications, the memory alignment format may also be referred to as a space occupied by real image data and extension content of each line in a memory, that is, a step size stride.
Stride in image processing refers to the space occupied by each row of pixels in memory. Under normal conditions, stride values are calculated as: stride is the number of bytes occupied by each pixel (i.e. the number of pixels/8) and the Width of the picture (formula 1). However, in the byte alignment problem, it is necessary to ensure that stride value is an integer multiple of 4 (since memory alignment is performed according to 4 bytes), and therefore, when stride value is not an integer multiple of 4, stride value is calculated as: stride + (4-stride mod 4) (formula 2).
For better understanding, the following description is given in conjunction with examples.
For example, the picture corresponding to the preview stream is in 24-bit (pixel-bit number) RGB format, the picture width is 20, the above numerical value is substituted into equation 1, the result of calculation is 24/8 × 20 to 60, and since 60 is an integral multiple of 4, the stride value is 60 bits.
For example, the picture corresponding to the preview stream is in a 24-bit (pixel-bit number) RGB format, the picture width is 21, the above numerical value is substituted into formula 1, the calculated result is 24/8 × 21 to 63, 63 is substituted into formula 2 because 63 is not an integral multiple of 4, the calculated result is 63+ (4-63mod 4) to 64, and the stride value is 64 bits because 64 is an integral multiple of 4.
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not to be taken as the only limitation of the present embodiment.
Thus, by the memory alignment method, when copying image data, the memory-aligned preview stream can be entirely copied from the preview stream Buffer to the blurred frame Buffer in units of a data block, i.e., the preview stream Buffer.
In order to better understand the software system structure shown in fig. 2, if a scheme for acquiring a blurred frame by using the same blurred frame interface is adopted in mobile phones with different platforms (the mobile phones adopt different chips), when the conventional scheme shown in fig. 2 is operated, problems may be encountered, and the following description will be given by taking the flowchart shown in fig. 3 as an example. Referring to fig. 3, the process of switching from turning on the camera to triggering the mode to acquire the mode frame includes:
step 101, sending a request for opening a camera to a camera frame.
Still take the example that the user a in fig. 1 opens the camera application 10-1 (camera application) on the main page 10 of the mobile phone, and take the rear camera to shoot the user B. As can be seen from fig. 2, step 101 is specifically a request initiated to the camera framework located in the application framework layer after the camera application located in the application program layer monitors the trigger/click operation of the user a.
And 102, sending a configuration stream to a camera hardware abstraction layer.
Specifically, after receiving a request for opening a camera from a camera application in an application layer, a camera frame in the application frame layer calls a preset functional interface to start the camera to be opened in a default state of the camera application, and issues a configuration stream required for calling the corresponding camera to a camera hardware abstraction layer in the HAL layer.
In step 103, a preview stream with 512-bit memory alignment is uploaded to S1.
Specifically, after the configuration of the camera is completed according to the configuration flow issued by the camera frame in the application program frame layer, the camera hardware abstraction layer located in the HAL layer continuously uploads the currently acquired preview stream to the buffer located in the camera frame when the user a does not trigger/click any function button in the display interface.
For example, at time T1, the preview stream for user B S1 is uploaded, and at time T2, the preview stream for user B S2 is uploaded.
In step 104, a preview stream Buffer with memory alignment of 512 bits is allocated for the preview stream S1, and the preview stream is buffered S1.
For convenience of description, it is still described with reference to 512 bits after the memory alignment of the preview stream, the memory alignment format of the preview stream Buffer that buffers the preview stream in the camera frame of the application frame layer needs to be 512 bits, that is, after the preview stream is buffered to the preview stream Buffer, the step size of the space occupied by each row of pixels in the preview stream Buffer is 512 bits.
In step 105, the preview stream is uploaded S1.
Specifically, if the user a still does not trigger/click any function button in the display interface after the preview stream S1 at time T1 uploaded by the camera hardware abstraction layer reaches the Buffer in the camera frame, the camera frame transmits the preview stream Buffer in which the preview stream S1 is currently buffered to the camera application in its entirety, so that the camera application generates a corresponding preview picture according to the preview stream in the preview stream Buffer, and displays the generated preview picture in the area 20 of the display interface shown in fig. 1.
Step 106, triggering the switching mode.
Specifically, if the camera application monitors that the user a triggers/clicks any function button in the display interface, for example, the user a clicks a function button 20-5 in the display interface in fig. 1, that is, the camera application switches from the photographing mode to the photographing mode, and determines that the switching mode is triggered, the camera application needs to perform the copying operation of the blurred frame in the camera frame.
And step 107, sending a fuzzy frame request carrying the surface information to the camera frame.
Specifically, after monitoring that the user a triggers mode switching, the camera application issues a fuzzy frame request carrying preset surface information to a camera frame located in an application frame layer.
Illustratively, in the sent fuzzy frame request carrying the preset surface information, the preset surface information may be in a memory alignment format of 64 bits, that is, the step size of the space occupied by each line of pixels in the fuzzy frame Buffer allocated by the camera frame according to the memory alignment format is 64 bits.
And step 108, distributing the fuzzy frame Buffer with 64-bit memory alignment according to the surface information.
Step 109, determine whether the memory of the preview stream Buffer and the memory of the blur frame Buffer are aligned.
Specifically, if the memories are not aligned, that is, the memory alignment formats are different, step 109 is executed; if the two are aligned, i.e. the memory alignment formats are the same, step 111 is executed.
Referring to fig. 3, in the case that the memory alignment format of the preview stream Buffer is 512 bits, and the memory alignment format of the blurred frame Buffer is 64 bits, the memories of the preview stream Buffer and the blurred frame Buffer obviously cannot be aligned, so that the callback fails, that is, the blurred frame copy operation cannot be performed, in step 110.
At step 110, a callback fails.
Understandably, if the callback fails, the camera application does not acquire the fuzzy frame, so that transition animation in the mode switching process has no way to use the fuzzy frame for transition, and the picture switching is hard and abrupt, and even jumping and black screen phenomena occur.
Step 111, taking the preview stream Buffer as a unit, copying the preview stream S1 in the preview stream Buffer to the blurred frame Buffer as a whole.
Specifically, in the process of copying the entire preview stream S1 in the preview stream Buffer to the blurred frame Buffer in units of preview stream buffers, it is essential that all the contents (real image data + padding) in the preview stream Buffer are copied to the blurred frame Buffer.
For a better understanding of the operation of step 111, the following description is made in conjunction with fig. 4 and 5.
Referring to fig. 4, it is assumed that in the space occupied by each line of pixels in the preview stream Buffer, the width of a real pixel point, that is, the width of image data a is X, the extended content added at the end of each line is Padding _1, and the step size of the space occupied by each line of pixels in the preview stream Buffer is X + Padding _1, which is 64 bits.
Similarly, assuming that the width for displaying image data in the space occupied by each row of pixels in the blurred frame Buffer is also X, the extended content added at the end of each row is Padding _2, and the step size of the space occupied by each row of pixels in the blurred frame Buffer is X + Padding _2, which is 64 bits.
For the case shown in fig. 4, the memory alignment format of the preview stream Buffer and the memory alignment format of the blurred frame Buffer are the same, or the step size is the same, so that the preview stream Buffer can be regarded as a data block, and then the data block is taken as a unit and is entirely copied into the blurred frame Buffer, so that after the data copy operation is completed, the contents of the image data a + Padding _1 are directly displayed in the blurred frame Buffer.
For the case that the step size of the space occupied by each line of pixels in the preview stream Buffer is inconsistent with the step size of the space occupied by each line of pixels in the blurred frame Buffer, as shown in fig. 5, it is assumed that X + Padding _1 is 512 bits and X + Padding _2 is 64 bits, and obviously, the step sizes are inconsistent, and if the currently buffered data of the preview stream Buffer is regarded as a data block and copied into the blurred frame Buffer, partial data loss will inevitably occur, so that the final display result is affected, and therefore copying cannot be performed, so that only step 110 can be performed in this case, and the callback application to the camera fails.
Accordingly, because the operation of copying the blurred frame fails, there is no way to acquire the blurred frame as a transition animation to implement transition, so that the black screen phenomenon shown in fig. 1 occurs in the area 20 of the display interface, and after the switching of the switching mode is completed, and a video stream reported by the hardware abstraction layer of the camera is received (in fig. 1, the shooting mode is switched to the shooting mode), the image of the user B is displayed in the area 20 of the display interface.
And step 112, returning the fuzzy frame Buffer.
Specifically, after the copying is successful, the fuzzy frame Buffer carrying the copied image data is returned to the camera application, so that the camera application can perform fuzzy processing on the image data in the fuzzy frame Buffer.
And 113, blurring the image frame in the blurring frame Buffer, and displaying.
Specifically, after the camera receives the fuzzy frame Buffer carrying the copied image data, the image data in the fuzzy frame Buffer is subjected to fuzzy processing based on a preset fuzzy algorithm, the fuzzy frame subjected to the fuzzy processing is displayed in a preset surface, and the preset surface is used as a transition animation, so that transition of pictures in a mode switching process is realized by using the fuzzy frame, and a black screen phenomenon in 20 regions of a display interface is avoided.
And step 114, displaying the picture after the switching is successful.
Specifically, after the fuzzy frame picture is displayed, if it is monitored that the camera application is switched to the shooting mode from the shooting mode before the user a triggers the mode switching operation and a video stream uploaded by a camera hardware abstraction layer is received, the fuzzy frame picture displayed in the area 20 of the display interface is replaced by the picture after the switching is successful, so that smooth transition is realized.
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not to be taken as the only limitation of the present embodiment.
As can be seen from the above description, due to the difference of chip types adopted in the camera hardware abstraction layer and the difference of platforms providing the chips, the step size of the space occupied by each row of pixels in the preview stream Buffer of the preview stream uploaded by the buffered camera hardware abstraction layer may be different, such as may be 64 bits, may be 128 bits, and may also be 512 bits, and the preset surface information is different, which may cause the step size of the blurred frame Buffer allocated by the camera frame to be possibly consistent with the step size corresponding to the current preview stream Buffer, and may also be different, so if the copy interface of the same blurred frame is adopted, that is, the software system shown in fig. 2 is installed in the electronic device having the camera, such as a mobile phone and the like that adopt chips provided by different platforms (in the camera frame layer, the Buffer data is copied by using the data block as a unit), it is obviously inapplicable. When the shooting mode of the electronic equipment of each platform is required to be switched, the fuzzy frame can be acquired and used as transition animation for transition, a fuzzy frame copying interface or configuration needs to be changed according to specific conditions, the workload of changing and reconfiguring can be greatly increased, and the later maintenance is not facilitated.
Based on the technical problem that the switching process of the shooting mode is abnormal in different platforms due to the adoption of the same interface in the prior art, the shooting mode switching method and the electronic equipment provided by the embodiment of the application are provided, so that a set of fuzzy frame copying interfaces can be realized, and the method and the electronic equipment can be suitable for different platforms.
In order to better describe the method for switching the shooting mode and the electronic device provided in the embodiment of the present application, taking the electronic device as a mobile phone as an example, a hardware structure and a software structure of the mobile phone are described with reference to fig. 6 and 7, respectively.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a mobile phone according to an embodiment of the present application. Fig. 6 illustrates the structure of an electronic device by using a mobile phone, but it is obvious to those skilled in the art that the structure of the mobile phone in fig. 6 is also applicable to other electronic devices which are provided with a camera and support mode switching. As shown in fig. 6, the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a USB interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, buttons 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like, and may support various USB specifications including USB1.0, USB2.0, USB3.0, and USB4.0 or higher standard USB specifications. Illustratively, the USB interface 130 may include one or more USB interfaces.
In addition, the processor 110 is further configured to obtain an instruction for implementing the switching method of the shooting mode provided in the embodiment of the present application from the memory, so as to implement copying of the blurred frame when the step size of the space of each line of pixels in the preview stream Buffer is not consistent with the step size of the space of each line of pixels in the blurred frame Buffer according to the obtained instruction and according to the switching method of the shooting mode provided in the embodiment of the present application.
In addition, it should be understood that the connection relationship between the modules illustrated in the embodiments of the present application is only an exemplary illustration, and does not limit the structure of the mobile phone 100. In other embodiments of the present application, the mobile phone 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the handset 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
Referring to fig. 7, fig. 7 is a block diagram of a software structure of the mobile phone 100 according to the embodiment of the present application. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, which are an application layer, a system framework layer, a system library and runtime layer (in fig. 7, the runtime layer "Android runtime" is combined in the system library) and a kernel layer from top to bottom.
The application layer may include camera, gallery, calendar, sports, WLAN, music, video, etc. applications. It should be noted that the application included in the application layer shown in fig. 7 is only an exemplary description, and the application is not limited thereto. It is to be understood that the applications included in the application layer are not to be specifically limited to the handset 100. In other embodiments of the present application, the mobile phone 100 may include more or less applications than the applications included in the application layer shown in fig. 7, and the mobile phone 100 may also include completely different applications.
The system framework layer provides an Application Programming Interface (API) and a Programming framework for the Application program of the Application layer, including various components and services to support android development by the developer. The system framework layer includes a number of predefined functions. As shown in FIG. 7, the system framework layers may include a view system, a window manager, an explorer, a content provider, and the like. The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, and the like.
The system library and Runtime layer comprises a system library and an Android Runtime (Android Runtime). The system library may include a plurality of functional modules. For example: a browser kernel, a 3D graphics library (e.g., OpenGL ES), a font library, and the like. The browser kernel is responsible for interpreting the web page syntax (e.g., an application HTML, JavaScript in the standard universal markup language) and rendering (displaying) the web page. The 3D graphics library is used for realizing three-dimensional graphics drawing, image rendering, composition, layer processing and the like. The font library is used for realizing the input of different fonts. The android runtime includes a core library and a virtual machine. The android runtime is responsible for scheduling and managing the android system. The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
It is to be understood that the components included in the system framework layer, the system library and the runtime layer shown in fig. 7 do not constitute a specific limitation to the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
In addition, it is understood that, in the description of the embodiments of the present application, a mobile phone is taken as an example for illustration, and in other embodiments, the present application is also applicable to a scene in which a shooting mode is switched after a camera application is started, such as an electronic device with a camera, such as a laptop computer, a desktop computer, a palmtop computer (e.g., a tablet computer, a smart phone, etc.), and a smart wearable device with a camera (e.g., a smart watch), etc.
In order to better understand the method for cutting the shooting mode provided in the embodiment of the present application, an application scenario of the method for switching the shooting mode provided in the embodiment of the present application is described below with reference to fig. 9.
Referring to fig. 8, the method for switching the shooting mode provided by the embodiment of the present application can be applied to electronic devices of different platforms. The system is suitable for electronic equipment of different platforms to copy fuzzy frames.
As shown in fig. 8, a mobile phone installed with a system for fuzzy frame copying of electronic devices applicable to different platforms according to the embodiment of the present application adopts a chip (hereinafter referred to as chip 1) provided by platform 1, assuming that a step length after memory alignment of a preview stream Buffer uploaded to a camera frame after processing by chip 1 located at a camera hardware abstraction layer is 64 bits, and in an actual buffering process, image data to be displayed buffered in the preview stream Buffer aligned with 64 bits is, for example, a width of a pixel point of user B displayed on each line is X, a chip (hereinafter referred to as chip 2) provided by platform 2 is adopted by a tablet computer installed with a system for fuzzy frame copying of electronic devices applicable to different platforms according to the embodiment of the present application, assuming that a step length after memory alignment of a preview stream Buffer uploaded to a camera frame after processing by chip 2 located at a camera hardware abstraction layer is 512 bits, however, in the actual buffering process, the width of the pixel point for displaying the user B in each row of the user image to be displayed buffered in the preview stream Buffer aligned with the 512-bit memory is also X. For the electronic devices provided by the two platforms, if the camera application installed in the mobile phone and the camera application installed in the tablet computer are started, the memory alignments of the preset surface information carried in the fuzzy frame request issued when the mode switching is triggered are 64 bits, that is, the step length of the fuzzy frame Buffer allocated in the camera frame of the mobile phone and the step length of the fuzzy frame Buffer allocated in the camera frame of the tablet computer are 64 bits, and the width of the fuzzy frame Buffer for displaying the image data in the 64 bits is X (the width of the pixel point for displaying the real image data). Based on the system provided by the embodiment of the application, no matter the mobile phone with the aligned memory or the tablet computer with the misaligned memory can adopt the same fuzzy frame copying interface, so that the fuzzy transition of transition pictures can be realized without the phenomenon of black screen in the switching process of the shooting modes.
With regard to the architecture of the system provided by the embodiment of the present application, which can be applied to different platforms, and the specific flow when implementing the fuzzy frame copy, refer to fig. 9.
As shown in fig. 9, after the camera application located in the application program layer of the mobile phone system, for example, the user a clicks the camera application 10-1 in fig. 1, and after monitoring the click operation of the user a, the mobile phone system issues an operation instruction for opening the camera to the camera frame located in the application program frame layer, the camera frame calls a corresponding camera interface, and issues a configuration stream to the camera hardware abstraction layer of the hardware abstraction layer (HAL layer), so as to open the camera application.
Correspondingly, after the camera application is started, when the user A does not click any functional button of the display interface, the camera hardware abstraction layer can continuously report the preview stream.
It can be understood that, in an actual application scenario, the types of chips and platforms providing the chips adopted in the hardware abstraction layer of the camera are different, and memory spaces occupied by the uploaded preview stream stored in the preview stream Buffer located in the frame of the camera are different, and at present, in order to efficiently and quickly copy the preview stream from the Buffer, an operation of memory alignment is usually performed on image data corresponding to the preview stream, that is, extended content (padding) is automatically added at the end of each line of image data, so that memory alignment is realized.
In addition, it should be noted that the image data is a real image that needs to be displayed in a 20 region of a display interface corresponding to the camera application of the application layer, for example, each pixel point of the image of the user B displayed in the 20 region of the display interface in fig. 1; the extended content automatically added at the end of the line is substantially without any content, and is only used for occupying the memory space of the buffer, so that the display of the image is not influenced.
As can be known from fig. 9, after the camera application is started, the camera application located in the application program layer of the mobile phone system, that is, the camera application 10-1 in fig. 1 clicked by the user a monitors that the user a triggers the mode switching function, and at this time, a fuzzy frame request carrying information for presetting the surface is issued to the camera frame located in the application program frame layer.
With respect to the preset surface information carried in the blurred frame request, fig. 9 still takes the case of memory alignment of 64 bits.
Illustratively, when a fuzzy frame request carrying the information reaches a camera frame, the camera frame allocates a fuzzy frame Buffer according to preset surface information.
Correspondingly, after the camera frame is allocated with the fuzzy frame Buffer with 64-bit memory alignment, the image data in the preview stream Buffer is copied according to lines, and after the image data in the preview stream Buffer is copied, the fuzzy frame Buffer is returned to the camera application, so that the camera application can perform fuzzy processing according to the image data in the fuzzy frame Buffer, and a transition picture for performing fuzzy transition in the shooting mode switching process is obtained.
In addition, it should be noted that, because the switching method of the shooting blur provided in the embodiment of the present application can be applied to electronic devices using different chips, that is, copying of preview streams in different memory alignment formats can be realized, which is described below with reference to fig. 10.
Referring to fig. 10, two implementation scenarios are respectively shown, where contents shown by solid line boxes in the HAL layer indicate that a chip 1 is adopted by a current electronic device (for example, a mobile phone), and assuming that a memory alignment format of a preview stream provided by the chip 1 is 512 bits, in a camera frame of an application frame layer, the camera frame needs to allocate a preview stream Buffer for buffering the preview stream according to the memory alignment format of the preview stream provided by the chip 1.
Understandably, since the memory alignment format of the preview stream provided by the chip 1 is 512 bits, the preview stream Buffer allocated in the camera frame needs to be the preview stream Buffer with memory alignment of 512 bits shown in a solid frame.
For example, if the electronic device (e.g., a mobile phone) employs the chip 2 shown by a dashed-line frame in the HAL layer in fig. 10, and it is assumed that the memory alignment format of the preview stream provided by the chip 2 is 64 bits, in the camera frame of the application framework layer, the camera frame needs to allocate a preview stream Buffer for buffering the preview stream according to the memory alignment format of the preview stream provided by the chip 2.
It can be understood that, since the memory alignment format of the preview stream provided by the chip 2 is 64 bits, the preview stream Buffer allocated in the camera frame needs to be the preview stream Buffer with 64 bits memory alignment shown by the dashed box.
However, no matter which platform provides the chip used, and no matter which format the memory alignment is, when determining that the blurred frame copy is needed, the electronic device based on the system architecture provided in the embodiment of the present application employs an operation of copying the following data in the preview stream Buffer by line, as in step 7 and step 7' in fig. 10. Therefore, as long as the width of the image data for display in the blurred frame Buffer is identical with the width of the image data to be displayed stored in the preview stream Buffer (identical resolution, identical width), the blurred frame can be copied by copying the image data of each line in the line Buffer provided in the present system.
Based on the electronic device with the hardware structure and the software structure, the method for switching the shooting mode provided by the embodiment of the application is provided.
Referring to fig. 11, a specific implementation flow of the method for switching the shooting mode provided in this embodiment specifically includes:
step 201, when the application program layer monitors that the operation is selected for the camera application, the application program layer acquires a first preview memory block in which a first preview stream is buffered from the camera frame, generates a first preview picture according to the first preview stream buffered in the first preview memory block, and displays the first preview picture on a display interface.
Specifically, the first preview stream is uploaded by the camera hardware abstraction layer, the memory alignment format of the first preview stream is a first memory alignment format, and the memory alignment format of the second preview stream memory block is a first memory alignment format.
For example, in an actual application scenario, when the application program layer monitors a selected operation of the camera application, acquiring, from the camera frame, the first preview memory block in which the first preview stream is buffered specifically is:
the camera frame sends a configuration stream to the camera hardware abstraction layer according to the request for opening the camera;
the camera hardware abstraction layer opens the camera according to the configuration stream and uploads the first preview stream in the first memory alignment format to the camera frame;
the camera frame allocates the first preview stream memory block in the first memory alignment format for the first preview stream according to the first memory alignment format, buffers the first preview stream to the first preview memory block, and sends the first preview memory block in which the first preview stream is buffered to the camera application.
In addition, in an actual application scenario, in order to avoid repeatedly allocating a memory block for buffering a preview stream in a memory, the camera application returns the first preview stream memory block after the first preview stream is taken out to the camera frame, so that the camera frame buffers the second preview stream to the first preview stream memory block after receiving the second preview stream uploaded by the camera hardware abstraction layer, thereby avoiding memory redundancy.
Step 202, when monitoring a switching operation of any shooting mode in the camera application, the application program layer sends a fuzzy frame request carrying preset canvas information to the camera frame, wherein the preset canvas information comprises a second memory alignment format.
It is to be understood that the preset canvas information, i.e. the preset surface information, referred to herein, may be in a memory-aligned format of 64 bits in one example.
Step 203, the camera frame allocates a fuzzy frame internal memory block in a second memory alignment format according to the preset canvas information, copies the image data in the first preview stream memory block to the fuzzy frame internal memory block in a line-by-line manner, and sends the fuzzy frame internal memory block to the camera application after copying is completed.
For example, before the camera frame copies the image data in the first preview in-stream memory block to the blurred frame memory block in a line-by-line manner, the camera frame may compare the first memory alignment format with the second memory alignment format.
Correspondingly, when the first memory alignment format is different from the second memory alignment format, executing a step of copying, by line, image data in the first preview stream memory block to the blurred frame memory block by the camera frame in a line-by-line copy manner; when the first memory alignment format is the same as the second memory alignment format, the camera frame copies the entire content in the first preview memory block to the blurred frame memory block in units of the first preview memory block.
Understandably, because the preview stream subjected to memory alignment includes two parts, namely image data and extended content, the content in the memory block of the first preview stream includes the image data and the extended content spliced together with the image data, where the image data is a pixel point for generating the second preview picture, and the extended content padding is used for placeholder, that is, the extended content padding does not affect the display of the picture.
Therefore, the camera frame copies the image data in the first preview in-stream memory block to the blurred frame in-stream memory block in a line-by-line copy manner, specifically: and the camera frame copies the pixel points, which are used for generating the second preview picture, of each line in the first preview in-stream memory block to the corresponding position in the corresponding line of the fuzzy in-stream memory block on the basis of a YUV color coding method.
For example, the copying, by the camera frame, each line of the first preview in-stream memory block used for generating the second preview picture to a corresponding position in a corresponding line of the blurred frame in-stream memory block based on a YUV color coding method specifically includes:
the camera frame determines values of Y, U and V channels corresponding to each position of each line in the first preview in-stream memory block based on a YUV color coding method;
the camera frame distinguishes image data of the pixel point of each line for generating the second preview picture from the expanded content for occupying according to the change of the numerical values of the three channels of Y, U and V;
and the camera frame copies the pixel points of each line in the first preview in-stream memory block for generating the second preview picture to corresponding positions in the corresponding line of the memory block in the fuzzy frame one by one.
Thus, row-by-row copying based on the YUV color coding method is realized.
In addition, the operation of the camera frame copying the entire content in the first preview memory block to the blurred frame memory block in units of the first preview memory block specifically includes: the camera frame copies the image data in the first preview in-memory block and the extended content spliced with the image data to the blurred frame in-memory block together in units of the first preview in-memory block. This enables a fast copy of the blurred frame.
In addition, in an actual application scenario, the image data copied from the preview stream Buffer to the blurred frame Buffer needs to match a frame format that can be displayed by a surface generated by the camera application according to the blurred frame Buffer for transition animation, so before the blurred frame is copied, the camera frame detects whether the first preview stream buffered in the first preview stream memory block is in a preset stream format, for example, when the surface displays an image frame in a 0 × 23 single-frame format, the preset stream format matched with the surface is in a 0 × 11 format or a 0 × 22 format, that is, the preview streams in the two stream formats can analyze the image frame in the 0 × 23 single-frame format.
Correspondingly, when the first preview stream buffered in the first preview stream memory block is in the preset stream format, executing a step of comparing the first memory alignment format with the second memory alignment format by the camera frame; and when the first preview stream buffered in the first preview stream memory block is not in the preset stream format, applying callback fuzzy frame copy failure to the camera.
Illustratively, in an actual application scene, when the camera application receives the callback failure information, the first preview picture displayed in the current display interface can be directly subjected to fuzzy processing, so that the occurrence of a black screen is avoided.
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not to be taken as the only limitation of the present embodiment.
And 204, in the switching process of the shooting mode, the camera application performs fuzzy processing on the image data in the fuzzy frame memory block to obtain a second preview picture, and the second preview picture is adopted to replace the first preview picture displayed in the display interface.
Further, after the shooting mode is switched to the selected shooting mode, the camera application is further configured to obtain a second preview memory chunk in which a second preview stream is buffered from the camera frame, generate a third preview picture according to the second preview stream buffered in the second preview memory chunk, and replace the second preview picture displayed in the display interface with the third preview picture, where the second preview stream is uploaded by the camera hardware abstraction layer after the shooting mode is switched to the selected shooting mode, a memory alignment format of the second preview stream is the first memory alignment format, and a memory alignment format of the second preview memory chunk is the first memory alignment format.
In addition, since the camera application will return the empty first preview stream memory block to the camera frame after taking out the first preview stream from the first preview stream memory block, when the camera frame receives the second preview stream in the first memory alignment format uploaded by the camera hardware abstraction layer, it may first detect whether there is the reusable first preview stream memory block.
Correspondingly, when the first preview in-stream memory block which can be reused exists, the camera frame buffers the second preview stream into the first preview in-stream memory block, and sends the first preview in-stream memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the first preview in-stream memory block, and replaces the second preview picture displayed in the display interface with the third preview picture; when the first reusable preview internal memory block does not exist, the camera frame allocates the second preview internal memory block in the first memory alignment format to the second preview stream according to the first memory alignment format, buffers the second preview stream to the second preview internal memory block, and sends the second preview internal memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the second preview internal memory block, and replaces the second preview picture displayed in the display interface with the third preview picture.
Therefore, a series of operations of switching the camera application from starting to triggering the mode, copying the fuzzy frame, realizing the transition of the frame of the fuzzy frame, switching to the shooting mode and finishing the switching to the normal frame are realized. In order to better understand the switching method of the shooting mode provided in this embodiment, a timing chart of data interaction among a camera application, a camera frame, and a camera hardware abstraction layer in a successful switching process of the shooting mode from a camera application being turned on according to the switching method of the shooting mode provided in this embodiment is described below with reference to fig. 12:
step 301, a request to open a camera is sent to a camera frame.
Step 302, sending a configuration stream to a camera hardware abstraction layer.
In step 303, the memory aligned 512-bit preview stream is uploaded S2.
In step 304, a preview stream Buffer with memory alignment of 512 bits is allocated for the preview stream S2, and the preview stream is buffered S2.
And step 305, uploading the preview flow Buffer.
Step 306, triggering mode switching.
And 307, sending a fuzzy frame request carrying the surface information to the camera frame.
Specifically, the method for switching the shooting mode provided in the embodiment of the present application mainly aims at improving the fuzzy frame copy process executed after the trigger mode is switched, and when the camera application is opened and a user does not trigger/click any function button in the display interface after the camera application is opened, the camera hardware abstraction layer continuously uploads the preview stream to the buffer in the camera frame and uploads the preview stream to the camera application from the buffer for display, which is similar to the existing scheme and is not described herein again. In this embodiment, a starting point is directly that a camera application located in an application layer monitors that a user triggers a mode switching operation, and issues a fuzzy frame request to a camera frame located in an application frame layer.
Understandably, because the fuzzy frame to be acquired is finally displayed in the preset surface, in an actual application scene, the fuzzy frame request issued to the camera frame by the camera application needs to carry preset surface information.
Further, in order to ensure that the preview stream in the stream format that can match the single frame format that the preset surface needs to display can be acquired from the preview stream Buffer, the blurred frame request may further include the single frame format corresponding to the preset surface.
For convenience of description, in this embodiment, the carried preset surface information is specifically memory alignment (step size of memory space (storing real pixel point + extended content padding)) is 64 bits, and the format of a single frame is 0 × 23 as an example.
And 308, allocating a fuzzy frame Buffer with 64-bit memory alignment according to the surface information.
Step 309, copying the image data in the preview stream Buffer to the blurred frame Buffer by line.
Specifically, after receiving a fuzzy frame request carrying preset surface information and a single frame format from a camera application, the camera frame may first extract a frame of preview stream from the preview stream Buffer S2, and then detect whether there is a preview stream in a preset stream format in the preview stream Buffer S2. The newly buffered to buffered frame preview stream is fetched from the buffer to determine whether the fetched frame preview stream is in a predetermined stream format matching the 0 x 23 single frame format.
For example, in an actual application scenario, the preview stream can be currently parsed, and the stream format of the preview stream capable of parsing out the image frames in the 0 × 23 format is typically 0 × 11 or 0 × 22. Therefore, when the preset surface is in the single frame format of 0 × 23, the preset stream format of the preview stream matching the single frame format of 0 × 23 is 0 × 11 or 0 × 22. Therefore, it is determined whether or not the preview stream Buffer buffers the preview stream of 0 × 11 or 0 × 22 format.
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not to be taken as the only limitation of the present embodiment. In practical application, the single-frame format of the surface may be of other types, and accordingly, the format of the preview stream matched with the single-frame format is the preview stream capable of analyzing the set single-frame format.
Correspondingly, if not, determining that no preview flow capable of performing fuzzy frame processing exists in the preview flow Buffer, and at this time, applying callback information of failure in copying the fuzzy frame to a camera positioned in an application program layer; otherwise, if there is a preview stream that can be processed by the blurred frame process in the preview stream Buffer, the process proceeds to step 309, and the operation in step 309 is executed.
Further, before performing step 309, it may be determined whether the memory alignment formats of the preview stream Buffer and the blurred frame Buffer are the same.
Correspondingly, if the memory pair formats of the two are the same, that is, the step sizes are the same, in this case, the preview stream Buffer may be locked, the current preview stream Buffer is regarded as a data block, and then the content (image data + padding) in the preview stream Buffer is entirely copied to the blurred frame Buffer by using the data block as a unit, and a specific copying process may join fig. 4.
Accordingly, if the memory formats of the two are different, i.e. the step sizes are not the same, the operation of step 309 may be performed.
Specifically, when step 309 is executed, the preview stream Buffer may be locked first, so as to avoid that the camera hardware abstraction layer uploads a new preview stream to the preview stream Buffer, which affects the process of copying the blurred frame; then, based on a YUV color coding method, copying pixel points of each row in the preview stream Buffer for generating a second preview picture (a blurred frame picture obtained after subsequent blurring operation) to corresponding positions in a row corresponding to the blurred frame Buffer by using a camera frame.
Of course, in an actual application scenario, a logic for determining memory alignment of the preview stream Buffer and the blurred frame Buffer does not need to be added, and in this case, regardless of whether the memories are aligned, under the condition that the width of the image data to be displayed buffered in the preview stream Buffer is consistent with the width used for displaying the image data in the blurred frame Buffer, the image data of each line in the preview stream Buffer is directly copied to the corresponding line in the blurred frame Buffer in a line-by-line manner.
The YUV color coding method described above employs brightness and chroma to specify the color of a pixel, and chroma defines both hue and saturation of the color. Y in YUV color coding denotes in particular brightness (Luma), U and V denote Chroma (Chroma). In practical applications, U and V are a blue channel and a red channel, respectively, and the higher the Y channel value is, the brighter the picture is, the higher the U channel value is, the closer the color is to blue, and the higher the V channel value is, the closer the color is to red.
Therefore, based on the characteristic of YUV color coding, when it is determined that the step size of the space occupied by each line of pixels in the preview stream Buffer is inconsistent with the step size of the space occupied by each line of pixels in the blur frame Buffer, that is, the memory of the preview stream Buffer and the memory of the blur frame Buffer are not aligned, in order to solve the problem that the current direct callback fails and the blur frame copy fails, a black screen phenomenon occurs in 20 regions of the display interface shown in fig. 1. Specifically, based on a YUV color coding method, values of Y, U and three channels, namely V, corresponding to each position of each line in the preview stream Buffer are identified, pixel points of real image data of each line are distinguished from padding without content according to changes of the values of Y, U and the three channels of V, and then pixel points of the real image data identified by each line are sequentially copied to corresponding positions in corresponding lines in the blurred frame Buffer.
It can be understood that in an actual application scenario, the padding added at the end of each line of real image data is usually blank, so that the values of Y, U and V corresponding to the padding are obviously different from the values of Y, U and V corresponding to the pixel points of the real image data, and the pixel points of each line of real image data in the buffer can be distinguished from the padding added at the end of the line based on this characteristic.
For better understanding, the method for switching the shooting mode provided in the embodiment of the present application performs a blurred frame copying operation when the memories of the two are not aligned, which is described below with reference to fig. 13.
Referring to fig. 13, assuming that the step size (X + Padding _1) of the space occupied by each row of pixels in the preview stream Buffer is 512 bits, and the step size (X + Padding _2) of the space occupied by each row of pixels in the blurred frame Buffer is 64 bits, the copying process of the image data a in the preview stream Buffer is specifically: and finding out real image data a from the head of the preview stream Buffer according to the resolution, and then respectively copying the real image data of each line into the corresponding line in the fuzzy frame Buffer.
It can be understood that in practical application scenarios, for convenience of copying, the end of each line of the image data a will contain partial extension contents (Padding _1) to ensure memory alignment, but in practice, these extension contents will not affect the display of the image data a. Therefore, regardless of the 64-bit, 128-bit, or 512-bit alignment, as long as the resolution of the image data is the same, the width X of the real YUV display data stored in the image data is the same, so that as long as the image data for display is copied successfully, the size of the actual stored image data X + the extension Padding — 1 is not concerned.
Therefore, based on the characteristic of YUV color coding, the values of the three channels Y, U and V corresponding to each position of each line in the preview stream Buffer are identified one by one, the pixel points of the image data a (real image data to be displayed) of each line are distinguished from padding without content according to the change of the values of the three channels Y, U and V, and then the pixel points of the real image data identified by each line are sequentially copied to the corresponding positions in the corresponding lines in the blurred frame Buffer.
Regarding the copying process, as shown in fig. 13, before the data copying is started, an area available for displaying image data in the blurred frame Buffer is empty, that is, there is no content, and in the copying process of the image data a, the image data a in the preview stream Buffer will be copied into the blurred frame Buffer line by line until all the image data a in the preview stream Buffer are copied into the blurred frame Buffer, that is, the content of the blurred frame Buffer after the data copying is completed includes the image data a and Padding _ 2.
In order to better understand that the step size of the space occupied by each row of pixels in the preview stream Buffer provided by the embodiment of the present application is not consistent with the step size of the space occupied by each row of pixels in the blurred frame Buffer, that is, the two memories are not aligned, how to copy the image data of each row in the preview stream Buffer into the blurred frame Buffer by rows to ensure that the blurred frame copy is performed smoothly is described below with reference to fig. 14 to 16 c.
Referring to fig. 14, the step size of the space occupied by each row of pixels in the preview stream Buffer is 512 bits, i.e., the width X of the image data a to be displayed + the extended content Padding _1 added at the end of each row for occupying place. The image data a to be displayed is composed of 5 pixels of a first line "0" to "4", 5 pixels of a second line "8" to "12", and 5 pixels of a third line "16" to "20". However, at the end of each row, extended contents with a width of Padding _1 are added, for example, 3 extended contents for occupancy are added after a pixel point "4" in the first row, 3 extended contents for occupancy are added after a pixel point "12" in the second row, and 3 extended contents for occupancy are added after a pixel point "16" in the third row, wherein the 3 extended contents for occupancy are added after "13" to "15".
Referring to fig. 15, the step size of the space occupied by each row of pixels in the blur frame Buffer allocated according to the preset surface information carried in the blur frame request is 64 bits, i.e., the width X for displaying the image data + the extended content Padding _2 for occupancy added at the end of each row. The positions for displaying the image data are 5 positions of the first line "0 '" to "4'", 5 positions of the second line "7 '" to "11'", and 5 positions of the third line "14 '" to "18'", respectively. However, at the end of each row, extended content with a width of Padding _2 is added, such as 2 extended content for occupancy of "5 '", "6 '" is added after the first row "4 '", 2 extended content for occupancy of "12 '", "13 '" is added after the second row "11 '", and 2 extended content for occupancy of "19 '" to "20 '" is added after the third row "18 '".
Accordingly, when it is determined that the step size (512 bits) of the space occupied by each line of pixels in the preview stream Buffer is not consistent with the step size (64 bits) of the space occupied by each line of pixels in the blurred frame Buffer, the preview stream Buffer is locked, and based on the YUV color coding method, the pixel points of the image data a in the first line, such as the 5 pixel points of the first line "0" to "4" in fig. 14, are identified first, and then the 5 pixel points are copied to the 5 empty positions, such as the 5 empty positions of the first line "0 '" to "4 '" in fig. 15, specifically, "0" to "0 '", "1" to "1 '", "2" to "2 '", "3" to "3 '", and "4" to "4 '", respectively, and after the copying of the 5 pixel points is completed, the data distribution of the blurred frame Buffer will be as shown in fig. 16 a.
Next, in a manner of copying the pixel points of the image data a to be displayed in the line sequence, the data of the second line in the preview stream Buffer shown in fig. 14 is copied into the blurred frame Buffer shown in fig. 16a, and after the copying of the 5 pixel points of the second line "8" to "12" in fig. 14 is completed, the data distribution of the blurred frame Buffer is as shown in fig. 16 b.
Next, in a manner of copying the pixel points of the image data a to be displayed in line sequence, the data in the third line in the preview stream Buffer shown in fig. 14 is copied into the blurred frame Buffer shown in fig. 16b, and after the copying of the 5 pixel points in the third line "16" to "20" in fig. 14 is completed, the data distribution of the blurred frame Buffer is as shown in fig. 16c, thereby completing the copying of all the image data a to be displayed in the preview stream Buffer.
In addition, it can be found from the above description that, in the method for copying each line of image data in the preview stream Buffer into the blurred frame Buffer according to the embodiment of the present application, Padding _1 at the end of each line of image data a in the preview stream Buffer is not copied into the blurred frame Buffer during the copying process, the blurred frame Buffer stores the image data a copied from the preview stream Buffer, and the extension content is still the original Padding _2 of the blurred frame Buffer.
That is to say, when the step size of the space occupied by each line of pixels in the preview stream Buffer is not consistent with the step size of the space occupied by each line of pixels in the blur frame Buffer, according to the blur frame copying method provided in the embodiment of the present application, when the user a clicks the camera application 10-1 in the main page 10 and the default rear camera is used to take a picture of the user B, if it is monitored that the user a clicks the function button 20-5 of starting the camera shooting mode in the display interface, the method for switching the camera shooting mode provided in the embodiment of the present application is executed between the camera application and the camera frame, and further, no matter whether the memories of the preview stream Buffer and the blur frame Buffer are aligned, the blur frame can be obtained, so that the image of the user B after the blur processing is displayed in the 20 area of the display interface (for example, the effect of the blur of the outline of the user B in the lower right corner in fig. 17), and the blur frame is displayed, and the shooting mode is switched to, and when the camera application receives the video stream uploaded by the camera hardware abstract layer, the image after the mode switching is successful is displayed.
It should be understood that the above description is only an example for better understanding of the technical solution of the present embodiment, and is not to be taken as the only limitation of the present embodiment.
Step 310 uploads the blurred frame Buffer.
And 311, blurring the image frame in the blurring frame Buffer, and displaying.
Step 312, displaying the image after the switching is successful.
Therefore, by adopting the method for switching the shooting modes provided by the embodiment of the application, when the preview stream Buffer is subjected to polling, namely multiplexing, in a multi-platform and multi-scene mode, the copying of the fuzzy frame can be smoothly finished aiming at the condition that the step length of the space occupied by each line of pixels in the preview stream Buffer is not consistent with the step length of the space occupied by each line of pixels in the fuzzy frame Buffer, so that the method can be suitable for the adaptation of the multi-platform technology under the condition that the configuration is not changed, and the compatibility of the non-aligned format is realized.
Further, based on the method for switching the shooting mode provided by the embodiment of the application, the image step size format does not need to be independently adapted according to each chip platform, and the subsequent development and maintenance processes are reduced while the adaptability of multiple platforms and multiple scenes is improved.
In addition, in practical application, the method for switching the shooting mode provided in this embodiment is not only suitable for extracting the blurred frame from the preview stream in the buffer in a multi-platform and multi-scene, but also suitable for extracting the blurred frame from the data stream such as the photo stream or the video stream, for example, in a multi-platform scene, the data frame buffered in the buffer is obtained multiple times in the preview process, and then the multi-frame data is made into an image or video outline of a Graphics Interactive Format (GIF).
Furthermore, it is understood that the electronic device comprises corresponding hardware and/or software modules for performing the respective functions in order to implement the above-described functions. The present application is capable of being implemented in hardware or a combination of hardware and computer software in conjunction with the exemplary algorithm steps described in connection with the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, with the embodiment described in connection with the particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In this embodiment, the electronic device may be divided into functional modules according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module may be implemented in the form of hardware. It should be noted that the division of the modules in this embodiment is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Illustratively, fig. 18 shows a schematic block diagram of an apparatus 400 according to an embodiment of the present application. The apparatus 400 may include: a processor 401 and transceiver/transceiver pins 402, optionally also including a memory 403.
The various components of device 400 are coupled together by a bus 404, where bus 404 includes a power bus, a control bus, and a status signal bus in addition to a data bus. But for clarity of illustration the various busses are referred to in the drawings as busses 404.
Optionally, the memory 403 may be used for the instructions in the foregoing method embodiments. The processor 401 may be used to execute instructions in the memory 303 and control the receive pin to receive signals and the transmit pin to transmit signals.
The apparatus 400 may be an electronic device with a camera and supporting different shooting modes, such as a mobile phone as mentioned in the above method embodiments.
Illustratively, when the apparatus is an electronic device, a camera application is installed on an application program layer of the electronic device, an application program framework layer includes a camera framework, a HAL layer includes a camera hardware abstraction layer, the camera framework performs data interaction with the camera application and the camera hardware abstraction layer, respectively, and the electronic device further includes: one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored on the memory, and when executed by the one or more processors, cause the electronic device to perform the steps of:
when the application program layer monitors the selected operation of the camera application, acquiring a first preview stream memory block in which a first preview stream is buffered from the camera frame, generating a first preview picture according to the first preview stream buffered in the first preview stream memory block, and displaying the first preview picture on a display interface, wherein the first preview stream is uploaded by the camera hardware abstraction layer, and the memory alignment format of the first preview stream is a first memory alignment format;
when monitoring the switching operation of any shooting mode in the camera application, the application program layer sends a fuzzy frame request carrying preset canvas information to the camera frame, wherein the preset canvas information comprises a second memory alignment format;
the camera frame allocates a fuzzy frame internal memory block in a second memory alignment format according to the preset canvas information, copies the image data in the first preview stream internal memory block to the fuzzy frame internal memory block in a line-by-line copying mode, and sends the fuzzy frame internal memory block to the camera application after copying is finished;
in the switching process of the shooting mode, the camera application carries out fuzzy processing on the image data in the fuzzy frame memory block to obtain a second preview picture, and the second preview picture is adopted to replace the first preview picture displayed in the display interface.
Illustratively, in another example, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
when the application program layer monitors the selection operation of the camera application, a request for opening the camera is sent to the camera frame;
the camera frame sends a configuration stream to the camera hardware abstraction layer according to the request for opening the camera;
the camera hardware abstraction layer opens the camera according to the configuration stream and uploads the first preview stream in the first memory alignment format to the camera frame;
the camera frame allocates the first preview stream memory block in the first memory alignment format for the first preview stream according to the first memory alignment format, buffers the first preview stream to the first preview memory block, and sends the first preview memory block in which the first preview stream is buffered to the camera application.
Illustratively, in another example, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
and the camera application transmits the first preview memory block after the first preview stream is taken out back to the camera frame, so that the camera frame buffers the second preview stream to the first preview memory block after receiving the second preview stream uploaded by the camera hardware abstract layer.
Illustratively, in another example, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
after the shooting mode is switched to the selected shooting mode, the camera application acquires a second preview stream memory block buffering a second preview stream from the camera frame, generates a third preview picture according to the second preview stream buffered in the second preview stream memory block, and replaces the second preview picture displayed in the display interface with the third preview picture, wherein the second preview stream is uploaded on the camera hardware abstraction layer after the shooting mode is switched to the selected shooting mode, and the memory alignment format of the second preview stream is the first memory alignment format.
Illustratively, in another example, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
when the camera frame receives the second preview stream in the first memory alignment format uploaded by the camera hardware abstraction layer, detecting whether the first preview stream memory block which can be reused exists or not;
when the first reusable preview in-stream memory block exists, the camera frame buffers the second preview stream into the first preview in-stream memory block, and sends the first preview in-stream memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the first preview in-stream memory block, and replaces a second preview picture displayed in the display interface with the third preview picture;
when the first reusable preview internal memory block does not exist, the camera frame allocates the second preview internal memory block in the first memory alignment format to the second preview stream according to the first memory alignment format, buffers the second preview stream to the second preview internal memory block, and sends the second preview internal memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the second preview internal memory block, and replaces the second preview picture displayed in the display interface with the third preview picture.
Illustratively, in another example, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
the camera frame compares the first memory alignment format with the second memory alignment format;
when the first memory alignment format is different from the second memory alignment format, executing a step of copying the image data in the first preview in-stream memory block to the blurred frame in-stream memory block in a line-by-line manner by the camera frame;
when the first memory alignment format is the same as the second memory alignment format, the camera frame copies the entire content in the first preview memory chunk to the blurred frame memory chunk with the first preview memory chunk as a unit.
Illustratively, in another example, the content in the first preview memory chunk includes the image data and an extended content spliced with the image data, where the image data is a pixel point for generating the second preview screen, and the extended content is used for placeholder;
the computer programs, when executed by the one or more processors, cause the electronic device to perform the steps of:
and the camera frame copies the pixel points, which are used for generating the second preview picture, of each line in the first preview in-stream memory block to the corresponding position in the corresponding line of the fuzzy in-stream memory block on the basis of a YUV color coding method.
Illustratively, in another example, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
the camera frame determines values of Y, U and V channels corresponding to each position of each line in the first preview in-stream memory block based on a YUV color coding method;
the camera frame distinguishes image data of the pixel point of each line for generating the second preview picture from the expanded content for occupying according to the change of the numerical values of the three channels of Y, U and V;
and the camera frame copies the pixel points of each line in the first preview in-stream memory block for generating the second preview picture to corresponding positions in the corresponding line of the memory block in the fuzzy frame one by one.
Illustratively, in another example, the content in the first preview memory chunk includes the image data and an extended content spliced with the image data, where the image data is a pixel point for generating the second preview screen, and the extended content is used for placeholder;
the computer programs, when executed by the one or more processors, cause the electronic device to perform the steps of:
the camera frame copies the image data in the first preview in-stream memory block and the extended content spliced with the image data to the blurred frame in-stream memory block together in units of the first preview in-stream memory block.
Illustratively, in another example, the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
the camera frame detects whether the first preview stream buffered in the first preview stream memory block is in a preset stream format;
when the first preview stream buffered in the first preview stream memory block is in the preset stream format, executing a step of comparing the first memory alignment format with the second memory alignment format by the camera frame;
and when the first preview stream buffered in the first preview stream memory block is not in the preset stream format, applying callback fuzzy frame copy failure to the camera.
All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The embodiment of the application further provides an assembling method of the camera of the electronic device, which specifically comprises the following steps: when a chip adopted by an electronic device is a first chip providing a preview stream in a first memory alignment format, a memory alignment format of a preview stream memory block allocated by a camera frame for buffering the preview stream is the first memory alignment format, and when it is monitored that a shooting mode of the camera is triggered to be switched, the electronic device executes the relevant method steps to realize the switching method of the shooting mode in the embodiment;
when the chip adopted by the electronic device is a second chip providing a preview stream in a second memory alignment format, the memory alignment format of the preview stream memory block allocated by the camera frame for buffering the preview stream is the second memory alignment format, and when it is monitored that switching of the shooting mode of the camera is triggered, the electronic device executes the relevant method steps to realize the switching method of the shooting mode in the embodiment.
The present invention further provides a computer storage medium, which stores computer instructions, and when the computer instructions are executed on an electronic device/network device (e.g., OTA server, caba server), the electronic device/network device executes the above related method steps to implement the method for switching shooting modes in the above embodiments.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute the above related steps to implement the method for switching the shooting mode in the above embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the switching method of the shooting mode in the above method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
Any of the various embodiments of the present application, as well as any of the same embodiments, can be freely combined. Any combination of the above is within the scope of the present application.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (23)

1. A method for switching shooting modes is applied to electronic equipment, a camera application is installed on an application program layer of the electronic equipment, an application program framework layer comprises a camera framework, an HAL layer comprises a camera hardware abstraction layer, and the camera framework performs data interaction with the camera application and the camera hardware abstraction layer respectively, and the method comprises the following steps:
when the application program layer monitors the selected operation of the camera application, acquiring a first preview stream memory block in which a first preview stream is buffered from the camera frame, generating a first preview picture according to the first preview stream buffered in the first preview stream memory block, and displaying the first preview picture on a display interface, wherein the first preview stream is uploaded by the camera hardware abstraction layer, and the memory alignment format of the first preview stream is a first memory alignment format;
when monitoring the switching operation of any shooting mode in the camera application, the application program layer sends a fuzzy frame request carrying preset canvas information to the camera frame, wherein the preset canvas information comprises a second memory alignment format;
the camera frame allocates a fuzzy frame internal memory block in a second memory alignment format according to the preset canvas information, copies the image data in the first preview stream internal memory block to the fuzzy frame internal memory block in a line-by-line copying mode, and sends the fuzzy frame internal memory block to the camera application after copying is completed, wherein the width of a memory space of the fuzzy frame internal memory block for buffering the image data is the same as the width of a memory space of the first preview stream internal memory block for buffering the image data;
in the switching process of the shooting mode, the camera application carries out fuzzy processing on the image data in the fuzzy frame memory block to obtain a second preview picture, and the second preview picture is adopted to replace the first preview picture displayed in the display interface.
2. The method according to claim 1, wherein when the application program layer monitors a selected operation of the camera application, acquiring a first preview memory chunk in which a first preview stream is buffered from the camera frame includes:
when the application program layer monitors the selection operation of the camera application, a request for opening the camera is sent to the camera frame;
the camera frame sends a configuration stream to the camera hardware abstraction layer according to the request for opening the camera;
the camera hardware abstraction layer opens the camera according to the configuration stream and uploads the first preview stream in the first memory alignment format to the camera frame;
the camera frame allocates the first preview stream memory block in the first memory alignment format for the first preview stream according to the first memory alignment format, buffers the first preview stream to the first preview memory block, and sends the first preview memory block in which the first preview stream is buffered to the camera application.
3. The method according to claim 1, wherein when the application program layer monitors a selected operation on the camera application, the method further includes, when acquiring a first preview in-stream memory chunk in which a first preview stream is buffered from the camera frame, generating a first preview screen according to the first preview stream buffered in the first preview in-stream memory chunk, and displaying the first preview screen on a display interface:
and the camera application transmits the first preview memory block after the first preview stream is taken out back to the camera frame, so that the camera frame buffers the second preview stream to the first preview memory block after receiving the second preview stream uploaded by the camera hardware abstract layer.
4. The method according to claim 1, wherein during the switching of the shooting mode, after the camera application performs blurring processing on the image data in the blurred frame memory block to obtain a second preview picture and replaces the first preview picture displayed in the display interface with the second preview picture, the method further comprises:
after the shooting mode is switched to the selected shooting mode, the camera application acquires a second preview stream memory block buffering a second preview stream from the camera frame, generates a third preview picture according to the second preview stream buffered in the second preview stream memory block, and replaces the second preview picture displayed in the display interface with the third preview picture, wherein the second preview stream is uploaded on the camera hardware abstraction layer after the shooting mode is switched to the selected shooting mode, and the memory alignment format of the second preview stream is the first memory alignment format.
5. The method according to claim 4, wherein after switching to the selected photographing mode, the method further comprises:
when the camera frame receives the second preview stream in the first memory alignment format uploaded by the camera hardware abstraction layer, detecting whether the first preview stream memory block which can be reused exists or not;
when the first reusable preview in-stream memory block exists, the camera frame buffers the second preview stream into the first preview in-stream memory block, and sends the first preview in-stream memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the first preview in-stream memory block, and replaces a second preview picture displayed in the display interface with the third preview picture;
when the first reusable preview internal memory block does not exist, the camera frame allocates the second preview internal memory block in the first memory alignment format to the second preview stream according to the first memory alignment format, buffers the second preview stream to the second preview internal memory block, and sends the second preview internal memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the second preview internal memory block, and replaces the second preview picture displayed in the display interface with the third preview picture.
6. The method according to any one of claims 1 to 5, wherein the camera frame copies the image data in the first preview in-stream memory block in a line-by-line manner before copying the image data in the first preview in-stream memory block in a line-by-line manner to the blurred frame in-stream memory block, and the method further comprises:
the camera frame compares the first memory alignment format with the second memory alignment format;
when the first memory alignment format is different from the second memory alignment format, executing a step of copying the image data in the first preview in-stream memory block to the blurred frame in-stream memory block in a line-by-line manner by the camera frame;
when the first memory alignment format is the same as the second memory alignment format, the camera frame copies the entire content in the first preview memory chunk to the blurred frame memory chunk with the first preview memory chunk as a unit.
7. The method according to claim 6, wherein the content in the first preview in-stream memory chunk includes the image data and an extended content spliced together with the image data, the image data is a pixel point for generating the second preview screen, and the extended content is used for placeholder;
the camera frame copies the image data in the first preview in-stream memory block to the blurred frame in-stream memory block in a line-by-line manner, including:
and the camera frame copies the pixel points of each line in the first preview in-stream memory block for generating the second preview picture to the corresponding positions in the corresponding lines of the fuzzy in-stream memory block on the basis of a YUV color coding method.
8. The method of claim 7, wherein the copying, by the camera framework based on a YUV color coding method, the pixel point of each line in the first preview in-stream memory block for generating the second preview picture to a corresponding position in a corresponding line in the blurred frame in-stream memory block comprises:
the camera frame determines values of Y, U and V channels corresponding to each position of each line in the first preview in-stream memory block based on a YUV color coding method;
the camera frame distinguishes image data of the pixel point of each line for generating the second preview picture from the expanded content for occupying according to the change of the numerical values of the three channels of Y, U and V;
and the camera frame copies the pixel points of each line in the first preview in-stream memory block for generating the second preview picture to corresponding positions in the corresponding line of the memory block in the fuzzy frame one by one.
9. The method according to claim 8, wherein the content in the first preview in-stream memory chunk includes the image data and an extended content spliced together with the image data, the image data is a pixel point for generating the second preview screen, and the extended content is used for placeholder;
the camera frame uses the first preview in-stream memory block as a unit to copy the whole content in the first preview in-stream memory block to the fuzzy frame in-stream memory block, and the method includes:
the camera frame copies the image data in the first preview in-stream memory block and the extended content spliced with the image data to the blurred frame in-stream memory block together in units of the first preview in-stream memory block.
10. The method of claim 6, wherein prior to the camera chassis comparing the first memory-aligned format to the second memory-aligned format, the method further comprises:
the camera frame detects whether the first preview stream buffered in the first preview stream memory block is in a preset stream format;
when the first preview stream buffered in the first preview stream memory block is in the preset stream format, executing a step of comparing the first memory alignment format with the second memory alignment format by the camera frame;
and when the first preview stream buffered in the first preview stream memory block is not in the preset stream format, applying callback fuzzy frame copy failure to the camera.
11. An electronic device, wherein a camera application is installed on an application layer of the electronic device, an application framework layer includes a camera framework, a HAL layer includes a camera hardware abstraction layer, and the camera framework performs data interaction with the camera application and the camera hardware abstraction layer, respectively, and the electronic device further includes:
one or more processors;
a memory;
and one or more computer programs, wherein the one or more computer programs are stored on the memory, and when executed by the one or more processors, cause the electronic device to perform the steps of:
when the application program layer monitors the selected operation of the camera application, acquiring a first preview stream memory block in which a first preview stream is buffered from the camera frame, generating a first preview picture according to the first preview stream buffered in the first preview stream memory block, and displaying the first preview picture on a display interface, wherein the first preview stream is uploaded by the camera hardware abstraction layer, and the memory alignment format of the first preview stream is a first memory alignment format;
when monitoring the switching operation of any shooting mode in the camera application, the application program layer sends a fuzzy frame request carrying preset canvas information to the camera frame, wherein the preset canvas information comprises a second memory alignment format;
the camera frame allocates a fuzzy frame internal memory block in a second memory alignment format according to the preset canvas information, copies the image data in the first preview stream internal memory block to the fuzzy frame internal memory block in a line-by-line copying mode, and sends the fuzzy frame internal memory block to the camera application after copying is completed, wherein the width of a memory space of the fuzzy frame internal memory block for buffering the image data is the same as the width of a memory space of the first preview stream internal memory block for buffering the image data;
in the switching process of the shooting mode, the camera application carries out fuzzy processing on the image data in the fuzzy frame memory block to obtain a second preview picture, and the second preview picture is adopted to replace the first preview picture displayed in the display interface.
12. The electronic device of claim 11, wherein the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
when the application program layer monitors the selection operation of the camera application, a request for opening the camera is sent to the camera frame;
the camera frame sends a configuration stream to the camera hardware abstraction layer according to the request for opening the camera;
the camera hardware abstraction layer opens the camera according to the configuration stream and uploads the first preview stream in the first memory alignment format to the camera frame;
the camera frame allocates the first preview stream memory block in the first memory alignment format for the first preview stream according to the first memory alignment format, buffers the first preview stream to the first preview memory block, and sends the first preview memory block in which the first preview stream is buffered to the camera application.
13. The electronic device of claim 11, wherein the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
and the camera application transmits the first preview memory block after the first preview stream is taken out back to the camera frame, so that the camera frame buffers the second preview stream to the first preview memory block after receiving the second preview stream uploaded by the camera hardware abstract layer.
14. The electronic device of claim 11, wherein the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
after the shooting mode is switched to the selected shooting mode, the camera application acquires a second preview stream memory block buffering a second preview stream from the camera frame, generates a third preview picture according to the second preview stream buffered in the second preview stream memory block, and replaces the second preview picture displayed in the display interface with the third preview picture, wherein the second preview stream is uploaded on the camera hardware abstraction layer after the shooting mode is switched to the selected shooting mode, and the memory alignment format of the second preview stream is the first memory alignment format.
15. The electronic device of claim 14, wherein the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
when the camera frame receives the second preview stream in the first memory alignment format uploaded by the camera hardware abstraction layer, detecting whether the first preview stream memory block which can be reused exists or not;
when the first reusable preview in-stream memory block exists, the camera frame buffers the second preview stream into the first preview in-stream memory block, and sends the first preview in-stream memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the first preview in-stream memory block, and replaces a second preview picture displayed in the display interface with the third preview picture;
when the first reusable preview internal memory block does not exist, the camera frame allocates the second preview internal memory block in the first memory alignment format to the second preview stream according to the first memory alignment format, buffers the second preview stream to the second preview internal memory block, and sends the second preview internal memory block in which the second preview stream is buffered to the camera application, so that the camera application generates a third preview picture according to the second preview stream buffered in the second preview internal memory block, and replaces the second preview picture displayed in the display interface with the third preview picture.
16. The electronic device of any of claims 11-15, wherein the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
the camera frame compares the first memory alignment format with the second memory alignment format;
when the first memory alignment format is different from the second memory alignment format, executing a step of copying the image data in the first preview in-stream memory block to the blurred frame in-stream memory block in a line-by-line manner by the camera frame;
when the first memory alignment format is the same as the second memory alignment format, the camera frame copies the entire content in the first preview memory chunk to the blurred frame memory chunk with the first preview memory chunk as a unit.
17. The electronic device according to claim 16, wherein the content in the first preview in-stream memory chunk includes the image data and an extended content spliced together with the image data, the image data is a pixel point for generating the second preview screen, and the extended content is used for placeholder;
the computer programs, when executed by the one or more processors, cause the electronic device to perform the steps of:
and the camera frame copies the pixel points, which are used for generating the second preview picture, of each line in the first preview in-stream memory block to the corresponding position in the corresponding line of the fuzzy in-stream memory block on the basis of a YUV color coding method.
18. The electronic device of claim 17, wherein the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
the camera frame determines values of Y, U and V channels corresponding to each position of each line in the first preview in-stream memory block based on a YUV color coding method;
the camera frame distinguishes image data of the pixel point of each line for generating the second preview picture from the expanded content for occupying according to the change of the numerical values of the three channels of Y, U and V;
and the camera frame copies the pixel points of each line in the first preview in-stream memory block for generating the second preview picture to corresponding positions in the corresponding line of the memory block in the fuzzy frame one by one.
19. The electronic device according to claim 18, wherein the content in the first preview in-stream memory chunk includes the image data and an extended content spliced together with the image data, the image data is a pixel point for generating the second preview screen, and the extended content is used for placeholder;
the computer programs, when executed by the one or more processors, cause the electronic device to perform the steps of:
the camera frame copies the image data in the first preview in-stream memory block and the extended content spliced with the image data to the blurred frame in-stream memory block together in units of the first preview in-stream memory block.
20. The electronic device of claim 16, wherein the computer program, when executed by the one or more processors, causes the electronic device to perform the steps of:
the camera frame detects whether the first preview stream buffered in the first preview stream memory block is in a preset stream format;
when the first preview stream buffered in the first preview stream memory block is in the preset stream format, executing a step of comparing the first memory alignment format with the second memory alignment format by the camera frame;
and when the first preview stream buffered in the first preview stream memory block is not in the preset stream format, applying callback fuzzy frame copy failure to the camera.
21. An assembling method of a camera of an electronic device, comprising:
when a chip adopted by an electronic device is a first chip providing a preview stream in a first memory alignment format, a memory alignment format of a preview stream memory block allocated by a camera frame for buffering the preview stream is the first memory alignment format, and when it is monitored that switching of a shooting mode of a camera is triggered, the electronic device is caused to execute the method for switching the shooting mode according to any one of claims 1 to 10;
when the chip adopted by the electronic device is a second chip providing a preview stream in a second memory alignment format, the memory alignment format of a preview stream memory block allocated by a camera frame for buffering the preview stream is the second memory alignment format, and when the situation that switching of the shooting mode of the camera is triggered is monitored, the electronic device is made to execute the method for switching the shooting mode according to any one of claims 1 to 10;
wherein the first memory alignment format is different from the second memory alignment format.
22. A chip, comprising: one or more interface circuits and one or more processors; the interface circuit is configured to receive signals from a memory of an electronic device and to transmit the signals to the processor, the signals including computer instructions stored in the memory; the computer instructions, when executed by the processor, cause the electronic device to perform the method of switching shooting modes of any one of claims 1 to 10.
23. A computer-readable storage medium comprising a computer program, characterized in that when the computer program is run on an electronic device, the electronic device is caused to execute the method of switching the shooting mode according to any one of claims 1 to 10.
CN202110910375.1A 2021-08-09 2021-08-09 Shooting mode switching method and electronic equipment Active CN113766120B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110910375.1A CN113766120B (en) 2021-08-09 2021-08-09 Shooting mode switching method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110910375.1A CN113766120B (en) 2021-08-09 2021-08-09 Shooting mode switching method and electronic equipment

Publications (2)

Publication Number Publication Date
CN113766120A CN113766120A (en) 2021-12-07
CN113766120B true CN113766120B (en) 2022-08-09

Family

ID=78788837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110910375.1A Active CN113766120B (en) 2021-08-09 2021-08-09 Shooting mode switching method and electronic equipment

Country Status (1)

Country Link
CN (1) CN113766120B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116723416B (en) * 2022-10-21 2024-04-02 荣耀终端有限公司 Image processing method and electronic equipment
CN116680001A (en) * 2022-12-20 2023-09-01 荣耀终端有限公司 Starting method of camera application, readable storage medium and electronic equipment
CN117135259A (en) * 2023-04-11 2023-11-28 荣耀终端有限公司 Camera switching method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103716535A (en) * 2013-12-12 2014-04-09 乐视致新电子科技(天津)有限公司 Method for switching photographing mode, and electronic device
CN107465875A (en) * 2017-09-13 2017-12-12 北京元心科技有限公司 Camera preview data cache method and device
WO2019071548A1 (en) * 2017-10-13 2019-04-18 深圳传音通讯有限公司 Terminal photograph capturing control method, mobile terminal, and readable storage medium
CN110113526A (en) * 2019-04-22 2019-08-09 联想(北京)有限公司 Processing method, processing unit and electronic equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10063776B2 (en) * 2015-05-01 2018-08-28 Gopro, Inc. Camera mode control
CN112492193B (en) * 2019-09-12 2022-02-18 华为技术有限公司 Method and equipment for processing callback stream

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103716535A (en) * 2013-12-12 2014-04-09 乐视致新电子科技(天津)有限公司 Method for switching photographing mode, and electronic device
CN107465875A (en) * 2017-09-13 2017-12-12 北京元心科技有限公司 Camera preview data cache method and device
WO2019071548A1 (en) * 2017-10-13 2019-04-18 深圳传音通讯有限公司 Terminal photograph capturing control method, mobile terminal, and readable storage medium
CN110113526A (en) * 2019-04-22 2019-08-09 联想(北京)有限公司 Processing method, processing unit and electronic equipment

Also Published As

Publication number Publication date
CN113766120A (en) 2021-12-07

Similar Documents

Publication Publication Date Title
CN113766120B (en) Shooting mode switching method and electronic equipment
CN114071197B (en) Screen projection data processing method and device
CN114461375B (en) Memory resource management method and electronic equipment
CN115526787B (en) Video processing method and device
CN115564659B (en) Video processing method and device
CN114466134A (en) Method and electronic device for generating HDR image
US20230335081A1 (en) Display Synchronization Method, Electronic Device, and Readable Storage Medium
WO2023125518A1 (en) Image encoding method and device
CN115665342B (en) Image processing method, image processing circuit, electronic device, and readable storage medium
WO2023016059A1 (en) Data transmission control method and related apparatus
CN110365962B (en) Color gamut conversion processing method and device and electronic equipment
CN110378973B (en) Image information processing method and device and electronic equipment
CN108881999B (en) Screen capture processing method and system
CN111131019A (en) Multiplexing method and terminal for multiple HTTP channels
CN117082295B (en) Image stream processing method, device and storage medium
CN116048323B (en) Image processing method and electronic equipment
CN115802144B (en) Video shooting method and related equipment
CN117278864B (en) Image capturing method, electronic device, and storage medium
CN117076284B (en) Page loading time length detection method, equipment and storage medium
CN116028383B (en) Cache management method and electronic equipment
CN114327317B (en) Mirror image screen projection method, device and system
CN117135468A (en) Image processing method and electronic equipment
CN117695626A (en) Game data identification method, equipment and storage medium
CN115460343A (en) Image processing method, apparatus and storage medium
CN116723264A (en) Method, apparatus and storage medium for determining target location information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant