CN116048379A - Data recharging method and device - Google Patents

Data recharging method and device Download PDF

Info

Publication number
CN116048379A
CN116048379A CN202210758524.1A CN202210758524A CN116048379A CN 116048379 A CN116048379 A CN 116048379A CN 202210758524 A CN202210758524 A CN 202210758524A CN 116048379 A CN116048379 A CN 116048379A
Authority
CN
China
Prior art keywords
data
image processing
processing module
image
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210758524.1A
Other languages
Chinese (zh)
Other versions
CN116048379B (en
Inventor
许集润
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210758524.1A priority Critical patent/CN116048379B/en
Publication of CN116048379A publication Critical patent/CN116048379A/en
Application granted granted Critical
Publication of CN116048379B publication Critical patent/CN116048379B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0628Interfaces specially adapted for storage systems making use of a particular technique
    • G06F3/0638Organizing or formatting or addressing of data
    • G06F3/0643Management of files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0602Interfaces specially adapted for storage systems specifically adapted to achieve a particular effect
    • G06F3/0614Improving the reliability of storage systems
    • G06F3/0619Improving the reliability of storage systems in relation to data integrity, e.g. data losses, bit errors

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a data recharging method and device, which are applied to electronic equipment, wherein the method comprises the following steps: after receiving a reading instruction, acquiring image data in response to a shooting operation, reading pre-stored historical image processing data from the electronic equipment in response to the reading instruction, replacing target data with the historical image processing data, wherein the target data is at least one of the image data and processing result data of the image data, and displaying an image obtained based on the historical image processing data. After the image data is acquired, the historical image processing data is used for replacing the target data, so that the recharging of the data is realized, and a foundation is laid for realizing functions of fault positioning, optimization and the like based on the recharging of the image data.

Description

Data recharging method and device
Technical Field
The present disclosure relates to the field of electronic information technologies, and in particular, to a data recharging method and apparatus.
Background
Electronic devices inevitably acquire, call, process, or store data in providing various functions. Data recharging is understood to mean that data that has been acquired or called by at least one data processing module is retransmitted to the data processing module for processing.
Image processing is a common function in electronic devices, such as beautifying a photograph taken by a camera or performing anti-reflection processing on an image in a gallery, and image data can be understood as data related to the image processing functions, such as photographing, beautifying, image editing, and the like. The recharging of the image data has positive significance for the relevant modules of the image processing function, such as the testing and improvement of the camera.
Disclosure of Invention
The application provides a data recharging method and device, and aims to solve the problem of how to achieve recharging of image data.
In order to achieve the above object, the present application provides the following technical solutions:
a first aspect of the present application provides a data recharging method, applied to an electronic device, the method including: after receiving a reading instruction, acquiring image data in response to a shooting operation, reading pre-stored historical image processing data from the electronic equipment in response to the reading instruction, replacing target data with the historical image processing data, wherein the target data is at least one of the image data and processing result data of the image data, and displaying an image obtained based on the historical image processing data. After the image data is acquired, the historical image processing data is used for replacing the target data, so that the recharging of the data is realized, and a foundation is laid for realizing functions of fault positioning, optimization and the like based on the recharging of the image data.
In some implementations, the electronic device is connected to the control device, and the read instruction is obtained from the control device, so as to embed a data recharging function in a flow of implementing the image processing function by the electronic device.
In some implementations, the historical image processing data is stored in the electronic device in a classification dimension that includes: at least one of the acquisition time of the image data, the frame identification of the image data and the identification of the module for processing the image data aims at realizing flexible recharging of the data with different dimensions.
In some implementations, replacing the target data with historical image processing data includes: and replacing the data indicated by the preset classification dimension in the image data by using historical image data of the preset classification dimension, wherein the preset classification dimension comprises preset time, preset frame number and identification of a preset image processing module, and the data indicated by the preset classification dimension comprises the data of the preset frame number of the preset image processing module. Therefore, the image processing module is responsible for recharging the data to be processed, and has better feasibility and flexibility.
In some implementations, a first image processing platform and a second image processing platform are run in the electronic device, the second image processing platform including at least one image processing module, the historical image processing data including first image processing platform data and second image processing platform data. Replacing target data with the historical image processing data, comprising: at least one image processing module replaces target data with the historical image processing data. The first image processing platform is a platform which is adaptive to an image processing module in the chip, so that the image processing function of the chip can be adapted, and the purposes of image processing and data recharging based on the second image processing platform are realized.
In some implementations, replacing the target data with historical image processing data includes: assembling the first image processing platform data and the second image processing platform data to obtain assembled data, and replacing target data by using the assembled data. The mode of assembling first and then replacing is beneficial to ensuring the accuracy and the feasibility of data replacement.
In some implementations, the first image processing platform data and the second image processing platform data are both stored according to a classification dimension, the classification dimension including at least one of a time of acquisition of the image data, a frame identification of the image data, and an identification of a module that processes the image data, with the objective of achieving flexible recharging of the data in different dimensions.
In some implementations, the storing of the historical image processing data includes: after receiving the storage instruction, acquiring historical image processing data in response to the shooting operation, and storing second image processing platform data in the historical image processing data by an image processing module of the second image processing platform in response to the storage instruction, wherein the image processing module transmits the first image processing platform data to the first image processing platform, and the first image processing platform stores the first image processing platform data. The first image processing platform and the second image processing platform respectively store respective data, so that the second image processing can be ensured not to process the data of the first image processing platform, the processing logic of the second image processing platform is simplified, the data of the first image processing platform can be ensured to be stored, the integrity of the image data is ensured, and the function based on the complete image data is realized.
In some implementations, the second image processing platform data includes: the processing object data and the processing result data of the image processing module are beneficial to providing data recharging with various requirements.
In some implementations, assembling the first image processing platform data and the second image processing platform data to obtain the assembled data includes: the target image processing module reads second image processing platform data of a preset frame number stored in the target image processing module acquired at a first moment, the target image processing module is any one image processing module, the target image processing module receives the first image processing platform data stored by the target image processing module acquired at the first moment and transmitted by the first image processing platform, the target image processing module assembles the first image processing platform data stored by the target image processing module acquired at the first moment and the second image processing platform data to obtain assembled data of the target image processing module, the first image processing platform and the second image processing platform respectively read data of respective platforms and are assembled by the second image processing platform, and image processing is carried out on the second image processing platform in a compatible mode, and the frame is adaptive to the chip based on the first image processing platform.
Replacing the target data with the assembly data, comprising: and replacing the data of the preset frame number acquired at the second moment and obtained by the target image processing module by using the assembly data, wherein the first moment is earlier than the second moment. It can be seen that the image data acquired after the replacement of the previously stored historical image processing data is replaced by the acquisition time, the frame number and the dimension of the processing module, so that the practicability and the flexibility are higher.
In some implementations, the second image processing platform further includes a perceptual decision module, and the method provided by the first aspect further includes: the perception decision module transmits scene information and first time information to the image processing module based on the storage instruction, wherein the scene information indicates the image processing module receiving the instruction and the processing sequence, the first time information indicates the acquisition time of the stored data, the first time is used for classifying and storing the stored data, and the visible perception decision module lays a foundation for the image processing module to recharge the data.
In some implementations, the method provided in the first aspect further includes: the perception decision module transmits scene information and first time information to the image processing module based on the reading instruction, wherein the first time information indicates the type of the data to be read. The visible perception decision module triggers the data recharging of the image processing module and provides basis information for realizing recharging.
In some implementations, the method provided in the first aspect further includes: the perception decision module transmits scene information and time information to the first image processing platform based on the reading instruction so as to lay a foundation for the first image processing platform to realize recharging of the first image processing platform data.
In some implementations, the method provided in the first aspect further includes: the sensing decision module reads the first time information from the preset file of the electronic equipment, and has higher flexibility and practicability.
In some implementations, the second image processing platform further includes: the event request processing module, the method provided in the first aspect further includes: the event request processing module is used for responding to shooting event information and triggering the first image processing platform to acquire image data, and responding to instructions, the event request processing module is used for transmitting shooting mode information, instructions and image data, and the instructions comprise storage instructions or reading. The visible event request processing module is used as a functional module for receiving the instruction and obtaining the image data, and lays a foundation for realizing data recharging.
A second aspect of the present application provides an electronic device comprising: the system comprises one or more processors, one or more memories, the memories storing one or more programs, which when executed by the processors, cause the electronic device to perform the data recharging method provided in the first aspect of the application.
A third aspect of the present application provides a computer readable storage medium having a computer program stored therein, which when executed by a processor causes the processor to perform the data recharging method provided in the first aspect of the present application.
A fourth aspect of the present application provides a computer program product comprising: computer program code which, when run on an electronic device, causes the electronic device to perform the data recharging method provided in the first aspect of the present application.
Drawings
FIG. 1 is an example of a scene taken with a camera of an electronic device;
FIG. 2 is an example of a framework for an electronic device to implement image processing functions;
fig. 3 is an example of an implementation scenario of the data recharging method disclosed in the embodiments of the present application;
FIG. 4 is a flow chart of a storage phase in a data recharging method disclosed in an embodiment of the present application;
FIG. 5 is a flow chart of a reading phase in the data recharging method disclosed in the embodiment of the present application;
FIG. 6 is an example of interface effects achieved by the data recharging method disclosed in the embodiments of the present application;
FIG. 7 is a flow chart of a further reading phase in the data recharging method disclosed in the embodiment of the present application;
Fig. 8 is a diagram illustrating a structure of an electronic device to which the data recharging method disclosed in the embodiment of the present application is applied.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. The terminology used in the following embodiments is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification and the appended claims, the singular forms "a," "an," "the," and "the" are intended to include, for example, "one or more" such forms of expression, unless the context clearly indicates to the contrary. It should also be understood that in embodiments of the present application, "one or more" means one, two, or more than two; "and/or", describes an association relationship of the association object, indicating that three relationships may exist; for example, a and/or B may represent: a alone, a and B together, and B alone, wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The plurality of the embodiments of the present application refers to greater than or equal to two. It should be noted that, in the description of the embodiments of the present application, the terms "first," "second," and the like are used for distinguishing between the descriptions and not necessarily for indicating or implying a relative importance, or alternatively, for indicating or implying a sequential order.
Fig. 1 is an example of a scene of a building photographed using a camera of an electronic device:
After the camera in the electronic device is started, an interface A1 shown in fig. 1 is displayed, a preview image of data currently collected by the camera is displayed in a preview window B of the camera in the interface A1, and in fig. 1, a preview image of a building on a glass surface is taken as an example. Since the surface of the building is made of glass, the preview image has a reflective (which may also be referred to as a reflective) area B1.
After the shooting control 1 in the interface A1 is pressed, the camera jumps to the interface A2, and it should be noted that after the image of the building is obtained by shooting, the preview window B is still displayed in the window of the camera, and the camera of the camera is still in contrast to the building in the interface A2, so that the preview image of the data currently collected by the camera is displayed in the preview window B.
The thumbnail image 2 of the captured image of the building is also displayed on the interface A2. It will be understood that after the thumbnail image 2 is clicked, the electronic device jumps to the editing interface A3 of the captured image of the building, and from the image of the building displayed in the interface A3, the electronic device performs the antireflection processing on the preview image with the reflection area, and the obtained image has no reflection area.
As can be seen from fig. 1, the camera performs optimization processing on the image during the image capturing process, so as to obtain an image with better quality.
It is understood that at least one image processing module configured in the electronic device implements the image processing function. Fig. 2 is an example of a framework of an electronic device implementing image processing functions. Taking an android operating system running in an electronic device as an example, a layered architecture divides software into a plurality of layers, and each layer has clear roles and division of labor. The layers communicate with each other through a software interface.
In fig. 2, the application layer runs a series of application packages. Examples of applications relevant to embodiments of the present application include cameras and gallery.
The application layer is hereinafter the application framework layer. The application framework layer provides an application programming interface (application programming interface, API) and programming framework for applications of the application layer, e.g., the application layer includes Camera services (Camera services) and the like.
The application framework layer is hereinafter referred to as a hardware abstraction layer (HAL for short). HAL is used to abstract hardware. The hardware interface details of a specific platform are hidden, a virtual hardware platform is provided for an operating system, so that the operating system has hardware independence, and can be transplanted on various platforms.
The modules running on the HAL in connection with embodiments of the present application include a first image processing platform and a second image processing platform.
The first image processing platform is used for being matched with an image acquisition and processing module in the chip. The first image processing platform comprises a first data storage module, and the first data storage module is used for storing image data. The second image processing platform is used for realizing the image processing function.
As can also be seen from fig. 2, a common processing module is further arranged between the first image processing platform and the second image processing platform. The common processing module may be understood as an interface between the first image processing platform and the second image processing platform, where the image processing module in the second platform cannot parse and process (or does not process) data (such as Metadata (Metadata) in the data collected by the camera), and the data is transmitted to the first data storage module after being processed by the common processing platform, such as parsing. The first data storage module stores the received data into the storage space.
It will be appreciated that the image processing modules of different types of chips are of different data types and therefore of different data types to the first image processing platform to which the image processing modules of the chips are adapted, and therefore in some implementations, a docking module (e.g. an interface) of multiple data types is provided in the common processing platform, so that the second image processing platform shown in fig. 2 can be adapted to multiple types of chips.
Below the HAL is a kernel layer, below which is a hardware layer. The kernel layer is a layer between hardware and software. Referring to fig. 2, in an application scenario of the present application, the kernel layer at least includes a display driver and a camera driver. The hardware layer of the electronic equipment comprises a memory, a hard disk, a camera and other hardware.
Based on the framework shown in fig. 2, the flow of capturing an image by the electronic device is shown in fig. 3, and the method comprises the following steps:
s101, the camera responds to shooting operation and transmits shooting event information to a camera service.
An example of the photographing operation is that the photographing control 1 shown in fig. 1 is pressed (clicked).
It can be understood that the shooting operation can be performed in a shooting scene or a video scene, that is, the shooting control can be clicked.
The shooting event message includes shooting mode information. The photographing mode information indicates a selected photographing mode, and it is understood that the photographing mode corresponds to a flow (type) in which image data is to be processed, for example, the photographing mode information includes a beauty function flag indicating that an image subjected to beauty processing is to be obtained, and for example, the photographing mode information includes an AI mode flag indicating that, in the case where there is a reflection area in the image, an image in which the reflection area is to be removed is to be obtained.
S102, the camera service transmits shooting event information to the event request processing module.
It will be appreciated that the camera service transmits the shooting event message to the event request processing module based on a protocol and interface between the application layer and the HAL layer, etc.
S103, the event request processing module responds to the shooting event message and transmits an image data acquisition message to the first image processing platform.
It will be appreciated that because the first image processing platform is adapted to the image data processing module of the chip, the second image processing platform needs to obtain image data from the first image processing platform.
In some implementations, the event request processing module parses the shooting event message to obtain shooting mode information.
S104, the first image processing platform responds to the image data acquisition message and transmits a data image acquisition instruction to the camera drive of the kernel layer.
S105, the camera area responds to the image acquisition instruction and transmits the image data acquisition instruction to the camera.
S106, the camera transmits the collected image data to the first image processing platform.
It may be appreciated that the camera may directly or through other modules such as a camera driver, transmit the collected image data to the first image processing platform, which is not described herein.
S107, the first image processing platform transmits the image data to the event request processing module.
It will be appreciated that in some implementations, the event request processing module performs format conversion or the like on the received image data or the like.
It will be appreciated that the modules in the first image processing platform that execute S104 and S107 are not limited, but are summarized using the "first image processing platform".
S108, the event request processing module transmits the event processing message and the image data to the perception decision module.
The event processing message includes shooting mode information.
S109, the perception decision module decides scene information and generates image frames based on the event processing message, and transmits the scene information and the image frames.
The scene information includes channel information and frame numbers.
The pathway can be understood as: at least one image processing module that implements an image processing function (flow) represented by the photographing mode information, and a processing order of the at least one image processing module. It will be appreciated that the path is constituted by at least one image processing module. One path is used to implement an image processing function.
Taking the antireflection function as an example, assuming that the antireflection processing is implemented by the reflection area detection module (i.e., the image processing module 1) and the antireflection processing module (i.e., the image processing module N), the paths for implementing the antireflection processing are the image processing module 1 and the image processing module N. It is understood that different paths may multiplex at least one image processing module.
In some implementations, the path information is a sequence of identifiers of at least one image processing module, and taking an image processing function that implements the capturing mode information as an example, assuming that the identifier of the image processing module 1 is 1 and the identifier of the image processing module N is N, the path information that implements the antireflection function is the sequence 1, N.
The frame number may be understood as the number of image frames required to implement the image processing function represented by the photographing mode information.
It will be appreciated that the perceptual decision module, after deciding to obtain the number of frames M (M being an integer greater than zero), generates M image frames (i.e., a first frame image and an mth frame image) based on the received image data. For example, a matrix as image data is taken as a first frame image, and a product of the matrix and a first numerical value is taken as a second frame image.
In some implementations, the perceptual decision module decides to obtain the frame number based on a pre-configured correspondence of the shooting mode to the frame number, and in other implementations, the perceptual decision module determines the frame number based on the learned characteristics of the shooting mode.
It can be appreciated that the perceptual decision module, after deciding to obtain scene information and generating an image frame, transmits the scene information and the first frame image to the first image processing module in the path.
In fig. 2, M is 1, and the channel information is the sequence 1, n.
S110, the image processing module 1 processes the first frame image to obtain a processing result of the first image processing module, and transmits the processing result of the first image processing module and scene information to the image processing module N.
The image processing module 1 can learn that the next image processing module is the image processing module N based on the scene information.
S111, the image processing module 1 transmits the first image processing platform data to the common processing module.
The first image processing platform data may be understood as data processed by the first image processing platform, and the second image processing platform does not participate in processing or cannot process, one example of the first image processing platform data being Meta data.
S112, the image processing module N processes the processing result of the first image processing module to obtain the processing result of the image processing module N, and the processing result of the image processing module N is transmitted to the gallery.
It will be appreciated that the image processing module N is aware of itself being the last image processing module based on the scene information, and thus transmits the processing result to the gallery.
It can be understood that, in the case where M is greater than 1 and equal to 2, for example, the perceptual decision module first transmits the first frame image to the image processing module 1, as described in S110 and S112, the image processing module 1 transmits the processing result 1 of the first frame to the image processing module N, and the image processing module N processes the processing result 1 of the first frame to obtain the processing result N of the first frame.
The perception decision module transmits a second frame image to the image processing module 1, the image processing module 1 processes the second frame image to obtain a processing result 1 of the second frame, and transmits the processing result 1 of the second frame to the image processing module N, the image processing module N processes the processing result 1 of the second frame to obtain a processing result N of the second frame, and the processing result N of the first frame is fused with the processing result N of the second frame to obtain a processing result of the image processing module N. It will be appreciated that the image processing module N knows the number of processing results to be fused based on the number of frames.
S113, the image processing module N transmits the first image processing platform data to the public processing module.
And S114, the gallery stores the processing result of the image processing module N in a hard disk.
It should be understood that the processing result of the image processing module N may be stored in a memory, which is not limited herein.
Referring to fig. 1, the processing result of the image processing module N is an image of the interface A3 shown in fig. 1 after the reflection area is removed. After executing S112, thumbnail 2 is displayed in interface A2, and the user can view the captured image by clicking on thumbnail 2 to enter the gallery, as shown in interface A3.
S115, the public processing module transmits the first image processing platform data to the first data storage module.
S116, the first data storage module stores the first image processing platform data into the memory.
It will be appreciated that it may also be stored to the hard disk.
Based on the framework and flow shown in fig. 2, it can be appreciated that the developer needs to recharge the historical data to at least one image processing module based on the requirements of fault localization and optimization testing.
For example, an image of a building obtained as shown in fig. 1, the reflection area is removed but the image after the antireflection processing is assumed to have a green edge. In this case, the developer needs to locate the faulty module, i.e., the image processing module that causes the green edge.
For example, when the face-beautifying function of the camera is used to capture an image, the color of the face area is blackish, and the developer positions the smoothing module to blackish the skin color and optimize the parameters of the smoothing module, in this case, the optimization effect of the optimized module needs to be verified, so that the same image needs to be processed by the modules before and after the optimization, and because the external environment is uncontrollable, the image data processed by the modules before the optimization needs to be recharged to the modules after the optimization.
In order to solve the above-mentioned problems, an embodiment of the present application discloses a data recharging method, which is aimed at recharging image data to at least meet the above-mentioned needs.
Fig. 3 is an example of an implementation scenario of the data recharging method disclosed in the embodiment of the present application:
taking the electronic equipment as a mobile phone A as an example, the mobile phone A is connected with a computer B in a wired mode (or in a wireless mode), and a command line is input into the computer B to start the data recharging method provided by the embodiment of the application.
Based on the framework and the flow shown in fig. 2, the data recharging method disclosed in the embodiment of the present application is divided into two phases, a first phase is a storage phase, and a second phase is a reading phase.
These two phases will be described in detail below.
Fig. 4 is a flow chart of a storage phase, and it can be understood that, compared with fig. 2, since the data recharging is mainly implemented by improving the second image processing platform, the application framework layer, the kernel layer and part of the transmission steps of the image data are not drawn in fig. 4, and all can be seen in fig. 2.
The flow of the storage phase comprises the following steps:
s201, the event request processing module receives a storage instruction.
In some implementations, as shown in connection with fig. 3, the stored instructions are transmitted by computer B to handset a in response to an entered command line. The specific contents of the command line are not described here.
S202, the event request processing module receives shooting event information.
The generation condition and content of the shooting event message may be referred to S101, and will not be described here.
In some implementations, after inputting a command line for transmitting a storage command to the mobile phone a in the computer B, the developer clicks a shooting control in the camera interface of the mobile phone 1 to trigger the camera to transmit a shooting event message to the event request processing module.
S203, the event request processing module transmits event processing information and image data to the perception decision module after acquiring the image data.
In this embodiment, the event processing message includes shooting mode information, time information, and a storage instruction. The definition of the photographing mode information can be seen in S101.
The time information indicates the shooting time and can be understood as the current time. In some implementations, the event request processing module reads a clock (timer) of the electronic device to obtain the current time.
The process of the event request processing module acquiring the image data may refer to S103-S107, and will not be described herein.
S204, the perception decision module transmits scene information, time information, a storage instruction and an image frame to the first image processing module in the path.
The scene information includes channel information and frame numbers. The definition and example of the path, frame number and image frame can be found in 109, and will not be described here again. In fig. 4, the path including the image processing module 1 and the image processing module N is also taken as an example.
It is understood that the in-path image processing module performs S205-S210 based on the received information and the storage instruction.
S205, the image processing module 1 processes the image frame, and transmits the processing result, scene information, time information, and storage instruction of the image processing module 1 to the next image processing module in the path based on the path information.
It will be appreciated that the perceptual decision module sequentially transmits the image frames to the image processing module 1, and thus, in this step, the image frames are image frames received by the image processing module 1, for example, the first frame image, the second frame image, or the like.
S206, the image processing module 1 stores the data of the image processing module 1 into the memory based on the storage instruction and the time information.
The data of the image processing module 1 includes the processing result of the image processing module 1, and may also include the processing object of the image processing module 1, such as individual image frames. In some implementations, as shown in connection with fig. 3, the image processing module 1 is instructed to turn on or off the function of storing the processing object of the image processing module 1 by transmitting an instruction to the mobile phone a. For example, the storage instruction is only used to instruct the image processing module 1 to store the processing result of the image processing module 1, and the storage instruction carries the identification bit 1 by outputting the command line in the computer B, instructs the image processing module 1 to store the processing object of the image processing module 1, or transmits the storage instruction to the mobile phone a, including a first storage instruction for instructing to store the processing result and a second storage instruction for instructing to store the processing object, instructs the image processing module 1 to store the processing object of the image processing module 1.
The time information is used to mark the time of the data stored to the memory, as will be described in detail below.
S207, the image processing module 1 transmits the first image processing platform data in the image data to the common processing module. See in particular S111.
S208, the image processing module N processes the processing result of the image processing module 1 and transmits the processing result of the image processing module N to the gallery.
It can be understood that the image processing module N can learn that the image processing module N is the last image processing module based on the path information, and thus, the processing result of the image processing module N is transmitted to the gallery.
Referring to S112, it can be seen that, when the frame number is greater than 1, the processing result of the image processing module N is a result of the image processing module N fusing a plurality of processing results, and detailed processing modes of the multi-frame are not described herein.
S209, the image processing module N stores the data of the image processing module N into the memory based on the storage instruction and the time information.
The data of the image processing module N includes the processing result of the image processing module N, and it may be understood that, because the processing object of the image processing module N is the processing result of the image processing module 1 and has been stored in the memory by the image processing module 1, the image processing module N may not be instructed to store the processing object of the image processing module N, so as to save resources, and therefore, it may be understood that in some implementations, only the first image processing module in the path is instructed to store the processing object.
For the subsequent data recharging requirement, in this embodiment, the data is stored in the memory based on the time information, each image processing module, and the frame number.
As shown in the second image processing platform data in fig. 4, in some implementations, taking the time information as the first dimension, taking the first time, the second time and the third time as examples, it can be understood that, each time a triggered shooting event message is shot, the trigger time request processing module obtains the time information, and therefore, any one time represents one shot.
Under the same time information, frame numbers such as a 1 st frame and an M th frame are taken as a second dimension, and under the same frame number, an image processing module is taken as a third dimension.
S210, the image processing module N transmits the first image processing platform data in the image data to the public processing module.
S211, the public processing module transmits the first image processing platform data to the first data storage module.
S212, the first data storage module stores the first image processing platform data into the memory.
It will be appreciated that, in some implementations, as shown in fig. 4, the first image processing platform data stored in the memory is stored according to the storage mode of the second image processing platform data, that is, the first dimension is time information, the second dimension is frame number under the same time information, and the third dimension is image processing module under the same frame number. It will be understood that the data of any one image processing module in the first image processing platform data refers to the first image processing platform data transmitted to the first image processing platform by the image processing module. Taking the first image processing platform data transmitted to the common processing module in S207 as an example, the image processing module of the first image processing platform data transmitted in S207 is the image processing module 1 responsible for transmission.
S213, the gallery stores the processing result of the image processing module N to the hard disk. See S114.
The storage flow shown in fig. 4 lays a foundation for subsequent reading of historical data, and in combination with the image processing flow shown in fig. 2, it can be known that the storage of data related to image processing is fused in the image processing flow, so that the method has higher compatibility and implementation property.
The flow of the reading phase will be described in detail below.
The reading instruction needs to be transmitted to the mobile phone a to trigger the reading process, and the reading instruction is transmitted to the mobile phone a by inputting a corresponding command line in the computer B as shown in fig. 3.
In addition to the read instruction, before the read flow is started, time information of the data to be read may also be written in the mobile phone a, and in some implementations, a TXT file is stored in the mobile phone a, where the time information is stored in the TXT file. The technician can search the required images from the mobile phone as required and inquire the time stamp of the images as time information.
In this embodiment, taking the scenario of verifying the optimized image processing module as an example, the data reading process in the data recharging process is executed, and if the face area of the image captured under the condition that the beautifying function of the camera is turned on is dark, the technician optimizes the image processing module 1, and the effect of the optimization needs to be verified, so that the optimized image processing module 1 and the image processing module 1 before the optimization need to process the same image data.
Based on the above object, the flow of the reading phase is shown in fig. 5, comprising the following steps:
s301, the event request processing module receives a reading instruction.
S302, the event request processing module receives shooting event information.
The generation condition and content of the shooting event message may be referred to S101, and will not be described here. In some implementations, after inputting a command line for transmitting a reading command to the mobile phone a in the computer B, the developer clicks a shooting control in the camera interface of the mobile phone 1 to trigger the camera to transmit a shooting event message to the event request processing module.
S303, the event request processing module transmits event processing information, a reading instruction and image data to the perception decision module after acquiring the image data.
In this embodiment, the event processing message includes shooting mode information and a reading instruction.
The process of the event request processing module acquiring the image data may refer to S103-S107, and will not be described herein.
S304, the perception decision module obtains time information by reading the TXT file.
It is understood that the read instruction may carry a storage path of the TXT file, and the sensing decision module reads the TXT file based on the storage path.
S305, the perception decision module respectively transmits scene information, time information and a reading instruction to the first image processing module and the public processing module in the path, and transmits image frames to the first image processing module.
In the above example, the first image processing module in the path is the image processing module 1.
The time information is the time information acquired in S304, that is, the time information of the data to be read.
The image frame may be acquired by referring to 109, and the specific content of the scene information may be referred to S204, which are not described herein.
S306, the image processing module 1 reads the history processing object data of the image processing module 1 from the memory based on the read instruction and the time information.
In the above example, the optimized beautifying function needs to be verified, so the historical processing object data of the image processing module 1 is the first frame image generated by the face image data collected by the camera, and it can be understood that, as shown in fig. 4, the first frame image is obtained (e.g. generated) in the storage flow by the perception decision module.
It can be understood that the precondition that the first frame image generated by the face image data can be read from the memory in S306 is that the function of the image processing module 1 for storing the processing object in the memory is enabled to be turned on in the storage flow.
S307, the public processing module transmits scene information, time information and a reading instruction to the first data storage module.
S308, the first data storage module reads the first image processing platform data from the memory based on the scene information, the time information and the reading instruction.
S309, the first data storage module transmits the first image processing platform data to the image processing module 1 through the common processing module.
It will be appreciated that in some implementations, in S308, the first data saving module reads all data corresponding to the time information in the TXT file from the memory, that is, each image frame corresponding to the time information and the first image processing platform data of each image processing module dimension. In this case, in some implementations, the frames are sequentially transmitted by the common processing module to the image processing module 1 based on the period of processing each frame.
S310, the image processing module 1 transmits the processing result, scene information, time information, and a reading instruction of the image processing module 1 to the image processing module N.
It can be understood that the processing result of the image processing module 1 is obtained by: the image processing module 1 assembles the first image processing platform data transmitted by the common processing module and the historical processing object data read from the memory into a historical first frame image, replaces the first frame image transmitted in S305, and the image processing module 1 processes the historical first frame image to obtain a processing result of the image processing module 1.
It may be understood that, in the case where the first image platform processing data exists in the data processed by the image processing module 1, the first image platform processing data may also be stored in the memory through the common processing module and the first data storage module.
S311, the image processing module N processes the processing result of the image processing module 1 and transmits the processing result of the image processing module N to the gallery.
It will be appreciated that, in the above example, since the image processing module N is not optimized, there is no need to recharge the image processing module N with data, and therefore the image processing module N does not read data from the memory to replace the received data, but processes the received data.
To achieve the above objective, in some implementations, as indicated by the crosses in fig. 5, the function of the image processing module N for reading data from the memory is turned off (i.e., disabled). It can be understood that the read instruction may carry information for turning off the function of the image processing module N for reading data from the memory, or may transmit the turn-off instruction to the mobile phone by inputting a command line into the computer B.
It will be appreciated that in the case where the function of the image processing module N for reading data from the memory is turned off, the image processing module N does not process the read instruction and the time information.
It may be understood that, in the case where the first image platform processing data exists in the data processed by the image processing module N, the first image platform processing data may also be stored in the memory through the common processing module and the first data storage module.
S312, the gallery stores the processing result of the image processing module N to the hard disk.
It can be understood that a technician can view the shooting result under the condition that the beautifying function is turned on through the gallery, so that the shooting result can be compared with the shooting result before the image processing module 1 is optimized to verify the optimizing effect on the image processing module 1. It can be seen that the data recharging process constituted by the storage process shown in fig. 4 and the reading process shown in fig. 5 can verify the optimization effect on the image processing module.
Fig. 3 shows an example where a cell phone a is placed on a computer B, in which case the preview interface of the camera in the cell phone shown in fig. 3 is black, indicating a scene that is occluded by the camera.
In order to further highlight the effect of the flow shown in fig. 5, in combination with fig. 6, assuming that the user is aimed at a building when clicking the shooting control of the camera, the transmitted image data in S303 and the image frame transmitted in S304 are both images of the building, but the image finally output to the gallery is a face image, as shown in fig. 6, after the user presses the shooting control to obtain the thumbnail 61 of the face image, the preview interface of the camera remains in a state aimed at the building, and the preview interface of the camera displays the preview image of the building.
In addition to verifying the optimization effect of the image processing module 1 shown in fig. 5, in the case of optimizing the image processing module N, the steps to be varied in fig. 5 are:
the image processing module N reads history processing object data, that is, a history processing result of the image processing module 1, from the memory and performs processing as a current processing object. Under the condition of multiple frames, according to the sequence of the frame numbers from small to large, the historical processing results of the image processing module 1 corresponding to the time information are read frame by frame, the processing results of the multiple frames are fused after processing frame by frame, and the final results are obtained and transmitted to a gallery.
The manner in which the image processing module N assembles the data read from the memory with the first image processing platform data and the manner in which the first image processing platform data is obtained may refer to the implementation manner of the image processing module 1, which is not described herein again.
It will be understood that, since the image processing module N reads data from the memory as a processing object, no matter what data the image processing module 1 transmits to the image processing module N, the data is replaced by the processing object read by the image processing module N, so the image processing module 1 may process the first frame image transmitted in S305, or may execute S306, and process the historical processing object data obtained in S306 as a processing object, to obtain a processing result of the image processing module 1.
In addition to the requirement of the optimization effect of the verification module, there is a requirement of locating the fault module, in this embodiment, taking the case of starting the AI mode, assuming that there is a green edge in the image after the antireflection area, the module that needs to be located to cause the fault is taken as an example, the data reading flow in the executed data recharging flow is shown in fig. 7, and includes the following steps:
s401, the event request processing module receives a reading instruction. See S301, which is not described here again.
S402, the event request processing module receives shooting event information. See S302, which is not described here again.
S403, the event request processing module transmits event processing information, a reading instruction and image data to the perception decision module after acquiring the image data. See S303 for a specific implementation.
S404, the perception decision module obtains time information by reading the TXT file. See S304, which is not described here.
S405, the perception decision module transmits scene information, time information, a reading instruction and an image frame to the first image processing module in the path. See S305, which is not described here.
In this embodiment, it is detected whether the image processing module N has failed, and then whether the image processing module 1 has failed, so that the image processing modules in the path execute S406 to S407.
S406, the image processing module 1 processes the first image frame, and transmits the processing result, scene information, time information and reading instruction of the image processing module 1 to the next image processing module in the path based on the path information.
It is understood that the function of reading data from the memory by the image processing module 1 is turned off (i.e. disabled), and it is understood that the read command may carry information for turning off the function of reading data from the memory by the image processing module N, or the close command may be transmitted to the mobile phone by inputting a command line into the computer B.
It will be appreciated that in the case where the function of the image processing module 1 for reading data from the memory is turned off, the image processing module 1 does not process the read instruction and the time information.
S407, the image processing module N reads the history processing result data and the history processing object data of the image processing module N from the memory based on the reading instruction.
It will be appreciated that the read instruction in fig. 4 is used to read the history processing result data of the image processing module, and the read instruction in fig. 3 is used to read the history processing object data and the history processing object data of the image processing module, and thus, the read instruction in fig. 3 is different from the read instruction in fig. 4, and for convenience of distinction, the read instruction in fig. 3 may be referred to as a "first read instruction" and the read instruction in fig. 4 may be referred to as a "second read instruction".
S408, the image processing module N transmits the historical processing result data and the historical processing object data of the image processing module N to the gallery.
Therefore, the history processing result data and history processing object data of the image processing module N can be checked from the gallery.
It can be understood that in this step, the image processing module N replaces the result obtained by processing the processing result of the image processing module 1 by the image processing module N using the read history processing result data and history processing object data of the image processing module N.
It can be understood that the history processing result data of the image processing module N is a result of the combined action of the image processing module 1 and the image processing module N, and the history processing object data of the image processing module N is the history processing result data of the image processing module 1 and is a result of the action of the image processing module 1, so that the image processing module 1 or the image processing module N can be located to be faulty when comparing two images.
In addition to the scenario in which two image processing modules are included in the path shown in fig. 7, assuming that the path includes the image processing module 1, the image processing module N, and the image processing module N-1, according to the flow shown in fig. 7, it is possible to detect whether there is a problem with the image processing module N, and in the case where there is no problem with the image processing module N, enable the image processing module N-1 to read the history processing result data and the history processing object data of the image processing module N-1 from the memory, and transmit the read data to the image processing module N (the function of reading the history data from the memory is turned off), and after being processed by the image processing module N, output to the gallery, a technician can compare the two images, and confirm whether there is a problem with the image processing module N-1.
In the case where there is no problem with the image processing module N-1, it is confirmed whether there is a problem with the image processing module 1 in the same manner.
It can be understood that, for some problems of the image that can be observed only by human eyes, for example, the processing result has a green edge, or after the storage flow shown in fig. 4, the processing result data of each module can be output from the memory of the mobile phone, and the image processing module with the problem can be confirmed by observing.
The data recharging method provided by the embodiment of the application is executed by the electronic equipment. In some implementations, the electronic device may be a cell phone, tablet, desktop, laptop, notebook, ultra mobile personal computer (Ultra-mobile Personal Computer, UMPC), handheld computer, netbook, personal digital assistant (Personal Digital Assistant, PDA), wearable electronic device, smart watch, or the like.
Taking a mobile phone as an example of the electronic device, fig. 8 shows a part of a structure of the mobile phone related to an embodiment of the present application, including: processor 110, antenna 1, antenna 2, mobile communication module 120, wireless communication module 130, audio module 140, camera 150, display 160, etc.
It is to be understood that the configuration illustrated in this embodiment does not constitute a specific limitation on the electronic apparatus. In other embodiments, the electronic device may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a general-purpose input/output (GPIO) interface, and the like.
In some embodiments, a GPIO interface may be used to connect the processor 110 with the wireless communication module 130, the audio module 140, and the like.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 140 via an I2S bus to enable communication between the processor 110 and the audio module 140. In some embodiments, the audio module 140 may transmit an audio signal to the wireless communication module 130 through the I2S interface, to implement a function of answering a call through a bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 140 and the wireless communication module 130 may be coupled by a PCM bus interface. In some embodiments, the audio module 140 may also transmit audio signals to the wireless communication module 130 through the PCM interface, so as to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
It should be understood that the connection relationship between the modules illustrated in this embodiment is only illustrative, and does not limit the structure of the electronic device. In other embodiments of the present application, the electronic device may also use different interfacing manners in the foregoing embodiments, or a combination of multiple interfacing manners. The mobile communication module 120 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on an electronic device.
The wireless communication module 130 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. for application on an electronic device.
The audio module 140 is used to convert the analog audio input into a digital audio signal. The audio module 140 may also be used to encode and decode audio signals. In some embodiments, the audio module 140 may be disposed in the processor 110, or some functional modules of the audio module 140 may be disposed in the processor 110.
Speaker 140A, also referred to as a "horn," is used to output audio signals. Microphone 140B, also referred to as a "microphone," is used to convert sound signals into electrical signals.
After the electronic device acquires the sound signal in the above manner, the sound signal is transmitted to the processor 110 through the above-described interface of the processor 110, and the processor 110 performs noise cancellation processing described in the following embodiments on the sound signal.
The camera 150 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device may include 1-N cameras 150, N being a positive integer greater than 1.
The display screen 160 is used to display images, videos, and the like. The display screen 160 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro-led, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device may include 1 or N display screens 160, N being a positive integer greater than 1.

Claims (17)

1. A data recharging method, characterized in that it is applied to an electronic device, the method comprising:
after receiving the reading instruction, responding to shooting operation, and collecting image data;
reading pre-stored historical image processing data from the electronic equipment in response to the reading instruction;
replacing target data with the historical image processing data, wherein the target data is at least one of the image data and processing result data of the image data;
displaying an image obtained based on the historical image processing data.
2. The method of claim 1, wherein the electronic device is coupled to a control device, and the read instruction is obtained from the control device.
3. The method of claim 1 or 2, wherein the historical image processing data is stored in the electronic device in a classification dimension, the classification dimension comprising: at least one of a time of acquisition of the image data, a frame identification of the image data, and an identification of a module processing the image data.
4. A method according to claim 3, wherein said replacing target data with said historical image processing data comprises:
And replacing the data indicated by the preset classification dimension in the image data by using historical image data of the preset classification dimension, wherein the preset classification dimension comprises preset time, preset frame number and identification of a preset image processing module, and the data indicated by the preset classification dimension comprises the data of the preset frame number of the preset image processing module.
5. The method of any of claims 1-4, wherein the first image processing platform and the second image processing platform are run in the electronic device; the second image processing platform comprises at least one image processing module;
the historical image processing data comprises first image processing platform data and second image processing platform data;
the replacing the target data with the historical image processing data includes:
the at least one image processing module replaces the target data with the historical image processing data.
6. The method of claim 5, wherein said replacing said target data with said historical image processing data comprises:
assembling the first image processing platform data and the second image processing platform data to obtain assembled data;
Replacing the target data with the assembly data.
7. The method of claim 6, wherein the first image processing platform data and the second image processing platform data are each stored in a classification dimension, the classification dimension comprising at least one of a time of acquisition of the image data, a frame identification of the image data, and an identification of a module that processes the image data.
8. The method according to any one of claims 5 to 7, wherein the storing procedure of the history image processing data includes:
after receiving a storage instruction, responding to shooting operation, and collecting the historical image processing data;
in response to the storage instruction, the image processing module of the second image processing platform stores second image processing platform data in the historical image processing data;
the image processing module transmits the first image processing platform data to the first image processing platform;
the first image processing platform stores the first image processing platform data.
9. The method of claim 8, wherein the second image processing platform data comprises: processing object data and processing result data of the image processing module.
10. The method according to any of claims 6-9, wherein assembling the first image processing platform data and the second image processing platform data to obtain assembled data comprises:
the target image processing module reads second image processing platform data of a preset frame number acquired at a first moment and stored by the target image processing module, wherein the target image processing module is any one image processing module;
the target image processing module receives first image processing platform data which is transmitted by the first image processing platform, acquired at the first moment and stored by the target image processing module through the first image processing platform;
the target image processing module assembles the first image processing platform data and the second image processing platform data acquired at the first moment and stored by the target image processing module to obtain assembled data of the target image processing module;
the replacing the target data with the assembled data includes:
and replacing the data of the preset frame number acquired at the second moment by using the assembly data, wherein the first moment is earlier than the second moment.
11. The method according to any of claims 5-10, wherein the second image processing platform further comprises a perceptual decision module;
the method further comprises the steps of:
the perception decision module transmits scene information and first time information to the image processing module based on a storage instruction, wherein the scene information indicates the image processing module receiving the instruction and a processing sequence, the first time information indicates the acquisition time of stored data, and the first time is used for classified storage of the stored data.
12. The method of claim 11, wherein the method further comprises:
the perception decision module transmits scene information and the first time information to the image processing module based on the reading instruction, wherein the first time information indicates the type of the read data.
13. The method according to claim 12, wherein the method further comprises:
and the perception decision module transmits the scene information and the moment information to the first image processing platform based on the reading instruction.
14. The method according to claim 12 or 13, characterized in that the method further comprises:
And the perception decision module reads the first moment information from a preset file of the electronic equipment.
15. The method of any of claims 5-14, wherein the second image processing platform further comprises: an event request processing module;
the method further comprises the steps of:
the event request processing module is used for responding to shooting event information and triggering the first image processing platform to acquire image data;
the event request processing module transmits shooting mode information, the instruction, and the image data in response to an instruction, the instruction including a storage instruction or a reading instruction.
16. An electronic device, comprising:
one or more processors;
one or more memories;
the memory stores one or more programs that, when executed by the processor, cause the electronic device to perform the data recharging method of any of claims 1-15.
17. A computer readable storage medium, wherein a computer program is stored in the computer readable storage medium, which when executed by a processor causes the processor to perform the data recharging method of any of claims 1-15.
CN202210758524.1A 2022-06-30 2022-06-30 Data recharging method and device Active CN116048379B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210758524.1A CN116048379B (en) 2022-06-30 2022-06-30 Data recharging method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210758524.1A CN116048379B (en) 2022-06-30 2022-06-30 Data recharging method and device

Publications (2)

Publication Number Publication Date
CN116048379A true CN116048379A (en) 2023-05-02
CN116048379B CN116048379B (en) 2023-10-24

Family

ID=86126065

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210758524.1A Active CN116048379B (en) 2022-06-30 2022-06-30 Data recharging method and device

Country Status (1)

Country Link
CN (1) CN116048379B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109510940A (en) * 2018-11-28 2019-03-22 维沃移动通信(杭州)有限公司 A kind of image display method and terminal device
CN111182217A (en) * 2020-01-07 2020-05-19 徐梦影 Image white balance processing method and device
CN111789675A (en) * 2020-06-29 2020-10-20 首都医科大学附属北京天坛医院 Intracranial hematoma operation positioning auxiliary method and device
CN113037997A (en) * 2021-01-28 2021-06-25 维沃移动通信有限公司 Image processing method and device and electronic equipment
WO2021238325A1 (en) * 2020-05-29 2021-12-02 华为技术有限公司 Image processing method and apparatus
CN113747080A (en) * 2021-09-29 2021-12-03 维沃移动通信(杭州)有限公司 Shooting preview method, shooting preview device, electronic equipment and medium
CN114257755A (en) * 2020-09-21 2022-03-29 腾讯科技(深圳)有限公司 Image processing method, device, equipment and storage medium
WO2022127611A1 (en) * 2020-12-15 2022-06-23 华为技术有限公司 Photographing method and related device
WO2022127787A1 (en) * 2020-12-18 2022-06-23 华为技术有限公司 Image display method and electronic device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109510940A (en) * 2018-11-28 2019-03-22 维沃移动通信(杭州)有限公司 A kind of image display method and terminal device
CN111182217A (en) * 2020-01-07 2020-05-19 徐梦影 Image white balance processing method and device
WO2021238325A1 (en) * 2020-05-29 2021-12-02 华为技术有限公司 Image processing method and apparatus
CN113810587A (en) * 2020-05-29 2021-12-17 华为技术有限公司 Image processing method and device
CN111789675A (en) * 2020-06-29 2020-10-20 首都医科大学附属北京天坛医院 Intracranial hematoma operation positioning auxiliary method and device
CN114257755A (en) * 2020-09-21 2022-03-29 腾讯科技(深圳)有限公司 Image processing method, device, equipment and storage medium
WO2022127611A1 (en) * 2020-12-15 2022-06-23 华为技术有限公司 Photographing method and related device
WO2022127787A1 (en) * 2020-12-18 2022-06-23 华为技术有限公司 Image display method and electronic device
CN113037997A (en) * 2021-01-28 2021-06-25 维沃移动通信有限公司 Image processing method and device and electronic equipment
CN113747080A (en) * 2021-09-29 2021-12-03 维沃移动通信(杭州)有限公司 Shooting preview method, shooting preview device, electronic equipment and medium

Also Published As

Publication number Publication date
CN116048379B (en) 2023-10-24

Similar Documents

Publication Publication Date Title
CN109891874B (en) Panoramic shooting method and device
CN114092364B (en) Image processing method and related device
CN109951633A (en) A kind of method and electronic equipment shooting the moon
WO2021104485A1 (en) Photographing method and electronic device
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN114040242B (en) Screen projection method, electronic equipment and storage medium
CN114710640B (en) Video call method, device and terminal based on virtual image
CN110121084B (en) Method, device and system for switching ports
CN113496426A (en) Service recommendation method, electronic device and system
CN112969024B (en) Camera calling method, electronic equipment and camera
CN111953904B (en) Shooting method, shooting device, electronic equipment and storage medium
CN113973189B (en) Display content switching method, device, terminal and storage medium
CN115526787B (en) Video processing method and device
CN114697348A (en) Distributed implementation method, distributed system, readable medium and electronic device
CN116048379B (en) Data recharging method and device
CN113497851A (en) Control display method and electronic equipment
CN112335294B (en) Emergency call method and user terminal
CN116027997A (en) Method and equipment for opening file
CN114528581A (en) Safety display method and electronic equipment
KR20190018280A (en) Mobile terminal and method for obtaining high dynamic range image using the same
CN115150542A (en) Video anti-shake method and related equipment
CN115515001B (en) Screen mirroring method, device, equipment and storage medium
CN116631011B (en) Hand gesture estimation method and electronic equipment
CN116709018B (en) Zoom bar segmentation method and electronic equipment
WO2024067432A1 (en) Audio transmission method and system, and related apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant