CN117952818A - Method and system for processing image data and electronic equipment - Google Patents

Method and system for processing image data and electronic equipment Download PDF

Info

Publication number
CN117952818A
CN117952818A CN202211329143.8A CN202211329143A CN117952818A CN 117952818 A CN117952818 A CN 117952818A CN 202211329143 A CN202211329143 A CN 202211329143A CN 117952818 A CN117952818 A CN 117952818A
Authority
CN
China
Prior art keywords
image data
signal processor
image signal
memory address
physical memory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211329143.8A
Other languages
Chinese (zh)
Inventor
樊明兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202211329143.8A priority Critical patent/CN117952818A/en
Publication of CN117952818A publication Critical patent/CN117952818A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/60Memory management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/907Television signal recording using static stores, e.g. storage tubes or semiconductor memories

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application provides a method, a system and electronic equipment for processing image data. The system comprises: the image signal processor is used for receiving the original image data from the sensor, preprocessing the original image data to obtain processed image data, and storing the processed image data into a physical memory address of the image signal processor; the memory management unit is used for establishing a mapping relation between the physical memory address of the image signal processor and the virtual memory address of the application processor; and the application processor is connected with the memory management unit and is used for acquiring the processed image data from the physical memory address of the image signal processor according to the mapping relation and generating target image data according to the processed image data. Based on the mapping relation, the application processor can directly access the image data stored in the memory by the image signal processor, so that the problems of large data transmission quantity and large memory consumption caused by image data transmission based on the MIPI interface are avoided.

Description

Method and system for processing image data and electronic equipment
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a method and a system for processing image data, and an electronic device.
Background
An image signal processor may be included between the sensor and the application processor (application processor, AP). The application processor may be in data communication with the image signal processor through a mobile industry processor interface (mobile industry processor interface, MIPI). In the data transmission process based on MIPI, operations such as packaging, storing, transmitting, receiving, parsing, etc. are required to be performed on data. Such operations in the data transmission process may cause problems of increased data volume transmitted and increased memory consumption.
Disclosure of Invention
The embodiment of the application provides a method, a system and electronic equipment for processing image data. Various aspects of the application are described below.
In a first aspect, there is provided a system for processing image data, comprising: the image signal processor is used for receiving the original image data from the sensor, preprocessing the original image data to obtain processed image data, and storing the processed image data into a physical memory address of the image signal processor; the memory management unit is used for establishing a mapping relation between the physical memory address of the image signal processor and the virtual memory address of the application processor; the application processor is connected with the memory management unit and is used for acquiring the processed image data from the physical memory address of the image signal processor according to the mapping relation and generating target image data according to the processed image data.
In some embodiments, the system for processing image data further comprises: the neural network processor is used for acquiring the depth information of the original image data and storing the depth information into a physical memory address of the image signal processor; the application processor is further configured to: and acquiring the depth information from the physical memory address of the image signal processor according to the mapping relation, and generating the target image data according to the processed image data and the depth information.
In some embodiments, the depth information is used to perform one or more of the following on the raw image data: blurring of single shot portrait, blurring of neon portrait, and blurring of background.
In some embodiments, the neural network processor and the image signal processor are both integrated in an image signal pre-processing chip.
In a second aspect, there is provided an electronic device comprising: a sensor; and the system of the first aspect.
In a third aspect, there is provided an image processing method, the method being applied to an application processor, the method comprising: obtaining a mapping relation between a physical memory address of an image signal processor and a virtual memory address of the application processor from a memory management unit; and accessing the physical memory address of the image signal processor according to the mapping relation so as to acquire the image data processed by the image signal processor from the physical memory address.
In some embodiments, the method further comprises: accessing a physical memory address of the image signal processor according to the mapping relation to acquire depth information of the processed image data from the physical memory address; and generating the target image data according to the processed image data and the depth information.
In some embodiments, the depth information is used to perform one or more of the following processes on the original image data corresponding to the processed image data: blurring of single shot portrait, blurring of neon portrait, and blurring of background.
In a fourth aspect, there is provided an image processing method including: an image signal processor receives raw image data from the sensor; the image signal processor pre-processes the original image data to obtain processed image data; the image signal processor stores the processed image data into a physical memory address of the image signal processor; the memory management unit establishes a mapping relation between a physical memory address of the image signal processor and a virtual memory address of the application processor; the application processor acquires the processed image data from the physical memory address of the image signal processor according to the mapping relation; the application processor generates target image data from the processed image data.
In some embodiments, the method further comprises: the neural network processor acquires depth information of the original image data; the neural network processor stores the depth information into a physical memory address of the image signal processor; the application processor acquires the depth information from a physical memory address of the image signal processor according to the mapping relation; the application processor generates the target image data according to the processed image data and the depth information.
In some embodiments, the depth information is used to perform one or more of the following on the raw image data: blurring of single shot portrait, blurring of neon portrait, and blurring of background.
In a fifth aspect, there is provided a computer program product comprising: computer program code which, when run on a computer, causes the computer to perform the method in the third or fourth aspect.
In a sixth aspect, there is provided a computer readable medium storing program code which, when run on a computer, causes the computer to perform the method of the third or fourth aspect.
The application establishes the mapping relation between the physical memory address of the image signal processor and the virtual memory address of the application processor through the memory management unit. Based on the mapping relation, the application processor can directly access the image data stored in the memory by the image signal processor, so that the problems of large transmission data volume and large memory consumption caused by MIPI-based transmission are avoided.
Drawings
FIG. 1 is an exemplary diagram of a data transmission scheme between a sensor and an application processor.
Fig. 2 is an exemplary diagram of an MIPI-based image data transmission scheme.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 4 is a schematic diagram of mapping relationship between physical memory addresses and virtual memory addresses according to an embodiment of the present application.
Fig. 5 is a schematic diagram of an image data transmission scheme according to an embodiment of the present application.
Fig. 6 is a schematic flowchart of an image processing method according to an embodiment of the present application.
Fig. 7 is a schematic flowchart of another image processing method according to an embodiment of the present application.
Fig. 8 is a schematic flow chart diagram of one possible implementation of the method shown in fig. 7.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments.
Some electronic devices may transmit data acquired by the sensor to an application processor for processing. An image signal processor may be included between the sensor and the application processor. The image signal processor may process the received sensor data (e.g., image data) to relieve processing pressure of the application processor. The sensor data may comprise, for example, image data. In some embodiments, the sensor data may also be referred to as raw data (raw data).
The image signal processor and the application processor may be in data communication therebetween to transfer data output from the image signal processor to the application processor. The data transfer between the image signal processor and the application processor may be realized by MIPI, for example.
FIG. 1 is an exemplary diagram of a data transmission scheme between a sensor and an application processor. In fig. 1, the image signal processor may be integrated in an image signal preprocessor (PRE IMAGE SIGNAL processor, pre-ISP). The Pre-ISP communicates data with the application processor via MIPI.
Fig. 2 is an exemplary diagram of a MIPI-based data transmission scheme. As can be seen from fig. 2, during the data transmission process, the transmitting end needs to package the data. The packed data needs to be stored and transmitted. And the receiving end receives the packed data and analyzes the received data. Therefore, the above-described operations in the MIPI-based data transmission process may cause problems of increased data volume transmitted and increased memory consumption.
Fig. 3 is a schematic block diagram of an electronic device 300 according to an embodiment of the present application to solve the above-mentioned problem. Electronic device 300 may include a sensor 310 and a system 320 to process image data.
The sensor 310 may output raw data. The raw data may include raw image data. The image data may comprise, for example, video frame data. In fig. 3, the image data flow corresponding to the image data is indicated by solid arrows.
The system 320 for processing image data may include an image signal processor 321, a memory management unit (memory management unit, MMU) 322, and an application processor 323.
The image signal processor 321 may be configured to receive raw image data from the sensor and perform preprocessing on the raw image data to obtain processed image data.
In some implementations, preprocessing of the raw image data may include processing by an image processing backend (FE) and/or an image processing backend (backend end, BE). The FE may include high dynamic range (high-DYNAMIC RANGE, HDR) image processing. Through HDR image processing, the image data may have a higher dynamic range, providing more image detail. The BE may include three-dimensional noise reduction (3D noise reduce,3DNR), local effect optimization (local tone mapping, LTMP), and the like.
In some embodiments, the processed image data may include main data and/or process data. The image data subjected to the entire preprocessing process may be referred to as main data. The image data subjected to the partial preprocessing process may be referred to as process data (process data). For example, the process data may include image data obtained after FE processing.
In some embodiments, the image signal processor 321 may also generate data related to the image data based on the image data (including the original image data and/or the processed image data). The data related to the image data may be referred to as extended custom data.
In some embodiments, the image signal processor 321 may be a front-end chip between the sensor and the application processor. For example, the image signal processor 321 may be integrated at a Pre-ISP. The Pre-ISP and the application processor 323 may be connected by hardware via a high-speed peripheral component interconnect (PERIPHERAL COMPONENT INTERCONNECT EXPRESS, PCI-E), universal serial bus (universal serial bus, USB) or bus protocol interface (advanced extensible interface port, axiport).
The image signal processor 321 may store the processed image data in a physical memory address of the image signal processor 321. For example, the image signal processor 321 may include a double rate synchronous dynamic random access memory (double DATA RATE SDRAM, DDR). The image signal processor 321 may store the processed image data in DDR.
The memory management unit 322 may be a hardware circuit unit. As shown in fig. 4, the memory management unit 322 may be configured to translate virtual memory addresses into physical memory addresses. With memory management unit 322 enabled, memory access may be translated through memory management unit 322.
In some embodiments, the memory management unit 322 may be configured to establish a mapping relationship between the physical memory address of the image signal processor 321 and the virtual memory address of the application processor 323. In some implementations, the mapping relationship may be stored in a translation table or a translation table. The memory management unit 322 may implement the caching of translation tables by translating a look-aside buffer (translation lookaside buffer, TLB).
Based on the processed image data, the application processor 323 may further process the processed image data to generate target image data. The application processor 323 may be used to implement a more complex image processing procedure than the image signal processor 321. For example, the application processor may implement one or more of the following: the figure is deficient, the neon is iridescent, and the background is deficient. It should be noted that, the application processor 323 may adjust the processing performed by the application processor according to the requirements of the platform or algorithm. For example, the application processor 323 may perform a portrait beauty process based on the portrait information extracted by the image signal processor 321. In some embodiments, the application processor 323 may include a central processing unit (central processing unit, CPU).
The application processor 323 may acquire the processed image data through the memory management unit 322. As an implementation, the application light processor 323 may be connected to the memory management unit 322. The application processor 323 may acquire the processed image data from the physical memory address of the image signal processor 321 according to the above-described mapping relationship.
Fig. 5 is a schematic flow chart of acquiring data of a Pre-ISP by an application processor through a memory management unit according to an embodiment of the present application. In the running process of the application processor, the CPU can convert the virtual address of the application processor into a physical memory address corresponding to the Pre-ISP through a translation table in the memory management unit according to the determined mapping relation when the memory management unit is initialized. Based on this, the application processor can directly access the processed image data in the DDR corresponding physical memory address in the Pre-ISP.
The application establishes the mapping relation between the physical memory address of the image signal processor and the virtual memory address of the application processor through the memory management unit. Based on the mapping relation, the application processor can directly access the image data stored in the memory by the image signal processor, so that the problems of large transmission data volume and large memory consumption caused by MIPI-based transmission are avoided. In addition, image data is transmitted by MIPI or the like, and there is a possibility that transmission errors may be caused by erroneous operation during data transmission. In the application, the application processor can directly acquire the data stored by the image signal processor through the memory management unit, and the process of data transmission does not exist. Therefore, the application can avoid transmission errors generated in the data transmission process.
It should be noted that, the application processor 323 may also obtain other data from the physical memory address of the image signal processor 321 according to the mapping relationship. Other data may include, for example, expanded custom data.
The image data and the extended custom data need to be transmitted separately according to the specifications of the MIPI protocol. That is, the MIPI-based solution also causes a problem that image data and extended custom data cannot be simultaneously transmitted. The description will be continued with reference to fig. 1 as an example. In fig. 1, the extended custom data includes depth information of the image data. The depth information data stream corresponding to the depth information is indicated by a dashed arrow in fig. 1. As shown in fig. 1, between the image signal preprocessor and the application processor, a depth information data stream and an image data stream need to be transmitted through MIPI, respectively, and the depth information data stream and the image data stream cannot be transmitted simultaneously. In comparison, based on the technical scheme provided by the application, the application processor can simultaneously acquire the processed image data and the expanded custom data according to the mapping relation established by the memory management unit. Thus, the process of acquiring the application processor acquired data is simplified.
In some embodiments, the system 320 for processing image data may also include a neural network processor 324.
It is noted above that the image signal processor 321 may be used to obtain extended custom data. The neural network processor 324 may also be used to obtain extended custom data. In some implementations, the neural network processor 324 may integrate third party algorithms to form extended custom data.
The extended custom data may include, for example, depth information of the original image data. For example, the neural network processor 324 may obtain FE-processed image data to generate depth information. In some implementations, depth information may be included in the information block data.
In some embodiments (as shown in fig. 3), the neural network processor 324 may generate depth information based on downscaled (downscaled) image data. The downscaling operation may be used to reduce the resolution of the image data input to the neural network processor 324 to increase the processing speed of the neural network processor 324. The downscaling operation may convert, for example, image data having a resolution of 4096×2304 into image data having a resolution of 640×480.
The neural network processor 324 may store the expanded custom data into a physical memory address of the image signal processor 321. In some embodiments, the neural network processor 324 may store the depth information into a physical memory address of the image signal processor 321. The depth information data stream corresponding to the depth information is indicated by a dashed arrow in fig. 3.
The depth information may be stored into a physical memory address of the image signal processor 321 through the memory management unit 322. As can be seen from the above, the memory management unit 322 can establish a mapping relationship between the physical memory address of the image signal processor 321 and the virtual memory address of the application processor 323. Thus, according to the mapping relationship, the application processor 323 can acquire depth information generated by the neural network processor 324 from the physical memory address of the image signal processor 321.
It is noted above that the application processor 323 may generate the target image data from the processed image data. In some embodiments, the application processor 323 may also generate target image data in combination with the expanded custom data.
Based on the expanded custom data, the application processor 323 can perform richer processing on the original image data to generate target image data. Taking the extended custom data as depth information as an example, the application processor 323 can perform data fusion based on the processed image data and the depth information to implement image processing. For example, the depth information may be used to perform one or more of the following processing on the original image data: blurring of single shot portrait, blurring of neon portrait, and blurring of background.
In the related art, at least two cameras are required to realize functions of portrait blurring, neon portrait, background blurring, and the like. Based on the method provided by the application, the system for processing the image data can acquire the depth information of the image data through the neural network processor. Based on the depth information, one or more of image blurring, neon image blurring and background blurring of a single camera (for short, single shooting) can be realized. If the camera is blocked by a hand or other object, the remaining one camera of the electronic device is enabled to work normally (for example, the auxiliary camera is blocked, and one main camera is kept to work normally). And shooting or video recording to obtain a photo or video. If the photograph or video has the effect of a portrait blurring, a neon portrait or a background blurring. In addition, if the sensor in the hardware circuit does not transmit data through MIPI connection, the electronic equipment can be preliminarily determined to use the technical scheme provided by the embodiment of the application.
In some embodiments, the neural network processor may be integrated in the Pre-ISP. For example, the neural network processor may be integrated in a network processing unit (network processing unit, NPU). The Pre-ISP may comprise an NPU.
Having described in detail apparatus embodiments of the present application, method embodiments of the present application are described in detail below. It is to be understood that the description of the method embodiments corresponds to the description of the device embodiments, and that parts not described in detail can therefore be seen in the preceding device embodiments.
Fig. 6 is a schematic flowchart of an image processing method according to an embodiment of the present application. The method shown in fig. 6 may be applied to an application light processor. The method shown in fig. 6 may include step S610 and step S620.
In step S610, a mapping relationship between the physical memory address of the image signal processor and the virtual memory address of the application processor is obtained from the memory management unit.
Step S620, accessing the physical memory address of the image signal processor according to the mapping relationship, so as to obtain the image data processed by the image signal processor from the physical memory address.
Optionally, the method shown in fig. 6 may further include: accessing a physical memory address of the image signal processor according to the mapping relation to acquire depth information of the processed image data from the physical memory address; target image data is generated from the processed image data and depth information.
Optionally, the depth information is used to perform one or more of the following processing on the raw image data: blurring of single shot portrait, blurring of neon portrait, and blurring of background.
Fig. 7 is a schematic flowchart of another image processing method according to an embodiment of the present application. The method shown in fig. 7 may be performed by a system that processes image data. A system for processing image data may include an image signal processor, a memory management unit, and an application processor. The method shown in fig. 7 may include steps S710 to S760.
Step S710, the image signal processor receives raw image data from the sensor
Step S720, the image signal processor preprocesses the original image data to obtain processed image data;
Step S730, the image signal processor stores the processed image data into the physical memory address of the image signal processor;
step S740, the memory management unit establishes a mapping relation between the physical memory address of the image signal processor and the virtual memory address of the application processor;
Step S750, the application processor acquires the processed image data from the physical memory address of the image signal processor according to the mapping relation;
In step S760, the application processor generates target image data from the processed image data.
Optionally, the method shown in fig. 7 may further include: the neural network processor acquires depth information of original image data; the neural network processor stores the depth information into a physical memory address of the image signal processor; the application processor acquires depth information from a physical memory address of the image signal processor according to the mapping relation; the application processor generates target image data from the processed image data and depth information.
Optionally, the depth information is used to perform one or more of the following processing on the raw image data: blurring of single shot portrait, blurring of neon portrait, and blurring of background.
Fig. 8 is a schematic flowchart of still another image processing method according to an embodiment of the present application. The method shown in fig. 8 may be performed by an electronic device. An electronic device may include a system to process image data and a camera. The camera may include a sensor. A system for processing image data may include an image signal processor, a memory management unit (represented by MMU in the figure), and an application processor (represented by AP in the figure). Wherein the image signal processor is integrated with the Pre-ISP. The Pre-ISP also includes an NPU module.
The method shown in fig. 8 may include steps 1 to 7 described below.
Step 1, when the camera or electronic device is turned on, initializing (init) Pre-ISP and initializing MMU hardware unit. The purpose of initializing the MMU hardware unit is to establish a mapping relation between the virtual memory address of the AP and the DDR physical address of the Pre-ISP through the MMU.
And 2, the Pre-ISP processes the received sensor raw data (sensor raw data) through FE and BE to form main data, and stores the main data in the DDR.
And 3, the NPU module in the Pre-Isp reads the sensor data processed by the FE from the DDR, performs NPU processing on the read sensor data, and extracts depth information (depth) data in the sensor data.
And 4, storing the extracted depth information data in the DDR.
And 5, in the running process of the AP, the CPU converts the virtual memory address of the AP into a corresponding physical memory address at the Pre-ISP side through a translation table in the MMU according to the mapping relation during the initialization of the MMU. Thus, the AP can directly access the data in the DDR corresponding physical memory address in the Pre-ISP.
And 6, directly acquiring data in the Pre-ISP by the AP through the MMU, and realizing data operations such as portrait blurring and the like.
And 7, the AP side obtains the sensor data and the depth information data in the DDR of the Pre-ISP side through the MMU unit in a virtual address mode, and then carries out post-processing fusion on the obtained sensor data and the depth information data, so that data transmission can be carried out under the condition of no MIPI hardware intervention, and the purpose of single-shot image blurring is achieved.
It should be understood that in embodiments of the present application, "B corresponding to a" means that B is associated with a, from which B may be determined. It should also be understood that determining B from a does not mean determining B from a alone, but may also determine B from a and/or other information.
It should be understood that the term "and/or" is merely an association relationship describing the associated object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be through some interface, device or unit, and may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber Line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be read by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., digital versatile disk (digital video disc, DVD)), or a semiconductor medium (e.g., solid State Disk (SSD)), etc.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A system for processing image data, comprising:
the image signal processor is used for receiving the original image data from the sensor, preprocessing the original image data to obtain processed image data, and storing the processed image data into a physical memory address of the image signal processor;
the memory management unit is used for establishing a mapping relation between the physical memory address of the image signal processor and the virtual memory address of the application processor;
The application processor is connected with the memory management unit and is used for acquiring the processed image data from the physical memory address of the image signal processor according to the mapping relation and generating target image data according to the processed image data.
2. The system of claim 1, further comprising:
The neural network processor is used for acquiring the depth information of the original image data and storing the depth information into a physical memory address of the image signal processor;
the application processor is further configured to: and acquiring the depth information from the physical memory address of the image signal processor according to the mapping relation, and generating the target image data according to the processed image data and the depth information.
3. The system of claim 2, wherein the depth information is used to perform one or more of the following on the raw image data: blurring of single shot portrait, blurring of neon portrait, and blurring of background.
4. A system according to claim 2 or 3, wherein the neural network processor and the image signal processor are both integrated in an image signal preprocessing chip.
5. An electronic device, comprising:
A sensor;
And a system as claimed in any one of claims 1 to 4.
6. An image processing method, the method being applied to an application processor, the method comprising:
obtaining a mapping relation between a physical memory address of an image signal processor and a virtual memory address of the application processor from a memory management unit;
And accessing the physical memory address of the image signal processor according to the mapping relation so as to acquire the image data processed by the image signal processor from the physical memory address.
7. The method as recited in claim 6, further comprising:
accessing a physical memory address of the image signal processor according to the mapping relation to acquire depth information of the processed image data from the physical memory address;
And generating target image data according to the processed image data and the depth information.
8. The method of claim 7, wherein the depth information is used to perform one or more of the following on raw image data corresponding to the processed image data: blurring of single shot portrait, blurring of neon portrait, and blurring of background.
9. An image processing method, comprising:
An image signal processor receives raw image data from the sensor;
the image signal processor pre-processes the original image data to obtain processed image data;
The image signal processor stores the processed image data into a physical memory address of the image signal processor;
The memory management unit establishes a mapping relation between a physical memory address of the image signal processor and a virtual memory address of the application processor;
The application processor acquires the processed image data from the physical memory address of the image signal processor according to the mapping relation;
The application processor generates target image data from the processed image data.
10. The method as recited in claim 9, further comprising:
the neural network processor acquires depth information of the original image data;
The neural network processor stores the depth information into a physical memory address of the image signal processor;
the application processor acquires the depth information from a physical memory address of the image signal processor according to the mapping relation;
The application processor generates the target image data from the processed image data and the depth information.
11. The method of claim 10, wherein the depth information is used to perform one or more of the following on the raw image data: blurring of single shot portrait, blurring of neon portrait, and blurring of background.
12. A computer readable storage medium having stored thereon the method of any of claims 6-8 or any of claims 9-11.
CN202211329143.8A 2022-10-27 2022-10-27 Method and system for processing image data and electronic equipment Pending CN117952818A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211329143.8A CN117952818A (en) 2022-10-27 2022-10-27 Method and system for processing image data and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211329143.8A CN117952818A (en) 2022-10-27 2022-10-27 Method and system for processing image data and electronic equipment

Publications (1)

Publication Number Publication Date
CN117952818A true CN117952818A (en) 2024-04-30

Family

ID=90793329

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211329143.8A Pending CN117952818A (en) 2022-10-27 2022-10-27 Method and system for processing image data and electronic equipment

Country Status (1)

Country Link
CN (1) CN117952818A (en)

Similar Documents

Publication Publication Date Title
US20170099435A1 (en) Image Generation Method Based On Dual Camera Module And Dual Camera Apparatus
US9699380B2 (en) Fusion of panoramic background images using color and depth data
US10523917B2 (en) Method and apparatus for acquiring three-dimensional image using two cameras
CN111402170B (en) Image enhancement method, device, terminal and computer readable storage medium
JP2014209375A (en) Integrated processor for 3d mapping
CN111160178A (en) Image processing method and device, processor, electronic device and storage medium
JP2001189886A (en) Image pickup device, information processor, system and method for processing image and storage medium
US20150049946A1 (en) Electronic device and method for adding data to image and extracting added data from image
US11783447B2 (en) Methods and apparatus for optimized stitching of overcapture content
CN111742320A (en) Method of providing text translation management data related to application and electronic device thereof
CN112540938B (en) Processor cores, processors, apparatus and methods
US20150379675A1 (en) Image processor
CN111161136A (en) Image blurring method, image blurring device, image blurring equipment and storage device
CN114429495B (en) Three-dimensional scene reconstruction method and electronic equipment
US11389065B2 (en) OCT image processing device and system
US11126322B2 (en) Electronic device and method for sharing image with external device using image link information
CN211830923U (en) Device capable of connecting two cameras
CN117952818A (en) Method and system for processing image data and electronic equipment
CN116703995B (en) Video blurring processing method and device
CN107205148A (en) It is a kind of to take the photograph IMAQ test device based on cloud processing more
WO2021087184A1 (en) Methods and apparatus for image frame freeze detection
US9779328B2 (en) Range image generation
CN109271543B (en) Thumbnail display method and device, terminal and computer-readable storage medium
WO2022068551A1 (en) Video cropping method and apparatus, and device and storage medium
CN114945019B (en) Data transmission method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination