CN113298690B - Image data processing method and device and electronic equipment - Google Patents

Image data processing method and device and electronic equipment Download PDF

Info

Publication number
CN113298690B
CN113298690B CN202110853603.6A CN202110853603A CN113298690B CN 113298690 B CN113298690 B CN 113298690B CN 202110853603 A CN202110853603 A CN 202110853603A CN 113298690 B CN113298690 B CN 113298690B
Authority
CN
China
Prior art keywords
channel
data
pixel
image
pixel data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110853603.6A
Other languages
Chinese (zh)
Other versions
CN113298690A (en
Inventor
吕焱飞
王宗苗
史为平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Huaray Technology Co Ltd
Original Assignee
Zhejiang Huaray Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Huaray Technology Co Ltd filed Critical Zhejiang Huaray Technology Co Ltd
Priority to CN202110853603.6A priority Critical patent/CN113298690B/en
Publication of CN113298690A publication Critical patent/CN113298690A/en
Application granted granted Critical
Publication of CN113298690B publication Critical patent/CN113298690B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0007Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining

Abstract

The application provides an image data processing method, an image data processing device, electronic equipment and a computer readable storage medium; the method comprises the following steps: respectively sampling data output by different channels of an image sensor to obtain image data of each channel; respectively carrying out pixel word alignment processing on the image data of each channel to obtain pixel data corresponding to each channel; carrying out channel alignment processing on pixel data corresponding to the different channels of the image sensor; and reordering all the pixel data subjected to the channel alignment processing according to the positions of the pixel data in the image according to the address mapping table corresponding to each channel. According to the image data processing method, the image data output by the image sensor can be directly used by a processor at the back end.

Description

Image data processing method and device and electronic equipment
Technical Field
The present disclosure relates to image processing technologies, and in particular, to an image data processing method and apparatus, and an electronic device.
Background
The image data output by the image sensor is typically provided to a processor at the back end for direct use by an image processing module in the processor. However, the image data output by the output interface of some image sensors (such as large-area array and high-speed image sensors) cannot be directly used by the processor at the back end.
Disclosure of Invention
The embodiment of the application provides an image data processing method and device and electronic equipment, so that image data output by an image sensor can be directly used by a processor at the back end.
The technical scheme of the embodiment of the application is realized as follows:
in a first aspect, an embodiment of the present application provides an image data processing method, including:
respectively sampling data output by different channels of an image sensor to obtain image data of each channel;
respectively carrying out pixel word alignment processing on the image data of each channel to obtain pixel data corresponding to each channel;
carrying out channel alignment processing on pixel data corresponding to the different channels of the image sensor;
and reordering all the pixel data subjected to the channel alignment processing according to the positions of the pixel data in the image according to the address mapping table corresponding to each channel.
In some embodiments, the method further comprises:
configuring a time interval corresponding to each channel;
determining a channel corresponding to a time interval to which the current moment belongs;
and caching the pixel data corresponding to the channel corresponding to the time interval to which the current time belongs based on the address mapping table corresponding to the channel corresponding to the time interval to which the current time belongs.
In some embodiments, the reordering all the pixel data subjected to the channel alignment processing according to the position of the pixel data in the image according to the address mapping table corresponding to each channel includes:
determining the position of each pixel data corresponding to the channel corresponding to the address mapping table according to the address mapping table;
reordering all the pixel data according to the position of each pixel data;
wherein the location of the pixel data includes row information and column information in which the pixel data is stored.
In some embodiments, before reordering all the pixel data subjected to the channel alignment processing according to the positions of the pixel data in the image, the method further comprises:
and generating an address mapping table corresponding to each channel according to the arrangement sequence of the test data output by different channels of the image sensor.
In some embodiments, before reordering all the pixel data subjected to the channel alignment processing according to the positions of the pixel data in the image, the method further comprises:
decoding the line synchronization coded data and the frame coded data output by the image sensor to obtain a line synchronization signal and a frame synchronization signal;
and determining a frame corresponding to the image data and a line starting position of the data image in the frame based on the line synchronization signal and the frame synchronization signal.
In some embodiments, the performing pixel word alignment processing on the sampled image data includes:
acquiring single test data output by the image sensor, sliding a window by taking a pixel as a unit, and determining a pixel word alignment boundary;
and performing pixel word alignment processing on the parallel image data based on the pixel word aligned boundary.
In some embodiments, the separately sampling data output by different channels of the image sensor includes:
and respectively sampling the image data output by each channel according to the sampling phase corresponding to each channel of the image sensor.
In some embodiments, before the sampling the data output by the different channels of the image sensor respectively, the method further comprises:
performing the following for each channel of the image sensor:
determining deserialization data of single test data output by the channel under different delay times;
and determining a sampling phase corresponding to the channel based on the delay time corresponding to the jump value in the deserializing data.
In some embodiments, after reordering all the pixel data subjected to the channel alignment processing according to the positions of the pixel data in the image, the method further comprises:
and outputting the pixel data according to the position sequence of the pixels in the corresponding image according to the pre-configured output parallel pixel quantity.
In a second aspect, an embodiment of the present application provides an image data processing apparatus, including:
the sampling module is used for respectively sampling data output by different channels of the image sensor to obtain image data of each channel;
the pixel word alignment module is used for respectively carrying out pixel word alignment processing on the image data of each channel to obtain pixel data corresponding to each channel;
the channel alignment module is used for carrying out channel alignment processing on pixel data corresponding to different channels of the image sensor;
and the ordering module is used for reordering all the pixel data subjected to the channel alignment processing according to the positions of the pixel data in the image according to the address mapping table corresponding to each channel.
In some embodiments, the image data processing apparatus further comprises: the storage module is used for configuring a time interval corresponding to each channel;
determining a channel corresponding to a time interval to which the current moment belongs;
and caching the pixel data corresponding to the channel corresponding to the time interval to which the current time belongs based on the address mapping table corresponding to the channel corresponding to the time interval to which the current time belongs.
In some embodiments, the sorting module is configured to determine, according to the address mapping table, a position of each pixel data corresponding to a channel corresponding to the address mapping table;
reordering all the pixel data according to the position of each pixel;
wherein the location of the pixel data includes row information and column information to store the pixel data.
In some embodiments, the sorting module is further configured to generate an address mapping table corresponding to each channel according to an arrangement order of test data output by different channels of the image sensor.
In some embodiments, the sorting module is further configured to decode the line synchronization encoded data and the frame encoded data output by the image sensor to obtain a line synchronization signal and a frame synchronization signal;
and determining a frame corresponding to the image data and a line starting position of the data image in the frame based on the line synchronization signal and the frame synchronization signal.
In some embodiments, the pixel word alignment module is configured to obtain single test data output by the image sensor, slide a window in units of pixels, and determine a boundary of pixel word alignment;
and performing pixel word alignment processing on the parallel image data based on the pixel word aligned boundary.
In some embodiments, the sampling module is configured to sample the image data output by each channel of the image sensor according to a sampling phase corresponding to each channel.
In some embodiments, the sampling module is further configured to, for each channel of the image sensor:
determining deserializing data of the single test data output by the channel under different delay times;
and determining a sampling phase corresponding to the channel based on the delay time corresponding to the jump value in the deserializing data.
In some embodiments, the apparatus further comprises an output module for outputting the pixel data in order of position of the pixels in the corresponding image according to a preconfigured number of output parallel pixels.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for realizing the image data processing method provided by the embodiment of the application when the executable instructions stored in the memory are executed.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, which stores executable instructions and is configured to, when executed by a processor, implement an image data processing method provided in an embodiment of the present application.
The image data processing method provided by the embodiment of the application is used for respectively sampling data output by different channels of an image sensor to obtain image data of each channel; respectively carrying out pixel word alignment processing on the image data of each channel to obtain pixel data corresponding to each channel; carrying out channel alignment processing on pixel data corresponding to different channels of the image sensor; and reordering all the pixel data subjected to the channel alignment processing according to the positions of the pixel data in the image according to the address mapping table corresponding to each channel. According to the image data processing method provided by the embodiment of the application, the image data output by the image sensor can be suitable for different Printed Circuit Boards (PCBs) and different systems by performing pixel word alignment processing and channel alignment processing on the image data output by each channel of the image sensor, so that the adaptability of the image data is improved; the pixel data processed by the channel alignment are reordered according to the positions of the pixel data in the image according to the address mapping tables corresponding to the channels, so that the storage sequence of the pixel data can be flexibly configured, the pixel data can be output according to the configured storage sequence, and the image data output by the output interface of the image sensor can be directly used by a processor at the rear end.
Drawings
Fig. 1 is a schematic diagram of an output interface of an image sensor in the related art;
fig. 2 is a schematic diagram of image data output from an image sensor in the related art;
FIG. 3 is a block diagram of an image data processing system according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of an architecture of a server provided in an embodiment of the present application;
FIG. 5 is a schematic view of an alternative flowchart of an image data processing method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of determining a sampling phase according to an embodiment of the present application;
FIG. 7 is a diagram illustrating phase scanning results provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of pixel word alignment provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of image data before channel alignment according to an embodiment of the present application;
FIG. 10 is a schematic diagram of image data after channel alignment according to an embodiment of the present application;
FIG. 11 is an alternative diagram of an address mapping table provided in an embodiment of the present application;
FIG. 12 is an alternative diagram of an address mapping table provided in an embodiment of the present application;
FIG. 13 is a schematic diagram of a time interval for storing pixel data for a channel configuration according to an embodiment of the present application;
FIG. 14 is a schematic diagram of a line cache provided by an embodiment of the present application;
FIG. 15 is a schematic diagram of a data write FIFO memory according to an embodiment of the present application;
FIG. 16 is a schematic diagram of output pixel data provided by an embodiment of the present application;
fig. 17 is a schematic overall processing flow diagram of a data processing method according to an embodiment of the present application.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
In the following description, references to the terms "first \ second \ third" are only to distinguish similar objects and do not denote a particular order, but rather the terms "first \ second \ third" are used to interchange specific orders or sequences, where appropriate, so as to enable the embodiments of the application described herein to be practiced in other than the order shown or described herein. In the following description, the term "plurality" referred to means at least two.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, description will be given of image data output from an image sensor in the related art.
The image sensor can output data through a plurality of channels, and the output pixel data is not arranged according to the position sequence of the pixels in the image, which is not beneficial to the subsequent image data processing. As shown in fig. 1, an image includes a plurality of output interfaces, each output interface corresponds to a channel, and each channel outputs corresponding image data; as shown in fig. 2, a row of image data is output through different channels, the image data output at the same time is not continuous, and the data output by the image sensor is not arranged according to the pixel position order of the pixel data in the image.
The embodiment of the application provides an image data processing method, an image data processing device, an electronic device and a computer-readable storage medium, wherein an address mapping table is configured for each sensor channel, so that the storage sequence of pixel data output by each channel can be flexibly configured, and further the pixel data can be output according to the configured storage sequence; the image data output by the image sensor can be suitable for different PCBs and different systems by carrying out pixel word alignment processing and channel alignment processing on the image data output by the image sensor, the adaptability of the image data is improved, and the data output by the image sensor can be directly used by a subsequent processor. An exemplary application of the electronic device provided in the embodiment of the present application is described below, and the electronic device provided in the embodiment of the present application may be implemented as various types of terminal devices, and may also be implemented as a server.
Referring to fig. 3, fig. 3 is a schematic diagram of an architecture of the image data processing system 100 according to an embodiment of the present application, where a terminal device 400 or a server 200 collects image data output by an image sensor.
In some embodiments, taking an example that an electronic device implementing the image data processing method is a terminal device, the image data processing method provided in the embodiments of the present application may be implemented by the terminal device and a server in cooperation. For example, the server 200 determines a sampling phase corresponding to each channel according to the test data in the database, and the server 200 sends the sampling phase corresponding to each channel to the terminal device 400 according to the test data in the database 500. The terminal device 400 runs a client 410, and the client 410 may be a client for image data processing.
The client 410 samples data output by different channels of the image sensor respectively to obtain image data of each channel; respectively carrying out pixel word alignment processing on the image data of each channel to obtain pixel data corresponding to each channel; carrying out channel alignment processing on pixel data corresponding to different channels of the image sensor; and reordering all the pixel data subjected to the channel alignment processing according to the positions of the pixel data in the image according to the address mapping table corresponding to each channel.
In some embodiments, taking the example that the electronic device implementing the image data processing method is a server, the server 200 determines, according to the test data in the database, sampling phases respectively corresponding to each channel, and samples data output by different channels of the image sensor based on the sampling phases respectively corresponding to each channel to obtain image data of each channel; the sampling phase may also be referred to as a sampling time. Respectively carrying out pixel word alignment processing on the image data of each channel to obtain pixel data corresponding to each channel; carrying out channel alignment processing on pixel data corresponding to different channels of the image sensor; and reordering all the pixel data subjected to the channel alignment processing according to the positions of the pixel data in the image according to the address mapping table corresponding to each channel.
In some embodiments, the terminal device 400 or the server 200 may implement the image data processing method provided by the embodiments of the present application by running a computer program, for example, the computer program may be a native program or a software module in an operating system; can be a local (Native) Application program (APP), i.e. a program that needs to be installed in an operating system to run; or may be an applet, i.e. a program that can be run only by downloading it to a browser environment; but also an applet that can be embedded into any APP. In general, the computer programs described above may be any form of application, module or plug-in.
In some embodiments, the server 200 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a Cloud server providing basic Cloud computing services such as Cloud services, a Cloud database, Cloud computing, Cloud functions, Cloud storage, web services, Cloud communication, middleware services, domain name services, security services, a CDN, and a big data and artificial intelligence platform, where Cloud Technology (Cloud Technology) refers to a hosting Technology for unifying a series of resources such as hardware, software, and a network in a wide area network or a local area network to implement computing, storing, processing, and sharing of data. The terminal device 400 may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, and the like. The terminal device and the server may be directly or indirectly connected through wired or wireless communication, and the embodiment of the present application is not limited.
Taking the electronic device provided by the embodiment of the present application as an example, it can be understood that parts (such as the user interface, the presentation module, and the input processing module) in the structure shown in fig. 4 may be default. Referring to fig. 4, fig. 4 is a schematic structural diagram of a server 200 according to an embodiment of the present application, where the server 200 shown in fig. 4 includes: at least one processor 460, memory 450, at least one network interface 420, and a user interface 430. The various components in server 200 are coupled together by a bus system 440. It is understood that the bus system 440 is used to enable communications among the components. The bus system 440 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 440 in FIG. 4.
The Processor 460 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc., wherein the general purpose Processor may be a microprocessor or any conventional Processor, etc.
The user interface 430 includes one or more output devices, including one or more speakers and/or one or more visual display screens, that enable the presentation of media content. The user interface 430 also includes one or more input devices, including user interface components that facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 450 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 450 optionally includes one or more storage devices physically located remote from processor 460.
The memory 450 includes both volatile memory and nonvolatile memory, and can include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a Random Access Memory (RAM). The memory 450 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 450 is capable of storing data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 451, including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
a network communication module 452 for communicating to other computing devices via one or more (wired or wireless) network interfaces 420, exemplary network interfaces 420 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), and the like;
a rendering module 453 for enabling the rendering of information (e.g., user interfaces for operating peripherals and displaying content and information) via one or more output devices (e.g., display screens, speakers, etc.) associated with user interface 430;
an input processing module 454 for detecting one or more user inputs or interactions from one of the one or more input devices and translating the detected inputs or interactions.
In some embodiments, the apparatus provided in the embodiments of the present application may be implemented in software, and fig. 4 illustrates an image data processing apparatus 455 stored in the memory 450, which may be software in the form of programs and plug-ins, and may include the following software modules: a sampling module 4551, a pixel word alignment module 4552, a channel alignment module 4553 and a sorting module 4554, which are logical and thus may be arbitrarily combined or further split depending on the functionality implemented. The functions of the respective modules will be explained below.
An embodiment of the present application provides an image data processing method, which can at least solve the above problem.
The image data processing method provided by the embodiment of the present application will be described below in conjunction with exemplary applications and implementations of the electronic device provided by the embodiment of the present application.
Referring to fig. 5, fig. 5 is an alternative flowchart of an image data processing method provided in an embodiment of the present application, which will be described with reference to the steps shown in fig. 5.
Step S101, respectively sampling data output by different channels of the image sensor to obtain image data of each channel.
In some embodiments, the image sensor includes two or more channels, each channel corresponds to a sampling phase, and the sampling phases corresponding to different channels may be the same or different. Based on this, the data output by each channel of the image sensor can be sampled at the sampling phase corresponding to each channel. As an example, the first channel corresponds to a first sampling phase, and the second channel corresponds to a second sampling phase, so that data output by the first channel is sampled according to the first sampling phase to obtain image data of the first channel, and data output by the second channel is sampled according to the second sampling phase to obtain image data of the second channel. The image data output by the image sensor through the output pin is serial image data.
In some embodiments, before performing step S101, the image data processing method may further include:
performing the following for each channel of the image sensor: determining deserializing data of the single test data output by the channel under different delay times, wherein the single test data can mean that the test data is a string of the same data, such as 00000 or 11111; and determining a sampling phase corresponding to the channel based on the delay time corresponding to the jump value in the deserializing data. As an example, a schematic diagram of determining a sampling phase, as shown in fig. 6, may be configured to perform phase scanning on test data output by an image sensor through a delay chain, and determine an optimal sampling phase according to a delay time corresponding to a jump value in deserialized data corresponding to the test data; the optimal sampling phase is as far as possible from the delay time corresponding to the jump value, so as to improve the capability of resisting phase drift. As shown in fig. 7, the abscissa is the delay value of the delay chain, and the step of the delay value is n, the delay values are 0, n, 2n, and 3n … m, respectively, starting from the delay value of 0, the delay time is increased according to the step n, and the deserializing data at the delay time is recorded every time the delay time is increased; the ordinate indicates the delay value set according to the delay time corresponding to the jump value in different cases. For case 1, no transition value is scanned, then the delay value of the delay chain is set to m/2. For case 2, the transition value p0 is scanned, and p0 < m/2, then the delay value of the delay chain is set to (m-p 0)/2. For case 3, the transition value p0 is scanned, and p0= m/2, then the delay value of the delay chain is set to p 0/2. For case 4, the transition value p0 is scanned, and p0> m/2, then the delay value of the delay chain is set to p 0/2. For case 5, scanning to transition values p0 and p1, the delay value of the delay chain is set to (p 1-p 0)/2. For case 6, scanning more than two jump values p0, p1 …, the delay value of the delay chain is set to (p 1-p 0)/2. Therefore, through the test data output by the channels, the optimal sampling phase corresponding to each channel can be determined without determining the data rate of the serial output of the image sensor, and the correct sampling of the data output by the channels is further ensured.
Step S102, respectively carrying out pixel word alignment processing on the image data of each channel to obtain pixel data corresponding to each channel.
In some embodiments, performing the pixel word alignment process on the sampled image data may include: and performing pixel word alignment processing on the parallel image data based on a predetermined pixel word alignment boundary.
In some embodiments, prior to performing the pixel word alignment process, the method may further comprise: the boundaries of the pixel word alignment are determined. In specific implementation, the image sensor may be configured to output single and fixed test data, slide the window in units of pixels, determine the boundaries of pixel word alignment, and store the corresponding window positions. As shown in fig. 8, the bit width of the pixel data is n, and the parallel data 1 is data obtained by delaying the parallel data 0 by one clock.
Step S103, performing channel alignment processing on the pixel data corresponding to different channels of the image sensor.
In some embodiments, different channels of the image sensor may have different delays for completing the pixel word alignment, which may result in misalignment or misalignment of pixel data between different channels after completing the pixel word alignment. Based on this, the image sensor can be configured to output a periodic, non-single repeating sequence of test data; and determining the deviation among the channels according to the test data sequence output periodically, and eliminating the deviation among the channels through data pipeline (pipeline) operation to realize channel alignment. The image data before the channel alignment is shown in fig. 9, and it can be seen that the image data output by different channels are not aligned. The image data after the channels are aligned is schematically shown in fig. 10, and it can be seen that the image data output by different channels are completely aligned.
In some embodiments, before performing steps S101 to S103, parameters of the image sensor, such as the number of channels, the pixel bit width, and the output interface parameters of the image sensor, may also be configured.
And step S104, reordering all the pixel data subjected to the channel alignment processing according to the positions of the pixel data in the image according to the address mapping table corresponding to each channel.
In some embodiments, before reordering all pixel data subjected to the channel alignment process according to the position of the pixel data in the image, the method may further include: decoding the line synchronization coded data and the frame coded data output by the image sensor to obtain a line synchronization signal and a frame synchronization signal; according to the frame synchronization signal, a frame corresponding to the pixel data can be determined; according to the line synchronization signal, a line corresponding to the pixel data and a line start position can be determined. Wherein different image sensors can be configured with different line synchronization encoded data and frame encoded data.
In some embodiments, an address mapping table may be configured for each channel of the image sensor, and specifically, the address mapping table corresponding to each channel may be generated according to an arrangement order of test data output by different channels of the image sensor. The address mapping table is used for indicating a storage area corresponding to the pixel data output by the channel, or the address mapping table is used for indicating the position of the pixel data output by the channel in the image. As an example, the position of the pixel data output through the channel corresponding to the time interval to which the current time belongs may be determined according to the address mapping table, and all the pixel data may be reordered according to the position of each pixel; the position of the pixel data comprises row information and column information for storing the pixel data, the row information represents a row corresponding to the pixel data, and the column information represents a specific position of the pixel data in the corresponding row.
Therefore, after all pixels are reordered according to the address mapping table corresponding to each channel, the pixel data can be cached according to the positions of the pixel data in the image, and the ordered storage of the pixel data is realized.
In some embodiments, the pixel data may be stored in a buffer. Different address mapping tables can be configured for different types of image sensors, and different address mapping tables can also be configured for different image sensors of the same type.
As an alternative diagram of the address mapping table, as shown in fig. 11, Channel 0 (Channel 1) outputs pixel data with pixel values between 0 and 670, and the address mapping table corresponding to Channel 0 is address mapping table # 0; the channel1 outputs pixel data with even values between the pixel values 672-1342, the address mapping table corresponding to the channel1 is the address mapping table #1, and so on. As shown in fig. 12, Channel 0 (Channel 1) outputs pixel data with pixel values of 0-127, and the address mapping table corresponding to Channel 0 is address mapping table # 0; the channel1 outputs pixel data with a pixel value of 128-.
In some embodiments, before storing the pixel data subjected to the channel alignment processing in the corresponding storage area, the method may further include: carrying out polling arbitration on each channel; polling arbitration may refer to determining which channel to store the image data output at the current time. As an example, polling arbitration for each channel may include: configuring a time interval for storing pixel data for each channel; determining a channel corresponding to a time interval to which the current time belongs, and caching pixel data corresponding to the channel corresponding to the time interval to which the current time belongs based on an address mapping table corresponding to the channel corresponding to the time interval to which the current time belongs.
In some embodiments, each channel has a corresponding time interval for buffering pixel data. Configuring a schematic diagram of a time interval for storing pixel data for a channel, as shown in fig. 13, configuring a writing time period of channel 0 to a writing time period of channel n; as an example, the writing period of a channel may refer to a time interval in which pixel data output by the channel is stored in a buffer.
In the embodiment of the application, each channel of the image sensor corresponds to one write line cache interface, and one line cache generally comprises one write interface and one read interface, so that only one line cache interface writes data at the current moment by performing polling arbitration on the channels.
In order to realize writing and reading of pixel data, different clocks can be adopted, so that the embodiment of the application adopts the operation of caching 2 lines; a schematic diagram of a line cache, as shown in fig. 14, the line cache includes ram0 and ram1 for caching 2 lines of pixel data, respectively.
In some embodiments, the line cache may be implemented based on a First-in-First-out (FIFO) memory. As shown in fig. 15, after a line synchronization signal is obtained by decoding, the line buffer write enable is triggered, the pixel data corresponding to the channel is written into the FIFO memory according to the address mapping table in the order from small to large of the pixel values, and the line buffer address is written into the FIFO memory.
In some embodiments, after performing step S104, the method may further include:
and step S105, outputting the pixel data according to the position sequence of the pixels in the corresponding image according to the pre-configured output parallel pixel quantity.
In some embodiments, the number of pixels output in parallel by each clock can be flexibly configured according to actual conditions. As shown in fig. 16, if the number of the arranged parallel output pixel data is n, the n pixel data having pixel values of 0 to n-1 are sequentially output in the first clock according to the positions of the pixel values in the corresponding image, and the n pixel data having pixel values of n to 2n-1 are sequentially output in the second clock according to the positions of the pixel values in the corresponding image.
To sum up, the overall processing flow diagram of the data processing method provided in the embodiment of the present application may be as shown in fig. 17, and perform adaptive phase adjustment sampling, pixel word alignment processing, and channel alignment on each channel of the image sensor; then, carrying out image decoding on the image data output by the image sensor to obtain a line synchronization signal and a frame synchronization signal; and determining a frame corresponding to the pixel data and a line starting position of the pixel image in the frame based on the line synchronization signal and the frame synchronization signal. Determining a channel for storing pixel data in a cache at the current moment according to a polling arbitration mode, and determining a storage area for storing the pixel data output by the channel according to an address mapping table corresponding to the channel; thus, the pixel data are stored in the buffer according to the position sequence of the pixel data in the image. And finally, outputting the pixel data according to the sequence of the pixel values from small to large.
According to the data processing method provided by the embodiment of the application, the image data output by the image sensor can be suitable for different PCBs and different systems by performing pixel word alignment processing and channel alignment processing on the image data output by each channel of the image sensor, so that the adaptability and flexibility of the image data are improved, the problem caused by individual difference of the image sensor is solved, and the stability of the image sensor is improved; reordering all pixel data subjected to channel alignment processing according to the positions of the pixel data in an image according to the address mapping table corresponding to each channel, so that the storage sequence and the storage position of the pixel data can be flexibly configured, and the pixel data can be output according to the reordered sequence; the single test data is output through the channels, the optimal sampling phase corresponding to each channel can be determined without determining the data rate of serial output of the image sensor, and then correct sampling of the image data is guaranteed.
Continuing with the exemplary structure of the image data processing apparatus 455 provided by the embodiments of the present application implemented as software modules, in some embodiments, as shown in fig. 4, the software modules stored in the image data processing apparatus 455 of the memory 450 may include: the sampling module 4551 is configured to sample data output by different channels of the image sensor, respectively, to obtain image data of each channel; a pixel word alignment module 4552, configured to perform pixel word alignment processing on the image data of each channel respectively to obtain pixel data corresponding to each channel; a channel alignment module 4553, configured to perform channel alignment processing on pixel data corresponding to different channels of the image sensor; and the sorting module 4554 is configured to reorder, according to the address mapping table corresponding to each channel, all pixel data subjected to the channel alignment processing according to positions of the pixel data in the image.
In some embodiments, the image data processing device 455 further comprises: the storage module is used for configuring a time interval corresponding to each channel;
determining a channel corresponding to a time interval to which the current moment belongs;
and caching the pixel data corresponding to the channel corresponding to the time interval to which the current time belongs based on the address mapping table corresponding to the channel corresponding to the time interval to which the current time belongs.
In some embodiments, the sorting module 4554 is configured to determine, according to the address mapping table, a position of each pixel data corresponding to a channel corresponding to the address mapping table;
reordering all the pixel data according to the position of each pixel;
wherein the location of the pixel data includes row information and column information to store the pixel data.
In some embodiments, the sorting module 4554 is further configured to generate the address mapping table corresponding to each channel according to an arrangement order of test data output by different channels of the image sensor.
In some embodiments, the sorting module 4554 is further configured to decode the line synchronization encoded data and the frame encoded data output by the image sensor to obtain a line synchronization signal and a frame synchronization signal;
and determining a frame corresponding to the image data and a line starting position of the data image in the frame based on the line synchronization signal and the frame synchronization signal.
In some embodiments, the pixel word alignment module 4552 is configured to obtain single test data output by the image sensor, slide a window in units of pixels, and determine a boundary of pixel word alignment;
and performing pixel word alignment processing on the parallel image data based on the pixel word aligned boundary.
In some embodiments, the sampling module 4551 is configured to sample the image data output by each channel of the image sensor according to a sampling phase corresponding to each channel.
In some embodiments, the sampling module 4551 is further configured to perform the following for each channel of the image sensor: determining deserializing data of the single test data output by the channel under different delay times;
and determining a sampling phase corresponding to the channel based on the delay time corresponding to the jump value in the deserializing data.
In some embodiments, the apparatus further comprises: and the output module is used for outputting the pixel data according to the position sequence of the pixels in the corresponding image according to the number of the output parallel pixels which are preset.
An embodiment of the present application provides an electronic device, including: a memory for storing executable instructions; and the processor is used for realizing the image data processing method provided by the embodiment of the application when the executable instructions stored in the memory are executed.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the image data processing method described in the embodiment of the present application.
The embodiment of the present application provides a computer-readable storage medium storing executable instructions, wherein the executable instructions are stored, and when being executed by a processor, the executable instructions will cause the processor to execute the method provided by the embodiment of the present application, for example, the image data processing method as shown in fig. 5 to 17.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EPROM, EEPROM, flash, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of a program, software module, script, or code written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (HTML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (10)

1. A method of image data processing, the method comprising:
respectively sampling data output by different channels of an image sensor to obtain image data of each channel;
respectively carrying out pixel word alignment processing on the image data of each channel to obtain pixel data corresponding to each channel;
carrying out channel alignment processing on pixel data corresponding to the different channels of the image sensor;
generating an address mapping table corresponding to each channel according to the arrangement sequence of the test data output by different channels of the image sensor;
determining the position of each pixel data corresponding to the channel corresponding to the address mapping table according to the address mapping table;
reordering all the pixel data according to the position of each pixel data;
wherein the location of the pixel data includes row information and column information to store the pixel data.
2. The method of claim 1, further comprising:
configuring a corresponding time interval for each channel;
determining a channel corresponding to a time interval to which the current moment belongs;
and caching the pixel data corresponding to the channel corresponding to the time interval to which the current moment belongs based on the address mapping table corresponding to the channel corresponding to the time interval to which the current moment belongs.
3. The method of claim 1, wherein the determining the position of each pixel data corresponding to the channel corresponding to the address mapping table according to the address mapping table is preceded, the method further comprising:
decoding the line synchronization coded data and the frame coded data output by the image sensor to obtain a line synchronization signal and a frame synchronization signal;
and determining a frame corresponding to the pixel data and a line starting position of the pixel image in the frame based on the line synchronization signal and the frame synchronization signal.
4. The method of claim 1, wherein performing pixel word alignment on the sampled image data comprises:
acquiring single test data output by the image sensor, sliding a window by taking a pixel as a unit, and determining a pixel word alignment boundary;
and performing pixel word alignment processing on the parallel image data based on the pixel word aligned boundary.
5. The method of claim 1, wherein separately sampling data output by different channels of the image sensor comprises:
and respectively sampling the image data output by each channel according to the sampling phase corresponding to each channel of the image sensor.
6. The method of claim 5, wherein prior to separately sampling data output by different channels of the image sensor, the method further comprises:
performing the following for each channel of the image sensor:
determining deserialization data of single test data output by the channel under different delay times;
and determining a sampling phase corresponding to the channel based on the delay time corresponding to the jump value in the deserializing data.
7. The method according to any one of claims 1 to 6, wherein after the reordering of all the pixel data according to the position of each of the pixel data, the method further comprises:
and outputting the pixel data according to the position sequence of the pixels in the corresponding image according to the pre-configured output parallel pixel quantity.
8. An image data processing apparatus, characterized in that the apparatus comprises:
the sampling module is used for respectively sampling data output by different channels of the image sensor to obtain image data of each channel;
the pixel word alignment module is used for respectively carrying out pixel word alignment processing on the image data of each channel to obtain pixel data corresponding to each channel;
the channel alignment module is used for carrying out channel alignment processing on pixel data corresponding to different channels of the image sensor;
the sorting module is used for generating an address mapping table corresponding to each channel according to the arrangement sequence of the test data output by different channels of the image sensor; determining the position of each pixel data corresponding to the channel corresponding to the address mapping table according to the address mapping table; reordering all the pixel data according to the position of each pixel data; wherein the location of the pixel data includes row information and column information in which the pixel data is stored.
9. An electronic device, comprising:
a memory for storing executable instructions;
a processor for implementing the image data processing method of any one of claims 1 to 7 when executing the executable instructions stored in the memory.
10. A computer-readable storage medium storing executable instructions for implementing the image data processing method of any one of claims 1 to 7 when executed by a processor.
CN202110853603.6A 2021-07-28 2021-07-28 Image data processing method and device and electronic equipment Active CN113298690B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110853603.6A CN113298690B (en) 2021-07-28 2021-07-28 Image data processing method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110853603.6A CN113298690B (en) 2021-07-28 2021-07-28 Image data processing method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN113298690A CN113298690A (en) 2021-08-24
CN113298690B true CN113298690B (en) 2022-07-26

Family

ID=77331226

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110853603.6A Active CN113298690B (en) 2021-07-28 2021-07-28 Image data processing method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN113298690B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019009538A (en) * 2017-06-22 2019-01-17 マクセル株式会社 Receiving device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000125314A (en) * 1998-10-20 2000-04-28 Matsushita Electric Ind Co Ltd Device for aligning solid-state image pickup elements
US7128266B2 (en) * 2003-11-13 2006-10-31 Metrologic Instruments. Inc. Hand-supportable digital imaging-based bar code symbol reader supporting narrow-area and wide-area modes of illumination and image capture
CN102404578A (en) * 2011-12-21 2012-04-04 中国科学院自动化研究所 Multi-channel video transmitting system and method
US8754972B2 (en) * 2012-02-01 2014-06-17 Kla-Tencor Corporation Integrated multi-channel analog front end and digitizer for high speed imaging applications
US9430418B2 (en) * 2013-03-15 2016-08-30 International Business Machines Corporation Synchronization and order detection in a memory system
CN103347157A (en) * 2013-06-25 2013-10-09 杭州士兰微电子股份有限公司 Method and device for real-time input digital image mirroring storage
CN107249101B (en) * 2017-07-13 2020-01-10 浙江工业大学 High-resolution image acquisition and processing device
CN107835410B (en) * 2017-11-27 2019-12-24 浙江华睿科技有限公司 Method and device for calibrating image sensor

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019009538A (en) * 2017-06-22 2019-01-17 マクセル株式会社 Receiving device

Also Published As

Publication number Publication date
CN113298690A (en) 2021-08-24

Similar Documents

Publication Publication Date Title
WO2018095306A1 (en) Method and device for processing application program page, and storage medium
US8245125B1 (en) Hybrid rendering for webpages
US9921723B2 (en) Method for switching pages, and electronic device and non-transitory computer readable storage medium for performing same
US20170302747A1 (en) Card-type desktop implementation method, apparatus, and system
US20190364133A1 (en) Display processing method and apparatus, and electronic terminal therefor
US8751565B1 (en) Components for web-based configurable pipeline media processing
US20180349516A1 (en) Transforming datasets for visualization within widgets across multiple platforms and software applications
CN107085580B (en) Method for displaying page by client and client
US20170031871A1 (en) Method of and system for processing content of a web resource in a browser application
CN103823841A (en) Method and device for increasing browsing speed for mobile terminal client
CN110737495A (en) Window display method, device, terminal and storage medium
CN103761241A (en) Method for uploading picture data and browser
CN113382083B (en) Webpage screenshot method and device
CN103942290A (en) Method and equipment used for providing images in webpage in terminal
CN105683957A (en) Style sheet speculative preloading
CN111506844B (en) Page processing method, device and computer storage medium
CN113688341B (en) Dynamic picture decomposition method and device, electronic equipment and readable storage medium
CN113298690B (en) Image data processing method and device and electronic equipment
CN113312119A (en) Information synchronization method and device, computer readable storage medium and electronic equipment
CN112948018A (en) Dynamic library loading method, device, equipment and medium for small program
CN110618811B (en) Information presentation method and device
US20160252974A1 (en) Communicating with an unsupported input device
JP6924544B2 (en) Cartoon data display system, method and program
CN110795920A (en) Document generation method and device
CN111193952B (en) Media playing method, device, terminal and system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant