CN113497883B - Image processing method and system, camera module and image acquisition system - Google Patents

Image processing method and system, camera module and image acquisition system Download PDF

Info

Publication number
CN113497883B
CN113497883B CN202010248670.0A CN202010248670A CN113497883B CN 113497883 B CN113497883 B CN 113497883B CN 202010248670 A CN202010248670 A CN 202010248670A CN 113497883 B CN113497883 B CN 113497883B
Authority
CN
China
Prior art keywords
camera module
image
image data
data stream
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010248670.0A
Other languages
Chinese (zh)
Other versions
CN113497883A (en
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ninebot Beijing Technology Co Ltd
Original Assignee
Ninebot Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ninebot Beijing Technology Co Ltd filed Critical Ninebot Beijing Technology Co Ltd
Priority to CN202010248670.0A priority Critical patent/CN113497883B/en
Publication of CN113497883A publication Critical patent/CN113497883A/en
Application granted granted Critical
Publication of CN113497883B publication Critical patent/CN113497883B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Abstract

The application discloses an image processing method, an image processing system, a camera module and an image acquisition system, which are characterized in that the image processing method comprises the following steps: sending a trigger signal to a camera module, wherein the trigger signal is used for triggering an image sensor in the camera module to acquire images; receiving a coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition of the image sensor by a second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module.

Description

Image processing method and system, camera module and image acquisition system
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to an image processing method and system, a camera module and an image acquisition system.
Background
In the existing multi-vision system, the multi-vision system generally generates images in a joint photographic experts group (JPEG, joint Photographic Experts Group) format on the camera side, further transmits the image data in the JPEG format to a processing unit, and the processing unit decodes and processes the images, wherein the JPG format image data has a higher transmission speed, but the decoding time is longer when the related algorithm is used for processing the image data in the JPG format, so that the time consumption for running the algorithm is increased, and the performance of online processing of the images is reduced.
Disclosure of Invention
In order to solve the technical problems, the embodiment of the application provides an image processing method and system, a camera module and an image acquisition system.
The embodiment of the application provides an image processing method, which comprises the following steps:
sending a trigger signal to a camera module, wherein the trigger signal is used for triggering an image sensor in the camera module to acquire images;
receiving a coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition of the image sensor by a second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module.
In an optional embodiment of the present application, the camera module includes N image sensors, N is a positive integer, and correspondingly, the trigger signal is used to trigger each of the N image sensors to perform image acquisition, where the N image sensors acquire and output N images;
Receiving the coaxial signal sent by the camera module, decoding an original image data stream from the coaxial signal, including:
and receiving N coaxial signals sent by the camera module, and decoding an original image data stream comprising at most N images from the N coaxial signals.
In an alternative embodiment of the present application, the method further comprises:
performing format conversion on the decoded original image data stream which at most comprises N images to obtain image data after format conversion;
and synthesizing the image data after format conversion to obtain a three-dimensional image.
In an alternative embodiment of the present application, the method further comprises:
after sending a trigger signal to a camera module, recording the sending time of the trigger signal as a reference time stamp;
determining exposure center points of the N image sensors based on the reference time stamps and the exposure times of the N image sensors;
synthesizing the image data after format conversion to obtain a three-dimensional image, wherein the synthesizing comprises the following steps:
m images with the same exposure center point are determined from the image data after format conversion, wherein M is an integer greater than or equal to 2 and less than or equal to N;
And synthesizing the M images to obtain a three-dimensional image.
In an alternative embodiment of the present application, sending a trigger signal to a camera module includes:
and sending a trigger signal to the camera module according to the set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of the image sensor in the camera module.
The embodiment of the application also provides an image processing method, which comprises the following steps:
receiving a trigger signal sent by an image processing system, and acquiring an image based on the trigger signal;
converting the image data output after image acquisition into an original image data stream;
and converting the original image data stream into an on-axis signal and transmitting the on-axis signal to the image processing system.
In an alternative embodiment of the present application, the format of the original image data stream is YUV format.
The embodiment of the application also provides an image processing system, which comprises: a first processor and a deserializer; wherein, the liquid crystal display device comprises a liquid crystal display device,
the first processor is used for sending a trigger signal to the camera module, and the trigger signal is used for triggering an image sensor in the camera module to acquire images;
The deserializer is used for receiving the coaxial signal sent by the camera module and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition of the image sensor by a second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module.
In an alternative embodiment of the present application, the deserializer is connected to the serializer in the camera module through a coaxial cable;
the deserializer is used for receiving the coaxial signal sent by the serializer in the camera module through the coaxial cable.
In an optional embodiment of the present application, the camera module includes N image sensors, N is a positive integer, and correspondingly, the trigger signal is used to trigger each image sensor of the N image sensors to perform image acquisition, where the N image sensors acquire and output N images;
the deserializer is specifically for: and receiving N coaxial signals sent by the camera module, and decoding an original image data stream comprising at most N images from the N coaxial signals.
In an optional embodiment of the application, the first processor is specifically configured to: performing format conversion on the decoded original image data stream which at most comprises N images to obtain image data after format conversion; and synthesizing the image data after format conversion to obtain a three-dimensional image.
In an optional embodiment of the application, the first processor is further configured to: after sending a trigger signal to a camera module, recording the sending time of the trigger signal as a reference time stamp; determining exposure center points of the N image sensors based on the reference time stamps and the exposure times of the N image sensors;
m images with the same exposure center point are determined from the image data after format conversion, wherein M is an integer greater than or equal to 2 and less than or equal to N; and synthesizing the M images to obtain a three-dimensional image.
In an alternative embodiment of the present application, the first processor is further specifically configured to: and sending a trigger signal to the image sensor in the camera module according to the set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of the image sensor in the camera module.
The embodiment of the application also provides a camera module, which comprises: an image sensor, a second processor, and a serializer; wherein, the liquid crystal display device comprises a liquid crystal display device,
the image sensor is used for receiving a trigger signal sent by the image processing system and collecting images based on the trigger signal;
the second processor is used for converting the image data output after the image sensor performs image acquisition into an original image data stream;
the serializer is used for converting the original image data stream into an on-axis signal and transmitting the on-axis signal to the image processing system.
In an alternative embodiment of the present application, the serializer and the deserializer in the image processing system are connected by a coaxial cable;
the serializer is specifically used for: and converting the original image data stream into a coaxial signal, and transmitting the coaxial signal to a deserializer in the image processing system through the coaxial cable.
In an optional embodiment of the present application, the format of the original image data stream is YUV format.
The embodiment of the application also provides an image acquisition system, which comprises the image processing system and the camera module.
According to the technical scheme, the triggering signal is sent to the camera module and is used for triggering the image sensor in the camera module to acquire images; receiving a coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition of the image sensor by a second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module. Therefore, each image sensor in the plurality of image sensors can be triggered simultaneously by the same trigger signal to collect images, image data output after the image sensors collect images are converted into an original image data stream at the camera module side, and the original image data stream is converted into an on-axis signal and then transmitted to the image processing system to be processed later, so that the image processing system decodes the data stream of the original image from the on-axis signal, decoding of the received original image data stream is not needed, the time consumption of decoding and algorithm processing of the image sent by the camera module by the image processing system is reduced, and the performance and instantaneity of on-line processing of the image are improved.
Drawings
Fig. 1 is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of the structural components of an image acquisition system according to an embodiment of the present application;
fig. 3 is a second schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 4 is a schematic diagram II of the structural composition of an image acquisition system according to an embodiment of the present application;
fig. 5 is a software processing flow chart of an image processing method according to an embodiment of the present application.
Detailed Description
So that the manner in which the features and objects of the present application can be understood in more detail, a more particular description of the application, briefly summarized above, may be had by reference to the appended drawings, which are not necessarily limited to the embodiments described.
The multi-vision system is widely applied to equipment such as robots, unmanned vehicles, unmanned aerial vehicles and the like, and is used for realizing functions such as positioning navigation, precise industrial measurement, object identification, virtual reality, scene reconstruction, survey and the like. In the existing multi-vision system, a USB interface is adopted, so that in order to realize the transmission and data transmission of trigger signals between a processing unit and a plurality of cameras in the multi-vision system, independent stay wires between the processing unit and the plurality of cameras are needed, and the characteristics of high cost, short transmission distance, inconvenient access and the like exist. In addition, under the complex working condition, the reliability of the data transmission mode of independently pulling wires by adopting a universal serial bus (USB, universal Serial Bus) interface is poor, so that the real and accurate synchronization of data acquisition and transmission is difficult to realize, and the problems of frame loss, difficulty in flexibly changing the frame rate of a trigger signal according to an application scene and the like can also occur. In addition, after the camera of the multi-vision system collects the image, in order to increase the speed of transmitting the image data to the processing unit, the image data generated by the image sensor is usually required to be converted into a JPEG format at the camera side, then the image in the JPEG format is transmitted to the processing unit through the USB connection between the camera and the processing unit, the processing unit converts the received image in the JPG format into an RGB format, and then the subsequent image synthesis is performed, so that the time consumption of the whole process of decoding and the processing process is long, and the online processing performance of the multi-vision system is reduced.
In order to solve the above problems, an embodiment of the present application provides an image processing method, fig. 1 is a schematic flow chart of the image processing method provided in the embodiment of the present application, as shown in fig. 1, the method includes the following steps:
step 101: and sending a trigger signal to the camera module, wherein the trigger signal is used for triggering an image sensor in the camera module to acquire images.
Specifically, as shown in fig. 2, the image processing system 20 in the embodiment of the present application is connected to the camera module 21, where the camera module 21 includes a plurality of cameras, each of which includes an image sensor, and the image processing system 20 sends a trigger signal to one or more image sensors in the camera module 21 in a unified manner, so as to trigger the one or more image sensors in the camera module 21 to perform image acquisition.
In an alternative embodiment of the present application, sending a trigger signal to a camera module includes:
and sending a trigger signal to the camera module according to the set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of the image sensor in the camera module.
Specifically, in the embodiment of the present application, the trigger signal is also referred to as a frame synchronization signal of the image sensor, and the sending frequency of the trigger signal is preset in the image processing system 20, so that the trigger signal can be sent to one or more image sensors in the camera module 21 according to the set sending frequency, and after the image processing system 20 sends the trigger signal to the camera module 21 according to the set sending frequency, the image sensor can perform image acquisition according to the frame rate of the same frequency as the trigger signal based on the received trigger signal.
As a preferred embodiment, as shown in fig. 2, the image processing system 20 and the camera module 21 are connected through a coaxial cable 22, and the image processing system 20 can output a trigger signal through its general purpose input output port (GPIO, general Purpose Input Output) port, and transmit the trigger signal to the camera module 21 through the coaxial cable 22, so as to trigger one or more image sensors in the camera module 21 to acquire an image.
In the embodiment of the application, the sending frequency of the trigger signal can be flexibly set in the software program of the image processing system according to the actual application scene, so that the frame rate of the image sensor in the camera module 21 for image acquisition is ensured to flexibly change along with the frequency of the trigger signal, and the method and the device are suitable for various scene requirements.
According to the technical scheme provided by the embodiment of the application, the triggering signal output by the image processing system 20 can be transmitted to the camera module 21 through the coaxial cable 22, the plurality of image sensors in the camera module 21 are triggered to expose and collect images, the synchronous triggering of the plurality of image sensors in the camera module 21 can be realized in the whole process, the time sequence synchronization can be realized when the plurality of image sensors in the camera module 21 collect images, and the reliability is higher.
Step 102: and receiving the coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal.
In the embodiment of the application, the original image data stream is obtained by converting the image data output after the image acquisition of the image sensor by the second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module.
Specifically, as shown in fig. 2, each camera in the camera module 21 includes a serializer and a second processor in addition to the image sensor. The image sensor in the camera outputs bayer format image data after image acquisition, and transmits the bayer format image to the second processor, where the second processor is an image signal processor (ISP, image Signal Processing), and the second processor can process the bayer format image data output by the image sensor, convert the bayer format image data into an original image data stream, transmit the original image data stream to the serializer, and convert the original image data stream into an anti-interference coaxial signal by the serializer, and then transmit the anti-interference coaxial signal to the image processing system 20. In fig. 2, the image processing system 20 includes a deserializer 202, and after the serializer transmits the coaxial signal to the image processing system 20, the deserializer 202 in the image processing system 20 processes the coaxial signal in serial form to obtain an original image data stream output by the second processor. Here, each serializer in the camera module 21 is connected to the deserializer 202 in the image processing system 20 through the coaxial cable 22, so that the serializer transmits a coaxial signal to the deserializer 202 of the image processing system 20 through the coaxial cable 22, and the deserializer 202 may transmit an original image data stream to the first processor 201 of the image processing system 20 through the mobile industry processor interface (MIPI, mobile Industry Processor Interface) of the first processor 201.
In the embodiment of the present application, the second processor does not need to compress the bayer format image data in the process of converting the bayer format image output into the original image data stream. As a preferred embodiment, the format of the raw image data stream may be YUV format, and the second processor does not need to compress the bayer format image when converting the bayer format image data into YUV format image data.
In an optional embodiment of the present application, the camera module includes N image sensors, N is a positive integer, and correspondingly, the trigger signal is used to trigger each image sensor of the N image sensors to perform image acquisition, where the N image sensors acquire and output N images;
receiving the coaxial signal sent by the camera module, decoding an original image data stream from the coaxial signal, including:
and receiving N coaxial signals sent by the camera module, and decoding an original image data stream comprising at most N images from the N coaxial signals.
Specifically, as shown in fig. 2, in the case that the camera module 21 includes N cameras, the image processing system 20 sends trigger signals to the image sensors in the N cameras in a unified manner, and the N image sensors collect and output N images, so that the camera module 21 outputs N coaxial signals and sends the N coaxial signals to the image processing system 20, and the deserializer 202 in the image processing system 20 processes the N coaxial signals to obtain an original image data stream of the N images, and transmits the original image data stream of the N images to the first processor 201 for processing. Here, the time when the N coaxial signals reach the deserializer 202 may not be identical, but the deserializer 202 can process the N coaxial signals to obtain an original image data stream of N images after the N coaxial signals reach the deserializer 202 based on the clock information, and transmit the original image data stream of N images to the first processor 201.
In an alternative embodiment of the present application, format conversion is performed on the decoded original image data stream including at most N images, so as to obtain image data after format conversion;
and synthesizing the image data after format conversion to obtain a three-dimensional image.
Specifically, in fig. 2, after receiving the original image data streams of N images transmitted by the deserializer 202, the first processor 201 performs format conversion on the original image data streams of N images, and performs synthesis processing on the converted image data, so as to obtain a three-dimensional image of the target object acquired by the plurality of image sensors of the camera module 21.
Here, when the first processor 201 performs format conversion on the original image data stream of N images, since the original image data stream is uncompressed image data, the first processor 201 does not need to decode the original image data stream when performing format conversion on the original image data stream. As a preferred implementation manner, the format of the original image data stream in the embodiment of the present application may be a YUV format, and the first processor 201 may convert the original image data stream in the YUV format into an RGB format, and then the first processor 201 performs subsequent algorithm processing on the image data in the RGB format.
In an alternative embodiment of the present application, after sending a trigger signal to a camera module, the sending time of the trigger signal is recorded as a reference timestamp;
determining exposure center points of the N image sensors based on the reference time stamps and the exposure times of the N image sensors;
synthesizing the image data after format conversion to obtain a three-dimensional image, wherein the synthesizing comprises the following steps:
m images with the same exposure center point are determined from the image data after format conversion, wherein M is an integer greater than or equal to 2 and less than or equal to N;
and synthesizing the M images to obtain a three-dimensional image.
Specifically, in fig. 2, after sending the trigger signal to the N image sensors in the camera module 21, the image processing system 20 records the trigger signal sending time as a reference timestamp of image acquisition performed by the N image sensors, where the reference timestamp represents an exposure start time of each of the N image sensors; further, for each of the plurality of image sensors, an exposure center point of the image sensor is calculated based on the exposure start time, the exposure time, and the readout time of the image sensor. In one embodiment, the exposure center point of the image sensor 106 may be calculated by the following formula: exposure center point = exposure start time + (exposure time + readout time)/2. In this manner, image processing system 20 may determine the center point of exposure for each image captured by each image sensor.
Here, when determining the time stamp of a frame of image acquired by the image sensor by using the conventional USB interface method, the time of transmitting the frame of image acquired by the image sensor to the computing platform (i.e., the first processor of the present application) is generally determined as the time stamp of acquiring the frame of image by the image sensor, however, the time stamp determined in this way includes the readout time error after the exposure of the frame of image is completed, the time of image transcoding by the second processor, and the transmission time of image data, so when determining the time stamp in this way to process and synthesize the image acquired by the image sensor, error compensation needs to be performed on the time stamp by software at the computing platform.
In the embodiment of the application, an image processing system integrally outputs a trigger signal, and the sending time of the trigger signal is recorded as a reference time stamp when an image sensor collects one frame of image, wherein the reference time stamp is the exposure starting time of each image sensor in the plurality of image sensors; thus, the trigger times of the plurality of image sensors are the same. In addition, since the brightness between the different image sensors may cause the exposure values of the different image sensors to be different, the embodiment of the application calculates the exposure center point of each image sensor based on the exposure start time, the exposure time and the readout time of the image sensor for each image sensor in the plurality of image sensors, and finally can calculate the exposure center point of the different image sensors, thereby ensuring more accurate calculation of the exposure center point and reducing the workload of error compensation of the exposure center point by software in the image processing system 20. Here, the readout (i.e., readout) time of the image sensor may be determined by a manual of the image sensor.
In the embodiment of the application, after the exposure center points of the plurality of image sensors in the camera module are determined when the images are acquired, the images with the same exposure center points are regarded as images shot at the same moment, and the three-dimensional images of the target images shot by the plurality of image sensors in the camera module can be obtained by synthesizing the images shot at the same moment.
In the technical scheme of the embodiment of the application, a deserializer and a serializer are arranged between a first processor and an image sensor in an image processing system, and the deserializer and the serializer are connected through a coaxial cable to form an Fpdlink transmission scheme in the form of a serializer-coaxial cable-deserializer. When the coaxial cable is utilized for data transmission, the transmission distance can reach 15m at the maximum, so that the device is convenient to install and configure on unmanned vehicles, robots and other equipment, and the number of cameras can be conveniently expanded. Here, when the image acquisition system of the embodiment of the present application is applied to an unmanned vehicle, a robot, or the like, the following effects can be achieved by adopting a vehicle-gauge Fpdlink transmission scheme of a "serializer-coaxial cable-deserializer" form:
on the one hand, the triggering signals output by the image processing system are transmitted to the camera module through the coaxial cable, so that synchronous triggering of a plurality of image sensors in the camera module can be realized, and time sequence synchronization can be realized when the image sensors in the camera module collect images;
On the other hand, when the second processor in the camera module processes and converts the image data output by the image sensor into an original image data stream, the image data output by the image sensor does not need to be compressed, and after the original image data stream is sent to the image processing system in a coaxial signal form, the first processor in the image processing system can perform subsequent algorithm processing without decoding the original image data stream when the original image data stream is subjected to format conversion, and the transmission process of the whole image data is fast in transmission speed, high in stability and low in cost, so that the first processor can achieve the purpose of synchronously receiving the image data acquired by a plurality of image sensors.
Fig. 3 is a second flowchart of an image processing method according to an embodiment of the present application, as shown in fig. 3, where the method includes the following steps:
step 301: and receiving a trigger signal sent by the image processing system, and acquiring an image based on the trigger signal.
Specifically, as shown in fig. 2, the camera module 21 in the embodiment of the present application is connected to the image processing system 20, where the camera module 21 includes a plurality of cameras, each of the cameras includes an image sensor, and after the image processor system uniformly sends a trigger signal to the camera module 21, the image sensors in the camera module 21 can simultaneously receive the trigger signal sent by the image processing system 20, so as to perform image acquisition. As a preferred embodiment, the image processing system 20 and the camera module 21 are connected by a coaxial cable 22, and the image processing system 20 can output a trigger signal through its general purpose input/output port GPIO port, and transmit the trigger signal to the camera module 21 via the coaxial cable 22, so that a plurality of image sensors in the camera module 21 synchronously receive the trigger signal sent by the image processing system 20.
Here, since the image processing system 20 can transmit the trigger signal at the set transmission frequency, the image sensor can perform image acquisition at the same frame rate as the frequency of the transmitted trigger signal based on the trigger signal at the set frequency, and the timings at which the plurality of image sensors perform image acquisition are synchronized. Along with the frequency change of the trigger signal, the image acquisition frame rate of the plurality of image sensors can also flexibly change along with the frequency change of the trigger signal, so that the image acquisition frame rate of the image sensors can be changed by adjusting the frequency of the trigger signal sent by the image processing system 20, and the requirements of various application scenes can be met.
Step 302: and converting the image data output after image acquisition into an original image data stream.
Specifically, as shown in fig. 2, each camera in the camera module 21 includes a second processor in addition to the image sensor. The image sensor in the camera can output Bayer format image data after image acquisition, and the Bayer format image is transmitted to the second processor, wherein the second processor is ISP, and the second processor can process the Bayer format image data output by the image sensor and convert the Bayer format image data into an original image data stream.
Step 303: and converting the original image data stream into an on-axis signal and transmitting the on-axis signal to the image processing system.
Specifically, as shown in fig. 2, the camera module 21 further includes a serializer, and the serializer, after receiving the original image data stream output by the second processor, can convert the original image data stream into an on-axis signal and transmit the on-axis signal to the image processing system 20.
According to the technical scheme, the camera module is connected with the image sensors through the coaxial cable, the plurality of image sensors in the camera module synchronously receive the trigger signals output by the image processing system in an integrated mode, and synchronous triggering of the plurality of image sensors in the camera module is achieved. When the coaxial cable is utilized for data transmission, the transmission distance can reach 15m at the maximum, so that the device is convenient to install and configure on unmanned vehicles, robots and other equipment, and the number of cameras can be conveniently expanded. The second processor in the camera module does not need to compress the image data output by the image sensor when processing and converting the image data output by the image sensor into an original image data stream, and does not need to decode the original image data stream when transmitting the original image data stream to the image processing system in a coaxial signal form and converting the format of the received original image data stream in the image processing system, so that the time consumption of decoding and algorithm processing of the image by the image processing system can be reduced, and the performance and instantaneity of online processing of the image are improved.
The embodiment of the present application further provides an image processing system 20, where the image processing system 20 includes: a first processor 201 and a deserializer 202; wherein, the liquid crystal display device comprises a liquid crystal display device,
the first processor 201 is configured to send a trigger signal to the camera module 21, where the trigger signal is used to trigger the image sensor in the camera module 21 to perform image acquisition;
the deserializer 202 is configured to receive the coaxial signal sent by the camera module 21, and decode an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition of the image sensor by a second processor in the camera module 21; the coaxial signal is obtained by converting the original image data stream output by the second processor by a serializer in the camera module 21.
In an alternative embodiment of the present application, the deserializer 202 is connected to the serializer in the camera module 21 through a coaxial cable 22.
The deserializer 202 is configured to receive the coaxial signal sent by the serializer in the camera module 21 through the coaxial cable 22.
In an alternative embodiment of the present application, the camera module 21 includes N image sensors, N is a positive integer, and accordingly, the trigger signal is used to trigger each of the N image sensors to perform image acquisition, where the N image sensors acquire and output N images;
The deserializer 202 is specifically configured to: the N coaxial signals sent by the camera module 21 are received, and an original image data stream including at most N images is decoded from the N coaxial signals.
In an alternative embodiment of the present application, the first processor 201 is specifically configured to: performing format conversion on the decoded original image data stream which at most comprises N images to obtain image data after format conversion; and synthesizing the image data after format conversion to obtain a three-dimensional image.
In an alternative embodiment of the present application, the first processor 201 is further configured to: after sending the trigger signal to the camera module 21, recording the sending time of the trigger signal as a reference time stamp; determining exposure center points of the N image sensors based on the reference time stamps and the exposure times of the N image sensors; m images with the same exposure center point are determined from the image data after format conversion, wherein M is an integer greater than or equal to 2 and less than or equal to N; and synthesizing the M images to obtain a three-dimensional image.
In an alternative embodiment of the present application, the first processor 201 is further specifically configured to: and sending a trigger signal to the image sensor in the camera module 21 according to a set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of the image sensor in the camera module 21.
The embodiment of the present application further provides a camera module 21, where the camera module 21 includes: an image sensor, a second processor, and a serializer; wherein, the liquid crystal display device comprises a liquid crystal display device,
the image sensor is configured to receive a trigger signal sent by the image processing system 20, and perform image acquisition based on the trigger signal;
the second processor is used for converting the image data output after the image sensor performs image acquisition into an original image data stream;
the serializer is configured to convert the original image data stream into an on-axis signal for transmission to the image processing system 20.
In an alternative embodiment of the present application, the serializer is connected to the deserializer 202 in the image processing system 20 through a coaxial cable 22;
the serializer is specifically used for: the original image data stream is converted to a coaxial signal and the coaxial signal is transmitted over the coaxial cable 22 to a deserializer 202 in the image processing system 20.
In an optional embodiment of the present application, the format of the original image data stream is YUV format.
Those skilled in the art will appreciate that the implementation functions of the image processing system 20 and the units in the camera module 21 shown in fig. 2 can be understood with reference to the foregoing description of the image processing method. The functions of the image processing system 20 and the units in the camera module 21 shown in fig. 2 may be implemented by programs running on a processor, or by specific logic circuits.
The embodiment of the application also provides an image acquisition system, and fig. 2 is a schematic diagram of the structural composition of the image acquisition system provided by the embodiment of the application. As shown in fig. 2, the image acquisition system includes an image processing system 20 and a camera module 21, wherein the image processing system 20 includes a first processor 201 and a deserializer 202, the camera module 21 includes a plurality of cameras, and each of the plurality of cameras includes a serializer, a second processor and an image sensor.
The following describes the image acquisition, transmission and processing procedures of the image acquisition system according to the embodiment of the present application by taking a specific image acquisition system as an example, fig. 4 is a schematic diagram of the structural composition of the image acquisition system according to the embodiment of the present application, and fig. 5 is a software processing flow chart of the image acquisition system based on the embodiment of fig. 4.
As shown in fig. 4, the image acquisition system includes an image processing system 20, the image processing system 20 is connected to a camera module 21 through a coaxial cable 22, the image processing system 20 includes a first processor 201 and a deserializer 202, wherein the first processor 201 includes an image coprocessor 2011 and a graphics processor 2012 (GPU, graphics Processing Unit), and the deserializer 202 is connected to a mobile industry processor interface (MIPI, mobile Industry Processor Interface) of the image coprocessor 2011 and is connected to a GPIO interface of the graphics processor 2012. In addition, the image capturing system further includes a camera module 21, where the camera module 21 includes a plurality of cameras, for example, a camera 211, including a serializer 2110, a second processor 2111, and an image sensor 2112. The image processing system 20 is connected with the camera module 21 through a coaxial cable 22, specifically, the deserializer 202 in the image processing system 20 is connected with the serializer in the camera module 21 through the coaxial cable 22, so as to form an Fpdlink transmission scheme in the form of a serializer-coaxial cable-deserializer.
In fig. 4, a graphics processor 2012 in the image processing system 20 is configured to output trigger signals for triggering the plurality of image sensors in the camera module 21 to perform image capturing through a GPIO interface; each of the plurality of image sensors starts exposure after receiving a trigger signal output by the image processing system 20 and generates a frame of bayer format image, then the bayer format image data is transmitted to the second processor, the second processor debugs the bayer format image data and converts the bayer format image data into an original image data stream, then the serializer further converts the original image data stream into an anti-interference coaxial signal, so that coaxial signals output by a plurality of cameras in the camera module 21 can be transmitted to the deserializer 202 through a coaxial cable 22 with a certain length (such as 15 meters), the deserializer 202 can process the received multipath coaxial signals output by the plurality of cameras based on a clock signal to obtain an original image data stream of N images output by the camera module 21, and the original image data stream of the N images is transmitted to the first processor 201 through an MIPI interface of the image coprocessor 2011, and format conversion and synthesis processing are performed on the original image data stream of the N images by the first processor 201; wherein, after sending out the trigger signal, the first processor 201 records the sending time of the trigger signal as a reference time stamp when the image sensor collects one frame of image, and the reference time stamp is the exposure start time of each image sensor in the plurality of image sensors; for each of the plurality of image sensors, the exposure center point of the image sensor is calculated based on the exposure start time, the exposure time and the readout time of the image sensor, and finally the exposure center points of different image sensors can be calculated, after the exposure center points of the plurality of image sensors in the camera module 21 for image acquisition are determined, the images with the same exposure center point are regarded as images shot at the same moment, and the three-dimensional images of the target images shot by the plurality of image sensors in the camera module 21 can be obtained by synthesizing the images shot at the same moment. Here, as a preferred embodiment, after receiving the N image raw image data streams, the image processing system 20 may further perform processing such as cropping and scaling on the N image raw image data streams by using the image coprocessor 2011, and then perform format conversion and subsequent synthesis processing on the cropped and scaled raw image data streams by the first processor 201.
Fig. 5 is a software processing flow chart of an image processing method according to an embodiment of the present application, as shown in fig. 5, based on the image acquisition system shown in fig. 4, when the image acquisition system is used to acquire and process an image, after the image acquisition system is powered on, the graphics processor 2012 first initializes the camera in the camera module 21, maps a direct memory access (DMA, direct Memory Access) area as an image buffer area, sets a duty cycle control frequency of a trigger signal (i.e., a frame rate of an image sensor) when the camera in the camera module 21 waits for receiving the image in a state of opening a data stream, sends the trigger signal to a plurality of image sensors in the camera module 21 and acquires a reference timestamp, and makes a timestamp for each trigger as an exposure start time (i.e., an acquired reference timestamp) of the image sensor, and simultaneously reads an exposure time of each image sensor in the camera module 21, and calculates an exposure center point of the image sensor according to the following formula: exposure center = exposure start time + (exposure time + readout time)/2, and then sends the final calculated exposure center to the main thread. Meanwhile, after receiving the original image data stream in YUV422 format transmitted by the deserializer 202, the image coprocessor 2011 of the image processing system 20 converts the image in YUV422 format into YUV420 format through video image conversion (VIC, video Image Converter), copies the image data in YUV420 format into the unified computing device architecture (CUDA, compute Unified Device Architecture) of the graphics processor 2012, and finally converts the image into RGB format, and further performs subsequent algorithm and synthesis processing.
The technical schemes described in the embodiments of the present application may be arbitrarily combined without any collision.
In several embodiments provided by the present application, it should be understood that the disclosed method and intelligent device may be implemented in other manners. The above described device embodiments are only illustrative, e.g. the division of the units is only one logical function division, and there may be other divisions in practice, such as: multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the various components shown or discussed may be coupled or directly coupled or communicatively coupled to each other via some interface, whether indirectly coupled or communicatively coupled to devices or units, whether electrically, mechanically, or otherwise.
The units described as separate units may or may not be physically separate, and units displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units; some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one second processing unit, or each unit may be separately used as one unit, or two or more units may be integrated in one unit; the integrated units may be implemented in hardware or in hardware plus software functional units.
The method disclosed by the embodiment of the application can be applied to a processor or realized by the processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The processor may be a general purpose processor, DSP, or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present application. The general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed in the embodiment of the application can be directly embodied in the hardware of the decoding processor or can be implemented by combining hardware and software modules in the decoding processor. The software modules may be located in a storage medium having memory and a processor reading information from the memory and performing the steps of the method in combination with hardware.
The embodiment of the application also provides a storage medium, particularly a computer storage medium, and more particularly a computer readable storage medium. On which computer instructions, i.e. a computer program, are stored which, when executed by a processor, provide a method according to one or more of the above-mentioned claims.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application.

Claims (9)

1. An image processing method, the method comprising:
sending a trigger signal to a camera module, wherein the trigger signal is used for triggering an image sensor in the camera module to acquire images; the camera module comprises N image sensors, N is a positive integer, and correspondingly, the trigger signal is used for triggering each image sensor in the N image sensors to acquire images, wherein the N image sensors acquire and output N images;
receiving a coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition of the image sensor by a second processor in the camera module; the coaxial signal is obtained by converting an original image data stream output by the second processor through a serializer in the camera module;
The method further comprises the steps of:
performing format conversion on the decoded original image data stream which at most comprises N images to obtain image data after format conversion;
after sending a trigger signal to a camera module, recording the sending time of the trigger signal as a reference time stamp; determining exposure center points of the N image sensors based on the reference time stamps and the exposure times of the N image sensors;
m images with the same exposure center point are determined from the image data after format conversion, wherein M is an integer greater than or equal to 2 and less than or equal to N;
and synthesizing the M images.
2. The method of claim 1, wherein receiving the coaxial signal transmitted by the camera module decodes an original image data stream from the coaxial signal, comprising:
and receiving N coaxial signals sent by the camera module, and decoding an original image data stream comprising at most N images from the N coaxial signals.
3. The method of claim 1, wherein sending a trigger signal to the camera module comprises:
and sending a trigger signal to the camera module according to the set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of the image sensor in the camera module.
4. A method according to any one of claims 1 to 3, wherein the format of the raw image data stream is YUV format.
5. An image processing system, the image processing system comprising: a first processor and a deserializer; wherein, the liquid crystal display device comprises a liquid crystal display device,
the first processor is used for sending a trigger signal to the camera module, and the trigger signal is used for triggering an image sensor in the camera module to acquire images; the camera module comprises N image sensors, N is a positive integer, and correspondingly, the trigger signal is used for triggering each image sensor in the N image sensors to acquire images, wherein the N image sensors acquire and output N images;
the deserializer is used for receiving the coaxial signal sent by the camera module and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition of the image sensor by a second processor in the camera module; the coaxial signal is obtained by converting an original image data stream output by the second processor through a serializer in the camera module;
The first processor is further configured to perform format conversion on the decoded original image data stream that includes at most N images, to obtain image data after format conversion; after sending a trigger signal to a camera module, recording the sending time of the trigger signal as a reference time stamp; determining exposure center points of the N image sensors based on the reference time stamps and the exposure times of the N image sensors; m images with the same exposure center point are determined from the image data after format conversion, wherein M is an integer greater than or equal to 2 and less than or equal to N; and synthesizing the M images.
6. The image processing system of claim 5, wherein the deserializer is connected to a serializer in the camera module via a coaxial cable;
the deserializer is used for receiving the coaxial signal sent by the serializer in the camera module through the coaxial cable.
7. The image processing system according to claim 5 or 6, wherein the first processor is specifically configured to: and sending a trigger signal to the image sensor in the camera module according to the set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of the image sensor in the camera module.
8. An image acquisition system, characterized in that it comprises the image processing system of any one of claims 5 to 7, and at least one camera module; for each camera module, the camera module includes: an image sensor, a second processor, and a serializer; wherein, the liquid crystal display device comprises a liquid crystal display device,
the image sensor is used for receiving a trigger signal sent by the image processing system and collecting images based on the trigger signal;
the second processor is used for converting the image data output after the image sensor performs image acquisition into an original image data stream;
the serializer is used for converting the original image data stream into an on-axis signal and transmitting the on-axis signal to the image processing system.
9. The image acquisition system of claim 8, wherein the serializer is connected to a deserializer in the image processing system by a coaxial cable;
the serializer is specifically used for: and converting the original image data stream into a coaxial signal, and transmitting the coaxial signal to a deserializer in the image processing system through the coaxial cable.
CN202010248670.0A 2020-04-01 2020-04-01 Image processing method and system, camera module and image acquisition system Active CN113497883B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010248670.0A CN113497883B (en) 2020-04-01 2020-04-01 Image processing method and system, camera module and image acquisition system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010248670.0A CN113497883B (en) 2020-04-01 2020-04-01 Image processing method and system, camera module and image acquisition system

Publications (2)

Publication Number Publication Date
CN113497883A CN113497883A (en) 2021-10-12
CN113497883B true CN113497883B (en) 2023-08-22

Family

ID=77993076

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010248670.0A Active CN113497883B (en) 2020-04-01 2020-04-01 Image processing method and system, camera module and image acquisition system

Country Status (1)

Country Link
CN (1) CN113497883B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102804789A (en) * 2009-06-23 2012-11-28 Lg电子株式会社 Receiving system and method of providing 3D image
CN108924477A (en) * 2018-06-01 2018-11-30 北京图森未来科技有限公司 A kind of long-distance video processing method and system, video processing equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102804789A (en) * 2009-06-23 2012-11-28 Lg电子株式会社 Receiving system and method of providing 3D image
CN108924477A (en) * 2018-06-01 2018-11-30 北京图森未来科技有限公司 A kind of long-distance video processing method and system, video processing equipment

Also Published As

Publication number Publication date
CN113497883A (en) 2021-10-12

Similar Documents

Publication Publication Date Title
JP6306845B2 (en) Imaging apparatus and control method thereof
CN110708513B (en) 8K video multi-core heterogeneous processing device
US20070188522A1 (en) Mixed reality display system
EP3528490B1 (en) Image data frame synchronization method and terminal
JP4794938B2 (en) Monitoring system, monitoring device, monitoring method, and program
CN105611177A (en) Method for realizing multiple-camera simultaneous photographing of panorama camera and panorama camera
US10535193B2 (en) Image processing apparatus, image synthesizing apparatus, image processing system, image processing method, and storage medium
CN112887682B (en) Multi-path track image synchronous acquisition and storage system and method
KR100461339B1 (en) Device and Method for transmitting picture data
EP2421244B1 (en) Endoscope for wireless image data transmission
CN113194269A (en) Image output system and method
CN113497883B (en) Image processing method and system, camera module and image acquisition system
JP2013539611A (en) Obtaining a stereo image of a single pipeline
CN113572941A (en) Multifunctional image acquisition device applied to CPCI computer
CN105430297A (en) Automatic control system for conversion from multi-video format to IIDC protocol video format
US20100328514A1 (en) Image processing device, imaging apparatus, and thumbnail image displaying method
CN114866733A (en) Low-delay video processing method, system and device
JP2000083253A (en) Image processor
JP2008131264A (en) Monitor camera, image recording/display apparatus and monitor camera system
CN112019808A (en) Vehicle-mounted real-time video information intelligent recognition device based on MPSoC
US7656433B2 (en) Web camera
CN216625908U (en) Video storage device and endoscope equipment
CN220653423U (en) Signal conversion device
CN111756963A (en) Image shooting module and electronic terminal
KR100320151B1 (en) Control Unit for Multi Image Signal Storage

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant