CN113497883A - Image processing method and system, camera module and image acquisition system - Google Patents

Image processing method and system, camera module and image acquisition system Download PDF

Info

Publication number
CN113497883A
CN113497883A CN202010248670.0A CN202010248670A CN113497883A CN 113497883 A CN113497883 A CN 113497883A CN 202010248670 A CN202010248670 A CN 202010248670A CN 113497883 A CN113497883 A CN 113497883A
Authority
CN
China
Prior art keywords
image
camera module
image data
data stream
coaxial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010248670.0A
Other languages
Chinese (zh)
Other versions
CN113497883B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ninebot Beijing Technology Co Ltd
Original Assignee
Ninebot Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ninebot Beijing Technology Co Ltd filed Critical Ninebot Beijing Technology Co Ltd
Priority to CN202010248670.0A priority Critical patent/CN113497883B/en
Publication of CN113497883A publication Critical patent/CN113497883A/en
Application granted granted Critical
Publication of CN113497883B publication Critical patent/CN113497883B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses image processing method, image processing system, camera module and image acquisition system, its characterized in that, image processing method includes: sending a trigger signal to a camera module, wherein the trigger signal is used for triggering an image sensor in the camera module to acquire an image; receiving a coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition is carried out on the image sensor through a second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module.

Description

Image processing method and system, camera module and image acquisition system
Technical Field
The embodiment of the application relates to the technical field of image processing, in particular to an image processing method and system, a camera module and an image acquisition system.
Background
In the conventional multi-view vision system, the multi-view vision system usually generates an image in a Joint Photographic Experts Group (JPEG) format on a camera side, and further transmits image data in the JPEG format to a processing unit, and the processing unit decodes and processes the image.
Disclosure of Invention
In order to solve the above technical problem, embodiments of the present application provide an image processing method and system, a camera module, and an image acquisition system.
The embodiment of the application provides an image processing method, which comprises the following steps:
sending a trigger signal to a camera module, wherein the trigger signal is used for triggering an image sensor in the camera module to acquire an image;
receiving a coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition is carried out on the image sensor through a second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module.
In an optional embodiment of the present application, the camera module includes N image sensors, where N is a positive integer, and correspondingly, the trigger signal is used to trigger each image sensor of the N image sensors to perform image acquisition, where the N image sensors acquire and output N images;
receiving the coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal, including:
and receiving N coaxial signals sent by the camera module, and decoding an original image data stream comprising at most N images from the N coaxial signals.
In an optional embodiment of the present application, the method further comprises:
carrying out format conversion on the decoded original image data stream at most comprising N images to obtain image data after format conversion;
and synthesizing the image data after format conversion to obtain a three-dimensional image.
In an optional embodiment of the present application, the method further comprises:
after sending a trigger signal to a camera module, recording the sending time of the trigger signal as a reference timestamp;
determining exposure center points of the N image sensors based on the reference time stamps and the exposure times of the N image sensors;
synthesizing the image data after format conversion to obtain a three-dimensional image, wherein the synthesizing process comprises the following steps:
determining M images with the same exposure central point from the image data after format conversion, wherein M is an integer which is more than or equal to 2 and less than or equal to N;
and synthesizing the M images to obtain a three-dimensional image.
In an optional embodiment of the present application, the sending the trigger signal to the camera module includes:
and sending a trigger signal to the camera module according to a set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of an image sensor in the camera module.
The embodiment of the application also provides an image processing method, which comprises the following steps:
receiving a trigger signal sent by an image processing system, and acquiring an image based on the trigger signal;
converting image data output after image acquisition into an original image data stream;
and converting the original image data stream into a coaxial signal and transmitting the coaxial signal to the image processing system.
In an optional embodiment of the present application, a format of the raw image data stream is a YUV format.
An embodiment of the present application further provides an image processing system, including: a first processor and deserializer; wherein the content of the first and second substances,
the first processor is used for sending a trigger signal to a camera module, wherein the trigger signal is used for triggering an image sensor in the camera module to acquire an image;
the deserializer is used for receiving the coaxial signal sent by the camera module and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition is carried out on the image sensor through a second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module.
In an optional embodiment of the present application, the deserializer is connected to the serializer in the camera module through a coaxial cable;
and the deserializer is used for receiving the coaxial signal sent by the serializer in the camera module through the coaxial cable.
In an optional embodiment of the present application, the camera module includes N image sensors, where N is a positive integer, and correspondingly, the trigger signal is used to trigger each image sensor of the N image sensors to perform image acquisition, where the N image sensors acquire and output N images;
the deserializer is specifically configured to: and receiving N coaxial signals sent by the camera module, and decoding an original image data stream comprising at most N images from the N coaxial signals.
In an optional implementation manner of this application, the first processor is specifically configured to: carrying out format conversion on the decoded original image data stream at most comprising N images to obtain image data after format conversion; and synthesizing the image data after format conversion to obtain a three-dimensional image.
In an optional embodiment of the present application, the first processor is further configured to: after sending a trigger signal to a camera module, recording the sending time of the trigger signal as a reference timestamp; determining exposure center points of the N image sensors based on the reference time stamps and the exposure times of the N image sensors;
determining M images with the same exposure central point from the image data after format conversion, wherein M is an integer which is more than or equal to 2 and less than or equal to N; and synthesizing the M images to obtain a three-dimensional image.
In an optional implementation manner of the present application, the first processor is further specifically configured to: and sending a trigger signal to an image sensor in the camera module according to a set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of the image sensor in the camera module.
The embodiment of the present application further provides a camera module, the camera module includes: an image sensor, a second processor, and a serializer; wherein the content of the first and second substances,
the image sensor is used for receiving a trigger signal sent by an image processing system and acquiring an image based on the trigger signal;
the second processor is used for converting image data output after the image sensor performs image acquisition into an original image data stream;
and the serializer is used for converting the original image data stream into a coaxial signal and transmitting the coaxial signal to the image processing system.
In an optional embodiment of the present application, the serializer and the deserializer in the image processing system are connected by a coaxial cable;
the serializer is particularly useful for: and converting the original image data stream into a coaxial signal, and transmitting the coaxial signal to a deserializer in the image processing system through the coaxial cable.
In an optional embodiment of the present application, a format of the raw image data stream is a YUV format.
The embodiment of the application also provides an image acquisition system, which comprises the image processing system and the camera module.
According to the technical scheme of the embodiment of the application, a trigger signal is sent to a camera module, and the trigger signal is used for triggering an image sensor in the camera module to acquire an image; receiving a coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition is carried out on the image sensor through a second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module. Therefore, the same trigger signal can be utilized to trigger each image sensor in the plurality of image sensors to acquire images at the same time, image data output after the image sensors acquire the images is converted into an original image data stream on the camera module side, the original image data stream is converted into a coaxial signal and then transmitted to an image processing system to be subjected to subsequent processing, and therefore after the image processing system decodes the data stream of the original image from the coaxial signal, the received original image data stream does not need to be decoded, time consumption of the image processing system in decoding and algorithm processing of the images sent by the camera module is reduced, and performance and real-time performance of online processing of the images are improved.
Drawings
Fig. 1 is a first schematic flowchart of an image processing method according to an embodiment of the present disclosure;
fig. 2 is a first schematic structural diagram of an image acquisition system according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart illustrating an image processing method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of an image acquisition system according to an embodiment of the present application;
fig. 5 is a software processing flowchart of an image processing method according to an embodiment of the present application.
Detailed Description
So that the manner in which the features and aspects of the present application can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings.
The multi-view vision system is widely applied to equipment such as robots, unmanned vehicles and unmanned planes, and is used for realizing functions such as positioning navigation, precision industrial measurement, object identification, virtual reality, scene reconstruction and surveying. The existing multi-view vision system mostly adopts a USB interface, and in order to realize the sending of trigger signals and data transmission between a processing unit and a plurality of cameras in the multi-view vision system, separate pull wires need to be pulled between the processing unit and the plurality of cameras, so that the multi-view vision system has the characteristics of high cost, short transmission distance, inconvenient access and the like. In addition, under complicated conditions, the reliability of a data transmission mode of independently pulling wires by using a Universal Serial Bus (USB) interface is poor, so that data acquisition and transmission are difficult to achieve real accurate synchronization, and problems of frame loss, flexible change of the frame rate of a trigger signal according to an application scene, and the like also occur. Furthermore, after the camera of the multi-view vision system collects an image, in order to increase the speed of transmitting image data to the processing unit, the image data generated by the image sensor needs to be converted into a JPEG format on the camera side, and then the image in the JPEG format is transmitted to the processing unit through the USB connection between the camera and the processing unit, and the received image in the JEPG format is converted into an RGB format by the processing unit, and then the subsequent image synthesis is performed, so that the decoding and processing processes in the whole process take a long time, and the performance of the on-line processing of the multi-view vision system is reduced.
In order to solve the above problem, an embodiment of the present application provides an image processing method, and fig. 1 is a first flowchart of the image processing method provided in the embodiment of the present application, as shown in fig. 1, the method includes the following steps:
step 101: and sending a trigger signal to a camera module, wherein the trigger signal is used for triggering an image sensor in the camera module to acquire an image.
Specifically, as shown in fig. 2, the image processing system 20 in the embodiment of the present application is connected to a camera module 21, where the camera module 21 includes a plurality of cameras, each of the cameras includes an image sensor, and the image processing system 20 uniformly sends a trigger signal to one or more image sensors in the camera module 21, so as to trigger the one or more image sensors in the camera module 21 to perform image acquisition.
In an optional embodiment of the present application, sending a trigger signal to a camera module includes:
and sending a trigger signal to the camera module according to a set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of an image sensor in the camera module.
Specifically, in this embodiment, the trigger signal is also referred to as a frame synchronization signal of the image sensor, and the sending frequency of the trigger signal is preset in the image processing system 20, so that the trigger signal can be sent to one or more image sensors in the camera module 21 according to the set sending frequency, and after the image processing system 20 sends the trigger signal to the camera module 21 according to the set sending frequency, the image sensors can perform image acquisition according to the frame rate of the same frequency as the trigger signal based on the received trigger signal.
As a preferred embodiment, as shown in fig. 2, the image processing system 20 and the camera module 21 are connected by a coaxial cable 22, and the image processing system 20 can Output a trigger signal through its General Purpose Input/Output (GPIO) port and transmit the trigger signal to the camera module 21 via the coaxial cable 22, so as to trigger one or more image sensors in the camera module 21 to capture an image.
In the embodiment of the present application, the sending frequency of the trigger signal can be flexibly set in the software program of the image processing system according to the actual application scenario, so as to ensure that the frame rate of the image sensor in the camera module 21 for image acquisition flexibly changes along with the frequency of the trigger signal, thereby adapting to the requirements of various scenarios.
According to the technical scheme of the embodiment of the application, the trigger signal output by the image processing system 20 can be transmitted to the camera module 21 through the coaxial cable 22 to trigger the exposure and image acquisition of the plurality of image sensors in the camera module 21, the synchronous triggering of the plurality of image sensors in the camera module 21 can be realized in the whole process, the time sequence synchronization can be realized when the plurality of image sensors in the camera module 21 acquire images, and the reliability is high.
Step 102: and receiving the coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal.
In the embodiment of the application, the original image data stream is obtained by converting image data output after image acquisition is performed on the image sensor through a second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module.
Specifically, as shown in fig. 2, each camera in the camera module 21 includes a serializer and a second processor in addition to the image sensor. The Image sensor in the camera outputs Image data in a bayer format after Image acquisition, and transmits the Image in the bayer format to the second processor, where the second processor is an Image Signal Processor (ISP), the second processor can process the Image data in the bayer format output by the Image sensor, convert the Image data into a raw Image data stream and transmit the raw Image data stream to the serializer, and the serializer converts the raw Image data stream into an anti-interference coaxial Signal and transmits the anti-interference coaxial Signal to the Image Processing system 20. In fig. 2, the image processing system 20 includes a deserializer 202, and after the serializer transmits the coaxial signal to the image processing system 20, the deserializer 202 in the image processing system 20 processes the coaxial signal in serial form to obtain an original image data stream output by the second processor. Here, each serializer in the camera module 21 is connected to the deserializer 202 in the image processing system 20 through the coaxial cable 22, so that the serializer transmits a coaxial signal to the deserializer 202 of the image processing system 20 through the coaxial cable 22, and the deserializer 202 may transmit a raw image data stream to the first Processor 201 of the image processing system 20 through a Mobile Industry Processor Interface (MIPI) of the first Processor 201.
It should be noted that, in the embodiment of the present application, the process of converting the bayer-format image output into the raw image data stream by the second processor does not need to compress the bayer-format image data. As a preferred embodiment, the format of the raw image data stream may be YUV format, and the second processor does not need to compress the bayer format image when converting the bayer format image data into YUV format image data.
In an optional embodiment of the present application, the camera module includes N image sensors, where N is a positive integer, and correspondingly, the trigger signal is used to trigger each image sensor of the N image sensors to perform image acquisition, where the N image sensors acquire and output N images;
receiving the coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal, including:
and receiving N coaxial signals sent by the camera module, and decoding an original image data stream comprising at most N images from the N coaxial signals.
Specifically, as shown in fig. 2, when the camera module 21 includes N cameras, the image processing system 20 uniformly sends a trigger signal to the image sensors in the N cameras, and the N image sensors collect and output N images, so that the camera module 21 outputs N coaxial signals and sends the signals to the image processing system 20, and the deserializer 202 in the image processing system 20 processes the N coaxial signals to obtain an original image data stream of the N images, and transmits the original image data stream of the N images to the first processor 201 for processing. Here, the time when the N coaxial signals reach the deserializer 202 may not be completely the same, but the deserializer 202 can process the N coaxial signals to obtain the original image data streams of the N images and transmit the original image data streams of the N images to the first processor 201 after all the N coaxial signals reach the deserializer 202 based on the clock information.
In an optional embodiment of the present application, format conversion is performed on a decoded original image data stream including at most N images to obtain image data after format conversion;
and synthesizing the image data after format conversion to obtain a three-dimensional image.
Specifically, in fig. 2, after receiving the original image data streams of N images transmitted by the deserializer 202, the first processor 201 performs format conversion on the original image data streams of the N images, and performs synthesis processing on the converted image data to obtain a three-dimensional image of the target object acquired by the plurality of image sensors of the camera module 21.
Here, when the first processor 201 performs format conversion on the original image data stream of N images, since the original image data stream is uncompressed image data, the first processor 201 does not need to decode the original image data stream when performing format conversion on the original image data stream. As a preferred implementation manner, the format of the raw image data stream in the embodiment of the present application may be a YUV format, and the first processor 201 may convert the raw image data stream with the YUV format into an RGB format, and then the first processor 201 performs subsequent algorithm processing on the image data in the RGB format.
In an optional embodiment of the present application, after sending a trigger signal to a camera module, recording an emission time of the trigger signal as a reference timestamp;
determining exposure center points of the N image sensors based on the reference time stamps and the exposure times of the N image sensors;
synthesizing the image data after format conversion to obtain a three-dimensional image, wherein the synthesizing process comprises the following steps:
determining M images with the same exposure central point from the image data after format conversion, wherein M is an integer which is more than or equal to 2 and less than or equal to N;
and synthesizing the M images to obtain a three-dimensional image.
Specifically, in fig. 2, after sending a trigger signal to N image sensors in the camera module 21, the image processing system 20 records a trigger signal sending time as a reference timestamp for image acquisition of the N image sensors, where the reference timestamp represents an exposure start time of each of the N image sensors; further, for each of the plurality of image sensors, an exposure center point of the image sensor is calculated based on an exposure start time, an exposure time, and a readout time of the image sensor. In one embodiment, the exposure center point of the image sensor 106 may be calculated by the following formula: exposure center point ═ exposure start time + (exposure time + readout time)/2. In this manner, the image processing system 20 can determine the exposure center point of the image captured by each image sensor.
Here, when the conventional USB interface method is used to determine the timestamp of a frame of image captured by the image sensor, the time when the frame of image captured by the image sensor is transmitted to the computing platform (i.e. the first processor in the present application) is generally determined as the timestamp of the frame of image captured by the image sensor, however, the timestamp determined in this way includes the reading time error after the exposure of the frame of image is completed, the time when the second processor performs image transcoding, and the transmission time of image data, and therefore, when the timestamp is determined in this way to perform processing and synthesizing of the image captured by the image sensor, the computing platform needs to perform error compensation on the timestamp through software.
In the embodiment of the application, an image processing system outputs trigger signals in a coordinated manner, and the sending time of the trigger signals is recorded as a reference time stamp when an image sensor collects one frame of image, wherein the reference time stamp is the exposure starting time of each image sensor in a plurality of image sensors; therefore, the trigger times of the plurality of image sensors are the same. In addition, because different image sensors may have different exposure values due to different luminances, in the embodiment of the present application, for each image sensor of the multiple image sensors, the exposure center point of the image sensor is calculated based on the exposure start time, the exposure time, and the readout time of the image sensor, and finally, the exposure center points of the different image sensors can be calculated, so that the calculation of the exposure center points is more accurate, and the workload of performing error compensation on the exposure center points by using software in the image processing system 20 is reduced. Here, the readout (i.e., readout) time of the image sensor may be determined by a manual of the image sensor.
In the embodiment of the application, after the exposure central points of the plurality of image sensors in the camera module during image acquisition are determined, the images with the same exposure central points are regarded as the images shot at the same moment, and the three-dimensional images of the target images shot by the plurality of image sensors in the camera module can be obtained by synthesizing the images shot at the same moment.
In the technical scheme of the embodiment of the application, a deserializer and a serializer are arranged between a first processor and an image sensor in an image processing system, and the deserializer and the serializer are connected through a coaxial cable to form an Fpdlink transmission scheme in the form of a serializer-coaxial cable-deserializer. When the coaxial cable is used for data transmission, the transmission distance can reach 15m at most, so that the installation and configuration of the device on an unmanned vehicle, a robot and the like are facilitated, and the number of cameras is conveniently expanded. Here, when the image capturing system according to the embodiment of the present application is applied to an unmanned vehicle, a robot, or the like, the following effects can be achieved by using the vehicle gauge-level Fpdlink transmission scheme in the form of "serializer-coax-deserializer":
on one hand, the trigger signal output by the image processing system is transmitted to the camera module through the coaxial cable, so that synchronous triggering of a plurality of image sensors in the camera module can be realized, and timing synchronization can be realized when the image sensors in the camera module acquire images;
on the other hand, when the second processor in the camera module processes the image data output by the image sensor and converts the image data into the original image data stream, the image data output by the image sensor does not need to be compressed, after the original image data stream is sent to the image processing system in the form of a coaxial signal, when the first processor in the image processing system performs format conversion on the original image data stream, the subsequent algorithm processing can be carried out without decoding the original image data stream, the transmission speed of the whole image data transmission process is high, the stability is high, the cost is low, the first processor can achieve the purpose of synchronously receiving the image data acquired by the plurality of image sensors, in addition, the time consumption of the image processing system for decoding and processing the image by the algorithm can be reduced, and the performance and the real-time performance of the on-line processing of the image are improved.
Fig. 3 is a schematic flowchart of a second image processing method according to an embodiment of the present application, and as shown in fig. 3, the method includes the following steps:
step 301: and receiving a trigger signal sent by an image processing system, and acquiring an image based on the trigger signal.
Specifically, as shown in fig. 2, the camera module 21 in the embodiment of the present application is connected to the image processing system 20, the camera module 21 includes a plurality of cameras, each camera includes an image sensor, and after the image processor system uniformly sends a trigger signal to the camera module 21, the image sensors in the camera module 21 can simultaneously receive the trigger signal sent by the image processing system 20 to perform image acquisition. In a preferred embodiment, the image processing system 20 and the camera module 21 are connected by a coaxial cable 22, and the image processing system 20 can output a trigger signal through its general purpose input/output port GPIO port and transmit the trigger signal to the camera module 21 via the coaxial cable 22, so that the plurality of image sensors in the camera module 21 can synchronously receive the trigger signal sent by the image processing system 20.
Here, since the image processing system 20 can transmit the trigger signal at the set transmission frequency, the image sensor can also perform image capturing at the same frame rate as the frequency of the transmitted trigger signal based on the trigger signal at the set frequency, and the timings when the plurality of image sensors perform image capturing are synchronized. With the change of the frequency of the trigger signal, the image capturing frame rates of the plurality of image sensors can also flexibly change along with the frequency of the trigger signal, so that the image capturing frame rates of the image sensors can be changed by adjusting the frequency of the trigger signal sent by the image processing system 20, and the requirements of various application scenes can be met.
Step 302: and converting the image data output after image acquisition into an original image data stream.
Specifically, as shown in fig. 2, each camera in the camera module 21 includes a second processor in addition to the image sensor. The image sensor in the camera can output bayer pattern image data after image acquisition, and transmit the bayer pattern image to the second processor, where the second processor is an ISP, and the second processor can process the bayer pattern image data output by the image sensor to convert the bayer pattern image data into a raw image data stream.
Step 303: and converting the original image data stream into a coaxial signal and transmitting the coaxial signal to the image processing system.
Specifically, as shown in fig. 2, the camera module 21 further includes a serializer, and the serializer, after receiving the raw image data stream output by the second processor, can convert the raw image data stream into a coaxial signal and transmit the coaxial signal to the image processing system 20.
According to the technical scheme, the camera module is connected with the image sensors through the coaxial cable, the trigger signals output by the image processing system are synchronously received by the image sensors in the camera module, and synchronous triggering of the image sensors in the camera module is achieved. When the coaxial cable is used for data transmission, the transmission distance can reach 15m at most, so that the installation and configuration of the device on an unmanned vehicle, a robot and the like are facilitated, and the number of cameras is conveniently expanded. When the second processor in the camera module processes and converts the image data output by the image sensor into the original image data stream, the image data output by the image sensor is not required to be compressed, the original image data stream is sent to the image processing system in a coaxial signal mode, and when the format of the received original image data stream is converted in the image processing system, the original image data stream is not required to be decoded, so that the time consumption of the image processing system for decoding and arithmetic processing of the image can be reduced, and the performance and the real-time performance of on-line processing of the image are improved.
An embodiment of the present application further provides an image processing system 20, where the image processing system 20 includes: a first processor 201 and deserializer 202; wherein the content of the first and second substances,
the first processor 201 is configured to send a trigger signal to the camera module 21, where the trigger signal is used to trigger an image sensor in the camera module 21 to perform image acquisition;
the deserializer 202 is configured to receive the coaxial signal sent by the camera module 21, and decode an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition of the image sensor by a second processor in the camera module 21; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module 21.
In an optional embodiment of the present application, the deserializer 202 and the serializer in the camera module 21 are connected by a coaxial cable 22.
The deserializer 202 is configured to receive the coaxial signal sent by the serializer in the camera module 21 through the coaxial cable 22.
In an optional embodiment of the present application, the camera module 21 includes N image sensors, where N is a positive integer, and correspondingly, the trigger signal is used to trigger each image sensor of the N image sensors to perform image acquisition, where the N image sensors acquire and output N images;
the deserializer 202 is specifically configured to: receiving the N coaxial signals sent by the camera module 21, and decoding an original image data stream including at most N images from the N coaxial signals.
In an optional implementation manner of this application, the first processor 201 is specifically configured to: carrying out format conversion on the decoded original image data stream at most comprising N images to obtain image data after format conversion; and synthesizing the image data after format conversion to obtain a three-dimensional image.
In an optional implementation manner of this application, the first processor 201 is further configured to: after sending a trigger signal to the camera module 21, recording the sending time of the trigger signal as a reference timestamp; determining exposure center points of the N image sensors based on the reference time stamps and the exposure times of the N image sensors; determining M images with the same exposure central point from the image data after format conversion, wherein M is an integer which is more than or equal to 2 and less than or equal to N; and synthesizing the M images to obtain a three-dimensional image.
In an optional implementation manner of this application, the first processor 201 is further specifically configured to: and sending a trigger signal to an image sensor in the camera module 21 according to a set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of the image sensor in the camera module 21.
The embodiment of the present application further provides a camera module 21, the camera module 21 includes: an image sensor, a second processor, and a serializer; wherein the content of the first and second substances,
the image sensor is configured to receive a trigger signal sent by the image processing system 20, and perform image acquisition based on the trigger signal;
the second processor is used for converting image data output after the image sensor performs image acquisition into an original image data stream;
the serializer is configured to convert the raw image data stream into a coaxial signal and transmit the coaxial signal to the image processing system 20.
In an alternative embodiment of the present application, the serializer and the deserializer 202 in the image processing system 20 are connected by a coaxial cable 22;
the serializer is particularly useful for: converts the raw image data stream into a coaxial signal and transmits the coaxial signal through the coaxial cable 22 to a deserializer 202 in the image processing system 20.
In an optional embodiment of the present application, a format of the raw image data stream is a YUV format.
Those skilled in the art will understand that the functions of the image processing system 20 and the camera module 21 shown in fig. 2 can be understood by referring to the related description of the image processing method. The functions of the units in the image processing system 20 and the camera module 21 shown in fig. 2 can be realized by a program running on a processor, and can also be realized by a specific logic circuit.
An embodiment of the present application further provides an image acquisition system, and fig. 2 is a schematic structural composition diagram i of the image acquisition system provided in the embodiment of the present application. As shown in fig. 2, the image capturing system includes an image processing system 20 and a camera module 21, wherein the image processing system 20 includes a first processor 201 and a deserializer 202 therein, the camera module 21 includes a plurality of cameras, and each of the plurality of cameras includes a serializer, a second processor, and an image sensor therein.
Next, a specific image acquisition system is taken as an example to describe the image acquisition, transmission and processing processes of the image acquisition system in the embodiment of the present application, fig. 4 is a schematic structural diagram of the image acquisition system provided in the embodiment of the present application, and fig. 5 is a software processing flow chart of the image acquisition system based on fig. 4.
As shown in fig. 4, the image capturing system includes an image Processing system 20, the image Processing system 20 is connected to the camera module 21 through a coaxial cable 22, the image Processing system 20 includes a first Processor 201 and a deserializer 202, wherein the first Processor 201 includes an image coprocessor 2011 and a Graphics Processor 2012 (GPU), and the deserializer 202 is connected to a Mobile Industry Processor Interface (MIPI) of the image coprocessor 2011 and to a GPIO Interface of the Graphics Processor 2012. In addition, the image capturing system further includes a camera module 21, wherein the camera module 21 includes a plurality of cameras, for example, the camera 211, and the camera includes a serializer 2110, a second processor 2111, and an image sensor 2112. The image processing system 20 and the camera module 21 are connected by a coaxial cable 22, and specifically, the deserializer 202 in the image processing system 20 and the serializer in the camera module 21 are connected by the coaxial cable 22, so as to form an Fpdlink transmission scheme in the form of a serializer-coaxial cable-deserializer.
In fig. 4, the graphics processor 2012 in the image processing system 20 is configured to output a trigger signal for triggering the image sensors in the camera module 21 to perform image acquisition through the GPIO interface; each image sensor of the plurality of image sensors starts exposure and generates a frame of bayer-formatted image after receiving a trigger signal output by the image processing system 20, then the bayer-formatted image data is transmitted to a second processor, the second processor debugs the bayer-formatted image data and converts the bayer-formatted image data into a raw image data stream, and then the serializer further converts the raw image data stream into an anti-interference coaxial signal, so that the coaxial signals output by the plurality of cameras in the camera module 21 can be transmitted to the deserializer 202 through a coaxial cable 22 with a certain length (e.g., 15 meters), the deserializer 202 can process the received multiple coaxial signals output by the plurality of cameras based on a clock signal to obtain raw image data streams of N images output by the camera module 21, and send the raw image data streams to the first processor 201 through an MIPI interface of the image coprocessor 2011, performing format conversion and synthesis processing on an original image data stream of N images by the first processor 201; after the first processor 201 sends the trigger signal, recording the sending time of the trigger signal as a reference timestamp when the image sensor collects one frame of image, where the reference timestamp is an exposure start time of each image sensor in the plurality of image sensors; the method includes the steps that for each image sensor in the image sensors, exposure center points of the image sensors are calculated based on exposure starting time, exposure time and reading time of the image sensor, finally, exposure center points of different image sensors can be calculated, after the exposure center points of the image sensors in the camera module 21 during image acquisition are determined, images with the same exposure center points are regarded as images shot at the same time, and the images shot at the same time are subjected to synthesis processing, so that three-dimensional images of target images shot by the image sensors in the camera module 21 can be obtained. Here, as a preferred embodiment, after receiving the original image data stream of N images, the image processing system 20 may perform processing such as cropping and scaling on the N original image data stream images by using the image coprocessor 2011, and then perform format conversion and subsequent synthesis processing on the cropped and scaled original image data stream data by using the first processor 201.
Fig. 5 is a software processing flow chart of an image processing method according to an embodiment of the present application, as shown in fig. 5, based on the image capturing system shown in fig. 4, when the image capturing system is powered on to capture and process an image, the graphics processor 2012 initializes a camera in the camera module 21, maps a Direct Memory Access (DMA) area as an image buffer, sets a duty ratio control frequency of a trigger signal (i.e., a frame rate of the image sensor) when the camera in the camera module 21 is set to a data stream on state to wait for receiving the image, sends the trigger signal to a plurality of image sensors in the camera module 21 and obtains a reference time stamp, and prints a time stamp as an exposure start time of the image sensor (i.e., an obtained reference time stamp) each time the trigger is triggered, and simultaneously reading the exposure time of each image sensor in the camera module 21, and calculating the exposure central point of the image sensor by the following formula: and the exposure central point is equal to the exposure starting time + (the exposure time + the reading time)/2, and the finally calculated exposure central point is sent to the main thread. Meanwhile, after receiving the raw Image data stream in YUV422 format transmitted by the deserializer 202, the Image coprocessor 2011 of the Image processing system 20 converts the Image in YUV422 format into YUV420 format through Video Image Converter (VIC), copies the Image data in YUV420 format into a graphics processor 2012 Unified computing Device Architecture (CUDA), and finally converts the Image into RGB format, thereby performing subsequent algorithm and synthesis processing.
The technical solutions described in the embodiments of the present application can be arbitrarily combined without conflict.
In the several embodiments provided in the present application, it should be understood that the disclosed method and intelligent device may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one second processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The method disclosed by the embodiment of the invention can be applied to a processor or realized by the processor. The processor may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or instructions in the form of software. The processor described above may be a general purpose processor, a DSP, or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. The processor may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present invention. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed by the embodiment of the invention can be directly implemented by a hardware decoding processor, or can be implemented by combining hardware and software modules in the decoding processor. The software modules may be located in a storage medium having a memory and a processor reading the information in the memory and combining the hardware to perform the steps of the method.
The embodiment of the invention also provides a storage medium, in particular a computer storage medium, and more particularly a computer readable storage medium. Stored thereon are computer instructions, i.e. computer programs, which when executed by a processor perform the methods provided by one or more of the above-mentioned aspects.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.

Claims (13)

1. An image processing method, characterized in that the method comprises:
sending a trigger signal to a camera module, wherein the trigger signal is used for triggering an image sensor in the camera module to acquire an image;
receiving a coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition is carried out on the image sensor through a second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module.
2. The method according to claim 1, wherein the camera module comprises N image sensors, N being a positive integer, and accordingly the trigger signal is used to trigger each image sensor of the N image sensors to perform image acquisition, wherein the N image sensors acquire and output N images;
receiving the coaxial signal sent by the camera module, and decoding an original image data stream from the coaxial signal, including:
and receiving N coaxial signals sent by the camera module, and decoding an original image data stream comprising at most N images from the N coaxial signals.
3. The method of claim 2, further comprising:
carrying out format conversion on the decoded original image data stream at most comprising N images to obtain image data after format conversion;
and synthesizing the image data after format conversion to obtain a three-dimensional image.
4. The method of claim 3, further comprising:
after sending a trigger signal to a camera module, recording the sending time of the trigger signal as a reference timestamp;
determining exposure center points of the N image sensors based on the reference time stamps and the exposure times of the N image sensors;
synthesizing the image data after format conversion to obtain a three-dimensional image, wherein the synthesizing process comprises the following steps:
determining M images with the same exposure central point from the image data after format conversion, wherein M is an integer which is more than or equal to 2 and less than or equal to N;
and synthesizing the M images to obtain a three-dimensional image.
5. The method of claim 1, wherein sending a trigger signal to a camera module comprises:
and sending a trigger signal to the camera module according to a set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of an image sensor in the camera module.
6. An image processing method, characterized in that the method comprises:
receiving a trigger signal sent by an image processing system, and acquiring an image based on the trigger signal;
converting image data output after image acquisition into an original image data stream;
and converting the original image data stream into a coaxial signal and transmitting the coaxial signal to the image processing system.
7. The method of claim 6, wherein the format of the raw image data stream is YUV format.
8. An image processing system, characterized in that the image processing system comprises: a first processor and deserializer; wherein the content of the first and second substances,
the first processor is used for sending a trigger signal to a camera module, wherein the trigger signal is used for triggering an image sensor in the camera module to acquire an image;
the deserializer is used for receiving the coaxial signal sent by the camera module and decoding an original image data stream from the coaxial signal; the original image data stream is obtained by converting image data output after image acquisition is carried out on the image sensor through a second processor in the camera module; the coaxial signal is obtained by converting the original image data stream output by the second processor through a serializer in the camera module.
9. The image processing system of claim 8, wherein the deserializer and the serializer in the camera module are connected by a coaxial cable;
and the deserializer is used for receiving the coaxial signal sent by the serializer in the camera module through the coaxial cable.
10. The image processing system of claim 8 or 9, wherein the first processor is specifically configured to: and sending a trigger signal to an image sensor in the camera module according to a set sending frequency, wherein the sending frequency is the same as the acquisition frame rate of the image sensor in the camera module.
11. A camera module, comprising: an image sensor, a second processor, and a serializer; wherein the content of the first and second substances,
the image sensor is used for receiving a trigger signal sent by an image processing system and acquiring an image based on the trigger signal;
the second processor is used for converting image data output after the image sensor performs image acquisition into an original image data stream;
and the serializer is used for converting the original image data stream into a coaxial signal and transmitting the coaxial signal to the image processing system.
12. The camera module of claim 11, wherein the serializer and the deserializer in the image processing system are connected by a coaxial cable;
the serializer is particularly useful for: and converting the original image data stream into a coaxial signal, and transmitting the coaxial signal to a deserializer in the image processing system through the coaxial cable.
13. An image acquisition system comprising an image processing system according to any one of claims 8 to 10 and at least one camera module according to any one of claims 11 or 12.
CN202010248670.0A 2020-04-01 2020-04-01 Image processing method and system, camera module and image acquisition system Active CN113497883B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010248670.0A CN113497883B (en) 2020-04-01 2020-04-01 Image processing method and system, camera module and image acquisition system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010248670.0A CN113497883B (en) 2020-04-01 2020-04-01 Image processing method and system, camera module and image acquisition system

Publications (2)

Publication Number Publication Date
CN113497883A true CN113497883A (en) 2021-10-12
CN113497883B CN113497883B (en) 2023-08-22

Family

ID=77993076

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010248670.0A Active CN113497883B (en) 2020-04-01 2020-04-01 Image processing method and system, camera module and image acquisition system

Country Status (1)

Country Link
CN (1) CN113497883B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117880623A (en) * 2024-03-11 2024-04-12 厦门瑞为信息技术有限公司 Synchronization method of binocular lens and method for receiving end to acquire synchronous image

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102804789A (en) * 2009-06-23 2012-11-28 Lg电子株式会社 Receiving system and method of providing 3D image
CN108924477A (en) * 2018-06-01 2018-11-30 北京图森未来科技有限公司 A kind of long-distance video processing method and system, video processing equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102804789A (en) * 2009-06-23 2012-11-28 Lg电子株式会社 Receiving system and method of providing 3D image
CN108924477A (en) * 2018-06-01 2018-11-30 北京图森未来科技有限公司 A kind of long-distance video processing method and system, video processing equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117880623A (en) * 2024-03-11 2024-04-12 厦门瑞为信息技术有限公司 Synchronization method of binocular lens and method for receiving end to acquire synchronous image
CN117880623B (en) * 2024-03-11 2024-05-28 厦门瑞为信息技术有限公司 Synchronization method of binocular lens and method for receiving end to acquire synchronous image

Also Published As

Publication number Publication date
CN113497883B (en) 2023-08-22

Similar Documents

Publication Publication Date Title
CN101690173B (en) Image processing apparatus and method for displaying captured image without time delay
CN101990095A (en) Method and apparatus for generating compressed file, camera module associated therewith, and terminal including the same
CN105611177A (en) Method for realizing multiple-camera simultaneous photographing of panorama camera and panorama camera
CN112887682B (en) Multi-path track image synchronous acquisition and storage system and method
KR100461339B1 (en) Device and Method for transmitting picture data
CN113194269A (en) Image output system and method
CN113497883B (en) Image processing method and system, camera module and image acquisition system
CN105959562A (en) Method and device for obtaining panoramic photographing data and portable panoramic photographing equipment
CN109495707B (en) High-speed video acquisition and transmission method
CN113572941A (en) Multifunctional image acquisition device applied to CPCI computer
CN109660746B (en) MIPI signal distance transmission device and method
JP2013539611A (en) Obtaining a stereo image of a single pipeline
JP2008131264A (en) Monitor camera, image recording/display apparatus and monitor camera system
CN105430297A (en) Automatic control system for conversion from multi-video format to IIDC protocol video format
US8923639B2 (en) Image processing system, image processing method, and program
CN114866733A (en) Low-delay video processing method, system and device
CN115278189A (en) Image tone mapping method and apparatus, computer readable medium and electronic device
CN114298889A (en) Image processing circuit and image processing method
CN112019808A (en) Vehicle-mounted real-time video information intelligent recognition device based on MPSoC
US7656433B2 (en) Web camera
KR100320151B1 (en) Control Unit for Multi Image Signal Storage
CN220653423U (en) Signal conversion device
CN111756963A (en) Image shooting module and electronic terminal
CN221042979U (en) Image pickup assembly and electronic device
KR200182088Y1 (en) Control unit for multi image signal storage

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant