CN210274328U - 3D image acquisition system, depth camera and image processing equipment - Google Patents
3D image acquisition system, depth camera and image processing equipment Download PDFInfo
- Publication number
- CN210274328U CN210274328U CN201921433137.0U CN201921433137U CN210274328U CN 210274328 U CN210274328 U CN 210274328U CN 201921433137 U CN201921433137 U CN 201921433137U CN 210274328 U CN210274328 U CN 210274328U
- Authority
- CN
- China
- Prior art keywords
- image
- signal
- structured light
- serializer
- serial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Studio Devices (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
The application discloses a 3D image acquisition system, a depth camera and an image processing device. Wherein, 3D image acquisition system includes: the depth camera comprises an image sensor for acquiring depth image information and a serializer connected with the at least one image sensor, wherein the serializer is configured for receiving an image signal transmitted by the at least one image sensor, converting the received image signal into a serial signal and transmitting the converted serial signal to the image processing equipment through a serial transmission cable; and the image processing device comprises an image processor and a deserializer connected with the image processor, wherein the deserializer is connected with the image processor and is connected with the serializer through a serial transmission cable, and is configured to receive the serial signals transmitted by the serializer, perform deserialization processing on the received serial signals, and transmit the deserialized image signals to the image processor.
Description
Technical Field
The present application relates to the field of computer vision, and in particular, to a 3D image acquisition system, a depth camera, and an image processing apparatus.
Background
The 3D image is a process of providing a depth value corresponding to each pixel of an image on the basis of providing an RGB color value of the pixel. The depth value of a pixel may be obtained, for example, by sending structured light to a target object by using a structured light generator, acquiring structured light images of the structured light from different positions at different angles by using at least one image sensor, and then calculating a depth value corresponding to each pixel in an image acquired by a visible light image sensor (e.g., an RGB sensor) according to the structured light images sent by the image sensor.
In order to facilitate the calculation of depth information, a depth camera is proposed, which includes an image sensor for acquiring a depth image signal, and processes the image signal generated by the image sensor through a processor to generate 3D image data with depth information, and then transmits the processed depth image signal to a subsequent image processing apparatus.
However, the depth camera in the prior art has the following problems: the processor is arranged in the depth camera, so that the size and the cost of the depth camera are increased, the miniaturization and the light weight of the depth camera are not facilitated, and the maintenance and the waterproof and dustproof treatment of the depth camera are also not facilitated. Especially for systems requiring multiple depth cameras, this would clearly add significant cost to the system if a processor were provided for each depth camera. And if the image processor is separated from the image sensor, a challenge is faced in transmitting the image signal over a long distance. Generally, if the image signals generated by the image sensor are transmitted by using parallel ribbon transmission cables, the transmission distance is very limited, thereby being unfavorable for the arrangement of the 3D image acquisition system.
In view of the above technical problem in the prior art that the arrangement of the 3D image acquisition system is not facilitated due to the limited transmission distance when the image sensor of the depth camera is disposed separately from the image processing device, no effective solution has been proposed at present.
SUMMERY OF THE UTILITY MODEL
The utility model provides a 3D image acquisition system, depth camera and image processing equipment to at least, solve the technical problem that when separating the image sensor of depth camera and image processing equipment and deploying that exists among the prior art, transmission distance is limited and consequently be unfavorable for 3D image acquisition system's arrangement.
According to a first aspect of embodiments of the present application, there is provided a 3D image acquisition system, comprising: the depth camera is used for acquiring a depth image of the target object; and the image processing equipment is connected with the depth camera and is used for processing the depth image acquired by the depth camera and generating 3D image data with depth information. The depth camera comprises one or more image sensors for acquiring depth image information and a serializer connected with at least one image sensor of the one or more image sensors, wherein the serializer is configured to receive an image signal transmitted by the at least one image sensor, convert the received image signal into a serial signal and transmit the converted serial signal to the image processing device through a serial transmission cable; and the image processing device comprises an image processor and a deserializer connected with the image processor, wherein the deserializer is connected with the image processor and is connected with the serializer through a serial transmission cable, and is configured to receive the serial signals transmitted by the serializer, perform deserialization processing on the received serial signals, and transmit the deserialized image signals to the image processor.
Optionally, the depth camera further comprises a structured light generator for emitting structured light towards the target object, and the plurality of image sensors comprises: a structured light image sensor for acquiring a structured light image projected on a target object; and a visible light image sensor for acquiring a visible light image of the target object.
Optionally, the serializer comprises: the first serializer is connected with the structured light image sensors and is configured to receive an image signal of a structured light image transmitted by at least one of the structured light image sensors, convert the image signal of the structured light image into a serial signal and transmit the converted serial signal to the image processing equipment through a first serial transmission cable; and a second serializer connected to the visible light image sensor, configured to receive the image signal of the visible light image transmitted by the visible light image sensor, convert the image signal of the visible light image into a serial signal, and transmit the converted serial signal to the image processing apparatus through a second serial transmission cable.
Optionally, the deserializer comprises: the first deserializer is connected with the image processor and the first serializer through a first serial transmission cable, and is configured to receive a serial signal transmitted by the first serializer, deserialize the received serial signal, and transmit the deserialized signal to the image processor; and a second deserializer connected with the image processor and connected with the second serializer through a second serial transmission cable, and configured to receive the serial signal transmitted by the second serializer, deserialize the received serial signal, and transmit the deserialized signal to the image processor.
Optionally, the image processing apparatus is further configured to: sending a structured light trigger synchronization signal (Lsync) to the structured light generator, the structured light trigger synchronization signal (Lsync) being used to trigger the structured light generator to emit structured light; and sending an image acquisition synchronization signal (Vsync) synchronized with the structured light trigger synchronization signal (Lsync) to the image sensor, the image acquisition synchronization signal (Vsync) for triggering the image sensor to acquire the image signal.
Optionally, the depth camera further includes a first protocol converter disposed between the at least one image sensor and the serializer, wherein the first protocol converter is configured to convert a transmission protocol used when the at least one image sensor transmits a signal into a receiving protocol adapted to the serializer, and simultaneously, the first protocol converter transmits a Vsync synchronization signal received by the serializer to the image sensor.
Optionally, the image processing apparatus further includes a second protocol converter disposed between the deserializer and the image processor, wherein the second protocol converter is configured to convert a transmission protocol of the image signal deserialized by the deserializer into a receiving protocol adapted to the image processor, and simultaneously, the second protocol converter sends the Vsync synchronization signal generated by the image processor to the deserializer.
According to a second aspect of embodiments of the present application, there is provided a depth camera for acquiring a depth image of a target object, comprising: a plurality of image sensors for acquiring depth image information; and a serializer connected to at least one of the plurality of image sensors, wherein the serializer is configured to receive an image signal transmitted from the at least one image sensor, convert the received image signal into a serial signal, and transmit the converted serial signal through a serial transmission cable.
According to a third aspect of the embodiments of the present application, there is provided an image processing apparatus for processing a depth image, the image processing apparatus being connected to a depth camera, and being configured to process a depth image acquired by the depth camera to generate 3D image data with depth information, the image processing apparatus including: an image processor and a deserializer connected with the image processor. The deserializer is connected with the image processor and is configured to receive the serial signal transmitted by the serializer, deserialize the received serial signal, and transmit the deserialized image signal to the image processor.
Optionally, the image processor is further configured to: sending a structured light trigger synchronization signal to a structured light generator of the depth camera, wherein the structured light trigger synchronization signal is used for triggering the structured light generator to emit structured light; sending an image acquisition synchronization signal (Vsync) synchronized with the structured light trigger synchronization signal (Lsync) to an image sensor of the depth camera, wherein the image acquisition synchronization signal (Vsync) is used for triggering the image sensor to acquire an image signal; or the image processing apparatus further comprises a synchronizer configured to: sending a structured light trigger synchronization signal (Lsync) to the structured light generator, the structured light trigger synchronization signal (Lsync) being used to trigger the structured light generator to emit structured light; and sending an image acquisition synchronization signal (Vsync) synchronized with the structured light trigger synchronization signal (Lsync) to the image sensor, the image acquisition synchronization signal (Vsync) for triggering the image sensor to acquire the image signal.
Thus, in the present embodiment, the image signal collected by the image sensor may be converted into a serial signal by the serializer and then transmitted to the image processing apparatus through the serial transmission cable, so that the signal is deserialized by the deserializer on the image processing apparatus, and the deserialized signal is processed by the image processor. In this way, the signals of the image sensor can thus be transmitted over long distances. The image sensor and the image processor can be arranged separately in a long distance, so that the technical problem that the arrangement of a 3D image acquisition system is not facilitated due to the limited transmission distance when the image sensor of the depth camera and the image processing equipment are separately arranged in the prior art is solved.
The above and other objects, advantages and features of the present application will become more apparent to those skilled in the art from the following detailed description of specific embodiments thereof, taken in conjunction with the accompanying drawings.
Drawings
Some specific embodiments of the present application will be described in detail hereinafter by way of illustration and not limitation with reference to the accompanying drawings. The same reference numbers in the drawings identify the same or similar elements or components. Those skilled in the art will appreciate that the drawings are not necessarily drawn to scale. In the drawings:
fig. 1 is a schematic diagram of a 3D image acquisition system according to a first aspect of an embodiment of the application;
fig. 2 is a schematic diagram of another 3D image acquisition system according to the first aspect of an embodiment of the application;
fig. 3 is a schematic diagram of another 3D image acquisition system according to the first aspect of an embodiment of the application;
FIG. 4 is a schematic diagram of another depth camera employed by a 3D image acquisition system according to a first aspect of an embodiment of the present application; and
fig. 5 is a schematic diagram of another depth camera employed by the 3D image acquisition system according to the first aspect of an embodiment of the present application.
Detailed Description
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
In order to make the technical solutions of the present disclosure better understood by those skilled in the art, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only some embodiments of the present disclosure, not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances for describing the embodiments of the disclosure herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
Fig. 1 is a schematic diagram of a 3D image acquisition system according to a first aspect of an embodiment of the application. Referring to fig. 1, the 3D image acquisition system includes: a depth camera 100, the depth camera 100 being configured to acquire a depth image of a target object; and the image processing device 200, the image processing device 200 is connected with the depth camera 100 and is used for processing the depth image collected by the depth camera 100 and generating 3D image data with depth information.
The depth camera 100 includes one or more image sensors 101, 102, and 103 for acquiring depth image information and serializers 301, 302, and 303 connected to at least one of the one or more image sensors 101, 102, and 103. Wherein the serializers 301, 302, and 303 are configured to receive an image signal transmitted by at least one image sensor, convert the received image signal into a serial signal, and transmit the converted serial signal to the image processing apparatus 200 through a serial transmission cable. The image processing apparatus 200 includes an image processor 215 and deserializers 211, 212, and 213 connected to the image processor 215. Wherein the deserializers 211, 212 and 213 are connected to the image processor 215 and connected to the serializers 301, 302 and 303 through serial transmission cables, and are configured to receive the serial signals transmitted by the serializers 301, 302 and 303, deserialize the received serial signals, and transmit the deserialized image signals to the image processor 215.
As noted in the background, the depth cameras of the prior art suffer from the following problems: the processor is arranged in the depth camera, so that the size and the cost of the depth camera are increased, the miniaturization and the light weight of the depth camera are not facilitated, and the maintenance and the waterproof and dustproof treatment of the depth camera are also not facilitated. Especially for systems requiring multiple depth cameras, this would clearly add significant cost to the system if a processor were provided for each depth camera. And if the image processor is separated from the image sensor, a challenge is faced in transmitting the image signal over a long distance. Generally, if the image signals generated by the image sensor are transmitted by using parallel ribbon transmission cables, the transmission distance is very limited, thereby being unfavorable for the arrangement of the 3D image acquisition system.
In order to solve the technical problems in the prior art, in the 3D image capturing system of the present embodiment, the depth camera 100 is provided with serializers 301 to 303 connected to at least one of the image sensors 101 to 103. And deserializers 211 to 213 connected to an image processor 215 are provided in the image processing apparatus 200. The serializers 301 to 303 are connected to the deserializers 211 to 213 via serial transmission cables, respectively.
Thus, in the present embodiment, the image signals collected by the image sensors 101 to 103 can be converted into serial signals by the serializers 301 to 303 and then transmitted to the image processing apparatus 200 through the serial transmission cable, whereby the signals are deserialized by the deserializers 211 to 213 on the image processing apparatus 200, and the deserialized signals are processed by the image processor 215. Thus, in this way, signals of the image sensors 101 to 103 can be transmitted over long distances. The image sensors 101-103 and the image processor 215 can be arranged in a remote separated mode, and therefore the technical problem that in the prior art, when the image sensor of the depth camera and the image processing equipment are arranged in a separated mode, the arrangement of a 3D image acquisition system is not facilitated due to the fact that the transmission distance is limited is solved.
Optionally, the depth camera 100 further comprises a structured light generator 104 for emitting structured light towards the target object. And the image sensors 101 to 103 include: structured light image sensors 101 and 103 for acquiring a structured light image projected on a target object; and a visible light image sensor 102 for acquiring a visible light image of the target object.
Thus, the depth camera 100 in the present embodiment may emit structured light to the target object through the structured light generator 104. Wherein the specific structure of the structured light generator 104 is not limited, for example, it may be a structured light generator consisting of a laser emitter and a deflectable micro-mirror.
Then, the structured light image sensors 101 and 103 may acquire the structured light image projected on the target object, i.e., the structured light pattern projected on the target object, at different angles. Thus, the images acquired by the structured light image sensors 101 and 103 can be used to calculate a depth map (depth map). In addition, the visible light image sensor 102 is used to capture a visible light image (i.e., an RGB image) of the target object. Therefore, by using the image provided by the depth camera of the embodiment, the depth image related to the target object can be obtained through a subsequent processing algorithm (which is a well-known algorithm and is not described in detail in this application).
Further, referring to FIG. 1, the serializers 301 to 303 include: first serializers 301 and 303 connected to structured light image sensors 101 and 103, and second serializer 302 connected to visible light image sensor 102. The first serializers 301 and 303 are configured to receive an image signal of a structured light image transmitted by at least one of the structured light image sensors 101 and 103, convert the image signal of the structured light image into a serial signal, and transmit the converted serial signal to the image processing apparatus 200 through a serial transmission cable. The second serializer 302 is configured to receive the image signal of the visible light image transmitted by the visible light image sensor 102, convert the image signal of the visible light image into a serial signal, and transmit the converted serial signal to the image processing apparatus 200 through a serial transmission cable.
Thus, the present embodiment can convert the image signals of the structured light images collected by the structured light image sensors 101 and 103 into serial signals by the first serializers 301 and 303 and transmit the converted serial signals to the image processing apparatus 200 through the serial transmission cable. The image signal of the visible light image captured by the visible light image sensor 102 is converted into a serial signal by the second serializer 302, and the converted serial signal is transmitted to the image processing apparatus 200 through the serial transmission cable.
Further, for illustrative purposes, in the technical solution of the present embodiment, two structured- light image sensors 101 and 103 are provided. The acquisition of depth image information may be achieved with more or fewer structured light image sensors. For example, only one structured light image sensor may be provided, and the size of the acquired structured light image changes depending on the distance from the structured light image sensor to the target object, so that the effect of acquiring depth image information can be achieved by using a different number of structured light image sensors at different distances, and the difference is in accuracy.
For example, referring to fig. 2, a simple structured light camera is provided. Which includes only one structured light image sensor 101 for collecting infrared structured light pattern information emitted by a structured light generator 104. The image data is serialized by the serializer 301, and serial data is transmitted via a coaxial cable or a twisted pair (STP/CON); the acquisition time of the image sensor 101 is synchronized with the emission time of the structured light generator 104, such that the image sensor 101 acquires the relevant structured light image while the structured light generator 104 emits the pattern.
Fig. 4 and 5 show further schematic diagrams of further exemplary configurations of the depth camera 100. Fig. 4 shows a simple structured light camera: the structured light image sensor 101 may be, for example, an infrared image sensor, and is configured to collect infrared structured light pattern information emitted by the structured light generator 104, serialize the image data via a serializer 301, and transmit serial data via a coaxial cable or twisted pair (STP/CON). The acquisition time of the structured light image sensor 101 and the emission time of the structured light generator 104 may, for example, be strictly synchronized such that the structured light image sensor 101 acquires the relevant structured light image while the structured light generator 104 emits the pattern. In addition, the serializer 301 is also used to transmit an image capturing synchronization signal Vsync of the structured light image sensor 101.
Further, fig. 5 shows a binocular structured light camera: the structured light image sensors 101 and 103 may be, for example, infrared image sensors, which are used together to collect infrared structured light pattern information emitted by the structured light generator 104, serialize the image data via serializers 301 and 303, and transmit serial data via coaxial cable or twisted pair (STP/CON). The acquisition times of the structured light image sensors 101 and 103 and the emission time of the structured light generator 104 may be strictly synchronized such that the image sensors 101 and 103 acquire the relevant structured light images while the structured light generator 104 emits the pattern.
Optionally, referring to FIGS. 1 and 2, deserializers 211-213 include: first deserializers 211 and 213 connected to the image processor 215 and connected to the first serializers 301 and 303 through a first serial transmission cable, and configured to receive the serial signals transmitted by the first serializers 301 and 303, deserialize the received serial signals, and transmit the deserialized signals to the image processor 215; and a second deserializer 212 connected to the image processor 215 and connected to the second serializer 302 through a second serial transmission cable, and configured to receive the serial signal transmitted by the second serializer 302, deserialize the received serial signal, and transmit the deserialized signal to the image processor 215.
Thus, the present embodiment may receive the serial signals transmitted by the first serializers 301 and 303 through the first deserializers 211 and 213, perform deserialization processing on the received serial signals, and transmit the deserialized image signals to the image processor 215. The serial signal transmitted by the second serializer 302 is then received by the second deserializer 212, the received serial signal is deserialized, and the deserialized image signal is transmitted to the image processor 215. In this way, for example, signals from the image sensors 101 to 103 can be transmitted over long distances. The image sensors 101-103 and the image processor 215 can be arranged separately in a long distance, so that the flexibility of sensor arrangement can be met, and signals collected by the image sensors 101-103 can be transmitted in a long distance.
Optionally, referring to fig. 1 and 2, the image processing apparatus 200 is further configured to: sending a structured light trigger synchronization signal (Lsync) to the structured light generator 104, the structured light trigger synchronization signal (Lsync) for triggering the structured light generator 104 to emit structured light; and transmitting an image pickup synchronization signal (Vsync) synchronized with the structured light trigger synchronization signal (Lsync) to the image sensors 101, 102, and 103, the image pickup synchronization signal (Vsync) for triggering the image sensors 101, 102, and 103 to pick up the image signals.
Thus, a structured light trigger synchronization signal Lsync may be sent by the image processing apparatus 200 to the structured light generator 104, thereby triggering the structured light generator 104 to emit structured light. Further, an image pickup synchronization signal (Vsync) may also be transmitted to the image sensors 101, 102, and 103 through the image processing apparatus 200. The structured light image sensors 101, 102 and 103 can start and end image signal acquisition according to the received image acquisition synchronization signal (Vsync), and further realize the effect of synchronously acquiring image signals. Further, the acquisition time of the image sensors 101, 102 and 103 and the emission time of the structured light generator 104 may be, for example, positioned in close synchronization so that the image sensors 101, 102 and 103 acquire image signals while the structured light generator 104 emits a pattern.
Further, alternatively, as shown with reference to fig. 1 and 2, the image processor 215 may be configured to transmit the structured light trigger synchronization signal Lsync and the image capturing synchronization signal Vsync. In addition, referring to fig. 3, it may also be provided in the image processing apparatus 200 that the synchronizer 216 transmits the structured light trigger synchronization signal Lsync and the image capturing synchronization signal Vsync.
Further, optionally, referring to FIGS. 1 to 5, the 3D image capturing system further transmits an image capturing synchronization signal (Vsync) using the deserializers 211 to 213 of the image processing apparatus 200 and the serializers 301 to 303 of the depth camera 100.
Further, optionally, as shown with reference to fig. 1 to 3, the depth camera 100 further includes first protocol converters 401, 402, and 403 disposed between the at least one image sensor and the serializers 301, 302, and 303. Wherein the first protocol converters 401, 402, and 403 are used to convert the transmission protocol used when at least one image sensor transmits signals into a reception protocol adapted to the serializers 301, 302, and 303, and at the same time, the first protocol converters 401, 402, and 403 transmit Vsync synchronization signals received by the serializers 301, 302, and 303 to the image sensors 101, 102, and 103.
Due to the fact that the transmission protocols adopted when different types of image sensors of different brands transmit signals are various, for example, LVDS, Sub-LVDS, MiPi CSI-2, SLVS-EC and Parallel CMOS and other protocols can be adopted to transmit signals, and the transmission protocols which can be adapted by the serializer are usually LVCMOS and MiPi CSI-2. In this case, there may be a problem that the serializer does not match the transmission protocol employed by the image sensor, thereby causing the signal not to be normally transmitted.
In response to this problem, for example, as shown with reference to fig. 1 to 3, the depth camera 100 further includes first protocol converters 401, 402, and 403 provided between the image sensors 101, 102, and 103 and the serializers 301, 302, and 303. Thus, the transmission protocol used when the image sensors 101, 102, and 103 transmit signals may be converted into a reception protocol adapted to the serializers 301, 302, and 303 by the first protocol converters 401, 402, and 403, and at the same time, the first protocol converters 401, 402, and 403 transmit Vsync synchronization signals received by the serializers 301, 302, and 303 to the image sensors 101, 102, and 103.
For example, the first protocol converter 401 may be configured to receive and convert protocol transmission signals such as LVDS, Sub-LVDS, MiPi CSI-2, SLVS-EC, and parallell CMOS into signals of the LVCMOS protocol. So that the converted signal matches the transmission protocol of the serializer 301 and can be transmitted by the serializer 301.
In this way, the present embodiment converts the transmission protocol used when the image sensors 101, 102, and 103 transmit signals into the reception protocol adapted to the serializers 301, 302, and 303 by the first protocol converters 401, 402, and 403. So that the signal can be transmitted normally.
The first protocol converters 401, 402, and 403 used in this embodiment may be, for example, FPGA-based protocol converters. For example, but not limiting of, the first protocol converters 401, 402, and 403 may be dies of Lattice, which are LIF-MD 6000-80. The chip can realize the conversion from the MIPI protocol to the LVCMOS protocol.
Optionally, the image processing apparatus 200 further includes second protocol converters 501, 502, and 503 disposed between the deserializers 211, 212, and 213 and the image processor 215, wherein the second protocol converters 501, 502, and 503 are used to convert a transmission protocol of the deserialized image signals of the deserializers 211, 212, and 213 into a reception protocol adapted to the image processor 215, and at the same time, the second protocol converters 501, 502, and 503 transmit Vsync synchronization signals generated by the image processor 215 to the deserializers 211, 212, and 213.
Since the transmission protocol that can be adapted by the deserializer usually only supports LVCMOS, MiPi CSI-2, or the like, the receiving protocol that is adapted by the processor is also likely to be mismatched, which results in the problem that the signal processor analyzed by the deserializer cannot receive normally.
Specifically, as shown with reference to fig. 1 to 3, the image processing apparatus 200 further includes second protocol converters 501, 502, and 503 disposed between the deserializers 211, 212, and 213 and the image processor 215. Thus, the transmission protocol of the image signals deserialized by the deserializers 211, 212, and 213 can be converted into the reception protocol adapted to the image processor 215 by the second protocol converters 501, 502, and 503.
Thus, in this way, the transmission protocols employed by the deserializers 211, 212, and 213 in transmitting information to the image processor 215 can be converted into the reception protocols adapted to the image processor 215 by the second protocol converters 501, 502, and 503. Thereby ensuring normal transmission of information between the deserializers 211, 212 and 213 and the image processor 215.
The second protocol converters 501, 502, and 503 used in this embodiment may be, for example, FPGA-based protocol converters. For example, but not limiting of, the second protocol converters 501, 502, and 503 may be dies of Lattice, which are of the LIF-MD6000-80 type. The chip can also implement protocol conversion from LVCMOS to mipi.
Further, optionally, as shown with reference to fig. 1 to 3, the image processing apparatus 200 further includes a data interface 214 for data interaction with an external apparatus. Referring to FIG. 1, the data interface may be, for example, but not limited to, a USB3.0/GigE interface. Thus, the image processing apparatus 200 can perform data interaction with an external apparatus through the data interface 214, thereby transmitting 3D image data.
Further, referring to fig. 1 to 3, according to a second aspect of the present embodiment, there is provided a depth camera 100 for acquiring a depth image of a target object. The depth camera 100 includes: image sensors 101, 102, and 103 for acquiring depth image information; and serializers 301, 302, and 303 connected to at least one of the image sensors 101, 102, and 103. Wherein the serializers 301, 302 and 303 are configured to receive the image signal transmitted by the at least one image sensor, convert the received image signal into a serial signal, and transmit the converted serial signal through a serial transmission cable.
For further description of the depth camera 100, reference may be made to the description of the depth camera 100 in the first aspect of the present embodiment.
Further, referring to fig. 1 to 3, according to a third aspect of the present embodiment, there is provided an image processing apparatus 200 for processing a depth image. The image processing device 200 is configured to connect to the depth camera 100, process the depth image acquired by the depth camera 100, and generate 3D image data with depth information. Wherein the image processing device 200 comprises an image processor 215 and deserializers 211, 212, 213 connected to the image processor 215. Wherein the deserializers 211, 212, 213 are connected to the image processor 215, and are configured to receive the serial signal converted from the image signal of the depth image transmitted by the serializers 301, 302, 303, deserialize the received serial signal, and transmit the deserialized image signal to the image processor 215.
For further description of the image processing apparatus 200, reference may be made to the description of the image processing apparatus 200 in the first aspect of the present embodiment.
In summary, in the present embodiment, the image signal collected by the image sensor may be converted into a serial signal by the serializer and then transmitted to the image processing device through the serial transmission cable, so that the signal is deserialized by the deserializer on the image processing device, and the deserialized signal is processed by the image processor. In this way, the signals of the image sensor can thus be transmitted over long distances. The image sensor and the image processor can be arranged separately in a long distance, so that the technical problem that the arrangement of a 3D image acquisition system is not facilitated due to the limited transmission distance when the image sensor of the depth camera and the image processing equipment are separately arranged in the prior art is solved.
The relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description. Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate. In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Spatially relative terms, such as "above … …," "above … …," "above … …," "above," and the like, may be used herein for ease of description to describe one device or feature's spatial relationship to another device or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is turned over, devices described as "above" or "on" other devices or configurations would then be oriented "below" or "under" the other devices or configurations. Thus, the exemplary term "above … …" can include both an orientation of "above … …" and "below … …". The device may be otherwise variously oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
In the description of the present disclosure, it is to be understood that the orientation or positional relationship indicated by the directional terms such as "front, rear, upper, lower, left, right", "lateral, vertical, horizontal" and "top, bottom", etc., are generally based on the orientation or positional relationship shown in the drawings, and are presented only for the convenience of describing and simplifying the disclosure, and in the absence of a contrary indication, these directional terms are not intended to indicate and imply that the device or element being referred to must have a particular orientation or be constructed and operated in a particular orientation, and therefore, should not be taken as limiting the scope of the disclosure; the terms "inner and outer" refer to the inner and outer relative to the profile of the respective component itself.
The above description is only for the preferred embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (10)
1. A 3D image acquisition system comprising: a depth camera (100), the depth camera (100) for acquiring a depth image of a target object; and an image processing device (200), the image processing device (200) being connected to the depth camera (100) for processing a depth image captured by the depth camera (100) to generate 3D image data with depth information,
the depth camera (100) comprises one or more image sensors (101, 102, 103) for acquiring depth image information and a serializer (301, 302, 303) connected with at least one of the one or more image sensors (101, 102, 103), wherein the serializer (301, 302, 303) is configured to receive an image signal transmitted by the at least one image sensor, convert the received image signal into a serial signal, and transmit the converted serial signal to the image processing apparatus (200) through a serial transmission cable; and
the image processing device (200) comprises an image processor (215) and a deserializer (211, 212, 213) connected with the image processor (215), wherein the deserializer (211, 212, 213) is connected with the image processor (215) and is connected with the serializer (301, 302, 303) through the serial transmission cable, and is configured to receive a serial signal transmitted by the serializer (301, 302, 303), deserialize the received serial signal, and transmit the deserialized image signal to the image processor (215).
2. The 3D image acquisition system according to claim 1, characterized in that the depth camera (100) further comprises a structured light generator (104) for emitting structured light towards the target object, and the plurality of image sensors (101, 102, 103) comprises:
a structured light image sensor (101, 103) for acquiring a structured light image projected on the target object; and
a visible light image sensor (102) for acquiring a visible light image of the target object.
3. 3D image acquisition system according to claim 2, characterized in that the serializer (301, 302, 303) comprises:
a first serializer (301, 303) connected to the structured light image sensors (101, 103) and configured to receive an image signal of a structured light image transmitted by at least one of the structured light image sensors (101, 103), convert the image signal of the structured light image into a serial signal, and transmit the converted serial signal to the image processing apparatus (200) through a first serial transmission cable; and
and the second serializer (302) is connected with the visible light image sensor (102) and is configured to receive the image signal of the visible light image transmitted by the visible light image sensor (102), convert the image signal of the visible light image into a serial signal and transmit the converted serial signal to the image processing device (200) through a second serial transmission cable.
4. 3D image acquisition system according to claim 3, characterized in that the deserializer (211, 212, 213) comprises:
a first deserializer (211, 213) connected to the image processor (215) and connected to the first serializer (301, 303) through the first serial transmission cable, and configured to receive the serial signal transmitted by the first serializer (301, 303), deserialize the received serial signal, and transmit the deserialized signal to the image processor (215); and
a second deserializer (212) connected to the image processor (215) and connected to the second serializer (302) through the second serial transmission cable, and configured to receive the serial signal transmitted by the second serializer (302), deserialize the received serial signal, and transmit the deserialized signal to the image processor (215).
5. The 3D image acquisition system according to any of claims 2 to 4, characterized in that the image processing device (200) is further configured for:
sending a structured light trigger synchronization signal (Lsync) to the structured light generator (104), the structured light trigger synchronization signal (Lsync) for triggering the structured light generator (104) to emit structured light; and
sending an image acquisition synchronization signal (Vsync) synchronized with the structured light trigger synchronization signal (Lsync) to the image sensors (101, 102, 103), the image acquisition synchronization signal (Vsync) being used to trigger the image sensors (101, 102, 103) to acquire image signals.
6. The 3D image acquisition system according to claim 1, characterized in that the depth camera (100) further comprises a first protocol converter (401, 402, 403) arranged between the at least one image sensor and the serializer (301, 302, 303), wherein the first protocol converter (401, 402, 403) is configured to convert a transmission protocol used when the at least one image sensor transmits a signal into a reception protocol adapted to the serializer (301, 302, 303), and wherein the first protocol converter (401, 402, 403) is configured to transmit a Vsync synchronization signal received by the serializer (301, 302, 303) to the image sensor (101, 102, 103).
7. The 3D image acquisition system according to claim 6, characterized in that the image processing device (200) further comprises a second protocol converter (501, 502, 503) arranged between the deserializer (211, 212, 213) and the image processor (215), wherein the second protocol converter (501, 502, 503) is configured to convert a transmission protocol of the deserialized image signal by the deserializer (211, 212, 213) into a reception protocol adapted to the image processor (215), and wherein the second protocol converter (501, 502, 503) transmits the Vsync synchronization signal generated by the image processor (215) to the deserializer (211, 212, 213).
8. A depth camera (100) for acquiring a depth image of a target object, comprising:
a plurality of image sensors (101, 102, 103) for acquiring depth image information; and
a serializer (301, 302, 303) connected to at least one of the plurality of image sensors (101, 102, 103), wherein the serializer (301, 302, 303) is configured to receive an image signal transmitted by the at least one image sensor, convert the received image signal into a serial signal, and transmit the converted serial signal through a serial transmission cable.
9. An image processing device (200) for processing a depth image, for connecting to a depth camera (100), for processing a depth image acquired by the depth camera (100) to generate 3D image data with depth information, comprising:
an image processor (215) and a deserializer (211, 212, 213) connected to the image processor (215), wherein
The deserializer (211, 212, 213) is connected with the image processor (215), and is configured to receive the serial signal transmitted by the serializer (301, 302, 303), deserialize the received serial signal, and transmit the deserialized image signal to the image processor (215).
10. The image processing apparatus (200) of claim 9,
the image processor (215) is further configured for: sending a structured light trigger synchronization signal (Lsync) to a structured light generator (104) of the depth camera (100), the structured light trigger synchronization signal (Lsync) for triggering emission of structured light by the structured light generator (104); and sending an image acquisition synchronization signal (Vsync) synchronized with the structured light trigger synchronization signal (Lsync) to an image sensor (101, 102, 103) of the depth camera (100), the image acquisition synchronization signal (Vsync) being used to trigger the image sensor (101, 102, 103) to acquire an image signal; or
The image processing apparatus (200) further comprises a synchronizer (216) configured for: sending a structured light trigger synchronization signal (Lsync) to the structured light generator (104), the structured light trigger synchronization signal (Lsync) for triggering the structured light generator (104) to emit structured light; and sending an image acquisition synchronization signal (Vsync) synchronized with the structured light trigger synchronization signal (Lsync) to the image sensors (101, 102, 103), the image acquisition synchronization signal (Vsync) being used to trigger the image sensors (101, 102, 103) to acquire image signals.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201921433137.0U CN210274328U (en) | 2019-08-30 | 2019-08-30 | 3D image acquisition system, depth camera and image processing equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201921433137.0U CN210274328U (en) | 2019-08-30 | 2019-08-30 | 3D image acquisition system, depth camera and image processing equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN210274328U true CN210274328U (en) | 2020-04-07 |
Family
ID=70018492
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201921433137.0U Active CN210274328U (en) | 2019-08-30 | 2019-08-30 | 3D image acquisition system, depth camera and image processing equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN210274328U (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111866409A (en) * | 2020-06-22 | 2020-10-30 | 北京都是科技有限公司 | Image selection processing apparatus and image selection processing system |
CN112911265A (en) * | 2021-02-01 | 2021-06-04 | 北京都视科技有限公司 | Fusion processor and fusion processing system |
CN113552123A (en) * | 2020-04-17 | 2021-10-26 | 华为技术有限公司 | Visual inspection method and visual inspection device |
CN116500010A (en) * | 2023-06-25 | 2023-07-28 | 之江实验室 | Fluorescence microscopic imaging system and method thereof and fluorescence microscopic detection device |
CN116723406A (en) * | 2022-02-28 | 2023-09-08 | 比亚迪股份有限公司 | Image data processing method, device, computer equipment and storage medium |
-
2019
- 2019-08-30 CN CN201921433137.0U patent/CN210274328U/en active Active
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113552123A (en) * | 2020-04-17 | 2021-10-26 | 华为技术有限公司 | Visual inspection method and visual inspection device |
CN111866409A (en) * | 2020-06-22 | 2020-10-30 | 北京都是科技有限公司 | Image selection processing apparatus and image selection processing system |
CN112911265A (en) * | 2021-02-01 | 2021-06-04 | 北京都视科技有限公司 | Fusion processor and fusion processing system |
CN112911265B (en) * | 2021-02-01 | 2023-01-24 | 北京都视科技有限公司 | Fusion processor and fusion processing system |
CN116723406A (en) * | 2022-02-28 | 2023-09-08 | 比亚迪股份有限公司 | Image data processing method, device, computer equipment and storage medium |
CN116500010A (en) * | 2023-06-25 | 2023-07-28 | 之江实验室 | Fluorescence microscopic imaging system and method thereof and fluorescence microscopic detection device |
CN116500010B (en) * | 2023-06-25 | 2024-01-26 | 之江实验室 | Fluorescence microscopic imaging system and method thereof and fluorescence microscopic detection device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN210274328U (en) | 3D image acquisition system, depth camera and image processing equipment | |
US10122998B2 (en) | Real time sensor and method for synchronizing real time sensor data streams | |
JP3374175B2 (en) | Data transmission device with data transmission function for position display | |
JP5499762B2 (en) | Image processing apparatus, image processing method, program, and image processing system | |
TW378159B (en) | Input position detection device and entertainment system | |
US20050137827A1 (en) | System and method for managing arrangement position and shape of device | |
US10268108B2 (en) | Function enhancement device, attaching/detaching structure for function enhancement device, and function enhancement system | |
US12058438B2 (en) | Image processing apparatus and image processing system with gain correction based on correction control information | |
JP2021513175A (en) | Data processing methods and devices, electronic devices and storage media | |
US20160188018A1 (en) | System, drawing method and information processing apparatus | |
CN108280807A (en) | Monocular depth image collecting device and system and its image processing method | |
CN109579698B (en) | Intelligent cargo detection system and detection method thereof | |
JP4667833B2 (en) | Interactive device with image processing circuit and image sensor integrated on the substrate | |
CN215121012U (en) | Signal processing system | |
US11436818B2 (en) | Interactive method and interactive system | |
CN215344869U (en) | Signal acquisition system | |
CN111813232A (en) | VR keyboard and VR office device | |
WO2023098323A1 (en) | Depth image acquisition method and apparatus, system and computer readable storage medium | |
CN106959747B (en) | Three-dimensional human body measuring method and apparatus thereof | |
CN112911265B (en) | Fusion processor and fusion processing system | |
CN209821622U (en) | Signal acquisition system | |
JP2017200098A (en) | Movement monitoring device | |
CN110221553A (en) | Signal acquiring system | |
CN100559334C (en) | Image-processing circuit and image sensor are integrated in the interactive apparatus of a substrate | |
CN215300770U (en) | Signal converter |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |