CN214333663U - 2D and 3D combined high-precision vision device - Google Patents

2D and 3D combined high-precision vision device Download PDF

Info

Publication number
CN214333663U
CN214333663U CN202120226547.9U CN202120226547U CN214333663U CN 214333663 U CN214333663 U CN 214333663U CN 202120226547 U CN202120226547 U CN 202120226547U CN 214333663 U CN214333663 U CN 214333663U
Authority
CN
China
Prior art keywords
unit
image
image acquisition
projection
acquisition unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202120226547.9U
Other languages
Chinese (zh)
Inventor
吴伟锋
王国安
王前程
虞大鹏
周飞
郑泽鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hypersen Technologies Co ltd
Original Assignee
Hypersen Technologies Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hypersen Technologies Co ltd filed Critical Hypersen Technologies Co ltd
Priority to CN202120226547.9U priority Critical patent/CN214333663U/en
Application granted granted Critical
Publication of CN214333663U publication Critical patent/CN214333663U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The utility model discloses a 2D, 3D composite high-precision vision device, including projection unit, background lighting unit, image acquisition unit, the control unit and data processing unit, the projection unit sets up in measuring object's side aslope, and form the contained angle that the angle is greater than 0 and is less than 90 with the main optical axis of image acquisition unit for throw out the different even light of structured light pattern and different wavelength to measuring object; the background lighting unit is arranged right below the measuring object and used for providing a background light source for the measuring object; the image acquisition unit is positioned right above the measurement object and is used for acquiring the projected image data; the control unit is used for respectively controlling the projection unit and the background illumination unit to carry out time-sharing lightening and luminescence; the data processing unit is used for acquiring image data information, analyzing and processing the image data information to acquire a full-view color 3D image. The utility model discloses small, low-cost, angle are expanded, and application scope is wide.

Description

2D and 3D combined high-precision vision device
Technical Field
The utility model relates to a visual detection technical field especially relates to a 2D, compound high accuracy vision device of 3D.
Background
With the continuous improvement of the technological level, machine vision inspection is used to replace manual inspection and measurement in more and more industrial occasions.
Generally, in order to carry out omnibearing measurement and detection on a product, a 2D vision technology is used for detecting surface texture defects, color differences and length and width dimensions of the product, because 2D vision has no height data, if indexes such as flatness, height difference and the like of the product need to be detected, a 3D vision or profile measuring instrument needs to be used for scanning, generally, at least two different vision systems are needed to achieve the expected effect, each vision system also needs to be detected from different angles to meet the requirement of detection beat, and therefore the cost of a user and the design difficulty of equipment are greatly increased; if a high-precision 3D image with color textures needs to be obtained, a color CCD/CMOS is added to calibrate the poses of a plurality of CCDs/CMOS, and the image fusion cannot avoid errors; secondly, hardware cost and complexity of system calibration are increased, and although there are embodiments that can acquire color 3D images without adding CCD/CMOS, such as RGB CMOS with depth information, the accuracy is not high, usually in the centimeter level, and it is difficult to achieve high accuracy.
SUMMERY OF THE UTILITY MODEL
In order to solve the technical problem, the utility model provides a 2D, compound high accuracy vision device of 3D.
In order to achieve the above object, the technical embodiments of the present invention are as follows:
the utility model provides a high accuracy vision device of 2D, 3D complex, includes projection unit, background lighting unit, image acquisition unit, the control unit, data memory cell, data processing unit, wherein:
the projection unit is obliquely arranged on the side of the measurement object, forms an included angle with the main optical axis of the image acquisition unit, and the included angle is larger than 0 degree and smaller than 90 degrees and is used for projecting a structured light pattern and different uniform lights with different wavelengths onto the measurement object;
the background lighting unit is arranged right below the measuring object and used for providing a background light source for the measuring object;
the image acquisition unit is connected with the projection unit, is positioned right above the measurement object, and is used for acquiring image data after the structural light pattern and different uniform lights with different wavelengths are projected on the measurement object and are influenced by the measurement object;
the control unit is connected with the projection unit and the background illumination unit and is used for respectively controlling the projection unit and the background illumination unit to carry out time-sharing lightening and luminescence to irradiate a measurement object;
the data storage unit is connected with the image acquisition unit and used for storing the image data information acquired by the image acquisition unit;
the data processing unit is connected with the data storage unit and used for extracting image data information from the data storage unit, analyzing and processing the image data information to obtain a shape image with depth information and a pseudo-color image with texture information, and synthesizing a full-view-angle color 3D image based on the shape image with depth information and the pseudo-color image with texture information.
Preferably, the image acquisition unit comprises a receiving lens and a photosensitive device, and the receiving lens and the photosensitive device are located on the same central line.
Preferably, the photosensitive device adopts a black and white CCD or CMOS sensor.
Preferably, the receiving lens is a telecentric lens.
Preferably, the structured light pattern is a coded structured light pattern or a modulated stripe pattern.
Preferably, the different uniform lights of different wavelengths are three primary colors of light or a mixture of any combination.
Preferably, the background light source is a plane-parallel light source.
Preferably, the data processing unit adopts a general-purpose graphics processor, an FPGA-based programmable controller, a central processing unit or a DSP chip.
Preferably, a plurality of projection units with the same placing angle are arranged on the periphery of a main optical axis of the image acquisition unit.
Preferably, the image generation surface and the object surface of the projection unit follow the schem's law with respect to the main plane of the light projection lens.
Based on the above technical embodiment, the beneficial effects of the utility model are that:
1) the projection device adopts a color projector, so that the color of a projection pattern, such as red, green and blue light or mixed light of any combination, can be conveniently controlled to adapt to different colors, materials and reflectivities;
2) in the image acquisition unit, the receiving lens adopts a telecentric lens, and the telecentric lens has no aberration in the field depth range, so that the size of an object can be accurately measured, errors caused by aberration and distortion are reduced, and the precision of generated height data is improved;
3) the background light source adopts a plane parallel light source, the light irradiation angle of the plane parallel light source is approximately parallel, the contrast ratio of the measured object and the background is greatly improved, particularly for a workpiece with radian, the cambered surface part of the workpiece cannot be illuminated, the obtained image has clear and sharp edges, and the size measurement precision is improved;
4) the utility model discloses can generate 2D picture and the 3D degree of depth map of high accuracy, and reduce visual detection system's the integrated degree of difficulty by a wide margin, improve and detect the beat.
Drawings
The following describes embodiments of the present invention in further detail with reference to the accompanying drawings.
FIG. 1: the utility model relates to a structural schematic diagram of an embodiment I of a 2D and 3D composite high-precision vision device;
FIG. 2: the utility model discloses a structural schematic diagram of a second embodiment of a 2D and 3D composite high-precision vision device;
wherein: 101-left projection unit, 102-right projection unit, 201-image acquisition unit, 301-data storage unit, 401-data processing unit, 501-control unit, 601-background lighting unit.
Detailed Description
The technical embodiments of the present invention will be described in detail and fully with reference to the accompanying drawings.
Example one
As shown in fig. 1, a 2D, 3D composite high-precision vision device includes a left projection unit 101, an image acquisition unit 201, a data storage unit 301, a data processing unit 401, a control unit 501, and a background illumination unit 601, wherein:
the left projection unit 101 is obliquely arranged on the left side of the measurement object, forms an included angle larger than 0 degree and smaller than 90 degrees with the main optical axis of the image acquisition unit 201, and is used for projecting a structural light pattern and different uniform lights with different wavelengths onto the measurement object, an image generation surface and an object surface of the left projection unit 101 follow the Schlemm's law relative to a main plane of a light projection lens, so that the focal point of the projection device can fall into the object surface, the whole pair of patterns can be clearly imaged, the number of the optical system aperture F of the left projection unit 101 can be designed to be smaller, and the projection pattern can be clearer and brighter, and the left projection unit 101 in the embodiment adopts a color DLP projector;
a background illumination unit 601 disposed right below the measurement object and configured to provide a surface-parallel light source for the measurement object;
the image acquisition unit 201 is connected with the left projection unit 101 so as to be synchronous, is positioned right above a measurement object, and is used for acquiring structured light patterns and different uniform lights with different wavelengths, which are projected on the measurement object, and image data influenced by the measurement object, wherein the image acquisition unit 201 comprises a receiving lens and a light sensitive device, the receiving lens can adopt a telecentric lens, if high-precision 2D size measurement is not needed, a non-telecentric lens can also be adopted, and the light sensitive device can adopt a black-and-white CCD or CMOS or other types of light sensitive devices;
the control unit 501 is connected to the left projection unit 101 and the background lighting unit 601, and is configured to respectively control the left projection unit 101 and the background lighting unit 601 to illuminate a measurement object by time-sharing illumination, in this embodiment, the control unit 501 employs an ARM platform, and may also select any other programmable control platform according to a requirement, for example, a programmable controller based on an FPGA, a Central Processing Unit (CPU), and the like;
the data storage unit 301 is connected to the image acquisition unit and is used for storing image data information acquired by the image acquisition unit, in this embodiment, the data storage unit 301 is implemented by using a RAM, and a DDR memory is preferred because of its characteristics of large bandwidth, low cost, easy capacity expansion, and the like;
and the data processing unit 401 is connected with the 301 data storage unit and is used for extracting the image data information from the data storage unit 301, analyzing and processing the image data information to obtain a shape image with depth information and a pseudo-color image with texture information, and synthesizing a full-view-angle color 3D image based on the shape image with depth information and the pseudo-color image with texture information. In this embodiment, the data processing unit 401 is implemented by a general purpose computing-capable graphics processing unit (GPGPU), and since the GPGPU is relatively mature as a general purpose data processing platform, has the characteristics of easy programming and strong image processing capability, other computing platforms, such as a programmable controller based on an FPGA, a CPU or a DSP chip,
generation of depth map
The control unit 501 controls the left projection unit 101 to project a structured light pattern, which may be a coded structured light pattern, such as a modulated stripe pattern, or a random structured light pattern, where we select a coded stripe pattern as the structured light pattern; after the left projection unit 101 projects a pattern, a pulse signal is generated, the image acquisition unit 201 receives the signal and acquires a frame of image to be stored in the data storage unit 301, all the stripe patterns are projected in sequence, the data processing unit 401 reads the image from the data storage unit 301, calculates a plurality of acquired stripe patterns, and generates a depth map with height information.
Generation of 2D maps
After the control unit 501 opens the background lighting unit 601, the left projection unit 101 is controlled to sequentially project three uniform red, green and blue light, a pulse signal is generated after the left projection unit 101 projects, the image acquisition unit 201 receives the signal and acquires a frame of image to be stored in the data storage unit 301, the three color light patterns are sequentially projected, the data processing unit 401 reads the image from the data storage unit 301, and selects the pattern of the corresponding color light as required.
Color map generation
In the above 2D image generation, the data processing unit 401 can combine the three images into a pseudo-color image with clear texture, and can be widely applied to various identification occasions.
Generation of a color depth map
By using the depth map and the pseudo color map generated as described above, since the image acquisition unit 201 employs a telecentric lens and has no aberration within the depth of field, the pseudo color map can be fused into the depth map without special registration, and a fully-matched full-view-angle color 3D image is synthesized.
Example two:
as shown in fig. 2, the difference between the second embodiment and the first embodiment is that, for the situation that the blind-zone-free detection is required, a single left projection unit 101 cannot meet the requirement, and has an obvious illumination blind zone, so the apparatus can expand the projection units as required to achieve the purpose of eliminating the blind zone, for example, the detection of metal parts with bright surface, and possibly 4 projection units cannot meet the purpose of eliminating the blind zone, so more projection units can be expanded, in this embodiment, two projection units are taken as an example for expansion, a right projection unit 102 is added, two projection units (101, 102) are symmetric with respect to the main optical axis of the image acquisition unit 210, and both form an included angle with the image acquisition unit 201, which is greater than 0 ° and smaller than 90 °,
and (3) generating a depth map:
the control unit 501 controls the left projection unit 101 to project a coded structured light pattern, which may be a binary coded fringe pattern, the left projection unit 101 generates a pulse signal after projecting the pattern, the image acquisition unit 201 receives the signal and acquires a frame of image to be stored in the data storage unit 301, all fringe patterns are projected in sequence, the data processing unit 401 reads the image from the data storage unit 301, resolves a plurality of acquired fringe patterns, and synthesizes a new depth map.
Generation of 2D maps
After the control unit 501 opens the background lighting unit 601, the left projection unit 101 is controlled to sequentially project three uniform red, green and blue light, the left projection unit 101 generates a pulse signal after projection, the image acquisition unit 201 receives the signal and acquires a frame of image to be stored in the data storage unit 301, the three color light patterns are sequentially projected, and similarly, the control unit 501 controls the right projection unit 102 to sequentially project three uniform red, green and blue light to obtain corresponding new image data; the data processing unit 401 reads the image from the data storage unit 301, selects the pixel point with the best brightness value from the two projected images to synthesize a new image, the new image will eliminate the blind area that can not be irradiated by a single projection unit, the user selects the pattern of the corresponding color light according to the requirement, because the background illumination unit is started, the contrast ratio of the object to be measured and the background will be greatly improved, the edge is clear and sharp, and the size measurement precision is improved.
Color map generation
In the above 2D image generation, the data processing unit 401 can combine the three images into a pseudo-color image with clear texture, and can be widely applied to various identification occasions.
Generation of a color depth map
According to the depth map and the pseudo color map generated by the method, as the image acquisition unit adopts a telecentric lens and has no aberration in the depth of field range, the pseudo color map can be fused into the depth map without special registration to synthesize a completely matched full-view-angle color 3D image.
The above description is only a preferred embodiment of the 2D and 3D composite high-precision vision device disclosed in the present invention, and is not intended to limit the scope of the embodiments of the present disclosure. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the embodiments of the present disclosure should be included in the protection scope of the embodiments of the present disclosure.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The embodiments in the present specification are all described in a progressive manner, and the same and similar parts among the embodiments can be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.

Claims (10)

1. The utility model provides a high accuracy vision device of 2D, 3D complex which characterized in that, includes projection unit, background lighting unit, image acquisition unit, the control unit, data memory cell, data processing unit, wherein: the projection unit is obliquely arranged on the side of the measurement object, forms an included angle with the main optical axis of the image acquisition unit, and the included angle is larger than 0 degree and smaller than 90 degrees and is used for projecting a structured light pattern and different uniform lights with different wavelengths onto the measurement object;
the background lighting unit is arranged right below the measuring object and used for providing a background light source for the measuring object;
the image acquisition unit is connected with the projection unit, is positioned right above the measurement object, and is used for acquiring image data after the structural light pattern and different uniform lights with different wavelengths are projected on the measurement object and are influenced by the measurement object;
the control unit is connected with the projection unit and the background illumination unit and is used for respectively controlling the projection unit and the background illumination unit to carry out time-sharing lightening and luminescence to irradiate a measurement object;
the data storage unit is connected with the image acquisition unit and used for storing the image data information acquired by the image acquisition unit;
the data processing unit is connected with the data storage unit and used for extracting image data information from the data storage unit, analyzing and processing the image data information to obtain a shape image with depth information and a pseudo-color image with texture information, and synthesizing a full-view-angle color 3D image based on the shape image with depth information and the pseudo-color image with texture information.
2. The 2D and 3D composite high-precision vision device according to claim 1, wherein the image acquisition unit comprises a receiving lens and a photosensitive device, and the receiving lens and the photosensitive device are located on the same central line.
3. A 2D, 3D composite high precision vision device according to claim 2, characterized in that said photo sensitive device is a black and white CCD or CMOS sensor.
4. A 2D, 3D composite high precision vision device according to claim 2, characterized in that said receiving lens is a telecentric lens.
5. A 2D, 3D composite high precision vision apparatus according to claim 1, characterized in that said structured light pattern is a coded structured light pattern or a modulated stripe pattern.
6. A 2D, 3D composite high precision vision apparatus according to claim 1, wherein said different uniform lights of different wavelengths are three primary colors or any combination of mixed lights.
7. A 2D, 3D composite high precision vision apparatus according to claim 1, wherein said background light source is a plane parallel light source.
8. A 2D, 3D composite high precision vision device according to claim 1, characterized by that said data processing unit is a general purpose graphics processor, FPGA based programmable controller, central processing unit or DSP chip.
9. The 2D and 3D composite high-precision vision device according to claim 1, wherein a plurality of projection units with the same laying angle are arranged around the main optical axis of the image acquisition unit.
10. A 2D, 3D composite high precision vision apparatus according to claim 1 or 9, characterized in that the image generation surface and the object surface of said projection unit follow the schem's law with respect to the main plane of the light projection lens.
CN202120226547.9U 2021-01-27 2021-01-27 2D and 3D combined high-precision vision device Active CN214333663U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202120226547.9U CN214333663U (en) 2021-01-27 2021-01-27 2D and 3D combined high-precision vision device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202120226547.9U CN214333663U (en) 2021-01-27 2021-01-27 2D and 3D combined high-precision vision device

Publications (1)

Publication Number Publication Date
CN214333663U true CN214333663U (en) 2021-10-01

Family

ID=77906676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202120226547.9U Active CN214333663U (en) 2021-01-27 2021-01-27 2D and 3D combined high-precision vision device

Country Status (1)

Country Link
CN (1) CN214333663U (en)

Similar Documents

Publication Publication Date Title
CN108445007B (en) Detection method and detection device based on image fusion
US7570370B2 (en) Method and an apparatus for the determination of the 3D coordinates of an object
US10796428B2 (en) Inspection system and inspection method
CN105372259B (en) Measurement apparatus, base board checking device and its control method, storage media
JP5633058B1 (en) 3D measuring apparatus and 3D measuring method
CN106556357B (en) A kind of device and method based on one-dimensional Beams measurement 3 d shape
CN112815846A (en) 2D and 3D composite high-precision vision device and measuring method
CN107271445B (en) Defect detection method and device
US20180180407A1 (en) Image processing apparatus and image processing method
JP2017122614A (en) Image generation method and inspection device
US10444162B2 (en) Method of testing an object and apparatus for performing the same
KR101766468B1 (en) Method for 3D shape measuring using of Triple Frequency Pattern
JP5545932B2 (en) 3D shape measuring device
CN214333663U (en) 2D and 3D combined high-precision vision device
KR100903346B1 (en) Method for optical visual examination of three-dimensional shape
US20080279458A1 (en) Imaging system for shape measurement of partially-specular object and method thereof
KR101465996B1 (en) Method for measurement of high speed 3d shape using selective long period
CN102829956B (en) Image detection method, image detection apparatus and image testing apparatus
KR101653649B1 (en) 3D shape measuring method using pattern-light with uniformity compensation
KR20170124509A (en) Inspection system and inspection method
JP2009079934A (en) Three-dimensional measuring method
JP7062798B1 (en) Inspection system and inspection method
WO2021153057A1 (en) Three-dimensional shape measurement device, three-dimensional shape measurement method, and program
JP7452121B2 (en) Inspection equipment and inspection method
WO2021153056A1 (en) Three-dimensional shape measuring device, three-dimensional shape measuring method, and program

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant