CN107077729B - Method and device for recognizing structural elements of a projected structural pattern in a camera image - Google Patents

Method and device for recognizing structural elements of a projected structural pattern in a camera image Download PDF

Info

Publication number
CN107077729B
CN107077729B CN201580049754.9A CN201580049754A CN107077729B CN 107077729 B CN107077729 B CN 107077729B CN 201580049754 A CN201580049754 A CN 201580049754A CN 107077729 B CN107077729 B CN 107077729B
Authority
CN
China
Prior art keywords
camera
projector
image
structural
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201580049754.9A
Other languages
Chinese (zh)
Other versions
CN107077729A (en
Inventor
马丁·文德勒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pilz GmbH and Co KG
Original Assignee
Pilz GmbH and Co KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pilz GmbH and Co KG filed Critical Pilz GmbH and Co KG
Publication of CN107077729A publication Critical patent/CN107077729A/en
Application granted granted Critical
Publication of CN107077729B publication Critical patent/CN107077729B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Abstract

In a method for identifying individual structural elements (12) of a structural pattern (16) projected onto a scene (14) in a camera image, the structural pattern (16) is projected onto the scene (14) using a projector (36), and the structural pattern (16) projected onto the scene (14) is picked up using a first camera (M) and at least one second camera (H, V). The first camera (M) and the at least one second camera (H, V) are positioned at a distance from each other, and the projector (36) is positioned at a distance from the first camera (M) and from the at least one second camera (H, V) and outside the straight connecting line (26,28) of the first camera and the at least one second camera (M, H, V). For a structural element (12) to be recognized in the camera image of the first camera (M), a structural element (12) in the camera image of the at least one second camera (V, H) that can be associated one-to-one with the structural element (12) to be recognized in the camera image of the first camera (M) is determined by the following calibration data: the calibration data have, for each structural element (12), in each case a combination of a first parameter which relates the respective structural element (12) to the position and orientation of the first camera (M) and the projector (36) and at least a second parameter which relates the respective structural element (12) to the position and orientation of the at least one second camera (H, V) and the projector (36), the calibration data being obtained by recording a calibration image (76,78) of the structural pattern (16) projected by the projector (36) using the first camera and the at least one second camera (M, H, V).

Description

Method and device for recognizing structural elements of a projected structural pattern in a camera image
The invention relates to a method for identifying structural elements of a structural pattern projected onto a scene in a camera image.
The invention also relates to a device for recognizing structural elements of a structural pattern projected onto a scene in a camera image, in particular for carrying out the above method.
The invention further relates to the use of the aforementioned method and/or the aforementioned device for monitoring, in particular for securing, a hazardous area, in particular of a machine such as a printing press or a robot.
In the context of the present invention, a scene is understood to mean a three-dimensional hazard zone, in particular monitored by a camera. A moving object such as a person may be present in the scene. In particular in the latter case, the camera monitors the scene for hazards as to whether the person or a body part thereof is located in or close to the vicinity of the machine operating in an automatic manner.
In order to monitor hazardous areas, in particular of machines and industrial plants, it is known to use cameras as described in EP 2133619 a 1. Herein, a 3D security camera is disclosed having a first image sensor and a second image sensor that can generate image data of a spatial region, respectively. The 3D security camera operates in conjunction with a projector that generates a structured illumination pattern in a scene (i.e., in a spatial region). According to the principles of stereoscopy, the evaluation unit generates a depth map of the spatial region from the image data to monitor impermissible intrusions in the spatial region.
During the stereo evaluation of an image pair of a stereo camera, measurement errors may occur due to low contrast or repetitive structure of objects within the monitored spatial region. For this reason, as described in the above-mentioned document, a structured pattern is projected into the field of view of the camera in order to improve the contrast in the camera image. Here, it is advantageous to use a structured pattern in the form of a periodic dot pattern, as has likewise been disclosed in the abovementioned documents. The structural pattern in the form of a dot pattern is advantageous over more complex structural patterns in that: they are depicted clearly and require less illumination energy to project them.
A projected structure pattern in the context of the present invention is an optical structure pattern, in other words a pattern consisting of light spots.
However, when a dot pattern, particularly a uniform dot pattern, is used as a structural pattern, the following problems arise: it often happens that points in the camera image are wrongly assigned to points projected into the scene. This is because the position of one projected structure element in the camera image generally depends on the current distance of the projected structure element from the camera and also on the geometry of the object on which the structure element is projected. In other words, the position of the structural element in the camera image may vary depending on the distance and object geometry, and may in general be different in the camera.
The erroneous assignment of points in the camera image to points projected onto the scene results in an erroneous determination of distances to objects within the scene. Erroneous distance measurements in turn have the following effect: for example, a hazardous situation may arise if an object located in the hazardous area is deemed by the camera to be located outside the hazardous area; or if the object is located outside the danger zone, but the machine is switched off because the camera is deemed to be within the danger zone due to a wrong distance measurement, the machine located within the danger zone, for example, is not fully utilized.
According to EP 2019281 a1, a solution to the problem of a wrong assignment of structural elements in a camera image to structural elements projected onto a scene is to project a more complex structural pattern onto the scene, which is at least in partial areas, non-uniform, non-periodic and non-self-similar. However, generating such more complex structured patterns is disadvantageous in view of the high expenditure on the equipment, i.e. the complexity of the projector.
WO 2013/145665 a1 discloses an apparatus for three-dimensional measurement which projects light rays from a projector onto a workpiece, wherein a stereo camera picks up an image of the workpiece onto which the light rays are projected. The control device of the apparatus temporarily recognizes a correspondence between a bright line in a first image of the picked-up stereoscopic image and a light sectional plane, and projects the bright line onto the light sectional plane. The bright lines projected onto the light cross-sectional plane are projected onto the second image. The control device calculates a similarity between the bright line projected onto the second image and the bright line in the second image, and determines a result of the correspondence between the identified projected bright line and the bright line in the second image.
US 2008/0201101 a1 discloses a hand-held three-dimensional scanner for scanning and digitizing the surface geometry of an item.
DE 202008017729U 1 discloses a 3D security camera for monitoring and securing a spatial area. The security camera comprises a lighting unit with at least one semiconductor light source. The semiconductor light source generates a high light output of at least 10W, which allows for the generation of dense depth maps for reliable estimation independent of fluctuations in ambient light in the monitored spatial region.
DE 102004020419B 3 discloses a device for measuring even strongly curved reflecting surfaces. Here, the pattern reflected at the surface is observed and evaluated. The reflected pattern is viewed from multiple directions. The evaluation is achieved by determining those positions within the measurement space at which the surface normals determined for the respective viewing directions have the lowest deviation relative to one another.
The invention is based on the following objectives: a method and a device are specified with which structural elements of a structural pattern projected onto a scene in a camera image can be reliably recognized without the need for complex, inhomogeneous, aperiodic temporally or spatially coded structural patterns.
In order to achieve this object, a method for reliably identifying structural elements of a structural pattern projected onto a scene in a camera image is provided, wherein the structure pattern is projected onto the scene using a projector, and wherein the structure pattern projected onto the scene is picked up using a first camera and at least one second camera, wherein the first camera and the at least one second camera are positioned at a distance from each other, and wherein the projector is positioned outside a straight connecting line of the first camera and the at least one second camera at a distance from the first camera and from the at least one second camera, wherein, for a structural element to be recognized in the camera image of the first camera, a structural element in the camera image of the at least one second camera that can be associated one-to-one with the structural element to be recognized in the camera image of the first camera is determined by the following calibration data: the calibration data has, for each structural element, in each case a combination of a first parameter relating the respective structural element to the position and orientation of the first camera and the projector and at least a second parameter relating the respective structural element to the position and orientation of the at least one second camera and the projector, the calibration data being obtained by recording a calibration image of the structural pattern projected by the projector using the first camera and the at least one second camera.
Furthermore, according to the present invention, a device for reliably identifying structural elements of a structural pattern projected onto a scene in a camera image, in particular for performing the above method, is provided, which device comprises a projector for projecting the structural pattern onto the scene. The device further comprises a first camera and at least one second camera for picking up a structural pattern projected onto the scene, wherein the first camera and the at least one second camera are positioned at a distance from each other, and wherein the projector is positioned at a distance from the first camera and from the at least one second camera outside a straight connecting line of the first camera and the at least one second camera. The device further comprises a memory unit, in which calibration data are stored, which have, for each structural element, in each case a combination of a first parameter, which relates the respective structural element to the position and orientation of the first camera and projector, and at least a second parameter, which relates the respective structural element to the position and orientation of the at least one second camera and projector. And the device further comprises a computing unit adapted to: for structural elements to be recognized in the camera image of the first camera, structural elements in the camera image of the at least one second camera that can be associated one-to-one with structural elements to be recognized in the camera image of the first camera are determined using the calibration data.
The term "camera" in the context of the present invention is intended to have a general meaning. A camera may be understood to refer to an image sensor, for example, having only associated optics. The first camera and the at least second camera are also intended to mean that the two image sensors are accommodated in a common housing at a lateral distance from one another, wherein the image sensors can be assigned a common imaging optics or in each case individual imaging optics. However, the first camera and the at least one second camera may also be in the form of separate independent cameras having their own housing and their own imaging optics.
The method according to the invention and the device according to the invention are based on the following principle: the position and orientation of the projector (e.g. the exit pupil of the projector and the transmission direction for each individual structuring element) and the position and orientation of the at least two cameras (e.g. the position and orientation of the exit pupils of the at least two cameras and their viewing directions) are accurately known. In the method according to the invention, in order to obtain such accurate knowledge, for example, a calibration image of the structure pattern is recorded while at least two cameras and projectors are installed, calibration data is obtained from the calibration image. The calibration data have, for each structural element, in each case a combination of a first parameter which relates the respective structural element to the position and orientation of the first camera and projector and at least a second parameter which relates the respective structural element to the position and orientation of the at least one second camera and projector. Since these parameter combinations assigned to the individual structural elements during calibration are preferably assigned to each structural element, the structural elements of the projected structural pattern are uniquely determined.
In the device according to the invention, the calibration data obtained are stored in a memory unit. In the method according to the invention, the calibration data is then used to identify structural elements of the structural pattern in the camera images of the at least two cameras projected onto the scene, so as to allow a one-to-one assignment between the structural elements in the camera image of the at least one second camera and the structural elements in the camera image of the first camera.
The method according to the invention and the device according to the invention allow the reliable assignment of structural elements in the camera images of the first camera and the at least one second camera to structural elements projected onto the scene. With the method according to the invention and the device according to the invention, a basis for a correct measurement of the distance to an object in the area of space to be monitored is established. Thus, the camera system can reliably assess whether the object is located, for example, inside or outside the hazard zone.
By recording the calibration images, the position and orientation of the at least two cameras and the projector relative to each other is known from their calibration, and the straight lines of the at least two cameras, the so-called camera epipolar lines, can be determined from the known position and orientation of the two cameras relative to each other. The position and orientation of the projector relative to the at least two cameras is also known by calibration, wherein, based on this knowledge, the straight line of the pairing of the first camera and the projector and of the pairing of the at least one second camera and the projector, the so-called illumination epipolar line, can also be determined.
In at least two cameras, the images of the structural elements are always located on the respective camera epipolar lines. Which camera epipolar line the image of a structural element appears on and which other structural elements are also located on the camera epipolar line depends on the geometry and distance of the object on which the structural pattern is projected. For each of the at least two cameras, there are a number of camera epipolar lines.
In addition, each projection structure element and its image in the camera are located on the corresponding illumination epipolar line. For each pairing of a first camera and a projector and at least a second camera and a projector, there is a plurality of illumination epipolar lines. For both cameras and the projector positioned outside its straight connecting line with respect to the cameras as described above, there are two sets of illumination epipolar lines extending obliquely to each other, wherein the first set is assigned to the arrangement of the first camera and the projector and the second set is assigned to the arrangement of the second camera and the projector. The projected structural elements and their images in the camera are always located on the same corresponding illumination epipolar line, regardless of the geometry and distance of the object onto which the structural pattern is projected. However, which position and which other structural elements are located on the illumination epipolar line also depend on the geometry and distance of the object.
Using the above-described case where the structural elements and their images are always located on the same illumination epipolar line irrespective of the geometry and distance of the object on which the structural pattern is projected, according to a preferred embodiment the first parameter of each structural element comprises the slope of the first illumination epipolar line associated with the first camera and the intersection of the first illumination epipolar line with the first reference axis, and the at least second parameter of each structural element comprises the slope of the illumination epipolar line associated with the at least one second camera and the intersection of the illumination epipolar line with the second reference axis.
In the device according to the invention, the above-mentioned calibration data are stored in the storage unit, for example in the form of a calibration list.
As will be described in more detail below, using the parameters as calibration data has the following advantages: the allowability between the projection structure element and the parameter is unique and separate, and in addition, the assignment of images to projection structure elements can be achieved in a simple manner with this type of parameter.
During calibration of the at least two cameras and the projector, it is thus determined for each structural element in each camera as to which illumination epipolar line it is located with respect to the first camera and which illumination epipolar line it is located with respect to the at least one second camera. Mathematically speaking, since a straight line is uniquely determined by its slope and one point, it is sufficient to store only the slope of the illumination epipolar line and the intersection of the illumination epipolar line with the reference axis for each respective structural element, as indicated here, which significantly reduces the storage overhead.
For structured patterns with a higher density of structured elements, usually a plurality of structured elements are located on the same illumination epipolar line in each case with respect to one of the at least two cameras. However, each structural element is uniquely described by a separate combination of at least two illumination epipolar lines (i.e., an illumination epipolar line associated with a first camera and an illumination epipolar line associated with at least a second camera).
The first reference axis, which is mentioned in connection with the above embodiments and to which the first illumination epipolar line associated with the first camera intersects, is preferably a scan line of the first camera, and the second reference axis, to which the second illumination epipolar line associated with the at least one second camera intersects, is preferably a scan line of the at least one second camera.
The scan lines may for example be respective first scan lines of respective cameras.
In a further preferred embodiment of the method, the structure pattern is projected sequentially onto a first surface and a second surface for recording the calibration image, wherein the first surface and the second surface are arranged at a distance from each other in the transmission direction of the projector.
This type of recording of the calibration images and thus of the calibration of the at least two cameras and the projector can be carried out in a particularly simple manner and thus in a simple manner for each structural element of the structural pattern an illumination epipolar line relative to the first camera and an illumination epipolar line relative to the at least one second camera is given.
The first and second surfaces are preferably planar surfaces. Calibration becomes particularly simple if the same surface is used as the first and second surfaces, the surface being initially positioned at a first distance from the camera and the projector for recording the first calibration image and subsequently positioned at a second distance from the camera and the projector for recording the second calibration image. Two calibration images are stored and evaluated to obtain calibration data.
According to a preferred embodiment of the device, the method according to the invention is particularly easy to implement if the entrance pupil of the first camera, the entrance pupil of the at least one second camera and the exit pupil of the projector are located in one plane.
In this embodiment, the geometric scale is simplified when calibrating and recognizing the image of the structural element. The distance of the structural elements from one another in the camera image is independent of the distance of the projected structural elements from the camera in this arrangement, i.e. always the same. Thus, the assignment of the structural elements in the calibration image is easily possible. In the at least two calibration images, the straight lines through two positions of the structural element correspond exactly to the illumination epipolar line of the structural element. Furthermore, all the first illumination epipolar lines are parallel to each other, that is to say they have the same slope and differ only in their intersection with the first reference axis, and all the second illumination epipolar lines are parallel with respect to each other, that is to say also have the same slope and differ only in their intersection with the second reference axis. The range of calibration data and thus the necessary storage capacity and computing power is thereby advantageously further reduced.
Another way to help simplify the method according to the invention is to arrange the optical axis of the first camera, the optical axis of the at least one second camera and the optical axis of the projector such that they are parallel with respect to each other.
In this embodiment, the camera epipolar lines of at least two cameras are parallel with respect to each other and parallel with respect to the straight connecting line of the cameras. They may even be aligned with each other from camera to camera. Furthermore, the scan lines of the at least two cameras may advantageously be arranged parallel or perpendicular to the camera epipolar line and distortions of the camera lens or the common camera lens may be easily corrected.
In connection with the above-described configuration of the method, according to which the structured pattern is projected sequentially onto a first surface and a second surface for recording the calibration image, which are at a distance from each other in the transmission direction of the projector, it is preferred if said surfaces are planar and oriented perpendicular to the optical axes of the camera and the projector.
Due to the vertical orientation of the surface, the pattern of structures picked up by the at least two cameras is substantially undistorted.
The above-described embodiment not only simplifies the calibration, but also the reliable recognition of structural elements of the structural pattern in the camera images of the at least two cameras projected onto the scene.
In the device, the distance between the first camera and the at least one second camera is preferably different from the distance between the projector and the first camera and different from the distance between the projector and the at least one second camera.
Thus, the recognition of structural elements in the camera image is further improved. Any break (break) in symmetry in the arrangement of the at least two cameras and the projector increases the accuracy of assigning the structural elements in the camera image to the projected structural elements.
In a further preferred embodiment of the method, pixel coordinates of the structural elements to be recognized in the camera image of the first camera in each case are determined, at least first parameters are calculated from the pixel coordinates and the calibration data, and all structural elements which likewise satisfy at least one first parameter, possibly within a tolerance band, are read from the calibration data, at least one second parameter of the at least second parameters is also read for each of the structural elements read, and structural elements which can be associated one-to-one with the structural elements to be recognized in the camera image of the first camera in the camera image of the at least one second camera are determined.
In the device according to the invention, the calculation unit is accordingly arranged for carrying out the above-mentioned steps.
In this embodiment of the method, the computational overhead of the preceding steps is advantageously low, since the stored or saved calibration data is essentially accessible. For the structural elements to be identified, only the first illumination epipolar line is calculated, on which the structural element to be identified is located. The stored calibration data is then accessed in order to perform a one-to-one assignment between the structural elements in the image of the at least one second camera and the structural elements to be identified in the image of the first camera.
To this end, a structural element which is also located on the calibration camera epipolar line (i.e. the intersection between the second illumination epipolar line and the camera epipolar line) is selected from the structural elements located on the second illumination epipolar line in the image of the at least one second camera. In particular, it is particularly simple if the camera epipolar lines of the two cameras are parallel with respect to each other and are located at the same height.
In a further preferred embodiment of the method, a third camera is additionally used for picking up the structure pattern projected onto the scene, said third camera being positioned outside the straight connecting line of the first camera and the second camera and outside the straight connecting line of the first camera and the projector and outside the straight connecting line of the second camera and the projector, wherein the calibration data additionally has third parameters which relate the respective structure elements to the position and orientation of the third camera and the projector.
The device according to the invention therefore has a third camera for picking up the pattern of the structure projected onto the scene.
Depending on the complexity of the topology of the scene onto which the structural pattern is projected, the following may be the case: a particular structural element in one of the at least two cameras is not visible, for example due to blockage by an edge or shoulder. In this case, the structural element can then be reliably recognized with the third camera in conjunction with one of the two further cameras. The third camera is also useful for reliable identification of structural elements if, for example, one of the three cameras views a surface in a scene in a glancing manner.
In a further preferred embodiment of the method, the structural elements are distributed uniformly over the structural pattern on two main axes, which are preferably perpendicular with respect to one another, wherein it is further preferred that the two main axes of the structural pattern run obliquely with respect to a straight connecting line of the first camera and the at least one second camera and/or with respect to a straight connecting line of the projector with the first camera and the at least one second camera.
The uniform structure pattern has the following advantages: the projector can have a simple structure. The oblique arrangement of the main axis of the structure pattern relative to the straight connecting lines of the at least two cameras and/or the straight connecting lines of the projector and the first camera and the at least one second camera results in a disruption of symmetry, which facilitates reliable recognition of the structural elements in the camera image, because the ambiguity in the recognition is reduced.
The method according to the invention is particularly suitable for embodiments in which the structural pattern is a dot pattern, in particular a uniform dot pattern, and the structural elements are dots.
In this device, the projector is accordingly arranged to generate the structured pattern in the form of a particularly uniform dot pattern.
The advantage of using a particularly uniform or periodic dot pattern or grid is that: such a structured pattern can be easily generated and projected with high intensity.
In a further preferred embodiment of the method, the structured pattern is transmitted in pulsed fashion with a pulse frequency: the pulse frequency corresponds to half the image recording frequency of the first camera and the at least one second camera, and wherein in each case two sequential images picked up by the first camera and the at least one second camera are subtracted from one another.
In this device, the projector is accordingly arranged to transmit the structured pattern in pulses having a pulse frequency: the pulse frequency corresponds to half the image frequency of the first camera and the at least one second camera, and wherein the calculation unit is arranged to subtract two images, in each case picked up in sequence by the first camera and the at least one second camera, from each other.
In this embodiment, only the structural elements in the difference camera image are retained, since the two sequential images recorded by the first camera and the at least one second camera are subtracted. In this way, the structural elements in the camera image can be detected more easily.
To further increase contrast, a band pass filter (e.g., an IR band pass filter) may be used to block any ambient light other than the wavelength of the projected light. The band pass filter may be arranged in one or more optics of the at least two cameras.
According to one or more of the above-described embodiments, the method according to the invention and/or the device according to the invention are preferably used for monitoring, in particular ensuring, the safety of a hazard zone, in particular of a machine.
Other advantages and features may be derived from the following description and drawings.
It is to be understood that the features mentioned above and those yet to be explained can be used not only in the respectively stated combination, but also in different combinations or alone, without departing from the scope of the present invention.
Exemplary embodiments of the present invention are shown in the drawings and will be described in more detail with reference thereto. In the drawings:
fig. 1 shows an apparatus for identifying structural elements of a structural pattern projected onto a scene in a camera image;
FIG. 2 illustrates a portion of a scene having objects therein;
FIG. 3 shows a first calibration image recorded during calibration of the apparatus of FIG. 1;
FIG. 4 shows a second calibration image recorded during calibration of the apparatus of FIG. 1;
FIG. 5 shows a superposition of two calibration images from FIGS. 3 and 4;
FIG. 6 shows an enlarged portion of FIG. 5 for illustrating the manner in which calibration data may be obtained using the calibration images according to FIGS. 3 and 4; and
fig. 7 shows a scheme for explaining how the structural elements of the projected structural pattern in the camera image can be recognized.
Fig. 1 shows an apparatus, indicated with the general reference numeral 10, for recognizing structural elements 12 of a structural pattern 16 projected onto a scene 14 in a camera image.
The device 10 is used in particular for monitoring, in particular for securing, a hazard, in particular of a machine (not shown). The scene 14 is in this case, for example, a three-dimensional spatial region, in which a machine, in particular a machine operating in an automated manner, for example a printing press or a robot (not shown), is arranged. In general, a moving object, such as a person, may be present in scene 14, where apparatus 10 monitors scene 14 as to whether the object is fully or partially located in or near a hazard zone of a machine operating in an automated manner.
The device 10 has a first camera M, a second camera H and optionally simultaneously preferably a third camera V. The camera M, the camera H, and the camera V are, for example, digital cameras each having a photoelectric image sensor (not shown). The image sensor may be configured in CCD or CMOS technology. Such image sensors have a large number of light sensitive elements (also referred to as pixels) which are arranged in a large number of scan lines arranged parallel with respect to each other.
Camera M, camera H and camera V each have an entrance pupil 18, an entrance pupil 20 and an entrance pupil 22, respectively, which are all arranged in one plane. In fig. 1, a cartesian coordinate system 24 having an x-axis, a y-axis and a z-axis is shown as a reference system for purposes of illustration. In this case, the above-described plane in which the entrance pupil 18 of the camera M, the entrance pupil 20 of the camera H, and the entrance pupil 22 of the camera V are arranged is an xy plane where z is 0. The coordinate system 24 as a reference system is designed here such that the entrance pupil 18 of the camera M is located at the coordinates x 0, y 0 and z 0.
One straight connecting line 26 between camera M and camera H extends in the x-axis direction, respectively, and a straight connecting line 28 between camera M and camera V extends in the y-axis direction. Thus, the third camera V is located outside the straight connecting line 26 between the camera M and the camera H.
The distance between camera M and camera H is different from the distance between camera V and camera M.
Further, camera M, camera H, and camera V are positioned and oriented such that their respective optical axes 30 (camera M), 32 (camera H), and 34 (camera V) extend parallel with respect to one another.
Device 10 also has a projector 36, projector 36 for generating and projecting structured pattern 16 into scene 14. The projector 36 may be, for example, in the form of a laser diode employing diffractive optical elements, in the form of a matrix of LEDs with imaging optics, in the form of a slide projector, in the form of a spotlight source with segmented mirrors, or in the form of a digital light processing projector (DLP or DMP projector).
The exit pupil 38 of the projector 36 is located on the same plane as the entrance pupil 18 of camera M, the entrance pupil 20 of camera H, and the entrance pupil 22 of camera V. The optical axis 40 of the projector 36 extends parallel to the optical axis 30 of the camera M, the optical axis 32 of the camera H and the optical axis 34 of the camera V.
The projector 36 is located outside the straight connecting line 26 between the camera M and the camera H, and also outside the straight connecting line 28 between the camera M and the camera V. In fig. 1, also the linear connection lines between the camera M, the camera H and the camera V and the projector 36 are shown, in particular the linear connection line 37 between the camera M and the projector 36, the linear connection line 39 between the camera H and the projector 36 and the linear connection line 41 between the camera V and the projector 36. As can be seen from fig. 1, the linear connecting lines 37,39 and 41 run obliquely with respect to one another and also with respect to the linear connecting lines 26 and 28.
The structural pattern 16 is a uniform pattern of dots, that is to say the structural elements 12 are individual dots which are distributed periodically and uniformly along the two main axes 42, 44. Only major axis 42 and major axis 44 are shown here for illustrative purposes and in fact major axis 42 and major axis 44 do not appear in projected structural pattern 16. The two main axes 42,44, which extend perpendicularly in relation to one another here, give the two directions of the smallest distance between the points or structural elements 12 of the structural pattern 16.
Structural pattern 16 projected by projector 36 is preferably projected such that major axis 42 and major axis 44 of projected structural pattern 16 extend oblique to linear connecting lines 26 and 28 and also oblique to linear connecting lines 37,39 and 41.
Fig. 1 shows the structure pattern 16 with 25 dots or structure elements 12 only in a segmented manner. In practice, the structure pattern 16 has a significantly greater number of structure elements 12, here represented by means of interrupted continuation lines 46 and 48.
The distance between the individual structural elements or points 12 is preferably 4 to 10 pixels of the image sensor of camera M, camera H and camera V. The distance to be selected between the various structural elements 12 of the structural pattern 16 will depend on the accuracy with which the coordinates of the structural elements or points 12 in the images of the camera M, the camera H and the camera V can be determined.
Although the structural elements 12 may only illuminate a partial area of the scene 14 imaged in the camera M, the camera H and the camera V, typically the opening angle of the illumination from the projector 36 is at least as large as the full viewing angle or the full field of view of the camera M, the camera H and the camera V, and therefore the images in the camera M, the camera H and the camera V are completely filled with structural elements 12 (poplatites).
In fig. 1, individual transmission lines 50 of illumination light emitted by the projector 36 and causing the structure pattern 16 to project are depicted by way of example. Likewise, a receiving line 52 for the camera H, a receiving line 54 for the camera M and a receiving line 56 for the camera V are shown by way of example in fig. 1 starting from the respective structural element 12. Thus, camera M, camera H and camera V are used to pick up the structured pattern 16 projected onto the scene 14.
The device 10 also has a control device 58, the control device 58 having a memory unit 60 and a calculation unit 62. The storage unit 60 and the calculation unit 62 are connected to the camera M, the camera H, and the camera V, and to the projector 36 (not shown) for mutual data exchange purposes.
Since the following arrangement is preferred here: the exit pupil 38 of the projector 36 and the entrance pupils of the camera M, the camera H and the camera V are located in one common plane perpendicular to the optical axis 30 of the camera M, the optical axis 32 of the camera H and the optical axis 34 of the camera V, so that the structured pattern 16 imaged in the camera M, the camera H and the camera V always appears to be of the same size, irrespective of the distance of the projected structured pattern 16 from the camera M, the camera H and the camera V. More specifically, if the distances of the projected structure pattern 16 from the camera M, the camera H, and the camera V are changed, the distances between the respective structure elements 12 in the images of the camera M, the camera H, and the camera V are not changed.
Fig. 2 shows a portion of the scene 14 in which an object 64 is located on a larger scale than fig. 1, the object 64 being shown here in simplified form as a cube. In fig. 2, four of the structural elements 12 are denoted by reference numerals 66, 68, 70 and 72, wherein these structural elements or points have a shorter distance from the camera M, the camera H and the camera V than the remaining structural elements 12. However, the mutual distances of the structural elements 66, 68, 70 and 72 in the camera images of camera M, camera H and camera V remain unchanged with respect to the case in which the object 64 is not present in the scene 14.
The basic principle of the method for identifying individual structural elements 12 of the structural pattern 16 projected onto the scene 14 in the calibration images of the camera M, the camera H and the camera V is that: the position and orientation of the projector 36 (i.e., the transmission position of the projection light and the transmission direction for each individual structural element 12) and the positions and orientations of the camera M, the camera H, and the camera V (i.e., the reception positions and the reception directions of the light rays from the respective structural elements 12) are accurately known. For this purpose, a calibration image is recorded as will be described below with reference to fig. 3 to 6.
The calibration images are recorded without changing the arrangement of the camera M, the camera H and the camera V and the projector 36, that is to say the camera M, the camera H and the camera V and the projector 36 have the same position and orientation as is the case subsequently in the method for identifying the structural elements 12 projected onto the scene 14 in the images of the camera M, the camera H and the camera V.
First, a first calibration image is recorded using camera M, camera H and camera V, wherein structure pattern 16 is projected by projector 36 onto a planar surface at a first distance from camera M, camera H and camera V and projector 36. Fig. 3 shows by way of example an image surface 74 of an image sensor of a camera M for a camera M, wherein a structure pattern 16 projected onto a first plane surface is recorded with the camera M as a calibration image 76.
Subsequently, at least one second calibration image is recorded using camera M, camera H and camera V, wherein structure pattern 16 is projected by projector 36 onto the planar surface at least one second distance from camera M, camera H and camera V and projector 36, wherein the at least one second distance is greater than or less than the first distance. Fig. 4 shows, by way of example, the recording of a second calibration image 78 using the camera M.
The planar surface when recording the calibration image is oriented perpendicular to the optical axis 30, the optical axis 32 and the optical axis 34.
It is also possible to record more than two calibration images at several different distances of the projected structure pattern 16.
The calibration images recorded with the camera M, the camera H, and the camera V are stored in, for example, the storage unit 60.
Fig. 5 shows a superposition of two calibration images 76,78 recorded with the camera M according to fig. 3 and 4. As already mentioned, in the calibration images of the camera M, the camera H and the camera V, the distances of the individual structural elements 12 from each other always remain the same, although the distances of the projected structural pattern 16 from the camera M, the camera H and the camera V are different.
Fig. 5 shows that the individual structuring elements 12 in the calibration image 78 have been moved relative to the calibration image 76. Calibration data are now obtained, for example by means of the calculation unit 62, from the calibration images according to fig. 3 and 4, which are stored, for example, in the storage unit 60, the calibration data having, for the individual structural elements 12, in each case a combination of a first parameter, which relates the respective structural element 12 to the position and orientation of the camera M and the projector 36, a second parameter, which relates the respective structural element 12 to the position and orientation of the camera H and the projector 36, and a third parameter, which relates the respective structural element 12 to the position and orientation of the camera V and the projector 36.
To further illustrate the calibration, by way of example, four of the structural elements 12 are represented in the calibration image 76 of fig. 3 as a1, a2, B1, and B2, and the structural elements associated with the structural elements a1, a2, B1, B2 and offset from the camera image 76 are represented in the calibration image 78 of fig. 4 as a1 ', a 2', B1 ', and B2'.
If the straight line now passes through the pair of associated structuring elements a1, a 1' in each case; a2, a 2'; b1, B1 'and B2, B2', this will give the straight lines 76,78, 80, 82 according to fig. 5.
For simplicity, fig. 6 shows only the structural elements a1, a 1' in the image plane 76 of the camera M on a larger scale; a2, a 2'; b1, B1 'and B2, B2'. Also, fig. 6 shows straight lines 76,78, 80, and 82. The lines 76,78, 80 and 82 are so-called illumination epipolar lines. As can be seen from fig. 6, all illumination epipolar lines 76,78, 80, 82 are parallel with respect to one another, which is caused by the position of the projector 36, since here the projector 36 is preferably situated at the same height in the z-axis direction as the camera MAnd (4) degree. Thus, the illumination epipolar lines 76,78, 80 and 82 have the same slope StM, but a corresponding intersection point SPM with the reference axis 84 of the camera MA1、SPMB1、SPMA2And SPMB2Different. For example, the first scan line of the image sensor of the camera M is used as the reference axis 84.
As can be seen from fig. 6, each imaging structure element a1, a2, B1, B2 is always located on the same illumination epipolar line 76,78, 80 or 82, but the position of each imaging structure element a1, a2, B1, B2 on said illumination epipolar line, as indicated by the offset structure elements a1 ', a 2', B1 'and B2', depends on the distance of the respective projected structure element a1, a2, B1, B2 from the camera M. Furthermore, on the respective illumination epipolar line associated with the respective structural element a1, a2, B1, B2 there is not only its image in the camera M, but also the structural element itself projected into the scene (or onto a planar surface during calibration).
Furthermore, due to the position and orientation of the camera M and the projector 36 at the same height in the z-axis direction and their parallel optical axes 30 and 40 (fig. 1), the illumination epipolar lines 76,78, 80 and 82 are parallel to the straight connecting line 37 (fig. 1) of the camera M and the projector 36.
Although fig. 6 shows, by way of example, four illumination epipolar lines 76,78, 80, 82 for structural elements a1, a2, B1, B2, it should be understood that an illumination epipolar line for camera M is associated with each structural element 12 of structural pattern 16.
The respective conditions apply to the respective structural elements 12 for the camera H and the camera V. One illumination epipolar line associated with camera H and one illumination epipolar line associated with camera V are associated with each structural element 12 of the projected structural pattern 16. The illumination epipolar lines associated with the camera H are in turn parallel to each other and to the rectilinear connecting line 39 (fig. 1), and the illumination epipolar lines associated with the camera V are likewise parallel to each other and to the rectilinear connecting line 41 (fig. 1).
Furthermore, the illumination epipolar line associated with the camera H has in each case an intersection SPH with the reference axis (for example the first scan line of the camera H), and the illumination epipolar line associated with the camera V likewise has in each case one intersection SPH with the reference axis (for example the first scan line of the camera V).
By way of example, according to fig. 6, the following equation applies to the intersection SPM of the illumination epipolar line 76 to the structural element a1 and the reference axis 84 of the camera MA1
SPMA1=x1-(y1*(x2-x1)/(y2-y1))。
For the slope of this illumination epipolar line StM to structuring element a1, the following applies:
StM=(y1-y2)/(x1-x2)。
as already mentioned, the StM is the same for all illuminating epipolar lines.
Similarly, the intersection SPH and slope StH of the illumination epipolar line associated with camera H and the intersection SPV and slope StV of the illumination epipolar line associated with camera V may be calculated from calibration images recorded with camera H and camera V.
If the projector 36 is at a distance from the camera M, the camera H and the camera V in the z-axis direction in fig. 1, the illumination epipolar lines associated with the respective camera M, the camera H or the camera V will no longer be parallel to each other but will have different slopes but always start at the same structural element.
The calculated parameters SPM, SPH and SPV are stored in the storage unit 60 for each structural element 12 of the projected structural pattern 16. Likewise, the slopes StM, StH and StV are stored in the memory unit 60, wherein in the exemplary embodiment preferred here only one value is obtained for the slopes StM, StH and StV in each case.
Preferably, all structuring elements 12 of the projected structuring pattern 16 with the associated parameters SPM, SPH and SPV and the parameters StM, StH and StV are stored in the storage unit 60 in the form of an alignment list. Here, each structural element 12 is uniquely described by a separate combination of SPM, SPH and SPV.
With reference to fig. 7, a description will now be given of how individual structural elements 12 of the structural pattern 16 projected into the scene 14 according to fig. 1 in the camera images of the camera M, the camera H and the camera V can be identified using the previously described calibration data, or in other words how the structural elements 12 imaged in the camera M, the camera H and the camera V can be uniquely associated with the projected structural elements 12.
Fig. 7 shows the projected structure pattern 16 in plan projection view, along with the image surface 74 of camera M, the image surface 84 of camera H, and the image surface 86 of camera V.
The following description is given, by way of example, as to how a structural element M imaged in the image surface 74 of the camera M can be reliably recognized1In other words, how the structural element M imaged in the image surface 74 of the camera M can be made1Is uniquely associated with one of the projection structure elements 12.
First, the imaging structure element M in the image surface 74 is determined1Pixel coordinate MxAnd My. In practice, this is only possible to a certain degree of accuracy, so that the structural element M1Is located within the tolerance band.
Initially, the calculation unit 62 is used to calculate the pixel coordinate MxAnd MyTo calculate the structural element MxyStraight line on which (which is the illumination epipolar line):
SPMM1=Mx-(My/StM)
as already explained above, the slope StM is the same for all the illumination epipolar lines belonging to the camera M, and therefore the slope StM is stored in the storage unit 60 and can be easily read from the storage unit 60. Therefore, the SPM can be calculated in a simple mannerM1(that is, the structural element M1The intersection of the located illumination epipolar lines).
Since it is impossible to accurately determine the pixel coordinates MxAnd MyThus, SPMM1Within the tolerance band.
Next, all structural elements 12 of the projected structural pattern 16 associated with the respective SPMs in the calibration list stored in the storage unit 60, which are located at the SPMs, are now read from the calibration listM1Within a tolerance band of values of (c).
As can be seen from FIG. 7, the imaging structural element M1The illumination epipolar line is located in close proximity to two other illumination epipolar lines, one at each otherIn the case of having a value calculated SPMM1Within the tolerance band of (c).
At parameter SPMM1Within the tolerance band of (1), for interaction with the imaging structural element M1Possible candidate structural elements for the associated projected structural pattern 16 are structural elements C1, B2, and A3, all of which are located at SPMM1On the corresponding illumination core within the tolerance band of (c). The structural elements C1, B2, A3 are read from an alignment list comprising their associated, but not identical, parameters SPHC1、SPHB2、SPHA3、SPVC1、SPVB2、SPVA3
For each of the structural elements C1, B2 and A3, the calculation unit 62 can now calculate where in the image of camera H and the image of camera V it will be located and on which camera kernel.
Due to the position and orientation of the camera M and the camera H, here preferably at the same height as the parallel optical axes, the camera epipolar lines (three are shown in fig. 7 by way of example with break lines 90, 91, 92) of the camera M and the camera H are aligned and parallel to each other. The structuring element of structuring element 12 is projected (at candidate structuring element C1, B2 or A3 (the image of which in camera M is imaged structuring element M)1) In the case of) a camera epipolar line with the imaging structure element M in the image of the camera H1The camera epipolar lines located in the image of the camera M have the same y value HyNamely: hy=My
If one of the candidate structural elements C1, B2, A3 can be associated with the imaging structural element M1Unique association, then for the relevant structural elements C1, B2, A3, there must be a value for y H on the camera epipolar line of camera HyThe value of x, equation:
Hx=SPHC1;B2;A3+My/StH
it is necessary to have a solution for the relevant structural elements C1, B2, A3 of Hx. The calculation may also be performed by the calculation unit.
As can be seen from FIG. 7, for the structural element A3, H is presentxAnd for structural elements C1 and B2, the previously mentioned equations are not satisfied. As can be seen from FIG. 7, image H of structural element C1C1The structural element M to be recognized is located in the image of the camera H1On a different camera epipolar line (in particular, on the camera epipolar line 92), and projects an image H of the structural element B2B2In the image of the camera H, the imaging structure element M to be recognized is also located1On a different camera epipolar line (specifically, on the camera epipolar line 91). M only1And image H of A3A3On the same camera epipolar line 90 in the image of camera M.
Thus, only image H of structural element a3 in camera HA3Can be associated with an image M of the images of the camera M1One-to-one correlation, where the following equation can be used to verify
Mx=SPMA3+My/StM
SPMA3Which is already known from the alignment list.
Thus, the projection structural element a3 may be associated with image M in the image of camera M1And (4) unique association. In turn, for image H among the images of camera HA3As well as so.
Another exemplary structural element to be recognized in the image of camera M is denoted as M2. Using e.g. in respect of image M1The same procedure as described, the result being an image H of the structural element B1B1Structural element M in an image that can be associated with camera M2And one-to-one association.
In FIG. 7 with VA3Represents the image of the structural element A3 in camera V and is denoted by VB1Representing the image of the structural element B1 in camera V. The third motor V is not mandatory for the described method, but is optionally present. The third camera V may be used to further verify the one-to-one association of structural elements in the images of the cameras M and H. In particular, this is useful if the structural element 12 is not visible in one of the camera M and the camera H, for example if it is blocked by a shoulder or an edge, wherein the camera V can be utilized in combination with the camera of the camera M and the camera H that can see the relevant structural elementUniquely associated with the structural element. A third camera is also useful if one of the cameras views a surface in the scene 14 in a glancing (grazing) manner.
After the individual structural elements 12 of the projected structural pattern 16 in the image of the camera M, the camera H and/or the camera V have been reliably identified, a distance determination of the projected structural elements 12 can be performed, which will result in the correct distance. By the aforementioned method, false shutdowns of the monitored machine and dangerous situations can be avoided.
In order to increase the contrast of the image of the structural element 12 in the camera M, the camera H and the camera V, provision may additionally be made for the projector 36 to transmit the structural pattern 16 in a pulsed manner with the following pulse frequency: the pulse frequency corresponds to half the image recording frequency of the camera M, the camera H and the camera V, wherein in each case two sequential images picked up by the camera M, the camera H and the camera V are subtracted from one another. This effectively leaves only the image of the structural element 12 in the images of camera M, camera H and camera V, while the scene 14 itself and the objects located therein are not visible in the camera images.
It should be understood that the present invention is not limited to the above-described exemplary embodiments. The method according to the invention and the device according to the invention are therefore not limited in their function to the following facts: camera M, camera H and camera V are arranged with their entrance pupils at the same height and/or projector 36 is likewise arranged with its exit pupil in this plane. The optical axes of camera M, camera H and camera V and projector 36 also need not be parallel to each other. In addition, instead of the periodic structure pattern, an aperiodic structure pattern or a randomly distributed structure pattern may be used.

Claims (25)

1. A method for identifying individual structural elements (12) of a structural pattern (16) in a camera image projected onto a scene (14), wherein the structural pattern (16) is projected onto the scene (14) using a projector (36), and wherein the structural pattern (16) projected onto the scene (14) is picked up using a first camera (M) and at least one second camera (H, V), wherein the first camera (M) and the at least one second camera (H, V) are positioned at a distance from each other, and wherein the projector (36) is positioned at a distance from the first camera (M) and from the at least one second camera (H, V) and outside a straight connecting line (26,28) of the first camera and the at least one second camera (M, H, V), wherein, for a structural element (12) to be identified in the camera image of the first camera (M), a structural element (12) in the camera image of the at least one second camera (V, H) that can be associated one-to-one with a structural element (12) to be identified in the camera image of the first camera (M) is determined by the following calibration data: the calibration data having, for the individual structural elements (12), in each case a combination of first parameters which relate the respective structural element (12) to the position and orientation of the first camera (M) and the projector (36) and at least second parameters which relate the respective structural element (12) to the position and orientation of the at least one second camera (H, V) and the projector (36), the calibration data being obtained by recording a calibration image (76,78) of a structural pattern (16) projected by the projector (36) using the first and the at least one second camera (M, H, V),
wherein the first parameters of the respective construction elements (12) comprise a slope (StM) of a first illumination epipolar line associated with the first camera (M) and an intersection (SPM) of the first illumination epipolar line with a first reference axis (84), and the at least second parameters of the respective construction elements comprise a slope (StH, StV) of a second illumination epipolar line associated with the at least one second camera (H, V) and an intersection (SPH, SPV) of the second illumination epipolar line with a second reference axis, wherein the illumination epipolar lines lie in an image plane of the respective camera, and wherein each imaging construction element always lies on the same illumination epipolar line and the position of each imaging construction element on the illumination epipolar line depends on the distance of the respective projected construction element from the camera.
2. Method according to claim 1, wherein the first reference axis (84) is a scan line of the first camera (M) and the second reference axis is a scan line of the at least one second camera (H, V).
3. Method according to claim 1 or 2, wherein for recording the calibration image (76,78) the structure pattern (16) is projected sequentially onto a first surface and onto a second surface, wherein the first surface and the second surface are spaced apart from each other in the transmission direction of the projector (36).
4. The method of claim 3, wherein the first surface is a planar surface and the second surface is a planar surface.
5. Method according to claim 1 or 2, wherein pixel coordinates of the respective structural elements (12) to be identified in the camera image of the first camera (M) are determined, at least a first parameter is calculated from the pixel coordinates and the calibration data, all structural elements (12) that can also satisfy the at least one first parameter within a tolerance band are read from the calibration data, at least one second parameter of the at least second parameter is also read for each of the read structural elements (12), and structural elements (12) in the camera image of the at least one second camera (H, V) that can be associated one-to-one with the structural elements (12) to be identified in the camera image of the first camera (M) are determined.
6. Method according to claim 1 or 2, wherein a structure pattern (16) projected onto the scene (14) is picked up additionally using a third camera (V) positioned outside a straight connecting line (26) of the first and second cameras (M, H) and outside a straight connecting line (37) of the first camera (M) and the projector (36) and outside a straight connecting line (39) of the second camera (H) and the projector (36), wherein the calibration data additionally has a third parameter relating the respective structural element (12) to the position and orientation of the third camera (V) and the projector (36).
7. Method according to claim 1 or 2, wherein the structural elements (12) are distributed evenly over the structural pattern (16) on two main axes (42, 44).
8. The method of claim 7, wherein the major axes are mutually perpendicular axes.
9. Method according to claim 7, wherein the structure pattern (16) is projected onto the scene (14) such that the two main axes (42,44) of the structure pattern (16) extend obliquely with respect to the straight connecting lines (26,28) of the first and the at least one second camera (M, H, V) and/or obliquely with respect to the straight connecting lines (37,39,41) of the projector (36) with the first and the at least one second camera (M, H, V).
10. The method according to claim 1 or 2, wherein the structure pattern (16) is a dot pattern and the structure elements (12) are dots.
11. The method according to claim 1 or 2, wherein the structured pattern (16) is transmitted in pulsed manner with a pulse frequency: the pulse frequency corresponds to half the image recording frequency of the first camera and the at least one second camera (M, H, V), and wherein in each case two sequential images picked up by the first camera and the at least one second camera (M, H, V) are subtracted from one another, and/or wherein ambient light in the camera images is blocked using an optical filter which can only transmit in the wavelength range of the projection light utilized when the projection pattern (16) is projected.
12. A method according to claim 1 or 2, wherein the method is used to monitor a hazard zone.
13. A method according to claim 12, wherein the method is for securing the hazard zone.
14. The method of claim 12, wherein the hazard is a hazard of a machine.
15. An apparatus for identifying structural elements (12) of a structural pattern (16) in a camera image projected onto a scene (14), comprising a projector (36), the projector (36) for projecting the structural pattern (16) onto the scene (14); further comprising a first camera (M) and at least one second camera (H, V) for picking up a structure pattern (16) projected onto the scene (14), wherein the first camera (M) and the at least one second camera (H, V) are positioned at a distance from each other, and wherein the projector (36) is positioned at a distance from the first camera (M) and from the at least one second camera (H, V) and outside the straight connecting lines (26,28) of the first camera and the at least one second camera (M, H, V); further comprising a storage unit (60), the storage unit (60) storing calibration data, the calibration data having, for each structural element (12), in each case a combination of a first parameter relating the respective structural element (12) to the position and orientation of the first camera (M) and the projector (36), and at least a second parameter relating the respective structural element (12) to the position and orientation of the at least one second camera (H, V) and the projector (36); and further comprising a calculation unit (62), the calculation unit (62) being adapted to: for a structural element (12) to be identified in the camera image of the first camera (M), using the calibration data to determine a structural element (12) in the camera image of the at least one second camera (H, V) that can be one-to-one associated with the structural element (12) to be identified in the camera image of the first camera (M),
wherein the first parameters of the respective construction elements (12) stored in the storage unit (60) comprise a slope (StM) of a first illumination epipolar line associated with the first camera (M) and an intersection (SPM) of the first illumination epipolar line with a first reference axis (84), and the at least second parameters of the respective construction elements (12) comprise a slope (StH, StV) of a second illumination epipolar line associated with the at least one second camera (H, V) and an intersection (SPH, SPV) of the second illumination epipolar line with a second reference axis, wherein the illumination epipolar lines lie within an image plane of the respective camera, and wherein each imaging construction element always lies on the same illumination epipolar line and the position of each imaging construction element on the illumination epipolar line depends on the distance of the respective projected construction element from the camera.
16. Apparatus according to claim 15, wherein the calculation unit (62) is arranged to determine the pixel coordinates of the structural element (12) to be identified in the camera image of the first camera (M) to: calculating at least a first parameter from the pixel coordinates and the calibration data; reading from the calibration data all structural elements (12) that can also meet the at least one first parameter within a tolerance band; -also reading at least one of said at least second parameters for each of the read structural elements (12); and determining structural elements (12) in the camera image of the at least one second camera (H, V) that can be associated one-to-one with the structural elements (12) to be recognized in the camera image of the first camera (M).
17. Apparatus according to claim 15 or 16, wherein the entrance pupil (18) of the first camera (M), the entrance pupil (20,22) of the at least one second camera (H, V) and the exit pupil (38) of the projector (36) lie in one common plane.
18. Apparatus according to claim 15 or 16, wherein the optical axis (30) of the first camera (M), the optical axis (32,34) of the at least one second camera (H, V) and the optical axis (40) of the projector (36) are parallel to each other.
19. The apparatus of claim 15 or 16, wherein a distance between the first camera and the at least one second camera (M, H, V) is different from a distance between the projector (36) and the first camera (M) and different from a distance between the projector (36) and the at least one second camera (H, V).
20. Apparatus according to claim 15 or 16, further comprising a third camera (V) for picking up a structured pattern (16) projected onto the scene (14), wherein the third camera (V) is positioned outside a straight connecting line (26) of the first and second cameras (M, H) and outside a straight connecting line (37) of the first camera (M) and the projector (36) and outside a straight connecting line (39) of the second camera (H) and the projector (36), wherein the calibration data stored in the storage unit (62) additionally has third parameters which relate the respective structural elements (12) to the position and orientation of the third camera (V) and the projector (36).
21. Apparatus according to claim 15 or 16, wherein the projector (36) is arranged to generate the structure pattern (16) in the form of a dot pattern.
22. The apparatus of claim 15 or 16, wherein the projector (36) is adapted to transmit the structure pattern (16) in pulses having a pulse frequency: the pulse frequency corresponds to half of the image recording frequency of the first and the at least one second camera (M, H, V), and wherein the calculation unit (62) is arranged to: in each case subtracting two images picked up by the first camera and the at least one second camera (M, H, V) from each other, and/or the device comprises an optical filter for blocking ambient light in the camera images, which optical filter is only transmissive in the wavelength range of projection light utilized when the structured pattern (16) is projected.
23. Apparatus according to claim 15 or 16, wherein the apparatus is for monitoring a hazard zone.
24. The apparatus of claim 23, wherein the apparatus is to secure the hazard zone.
25. The apparatus of claim 23, wherein the hazard is a hazard of a machine.
CN201580049754.9A 2014-09-17 2015-08-10 Method and device for recognizing structural elements of a projected structural pattern in a camera image Active CN107077729B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102014113389.7 2014-09-17
DE102014113389.7A DE102014113389A1 (en) 2014-09-17 2014-09-17 Method and device for identifying structural elements of a projected structural pattern in camera images
PCT/EP2015/068377 WO2016041695A1 (en) 2014-09-17 2015-08-10 Method and apparatus for identifying structural elements of a projected structural pattern in camera images

Publications (2)

Publication Number Publication Date
CN107077729A CN107077729A (en) 2017-08-18
CN107077729B true CN107077729B (en) 2021-02-09

Family

ID=53879492

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201580049754.9A Active CN107077729B (en) 2014-09-17 2015-08-10 Method and device for recognizing structural elements of a projected structural pattern in a camera image

Country Status (6)

Country Link
US (1) US10068348B2 (en)
EP (1) EP3195256B1 (en)
JP (1) JP6625617B2 (en)
CN (1) CN107077729B (en)
DE (1) DE102014113389A1 (en)
WO (1) WO2016041695A1 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6922169B2 (en) 2016-08-24 2021-08-18 ソニーグループ株式会社 Information processing equipment and methods, vehicles, and information processing systems
US10414048B2 (en) 2016-09-14 2019-09-17 Faro Technologies, Inc. Noncontact safety sensor and method of operation
US10841561B2 (en) * 2017-03-24 2020-11-17 Test Research, Inc. Apparatus and method for three-dimensional inspection
CN107464265B (en) * 2017-06-14 2021-05-11 深圳市圆周率软件科技有限责任公司 Parameter calibration system and method for binocular panoramic camera
US11096765B2 (en) 2018-06-22 2021-08-24 Align Technology, Inc. Light field intraoral 3D scanner with structured light illumination
US10896516B1 (en) * 2018-10-02 2021-01-19 Facebook Technologies, Llc Low-power depth sensing using dynamic illumination
US10901092B1 (en) * 2018-10-02 2021-01-26 Facebook Technologies, Llc Depth sensing using dynamic illumination with range extension
GB2584439B (en) * 2019-06-03 2023-02-22 Inspecvision Ltd Projector assembly system and method
US11398085B2 (en) * 2020-07-31 2022-07-26 Wisconsin Alumni Research Foundation Systems, methods, and media for directly recovering planar surfaces in a scene using structured light
WO2023094530A1 (en) 2021-11-25 2023-06-01 Trinamix Gmbh One shot calibration

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19502459A1 (en) * 1995-01-28 1996-08-01 Wolf Henning Three dimensional optical measurement of surface of objects
CA2309008C (en) * 1999-05-24 2007-07-17 Richard Mcbain High speed laser triangulation measurements of shape and thickness
US6341016B1 (en) * 1999-08-06 2002-01-22 Michael Malione Method and apparatus for measuring three-dimensional shape of object
ATE404952T1 (en) * 2003-07-24 2008-08-15 Cognitens Ltd METHOD AND SYSTEM FOR THREE-DIMENSIONAL SURFACE RECONSTRUCTION OF AN OBJECT
DE102004020419B3 (en) * 2004-04-23 2005-10-20 3D Shape Gmbh Method and apparatus for determining the shape and local surface normal of specular surfaces
ATE518113T1 (en) * 2005-03-11 2011-08-15 Creaform Inc SELF-REFERENCED THREE-DIMENSIONAL SCANNING SYSTEM AND APPARATUS
US7454841B2 (en) * 2005-11-01 2008-11-25 Hunter Engineering Company Method and apparatus for wheel alignment system target projection and illumination
DE102006002602A1 (en) * 2006-01-13 2007-07-19 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Calibration method and calibration system
EP2019281B1 (en) 2007-07-20 2009-09-09 Sick Ag Method for operating a 3D sensor
DE102007036129B3 (en) * 2007-08-01 2008-09-25 Sick Ag Device and method for the three-dimensional monitoring of a spatial area with at least two image sensors
US8139110B2 (en) * 2007-11-01 2012-03-20 Northrop Grumman Systems Corporation Calibration of a gesture recognition interface system
EP2133619A1 (en) 2008-06-10 2009-12-16 Sick Ag Three-dimensional monitoring and securing of an area
JP5457865B2 (en) * 2010-02-02 2014-04-02 倉敷紡績株式会社 Non-contact three-dimensional measuring apparatus and non-contact three-dimensional measuring method
CN102884397B (en) * 2010-05-07 2015-07-15 深圳泰山在线科技有限公司 Structured-light measuring method and system
JP2011237296A (en) * 2010-05-11 2011-11-24 Nippon Telegr & Teleph Corp <Ntt> Three dimensional shape measuring method, three dimensional shape measuring device, and program
JP2013210254A (en) * 2012-03-30 2013-10-10 Canon Inc Three-dimensional measuring device, three-dimensional measuring method and three-dimensional measuring program
JP6021533B2 (en) * 2012-09-03 2016-11-09 キヤノン株式会社 Information processing system, apparatus, method, and program
DE102012023623B4 (en) * 2012-11-28 2014-07-03 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method for assembling partial recordings of a surface of an object to a total record of the object and system for creating a complete record of an object
DE102012112321B4 (en) * 2012-12-14 2015-03-05 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US20140307055A1 (en) * 2013-04-15 2014-10-16 Microsoft Corporation Intensity-modulated light pattern for active stereo
WO2014201303A2 (en) * 2013-06-13 2014-12-18 Edge Toy, Inc. Three dimensional scanning apparatuses and methods for adjusting three dimensional scanning apparatuses
US20150015701A1 (en) * 2013-07-10 2015-01-15 Faro Technologies, Inc. Triangulation scanner having motorized elements
US9562760B2 (en) * 2014-03-10 2017-02-07 Cognex Corporation Spatially self-similar patterned illumination for depth imaging
DE102014210672A1 (en) * 2014-06-05 2015-12-17 BSH Hausgeräte GmbH Cooking device with light pattern projector and camera
WO2015191605A1 (en) * 2014-06-09 2015-12-17 The Johns Hopkins University Virtual rigid body optical tracking system and method
US20150381972A1 (en) * 2014-06-30 2015-12-31 Microsoft Corporation Depth estimation using multi-view stereo and a calibrated projector

Also Published As

Publication number Publication date
US20170178354A1 (en) 2017-06-22
JP6625617B2 (en) 2019-12-25
US10068348B2 (en) 2018-09-04
JP2017531258A (en) 2017-10-19
DE102014113389A1 (en) 2016-03-17
EP3195256B1 (en) 2020-06-10
EP3195256A1 (en) 2017-07-26
CN107077729A (en) 2017-08-18
WO2016041695A1 (en) 2016-03-24

Similar Documents

Publication Publication Date Title
CN107077729B (en) Method and device for recognizing structural elements of a projected structural pattern in a camera image
US9858682B2 (en) Device for optically scanning and measuring an environment
US20170131085A1 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US8970823B2 (en) Device for optically scanning and measuring an environment
KR100753885B1 (en) Image obtaining apparatus
US20140168370A1 (en) Device for optically scanning and measuring an environment
CN109141373A (en) For protecting the sensor of machine
CN106997455A (en) Photoelectric sensor and method for the object that safely detects minimum dimension
JP2016186469A (en) Information processing apparatus, information processing method, and program
TWI672937B (en) Apparatus and method for processing three dimensional images
JP2015106287A (en) Calibration device and method
WO2016040229A1 (en) Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
JP3677444B2 (en) 3D shape measuring device
JP2017528714A (en) Method for optical measurement of three-dimensional coordinates and control of a three-dimensional measuring device
JP2007093412A (en) Three-dimensional shape measuring device
JP6180158B2 (en) Position / orientation measuring apparatus, control method and program for position / orientation measuring apparatus
US20230016639A1 (en) System and method for controlling automatic inspection of articles
Rodrigues et al. Structured light techniques for 3D surface reconstruction in robotic tasks
JP2019090753A (en) Three-dimensional measuring method
US10060733B2 (en) Measuring apparatus
JP2008164338A (en) Position sensor
Jovanović et al. Accuracy assessment of structured-light based industrial optical scanner
US20190113336A1 (en) Multi-Directional Triangulation Measuring System with Method
JP6611872B2 (en) Measuring device
KR101888363B1 (en) Sensor Test Bed, Object Sensing Device, Object Sensing Method for Sensing the 3D Shape of The Object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant