CN116718108A - Binocular camera - Google Patents

Binocular camera Download PDF

Info

Publication number
CN116718108A
CN116718108A CN202310665249.3A CN202310665249A CN116718108A CN 116718108 A CN116718108 A CN 116718108A CN 202310665249 A CN202310665249 A CN 202310665249A CN 116718108 A CN116718108 A CN 116718108A
Authority
CN
China
Prior art keywords
lens
binocular camera
component
target
spot data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310665249.3A
Other languages
Chinese (zh)
Inventor
张和君
冯福荣
吴兴发
陈源
廖学文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chotest Technology Inc
Original Assignee
Chotest Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chotest Technology Inc filed Critical Chotest Technology Inc
Priority to CN202310665249.3A priority Critical patent/CN116718108A/en
Publication of CN116718108A publication Critical patent/CN116718108A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • G01B11/005Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates coordinate measuring machines
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/002Active optical surveying means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements

Abstract

The present disclosure describes a binocular camera comprising a first light source, a second light source, a first acquisition unit configured to acquire a first light beam reflected by a target and to obtain first spot data, the first spot data comprising a first component and a second component, and a second acquisition unit configured to acquire a second light beam reflected by the target and to obtain second spot data, the second spot data comprising a third component and a fourth component; the first component, the third component and preset conditions met by the first component and the third component when the target is positioned on the sighting optical axis are used for obtaining a first rotation angle; the second component and the fourth component are used to obtain a second rotation angle; the binocular camera is configured to rotate a first rotation angle in a first direction with a rotation center of the binocular camera as a center point; the binocular camera is configured to rotate a second rotation angle in a second direction with a rotation center of the binocular camera as a center point. Thereby, the binocular camera can be aimed at a target.

Description

Binocular camera
The application relates to a method for aiming a target based on a binocular camera, a processing device and a laser tracker, which are classified into patent applications with application date of 2023, 02, 10 and application number of 2023100982589.
Technical Field
The present disclosure relates generally to the intelligent manufacturing equipment industry, and more particularly to a binocular camera.
Background
The basic principle of operation of the laser tracker is that a target (also called a reflector or a target ball) is arranged on a to-be-measured point, a laser beam emitted by a laser tracking head of the laser tracker is emitted onto the target along a measuring optical axis of the laser tracker, the laser beam is reflected by the target and returns to the laser tracking head, and when the target moves, the laser tracking head adjusts the direction of the laser beam to aim at the target. Meanwhile, the returned laser beam is received and identified by a detection system of the laser tracker and is used for measuring and calculating the spatial position of the target.
Patent document (CN 202210181257.6) discloses a coordinate measuring device with an automatic target recognition function and a recognition method thereof, in which a monocular camera system acquires a laser beam reflected by a target to form a target spot, calculates a rotation angle to be adjusted by the coordinate measuring device with a pixel difference value of an initial spot, controls the monocular camera to rotate based on the rotation angle so that a centroid of the target spot moves to a centroid of the initial spot, and enables a detection system of a laser tracker to receive and recognize the laser beam reflected by the target, wherein the initial spot is a spot formed by the laser beam reflected by the target at the monocular camera system when the target is located at a preset position.
However, since the sighting optical axis of the monocular camera and the measuring optical axis of the laser tracker are not coincident, there is parallax between the sighting optical axis of the monocular camera and the measuring optical axis of the laser tracker, and the zero point position of the laser tracker when measuring the target (i.e., the preset position of the returned laser beam on the imaging element of the monocular camera when the target is located on the measuring optical axis of the laser tracker) is related to the distance from the target to the monocular camera, it is difficult to accurately acquire the zero point position, and therefore, the rotation angle of the monocular camera (i.e., the rotation angle of the laser beam emitted by the laser tracker) cannot be accurately obtained based on the zero point position, so that the sighting optical axis cannot be directly aimed at the target.
Disclosure of Invention
The present disclosure has been made in view of the above-mentioned prior art, and an object thereof is to provide a binocular camera-based aiming method for improving the aiming accuracy of a laser tracker, and a processing apparatus and a laser tracker for performing the method.
To this end, a first aspect of the present disclosure provides a method of aiming a target based on a binocular camera, the method of aiming a sighting optical axis of the binocular camera at the target, the method comprising: emitting a first light beam and a second light beam; acquiring a first light beam reflected by the target by using the first acquisition unit and obtaining first light spot data, wherein the first light spot data comprises a first component along a first direction and a second component along a second direction, and acquiring a second light beam reflected by the target by using the second acquisition unit and obtaining second light spot data, and the second light spot data comprises a third component along the first direction and a fourth component along the second direction; calculating a first rotation angle based on the first component, the third component, and a preset condition satisfied by the first component and the third component when the target is located on the sighting optical axis; calculating a second rotation angle based on the second component and the fourth component; rotating the binocular camera by the first rotation angle along the first direction by taking the rotation center of the binocular camera as a center point; and rotating the binocular camera by the second rotation angle along the second direction by taking the rotation center of the binocular camera as a center point.
According to the binocular camera-based target aiming method, a first light beam can be emitted through a first light source, a second light beam can be emitted through a second light source, a target can receive the first light beam and the second light beam and can reflect the first light beam and the second light beam, a first acquisition unit of the binocular camera can acquire the first light beam reflected by the target and can obtain first light spot data of the first light beam, a second acquisition unit of the binocular camera can acquire the second light beam reflected by the target and can obtain second light spot data of the second light beam, a first rotation angle which needs to rotate along a first direction by taking a rotation center as a center point in the binocular camera aiming process can be calculated based on the first light spot data and the second light spot data, and the preset conditions which are met by the first light spot data and the second light spot data when the target is located on an aiming optical axis of the binocular camera can be calculated, in other words, the binocular camera can be aimed by rotating the binocular camera by taking the rotation center point as a second direction in the binocular camera aiming process, and the binocular camera can be aimed by rotating the first light spot data and the second light spot data.
Further, in the method according to the first aspect of the present disclosure, optionally, the first acquisition unit has a first lens and a first imaging element for acquiring the first spot data, and the second acquisition unit has a second lens and a second imaging element for acquiring the second spot data. In this case, the first light beam can be reflected to the first lens through the first lens and focused on the first imaging element to be imaged, the second light beam can be reflected to the second lens through the target and focused on the second imaging element through the second lens, whereby the first imaging element can acquire the first spot data based on the reflected first light beam and the second imaging element can acquire the second spot data based on the reflected second light beam.
In addition, in the method according to the first aspect of the present disclosure, optionally, a first coordinate system is established based on the first imaging element, a lateral axis direction of the first coordinate system is the first direction, a longitudinal axis direction of the first coordinate system is the second direction, a lateral axis direction of the second coordinate system is the first direction, a longitudinal axis direction of the second coordinate system is the second direction, the first direction is the horizontal direction, and the second direction is the vertical direction. In this case, the first spot data can be decomposed into a first component in the first direction and a second component in the second direction in the first coordinate system, and the second spot data can be decomposed into a third component in the first direction and a fourth component in the second direction in the second coordinate system, thereby facilitating calculation of the first spot data and the second spot data.
In addition, in the method according to the first aspect of the present disclosure, optionally, the number of the first light sources is plural, the plural first light sources are arranged around the first acquisition unit, and the first light spot data is light spot data obtained based on plural first light beams emitted by the plural first light sources, which are reflected by the target and acquired by the first acquisition unit; the second light sources are arranged around the second acquisition unit, and the second light spot data are light spot data obtained based on a plurality of second light beams emitted by the second light sources, reflected by the target and acquired by the second acquisition unit. In this case, since the plurality of first light sources are arranged around the first acquisition unit, the light spots formed by the plurality of first light sources acquired by the first acquisition unit are also in a surrounding shape, and the first light spot data is calculated by using the combined center of gravity of the plurality of light spots, the first light spot data with higher precision can be obtained. Meanwhile, as the plurality of second light sources are arranged around the second acquisition unit, light spots formed by the plurality of second light sources acquired by the second acquisition unit are also in a surrounding shape, and the second light spot data can be calculated by utilizing the combined gravity centers of the plurality of light spots, so that the second light spot data with higher precision can be obtained, thereby not only simplifying calculation, but also improving calculation precision.
Additionally, in the method according to the first aspect of the present disclosure, optionally, the method further includes calculating a distance between the target and the binocular camera based on the first component and the third component. Therefore, the distance between the target and the binocular camera can be conveniently measured, the space position of the target relative to the binocular camera can be determined, and the first rotation angle and the second rotation angle can be conveniently calculated.
Further, in the method according to the first aspect of the present disclosure, the first rotation angle may be calculated based on the first component, the third component, a distance between the target and the binocular camera, and a preset condition satisfied by the first component and the third component when the target is located on the sighting optical axis. In this case, since the first component and the third component satisfy the preset condition based on the triangle relation when the target is located on the sighting optical axis, the first rotation angle can be accurately and conveniently calculated based on the preset condition.
In addition, in the method according to the first aspect of the present disclosure, optionally, when the binocular camera aims at the target, the first component and the third component meet a preset condition that the first component and the third component are added to a preset value, and the preset value is obtained by calibrating the binocular camera when aiming the binocular camera at the target. In this case, the first rotation angle can be accurately and conveniently calculated based on the preset condition, and the binocular camera can be calibrated to obtain preset values at different distances before the binocular camera is used for aiming at the target, so that the calibrated preset values can be directly called to calculate the first rotation angle when the first rotation angle is calculated at different distances later.
In addition, in the method according to the first aspect of the present disclosure, optionally, the first lens and the second lensThe mirrors are located in the same plane, and the first rotation angle is:the second rotation angle is:wherein α1 is a parameter value of the first rotation angle, β1 is a parameter value of the second rotation angle, u is a parameter value of a preset width of each pixel point of the first imaging element and the second imaging element in the first direction and the second direction, L1 is a parameter value of a first preset distance from a rotation center of the binocular camera to a same plane where the first lens and the second lens are located, D1 is a parameter value of a distance from the object to the same plane where the first lens and the second lens are located, f is a parameter value of a focal length of the first lens and the second lens, K is a parameter value of the preset value, x 1 And x 2 A parameter value representing the first component of the first spot data in the first coordinate system and a parameter value representing the third component of the second spot data in the second coordinate system, y, respectively 1 And y 2 A parameter value representing the second component of the first spot data in the first coordinate system relative to the optical axis of the first lens and a parameter value representing the fourth component of the second spot data in the second coordinate system relative to the optical axis of the second lens, respectively. In this case, when the target is located at an arbitrary position on the sighting optical axis of the binocular camera, the preset condition satisfied by the first component and the third component can be deduced based on the triangle relation, and thus the first rotation angle can be accurately and conveniently calculated based on the preset condition. Meanwhile, as the optical axes of the first lens and the second lens of the binocular camera and the aiming optical axis of the binocular camera have no parallax in the second direction, namely the second component of the first light spot data and the fourth component of the second light spot data are not influenced by the parallax, the method for calculating the second rotation angle can be simplified, and meanwhile, the average value of the second component and the fourth component can be utilized to calculate The second rotation angle can improve the accuracy of calculation, so that the calculation of a method for aiming at the target can be more convenient.
In addition, a second aspect of the present disclosure provides a processing apparatus, including a control module that performs the method according to the first aspect of the present disclosure, and an input-output module that communicates with the outside, where the input-output module is interconnected with the control module through a bus. In this case, the input-output module can communicate with the binocular camera according to the first aspect of the present disclosure, whereby the control module can receive parameters of the binocular camera and calculate the first rotation angle and the second rotation angle that the binocular camera needs to rotate when aiming at the target based on the method of aiming at the target.
Further, a third aspect of the present disclosure provides a laser tracker for aiming a target, comprising a laser source for generating a laser beam, a processing device as related to the second aspect of the present disclosure, a binocular camera for acquiring a first rotation angle and a second rotation angle based on the method related to the first aspect of the present disclosure, and a driving unit for adjusting the direction of the laser beam based on the first rotation angle and the second rotation angle. In this case, the processing device can acquire the first rotation angle and the second rotation angle that need to be rotated in the process of aiming the binocular camera at the target, so that the driving unit can adjust the direction of the laser beam based on the first rotation angle and the second rotation angle to aim the laser beam at the target.
According to the present disclosure, a binocular camera-based method of targeting a target for improving the accuracy of a laser tracker targeting the target can be provided, as well as a processing apparatus and a laser tracker performing the method.
Drawings
Fig. 1A is a schematic diagram showing a laser tracker according to an example of the present embodiment.
Fig. 1B is a schematic diagram showing a method of aiming a target based on a binocular camera according to an example of the present embodiment.
Fig. 2A is a schematic diagram showing the structure of the binocular camera according to the present embodiment example.
Fig. 2B is a schematic diagram showing the composition of the binocular camera according to the present embodiment example.
Fig. 3 is a flowchart showing a method of aiming a target based on a binocular camera according to an example of the present embodiment.
Fig. 4A is a schematic diagram showing an actual optical path of a binocular camera-based method of aiming a target according to an example of the present embodiment.
Fig. 4B is a schematic diagram showing a first equivalent optical path of a binocular camera-based method of aiming a target according to an example of the present embodiment.
Fig. 5 is a schematic diagram showing a first coordinate system and a second coordinate system according to an example of the present embodiment.
Fig. 6 is a schematic diagram showing a distance to a measurement target based on a binocular camera according to an example of the present embodiment.
Fig. 7 is a schematic diagram showing that the sighting target according to the present embodiment example is located on the sighting optical axis of the binocular camera.
Fig. 8 is a schematic diagram showing measurement of the first rotation angle based on the binocular camera according to the present embodiment example.
Fig. 9 is a schematic diagram showing measurement of a second rotation angle based on the binocular camera according to the present embodiment example.
Fig. 10 is a schematic diagram showing components of the laser tracker according to the present embodiment example.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that the terms "first," "second," "third," and "fourth," etc. in the description and claims of the present invention and in the above figures are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus. In the following description, the same members are denoted by the same reference numerals, and overlapping description thereof is omitted. In addition, the drawings are schematic, and the ratio of the sizes of the components to each other, the shapes of the components, and the like may be different from actual ones.
The present embodiment relates to a method of aiming a target based on a binocular camera, which may also be referred to as a "method of capturing a target based on a binocular camera" (hereinafter may be simply referred to as a "method of aiming a target" or a "method of capturing a target").
Fig. 1A is a schematic diagram showing a laser tracker 30 according to an example of the present embodiment. Fig. 1B is a schematic diagram showing a method of aiming at the target 20 based on the binocular camera 10 according to the present embodiment example.
In this embodiment, referring to fig. 1A and 1B, the method for aiming the target 20 based on the binocular camera 10 may be applied to the laser tracker 30, and the laser tracker 30 may operate by placing a target 20 (may also be referred to as a "reflector" or "target ball") on a point to be measured, the laser tracker 30 may emit a laser beam, and may emit the laser beam onto the target 20 along a measurement optical axis of the laser tracker 30, the laser beam may be reflected by the target 20 and may return to the laser tracker 30, and at the same time, the returned laser beam is received and recognized by a detection system of the laser tracker 30, so that the laser tracker 30 can measure and calculate a spatial position of the target 20. When the moving range of the target 20 is larger, the detection system of the laser tracker 30 cannot receive and identify the laser beam reflected back by the target 20, and then the laser tracker 30 can adjust the position and posture of the binocular camera 10 based on the method of aiming the target 20 by the binocular camera 10, so that the target 20 is located on the aiming optical axis M of the binocular camera 10, and the reflected laser beam can be received and identified by the detection system of the laser tracker 30, so that the laser tracker 30 can calculate the spatial position of the target 20, wherein the aiming optical axis M of the binocular camera 10 can be the measuring optical axis of the laser tracker 30, that is, the axis where the laser beam emitted by the laser tracker 30 is located.
Referring to fig. 1B, in the method for aiming the target 20 based on the binocular camera 10 according to the present embodiment, the first direction 12, the second direction 13, and the aiming optical axis M of the binocular camera 10 may be perpendicular to each other, the first projection 20a may be a projection position of the target 20 in a plane formed by the aiming optical axis M and the first direction 12 (i.e., the X-axis direction shown in fig. 1B, which may also be referred to as a "horizontal direction"), and the second projection 20B may be a projection position of the target 20 in a plane formed by the aiming optical axis M and the second direction 13 (i.e., the Y-axis direction shown in fig. 1B, which may also be referred to as a "vertical direction").
By the method of aiming the target 20 based on the binocular camera 10 according to the present disclosure, the binocular camera 10 can irradiate the light beam onto the target 20 through the first and second light sources 15 and 16 of the binocular camera 10, the acquisition unit 17 of the binocular camera 10 can acquire the first and second spot data of the light beam reflected by the target 20, and the first and second rotation angles α and β in the first and second directions 12 and 13 about the rotation center 11 of the binocular camera 10 can be obtained based on the first and second spot data.
In some examples, during the aiming of the binocular camera 10 at the target 20, i.e., with the target 20 on the aiming optical axis M of the binocular camera 10, the binocular camera 10 may be rotated with the center of rotation 11 as a center point according to the first rotation angle α and the second rotation angle β such that the target 20 is located on the aiming optical axis M. The methods of the present disclosure can improve the accuracy of aiming the target 20 compared to methods of aiming the target 20 with a monocular camera.
It should be noted that the method for aiming the target 20 based on the binocular camera 10 of the present disclosure is not limited to being applicable to the laser tracker 30, but may be applicable to other devices that need to aim the target 20.
Fig. 2A is a schematic diagram showing the structure of the binocular camera 10 according to the present embodiment example. Fig. 2B is a schematic diagram showing the composition of the binocular camera 10 according to the present embodiment example.
Referring to fig. 2B, in some examples, the binocular camera 10 may include a first light source 15 for emitting a first light beam 151, a second light source 16 for emitting a second light beam 161, and an acquisition unit 17 that receives the first and second light beams 151, 161 reflected back by the target 20. In this case, the acquisition unit 17 can receive the first and second light beams 151 and 161 reflected back from the target 20, whereby the acquisition unit 17 can correspond to a first light spot 152 (described later) forming the first light beam 151 and a second light spot 162 (described later) forming the second light beam 161, and the acquisition unit 17 can form first light spot data of the first light spot 152 and second light spot data of the second light spot 162 based on the first and second light spots 152 and 162, respectively.
In some examples, acquisition unit 17 may include a first acquisition unit 171 and a second acquisition unit 172, wherein first acquisition unit 171 may be configured to receive first beam 151 reflected back by target 20 and form first spot 152 and first spot data of first spot 152, and second acquisition unit 172 may be configured to receive second beam 161 reflected back by target 20 and form second spot 162 and second spot data of second spot 162. In this case, the first and second acquisition units 171 and 172 can acquire the first and second light beams 151 and 161 reflected back from the target 20, respectively, and form the first and second spot data, respectively, whereby the structural setup of the binocular camera 10 is simpler and more flexible, and the field of view of the binocular camera 10 is wider (see fig. 2B).
In some examples, the first acquisition unit 171 may have a first lens 1711 and a first imaging element 1712, and the second acquisition unit 172 may have a second lens 1721 and a second imaging element 1722 (see fig. 2B). In this case, first light beam 151 can be reflected off of target 20 to first lens 1711, focused through first lens 1711 onto first imaging element 1712 as first spot 152, second light beam 161 can be reflected off of target 20 to second lens 1721, focused through second lens 1721 onto second imaging element 1722 as second spot 162, whereby first imaging element 1712 can acquire first spot data based on first spot 152, and second imaging element 1722 can acquire second spot data based on second spot 162.
In some examples, the first lens 1711 and the second lens 1721 may lie in the same plane and the optical centers of the first lens 1711 and the second lens 1721 lie in the plane, the optical centers of the first lens 1711 and the second lens 1721 lying on the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721, respectively. This facilitates the structural design of the binocular camera 10 and enables simpler and more accurate calculation steps of the method of aiming the target 20.
In some examples, the first lens 1711 and the second lens 1721 may be located in different planes. Thereby enabling an improvement in the adaptability of the binocular camera 10.
Referring to fig. 2B, in some examples, the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721 may be parallel. This facilitates the structural design of the binocular camera 10 and enables simpler and more accurate calculation steps of the method of aiming the target 20.
In some examples, the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721 may be parallel and distributed on both sides of the aiming optical axis M of the binocular camera 10. This can further simplify and make more accurate the calculation of the method of calculating the aiming at the target 20.
In some examples, the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721 may be parallel and symmetrically distributed on both sides of the aiming optical axis M of the binocular camera 10. In this case, when the binocular camera 10 is aimed at the target 20, that is, when the target 20 is located at the aiming optical axis M, a preset value can be obtained based on the triangular relationship, thereby contributing to further simplifying the calculation of the first rotation angle α that needs to be rotated when the binocular camera 10 is aimed at the target 20.
In some examples, the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721 may be intersecting. This can improve the adaptability of the binocular camera 10.
In some examples, the projection of the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721 in the second direction 13 may overlap with the projection of the aiming optical axis M of the binocular camera 10 in the second direction 13. Thereby, the calculation of the second rotation angle β that needs to be rotated when the binocular camera 10 aims at the target 20 can be simplified, and the accuracy of calculating the second rotation angle β can be improved.
In some examples, the first lens 1711 and the second lens 1721 may have the same focal length. This makes it possible to simplify the method of calculating the aiming at 20.
In some examples, the first lens 1711 and the second lens 1721 may have different focal lengths. This can improve the adaptability of the binocular camera 10.
In some examples, the first imaging element 1712 may be a CMOS photosensitive element, for example, a CMOS image sensor. In other examples, the first imaging element 1712 may be a CCD photosensitive element. However, the example of the present embodiment is not limited thereto, and the first imaging element 1712 may be other elements that can be used for photoimaging.
In some examples, the second imaging element 1722 may be a CMOS photosensitive element, such as a CMOS image sensor. In other examples, the second imaging element 1722 may be a CCD photosensitive element. However, the example of the present embodiment is not limited thereto, and the second imaging element 1722 may be other elements that can be used for photoimaging.
In some examples, the first imaging element 1712 may include a photosensitive array (or referred to as a pixel array). The photosensitive array can be composed of a plurality of pixel points, and can convert received optical signals into electric signals for output. Thereby, the first light spot 152 formed on the first imaging element 1712 by the first light beam 151 reflected by the target 20 and the first light spot data of the first light spot 152 can be obtained.
In some examples, second imaging element 1722 may also include a photosensitive array. Thereby, the second light spot 162 formed on the second imaging element 1722 by the second light beam 161 reflected by the target 20 and the second spot data of the second light spot 162 can be obtained.
In some examples, the photosensitive array of the first imaging element 1712 and the photosensitive array of the second imaging element 1722 may be of the same type. For example, the photosensitive array of the first imaging element 1712 and the photosensitive array of the second imaging element 1722 may each be a CMOS photosensitive element or a CCD photosensitive element.
In some examples, the photosensitive array of the first imaging element 1712 and the photosensitive array of the second imaging element 1722 can have the same parameters. For example, the photosensitive array of the first imaging element 1712 and the photosensitive array of the second imaging element 1722 can have the same effective pixel array and pixel size. In other words, the photosensitive array of the first imaging element 1712 and the photosensitive array of the second imaging element 1722 may each contain the same number of pixels in the horizontal direction and the vertical direction, that is, may have the same total width of pixels. The size of each pixel dot may be the same, i.e., the width of each pixel dot in the horizontal direction and the vertical direction may be the same. Thus, the method of calculating the aim target 20 can be made easier, while also facilitating the design of the binocular camera 10.
In some examples, the type or parameters of the photosensitive array of the first imaging element 1712 and the photosensitive array of the second imaging element 1722 may also be different. Thereby enabling an improvement in the adaptability of the binocular camera 10.
In some examples, the first spot data may be positional information of the first spot 152 on the first imaging element 1712 and the second spot data may be positional information of the second spot 162 on the second imaging element 1722. Thereby, the first rotation angle α and the second rotation angle β can be calculated based on the first spot data and the second spot data subsequently.
In some examples, the number of the first light sources 15 may be plural, and the plurality of first light sources 15 may be arranged around the first collection unit 171. Specifically, the plurality of first light sources 15 may be arranged around the first lens 1711, and the first spot data may be spot data obtained by a plurality of first light beams 151 emitted by the plurality of first light sources 15 being reflected by the target 20 and acquired by the first acquisition unit 171. In this case, since the plurality of first light sources 15 are arranged around the first acquisition unit 171, the spots formed by the plurality of first light sources 15 acquired by the first acquisition unit 171 are also in a surrounding shape, and the first spot data is calculated using the joint center of gravity of the plurality of spots, one first spot data with higher accuracy can be obtained, and thus, not only calculation can be simplified, but also calculation accuracy can be improved.
In some examples, the number of second light sources 16 may be a plurality, and the plurality of second light sources 16 may be arranged around the second acquisition unit 172. Specifically, the plurality of second light sources 16 may be arranged around the second lens 1721, and the second spot data may be spot data obtained by a plurality of light spots obtained by a plurality of second light beams 161 emitted by the plurality of second light sources 16 being reflected by the target 20 and acquired by the second acquisition unit 172. In this case, since the plurality of second light sources 16 are arranged around the second acquisition unit 172, the spots formed by the plurality of second light sources 16 acquired by the second acquisition unit 172 are also in a surrounding shape, and the second spot data is calculated using the combined center of gravity of the plurality of spots, one second spot data with higher accuracy can be obtained, and thus, not only calculation can be simplified, but also calculation accuracy can be improved.
In some examples, the combined center of gravity of the spots of the plurality of first spots 152 and/or the plurality of second spots 162 on the first imaging element 1712 and the second imaging element 1722, respectively, may be calculated using any one of a centroid tracking method, a grayscale centroid method, a circular fitting method, and a Hough transform method. This can improve the adaptability of the calculation method of the sighting target 20.
In some examples, the first light source 15 and the second light source 16 may be diffuse light sources. In particular, the first light source 15 and the second light source 16 may be infrared LED light sources, and the light emitting angle may be 10 degrees to 50 degrees. In this case, the first light beam 151 and the second light beam 161 can be emitted in a large range, so that the probability that the target 20 reflects the first light beam 151 and the second light beam 161 can be improved, and the field of view range of the binocular camera 10 can be enlarged.
In some examples, the field of view overlap range of the first light source 15 and the second light source 16 may be greater than the field of view range of the first acquisition unit 171 and the second acquisition unit 172. In this case, the field of view range of the first light source 15 and the second light source 16 can be larger than that of the first acquisition unit 171 and the second acquisition unit 172, so that the effective field of view of the acquisition unit 17 can be utilized as much as possible, whereby the field of view range of the binocular camera 10 can be further enlarged.
In some examples, the first light source 15 may be a plurality of infrared LED light sources symmetrically distributed on both sides of the first lens 1711. In some examples, the second light source 16 may be a plurality of infrared LED light sources symmetrically distributed on both sides of the second lens 1721. This can expand the field of view of the binocular camera 10.
Fig. 3 is a flowchart showing a method of aiming at the target 20 based on the binocular camera 10 according to the present embodiment example.
Referring to fig. 3, the method of aiming the target 20 based on the binocular camera 10 may include emitting a first light beam 151 and a second light beam 161 (step S100), obtaining first light spot data using the first light beam 151, obtaining second light spot data using the second light beam 161 (step S200), calculating a first rotation angle α based on the first light spot data, the second light spot data, and a preset condition satisfied by the first light spot data and the second light spot data when the target 20 is positioned on the aiming optical axis M of the binocular camera 10 (step S300), calculating a second rotation angle β based on the first light spot data and the second light spot data (step S400), rotating the binocular camera 10 by the first rotation angle α (step S500), and rotating the binocular camera 10 by the second rotation angle β (step S600).
Fig. 4A is a schematic diagram showing an actual optical path of a method of aiming at the target 20 based on the binocular camera 10 according to the present embodiment example. Fig. 4B is a schematic diagram showing a first equivalent optical path of the binocular camera 10-based method of aiming the target 20 according to the present embodiment example.
In this embodiment, to facilitate the explanation of the method of aiming at the target 20 based on the binocular camera 10, the present disclosure may propose a first equivalent optical path as shown in fig. 4B based on the actual optical path of the method of aiming at the target 20 in fig. 4A, wherein the first acquisition unit 171 of the binocular camera 10 may include the first lens 1711 and the first imaging element 1712, and the second acquisition unit 172 may include the second lens 1721 and the second imaging element 1722.
To further illustrate the methods of the present disclosure, the first lens 1711 and the second lens 1721 may have preset identical focal lengths and may be located in the same plane, and the optical center of the first lens 1711 and the optical center of the second lens 1721 may be located in the same plane; the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721 may be parallel and may be symmetrically distributed on both sides of the aiming optical axis M of the binocular camera 10, the rotation center 11 of the binocular camera 10 may be located on the aiming optical axis M, and a distance from the rotation center 11 to the same plane where the first lens 1711 and the second lens 1721 are located may have a first preset distance L; the space between the first lens 1711 and the second lens 1721 may have a second preset distance a, and the second preset distance a may be represented by a distance between the optical center of the first lens 1711 and the optical center of the second lens 1721 in the same plane; the parameters of the photosensitive array of the first imaging element 1712 and the photosensitive array of the second imaging element 1722 may be the same, e.g., have the same total width of pixels, and the size of each pixel dot may be the same, i.e., the width of each pixel dot in the first direction 12 and the second direction 13 are the same; the first light source 15 may be two infrared LED light sources symmetrically distributed on both sides of the first lens 1711, that is, the first light source 15a and the first light source 15B shown in fig. 4A, the second light source 16 may be two infrared LED light sources symmetrically distributed on both sides of the second lens 1721, that is, the second light source 16a and the second light source 16B shown in fig. 4A, the first light source 15a and the first light source 15B may have corresponding virtual light sources 15C and 15D (see fig. 4A), respectively, the second light source 16a and the second light source 16B may have corresponding virtual light sources 16C and 16D (see fig. 4A), respectively, the virtual light sources 15C and 15D may be equivalent to the first virtual light source 153 shown in fig. 4B, the virtual light sources 16C and 16D may be equivalent to the second virtual light source 163 shown in fig. 4B, and the target 20 may be located at a midpoint position of the first lens 1711 where the optical center of the virtual light is connected to the first virtual light source 153, while the target 20 may be located at a midpoint position of the second virtual light center of the second lens 1721 where the virtual light center is connected to the second virtual light source 153. Thus, the design of the binocular camera 10 is facilitated, as well as a method of calculating the aim target 20 with ease.
However, it should be noted that the setting of the positions and the partial parameters of the lens and the photosensitive array in the present disclosure should not be construed as limiting the method, for example, the focal lengths of the first lens 1711 and the second lens 1721 may be different, the first lens 1711 and the second lens 1721 may not be symmetrically disposed about the sighting optical axis M, and the photosensitive array may have different total pixel widths, in which case the formulas described below may be adaptively modified.
The method of aiming the target 20 may be described below with reference to the first equivalent optical path shown in fig. 4B.
In step S100, the first light source 15 may emit a first light beam 151 and the second light source 16 may emit a second light beam 161. In this case, the target 20 can receive the first and second light beams 151 and 161 and reflect the first and second light beams 151 and 161 to the first and second acquisition units 171 and 172, respectively, whereby the first acquisition unit 171 can obtain the first light beam 151 reflected by the target 20 and the second acquisition unit 172 can obtain the second light beam 161 reflected by the target 20.
In step S200, the first light beam 151 reflected by the target 20 may be acquired by the first acquisition unit 171 and first spot data may be obtained, the first spot data may include a first component in the first direction 12 and a second component in the second direction 13, the second light beam 161 reflected by the target 20 may be acquired by the second acquisition unit 172 and second spot data may be obtained, and the second spot data may include a third component in the first direction 12 and a fourth component in the second direction 13.
Fig. 5 is a schematic diagram showing a first coordinate system C1 and a second coordinate system C2 according to the present embodiment example.
In some examples, with the C-C direction in fig. 4B as the view direction, a first coordinate system C1 may be established based on the first imaging element 1712, the upper left corner of the first imaging element 1712 may be the origin O1 of the first coordinate system C1, the lateral axis direction of the first coordinate system C1, i.e., the X1 axis direction may be the first direction 12 (may also be referred to as the "horizontal direction"), and the longitudinal axis direction of the first coordinate system C1, i.e., the Y1 axis direction may be the second direction 13 (may also be referred to as the "vertical direction") (see fig. 5). In this case, the first spot data can be decomposed into a first component in the first direction 12 and a second component in the second direction 13 in the first coordinate system C1, thereby facilitating calculation of the first spot data.
In some examples, with the C-C direction in fig. 4B as the view direction, a second coordinate system C2 may be established based on the second imaging element 1722, the upper left corner of the second imaging element 1722 may be the origin O2 of the second coordinate system C2, the lateral axis direction of the second coordinate system C2, i.e., the X2 axis direction may be the first direction 12 (may also be referred to as the "horizontal direction"), and the longitudinal axis direction of the second coordinate system C2, i.e., the Y2 axis direction may be the second direction 13 (may also be referred to as the "vertical direction") (see fig. 5). In this case, the second spot data can be decomposed into a third component in the first direction 12 and a fourth component in the second direction 13 in the second coordinate system C2, thereby facilitating calculation of the second spot data.
In some examples, the units of the horizontal and vertical axes of the first and second coordinate systems C1 and C2 may be the number of pixels, in other words, coordinates located in the first and second coordinate systems C1 and C2 may be expressed in terms of pixel offset.
In some examples, the lateral axis directions of the first coordinate system C1 and the second coordinate system C2 may coincide. Thus, the design of the binocular camera 10 and the calculation method of the aiming target 20 are facilitated to be simpler.
In some examples, the first and second imaging elements 1712, 1722 may have the same total pixel width, in other words, the first and second imaging elements 1712, 1722 may have the same total pixel width in the lateral and longitudinal directions, and the width of each pixel point may be the same in the lateral and longitudinal directions, i.e., the first and second imaging elements 1712, 1722 shown in fig. 5 may have the same third preset distance W.
In addition, the first image center P1 and the second image center P2 in fig. 5 may represent the image centers of the first imaging element 1712 and the second imaging element 1722, that is, the center point positions of the photosensitive elements of the first imaging element 1712 and the second imaging element 1722, respectively, in other words, the values of the horizontal axis coordinate values of the first image center P1 and the second image center P2 in the respective coordinate systems are the values of the third preset distance W divided by 2.
In some examples, the first spot data may include a first component along the first direction 12 and a second component along the second direction 13, where the first component may be a coordinate value of the first spot 152 in the X1 axis direction of the first coordinate system C1, in other words, the first component may be a pixel offset of the first spot 152 in the X1 axis direction, that is, the first direction 12 relative to the origin O1 of the first coordinate system C1, and the second component may be a pixel offset of the first spot 152 in the Y1 axis direction of the first coordinate system C1, that is, the second direction 13 relative to the optical axis T1 of the first lens 1711. Thus, the method of calculating the aim target 20 can be facilitated by clearly defining the format meaning of the first spot data.
In some examples, the second spot data may include a third component along the first direction 12 and a fourth component along the second direction 13, where the third component may be a coordinate value of the second spot 162 in the X2 axis direction in the second coordinate system C2, in other words, the third component may be a pixel offset of the second spot 162 in the X2 axis direction of the second coordinate system C2, that is, the first direction 12 relative to the origin O2 of the second coordinate system C2, and the fourth component may be a pixel offset of the second spot 162 in the Y2 axis direction, that is, the second direction 13 relative to the optical axis T2 of the second lens 1721. Thus, the method of calculating the aim target 20 can be facilitated by clearly defining the format meaning of the second spot data.
Fig. 6 is a schematic diagram showing the measurement of the distance of the target 20 based on the binocular camera 10 according to the present embodiment example.
In some examples, referring to fig. 6, the method of aiming the target 20 based on the binocular camera 10 may further include calculating a distance between the target 20 and the binocular camera 10 based on the first component of the first spot data and the third component of the second spot data. Specifically, the distance between the target 20 and the binocular camera 10 may be represented by a distance D of the target 20 to the same plane in which the first lens 1711 and the second lens 1721 lie. Thus, the first rotation angle α and the second rotation angle β can be calculated easily, and the distance between the target 20 and the binocular camera 10 can be measured, thereby determining the spatial position of the target 20 relative to the binocular camera 10. Specifically, the calculation process of the distance D between the target 20 and the binocular camera 10 is as follows:
referring to fig. 6, formula (1) can be obtained according to the triangle relation:
wherein d 1 May be a parameter value, d, of the distance of the first spot 152 from the optical axis T1 of the first lens 1711 in the first direction 12 (i.e. in the X1 axis direction in the first coordinate system C1) 2 A parameter value (see fig. 5) of a distance of the second light spot 162 to the optical axis T2 of the second lens 1721 in the first direction 12 (i.e., in the X2 axis direction in the second coordinate system C2), f may be a parameter value of a focal length of the first lens 1711 and the second lens 1721, D1 may be a parameter value of a distance D between the target 20 and the binocular camera 10, and a parameter value of a distance of the first virtual light source 153 and the second virtual light source 163 integrally moved in the first direction 12 (i.e., in the horizontal direction) may be h due to a position of the target 20 moving with respect to the sighting optical axis M.
The addition cancellation h of the two formulas in the above formula (1) gives formula (2):
accordingly, the distance D between the target 20 and the binocular camera 10 can be conveniently measured, and the spatial position of the target 20 relative to the binocular camera 10 can be also determined, so that the first rotation angle α and the second rotation angle β can be conveniently calculated.
When the focal lengths of the first lens 1711 and the second lens 1721 are different, formula (3) can be obtained according to the triangle relation:
wherein f 1 And f 2 The parameter values may be the focal lengths of the first lens 1711 and the second lens 1721, respectively.
Referring to FIG. 5, x can be used 1 And x 2 A parameter value representing a first component of the first spot data in the first coordinate system C1 and a parameter value representing a third component of the second spot data in the second coordinate system C2, in other words x 1 May be the coordinate value of the first light spot data in the X1 axis direction and X in the first coordinate system C1 2 Coordinate values of the second spot data in the X2 axis direction in the second coordinate system C2; at the same time, in order to improve the calculation accuracy of the method for aiming the target 20, the optical axis T1 of the first lens 1711 and the first image center P1 may have a predetermined offset b along the X1 axis direction (i.e., the first direction 12) in the first coordinate system C1 1 The optical axis T2 of the second lens 1721 and the second image center P2 may have a predetermined offset b along the X2 axis direction (i.e., the first direction 12) in the second coordinate system C2 2 Wherein b when the optical axis T1 is located on the right side of the first image center P1 1 Can be positive, otherwise b 1 Can be negative, and likewise, b when the optical axis T2 is located to the right of the second image center P2 2 Can be positive, otherwise b 2 Can be negative; the total width of the pixels of the first and second imaging elements 1712 and 1722 may be represented by a third preset distance W, i.e., the total width of the pixels of the first and second imaging elements 1712 and 1722 may have a number of W1 pixels, and the parameter value of the preset width of each pixel of the first and second imaging elements 1712 and 1722 in the first and second directions is represented by u, which may be in millimeters (mm) or other length units, thereby obtaining equation (4):
let b=b 2 -b 1 Formula (2) can be converted to formula (5):
fig. 7 is a schematic view showing that the sighting target 20 according to the present embodiment example is located on the sighting optical axis M of the binocular camera 10, and fig. 8 is a schematic view showing that the first rotation angle α is measured based on the binocular camera 10 according to the present embodiment example.
In step S300, referring to fig. 7, the first rotation angle α may be calculated based on the first component of the first spot data, the third component of the second spot data, and a preset condition satisfied by the first component of the first spot data and the third component of the second spot data when the target 20 is located at the aiming optical axis M of the binocular camera 10. As a result, it can be obtained that the binocular camera 10 needs to rotate the first rotation angle α of the binocular camera 10 in the first direction 12, that is, in the direction of the transverse axis of the first coordinate system C1 and the second coordinate system C2, with the rotation center 11 as the center point in the process of aiming at the target 20.
Referring to fig. 7, in some examples, the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721 may be parallel and may be symmetrically distributed on both sides of the aiming optical axis M of the binocular camera 10.
In some examples, the preset condition satisfied by the first and third components when the binocular camera 10 is aimed at the target 20 may be that the first and third components are added to a preset value.
In some examples, the preset value may be obtained by calibrating the binocular camera 10 while aiming the binocular camera 10 at the target 20, i.e., with the target 20 on the aiming optical axis M of the binocular camera 10. Specifically, when aiming the binocular camera 10 at the target 20, the target 20 is located on the aiming optical axis M of the binocular camera 10, see fig. 7, equation (6) can be obtained:
d 1 =d 2
Obtaining the formula (7):
from this, formula (8) can be obtained:
x 1 +x 2 =W1+b 1 +b 2
in other words, when the binocular camera 10 aims at the target 20 and the target 20 is located at an arbitrary position on the aiming optical axis M, the preset condition that both the first component of the first spot data and the third component of the second spot data may satisfy is the formula (9):
K=x 1 +x 2 =W1+b 1 +b 2
k in the formula (9) is a parameter value of a preset value, W1 and b 1 And b 2 Can be preset values or can be considered to be W1, b during the aiming of the binocular camera 10 at the target 20 1 And b 2 May be left unchanged. In other words, when the binocular camera 10 is aimed at the target 20, the target 20 is located on the aiming optical axis M, and the binocular camera 10 is calibrated, it may be obtained that the sum of the first component of the first light spot data and the third component of the second light spot data may be a preset value, that is, the parameter value K in the formula (9) and the preset value may be irrelevant to the distance D from the target 20 to the binocular camera 10, and the preset value may be irrelevant to the second component of the first light spot data and the fourth component of the second light spot data. That is, in this case, the binocular camera 10 can be calibrated to obtain the preset value before the target 20 is aimed using the binocular camera 10, so that the first rotation angle α can be accurately and conveniently calculated based on the preset conditional expression (9) with the target 20 being located at different distances D on the aiming optical axis M, that is, the target 20 being located on the aiming optical axis M and being located at an arbitrary distance D to the same plane where the first lens 1711 and the second lens 1721 are located. In addition, when the target 20 is located at an arbitrary position on the sighting optical axis M, the second component of the first spot data and the fourth component of the second spot data can be kept unchanged, and the method of calculating the second rotation angle β can be simplified without being affected by the position of the target 20 on the sighting optical axis M.
In some examples, referring to fig. 8, the first rotation angle α may be calculated based on the first component, the third component, the distance D between the target 20 and the binocular camera 10, and a preset condition satisfied by the first component and the third component when the target 20 is located at the sighting optical axis M. Thereby, the first rotation angle α can be calculated easily. As shown in fig. 8, when the target 20 deviates from the aiming optical axis M of the binocular camera 10, the target 20 can be rotated by a first rotation angle α in a first direction 12, i.e., a lateral axis direction of the first coordinate system C1 and the second coordinate system C2, with respect to the rotation center 11 of the binocular camera 10, i.e., formula (10):
the distance from the rotation center 11 to the same plane where the first lens 1711 and the second lens 1721 are located may have a first preset distance L, a parameter value of the first preset distance L may be L1, and a parameter value of the first rotation angle α may be α1. By linking equation (1) and equation (9), a first rotation angle α in the first direction 12, equation (11), can be obtained:
as can be seen from the equation (11), the method of aiming the target 20 based on the binocular camera 10 may further include calculating the distance D between the target 20 and the binocular camera 10 based on the first component of the first spot data and the third component of the second spot data, so that the first rotation angle α can be conveniently calculated. In addition, when the target 20 is located on the sighting optical axis M, the preset condition satisfied by the first component and the third component can be deduced based on the triangle relation, so that the first rotation angle α can be accurately and conveniently calculated based on the preset condition.
Fig. 9 is a schematic diagram showing measurement of the second rotation angle β based on the binocular camera 10 according to the present embodiment example.
For the convenience of description of the calculation process, referring to fig. 9, the projection is performed in the B-B view direction shown in fig. 4B, so that a second equivalent optical path as shown in fig. 9 proposed in the present disclosure may be obtained, in other words, the optical path of the first acquisition unit 171 may be projected in the second direction 13, that is, in the vertical direction, so that the second equivalent optical path shown in fig. 9 may be obtained, where the second equivalent optical path may include the first lens 1711 and the first imaging element 1712, the optical axis T1 of the first lens 1711 may be a projection of the aiming optical axis M of the binocular camera 10 in the second direction 13, the optical axis T1 of the first lens 1711 and the projection of the aiming optical axis M in the second direction 13 may overlap, in fig. 9, the optical axis T1 of the first lens 1711 may be a parameter value of the offset distance of the first optical spot 152 in the second direction 13, that is, in the vertical direction, with respect to the aiming optical axis M, and the first parameter value of the offset distance of the first optical axis M may be a preset distance L1, and the first parameter value of the first distance L may be located between the first distance L and the first parameter value of the first parameter L1721.
It should be noted that, the second equivalent optical path of the present disclosure is not limited to being projected in the direction of the B-B view shown in fig. 4B, and the second equivalent optical path including the second acquisition unit 172 as a component may be projected in the opposite direction of the B-B view shown in fig. 4B.
In step S400, referring to fig. 9, the second rotation angle β may be calculated based on the second component of the first spot data, the fourth component of the second spot data. In some examples, when the target 20 deviates from the aiming optical axis M of the binocular camera 10, the target 20 may have a second angle of rotation β in a second direction 13, i.e., in the Y-axis direction of the first and second coordinate systems C1 and C2, relative to the center of rotation 11 of the binocular camera 10. Thus, it can be obtained that the binocular camera 10 needs to be rotated in the second direction 13, that is, the Y-axis direction of the first coordinate system C1 and the second coordinate system C2 by the second rotation angle β with the rotation center 11 as the center point in the process of aiming at the target 20.
Referring to fig. 9, according to the trigonometric function, the second rotation angle β, that is, formula (12):
from the triangle relation, formula (13) can be obtained:
referring to FIG. 5, y can be used 1 And y 2 The parameter value of the second component of the first spot data in the first coordinate system C1 and the parameter value of the fourth component of the second spot data in the second coordinate system C2 are represented by β1, respectively. Since the projections of the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721 in the second direction 13 may overlap the sighting optical axis M, i.e. there may be no parallax between the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721 in the second direction 13 and the sighting optical axis M, the second rotation angle β may be calculated in the following manner.
In some examples, the second rotation angle β may be calculated based on the second component of the first spot data in a second equivalent optical path as shown in fig. 9 projected in the B-B view direction as shown in fig. 4B. In this case, formula (14) can be obtained: d=uy 1 From this, the second rotation angle β can be calculated by combining the expression (14) and the expression (12).
In some examples, the second rotation angle β may be calculated based on a fourth component of the second spot data in projecting with a view direction opposite to B-B shown in fig. 4B to obtain a second equivalent optical path. In this case, formula (15) can be obtained: d=uy 2 From this, the second rotation angle β can be calculated by combining the expression (15) and the expression (12).
In some examples, the second rotation angle β may be calculated based on a common calculation of the second component of the first spot data and the fourth component of the second spot data, which may result in equation (16):
in this case, since there is no parallax between the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721 and the sighting optical axis M in the second direction 13, the target 20 is located at an arbitrary position within the fields of view of the first lens 1711 and the second lens 1721, the second rotation angle β calculated based on the second component of the first spot data and the second rotation angle β calculated based on the fourth component of the second spot data can be substantially identical, whereby the parameter value of the offset distance of the first spot 152 in the second direction 13, i.e., the vertical direction, from the sighting optical axis M can be reduced to the average value of the second component of the first spot data and the fourth component of the second spot data, so that the calculation accuracy of the method of sighting the target 20 can be further improved.
From this, formula (17) can be obtained:
as can be seen from equation (17), the method of aiming the target 20 based on the binocular camera 10 may further include calculating a distance D between the target 20 and the binocular camera 10 based on the first component of the first spot data and the third component of the second spot data, thereby enabling convenient calculation of the second rotation angle β.
In the present embodiment, when the target 20 is located at an arbitrary position on the sighting optical axis M of the binocular camera 10, the preset condition satisfied by the first component and the third component can be deduced based on the triangle relation, and the first rotation angle α can be accurately and conveniently calculated based on the preset condition. Meanwhile, since the optical axis T1 of the first lens 1711 and the optical axis T2 of the second lens 1721 of the binocular camera 10 may be limited to have parallax with respect to the sighting optical axis M of the binocular camera 10, respectively, in the first direction 12, the second direction 13, that is, the result of the second component of the first spot data and the fourth component of the second spot data in the vertical direction, may not be affected by the parallax, and thus the method of calculating the second rotation angle β may be simplified, and meanwhile, the second rotation angle β may be calculated using the average value of the second component and the fourth component, and thus the accuracy of calculation may be improved, and thus, the calculation of the method of sighting the target 20 may be more convenient.
In step S500, the binocular camera 10 may be rotated by a first rotation angle α. In step S600, the binocular camera 10 may be rotated by a second rotation angle β. In this case, by step S500 and step S600, the binocular camera 10 can be aimed at the target 20, that is, the target 20 is located on the aiming optical axis M of the binocular camera 10.
In the present embodiment, the execution order of step S300 and step S400 is not limited, in other words, step S300 may be executed before step S400, may be executed after step S400, or may be executed simultaneously with step S400; the order of execution of step S500 and step S600 is not limited, in other words, step S500 may be executed before step S600, may be executed after step S600, or may be executed simultaneously with step S600.
According to the method for aiming the target 20 based on the binocular camera 10, the first light beam 151 can be emitted through the first light source 15, the second light source 16 emits the second light beam 161, the target 20 can receive the first light beam 151 and the second light beam 161 and reflect the first light beam 151 and the second light beam 161, the first acquisition unit 171 of the binocular camera 10 can acquire the first light beam 151 reflected by the target 20 and can acquire the first light spot 152 of the first light beam 151 and the first light spot data of the first light spot 152, the second acquisition unit 172 of the binocular camera 10 can acquire the second light beam 161 reflected by the target 20 and can acquire the second light spot 162 of the second light beam 161 and the second light spot data of the second light spot 162, the first rotation angle alpha of the target 12, which is required to rotate around the rotation center 11 as the center point in the first direction 12 in the aiming process of the binocular camera 10, can be calculated based on the first light spot data, the second light spot data and the preset conditions met by the first light spot data and the second light spot data when the target 20 is located on the aiming optical axis M of the binocular camera 10, and the rotation angle alpha of the second camera 10 can be calculated based on the first rotation angle alpha of the rotation center of the second light spot 10 in the aiming process of the binocular camera 10, and the rotation angle alpha of the target 20 can be acquired accordingly, and the rotation angle of the target 10 can be calculated based on the rotation angle alpha is required to rotate around the rotation center of the rotation center 10 rotation angle of the rotation center 10 and the rotation of the second light point 10 in the second rotation center of the target 10 along the first rotation direction of the rotation axis.
Additionally, embodiments of the present disclosure relate to a processing device that may include a control module that performs the methods of targeting the target 20 to which the present disclosure relates. Thus, the control module is able to calculate the first rotation angle α and the second rotation angle β that need to be rotated when the binocular camera 10 is aimed at the target 20 based on the method of aiming the target 20.
In some examples, the processing device may further include an input-output module in communication with the outside, the input-output module being interconnected with the control module by a bus. In this case, the input-output module can communicate with the binocular camera 10 according to the present disclosure, so that the control module can receive parameters of the binocular camera 10 and calculate the first rotation angle α and the second rotation angle β that need to be rotated when the binocular camera 10 is aimed at the target 20.
Fig. 10 is a schematic diagram showing components of the laser tracker 30 according to the present embodiment example.
Based on the method for aiming the target 20 based on the binocular camera 10, the disclosure also provides a laser tracker 30, which can aim the target 20.
In some examples, the laser tracker 30 may include a laser source 31 for generating a laser beam, a processing device 33 as described above, a binocular camera 10 that obtains a first rotation angle α and a second rotation angle β based on the method of aiming the target 20 described above, and a driving unit 32 that adjusts the direction of the laser beam based on the first rotation angle α and the second rotation angle β. In this case, the processing device 33 can acquire the first rotation angle α and the second rotation angle β that need to be rotated in the process of aiming the binocular camera 10 at the target 20, so that the driving unit 32 can adjust the direction of the laser beam based on the first rotation angle α and the second rotation angle β to aim the laser beam at the target 20 (see fig. 10).
While the disclosure has been described in detail in connection with the drawings and examples, it is to be understood that the foregoing description is not intended to limit the disclosure in any way. Those skilled in the art can make modifications and variations to the present disclosure as required without departing from the true spirit and scope of the disclosure, and such modifications and variations are within the scope of the disclosure.

Claims (10)

1. A binocular camera for aiming a sighting optical axis of the binocular camera at a target, characterized in that the binocular camera comprises a first light source for emitting a first light beam, a second light source for emitting a second light beam, a first acquisition unit configured to acquire the first light beam reflected by the target and to obtain first spot data comprising a first component in a first direction and a second component in a second direction, and a second acquisition unit configured to acquire the second light beam reflected by the target and to obtain second spot data comprising a third component in the first direction and a fourth component in the second direction;
the first component, the third component, and preset conditions satisfied by the first component and the third component when the target is located on the sighting optical axis are used for obtaining a first rotation angle; the second component and the fourth component are used to obtain a second rotation angle;
The binocular camera is configured to rotate the first rotation angle in the first direction with a rotation center of the binocular camera as a center point; the binocular camera is configured to rotate the second rotation angle in the second direction with a rotation center of the binocular camera as a center point.
2. The binocular camera of claim 1, wherein the camera is configured to,
the first acquisition unit has a first lens and a first imaging element for acquiring the first spot data, and the second acquisition unit has a second lens and a second imaging element for acquiring the second spot data.
3. The binocular camera of claim 1, wherein the camera is configured to,
the first light sources are multiple in number, the first light sources are arranged around the first acquisition unit, and the first light spot data are light spot data obtained based on a plurality of light spots which are obtained by the first acquisition unit and reflected by the target and obtained by a plurality of first light beams emitted by the first light sources;
the second light sources are arranged around the second acquisition unit, and the second light spot data are light spot data obtained based on a plurality of second light beams emitted by the second light sources, reflected by the target and acquired by the second acquisition unit.
4. The binocular camera of claim 2, wherein the camera is configured to,
the optical centers of the first lens and the second lens are located in the same plane, the optical centers of the first lens and the second lens are located in the plane, and the optical centers of the first lens and the second lens are located on the optical axis of the first lens and the optical axis of the second lens respectively.
5. The binocular camera of claim 4, it is characterized in that the method comprises the steps of,
the optical axis of the first lens is parallel to the optical axis of the second lens, and the first lens and the second lens are symmetrically distributed on two sides of the sighting optical axis.
6. The binocular camera of claim 5, it is characterized in that the method comprises the steps of,
the first lens and the second lens have the same focal length.
7. The binocular camera of claim 6, it is characterized in that the method comprises the steps of,
a first coordinate system is established based on the first imaging element, the transverse axis direction of the first coordinate system is the first direction, the longitudinal axis direction of the first coordinate system is the second direction, a second coordinate system is established based on the second imaging element, the transverse axis direction of the second coordinate system is the first direction, the longitudinal axis direction of the second coordinate system is the second direction, the first direction is the horizontal direction, and the second direction is the vertical direction.
8. The binocular camera of claim 1, wherein the camera is configured to,
the overlapping range of the fields of view of the first light source and the second light source is larger than the field of view of the first acquisition unit and the second acquisition unit.
9. The binocular camera of claim 7, it is characterized in that the method comprises the steps of,
the first component is a coordinate value of the first light spot in a transverse axis direction of the first coordinate system, and the second component is a pixel point offset of the first light spot relative to an optical axis of the first lens in the second direction;
the third component is a coordinate value of the second light spot in a transverse axis direction of the second coordinate system, and the fourth component is a pixel point offset of the second light spot in the second direction relative to an optical axis of the second lens.
10. The binocular camera of claim 7, it is characterized in that the method comprises the steps of,
when the binocular camera aims at the target, the preset conditions met by the first component and the third component are that the first component and the third component are added to be a preset value, and the preset value is obtained by calibrating the binocular camera when the binocular camera aims at the target;
the first rotation angle is:
The second rotation angle is:
wherein α1 is a parameter value of the first rotation angle, β1 is a parameter value of the second rotation angle, u is a parameter value of a preset width of each pixel point of the first imaging element and the second imaging element in the first direction and the second direction, L1 is a parameter value of a first preset distance from a rotation center of the binocular camera to a same plane where the first lens and the second lens are located, D1 is a parameter value of a distance from the object to the same plane where the first lens and the second lens are located, f is a parameter value of a focal length of the first lens and the second lens, K is a parameter value of the preset value, x 1 And x 2 A parameter value representing the first component of the first spot data in the first coordinate system and a parameter value representing the third component of the second spot data in the second coordinate system, y, respectively 1 And y 2 A parameter value representing the second component of the first spot data in the first coordinate system relative to the optical axis of the first lens and a parameter value representing the fourth component of the second spot data in the second coordinate system relative to the optical axis of the second lens, respectively.
CN202310665249.3A 2023-02-10 2023-02-10 Binocular camera Pending CN116718108A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310665249.3A CN116718108A (en) 2023-02-10 2023-02-10 Binocular camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310098258.9A CN116105600B (en) 2023-02-10 2023-02-10 Aiming target method based on binocular camera, processing device and laser tracker
CN202310665249.3A CN116718108A (en) 2023-02-10 2023-02-10 Binocular camera

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202310098258.9A Division CN116105600B (en) 2023-02-10 2023-02-10 Aiming target method based on binocular camera, processing device and laser tracker

Publications (1)

Publication Number Publication Date
CN116718108A true CN116718108A (en) 2023-09-08

Family

ID=86259467

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202310098258.9A Active CN116105600B (en) 2023-02-10 2023-02-10 Aiming target method based on binocular camera, processing device and laser tracker
CN202310667438.4A Pending CN116718109A (en) 2023-02-10 2023-02-10 Target capturing method based on binocular camera
CN202310665249.3A Pending CN116718108A (en) 2023-02-10 2023-02-10 Binocular camera

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN202310098258.9A Active CN116105600B (en) 2023-02-10 2023-02-10 Aiming target method based on binocular camera, processing device and laser tracker
CN202310667438.4A Pending CN116718109A (en) 2023-02-10 2023-02-10 Target capturing method based on binocular camera

Country Status (1)

Country Link
CN (3) CN116105600B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117218211A (en) * 2023-11-09 2023-12-12 广东兆恒智能科技有限公司 Camera calibration device and calibration method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103959090A (en) * 2011-12-06 2014-07-30 莱卡地球系统公开股份有限公司 Laser tracker with position-sensitive detectors for searching for a target
CN104251696A (en) * 2011-03-14 2014-12-31 法罗技术股份有限公司 Automatic measurement of dimensional data with a laser tracker
US20180203120A1 (en) * 2017-01-13 2018-07-19 Faro Technologies, Inc. Remote control of a laser tracker using a mobile computing device
US20190146089A1 (en) * 2017-11-15 2019-05-16 Faro Technologies, Inc. Retroreflector acquisition in a coordinate measuring device
CN114509005A (en) * 2022-02-25 2022-05-17 深圳市中图仪器股份有限公司 Coordinate measuring device with automatic target identification function and identification method thereof
CN115546318A (en) * 2022-11-23 2022-12-30 中科星图测控技术(合肥)有限公司 Automatic high-speed trajectory calibration method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103796004B (en) * 2014-02-13 2015-09-30 西安交通大学 A kind of binocular depth cognitive method of initiating structure light
EP3339900B1 (en) * 2016-12-22 2020-08-12 Safran Vectronix AG Observation device having an eye-controlled laser rangefinder
CN106911923B (en) * 2017-02-28 2018-08-31 驭势科技(北京)有限公司 Binocular camera and distance measuring method based on binocular camera
CN106871787B (en) * 2017-04-13 2019-02-22 中国航空工业集团公司北京长城航空测控技术研究所 Large space line scanning imagery method for three-dimensional measurement
CN107560543B (en) * 2017-09-04 2023-08-22 华南理工大学 Binocular stereoscopic vision-based camera optical axis offset correction device and method
CN110376543A (en) * 2018-04-12 2019-10-25 北京凌宇智控科技有限公司 A kind of three dimension location method and system
CN113790689B (en) * 2021-11-18 2022-03-08 深圳市中图仪器股份有限公司 Calibration method of space coordinate measuring instrument

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104251696A (en) * 2011-03-14 2014-12-31 法罗技术股份有限公司 Automatic measurement of dimensional data with a laser tracker
CN103959090A (en) * 2011-12-06 2014-07-30 莱卡地球系统公开股份有限公司 Laser tracker with position-sensitive detectors for searching for a target
US20180203120A1 (en) * 2017-01-13 2018-07-19 Faro Technologies, Inc. Remote control of a laser tracker using a mobile computing device
US20190146089A1 (en) * 2017-11-15 2019-05-16 Faro Technologies, Inc. Retroreflector acquisition in a coordinate measuring device
CN114509005A (en) * 2022-02-25 2022-05-17 深圳市中图仪器股份有限公司 Coordinate measuring device with automatic target identification function and identification method thereof
CN115546318A (en) * 2022-11-23 2022-12-30 中科星图测控技术(合肥)有限公司 Automatic high-speed trajectory calibration method

Also Published As

Publication number Publication date
CN116105600A (en) 2023-05-12
CN116105600B (en) 2023-06-13
CN116718109A (en) 2023-09-08

Similar Documents

Publication Publication Date Title
EP1413850B1 (en) Optical sensor for measuring position and orientation of an object in three dimensions
KR100753885B1 (en) Image obtaining apparatus
US9330324B2 (en) Error compensation in three-dimensional mapping
US8917942B2 (en) Information processing apparatus, information processing method, and program
US7124046B2 (en) Method and apparatus for calibration of camera system, and method of manufacturing camera system
US20090097039A1 (en) 3-Dimensional Shape Measuring Method and Device Thereof
WO2019012770A1 (en) Imaging device and monitoring device
US20150085108A1 (en) Lasergrammetry system and methods
CN111739104A (en) Calibration method and device of laser calibration system and laser calibration system
CN113034612B (en) Calibration device, method and depth camera
CN116105600B (en) Aiming target method based on binocular camera, processing device and laser tracker
KR20150069927A (en) Device, method for calibration of camera and laser range finder
JP3913901B2 (en) Camera internal parameter determination device
WO2014181581A1 (en) Calibration device, calibration system, and imaging device
CN116342710B (en) Calibration method of binocular camera for laser tracker
US20240087167A1 (en) Compensation of three-dimensional measuring instrument having an autofocus camera
WO2019058729A1 (en) Stereo camera
CN102401901B (en) Distance measurement system and distance measurement method
So et al. Calibration of a dual-laser triangulation system for assembly line completeness inspection
KR102185329B1 (en) Distortion correction method of 3-d coordinate data using distortion correction device and system therefor
EP3988895A1 (en) Compensation of three-dimensional measuring instrument having an autofocus camera
US20220364849A1 (en) Multi-sensor depth mapping
KR100698535B1 (en) Position recognition device and method of mobile robot with tilt correction function
WO2022118513A1 (en) Position/orientation calculation device, position/orientation calculation method, and surveying device
KR20200032442A (en) 3D information generating device and method capable of self-calibration

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination