CN111337013B - Four-linear array CCD-based multi-target point distinguishing and positioning system - Google Patents

Four-linear array CCD-based multi-target point distinguishing and positioning system Download PDF

Info

Publication number
CN111337013B
CN111337013B CN201911308933.6A CN201911308933A CN111337013B CN 111337013 B CN111337013 B CN 111337013B CN 201911308933 A CN201911308933 A CN 201911308933A CN 111337013 B CN111337013 B CN 111337013B
Authority
CN
China
Prior art keywords
linear array
receiving end
point light
linear
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911308933.6A
Other languages
Chinese (zh)
Other versions
CN111337013A (en
Inventor
屠晓伟
王闯
杨庆华
杨建明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN201911308933.6A priority Critical patent/CN111337013B/en
Publication of CN111337013A publication Critical patent/CN111337013A/en
Application granted granted Critical
Publication of CN111337013B publication Critical patent/CN111337013B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a multi-target point distinguishing and positioning system based on a four-line array CCD, which consists of four fixed-position one-dimensional image acquisition units, a target consisting of three LED point light sources and an embedded system. The one-dimensional image acquisition unit consists of a cylindrical optical lens with a central axis perpendicular to each other and a linear array CCD device. A point light source in space is projected onto each linear array CCD plane through a cylindrical optical lens at a receiving end to form a linear light ray perpendicular to a CCD photosensitive line, the linear light ray is intersected on the CCD to obtain projected position information, and the three-dimensional position of the point light source is determined by the projected positions of the four groups of CCDs. The three-dimensional coordinate position of the three point light sources relative to the sensor at the receiving end in space can be calculated through the embedded system under the condition that the emitting end and the receiving end do not need to synchronously expose. The system has the advantages of good stability, good symmetry consistency of the four axes of the coordinate system and low cost.

Description

Four-linear array CCD-based multi-target point distinguishing and positioning system
Technical Field
The invention relates to an optical measurement system, in particular to a photoelectric multi-target identification system, which is applied to the technical field of indoor and outdoor 3D space multi-target point real-time measurement instrument equipment, is also applied to the technical field of photoelectric three-dimensional real-time positioning, and can be applied to the technical field of image navigation operation and determination of the moving position of an intelligent body in space.
Background
The multi-target point light source identification method is often used in computer vision, two cameras are used for shooting the same object at different positions, and each point on the object can be distinguished by adopting a epipolar geometry principle, but the method is realized by using a two-dimensional area array CCD or CMOS sensor. Linear CCDs are common in computer vision metrology, and have outstanding advantages in accurate coordinate measurement and dynamic position tracking due to their high resolution and excellent frame rate compared to area CCDs. However, for a three-dimensional optical positioning measurement system based on a linear array CCD, if two or more LED point light sources are simultaneously lightened in the visual field, the optical measurement system based on the linear array CCD cannot distinguish the projection relation corresponding to each LED point light source.
When the optical measurement system based on the three-linear CCD is used for measuring a plurality of point light sources, the LED point light sources need to be controlled one by one to be lighted in sequence, namely, in the visual field of the optical measurement system based on the three-linear CCD, only one LED point light source is lighted at a time, so that the linear CCD measurement system knows the LED point light source corresponding to the current projection point. However, this requires synchronous exposure control of the linear array CCD measurement system, which must increase the complexity and power consumption of the system. The existing three-dimensional optical positioning measurement system based on the linear array CCD still lacks a feasible distinguishing and measuring method under the condition that a plurality of LED point light sources are simultaneously lightened, so that the system cannot work normally. Therefore, it is necessary to develop a related art study to enable the linear array CCD optical measurement system to still work normally in the case where a plurality of LED point light sources are simultaneously lighted. The defects of the conventional linear array CCD-based three-dimensional optical positioning measurement system in measuring a plurality of simultaneously-lighted LED point light sources become technical problems to be solved.
Disclosure of Invention
In order to solve the problems in the prior art, the invention aims to overcome the defects in the prior art, and provides a multi-target point distinguishing and positioning system based on a four-line array CCD, which can distinguish three LED point light sources which are lightened simultaneously and measure three-dimensional coordinates of the three point light sources simultaneously by using only a projection relation which is met by the three point light sources on the four-line array CCD under the condition that a point light source of a transmitting end target and a receiving end sensor are not required to be synchronously exposed, and can be used in the fields of image navigation operation, determination of the movement position of an intelligent body in space, measurement of the indoor position and gesture and the like.
In order to achieve the aim of the invention, the invention adopts the following technical scheme:
a multi-target point distinguishing and positioning system based on a four-wire array CCD comprises a transmitting end target, a receiving end sensor and an embedded system; the receiving end sensor is used as a linear photoelectric sensorThe receiving end is used for receiving the point light source signals in the target of the receiving end; the emitting end target consists of three point light sources, each point light source consists of an LED luminous point, the LED luminous point emits light by infrared light to send light signals, the emitting end target can be arranged on any movable or stationary object, and three trilateral shapes with unequal side lengths are formed by connecting lines among the three point light sources of the emitting end target; the receiving end sensor consists of four groups of photoelectric sensors and an embedded system, the four groups of photoelectric sensors comprise four linear array CCD components, four fixed-focus cylindrical optical lenses and a sensor base, each group of linear array CCD components and each cylindrical optical lens respectively form a one-dimensional imaging unit, each linear array CCD component corresponds to one cylindrical optical lens, any point light source in space is projected onto the plane of each linear array CCD component through each cylindrical optical lens to form four linear rays perpendicular to the linear array of the linear array CCD component, the linear array CCD component system of the receiving end sensor is formed by arranging the linear array CCD1 component, the linear array CCD2 component, the linear array CCD3 component and the linear array CCD4 component, wherein the intersections of the photosensitive surfaces of the linear array CCD1 component, the linear array CCD2 component, the linear array CCD3 component and the linear array CCD4 component and the four linear rays are respectively defined as lambda 1234 By the structure of the receiving end sensor, the four intersecting points satisfy the relation: lambda (lambda) 14 =λ 23 The method comprises the steps of carrying out a first treatment on the surface of the Three LED point light sources A of transmitting end target 1 ,A 2 ,A 3 Simultaneously lighting, and respectively projecting the LED light to the relation lambda satisfied by the four linear array CCD components 14 =λ 23 Respectively determining projection points lambda corresponding to the three LED point light sources 1234 The method comprises the steps of carrying out a first treatment on the surface of the Three groups of four different projection points can be obtained, and the coordinates of each point light source under the coordinate system of the receiving end sensor can be respectively reconstructed and determined according to each group of projection points; calculating the distance between the three LED point light sources, and combining the calculated distance value with the known point light source A 1 、A 2 、A 3 Distance A between 1 A 2 、A 1 A 3 、A 1 A 2 Comparison and differentiationOutputting a matching comparison relation of three LED point light sources; and the embedded system calculates three-dimensional space coordinate position data of each point light source in the transmitting end target relative to the receiving end sensor in real time, performs data storage, and completes multi-target point distinguishing and positioning tasks.
As a preferable technical scheme of the invention, three point light sources A 1 、A 2 、A 3 At the same time when lighted, and satisfy A 1 A 2 、A 1 A 3 、A 2 A 3 Are not equal in structural relationship.
As a preferable technical scheme of the invention, four projection points lambda of any point light source on four linear array CCD 1234 Satisfy the relation lambda 14 =λ 23 According to the relation, four projection points corresponding to the LED point light source can be distinguished, so that coordinate values of the LED point light source under a coordinate system of the receiving end sensor are reconstructed, and projection points corresponding to the three LED point light sources respectively are distinguished.
As a preferable technical scheme of the invention, three point light sources A 1 、A 2 、A 3 At the same time when lighted, and satisfy A 1 A 2 、A 1 A 3 、A 2 A 3 Are not equal in structural relationship; then the coordinate values of the three obtained LED point light sources are matched with A 1 A 2 ,A 1 A 3 ,A 2 A 3 And (3) comparing the structural relations of the three LED point light sources to distinguish the matching comparison relation of the three LED point light sources.
Preferably, the intersection point of the straight lines where the photosensitive lines of the four linear array CCD assemblies are located is the origin of the coordinate system of the receiving end sensor.
As the preferable technical scheme of the invention, four linear array CCD components are distributed on the same plane, and the plane layout is in a cross shape; when the intersection point of the straight lines of the four linear array CCD is made to be the origin of the coordinate system of the receiving end sensor, the linear array CCD1 component is arranged on the positive X-axis half axis of the coordinate system of the receiving end sensor, the linear array CCD2 component is arranged on the positive Y-axis half axis of the coordinate system of the receiving end sensor, the linear array CCD3 component is arranged on the negative X-axis half axis of the coordinate system of the receiving end sensor, and the linear array CCD4 component is arranged on the negative Y-axis half axis of the coordinate system of the receiving end sensor.
Preferably, the distances from the innermost ends of the four linear array CCD assemblies to the origin of the coordinate system of the receiving end sensor are all equal.
Preferably, the focal lengths of the four cylindrical optical lenses mated with the four linear array CCD assemblies are all equal.
Preferably, the lengths of the photosensitive lines of the four linear array CCD assemblies are all equal.
It is preferred that there is no need for simultaneous exposure between the point source of the emitting end target and the receiving end sensor.
Compared with the prior art, the invention has the following obvious prominent substantive features and obvious advantages:
1. the system adopts a photoelectric multi-target recognition system to perform photoelectric three-dimensional real-time positioning; under the condition that the synchronous operation of a transmitting end target and a receiving end sensor is not needed, four one-dimensional image acquisition unit combinations are used for identifying a plurality of point light source targets which emit light simultaneously, so that multi-target point distinction and positioning based on a four-linear array CCD device are realized;
2. the system has high measurement accuracy, high data measurement efficiency and low delay;
3. the system calculates the three-dimensional coordinate positions of the three point light sources in space relative to the receiving end sensor through the embedded system, has good working stability, good four-axis symmetry consistency of the coordinate system and low device manufacturing and measuring cost.
Drawings
FIG. 1 is a schematic diagram of a partial discharge sensing detection system of a quartz fluorescent fiber according to a preferred embodiment of the present invention.
FIG. 2 is a diagram illustrating multi-target differentiation and positioning according to a preferred embodiment of the present invention.
FIG. 3 is a diagram illustrating multi-target differentiation and positioning according to a preferred embodiment of the present invention.
FIG. 4 is a flowchart of the multi-target point discrimination and localization system according to the preferred embodiment of the present invention.
Detailed Description
The foregoing aspects are further described in conjunction with specific embodiments, and the following detailed description of preferred embodiments of the present invention is provided:
in this embodiment, referring to fig. 1 to 4, a multi-target point distinguishing and positioning system based on a four-line array CCD includes a transmitting end target 1, a receiving end sensor 2 and an embedded system 3; the receiving end sensor 2 is used as a linear photoelectric sensor receiving end to receive the point light source signal in the receiving end target 1; the transmitting end target 1 consists of three point light sources, each point light source consists of an LED luminous point 4, the LED luminous points transmit optical signals by adopting infrared luminescence, the transmitting end target 1 can be arranged on any movable or stationary object, and three trilateral shapes with unequal side lengths are formed by connecting lines among the three point light sources of the transmitting end target 1; the receiving end sensor 2 consists of four groups of photoelectric sensors and an embedded system 3, wherein the four groups of photoelectric sensors comprise four linear CCD (charge coupled device) components, four fixed-focus cylindrical optical lenses 6 and a sensor base, each group of linear CCD components and each group of cylindrical optical lenses 6 respectively form a one-dimensional imaging unit, each linear CCD component corresponds to one cylindrical optical lens 6, any point light source in space is projected onto the plane of each linear CCD component through each cylindrical optical lens 6 to form four linear rays perpendicular to the linear array of the linear CCD component, and a linear CCD component system of the receiving end sensor 2 is formed by arranging a linear CCD1 component 7, a linear CCD2 component 8, a linear CCD3 component 9 and a linear CCD4 component 10, wherein the intersections of the photosensitive surfaces of the linear CCD1 component 7, the linear CCD2 component 8, the linear CCD3 component 9 and the linear CCD4 component 10 are respectively defined as lambda rays 1234 By the structure of the receiving-end sensor 2, the relationship between the four intersecting points is satisfied: lambda (lambda) 14 =λ 23 The method comprises the steps of carrying out a first treatment on the surface of the Three LED point light sources A of transmitting end target 1 1 ,A 2 ,A 3 Simultaneously lighting, and respectively projecting the LED light to the relation lambda satisfied by the four linear array CCD components 14 =λ 23 Respectively determining projection points lambda corresponding to the three LED point light sources 1234 The method comprises the steps of carrying out a first treatment on the surface of the Three groups of four different projection points can be obtained, and the coordinates of each point light source under the coordinate system of the receiving end sensor 2 can be respectively reconstructed and determined according to each group of projection points; calculating the distance between the three LED point light sources, and combining the calculated distance value with the known point light source A 1 、A 2 、A 3 Distance A between 1 A 2 、A 1 A 3 、A 1 A 2 The matching comparison relations of the three LED point light sources are distinguished; the embedded system 3 calculates three-dimensional space coordinate position data of each point light source in the transmitting end target 1 relative to the receiving end sensor 2 in real time, performs data storage, and completes multi-target point distinguishing and positioning tasks.
In the present embodiment, a cylindrical optical lens 6 is used to realize projection mapping from three-dimensional space to one-dimensional imaging. According to the basic principle of optics, light from an object point on one side of a cylindrical optical lens 6 will form a linear image on the other side, the linear array CCD and the cylindrical optical lens constituting a one-dimensional imaging unit (One Dimensional Imaging Unit, ODIU), as shown in fig. 1.
In this embodiment, four linear CCDs are set to be distributed on the same plane, the planar layout is in a cross shape, when the intersection point of the straight lines of the photosensitive lines of the four linear CCDs is set to be the origin of the coordinate system of the receiving end sensor 2, the X-axis positive half axis of the coordinate system of the receiving end sensor 2 is set to be the linear CCD1 component 7, the Y-axis positive half axis of the coordinate system of the receiving end sensor 2 is set to be the linear CCD2 component 8, the X-axis negative half axis of the coordinate system of the receiving end sensor 2 is set to be the linear CCD3 component 9, the Y-axis negative half axis of the coordinate system of the receiving end sensor 2 is set to be the linear CCD4 component 10, the distance from the innermost side of the four linear CCDs to the origin is equal to 60mm, the focal length of the cylindrical optical lens 6 is 50mm, and the length of the photosensitive lines of the linear CCDs is set to be 30mm, as shown in fig. 2.
In the present embodiment, when one LED point light source in the emitting end target 1 emits light, the point light source passes through the four cylindrical optical lenses 6 of the receiving end sensor 2 and forms four intersecting projection points λ on four photosensitive lines of the linear array CCD1 assembly 7, the linear array CCD2 assembly 8, the linear array CCD3 assembly 9, and the linear array CCD4 assembly 10, respectively 1234 As shown in fig. 2. According to the structure of the receiver sensor 6, four plane equations corresponding to the intersection of four planes at one point are listed as follows:
50*x+0*y-(75-λ 1 )*z=50*λ 1 (1)
0*x+50*y-(75-λ 2 )*z=50*λ 2 (2)
50*x+0*y+(75+λ 3 )*z=50*λ 3 (3)
0*x+50*y+(75+λ 4 )*z=50*λ 4 (4)
the coefficient matrix and the augmentation matrix of these four equations are listed.
Figure GDA0002498534370000051
Performing primary equal-row transformation on the matrix to obtain the following results:
Figure GDA0002498534370000052
from the knowledge of linear algebra, the four plane equations intersect at one point and the constraint of having a unique solution is Rank (a) =rank (B) =3, so that four projection points λ of a point light source can be obtained 1234 The following are satisfied: lambda (lambda) 14 =λ 23
When the three LED point light sources of the emitting end target 1 emit light at the same time, it is not required to be lighted one by one in a cycle, and a normally lighted state is maintained as shown in fig. 3. Three LED point light sources A in transmitting end target 1 1 ,A 2 ,A 3 Four cylindrical optical lenses 6 each passing through the receiving end sensor 2 and forming four intersecting points defined as lambda on the photosensitive surfaces of the four linear array CCDs 1_A12_A13_A14_A1 ;λ 1_A22_A23_A24_A2 ;λ 1_A32_A33_A34_A3 . Wire (C)All projection points on the array CCD1 assembly 7 are lambda ccd1 ={λ 1_A11_A21_A3 All projection points on the linear array CCD2 assembly 8 are lambda ccd2 ={λ 2_A12_A22_A3 All projection points on the linear array CCD3 assembly 9 are lambda ccd3 ={λ 3_A13_A23_A3 All projection points on the linear array CCD4 assembly 10 are lambda ccd4 ={λ 4_A14_A24_A3 }。
Each projection point on the linear array CCD1 assembly 7 was added to each projection point on the linear array CCD4 assembly 10 to find all (lambda) ccd1ccd4 ) The method comprises the steps of carrying out a first treatment on the surface of the Each projection point on the linear array CCD2 assembly 8 and each projection point on the linear array CCD3 assembly 9 are added to find all (lambda) ccd2ccd3 )。
Find | (lambda) ccd1ccd4 )-(λ ccd2ccd3 ) All possible calculations, the first three minima are selected from these calculations: i.e., min { | (λ) ccd1ccd4 )-(λ ccd2ccd3 ) The coordinates of three markers are reconstructed according to four projection points corresponding to the first three minimum values respectively and are set as C 1 ,C 2 ,C 3 . Comparative C 1 C 2 ,C 1 C 3 ,C 2 C 3 Is greater than the known A 1 A 2 ,A 1 A 3 ,A 2 A 3 To calculate the estimated point C 1 ,C 2 ,C 3 And the actual point A 1 ,A 2 ,A 3 The comparison relation between the three luminous mark points A is successfully distinguished at the receiving end 1 ,A 2 ,A 3 . The system flow diagram is shown in fig. 4. The present embodiment calculates the distance between three LED point light sources, and compares the calculated distance value with the known point light source A 1 、A 2 、A 3 Distance A between 1 A 2 、A 1 A 3 、A 1 A 2 The matching comparison relations of the three LED point light sources are distinguished; the embedded system 3 calculates the transmitting end in real timeAnd (3) carrying out data storage on three-dimensional space coordinate position data of each point light source in the target 1 relative to the receiving end sensor 2, and completing multi-target point distinguishing and positioning tasks.
The multi-target point distinguishing and positioning system based on the four-line array CCD consists of four fixed-position one-dimensional image acquisition units, a target consisting of three LED point light sources and an embedded system. The one-dimensional image acquisition unit consists of a cylindrical optical lens 6 and a linear array CCD device, wherein the central axes of the cylindrical optical lens 6 and the linear array CCD device are perpendicular to each other. A point light source in space is projected onto each linear array CCD plane through a cylindrical optical lens at a receiving end to form a linear light ray perpendicular to a CCD photosensitive line, the linear light ray is intersected on the CCD to obtain projected position information, and the three-dimensional position of the point light source is determined by the projected positions of the four groups of CCDs. The three-dimensional coordinate position of the three point light sources relative to the sensor at the receiving end in space can be calculated through the embedded system under the condition that the emitting end and the receiving end do not need to synchronously expose. The system has the advantages of good stability, good symmetry consistency of the four axes of the coordinate system and low cost.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the embodiments described above, and various changes, modifications, substitutions, combinations or simplifications made under the spirit and principles of the technical solution of the present invention can be made according to the purpose of the present invention, so long as the purpose of the present invention is met, and the present invention does not depart from the technical principles and the inventive concept of the multi-target point distinguishing and positioning system based on the four-line CCD, which is the scope of the present invention.

Claims (7)

1. A multi-target point distinguishing and positioning system based on a four-wire array CCD is characterized in that: comprises a transmitting end target (1), a receiving end sensor (2) and an embedded system (3); receiving a point light source signal in the transmitting end target (1) by using the receiving end sensor (2) as a linear photoelectric sensor receiving end; the emitting end target (1) consists of three point light sources, each point light source consists of an LED luminous point (4), the LED luminous points emit light signals by adopting infrared light, the emitting end target (1) can be arranged on any movable or stationary object, and three trilateral shapes with unequal side lengths are formed by connecting lines among the three point light sources of the emitting end target (1);
the receiving end sensor (2) comprises four groups of photoelectric sensors, each four groups of photoelectric sensors comprise four linear array CCD components, four cylindrical optical lenses (6) with fixed focal lengths and a sensor base, each group of linear array CCD components and each cylindrical optical lens (6) respectively form a one-dimensional imaging unit, wherein each linear array CCD component corresponds to one cylindrical optical lens (6), any point light source in space is projected onto the plane of each linear array CCD component through each cylindrical optical lens (6) to form four linear rays perpendicular to the linear array of the linear array CCD components, the linear array CCD1 components (7), the linear array CCD2 components (8), the linear array CCD3 components (9) and the linear array CCD4 components (10) are arranged to form a linear array CCD component system of the receiving end sensor (2), and the intersecting points of the light sensitive surfaces of the linear array CCD1 components (7), the linear array CCD2 components (8), the linear array CCD3 components (9) and the linear array CCD4 components (10) are respectively defined as lambda 1234 The structure of the receiving end sensor (2) satisfies the relation among four intersecting points: lambda (lambda) 14 =λ 23
Three LED point light sources A of the emitting end target (1) 1 ,A 2 ,A 3 Simultaneously lighting, and respectively projecting the LED light to the relation lambda satisfied by the four linear array CCD components 14 =λ 23 Respectively determining projection points lambda corresponding to the three LED point light sources 1234 The method comprises the steps of carrying out a first treatment on the surface of the Three groups of four different projection points can be obtained, and the coordinates of each point light source under the coordinate system of the receiving end sensor (2) can be respectively reconstructed and determined according to each group of projection points; calculating the distance between the three LED point light sources, and comparing the calculated distance value with the known point light source A 1 、A 2 、A 3 Distance A between 1 A 2 、A 1 A 3 、A 1 A 2 In contrast, the matching comparison relation of the three LED point light sources is distinguished; the embedded system (3) calculates three-dimensional space coordinate position data of each point light source in the transmitting end target (1) relative to the receiving end sensor (2) in real time, performs data storage, and completes multi-target point distinguishing and positioning tasks;
three LED point light sources A in transmitting end target (1) 1 ,A 2 ,A 3 Four cylindrical optical lenses (6) passing through the receiving end sensor (2) respectively, and forming four intersecting points on the photosensitive surfaces of the four linear array CCDs, which are defined as lambda 1_A12_A13_A14_A1 ;λ 1_A22_A23_A24_A2 ;λ 1_A32_A33_A34_A3 The method comprises the steps of carrying out a first treatment on the surface of the All projection points on the linear array CCD1 component (7) are lambda ccd1 ={λ 1_A11_A21_A3 All projection points on the linear array CCD2 component (8) are lambda ccd2 ={λ 2_A12_A22_A3 All projection points on the linear array CCD3 component (9) are lambda ccd3 ={λ 3_A13_A23_A3 All projection points on the linear array CCD4 component (10) are lambda ccd4 ={λ 4_A14_A24_A3 };
Adding each projection point on the linear CCD1 assembly (7) to each projection point on the linear CCD4 assembly (10) to obtain all (lambda) ccd1ccd4 ) The method comprises the steps of carrying out a first treatment on the surface of the Adding each projection point on the linear CCD2 assembly (8) to each projection point on the linear CCD3 assembly (9) to obtain all (lambda) ccd2ccd3 );
Find | (lambda) ccd1ccd4 )-(λ ccd2ccd3 ) All possible calculations, the first three minima are selected from these calculations: i.e., min { | (λ) ccd1ccd4 )-(λ ccd2ccd3 ) The coordinates of three markers are reconstructed according to four projection points corresponding to the first three minimum values respectively and are set as C 1 ,C 2 ,C 3
2. The four-line CCD-based multi-target point distinguishing and locating system according to claim 1, wherein: the intersection point of the straight lines of the photosensitive lines of the four linear array CCD assemblies is the origin of the coordinate system of the receiving end sensor (2).
3. The four-line CCD-based multi-target point distinguishing and positioning system of claim 2, wherein: the four linear array CCD components are distributed on the same plane, and the plane layout is in a cross shape; when the intersection point of the straight lines of the four linear array CCDs is made to be the origin of the coordinate system of the receiving end sensor (2), the linear array CCD1 component (7) is arranged on the positive X-axis half axis of the coordinate system of the receiving end sensor (2), the linear array CCD2 component (8) is arranged on the positive Y-axis half axis of the coordinate system of the receiving end sensor (2), the linear array CCD3 component (9) is arranged on the negative X-axis half axis of the coordinate system of the receiving end sensor (2), and the linear array CCD4 component (10) is arranged on the negative Y-axis half axis of the coordinate system of the receiving end sensor (2).
4. The four-line CCD-based multi-target point distinguishing and positioning system of claim 2, wherein: the distances from the innermost ends of the four linear array CCD assemblies to the origin of the coordinate system of the receiving end sensor (2) are equal.
5. The four-line CCD-based multi-target point distinguishing and locating system according to claim 1, wherein: the focal lengths of the four cylindrical optical lenses (6) matched with the four linear array CCD assemblies are all equal.
6. The four-line CCD-based multi-target point distinguishing and locating system according to claim 1, wherein: the lengths of the photosensitive lines of the four linear array CCD assemblies are equal.
7. The four-line CCD-based multi-target point distinguishing and locating system according to claim 1, wherein: synchronous exposure is not needed between the point light source of the transmitting end target (1) and the receiving end sensor (2).
CN201911308933.6A 2019-12-18 2019-12-18 Four-linear array CCD-based multi-target point distinguishing and positioning system Active CN111337013B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911308933.6A CN111337013B (en) 2019-12-18 2019-12-18 Four-linear array CCD-based multi-target point distinguishing and positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911308933.6A CN111337013B (en) 2019-12-18 2019-12-18 Four-linear array CCD-based multi-target point distinguishing and positioning system

Publications (2)

Publication Number Publication Date
CN111337013A CN111337013A (en) 2020-06-26
CN111337013B true CN111337013B (en) 2023-05-16

Family

ID=71181348

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911308933.6A Active CN111337013B (en) 2019-12-18 2019-12-18 Four-linear array CCD-based multi-target point distinguishing and positioning system

Country Status (1)

Country Link
CN (1) CN111337013B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112710234A (en) * 2020-12-17 2021-04-27 中国航空工业集团公司北京长城航空测控技术研究所 Three-dimensional dynamic measuring device and measuring method based on linear array and area array
CN114111601B (en) * 2021-12-07 2024-01-30 合肥工业大学智能制造技术研究院 Method for detecting position offset of assembly hole by utilizing linear array CCD technology

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1431628A (en) * 2003-02-14 2003-07-23 清华大学 3D real time positioning method based on linear CCD and its system
CN203719535U (en) * 2013-09-18 2014-07-16 赵伟东 Positioning system for multiple CCD (charge-coupled device) large-scene indicating target
CN103983189A (en) * 2014-05-16 2014-08-13 哈尔滨工业大学 Horizontal position measuring method based on secondary platform linear array CCDs
CN104819718A (en) * 2015-04-09 2015-08-05 上海大学 3D photoelectric sensing localization system
CN110109056A (en) * 2019-04-24 2019-08-09 广州市慧建科技有限公司 A kind of multiple target laser orientation system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002104771A (en) * 2000-07-25 2002-04-10 Inst Of Physical & Chemical Res Container position detector

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1431628A (en) * 2003-02-14 2003-07-23 清华大学 3D real time positioning method based on linear CCD and its system
CN203719535U (en) * 2013-09-18 2014-07-16 赵伟东 Positioning system for multiple CCD (charge-coupled device) large-scene indicating target
CN103983189A (en) * 2014-05-16 2014-08-13 哈尔滨工业大学 Horizontal position measuring method based on secondary platform linear array CCDs
CN104819718A (en) * 2015-04-09 2015-08-05 上海大学 3D photoelectric sensing localization system
CN110109056A (en) * 2019-04-24 2019-08-09 广州市慧建科技有限公司 A kind of multiple target laser orientation system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Multiple Targets Identification in the Linear CCD Measurement System;Chuang Wang et al.;《2020 IEEE 4th Information Technology,Networking,Electronic and Automation Control Conference(ITNEC 2020)》;20200504;第679-683页 *

Also Published As

Publication number Publication date
CN111337013A (en) 2020-06-26

Similar Documents

Publication Publication Date Title
US9967545B2 (en) System and method of acquiring three-dimensional coordinates using multiple coordinate measurment devices
CN104634276B (en) Three-dimension measuring system, capture apparatus and method, depth computing method and equipment
CN105004324B (en) A kind of monocular vision sensor with range of triangle function
US8339616B2 (en) Method and apparatus for high-speed unconstrained three-dimensional digitalization
CN102034238B (en) Multi-camera system calibrating method based on optical imaging probe and visual graph structure
US20180180408A1 (en) Multi-line array laser three-dimensional scanning system, and multi-line array laser three-dimensional scanning method
CN101901501B (en) Method for generating laser color cloud picture
WO2017041418A1 (en) Multi-line array laser three-dimensional scanning system, and multi-line array laser three-dimensional scanning method
CN111492265A (en) Multi-resolution, simultaneous localization and mapping based on 3D lidar measurements
CN111964694B (en) Laser range finder calibration method for three-dimensional measurement
CN106595519B (en) A kind of flexible 3 D contour measuring method and device based on laser MEMS projection
CN109341668B (en) Multi-camera measuring method based on refraction projection model and light beam tracking method
CN110044300A (en) Amphibious 3D vision detection device and detection method based on laser
CN105066909A (en) Hand-held multi-laser-stripe quick three-dimensional measuring method
CN102155923A (en) Splicing measuring method and system based on three-dimensional target
CN112541946A (en) Real-time pose detection method of mechanical arm based on perspective multi-point projection
CN111337013B (en) Four-linear array CCD-based multi-target point distinguishing and positioning system
CN102438111A (en) Three-dimensional measurement chip and system based on double-array image sensor
US20190339071A1 (en) Marker, and Posture Estimation Method and Position and Posture Estimation Method Using Marker
Shi et al. 3D reconstruction framework via combining one 3D scanner and multiple stereo trackers
CN202406199U (en) Three-dimensional measure chip and system based on double-array image sensor
Yamauchi et al. Calibration of a structured light system by observing planar object from unknown viewpoints
CN110146032A (en) Synthetic aperture camera calibration method based on optical field distribution
CN114170321A (en) Camera self-calibration method and system based on distance measurement
CN102401901B (en) Distance measurement system and distance measurement method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant