CN114862954B - Method, system, computing device, and medium for aligning objects - Google Patents

Method, system, computing device, and medium for aligning objects Download PDF

Info

Publication number
CN114862954B
CN114862954B CN202210780975.5A CN202210780975A CN114862954B CN 114862954 B CN114862954 B CN 114862954B CN 202210780975 A CN202210780975 A CN 202210780975A CN 114862954 B CN114862954 B CN 114862954B
Authority
CN
China
Prior art keywords
optical path
image
current
determining
photosensitive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210780975.5A
Other languages
Chinese (zh)
Other versions
CN114862954A (en
Inventor
宋小飞
张丽
何杭
赵忠锐
王欣圆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian Youxinguang Technology Co ltd
Wuhan Qianxi Technology Co ltd
Original Assignee
Dalian Youxun Technology Co ltd
Wuhan Qianxi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian Youxun Technology Co ltd, Wuhan Qianxi Technology Co ltd filed Critical Dalian Youxun Technology Co ltd
Priority to CN202210780975.5A priority Critical patent/CN114862954B/en
Publication of CN114862954A publication Critical patent/CN114862954A/en
Application granted granted Critical
Publication of CN114862954B publication Critical patent/CN114862954B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • G01C15/002Active optical surveying means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

Abstract

Embodiments of the present disclosure relate to methods, systems, computing devices, and media for aligning objects. In the method, at a control device, a first target position where a first object is located is determined according to first sensitization information from a sensitization device; controlling the pickup device to pick up the first object and rotate so that the second surface of the first object is on a reflected light path of the 45-degree prism; moving the pickup device according to the second photosensitive information so that the first object reaches the second target position so that the second optical path is equal to the first optical path; and determining a third target position at which the second object is located based on at least the first optical path and third exposure information for the second object, such that the second surface of the first object at the fourth target position mates with the first surface of the second object at the third target position. Thus, the present disclosure can precisely align the first object and the second object.

Description

Method, system, computing device, and medium for aligning objects
Technical Field
Embodiments of the present disclosure relate generally to the field of high precision manufacturing, and more particularly, to a method, system, computing device, and medium for aligning objects.
Background
In the field of high precision manufacturing, precise alignment between tiny objects (e.g., parts) is required to achieve mounting. Therefore, precise alignment between objects is a key element thereof. However, in the current manufacturing equipment, there are difficulties in aligning two objects (for example, but not limited to, two optical elements and two elements to be coupled), for example, at least in the height direction, so as to interface or mount one of the objects with the other object, which results in insufficient accuracy of alignment and adversely affects the manufacturing process.
In summary, currently, in a high-precision manufacturing process, in order to align two objects, for example, to achieve position matching at least in the height direction, there is a problem that the precision is not high enough.
Disclosure of Invention
In view of the above, the present disclosure provides a method, system, computing device, and storage medium for aligning objects, which can significantly improve the accuracy of alignment of a first object and a second object.
According to a first aspect of the present disclosure, a method for aligning an object is provided. The method for aligning an object includes: determining, at the control device, a first target location at which the first object is located based on first sensitization information from the sensitization device, the first sensitization information being sensitization information about a first surface of the first object collected by the sensitization device via a transmission optical path of the 45-degree prism; controlling the pickup device to pick up the first object and rotate so that a second surface of the first object is positioned on a reflection light path of the 45-degree prism, so that the photosensitive device collects second photosensitive information about the second surface of the first object via the reflection light path, wherein the second surface is opposite to the first surface; moving the pickup device according to the second sensed information so that the first object reaches the second target position so that a second optical path is equal to a first optical path, the first optical path being an optical path from the sensing device to the first surface of the first object via the transmission optical path when the first object is at the first target position, the second optical path being an optical path from the sensing device to the second surface of the first object via the reflection optical path; and determining a third target position at which the second object is located based on at least the first optical path and third sensitization information about the second object, such that the second surface of the first object at the fourth target position is mated with the first surface of the second object at the third target position, the third sensitization information being sensitization information about the first surface of the second object acquired by the sensitization device via the transmission optical path.
In some embodiments, determining the third target location at which the second object is located comprises: determining a third target position where the second object is located according to the collected third photosensitive information so as to enable a third optical path to be equal to the first optical path, wherein the third optical path is an optical path from the photosensitive device to the first surface of the second object through the transmission optical path; and controlling the pickup device to rotate the first object such that the second surface of the first object rotated to the fourth target position is mated with the first surface of the second object at the third target position.
In some embodiments, the light sensing device comprises a camera device, and determining the first target location at which the first object is located comprises: determining whether a current first image acquired by a camera device meets a preset condition, wherein the current first image is based on first photosensitive information acquired by the camera device; in response to determining that the current first image does not satisfy the predetermined condition, moving the first object in accordance with the current first image; and in response to determining that the current first image satisfies the predetermined condition, determining a current location of the first object as the first target location.
In some embodiments, moving the pickup device according to the second photosensitive information so that the first object reaches the second target position comprises: in response to determining that the current second image acquired by the camera does not meet the predetermined condition, moving the pickup device, wherein the current second image is based on the second photosensitive information acquired by the camera; and in response to determining that the current second image acquired by the camera device meets the preset condition, stopping moving the pickup device so as to determine the current position of the first object at which the pickup device stops moving as a second target position.
In some embodiments, in response to determining that the current first image does not satisfy the predetermined condition, moving the first object according to the current first image comprises: moving the first object so that the first object is close to or far from the image pickup apparatus in response to determining that the current first image does not satisfy the predetermined condition regarding the clear state; and in response to determining that the current second image does not satisfy the predetermined condition, moving the pickup apparatus includes: in response to determining that the current second image does not satisfy the predetermined condition regarding the sharpness state, the pickup device is moved such that the first object is close to or far from the 45-degree prism.
In some embodiments, determining whether the current first image captured by the camera satisfies the predetermined condition comprises: identifying a background region and a contour of a first object from a current first image; obtaining the contrast between the outline and a background area adjacent to the outline; determining whether the contrast is greater than or equal to a predetermined contrast threshold; and responsive to determining that the contrast is greater than or equal to the predetermined contrast threshold, determining that the current first image satisfies the predetermined condition.
In some embodiments, a mobile pickup device comprises: the pick-up section of the pick-up device is controlled to be elongated or shortened so that the first object is close to or far from the 45-degree prism.
In some embodiments, the light sensing device comprises a laser gauge such that the second optical path is equal to the first optical path comprises: acquiring a first optical path through a transmission optical path based on the laser measuring instrument; acquiring a second optical path through the reflection optical path based on the laser measuring instrument; determining whether the second optical path is equal to the first optical path; in response to determining that the second optical path is not equal to the first optical path, moving the pickup such that the first object is closer to or farther from the 45 degree prism; and stopping moving the pickup in response to determining that the second optical path is equal to the first optical path.
In some embodiments, after determining a third target position at which the second object is located based on the acquired third exposure information, the method for aligning objects further comprises: and moving the second object in the normal plane of the transmission light path according to the third photosensitive information so that the third photosensitive information is matched with the target photosensitive information, wherein the target photosensitive information comprises at least one of the first photosensitive information and the second photosensitive information.
In some embodiments, moving the second object within the normal plane of the transmission light path according to the third exposure information such that the third exposure information matches the target exposure information comprises at least one of: moving a second object in a normal plane of the transmission light path according to a third image so that the third image is matched with a target image, wherein the third image is based on third photosensitive information acquired by the camera device, the target image comprises at least one of a first image and a second image, the first image is based on the first photosensitive information acquired by the camera device, and the second image is based on the second photosensitive information acquired by the camera device; constructing a first three-dimensional model based on the first photosensitive information acquired by the laser measuring instrument, constructing a second three-dimensional model based on the second photosensitive information acquired by the laser measuring instrument, and constructing a third three-dimensional model based on the third photosensitive information acquired by the laser measuring instrument; and moving the second object in a normal plane of the transmitted light path according to a third three-dimensional model such that the third three-dimensional model matches a target three-dimensional model, the target three-dimensional model including at least one of the first three-dimensional model and the second three-dimensional model.
According to a second aspect of the present disclosure, a computing device is provided. The computing device includes: at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first aspect of the disclosure.
According to a third aspect of the present disclosure, a computer-readable storage medium is provided. The computer readable storage medium has stored thereon a computer program which, when executed by a machine, implements the method of the first aspect of the disclosure.
According to a fourth aspect of the present disclosure, a system for aligning an object is provided. The system for aligning an object comprises: a first 45-degree prism having a transmission optical path and a reflection optical path; the photosensitive device is used for collecting first photosensitive information about a first surface of a first object through a transmission light path and collecting second photosensitive information about a second surface of the first object through a reflection light path, wherein the second surface is opposite to the first surface; a pickup device for picking up the first object and rotating so that the second surface of the first object is on the reflected light path, so that the photosensitive device collects second photosensitive information; and a control device configured to perform the steps of the method of the first aspect of the disclosure.
In some embodiments, the pick-up device comprises: and the pick-up part can be lengthened or shortened so that the first object on the reflection optical path is close to or far away from the first 45-degree prism.
In some embodiments, the pick-up device further comprises: a rotating shaft; and a rotating arm rotating along the rotation axis in a plane perpendicular to the 45-degree face of the first 45-degree prism; the pickup part is arranged on the rotating arm.
In some embodiments, the system for aligning an object further comprises: and the 45-degree surface of the second 45-degree prism is attached to the 45-degree surface of the first 45-degree prism.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings. In the drawings, like or similar reference numbers indicate like or similar elements.
Fig. 1 shows a schematic structural diagram of a system for aligning objects of an embodiment of the present disclosure.
Fig. 2 shows a side view of a prism group of an embodiment of the present disclosure.
Fig. 3 shows a flow diagram of a method for aligning objects of an embodiment of the present disclosure.
Fig. 4 shows a schematic diagram of a photosensitive device of an embodiment of the present disclosure collecting first photosensitive information.
FIG. 5 illustrates a flow chart of a method of determining a first target location at which a first object is located according to an embodiment of the disclosure.
Fig. 6 illustrates a flowchart of a method of determining whether a current first image captured by a camera device satisfies a predetermined condition according to an embodiment of the disclosure.
Fig. 7 illustrates a flowchart of a method of moving a pickup device according to second light-sensing information according to an embodiment of the present disclosure.
Fig. 8 shows a schematic diagram of moving a pickup according to second photosensitive information according to an embodiment of the present disclosure.
Fig. 9 illustrates a flow chart of a method of mating a first object with a second object of an embodiment of the present disclosure.
FIG. 10 illustrates a schematic diagram of determining a third target position at which a second object is located according to an embodiment of the disclosure.
Fig. 11 illustrates a schematic diagram of a first object being mated with a second object of an embodiment of the disclosure.
Fig. 12 illustrates a flow chart of a method of mating a first object with a second object of an embodiment of the present disclosure.
Fig. 13 illustrates a flow chart of a method for aligning objects of an embodiment of the present disclosure.
FIG. 14 shows a schematic block diagram of an example electronic device that may be used to implement the method for aligning objects of embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of embodiments of the present disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The term "include" and variations thereof as used herein is meant to be inclusive in an open-ended manner, i.e., "including but not limited to". Unless specifically stated otherwise, the term "or" means "and/or". The term "based on" means "based at least in part on". The terms "one example embodiment" and "one embodiment" mean "at least one example embodiment". The term "another embodiment" means "at least one additional embodiment". The terms "first," "second," and the like may refer to different or the same object. Other explicit and implicit definitions are also possible below.
As described above, in a high-precision manufacturing process, in order to align two objects, for example, to achieve position matching at least in the height direction, there are many difficulties, resulting in insufficient accuracy of alignment, which adversely affects the manufacturing process.
To address, at least in part, one or more of the above issues and other potential issues, an example embodiment of the present disclosure proposes a scheme for aligning objects. In the scheme of the disclosure, a first target position where a first object is located is determined according to first photosensitive information from the photosensitive device. Wherein the first photosensitive information is photosensitive information about the first surface of the first object collected by the photosensitive device via a transmission optical path of the 45-degree prism, and then the pickup device is controlled to pick up the first object and rotate to the transition position so that the second surface of the first object is on a reflection optical path of the 45-degree prism, so that the photosensitive device collects second photosensitive information about the second surface of the first object via a reflection optical path. The second surface is opposite to the first surface. Next, the pickup device is moved according to the second photosensitive information so that the second optical path is equal to the first optical path, thereby determining that the first object reaches the second target position. The second optical path is an optical path from the photosensitive device to the second surface of the first object via the reflected optical path. Finally, a third target position at which the second object is located is determined based on at least the first optical path and third light-sensing information about the second object, such that the second surface of the first object at the fourth target position is mated with the first surface of the second object at the third target position. Wherein the third sensitization information is sensitization information about the first surface of the second object collected by the sensitization device via the transmission light path. The third optical path and the second optical path are both equal to the first optical path, which means that when the pickup device is controlled to drive the first object to rotate in the reverse direction by the target angle (i.e. when the first object reaches the fourth target position), the plane where the second surface of the first object is located is coplanar with the plane where the first surface of the second object is located. The target angle is an angle rotated by the pickup device to the transition position after the pickup device picks up the first object at the first target position. Based on the scheme, the second surface of the first object can be accurately matched with the first surface of the second object, and high-precision manufacturing is facilitated.
Fig. 1 shows a schematic structural diagram of a system 100 for aligning objects of an embodiment of the present disclosure. The system 100 includes a first 45-degree prism 102, a photosensitive device 104, a pickup device 106, and a control device (not shown).
In some embodiments, the system 100 further includes a base 114, a stage 116, a mounting bracket 118, and a display device (not shown). Wherein the stage 116 is disposed on the base 114. The control device may drive the stage 116 to move in at least one of the following ways: translation along the x-axis, translation along the y-axis, elevation along the z-axis, and rotation in the x-y plane. For convenience of illustration, the base 114 is rectangular, for example, the x-axis direction is a direction parallel to the long side of the base 114, and the y-axis direction is a direction parallel to the wide side of the base 114. When the system 100 is placed horizontally, the x-y plane is a horizontal plane. The object stage 116 is used for carrying the object and driving the object to move correspondingly based on the driving of the control device. The object plane of the object table 116 is parallel to the x-y plane.
In some embodiments, the number of stages 116 is two, and the two stages 116 respectively carry the first object O1 and the second object O2. The first object O1 and the second object O2 are two objects that need to be aligned.
With respect to the mounting bracket 118, it includes, for example, a vertical arm 120 and a horizontal arm 122. Wherein the vertical arms 120 are disposed on the base 114 and support the cross arms 122.
With respect to the first 45-degree prism 102, it has a transmission optical path and a reflection optical path. For example, the first 45-degree prism 102 has a 45-degree surface 112, and an optical coating is disposed on the 45-degree surface 112. The 45 degree face 112 may transmit a portion of the light and reflect a portion of the light to form a transmitted light path and a reflected light path. Wherein, the light path formed by the light entering the first 45-degree prism 102 and transmitting through the 45-degree surface 112 is a transmission light path; the light path formed by the light entering the first 45-degree prism 102 and being reflected by the 45-degree surface 112 is a reflected light path. In some embodiments, the thickness of the optical coating is less than or equal to 1 micron, so that the optical path error can be effectively reduced, and the position alignment accuracy of the object can be improved.
In some embodiments, the first 45 degree prism 102 is disposed on the vertical arm 120 and extends in a direction parallel to the x-y plane. In this state, the transmitted light path is perpendicular to the x-y plane.
In some embodiments, the system 100 also includes a second 45 degree prism 136. The 45 degree face of the second 45 degree prism 136 is attached to the 45 degree face 112 of the first 45 degree prism 102. The second 45 degree prism 136 and the first 45 degree prism 102 form a prism group 138. Fig. 2 shows a side view of a prism stack 138 of an embodiment of the present disclosure. The arrow LI represents incident light, the arrow LT represents transmitted light formed by the incident light through the transmitted light path, and the arrow LR represents reflected light formed by the reflected light path. The prism assembly 138 is square in cross-section. Via the prism set 138, the optical path length LA (not shown) of the incident light in the prism medium via the transmission optical path is equal to the optical path length LB (not shown) of the light in the prism medium via the reflection optical path. It is understood that LA = LC + LD; LB = LC + LE. Since LD = LE, LA = LB. Therefore, the accuracy in comparing the optical path via the transmission optical path and the optical path via the reflection optical path can be improved.
The photosensitive device 104 is used, for example, for collecting first photosensitive information about a first surface of the first object via a transmission optical path, and collecting second photosensitive information about a second surface of the first object via a reflection optical path, the second surface being an opposite surface to the first surface. For example, the photosensitive device 104 is disposed on the cross arm 122, on a common optical path of the transmission optical path and the reflection optical path, and faces the first 45-degree prism 102. For example, when the first object O1 is placed on the object carrying surface of the stage 116, the surface facing the photosensitive device 104 is a first surface, and the surface opposite to the first surface (i.e., the surface facing the object carrying surface) is a second surface.
In some embodiments, the photosensitive device 104 comprises an imaging device. The camera device may for example acquire a current first image of the first surface of the first object via a transmission optical path, or a current second image of the second surface of the first object via a reflection optical path, or a current third image of the first surface of the second object via a transmission optical path. The display device may, for example, display at least one of a current first image, a current second image, and a current third image.
In some embodiments, the light sensing device 104 comprises a laser gauge. The laser measuring instrument may for example acquire a first optical path of the light to the first surface of the first object via a transmitted optical path, or a second optical path of the light to the second surface of the first object via a reflected optical path, or a third optical path of the light to the first surface of the second object via a transmitted optical path. The laser surveying instrument may, for example, further scan a first surface of the first object via the transmitted optical path to obtain first scan data, or scan a second surface of the first object via the reflected optical path to obtain second scan data, or scan the first surface of the second object via the transmitted optical path to obtain third scan data, so that the control device constructs a first three-dimensional model corresponding to the first surface of the first object based on the first scan data, constructs a second three-dimensional model corresponding to the second surface of the first object based on the second scan data, and constructs a third three-dimensional model corresponding to the first surface of the second object based on the third scan data. The display device may, for example, display at least one of the first three-dimensional model, the second three-dimensional model, and the third three-dimensional model.
As for the pickup device 106, it is used to pick up the first object and rotate so that the second surface of the first object is on the reflected light path of the 45-degree prism, for example, so that the photosensitive device collects the second photosensitive information. In some embodiments, the pickup device 106 includes a base 124, a rotating shaft 126, a rotating arm 128, a retractable pickup member 130. Wherein, the base 124 is disposed on the base 114, the rotating arm 128 is connected with the base 124 through the rotating shaft 126, and the retractable picking member 130 is disposed on the rotating arm 128. The axis of the rotating shaft 126 is parallel to the 45 degree face 112 and the rotating arm 128 rotates in a plane perpendicular to the 45 degree face 112.
In some embodiments, retractable pick-up member 130 includes, for example, a telescoping portion 134 and a pick-up portion 132. The telescopic portion 134 can be extended or shortened at least along a direction perpendicular to the rotating arm 128 to bring the pickup portion 132 away from or close to the rotating arm 128. The pickup 132 may be a suction nozzle or a mechanical gripper, for example.
Under the control of the control device, the pickup device 106 picks up the first object from the stage 116 by using the pickup portion 132, and drives the first object to move by using the rotation of the rotating arm 128, so that the second surface of the first object is located on the reflected light path of the 45-degree prism, so that the photosensitive device collects the second photosensitive information. For example, the length of the rotary arm 128 matches the height of the first 45-degree prism 102, and when the rotary arm 128 rotates to be perpendicular to the x-y plane, the first object is located at a height matching the first 45-degree prism 102, which is suitable for the photosensitive device to collect the second photosensitive information of the second surface of the first object via the reflected light path.
With regard to the control device, for example, it is configured to perform the steps of the method for aligning objects of embodiments of the present disclosure. The control device may have one or more Processing units, including special-purpose Processing units such as a GPU (Graphics Processing Unit), an FPGA (Field Programmable Gate Array), an ASIC (Application Specific Integrated Circuit), and a general-purpose Processing Unit such as a CPU (Central Processing Unit). In addition, one or more virtual machines may be running on each control device. The control device drives the pick-up device 106 and the stage 116, for example, by means of a transmission mechanism.
Fig. 3 shows a flow diagram of a method 300 for aligning objects of an embodiment of the present disclosure. The method 300 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 300 may also include additional steps not shown and/or may omit steps shown, as the scope of the disclosure is not limited in this respect.
At step 302, the control device determines a first target location at which the first object is located based on the first exposure information from the exposure device. The first sensitization information is sensitization information about the first surface of the first object collected by the sensitization device via a transmission light path of the 45-degree prism.
Fig. 4 shows a schematic diagram of a photosensitive device of an embodiment of the present disclosure collecting first photosensitive information. Taking the photo sensing device as an example of the imaging device, after the first object O1 is placed on the object-bearing surface of the object stage 116, the control device controls the imaging device to capture a current first image about the first surface S1 of the first object O1 via the transmission optical path of the first 45-degree prism 102.
In some embodiments, a method of determining a first target location at which a first object is located includes, for example: the control device determines whether a current first image acquired by the camera device meets a preset condition or not; if the current first image is determined not to satisfy the predetermined condition, the control device moves the first object according to the current first image; if the current first image is determined to meet the preset condition, the control device determines the current position of the first object as the first target position. The method 500 for determining the first target position of the first object will be described in detail with reference to fig. 5, and will not be described herein again.
At step 304, the control device controls the pickup device to pick up the first object and rotate so that the second surface of the first object is on a reflected optical path of the 45-degree prism, so that the light sensing device collects second light sensing information about the second surface of the first object via the reflected optical path. The second surface is opposite to the first surface. For example, the control device drives the rotating arm 128 to rotate to be perpendicular to the x-y plane, and then the control device controls the camera device to acquire a current second image of the second surface of the first object. The present disclosure refers to a position at which the first object is located at this time as a "transition position", and refers to an angle rotated by the pickup device to rotate to the transition position from the first target position to the first target position as a "target angle".
At step 306, the control means moves the pickup means in accordance with the second sensed information so that the first object reaches the second target position so that the second optical path is equal to the first optical path. The first optical path is an optical path from the photosensitive device to the first surface of the first object via the transmission optical path when the first object is at the first target position. The second optical path is an optical path from the photosensitive device to the second surface of the first object via the reflected optical path.
The method concerning moving the pickup device according to the second photosensitive information includes, for example: the control device determines whether the current second image acquired by the camera device meets a preset condition; if the current second image collected by the camera device is determined not to meet the preset condition, the control device moves the pickup device; if the current second image acquired by the camera device is determined to meet the preset condition, the control device stops moving the pickup device so as to determine the current position of the first object at which the pickup device stops moving as the second target position. The method 700 for moving the pickup device according to the second photosensitive information will be described in detail below with reference to fig. 7, and will not be described herein again.
At step 308, the control device determines a third target location at which the second object is located based on at least the first optical path and third exposure information for the second object, such that the second surface of the first object at the fourth target location mates with the first surface of the second object at the third target location. The third exposure information is exposure information about the first surface of the second object collected by the exposure device via the transmission optical path.
The method for matching the first object with the second object includes, for example: the control device determines a third target position where the second object is located according to third photosensitive information from the photosensitive device; and controlling the pick-up device to rotate so as to drive the first object to move to the fourth target position, so that the second surface of the first object at the fourth target position is matched with the first surface of the second object at the third target position. The method 900 for matching the first object with the second object will be described in detail with reference to fig. 9, and will not be described herein again.
In the above solution, by making the third optical path and the second optical path equal to the first optical path, it means that the pickup device is controlled to drive the first object to rotate reversely by the target angle (i.e. when the first object reaches the fourth target position), and the plane where the second surface of the first object is located is coplanar with the plane where the first surface of the second object is located. The target angle is an angle rotated by the pickup device to the transition position after the pickup device picks up the first object at the first target position. Based on the scheme, the second surface of the first object can be accurately matched with the first surface of the second object, and high-precision manufacturing is facilitated.
FIG. 5 illustrates a flow chart of a method 500 of determining a first target location at which a first object is located in an embodiment of the disclosure. The method 500 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 500 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect.
At step 502, the control device determines whether a current first image captured by the camera device satisfies a predetermined condition. The current first image is based on, for example, first exposure information acquired by the camera device.
In some embodiments, the predetermined condition comprises, for example, a predetermined condition regarding a clear state. The method for determining whether the current first image acquired by the camera device meets the predetermined condition includes, for example: the control device identifies a background area and a contour of the first object from the current first image; obtaining the contrast between the outline and a background area adjacent to the outline; determining whether the contrast is greater than or equal to a predetermined contrast threshold; if the contrast is determined to be greater than or equal to the predetermined contrast threshold, the control means determines that the current first image satisfies the predetermined condition; if it is determined that the contrast is less than the predetermined contrast threshold, the control means determines that the current first image does not satisfy the predetermined condition. The method 600 for determining whether the current first image acquired by the image capturing device satisfies the predetermined condition will be described in detail below with reference to fig. 6, and will not be described herein again.
At step 504, if it is determined that the current first image does not satisfy the predetermined condition, the control apparatus moves the first object according to the current first image, and then returns to step 502. In some embodiments, if it is determined that the current first image does not satisfy the predetermined condition regarding the clear state, the control means moves the first object so that the first object is close to or far from the image pickup means. This is repeated until the first image satisfies a predetermined condition with respect to a clear state (e.g., the contrast between the contour of the first object and the background area adjacent to the contour is greater than or equal to a predetermined contrast threshold). Wherein the control means moves the first object, for example by driving the stage 116 up or down in the z-axis direction.
At step 506, if it is determined that the current first image satisfies the predetermined condition, the control apparatus determines the current location of the first object as the first target location. For example, when the first image satisfies a predetermined condition regarding the clear state, the movement of the first object is stopped, and it is determined that the current position of the first object is determined as the first target position.
It should be understood that the greater the contrast between the contour of the first object and the background area adjacent to the contour, the sharper the first object in the current first image; conversely, the more blurred the first object. When the contrast is greater than or equal to a predetermined contrast threshold, a clear image of the first surface of the first object, referred to as the "first reference image", is acquired by the characterization camera. Accordingly, the optical path from the photosensitive device to the first surface of the first object via the transmission optical path when the first object is at the first target position is referred to as a "first optical path", which is characterized by R1 in fig. 4.
In the case of an imaging device, the target optical path to which it acquires a clear image about an object is fixed and unchanged without changing its focal length. It is understood that the target optical path refers to an optical path from the image pickup device to the object at this time. Therefore, the position of the object is determined according to the definition state of the image of the object acquired by the camera device, and the method has extremely high accuracy.
By adopting the above means, the present disclosure can reasonably adjust the position of the first object according to whether the first image satisfies a predetermined condition (e.g., a predetermined condition on a clear state) to determine the first target position where the first object is located. The image acquisition process is rapid and accurate, the first target position of the first object is determined according to whether the first image meets the preset condition, the accuracy is extremely high, and the stability of the moving process can be ensured.
Fig. 6 illustrates a flow chart of a method 600 of determining whether a current first image captured by a camera device satisfies a predetermined condition according to an embodiment of the disclosure. The method 600 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 600 may also include additional steps not shown and/or may omit steps shown, as the scope of the disclosure is not limited in this respect.
At step 602, the control apparatus identifies a background region and an outline of the first object from the current first image. The control means identifies a contour of the first object from the current first image, for example based on an image recognition algorithm, and identifies an area outside the contour in the current first image as a background area.
At step 604, the control device obtains a contrast between the contour and a background area adjacent to the contour. The control device acquires the contrast, for example, based on an image analysis algorithm.
At step 606, control determines whether the contrast is greater than or equal to a predetermined contrast threshold.
At step 608, if it is determined that the contrast is greater than or equal to the predetermined contrast threshold, the control apparatus determines that the current first image satisfies the predetermined condition.
At step 610, if it is determined that the contrast is less than the predetermined contrast threshold, the control device determines that the current first image does not satisfy the predetermined condition.
By adopting the above means, the method and the device can determine whether the current first image meets the preset condition about the clear state according to the contrast between the contour of the first object in the current first image and the background area adjacent to the contour, can effectively ensure that the clear state of the first image is accurately acquired, and provide important guarantee for realizing accurate position adjustment.
Fig. 7 shows a flowchart of a method 700 of moving a pickup according to second photosensitive information of an embodiment of the present disclosure. The method 700 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 700 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect.
Fig. 8 shows a schematic diagram of moving a pickup according to second photosensitive information according to an embodiment of the present disclosure. Method 700 is described below in conjunction with fig. 8. For convenience of explanation, the first object O1 at the first target position is illustrated in dashed lines in fig. 8.
At step 702, the control device determines whether a current second image captured by the camera device satisfies a predetermined condition. The current second image is based on the second sensed information acquired by the camera, for example, the current second image of the second surface S2 of the first object O1 acquired by the camera via the reflected light path 122.
At step 704, if it is determined that the current second image captured by the camera does not satisfy the predetermined condition, the control means moves the pickup means and then returns to step 702. For example, if it is determined that the current second image does not satisfy the predetermined condition regarding the clear state, the control means moves the pickup means so that the first object is close to or away from the first 45-degree prism 102. In some embodiments, the control device drives the telescopic part 134 to extend or contract so that the first object O1 is close to or far from the first 45-degree prism 102.
At step 706, if it is determined that the current second image captured by the camera device meets the predetermined condition, the control device stops moving the pickup device so as to determine the current position of the first object at which the pickup device stopped moving as the second target position. For example, if it is determined that the current second image satisfies a predetermined condition with respect to the clear state, the control means stops moving the pickup means. At this time, the camera device captures a clear image of the second surface of the first object, referred to as a "second reference image". An optical path from the photosensitive device to the second surface of the first object via the reflected optical path is referred to as a "second optical path". With the focal length of the imaging device remaining unchanged, the optical path length over which the imaging device acquires a sharp image about the object is fixed. Therefore, when the first object is at the second target position, the second optical path is equal to the first optical path.
As shown in fig. 8, the first optical path R1= RA + RB; the second optical path R2 (not shown in the figure) = RA + RC. When the second optical path R2 is equal to the first optical path R1, RB = RC. This means that if the pickup 106 is controlled to rotate reversely by the target angle at this time, the first object O1 is moved to the fourth target position. The plane of the second surface S2 of the first object O1 at the fourth target position is coplanar with, i.e., in the same plane as, the plane of the first surface S1 of the first object O1 at the first target position.
By adopting the above means, the present disclosure can reasonably move the pickup device so that the first object is at the second target position according to whether the second image satisfies a predetermined condition (e.g., a predetermined condition regarding a clear state). The image acquisition process is rapid and accurate; the pick-up device is moved according to the clear state of the image with extremely high accuracy, and the stability of the moving process can be ensured.
The method of mating a first object with a second object may be implemented, for example, according to method 900. Fig. 9 illustrates a flow diagram of a method 900 of mating a first object with a second object of an embodiment of the disclosure. The method 900 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 900 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect. The method 900 is described in detail below with reference to fig. 10 and 11. FIG. 10 illustrates a schematic diagram of determining a third target position at which a second object is located according to an embodiment of the disclosure. Fig. 11 shows a schematic diagram of an embodiment of the present disclosure for enabling a first object to be mated with a second object.
At step 902, the control device determines a third target location at which the second object is located based on third exposure information from the exposure device. The third sensitization information is sensitization information about the first surface of the second object collected by the sensitization device via a transmission light path of the 45-degree prism.
Taking the photosensitive device as an example of the imaging device, after the second object O2 is placed on the object carrying surface of the object carrying table 116, the control device controls the imaging device to capture a current third image of the first surface S3 of the second object O2 via the transmission optical path of the 45-degree prism. Determining the third target location of the second object may be performed with reference to method 500, and will not be described herein.
In some embodiments, the number of stages 116 is two. In the initial state, the first stage carries the first object O1, and the second stage carries the second object O2. After confirming that the first object O1 is at the second target position, the control device drives the stage 116 to move so that the second stage is within the shooting range of the camera device, so as to capture the current third image.
If the control means determines that the present third image satisfies the predetermined condition on the clear state, it is determined that the position where the second object is present is the third target position. At this time, the third optical path R3 is equal to the first optical path R1.
At step 904, the control means controls the pick-up means to rotate so as to bring the first object to move to the fourth target position such that the second surface of the first object at the fourth target position cooperates with the first surface of the second object at the third target position. For example, the control device controls the pickup device to rotate the target angle in reverse so as to bring the first object to move to the fourth target position. Referring to fig. 11, the second surface S2 of the first object O1 at the fourth target position precisely matches the first surface S3 of the second object O2 at the third target position.
Since the third optical path and the second optical path are both equal to the first optical path, the pickup device is controlled to drive the first object to rotate in the reverse direction by the target angle, so that when the first object reaches the fourth target position, the plane where the second surface of the first object is located is coplanar with the plane where the first surface of the second object is located, that is, the second surface of the first object and the first surface of the second object are precisely matched at least in the z-axis direction.
Based on the scheme, the second surface of the first object can be matched with the first surface of the second object accurately and naturally, and high-precision manufacturing is facilitated.
Fig. 12 illustrates a flow diagram of a method 1200 of mating a first object with a second object of an embodiment of the disclosure. The method 1200 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 1200 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect.
At step 1202, the control device determines a third target location at which the second object is located based on third photosensitive information from the photosensitive device.
At step 1204, the control device moves the second object within the normal plane of the transmission light path according to the third exposure information such that the third exposure information matches target exposure information, the target exposure information including at least one of the first exposure information and the second exposure information.
In some embodiments, after determining that the second object is at the third target position, the control device obtains a matching degree of the current third image and the first reference image based on an image recognition algorithm. For example, a first surface of the second object having a third feature may serve as a localization marker, and a first surface of the first object having a first feature that corresponds to the third feature (e.g., when the first object matches the second object, the first feature matches a projection of the third feature in an x-y plane). With the first reference image as the matching target, the control device drives the stage 116 to move (e.g., at least one of translate along the x-axis direction, translate along the y-axis direction, raise and lower along the z-axis direction, and rotate in the x-y plane) within the normal plane (i.e., the x-y plane) of the transmission optical path so that the current third image matches the first reference image. To this end, the control device enables an accurate adjustment of the pose of the second object in the x-y plane.
In some embodiments, for example, the second surface of the first object has a second feature, the second feature corresponding to the third feature. The control means moves the second object with the second reference image as a matching target so that the current third image matches the second reference image.
At step 1206, the control means controls the pick-up means to rotate in order to bring the first object to move to the fourth target position such that the second surface of the first object at the fourth target position cooperates with the first surface of the second object at the third target position.
For example, the control device controls the pickup device 106 to rotate reversely by the target angle so as to bring the first object O1 to move to the fourth target position. Based on the third optical path and the second optical path both being equal to the first optical path, the control device achieves an exact match of the second surface of the first object with the first surface of the second object at least in the z-axis direction; in combination with the current third image matching the first reference image, the control means enables matching of the pose of the second surface of the first object and the first surface of the second object in the x-y plane. Thus, when the first object is moved to the fourth target position, the first object and the second object fit exactly in all three dimensions x, y, z.
By adopting the above means, the present disclosure can reasonably adjust the position of the second object according to the matching degree of the current third image and the target test image, so that the second object is precisely matched with the first object in the x-axis and y-axis directions, so that when the first object is moved to the fourth target position, the first object and the second object are precisely matched in three dimensions of x, y and z.
Fig. 13 shows a flow diagram of a method 1300 for aligning objects of an embodiment of the present disclosure. The method 1300 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 1300 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect.
At step 1302, the control device obtains a first optical path via the transmission optical path based on the laser gauge. For example, after the first object is placed on the object-carrying surface of the object stage 116, the control device controls the laser measuring instrument to acquire a first optical path to the first surface of the first object via the transmission optical path, and determines the current position of the first object as the first target position.
In some embodiments, for a first object at a first target position, the control device further controls the laser measuring instrument to scan a first surface of the first object via the transmitted optical path to obtain first scan data, and then constructs a first three-dimensional model for the first surface of the first object based on the first scan data, for example.
At step 1304, the control device obtains a second optical path via the reflected optical path based on the laser gauge. For example, after the first object is moved to the transition position by the pickup device, the control device controls the laser surveying instrument to acquire the second optical path reaching the second surface of the first object via the reflected optical path.
At step 1306, control determines whether the second optical path is equal to the first optical path.
At step 1308, if it is determined that the second optical path is not equal to the first optical path, the control means moves the pickup means so that the first object is close to or far from the 45-degree prism, and then returns to step 1306.
At step 1310, the control means stops moving the pickup means if it is determined that the second optical path is equal to the first optical path. At this time, the control device determines that the first object is at the second target position.
In some embodiments, for the first object at the second target position, the control device further controls the laser measuring instrument to scan the second surface of the first object via the reflected light path to obtain second scan data, and then constructs a second three-dimensional model of the second surface of the first object based on the second scan data.
At step 1312, the control device acquires a third optical path via the transmission optical path based on the laser gauge. For example, after the second object is placed on the object-carrying surface of the object stage 116, the control device controls the laser measuring instrument to acquire a third optical path to the first surface of the second object via the transmission optical path.
At step 1314, control determines whether the third optical path is equal to the first optical path.
At step 1316, if it is determined that the third optical path is not equal to the first optical path, the control device moves the second object so that the second object is close to or far from the laser meter, and then returns to step 1314.
At step 1318, the control means stops moving the second object if it is determined that the third optical path is equal to the first optical path. At this time, the control device determines that the second object is at the third target position.
In some embodiments, for a second object at a third target position, the control device further controls the laser measuring instrument to scan the first surface of the second object via the transmission optical path to obtain third scan data, and then constructs a third three-dimensional model of the first surface of the second object based on the third scan data.
At step 1320, the control apparatus moves the second object in accordance with the third three-dimensional model and within the normal plane of the transmitted light path such that the third three-dimensional model matches a target three-dimensional model, the target three-dimensional model including at least one of the first three-dimensional model and the second three-dimensional model.
At step 1322, the control device controls the pickup device to rotate so as to bring the first object to move to the fourth target position, such that the second surface of the first object at the fourth target position is brought into engagement with the first surface of the second object at the third target position.
Based on the laser measuring instrument, the precision of optical path acquisition can be improved, so that the precision of object alignment is further improved. The three-dimensional model is constructed based on the scanning data acquired by the laser measuring instrument, has extremely high accuracy, can accurately reflect the shape characteristics of the surface of the object, and is beneficial to realizing high-accuracy matching so as to improve the accuracy of object alignment.
Fig. 14 shows a schematic block diagram of an example electronic device 1400 for aligning objects that may be used to implement embodiments of the present disclosure. As shown, the electronic device 1400 includes a central processing unit (i.e., CPU 1401) that can perform various appropriate actions and processes in accordance with computer program instructions stored in a read-only memory (i.e., ROM 1402) or loaded from a storage unit 1408 into a random access memory (i.e., RAM 1403). In the RAM 1403, various programs and data required for the operation of the electronic device 1400 can also be stored. The CPU 1401, ROM 1402, and RAM 1403 are connected to each other via a bus 1404. An input/output interface (i.e., I/O interface 1405) is also connected to bus 1404.
A number of components in the electronic device 1400 are connected to the I/O interface 1405, including: an input unit 1406 such as a keyboard, a mouse, a microphone, and the like; an output unit 1407 such as various types of displays, speakers, and the like; a storage unit 1408 such as a magnetic disk, optical disk, or the like; and a communication unit 1409 such as a network card, a modem, a wireless communication transceiver, and the like. The communication unit 1409 allows the electronic device 1400 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The various processes and processes described above, such as methods 300, 500, 600, 700, 900, 1200, and 1300, may be performed by the CPU 1401. For example, in some embodiments, methods 300, 500, 600, 700, 900, 1200, and 1300 may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 1408. In some embodiments, part or all of the computer program can be loaded and/or installed onto the electronic device 1400 via the ROM 1402 and/or the communication unit 1409. When loaded into RAM 1403 and executed by CPU 1401, may perform one or more of the actions of methods 300, 500, 600, 700, 900, 1200 and 1300 described above.
The present disclosure relates to methods, apparatuses, systems, electronic devices, computer-readable storage media and/or computer program products. The computer program product may include computer-readable program instructions for performing various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge computing devices. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the disclosure are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (16)

1. A method for aligning an object, comprising:
determining, at a control device, a first target location at which a first object is located based on first sensitization information from a sensitization device, the first sensitization information being sensitization information about a first surface of the first object collected by the sensitization device via a transmission optical path of a 45-degree prism;
controlling a pickup device to pick up a first object and rotate so that a second surface of the first object is located on a reflection light path of a 45-degree prism, so that a photosensitive device collects second photosensitive information about the second surface of the first object via the reflection light path, wherein the second surface is opposite to the first surface;
moving the pickup device according to the second light sensing information so that the first object reaches the second target position so that a second optical path is equal to a first optical path, the first optical path being an optical path from the light sensing device to the first surface of the first object via the transmission optical path when the first object is at the first target position, the second optical path being an optical path from the light sensing device to the second surface of the first object via the reflection optical path; and
determining a third target position at which the second object is located based on at least the first optical path and third sensitization information about the second object, the third sensitization information being sensitization information about the first surface of the second object acquired by the sensitization apparatus via the transmission optical path, so that the second surface of the first object at the fourth target position is matched with the first surface of the second object at the third target position.
2. The method of claim 1, wherein determining a third target location at which a second object is located comprises:
determining a third target position where a second object is located according to the collected third photosensitive information so that a third optical path is equal to the first optical path, wherein the third optical path is an optical path from the photosensitive device to the first surface of the second object through the transmission optical path; and
and controlling the pickup device to rotate the first object so that the second surface of the first object rotated to the fourth target position is matched with the first surface of the second object at the third target position.
3. The method of claim 1, wherein the light sensing device comprises a camera device, and determining the first target location at which the first object is located comprises:
determining whether a current first image acquired by a camera device meets a preset condition, wherein the current first image is based on first photosensitive information acquired by the camera device;
in response to determining that the current first image does not satisfy the predetermined condition, moving the first object in accordance with the current first image; and
in response to determining that the current first image satisfies the predetermined condition, a current location of the first object is determined as the first target location.
4. The method of claim 3, wherein moving the pickup device according to the second photosensitive information so that the first object reaches the second target location comprises:
moving the pickup device in response to determining that a current second image acquired by the camera device does not meet a predetermined condition, wherein the current second image is based on second photosensitive information acquired by the camera device; and
and in response to determining that the current second image acquired by the camera device meets the preset condition, stopping moving the pickup device so as to determine the current position of the first object at which the pickup device stops moving as the second target position.
5. The method of claim 4, wherein in response to determining that the current first image does not satisfy the predetermined condition, moving the first object according to the current first image comprises:
moving the first object so that the first object is close to or far from the image pickup apparatus in response to determining that the current first image does not satisfy the predetermined condition regarding the clear state; and
in response to determining that the current second image does not satisfy the predetermined condition, moving the pickup apparatus includes:
in response to determining that the current second image does not satisfy the predetermined condition with respect to the clear state, the pickup device is moved so that the first object is close to or far from the 45-degree prism.
6. The method of claim 3, wherein determining whether the current first image captured by the camera satisfies a predetermined condition comprises:
identifying a background region and a contour of a first object from a current first image;
acquiring contrast between the contour and a background area adjacent to the contour;
determining whether the contrast is greater than or equal to a predetermined contrast threshold;
in response to determining that the contrast is greater than or equal to a predetermined contrast threshold, determining that the current first image satisfies a predetermined condition; and
in response to determining that the contrast is less than a predetermined contrast threshold, determining that the current first image does not satisfy a predetermined condition.
7. The method of claim 1, wherein moving the pickup comprises controlling a pickup portion of the pickup to lengthen or shorten so that the first object is closer to or farther from a 45 degree prism.
8. The method of claim 1, wherein the photosensitive device comprises a laser gauge such that the second optical path is equal to the first optical path comprises:
acquiring the first optical path via the transmission optical path based on a laser measuring instrument;
acquiring the second optical path via the reflection optical path based on a laser measuring instrument;
determining whether the second optical path is equal to the first optical path;
in response to determining that the second optical path is not equal to the first optical path, moving the pickup device so that the first object is closer to or farther from the 45-degree prism; and
stopping moving the pickup in response to determining that the second optical path is equal to the first optical path.
9. The method of claim 2, after determining a third target location at which a second object is located based on the acquired third exposure information, the method further comprising:
and moving the second object in the normal plane of the transmission light path according to the third photosensitive information so that the third photosensitive information is matched with target photosensitive information, wherein the target photosensitive information comprises at least one of the first photosensitive information and the second photosensitive information.
10. The method of claim 9, wherein moving the second object in a normal plane of the transmission light path according to the third exposure information such that the third exposure information matches the target exposure information comprises at least one of:
moving a second object in a normal plane of the transmission light path according to a third image so that the third image is matched with a target image, wherein the third image is based on third photosensitive information acquired by a camera device, the target image comprises at least one of a first image and a second image, the first image is based on the first photosensitive information acquired by the camera device, and the second image is based on second photosensitive information acquired by the camera device; and
constructing a first three-dimensional model based on first photosensitive information acquired by a laser measuring instrument, constructing a second three-dimensional model based on second photosensitive information acquired by the laser measuring instrument, and constructing a third three-dimensional model based on third photosensitive information acquired by the laser measuring instrument; and moving a second object in a normal plane of the transmission lightpath according to a third three-dimensional model such that the third three-dimensional model matches a target three-dimensional model, the target three-dimensional model including at least one of the first three-dimensional model and the second three-dimensional model.
11. A computing device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor;
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 10.
12. A computer-readable storage medium having stored thereon a computer program which, when executed by a machine, implements the method of any of claims 1 to 10.
13. A system for aligning an object, comprising:
a first 45-degree prism having a transmission optical path and a reflection optical path;
the photosensitive device is used for collecting first photosensitive information about a first surface of a first object through a transmission light path and collecting second photosensitive information about a second surface of the first object through a reflection light path, wherein the second surface is opposite to the first surface;
the pickup device is used for picking up the first object and rotating so that the second surface of the first object is positioned on the reflection light path, and therefore the photosensitive device can collect second photosensitive information; and
control means configured to perform the steps of the method of any one of claims 1 to 10.
14. The system of claim 13, wherein the pick-up device comprises:
and the pick-up part can be lengthened or shortened so that the first object on the reflection optical path is close to or far away from the first 45-degree prism.
15. The system of claim 14, wherein the pick-up device further comprises:
a rotating shaft; and
a rotating arm that rotates in a plane perpendicular to a 45-degree face of the first 45-degree prism along a rotation axis;
the pickup part is arranged on the rotating arm.
16. The system of claim 13, further comprising:
and the 45-degree surface of the second 45-degree prism is attached to the 45-degree surface of the first 45-degree prism.
CN202210780975.5A 2022-07-05 2022-07-05 Method, system, computing device, and medium for aligning objects Active CN114862954B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210780975.5A CN114862954B (en) 2022-07-05 2022-07-05 Method, system, computing device, and medium for aligning objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210780975.5A CN114862954B (en) 2022-07-05 2022-07-05 Method, system, computing device, and medium for aligning objects

Publications (2)

Publication Number Publication Date
CN114862954A CN114862954A (en) 2022-08-05
CN114862954B true CN114862954B (en) 2022-09-09

Family

ID=82626234

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210780975.5A Active CN114862954B (en) 2022-07-05 2022-07-05 Method, system, computing device, and medium for aligning objects

Country Status (1)

Country Link
CN (1) CN114862954B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5698480B2 (en) * 2010-09-02 2015-04-08 株式会社トプコン Measuring method and measuring device
CN104486550B (en) * 2014-12-29 2017-09-12 中国科学院长春光学精密机械与物理研究所 Aerial camera image focusing test device and method
JP6654649B2 (en) * 2015-12-14 2020-02-26 株式会社ニコン・トリンブル Defect detection device and program
CN106767537B (en) * 2017-03-20 2019-03-01 重庆市光学机械研究所 A kind of monocular various dimensions profile scan device
US10660523B2 (en) * 2017-07-07 2020-05-26 Hideo Ando Light-source unit, measurement apparatus, near-infrared microscopic apparatus, optical detection method, imaging method, calculation method, functional bio-related substance, state management method, and manufacturing method
CN111122509B (en) * 2019-11-08 2023-11-24 桂林电子科技大学 F-P interferometer-based reflection transmission type phase microscopic imaging measurement system

Also Published As

Publication number Publication date
CN114862954A (en) 2022-08-05

Similar Documents

Publication Publication Date Title
WO2022052404A1 (en) Memory alignment and insertion method and system based on machine vision, device, and storage medium
US8934721B2 (en) Microscopic vision measurement method based on adaptive positioning of camera coordinate frame
JP6510502B2 (en) Digital microscope with swivel stand, calibration method for such digital microscope and automatic tracking method of focus and image center
JP2017515145A (en) Auto-focus in low-profile bending optical multi-camera system
CN110148454B (en) Positioning method, positioning device, server and storage medium
CN109612689B (en) Optical fiber end face detection method and system
CN104111059A (en) Distance measuring and locating device and method and terminal
CN107726999B (en) Object surface three-dimensional information reconstruction system and working method thereof
WO2018156224A1 (en) Three-dimensional imager
CN111665512A (en) Range finding and mapping based on fusion of 3D lidar and inertial measurement unit
CN112230345A (en) Optical fiber auto-coupling alignment apparatus and method
CN108805940B (en) Method for tracking and positioning zoom camera in zooming process
TW201915443A (en) Positioning and measuring system based on image scale
CN114862954B (en) Method, system, computing device, and medium for aligning objects
EP3602214B1 (en) Method and apparatus for estimating system error of commissioning tool of industrial robot
EP3772633A1 (en) Surveying instrument
CN109916391B (en) Mechanical equipment space position real-time acquisition device and measurement system and method thereof
JP7342238B2 (en) Imaging equipment and methods, and adjustment elements
CN105759390B (en) A kind of automatic positioning of optical fiber and apparatus for placing and method
CN112529856A (en) Method for determining the position of an operating object, robot and automation system
JP3927063B2 (en) Optical member array core position measuring method and core position measuring apparatus
CN114518217B (en) Method for determining center distance between lenses, microscope control device, and storage medium
CN114897851A (en) Coordinate compensation method, device, equipment and medium based on central projection
CN113028997B (en) Method, device and equipment for measuring travel allowance of lens group and storage medium
JP7245040B2 (en) Survey data processing device, survey data processing method, program for survey data processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 430074 room a613, 4 / F, building 1, phase III, international enterprise center, special 1, Guanggu Avenue, Donghu New Technology Development Zone, Wuhan City, Hubei Province (Wuhan area of free trade zone)

Patentee after: Wuhan Qianxi Technology Co.,Ltd.

Patentee after: Dalian Youxinguang Technology Co.,Ltd.

Address before: 430074 room a613, 4 / F, building 1, phase III, international enterprise center, special 1, Guanggu Avenue, Donghu New Technology Development Zone, Wuhan City, Hubei Province (Wuhan area of free trade zone)

Patentee before: Wuhan Qianxi Technology Co.,Ltd.

Patentee before: Dalian Youxun Technology Co.,Ltd.

CP01 Change in the name or title of a patent holder