Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of embodiments of the present disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
The term "include" and variations thereof as used herein is meant to be inclusive in an open-ended manner, i.e., "including but not limited to". Unless specifically stated otherwise, the term "or" means "and/or". The term "based on" means "based at least in part on". The terms "one example embodiment" and "one embodiment" mean "at least one example embodiment". The term "another embodiment" means "at least one additional embodiment". The terms "first," "second," and the like may refer to different or the same object. Other explicit and implicit definitions are also possible below.
As described above, in a high-precision manufacturing process, in order to align two objects, for example, to achieve position matching at least in the height direction, there are many difficulties, resulting in insufficient accuracy of alignment, which adversely affects the manufacturing process.
To address, at least in part, one or more of the above issues and other potential issues, an example embodiment of the present disclosure proposes a scheme for aligning objects. In the scheme of the disclosure, a first target position where a first object is located is determined according to first photosensitive information from the photosensitive device. Wherein the first photosensitive information is photosensitive information about the first surface of the first object collected by the photosensitive device via a transmission optical path of the 45-degree prism, and then the pickup device is controlled to pick up the first object and rotate to the transition position so that the second surface of the first object is on a reflection optical path of the 45-degree prism, so that the photosensitive device collects second photosensitive information about the second surface of the first object via a reflection optical path. The second surface is opposite to the first surface. Next, the pickup device is moved according to the second photosensitive information so that the second optical path is equal to the first optical path, thereby determining that the first object reaches the second target position. The second optical path is an optical path from the photosensitive device to the second surface of the first object via the reflected optical path. Finally, a third target position at which the second object is located is determined based on at least the first optical path and third light-sensing information about the second object, such that the second surface of the first object at the fourth target position is mated with the first surface of the second object at the third target position. Wherein the third sensitization information is sensitization information about the first surface of the second object collected by the sensitization device via the transmission light path. The third optical path and the second optical path are both equal to the first optical path, which means that when the pickup device is controlled to drive the first object to rotate in the reverse direction by the target angle (i.e. when the first object reaches the fourth target position), the plane where the second surface of the first object is located is coplanar with the plane where the first surface of the second object is located. The target angle is an angle rotated by the pickup device to the transition position after the pickup device picks up the first object at the first target position. Based on the scheme, the second surface of the first object can be accurately matched with the first surface of the second object, and high-precision manufacturing is facilitated.
Fig. 1 shows a schematic structural diagram of a system 100 for aligning objects of an embodiment of the present disclosure. The system 100 includes a first 45-degree prism 102, a photosensitive device 104, a pickup device 106, and a control device (not shown).
In some embodiments, the system 100 further includes a base 114, a stage 116, a mounting bracket 118, and a display device (not shown). Wherein the stage 116 is disposed on the base 114. The control device may drive the stage 116 to move in at least one of the following ways: translation along the x-axis, translation along the y-axis, elevation along the z-axis, and rotation in the x-y plane. For convenience of illustration, the base 114 is rectangular, for example, the x-axis direction is a direction parallel to the long side of the base 114, and the y-axis direction is a direction parallel to the wide side of the base 114. When the system 100 is placed horizontally, the x-y plane is a horizontal plane. The object stage 116 is used for carrying the object and driving the object to move correspondingly based on the driving of the control device. The object plane of the object table 116 is parallel to the x-y plane.
In some embodiments, the number of stages 116 is two, and the two stages 116 respectively carry the first object O1 and the second object O2. The first object O1 and the second object O2 are two objects that need to be aligned.
With respect to the mounting bracket 118, it includes, for example, a vertical arm 120 and a horizontal arm 122. Wherein the vertical arms 120 are disposed on the base 114 and support the cross arms 122.
With respect to the first 45-degree prism 102, it has a transmission optical path and a reflection optical path. For example, the first 45-degree prism 102 has a 45-degree surface 112, and an optical coating is disposed on the 45-degree surface 112. The 45 degree face 112 may transmit a portion of the light and reflect a portion of the light to form a transmitted light path and a reflected light path. Wherein, the light path formed by the light entering the first 45-degree prism 102 and transmitting through the 45-degree surface 112 is a transmission light path; the light path formed by the light entering the first 45-degree prism 102 and being reflected by the 45-degree surface 112 is a reflected light path. In some embodiments, the thickness of the optical coating is less than or equal to 1 micron, so that the optical path error can be effectively reduced, and the position alignment accuracy of the object can be improved.
In some embodiments, the first 45 degree prism 102 is disposed on the vertical arm 120 and extends in a direction parallel to the x-y plane. In this state, the transmitted light path is perpendicular to the x-y plane.
In some embodiments, the system 100 also includes a second 45 degree prism 136. The 45 degree face of the second 45 degree prism 136 is attached to the 45 degree face 112 of the first 45 degree prism 102. The second 45 degree prism 136 and the first 45 degree prism 102 form a prism group 138. Fig. 2 shows a side view of a prism stack 138 of an embodiment of the present disclosure. The arrow LI represents incident light, the arrow LT represents transmitted light formed by the incident light through the transmitted light path, and the arrow LR represents reflected light formed by the reflected light path. The prism assembly 138 is square in cross-section. Via the prism set 138, the optical path length LA (not shown) of the incident light in the prism medium via the transmission optical path is equal to the optical path length LB (not shown) of the light in the prism medium via the reflection optical path. It is understood that LA = LC + LD; LB = LC + LE. Since LD = LE, LA = LB. Therefore, the accuracy in comparing the optical path via the transmission optical path and the optical path via the reflection optical path can be improved.
The photosensitive device 104 is used, for example, for collecting first photosensitive information about a first surface of the first object via a transmission optical path, and collecting second photosensitive information about a second surface of the first object via a reflection optical path, the second surface being an opposite surface to the first surface. For example, the photosensitive device 104 is disposed on the cross arm 122, on a common optical path of the transmission optical path and the reflection optical path, and faces the first 45-degree prism 102. For example, when the first object O1 is placed on the object carrying surface of the stage 116, the surface facing the photosensitive device 104 is a first surface, and the surface opposite to the first surface (i.e., the surface facing the object carrying surface) is a second surface.
In some embodiments, the photosensitive device 104 comprises an imaging device. The camera device may for example acquire a current first image of the first surface of the first object via a transmission optical path, or a current second image of the second surface of the first object via a reflection optical path, or a current third image of the first surface of the second object via a transmission optical path. The display device may, for example, display at least one of a current first image, a current second image, and a current third image.
In some embodiments, the light sensing device 104 comprises a laser gauge. The laser measuring instrument may for example acquire a first optical path of the light to the first surface of the first object via a transmitted optical path, or a second optical path of the light to the second surface of the first object via a reflected optical path, or a third optical path of the light to the first surface of the second object via a transmitted optical path. The laser surveying instrument may, for example, further scan a first surface of the first object via the transmitted optical path to obtain first scan data, or scan a second surface of the first object via the reflected optical path to obtain second scan data, or scan the first surface of the second object via the transmitted optical path to obtain third scan data, so that the control device constructs a first three-dimensional model corresponding to the first surface of the first object based on the first scan data, constructs a second three-dimensional model corresponding to the second surface of the first object based on the second scan data, and constructs a third three-dimensional model corresponding to the first surface of the second object based on the third scan data. The display device may, for example, display at least one of the first three-dimensional model, the second three-dimensional model, and the third three-dimensional model.
As for the pickup device 106, it is used to pick up the first object and rotate so that the second surface of the first object is on the reflected light path of the 45-degree prism, for example, so that the photosensitive device collects the second photosensitive information. In some embodiments, the pickup device 106 includes a base 124, a rotating shaft 126, a rotating arm 128, a retractable pickup member 130. Wherein, the base 124 is disposed on the base 114, the rotating arm 128 is connected with the base 124 through the rotating shaft 126, and the retractable picking member 130 is disposed on the rotating arm 128. The axis of the rotating shaft 126 is parallel to the 45 degree face 112 and the rotating arm 128 rotates in a plane perpendicular to the 45 degree face 112.
In some embodiments, retractable pick-up member 130 includes, for example, a telescoping portion 134 and a pick-up portion 132. The telescopic portion 134 can be extended or shortened at least along a direction perpendicular to the rotating arm 128 to bring the pickup portion 132 away from or close to the rotating arm 128. The pickup 132 may be a suction nozzle or a mechanical gripper, for example.
Under the control of the control device, the pickup device 106 picks up the first object from the stage 116 by using the pickup portion 132, and drives the first object to move by using the rotation of the rotating arm 128, so that the second surface of the first object is located on the reflected light path of the 45-degree prism, so that the photosensitive device collects the second photosensitive information. For example, the length of the rotary arm 128 matches the height of the first 45-degree prism 102, and when the rotary arm 128 rotates to be perpendicular to the x-y plane, the first object is located at a height matching the first 45-degree prism 102, which is suitable for the photosensitive device to collect the second photosensitive information of the second surface of the first object via the reflected light path.
With regard to the control device, for example, it is configured to perform the steps of the method for aligning objects of embodiments of the present disclosure. The control device may have one or more Processing units, including special-purpose Processing units such as a GPU (Graphics Processing Unit), an FPGA (Field Programmable Gate Array), an ASIC (Application Specific Integrated Circuit), and a general-purpose Processing Unit such as a CPU (Central Processing Unit). In addition, one or more virtual machines may be running on each control device. The control device drives the pick-up device 106 and the stage 116, for example, by means of a transmission mechanism.
Fig. 3 shows a flow diagram of a method 300 for aligning objects of an embodiment of the present disclosure. The method 300 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 300 may also include additional steps not shown and/or may omit steps shown, as the scope of the disclosure is not limited in this respect.
At step 302, the control device determines a first target location at which the first object is located based on the first exposure information from the exposure device. The first sensitization information is sensitization information about the first surface of the first object collected by the sensitization device via a transmission light path of the 45-degree prism.
Fig. 4 shows a schematic diagram of a photosensitive device of an embodiment of the present disclosure collecting first photosensitive information. Taking the photo sensing device as an example of the imaging device, after the first object O1 is placed on the object-bearing surface of the object stage 116, the control device controls the imaging device to capture a current first image about the first surface S1 of the first object O1 via the transmission optical path of the first 45-degree prism 102.
In some embodiments, a method of determining a first target location at which a first object is located includes, for example: the control device determines whether a current first image acquired by the camera device meets a preset condition or not; if the current first image is determined not to satisfy the predetermined condition, the control device moves the first object according to the current first image; if the current first image is determined to meet the preset condition, the control device determines the current position of the first object as the first target position. The method 500 for determining the first target position of the first object will be described in detail with reference to fig. 5, and will not be described herein again.
At step 304, the control device controls the pickup device to pick up the first object and rotate so that the second surface of the first object is on a reflected optical path of the 45-degree prism, so that the light sensing device collects second light sensing information about the second surface of the first object via the reflected optical path. The second surface is opposite to the first surface. For example, the control device drives the rotating arm 128 to rotate to be perpendicular to the x-y plane, and then the control device controls the camera device to acquire a current second image of the second surface of the first object. The present disclosure refers to a position at which the first object is located at this time as a "transition position", and refers to an angle rotated by the pickup device to rotate to the transition position from the first target position to the first target position as a "target angle".
At step 306, the control means moves the pickup means in accordance with the second sensed information so that the first object reaches the second target position so that the second optical path is equal to the first optical path. The first optical path is an optical path from the photosensitive device to the first surface of the first object via the transmission optical path when the first object is at the first target position. The second optical path is an optical path from the photosensitive device to the second surface of the first object via the reflected optical path.
The method concerning moving the pickup device according to the second photosensitive information includes, for example: the control device determines whether the current second image acquired by the camera device meets a preset condition; if the current second image collected by the camera device is determined not to meet the preset condition, the control device moves the pickup device; if the current second image acquired by the camera device is determined to meet the preset condition, the control device stops moving the pickup device so as to determine the current position of the first object at which the pickup device stops moving as the second target position. The method 700 for moving the pickup device according to the second photosensitive information will be described in detail below with reference to fig. 7, and will not be described herein again.
At step 308, the control device determines a third target location at which the second object is located based on at least the first optical path and third exposure information for the second object, such that the second surface of the first object at the fourth target location mates with the first surface of the second object at the third target location. The third exposure information is exposure information about the first surface of the second object collected by the exposure device via the transmission optical path.
The method for matching the first object with the second object includes, for example: the control device determines a third target position where the second object is located according to third photosensitive information from the photosensitive device; and controlling the pick-up device to rotate so as to drive the first object to move to the fourth target position, so that the second surface of the first object at the fourth target position is matched with the first surface of the second object at the third target position. The method 900 for matching the first object with the second object will be described in detail with reference to fig. 9, and will not be described herein again.
In the above solution, by making the third optical path and the second optical path equal to the first optical path, it means that the pickup device is controlled to drive the first object to rotate reversely by the target angle (i.e. when the first object reaches the fourth target position), and the plane where the second surface of the first object is located is coplanar with the plane where the first surface of the second object is located. The target angle is an angle rotated by the pickup device to the transition position after the pickup device picks up the first object at the first target position. Based on the scheme, the second surface of the first object can be accurately matched with the first surface of the second object, and high-precision manufacturing is facilitated.
FIG. 5 illustrates a flow chart of a method 500 of determining a first target location at which a first object is located in an embodiment of the disclosure. The method 500 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 500 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect.
At step 502, the control device determines whether a current first image captured by the camera device satisfies a predetermined condition. The current first image is based on, for example, first exposure information acquired by the camera device.
In some embodiments, the predetermined condition comprises, for example, a predetermined condition regarding a clear state. The method for determining whether the current first image acquired by the camera device meets the predetermined condition includes, for example: the control device identifies a background area and a contour of the first object from the current first image; obtaining the contrast between the outline and a background area adjacent to the outline; determining whether the contrast is greater than or equal to a predetermined contrast threshold; if the contrast is determined to be greater than or equal to the predetermined contrast threshold, the control means determines that the current first image satisfies the predetermined condition; if it is determined that the contrast is less than the predetermined contrast threshold, the control means determines that the current first image does not satisfy the predetermined condition. The method 600 for determining whether the current first image acquired by the image capturing device satisfies the predetermined condition will be described in detail below with reference to fig. 6, and will not be described herein again.
At step 504, if it is determined that the current first image does not satisfy the predetermined condition, the control apparatus moves the first object according to the current first image, and then returns to step 502. In some embodiments, if it is determined that the current first image does not satisfy the predetermined condition regarding the clear state, the control means moves the first object so that the first object is close to or far from the image pickup means. This is repeated until the first image satisfies a predetermined condition with respect to a clear state (e.g., the contrast between the contour of the first object and the background area adjacent to the contour is greater than or equal to a predetermined contrast threshold). Wherein the control means moves the first object, for example by driving the stage 116 up or down in the z-axis direction.
At step 506, if it is determined that the current first image satisfies the predetermined condition, the control apparatus determines the current location of the first object as the first target location. For example, when the first image satisfies a predetermined condition regarding the clear state, the movement of the first object is stopped, and it is determined that the current position of the first object is determined as the first target position.
It should be understood that the greater the contrast between the contour of the first object and the background area adjacent to the contour, the sharper the first object in the current first image; conversely, the more blurred the first object. When the contrast is greater than or equal to a predetermined contrast threshold, a clear image of the first surface of the first object, referred to as the "first reference image", is acquired by the characterization camera. Accordingly, the optical path from the photosensitive device to the first surface of the first object via the transmission optical path when the first object is at the first target position is referred to as a "first optical path", which is characterized by R1 in fig. 4.
In the case of an imaging device, the target optical path to which it acquires a clear image about an object is fixed and unchanged without changing its focal length. It is understood that the target optical path refers to an optical path from the image pickup device to the object at this time. Therefore, the position of the object is determined according to the definition state of the image of the object acquired by the camera device, and the method has extremely high accuracy.
By adopting the above means, the present disclosure can reasonably adjust the position of the first object according to whether the first image satisfies a predetermined condition (e.g., a predetermined condition on a clear state) to determine the first target position where the first object is located. The image acquisition process is rapid and accurate, the first target position of the first object is determined according to whether the first image meets the preset condition, the accuracy is extremely high, and the stability of the moving process can be ensured.
Fig. 6 illustrates a flow chart of a method 600 of determining whether a current first image captured by a camera device satisfies a predetermined condition according to an embodiment of the disclosure. The method 600 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 600 may also include additional steps not shown and/or may omit steps shown, as the scope of the disclosure is not limited in this respect.
At step 602, the control apparatus identifies a background region and an outline of the first object from the current first image. The control means identifies a contour of the first object from the current first image, for example based on an image recognition algorithm, and identifies an area outside the contour in the current first image as a background area.
At step 604, the control device obtains a contrast between the contour and a background area adjacent to the contour. The control device acquires the contrast, for example, based on an image analysis algorithm.
At step 606, control determines whether the contrast is greater than or equal to a predetermined contrast threshold.
At step 608, if it is determined that the contrast is greater than or equal to the predetermined contrast threshold, the control apparatus determines that the current first image satisfies the predetermined condition.
At step 610, if it is determined that the contrast is less than the predetermined contrast threshold, the control device determines that the current first image does not satisfy the predetermined condition.
By adopting the above means, the method and the device can determine whether the current first image meets the preset condition about the clear state according to the contrast between the contour of the first object in the current first image and the background area adjacent to the contour, can effectively ensure that the clear state of the first image is accurately acquired, and provide important guarantee for realizing accurate position adjustment.
Fig. 7 shows a flowchart of a method 700 of moving a pickup according to second photosensitive information of an embodiment of the present disclosure. The method 700 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 700 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect.
Fig. 8 shows a schematic diagram of moving a pickup according to second photosensitive information according to an embodiment of the present disclosure. Method 700 is described below in conjunction with fig. 8. For convenience of explanation, the first object O1 at the first target position is illustrated in dashed lines in fig. 8.
At step 702, the control device determines whether a current second image captured by the camera device satisfies a predetermined condition. The current second image is based on the second sensed information acquired by the camera, for example, the current second image of the second surface S2 of the first object O1 acquired by the camera via the reflected light path 122.
At step 704, if it is determined that the current second image captured by the camera does not satisfy the predetermined condition, the control means moves the pickup means and then returns to step 702. For example, if it is determined that the current second image does not satisfy the predetermined condition regarding the clear state, the control means moves the pickup means so that the first object is close to or away from the first 45-degree prism 102. In some embodiments, the control device drives the telescopic part 134 to extend or contract so that the first object O1 is close to or far from the first 45-degree prism 102.
At step 706, if it is determined that the current second image captured by the camera device meets the predetermined condition, the control device stops moving the pickup device so as to determine the current position of the first object at which the pickup device stopped moving as the second target position. For example, if it is determined that the current second image satisfies a predetermined condition with respect to the clear state, the control means stops moving the pickup means. At this time, the camera device captures a clear image of the second surface of the first object, referred to as a "second reference image". An optical path from the photosensitive device to the second surface of the first object via the reflected optical path is referred to as a "second optical path". With the focal length of the imaging device remaining unchanged, the optical path length over which the imaging device acquires a sharp image about the object is fixed. Therefore, when the first object is at the second target position, the second optical path is equal to the first optical path.
As shown in fig. 8, the first optical path R1= RA + RB; the second optical path R2 (not shown in the figure) = RA + RC. When the second optical path R2 is equal to the first optical path R1, RB = RC. This means that if the pickup 106 is controlled to rotate reversely by the target angle at this time, the first object O1 is moved to the fourth target position. The plane of the second surface S2 of the first object O1 at the fourth target position is coplanar with, i.e., in the same plane as, the plane of the first surface S1 of the first object O1 at the first target position.
By adopting the above means, the present disclosure can reasonably move the pickup device so that the first object is at the second target position according to whether the second image satisfies a predetermined condition (e.g., a predetermined condition regarding a clear state). The image acquisition process is rapid and accurate; the pick-up device is moved according to the clear state of the image with extremely high accuracy, and the stability of the moving process can be ensured.
The method of mating a first object with a second object may be implemented, for example, according to method 900. Fig. 9 illustrates a flow diagram of a method 900 of mating a first object with a second object of an embodiment of the disclosure. The method 900 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 900 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect. The method 900 is described in detail below with reference to fig. 10 and 11. FIG. 10 illustrates a schematic diagram of determining a third target position at which a second object is located according to an embodiment of the disclosure. Fig. 11 shows a schematic diagram of an embodiment of the present disclosure for enabling a first object to be mated with a second object.
At step 902, the control device determines a third target location at which the second object is located based on third exposure information from the exposure device. The third sensitization information is sensitization information about the first surface of the second object collected by the sensitization device via a transmission light path of the 45-degree prism.
Taking the photosensitive device as an example of the imaging device, after the second object O2 is placed on the object carrying surface of the object carrying table 116, the control device controls the imaging device to capture a current third image of the first surface S3 of the second object O2 via the transmission optical path of the 45-degree prism. Determining the third target location of the second object may be performed with reference to method 500, and will not be described herein.
In some embodiments, the number of stages 116 is two. In the initial state, the first stage carries the first object O1, and the second stage carries the second object O2. After confirming that the first object O1 is at the second target position, the control device drives the stage 116 to move so that the second stage is within the shooting range of the camera device, so as to capture the current third image.
If the control means determines that the present third image satisfies the predetermined condition on the clear state, it is determined that the position where the second object is present is the third target position. At this time, the third optical path R3 is equal to the first optical path R1.
At step 904, the control means controls the pick-up means to rotate so as to bring the first object to move to the fourth target position such that the second surface of the first object at the fourth target position cooperates with the first surface of the second object at the third target position. For example, the control device controls the pickup device to rotate the target angle in reverse so as to bring the first object to move to the fourth target position. Referring to fig. 11, the second surface S2 of the first object O1 at the fourth target position precisely matches the first surface S3 of the second object O2 at the third target position.
Since the third optical path and the second optical path are both equal to the first optical path, the pickup device is controlled to drive the first object to rotate in the reverse direction by the target angle, so that when the first object reaches the fourth target position, the plane where the second surface of the first object is located is coplanar with the plane where the first surface of the second object is located, that is, the second surface of the first object and the first surface of the second object are precisely matched at least in the z-axis direction.
Based on the scheme, the second surface of the first object can be matched with the first surface of the second object accurately and naturally, and high-precision manufacturing is facilitated.
Fig. 12 illustrates a flow diagram of a method 1200 of mating a first object with a second object of an embodiment of the disclosure. The method 1200 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 1200 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect.
At step 1202, the control device determines a third target location at which the second object is located based on third photosensitive information from the photosensitive device.
At step 1204, the control device moves the second object within the normal plane of the transmission light path according to the third exposure information such that the third exposure information matches target exposure information, the target exposure information including at least one of the first exposure information and the second exposure information.
In some embodiments, after determining that the second object is at the third target position, the control device obtains a matching degree of the current third image and the first reference image based on an image recognition algorithm. For example, a first surface of the second object having a third feature may serve as a localization marker, and a first surface of the first object having a first feature that corresponds to the third feature (e.g., when the first object matches the second object, the first feature matches a projection of the third feature in an x-y plane). With the first reference image as the matching target, the control device drives the stage 116 to move (e.g., at least one of translate along the x-axis direction, translate along the y-axis direction, raise and lower along the z-axis direction, and rotate in the x-y plane) within the normal plane (i.e., the x-y plane) of the transmission optical path so that the current third image matches the first reference image. To this end, the control device enables an accurate adjustment of the pose of the second object in the x-y plane.
In some embodiments, for example, the second surface of the first object has a second feature, the second feature corresponding to the third feature. The control means moves the second object with the second reference image as a matching target so that the current third image matches the second reference image.
At step 1206, the control means controls the pick-up means to rotate in order to bring the first object to move to the fourth target position such that the second surface of the first object at the fourth target position cooperates with the first surface of the second object at the third target position.
For example, the control device controls the pickup device 106 to rotate reversely by the target angle so as to bring the first object O1 to move to the fourth target position. Based on the third optical path and the second optical path both being equal to the first optical path, the control device achieves an exact match of the second surface of the first object with the first surface of the second object at least in the z-axis direction; in combination with the current third image matching the first reference image, the control means enables matching of the pose of the second surface of the first object and the first surface of the second object in the x-y plane. Thus, when the first object is moved to the fourth target position, the first object and the second object fit exactly in all three dimensions x, y, z.
By adopting the above means, the present disclosure can reasonably adjust the position of the second object according to the matching degree of the current third image and the target test image, so that the second object is precisely matched with the first object in the x-axis and y-axis directions, so that when the first object is moved to the fourth target position, the first object and the second object are precisely matched in three dimensions of x, y and z.
Fig. 13 shows a flow diagram of a method 1300 for aligning objects of an embodiment of the present disclosure. The method 1300 may be performed by a control device or at the electronic device 1400 shown in fig. 14. It should be understood that method 1300 may also include additional steps not shown and/or may omit steps shown, as the scope of the present disclosure is not limited in this respect.
At step 1302, the control device obtains a first optical path via the transmission optical path based on the laser gauge. For example, after the first object is placed on the object-carrying surface of the object stage 116, the control device controls the laser measuring instrument to acquire a first optical path to the first surface of the first object via the transmission optical path, and determines the current position of the first object as the first target position.
In some embodiments, for a first object at a first target position, the control device further controls the laser measuring instrument to scan a first surface of the first object via the transmitted optical path to obtain first scan data, and then constructs a first three-dimensional model for the first surface of the first object based on the first scan data, for example.
At step 1304, the control device obtains a second optical path via the reflected optical path based on the laser gauge. For example, after the first object is moved to the transition position by the pickup device, the control device controls the laser surveying instrument to acquire the second optical path reaching the second surface of the first object via the reflected optical path.
At step 1306, control determines whether the second optical path is equal to the first optical path.
At step 1308, if it is determined that the second optical path is not equal to the first optical path, the control means moves the pickup means so that the first object is close to or far from the 45-degree prism, and then returns to step 1306.
At step 1310, the control means stops moving the pickup means if it is determined that the second optical path is equal to the first optical path. At this time, the control device determines that the first object is at the second target position.
In some embodiments, for the first object at the second target position, the control device further controls the laser measuring instrument to scan the second surface of the first object via the reflected light path to obtain second scan data, and then constructs a second three-dimensional model of the second surface of the first object based on the second scan data.
At step 1312, the control device acquires a third optical path via the transmission optical path based on the laser gauge. For example, after the second object is placed on the object-carrying surface of the object stage 116, the control device controls the laser measuring instrument to acquire a third optical path to the first surface of the second object via the transmission optical path.
At step 1314, control determines whether the third optical path is equal to the first optical path.
At step 1316, if it is determined that the third optical path is not equal to the first optical path, the control device moves the second object so that the second object is close to or far from the laser meter, and then returns to step 1314.
At step 1318, the control means stops moving the second object if it is determined that the third optical path is equal to the first optical path. At this time, the control device determines that the second object is at the third target position.
In some embodiments, for a second object at a third target position, the control device further controls the laser measuring instrument to scan the first surface of the second object via the transmission optical path to obtain third scan data, and then constructs a third three-dimensional model of the first surface of the second object based on the third scan data.
At step 1320, the control apparatus moves the second object in accordance with the third three-dimensional model and within the normal plane of the transmitted light path such that the third three-dimensional model matches a target three-dimensional model, the target three-dimensional model including at least one of the first three-dimensional model and the second three-dimensional model.
At step 1322, the control device controls the pickup device to rotate so as to bring the first object to move to the fourth target position, such that the second surface of the first object at the fourth target position is brought into engagement with the first surface of the second object at the third target position.
Based on the laser measuring instrument, the precision of optical path acquisition can be improved, so that the precision of object alignment is further improved. The three-dimensional model is constructed based on the scanning data acquired by the laser measuring instrument, has extremely high accuracy, can accurately reflect the shape characteristics of the surface of the object, and is beneficial to realizing high-accuracy matching so as to improve the accuracy of object alignment.
Fig. 14 shows a schematic block diagram of an example electronic device 1400 for aligning objects that may be used to implement embodiments of the present disclosure. As shown, the electronic device 1400 includes a central processing unit (i.e., CPU 1401) that can perform various appropriate actions and processes in accordance with computer program instructions stored in a read-only memory (i.e., ROM 1402) or loaded from a storage unit 1408 into a random access memory (i.e., RAM 1403). In the RAM 1403, various programs and data required for the operation of the electronic device 1400 can also be stored. The CPU 1401, ROM 1402, and RAM 1403 are connected to each other via a bus 1404. An input/output interface (i.e., I/O interface 1405) is also connected to bus 1404.
A number of components in the electronic device 1400 are connected to the I/O interface 1405, including: an input unit 1406 such as a keyboard, a mouse, a microphone, and the like; an output unit 1407 such as various types of displays, speakers, and the like; a storage unit 1408 such as a magnetic disk, optical disk, or the like; and a communication unit 1409 such as a network card, a modem, a wireless communication transceiver, and the like. The communication unit 1409 allows the electronic device 1400 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The various processes and processes described above, such as methods 300, 500, 600, 700, 900, 1200, and 1300, may be performed by the CPU 1401. For example, in some embodiments, methods 300, 500, 600, 700, 900, 1200, and 1300 may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 1408. In some embodiments, part or all of the computer program can be loaded and/or installed onto the electronic device 1400 via the ROM 1402 and/or the communication unit 1409. When loaded into RAM 1403 and executed by CPU 1401, may perform one or more of the actions of methods 300, 500, 600, 700, 900, 1200 and 1300 described above.
The present disclosure relates to methods, apparatuses, systems, electronic devices, computer-readable storage media and/or computer program products. The computer program product may include computer-readable program instructions for performing various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge computing devices. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the disclosure are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.