CN112333378A - Imaging method, device and system - Google Patents

Imaging method, device and system Download PDF

Info

Publication number
CN112333378A
CN112333378A CN201810814359.0A CN201810814359A CN112333378A CN 112333378 A CN112333378 A CN 112333378A CN 201810814359 A CN201810814359 A CN 201810814359A CN 112333378 A CN112333378 A CN 112333378A
Authority
CN
China
Prior art keywords
lens
image
value
light
optionally
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810814359.0A
Other languages
Chinese (zh)
Inventor
李林森
孙瑞涛
徐家宏
周志良
姜泽飞
颜钦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Genemind Biosciences Co Ltd
Original Assignee
Genemind Biosciences Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Genemind Biosciences Co Ltd filed Critical Genemind Biosciences Co Ltd
Priority to CN201810814359.0A priority Critical patent/CN112333378A/en
Priority to EP19841635.6A priority patent/EP3829158A4/en
Priority to PCT/CN2019/097272 priority patent/WO2020020148A1/en
Priority to US17/262,663 priority patent/US11368614B2/en
Publication of CN112333378A publication Critical patent/CN112333378A/en
Priority to US17/746,838 priority patent/US11575823B2/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals

Abstract

The invention discloses an imaging method and an imaging system, wherein the imaging method utilizes the imaging system to image an object, the imaging system comprises a lens, the object comprises a first object, a second object and a third object which are positioned at different positions of a first preset track, and the imaging method comprises the following steps: the lens and the first preset track are relatively moved according to a first preset relation so as to obtain a clear image of the third object without focusing by using the imaging system, and the first preset relation is determined by the focal plane position of the first object and the focal plane position of the second object. The imaging method is high in imaging efficiency, and can still realize quick focusing according to the first preset relation under the condition that the focus tracking fails, so that the condition that a shot image is fuzzy due to the fact that the focus is out of focus is avoided.

Description

Imaging method, device and system
Technical Field
The present invention relates to the field of optical detection, and in particular, to an imaging method, device and system.
Background
In the related art, the focal length of the camera is adjusted rapidly to obtain the clearest focal plane each time the camera takes a picture, so that a clear picture is obtained, and the process is called focus tracking.
However, in practical applications, when a camera is used for shooting, some external interference is easy to occur, for example, when the camera is used for shooting, the camera fails to focus due to the existence of other veiling glare or dust or scratches on the surface of the object, and when the camera fails to focus, if the camera cannot focus again, the image is blurred. For example, when a camera is used for sequence determination, if the target is a nucleic acid molecule located in a chip, bubbles are contained in the liquid inside the imaging chip, and large masses of fluorescent impurities, dust on the surface of the chip, scratches, and the like are likely to cause failure in focusing of the camera.
Disclosure of Invention
In a sequencing platform for obtaining nucleic acid information based on imaging, such as the second generation or third generation sequencing platform for obtaining nucleic acid information by photographing currently on the market, a process of photographing nucleic acid placed in a reactor by using an imaging system is included.
As is common, a reactor is also referred to as a chip (Flowcell) which may contain one or more parallel channels for the ingress and egress of reagents to form the environment required for the sequencing reaction. The chip can be formed by bonding two pieces of glass, the sequencing process comprises the steps that a camera carries out multiple rounds of photographing on a fixed area of the chip, the area photographed each time can be called Fov (field of view), each round of photographing can be called a cycle, and the two cycles are subjected to chemical reaction by introducing reagents again.
In the normal photographing process, the camera can mostly and automatically focus on successfully, namely, the clearest focal plane position is found. When interference occurs, the tracking may fail.
Fig. 1-3 illustrate data for successful focus tracking and abnormal or failed focus tracking as occurred in the inventors' experiments.
Taking two rows of FOVs of the same cycle as an example of continuous shooting, the height (Z value) coordinates of the objective lens during shooting are recorded, as shown in fig. 1, the abscissa is a serial number of Fov, the first half Fov is shot sequentially from the left side to the right side of the Flowcell, and the second half Fov is shot from right side to left side after line feed. The ordinate is the height of the microscope objective from the camera, namely the Z value, the unit is um, the negative value indicates that the microscope objective is positioned below the camera, and the larger the absolute value of the Z value is, the farther the objective is from the camera is.
Fig. 1 shows Z-value curves corresponding to 300 Fov images captured successfully in focus, and fig. 2 shows Z-value curves corresponding to a partial focus error (reflected as a partial Z-value error) included in 200 Fov images captured successfully, in which case the abnormal portion of the curve, i.e., the image corresponding to the portion appearing as a convex portion, is an unclear/blurred image.
Because the camera has a certain limitation in tracking, the camera is easy to be out of focus after encountering interference, and the object lens is far away from the focal plane after being out of focus, namely the distance between the object lens and the focal plane position of the object lens Fov in the subsequent tracking shooting is too large, so that the object lens cannot return to the focal plane even if the interference is eliminated, which is shown in fig. 3. The first 1-200 Fov in fig. 3 belong to one cycle and the last Fov belongs to another cycle. Fig. 3 shows that after 268 th Fov (in the first row of another cycle), the focus chase failed, and after the disturbance disappeared, the re-focus was not successful until the cycle ended.
Failure to catch up means that the image is blurred, which can lead to loss of information. This is therefore a problem that must be solved. In reality, interference cannot be completely eliminated, but generally, it is desirable that a clear image can be acquired at least after the interference disappears.
The inventor analyzes a large amount of data of successful focus tracking and abnormal focus tracking, and finds that a plurality of Z value curves corresponding to a plurality of Fov with the same normal focus tracking in different cycles (namely different time periods) show a certain rule under the condition that the objective lens is fixed. As shown in fig. 4, the Z value curves corresponding to 300 Fov obtained clear pictures in normal focus tracking of 4 different cycles are shown.
The inventors have found two laws:
1) the same location (Fov) may have different focal planes at different cycles, but the relative position of the focal planes changes substantially relative to the other Fov of the same cycle. That is, in physical location, there is correlation between the focal planes of different Fov in the same cycle.
2) Half of 300 Fov of each curve on the graph is shot from the left side to the right side of one line of the Flowcell, and the other half is shot from the right side to the left side after line change, because of the deformation of the Flowcell and/or the height difference of the left side and the right side, focal planes of a plurality of Fov which are continuous in the same direction (for example, from left to right or from right to left) of the same line are in a certain rule, and can be better fit into a straight line.
Presenting the above rules, the inventors guess the possible reasons to include: since the same Fov needs to be photographed repeatedly in different cycles, the pressure inside the chip changes after heating and reagent circulation, and the focal plane shifts integrally. While Fov were small relative to the entire chip, the surface flatness of each Fov was seen to be constant, as indicated by the relative focal plane position between adjacent Fov remaining constant.
Based on the above-mentioned discovered rules, the inventor develops a set of algorithms, and can enable the camera to have the focal plane prediction function through the assistance of software algorithms under the condition of not replacing hardware. Specifically, for example, in cycle1, for a plurality of Fov located in the same preset track (the first preset track, for example, the same row), the focal planes of two of them Fov can be acquired in focus, the difference between the focal planes of the two is calculated, a relationship (such as a first predetermined relationship) is obtained by linear fitting, and the relationship is used to predict the focal plane positions of other Fov in the row. For the cycle2 and the following cycles, the focus surface of the Fov of the current cycle is determined by memorizing the focus surface of the 1 cycle or any Fov of any previous cycle which is focused normally, and then focusing is carried out, so that the focus surface of any other Fov of the current cycle can be predicted by establishing a relationship through linear regression.
The relationship is established using linear regression, expressed as formula (a) y ═ kx + b, and the slope k (which may also be referred to as the trend k) and intercept b (which may also be referred to as the base offset b) need to be determined. Based on the above rule 1), k is 1, so the formula (a) can be converted into the formula (b) y is x + b, and b can be determined based on the relative positions of any two Fov focal planes on the same track of the same cycle and the Z value.
For example, for cycle1, its base offset b may be calculated by the overall focal plane difference (e.g., from one end of the track to the other end of the track), specifically, focus obtains the focal plane position Z values of cycle 1fov Z (r) and cycle 1fov Z (l) that respectively represent two objects at one end and the other end of one track in cycle1 (which may be referred to as two positions or two Fov), intercept b may be calculated as (cycle 1fov Z (r) -cycle 1fov Z (l))/fov num that represents the number of positions Fov between the two positions of cycle 1fov Z (r) and cycle 1fov Z (l), equation b) may be used to predict the cycle 1fov Z (n +1) of cycle1, cycle1 vc (n) and cycle1 vc (1) in b) may represent the adjacent two focus positions of cycle 1fov (Z (n +1), and focus more may be obtained by obtaining the focus position (Fov).
Note that b can be determined by the focal plane information of two Fov on the same track. Also, cyc1FovZ (n +1) may be determined using the determined formula (b) and any focused Fov focal plane coordinate information, where, for example, cyc1FovZ (r) and cyc1FovZ (l) are determined using the determined relationship (b) and any of the determined values of cyc1FovZ (r) and cyc1FovZ (l).
After the linear relationship of a certain cycle is determined, for any cycle of the same track/same Fov of the subsequent shooting, the focal plane position of any Fov of the current cycle can be predicted based on the determined linear relationship and the focal plane position of any Fov of the current cycle. For example, using the focal plane position of fov (N) (Fov of nth or nth position) in the current cycle to predict the focal plane position of Fov (N +1) (Fov of nth +1 or nth +1 position) in the same cycle, we can bring the Z value currfov (N) of nth Fov as a dependent variable into formula (b), and the obtained y is currfov (N + 1).
After the linear relationship of a certain cycle is determined, for any cycle of the same track/same Fov in the subsequent shooting, the focal plane positions of two Fov in the cycle and the focal plane position of one of the same Fov of the current cycle may be determined based on the determined linear relationship, so as to predict the focal plane position of the other of the same Fov of the current cycle. For example, in the previous cycle, equation (b) is determined, the focal plane position of Fov (N +1) (Fov of the N +1 th or N +1 th position) in the same cycle is predicted using the focal plane position of fov (N) (Fov of the nth or nth position) in the current cycle, we can determine the focal plane positions of fov (N) and Fov (N +1) in the previous cycle, denoted preFovZ (N) and preFovZ (N +1), respectively, using equation (c) currfov (N +1) currfov (N) + z (N) + (+ 1) -preFovZ (N)), to determine currfov (N + 1).
It should be noted that the relationship between the discovery and the explanation of the above rule and the relationship established schematically is a linear relationship only for convenience of description or understanding, and those skilled in the art can understand that the first preset track may be a straight line or a curve, and any curve may be regarded as a fitting of a plurality of line segments. In this regard, it is believed that by the above-mentioned exemplary illustration of finding and establishing the relationship of the rule, when the first preset track is a curve, the skilled person can follow the concept of the present invention to regard the curve-type first preset track as a group of line segments, and correspondingly, a first preset relationship comprising a group of linear relationships can be established to realize the focus-free prediction of the focal plane position of the object on the track.
In the case of not replacing the hardware, the camera can be returned to the vicinity of the focal plane again by the imaging method of the present embodiment, and photographing can be started again. Based on the above findings and explanatory illustrations, the present invention provides an imaging method, an imaging apparatus, an imaging system, and a sequencing system.
An imaging method according to an embodiment of the present invention is an imaging method for imaging an object by using an imaging system, where the imaging system includes a lens, and the object includes a first object, a second object, and a third object located at different positions of a first preset trajectory, and the imaging method includes: the lens and the first preset track are relatively moved according to a first preset relation so as to obtain a clear image of the third object without focusing by using the imaging system, and the first preset relation is determined by the focal plane position of the first object and the focal plane position of the second object.
An imaging system according to an embodiment of the present invention images an object, the imaging system includes a lens and a control device, the object includes a first object, a second object, and a third object located at different positions of a first preset trajectory, and the control device is configured to: the lens and the first preset track are relatively moved according to a first preset relation so as to obtain a clear image of the third object without focusing by using the imaging system, and the first preset relation is determined by the focal plane position of the first object and the focal plane position of the second object.
According to the imaging method and the imaging system, the first preset relation is determined through the focusing positions of the first object and the second object, when other objects on the first preset track are imaged, focal plane prediction can be directly carried out according to the first preset relation, a clear image of the third object is obtained without focusing, and the method is particularly suitable for the situation that the number of the objects is large and the images of the objects are required to be obtained quickly and continuously.
The sequencing device comprises the imaging system.
A computer-readable storage medium of an embodiment of the present invention stores a program for execution by a computer, where executing the program includes performing the steps of the method of the above embodiment. The computer-readable storage medium may include: read-only memory, random access memory, magnetic or optical disk, and the like.
An imaging system according to an embodiment of the present invention is configured to image an object, the imaging system including a lens and a control device, the object including a first object, a second object, and a third object located at different positions of a first preset trajectory, the control device including a computer-executable program, and executing the computer-executable program including steps to perform the method according to the above-described embodiment.
A computer program product of the embodiment of the present invention contains instructions, which when executed by a computer, cause the computer to execute the steps of the method of the above embodiment.
Additional aspects and advantages of embodiments of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the invention.
Drawings
The above and/or additional aspects and advantages of embodiments of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a Z-value graph showing the success of focus tracking in sequence measurement.
Fig. 2 is a graph showing a Z value corresponding to the case where focus tracking failure occurs in the abnormal convex portion Fov in the sequence measurement.
Fig. 3 is a Z-value graph showing that focusing failed in the sequence measurement and that focusing could not be resumed successfully even at the end of the cycle photographing after the interference disappeared.
Fig. 4 is a schematic diagram showing different focusing positions formed by the focusing data of the object in the sequence measurement.
Fig. 5 is a schematic structural diagram of a first preset track and a second preset track according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of a focus position formed by focus data of an object without interference at the time of sequence determination.
Fig. 7 is a schematic view of a focusing position formed by focusing data of an object when re-focusing is successful in the presence of disturbance at the time of sequence determination.
Fig. 8 is a schematic view of a focused position formed by the focused data of the subject when the re-focusing is not possible with interference during the sequence measurement.
Fig. 9 is a flowchart illustrating a focusing method according to an embodiment of the invention.
Fig. 10 is a schematic diagram of a positional relationship between a lens and an object according to an embodiment of the present invention.
Fig. 11 is a partial structural schematic diagram of an imaging system of an embodiment of the present invention.
FIG. 12 is a schematic diagram of connected components of an image according to an embodiment of the invention.
Fig. 13 is another flowchart illustrating a focusing method according to an embodiment of the invention.
FIG. 14 is a flowchart illustrating a focusing method according to an embodiment of the present invention.
FIG. 15 is a further flowchart illustrating a focusing method according to an embodiment of the invention.
FIG. 16 is a further flowchart illustrating a focusing method according to an embodiment of the present invention.
FIG. 17 is a block schematic diagram of an imaging system of an embodiment of the invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that "first", "second", "third", "fourth" and "fifth" are merely for convenience of description and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
In the description of the present invention, unless otherwise explicitly specified or limited, "connected" is to be understood in a broad sense, e.g., fixedly, detachably or integrally connected; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
The terms "center," "thickness," "upper," "lower," "front," "rear," and the like, as used herein, refer to an orientation or positional relationship that is based on the orientation or positional relationship shown in the detailed description or the drawings, which is for convenience and simplicity of description, and does not indicate or imply that the referenced device or element must have a particular orientation, be constructed and operated in a particular orientation.
The term "constant", as it relates to distance, object distance and/or relative position, may refer to a change in value, range or amount, either absolutely constant or relatively constant, which is said to remain within a certain deviation or predetermined acceptable range. "invariant" with respect to distance, object distance, and/or relative position is relatively invariant, unless otherwise specified.
The following disclosure provides many implementations or examples for implementing the inventive concepts. The present invention may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
The "sequencing" referred to in the embodiments of the present invention is similar to nucleic acid sequencing, including DNA sequencing and/or RNA sequencing, including long fragment sequencing and/or short fragment sequencing. The so-called "sequencing reaction" is the same as the sequencing reaction.
The embodiment of the invention provides an imaging method, which utilizes an imaging system to image an object. Referring to fig. 5, 11 and 12, the imaging system includes a lens 104, the objects include a first object 42, a second object 44 and a third object 46 at different positions of a first predetermined track 43, and the imaging method includes: the lens 104 and the first preset trajectory 43 are relatively moved according to a first predetermined relationship determined by the focal plane position of the first object 42 and the focal plane position of the second object 44 to obtain an image of the third object 46 with the imaging system without focusing.
In the imaging method, the first predetermined relationship is determined by the focusing positions of the first object 42 and the second object 44, when other objects on the first preset track are imaged, focal plane prediction can be directly performed according to the first predetermined relationship, a clear image of a third object is obtained without focusing, and the method is particularly suitable for a situation where the number of the objects is large and the images of the objects are desired to be obtained quickly and continuously.
Specifically, in the example of fig. 5, the first preset track 43 may be a linear track, and the first object 42 and the second object 44 are located at two positions of the linear track, for example, at both ends of the linear track, and it is understood that the number of the third objects 46 may be plural, the plural third objects 46 are sequentially arranged at the first preset track 43, and the third object 46 is located between the first object 42 and the second object 44. It is understood that in other examples, the third object 46 may be located in other locations than the locations of the first object 42 and the second object 44. In other examples, the first preset trajectory 43 may be a non-linear trajectory, such as a curved trajectory, which may be considered a fit of a plurality of line segments in which the first, second and third objects are located on the same line segment.
In some embodiments, the first predetermined relationship may be a linear relationship. In one embodiment, referring to fig. 5, when the first predetermined track 43 is one or more channels 52 of the chip 500 used in the sequence sequencing process, the third object 46 to be imaged is located at one or more positions (FOVs) in the channels 52, and the lens and the first predetermined track 43 can move relatively along the first direction a during photographing, for example, the lens 104 is fixed, the lens 104 includes an optical axis OP, and the first predetermined track 43 moves along a direction perpendicular to the optical axis OP. It will be appreciated that in some embodiments, the first pre-set track 43 is capable of moving in a direction parallel to the optical axis OP. The first preset track 43 can be moved according to the actual adjustment requirement.
The imaging system includes a camera 108, the lens 104 is mounted on the camera 108, and the camera 108 captures light passing through the lens 104 for imaging.
In some embodiments, the moving the lens 104 relative to the first preset track 43 includes at least one of: fixing the lens 104 and moving the first preset track 43; fixing the first preset track 43 and moving the lens 104; the lens 104 and the first preset track 43 are moved simultaneously.
Therefore, the moving modes of the lens 104 and the first preset track 43 are various, the adaptability is strong, and the application range of the imaging method is widened.
Specifically, when the first preset track 43 is moved, the first preset track 43 may be placed on a stage, and the stage may drive the first preset track 43 and the object to translate back and forth along a direction perpendicular to the optical axis OP of the lens 104, so as to place one of the third objects 46 below the lens 104, so that the imaging system images the third object 46.
When the lens 104 is moved, the lens 104 may be mounted on a driving mechanism, and the driving mechanism may drive the lens 104 to translate back and forth in a direction perpendicular to the optical axis OP of the lens 104 in an electric or manual manner, so that the lens 104 moves above one of the third objects 46, and the imaging system images the object.
Moving the lens 104 and the first preset track 43 at the same time, it can be understood that the lens 104 may be moved first, and then the first preset track 43 is moved, so that one of the third objects 46 is located below the lens 104; alternatively, the first predetermined track 43 may be moved first, and then the lens 104 is moved to position the lens 104 over one of the third objects 46, or the first predetermined track 43 may be moved while the lens 104 is moved to position the lens over one of the third objects 46.
In some embodiments, the determining of the first predetermined relationship comprises: focusing the first object 42 with the imaging system, determining first coordinates; focusing the second object 44 with the imaging system to determine second coordinates; a first predetermined relationship is established in terms of first coordinates reflecting the focal plane position of the first object 42 and second coordinates reflecting the focal plane position of the second object 44. Thus, the first predetermined relationship can be predetermined, and when imaging of other objects is performed, a clear image of the other objects can be obtained without focusing by the imaging system according to the first predetermined relationship, simplifying the imaging method and improving the efficiency of the imaging method.
Specifically, according to one embodiment described above in conjunction with FIG. 5, when the third object 46 to be imaged is one or more locations of the chip 500 used for sequencing, the first object 42, the second object 44, and the third object 46 may be located in the same channel of the chip 500.
Preferably, the first object 42, the third object 46 and the second object 44 are sequentially arranged on the first preset track 43. According to the above-mentioned one embodiment, the first direction a is a direction from the left to the right of the chip 500, that is, along the direction a from the left to the right of the chip 500, the first object 42, the third object 46 and the second object 44 are sequentially arranged on the first preset track 43. In other embodiments, the first object 42, the third object 46 and the second object 44 may be arranged on the first preset track 43 in other sequences.
It will be appreciated that in determining the first predetermined relationship, two objects may be selected on the first preset trajectory 43: the first object 42 and the second object 44 are brought into focus to acquire the focus positions of the two objects. Specifically, it can be seen from the foregoing that, in the sequence determination, the relative focal plane position between two Fov on the first preset trajectory 43, particularly the adjacent Fov, remains unchanged. Thus, the first predetermined relationship may be determined by focusing the first object 42 and the second object 44, obtaining focal plane coordinate data of the first object 42 and the second object 44. With this first predetermined relationship, any third object on the first preset track 43 can be obtained without focusing.
Thus, as an example, the first object 42 and the second object 44 may be a start point and an end point FOV of a first preset track in one cycle (i.e., the same time period), such as two end FOVs of the same row of the same channel, respectively, as shown in fig. 5. The third object 46 may be any one or more FOVs between the first object 42 and the second object 44. It is understood that, basically, the first object 42 and the second object 44 may also be FOVs at other positions, the third object 46 does not need to be located between the first object 42 and the second object 44, any two positions (objects) on the first preset track are selected based on the rule that two points determine a straight line (first preset relationship), the focal plane position corresponding to each position is obtained, the first preset relationship corresponding to the first preset track 43 is obtained according to the focal plane position of each position, and the image of the third object can be obtained through the first preset relationship in a non-focusing manner by using the imaging system. In practical application scenarios, a coordinate system may be established to digitize/quantify the relative position relationship including the so-called focal plane position, for example, when the sequence determination platform is used for image signal acquisition, a three-dimensional coordinate system may be established by representing the plane where the first/second preset tracks are located by xy and representing the optical axis direction of the objective lens by Z, and the focal plane position of each position includes the focal plane Z value.
It should be noted that the mentioned cycle reflects the influence of the time factor/image acquisition cycle. Generally, in a high-precision imaging system, such as a 60-fold objective microscope system with a depth of field of 200nm, the fluctuation caused by one or more back-and-forth mechanical movements of the first/second predetermined tracks or the platform carrying the first/second predetermined tracks is likely to exceed the depth of field, so that, preferably, in the case of multiple multi-object continuous imaging with higher precision by using the imaging method according to any of the above or below embodiments, it is relatively better to re-focus the first predetermined relationship based on the re-fitting of the focusing data if the multiple objects located on the same predetermined tracks are not in the same image acquisition time period (e.g. in different mechanical movement directions). It will be appreciated by those skilled in the art that in a multi-object continuous imaging scenario with relatively low accuracy, due to the large depth of field, the focus position deviation caused by the mechanical reciprocating motion may not be considered, that is, for a plurality of objects on the same preset track, in different image acquisition cycles, the first and/or second predetermined relationship determined in any one of the previous image acquisition cycles may be used for imaging.
The effect of Z value prediction is shown in fig. 7-9 using the above prediction strategy.
The curve C5 in fig. 6 to 8 is a Z-value curve (focal line formed at the actual focus position) obtained from the real shooting result of the camera, and shooting is performed only by focusing the camera. The C6 curve is a predicted Z value curve (focal line formed by predicting the in-focus position).
Fig. 6 shows Z value predictions for multiple Fov for one cycle without interference, fig. 7 and 8 show Z value predictions for the case of interference and decoking, with no intervention, successful re-focus following decoking of fig. 7 and no re-focus following decoking of fig. 8.
In some embodiments, the objects include a fourth object 47 and a fifth object 48 at different positions of the second preset trajectory 45, and the imaging method includes: the lens 104 and the second preset trajectory 45 are relatively moved according to a second predetermined relationship to obtain an image of the fifth object 48 without focusing by the imaging system, the second predetermined relationship being determined by the focal plane position of the fourth object 47 and the first predetermined relationship, the second preset trajectory 45 being different from the first preset trajectory 43. The second predetermined relationship corresponding to the second preset track 45 is determined based on the first predetermined relationship of the first preset track 43 and the focal plane position of any object on the second preset track 45, and by using the second predetermined relationship, a clear image of any object on the second preset track 45 can be obtained without focusing, so that clear images of more objects can be obtained, and the user requirements are met.
Specifically, the second preset track 45 may be a track adjacent to the first preset track 43, in the above-described embodiment, the second preset track 45 is a parallel channel adjacent to the first preset track 43, the second preset track 45 may be a linear track, the fourth object 47 and the fifth object 48 are located at two positions of the linear track, for example, the fourth object 47 is located at one end of the linear track, and the fifth object 48 is located at the middle of the linear track, it is understood that the number of the fifth objects 48 may be plural, the plural fifth objects 48 are sequentially arranged on the second preset track 45, and the fifth object 48 is located at a position different from the fourth object 47. It will be appreciated that in other examples the second predetermined trajectory 45 may be a non-linear trajectory, for example a curved trajectory, which may be considered a fit of a plurality of line segments, the fourth object 47 and the fifth object 48 being located on the same line segment in the curved trajectory.
In some embodiments, the second predetermined relationship may be a linear relationship.
In one embodiment, referring to fig. 5, when the second predetermined track 45 is one or more channels 52 of the chip 500 used in the sequence sequencing process, the fifth object 48 to be imaged is located at one or more positions (FOVs) in the channels 52, and the lens and the second predetermined track can move relatively along the second direction B during photographing, for example, the lens is fixed, the lens includes an optical axis, and the second predetermined track 45 moves along a direction perpendicular to the optical axis. It will be appreciated that in some embodiments, the second pre-set track 45 is capable of moving in a direction parallel to the optical axis OP. The second preset track 45 can be moved according to the actual adjustment requirement.
It is understood that other manners of relative movement of the lens 104 and the second preset track 45 can also be referred to the above explanation of the manner of relative movement of the lens 104 and the first preset track 43, and are not detailed herein to avoid redundancy. It should be noted that, in the example of fig. 5, the first preset track 43 and the second preset track 45 are two adjacent channels 52 on the chip 500, so that the first preset track 43 and the second preset track 45 move synchronously when the mobile chip 500 moves.
In some embodiments, the determining of the second predetermined relationship comprises: focusing the fourth object 47 with the imaging system to determine a fourth coordinate; a second predetermined relationship is established in accordance with the first predetermined relationship and a fourth coordinate reflecting the focal plane position of the fourth object 47. Thus, the second predetermined relationship can be predetermined, and when imaging of other objects is performed, a clear image of the other objects can be obtained without focusing by the imaging system according to the second predetermined relationship, simplifying the imaging method and improving the efficiency of the imaging method.
Specifically, according to one embodiment described above, referring to fig. 5, when the fifth object 48 to be imaged is one or more positions of the chip 500 used for sequence sequencing, the fourth object 47 and the fifth objects 48 may be located in the same channel of the chip 500.
Preferably, the fourth object 47 and the plurality of fifth objects 48 are sequentially arranged on the second preset track 45. According to the above-mentioned one embodiment, the second direction B is a right-to-left direction of the chip 500, that is, along the right-to-left direction B of the chip 500, the fourth object 47 and the plurality of fifth objects 48 are sequentially arranged on the second preset track 45. In other embodiments, the fourth object 47 and the fifth object 48 may be arranged on the second preset track 45 in other sequences.
It is to be understood that the determination of the second predetermined relationship can be referred to the above explanation of the determination of the first predetermined relationship, and is not detailed herein to avoid redundancy.
In certain embodiments, a method of imaging comprises: after acquiring the image of the third object 46, the lens 104 is moved relative to the first preset track 43 and/or the second preset track 45 to acquire the image of the fifth object 48 without focusing using the imaging system. In this way, after the image of the third object 46 on the first preset track 43 is obtained, the image of the fifth object 48 on the second preset track 45 is obtained, so as to realize the imaging of the objects of different preset tracks.
Specifically, in the above embodiment, after the image acquisition of one or more third objects 46 on the first preset track 43 is completed, the lens 104 and the chip 500 are relatively moved along the third direction C, i.e. the direction perpendicular to the extending direction of the channel 52, so that the lens 104 is located above the fifth object 48, and then the image of the fifth object 48 is acquired without focusing by the imaging system according to the second predetermined relationship. In the illustrated embodiment, the third direction C is perpendicular to the first direction a and the second direction B.
Further, in the example shown in fig. 5, the first preset tracks 43 and the second preset tracks 45 are alternately arranged from top to bottom at intervals, after the image acquisition of the one or more third objects 46 of the first preset track 43 from top to bottom is completed, the lens 104 and the chip 500 are relatively moved, the lens 104 is positioned above the fifth object 48 of the first preset track 45, and then the image of the one or more fifth objects 48 of the first preset track 45 is acquired. Then, the lens 104 and the chip 500 move relatively, so that the lens 104 is located above the third object 46 of the second first preset track 43, and then the image of the third object 46 of the second first preset track 43 is acquired until the clear image acquisition of the objects on all the first preset tracks 43 and the second preset tracks 45 is completed.
In summary, since the focal plane position (e.g., Z value) of the object whose image is to be acquired is predicted according to the first predetermined relationship or the second predetermined relationship, the imaging of other FOVs is performed without focusing, which improves the imaging efficiency and accuracy. Further, the determination method is applied to the photographing process of the same area and the similar area, and the rapid and continuous image acquisition of multiple objects can be realized. And the focusing process can be omitted in the continuous shooting process, so that the rapid scanning shooting is realized. Furthermore, by matching with an automatic focus tracking system of the camera and using a focal plane prediction technology, better picture quality can be obtained, and the problem that the camera cannot track the focus again after interference and decoking can be solved. In a wider sense, the camera can have certain intelligence by using a focal plane prediction technology, the focusing process can be quickly realized by assisting the focusing process according to the priori knowledge, and even the focusing process is omitted. Especially in the process of shooting, the intelligence has more important extended application.
In some embodiments, the imaging system includes an imaging device 102 and a stage, the imaging device 102 includes a lens 104 and a focusing module 106, the lens 104 includes an optical axis OP, the lens 104 is capable of moving along the direction of the optical axis OP, and the first predetermined track 43 and/or the second predetermined track 45 are located on the stage.
Hereinafter, the focusing process when determining the first predetermined relationship or the second predetermined relationship according to the present invention will be described with specific embodiments. It should be noted that the same named elements used in different embodiments are limited to the explanation of the embodiments unless otherwise specified, and the same named elements in different embodiments should not be understood in a cross or confusing manner.
The first embodiment is as follows:
referring to fig. 9-11, focusing includes the following steps: (a) emitting light onto the object by using the focusing module; (b) moving the lens to a first set position; (c) the lens is moved to the object from the first set position by a first set step length and whether the focusing module receives the light reflected by the object is judged; (d) when the focusing module receives light reflected by an object, the lens is moved from the current position to a second set position, the second set position is located in a first range, and the first range comprises the current position and allows the lens to move along the direction of the optical axis; (e) moving the lens from a second set position by a second set step length, and obtaining an image of the object by using the imaging device at each step position, wherein the second set step length is smaller than the first set step length; (f) and evaluating the image of the object, and realizing focusing according to the obtained image evaluation result.
By utilizing the imaging method, the clear imaging plane of the target object, namely the clear plane/clear plane, can be quickly and accurately found. The method is particularly suitable for devices containing precise optical systems, such as optical detection devices with high power lenses, where clear planes are not easily found. Thus, the cost can be reduced.
Specifically, in the above focusing step, the object is an object that needs to obtain the focal plane position, for example, if the first predetermined relationship needs to be determined, two objects may be selected in the first preset track, and two objects located in the first preset track 43 may be focused sequentially or simultaneously, so as to obtain two sets of focal plane position data, one of which is used as the focal plane position data of the first object 42 and the other is used as the focal plane position data of the second object 44; if the second predetermined relationship needs to be determined, an object may be selected for focusing on the second preset track, and the focal plane position data of the object may be obtained as the focal plane position data of the fourth object 47, so as to determine the second predetermined relationship by combining the first predetermined relationship.
Referring to fig. 10 and 11, in the embodiment of the present invention, the object is a plurality of positions (FOVs) of the sample 300 applied in the sequence determination, and specifically, the object in which focusing is performed may be the first object or the second object when the first predetermined relationship is determined, and the object in which focusing is performed may be the fourth object or the fifth object when the second predetermined relationship is determined. The sample 300 includes a carrier 200 and a sample 302 to be tested located on the carrier, the sample 302 to be tested is a biomolecule, such as a nucleic acid, etc., and the lens 104 is located above the carrier 200. The carrier 200 has a front panel 202 and a back panel (lower panel), each having two surfaces, and a sample 302 to be tested is attached to the upper surface of the lower panel, i.e., the sample 302 to be tested is located below the lower surface 204 of the front panel 202. In the embodiment of the present invention, since the imaging device 102 is used to acquire an image of the sample 302 to be detected, the sample 302 to be detected is the corresponding position (FOV) when taking a picture, and the sample 302 to be detected is located below the lower surface 204 of the front panel 202 of the carrying device 200, when the focusing process starts, the lens 104 moves to find the medium interface 204 where the sample 302 to be detected is located, so as to improve the success rate of acquiring a clear image by the imaging device 102. In the embodiment of the present invention, the sample 302 to be tested is a solution, the front panel 202 of the carrier 200 is glass, and the medium interface 204 between the carrier 200 and the sample 302 to be tested is the lower surface 204 of the front panel 202 of the carrier 200, i.e. the interface between the glass and the liquid. The sample 302 to be tested whose image needs to be acquired by the imaging device 102 is located below the lower surface 204 of the front panel 202, and the clear surface for clearly imaging the sample 302 to be tested is determined and found according to the image acquired by the imaging device 102, which may be referred to as focusing. In one example, the front panel 202 has a thickness of 0.175 mm.
In other embodiments, the carrier 200 can be a slide, with the sample 302 to be tested placed on the slide, or the sample 302 to be tested can be clamped between two slides. In another embodiment, the carrier 200 may be a reaction device, such as a chip with a sandwich structure having a carrier panel on and under, and the sample 302 to be tested is disposed on the chip.
In the present embodiment, referring to fig. 11, the imaging device 102 includes a microscope 107 and a camera 108, the lens 104 includes an objective lens 110 of the microscope and a camera lens 112, the focusing module 106 can be fixed with the camera lens 112 by a dichroic beam splitter 114(dichroic beam splitter), and the dichroic beam splitter 114 is located between the camera lens 112 and the objective lens 110. The dichroic beam splitter 114 includes a dual C-shaped beam splitter (dual C-mount splitter). The dichroic beam splitter 114 reflects light emitted from the focusing module 106 to the objective lens 110 and allows visible light to pass through and enter the camera 108 through the camera lens 112, as shown in fig. 11.
In the embodiment of the present invention, the movement of the lens 104 is along the optical axis OP. The movement of the lens 104 may refer to the movement of the objective lens 110, and the position of the lens 104 may refer to the position of the objective lens 110. In other embodiments, other lenses of the lens 104 may be selected to be moved to achieve focus. In addition, the microscope 107 further includes a tube lens 111(tube lens) between the objective lens 110 and the camera 108.
In this embodiment, the stage can drive the sample 200 to move (e.g., XY plane) in a plane perpendicular to the optical axis OP (e.g., Z axis) of the lens 104, and/or can drive the sample 300 to move along the optical axis OP (e.g., Z axis) of the lens 104.
In other embodiments, the plane in which the stage drives the sample 300 to move is not perpendicular to the optical axis OP, i.e. the included angle between the motion plane of the sample and the XY plane is not 0, and the imaging method is still applicable.
In addition, the imaging device 102 can also drive the objective lens 110 to move along the optical axis OP of the lens 104 for focusing. In some examples, the imaging device 102 drives the objective lens 110 to move using an actuator such as a stepper motor or a voice coil motor.
In the present embodiment, when establishing the coordinate system, as shown in fig. 10, the positions of the objective lens 110, the stage, and the sample 300 may be set on the negative axis of the Z-axis, and the first set position may be a coordinate position on the negative axis of the Z-axis. It is understood that, in other embodiments, the relationship between the coordinate system and the camera and the objective lens 110 may be adjusted according to actual situations, and is not limited in particular.
In one example, the imaging device 102 comprises a total internal reflection fluorescence microscope, the objective lens 110 is at 60 times magnification, and the first set step size S1 is 0.01 mm. Thus, the first setting step S1 is more suitable, since S1 is too large to cross the acceptable focusing range, and S1 is too small to increase the time overhead.
When the focusing module 106 does not receive the light reflected by the object, the lens 104 is moved to the sample 300 and the object in the first set step.
In this embodiment, the imaging system is applicable to, or comprises, a sequencing system.
In this embodiment, the first range includes a first section and a second section which are opposite to each other with reference to the current position, and the second section is defined to be closer to the sample, and the step (e) includes: (i) when the second set position is located in the second section, the lens is moved from the second set position to a direction away from the object, and the imaging device is used for collecting images of the object at each step position; or (ii) when the second setting position is in the first section, moving the lens from the second setting position to a direction close to the object, and performing image acquisition on the object by the imaging device at each step position. Therefore, the movement of the lens can be controlled according to the specific position of the second set position, and the required image can be rapidly acquired.
Specifically, in one example, the current position may be set as the origin oPos and the coordinate axis Z1 may be established in the optical axis direction of the lens, the first zone being a positive zone and the second zone being a negative zone. The range between positive and negative ranges ± rLen, that is, the first range is [ oPos + rLen, oPos-rLen ]. The second setting position is located in the negative interval and the second setting position is (oPos-3 × r 0). r0 denotes a second setting step. The imaging device begins image acquisition at (oPos-3 × r0) and moves away from the subject.
It should be noted that the coordinate axis Z1 established in the above example coincides with the Z axis of fig. 10, and the first range is located in the negative range of the Z axis. This simplifies the control of the imaging method, and for example, the correspondence between the position of the lens on the coordinate axis Z1 and the position on the Z axis can be known only by knowing the positional relationship between the origin of the Z axis and the origin oPos.
In this embodiment, step (f) includes: comparing the image evaluation result with a preset condition, and if the image evaluation result meets the preset condition, saving the position of the lens 104 corresponding to the image; if the image evaluation result does not satisfy the preset condition, the lens 104 is moved to a third setting position, and the third setting position is located in another interval of the first range, which is different from the interval where the second setting position is located, that is, the reverse photographing focusing is started. For example, in the process of the part (i) of the step (e), the image evaluation results do not meet the preset conditions; moving the lens 104 to the third setting position corresponds to moving the lens to the start position of the part (ii) to be performed in step (e), and then performing reverse photograph focusing, that is, performing the part (ii) process in step (e). Therefore, the focusing position of the image is searched in the first range, and the efficiency of the imaging method is effectively improved.
Specifically, referring to the example of the above embodiment, the second setting position is located at (oPos-3 × r0) in the negative interval, the lens is moved upward from the second setting position, the imaging device 102 performs image capturing at each step position, if the image evaluation result does not satisfy the preset condition, the lens 104 is moved to a third setting position located in the positive interval, for example, the third setting position is (oPos +3 × r0), then the imaging device 102 performs image capturing from (oPos +3 × r0) and moves in a direction to approach the object, and focusing is achieved according to the obtained image evaluation result. When the image evaluation result satisfies the preset condition, the current position of the lens 104 corresponding to the image is saved as the saving position, so that the imaging device 102 can output a clear image when the sequence determination reaction is performed for photographing.
In some embodiments, the image evaluation result includes a first evaluation value and a second evaluation value, the second setting step includes a coarse step and a fine step, and step (f) includes: the lens is moved in a coarse step until the first evaluation value of the image at the corresponding position is not more than the first threshold value, the lens 104 is changed to a fine step to continue moving until the second evaluation value of the image at the corresponding position is maximized, and the position of the lens 104 corresponding to the image at which the second evaluation value is maximized is saved. Thus, a coarse step size may allow the lens 104 to quickly approach the in-focus position, and a fine step size may ensure that the lens 104 can reach the in-focus position.
Specifically, the position of the lens 104 corresponding to the image of the largest second evaluation value may be saved as the in-focus position. At each step position of image acquisition by the imaging device 102, a first evaluation value and a second evaluation value are calculated for the acquired image.
In one example, during sequencing, the object is provided with an optically detectable label, such as a fluorescent label, the fluorescent molecule can be excited to fluoresce under the irradiation of laser light with a specific wavelength, and the image collected by the imaging device 102 includes a bright spot that may correspond to the position of the fluorescent molecule. It can be understood that when the lens 104 is located at the in-focus position, the size of the bright spot corresponding to the position of the fluorescent molecule in the collected image is small and the brightness is high; when the lens 104 is located at the out-of-focus position, the size of the bright spot corresponding to the position of the fluorescent molecule in the acquired image is large and the brightness is low.
In this embodiment, the size of the bright spot on the image and the intensity of the bright spot are used to evaluate the image.
For example, the size of a bright spot of an image is reflected with the first evaluation value; in one example, the first evaluation value is determined by counting the connected component size of a bright spot on an image, and a connected pixel (connected component) larger than the average pixel value of the image is defined as one connected component. The first evaluation value may be determined, for example, by calculating the size of the corresponding connected component of each bright patch, and taking the average value of the sizes of the connected components of the bright patches representing a characteristic of the image as the first evaluation value of the image; for another example, the sizes of the connected components corresponding to the bright spots may be sorted from small to large, and the size of the connected component at 50, 60, 70, 80, or 90 quantiles may be used as the first evaluation value of the image.
In one example, the size of the connected component Area corresponding to a bright spot of one of the images is a × B, where a represents the size of the connected component in the row centered on the center of the matrix corresponding to the bright spot, and B represents the size of the connected component in the column centered on the center of the matrix corresponding to the bright spot. The matrix corresponding to the defined bright spots is a matrix k1 × k2 formed by odd rows and odd columns, and comprises k1 × k2 pixel points.
In one example, the image is first binarized, the image is converted into a digital matrix, and then the size of the connected component is calculated. For example, with the average pixel value of the image as a reference, the pixel points not smaller than the average pixel value are marked as 1, and the pixel points smaller than the average pixel value are marked as 0, as shown in fig. 12. In fig. 12, the bold and enlarged indicates the center of the matrix corresponding to the bright spot, and the bold frame indicates a 3 × 3 matrix. And the connected pixel points marked as 1 form a connected domain, and the size of the connected domain corresponding to the bright spot is A, B and 3, 6.
The first threshold may be set empirically or a priori. In one example, the first evaluation value reflects the size of a bright spot on an image, and the inventor observes that the Area of the connected component becomes smaller and larger during the process from being close to the clear plane to being far away from the clear plane, and the inventor determines the first threshold value based on the size of the Area value and the change rule during focusing for finding the clear plane multiple times. In one example, the first threshold is set to 260. It is noted that the first threshold may have an association with the coarse step size, fine step size settings: the first threshold can be sized to not cross the focal plane of the imaging device when imaging the object in a coarse step.
In some embodiments, the second evaluation value or the third evaluation value is determined by counting scores of the flare of the images, Score of the flare of one image ((k1 × k2-1) CV-EV)/((CV + EV)/(k1 × k2)), CV represents a central pixel value of the matrix corresponding to the flare, and EV represents a sum of non-central pixel values of the matrix corresponding to the flare. Thus, the second evaluation value or the third evaluation value can be determined.
Specifically, after the bright spots of the image are determined, the Score values of all the bright spots of the image may be sorted in ascending order. When the number of bright spots is larger than a preset number, for example, the preset number is 30, the number of bright spots is 50, and the second evaluation value may take a Score value of 50, 60, 70, 80, or 90 quantiles, so that interference of 50%, 60%, 70%, 80%, or 90% of the bright spots of relatively poor quality can be excluded; generally, the central and edge intensities/pixel values are considered to have a large difference and the converged bright spot is a bright spot corresponding to the molecule to be detected. The molecule to be detected may be a nucleic acid molecule corresponding to a target detection target in nucleic acid detection.
When the number of the bright spots is smaller than the preset number, for example, the number of the bright spots is 10 smaller than the preset number, so that the number of the bright spots is small and has no statistical significance, the bright spot with the largest Score value is taken to represent the image, that is, the Score value of one percentile is taken as the third evaluation value.
In the present embodiment, the image evaluation result includes a first evaluation value, a second evaluation value, and a third evaluation value, and the image includes a plurality of pixels; the preset condition is that the number of the bright spots on the image is larger than a preset value, the first evaluation value of the image at the corresponding position is not larger than a first threshold value, and the second evaluation value of the image at the corresponding position is the largest among the second evaluation values of the N images before and after the image at the corresponding position; or the preset condition is that the number of the bright spots on the image is smaller than the preset value, the first evaluation value of the image at the corresponding position is not larger than the first threshold, and the third evaluation value of the image at the corresponding position is the largest among the third evaluation values of the N images before and after the current image. Therefore, different evaluation values are adopted for evaluation according to the number of the bright spots of the image, so that the focusing of the imaging method is more accurate.
Specifically, in one example, the first evaluation value may be a connected component size corresponding to a bright patch of the image in the above embodiment. In the example where the second evaluation value and the third evaluation value are different, the Score quantiles are different depending on whether the number of bright spots has or does not have a statistical significance, for example, a Score value of a non-percentile and a Score value of a percentile, respectively.
In one example, where single molecule sequencing is performed, the bright spots on the collected image may be from one or more optically detectable labeled molecules carried by the sample to be tested, or from other interferences.
In this embodiment, the bright spots are detected, and the bright spots corresponding to/from the labeled molecules are detected, for example, using a k1 × k2 matrix. Specifically, the bright spots on the image are detected using the following method:
and performing bright spot detection on the image by using a k1 x k2 matrix, wherein the matrix which judges that the central pixel value of the matrix is not less than any pixel value of the non-center of the matrix corresponds to a bright spot, both k1 and k2 are odd numbers which are more than 1, and the k1 x k2 matrix comprises k1 x k2 pixel points.
The method is based on the difference between the brightness/intensity of the signal generated by fluorescence and the background brightness/intensity, and can simply and quickly detect the information from the labeling molecule signal. In some embodiments, the central pixel value of the matrix is greater than the first predetermined value, and any pixel value not in the center of the matrix is greater than the second predetermined value.
The first preset value and the second preset value can be set according to experience or pixel/intensity data of normal bright spots of a certain amount of normal images, and the named "normal images" and "normal bright spots" can be images obtained by an imaging system at a clear surface position and can be normally seen by naked eyes, for example, the images are clear, the background is cleaner, the sizes and the brightness of the bright spots are uniform, and the like. In one embodiment, the first and second preset values are related to an average pixel value of the image. For example, by setting the first preset value to be 1.4 times the average pixel value of the image and setting the second preset value to be 1.1 times the average pixel value of the image, it is possible to eliminate interference and obtain a bright spot detection result from the mark.
Specifically, in one example, the image is a color image, one pixel of the color image has three pixel values, and the color image can be converted into a gray image, and then image detection is performed, so as to reduce the calculation amount and complexity of the image detection process. The non-grayscale image may be optionally, but not limited to, converted to a grayscale image using a floating-point algorithm, an integer method, a shift method, or an average value method, etc. Of course, color images can also be detected directly, the above-mentioned size comparison of pixel values can be regarded as a size comparison of three-dimensional values or an array having three elements, and the relative sizes of a plurality of multi-dimensional values can be customized according to experience and needs, for example, when any two-dimensional value in the three-dimensional value a is larger than the corresponding dimension of the three-dimensional value b, the three-dimensional value a can be regarded as being larger than the three-dimensional value b.
In another example, the image is a grayscale image, and the pixel values of the grayscale image are the same as the grayscale values. Therefore, the average pixel value of the image is the average gray value of the image.
In one example, the first threshold is 260, the preset number is 30, and N is 2. That is, when the first evaluation value of the image of the corresponding position is not more than 260 and the number of the bright spots is more than 30, the second evaluation value of the image of the corresponding position is statistically obtained, the position of the image at which the second evaluation value is the largest is determined to be the clear position, and there are 2 positions before and after the position that meet the following conditions: the second evaluation value of the corresponding image is greater than zero. When the first evaluation value of the image of the corresponding position is not more than 260 and the number of the bright spots is less than 30, counting the third evaluation value of the image of the corresponding position, and finding out the position of the image with the maximum third evaluation value as the clear surface position, wherein 2 positions in front of and behind the position satisfy the following conditions: the third evaluation value of the corresponding image is greater than zero.
And if the image meeting the condition is not found, judging that the image evaluation result does not meet the preset condition.
In one example, k 1-k 2-3, then there are 9 pixels in the 3 x 3 matrix and EV is the sum of the non-central 8 pixel values.
In the present embodiment, if focusing cannot be completed according to the image evaluation result, the lens is moved to the next image capture area (FOV) of the subject in the direction perpendicular to the optical axis for focusing. Therefore, refocusing can be carried out from other objects, the focusing of the current object which can not be focused is avoided all the time, and the time is saved.
In this embodiment, the imaging method further includes: and when the number of the current objects which are not successfully focused is larger than the preset number, prompting that focusing fails. Therefore, the reason of focusing failure can be manually eliminated, and the focusing is avoided all the time, so that the time is saved. In particular, in this case, there may be a cause of misalignment of the object placement or malfunction of the imaging apparatus. After the failure of focusing is prompted, the reason of the failure of focusing can be manually eliminated. In one example, the preset number is 3, that is, when the number of current objects for which focusing is unsuccessful is greater than 3, a failure in focusing is prompted. The focusing failure can be prompted by displaying images, characters, playing sound and the like.
In this embodiment, the imaging method further includes: and judging whether the position of the lens exceeds the first range or not, and exiting focusing when the position of the lens exceeds the first range. Therefore, the lens is out of focus when the position of the lens exceeds the first range, and the problems of overlong focus time and increased power consumption can be avoided.
Specifically, in the example of the above embodiment, the first range is [ oPos + rLen, oPos-rLen ].
In this embodiment, when the lens 104 moves, it is determined whether the current position of the lens 104 exceeds a fourth setting position; when the current position of the lens 104 exceeds the fourth set position, the movement of the lens 104 is stopped. Thus, the first setting position and the fourth setting position can limit the moving range (first range) of the lens 104, so that the lens 104 can stop moving when focusing is failed, waste of resources or damage of equipment is avoided, or refocusing can be performed on the lens 104 when focusing is failed, and automation of the imaging method is improved.
In a total internal reflection imaging system, for example, to quickly find the medium interface, the settings are adjusted so that the range of motion of the lens 104 is as small as is sufficient for implementing the solution. For example, in a total internal reflection imaging device with a 60-fold objective lens, the movement range of the lens 104 can be set to 200 μm ± 10 μm or [190 μm, 250 μm ] according to the optical path characteristics and empirical summary.
In this embodiment, another setting position can be determined according to the determined moving range and the setting of any one of the fourth setting position and the first setting position. In one example, the fourth setting position is set to a position corresponding to a depth of field from the lowest position of the upper surface 205 of the front panel 202 of the reaction apparatus 200 to the next position, and the moving range of the lens 104 is set to 250 μm, so that the first setting position is determined. In the present example, the coordinate position corresponding to the position of the next depth of field size is a position that becomes smaller in the negative Z-axis direction.
Specifically, in the present embodiment, the movement range is one section on the negative axis of the Z axis. In one example, the first set position is nearlimit, the fourth set position is farlimit, and the coordinate positions corresponding to nearlimit and farlimit are both located on the negative axis of the Z-axis, where nearlimit is-6000 um, and farlimit is-6350 um. The size of the range of motion defined between nearlimit and farlimit is 350 um. Therefore, when the coordinate position corresponding to the current position of the lens 104 is smaller than the coordinate position corresponding to the fourth setting position, it is determined that the current position of the lens 104 exceeds the fourth setting position. In fig. 10, the position of farlimit is the position of the depth of field L next to the lowest position of the upper surface 205 of the front panel 202 of the reaction apparatus 200. The depth of field L is the size of the depth of field of the lens 104.
It should be noted that, in other embodiments, the coordinate position corresponding to the first setting position and/or the fourth setting position may be specifically set according to the actual situation, and is not specifically limited herein.
In the present embodiment, the focusing module 106 includes a light source 116 and a light sensor 118, the light source 116 is used for emitting light to the object, and the light sensor 118 is used for receiving light reflected by the object. Thus, the light emitting and receiving of the focusing module 106 can be realized.
Specifically, in the embodiment of the present invention, the light source 116 may be an infrared light source 116, and the light sensor 118 may be a photo diode (photo diode), so that the cost is low and the accuracy of the detection is high. Infrared light emitted by the light source 116 enters the objective lens 110 via reflection by the dichroic beamsplitter and is projected through the objective lens 110 onto the sample 300 and the object. The object may reflect the infrared light projected through the objective lens 110. In an embodiment of the present invention, when the sample 300 comprises the carrier 200 and the sample 302 to be measured, the received light reflected by the object is the light reflected by the lower surface 204 of the front panel of the carrier 200.
Whether infrared light reflected by the object can enter the objective lens 110 and be received by the light sensor 118 depends primarily on the distance of the objective lens 110 from the object. Therefore, when the focusing module 106 determines that the infrared light reflected by the object is received, it can be determined that the distance between the objective lens 110 and the object is within the suitable range for optical imaging, and the distance can be used for imaging of the imaging device 102. In one example, the distance is 20-40 um.
At this time, the lens 104 is moved by a second set step smaller than the first set step, so that the imaging system can find the optimal imaging position of the lens 104 in a smaller range.
In this embodiment, referring to fig. 13, when the focusing module 106 receives the light reflected by the object, the imaging method further includes the steps of: g, the lens 104 is moved to the subject by a third set step length which is smaller than the first set step length and larger than the second set step length, and a first light intensity parameter is calculated according to the light intensity of the light received by the focusing module 106, and whether the first light intensity parameter is larger than a first set light intensity threshold value is judged; and (d) when the first light intensity parameter is larger than the first set light intensity threshold value. In this way, by comparing the first light intensity parameter with the first set light intensity threshold, the interference of the light signal with very weak contrast with the reflected light of the medium interface to focusing can be eliminated.
When the first light intensity parameter is not greater than the first set light intensity threshold, the lens 104 is caused to continue moving toward the subject in a third set step.
In the present embodiment, the focusing module 106 includes two light sensors 118, the two light sensors 118 are used for receiving light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118. In this way, the first light intensity parameter is calculated by the average of the light intensities of the light received by the two light sensors 118, so that it is more accurate to exclude weak light signals.
Specifically, the first light intensity parameter may be set to SUM, i.e., SUM ═ PD1+ PD2)/2, and PD1 and PD2 respectively indicate the light intensities of the light received by the two light sensors 118. In one example, the first set light intensity threshold nSum is 40.
In one example, the third set step size S2 is 0.005 mm. It is understood that, in other examples, the third setting step may also take other values, and is not limited in particular.
Example two:
it should be noted that in this embodiment, the structural diagram of the imaging system in the first embodiment can be adopted as the structural diagram of the imaging system in the first embodiment, and it can be understood that the focusing method or the focusing logic in the second embodiment is different from that in the first embodiment, but the used structure of the imaging system is basically the same.
Referring to fig. 10, 11 and 14, focusing includes the following steps: s11, emitting light onto the object by the focusing module 106; s12, moving the lens 104 to a first setting position; s13, moving the lens 104 from the first setting position to the object by a first setting step length and determining whether the focusing module 106 receives the light reflected by the object; when the focusing module 106 receives the light reflected by the object, S14 causes the lens 104 to move by a second set step smaller than the first set step and to capture an image of the object by the imaging device 102, and determines whether the sharpness value of the image captured by the imaging device 102 reaches a set threshold value; when the sharpness value of the image reaches the set threshold value, S15 saves the current position of the lens 104 as the save position.
By utilizing the focusing method, the plane of the target object which is clearly imaged, namely a clear plane/clear plane, can be quickly and accurately found. The method is particularly suitable for devices containing precise optical systems, such as optical detection devices with high power lenses, where clear planes are not easily found.
Specifically, in the above focusing step, the object is an object that needs to obtain the focal plane position, for example, if the first predetermined relationship needs to be determined, two objects may be selected in the first preset track, and two objects located in the first preset track 43 may be focused sequentially or simultaneously, so as to obtain two sets of focal plane position data, one of which is used as the focal plane position data of the first object 42 and the other is used as the focal plane position data of the second object 44; if the second predetermined relationship needs to be determined, two objects may be selected in the second preset track, and two objects located in the second preset track 45 may be focused sequentially or simultaneously to obtain two sets of focal plane position data, where one is used as the focal plane position data of the fourth object 47 and the other is used as the focal plane position data of the fifth object 48.
Referring to fig. 10, in the embodiment of the present invention, the objects are a plurality of positions (FOVs) of the specimen 300 applied in the sequence determination, and specifically, the object in focus may be the first object or the second object when the first predetermined relationship is determined, and the object in focus may be the fourth object or the fifth object when the second predetermined relationship is determined. The sample 300 includes a carrier 200 and a sample 302 to be tested located on the carrier, the sample 302 to be tested is a biomolecule, such as a nucleic acid, etc., and the lens 104 is located above the carrier 200. The carrier 200 has a front panel 202 and a back panel (lower panel), each having two surfaces, and a sample 302 to be tested is attached to the upper surface of the lower panel, i.e., the sample 302 to be tested is located below the lower surface 204 of the front panel 202. In the embodiment of the present invention, since the imaging device 102 is used to acquire an image of the sample 302 to be detected, the sample 302 to be detected is the corresponding position (FOV) when taking a picture, and the sample 302 to be detected is located below the lower surface 204 of the front panel 202 of the carrying device 200, when the focusing process starts, the lens 104 moves to find the medium interface 204 where the sample 302 to be detected is located, so as to improve the success rate of acquiring a clear image by the imaging device 102. In the embodiment of the present invention, the sample 302 to be tested is a solution, the front panel 202 of the carrier 200 is glass, and the medium interface 204 between the carrier 200 and the sample 302 to be tested is the lower surface 204 of the front panel 202 of the carrier 200, i.e. the interface between the glass and the liquid. The sample 302 to be tested whose image needs to be acquired by the imaging device 102 is located below the lower surface 204 of the front panel 202, and the clear surface for clearly imaging the sample 302 to be tested is determined and found according to the image acquired by the imaging device 102, which may be referred to as focusing. In one example, the front panel 202 has a thickness of 0.175 mm.
In this embodiment, the carrier 200 can be a slide, and the sample 302 to be tested is placed on the slide, or the sample 302 to be tested is clamped between two slides. In some embodiments, the carrier 200 may be a reaction device, such as a chip with a sandwich structure having a carrier panel on and under, and the sample 302 to be tested is disposed on the chip.
In the present embodiment, referring to fig. 11, the imaging device 102 includes a microscope 107 and a camera 108, the lens 104 includes an objective lens 110 of the microscope and a camera lens 112, the focusing module 106 can be fixed with the camera lens 112 by a dichroic beam splitter 114(dichroic beam splitter), and the dichroic beam splitter 114 is located between the camera lens 112 and the objective lens 110. The dichroic beam splitter 114 includes a dual C-shaped beam splitter (dual C-mount splitter). The dichroic beam splitter 114 reflects light emitted from the focusing module 106 to the objective lens 110 and allows visible light to pass through and enter the camera 108 through the camera lens 112, as shown in fig. 11.
In the embodiment of the present invention, the movement of the lens 104 is along the optical axis OP. The movement of the lens 104 may refer to the movement of the objective lens 110, and the position of the lens 104 may refer to the position of the objective lens 110. In other embodiments, other lenses of the lens 104 may be selected to be moved to achieve focus. In addition, the microscope 107 further includes a tube lens 111(tube lens) between the objective lens 110 and the camera 108.
In this embodiment, the stage can drive the sample 200 to move (e.g., XY plane) in a plane perpendicular to the optical axis OP (e.g., Z axis) of the lens 104, and/or can drive the sample 300 to move along the optical axis OP (e.g., Z axis) of the lens 104.
In other embodiments, the plane in which the stage drives the sample 300 to move is not perpendicular to the optical axis OP, i.e. the included angle between the motion plane of the sample and the XY plane is not 0, and the imaging method is still applicable.
In addition, the imaging device 102 can also drive the objective lens 110 to move along the optical axis OP of the lens 104 for focusing. In some examples, the imaging device 102 drives the objective lens 110 to move using an actuator such as a stepper motor or a voice coil motor.
In the present embodiment, when establishing the coordinate system, as shown in fig. 10, the positions of the objective lens 110, the stage, and the sample 300 may be set on the negative axis of the Z-axis, and the first set position may be a coordinate position on the negative axis of the Z-axis. It is understood that, in other embodiments, the relationship between the coordinate system and the camera and the objective lens 110 may be adjusted according to actual situations, and is not limited in particular.
In one example, the imaging device 102 comprises a total internal reflection fluorescence microscope, the objective lens 110 is at 60 times magnification, and the first set step size S1 is 0.01 mm. Thus, the first setting step S1 is more suitable, since S1 is too large to cross the acceptable focusing range, and S1 is too small to increase the time overhead.
When the focusing module 106 does not receive the light reflected by the object, the lens 104 is moved further along the optical axis OP toward the sample 300 and the object by a first set step size.
In the present embodiment, when the sharpness value of the image does not reach the set threshold, the lens 104 is moved continuously along the optical axis OP by a second set step.
In this embodiment, the imaging system is applicable to, or comprises, a sequencing system.
In this embodiment, when the lens 104 moves, it is determined whether the current position of the lens 104 exceeds a second set position; when the current position of the lens 104 exceeds the second set position, the moving of the lens 104 is stopped or a focusing step is performed. Thus, the first setting position and the second setting position can limit the moving range of the lens 104, so that the lens 104 can stop moving when focusing is not successful, thereby avoiding waste of resources or damage of equipment, or refocusing the lens 104 when focusing is not successful, and improving automation of the imaging method.
In a total internal reflection imaging system, for example, to quickly find the medium interface, the settings are adjusted so that the range of motion of the lens 104 is as small as is sufficient for implementing the solution. For example, in a total internal reflection imaging device with a 60-fold objective lens, the movement range of the lens 104 can be set to 200 μm ± 10 μm or [190 μm, 250 μm ] according to the optical path characteristics and empirical summary.
In this embodiment, another setting position can be determined according to the determined moving range and the setting of any one of the second setting position and the first setting position. In one example, the second setting position is set to a position corresponding to a depth of field from the lowest position of the upper surface 205 of the front panel 202 of the reaction apparatus 200 to the next position, and the moving range of the lens 104 is set to 250 μm, so that the first setting position is determined. In the present example, the coordinate position corresponding to the position of the next depth of field size is a position that becomes smaller in the negative Z-axis direction.
Specifically, in the embodiment of the present invention, the movement range is one section on the negative axis of the Z axis. In one example, the first set position is nearlimit, the second set position is farlimit, and the coordinate positions corresponding to nearlimit and farlimit are both located on the negative axis of the Z-axis, where nearlimit is-6000 um, and farlimit is-6350 um. The size of the range of motion defined between nearlimit and farlimit is 350 um. Therefore, when the coordinate position corresponding to the current position of the lens 104 is smaller than the coordinate position corresponding to the second set position, it is determined that the current position of the lens 104 exceeds the second set position. In fig. 10, the position of farlimit is the position of the depth of field L next to the lowest position of the upper surface 205 of the front panel 202 of the reaction apparatus 200. The depth of field L is the size of the depth of field of the lens 104.
It should be noted that, in other embodiments, the coordinate position corresponding to the first setting position and/or the second setting position may be specifically set according to the actual situation, and is not specifically limited herein.
In the present embodiment, the focusing module 106 includes a light source 116 and a light sensor 118, the light source 116 is used for emitting light onto the object, and the light sensor 118 is used for receiving the light reflected by the object. Thus, the light emitting and receiving of the focusing module 106 can be realized.
Specifically, in the embodiment of the present invention, the light source 116 may be an infrared light source 116, and the light sensor 118 may be a photo diode (photo diode), so that the cost is low and the accuracy of the detection is high. Infrared light emitted by the light source 116 enters the objective lens 110 via reflection by the dichroic beamsplitter and is projected through the objective lens 110 onto the sample 300 and the object. The object may reflect the infrared light projected through the objective lens 110. In an embodiment of the present invention, when the sample 300 comprises the carrier 200 and the sample 302 to be measured, the received light reflected by the object is the light reflected by the lower surface 204 of the front panel of the carrier 200.
Whether infrared light reflected by the object can enter the objective lens 110 and be received by the light sensor 118 depends primarily on the distance of the objective lens 110 from the object. Therefore, when the focusing module 106 determines that the infrared light reflected by the object is received, it can be determined that the distance between the objective lens 110 and the object is within the suitable range for optical imaging, and the distance can be used for imaging of the imaging device 102. In one example, the distance is 20-40 um.
At this time, the lens 104 is moved by a second set step smaller than the first set step, so that the imaging system can find the optimal imaging position of the lens 104 in a smaller range.
In the present embodiment, the sharpness value of the image may be used as an evaluation value (evaluation value) of image focusing. In one example, determining whether the sharpness value of the image captured by the imaging device 102 reaches a set threshold may be performed by a hill-climbing algorithm of image processing. Whether the sharpness value reaches the maximum value at the peak of the sharpness value is judged by calculating the sharpness value of the image output by the imaging device 102 when the objective lens 110 is at each position, and whether the lens 104 reaches the position of the clear plane when the imaging device 102 images is judged. It will be appreciated that in other embodiments, other image processing algorithms may be used to determine whether the sharpness value has reached a maximum at the peak.
When the sharpness value of the image reaches the set threshold value, the current position of the lens 104 is stored as the storage position, so that the imaging device 102 can output a sharp image when the sequence measurement reaction is taken.
In this embodiment, referring to fig. 15, when the focusing module 106 receives the light reflected by the object, the focusing further includes the steps of: s16, the lens 104 is moved to the subject by a third setting step which is smaller than the first setting step and larger than the second setting step, and the first light intensity parameter is calculated according to the light intensity of the light received by the focusing module 106, and whether the first light intensity parameter is larger than the first setting light intensity threshold is judged; when the first light intensity parameter is greater than the first set light intensity threshold, step S14 is performed. In this way, by comparing the first light intensity parameter with the first set light intensity threshold, the interference of the light signal with very weak contrast with the reflected light of the medium interface to focusing can be eliminated.
When the first light intensity parameter is not greater than the first set light intensity threshold, the lens 104 is moved further along the optical axis OP toward the subject in a third set step.
In the present embodiment, the focusing module 106 includes two light sensors 118, the two light sensors 118 are used for receiving light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118. In this way, the first light intensity parameter is calculated by the average of the light intensities of the light received by the two light sensors 118, so that it is more accurate to exclude weak light signals.
Specifically, the first light intensity parameter may be set to SUM, i.e., SUM ═ PD1+ PD2)/2, and PD1 and PD2 respectively indicate the light intensities of the light received by the two light sensors 118. In one example, the first set light intensity threshold nSum is 40.
In one example, the third set step size S2 is 0.005 mm. It is understood that, in other examples, the third setting step may also take other values, and is not limited in particular.
In another embodiment, referring to fig. 16, when the focusing module 106 receives the light reflected by the object, the method further includes the following steps: s16, the lens 104 is moved to the subject by a third setting step which is smaller than the first setting step and larger than the second setting step, and the first light intensity parameter is calculated according to the light intensity of the light received by the focusing module 106, and whether the first light intensity parameter is larger than the first setting light intensity threshold is judged; when the first light intensity parameter is greater than the first set light intensity threshold, S17, the lens 104 is moved to the subject by a fourth set step that is less than the third set step and greater than the second set step, and the second light intensity parameter is calculated according to the light intensity of the light received by the focusing module 106, and whether the second light intensity parameter is less than the second set light intensity threshold is determined; when the second light intensity parameter is smaller than the second set light intensity threshold, step S14 is performed. Therefore, by comparing the first light intensity parameter with the first set light intensity threshold value, the interference of light signals with very weak contrast with the reflected light of the medium interface on focusing/focusing can be eliminated; and the strong reflected light signals at the position of the non-medium interface, such as the interference of the light signals reflected by the oil surface/air of the objective lens 110 on focusing/focusing can be eliminated by comparing the second light intensity parameter with the second set light intensity threshold.
When the first light intensity parameter is not greater than the first set light intensity threshold, the lens 104 is moved further along the optical axis OP toward the subject in a third set step.
When the second light intensity parameter is not less than the second set light intensity threshold, the lens 104 is moved further along the optical axis OP toward the subject by a fourth set step.
In one example, the third setting step S2 is 0.005mm, and the fourth setting step S3 is 0.002 mm. It is understood that, in other examples, the third setting step and the fourth setting step may also adopt other values, and are not limited in particular.
In the present embodiment, the focusing module 106 includes two light sensors 118, the two light sensors 118 are used for receiving light reflected by the object, the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118, the light intensities of the light received by the two light sensors 118 have a first difference, and the second light intensity parameter is a difference between the first difference and the set compensation value. In this manner, the second light intensity parameter is calculated from the light intensities of the light received by the two light sensors 118, so that the light signal excluding the strong reflection is more accurate.
Specifically, the first light intensity parameter may be set to SUM, i.e., SUM ═ PD1+ PD2)/2, and PD1 and PD2 respectively indicate the light intensities of the light received by the two light sensors 118. In one example, the first set light intensity threshold nSum is 40. The difference may be set to err and the offset may be set to err ═ offset (PD1-PD2) -offset. In an ideal situation, the first difference value may be zero. In one example, the second set light intensity threshold value nrerr is 10 and the offset is 30.
In this embodiment, when the lens 104 is moved by the second set step, it is determined whether the first sharpness value of the pattern corresponding to the current position of the lens 104 is greater than the second sharpness value of the image corresponding to the previous position of the lens 104; when the first sharpness value is larger than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is larger than the set difference, the lens 104 is enabled to continue to move towards the object by a second set step length; when the first sharpness value is larger than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is smaller than the set difference, the lens 104 is continuously moved to the object by a fifth set step smaller than the second set step so that the sharpness value of the image acquired by the imaging device 102 reaches the set threshold; moving the lens 104 away from the subject by a second set step size when the second sharpness value is greater than the first sharpness value and the sharpness difference between the second sharpness value and the first sharpness value is greater than the set difference; when the second sharpness value is greater than the first sharpness value and the sharpness difference between the second sharpness value and the first sharpness value is less than the set difference, the lens 104 is moved away from the subject in a fifth set step size to bring the sharpness value of the image acquired by the imaging device 102 to the set threshold. Therefore, the position of the lens 104 corresponding to the peak of the sharpness value can be accurately found, so that the image output by the imaging device is clear.
Specifically, the second set step may be taken as a coarse step Z1, the fifth set step may be taken as a fine step Z2, and a coarse adjustment range Z3 may be set. The coarse adjustment range Z3 is set so that the movement of the lens 104 can be stopped when the sharpness value of the image does not reach the set threshold, thereby saving resources.
The coarse adjustment range Z3 is an adjustment range with the current position of the lens 104 as the starting point T, i.e., the adjustment range on the Z axis is (T, T + Z3). The lens 104 is first moved in a first direction (e.g., a direction toward the object along the optical axis OP) by a step Z1 within a range of (T, T + Z3), and a first sharpness value R1 of an image captured by the imaging device 102 at a current position of the lens 104 is compared with a second sharpness value R2 of an image captured by the imaging device 102 at a previous position of the lens 104.
When R1> R2 and R1-R2> R0, i.e., the sharpness values of the illustrative image are closer to and farther from the set threshold, the lens 104 is continued to move in the first direction by step Z1 to quickly approach the set threshold.
When R1> R2 and R1-R2< R0, which illustrate the sharpness values of the image being close to and closer to the set threshold, the lens 104 is moved in the first direction by a step Z2, closer to the set threshold by a smaller step.
When R2> R1 and R2-R1> R0, which illustrate that the sharpness values of the image have crossed the set threshold and are farther from the set threshold, the lens 104 is moved in a second direction opposite to the first direction (such as a direction away from the subject along the optical axis OP) by a step Z1 to quickly approach the set threshold.
When R2> R1 and R2-R1< R0, which indicates that the sharpness value of the image has crossed the set threshold and is closer to the set threshold, the lens 104 is moved in a second direction opposite to the first direction by a step Z2, closer to the set threshold by a smaller step.
In this embodiment, the fifth setting step size can be adjusted to adapt to the step size approaching the setting threshold value, which is not too large or too small when the lens 104 moves.
In one example, T is 0, Z1 is 100, Z2 is 40, Z3 is 2100, and the adjustment range is (0,2100). It should be noted that the above values are used as a measure for moving the lens 104 during image acquisition by the imaging device 102, and the measure is related to light intensity. Setting the threshold value may be understood as the peak value of the focus curve or a range centered on the peak value, or a range including the peak value.
Referring to fig. 17 and 5, an imaging system 100 according to an embodiment of the present invention is configured to image a subject, the imaging system includes a lens 104 and a control device 101, the subject includes a first subject 42, a second subject 44 and a third subject 46 located at different positions of a first preset track 43, and the control device 101 includes a computer-executable program, and the computer-executable program includes the steps of the imaging method according to any one of the embodiments.
In the imaging system 100, the first predetermined relationship is determined by the focusing positions of the first object 42 and the second object 44, when other objects on the first preset track are imaged, the focal plane can be directly predicted according to the first predetermined relationship, a clear image of a third object is obtained without focusing, and the imaging system is particularly suitable for a situation where the number of objects is large and the images of the objects are desired to be obtained quickly and continuously.
It should be noted that the explanation and description of the technical features and advantages of the imaging method in any of the above embodiments and examples are also applicable to the imaging system 100 of the present embodiment, and are not detailed here to avoid redundancy.
In some embodiments, the third object 46 is located between the first object 42 and the second object 44.
In some embodiments, the lens 104 is fixed, the lens 104 includes an optical axis OP, and the first predetermined track 43 is capable of moving in a direction perpendicular or parallel to the optical axis OP.
In some embodiments, the determining of the first predetermined relationship comprises:
focusing the first object 42 with the imaging system, determining first coordinates;
focusing the second object 44 with the imaging system to determine second coordinates;
a first predetermined relationship is established in terms of first coordinates reflecting the focal plane position of the first object 42 and second coordinates reflecting the focal plane position of the second object 44.
In some embodiments, the first predetermined trajectory 43 is a linear or non-linear trajectory; and/or the first predetermined relationship is a linear relationship.
In some embodiments, the objects include a fourth object 47 and a fifth object 48 at different positions of the second preset trajectory 45, and the control device 101 is configured to:
the lens 104 and the second preset trajectory 45 are relatively moved according to a second predetermined relationship to obtain an image of the fifth object 48 without focusing by the imaging system, the second predetermined relationship being determined by the focal plane position of the fourth object 47 and the first predetermined relationship, the second preset trajectory 45 being different from the first preset trajectory 43.
In some embodiments, the lens 104 is fixed, the lens 104 includes an optical axis OP, and the second predetermined track 45 is capable of moving in a direction perpendicular or parallel to the optical axis OP.
In some embodiments, the determining of the second predetermined relationship comprises:
focusing the fourth object 47 with the imaging system to determine a fourth coordinate;
a second predetermined relationship is established in accordance with the first predetermined relationship and a fourth coordinate reflecting the focal plane position of the fourth object 47.
In some embodiments, the control device 101 is configured to: after acquiring the image of the third object 46, the lens 104 is moved relative to the first preset track 43 and/or the second preset track 45 to acquire the image of the fifth object 48 without focusing using the imaging system.
In some embodiments, the imaging system includes an imaging device 102 and a stage 103, the imaging device 102 includes a lens 104 and a focusing module 106, the lens 104 includes an optical axis OP, the lens 104 is capable of moving along the optical axis OP, and the first predetermined track 43 and/or the second predetermined track 45 are located on the stage 103.
In some embodiments, the control device 101 is configured to perform the following steps:
(a) using the focusing module 106 to emit light onto the object;
(b) moving the lens 104 to a first set position;
(c) moving the lens 104 from the first setting position to the object by a first setting step length and determining whether the focusing module 106 receives the light reflected by the object;
(d) when the focusing module 106 receives light reflected by the object, the lens 104 is moved from the current position to a second set position, the second set position is located in a first range, and the first range includes the current position and allows the lens 104 to move in the direction of the optical axis OP;
(e) moving the lens 104 from a second set position at a second set step size, at each step position obtaining an image of the subject with the imaging device 102, the second set step size being smaller than the first set step size;
(f) and evaluating the image of the object, and realizing focusing according to the obtained image evaluation result.
In some embodiments, the first range includes a first interval and a second interval which are opposite to each other with respect to the current position, and the second interval is defined to be closer to the object, and the step (e) includes:
(i) moving the lens 104 from the second set position to a direction away from the subject when the second set position is in the second section, obtaining an image of the subject with the imaging device 102 at each step position; or
(ii) When the second set position is in the first zone, the lens 104 is moved from the second set position in a direction to approach the subject, and an image of the subject is obtained with the imaging device 102 at each step position.
In certain embodiments, step (f) comprises: comparing the image evaluation result with a preset condition, and if the image evaluation result meets the preset condition, saving the position of the lens 104 corresponding to the image;
if the image evaluation result does not satisfy the preset condition, the lens 104 is moved to a third setting position, and the third setting position is located in another section of the first range, which is different from the section where the second setting position is located.
In some embodiments, the image evaluation result includes a first evaluation value and a second evaluation value, the second setting step includes a coarse step and a fine step, and the step (f) includes: the lens 104 is moved in coarse steps until the first evaluation value of the image at the corresponding position is not more than the first threshold value, the lens 104 is changed to a fine step to continue moving until the second evaluation value of the image at the corresponding position is maximized, and the position of the lens 104 corresponding to the image at which the second evaluation value is maximized is saved.
In some embodiments, the image evaluation result includes a first evaluation value, a second evaluation value, and a third evaluation value, and the image includes a plurality of pixels;
the preset condition is that the number of the bright spots on the image is larger than a preset value, the first evaluation value of the image at the corresponding position is not larger than a first threshold value, and the second evaluation value of the image at the corresponding position is the largest among the second evaluation values of the N images before and after the image at the corresponding position; or
The preset condition is that the number of the bright spots on the image is smaller than a preset value, the first evaluation value of the image at the corresponding position is not larger than a first threshold, and the third evaluation value of the image at the corresponding position is the largest among the third evaluation values of the N images before and after the current image.
In some embodiments, the imaging system includes a bright spot detection module to:
and performing bright spot detection on the image by using a k1 x k2 matrix, wherein the matrix which judges that the central pixel value of the matrix is not less than any pixel value of the non-center of the matrix corresponds to a bright spot, both k1 and k2 are odd numbers which are more than 1, and the k1 x k2 matrix comprises k1 x k2 pixel points.
In some embodiments, the central pixel value of the matrix corresponding to a bright spot is greater than a first preset value, any pixel value in the non-central area of the matrix is greater than a second preset value, and the first preset value and the second preset value are related to the average pixel value of the image.
In some embodiments, the first evaluation value is determined by counting sizes of connected components corresponding to the bright spots of the image, where the size of the connected component Area corresponding to the bright spot of one image is a × B, a represents a size of the connected component in a row centered on a center of a matrix corresponding to the bright spot, B represents a size of the connected component in a column centered on the center of the matrix corresponding to the bright spot, and a connected pixel point larger than an average pixel value of the image is defined as one connected component.
In some embodiments, the second evaluation value and/or the third evaluation value is determined by counting scores of the flare of the images, Score of the flare of one image ((k1 × k2-1) CV-EV)/((CV + EV)/(k1 × k2)), CV represents a central pixel value of the matrix corresponding to the flare, and EV represents a sum of non-central pixel values of the matrix corresponding to the flare.
In some embodiments, the focusing module 106 includes a light source 116 and a light sensor 118, the light source 116 is configured to emit light onto the object, and the light sensor 118 is configured to receive light reflected by the object.
In some embodiments, when the focusing module 106 receives light reflected by the object, the control device 101 is further configured to:
the lens 104 is moved to the subject by a third set step length which is smaller than the first set step length and larger than the second set step length, and a first light intensity parameter is calculated according to the light intensity of the light received by the focusing module 106, and whether the first light intensity parameter is larger than a first set light intensity threshold value is judged;
when the first light intensity parameter is greater than the first set light intensity threshold, the lens 104 is moved from the current position to the second set position.
In some embodiments, the focusing module 106 includes two light sensors 118, the two light sensors 118 are configured to receive light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118.
In some embodiments, the control device 101 is configured to, when the lens 104 is moved: judging whether the current position of the lens 104 exceeds a fourth set position;
when the current position of the lens 104 exceeds the fourth set position, the movement of the lens 104 is stopped.
In some embodiments, the control device 101 is configured to:
using the focusing module 106 to emit light onto the object;
moving the lens 104 to a first set position;
moving the lens 104 from the first setting position to the object by a first setting step length and determining whether the focusing module 106 receives the light reflected by the object;
when the focusing module 106 receives the light reflected by the object, the lens 104 is moved by a second set step smaller than the first set step, and the imaging device 102 is used to collect the image of the object, and whether the sharpness value of the image collected by the imaging device 102 reaches a set threshold value is judged;
when the sharpness value of the image reaches the set threshold value, the current position of the lens 104 is saved as the saving position.
In some embodiments, the focusing module 106 includes a light source 116 and a light sensor 118, the light source 116 is configured to emit light onto the object, and the light sensor 118 is configured to receive light reflected by the object.
In some embodiments, when the focusing module 106 receives light reflected by the object, the control device 101 is configured to:
the lens 104 is moved by a third set step object which is smaller than the first set step and larger than the second set step, and a first light intensity parameter is calculated according to the light intensity of the light received by the focusing module 106, and whether the first light intensity parameter is larger than a first set light intensity threshold value is judged;
when the first light intensity parameter is larger than the first set light intensity threshold, the lens 104 is moved by a second set step length, the imaging device 102 is used for collecting the image of the object, and whether the sharpness value of the image collected by the imaging device 102 reaches the set threshold is judged.
In some embodiments, the focusing module 106 includes two light sensors 118, the two light sensors 118 are used for receiving light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118.
In some embodiments, when the focusing module 106 receives light reflected by the object, the control device 101 is configured to:
the lens 104 is moved by a third set step object which is smaller than the first set step and larger than the second set step, and a first light intensity parameter is calculated according to the light intensity of the light received by the focusing module 106, and whether the first light intensity parameter is larger than a first set light intensity threshold value is judged;
when the first light intensity parameter is greater than the first set light intensity threshold, the lens 104 is moved to the object by a fourth set step length which is smaller than the third set step length and larger than the second set step length, and the second light intensity parameter is calculated according to the light intensity of the light received by the focusing module 106, and whether the second light intensity parameter is smaller than the second set light intensity threshold is judged;
when the second light intensity parameter is smaller than the second set light intensity threshold, the lens 104 is moved by a second set step length, the imaging device 102 is used for collecting the image of the object, and whether the sharpness value of the image collected by the imaging device 102 reaches the set threshold is judged.
In some embodiments, the focusing module 106 includes two light sensors 118, the two light sensors 118 are configured to receive light reflected by the object, the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118, the light intensities of the light received by the two light sensors 118 have a first difference, and the second light intensity parameter is a difference between the first difference and the set compensation value.
In some embodiments, when moving the lens 104 by the second set step size, the control device 101 is configured to: judging whether a first sharpness value of a pattern corresponding to the current position of the lens 104 is larger than a second sharpness value of an image corresponding to the previous position of the lens 104;
when the first sharpness value is larger than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is larger than the set difference, the lens 104 is enabled to continue to move towards the object by a second set step length;
when the first sharpness value is larger than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is smaller than the set difference, the lens 104 is continuously moved to the object by a fifth set step smaller than the second set step so that the sharpness value of the image acquired by the imaging device 102 reaches the set threshold;
moving the lens 104 away from the subject by a second set step size when the second sharpness value is greater than the first sharpness value and the sharpness difference between the second sharpness value and the first sharpness value is greater than the set difference;
when the second sharpness value is greater than the first sharpness value and the sharpness difference between the second sharpness value and the first sharpness value is less than the set difference, the lens 104 is moved away from the subject in a fifth set step size to bring the sharpness value of the image acquired by the imaging device 102 to the set threshold.
In some embodiments, the control device 101 is configured to, when the lens 104 is moved: judging whether the current position of the lens 104 exceeds a second set position;
when the current position of the lens 104 exceeds the second set position, the moving of the lens 104 is stopped or a focusing step is performed.
A computer-readable storage medium of an embodiment of the present invention stores a program for execution by a computer, the execution program including steps to perform the imaging method of any of the above embodiments. The computer-readable storage medium may include: read-only memory, random access memory, magnetic or optical disk, and the like.
A computer program product according to an embodiment of the present invention includes instructions that, when executed by a computer, cause the computer to perform the steps of the imaging method according to any one of the above embodiments.
In the description herein, references to the description of the terms "one embodiment," "certain embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
In addition, each functional unit in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and not to be construed as limiting the present invention, and those skilled in the art can make changes, modifications, substitutions and alterations to the above embodiments within the scope of the present invention.

Claims (10)

1. An imaging method for imaging an object with an imaging system including a lens, the object including a first object, a second object, and a third object at different positions of a first preset trajectory, the imaging method comprising:
and relatively moving the lens and the first preset track according to a first preset relation so as to obtain a clear image of the third object without focusing by using the imaging system, wherein the first preset relation is determined by the focal plane position of the first object and the focal plane position of the second object.
2. The method of claim 1, wherein the third object is located between the first object and the second object;
optionally, the lens is fixed, the lens includes an optical axis, and the first preset track is capable of moving in a direction perpendicular to or parallel to the optical axis;
optionally, the determining of the first predetermined relationship comprises:
focusing the first object by using the imaging system to determine a first coordinate;
focusing the second object by using the imaging system, and determining a second coordinate;
establishing the first predetermined relationship according to the first coordinate and the second coordinate, wherein the first coordinate reflects the focal plane position of the first object, and the second coordinate reflects the focal plane position of the second object;
optionally, the first preset track is a linear or non-linear track; and/or the first predetermined relationship is a linear relationship;
optionally, the objects include a fourth object and a fifth object located at different positions of the second preset track, and the imaging method includes:
relatively moving the lens and the second preset track according to a second preset relationship to obtain a sharp image of the fifth object without focusing by using the imaging system, wherein the second preset relationship is determined by the focal plane position of the fourth object and the first preset relationship, and the second preset track is different from the first preset track;
optionally, the lens is fixed, the lens includes an optical axis, and the second preset track is capable of moving in a direction perpendicular to or parallel to the optical axis;
optionally, the determining of the second predetermined relationship comprises:
focusing the fourth object by using the imaging system to determine a fourth coordinate;
establishing the second predetermined relationship according to the first predetermined relationship and the fourth coordinate, wherein the fourth coordinate reflects the focal plane position of the fourth object;
optionally, the imaging method comprises: after the sharp image of the third object is acquired, the lens and the first preset track and/or the second preset track are/is moved relatively to acquire the sharp image of the fifth object without focusing by using the imaging system.
3. The method of claim 2, wherein the imaging system comprises an imaging device and a stage, the imaging device comprises the lens and a focusing module, the lens comprises an optical axis, the lens can move along the optical axis to perform the focusing, and the first preset track and/or the second preset track are/is located on the stage;
optionally, the focusing comprises the steps of:
(a) emitting light onto the object by using the focusing module;
(b) moving the lens to a first set position;
(c) enabling the lens to move towards the object from the first set position by a first set step length and judging whether the focusing module receives light reflected by the object;
(d) moving the lens from a current position to a second set position when the focusing module receives light reflected by the object, the second set position being within a first range, the first range being a range that includes the current position and allows the lens to move in the optical axis direction;
(e) moving the lens from the second setting position by a second setting step size, and obtaining an image of the object by the imaging device at each step position, wherein the second setting step size is smaller than the first setting step size;
(f) evaluating the image of the object, and realizing focusing according to the obtained image evaluation result;
optionally, with reference to the current position, the first range includes a first interval and a second interval which are opposite to each other, and the second interval is defined to be closer to the object, and the step (e) includes:
(i) when the second set position is in the second section, moving the lens from the second set position to a direction away from the object, and obtaining an image of the object by the imaging device at each step position; or
(ii) Moving the lens from the second setting position to a direction approaching the subject when the second setting position is in the first zone, and obtaining an image of the subject with the imaging device at each step position;
optionally, step (f) comprises: comparing the image evaluation result with a preset condition, and if the image evaluation result meets the preset condition, saving the position of a lens corresponding to the image;
if the image evaluation result does not meet the preset condition, moving the lens to a third set position, wherein the third set position is located in another interval, different from the interval where the second set position is located, in the first range;
optionally, the image evaluation result includes a first evaluation value and a second evaluation value, the second set step includes a coarse step and a fine step, and step (f) includes: moving the lens by the coarse step length until a first evaluation value of the image at the corresponding position is not larger than a first threshold value, changing the lens to be the maximum value by a second evaluation value of the image continuously moved to the corresponding position by the fine step length, and saving the position of the lens corresponding to the image when the second evaluation value is the maximum value;
optionally, the image evaluation result includes a first evaluation value, a second evaluation value, and a third evaluation value, and the image includes a plurality of pixels;
the preset condition is that the number of bright spots on the image is greater than a preset value, the first evaluation value of the image at the corresponding position is not greater than a first threshold value, and the second evaluation value of the image at the corresponding position is the largest among the second evaluation values of the N images before and after the image at the corresponding position; or
The preset condition is that the number of bright spots on the image is smaller than the preset value, the first evaluation value of the image at the corresponding position is not larger than the first threshold, and the third evaluation value of the image at the corresponding position is the largest among the third evaluation values of the N images before and after the current image;
optionally, detecting bright spots on the image using:
performing speckle detection on the image by using a k1 x k2 matrix, wherein the detection comprises that a matrix with a central pixel value not less than any non-central pixel value of the matrix corresponds to a speckle, both k1 and k2 are odd numbers greater than 1, and the k1 x k2 matrix comprises k1 x k2 pixel points;
optionally, a central pixel value of the matrix corresponding to one bright spot is greater than a first preset value, any pixel value of a non-central pixel of the matrix is greater than a second preset value, and the first preset value and the second preset value are related to an average pixel value of the image;
optionally, the first evaluation value is determined by counting sizes of connected components corresponding to bright spots of the images, where Area corresponding to a bright spot of one of the images is a.a.b., where a represents a size of a connected component of a row centered on a center of a matrix corresponding to the bright spot, B represents a size of a connected component of a column centered on the center of the matrix corresponding to the bright spot, and a connected pixel point larger than an average pixel value of the image is defined as a connected component;
optionally, the second evaluation value and/or the third evaluation value is determined by counting scores of the patches of the images, Score of one of the patches of the image ((k1 k2-1) CV-EV)/((CV + EV)/(k1 k2)), CV represents a central pixel value of a matrix corresponding to the patch, and EV represents a sum of non-central pixel values of the matrix corresponding to the patch;
optionally, the focusing module comprises a light source and a light sensor, the light source is used for emitting the light to the object, and the light sensor is used for receiving the light reflected by the object;
optionally, when the focusing module receives the light reflected by the object, the focusing further includes:
enabling the lens to move towards the object by a third set step length which is smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value or not;
when the first light intensity parameter is larger than the first set light intensity threshold value, moving the lens from the current position to the second set position;
optionally, the focusing module comprises two light sensors, the two light sensors are used for receiving the light reflected by the object, and the first light intensity parameter is an average value of the light intensities of the light received by the two light sensors;
optionally, when the lens moves, judging whether the current position of the lens exceeds a fourth set position;
and when the current position of the lens exceeds the fourth set position, stopping moving the lens.
4. The method of claim 3, wherein the focusing comprises the steps of:
emitting light onto the object by using the focusing module;
moving the lens to a first set position;
enabling the lens to move towards the object from the first set position by a first set step length and judging whether the focusing module receives light reflected by the object;
when the focusing module receives light reflected by the object, the lens is moved by a second set step length which is smaller than the first set step length, the imaging device is used for collecting images of the object, and whether the sharpness value of the images collected by the imaging device reaches a set threshold value or not is judged;
when the sharpness value of the image reaches the set threshold value, saving the current position of the lens as a saving position;
optionally, the focusing module comprises a light source for emitting light onto the object and a light sensor for receiving light reflected by the object;
optionally, when the focusing module receives the light reflected by the object, the focusing further comprises:
enabling the lens to move by a third set step length which is smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value;
when the first light intensity parameter is larger than the first set light intensity threshold, moving the lens by the second set step length, acquiring an image of the object by using the imaging device, and judging whether the sharpness value of the image acquired by the imaging device reaches a set threshold;
optionally, the focusing module comprises two light sensors for receiving light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors;
optionally, when the focusing module receives light reflected by the object, the focusing further comprises the following steps:
enabling the lens to move by a third set step length which is smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value;
when the first light intensity parameter is larger than the first set light intensity threshold, the lens is made to move towards the object by a fourth set step length which is smaller than the third set step length and larger than the second set step length, a second light intensity parameter is calculated according to the light intensity of the light received by the focusing module, and whether the second light intensity parameter is smaller than the second set light intensity threshold is judged;
when the second light intensity parameter is smaller than the second set light intensity threshold, moving the lens by the second set step length, acquiring an image of the object by using the imaging device, and judging whether the sharpness value of the image acquired by the imaging device reaches the set threshold;
optionally, the focusing module includes two light sensors, the two light sensors are configured to receive light reflected by the object, the first light intensity parameter is an average value of light intensities of the light received by the two light sensors, the light intensities of the light received by the two light sensors have a first difference, and the second light intensity parameter is a difference between the first difference and a set compensation value;
optionally, when the lens is moved by the second set step length, determining whether a first sharpness value of the pattern corresponding to a current position of the lens is greater than a second sharpness value of the image corresponding to a previous position of the lens;
when the first sharpness value is greater than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is greater than a set difference, continuing to move the lens toward the object at the second set step size;
when the first sharpness value is greater than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is less than the set difference, continuing to move the lens toward the object in a fifth set step that is less than the second set step to bring the sharpness value of the image acquired by the imaging device to the set threshold;
moving the lens away from the object by the second set step size when the second sharpness value is greater than the first sharpness value and a sharpness difference between the second sharpness value and the first sharpness value is greater than the set difference;
moving the lens away from the object in the fifth set step to bring the sharpness value of the image acquired by the imaging device to the set threshold when the second sharpness value is greater than the first sharpness value and the sharpness difference between the second sharpness value and the first sharpness value is less than the set difference;
optionally, when the lens moves, judging whether the current position of the lens exceeds a second set position;
and when the current position of the lens exceeds the second set position, stopping moving the lens or carrying out the focusing step.
5. An imaging system for imaging an object, the imaging system comprising a lens and control means, the object comprising a first object, a second object and a third object at different positions of a first predetermined trajectory, the control means being arranged to:
relatively moving the lens and the first preset track according to a first preset relation so as to obtain a clear image of the third object without focusing by using the imaging system, wherein the first preset relation is determined by the focal plane position of the first object and the focal plane position of the second object;
optionally, the third object is located between the first object and the second object;
optionally, the lens is fixed, the lens includes an optical axis, and the first preset track is capable of moving in a direction perpendicular to or parallel to the optical axis;
optionally, the determining of the first predetermined relationship comprises:
focusing the first object by using the imaging system to determine a first coordinate;
focusing the second object by using the imaging system, and determining a second coordinate;
establishing the first predetermined relationship according to the first coordinate and the second coordinate, wherein the first coordinate reflects the focal plane position of the first object, and the second coordinate reflects the focal plane position of the second object;
optionally, the first preset track is a linear or non-linear track; and/or the first predetermined relationship is a linear relationship;
optionally, the objects include a fourth object and a fifth object located at different positions of the second preset track, and the control device is configured to:
relatively moving the lens and the second preset track according to a second preset relationship to obtain an image of the fifth object without focusing by using the imaging system, wherein the second preset relationship is determined by the focal plane position of the fourth object and the first preset relationship, and the second preset track is different from the first preset track;
optionally, the lens is fixed, the lens includes an optical axis, and the second preset track is capable of moving in a direction perpendicular to or parallel to the optical axis;
optionally, the determining of the second predetermined relationship comprises:
focusing the fourth object by using the imaging system to determine a fourth coordinate;
establishing the second predetermined relationship according to the first predetermined relationship and the fourth coordinate, wherein the fourth coordinate reflects the focal plane position of the fourth object;
optionally, the control device is configured to: after the sharp image of the third object is acquired, the lens and the first preset track and/or the second preset track are/is moved relatively to acquire the image of the fifth object without focusing by using the imaging system.
6. The system of claim 5, wherein the imaging system comprises an imaging device and a stage, the imaging device comprises the lens and a focusing module, the lens comprises an optical axis, the lens is capable of moving along the optical axis, and the first predetermined track and/or the second predetermined track are/is located on the stage;
optionally, the control device is configured to perform the following steps:
(a) emitting light onto the object by using the focusing module;
(b) moving the lens to a first set position;
(c) enabling the lens to move towards the object from the first set position by a first set step length and judging whether the focusing module receives light reflected by the object;
(d) moving the lens from a current position to a second set position when the focusing module receives light reflected by the object, the second set position being within a first range, the first range being a range that includes the current position and allows the lens to move in the optical axis direction;
(e) moving the lens from the second setting position by a second setting step size, and obtaining an image of the object by the imaging device at each step position, wherein the second setting step size is smaller than the first setting step size;
(f) evaluating the image of the object, and realizing focusing according to the obtained image evaluation result;
optionally, with reference to the current position, the first range includes a first interval and a second interval which are opposite to each other, and the second interval is defined to be closer to the object, and the step (e) includes:
(i) when the second set position is in the second section, moving the lens from the second set position to a direction away from the object, and obtaining an image of the object by the imaging device at each step position; or
(ii) Moving the lens from the second setting position to a direction approaching the subject when the second setting position is in the first zone, and obtaining an image of the subject with the imaging device at each step position;
optionally, step (f) comprises: comparing the image evaluation result with a preset condition, and if the image evaluation result meets the preset condition, saving the position of a lens corresponding to the image;
if the image evaluation result does not meet the preset condition, moving the lens to a third set position, wherein the third set position is located in another interval, different from the interval where the second set position is located, in the first range;
optionally, the image evaluation result includes a first evaluation value and a second evaluation value, the second set step includes a coarse step and a fine step, and step (f) includes: moving the lens by the coarse step length until a first evaluation value of the image at the corresponding position is not larger than a first threshold value, changing the lens to be the maximum value by a second evaluation value of the image continuously moved to the corresponding position by the fine step length, and saving the position of the lens corresponding to the image when the second evaluation value is the maximum value;
optionally, the image evaluation result includes a first evaluation value, a second evaluation value, and a third evaluation value, and the image includes a plurality of pixels;
the preset condition is that the number of bright spots on the image is greater than a preset value, the first evaluation value of the image at the corresponding position is not greater than a first threshold value, and the second evaluation value of the image at the corresponding position is the largest among the second evaluation values of the N images before and after the image at the corresponding position; or
The preset condition is that the number of bright spots on the image is smaller than the preset value, the first evaluation value of the image at the corresponding position is not larger than the first threshold, and the third evaluation value of the image at the corresponding position is the largest among the third evaluation values of the N images before and after the current image;
optionally, the imaging system comprises a bright spot detection module, the bright spot detection module is configured to:
performing speckle detection on the image by using a k1 x k2 matrix, wherein the detection comprises that a matrix with a central pixel value not less than any non-central pixel value of the matrix corresponds to a speckle, both k1 and k2 are odd numbers greater than 1, and the k1 x k2 matrix comprises k1 x k2 pixel points;
optionally, a central pixel value of the matrix corresponding to one bright spot is greater than a first preset value, any pixel value of a non-central pixel of the matrix is greater than a second preset value, and the first preset value and the second preset value are related to an average pixel value of the image;
optionally, the first evaluation value is determined by counting sizes of connected components corresponding to bright spots of the images, where Area corresponding to a bright spot of one of the images is a.a.b., where a represents a size of a connected component of a row centered on a center of a matrix corresponding to the bright spot, B represents a size of a connected component of a column centered on the center of the matrix corresponding to the bright spot, and a connected pixel point larger than an average pixel value of the image is defined as a connected component;
optionally, the second evaluation value and/or the third evaluation value is determined by counting scores of the patches of the images, Score of one of the patches of the image ((k1 k2-1) CV-EV)/((CV + EV)/(k1 k2)), CV represents a central pixel value of a matrix corresponding to the patch, and EV represents a sum of non-central pixel values of the matrix corresponding to the patch;
optionally, the focusing module comprises a light source and a light sensor, the light source is used for emitting the light to the object, and the light sensor is used for receiving the light reflected by the object;
optionally, when the focusing module receives light reflected by the object, the control device is further configured to:
enabling the lens to move towards the object by a third set step length which is smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value or not;
when the first light intensity parameter is larger than the first set light intensity threshold value, moving the lens from the current position to the second set position;
optionally, the focusing module comprises two light sensors, the two light sensors are used for receiving the light reflected by the object, and the first light intensity parameter is an average value of the light intensities of the light received by the two light sensors;
optionally, when the lens moves, the control device is configured to: judging whether the current position of the lens exceeds a fourth set position;
stopping moving the lens when the current position of the lens exceeds the fourth set position;
optionally, the control device is configured to:
emitting light onto the object by using the focusing module;
moving the lens to a first set position;
enabling the lens to move towards the object from the first set position by a first set step length and judging whether the focusing module receives light reflected by the object;
when the focusing module receives light reflected by the object, the lens is moved by a second set step length which is smaller than the first set step length, the imaging device is used for collecting images of the object, and whether the sharpness value of the images collected by the imaging device reaches a set threshold value or not is judged;
when the sharpness value of the image reaches the set threshold value, saving the current position of the lens as a saving position;
optionally, the focusing module comprises a light source for emitting light onto the object and a light sensor for receiving light reflected by the object;
optionally, when the focusing module receives light reflected by the object, the control device is configured to:
enabling the lens to move by a third set step length which is smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value;
when the first light intensity parameter is larger than the first set light intensity threshold, moving the lens by the second set step length, acquiring an image of the object by using the imaging device, and judging whether the sharpness value of the image acquired by the imaging device reaches a set threshold;
optionally, the focusing module comprises two light sensors for receiving light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors;
optionally, when the focusing module receives light reflected by the object, the control device is configured to:
enabling the lens to move by a third set step length which is smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value;
when the first light intensity parameter is larger than the first set light intensity threshold, the lens is made to move towards the object by a fourth set step length which is smaller than the third set step length and larger than the second set step length, a second light intensity parameter is calculated according to the light intensity of the light received by the focusing module, and whether the second light intensity parameter is smaller than the second set light intensity threshold is judged;
when the second light intensity parameter is smaller than the second set light intensity threshold, moving the lens by the second set step length, acquiring an image of the object by using the imaging device, and judging whether the sharpness value of the image acquired by the imaging device reaches the set threshold;
optionally, the focusing module includes two light sensors, the two light sensors are configured to receive light reflected by the object, the first light intensity parameter is an average value of light intensities of the light received by the two light sensors, the light intensities of the light received by the two light sensors have a first difference, and the second light intensity parameter is a difference between the first difference and a set compensation value;
optionally, when moving the lens by the second set step size, the control device is configured to: judging whether a first sharpness value of the pattern corresponding to the current position of the lens is larger than a second sharpness value of the image corresponding to the previous position of the lens;
when the first sharpness value is greater than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is greater than a set difference, continuing to move the lens toward the object at the second set step size;
when the first sharpness value is greater than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is less than the set difference, continuing to move the lens toward the object in a fifth set step that is less than the second set step to bring the sharpness value of the image acquired by the imaging device to the set threshold;
moving the lens away from the object by the second set step size when the second sharpness value is greater than the first sharpness value and a sharpness difference between the second sharpness value and the first sharpness value is greater than the set difference;
moving the lens away from the object in the fifth set step to bring the sharpness value of the image acquired by the imaging device to the set threshold when the second sharpness value is greater than the first sharpness value and the sharpness difference between the second sharpness value and the first sharpness value is less than the set difference;
optionally, when the lens moves, the control device is configured to: judging whether the current position of the lens exceeds a second set position;
and when the current position of the lens exceeds the second set position, stopping moving the lens or carrying out the focusing step.
7. A sequencing apparatus comprising the imaging system of any one of claims 5 to 6.
8. A computer-readable storage medium storing a program for execution by a computer, wherein executing the program comprises performing the steps of the method of any one of claims 1-4.
9. An imaging system for imaging an object, the imaging system comprising a lens and a control device, the object comprising a first object, a second object and a third object located at different positions of a first predetermined trajectory, characterized in that the control device comprises a computer executable program, execution of which comprises performing the steps of the method of any of claims 1-4.
10. A computer program product comprising instructions, characterized in that, when said instructions are executed by a computer, said instructions cause said computer to carry out the steps of the method according to any one of claims 1 to 4.
CN201810814359.0A 2018-07-23 2018-07-23 Imaging method, device and system Pending CN112333378A (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201810814359.0A CN112333378A (en) 2018-07-23 2018-07-23 Imaging method, device and system
EP19841635.6A EP3829158A4 (en) 2018-07-23 2019-07-23 Imaging method, device and system
PCT/CN2019/097272 WO2020020148A1 (en) 2018-07-23 2019-07-23 Imaging method, device and system
US17/262,663 US11368614B2 (en) 2018-07-23 2019-07-23 Imaging method, device and system
US17/746,838 US11575823B2 (en) 2018-07-23 2022-05-17 Imaging method, device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810814359.0A CN112333378A (en) 2018-07-23 2018-07-23 Imaging method, device and system

Publications (1)

Publication Number Publication Date
CN112333378A true CN112333378A (en) 2021-02-05

Family

ID=74319229

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810814359.0A Pending CN112333378A (en) 2018-07-23 2018-07-23 Imaging method, device and system

Country Status (1)

Country Link
CN (1) CN112333378A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113763401A (en) * 2021-09-10 2021-12-07 南京比邻智能软件有限公司 Rapid multi-point automatic focusing method, system and application equipment thereof

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6674574B1 (en) * 1999-09-24 2004-01-06 Olympus Corporation Focusing system for a microscope and a reflected illumination fluorescence microscope using the focusing system
US20040217257A1 (en) * 2003-05-01 2004-11-04 Eastman Kodak Company Scene-based method for determining focus
JP2005284155A (en) * 2004-03-30 2005-10-13 Fuji Photo Film Co Ltd Manual focusing device and focusing assist program
TW200804962A (en) * 2006-07-06 2008-01-16 Asia Optical Co Inc Semi-automatic zooming method for a manually-zooming optical component and a videotaping device using such method,
CN101303269A (en) * 2007-05-09 2008-11-12 奥林巴斯株式会社 Optical system evaluation apparatus, optical system evaluation method and program thereof
US20120281132A1 (en) * 2010-11-08 2012-11-08 Yasunobu Ogura Image capturing device, image capturing method, program, and integrated circuit
CN102967983A (en) * 2012-11-07 2013-03-13 苏州科达科技股份有限公司 Automatic focusing method of camera
CN103513395A (en) * 2012-06-15 2014-01-15 中兴通讯股份有限公司 Passive auto-focusing method and device
US20140300724A1 (en) * 2000-05-03 2014-10-09 Leica Biosystems Imaging, Inc. Optimizing virtual slide image quality
CN104102069A (en) * 2013-04-11 2014-10-15 展讯通信(上海)有限公司 Focusing method and device of imaging system, and imaging system
CN104243815A (en) * 2014-08-25 2014-12-24 联想(北京)有限公司 Focusing method and electronic equipment
CN104503189A (en) * 2014-12-31 2015-04-08 信利光电股份有限公司 Automatic focusing method
CN105827944A (en) * 2015-11-25 2016-08-03 维沃移动通信有限公司 Focusing method and mobile terminal
TW201638622A (en) * 2015-04-28 2016-11-01 信泰光學(深圳)有限公司 Automatic focusing method and image capturing device utilizing the same
CN106257914A (en) * 2015-06-19 2016-12-28 奥林巴斯株式会社 Focus detection device and focus detecting method
CN106375647A (en) * 2015-07-23 2017-02-01 杭州海康威视数字技术股份有限公司 Method, device and system for adjusting camera back focus
CN106791387A (en) * 2016-12-12 2017-05-31 中国航空工业集团公司洛阳电光设备研究所 A kind of high-definition camera Atomatic focusing method that gondola is patrolled and examined for power network
CN207215686U (en) * 2017-09-20 2018-04-10 深圳市瀚海基因生物科技有限公司 Systems for optical inspection and Sequence Detection System
CN108076268A (en) * 2016-11-15 2018-05-25 谷歌有限责任公司 The equipment, system and method for auto-focusing ability are provided based on object distance information

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6674574B1 (en) * 1999-09-24 2004-01-06 Olympus Corporation Focusing system for a microscope and a reflected illumination fluorescence microscope using the focusing system
US20140300724A1 (en) * 2000-05-03 2014-10-09 Leica Biosystems Imaging, Inc. Optimizing virtual slide image quality
US20040217257A1 (en) * 2003-05-01 2004-11-04 Eastman Kodak Company Scene-based method for determining focus
JP2005284155A (en) * 2004-03-30 2005-10-13 Fuji Photo Film Co Ltd Manual focusing device and focusing assist program
TW200804962A (en) * 2006-07-06 2008-01-16 Asia Optical Co Inc Semi-automatic zooming method for a manually-zooming optical component and a videotaping device using such method,
CN101303269A (en) * 2007-05-09 2008-11-12 奥林巴斯株式会社 Optical system evaluation apparatus, optical system evaluation method and program thereof
US20120281132A1 (en) * 2010-11-08 2012-11-08 Yasunobu Ogura Image capturing device, image capturing method, program, and integrated circuit
CN103513395A (en) * 2012-06-15 2014-01-15 中兴通讯股份有限公司 Passive auto-focusing method and device
CN102967983A (en) * 2012-11-07 2013-03-13 苏州科达科技股份有限公司 Automatic focusing method of camera
CN104102069A (en) * 2013-04-11 2014-10-15 展讯通信(上海)有限公司 Focusing method and device of imaging system, and imaging system
CN104243815A (en) * 2014-08-25 2014-12-24 联想(北京)有限公司 Focusing method and electronic equipment
CN104503189A (en) * 2014-12-31 2015-04-08 信利光电股份有限公司 Automatic focusing method
TW201638622A (en) * 2015-04-28 2016-11-01 信泰光學(深圳)有限公司 Automatic focusing method and image capturing device utilizing the same
CN106257914A (en) * 2015-06-19 2016-12-28 奥林巴斯株式会社 Focus detection device and focus detecting method
CN106375647A (en) * 2015-07-23 2017-02-01 杭州海康威视数字技术股份有限公司 Method, device and system for adjusting camera back focus
CN105827944A (en) * 2015-11-25 2016-08-03 维沃移动通信有限公司 Focusing method and mobile terminal
CN108076268A (en) * 2016-11-15 2018-05-25 谷歌有限责任公司 The equipment, system and method for auto-focusing ability are provided based on object distance information
CN106791387A (en) * 2016-12-12 2017-05-31 中国航空工业集团公司洛阳电光设备研究所 A kind of high-definition camera Atomatic focusing method that gondola is patrolled and examined for power network
CN207215686U (en) * 2017-09-20 2018-04-10 深圳市瀚海基因生物科技有限公司 Systems for optical inspection and Sequence Detection System

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113763401A (en) * 2021-09-10 2021-12-07 南京比邻智能软件有限公司 Rapid multi-point automatic focusing method, system and application equipment thereof

Similar Documents

Publication Publication Date Title
US11156823B2 (en) Digital microscope apparatus, method of searching for in-focus position thereof, and program
US11575823B2 (en) Imaging method, device and system
US11086118B2 (en) Self-calibrating and directional focusing systems and methods for infinity corrected microscopes
CN108693625B (en) Imaging method, device and system
WO2015068360A1 (en) Microscope system and autofocusing method
US20200404186A1 (en) Defocus amount measuring device, defocus amount measuring method, defocus amount measuring program, and discriminator
US20150358533A1 (en) Control method for imaging apparatus and imaging system
US11243389B2 (en) Optical scanning arrangement and method
CN113467067B (en) Automatic focusing method and device of microscopic imaging system based on multi-image area relation
US9851549B2 (en) Rapid autofocus method for stereo microscope
CN108693624B (en) Imaging method, device and system
CN112322713B (en) Imaging method, device and system and storage medium
CN112333378A (en) Imaging method, device and system
WO2018188440A1 (en) Imaging method, device and system
CN112291469A (en) Imaging method, device and system
CN113366364A (en) Real-time focusing in slide scanning system
JP5471715B2 (en) Focusing device, focusing method, focusing program, and microscope
JP2009109682A (en) Automatic focus adjusting device and automatic focus adjusting method
KR102010818B1 (en) Apparatus for capturing images of blood cell
CN108693113B (en) Imaging method, device and system
CN111647506B (en) Positioning method, positioning device and sequencing system
JP5960006B2 (en) Sample analyzer, sample analysis method, sample analysis program, and particle track analyzer
KR101873318B1 (en) Celll imaging device and mehtod therefor
EP4025949A1 (en) Microscope and method for imaging an object using a microscope

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40036782

Country of ref document: HK