CN118158524A - Imaging method, device and system - Google Patents

Imaging method, device and system Download PDF

Info

Publication number
CN118158524A
CN118158524A CN202410235075.1A CN202410235075A CN118158524A CN 118158524 A CN118158524 A CN 118158524A CN 202410235075 A CN202410235075 A CN 202410235075A CN 118158524 A CN118158524 A CN 118158524A
Authority
CN
China
Prior art keywords
lens
value
image
light
optionally
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410235075.1A
Other languages
Chinese (zh)
Inventor
李林森
孙瑞涛
徐家宏
周志良
姜泽飞
颜钦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Genemind Biosciences Co Ltd
Original Assignee
Genemind Biosciences Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Genemind Biosciences Co Ltd filed Critical Genemind Biosciences Co Ltd
Priority to CN202410235075.1A priority Critical patent/CN118158524A/en
Publication of CN118158524A publication Critical patent/CN118158524A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses an imaging method and an imaging system, the imaging method uses the imaging system to image an object, the imaging system comprises a lens, the object comprises a first object, a second object and a third object which are positioned at different positions of a first preset track, and the imaging method comprises the following steps: the lens and the first preset track are relatively moved according to a first preset relation, so that a clear image of the third object is obtained by using the imaging system without focusing, and the first preset relation is determined by the focal plane position of the first object and the focal plane position of the second object. The imaging method has higher imaging efficiency, and can still realize quick focusing according to the first preset relation under the condition of focus tracking failure, so that the condition of blurring of a shot image caused by decoking is avoided.

Description

Imaging method, device and system
Technical Field
The present invention relates to the field of optical detection, and in particular, to an imaging method, apparatus, and system.
Background
In the related art, the focal length of the camera is quickly adjusted to obtain the sharpest focal plane at each shooting, so as to obtain a clear picture, and this process is called focus tracking.
However, when the camera is used for shooting in practical applications, some external interference is easy to exist, for example, when the camera is used for shooting, the camera is failed to focus due to the existence of other stray light or dust or scratches on the surface of the object, and in the case of failed camera tracking, if the camera cannot re-focus, the image imaging is blurred. For example, in the case where a camera is used for sequence measurement, if the object is a nucleic acid molecule located in a chip, a liquid inside the photographing chip is provided with bubbles, large fluorescent impurities or dust, scratches, etc. on the chip surface, the camera may easily fail to catch focus.
Disclosure of Invention
In sequencing platforms that acquire nucleic acid information based on imaging, such as the current second or third generation sequencing platforms on the market that acquire nucleic acid information using photography, a process of photographing nucleic acid placed in a reactor using an imaging system is included.
Often, the reactor is also referred to as a chip (Flowcell), which may contain one or more parallel channels (channels) for access and carrying reagents to create the environment required for the sequencing reaction. The chip can be formed by bonding two pieces of glass, the sequencing process comprises taking a plurality of rounds of photographing of a fixed area of the chip by a camera, the area photographed each time can be called Fov (field of view), each round of photographing can be called a cycle, and the chemical reaction is carried out by introducing reagents again between the two cycles.
In the normal photographing process, the camera can mostly and automatically catch focus successfully, namely the most clear focal plane position is found. When interference is encountered, a focus tracking failure may occur.
Figures 1-3 illustrate data for successful and abnormal or failed chasing focus as found in the inventors' experiments.
Taking two rows of FOV continuously photographing the same cycle as an example, the coordinates of the height (Z value) of the objective lens at the time of photographing are recorded, and as shown in fig. 1, the abscissa is the serial number Fov, the first half Fov is photographed from the left side to the right side of Flowcell, and the second half Fov is photographed from the right to the left after line feed. The ordinate is the height of the microscope objective from the camera, i.e. the Z value, in um, and negative values indicate that the microscope objective is located below the camera, the greater the absolute value of the Z value, the farther the objective is from the camera.
Fig. 1 shows a Z-value curve corresponding to 300 images Fov successfully captured in focus tracking, and fig. 2 shows a Z-value curve corresponding to a partial focus tracking abnormality (reflected as a partial Z-value abnormality) included in 200 images captured, in which case the abnormal portion of the curve, that is, the image corresponding to the portion that appears as a convex portion therein, is a non-clear/blurred image.
Because the camera focus tracking has a certain limitation, the camera is easy to decoke after interference is encountered, and the distance between the objective lens and the focal plane is too large after the decoking, namely the objective lens is far away from the focal plane after the focus tracking shooting, and the follow-up Fov objective lens and the focal plane position cannot return to the focal plane even though the interference is eliminated, which is shown in fig. 3. The first 1-200 Fov in fig. 3 belong to one cycle, and the later Fov belongs to another cycle. Fig. 3 shows that after 268 th Fov (in the first row of another cycle), the tracking fails, and after the disturbance disappears, the tracking fails again until the cycle ends.
Failure to focus means blurring the image, which can lead to loss of information. This is therefore a problem that must be addressed. In reality, interference cannot be completely eliminated, but in general, it is at least desirable that a clear image can be obtained after the interference disappears.
The inventor finds that a plurality of Z value curves corresponding to a plurality of Fov with the same normal focus tracking of different cycles (namely different times) show a certain rule under the condition of fixed objective lens in analyzing a large amount of data of successful focus tracking and abnormal focus tracking. As shown in fig. 4, the Z value curves corresponding to the clear pictures obtained by 300 normal focus tracking for 4 different cycles of Fov are shown.
The inventors found two laws:
1) The same location (Fov) may have different focal planes for different cycles, but the relative positions of the focal planes are substantially changed with respect to other Fov of the same cycle. I.e. in physical location, there is a correlation between the focal planes of different Fov of the same cycle.
2) 300 Fov of each curve in the figure are shot from the left side to the right side of one row of Flowcell, and the other half is shot from the right side to the left side after line feed, and due to deformation of Flowcell and/or height difference of the left side and the right side, focal planes of a plurality of Fov continuously in the same row and the same direction (for example, left to right or right to left) are in a certain rule, so that a straight line can be well simulated.
Presenting the above rules, the inventors hypothesize that possible reasons include: because the same Fov needs to be repeatedly shot at different cycles, after heating and reagent circulation, the pressure in the chip changes, and the focal plane is wholly deviated. While each Fov is small relative to the entire chip, the surface flatness of each Fov can be seen as unchanged, as the relative focal plane position between adjacent Fov remains unchanged.
Based on the discovered rules, the inventor develops a set of algorithm, and can enable the camera to have a focal plane prediction function through the assistance of a software algorithm under the condition of not replacing hardware. Specifically, for example, in cycle1, for a plurality Fov in the same preset track (the first preset track, for example, the same row), focal planes of two Fov may be obtained by focusing, focal plane differences of the two may be calculated, a relationship (for example, a first predetermined relationship) may be obtained by linear fitting, and the focal plane positions of the other Fov rows may be predicted using the relationship. For cycle2 and subsequent cycles, the focal plane of any other Fov of the current cycle can be predicted by memorizing the focal plane of any Fov of the previous cycle1 or any previous cycle and then focusing to determine the focal plane position of the Fov of the current cycle.
Using linear regression to establish a relationship, expressed as equation (a) y=kx+b, it is necessary to determine the slope k (which may also be referred to as the trend k) and the intercept b (which may also be referred to as the base offset b). Based on the rule 1), k=1 is known, so the formula (a) can be converted into the formula (b) y=x+b, and b can be determined based on the relative positions of the focal planes of any two Fov on the same track of the same cycle and the Z value.
For example, for cycle1, the base offset b thereof may be calculated by the overall focal plane difference (e.g., from one end of the track to the other end of the track), specifically, focus obtains the Fov number between the two positions of cycle 1FovZ (r) and cycle 1FovZ (l), cycle 1FovZ (r) and cycle 1FovZ (l) represent focal plane position Z values of two objects (which may be referred to as two positions or two Fov) at one end and the other end of one track in cycle1, the intercept b= (cycle 1FovZ (r) -cycle 1FovZ (l))/FovNum, fovNum represents the number between the two positions of cycle 1FovZ (r) and cycle 1FovZ (l), and the formulas b) may be used to predict the focal plane position Z values of the two objects (which may be referred to as two positions or two Fov) in cycle 1FovZ (n) and cycle 1FovZ (n+1) represent the adjacent two positions (3525) and focus may be obtained by relatively approaching cycle 1FovZ (2).
It should be noted that, the determination of b may be performed by focal plane information of two Fov on the same track. Also, cyc1FovZ (n+1) may be determined using the determined formula (b) and focal plane coordinate information of any one of the focused Fov, where, for example, using the determined relationship (b) and any one of the determined values of cyc1FovZ (r) and cyc1FovZ (l).
After the linear relationship of a cycle is determined, for any cycle that subsequently shoots the same track/same Fov, the focal plane position of any Fov of the current cycle can be predicted based on the determined linear relationship and the focal plane position of any Fov of the current cycle. For example, using the focal plane position of Fov (N) (Fov th or nth position) in the current cycle to predict the focal plane position of Fov (n+1) (Fov th or n+1 th position) in the same cycle, we can take the Z value curfov Z (N) of N Fov as a dependent variable into formula (b), where y is curFovZ (n+1).
In addition, after the linear relationship of a certain cycle is determined, for any cycle of the same track/same Fov that is shot later, the focal plane position of one of the same Fov of the current cycle and the focal plane position of the other of the same Fov of the current cycle may also be predicted based on the determined linear relationship to determine the focal plane positions of the two Fov of the cycles. For example, equation (b) is determined in the previous cycle, the focal plane position of Fov (n+1) (Fov of the (n+1) th or n+1) th position in the same cycle is predicted using the focal plane position of Fov (N) (Fov of the (N) th or N-th positions) in the current cycle, we can determine curFovZ (n+1) by equation (b) the focal plane positions of Fov (N) and Fov (n+1) in the previous cycle, expressed as prefovez (N) and preFovZ (n+1), respectively, using equation (c) curFovZ (n+1) =curFovz (N) + (preFovZ (n+1) -prefovez (N)).
It should be noted that, the above-mentioned relation established by the finding and explanation of the rule and the schematic is a linear relation for convenience of description or understanding, and those skilled in the art can understand that the first preset track may be a straight line or a curve, and any curve may be regarded as a fit of a plurality of line segments. In this regard, it is believed that by way of the above illustration of the establishment of the relation and regular findings included in the relevant scenario in which the present invention was made, for the case where the first preset track is a curve, those skilled in the art can follow the concepts of the present invention and consider the curve-shaped first preset track as a set of line segments, and correspondingly, establish a first preset relation comprising a set of linear relations to enable prediction of the focal plane position of an object on the track without focus.
Under the condition that hardware is not replaced, the camera can be returned to the vicinity of the focal plane again by the imaging method of the embodiment, and photographing is started again. Based on the findings and the explanation above, the present invention provides an imaging method, an imaging apparatus, an imaging system, and a sequencing system.
The imaging method of the embodiment of the invention uses an imaging system to image an object, the imaging system comprises a lens, the object comprises a first object, a second object and a third object which are positioned at different positions of a first preset track, and the imaging method comprises the following steps: the lens and the first preset track are relatively moved according to a first preset relation, so that a clear image of the third object is obtained by using the imaging system without focusing, and the first preset relation is determined by the focal plane position of the first object and the focal plane position of the second object.
The imaging system of the embodiment of the invention images an object, the imaging system comprises a lens and a control device, the object comprises a first object, a second object and a third object which are positioned at different positions of a first preset track, and the control device is used for: the lens and the first preset track are relatively moved according to a first preset relation, so that a clear image of the third object is obtained by using the imaging system without focusing, and the first preset relation is determined by the focal plane position of the first object and the focal plane position of the second object.
In the imaging method and the imaging system, the first preset relation is determined through the focusing positions of the first object and the second object, when other objects on the first preset track are imaged, focal plane prediction can be directly carried out according to the first preset relation, clear images of the third object are not required to be focused, and the imaging method and the imaging system are particularly suitable for scenes with a large number of objects and hope of rapidly and continuously acquiring images of the objects.
The sequencing device comprises the imaging system of the embodiment.
A computer-readable storage medium of an embodiment of the present invention stores a program for execution by a computer, the execution program including steps to complete the method of the above embodiment. The computer readable storage medium may include: read-only memory, random access memory, magnetic or optical disk, etc.
An imaging system according to an embodiment of the present invention is for imaging an object, the imaging system comprising a lens and a control device, the object comprising a first object, a second object and a third object located at different positions of a first preset trajectory, the control device comprising a computer executable program, executing the computer executable program comprising steps for performing the method of the above embodiment.
A computer program product of an embodiment of the invention contains instructions which, when executed by a computer, cause the computer to perform the steps of the method of the above embodiment.
Additional aspects and advantages of embodiments of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the invention.
Drawings
The foregoing and/or additional aspects and advantages of embodiments of the invention will become apparent and may be readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a graph of Z values corresponding to successful focus tracking in sequence determination.
FIG. 2 is a graph showing Z values corresponding to the occurrence of a focus tracking failure in the abnormal convex portion Fov in the sequence measurement.
Fig. 3 is a graph of Z values for sequence determination that failed to focus back and failed to re-focus back at the end of the loop shot after the disturbance had disappeared.
Fig. 4 is a schematic diagram of different focus positions formed by focus data of an object at the time of sequence measurement.
Fig. 5 is a schematic structural view of a first preset track and a second preset track according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of a focus position formed by focus data of an object without interference at the time of sequence determination.
Fig. 7 is a schematic diagram of a focus position formed by focus data of an object when refocusing is successful with interference in sequence determination.
Fig. 8 is a schematic diagram of a focus position formed by focus data of an object when re-focusing cannot be performed under disturbance in sequence measurement.
Fig. 9 is a flowchart of a focusing method according to an embodiment of the present invention.
Fig. 10 is a schematic diagram of a positional relationship between a lens and an object according to an embodiment of the present invention.
Fig. 11 is a partial schematic configuration diagram of an imaging system of an embodiment of the present invention.
Fig. 12 is a schematic diagram of connected areas of an image according to an embodiment of the present invention.
Fig. 13 is another flow chart of the focusing method according to the embodiment of the invention.
Fig. 14 is a schematic flow chart of a focusing method according to an embodiment of the invention.
Fig. 15 is a schematic flow chart of a focusing method according to an embodiment of the invention.
Fig. 16 is a schematic view of a focusing method according to another embodiment of the invention.
Fig. 17 is a block diagram of an imaging system according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for explaining the present invention and are not to be construed as limiting the present invention.
In the description of the present invention, it should be understood that "first," "second," "third," "fourth," and "fifth" are merely for convenience of description and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features. In the description of the present invention, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the description of the present invention, unless explicitly stated and limited otherwise, "connected" is to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; may be mechanically connected, may be electrically connected, or may be in communication with each other; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
The orientation or positional relationship indicated by the terms "center", "thickness", "upper", "lower", "front", "rear", and the like are based on the orientation or positional relationship shown in the detailed description or drawings, and are merely for convenience of description and to simplify the description, rather than to indicate or imply that the apparatus or elements being referred to must have a particular orientation, be configured and operate in a particular orientation.
The term "constant", for example, refers to a change in a value, a range of values, or an amount, in relation to a distance, an object distance, and/or a relative position, etc., and may be either absolute or relative, or the term relative may be maintained within a certain deviation range or a preset acceptable range. "invariant" with respect to distance, object distance, and/or relative position is relatively unchanged, unless otherwise stated.
The following disclosure provides a number of embodiments or examples for implementing the technical solutions of the present invention. The present invention may repeat reference numerals and/or letters in the various examples, and this repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
"Sequencing" and nucleic acid sequencing, as referred to herein, includes DNA sequencing and/or RNA sequencing, including long fragment sequencing and/or short fragment sequencing. The so-called "sequencing reaction" is the same sequencing reaction.
The embodiment of the invention provides an imaging method, which uses an imaging system to image an object. Referring to fig. 5, 11 and 12, the imaging system includes a lens 104, the object includes a first object 42, a second object 44 and a third object 46 located at different positions of a first preset track 43, and the imaging method includes: the lens 104 and the first preset track 43 are relatively moved according to a first predetermined relationship determined by the focal plane position of the first object 42 and the focal plane position of the second object 44 to obtain an image of the third object 46 using the imaging system without focusing.
In the imaging method, the first predetermined relationship is determined by the focusing positions of the first object 42 and the second object 44, when other objects on the first predetermined orbit are imaged, focal plane prediction can be directly performed according to the first predetermined relationship, clear images of the third object are not required to be acquired in a focusing way, and the imaging method is particularly suitable for scenes with a large number of objects and hopes of acquiring images of the objects in a rapid and continuous way.
Specifically, in the example of fig. 5, the first preset track 43 may be a linear track, the first object 42 and the second object 44 are located at two positions of the linear track, for example, at both ends of the linear track, it may be understood that the number of the third objects 46 may be plural, the plural third objects 46 are sequentially arranged at the first preset track 43, and the third object 46 is located between the first object 42 and the second object 44. It is to be appreciated that in other examples, the third object 46 may be located in other locations than the locations of the first object 42 and the second object 44. In other examples, the first preset track 43 may be a non-linear track, such as a curved track, which may be considered as a fit of a plurality of line segments, the first object, the second object, and the third object being located in the same line segment in the curved track.
In some embodiments, the first predetermined relationship may be a linear relationship. In one embodiment, referring to fig. 5, when the first preset track 43 is one or more channels 52 of the chip 500 used in the sequencing process, the imaged third object 46 is at one or more positions (FOVs) within the channels 52, and the lens and the first preset track 43 may be relatively movable in a first direction a when photographing, e.g., the lens 104 is stationary, the lens 104 includes an optical axis OP, and the first preset track 43 moves in a direction perpendicular to the optical axis OP. It will be appreciated that in some embodiments, the first predetermined track 43 is capable of movement in a direction parallel to the optical axis OP. The first preset track 43 may be moved according to the actual adjustment requirements.
The imaging system includes a camera 108, the lens 104 may be mounted on the camera 108, and the camera 108 captures light passing through the lens 104 for imaging.
In some embodiments, the lens 104 is moved relative to the first preset track 43, including at least one of: fixing the lens 104 and moving the first preset track 43; fixing the first preset track 43 and moving the lens 104; while moving the lens 104 and the first preset track 43.
In this way, the lens 104 and the first preset track 43 have various moving modes, so that the adaptability is high, and the application range of the imaging method is improved.
Specifically, when the first preset track 43 is moved, the first preset track 43 may be placed on a stage, and the stage may drive the first preset track 43 and the object to translate back and forth along a direction perpendicular to the optical axis OP of the lens 104, so as to place one of the third objects 46 under the lens 104, so that the imaging system images the third object 46.
When the lens 104 is moved, the lens 104 may be mounted on a driving mechanism, and the driving mechanism may electrically or manually drive the lens 104 to translate back and forth along a direction perpendicular to the optical axis OP of the lens 104, so that the lens 104 moves above one of the third objects 46, and the imaging system images the object.
While moving the lens 104 and the first preset track 43, it may be understood that the lens 104 may be moved first, and then the first preset track 43 may be moved, so that one of the third objects 46 is located below the lens 104; alternatively, the first preset track 43 may be moved first, and then the lens 104 may be moved so that the lens 104 is located above one of the third objects 46, or the first preset track 43 may be moved while the lens 104 is moved so that the lens is located above one of the third objects 46.
In some embodiments, the determining of the first predetermined relationship comprises: focusing the first object 42 with the imaging system to determine a first coordinate; focusing the second object 44 with the imaging system to determine a second coordinate; a first predetermined relationship is established based on first coordinates reflecting the focal plane position of the first object 42 and second coordinates reflecting the focal plane position of the second object 44. Thus, the first predetermined relation can be predetermined, and when imaging other objects is performed, a clear image of the other objects can be obtained by using the imaging system without focusing according to the first predetermined relation, thereby simplifying the imaging method and improving the efficiency of the imaging method.
Specifically, according to one embodiment described above, referring to FIG. 5, when the imaged third object 46 is one or more locations of the chip 500 used for sequencing, the first object 42, the second object 44, and the third object 46 may be located within the same channel of the chip 500.
Preferably, the first object 42, the third object 46, and the second object 44 are sequentially arranged on the first preset track 43. According to one embodiment described above, the first direction a is a left-to-right direction of the chip 500, that is, the first object 42, the third object 46, and the second object 44 are sequentially arranged on the first preset track 43 along the left-to-right direction a of the chip 500. In other embodiments, the first object 42, the third object 46, and the second object 44 may also be arranged in other orders in the first preset track 43.
It will be appreciated that in determining the first predetermined relationship, two objects may be selected on the first predetermined track 43: the first object 42 and the second object 44 are focused to obtain the focus positions of the two objects. In particular, it is evident from the foregoing that the relative focal plane position between two Fov, in particular adjacent Fov, on the first predetermined track 43 remains unchanged during the sequence determination. Thus, by focusing the first object 42 and the second object 44, focal plane coordinate data of the first object 42 and the second object 44 can be obtained to determine what is called a first predetermined relationship. With this first predetermined relationship, any third object obtained on the first preset track 43 can be obtained without focusing.
Thus, by way of example, the first object 42 and the second object 44 may be the start and end FOVs, respectively, of a first preset track in a cycle (i.e., the same time period), such as the two end FOVs of the same line of the same channel, as shown in fig. 5. The third object 46 may be any one or more FOVs between the first object 42 and the second object 44. It will be appreciated that, basically, the first object 42 and the second object 44 may be FOVs at other positions, the third object 46 need not be located between the first object 42 and the second object 44, only two positions (objects) on the first preset track need be selected based on a rule that a straight line (a first predetermined relationship) is determined between two points, focal plane positions corresponding to the positions are obtained, and the first predetermined relationship corresponding to the first preset track 43 is obtained according to the focal plane positions of the positions, and the image of the third object can be obtained without focusing by using the imaging system through the first predetermined relationship. In an actual application scenario, a coordinate system may be established to digitize/quantify the relative position relationship to include a so-called focal plane position, for example, when the image signal acquisition is performed by using the sequencing platform, xy may be used to represent a plane in which the first/second preset tracks are located, and Z may be used to represent an optical axis direction of the objective lens to establish a three-dimensional coordinate system, where the focal plane position of each position includes a focal plane Z value.
It should be noted that the mentioned cycle reflects the influence of the time factor/image acquisition period. Generally, in a high-precision imaging system, for example, in a 60-times objective lens, a microscope system with a depth of field of 200nm, the mechanical motion of the first/second preset track back and forth or the fluctuation caused by the mechanical motion of the platform carrying the first/second preset track back and forth or the mechanical motion of the platform carrying the first/second preset track back and forth is likely to exceed the depth of field, so, preferably, in the case of continuous imaging of multiple objects with higher precision by using the imaging method of any of the above or the following embodiments, it is relatively more accurate and better to re-fit the first preset relationship based on the focusing data for a plurality of objects located on the same preset track, if not in the same image acquisition time period (for example, in different mechanical motion directions). It will be appreciated by those skilled in the art that in a multi-object continuous imaging scenario with relatively low accuracy, due to the large depth of field, focal plane position deviations caused by mechanical reciprocation may not be considered, i.e. for a plurality of objects on the same preset trajectory, imaging may be performed in different image acquisition cycles using the first and/or second predetermined relationship determined for any of the previous image acquisition cycles.
The effect of Z-value prediction using the above prediction strategy is shown in FIGS. 7-9.
In fig. 6 to 8, the curve C5 is a Z value curve (focal plane line formed by the actual focusing position) obtained by the actual photographing result of the camera, and photographing is performed by using only the camera tracking focus. The C6 curve is a predicted Z-value curve (focal plane line formed by the predicted focus position).
Fig. 6 shows Z-value predictions for multiple Fov of one cycle in the no interference state, fig. 7 and 8 show Z-value predictions with interference and decoking, without interference, the re-focus can be successful after the decoking of fig. 7, and the re-focus cannot be performed after the decoking of fig. 8.
In some embodiments, the objects include a fourth object 47 and a fifth object 48 positioned at different locations of the second preset track 45, and the imaging method includes: the lens 104 and the second preset track 45 are relatively moved according to a second predetermined relationship, which is determined by the focal plane position of the fourth object 47 and the first predetermined relationship, to obtain an image of the fifth object 48 using the imaging system without focusing, the second preset track 45 being different from the first preset track 43. Based on the first predetermined relationship of the first preset track 43 and the focal plane position of any object on the second preset track 45, a second predetermined relationship corresponding to the second preset track 45 is determined, and by using the second predetermined relationship, a clear image of any object on the second preset track 45 can be obtained without focusing, so that clear images of more objects can be obtained, and the user requirement is met.
Specifically, the second preset track 45 may be a track adjacent to the first preset track 43, and in the above embodiment, the second preset track 45 is a parallel channel adjacent to the first preset track 43, the second preset track 45 may be a linear track, the fourth object 47 and the fifth object 48 are located at two positions of the linear track, for example, the fourth object 47 is located at one end of the linear track, the fifth object 48 is located at the middle of the linear track, it is understood that the number of the fifth objects 48 may be plural, the plural fifth objects 48 are sequentially arranged at the second preset track 45, and the fifth object 48 is located at a position different from the fourth object 47. It will be appreciated that in other examples, the second preset track 45 may be a non-linear track, such as a curved track, which may be seen as a fit of a plurality of line segments, the fourth object 47 and the fifth object 48 being located in the same line segment in the curved track.
In some embodiments, the second predetermined relationship may be a linear relationship.
In one embodiment, referring to fig. 5, when the second predetermined trajectory 45 is one or more channels 52 of the chip 500 used in the sequencing process, the imaged fifth object 48 is at one or more positions (FOVs) within the channels 52, and the lens and the second predetermined trajectory can be relatively moved in a second direction B, e.g., the lens is stationary, the lens includes an optical axis, and the second predetermined trajectory 45 moves in a direction perpendicular to the optical axis, when taking a photograph. It will be appreciated that in some embodiments, the second preset track 45 is capable of movement in a direction parallel to the optical axis OP. The second preset track 45 may be moved according to the actual adjustment requirements.
It will be appreciated that other ways of relative movement of the lens 104 and the second predetermined track 45 may be referred to as the above explanation of the way of relative movement of the lens 104 and the first predetermined track 43, and will not be expanded in detail herein to avoid redundancy. It should be noted that, in the example of fig. 5, the first preset track 43 and the second preset track 45 are two adjacent channels 52 on the chip 500, and thus, when the mobile chip 500 moves, the first preset track 43 and the second preset track 45 move synchronously.
In some embodiments, the determining of the second predetermined relationship comprises: focusing the fourth object 47 with the imaging system to determine a fourth coordinate; the second predetermined relationship is established in accordance with the first predetermined relationship and a fourth coordinate, which reflects the focal plane position of the fourth object 47. Thus, the second predetermined relationship can be predetermined, and when imaging of other objects is performed, a clear image of other objects can be obtained without focusing by using the imaging system according to the second predetermined relationship, simplifying the imaging method and improving the efficiency of the imaging method.
Specifically, according to one embodiment described above, referring to FIG. 5, when the fifth object 48 being imaged is one or more locations of the chip 500 used for sequence sequencing, the fourth object 47 and the plurality of fifth objects 48 may be located within the same channel of the chip 500.
Preferably, the fourth object 47 and the plurality of fifth objects 48 are sequentially arranged on the second preset track 45. According to one embodiment described above, the second direction B is a direction from the right to the left of the chip 500, that is, the direction B from the right to the left of the chip 500, and the fourth object 47 and the plurality of fifth objects 48 are sequentially arranged on the second preset track 45. In other embodiments, the fourth object 47 and the fifth object 48 may also be arranged in other orders on the second preset track 45.
It will be appreciated that the determination of the second predetermined relationship may be referred to above in the explanation of the determination of the first predetermined relationship and will not be discussed in detail herein to avoid redundancy.
In certain embodiments, the imaging method comprises: after the image of the third object 46 is acquired, the lens 104 is moved relative to the first preset track 43 and/or the second preset track 45 to acquire an image of the fifth object 48 using the imaging system without focusing. In this way, after the image of the third object 46 on the first preset track 43 is acquired, the image of the fifth object 48 on the second preset track 45 is acquired, so as to further realize the imaging of the objects of different preset tracks.
Specifically, in the above embodiment, after the image capturing of the one or more third objects 46 on the first preset track 43 is completed, the lens 104 and the chip 500 are relatively moved along the third direction C, i.e., the direction perpendicular to the extending direction of the channel 52, so that the lens 104 is located above the fifth object 48, and then the image of the fifth object 48 is captured by the imaging system according to the second predetermined relationship without focusing. In the illustrated embodiment, the third direction C is perpendicular to the first direction a and the second direction B.
Further, in the example shown in fig. 5, the first preset tracks 43 and the second preset tracks 45 are alternately arranged at intervals from top to bottom, and after the image acquisition of the one or more third objects 46 of the first preset tracks 43 from top to bottom is completed, the lens 104 and the chip 500 are moved relatively, so that the lens 104 is located above the fifth objects 48 of the first preset tracks 45, and then the image of the one or more fifth objects 48 of the first preset tracks 45 is acquired. Then, the lens 104 and the chip 500 are moved relatively, so that the lens 104 is located above the third object 46 of the second first preset track 43, and then the image of the third object 46 of the second first preset track 43 is acquired until the clear image acquisition of all the objects on the first preset track 43 and the second preset track 45 is completed.
In summary, since the focal plane position (e.g., Z value) of the object of the image to be acquired is predicted according to the first predetermined relationship or the second predetermined relationship, imaging of other FOVs is performed without focusing, which improves imaging efficiency and accuracy. Furthermore, the determination method is applied to photographing in the same area and similar areas, and can realize rapid and continuous image acquisition of multiple objects. The focusing process can be omitted in the continuous shooting process, so that the rapid scanning shooting can be realized. Further, by matching with an automatic focus tracking system of the camera, better picture quality can be obtained by using a focus plane prediction technology, and the problem that the camera cannot track focus again after interference decoking can be solved. In a broader sense, the camera can have a certain intelligence by using the focal plane prediction technology, and the focusing process can be quickly realized by assisting the focusing process according to priori knowledge, and even the focusing process is omitted. Especially in the shooting process, the intelligent camera has more important expansion application.
In some embodiments, the imaging system comprises an imaging device 102 and a stage, the imaging device 102 comprises a lens 104 and a focusing module 106, the lens 104 comprises an optical axis OP, the lens 104 is capable of moving along the optical axis OP, and the first preset track 43 and/or the second preset track 45 are located on the stage.
Hereinafter, a focusing process when the present invention determines the first predetermined relationship or the second predetermined relationship will be described in specific embodiments. It should be noted that unless specifically stated otherwise, elements of the same name as those used in the various embodiments should be limited to the illustrations of the embodiments and should not be construed as being cross-interpreted or otherwise confusing to the elements of the same name as those used in the various embodiments.
Embodiment one:
Referring to fig. 9-11, focusing includes the steps of: (a) emitting light onto the object using the focusing module; (b) moving the lens to a first set position; (c) Moving the lens from the first setting position to the object in a first setting step length and judging whether the focusing module receives light reflected by the object or not; (d) When the focusing module receives light reflected by an object, the lens is moved from the current position to a second set position, the second set position is positioned in a first range, and the first range is a range which comprises the current position and allows the lens to move along the optical axis direction; (e) Moving the lens from the second set position by a second set step length, and obtaining an image of the object by the imaging device at each step position, wherein the second set step length is smaller than the first set step length; (f) And evaluating the image of the object, and focusing according to the obtained image evaluation result.
By using the imaging method, the plane of clear imaging of the target object, namely the clear plane/clear plane, can be rapidly and accurately found. The method is particularly suitable for devices comprising precision optical systems where a clear plane is not easily found, such as optical detection devices with high magnification lenses. Thus, the cost can be reduced.
Specifically, in the focusing step, the object is an object whose focal plane position needs to be acquired, for example, if a first predetermined relationship needs to be determined, two objects may be selected in the first preset track, and two objects located in the first preset track 43 are focused sequentially or simultaneously to acquire two sets of focal plane position data, where one of the two sets of focal plane position data is focal plane position data of the first object 42, and the other one of the two sets of focal plane position data is focal plane position data of the second object 44; if the second predetermined relationship needs to be determined, an object may be selected for focusing on the second preset track, and focal plane position data of the object may be obtained as focal plane position data of the fourth object 47, so as to determine the second predetermined relationship in combination with the first predetermined relationship.
Referring to fig. 10 and 11, in the embodiment of the present invention, the objects are a plurality of positions (FOVs) of the sample 300 applied in the sequencing, specifically, when the first predetermined relationship is determined, the object to be focused may be the first object or the second object, and when the second predetermined relationship is determined, the object to be focused may be the fourth object or the fifth object. The sample 300 includes a carrying device 200 and a sample 302 to be detected located on the carrying device, the sample 302 to be detected is a biological molecule, such as a nucleic acid, and the lens 104 is located above the carrying device 200. The carrier 200 has a front panel 202 and a rear panel (lower panel), each having two surfaces, with a sample 302 to be tested attached to the upper surface of the lower panel, i.e., the sample 302 to be tested is located below the lower surface 204 of the front panel 202. In the embodiment of the present invention, since the imaging device 102 is used for collecting the image of the sample 302 to be measured, that is, the position (FOV) corresponding to the sample 302 to be measured when photographing, and the sample 302 to be measured is located below the lower surface 204 of the front panel 202 of the carrying device 200, when the focusing process is started, the lens 104 is moved to find the medium interface 204 where the sample 302 to be measured is located, so as to improve the success rate of collecting the clear image of the imaging device 102. In the embodiment of the present invention, the sample 302 to be measured is a solution, the front panel 202 of the carrying device 200 is glass, and the medium interface 204 between the carrying device 200 and the sample 302 to be measured is the lower surface 204 of the front panel 202 of the carrying device 200, i.e. the interface between glass and two kinds of media. The sample 302 to be measured for which the imaging device 102 needs to collect an image is located below the lower surface 204 of the front panel 202, and the image collected by the imaging device 102 is used to determine a clear surface for finding a clear image of the sample 302 to be measured, which may be referred to as focusing. In one example, the front panel 202 has a thickness of 0.175mm.
In other embodiments, the carrier 200 may be a slide, with the sample 302 to be measured placed on the slide, or with the sample 302 to be measured sandwiched between two slides. In another embodiment, the carrying device 200 may be a reaction device, for example, a chip similar to a sandwich structure with carrying panels on top and bottom, and the sample 302 to be measured is disposed on the chip.
In this embodiment, referring to fig. 11, the imaging device 102 includes a microscope 107 and a camera 108, the lens 104 includes an objective lens 110 and a camera lens 112 of the microscope, and the focusing module 106 may be fixed to the camera lens 112 by a dichroic beam splitter 114 (dichroic beam splitter), where the dichroic beam splitter 114 is located between the camera lens 112 and the objective lens 110. The dichroic beamsplitter 114 comprises a dual C-mount splitter. The dichroic beam splitter 114 may reflect light emitted by the focusing module 106 to the objective lens 110 and may allow visible light to pass through and into the camera 108 via the camera lens 112, as shown in fig. 11.
In the embodiment of the present invention, the movement of the lens 104 is along the optical axis OP. The movement of the lens 104 may refer to the movement of the objective lens 110, and the position of the lens 104 may refer to the position of the objective lens 110. In other embodiments, other lenses of the moving lens 104 may be selected to achieve focus. In addition, the microscope 107 further includes a tube lens 111 (tube lens) located between the objective lens 110 and the camera 108.
In this embodiment, the stage can drive the sample 200 to move in a plane (e.g., XY plane) perpendicular to the optical axis OP (e.g., Z axis) of the lens 104, and/or can drive the sample 300 to move along the optical axis OP (e.g., Z axis) of the lens 104.
In other embodiments, the plane in which the stage moves the sample 300 is not perpendicular to the optical axis OP, i.e. the included angle between the moving plane of the sample and the XY plane is not 0, and the imaging method is still applicable.
In addition, the imaging device 102 can also drive the objective lens 110 to move along the optical axis OP direction of the lens 104 for focusing. In some examples, the imaging device 102 uses a drive such as a stepper motor or voice coil motor to drive the objective lens 110.
In this embodiment, when the coordinate system is established, as shown in fig. 10, the positions of the objective lens 110, the stage, and the sample 300 may be set on the negative axis of the Z axis, and the first set position may be a coordinate position on the negative axis of the Z axis. It will be appreciated that in other embodiments, the relationship between the coordinate system and the camera and the objective lens 110 may be adjusted according to the actual situation, which is not particularly limited herein.
In one example, imaging device 102 comprises a total internal reflection fluorescence microscope with objective lens 110 at 60 x magnification, a first set step s1=0.01 mm. Thus, the first setting step S1 is suitable, and too large S1 spans an acceptable focus range, and too small S1 increases the time overhead.
When the focusing module 106 does not receive the light reflected by the object, the lens 104 is made to move towards the sample 300 and the object with a first set step.
In this embodiment, the imaging system may be applied to a sequencing system, or the sequencing system may include an imaging system.
In this embodiment, based on the current position, the first range includes a first section and a second section that are opposite to each other, and the second section is defined to be closer to the sample, and the step (e) includes: (i) When the second set position is located in the second interval, the lens is moved from the second set position to a direction away from the object, and an imaging device is used for acquiring images of the object at each step of position; or (ii) when the second setting position is located in the first section, moving the lens from the second setting position to a direction approaching the object, and performing image acquisition on the object by using the imaging device at each position. Therefore, the movement of the lens can be controlled according to the specific position of the second setting position, and the required image can be quickly acquired.
Specifically, in one example, the current position may be taken as the origin oPos and the coordinate axis Z1 may be established along the optical axis direction of the lens, where the first interval is a positive interval and the second interval is a negative interval. The range of plus or minus intervals is + -rLen, that is, the first range is [ oPos + rLen, oPos-rLen ]. The second set position is located in the negative interval and the second set position is (oPos-3 x r 0). r0 represents a second set step size. The imaging device starts image acquisition at (oPos-3 x r 0) and moves away from the subject.
Note that, the coordinate axis Z1 established in the above example coincides with the Z axis of fig. 10, and the first range is located in the negative section of the Z axis. In this way, the control of the imaging method can be simplified, for example, the corresponding relation between the position of the lens on the coordinate axis Z1 and the position on the Z axis can be known only by knowing the position relation between the origin of the Z axis and the origin oPos.
In this embodiment, step (f) includes: comparing the image evaluation result with a preset condition, and if the image evaluation result meets the preset condition, storing the position of the lens 104 corresponding to the image; if the image evaluation result does not meet the preset condition, the lens 104 is moved to a third set position, and the third set position is located in another section different from the section where the second set position is located in the first range, that is, reverse photographing focusing is started. For example, in the process of performing the step (e) in the part (i), the image evaluation result does not meet the preset condition; moving the lens 104 to the third setting position corresponds to moving the lens to the starting position of the part (ii) to be subjected to the step (e), and then performing reverse photographing and focusing, namely performing the process of the part (ii) of the step (e). Therefore, the focusing position of the image is searched in the first range, and the efficiency of the imaging method is effectively improved.
Specifically, referring to the example of the embodiment, the second setting position is located in the negative interval (oPos-3×r0), the lens moves upward from the second setting position, the imaging device 102 performs image capturing at each step, if the image evaluation result does not meet the preset condition, the lens 104 is moved to the third setting position located in the positive interval, for example, the third setting position is (oPos +3×r0), and then the imaging device 102 starts image capturing from (oPos +3×r0) and moves in a direction approaching to the object, and focuses according to the obtained image evaluation result. When the image evaluation result satisfies the preset condition, the current position of the lens 104 corresponding to the image is saved as the saved position, so that the imaging device 102 can output a clear image when the sequencing reaction takes a picture.
In some embodiments, the image evaluation result includes a first evaluation value and a second evaluation value, the second set step includes a coarse step and a fine step, and step (f) includes: the lens moves in a coarse step until the first evaluation value of the image at the corresponding position is not greater than the first threshold, the lens 104 is replaced with a fine step to continue to move to the second evaluation value of the image at the corresponding position for the maximum, and the position of the lens 104 corresponding to the image when the second evaluation value is the maximum is saved. Thus, the coarse step size may allow the lens 104 to quickly approach the focus position, and the fine step size may ensure that the lens 104 can reach the focus position.
Specifically, the position of the lens 104 corresponding to the image of the largest second evaluation value may be saved as the focus position. At each position, image acquisition is performed by the imaging device 102, a first evaluation value and a second evaluation value are calculated for the acquired image.
In one example, during sequencing, the object is provided with an optically detectable label, such as a fluorescent label, which is excited to fluoresce by a laser of a specific wavelength, and the image acquired by the imaging device 102 includes a bright spot that may correspond to the location of the fluorescent molecule. It can be appreciated that when the lens 104 is in the in-focus position, the size of the bright spot corresponding to the position of the fluorescent molecule is smaller and the brightness is higher in the acquired image; when the lens 104 is in the out-of-focus position, the size of the bright spot corresponding to the position of the fluorescent molecule is larger and the brightness is lower in the acquired image.
In this embodiment, the size of the bright spots on the image and the intensity of the bright spots are used to evaluate the image.
For example, the first evaluation value is used to reflect the size of the bright spot of the image; in one example, the first evaluation value is determined by counting the connected domain size of the bright spots on the image, and the connected pixel (pixels connectivity) that is defined to be larger than the average pixel value of the image is defined as one connected domain (connected domain, connected component). The first evaluation value may be determined, for example, by calculating the size of the corresponding connected domain of each bright spot, taking the average value of the sizes of the connected domains of the bright spots to represent a characteristic of the image, as the first evaluation value of the image; for another example, the sizes of the connected domains corresponding to the respective bright spots may be sorted from small to large, and the connected domain sizes at 50, 60, 70, 80, or 90 minute points may be taken as the first evaluation value of the image.
In one example, the connected domain size area=a×b corresponding to the bright spot of one called image, a represents the connected domain size of a row centered on the center of the matrix corresponding to the bright spot, and B represents the connected domain size of a column centered on the center of the matrix corresponding to the bright spot. The matrix corresponding to the bright spots is defined as a matrix k1 k2 formed by odd rows and odd columns, and comprises k1 k2 pixel points.
In one example, the image is binarized, converted into a digital matrix, and then the size of the connected domain is calculated. For example, with the average pixel value of the image as a reference, a pixel point not smaller than the average pixel value is marked 1, and a pixel point smaller than the average pixel value is marked 0, as shown in fig. 12. In fig. 12, the center of the matrix indicating the correspondence of the bright spots is bolded and enlarged, and the thick line frame indicates 3*3 matrix. The connected pixel points marked as 1 form a connected domain, and the size of the connected domain corresponding to the bright spots is a= 3*6.
The first threshold value may be set based on empirical or a priori data. In one example, the first evaluation value reflects the size of the bright spot on the image, and the inventor observes that in the process from approaching the clear surface to separating from the clear surface, the connected domain Area becomes smaller and then larger, and the inventor determines the first threshold value based on the size of the Area value and the change rule in the focusing process of finding the clear surface for a plurality of times. In one example, the first threshold is set to 260. It is noted that the first threshold is associated with coarse step size, fine step size settings that may have: the first threshold is sized to span the focal plane of the imaging device when imaging the subject without having to travel a coarse step.
In some embodiments, the second or third evaluation value is determined by counting the scores of the bright spots of the images, the Score of a bright spot of one image Score = ((k1×k2-1) CV-EV)/((cv+ev)/(k1×k2)) CV representing the center pixel value of the matrix corresponding to the bright spot, and EV representing the sum of the non-center pixel values of the matrix corresponding to the bright spot. In this way, the second evaluation value or the third evaluation value can be determined.
Specifically, after the bright spots of the image are determined, score values of all the bright spots of the image may be ranked in ascending order. When the number of the bright spots is larger than the preset number, for example, the preset number is 30, the number of the bright spots is 50, the second evaluation value can take a Score value of 50, 60, 70, 80 or 90 minutes, so that 50%, 60%, 70%, 80% or 90% of interference of the bright spots with relatively poor quality can be eliminated; in general, the bright spots that are concentrated and have large differences in center and edge intensities/pixel values are considered to be bright spots corresponding to molecules to be detected. The molecule to be detected may represent a nucleic acid molecule corresponding to the target detection object at the time of nucleic acid detection.
When the number of the bright spots is smaller than the preset number, for example, the number of the bright spots is 10 and smaller than the preset number, so that the bright spots with smaller number of the bright spots have no statistical significance, the bright spots with the largest Score value are taken to represent the image, namely, a percentile Score value is taken as a third evaluation value.
In the present embodiment, the image evaluation result includes a first evaluation value, a second evaluation value, and a third evaluation value, and the image includes a plurality of pixels; the preset condition is that the number of the bright spots on the image is larger than a preset value, the first evaluation value of the image at the corresponding position is not larger than a first threshold value, and the second evaluation value of the image at the corresponding position is the largest in the second evaluation values of N images before and after the image at the corresponding position; or the preset condition is that the number of the bright spots on the image is smaller than a preset value, the first evaluation value of the image at the corresponding position is not larger than a first threshold value, and the third evaluation value of the image at the corresponding position is the largest in the third evaluation values of N images before and after the current image. Therefore, the imaging method is more accurate in focusing by adopting different evaluation values for evaluation according to the number of the bright spots of the image.
Specifically, in one example, the first evaluation value may be a connected domain size corresponding to the bright spot of the image in the above embodiment. In the case where the second evaluation value and the third evaluation value are different examples, the Score scores may be different depending on whether the number of bright spots has or has no statistical significance, for example, a Score value of a non-percentile and a Score value of a percentile, respectively.
In one example, single molecule sequencing is performed, and the bright spots on the acquired images may come from one or several optically detectable marker molecules carried by the sample to be tested, as well as from other disturbances.
In this embodiment, the detection of the bright spots is performed by detecting the bright spots corresponding to/from the labeling molecules, for example, using a k1 x k2 matrix. Specifically, bright spots on an image were detected using the following method:
and (3) carrying out bright spot detection on the image by using a k1 x k2 matrix, wherein the matrix comprising the central pixel value of the judgment matrix is not less than any pixel value of the non-center of the matrix corresponds to one bright spot, both k1 and k2 are odd numbers greater than 1, and the k1 x k2 matrix comprises k1 x k2 pixel points.
The method is based on the difference between the brightness/intensity of the signal generated by fluorescence and the background brightness/intensity, and can simply and rapidly detect the information from the signal of the labeled molecule. In some embodiments, the central pixel value of the matrix is greater than a first preset value and any non-central pixel value of the matrix is greater than a second preset value.
The first preset value and the second preset value can be set according to experience or pixel/intensity data of normal bright spots of a certain amount of normal images, and the normal images and the normal bright spots can be images obtained by an imaging system at a clear surface position and are normal to the naked eye, such as clear images, cleaner backgrounds, uniform brightness and the like. In one embodiment, the first preset value and the second preset value are related to an average pixel value of the image. For example, by setting the first preset value to be 1.4 times the average pixel value of the image and the second preset value to be 1.1 times the average pixel value of the image, interference can be eliminated and a bright spot detection result from the mark can be obtained.
Specifically, in one example, the image is a color image, one pixel point of the color image has three pixel values, and the color image can be converted into a gray image, and then image detection is performed, so as to reduce the calculation amount and complexity of the image detection process. Alternatively, but not limited to, converting the non-gray scale image into a gray scale image using a floating point algorithm, an integer method, a shift method, an average method, or the like. Of course, it is also possible to directly detect a color image, and the above-mentioned comparison of the pixel values can be regarded as a three-dimensional value or a comparison of the sizes of a plurality of groups of three elements, and the relative sizes of a plurality of multi-dimensional values can be customized according to experience and needs, for example, when any two-dimensional value in the three-dimensional value a is larger than the corresponding dimensional value of the three-dimensional value b, the three-dimensional value a can be regarded as being larger than the three-dimensional value b.
In another example, the image is a grayscale image, the pixel values of which are the same as the grayscale values. Therefore, the average pixel value of the image is the average gray value of the image.
In one example, the first threshold is 260 and the preset number is 30, n=2. That is, when the first evaluation value of the image at the corresponding position is not more than 260 and the number of bright spots is more than 30, the second evaluation value of the image at the corresponding position is statistically obtained, the position of the image at which the second evaluation value is largest is determined as the clear face position, and there are 2 positions in front of and behind the position that satisfy the following conditions: the second evaluation value of the corresponding image is greater than zero. When the first evaluation value of the image at the corresponding position is not more than 260 and the number of the bright spots is less than 30, counting the third evaluation value of the image at the corresponding position, and finding the position of the image with the maximum third evaluation value as a clear surface position, wherein 2 positions meeting the following conditions are arranged in front of and behind the position: the third evaluation value of the corresponding image is greater than zero.
If no image meeting the above conditions is found, it is determined that the image evaluation result does not meet the preset conditions.
In one example, k1=k2=3, then there are 9 pixels in the 3*3 matrix, and EV is the sum of the non-center 8 pixel values.
In this embodiment, if focusing cannot be completed according to the image evaluation result, the lens is moved to the next image acquisition area (FOV) of the subject in the direction perpendicular to the optical axis for focusing. Therefore, refocusing can be carried out from other objects, the current objects which cannot be focused are prevented from being focused all the time, and time is saved.
In this embodiment, the imaging method further includes: and when the number of current objects which are unsuccessful in focusing is greater than the preset number, prompting focusing failure. Therefore, the reason of focusing failure can be manually eliminated, and the focusing is avoided all the time, so that the time is saved. Specifically, in this case, there may be a cause of the position of the object placement not being right or the failure of the imaging apparatus or the like. After the focusing failure is prompted, the focusing failure reason can be manually eliminated. In one example, the preset number is 3, that is, when the number of current objects for which focusing is unsuccessful is greater than 3, focusing failure is prompted. The focusing failure prompting mode can be prompting modes such as displaying images, words, playing sounds and the like.
In this embodiment, the imaging method further includes: judging whether the position of the lens exceeds a first range, and when the position of the lens exceeds the first range, exiting focusing. Therefore, when the position of the lens exceeds the first range, the focusing is stopped, so that overlong focusing time and increased power consumption can be avoided.
Specifically, in the example of the above embodiment, the first range is [ oPos + rLen, oPos-rLen ].
In the present embodiment, when the lens 104 moves, it is determined whether the current position of the lens 104 exceeds the fourth setting position; when the current position of the lens 104 exceeds the fourth set position, the movement of the lens 104 is stopped. Thus, the first setting position and the fourth setting position can limit the movement range (first range) of the lens 104, so that the lens 104 can stop moving when the focusing is not successful, the waste of resources or the damage of equipment is avoided, or the lens 104 can be refocused when the focusing is not successful, and the automation of the imaging method is improved.
For example, in a total internal reflection imaging system, to quickly find the medium interface, the settings are adjusted to minimize the range of movement of the lens 104 as may be satisfactory for implementing this embodiment. For example, on a total internal reflection imaging device in which the objective lens is 60 times, the movement range of the lens 104 may be set to 200 μm±10 μm or [190 μm,250 μm ] in accordance with the light path characteristics and empirical summary.
In the present embodiment, the other setting position can be determined according to the determined movement range and the setting of any one of the fourth setting position and the first setting position. In one example, the fourth setting position is set to a position of the depth of field of the reaction apparatus 200, which is the lowest position of the upper surface 205 of the front panel 202, and the movement range of the lens 104 is set to 250 μm, so that the first setting position is determined. In the present example, the coordinate position corresponding to the position of the next depth of field is a position that becomes smaller in the negative Z-axis direction.
Specifically, in the present embodiment, the movement range is one section on the negative axis of the Z axis. In one example, the first set position is nearlimit, the fourth set position is farlimit, nearlimit, and farlimit, and the coordinate positions corresponding to the first set position and the fourth set position are both on the negative axis of the Z axis, nearlimit = -6000um, farlimit= -6350um. The range of movement defined between nearlimit and farlimit is 350um in size. Therefore, when the coordinate position corresponding to the current position of the lens 104 is smaller than the coordinate position corresponding to the fourth setting position, it is determined that the current position of the lens 104 exceeds the fourth setting position. In fig. 10, farlimit is the position of the next depth of field L at the lowest position of the upper surface 205 of the front panel 202 of the reactor 200. The depth of field L is the depth of field size of the lens 104.
It should be noted that, in other embodiments, the coordinate positions corresponding to the first setting position and/or the fourth setting position may be specifically set according to the actual situation, which is not specifically limited herein.
In the present embodiment, the focusing module 106 includes a light source 116 and a light sensor 118, the light source 116 is used for emitting light onto the object, and the light sensor 118 is used for receiving light reflected by the object. Thus, the focusing module 106 can emit and receive light.
Specifically, in the embodiment of the present invention, the light source 116 may be an infrared light source 116, and the light sensor 118 may be a photodiode (photo diode), so that the cost is low and the detection accuracy is high. The infrared light emitted by the light source 116 is reflected by the dichroic beam splitter into the objective lens 110 and projected to the sample 300 and the object via the objective lens 110. The object may reflect infrared light projected through the objective lens 110. In an embodiment of the present invention, when the sample 300 includes the carrier 200 and the sample 302 to be measured, the light reflected by the received object is the light reflected by the lower surface 204 of the front panel of the carrier 200.
Whether or not infrared light reflected by the object can enter the objective lens 110 and be received by the photosensor 118 depends mainly on the distance between the objective lens 110 and the object. Therefore, when the focusing module 106 receives the infrared light reflected by the object, it can be determined that the distance between the objective lens 110 and the object is in the optical imaging suitable range, and the imaging device 102 can be used for imaging. In one example, the distance is 20-40um.
At this time, the lens 104 is moved by a second set step smaller than the first set step, so that the imaging system can find the optimal imaging position of the lens 104 in a smaller range.
In this embodiment, referring to fig. 13, when the focusing module 106 receives the light reflected by the object, the imaging method further includes the steps of: g, enabling the lens 104 to move towards the object in a third set step length smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module 106, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value; and (d) performing the step when the first light intensity parameter is greater than the first set light intensity threshold. Thus, by comparing the first light intensity parameter with the first set light intensity threshold value, interference of the focusing/focusing caused by the light signal which is very weak in comparison with the reflected light of the medium interface can be eliminated.
When the first light intensity parameter is not greater than the first set light intensity threshold, the lens 104 is made to move continuously towards the object in a third set step.
In this embodiment, the focusing module 106 includes two light sensors 118, the two light sensors 118 are configured to receive light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118. In this way, the first light intensity parameter is calculated by the average of the light intensities of the light received by the two light sensors 118, so that the elimination of the weak light signal is more accurate.
Specifically, the first light intensity parameter may be set to SUM, i.e., sum= (pd1+pd2)/2, and pd1 and PD2 represent the light intensities of the light received by the two light sensors 118, respectively. In one example, the first set intensity threshold nsum=40.
In one example, the third set step s2=0.005 mm. It will be appreciated that in other examples, the third setting step may take other values, and is not specifically limited herein.
Embodiment two:
It should be noted that, in this embodiment, the structural diagram of the imaging system may be the structural diagram of the imaging system of the first embodiment, and it is understood that the focusing method of the second embodiment is different from the focusing method or the focusing logic of the first embodiment, but the structure of the imaging system used is basically the same.
Referring to fig. 10, 11 and 14, focusing includes the steps of: s11, emitting light to the object by using the focusing module 106; s12, enabling the lens 104 to move to a first set position; s13, enabling the lens 104 to move from the first set position to the object in a first set step length and judging whether the focusing module 106 receives light reflected by the object or not; when the focusing module 106 receives the light reflected by the object, S14, the lens 104 is moved by a second set step smaller than the first set step, and the imaging device 102 is used to collect an image of the object, and determine whether the sharpness value of the image collected by the imaging device 102 reaches the set threshold; when the sharpness value of the image reaches the set threshold, S15, the current position of the lens 104 is saved as a save position.
By utilizing the focusing method, the plane of clear imaging of the target object, namely the clear plane/clear plane, can be quickly and accurately found. The method is particularly suitable for devices comprising precision optical systems where a clear plane is not easily found, such as optical detection devices with high magnification lenses.
Specifically, in the focusing step, the object is an object whose focal plane position needs to be acquired, for example, if a first predetermined relationship needs to be determined, two objects may be selected in the first preset track, and two objects located in the first preset track 43 are focused sequentially or simultaneously to acquire two sets of focal plane position data, where one of the two sets of focal plane position data is focal plane position data of the first object 42, and the other one of the two sets of focal plane position data is focal plane position data of the second object 44; if the second predetermined relationship needs to be determined, two objects may be selected in the second preset track, and the two objects located in the second preset track 45 may be focused sequentially or simultaneously to obtain two sets of focal plane position data, one of which is focal plane position data of the fourth object 47, and the other of which is focal plane position data of the fifth object 48.
Referring to fig. 10, in the embodiment of the present invention, the objects are a plurality of positions (FOVs) of the sample 300 applied in the sequencing, specifically, when a first predetermined relationship is determined, an object to be focused may be a first object or a second object, and when a second predetermined relationship is determined, an object to be focused may be a fourth object or a fifth object. The sample 300 includes a carrying device 200 and a sample 302 to be detected located on the carrying device, the sample 302 to be detected is a biological molecule, such as a nucleic acid, and the lens 104 is located above the carrying device 200. The carrier 200 has a front panel 202 and a rear panel (lower panel), each having two surfaces, with a sample 302 to be tested attached to the upper surface of the lower panel, i.e., the sample 302 to be tested is located below the lower surface 204 of the front panel 202. In the embodiment of the present invention, since the imaging device 102 is used for collecting the image of the sample 302 to be measured, that is, the position (FOV) corresponding to the sample 302 to be measured when photographing, and the sample 302 to be measured is located below the lower surface 204 of the front panel 202 of the carrying device 200, when the focusing process is started, the lens 104 is moved to find the medium interface 204 where the sample 302 to be measured is located, so as to improve the success rate of collecting the clear image of the imaging device 102. In the embodiment of the present invention, the sample 302 to be measured is a solution, the front panel 202 of the carrying device 200 is glass, and the medium interface 204 between the carrying device 200 and the sample 302 to be measured is the lower surface 204 of the front panel 202 of the carrying device 200, i.e. the interface between glass and two kinds of media. The sample 302 to be measured for which the imaging device 102 needs to collect an image is located below the lower surface 204 of the front panel 202, and the image collected by the imaging device 102 is used to determine a clear surface for finding a clear image of the sample 302 to be measured, which may be referred to as focusing. In one example, the front panel 202 has a thickness of 0.175mm.
In this embodiment, the carrying device 200 may be a slide, the sample 302 to be measured is placed on the slide, or the sample 302 to be measured is clamped between two slides. In some embodiments, the carrying device 200 may be a reaction device, for example, a chip similar to a sandwich structure with carrying panels on top and bottom, and the sample 302 to be measured is disposed on the chip.
In this embodiment, referring to fig. 11, the imaging device 102 includes a microscope 107 and a camera 108, the lens 104 includes an objective lens 110 and a camera lens 112 of the microscope, and the focusing module 106 may be fixed to the camera lens 112 by a dichroic beam splitter 114 (dichroic beam splitter), where the dichroic beam splitter 114 is located between the camera lens 112 and the objective lens 110. The dichroic beamsplitter 114 comprises a dual C-mount splitter. The dichroic beam splitter 114 may reflect light emitted by the focusing module 106 to the objective lens 110 and may allow visible light to pass through and into the camera 108 via the camera lens 112, as shown in fig. 11.
In the embodiment of the present invention, the movement of the lens 104 is along the optical axis OP. The movement of the lens 104 may refer to the movement of the objective lens 110, and the position of the lens 104 may refer to the position of the objective lens 110. In other embodiments, other lenses of the moving lens 104 may be selected to achieve focus. In addition, the microscope 107 further includes a tube lens 111 (tube lens) located between the objective lens 110 and the camera 108.
In this embodiment, the stage can drive the sample 200 to move in a plane (e.g., XY plane) perpendicular to the optical axis OP (e.g., Z axis) of the lens 104, and/or can drive the sample 300 to move along the optical axis OP (e.g., Z axis) of the lens 104.
In other embodiments, the plane in which the stage moves the sample 300 is not perpendicular to the optical axis OP, i.e. the included angle between the moving plane of the sample and the XY plane is not 0, and the imaging method is still applicable.
In addition, the imaging device 102 can also drive the objective lens 110 to move along the optical axis OP of the lens 104 for focusing. In some examples, the imaging device 102 uses a drive such as a stepper motor or voice coil motor to drive the objective lens 110.
In this embodiment, when the coordinate system is established, as shown in fig. 10, the positions of the objective lens 110, the stage, and the sample 300 may be set on the negative axis of the Z axis, and the first set position may be a coordinate position on the negative axis of the Z axis. It will be appreciated that in other embodiments, the relationship between the coordinate system and the camera and the objective lens 110 may be adjusted according to the actual situation, which is not particularly limited herein.
In one example, imaging device 102 comprises a total internal reflection fluorescence microscope with objective lens 110 at 60 x magnification, a first set step s1=0.01 mm. Thus, the first setting step S1 is suitable, and too large S1 spans an acceptable focus range, and too small S1 increases the time overhead.
When the focusing module 106 does not receive the light reflected by the object, the lens 104 is further moved along the optical axis OP toward the sample 300 and the object by the first set step.
In the present embodiment, when the sharpness value of the image does not reach the set threshold, the lens 104 is made to continue moving along the optical axis OP in the second set step.
In this embodiment, the imaging system may be applied to a sequencing system, or the sequencing system may include an imaging system.
In this embodiment, when the lens 104 moves, it is determined whether the current position of the lens 104 exceeds the second setting position; when the current position of the lens 104 exceeds the second set position, the lens 104 is stopped moving or the focusing step is performed. Thus, the first setting position and the second setting position can limit the movement range of the lens 104, so that the lens 104 can stop moving when the focusing is not successful, the waste of resources or the damage of equipment is avoided, or the lens 104 can be refocused when the focusing is not successful, and the automation of the imaging method is improved.
For example, in a total internal reflection imaging system, to quickly find the medium interface, the settings are adjusted to minimize the range of movement of the lens 104 as may be satisfactory for implementing this embodiment. For example, on a total internal reflection imaging device in which the objective lens is 60 times, the movement range of the lens 104 may be set to 200 μm±10 μm or [190 μm,250 μm ] in accordance with the light path characteristics and empirical summary.
In this embodiment, according to the determined movement range and the setting of any one of the second setting position and the first setting position, the other setting position can be determined. In one example, the second setting position is set to a position of the depth of field of the reaction apparatus 200, which is the lowest position of the upper surface 205 of the front panel 202, and the movement range of the lens 104 is set to 250 μm, so that the first setting position is determined. In the present example, the coordinate position corresponding to the position of the next depth of field is a position that becomes smaller in the negative Z-axis direction.
Specifically, in the embodiment of the present invention, the movement range is one section on the negative axis of the Z axis. In one example, the first setting position is nearlimit, the second setting position is farlimit, nearlimit, and farlimit, and the corresponding coordinate positions are on the negative axis of the Z axis, nearlimit = -6000um, farlimit= -6350um. The range of movement defined between nearlimit and farlimit is 350um in size. Therefore, when the coordinate position corresponding to the current position of the lens 104 is smaller than the coordinate position corresponding to the second setting position, it is determined that the current position of the lens 104 exceeds the second setting position. In fig. 10, farlimit is the position of the next depth of field L at the lowest position of the upper surface 205 of the front panel 202 of the reactor 200. The depth of field L is the depth of field size of the lens 104.
It should be noted that, in other embodiments, the coordinate position corresponding to the first setting position and/or the second setting position may be specifically set according to the actual situation, which is not specifically limited herein.
In the present embodiment, the focusing module 106 includes a light source 116 and a light sensor 118, the light source 116 is used to emit light onto the object, and the light sensor 118 is used to receive the light reflected by the object. Thus, the focusing module 106 can emit and receive light.
Specifically, in the embodiment of the present invention, the light source 116 may be an infrared light source 116, and the light sensor 118 may be a photodiode (photo diode), so that the cost is low and the detection accuracy is high. The infrared light emitted by the light source 116 is reflected by the dichroic beam splitter into the objective lens 110 and projected to the sample 300 and the object via the objective lens 110. The object may reflect infrared light projected through the objective lens 110. In an embodiment of the present invention, when the sample 300 includes the carrier 200 and the sample 302 to be measured, the light reflected by the received object is the light reflected by the lower surface 204 of the front panel of the carrier 200.
Whether or not infrared light reflected by the object can enter the objective lens 110 and be received by the photosensor 118 depends mainly on the distance between the objective lens 110 and the object. Therefore, when the focusing module 106 receives the infrared light reflected by the object, it can be determined that the distance between the objective lens 110 and the object is in the optical imaging suitable range, and the imaging device 102 can be used for imaging. In one example, the distance is 20-40um.
At this time, the lens 104 is moved by a second set step smaller than the first set step, so that the imaging system can find the optimal imaging position of the lens 104 in a smaller range.
In this embodiment, the sharpness value of the image may be used as an evaluation value (evaluation value) of image focusing. In one example, determining whether the sharpness value of the image captured by the imaging device 102 reaches a set threshold may be by a hill climbing algorithm of image processing. Whether the sharpness value reaches the maximum value at the peak of the sharpness value is determined by calculating the sharpness value of the image output by the imaging device 102 at each position of the objective lens 110, and further whether the lens 104 reaches the position where the clear surface is located when the imaging device 102 images. It will be appreciated that in other embodiments, other image processing algorithms may be used to determine whether the sharpness value reaches a maximum at the peak.
Saving the current position of the lens 104 as a save position when the sharpness value of the image reaches a set threshold value may enable the imaging device 102 to output a clear image when the sequencing reaction takes a picture.
In this embodiment, referring to fig. 15, when the focusing module 106 receives the light reflected by the object, focusing further includes the steps of: s16, enabling the lens 104 to move towards the object in a third set step length smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module 106, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value; when the first light intensity parameter is greater than the first set light intensity threshold, step S14 is performed. Thus, by comparing the first light intensity parameter with the first set light intensity threshold value, interference of the focusing/focusing caused by the light signal which is very weak in comparison with the reflected light of the medium interface can be eliminated.
When the first light intensity parameter is not greater than the first set light intensity threshold, the lens 104 is further moved toward the object along the optical axis OP by a third set step.
In the present embodiment, the focusing module 106 includes two light sensors 118, the two light sensors 118 are configured to receive light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118. In this way, the first light intensity parameter is calculated by the average of the light intensities of the light received by the two light sensors 118, so that the elimination of the weak light signal is more accurate.
Specifically, the first light intensity parameter may be set to SUM, i.e., sum= (pd1+pd2)/2, and pd1 and PD2 represent the light intensities of the light received by the two light sensors 118, respectively. In one example, the first set intensity threshold nsum=40.
In one example, the third set step s2=0.005 mm. It will be appreciated that in other examples, the third setting step may take other values, and is not specifically limited herein.
In another embodiment, referring to fig. 16, when the focusing module 106 receives the light reflected by the object, the method further includes the following steps: s16, enabling the lens 104 to move towards the object in a third set step length smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module 106, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value; when the first light intensity parameter is greater than the first set light intensity threshold, S17, moving the lens 104 to the object in a fourth set step smaller than the third set step and greater than the second set step, calculating a second light intensity parameter according to the light intensity of the light received by the focusing module 106, and judging whether the second light intensity parameter is less than the second set light intensity threshold; when the second light intensity parameter is smaller than the second set light intensity threshold, step S14 is performed. Therefore, by comparing the first light intensity parameter with the first set light intensity threshold value, interference of light signals with very weak contrast with the reflected light of the medium interface on focusing/focusing can be eliminated; and by comparing the second light intensity parameter with the second set light intensity threshold, the interference of the strong reflected light signal at the non-medium interface position, such as the light signal reflected by the oil/air of the objective lens 110, on focusing/focusing can be eliminated.
When the first light intensity parameter is not greater than the first set light intensity threshold, the lens 104 is further moved toward the object along the optical axis OP by a third set step.
When the second light intensity parameter is not smaller than the second set light intensity threshold, the lens 104 is made to move continuously along the optical axis OP toward the subject in a fourth set step.
In one example, the third set step s2=0.005 mm and the fourth set step s3=0.002 mm. It will be appreciated that in other examples, the third setting step and the fourth setting step may take other values, and are not specifically limited herein.
In the present embodiment, the focusing module 106 includes two light sensors 118, the two light sensors 118 are configured to receive light reflected by the object, the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118, the light intensities of the light received by the two light sensors 118 have a first difference value, and the second light intensity parameter is a difference value between the first difference value and the set compensation value. In this way, the second light intensity parameter is calculated by the light intensity of the light received by the two light sensors 118, so that the rejection of the strongly reflected light signal is more accurate.
Specifically, the first light intensity parameter may be set to SUM, i.e., sum= (pd1+pd2)/2, and pd1 and PD2 represent the light intensities of the light received by the two light sensors 118, respectively. In one example, the first set intensity threshold nsum=40. The difference value may be set to err, and the compensation value is set to offset, that is err= (PD 1-PD 2) -offset. In an ideal case, the first difference may be zero. In one example, the second set intensity threshold nErr =10 and offset=30.
In the present embodiment, when the lens 104 is moved by the second setting step, it is determined whether the first sharpness value of the pattern corresponding to the current position of the lens 104 is greater than the second sharpness value of the image corresponding to the previous position of the lens 104; when the first sharpness value is greater than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is greater than the set difference, continuing to move the lens 104 toward the subject in a second set step size; when the first sharpness value is greater than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is less than the set difference, causing the lens 104 to continue moving toward the subject at a fifth set step that is less than the second set step to cause the sharpness value of the image captured by the imaging device 102 to reach the set threshold; moving the lens 104 away from the subject by a second set step when the second sharpness value is greater than the first sharpness value and a sharpness difference between the second sharpness value and the first sharpness value is greater than the set difference; when the second sharpness value is greater than the first sharpness value and the sharpness difference between the second sharpness value and the first sharpness value is less than the set difference, the lens 104 is moved away from the subject by a fifth set step to bring the sharpness value of the image acquired by the imaging device 102 to the set threshold. Thus, the position of the lens 104 corresponding to the peak of the sharpness value can be found more accurately, so that the image output by the imaging device is clear.
Specifically, the second set step may be regarded as a coarse step Z1, the fifth set step may be regarded as a fine step Z2, and the coarse range Z3 may be set. The rough adjustment range Z3 is set so that when the sharpness value of the image cannot reach the set threshold, the movement of the lens 104 can be stopped, thereby saving resources.
With the current position of the lens 104 as the start point T, the coarse adjustment range Z3 is an adjustment range, that is, the adjustment range on the Z axis is (T, t+z3). The lens 104 is first moved in a first direction (e.g., in a direction along the optical axis OP toward the object) within a range of (T, t+z3) by a step Z1, and a first sharpness value R1 of an image acquired by the imaging device 102 at a current position of the lens 104 is compared with a second sharpness value R2 of an image acquired by the imaging device 102 at a previous position of the lens 104.
When R1> R2 and R1-R2> R0, the sharpness value of the image is closer to and farther from the set threshold, the lens 104 is caused to continue moving in the first direction by the step Z1 to quickly approach the set threshold.
When R1> R2 and R1-R2< R0, the sharpness value of the image is close to and nearer to the set threshold, the lens 104 is moved in the first direction by the step Z2, approaching the set threshold by a smaller step.
When R2> R1 and R2-R1> R0, i.e., the sharpness value of the image has crossed the set threshold and is farther from the set threshold, the lens 104 is moved in a second direction (e.g., a direction along the optical axis OP away from the subject) opposite the first direction by a step Z1 to quickly approach the set threshold.
When R2> R1 and R2-R1< R0, the sharpness value of the image has crossed the set threshold and is closer to the set threshold, the lens 104 is moved in a second direction opposite the first direction by a step Z2, approaching the set threshold by a smaller step.
In this embodiment, the fifth set step size may be adjusted to accommodate the step size approaching the set threshold value that is not too large or too small as the lens 104 moves.
In one example, t=0, z1=100, z2=40, z3=2100, and the adjustment range is (0,2100). It should be noted that the above-mentioned values are measurement values used in the image capturing process of the imaging device 102 when the lens 104 is moved, and the measurement values are related to the light intensity. The set threshold value may be understood as a peak value or a range centered around the peak value or a range containing the peak value of the focusing curve.
Referring to fig. 17 and 5, an imaging system 100 according to an embodiment of the present invention is used for imaging an object, and the imaging system includes a lens 104 and a control device 101, where the object includes a first object 42, a second object 44, and a third object 46 located at different positions of a first preset track 43, and the control device 101 includes a computer executable program, and executing the computer executable program includes the steps of the imaging method according to any of the above embodiments.
In the imaging system 100, the first predetermined relationship is determined by the focusing positions of the first object 42 and the second object 44, when imaging other objects on the first predetermined track, focal plane prediction can be directly performed according to the first predetermined relationship, and clear images of the third object are not required to be acquired in a focusing manner.
It should be noted that the explanation and description of the technical features and advantages of the imaging method in any of the above embodiments and examples are also applicable to the imaging system 100 of the present embodiment, and are not further detailed herein to avoid redundancy.
In some embodiments, the third object 46 is located between the first object 42 and the second object 44.
In some embodiments, the lens 104 is fixed, the lens 104 includes an optical axis OP, and the first preset rail 43 is movable in a direction perpendicular or parallel to the optical axis OP.
In some embodiments, the determining of the first predetermined relationship comprises:
focusing the first object 42 with the imaging system to determine a first coordinate;
Focusing the second object 44 with the imaging system to determine a second coordinate;
a first predetermined relationship is established based on first coordinates reflecting the focal plane position of the first object 42 and second coordinates reflecting the focal plane position of the second object 44.
In certain embodiments, the first preset track 43 is a linear or non-linear track; and/or the first predetermined relationship is a linear relationship.
In some embodiments, the objects include a fourth object 47 and a fifth object 48 located at different positions of the second preset track 45, and the control device 101 is configured to:
The lens 104 and the second preset track 45 are relatively moved according to a second predetermined relationship, which is determined by the focal plane position of the fourth object 47 and the first predetermined relationship, to obtain an image of the fifth object 48 using the imaging system without focusing, the second preset track 45 being different from the first preset track 43.
In some embodiments, the lens 104 is fixed, the lens 104 includes an optical axis OP, and the second preset orbit 45 is movable in a direction perpendicular or parallel to the optical axis OP.
In some embodiments, the determining of the second predetermined relationship comprises:
Focusing the fourth object 47 with the imaging system to determine a fourth coordinate;
The second predetermined relationship is established in accordance with the first predetermined relationship and a fourth coordinate, which reflects the focal plane position of the fourth object 47.
In certain embodiments, the control device 101 is configured to: after the image of the third object 46 is acquired, the lens 104 is moved relative to the first preset track 43 and/or the second preset track 45 to acquire an image of the fifth object 48 using the imaging system without focusing.
In some embodiments, the imaging system comprises an imaging device 102 and a stage 103, the imaging device 102 comprises a lens 104 and a focusing module 106, the lens 104 comprises an optical axis OP, the lens 104 is capable of moving along the optical axis OP, and the first preset track 43 and/or the second preset track 45 are located on the stage 103.
In some embodiments, the control device 101 is configured to perform the steps of:
(a) Emitting light onto the object using the focusing module 106;
(b) Moving the lens 104 to a first set position;
(c) Moving the lens 104 from the first setting position to the object in a first setting step length and judging whether the focusing module 106 receives the light reflected by the object;
(d) When the focusing module 106 receives light reflected by an object, the lens 104 is moved from the current position to a second set position, the second set position is located in a first range, and the first range is a range including the current position and allowing the lens 104 to move along the optical axis OP direction;
(e) Moving the lens 104 from the second set position by a second set step length, which is smaller than the first set step length, at each position to obtain an image of the subject with the imaging device 102;
(f) And evaluating the image of the object, and focusing according to the obtained image evaluation result.
In some embodiments, based on the current position, the first range includes a first interval and a second interval that are opposite, the second interval is defined to be closer to the object, and step (e) includes:
(i) When the second setting position is located in the second section, the lens 104 is moved from the second setting position in a direction away from the subject, and an image of the subject is obtained with the imaging device 102 at each position; or alternatively
(Ii) When the second setting position is located in the first section, the lens 104 is moved from the second setting position in a direction approaching the subject, and an image of the subject is obtained by the imaging device 102 at each position.
In certain embodiments, step (f) comprises: comparing the image evaluation result with a preset condition, and if the image evaluation result meets the preset condition, storing the position of the lens 104 corresponding to the image;
if the image evaluation result does not meet the preset condition, the lens 104 is moved to a third set position, and the third set position is located in another section different from the section where the second set position is located in the first range.
In some embodiments, the image evaluation result includes a first evaluation value and a second evaluation value, the second set step includes a coarse step and a fine step, and step (f) includes: the lens 104 moves in a coarse step until the first evaluation value of the image at the corresponding position is not greater than the first threshold, the lens 104 replaces the second evaluation value of the image continued to move to the corresponding position in a fine step with the second evaluation value being the maximum, and the position of the lens 104 corresponding to the image when the second evaluation value is the maximum is saved.
In some embodiments, the image evaluation result includes a first evaluation value, a second evaluation value, and a third evaluation value, and the image includes a plurality of pixels;
The preset condition is that the number of the bright spots on the image is larger than a preset value, the first evaluation value of the image at the corresponding position is not larger than a first threshold value, and the second evaluation value of the image at the corresponding position is the largest in the second evaluation values of N images before and after the image at the corresponding position; or (b)
The preset condition is that the number of bright spots on the image is smaller than a preset value, the first evaluation value of the image at the corresponding position is not larger than a first threshold value, and the third evaluation value of the image at the corresponding position is the largest among the third evaluation values of N images before and after the current image.
In certain embodiments, the imaging system includes a speckle detection module for:
and (3) carrying out bright spot detection on the image by using a k1 x k2 matrix, wherein the matrix comprising the central pixel value of the judgment matrix is not less than any pixel value of the non-center of the matrix corresponds to one bright spot, both k1 and k2 are odd numbers greater than 1, and the k1 x k2 matrix comprises k1 x k2 pixel points.
In some embodiments, the central pixel value of the matrix corresponding to one bright spot is greater than a first preset value, and any one of the non-central pixel values of the matrix is greater than a second preset value, the first preset value and the second preset value being related to the average pixel value of the image.
In some embodiments, the first evaluation value is determined by counting the size of the connected domain corresponding to the bright spot of the image, where the size area=a×b of the connected domain corresponding to the bright spot of one image, a represents the size of the connected domain of the row centered on the center of the matrix corresponding to the bright spot, B represents the size of the connected domain of the column centered on the center of the matrix corresponding to the bright spot, and a connected pixel point larger than the average pixel value of the image is defined as one connected domain.
In some embodiments, the second evaluation value and/or the third evaluation value is determined by counting scores of the bright spots of the images, the Score of the bright spot of one image = ((k 1 x k 2-1) CV-EV)/((cv+ev)/(k 1 x k 2)), CV represents a center pixel value of the matrix corresponding to the bright spot, and EV represents a sum of non-center pixel values of the matrix corresponding to the bright spot.
In some embodiments, the focus module 106 includes a light source 116 and a light sensor 118, the light source 116 is configured to emit light onto the subject, and the light sensor 118 is configured to receive light reflected by the subject.
In some embodiments, when the focusing module 106 receives the light reflected by the object, the control device 101 is further configured to:
Moving the lens 104 to the object in a third set step length smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module 106, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value;
when the first light intensity parameter is greater than the first set light intensity threshold, the lens 104 is moved from the current position to the second set position.
In some embodiments, the focusing module 106 includes two light sensors 118, where the two light sensors 118 are configured to receive light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118.
In some embodiments, the control device 101 is configured to, as the lens 104 moves: judging whether the current position of the lens 104 exceeds a fourth set position;
when the current position of the lens 104 exceeds the fourth set position, the movement of the lens 104 is stopped.
In certain embodiments, the control device 101 is configured to:
Emitting light onto the object using the focusing module 106;
moving the lens 104 to a first set position;
moving the lens 104 from the first setting position to the object in a first setting step and judging whether the focusing module 106 receives the light reflected by the object;
When the focusing module 106 receives the light reflected by the object, the lens 104 is moved by a second set step length smaller than the first set step length, the imaging device 102 is used for acquiring an image of the object, and whether the sharpness value of the image acquired by the imaging device 102 reaches a set threshold value is judged;
When the sharpness value of the image reaches the set threshold, the current position of the lens 104 is saved as a save position.
In some embodiments, the focus module 106 includes a light source 116 and a light sensor 118, the light source 116 for emitting light onto the subject and the light sensor 118 for receiving light reflected by the subject.
In some embodiments, when the focusing module 106 receives the light reflected by the object, the control device 101 is configured to:
Moving the lens 104 by a third set step object smaller than the first set step and larger than the second set step, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module 106, and judging whether the first light intensity parameter is larger than a first set light intensity threshold;
when the first light intensity parameter is greater than the first set light intensity threshold, a step of moving the lens 104 by a second set step length and performing image acquisition on the object by the imaging device 102, and judging whether the sharpness value of the image acquired by the imaging device 102 reaches the set threshold.
In some embodiments, the focusing module 106 includes two light sensors 118, the two light sensors 118 are configured to receive light reflected by the subject, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118.
In some embodiments, when the focusing module 106 receives the light reflected by the object, the control device 101 is configured to:
Moving the lens 104 by a third set step object smaller than the first set step and larger than the second set step, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module 106, and judging whether the first light intensity parameter is larger than a first set light intensity threshold;
When the first light intensity parameter is greater than the first set light intensity threshold, the lens 104 is made to move towards the object in a fourth set step length smaller than the third set step length and greater than the second set step length, and a second light intensity parameter is calculated according to the light intensity of the light received by the focusing module 106, and whether the second light intensity parameter is smaller than the second set light intensity threshold is judged;
When the second light intensity parameter is smaller than the second set light intensity threshold, a step of moving the lens 104 by a second set step length and performing image acquisition on the object by the imaging device 102 is performed, and whether the sharpness value of the image acquired by the imaging device 102 reaches the set threshold is determined.
In some embodiments, the focusing module 106 includes two light sensors 118, the two light sensors 118 are configured to receive light reflected by the subject, the first light intensity parameter is an average value of light intensities of the light received by the two light sensors 118, the light intensities of the light received by the two light sensors 118 have a first difference value, and the second light intensity parameter is a difference value between the first difference value and the set compensation value.
In some embodiments, when moving the lens 104 by the second set step, the control device 101 is configured to: judging whether the first sharpness value of the pattern corresponding to the current position of the lens 104 is larger than the second sharpness value of the image corresponding to the previous position of the lens 104;
When the first sharpness value is greater than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is greater than the set difference, continuing to move the lens 104 toward the subject in a second set step size;
When the first sharpness value is greater than the second sharpness value and the sharpness difference between the first sharpness value and the second sharpness value is less than the set difference, causing the lens 104 to continue moving toward the subject at a fifth set step that is less than the second set step to cause the sharpness value of the image captured by the imaging device 102 to reach the set threshold;
moving the lens 104 away from the subject by a second set step when the second sharpness value is greater than the first sharpness value and a sharpness difference between the second sharpness value and the first sharpness value is greater than the set difference;
When the second sharpness value is greater than the first sharpness value and the sharpness difference between the second sharpness value and the first sharpness value is less than the set difference, the lens 104 is moved away from the subject by a fifth set step to bring the sharpness value of the image acquired by the imaging device 102 to the set threshold.
In some embodiments, the control device 101 is configured to, as the lens 104 moves: judging whether the current position of the lens 104 exceeds a second set position;
When the current position of the lens 104 exceeds the second set position, the lens 104 is stopped moving or the focusing step is performed.
A computer-readable storage medium of an embodiment of the present invention stores a program for execution by a computer, the execution program including steps of completing the imaging method of any of the above embodiments. The computer readable storage medium may include: read-only memory, random access memory, magnetic or optical disk, etc.
A computer program product of an embodiment of the invention contains instructions which, when executed by a computer, cause the computer to perform the steps of the imaging method of any of the embodiments described above.
In the description of the present specification, reference to the terms "one embodiment," "certain embodiments," "illustrative embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
In addition, each functional unit in each embodiment of the present invention may be integrated into one processing module, each unit may exist alone physically, or two or more units may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules. The integrated modules may also be stored in a computer readable storage medium if implemented in the form of software functional modules and sold or used as a stand-alone product.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.

Claims (10)

1. An imaging method, wherein an object is imaged with an imaging system, the imaging system comprising a lens, the object comprising a first object, a second object, and a third object located at different positions of a first preset trajectory, the imaging method comprising:
Relatively moving the lens and the first preset track according to a first preset relation, so as to obtain a clear image of the third object without focusing by using the imaging system, wherein the first preset relation is determined by the focal plane position of the first object and the focal plane position of the second object;
Optionally, the object further includes a fourth object and a fifth object located at different positions of the second preset track, and the imaging method further includes:
And enabling the lens and the second preset track to relatively move according to a second preset relation so as to obtain a clear image of the fifth object by using the imaging system without focusing, wherein the second preset relation is determined by the focal plane position of the fourth object and the first preset relation, and the second preset track is different from the first preset track.
2. The method of claim 1, wherein the third object is located between the first object and the second object;
optionally, the lens is fixed, the lens comprises an optical axis, and the first preset track is capable of moving along a direction perpendicular to or parallel to the optical axis;
optionally, the determining of the first predetermined relationship includes:
focusing the first object by using the imaging system, and determining a first coordinate;
Focusing the second object by using the imaging system to determine a second coordinate;
Establishing the first preset relation according to the first coordinate and the second coordinate, wherein the first coordinate reflects the focal plane position of the first object, and the second coordinate reflects the focal plane position of the second object;
Optionally, the first preset track is a linear or nonlinear track; and/or the first predetermined relationship is a linear relationship;
Optionally, the lens is fixed, the lens comprises an optical axis, and the second preset track is capable of moving along a direction perpendicular to or parallel to the optical axis;
optionally, the determining of the second predetermined relationship includes:
focusing the fourth object by using the imaging system to determine a fourth coordinate;
Establishing the second predetermined relationship according to the first predetermined relationship and the fourth coordinate, wherein the fourth coordinate reflects the focal plane position of the fourth object;
optionally, the imaging method comprises: after the clear image of the third object is acquired, the lens and the first preset track and/or the second preset track are moved relatively to acquire the clear image of the fifth object by using the imaging system without focusing.
3. The method of claim 2, wherein the imaging system comprises an imaging device and a stage, the imaging device comprising the lens and a focusing module, the lens comprising an optical axis, the lens being movable in the direction of the optical axis for the focusing, the first preset track and/or the second preset track being located on the stage;
optionally, the focusing comprises the steps of:
(a) Transmitting light to the object by utilizing the focusing module;
(b) Moving the lens to a first set position;
(c) Moving the lens from the first setting position to the object in a first setting step length and judging whether the focusing module receives the light reflected by the object or not;
(d) When the focusing module receives the light reflected by the object, the lens is moved from a current position to a second set position, the second set position is positioned in a first range, and the first range is a range which comprises the current position and allows the lens to move along the optical axis direction;
(e) Moving the lens from the second set position by a second set step size, wherein the second set step size is smaller than the first set step size, and the imaging device is used for obtaining the image of the object at each step position;
(f) Evaluating the image of the object, and realizing focusing according to the obtained image evaluation result;
optionally, based on the current position, the first range includes a first section and a second section that are opposite, the second section is defined to be closer to the object, and step (e) includes:
(i) When the second setting position is located in the second section, moving the lens from the second setting position to a direction away from the object, and obtaining an image of the object with the imaging device at each position; or alternatively
(Ii) When the second setting position is located in the first section, moving the lens from the second setting position to a direction approaching the object, and obtaining an image of the object with the imaging device at each position;
Optionally, step (f) comprises: comparing the image evaluation result with a preset condition, and if the image evaluation result meets the preset condition, storing the position of a lens corresponding to the image;
If the image evaluation result does not meet the preset condition, the lens is moved to a third set position, and the third set position is located in another section which is different from the section where the second set position is located in the first range;
Optionally, the image evaluation result includes a first evaluation value and a second evaluation value, the second set step includes a coarse step and a fine step, and step (f) includes: the lens moves with the coarse step until the first evaluation value of the image at the corresponding position is not larger than a first threshold value, the lens is replaced with the fine step to continue to move to the second evaluation value of the image at the corresponding position to be maximum, and the position of the lens corresponding to the image when the second evaluation value is maximum is stored;
Optionally, the image evaluation result includes a first evaluation value, a second evaluation value, and a third evaluation value, and the image includes a plurality of pixels;
The preset condition is that the number of the bright spots on the image is larger than a preset value, a first evaluation value of the image at the corresponding position is not larger than a first threshold value, and a second evaluation value of the image at the corresponding position is the largest in the second evaluation values of N images before and after the image at the corresponding position; or (b)
The preset condition is that the number of the bright spots on the image is smaller than the preset value, the first evaluation value of the image at the corresponding position is not larger than the first threshold value, and the third evaluation value of the image at the corresponding position is the largest in the third evaluation values of N images before and after the current image;
Optionally, the bright spots on the image are detected using the following method:
The method comprises the steps of detecting bright spots of the image by using a k1 x k2 matrix, wherein the method comprises the steps of judging that a central pixel value of the matrix is not smaller than any pixel value of a non-center matrix of the matrix, corresponding to one bright spot, wherein k1 and k2 are both odd numbers larger than 1, and the k1 x k2 matrix comprises k1 x k2 pixel points;
Optionally, the central pixel value of the matrix corresponding to one bright spot is larger than a first preset value, any pixel value of the non-center of the matrix is larger than a second preset value, and the first preset value and the second preset value are related to the average pixel value of the image;
optionally, the first evaluation value is determined by counting the size of the connected domain corresponding to the bright spot of the image, where one connected domain size area=a×b corresponding to the bright spot of the image, a represents the size of the connected domain of a row centered on the center of the matrix corresponding to the bright spot, B represents the size of the connected domain of a column centered on the center of the matrix corresponding to the bright spot, and a connected pixel point larger than the average pixel value of the image is defined as one connected domain;
Optionally, the second evaluation value and/or the third evaluation value is determined by counting the scores of the bright spots of the images, one Score of the bright spots of the images = ((k1×k2-1) CV-EV)/((cv+ev)/(k1×k2)), CV representing the central pixel value of the matrix corresponding to the bright spots, EV representing the sum of the non-central pixel values of the matrix corresponding to the bright spots;
Optionally, the focusing module comprises a light source for emitting the light onto the object and a light sensor for receiving the light reflected by the object;
Optionally, when the focusing module receives the light reflected by the object, the focusing further includes the steps of:
Moving the lens to the object in a third set step length smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value;
when the first light intensity parameter is larger than the first set light intensity threshold value, the lens is moved from the current position to the second set position;
Optionally, the focusing module comprises two light sensors, the two light sensors are used for receiving light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors;
Optionally, when the lens moves, judging whether the current position of the lens exceeds a fourth set position;
and stopping moving the lens when the current position of the lens exceeds the fourth setting position.
4. The method of claim 3, wherein the focusing comprises the steps of:
Transmitting light to the object by utilizing the focusing module;
moving the lens to a first set position;
Moving the lens from the first set position to the object in a first set step length and judging whether the focusing module receives the light reflected by the object or not;
When the focusing module receives light reflected by the object, the lens is moved in a second set step length smaller than the first set step length, the imaging device is used for acquiring an image of the object, and whether the sharpness value of the image acquired by the imaging device reaches a set threshold value is judged;
when the sharpness value of the image reaches the set threshold value, the current position of the lens is saved as a saving position;
optionally, the focusing module comprises a light source for emitting light onto the object and a light sensor for receiving light reflected by the object;
Optionally, when the focusing module receives the light reflected by the object, the focusing further comprises the steps of:
Moving the lens with a third set step length smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value;
When the first light intensity parameter is larger than the first set light intensity threshold value, performing the steps of enabling the lens to move in the second set step length, acquiring an image of the object by using the imaging device, and judging whether the sharpness value of the image acquired by the imaging device reaches the set threshold value or not;
Optionally, the focusing module includes two light sensors, the two light sensors are used for receiving light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors;
Optionally, when the focusing module receives the light reflected by the object, the focusing further comprises the following steps:
Moving the lens with a third set step length smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value;
When the first light intensity parameter is larger than the first set light intensity threshold, the lens is enabled to move towards the object in a fourth set step length smaller than the third set step length and larger than the second set step length, a second light intensity parameter is calculated according to the light intensity of the light received by the focusing module, and whether the second light intensity parameter is smaller than the second set light intensity threshold is judged;
When the second light intensity parameter is smaller than the second set light intensity threshold value, performing the steps of enabling the lens to move in the second set step length, acquiring an image of the object by using the imaging device, and judging whether the sharpness value of the image acquired by the imaging device reaches the set threshold value or not;
Optionally, the focusing module includes two light sensors, the two light sensors are used for receiving light reflected by the object, the first light intensity parameter is an average value of light intensities of the light received by the two light sensors, the light intensities of the light received by the two light sensors have a first difference value, and the second light intensity parameter is a difference value between the first difference value and a set compensation value;
Optionally, when the lens is moved by the second set step length, judging whether a first sharpness value of the pattern corresponding to the current position of the lens is larger than a second sharpness value of the image corresponding to the previous position of the lens;
continuing to move the lens toward the subject with the second set step size when the first sharpness value is greater than the second sharpness value and a sharpness difference between the first sharpness value and the second sharpness value is greater than a set difference;
When the first sharpness value is greater than the second sharpness value and a sharpness difference between the first sharpness value and the second sharpness value is less than the set difference, continuing to move the lens toward the subject with a fifth set step smaller than the second set step to cause the sharpness value of the image acquired by the imaging device to reach the set threshold;
Moving the lens away from the subject by the second set step when the second sharpness value is greater than the first sharpness value and a sharpness difference between the second sharpness value and the first sharpness value is greater than the set difference;
Moving the lens away from the subject by the fifth set step to bring the sharpness value of the image acquired by the imaging device to the set threshold when the second sharpness value is greater than the first sharpness value and a sharpness difference between the second sharpness value and the first sharpness value is less than the set difference;
Optionally, when the lens moves, judging whether the current position of the lens exceeds a second set position;
and stopping moving the lens or performing the focusing step when the current position of the lens exceeds the second set position.
5. An imaging system for imaging an object, the imaging system comprising a lens and a control device, the object comprising a first object, a second object and a third object located at different positions of a first preset trajectory, the control device being adapted to:
Relatively moving the lens and the first preset track according to a first preset relation, so as to obtain a clear image of the third object without focusing by using the imaging system, wherein the first preset relation is determined by the focal plane position of the first object and the focal plane position of the second object;
optionally, the third object is located between the first object and the second object;
optionally, the lens is fixed, the lens comprises an optical axis, and the first preset track is capable of moving along a direction perpendicular to or parallel to the optical axis;
optionally, the determining of the first predetermined relationship includes:
focusing the first object by using the imaging system, and determining a first coordinate;
Focusing the second object by using the imaging system to determine a second coordinate;
Establishing the first preset relation according to the first coordinate and the second coordinate, wherein the first coordinate reflects the focal plane position of the first object, and the second coordinate reflects the focal plane position of the second object;
Optionally, the first preset track is a linear or nonlinear track; and/or the first predetermined relationship is a linear relationship;
Optionally, the objects include a fourth object and a fifth object located at different positions of the second preset track, and the control device is configured to:
Relatively moving the lens and the second preset track according to a second preset relation, wherein the second preset relation is determined by the focal plane position of the fourth object and the first preset relation, and the second preset track is different from the first preset track, so that an image of the fifth object is obtained by using the imaging system without focusing;
Optionally, the lens is fixed, the lens comprises an optical axis, and the second preset track is capable of moving along a direction perpendicular to or parallel to the optical axis;
optionally, the determining of the second predetermined relationship includes:
focusing the fourth object by using the imaging system to determine a fourth coordinate;
Establishing the second predetermined relationship according to the first predetermined relationship and the fourth coordinate, wherein the fourth coordinate reflects the focal plane position of the fourth object;
Optionally, the control device is configured to: after a clear image of the third object is acquired, the lens is moved relative to the first preset track and/or the second preset track to acquire an image of the fifth object using the imaging system without focusing.
6. The system of claim 5, wherein the imaging system comprises an imaging device and a stage, the imaging device comprising the lens and a focusing module, the lens comprising an optical axis, the lens being movable in the direction of the optical axis, the first preset track and/or the second preset track being located on the stage;
Optionally, the control device is configured to perform the following steps:
(a) Transmitting light to the object by utilizing the focusing module;
(b) Moving the lens to a first set position;
(c) Moving the lens from the first setting position to the object in a first setting step length and judging whether the focusing module receives the light reflected by the object or not;
(d) When the focusing module receives the light reflected by the object, the lens is moved from a current position to a second set position, the second set position is positioned in a first range, and the first range is a range which comprises the current position and allows the lens to move along the optical axis direction;
(e) Moving the lens from the second set position by a second set step size, wherein the second set step size is smaller than the first set step size, and the imaging device is used for obtaining the image of the object at each step position;
(f) Evaluating the image of the object, and realizing focusing according to the obtained image evaluation result;
optionally, based on the current position, the first range includes a first section and a second section that are opposite, the second section is defined to be closer to the object, and step (e) includes:
(i) When the second setting position is located in the second section, moving the lens from the second setting position to a direction away from the object, and obtaining an image of the object with the imaging device at each position; or alternatively
(Ii) When the second setting position is located in the first section, moving the lens from the second setting position to a direction approaching the object, and obtaining an image of the object with the imaging device at each position;
Optionally, step (f) comprises: comparing the image evaluation result with a preset condition, and if the image evaluation result meets the preset condition, storing the position of a lens corresponding to the image;
If the image evaluation result does not meet the preset condition, the lens is moved to a third set position, and the third set position is located in another section which is different from the section where the second set position is located in the first range;
Optionally, the image evaluation result includes a first evaluation value and a second evaluation value, the second set step includes a coarse step and a fine step, and step (f) includes: the lens moves with the coarse step until the first evaluation value of the image at the corresponding position is not larger than a first threshold value, the lens is replaced with the fine step to continue to move to the second evaluation value of the image at the corresponding position to be maximum, and the position of the lens corresponding to the image when the second evaluation value is maximum is stored;
Optionally, the image evaluation result includes a first evaluation value, a second evaluation value, and a third evaluation value, and the image includes a plurality of pixels;
The preset condition is that the number of the bright spots on the image is larger than a preset value, a first evaluation value of the image at the corresponding position is not larger than a first threshold value, and a second evaluation value of the image at the corresponding position is the largest in the second evaluation values of N images before and after the image at the corresponding position; or (b)
The preset condition is that the number of the bright spots on the image is smaller than the preset value, the first evaluation value of the image at the corresponding position is not larger than the first threshold value, and the third evaluation value of the image at the corresponding position is the largest in the third evaluation values of N images before and after the current image;
optionally, the imaging system includes a bright spot detection module for:
The method comprises the steps of detecting bright spots of the image by using a k1 x k2 matrix, wherein the method comprises the steps of judging that a central pixel value of the matrix is not smaller than any pixel value of a non-center matrix of the matrix, corresponding to one bright spot, wherein k1 and k2 are both odd numbers larger than 1, and the k1 x k2 matrix comprises k1 x k2 pixel points;
Optionally, the central pixel value of the matrix corresponding to one bright spot is larger than a first preset value, any pixel value of the non-center of the matrix is larger than a second preset value, and the first preset value and the second preset value are related to the average pixel value of the image;
optionally, the first evaluation value is determined by counting the size of the connected domain corresponding to the bright spot of the image, where one connected domain size area=a×b corresponding to the bright spot of the image, a represents the size of the connected domain of a row centered on the center of the matrix corresponding to the bright spot, B represents the size of the connected domain of a column centered on the center of the matrix corresponding to the bright spot, and a connected pixel point larger than the average pixel value of the image is defined as one connected domain;
Optionally, the second evaluation value and/or the third evaluation value is determined by counting the scores of the bright spots of the images, one Score of the bright spots of the images = ((k1×k2-1) CV-EV)/((cv+ev)/(k1×k2)), CV representing the central pixel value of the matrix corresponding to the bright spots, EV representing the sum of the non-central pixel values of the matrix corresponding to the bright spots;
Optionally, the focusing module comprises a light source for emitting the light onto the object and a light sensor for receiving the light reflected by the object;
optionally, when the focusing module receives the light reflected by the object, the control device is further configured to:
Moving the lens to the object in a third set step length smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value;
when the first light intensity parameter is larger than the first set light intensity threshold value, the lens is moved from the current position to the second set position;
Optionally, the focusing module comprises two light sensors, the two light sensors are used for receiving light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors;
optionally, the control device is configured to, while the lens is moving: judging whether the current position of the lens exceeds a fourth set position or not;
stopping moving the lens when the current position of the lens exceeds the fourth set position;
optionally, the control device is configured to:
Transmitting light to the object by utilizing the focusing module;
moving the lens to a first set position;
Moving the lens from the first set position to the object in a first set step length and judging whether the focusing module receives the light reflected by the object or not;
When the focusing module receives light reflected by the object, the lens is moved in a second set step length smaller than the first set step length, the imaging device is used for acquiring an image of the object, and whether the sharpness value of the image acquired by the imaging device reaches a set threshold value is judged;
when the sharpness value of the image reaches the set threshold value, the current position of the lens is saved as a saving position;
optionally, the focusing module comprises a light source for emitting light onto the object and a light sensor for receiving light reflected by the object;
Optionally, when the focusing module receives the light reflected by the object, the control device is configured to:
Moving the lens with a third set step length smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value;
When the first light intensity parameter is larger than the first set light intensity threshold value, performing the steps of enabling the lens to move in the second set step length, acquiring an image of the object by using the imaging device, and judging whether the sharpness value of the image acquired by the imaging device reaches the set threshold value or not;
Optionally, the focusing module includes two light sensors, the two light sensors are used for receiving light reflected by the object, and the first light intensity parameter is an average value of light intensities of the light received by the two light sensors;
Optionally, when the focusing module receives the light reflected by the object, the control device is configured to:
Moving the lens with a third set step length smaller than the first set step length and larger than the second set step length, calculating a first light intensity parameter according to the light intensity of the light received by the focusing module, and judging whether the first light intensity parameter is larger than a first set light intensity threshold value;
When the first light intensity parameter is larger than the first set light intensity threshold, the lens is enabled to move towards the object in a fourth set step length smaller than the third set step length and larger than the second set step length, a second light intensity parameter is calculated according to the light intensity of the light received by the focusing module, and whether the second light intensity parameter is smaller than the second set light intensity threshold is judged;
When the second light intensity parameter is smaller than the second set light intensity threshold value, performing the steps of enabling the lens to move in the second set step length, acquiring an image of the object by using the imaging device, and judging whether the sharpness value of the image acquired by the imaging device reaches the set threshold value or not;
Optionally, the focusing module includes two light sensors, the two light sensors are used for receiving light reflected by the object, the first light intensity parameter is an average value of light intensities of the light received by the two light sensors, the light intensities of the light received by the two light sensors have a first difference value, and the second light intensity parameter is a difference value between the first difference value and a set compensation value;
Optionally, when moving the lens by the second set step, the control device is configured to: judging whether a first sharpness value of the pattern corresponding to the current position of the lens is larger than a second sharpness value of the image corresponding to the previous position of the lens;
continuing to move the lens toward the subject with the second set step size when the first sharpness value is greater than the second sharpness value and a sharpness difference between the first sharpness value and the second sharpness value is greater than a set difference;
When the first sharpness value is greater than the second sharpness value and a sharpness difference between the first sharpness value and the second sharpness value is less than the set difference, continuing to move the lens toward the subject with a fifth set step smaller than the second set step to cause the sharpness value of the image acquired by the imaging device to reach the set threshold;
Moving the lens away from the subject by the second set step when the second sharpness value is greater than the first sharpness value and a sharpness difference between the second sharpness value and the first sharpness value is greater than the set difference;
Moving the lens away from the subject by the fifth set step to bring the sharpness value of the image acquired by the imaging device to the set threshold when the second sharpness value is greater than the first sharpness value and a sharpness difference between the second sharpness value and the first sharpness value is less than the set difference;
Optionally, the control device is configured to, while the lens is moving: judging whether the current position of the lens exceeds a second set position or not;
and stopping moving the lens or performing the focusing step when the current position of the lens exceeds the second set position.
7. A sequencing device comprising the imaging system of any one of claims 5-6.
8. A computer-readable storage medium storing a program for execution by a computer, wherein executing the program comprises performing the steps of the method of any one of claims 1-4.
9. An imaging system for imaging an object, the imaging system comprising a lens and a control device, the object comprising a first object, a second object and a third object located at different positions of a first preset trajectory, characterized in that the control device comprises a computer executable program, the execution of which computer executable program comprises the steps of performing the method of any one of claims 1-4.
10. A computer program product comprising instructions which, when executed by a computer, cause the computer to perform the steps of the method of any of claims 1-4.
CN202410235075.1A 2018-07-23 2018-07-23 Imaging method, device and system Pending CN118158524A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410235075.1A CN118158524A (en) 2018-07-23 2018-07-23 Imaging method, device and system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202410235075.1A CN118158524A (en) 2018-07-23 2018-07-23 Imaging method, device and system
CN201810814359.0A CN112333378A (en) 2018-07-23 2018-07-23 Imaging method, device and system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810814359.0A Division CN112333378A (en) 2018-07-23 2018-07-23 Imaging method, device and system

Publications (1)

Publication Number Publication Date
CN118158524A true CN118158524A (en) 2024-06-07

Family

ID=74319229

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202410235075.1A Pending CN118158524A (en) 2018-07-23 2018-07-23 Imaging method, device and system
CN201810814359.0A Pending CN112333378A (en) 2018-07-23 2018-07-23 Imaging method, device and system

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810814359.0A Pending CN112333378A (en) 2018-07-23 2018-07-23 Imaging method, device and system

Country Status (1)

Country Link
CN (2) CN118158524A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113763401B (en) * 2021-09-10 2023-06-27 南京比邻智能软件有限公司 Quick multi-point automatic focusing method, system and application equipment thereof

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4608043B2 (en) * 1999-09-24 2011-01-05 オリンパス株式会社 Microscope focus detector
US7668362B2 (en) * 2000-05-03 2010-02-23 Aperio Technologies, Inc. System and method for assessing virtual slide image quality
US7171054B2 (en) * 2003-05-01 2007-01-30 Eastman Kodak Company Scene-based method for determining focus
JP4374574B2 (en) * 2004-03-30 2009-12-02 富士フイルム株式会社 Manual focus adjustment device and focus assist program
TW200804962A (en) * 2006-07-06 2008-01-16 Asia Optical Co Inc Semi-automatic zooming method for a manually-zooming optical component and a videotaping device using such method,
JP5489392B2 (en) * 2007-05-09 2014-05-14 オリンパス株式会社 Optical system evaluation apparatus, optical system evaluation method, and optical system evaluation program
CN102696219B (en) * 2010-11-08 2016-03-23 松下电器产业株式会社 Camera head, image capture method and integrated circuit
CN103513395B (en) * 2012-06-15 2018-05-04 中兴通讯股份有限公司 A kind of passive auto-focusing method and device
CN102967983A (en) * 2012-11-07 2013-03-13 苏州科达科技股份有限公司 Automatic focusing method of camera
CN104102069B (en) * 2013-04-11 2017-03-15 展讯通信(上海)有限公司 A kind of focusing method of imaging system and device, imaging system
CN104243815B (en) * 2014-08-25 2017-09-29 联想(北京)有限公司 A kind of focusing method and electronic equipment
CN104503189A (en) * 2014-12-31 2015-04-08 信利光电股份有限公司 Automatic focusing method
TW201638622A (en) * 2015-04-28 2016-11-01 信泰光學(深圳)有限公司 Automatic focusing method and image capturing device utilizing the same
JP6474693B2 (en) * 2015-06-19 2019-02-27 オリンパス株式会社 Focus detection apparatus, focus detection method, and recording medium
CN106375647B (en) * 2015-07-23 2020-05-29 杭州海康威视数字技术股份有限公司 Method, device and system for adjusting back focus of camera
CN105827944B (en) * 2015-11-25 2019-05-17 维沃移动通信有限公司 A kind of focusing method and mobile terminal
US10027879B2 (en) * 2016-11-15 2018-07-17 Google Llc Device, system and method to provide an auto-focus capability based on object distance information
CN106791387A (en) * 2016-12-12 2017-05-31 中国航空工业集团公司洛阳电光设备研究所 A kind of high-definition camera Atomatic focusing method that gondola is patrolled and examined for power network
CN207215686U (en) * 2017-09-20 2018-04-10 深圳市瀚海基因生物科技有限公司 Systems for optical inspection and Sequence Detection System

Also Published As

Publication number Publication date
CN112333378A (en) 2021-02-05

Similar Documents

Publication Publication Date Title
CN107076964B (en) Image-based laser auto-focusing system
US11156823B2 (en) Digital microscope apparatus, method of searching for in-focus position thereof, and program
US10890750B2 (en) Observation system, observation program, and observation method
US9001200B2 (en) Cell characterization using multiple focus planes
US11575823B2 (en) Imaging method, device and system
CN108693625B (en) Imaging method, device and system
JP2018512609A (en) Method, system and apparatus for automatically focusing a microscope on a substrate
JP2015230393A (en) Control method of imaging apparatus, and imaging system
CN105026977A (en) Information processing device, information processing method, and information processing program
CN102236148B (en) Focus detection apparatus
US9851549B2 (en) Rapid autofocus method for stereo microscope
CN112322713B (en) Imaging method, device and system and storage medium
CN108693624B (en) Imaging method, device and system
WO2018188441A1 (en) Imaging method, device and system
CN118158524A (en) Imaging method, device and system
JP2009109682A (en) Automatic focus adjusting device and automatic focus adjusting method
CN112291469A (en) Imaging method, device and system
KR20190022028A (en) Apparatus for capturing images of blood cell
CN108693113B (en) Imaging method, device and system
CN114326074A (en) Method and microscope for generating a sample overview image
JP2013088570A (en) Microscope apparatus
CN113366364A (en) Real-time focusing in slide scanning system
KR101873318B1 (en) Celll imaging device and mehtod therefor
CN111647506B (en) Positioning method, positioning device and sequencing system
JP5960006B2 (en) Sample analyzer, sample analysis method, sample analysis program, and particle track analyzer

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination