WO2014208224A1 - Dispositif de mise au point automatique - Google Patents

Dispositif de mise au point automatique Download PDF

Info

Publication number
WO2014208224A1
WO2014208224A1 PCT/JP2014/063496 JP2014063496W WO2014208224A1 WO 2014208224 A1 WO2014208224 A1 WO 2014208224A1 JP 2014063496 W JP2014063496 W JP 2014063496W WO 2014208224 A1 WO2014208224 A1 WO 2014208224A1
Authority
WO
WIPO (PCT)
Prior art keywords
focus
area
focusing
observation
calculation
Prior art date
Application number
PCT/JP2014/063496
Other languages
English (en)
Japanese (ja)
Inventor
塩田 敬司
Original Assignee
オリンパスメディカルシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパスメディカルシステムズ株式会社 filed Critical オリンパスメディカルシステムズ株式会社
Priority to JP2014559024A priority Critical patent/JP5792401B2/ja
Publication of WO2014208224A1 publication Critical patent/WO2014208224A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/241Devices for focusing
    • G02B21/244Devices for focusing using image analysis techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/365Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals by analysis of the spatial frequency components of the image

Definitions

  • the present invention relates to an autofocus device suitable as a device for observing a surgical site during surgery or the like.
  • the surgical microscope has a problem that since it has a high observation magnification and a shallow depth of field, it can easily be out of focus even with a slight movement.
  • Most surgical microscopes are equipped with an electric focus mechanism, and when moving the microscope, the surgeon drives the electric focus mechanism using a foot switch or hand switch, and manually adjusts the focus. Yes.
  • a surgical microscope incorporating an autofocus function has also been developed.
  • an apparatus that automatically performs autofocus control at the end of movement of a surgical microscope is disclosed.
  • the autofocus control is normally performed so as to focus on any area (in-focus area) in the observation image, as disclosed in Japanese Unexamined Patent Publication No. 2013-25132.
  • the user can designate a position such as the center of the observation range as the focus area. In this case, the user performs focusing by positioning the main subject at the center of the observation range.
  • focus control is adopted in which an area where the main subject is likely to be present is estimated by subject pattern recognition among a plurality of focusing areas provided in the observation range, and the area is automatically focused. Sometimes. Also, focus control is adopted in which a specific pattern such as a person's face is detected from an image in the observation range among a plurality of focus areas, and the focus area where this pattern is located is automatically focused. Sometimes.
  • a database for recognizing the subject pattern or the specific pattern is required.
  • a database of people, faces, etc. has already been constructed, and is used for autofocus control of a general camera that photographs a person, a landscape, and the like.
  • the surgical situation varies not only depending on the department or facility, but also varies depending on the individual patient even in the same case.
  • An object of the present invention is to provide an autofocus device capable of automatically and surely focusing on a region of interest or its vicinity.
  • An autofocus device includes an observation unit having a focus mechanism, a calculation unit that obtains a focus position by focus calculation from an image acquired by the observation unit, and a partial region of the observation region of the observation unit. When the focus position is not obtained by the focus calculation for the first in-focus area, the first in-focus area is changed.
  • An area setting unit that sets the second focusing area; and a drive control unit that drives the focus mechanism based on the focusing position obtained by the calculation unit.
  • FIG. 1 is a configuration diagram showing a schematic configuration of a medical observation apparatus incorporating an autofocus device according to a first embodiment of the present invention.
  • this embodiment is an example applied to a video observation type surgical microscope, the present embodiment can also be applied to an optical observation type surgical microscope, an endoscope, and other medical observation apparatuses.
  • the medical observation apparatus 1 includes a microscope 10, a camera control unit (hereinafter referred to as CCU) 20, a 3D monitor 30, and an AF control unit 40.
  • the microscope 10 is supported by the arm 51 of the gantry 50 and is movably disposed in the operative field.
  • the microscope 10 is used for 3D observation of an observation image, and includes imaging units 14R and 14L as observation units.
  • the observation images for the right eye and the left eye are photoelectrically converted by the imaging units 14R and 14L, and a right eye image and a left eye image are acquired.
  • the microscope 10 includes an objective lens 11, zoom lenses 12R and 12L, and imaging lenses 13R and 13L on the optical axis from the observation region side to the imaging surfaces of the imaging units 14R and 14L. Having an optical system.
  • the objective lens 11 is driven by the motor 15 and moves in the optical axis direction, thereby setting a focus state.
  • the motor (M) 15 is controlled and rotated by a motor driver 46 described later, thereby moving the objective lens 11 forward and backward in the optical axis direction.
  • an encoder (E) 16 is also provided.
  • the encoder 16 detects the position (lens position) in the movable region of the objective lens 11 and outputs the detection result to a lens position detection unit 44 described later.
  • the zoom lenses 12R and 12L are for controlling the magnification of an observation image incident through the objective lens 11.
  • the zoom lens 12R performs zoom magnification of the observation image for the right eye
  • the zoom lens 12L is for the left eye. Zooming of the observed image is performed.
  • the imaging lenses 13R and 13L guide the observation images for the right eye and the left eye that have been zoomed to the imaging surfaces of the imaging units 14L and 14R, respectively.
  • the imaging units 14R and 14L are configured by an imaging element such as a CCD or a CMOS sensor, and photoelectrically convert observation images incident on the imaging surface, respectively, to obtain a right-eye image and a left-eye image.
  • the image signals for the right eye and the left eye from the imaging units 14R and 14L are supplied to the CCU 20.
  • an example in which a 3D observation image is acquired as the microscope 10 is shown, but the present invention can be similarly applied to an apparatus that acquires a 2D observation image.
  • the CCU 20 performs predetermined image signal processing on the input right-eye and left-eye image signals to convert them into standard 3D video signals.
  • the 3D video signal from the CCU 20 is supplied to the 3D monitor 30.
  • the 3D monitor 30 displays a microscope observation image on a display screen based on the input 3D video signal.
  • the video signal from the CCU 20 is also supplied to the AF control unit 40 provided in the gantry 50.
  • the AF control unit 40 can set a plurality of focus areas of a predetermined size in the imaging area (observation visual field range) of the microscope 10 based on the user setting, and the image portion of the focus area is automatically focused. Thus, focus control is performed. Furthermore, in the present embodiment, it is possible to perform reliable focus control by changing the focus area.
  • the video signal from the CCU 20 is input to the detection unit 41 of the AF control unit 40.
  • the detection unit 41 performs a detection process on the video signal in the in-focus area set by the in-focus area setting unit 42 and outputs the detection result to the contrast calculation unit 43.
  • the contrast calculation unit 43 is controlled by the control unit 49 and obtains the contrast of the in-focus area based on the detection processing result of the detection unit 41.
  • the output of the lens position detection unit 44 is given to the contrast calculation unit 43.
  • the lens position detection unit 44 is given a detection result of the position in the movable region of the objective lens 11 from the encoder 16 of the microscope 10.
  • the lens position detection unit 44 obtains the current lens position based on the output of the encoder 16 and outputs lens position information to the contrast calculation unit 43.
  • the contrast calculation unit 43 has a memory (not shown), and stores the contrast value of the focus area at each lens position of the objective lens 11 in the memory.
  • the contrast value of the focus area at each lens position stored in the memory by the contrast calculation unit 43 is given to the focus position calculation control unit 45 via the control unit 49.
  • the control unit 49 determines that the peak value can be detected only when the contrast value exceeds a predetermined threshold value (hereinafter referred to as an in-focus determination threshold value), and determines the contrast value and the lens position from the contrast calculation unit 43. Information is given to the in-focus position calculation control unit 45.
  • the control unit 49 is configured to give a range result indicating whether or not the peak value of the contrast value is equal to or less than the focus determination threshold value to the focus region setting unit 42.
  • the motor driver 46 is controlled by the focus position calculation control unit 45 to drive the motor 15 and move the objective lens 11 forward and backward.
  • the in-focus position calculation control unit 45 can move the objective lens 11 to a desired position by controlling the motor driver 46.
  • the in-focus position calculation control unit 45 is controlled by the control unit 49 to move the objective lens 11 forward and backward, and based on the contrast value of the in-focus area at each lens position, the in-focus state is in any lens position. Find out if you can get it. For example, the focus position calculation control unit 45 obtains the lens position of the objective lens 11 in focus (hereinafter referred to as the focus position) by detecting the peak of the contrast value. The in-focus position calculation control unit 45 controls the motor driver 46 to move the objective lens 11 so that the lens position at which the in-focus state is obtained is obtained.
  • the focus area setting unit 42 determines whether or not to change the focus area depending on whether or not the peak value of the contrast value is equal to or less than the focus determination threshold value. That is, in the initial state, the focus area setting unit 42 sets the area set by the user as the focus area, and when the contrast value recorded in the memory does not exceed the focus determination threshold during the focus operation, The focusing area is changed under the control of the control unit 49. In this case, the focusing area setting unit 42 sets a new focusing area at a position close to the focusing area set by the user. Further, the focusing area setting unit 42 may newly set a focusing area including the focusing area set immediately before. The focus area setting unit 42 changes the focus area until the contrast value exceeds the focus determination threshold value.
  • FIG. 2 shows a focusing area 62 designated by the user in the observation visual field range 61.
  • the example of FIG. 2 shows that the user has set the focus area at the center of the observation visual field range 61.
  • FIG. 3 shows a focus area newly set by the focus area setting unit 42 when the contrast value is equal to or less than the focus determination threshold value.
  • the focus area setting unit 42 is positioned at a position close to the focus area 62 set by the user, or a position adjacent to the right or left of the focus area in FIG. A new focusing area 63a is set.
  • the focus area setting unit 42 may newly set the focus areas 63a on both the left and right sides of the focus area 62, and sets one of the left and right focus areas 63a first, and the focus area 63a. If the contrast value is equal to or less than the focus determination threshold value, the other focus area 63a may be set.
  • the focus area setting unit 42 may set a focus area 62 and an area including the two focus areas 63a as a new focus area.
  • the focus area setting unit 42 When the contrast value of the new focus area 63a is equal to or less than the focus determination threshold, the focus area setting unit 42 further sets a new focus area 63b on the left and right sides of the focus area 63a. Also in this case, the focus area setting unit 42 may newly set the focus areas 63b on both the left and right sides of the focus area 63a, and set one of the left and right focus areas 63b first, When the contrast value of the focus area is equal to or less than the focus determination threshold value, the other focus area 63b may be set. Further, the focus area setting unit 42 may set an area including the focus area 62, the two focus areas 63a, and the two focus areas 63b as a new focus area.
  • the focus area setting unit 42 changes the focus area until the contrast value exceeds the focus determination threshold value.
  • FIG. 3 shows an example in which the newly set focus area is extended to the left and right of the focus area 62 set by the user, the focus area may be set so as to extend vertically, Obviously, the focus area may be set so as to widen. In the present embodiment, the focus area may be changed so as to be sequentially switched or expanded from a position close to the focus area set by the user to a position away from the focus area.
  • FIG. 4 is a flowchart for explaining the operation during focusing.
  • 5 and 7 are explanatory diagrams illustrating examples of observation images.
  • 6 and 8 are explanatory diagrams showing changes in contrast values with respect to lens positions.
  • the observation visual field range 70 is, for example, a surgical part such as a brain surgery, and shows that an observation image 71 in the observation visual field range 70 is obtained by the microscope 10.
  • the observation image 71 indicates that the blood vessel 73 exists in a part of the brain parenchyma 72, for example. Note that the brain parenchyma 72 generally has little change in color and unevenness.
  • the AF control unit 40 performs the focusing operation shown in FIG. 4 at a predetermined timing. Further, the AF control unit 40 may perform the focusing operation at a predetermined time interval.
  • step S ⁇ b> 1 the focusing area setting unit 42 sets the focusing area 62 specified in advance by the user in the detection unit 41.
  • the detection unit 41 detects the video signal in the set in-focus area from the output of the CCU 20 and outputs the detection result to the contrast calculation unit 43.
  • the contrast calculation unit 43 is also provided with information on the current lens position of the objective lens 11 from the lens position detection unit 44, and obtains a contrast value from the detection result of the detection unit 41 (step S3), and the current lens position. And the contrast value are stored in the memory in association with each other (step S4).
  • the control unit 49 controls the in-focus position calculation control unit 45 to read the contrast value stored in the memory of the contrast calculation unit 43 while changing the lens position of the objective lens 11 (step S6). Detect peaks. If the peak of the contrast value cannot be detected, the control unit 49 controls the in-focus position calculation control unit 45 to repeat the movement of the objective lens in step S6, and if the peak of the contrast value can be detected, the contrast is determined in step S7. It is determined whether or not the peak value of the value exceeds a focus determination threshold value.
  • FIG. 6 corresponds to FIG. 5 and shows the change of the contrast value according to the change of the lens position, with the horizontal axis representing the lens position of the objective lens 11 and the vertical axis representing the contrast value.
  • the blood vessel 73 passes over the focusing area 62 designated in advance by the user, and the contrast value in the focusing area 62 is sufficiently high. Thereby, as shown in FIG. 6, the contrast value exceeds the focus determination threshold value.
  • control unit 49 outputs the contrast value and lens position information to the in-focus position calculation control unit 45.
  • the in-focus position calculation control unit 45 drives the motor driver 46 so that the lens position at which the peak of the contrast value is obtained is obtained.
  • the objective lens 11 is driven back and forth to the lens position where the in-focus state is obtained (step S9).
  • the observation visual field range 80 is a range slightly deviated from the observation visual field range 70 in FIG. 6, and the observation image 81 indicates that the blood vessel 73 exists in the peripheral region of the observation visual field range 80.
  • the focusing area setting unit 42 of the AF control unit 40 sets the focusing area 62 specified in advance by the user in the detection unit 41 in step S1.
  • the processing in steps S2 to S6 is repeated to obtain the contrast value and the lens position, and it is determined whether or not the contrast value peak has exceeded the in-focus determination threshold value (step S7).
  • FIG. 8 corresponds to FIG. 7, and shows the change in contrast value with respect to the lens position by the same notation method as in FIG.
  • FIG. 8A shows a change in contrast value for the in-focus area 62.
  • the contrast value is low at any lens position, and is below the focus determination threshold value. When the contrast value is less than or equal to the focus determination threshold, it is considered difficult to detect the focus position.
  • control unit 49 notifies the in-focus area setting unit 42 that the contrast value is equal to or less than the in-focus determination threshold, and causes the in-focus area setting unit 42 to change the in-focus area (step S8).
  • the focusing area setting unit 42 sets the focusing area 63a in the detection unit 41. In this way, the processing of steps S2 to S6 is repeated for the in-focus area 63a, the contrast value and the lens position are obtained, and it is determined whether or not the peak of the contrast value exceeds the in-focus determination threshold (step S7). .
  • the blood vessel 73 does not pass over the focusing area 63a, and the contrast value in the focusing area 63a is relatively low. That is, also in this case, the change in the contrast value is the same as in FIG. 8A, and the contrast value is equal to or less than the focus determination threshold value at any lens position.
  • the focus area setting unit 42 further changes the focus area and sets the focus area 63b in the detection unit 41 (step S8). In this way, the processing of steps S2 to S6 is repeated for the in-focus area 63b, the contrast value and the lens position are obtained, and it is determined whether or not the peak of the contrast value exceeds the in-focus determination threshold value (step S7). .
  • the blood vessel 73 passes over the focusing area 63b, and the contrast value in the focusing area 63 is sufficiently high.
  • the change in the contrast value is, for example, that shown in FIG. It becomes. That is, in this case, the contrast value exceeds the focus determination threshold value.
  • the in-focus position calculation control unit 45 drives the motor driver 46 so that the lens position at which the peak of the contrast value is obtained, and an in-focus state is obtained (step S9).
  • the in-focus position when the in-focus position cannot be obtained in the in-focus area designated by the user in advance, the in-focus position is moved away from the position close to the in-focus area set by the user.
  • the focus position is obtained by sequentially changing or expanding the area.
  • the focus region is automatically switched to obtain the focus position.
  • a focus position is obtained in a focus region having a high contrast in a region near the region of interest. In this way, it is possible to automatically and reliably focus on the region of interest or the vicinity thereof.
  • FIG. 4 shows an example in which the focus area is changed in the focus area setting unit 42 when the peak contrast value of the focus area is equal to or less than the focus determination threshold value. It is also possible to calculate the contrast value for, determine the in-focus area used for focus control in accordance with the contrast value obtained for each in-focus area, and obtain the in-focus position.
  • FIG. 9 is a flowchart for explaining a focusing operation in this case
  • FIG. 10 is an explanatory diagram for explaining a method of determining a focusing area. In FIG. 9, the same steps as those in FIG.
  • the focus area setting unit 42 sets the focus area specified by the user and one or more focus areas around the focus area in the detection unit 41.
  • the processing in steps S2 to S6 is repeated, and the contrast value and the lens position are obtained and stored. That is, in this case, the contrast values of all in-focus positions are obtained for each lens position, and finally peak detection is performed for each in-focus position.
  • the control unit 49 determines whether or not there is an in-focus area in which the peak value of the contrast value exceeds the in-focus determination threshold in step S13. . If it does not exist, the focus area setting unit 42 sets or expands a new focus area around the focus area set in step S11, and resets the focus area (step S14). Steps S2 to S6 and S13 are repeated.
  • the control unit 49 selects an in-focus area that is closest to the in-focus area set in advance by the user. Then, the lens position of the contrast peak obtained for the selected in-focus area is given to the in-focus position calculation control unit 45. Thereby, the focus position calculation control unit 45 controls the motor driver 46 to move the lens position of the objective lens 11 to the focus position.
  • observation visual field range 90 of FIG. 10 is a range slightly deviated from the observation visual field range 70 in FIG. 6, and the observation image 91 indicates that the blood vessel 73 exists in a region slightly deviated from the center of the observation visual field range 90.
  • FIG. 10 shows that the focusing area setting unit 42 sets the focusing area 62 designated by the user and the eight surrounding focusing areas 92a to 92h in step S11.
  • the blood vessel 73 passes through the focus areas 92d, 92g, and 92h, and it is considered that only the contrast values of these focus areas 92d, 92g, and 92h exceed the focus determination threshold value.
  • the contrast value of the in-focus region 92d is higher than the contrast values of the other in-focus regions 92g and 92h.
  • the control unit 49 selects the in-focus area 92g in step S15 and controls the in-focus position calculation control unit 45,
  • the focus position is determined using the lens position of the contrast peak obtained for the focus area 92g.
  • the contrast values of all the in-focus areas can be obtained simultaneously, and the processing time in the in-focus operation can be shortened. Further, since the focus area is selected based on the distance from the focus area set by the user, it is easy to obtain the focus state intended by the user.
  • FIG. 11 is a block diagram showing a second embodiment of the present invention.
  • focus control by the phase difference method is performed using the fact that the microscope 10 can acquire observation images for the right eye and the left eye. Even when the microscope is a two-dimensional one, this embodiment can be applied if the imaging unit has an AF pixel for autofocus or a sensor for phase detection. is there.
  • the medical observation apparatus 100 replaces the focus area setting unit 42, the contrast calculation unit 43, and the focus position calculation control unit 45 with the focus area setting unit 102, the phase difference calculation unit 103, and the focus, respectively.
  • adopted AF control part 101 using the position calculation control part 105 differs from 1st Embodiment.
  • the focusing area setting unit 102 sets different focusing areas in the detection unit 41 for the observation image for the right eye and the observation image for the left eye.
  • the in-focus area setting unit 102 sets a reference in-focus area that is a relatively wide in-focus area for either one of the observation images for the right eye or the left eye, and compares it with the other observation image.
  • a comparative focusing area which is a narrow focusing area, is set.
  • the detection unit 41 detects the video signals of the reference and comparative focus areas set for the right eye and the left eye, respectively, and outputs the detection result to the phase difference calculation unit 103.
  • the phase difference calculation unit 103 detects a detection result (hereinafter referred to as a comparative detection result) for the observation image in which the comparative focus area is set (hereinafter referred to as a reference detection result) for the observation image in which the reference focus area is set. While shifting up, the comparison between the reference detection result and the comparative detection result is performed. That is, the phase difference calculation unit 103 obtains which of the reference in-focus areas the reference detection result matches with the reference detection result, and calculates the position shift between the matched area and the comparison in-focus area as the phase difference. Detect as. Note that the phase difference calculation unit 103 determines a match based on, for example, a luminance distribution or a level distribution for each color.
  • the phase difference obtained by the phase difference calculation unit 103 corresponds to the focus shift of the microscope 10.
  • Information on the phase difference obtained by the phase difference calculation unit 103 is given to the in-focus position calculation unit 104 via the control unit 49.
  • the in-focus position calculation control unit 104 calculates and determines the in-focus position based on the phase difference obtained by the phase difference calculation unit 103.
  • the in-focus position calculation control unit 104 controls the motor driver 46 to move the objective lens 11 forward and backward to move to the in-focus position. As a result, the microscope 10 is brought into focus.
  • the phase difference calculation unit 103 can perform coincidence comparison by including observation images having feature points necessary for focusing in both the reference focusing area and the comparative focusing area.
  • both the reference focusing area and the comparative focusing area include observation images having many similar feature points, and matching comparison is performed. It can be difficult. Therefore, the comparative focus area is set to a relatively narrow area so that the number of feature points included in the comparative focus area is reduced, and the phase difference detection accuracy is improved.
  • the phase difference calculation unit 103 cannot detect a correct phase difference.
  • control unit 49 cannot detect the phase difference when the calculation result of the phase difference calculation unit 103 does not reach a predetermined threshold at all the shift positions in the comparison focus area. It is determined that there is, and the determination result is given to the in-focus area setting unit 102.
  • the focusing area setting unit 102 shifts or widens the comparison focusing area so that the observation image having the feature points necessary for focusing is compared with the comparison focusing area. It is designed to be included in.
  • FIG. 12 is a flowchart for explaining the focusing operation.
  • 13 and 14 are explanatory diagrams for explaining the focusing operation by the phase difference method.
  • FIG. 13 shows an example in which a feature point is included in the comparative focus area.
  • FIG. 13A shows an observation visual field range 111L for the left eye
  • FIG. 13B shows an observation visual field range 111R for the right eye
  • the observation image 112L in the observation visual field range 111L indicates, for example, that the blood vessel 114L exists in a part of the brain parenchyma 113L.
  • the observation image 112R in the observation visual field range 111R indicates that the blood vessel 114R exists in a part of the brain parenchyma 113R, for example.
  • the brain parenchyma 113L and 113R are the same observation object, and the blood vessels 114L and 114R are also the same observation object.
  • the positional shift (phase difference) of each observation object on the observation visual field ranges 111L and 111R corresponds to the focus shift amount.
  • the focusing area setting unit 102 sets a reference focusing area and a comparative focusing area in step S21 of FIG.
  • the example of FIG. 13 shows an example in which the reference focusing area 115L is set in the observation visual field range 111L for the left eye and the reference focusing area 115R is set in the observation visual field range 111R for the right eye.
  • the detection unit 41 detects the video signals in the focusing regions 115L and 115R and outputs the detection result to the phase difference calculation unit 103 (step S22).
  • FIG. 13C shows the detection results of the in-focus areas 115L and 115R as images.
  • the image portion 114La of the blood vessel 114L is included in the approximate center.
  • the comparison focus area 115R includes the image portion 114Ra of the blood vessel 114R at the right end.
  • the phase difference calculation unit 103 performs a coincidence comparison with the detection result of the reference focusing area 115L of the same size while moving the detection result of the comparative focusing area 115R step by pixel, for example.
  • step-moving the detection result of the comparative focusing area 115R to the range indicated by the broken line in FIG. 13C the detection result of the reference focusing area 115L of the same size is matched.
  • the amount of movement in this case corresponds to the phase difference.
  • the phase difference calculation unit 103 outputs the obtained phase difference to the focus position calculation control unit 104.
  • the control unit 49 determines that the phase difference cannot be detected in step S24.
  • the focus position calculation control unit 104 calculates the focus shift amount from the phase difference obtained by the phase difference calculation unit 103 and obtains the focus position (step S25). In step S26, the focus position calculation control unit 104 controls the motor driver 46 to move the objective lens 11 forward and backward to move to the focus position. Thus, the microscope 10 can be brought into focus.
  • FIG. 13 shows an example in which an observation image of a feature point such as a blood vessel is included in the comparison focus area initially set by the focus area setting unit 102.
  • a feature point such as a blood vessel
  • the feature points may not be included in the comparative focus region.
  • the magnification is increased from the state of FIG. 13, the feature point may not be included in the comparative focus area.
  • FIG. 14 shows an example of this case
  • FIGS. 14A and 14B show the observation visual field ranges 121L and 121R when the magnification is increased from the state of FIGS. 13A and 13B.
  • the observation image 112L is enlarged to become an observation image 122L
  • the observation image 112R is enlarged to become an observation image 122R.
  • the blood vessel 114L is observed as the blood vessel 124L
  • the blood vessel 114R is observed as the blood vessel 124R.
  • the reference focusing area 115L includes the feature points of the blood vessel 124L
  • the comparative focusing area 115R does not include the feature points of the blood vessel 124R.
  • the control unit 49 determines that the phase difference detection is impossible in step S24, and sets the determination result as the focus area setting. Output to the unit 102.
  • the focusing area setting unit 102 enlarges the comparison focusing area.
  • FIG. 14C shows that the comparative focusing area 115R is expanded to the comparative focusing area 125R.
  • This comparative focusing area 125R includes the edge portion of the blood vessel 124R as a feature point. Accordingly, the phase difference can be detected by the phase difference calculation unit 103, and the phase difference is detected in steps S22 and S23. In steps S25 and S26, the calculation of the in-focus position and the movement of the objective lens 11 to the in-focus position are performed. Is done.
  • the focus area for detecting the phase difference is set, and the focus area is set so that the feature point is included in the focus area.
  • Control is performed to detect the phase difference while gradually expanding.
  • it is possible to reliably detect the phase difference and set the in-focus state. For example, even when the enlargement magnification is increased, the in-focus state is enlarged and the phase difference is detected, so that the in-focus state can be reliably maintained.
  • FIG. 15 is a block diagram showing a third embodiment of the present invention. 15, the same components as those in FIGS. 1 and 11 are denoted by the same reference numerals, and the description thereof is omitted.
  • This embodiment is a combination of the focus control by the contrast method in the first embodiment and the focus control by the phase difference method in the second embodiment.
  • the medical observation apparatus 130 in the present embodiment has a contrast / phase difference calculation having the functions of the focus area setting unit 132 having the functions of the focus area setting units 42 and 102, the contrast calculation unit 43, and the phase difference calculation unit 103.
  • adopted AF control part 131 using the focus position calculation control part 135 which has the function of the part 133 and the focus position calculation control parts 45 and 105 differs from 1st and 2nd embodiment.
  • the operation at the time of focus control by the contrast method is the same as that of the first embodiment, and the operation at the time of focus control by the phase difference method is the same as that of the second embodiment.
  • the contrast method and the phase difference method are switched according to the flowchart of FIG.
  • the phase difference method enables faster focus control than the contrast method.
  • the contrast method can perform focus control with higher accuracy than the phase difference method. Therefore, in the present embodiment, the phase difference method is first adopted to enable high-speed focus control.
  • the objective lens 11 is positioned in the vicinity of a position where a substantially contrast peak is obtained.
  • the contrast method By adopting the contrast method by moving the objective lens 11 only in the vicinity of this position, the objective lens 11 is relatively short. High-precision focus control is possible over time.
  • the focus area setting unit 132 sets, for example, a plurality of focus areas by the contrast method shown in FIG. 10 and the like, and sets a focus area by the phase difference method shown in FIG. .
  • the detection unit 41 detects the video signal in the set focus area and outputs the detection result to the contrast / phase difference calculation unit 133.
  • the contrast / phase difference calculator 133 obtains the contrast value and the phase difference in step S33. In step S33, only the contrast value at a predetermined objective lens position is obtained as the contrast value.
  • the control unit 49 determines whether or not the phase difference is detected by the contrast / phase difference calculation unit 133 in step S34. When the phase difference cannot be detected, the focus area setting unit 132 widens the comparison focus area as shown in FIG. 14 through steps S38 and S39. When the phase difference can be detected in the contrast / phase difference calculation unit 133 in this way, the process proceeds to step S36 through steps S34 and S35, and the focus position calculation control unit 135 calculates the focus position based on the phase difference, In step S37, the motor driver 46 is controlled to move the objective lens 11 to the in-focus position.
  • the focus control by the phase difference method is switched to the focus control by the contrast method. That is, the process proceeds from step S34 to steps S46 through steps S38, 39, and S45, and the focus position calculation control unit 135 moves the objective lens 11. Next, steps S32 and S33 are repeated.
  • step S46 the change direction of the contrast value is detected from the comparison with the previous contrast value, and the movement direction is controlled so as to move the objective lens 11 in the direction in which the contrast value increases. Is called.
  • the process proceeds from step S38 to step S40, the comparison focus area is initialized and narrowed, and then steps S32 and S33 are repeated.
  • focus control by the phase difference method is performed again, and the focus state is shifted to a relatively high speed.
  • step S37 when focus is obtained by the phase difference method, or when phase difference cannot be detected during focus control by the contrast method, focus control by the contrast method is newly or continuously performed. . That is, the process proceeds from step S37, S35, S39 to step S45, and the objective lens 11 is controlled to move until a peak is detected (step S46).
  • the focus position calculation control unit 135 performs focus position calculation by the contrast method in step S47, and controls the motor driver 46 to move the objective lens 11 to the focus position (step S48). . Thereby, high-precision focus control is performed.
  • the present invention is not limited to the above-described embodiments as they are, and can be embodied by modifying constituent elements without departing from the scope of the invention in the implementation stage.
  • various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above embodiments. For example, you may delete some components of all the components shown by embodiment.
  • constituent elements over different embodiments may be appropriately combined.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Microscoopes, Condenser (AREA)
  • Automatic Focus Adjustment (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)
  • Endoscopes (AREA)
  • Studio Devices (AREA)

Abstract

Dispositif de mise au point automatique comportant: une unité de calcul qui reçoit une image d'observation en provenance d'une unité d'observation comprenant un mécanisme de mise au point et qui effectue un calcul de contraste de mise au point sur l'image d'observation pour acquérir une position focale; une unité de spécification de région qui spécifie une partie de la région d'observation de l'unité d'observation en tant que première région focale à soumettre au calcul de mise au point, et qui modifie la première région focale pour spécifier une deuxième région focale lorsqu'une position focale ne peut pas être acquise à partir du calcul de mise au point sur la première région focale; et une unité de commande d'entraînement qui entraîne le mécanisme de mise au point en se basant sur la position focale acquise par l'unité de calcul.
PCT/JP2014/063496 2013-06-24 2014-05-21 Dispositif de mise au point automatique WO2014208224A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2014559024A JP5792401B2 (ja) 2013-06-24 2014-05-21 オートフォーカス装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-131723 2013-06-24
JP2013131723 2013-06-24

Publications (1)

Publication Number Publication Date
WO2014208224A1 true WO2014208224A1 (fr) 2014-12-31

Family

ID=52141576

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/063496 WO2014208224A1 (fr) 2013-06-24 2014-05-21 Dispositif de mise au point automatique

Country Status (2)

Country Link
JP (1) JP5792401B2 (fr)
WO (1) WO2014208224A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017040726A (ja) * 2015-08-18 2017-02-23 キヤノン株式会社 撮像装置及びその制御方法
WO2020203810A1 (fr) * 2019-03-29 2020-10-08 Sony Corporation Système de traitement d'images, dispositif de traitement d'images, et procédé de traitement d'images

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3851894B1 (fr) * 2020-01-20 2024-06-12 CellaVision AB Système et procédé d'imagerie de microscopie numérique

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0529015U (ja) * 1991-09-27 1993-04-16 株式会社ニコン 自動焦点調節カメラ
JP2007034261A (ja) * 2005-06-22 2007-02-08 Fujifilm Corp 自動合焦制御装置およびその制御方法
JP2007310004A (ja) * 2006-05-16 2007-11-29 Citizen Holdings Co Ltd 自動合焦点装置
JP2009210815A (ja) * 2008-03-04 2009-09-17 Olympus Imaging Corp 自動焦点調節装置
WO2012029357A1 (fr) * 2010-08-30 2012-03-08 オリンパスメディカルシステムズ株式会社 Endoscope
WO2013058145A1 (fr) * 2011-10-21 2013-04-25 オリンパス株式会社 Dispositif d'imagerie, dispositif d'endoscope et procédé de commande de dispositif d'imagerie

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6317416A (ja) * 1986-06-09 1988-01-25 Minolta Camera Co Ltd 焦点検出装置
JP2840562B2 (ja) * 1995-01-17 1998-12-24 キヤノン株式会社 焦点検出装置を有するカメラ
JP2000102040A (ja) * 1998-09-28 2000-04-07 Olympus Optical Co Ltd 電子ステレオカメラ
JP3263931B2 (ja) * 1999-09-22 2002-03-11 富士重工業株式会社 ステレオマッチング装置
JP2001203883A (ja) * 2000-01-21 2001-07-27 Sharp Corp 画像処理装置
JP2009014445A (ja) * 2007-07-03 2009-01-22 Konica Minolta Holdings Inc 測距装置
JP5814692B2 (ja) * 2011-08-15 2015-11-17 キヤノン株式会社 撮像装置及びその制御方法、プログラム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0529015U (ja) * 1991-09-27 1993-04-16 株式会社ニコン 自動焦点調節カメラ
JP2007034261A (ja) * 2005-06-22 2007-02-08 Fujifilm Corp 自動合焦制御装置およびその制御方法
JP2007310004A (ja) * 2006-05-16 2007-11-29 Citizen Holdings Co Ltd 自動合焦点装置
JP2009210815A (ja) * 2008-03-04 2009-09-17 Olympus Imaging Corp 自動焦点調節装置
WO2012029357A1 (fr) * 2010-08-30 2012-03-08 オリンパスメディカルシステムズ株式会社 Endoscope
WO2013058145A1 (fr) * 2011-10-21 2013-04-25 オリンパス株式会社 Dispositif d'imagerie, dispositif d'endoscope et procédé de commande de dispositif d'imagerie

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017040726A (ja) * 2015-08-18 2017-02-23 キヤノン株式会社 撮像装置及びその制御方法
WO2020203810A1 (fr) * 2019-03-29 2020-10-08 Sony Corporation Système de traitement d'images, dispositif de traitement d'images, et procédé de traitement d'images
JP2020162803A (ja) * 2019-03-29 2020-10-08 ソニー株式会社 画像処理システム、画像処理装置、及び画像処理方法

Also Published As

Publication number Publication date
JP5792401B2 (ja) 2015-10-14
JPWO2014208224A1 (ja) 2017-02-23

Similar Documents

Publication Publication Date Title
JP5284731B2 (ja) 立体画像撮影表示システム
CN108697308B (zh) 图像处理装置、图像处理方法和记录介质
US9485405B2 (en) Focus control device, endoscope device, and focus control method
JP5178553B2 (ja) 撮像装置
JP5415973B2 (ja) 撮像装置、内視鏡システム及び撮像装置の作動方法
US10827906B2 (en) Endoscopic surgery image processing apparatus, image processing method, and program
JP2020114491A (ja) データユニットを有する手術用顕微鏡及び画像をオーバレイするための方法
JP6758287B2 (ja) 制御装置及び医療用撮像システム
WO2020095366A1 (fr) Dispositif d'imagerie, dispositif d'endoscope, et procédé de fonctionnement de dispositif d'imagerie
WO2016117107A1 (fr) Dispositif d'endoscope et procédé de commande de focalisation pour le dispositif d'endoscope
JP6736670B2 (ja) 内視鏡システム
JP2017038285A (ja) 医療用観察装置、制御装置、制御装置の作動方法および制御装置の作動プログラム
JP5792401B2 (ja) オートフォーカス装置
JP2014175965A (ja) 手術用カメラ
JP4668581B2 (ja) 手術用顕微鏡
JP2013043007A (ja) 焦点位置制御装置、内視鏡装置及び焦点位置制御方法
JP2003334160A (ja) 立体視内視鏡システム
JP7207296B2 (ja) 撮像装置とフォーカス制御方法およびフォーカス判定方法
WO2020095365A1 (fr) Dispositif d'imagerie, dispositif endoscope, et procédé de fonctionnement de dispositif d'imagerie
JP6489876B2 (ja) 撮像装置及びその制御方法
JP2015152830A (ja) 撮像装置及びその制御方法、プログラム、記憶媒体
JP6329400B2 (ja) 画像処理装置、画像処理方法およびプログラム、並びに撮像装置
EP4205691A1 (fr) Dispositif de traitement d'image médicale et système d'observation médicale
JP6882016B2 (ja) 撮像装置、撮像システム、撮像装置の制御方法、および、プログラム
JP6900228B2 (ja) 撮像装置、撮像システム、撮像装置の制御方法、および、プログラム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2014559024

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14817908

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14817908

Country of ref document: EP

Kind code of ref document: A1