WO2013164208A1 - Verfahren zur ausführung beim betreiben eines mikroskops und mikroskop - Google Patents
Verfahren zur ausführung beim betreiben eines mikroskops und mikroskop Download PDFInfo
- Publication number
- WO2013164208A1 WO2013164208A1 PCT/EP2013/058179 EP2013058179W WO2013164208A1 WO 2013164208 A1 WO2013164208 A1 WO 2013164208A1 EP 2013058179 W EP2013058179 W EP 2013058179W WO 2013164208 A1 WO2013164208 A1 WO 2013164208A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- representation
- microscope
- manipulation
- representative
- display
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/0004—Microscopes specially adapted for specific applications
- G02B21/002—Scanning microscopes
- G02B21/0024—Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
- G02B21/008—Details of detection or image processing, including general computer control
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
- G02B21/367—Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the invention relates to a method for carrying out the operation of a microscope and / or for displaying an object recorded with a microscope or of parts of an object recorded with a microscope.
- the invention also relates to a microscope, in particular a scanning microscope and / or laser scanning microscope and / or confocal scanning microscope, for carrying out the method according to the invention and a microscope, in particular scanning microscope and / or laser scanning microscope and / or confocal scanning microscope with a display and means for presenting a with a Microscope recorded object or parts of an object recorded with a microscope on the display.
- a microscope in particular a scanning microscope and / or laser scanning microscope and / or confocal scanning microscope with a display and means for presenting a with a Microscope recorded object or parts of an object recorded with a microscope on the display.
- image data of an object In order to be able to display the object on a display.
- image data can contain, for example, for each scanned object point information about the power of detection light emanating from this object point and / or about the wavelength of the detection light and information about the position of the respective object point, for example in the form of coordinate information.
- a three-dimensional scanning of an object is possible, for example, with a confocal scanning microscope.
- a sample is illuminated with a light beam to observe the detection light emitted by the sample, as reflection or fluorescent light.
- the focus of a lighting beam of light is moved by means of a controllable beam deflecting device, generally by tilting two mirrors, in a sample plane, wherein the deflection axes are usually perpendicular to each other, so that one mirror deflects in the x-direction, the other in the y-direction.
- the tilting of the mirror is accomplished, for example, with the help of galvanometer actuators.
- the power of the detection light coming from the object is measured as a function of the position of the scanning beam.
- the control elements are equipped with sensors for determining the current mirror position.
- a confocal scanning microscope generally comprises a light source, a focusing optics with which the light of the source is focused on a pinhole - the so-called excitation diaphragm, a beam splitter, a beam deflector for beam control, a microscope optics, a detection diaphragm and the detectors for detecting the detection - or fluorescent light.
- the illumination light is coupled in via a beam splitter.
- the fluorescence or reflection light coming from the object passes back to the beam splitter via the beam deflector, passes through it, and is subsequently focused onto the detection aperture behind which the detectors are located.
- This detection arrangement is called Descan arrangement.
- Detection light that does not come directly from the focus region takes a different light path and does not pass through the detection aperture, so that one obtains point information that results in a three-dimensional image by sequentially scanning the object with the focus of the illumination light beam.
- a three-dimensional image is achieved by layerwise image data acquisition.
- Commercial scanning microscopes usually consist of a scan module, which is flanged to the stand of a classical light microscope, wherein the scan module includes all mentioned above for scanning a sample additionally necessary elements.
- a detection stop can be dispensed with, since the excitation probability depends on the square of the photon density and thus on the square of the illumination light intensity, which is naturally much higher in the focus than in the neighboring regions. Therefore, the fluorescent light to be detected is most likely originated for the most part from the focus region, which makes a further differentiation of fluorescence photons from the focus range of fluorescence photons from the neighboring areas with a diaphragm arrangement superfluous.
- a three-dimensional image or the generation of image data that allow a three-dimensional representation of the object can also be done with other types of microscopes.
- SPIM technology is mentioned in this respect, in which an object from different directions is illuminated with a lens.
- a researcher wants to present the object in such a way that the aspects important to him are clearly visible.
- the user of a microscope or the researcher who evaluated the image data obtained with a microscope therefore, has the need to influence the type of representation, such as the size of the representation of the object or the viewing direction.
- the type of representation such as the size of the representation of the object or the viewing direction.
- the object is achieved by a method characterized by the following steps: a. Representing a representative of the object on the display or on another display, b. Performing at least one manipulation on the substitute and / or on the representation of the substitute by means of an input means, c. Deriving at least one representation parameter for the representation of the object or of the part of the object from the manipulation and / or derivation of at least one microscope control parameter from the
- Manipulation, d. Representing the object or the part of the object taking into account the derived representation parameter and / or the derived microscope control parameter.
- the further object is achieved by a microscope, which is characterized by: a. Means for displaying a representative of the object on the display or on another display, b. Means for performing at least one manipulation on the substitute and / or on the representation of the representative by means of an input means, c. Means for deriving at least one representation parameter for the representation of the object or the part of the object from the manipulation and / or derivation of at least one microscope control parameter from the manipulation, d. Means for displaying the object or the part of the object taking into account the derivative
- the invention has the advantage that the user can quickly and efficiently tell the performing system, for example a microscope with a computer-controlled display, which type of display is desired with which boundary conditions.
- the deputy especially for the user easily and quickly understandable further Object, such as a ball can be.
- the actual displayed, for example biological, object may have a much more complicated structure, in particular form.
- the risk is reduced that the user loses his orientation due to the large number of detailed information of the object shown and has to think long and hard about which viewing direction has just been selected or which partial area he has displayed or faded in.
- the representative in his three-dimensional form corresponds to an abstraction of the real object, for example a mouse or a brain.
- the invention has the further advantage that the representative has a consistent quality, in particular quality of presentation, and thus also permits a manipulation of qualitatively poor real data (for example strongly noisy objects or objects with little contrast) in a consistent manner.
- the invention has the advantage that the user is largely not burdened with information in the communication of his wishes regarding the presentation, which he does not need for this process.
- the invention also has the very special advantage that even then the desired representation can be achieved, if this is the present Do not allow image data, because according to a particular embodiment of the method according to the invention can also be provided that in order to achieve the desired representation directly influence the process of image data acquisition is taken.
- a modification of the representation of the object corresponding to the manipulation is effected by a manipulation carried out on the representative and / or his representation, in particular automatically.
- the change in the representation of the object is based on a change in at least one representation parameter and / or is based on a change in at least one microscope control parameter.
- the user has the advantage that he can enter the manipulation, for example, with the help of a computer mouse directly in relation to the deputy or the representation of the deputy and that a manipulation of the representation of the object takes place and / or that at least one microscope control parameter due to the manipulation is prevented in order, for example, to effect a special representation of the object or of subregions of the object.
- the user touches the deputy with a displayed mouse pointer, fixes a specific position of the mouse pointer relative to the deputy by clicking a mouse button and then turns and / or shifts the deputy by moving the mouse.
- it can be provided that at the same time I also rotate and / or shift the representation of the object in the same way.
- the representative two-dimensional and the object with a stereoscopic or holographic display are displayed in three dimensions.
- the presentation of the representative for example, on a separate 2D display done while the object with the stereoscopic or holographic display is displayed in three dimensions.
- the representative it is also possible for the representative to be displayed on the stereoscopic or holographic display - but two-dimensional - on which the object is also displayed - but three-dimensionally.
- the use of at least one 2D substitute in representations on 3D stereo monitors may be helpful if the substitute is represented by a 3D representation on a 2D screen and the actual 3D object has its representation on the stereo monitor.
- the aforementioned embodiments have the very special advantage that the effect of "hovering" of the mouse pointer over the stereoscopic or holographic representative is avoided and thus better and simpler control, in particular 3D control, is made possible.
- the user can mark a subregion of the proxy, for example by specifying cutting planes and then an area defined by the planes, for example by double-clicking or single-clicking with a mouse button while the mouse pointer is in this area, from the deputy or Representation of the representative removed, whereby the corresponding area of the displayed object is also removed, so that the user gets a clear view into the interior of the object or on the resulting cut surfaces.
- object image data are first generated with a microscope for offline operation and that in an off-line operation only display parameters for the representation of the object or of the part of the object are derived from the manipulation.
- corresponding microscope control parameters are stored in a microscope for a multiplicity of possible display modes, which are retrieved as needed, for example by a control device, and used.
- the user does not even need to know which microscope settings he must make in order to be able to record suitable image data for a particular display.
- the representative is a virtual object.
- the representative can be another multi-dimensional, in particular three-dimensional, object and / or be part of another multi-dimensional, in particular three-dimensional, object.
- the representative is a geometric body, for example a ball, a cube or a cylinder, and / or is part of a geometric body.
- a basic geometric figure has the particular advantage that the user can easily keep track because he usually knows geometric basic figures from his wealth of experience.
- a basic figure adapted to the object as a proxy eg the stylized figure of a mouse or a brain, can help to standardize microscopic representations and to repeat them in a uniform manner.
- Another embodiment may be the simultaneous representation of the proxy within the 3D image, allowing the user to better orient by using the abstract primitive
- the representative has a coordinate system that is displayed to the user.
- the representative has a coordinate system which is displayed to the user in the form of coordinate axes and / or by displaying characteristic planes or areas and / or by displaying the basic levels.
- Such an embodiment makes it easier for the user, for example, to assess which viewing direction is currently selected and / or which quadrants are currently displayed or hidden, for example.
- the deputy or the surface of the deputy are displayed transparently or semitransparent or that parts of the deputy are displayed transparent or semi-transparent.
- the surface of the representative has a texture and / or characters and / or a pattern and / or a color and / or that the surface of the representative a temporally or state-dependent variable texture and / or time or state-dependent has variable character and / or a temporally or state-dependent variable pattern and / or a temporally or state-dependent variable color and / or has a temporally or state-dependent variable transparency.
- a manipulation may include, for example, a rotation and / or a cutting of a partial area.
- the manipulation is a cut includes portions located in a quadrant of a proxy coordinate system and / or that manipulation involves the addition or modification of one or more intersection planes.
- the manipulation involves the departure of a part of the deputy or the cutting out of a part of the depiction of the deputy.
- the manipulation includes marking a part of the deputy.
- a chronological sequence of individual representations which can be, for example, two-dimensional or three-dimensional, takes place.
- the representations may be, for example, a sequence of sectional images of adjacent object planes.
- the manipulation involves the addition or modification of a start marker, in particular a start level, and an end marker, in particular an end plane, for example, for a temporally successive representation of adjacent cutting planes.
- the areas that can be selected by manipulation can also be curved surfaces.
- the representative and / or the representation of the representative in size, color and contrast is adapted to the perception possibilities of the user, in particular when a 3D image object is e.g. is very noisy.
- This has the particular advantage that nevertheless a clear cutting planes or cutting surface can be defined.
- a temporal change of the cutting planes or cut surfaces defined with the help of the representative is reproduced as a temporally changing representation of the object.
- the representative or his representation is adapted to the n different channels of a 3D image. This in particular such that an individual control of individual channels is made possible by a manipulation of the representative or his representation - simultaneously or separately in time.
- a cutting plane has an adjustable start and end point over which a course, e.g. a transparency, lets set.
- a course e.g. a transparency
- the manipulation may include shifting the representation of the representative on the display or the other display.
- the presentation parameter may contain information about the outer boundary and / or the shape of a part of the object to be displayed.
- the presentation parameter contains information about a time sequence of includes different representations and / or representations of different parts of the object.
- the presentation parameter may include information about the position of the representation of the object or a part of the object to be displayed.
- a volume rendering method is used for the representation of the object.
- a volume is divided into volume segments and volume segments are selected by the user and an indication of a subset of the data is made as a function of this selection of the volume segments.
- different combinations of rendering methods are used in different volume segments depending on the choice of the user.
- the data of different image channels or combinations of image channels are displayed in different volume segments depending on the user's choice.
- the microscope control parameter included at least one piece of information that is directly or indirectly relevant to the microscopic imaging and / or generation of image data of the object.
- the microscope control parameter may, for example, include information regarding at least one object plane to be scanned. It can also be provided that the microscope control parameter contains information relating to at least one sequence of scanning planes of the object to be scanned and / or that the microscope control parameter contains information regarding the temporal position of a focus of an illumination light beam.
- the microscope control parameter contains information regarding the scan speed.
- the microscope control parameter contains information relating to a scan inaccuracy and / or a resolution.
- provision can be made for image data to be obtained, in particular automatically, outside a region of interest with a lower resolution than within this region of interest.
- the microscope control parameter contains information regarding an object manipulation.
- the user with a manipulation of the representative or the representation of the representative, a real manipulation of the object, for example, a photobleach an area that triggers or the application of a voltage with microelectrodes.
- the representation of the object or the part of the object takes place on a stereoscopic display, it being possible for the user or users to wear spectacles which ensure that only the information falls into the respective eye intended for this eye.
- it can negotiate shutter glasses or polarization glasses.
- a hologram is generated from generated image data that is displayed to the user on the display.
- Such a design has the very special advantage that the user does not need to wear glasses.
- the substitute may also be displayed on such a stereoscopic or holographic display. This may be the same Display on which the object is also displayed. However, it is also possible that another display is used to represent the substitute.
- the object and / or the representative are displayed as a hologram and / or that the object and / or the representative are displayed as a stereoscopic image and / or that the object and / or the representative represented in a two-dimensional representation in perspective become.
- the further display is the display of a portable computer and / or a laptop and / or a mobile phone and / or a tablet PC.
- the manipulation is input to and / or with one of the aforementioned devices.
- the input means all devices are suitable in front of which it is possible to communicate a system for displaying the image information and / or a microscope information regarding the desired representation.
- the input means has a computer mouse or a touch-sensitive surface or a touch-sensitive screen or a motion sensor.
- the input means may be a wireless device, such as a laptop or a smartphone. Particularly advantageous is the use of devices that have their own display for representing the substitute.
- the input means there are basically no restrictions. For example, it can be provided that the input means detects a touch or an approach capacitively or inductively. It can also be provided that the input device transmits information by radio or by sound transmission or by light transmission, for example to a system for displaying image information and / or to a microscope.
- a volume rendering in particular an alpha blending is performed.
- the method is carried out with a computer and / or a scanning microscope and / or a laser scanning microscope and / or a confocal scanning microscope.
- a microscope for execution with a computer or with a programmable control device of a microscope, in particular a confocal scanning microscope, may advantageously serve a computer program product that can be loaded directly into the memory of a digital computer and includes software code sections with which the inventive method, in particular at least one of The above-described specific embodiments can be carried out when the computer program product runs on a computer or the programmable control device of a microscope.
- a microscope in particular a scanning microscope and / or laser scanning microscope and / or a confocal scanning microscope can be advantageously used for carrying out the method.
- the microscope according to the invention is designed in a particular embodiment such that by manipulating the representative and / or his representation, in particular automatically, by means of a control device, in particular a control computer, a manipulation corresponding change of the representation of the object is effected.
- a control device in particular a control computer
- the change in the representation of the object is based on a change in at least one presentation parameter and / or is based on a change in at least one microscope control parameter.
- the method according to the invention and / or the microscope according to the invention which can be embodied in particular as a scanning microscope and / or laser scanning microscope and / or confocal scanning microscope, can advantageously be used to display object data of a, in particular biological, object obtained with another or the same microscope.
- Fig. 1 is a schematic illustration of the possible components of a computer system or a microscope control device for in the embodiment of an embodiment of the invention
- FIG. 2 shows a schematic illustration of sectional polygons for the calculation of the representation of the object to be displayed
- FIG. 3 shows a display which displays a representative and a representation of the object to the user
- 4 shows a further schematic illustration of the sectional polygon calculation
- FIG. 5 shows a subdivision into volume segments
- FIG. 10 shows an exemplary embodiment of a substitute and a representation of the object that are displayed to the user
- Fig. 1 1 an embodiment of a microscope according to the invention.
- Figure 1 1 shows an embodiment of a microscope according to the invention, which is designed as a confocal scanning microscope.
- the microscope has a light source 1 101.
- the illumination light beam 1 102 generated by the light source 1 101 after passing through the illumination pinhole 1 103, strikes a dichroic main beam splitter 1 104 which deflects the illumination light beam to a beam deflection device 105 which includes a gimbal-mounted scanning mirror.
- the beam deflection device 1 105 guides the focus of the illumination light beam 1 102 via a scanning optical system 1 106 and a tube optical system 1 107 and the objective 1 108 via or through the object 1 109.
- the outgoing from the object 1 109 detection light 1 1 10 passes on the same light path, namely through the lens 1 109 through the tube optics 1 107 and the scanning optics 1 106 through the beam deflector 1 105th back and hits after passing the main beam splitter 1 104 and the detection pinhole 1 1 1 1 on the detection device 1 1 12, which generates proportional to the power of the detection light electrical signals.
- the generated electrical detection signals are forwarded to a programmable control device 1 1 13, which displays a representation 1 1 15 of the object 1 109 on a display 1 1 14 the user.
- a means for displaying the representation 1 1 16 a representative of the object 1 109 namely a portable computer 1 1 17 is provided which has another display 1 1 18, which shows the representation of 1 1 16 of the deputy.
- the representative is shown as a ball, which can be seen in particular in Figure 3.
- the portable computer 1 1 17 may be formed, for example, as a smartphone.
- the display 1 1 15 of the portable computer 1 1 17 is designed as a touch-sensitive display and acts as an input means to make at least a manipulation of the deputy and / or on the representation of the deputy.
- a mouse (not shown) may be connected to the portable computer 1 1 17.
- the control device 1 1 13 wirelessly receives signals with information regarding an input manipulation and derives at least one presentation parameter for the representation of the object or the part of the object from the manipulation and / or at least one microscope control parameter from the manipulation.
- the control device 1 1 13 may control the beam deflector according to the manipulation to allow a desired display, in particular to generate image data specific to a desired display.
- the means for displaying the object or the part of the object taking into account the derived representation parameter and / or the derived microscope control parameter, among others, serves as a means. the display 1 1 14.
- FIG. 3a shows an illustration 301 of a ball Substitute and a representation 302 of an object.
- Substitute 301 has a segment removed by manipulation. Accordingly, the electronic system implementing the method has removed the corresponding segment of the representation 302 of the object so that the user can view the resulting cut surfaces and gain insight into the interior of the object being displayed.
- Intensity data depending on location.
- these data are in the form of a regular grid for the spatial coordinates x, y, and z, which is referred to below as the image stack.
- a volume element within this grid is called a voxel.
- the arrangement in the grid is advantageous, but not mandatory for the use of
- each grid point there may be multiple intensity values, the data of which come from different detectors or different device settings. They are referred to here as picture channels.
- the data may also be in the form of a time-varying n-dimensional space grid.
- dimensions in microscopy e.g. besides x, y and z too
- Figure 1 shows those components of a computer system or a preferably programmable, microscope control device, which for a typical embodiment of the alpha blending of importance, the input devices 101, a computer unit 102 with central processing unit - CPU 104 and memory 105, a graphics card 106 and a monitor 103.
- the graphics card includes a Graphics Processing Unit - GPU 107, a
- Texture storage unit 1 1 1, and an image memory 1 12, the content of which is displayed on the monitor 103.
- Alpha blending consists of several computational steps, some running on one CPU and others running on a GPU.
- Rasterization 109 and fragment shading 1 10 are involved in the calculations in the GPU 107.
- Intensity data of the image stack is generated with color and transparency value for each voxel. It is customary for each image channel to assign a color with red Ir, green Ig and blue Ib, which can also be selected by the user.
- the texture thus created is written by the CPU 104 into the texture memory 1 1 1 of the graphics card (106).
- Texture coordinates are passed to the vertex shading unit 108, where the triangle coordinates are transformed into the observer coordinate system.
- the triangles will be in the order of decreasing distance to the
- Transformed triangle coordinates and texture coordinates are then passed to the Primitive Assembly and Rasterization unit 109. There is a compilation of pixel coordinates for all pixels of the
- the associated texture coordinates are interpolated for each pixel.
- the unit Fragment Shading 110 calculates interpolated intensities and
- Transparency value ⁇ determines new color values R ', G' and B * , which are then placed in the
- the content of the frame buffer 112 is transferred to the output device 103.
- Section planes can take the form of a normalized plane equation
- Fulfill condition lie in the plane.
- the coefficients a, b, c and d determine location and orientation of the plane.
- the distance D of any point from the plane can with
- Rasterization 109 of the GPU 107 the triangles are cut at the described alpha blending.
- the result of this operation are 0 to 2 triangles, which are located in the half-volume not to be cut off.
- a suitable algorithm for this operation is Sutherland-Hodegman polygon clipping, which will be discussed later. For multiple cutting planes, only the parts that are not cut away by any of the planes are shown.
- Figure 3a shows a representation of the representative 301 and a representation 302 of the object in one embodiment of the method for 2 cutting planes 303.
- the representation of the representative 301 is a computer graphics and has the same orientation as the representation 302 of the object. This equal orientation allows a quick, targeted choice of volume segments to be cut away.
- Figure 3b is a representation of the substitute 301 for the case of 3 orthogonal sectional planes shown.
- the position and orientation of the cutting planes is visualized 304. A mouse click on a sphere segment switches the visibility of the data in the corresponding volume segment.
- the cutting planes are managed in a computer program in the form of the coefficients of a plane equation (4).
- the display is in the form of boundary lines of the respective level with the
- Boundary surfaces of the volume immediately in the projected image can be generated with the algorithm, which is also used in the generation of the polygons for the above-described alpha blending method.
- Figure 4a shows
- Section planes 402, 403 and 404 Section planes 402, 403 and 404.
- the position and orientation of the cutting planes can be changed by the user.
- a shift is preferably made by dragging with an input device in the projected image.
- an input device is preferably also used with the aid of the representation of a special manipulator in the projected image.
- other elements such as e.g. Sliders are used.
- a different number of sectional planes than three can be used. These cutting planes can be switched off for the user. Your starting position can be chosen differently.
- the section planes can also be displayed in other areas of the output device or on other output devices.
- Volume segments are numbered the slice planes from 0 to N-1. It
- Dn is the distance of points (x, y, z) in the volume segment to
- Figure 5 shows such a subdivision with the corresponding numbers s of the volume segments in decimal
- Computer mouse one or more volume segments with a mouse click. This choice is stored by the computer program per volume segment based on the volume segment number.
- the substitute can also be displayed on another input or output device.
- another input device such as a touchpad can be used.
- a slightly different coordinate transformation than for the projection of the volume. It makes sense, for example, to use a parallel projection for the representative or for his representation while a perspective image is used for the projection of the volume. Even a slightly different direction of projection may be useful if this provides better orientation by the arrangement of the input and output devices.
- the plane equations of the section planes and the information about selected volume segments are used to re-project the volume.
- the method used for this is described using alpha blending. However, it can also be applied in a similar way to other volume rendering methods.
- the polygons 107 described above are already processed by the CPU before the transfer of the corresponding triangles the vertex shading Em e 'ti 108 of the GPU to account for the specified slice planes.
- Rasterization-Em e ' ti 109 is not used for the consideration of the cutting planes.
- FIG. 4b a cut polygon 401 and the cut lines 405, 406, and 407 of the 3 cut planes 402, 403, and 404 are shown with the polygon.
- Figure 4c shows a decomposition of the polygon 401 into triangles. The goal is a decomposition of the individual triangles into sub-triangles which are only located in one of the volume segments. In addition, it must be determined for each sub-triangle in which volume segment it lies. The sub-triangles can then be treated differently for processing on one or more GPUs according to user preferences.
- the triangle 408 is chosen as an example.
- this triangle is first broken down into three triangles by the first intersection 405.
- the triangle edge 409 there are two
- the first plane equation is used to divide the triangle into up to 3 sub-triangles, none of the sub-triangles being in both half-volumes simultaneously.
- a table index is determined by giving the vertices of the triangle a number k from 0 to 2 and setting one bit in the table index for each vertex if the vertex distance from the level is nonnegative.
- Figure 6 shows a non-negative vertex as a black circle for all possible cases.
- the table contains the information on how many sub-triangles arise, and which vertices k of the original triangle each represent the vertices of the
- the table also contains columns that indicate whether the new vertex has a non-negative distance to the cutting plane. If that is the case, for the first slice level, the lowest bit for the information is in which
- Volume segment is a sub triangle, set. This information is in the further steps with the other cutting planes. For the second cutting plane, therefore, the second bit is set, and so on.
- Figures 7a-c illustrate this algorithm again for the case of three cutting planes.
- the result is a list of sub-triangles which are numbered corresponding to the number of the volume segment (5) in which they are located. Depending on the selection of the user, those triangles with the associated texture coordinates are now transferred to the vertex shading unit 108 of the GPU 107, which lie in the segments for which the user has allowed the representation.
- the described triangulation method is only one of the possible embodiments for this step of the method according to the invention.
- Other information in the table e.g. a multiple
- the Sutherland-Hodegman polygon C // pp / ng algorithm can also be used in modified form. The modification is necessary because of the original form
- Another embodiment uses the Sutherland-Hodegman polygon clipping algorithm twice by applying different signs of the coefficients in the plane equation to produce the sub-triangles for the two half-volumes.
- Modern GPUs also feature a Geometry Shader Em e 'ti with a Dreieckszeriller on the GPU can be executed.
- the flexibility of the GPUs also allows execution of clipping algorithms in the Fragment Sfrac / er unit 1 10. For this, the x, y and z coordinates can be transferred to the fragment Sfraater unit. There, the plane equations are then used to determine whether the fragment is visible.
- a user can decide separately for each individual image channel whether the data should be displayed in selected segments.
- Figure 8 shows an arrangement of controls 801 and image display 802. The selection is made via switches 803 and 804 per channel, but for all volume segments. In addition, a separate choice for different rendering methods per channel is possible.
- Figure 9 shows the representation of a substitute
- Volume segment 901 takes the settings for this
- the sphere has the orientation as the represented projected image of the volume data. This selection can be used particularly advantageously in accordance with FIG. 10 for the case of co-planar cutting planes.
- the volume segments 1001, 1002 and 1003 are separated by the co-planar section planes.
- different settings can then be defined with the aid of the switches 1004, without the user losing the overview.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Microscoopes, Condenser (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/398,513 US10261306B2 (en) | 2012-05-02 | 2013-04-19 | Method to be carried out when operating a microscope and microscope |
JP2015509364A JP6255008B2 (ja) | 2012-05-02 | 2013-04-19 | 顕微鏡の稼働時に実施する方法および顕微鏡 |
GB1421358.1A GB2517110B (en) | 2012-05-02 | 2013-04-19 | Method to be carried out when operating a microscope and microscope |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102012009257.1 | 2012-05-02 | ||
DE102012009257.1A DE102012009257B4 (de) | 2012-05-02 | 2012-05-02 | Verfahren zur Ausführung beim Betreiben eines Mikroskops und Mikroskop |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013164208A1 true WO2013164208A1 (de) | 2013-11-07 |
Family
ID=48184187
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2013/058179 WO2013164208A1 (de) | 2012-05-02 | 2013-04-19 | Verfahren zur ausführung beim betreiben eines mikroskops und mikroskop |
Country Status (5)
Country | Link |
---|---|
US (1) | US10261306B2 (de) |
JP (1) | JP6255008B2 (de) |
DE (1) | DE102012009257B4 (de) |
GB (1) | GB2517110B (de) |
WO (1) | WO2013164208A1 (de) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102013021542A1 (de) * | 2013-12-18 | 2015-06-18 | Carl Zeiss Microscopy Gmbh | Mikroskop und Verfahren zur SPIM Mikroskopie |
JP2015125177A (ja) | 2013-12-25 | 2015-07-06 | ソニー株式会社 | 顕微鏡システムおよびその画像データ転送方法 |
JP6171970B2 (ja) | 2014-02-10 | 2017-08-02 | ソニー株式会社 | レーザ走査型顕微鏡装置および制御方法 |
CN105426079B (zh) * | 2015-11-25 | 2018-11-23 | 小米科技有限责任公司 | 图片亮度的调整方法及装置 |
JP6766882B2 (ja) * | 2016-11-24 | 2020-10-14 | 株式会社ニコン | 画像処理装置、顕微鏡システム、画像処理方法、およびコンピュータプログラム |
JP7015144B2 (ja) * | 2017-10-25 | 2022-02-02 | オリンパス株式会社 | 画像処理装置および顕微鏡システム |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002052393A1 (en) * | 2000-12-22 | 2002-07-04 | The Government Of The United States Of America, As Represented By The Secretary, Department Of Health And Human Services | Positioning an item in three dimensions via a graphical representation |
EP1235049A2 (de) * | 2001-02-21 | 2002-08-28 | Leica Microsystems Heidelberg GmbH | Verfahren und Anordnung zur Abbildung und Vermessung mikroskopischer dreidimensionaler Strukturen |
WO2003036566A2 (de) * | 2001-10-22 | 2003-05-01 | Leica Microsystems Wetzlar Gmbh | Verfahren und vorrichtung zur erzeugung lichtmikroskopischer, dreidimensionaler bilder |
US20060173268A1 (en) * | 2005-01-28 | 2006-08-03 | General Electric Company | Methods and systems for controlling acquisition of images |
US20110141103A1 (en) * | 2009-12-11 | 2011-06-16 | Mds Analytical Technologies (Us) Inc. | Integrated Data Visualization for Multi-Dimensional Microscopy |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6272235B1 (en) * | 1997-03-03 | 2001-08-07 | Bacus Research Laboratories, Inc. | Method and apparatus for creating a virtual microscope slide |
JP3867143B2 (ja) * | 2003-06-25 | 2007-01-10 | 独立行政法人産業技術総合研究所 | 三次元顕微鏡システムおよび画像表示方法 |
JP2005326601A (ja) * | 2004-05-14 | 2005-11-24 | Olympus Corp | 顕微鏡装置、立体画像出力方法および立体画像出力プログラム |
JP4450797B2 (ja) * | 2006-01-05 | 2010-04-14 | ザイオソフト株式会社 | 画像処理方法および画像処理プログラム |
JP4906442B2 (ja) * | 2006-08-29 | 2012-03-28 | オリンパス株式会社 | 顕微鏡撮像システム、顕微鏡撮像方法、及び、記録媒体 |
WO2008132563A2 (en) * | 2006-10-31 | 2008-11-06 | Koninklijke Philips Electronics N.V. | Fast volume rendering |
JP4912862B2 (ja) * | 2006-12-27 | 2012-04-11 | オリンパス株式会社 | 顕微鏡 |
JP2008175896A (ja) * | 2007-01-16 | 2008-07-31 | Nikon Corp | 画像表示装置 |
US20090208143A1 (en) * | 2008-02-19 | 2009-08-20 | University Of Washington | Efficient automated urothelial imaging using an endoscope with tip bending |
JP5191333B2 (ja) * | 2008-09-26 | 2013-05-08 | オリンパス株式会社 | 顕微鏡システム、該プログラム、及び該方法 |
EP2339389B1 (de) | 2008-09-26 | 2014-07-30 | Olympus Corporation | Mikroskopsystem, Speichermedium mit Steuerungsprogramm und Steuerungsverfahren |
JP5185769B2 (ja) * | 2008-10-22 | 2013-04-17 | オリンパス株式会社 | 顕微鏡システム |
JP5096301B2 (ja) * | 2008-12-12 | 2012-12-12 | 株式会社キーエンス | 撮像装置 |
JP5154392B2 (ja) * | 2008-12-12 | 2013-02-27 | 株式会社キーエンス | 撮像装置 |
JP5479950B2 (ja) * | 2010-03-03 | 2014-04-23 | オリンパス株式会社 | 顕微鏡装置及び観察位置再現方法 |
EP2553513B1 (de) * | 2010-03-28 | 2020-10-28 | Ecole Polytechnique Federale de Lausanne (EPFL) | Refraktionstomografie mit komplexem index und erhöhter auflösung |
US20130249903A1 (en) * | 2010-10-13 | 2013-09-26 | Hitachi, Ltd. | Medical image display device, medical information management server |
US20130106888A1 (en) * | 2011-11-02 | 2013-05-02 | Microsoft Corporation | Interactively zooming content during a presentation |
US9258550B1 (en) * | 2012-04-08 | 2016-02-09 | Sr2 Group, Llc | System and method for adaptively conformed imaging of work pieces having disparate configuration |
-
2012
- 2012-05-02 DE DE102012009257.1A patent/DE102012009257B4/de active Active
-
2013
- 2013-04-19 WO PCT/EP2013/058179 patent/WO2013164208A1/de active Application Filing
- 2013-04-19 US US14/398,513 patent/US10261306B2/en active Active
- 2013-04-19 GB GB1421358.1A patent/GB2517110B/en active Active
- 2013-04-19 JP JP2015509364A patent/JP6255008B2/ja active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002052393A1 (en) * | 2000-12-22 | 2002-07-04 | The Government Of The United States Of America, As Represented By The Secretary, Department Of Health And Human Services | Positioning an item in three dimensions via a graphical representation |
EP1235049A2 (de) * | 2001-02-21 | 2002-08-28 | Leica Microsystems Heidelberg GmbH | Verfahren und Anordnung zur Abbildung und Vermessung mikroskopischer dreidimensionaler Strukturen |
WO2003036566A2 (de) * | 2001-10-22 | 2003-05-01 | Leica Microsystems Wetzlar Gmbh | Verfahren und vorrichtung zur erzeugung lichtmikroskopischer, dreidimensionaler bilder |
US20060173268A1 (en) * | 2005-01-28 | 2006-08-03 | General Electric Company | Methods and systems for controlling acquisition of images |
US20110141103A1 (en) * | 2009-12-11 | 2011-06-16 | Mds Analytical Technologies (Us) Inc. | Integrated Data Visualization for Multi-Dimensional Microscopy |
Non-Patent Citations (1)
Title |
---|
"Living Image Software - User's Manual Version 3.2", 1 January 2009 (2009-01-01), XP055072971, Retrieved from the Internet <URL:http://www.perkinelmer.com/Content/LST_Software_Downloads/Living Image 32 User Manual_PN125112-8234.pdf> [retrieved on 20130725] * |
Also Published As
Publication number | Publication date |
---|---|
DE102012009257A1 (de) | 2013-11-07 |
GB2517110B (en) | 2020-08-05 |
JP6255008B2 (ja) | 2017-12-27 |
US10261306B2 (en) | 2019-04-16 |
JP2015518582A (ja) | 2015-07-02 |
GB2517110A (en) | 2015-02-11 |
US20150143274A1 (en) | 2015-05-21 |
DE102012009257B4 (de) | 2023-10-05 |
GB201421358D0 (en) | 2015-01-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102012009257B4 (de) | Verfahren zur Ausführung beim Betreiben eines Mikroskops und Mikroskop | |
DE102012110508B4 (de) | Roboter Einstellvorrichtung mit 3-D Display | |
DE3856127T2 (de) | Anzeigeverfahren und -vorrichtung für dreidimensionale Computergrafiken | |
WO2003036566A2 (de) | Verfahren und vorrichtung zur erzeugung lichtmikroskopischer, dreidimensionaler bilder | |
EP2156410A1 (de) | Verfahren zum darstellen von bildobjekten in einem virtuellen dreidimensionalen bildraum | |
WO2006092251A1 (de) | Verfahren und vorrichtung zur bestimmung von optischen überdeckungen mit ar-objekten | |
DE19955690A1 (de) | System und Verfahren zur auf einer Volumendarstellung basierenden Segmentation | |
JP6766882B2 (ja) | 画像処理装置、顕微鏡システム、画像処理方法、およびコンピュータプログラム | |
WO2009062492A2 (de) | Verfahren zum darstellen von bildobjekten in einem virtuellen dreidimensionalen bildraum | |
EP1235049A2 (de) | Verfahren und Anordnung zur Abbildung und Vermessung mikroskopischer dreidimensionaler Strukturen | |
WO2018185201A2 (de) | Mikroskopanordnung zur aufnahme und darstellung dreidimensionaler bilder einer probe | |
DE102012106890A1 (de) | Dreidimensionale Darstellung von Objekten | |
EP2528042A1 (de) | Verfahren und Vorrichtung zum Re-Meshing von 3D-Polygonmodellen | |
EP2457219B1 (de) | Verfahren und anordnung zur generierung von darstellungen anisotroper eigenschaften sowie ein entsprechendes computerprogramm und ein entsprechendes computerlesbares speichermedium | |
DE102019217878A1 (de) | VERGRÖßERNDE BEOBACHTUNGSVORRICHTUNG | |
Schulze-Döbold et al. | Volume rendering in a virtual environment | |
DE102019005635A1 (de) | Intuitives Bearbeiten dreidimensionaler Modelle | |
DE10336492A1 (de) | Vorrichtung zur Bearbeitung eines zweidimensionalen Bildes in einem dreidimensionalen Raum | |
DE10315592A1 (de) | Verfahren zum Ausführen von Interaktionen an sich räumlich und zeitlich verändernden mikroskopischen Objekten und System hierzu | |
EP2864830A1 (de) | Mikroskop | |
DE102005050350A1 (de) | System und Verfahren zur Überwachung einer technischen Anlage sowie Datenbrille auf Basis eines solchen Systems | |
DE102012108249A1 (de) | Verfahren und Vorrichtung zur Verbesserung der Wiedergabe stereoskopischer Bilder | |
EP2043005B1 (de) | Verfahren und Vorrichtung zum Messen und Simulieren von Abläufen in einem Mikroskopsystem | |
Bryden et al. | Interactive visualization of APT data at full fidelity | |
DE102015100680B4 (de) | Verfahren und Vorrichtungen zur Umgebungsdarstellung |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13718563 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015509364 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14398513 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 1421358 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20130419 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1421358.1 Country of ref document: GB |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13718563 Country of ref document: EP Kind code of ref document: A1 |