WO2022194965A1 - Microscope system and corresponding system, method and computer program - Google Patents

Microscope system and corresponding system, method and computer program Download PDF

Info

Publication number
WO2022194965A1
WO2022194965A1 PCT/EP2022/056886 EP2022056886W WO2022194965A1 WO 2022194965 A1 WO2022194965 A1 WO 2022194965A1 EP 2022056886 W EP2022056886 W EP 2022056886W WO 2022194965 A1 WO2022194965 A1 WO 2022194965A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
activated
visual control
microscope
overlay
Prior art date
Application number
PCT/EP2022/056886
Other languages
French (fr)
Inventor
Peter TEN HAVE
Siddharth Vikal
Edward DUNLAP
Alexander Wiethoff
Bastian RENNER
Svenja DITTRICH
Veronika THALHAMMER
Original Assignee
Leica Instruments (Singapore) Pte. Ltd.
Leica Microsystems Cms Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leica Instruments (Singapore) Pte. Ltd., Leica Microsystems Cms Gmbh filed Critical Leica Instruments (Singapore) Pte. Ltd.
Priority to EP22715612.2A priority Critical patent/EP4308990A1/en
Publication of WO2022194965A1 publication Critical patent/WO2022194965A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/368Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements details of associated display arrangements, e.g. mounting of LCD monitor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • Examples relate to a microscope system, such as a surgical microscope system, and to a corresponding system, method and computer program for a microscope system.
  • Surgical microscope systems are complex devices that provide a large number of functional ities. These functionalities are often accessible via haptic buttons, such as buttons that are located at handles of the surgical microscope system, or buttons that are arranged on a foot pedal of the surgical microscope system.
  • haptic buttons such as buttons that are located at handles of the surgical microscope system, or buttons that are arranged on a foot pedal of the surgical microscope system.
  • access to the functionality may be provided visually, e.g., via a display and a corresponding input device.
  • a display and a corresponding input device may be used to provide access to the functionality.
  • a user inter face being used to control the microscope system may be designed in a manner that focuses on usability and reachability of the elements of the user interface to support the surgeon and assistants during surgery.
  • a user interface is presented that uses a button to show or hide the remaining control elements of the user interface, which provides easy access to the remaining control elements without overly obstructing the view on the sample while being collapsed.
  • a two- or three-tiered arc-shaped user interface is introduced that is designed to take into account the shape and reach of the hand, while providing a layout that is intuitive and that can be controlled safely in stressful situations.
  • the system comprises one or more processors and one or more storage devices.
  • the system is configured to obtain image data from an optical imaging sensor of a microscope of the mi croscope system.
  • the system is configured to obtain a sensor signal from a touch interface of a touch screen of the microscope system.
  • the sensor signal represents a touch input ob tained via the touch interface.
  • the system is configured to generate a display signal for a display of the touch screen of the microscope system.
  • the display signal comprises a con tinuously updated representation of the image data and a visual control overlay being over laid over the representation of the image data.
  • the visual control overlay is controlled via the touch input obtained via the touch interface.
  • the visual control overlay comprises a first touch-activated button that is configured to show or hide the remaining touch -activated con trol elements of the visual control interface upon actuation.
  • the visual control overlay fur ther comprises at least one further touch-controlled control element being shown or hidden based on the actuation of the first touch-activated button.
  • the visual control overlay comprises one or more second touch- activated buttons that are arranged adjacent to the first touch -activated button.
  • the one or more second touch-activated buttons may be fanned out around the first touch- activated button.
  • the one or more second touch-activated buttons may be as signed different functionalities of the microscope system.
  • the visual control overlay is context-dependent, i.e., the functionality being accessible via the visual control overlay may change depending on the context it is being used in.
  • the microscope system may be suitable for performing imaging in two or more different imaging modes (e.g., in a reflectance imaging mode and in a fluo rescence imaging mode).
  • the system may be configured to generate the visual control over lay such that a functionality associated with the one or more second touch-activated buttons is dependent on an imaging mode being performed by the microscope system.
  • the functionality most relevant to the respective imaging mode may be accessible via the visual control overlay.
  • the visual control overlay may follow a design paradigm in which the first touch- activated button is used to show or hide the remaining control elements (i.e., to collapse or expand the visual control overlay), the one or more second touch-activated buttons are used to select the functionality being controlled, and the third touch-controlled control element is being used to manipulate a value of the functionality being selected.
  • the one or more second touch-activated buttons may each be associated with a functionality, which is select ed upon actuation of the respective button of the one or more second touch-activated but tons.
  • the system may be configured to generate the visual control overlay such that each of the one or more second touch-activated buttons is associated with a single func tionality of the microscope system.
  • the one or more second touch-activated buttons may cover a wide range of functionalities.
  • one of the one or more second touch-activated buttons may be associated with an optics-related functionality of the microscope system (such as zooming or focusing).
  • one of the one or more second touch-activated buttons may be associated with an illumination-related functionality of the microscope sys tem (e.g., brightness control, or switching between white light and fluorescence excitation illumination).
  • one of the one or more second touch-activated buttons may be associated with an imaging-mode-related functionality of the microscope system (e.g., to switch between fluorescence imaging and reflectance imaging, or to select a fluorescence imaging mode).
  • the visual control overlay comprises, at least if one of the one or more second touch-activated buttons is actuated via the touch interface, a third touch-controlled control element.
  • the third touch-controlled control element may be arc-shaped.
  • the one or more second touch-activated buttons may be arranged between the third touch-controlled control element and the first touch-activated button.
  • such a three-tiered and arc-shaped visual control overlay is designed to take into ac count the shape and reach of the hand, while providing a layout that is intuitive and that can be controlled safely in stressful situations.
  • the visual control overlay has a substantially arc-shaped layout, which may provide a high degree of usability, e.g., as the other control elements are within reach when the user places a thumb on the first touch-controlled button.
  • the one or more second touch-activated buttons and the third touch-controlled control element may be spread-out radially around the first touch-activated button.
  • the one or more second touch- activated buttons and the third touch-controlled control element may be contained within a circular sector of a circle being spanned around the first touch -activated button, the circular sector having an angle of less than 180 degrees (e.g., at most 160 degrees).
  • the third touch-controlled control element may be used to alter a value of the function being selected.
  • the system may be configured to generate the visual control overlay such that the third touch-controlled control element comprises at least one of a button (e.g., two buttons for increasing or decreasing a value, or two or more buttons for selecting among a set of pre-defmed values), a toggle (e.g., for selecting among two pre defined values, such as on and off) and a slider control (e.g., to control a numerical value).
  • a button e.g., two buttons for increasing or decreasing a value, or two or more buttons for selecting among a set of pre-defmed values
  • a toggle e.g., for selecting among two pre defined values, such as on and off
  • a slider control e.g., to control a numerical value
  • the system may be configured to generate the visual control overlay such that the third touch-controlled control element comprises two buttons (e.g., for selecting among two values, or for increasing or decreasing a numerical value), a slider control (e.g., for ma nipulating a numerical value), or two buttons and a slider control (e.g., with the buttons be ing used for increasing or decreasing a numerical value, and the slider control being used to directly manipulate the numerical value.
  • the third touch-controlled control element comprises two buttons (e.g., for selecting among two values, or for increasing or decreasing a numerical value), a slider control (e.g., for ma nipulating a numerical value), or two buttons and a slider control (e.g., with the buttons be ing used for increasing or decreasing a numerical value, and the slider control being used to directly manipulate the numerical value.
  • a textual representation of the numerical value may also be provided to enable a precise setting of the numerical value.
  • the visual control overlay may also include a numerical representation of the slider control.
  • the system may be con figured to toggle between showing and hiding the third touch-controlled control element in the visual control overlay upon actuating one of the one or more second touch-activated buttons.
  • the third touch-controlled control element may be used to alter a val ue of the function being selected.
  • the one or more second touch-activated buttons may be used to select a functionality
  • the third touch-controlled control element may be used to actually affect a change in the microscope system.
  • the third touch-controlled control element may be suitable for controlling a functionality of the mi croscope system, the functionality of the microscope system being coupled to a functionality that is associated with the second touch-activated button being actuated.
  • the visual control overlay is controlled via the touch-screen display.
  • the touch targets of the touch-screen may be adjusted, in particular with respect to the third touch-controlled control element.
  • the system may be configured to control the visual control overlay via the touch input obtained via the touch interface. Touch input that is registered radially outward of the third touch-controlled control element may be used to control the third touch-controlled control element (thereby increasing the size of the touch target).
  • the system is configured to generate the display signal such that the first touch-activated button is constantly shown overlaid over the representation of the im age data.
  • the first touch-activated button is constantly shown, it can be used to expand or collapse the visual control overlay at any time.
  • the visual control overlay may be designed such that it keeps out of the way even when it is shown in its entirety, e.g., so the user is at any time able to see the live view on the sample.
  • the system may be configured to generate the visual control over lay such, that the visual control overlay is constrained within a bottom half of the display. Consequently, the view on the sample may be unobstructed at any time in the top half of the display.
  • the system may be configured to generate the visual control overlay such that the first touch-activated button covers at most 5% of an overall surface of the display.
  • the first touch-activated button may be visible, and thus accessible, at any time, while obstructing only a small part of the view on the sample.
  • the system may be configured to generate the visual control overlay such that the visual control overlay covers at most 30% of an overall surface of the display. Thus, the majority of the view on the sample may remain unobstructed.
  • the visual control overlay itself may change based on the input of the user, e.g., to indicate to the user the functionality presently being controlled.
  • touch-activated control elements of the visual control overlay such as the first touch -activated button, the one or more second touch-activated buttons and the third touch-controlled control element may, have an active state and a passive state.
  • the system may be configured to generate the visual control overlay such that the touch-activated control elements are shown in different colors depending on their state (e.g., with “highlighted” colors if active).
  • a surgical microscope sys tem comprising the system introduced above, the microscope and the display.
  • the touch-screen may be arranged at a base unit of the surgical microscope system.
  • the touch-screen may be arranged at the microscope of the surgical microscope sys tem.
  • Such a system may be particularly beneficial in a surgical setting, as it is designed to provide an intuitive control of the (surgical) microscope system in stressful situations.
  • the method comprises obtaining image data from an optical imaging sensor of a microscope of the microscope system.
  • the method comprises obtaining a sensor signal from a touch interface of a touch screen of the microscope system.
  • the sensor signal represents a touch input obtained via the touch interface.
  • the method comprises generating a display signal for a display of the touch screen of the microscope system.
  • the display signal com prises a continuously updated representation of the image data and a visual control overlay being overlaid over the representation of the image data.
  • the visual control overlay is con trolled via the touch input obtained via the touch interface.
  • the visual control overlay com prises a first touch-activated button that is configured to show or hide the remaining touch- activated control elements of the visual control interface upon actuation.
  • the visual control overlay further comprises at least one further touch-controlled control element being shown or hidden based on the actuation of the first touch-activated button.
  • Fig. la shows a block diagram of an example of a system for a microscope system
  • Fig. lb shows a schematic diagram of an example of a surgical microscope system
  • Fig. 2 shows a flow chart of an example of a method for a microscope system
  • Figs. 3a to 3e show various states of an exemplary rotary menu
  • Fig. 3f shows two different states of a “rotary menu” button
  • Fig. 4 shows an exemplary placement of a rotary menu on top of a live view
  • Fig. 5 shows an exemplary transition between different states of the rotary menu
  • Fig. 6 shows a schematic diagram of a system comprising a microscope and a computer system.
  • Fig. la shows a block diagram of an example of a system 110 for a microscope system 100.
  • the system 110 comprises one or more processors 114 and one or more storage devices 116.
  • the system further comprises one or more interfaces 112.
  • the one or more processors 114 are coupled to the one or more storage devices 116 and to the optional one or more interfaces 112.
  • the functionality of the system is provided by the one or more processors, in conjunction with the one or more interfaces (for exchanging infor mation, e.g., with an optical imaging sensor of a microscope) and/or with the one or more storage devices (for storing and/or retrieving information.
  • the system is configured to obtain (e.g., receive) image data from an optical imaging sensor of a microscope 120 of the microscope system.
  • the system is configured to obtain (e.g., receive) a sensor signal from a touch interface of a touch screen 130 of the microscope sys tem.
  • the sensor signal represents a touch input obtained via the touch interface.
  • the system is configured to generate a display signal for a display of the touch screen of the microscope system.
  • the display signal comprises a continuously updated representation 140 of the im age data and a visual control overlay 150 being overlaid over the representation of the image data.
  • the visual control overlay is controlled via the touch input obtained via the touch in terface.
  • the visual control overlay comprises a first touch -activated button 152 that is con figured to show or hide the remaining touch-activated control elements of the visual control interface upon actuation.
  • the visual control overlay further comprises at least one further touch-controlled control element being shown or hidden based on the actuation of the first touch-activated button.
  • the visual control overlay further comprises one or more second touch-activated buttons 154.
  • the one or more second touch-activated buttons may be fanned out around the first touch- activated button.
  • the visual control overlay further comprises, at least if one of the one or more second touch-activated buttons is actuated via the touch interface, a third touch-controlled control element 156, which may be arc-shaped.
  • the one or more second touch-activated buttons may be arranged between the third touch-controlled control element and the first touch-activated button.
  • Embodiments of the present disclosure relate to a system, method and computer program for a microscope system.
  • a microscope is an optical instrument that is suitable for examining objects that are too small to be examined by the human eye (alone).
  • a microscope may provide an optical magnification of a sample.
  • the optical magnification is often provided for a camera or an imaging sensor, such as an optical imaging sensor of the microscope 120 that is shown in Fig. lb.
  • the microscope 120 may further comprise one or more optical magnification components that are used to magnify a view on the sample, such as an objective (i.e., lens).
  • the object being viewed through the microscope may be a sample of organic tissue, e.g., arranged within a petri dish or present in a part of a body of a patient.
  • the microscope system 100 may be a microscope system for use in a laborato ry, e.g., a microscope that may be used to examine the sample of organic tissue in a petri dish.
  • the microscope 120 may be part of a (neuro) surgical microscope system 100, e.g., a microscope to be used during a (neuro) surgical procedure. Such a system is shown in Fig. lb, for example.
  • an object being viewed through the micro scope, and shown in the image data may be a sample of organic tissue of a patient.
  • Alt hough embodiments are described in connection with a microscope, they may also be ap plied, in a more general manner, to any optical device.
  • the above system 110 is suitable for use with the microscope system comprising micro scope 120, e.g., as part of the microscope system 100.
  • Fig. lb shows a block diagram of the microscope system 100 comprising the system 110, the microscope 120 and the touch screen 130.
  • the microscope system shown in Fig. lb is a surgical microscope system. How ever, the system 110 may be used with other microscope systems or optical systems as well.
  • lb comprises a number of optional com ponents, such as a base unit 105 (comprising the system 110) with a (rolling) stand, the touch-screen 130, a (robotic or manual) arm 160 which holds the microscope 120 in place, and which is coupled to the base unit 105 and to the microscope 120, and steering handles 170 that are attached to the microscope 120.
  • the touch screen 130 may be ar ranged at the base unit 105 of the microscope system.
  • the term “(surgical) microscope system” is used, in order to cover the portions of the system that are not part of the actual microscope (which comprises optical components), but which are used in conjunction with the microscope, such as the touch-screen or an illumination system.
  • the system is configured to obtain image data from the optical imaging sensor of the micro scope.
  • the optical imaging sensor may comprise or be an APS (Active Pixel Sensor) - or a CCD (Charge-Coupled-Device)-based imaging sensor.
  • APS-based imaging sensors light is recorded at each pixel using a photo -detector and an active amplifier of the pixel.
  • CMOS Compplemen tary Metal -Oxide- Semi conductor
  • S-CMOS Stientific CMOS
  • incoming photons are converted into electron charges at a semicon ductor-oxide interface, which are subsequently moved between capacitive bins in the imag ing sensors by a control circuitry of the imaging sensors to perform the imaging.
  • the system is configured to obtain (i.e., receive or read out) the image data from the optical imaging sensor.
  • the image data may be obtained by receiving the image data from the optical imag ing sensor (e.g., via the interface 112), by reading the image data out from a memory of the optical imaging sensor (e.g., via the interface 112), or by reading the image data from a storage device 116 of the system 110, e.g., after the image data has been written to the stor age device 116 by the optical imaging sensor or by another system or processor.
  • the system is configured to generate the display signal the display of the touch screen of the microscope system.
  • the display signal may be a signal for driving (e.g., control ling) the display of the touch-screen 130.
  • the display signal may comprise video data and/or control instructions for driving the display.
  • the display sig nal may be provided via one of the one or more interfaces 112 of the system.
  • the system 110 may comprise a video interface 112 that is suitable for providing the video signal to the display of the touch screen.
  • the display signal comprises two components: the continuously updated representation 140 of the image data and the visual control overlay 150 being overlaid over the representation of the image data.
  • the continuously updated representation of the image data may be a live video that is generated based on the image data generated by the optical imag ing sensor of the microscope.
  • the image data, and thus also the representation of the image data may show a view through the microscope on the sample being observed via the micro scope, e.g., a view on a surgical site being observed through the surgical microscope.
  • the representation 140 of the image data is continuously updated, so both the image data may be video data (i.e., continuously updated image data), and the representation may be a (live) video of the video data generated by the optical imaging sensor.
  • the represen- tation 140 of the image data may (always) be shown in the background, with the visual con trol overlay 150 being shown in the foreground, partially obstructing the representation 140 of the image data.
  • the visual control overlay is at the core of the present concept - it provides the user with a user interface for controlling the (surgical) microscope system.
  • the visual control overlay may be a user interface for controlling the (surgical) microscope system.
  • the visual control overlay is also denoted “rotary menu”, as it provides a touch-controlled menu to the user that is arch-shaped and that allows the manipulation of values through a rotary movement.
  • Figs lc to If and 3a to 5e show dif ferent examples of such a visual control overlay.
  • a first touch-activated button 152 one or more second touch-activated buttons 154, and a third touch-controlled control element 156.
  • the one or more second touch-activated buttons 154 and the third touch-controlled control element merely serve as an example for the at least one further touch-controlled control element.
  • Any combination of touch- controlled user interface elements, arranged in any direction relative to the first touch- activated button may be used to implement the at least one further touch-controlled control element.
  • a first button e.g., an “activation button” that “un folds” an arbitrary number of second-level buttons (e.g., the one or more second touch- activated buttons) that are geometrically arranged relative to that activation button.
  • the one or more second touch-activated buttons may be two or more second touch- activated buttons, or three or more second touch-activated buttons, or four or more second touch-activated buttons.
  • the third-level control element might become visible only on activation.
  • the second and third level be arranged in any position or shape on the screen, relative to the activation button, with the half-moon shape only being one example.
  • the first touch-activated button 152 is used to, or configured to, expand or collapse the vis ual control overlay upon actuation (i.e., to toggle between showing and hiding the remaining elements of the visual control overlay upon actuation).
  • the one or more second touch-activated buttons are used to, or configured to, select a functionality of the microscope system that is to be controlled.
  • the third touch-controlled element is used to control the microscope system e.g., by selecting or adjusting a value as sociated with the functionality being selected through one of the one or more second touch- activated buttons.
  • the first touch-activated buttons is used for showing or hiding the (rest of) the menu.
  • the first touch-activated may always be visible.
  • the sys tem may be configured to generate the display signal such that the first touch -activated but ton is constantly shown overlaid over the representation of the image data.
  • a visual represen tation of the first touch-activated button may change.
  • the button may be shown 310a with a pictogram rep resenting settings being adjusted (e.g., by showing sliders being adjusted, as shown in Fig.
  • the button may be shown 320b with a pictogram representing a “cancel” operation, thereby providing an affordance for the user to re-use the same button for opening and closing the menu.
  • the one or more second touch-activated buttons 154 are fanned out around the first touch-activated button.
  • the term “fanned out” indicates that the one or more second touch-activated buttons are arranged at different sides of the first touch- activated button, located adjacent to the first touch -activated button, with each second touch-activated button having (substantially) the same distance to the first touch-activated button.
  • a rotary menu is presented, with rows of but tons being spread out radially around the first touch-activated button. Consequently, as shown in Figs lc to If, the one or more second touch-activated buttons and the third touch- controlled control element may be spread-out radially around the first touch-activated but ton.
  • the one or more second touch-activated buttons may be arranged on a first circle, or rather a portion of a first circle, around the first touch-activated button, the shape of the one or more second touch-activated buttons following the circle (with a center of gravity of the one or more second touch-activated buttons being placed on the circle).
  • the third touch-controlled element may be arranged on a second circle, or rather a portion of a second circle, around the first touch-activated button, with a shape of the third touch-controlled element following the second circle (thus being arc-shaped).
  • the third touch-controlled element, and/or the one or more second touch-activated buttons may have a shape that is centered around the respective circle, thus creating an arc that covers a portion of the circle.
  • the second circle may be larger than the first circle (and thus further out than the first circle relative to the first touch-activated button).
  • the one or more second touch-activated buttons may be arranged radially between the first touch -activated button and the third touch-controlled control element.
  • the one or more second touch-activated buttons and the third touch-controlled control element are constrained to a portion of the circle, e.g., to provide intuitive touch targets to a hand when the first touch-activated button has been activated via a thumb of the hand.
  • the one or more second touch-activated buttons and the third touch-controlled control element may be constrained to a semi-circle (or less than a semi-circle) “above” (i.e., above on the display) or “below” (on the display) the first touch- activated button (and centered around the first touch-activated button).
  • the one or more second touch-activated buttons and the third touch-controlled control element may be contained within a circular sector of a circle being spanned around the first touch- activated button.
  • the circular sector may have an angle of less than 180 de grees (or at most 170 degrees, or at most 160 degrees). In the examples shown in Figs lc to If, the circular sector has an angle of 156 degrees.
  • the one or more second touch-activated buttons (as a group) and the third touch-controlled elements may be hori zontally (i.e., laterally) centered around the first touch-activated buttons.
  • the circular sector may be horizontally aligned with, and/or horizontally centered around, the first touch-activated button.
  • the (entire) second touch-activated buttons and the third touch- controlled control element, and thus the circular sector may be arranged, within the display signal/on the display, above a center point of the first touch -activated button.
  • the visual control overlay is constrained to a small portion of the screen, e.g., so a large part of the representation 140 of the image data can be seen even if the visual control overlay is shown.
  • the system may be configured to generate the visual control overlay such that the first touch-activated button covers at most 5% (or at most 3%, or at most 2% of an overall surface of the display (or of the representation of the image data).
  • the entire visual control overlay i.e., the first touch-activated button, the one or more second touch-activated buttons and the third touch-controlled control element
  • at most 30% of the display may be covered.
  • the system may be configured to generate the visual control overlay such that the visual control overlay covers at most 30% of an overall surface of the display (or of the representation of the image data). How ever, in some implementations, the visual control overlay may cover more than 30% of the overall surface of the display.
  • the unfolded visual control overlay may extend across the entire display (without extending across the borders of the display). Furthermore, the visual control overlay may be restricted to the bottom half (or the top half) of the dis play. In other words, the system may be configured to generate the visual control overlay such, that the visual control overlay is constrained within a bottom (or top) half of the dis play. For example, the visual control overlay may be centered horizontally at the bottom or top of the display.
  • the first button, and correspondingly the other touch- activated control elements that are arranged relative to the first buttons may be arranged (e.g., dragged) to any position of the user interface, as long as the visual control overlay can be fully unfolded without extending across the borders of the display.
  • the posi tion of the remaining touch-controlled control elements may be adapted (e.g., rotated around the first touch-activated button), depending on the position of the first touch -activated but ton on the display.
  • the one or more second touch-activated buttons 154 are used to select the func tionality to control via the visual control overlay.
  • a functionality to control via the third touch-controlled
  • each of the one or more second touch-activated buttons is associated with a functionality, which may be controlled upon actuation of the respective touch-activated button.
  • at least one of the one or more second touch-activated buttons may be associated with an optics-related functionality of the micro scope system, such as zoom or autofocus.
  • At least one of the one or more second touch-activated buttons may be associated with an illumination-related functionality of the microscope system (e.g., brightness control or activation of illumination elements). Additionally, or alternatively, at least one of the one or more second touch- activated buttons may be associated with an imaging-mode-related functionality of the mi croscope system, e.g., to switch between reflectance imaging and fluorescence imaging, or to switch on fluorescence imaging in addition to reflectance imaging. Each button may in clude a pictogram representing the functionality, as shown in Figs. 3b to 3f.
  • the system may be configured to generate the visual control overlay such that each of the one or more second touch-activated buttons is associated with a single function ality of the microscope system (at a time).
  • the functionality may change depend ing the microscope operation.
  • the microscope system may be suitable for per forming imaging in two or more different imaging modes (e.g., reflectance imaging mode, fluorescence imaging mode in first frequency band, fluorescence imaging mode in second frequency band etc.).
  • the one or more second touch-activated buttons may be associated with different functionalities in different imaging modes.
  • the system may be configured to generate the visual control overlay such that a functionality associated with the one or more second touch-activated buttons is dependent on an imaging mode being performed by the microscope system.
  • At least one of the one or more second touch-activated buttons may be associated with a different functionality than in a second imaging mode (e.g., fluorescence imaging mode).
  • the number of second touch-activated buttons may be the same in the two different imaging modes.
  • the third touch-controlled control element is used control the selected functionality.
  • the third touch-controlled control element may be suitable for controlling a functionality of the microscope system, the functionality of the microscope system being coupled to a functionality that is associated with the second touch -activated button being actuated.
  • the present disclosure refers to the third touch-controlled element merely as element, it can comprise one or multiple sub-elements.
  • the system may be configured to generate the visual control overlay such that the third touch -controlled control element comprises at least one of a button and a slider control.
  • the system may be configured to generate the visual control overlay such that the third touch-controlled control element comprises or is composed of two buttons, such that the third touch-controlled control element comprises or is composed of a slider control, or such that the third touch-controlled control element comprises or is composed of two buttons and a slider control. Examples for these alternatives are shown in Figs lc to If.
  • the third touch-controlled control element is composed of a slider 156.
  • the third touch-controlled control element is composed of a slider 156 and two buttons 156a; 156b (for increasing or decreasing a numerical value, for example).
  • buttons 156a; 156b may be integrated within the slider (as shown in Fig. 3b) or separate from the slider, as shown in Fig. Id.
  • the third touch-controlled control element is also composed of a slider, with the relative proportions of the two portions 156c; 156d represent ing the numerical value being selected, with portion 156c representing the 156c representing the numerical value being selected, and portion 156d (which also includes the portion de noted 156e) representing the complementary value.
  • the third touch-controlled control element may further comprise a threshold value, which is below the maximal value, and which may limit the values that are selectable via the third touch -controlled control el ement.
  • a threshold value which is below the maximal value, and which may limit the values that are selectable via the third touch -controlled control el ement.
  • portion 156e represents a portion of the third touch-controlled touch ele ment that is beyond the threshold (the threshold being indicated by the dashed line).
  • the portion 156e of the slider on the right of the dashed line may represents an upper threshold which can never be reached by changing the slider, with portion 156c repre senting the actual value, and portion 156d representing the complementary value.
  • such a slider may be used to adjust a brightness level of an illumination system of the microscope system, with a threshold being applied to limit the selectable brightness.
  • the portion 156c covers values 0% to 30%
  • the portion 156d (without portion 156e), i.e., the complement, covers values 30% to 70%
  • the portion 156e which cannot be reached and is marked as “for bidden zone”, covers 70% to 100%.
  • the same principle may be used for other numerical values as well, such as the working distance.
  • the threshold may be updated dynamically, depending on the current state of the microscope. For example, in case of the illumination, the threshold may be dynamically updated depending on the work ing distance of the microscope.
  • the third touch-controlled control element is composed of two buttons 156f; 156g, which may be used to select one of two values (e.g., “on” / “off’ or “activate” / “deacti vate”). In some examples, however, the third touch-controlled control element may com prise or be composed of more than two buttons, e.g., three buttons or four buttons.
  • the term “touch -controlled control element” is used for touch-activated buttons (156a; 156b; 156f; 156g) as well as for the touch-controlled slider controller.
  • the visual control overlay may also include a numerical representation of the slider control, e.g., within the slider control or on top of the slider control, as shown (320) in Figs. 3b to 3e.
  • a numerical representation of the slider control e.g., within the slider control or on top of the slider control, as shown (320) in Figs. 3b to 3e.
  • any combination of sliders, toggles and click button controls may be used for the third touch-controlled control element.
  • the visual control overlay may be generated with different levels of expansion - only showing the first touch-activated button, or showing the first touch-activated button, the one or more second touch-activated buttons and the third touch-controlled control ele ment.
  • the inclusion of the third touch-controlled control element may be made condition on a functionality being selected via one of the one or more second touch-activated buttons (as the third touch-controlled control element might only serve a purpose once a functionality is selected.
  • the system may be configured to toggle in cluding the third touch-controlled control element in the visual control overlay upon actuat ing one of the one or more second touch-activated buttons, e.g., toggle between showing and hiding the third touch-controlled control element in the visual control overlay upon ac tuating one of the one or more second touch-activated button.
  • one of the func tionalities may be pre-selected upon expansion of the visual control overlay (e.g., by select ing a default functionality or by select the last-selected functionality).
  • the third touch- controlled control element may always be shown, even directly after the expansion.
  • the visual appearance of at least one of the one or more second touch-activated buttons, and, optionally, the third touch-controlled control element may be changed depending on whether one of the functionalities has been selected.
  • the touch-controlled control elements such as the first touch-activated button, the one or more second touch-activated buttons and/or the third touch-controlled control ele ment, may have an active state and a passive state.
  • the system may be configured to gener ate the visual control overlay such that the touch-controlled control elements are shown in different colors depending on their state. For example, if a button/control element is activat ed, it may be shown (i.e., “highlighted”) with a lighter and/or more noticeable color (e.g., a bright color instead of black or grey).
  • the visual control overlay is controlled via the touch input obtained via the touch interface.
  • the touch input obtained via the touch interface is obtained with the sensor signal that is obtained from the touch interface of the touch screen 130 of the microscope system.
  • the sensor signal represents the touch input obtained via the touch interface.
  • the sensor signal may comprise information on one or more coordinates at which a touch input has occurred.
  • the touch input is used to control the visual control overlay.
  • the system may be configured to control the visual control overlay via the touch input obtained via the touch interface.
  • the system may be configured to locate the touch input (e.g., in coordi nates) relative to the visual control overlay, e.g., relative to the buttons and/or element of the visual control overlay, and to control the visual control overlay based on a location of the touch input, and, with respect to the slider control, optionally based on a movement of the touch input.
  • the first touch-activated button, the one or more second touch-activated buttons and the third touch-controlled control element may have so-called touch targets, which are regions of a coordinate system representing at least the visual control overlay. If a touch input intersects with a touch target, the respective button (or control element) is actuated. In case of a button, the actuation may activate or deactivate the button. In case of the slider control, an actuation may set the slider value to the position of the touch input.
  • a touch movement after the initial touch input may be taken into account as well, increasing or decreasing the numerical value associated with the slider control de pending on a direction of the touch movement.
  • a touch movement from left to right or bottom to top may increase the value, and a touch value from right to left or from top to bottom may decrease the value (with diagonal movements being considered similar ly).
  • the value may be set to an end-position of the end of the touch movement.
  • touch inputs radially outside the third touch-controlled control element i.e., radially outwards within the same sector of the circle
  • touch input that is registered radially outward of the third touch-controlled control element may be used to control the third touch-controlled control element (e.g., both for single touches happening radially outward of the third touch-controlled control element and for touch movements starting and/or end ing radially outward of the third touch-controlled control element).
  • the slider can be con trolled via single touch inputs as well as, optionally, via touch movements, the third touch-controlled control element is touch-controlled, instead of being merely touch-activated.
  • the third touch-controlled touch element may be a third touch- activated control element, accepting only single touch inputs.
  • the one or more interfaces 112 may correspond to one or more inputs and/or outputs for receiving and/or transmitting information, which may be in digital (bit) values according to a specified code, within a module, between modules or between modules of different enti ties.
  • the one or more interfaces 112 may comprise interface circuitry config ured to receive and/or transmit information.
  • the one or more processors 114 may be implemented using one or more processing units, one or more processing devices, any means for processing, such as a processor, a computer or a programmable hardware component being operable with accordingly adapted software.
  • the described function of the one or more processors 114 may as well be implemented in software, which is then executed on one or more programmable hardware components.
  • Such hardware com ponents may comprise a general-purpose processor, a Digital Signal Processor (DSP), a micro-controller, etc.
  • the one or more storage devices 116 may comprise at least one element of the group of a computer readable storage medium, such as a magnetic or optical storage medium, e.g., a hard disk drive, a flash memory, Flop py-Disk, Random Access Memory (RAM), Programmable Read Only Memory (PROM), Erasable Programmable Read Only Memory (EPROM), an Electronically Erasable Pro grammable Read Only Memory (EEPROM), or a network storage.
  • a computer readable storage medium such as a magnetic or optical storage medium, e.g., a hard disk drive, a flash memory, Flop py-Disk, Random Access Memory (RAM), Programmable Read Only Memory (PROM), Erasable Programmable Read Only Memory (EPROM), an Electronically Erasable Pro grammable Read Only Memory (EEPROM), or a network storage.
  • system and microscope system may comprise one or more additional optional features corresponding to one or more aspects of the proposed concept or one or more ex amples described above or below.
  • Fig. 2 shows a flow chart of an example of a corresponding method for a microscope sys tem, e.g., for the microscope system 100 of Figs la to If.
  • the method comprises obtaining 210 image data from an optical imaging sensor of a microscope of the microscope system.
  • the method comprises obtaining 220 a sensor signal from a touch interface of a touch screen of the microscope system.
  • the sensor signal represents a touch input obtained via the touch interface.
  • the method comprises generating 230 a display signal for a display of the touch screen of the microscope system.
  • the display signal comprises a continuously updat ed representation of the image data and a visual control overlay being overlaid over the rep resentation of the image data.
  • the visual control overlay is controlled via the touch input obtained via the touch interface.
  • the visual control overlay comprises a first touch -activated button that is configured to show or hide the remaining touch-activated control elements of the visual control interface upon actuation.
  • the visual control overlay further comprises at least one further touch-controlled control element being shown or hidden based on the actu ation of the first touch-activated button.
  • the method may comprise providing the display signal to the display of the touch-screen.
  • the method may comprise one or more additional optional features corresponding to one or more aspects of the proposed concept or one or more examples described above or below.
  • Various aspects of the present disclosure relate to radial menu with live background, in the following denoted a “rotary menu” (e.g., the visual control overlay 150 introduced in con nection with Figs la to 2).
  • the present disclosure may thus relate to microscope on-screen controls.
  • the User Interface (UI)/User experience (UX) may enable a selection and control of microscope functions, in an intuitive and clear manner, while keeping a clean UI and where possible showing the immediate effect of the selected changes.
  • the Ro tary Menu is may be a context menu that is (only) available during surgery and located at the bottom of the Live View (e.g., the representation of the image data). The menu may be opened and closed with a one-button handling.
  • the “rotary menu” is shown with respect to an illumination control screen.
  • the shown illumination control screen has a live image from the microscope in the background.
  • a single button e.g., the first touch-activated button
  • brings up the radial menu with 4 “petals” e.g., the one or more second touch-activated but tons
  • a further circumferential bar e.g., the third touch-controlled control element
  • the screen can detect touch radially outward of the circumferential bar.
  • the "Rotary Menu” button (e.g., the first touch-activated button) 310 at the bottom, changes to a "X" button for closing the menu again.
  • the "Rotary Menu” button e.g., the first touch-activated button 310 at the bottom, changes to a "X" button for closing the menu again.
  • the user can change simple functions that needs to be accessed fast and easy by the user (e.g., change the illumination)
  • Figs. 3a to 3e show various states of an exemplary rotary menu.
  • the rotary menu comprises the “rotary menu” button 310 (which may correspond to the first touch -activated button 152 of Figs la to 2), to open/close the rotary menu, a slider or toggle buttons 315 (which may correspond to the third touch-controlled control element 156), which is the area to manipu late values, and four buttons 311-314 (e.g., the one or more second touch-activated buttons 154) for changing the content of the slider or toggle buttons 315.
  • buttons 311-314 e.g., the one or more second touch-activated buttons 154 for changing the content of the slider or toggle buttons 315.
  • a description of the currently activated function and a numerical value (if applicable) representing a value of the currently activated function may be shown.
  • the rotary menu may contain four sections, the description, the slider or toggle buttons, the buttons for changing the content of the slider or toggle buttons, and the button or buttons to open/close the rotary menu.
  • Fig. 3a a generic representation is shown, where button 310 and button 312 are high lighted.
  • the third touch-controlled control element 315 relates to the function associated with button 312.
  • buttons 311-314 in Figs. 3b to 3 e are “video focus” 311, which may be moved upwards or downwards until a terminal position is reached, “illumination” 312, which may also be set to a value between 0 and 100, “auto focus” 313, which may be set to on or off, and “bright care” 314 which may also be set to on or off.
  • buttons 315b; 315d beneath the slider - the user can change the parameters that need to be adjusted by the slider.
  • two ON / OFF Toggle buttons 315e; 315f may be shown, which function as classic buttons, as do the buttons 311-314.
  • the background may turn "Aquamarine" and the icon of the button may get the "fo cused” status, for example. For example, if a maximal or minimal value or position is reached, one of the buttons 315b; 315d may be disabled.
  • buttons 315b and 315d may be shown for decreasing and increasing the value.
  • the buttons are shown within the slider. However, as shown in Fig. Id, they may also be separate from the slider.
  • the numerical value may be controlled by placing the finger on any position on the slider, with the numerical value being adjusted according to the position of the finger on the slider.
  • the slider 315 in the middle provides the manipulation of values in two directions (up and down), it can be changed using the slide gesture or by tapping on the "+” or buttons 315b; 315d.
  • the two buttons 315e; 315f on the left and the right side are used to toggle val ues between two states (currently only on/off).
  • the slider of the Rotary Menu may be displayed in white in any circumstances.
  • the user may receive direct feedback on the changes by the visual appearance and manipulation of the slider itself. Also a quick tap on the "+" or or on any other area of the slider may make fast manipulation possible.
  • Fig. 3d a more detailed view on the touch targets 331-335a/b for actuating the buttons 311-315b/d is shown.
  • the rotary menu, and the description, is center-aligned around a cen tral axis 330.
  • Fig. 3e the angles between the different elements are shown.
  • Fig. 3e shows a horizontal axis 340 and a vertical axis 341.
  • the angle between the vertical axis 340 and the elements 314; 315 is 22 degrees (same between the horizontal axis 340 and the elements 311; 315 on the other side).
  • An angle between gaps separating the buttons 311-315 is 34 degrees in the example.
  • Fig. 3f shows two different states for button 310 according to an example.
  • Button 310a may be shown when the rotary menu is collapsed, while button 310b may be shown when the rotary menu is open, as shown in Figs. 3a to 3e.
  • Fig. 4 shows an exemplary placement of the rotary menu with first touch -controlled button 410; four second touch-controlled buttons 420 and two third touch-controlled elements 430 on top of a live view 440.
  • the description of the functionality is shown.
  • the rotary menu may be constrained to the bottom half of the screen, and may be centered horizontally.
  • Fig. 5 shows an exemplary transition between different states of the rotary menu.
  • Fig. 5 shows a first view 510, where the rotary menu is collapsed, a second view 520, in which the illumination button is activated, with the illumination being controlled via the slider, and a third view 530, in which the autofocus button is activated, with the choice being between “on” and “off’.
  • the rotary menu button on the bottom of the menu is pressed, the menu collapses again, and the first view 510 is shown again. More details and aspects of the rotary menu are mentioned in connection with the proposed concept or one or more examples described above or below (e.g., Fig. la to 2, 6).
  • the rotary menu may comprise one or more additional optional features corresponding to one or more aspects of the proposed concept or one or more examples described above or below.
  • aspects have been described in the context of an apparatus, it is clear that these aspects also represent a description of the corresponding method, where a block or device corresponds to a method step or a feature of a method step. Analogously, aspects described in the context of a method step also represent a description of a corresponding block or item or feature of a corresponding apparatus.
  • a microscope comprising a system as described in connection with one or more of the Figs. 1 to 5.
  • a microscope may be part of or connected to a system as described in connection with one or more of the Figs. 1 to 5.
  • Fig. 6 shows a schematic illustration of a system 600 configured to perform a method described herein.
  • the system 600 comprises a microscope 610 and a computer system 620.
  • the microscope 610 is configured to take images and is connected to the computer system 620.
  • the computer system 620 is configured to execute at least a part of a method described herein.
  • the computer system 620 may be configured to execute a machine learning algorithm.
  • the computer system 620 and microscope 610 may be separate entities but can also be integrated together in one common housing.
  • the computer system 620 may be part of a central processing system of the microscope 610 and/or the computer system 620 may be part of a subcomponent of the microscope 610, such as a sensor, an actor, a camera or an illumination unit, etc. of the microscope 610.
  • the computer system 620 may be a local computer device (e.g., personal computer, laptop, tablet computer or mobile phone) with one or more processors and one or more storage devices or may be a distributed computer system (e.g., a cloud computing system with one or more processors and one or more storage devices distributed at various locations, for example, at a local client and/or one or more remote server farms and/or data centers).
  • the computer system 620 may comprise any circuit or combination of circuits.
  • the computer system 620 may include one or more processors which can be of any type.
  • processor may mean any type of computational circuit, such as but not limited to a microprocessor, a microcontroller, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a graphics processor, a digital signal processor (DSP), multiple core processor, a field programmable gate array (FPGA), for example, of a microscope or a microscope component (e.g., camera) or any other type of processor or processing circuit.
  • CISC complex instruction set computing
  • RISC reduced instruction set computing
  • VLIW very long instruction word
  • DSP digital signal processor
  • FPGA field programmable gate array
  • circuits may be a custom circuit, an application-specific integrated circuit (ASIC), or the like, such as, for example, one or more circuits (such as a communication circuit) for use in wireless devices like mobile telephones, tablet computers, laptop computers, two-way radios, and similar electronic systems.
  • the computer system 620 may include one or more storage devices, which may include one or more memory elements suitable to the particular application, such as a main memory in the form of random access memory (RAM), one or more hard drives, and/or one or more drives that handle removable media such as compact disks (CD), flash memory cards, digital video disk (DVD), and the like.
  • RAM random access memory
  • CD compact disks
  • DVD digital video disk
  • the computer system 620 may also include a display device, one or more speakers, and a keyboard and/or controller, which can include a mouse, trackball, touch screen, voice-recognition device, or any other device that permits a system user to input information into and receive information from the computer system 620.
  • a display device one or more speakers
  • a keyboard and/or controller which can include a mouse, trackball, touch screen, voice-recognition device, or any other device that permits a system user to input information into and receive information from the computer system 620.
  • Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a processor, a microprocessor, a programmable computer or an electronic circuit. In some embodiments, some one or more of the most important method steps may be executed by such an apparatus.
  • embodiments of the invention can be implemented in hardware or in software.
  • the implementation can be performed using a non- transitory storage medium such as a digital storage medium, for example a floppy disc, a DVD, a Blu-Ray, a CD, a ROM, a PROM, and EPROM, an EEPROM or a FLASH memory, having electronically readable control signals stored thereon, which cooperate (or are capable of cooperating) with a programmable computer system such that the respective method is performed. Therefore, the digital storage medium may be computer readable.
  • Some embodiments according to the invention comprise a data carrier having electronically readable control signals, which are capable of cooperating with a programmable computer system, such that one of the methods described herein is performed.
  • embodiments of the present invention can be implemented as a computer program product with a program code, the program code being operative for performing one of the methods when the computer program product runs on a computer.
  • the program code may, for example, be stored on a machine readable carrier.
  • inventions comprise the computer program for performing one of the methods described herein, stored on a machine readable carrier.
  • an embodiment of the present invention is, therefore, a computer program having a program code for performing one of the methods described herein, when the computer program runs on a computer.
  • a further embodiment of the present invention is, therefore, a storage medium (or a data carrier, or a computer-readable medium) comprising, stored thereon, the computer program for performing one of the methods described herein when it is performed by a processor.
  • the data carrier, the digital storage medium or the recorded medium are typically tangible and/or non-transitionary.
  • a further embodiment of the present invention is an apparatus as described herein comprising a processor and the storage medium.
  • a further embodiment of the invention is, therefore, a data stream or a sequence of signals representing the computer program for performing one of the methods described herein.
  • the data stream or the sequence of signals may, for example, be configured to be transferred via a data communication connection, for example, via the internet.
  • a further embodiment comprises a processing means, for example, a computer or a programmable logic device, configured to, or adapted to, perform one of the methods described herein.
  • a further embodiment comprises a computer having installed thereon the computer program for performing one of the methods described herein.
  • a further embodiment according to the invention comprises an apparatus or a system configured to transfer (for example, electronically or optically) a computer program for performing one of the methods described herein to a receiver.
  • the receiver may, for example, be a computer, a mobile device, a memory device or the like.
  • the apparatus or system may, for example, comprise a file server for transferring the computer program to the receiver.
  • a programmable logic device for example, a field programmable gate array
  • a field programmable gate array may cooperate with a microprocessor in order to perform one of the methods described herein.
  • the methods are preferably performed by any hardware apparatus.

Abstract

Examples relate to a microscope system, such as a surgical microscope system, and to a corresponding system, method and computer program for a microscope system. The system comprises one or more processors and one or more storage devices. The system is configured to obtain image data from an optical imaging sensor of a microscope of the microscope system. The system is configured to obtain a sensor signal from a touch interface of a touch screen of the microscope system. The sensor signal represents a touch input obtained via the touch interface. The system is configured to generate a display signal for a display of the touch screen of the microscope system. The display signal comprises a continuously updated representation of the image data and a visual control overlay being overlaid over the representation of the image data. The visual control overlay is controlled via the touch input obtained via the touch interface. The visual control overlay comprises a first touch-activated button that is configured to show or hide the remaining touch-activated control elements of the visual control interface upon actuation, three or more second touch-activated buttons that are fanned out around the first touch-activated button, and, at least if one of the three or more second touch-activated buttons is actuated via the touch interface, a third touch-controlled control element that is arc-shaped. The three or more second touch-activated but- tons are arranged between the third touch-controlled control element and the first touch-activated button.

Description

Microscope System and Corresponding System, Method and Computer Program
Technical field
Examples relate to a microscope system, such as a surgical microscope system, and to a corresponding system, method and computer program for a microscope system.
Background
Surgical microscope systems are complex devices that provide a large number of functional ities. These functionalities are often accessible via haptic buttons, such as buttons that are located at handles of the surgical microscope system, or buttons that are arranged on a foot pedal of the surgical microscope system.
In some cases, access to the functionality may be provided visually, e.g., via a display and a corresponding input device. However, as surgical microscope systems are used in high-stake situations where the concentration of the surgeon and of the assistants is paramount, the design of a user interface being used to control the surgical microscope system is of in creased importance.
Summary
Various examples of the present disclosure are based on the finding that, while touchscreens provide means that are suitable for controlling a (surgical) microscope system, a user inter face being used to control the microscope system may be designed in a manner that focuses on usability and reachability of the elements of the user interface to support the surgeon and assistants during surgery. In the present disclosure, a user interface is presented that uses a button to show or hide the remaining control elements of the user interface, which provides easy access to the remaining control elements without overly obstructing the view on the sample while being collapsed. Furthermore, a two- or three-tiered arc-shaped user interface is introduced that is designed to take into account the shape and reach of the hand, while providing a layout that is intuitive and that can be controlled safely in stressful situations. Various aspects of the present disclosure relate to a system for a microscope system. The system comprises one or more processors and one or more storage devices. The system is configured to obtain image data from an optical imaging sensor of a microscope of the mi croscope system. The system is configured to obtain a sensor signal from a touch interface of a touch screen of the microscope system. The sensor signal represents a touch input ob tained via the touch interface. The system is configured to generate a display signal for a display of the touch screen of the microscope system. The display signal comprises a con tinuously updated representation of the image data and a visual control overlay being over laid over the representation of the image data. The visual control overlay is controlled via the touch input obtained via the touch interface. The visual control overlay comprises a first touch-activated button that is configured to show or hide the remaining touch -activated con trol elements of the visual control interface upon actuation. The visual control overlay fur ther comprises at least one further touch-controlled control element being shown or hidden based on the actuation of the first touch-activated button. By using a visual control overlay with a first button that is used to show or hide the remaining elements of the touch-based interface, the menu is always accessible via the first button, while obstructing only a minor portion of the view on the sample. Furthermore, the visual control overlay is overlaid over the live image, providing the user operating the microscope system with a sufficient view on the sample being observed, e.g., on the surgical site.
In various examples, the visual control overlay comprises one or more second touch- activated buttons that are arranged adjacent to the first touch -activated button. For example, the one or more second touch-activated buttons may be fanned out around the first touch- activated button. For example, the one or more second touch-activated buttons may be as signed different functionalities of the microscope system.
In various examples, the visual control overlay is context-dependent, i.e., the functionality being accessible via the visual control overlay may change depending on the context it is being used in. For example, the microscope system may be suitable for performing imaging in two or more different imaging modes (e.g., in a reflectance imaging mode and in a fluo rescence imaging mode). The system may be configured to generate the visual control over lay such that a functionality associated with the one or more second touch-activated buttons is dependent on an imaging mode being performed by the microscope system. Thus, the functionality most relevant to the respective imaging mode may be accessible via the visual control overlay.
In general, the visual control overlay may follow a design paradigm in which the first touch- activated button is used to show or hide the remaining control elements (i.e., to collapse or expand the visual control overlay), the one or more second touch-activated buttons are used to select the functionality being controlled, and the third touch-controlled control element is being used to manipulate a value of the functionality being selected. Thus, the one or more second touch-activated buttons may each be associated with a functionality, which is select ed upon actuation of the respective button of the one or more second touch-activated but tons. In particular, the system may be configured to generate the visual control overlay such that each of the one or more second touch-activated buttons is associated with a single func tionality of the microscope system.
For example, the one or more second touch-activated buttons may cover a wide range of functionalities. For example, one of the one or more second touch-activated buttons may be associated with an optics-related functionality of the microscope system (such as zooming or focusing). Additionally or alternatively, one of the one or more second touch-activated buttons may be associated with an illumination-related functionality of the microscope sys tem (e.g., brightness control, or switching between white light and fluorescence excitation illumination). Additionally or alternatively, one of the one or more second touch-activated buttons may be associated with an imaging-mode-related functionality of the microscope system (e.g., to switch between fluorescence imaging and reflectance imaging, or to select a fluorescence imaging mode).
In various examples, the visual control overlay comprises, at least if one of the one or more second touch-activated buttons is actuated via the touch interface, a third touch-controlled control element. For example, the third touch-controlled control element may be arc-shaped. For example, the one or more second touch-activated buttons may be arranged between the third touch-controlled control element and the first touch-activated button. As outlined above, such a three-tiered and arc-shaped visual control overlay is designed to take into ac count the shape and reach of the hand, while providing a layout that is intuitive and that can be controlled safely in stressful situations. In various examples, the visual control overlay has a substantially arc-shaped layout, which may provide a high degree of usability, e.g., as the other control elements are within reach when the user places a thumb on the first touch-controlled button. For example, the one or more second touch-activated buttons and the third touch-controlled control element may be spread-out radially around the first touch-activated button. The one or more second touch- activated buttons and the third touch-controlled control element may be contained within a circular sector of a circle being spanned around the first touch -activated button, the circular sector having an angle of less than 180 degrees (e.g., at most 160 degrees).
In general, the third touch-controlled control element may be used to alter a value of the function being selected. For example, the system may be configured to generate the visual control overlay such that the third touch-controlled control element comprises at least one of a button (e.g., two buttons for increasing or decreasing a value, or two or more buttons for selecting among a set of pre-defmed values), a toggle (e.g., for selecting among two pre defined values, such as on and off) and a slider control (e.g., to control a numerical value).
For example, through experimentation, the following layouts have proven to be intuitive. For example, the system may be configured to generate the visual control overlay such that the third touch-controlled control element comprises two buttons (e.g., for selecting among two values, or for increasing or decreasing a numerical value), a slider control (e.g., for ma nipulating a numerical value), or two buttons and a slider control (e.g., with the buttons be ing used for increasing or decreasing a numerical value, and the slider control being used to directly manipulate the numerical value.
In case a numerical value is being manipulated, a textual representation of the numerical value may also be provided to enable a precise setting of the numerical value. For example, if the visual control overlay is generated with a slider control, the visual control overlay may also include a numerical representation of the slider control.
In some cases, it may be beneficial to show the third touch-controlled control element only once one of the functions is selected, e.g., to indicate selection of one of the functionalities, or to further reduce the obstruction on the live view. For example, the system may be con figured to toggle between showing and hiding the third touch-controlled control element in the visual control overlay upon actuating one of the one or more second touch-activated buttons.
As pointed out above, the third touch-controlled control element may be used to alter a val ue of the function being selected. In other words, the one or more second touch-activated buttons may be used to select a functionality, and the third touch-controlled control element may be used to actually affect a change in the microscope system. For example, the third touch-controlled control element may be suitable for controlling a functionality of the mi croscope system, the functionality of the microscope system being coupled to a functionality that is associated with the second touch-activated button being actuated.
In general, the visual control overlay is controlled via the touch-screen display. To improve the usability in scenarios in which a user is unable to take the eyes off the sample, e.g., dur ing surgery, the touch targets of the touch-screen may be adjusted, in particular with respect to the third touch-controlled control element. For example, the system may be configured to control the visual control overlay via the touch input obtained via the touch interface. Touch input that is registered radially outward of the third touch-controlled control element may be used to control the third touch-controlled control element (thereby increasing the size of the touch target).
In various examples, the system is configured to generate the display signal such that the first touch-activated button is constantly shown overlaid over the representation of the im age data. When the first touch-activated button is constantly shown, it can be used to expand or collapse the visual control overlay at any time.
In general, the visual control overlay may be designed such that it keeps out of the way even when it is shown in its entirety, e.g., so the user is at any time able to see the live view on the sample. For example, the system may be configured to generate the visual control over lay such, that the visual control overlay is constrained within a bottom half of the display. Consequently, the view on the sample may be unobstructed at any time in the top half of the display. Additionally or alternatively, the system may be configured to generate the visual control overlay such that the first touch-activated button covers at most 5% of an overall surface of the display. For example, the first touch-activated button may be visible, and thus accessible, at any time, while obstructing only a small part of the view on the sample. Addi- tionally or alternatively, the system may be configured to generate the visual control overlay such that the visual control overlay covers at most 30% of an overall surface of the display. Thus, the majority of the view on the sample may remain unobstructed.
The visual control overlay itself may change based on the input of the user, e.g., to indicate to the user the functionality presently being controlled. For example, touch-activated control elements of the visual control overlay, such as the first touch -activated button, the one or more second touch-activated buttons and the third touch-controlled control element may, have an active state and a passive state. The system may be configured to generate the visual control overlay such that the touch-activated control elements are shown in different colors depending on their state (e.g., with “highlighted” colors if active).
Various embodiments of the present disclosure further relate to a surgical microscope sys tem comprising the system introduced above, the microscope and the display. For example, the touch-screen may be arranged at a base unit of the surgical microscope system. Alterna tively, the touch-screen may be arranged at the microscope of the surgical microscope sys tem. Such a system may be particularly beneficial in a surgical setting, as it is designed to provide an intuitive control of the (surgical) microscope system in stressful situations.
Various aspects of the present disclosure relate to a corresponding method for a microscope system. The method comprises obtaining image data from an optical imaging sensor of a microscope of the microscope system. The method comprises obtaining a sensor signal from a touch interface of a touch screen of the microscope system. The sensor signal represents a touch input obtained via the touch interface. The method comprises generating a display signal for a display of the touch screen of the microscope system. The display signal com prises a continuously updated representation of the image data and a visual control overlay being overlaid over the representation of the image data. The visual control overlay is con trolled via the touch input obtained via the touch interface. The visual control overlay com prises a first touch-activated button that is configured to show or hide the remaining touch- activated control elements of the visual control interface upon actuation. The visual control overlay further comprises at least one further touch-controlled control element being shown or hidden based on the actuation of the first touch-activated button. Various aspects of the present disclosure relate to a corresponding computer program with a program code for performing the above method when the computer program is executed on a processor.
Short description of the Figures
Some examples of apparatuses and/or methods will be described in the following by way of example only, and with reference to the accompanying figures, in which:
Fig. la shows a block diagram of an example of a system for a microscope system;
Fig. lb shows a schematic diagram of an example of a surgical microscope system;
Figs lc to If show different examples of a visual control overlay;
Fig. 2 shows a flow chart of an example of a method for a microscope system;
Figs. 3a to 3e show various states of an exemplary rotary menu;
Fig. 3f shows two different states of a “rotary menu” button;
Fig. 4 shows an exemplary placement of a rotary menu on top of a live view;
Fig. 5 shows an exemplary transition between different states of the rotary menu; and
Fig. 6 shows a schematic diagram of a system comprising a microscope and a computer system.
Detailed Description
Various examples will now be described more fully with reference to the accompanying drawings in which some examples are illustrated. In the figures, the thicknesses of lines, layers and/or regions may be exaggerated for clarity. Fig. la shows a block diagram of an example of a system 110 for a microscope system 100. The system 110 comprises one or more processors 114 and one or more storage devices 116. Optionally, the system further comprises one or more interfaces 112. The one or more processors 114 are coupled to the one or more storage devices 116 and to the optional one or more interfaces 112. In general, the functionality of the system is provided by the one or more processors, in conjunction with the one or more interfaces (for exchanging infor mation, e.g., with an optical imaging sensor of a microscope) and/or with the one or more storage devices (for storing and/or retrieving information.
The system is configured to obtain (e.g., receive) image data from an optical imaging sensor of a microscope 120 of the microscope system. The system is configured to obtain (e.g., receive) a sensor signal from a touch interface of a touch screen 130 of the microscope sys tem. The sensor signal represents a touch input obtained via the touch interface. The system is configured to generate a display signal for a display of the touch screen of the microscope system. The display signal comprises a continuously updated representation 140 of the im age data and a visual control overlay 150 being overlaid over the representation of the image data. The visual control overlay is controlled via the touch input obtained via the touch in terface. The visual control overlay comprises a first touch -activated button 152 that is con figured to show or hide the remaining touch-activated control elements of the visual control interface upon actuation. The visual control overlay further comprises at least one further touch-controlled control element being shown or hidden based on the actuation of the first touch-activated button. In various examples of the present disclosure, the visual control overlay further comprises one or more second touch-activated buttons 154. For example, the one or more second touch-activated buttons may be fanned out around the first touch- activated button. Optionally, the visual control overlay further comprises, at least if one of the one or more second touch-activated buttons is actuated via the touch interface, a third touch-controlled control element 156, which may be arc-shaped. For example, the one or more second touch-activated buttons may be arranged between the third touch-controlled control element and the first touch-activated button.
Embodiments of the present disclosure relate to a system, method and computer program for a microscope system. In general, a microscope is an optical instrument that is suitable for examining objects that are too small to be examined by the human eye (alone). For example, a microscope may provide an optical magnification of a sample. In modem microscopes, the optical magnification is often provided for a camera or an imaging sensor, such as an optical imaging sensor of the microscope 120 that is shown in Fig. lb. The microscope 120 may further comprise one or more optical magnification components that are used to magnify a view on the sample, such as an objective (i.e., lens).
There are a variety of different types of microscopes. If the microscope is used in the medi cal or biological fields, the object being viewed through the microscope may be a sample of organic tissue, e.g., arranged within a petri dish or present in a part of a body of a patient. For example, the microscope system 100 may be a microscope system for use in a laborato ry, e.g., a microscope that may be used to examine the sample of organic tissue in a petri dish. Alternatively, the microscope 120 may be part of a (neuro) surgical microscope system 100, e.g., a microscope to be used during a (neuro) surgical procedure. Such a system is shown in Fig. lb, for example. Accordingly, an object being viewed through the micro scope, and shown in the image data, may be a sample of organic tissue of a patient. Alt hough embodiments are described in connection with a microscope, they may also be ap plied, in a more general manner, to any optical device.
The above system 110 is suitable for use with the microscope system comprising micro scope 120, e.g., as part of the microscope system 100. Fig. lb shows a block diagram of the microscope system 100 comprising the system 110, the microscope 120 and the touch screen 130. The microscope system shown in Fig. lb is a surgical microscope system. How ever, the system 110 may be used with other microscope systems or optical systems as well. The surgical microscope system 100 shown in Fig. lb comprises a number of optional com ponents, such as a base unit 105 (comprising the system 110) with a (rolling) stand, the touch-screen 130, a (robotic or manual) arm 160 which holds the microscope 120 in place, and which is coupled to the base unit 105 and to the microscope 120, and steering handles 170 that are attached to the microscope 120. For example, the touch screen 130 may be ar ranged at the base unit 105 of the microscope system. In the context of this application, the term “(surgical) microscope system” is used, in order to cover the portions of the system that are not part of the actual microscope (which comprises optical components), but which are used in conjunction with the microscope, such as the touch-screen or an illumination system. The system is configured to obtain image data from the optical imaging sensor of the micro scope. For example, the optical imaging sensor may comprise or be an APS (Active Pixel Sensor) - or a CCD (Charge-Coupled-Device)-based imaging sensor. For example, in APS- based imaging sensors, light is recorded at each pixel using a photo -detector and an active amplifier of the pixel. APS-based imaging sensors are often based on CMOS (Complemen tary Metal -Oxide- Semi conductor) or S-CMOS (Scientific CMOS) technology. In CCD- based imaging sensors, incoming photons are converted into electron charges at a semicon ductor-oxide interface, which are subsequently moved between capacitive bins in the imag ing sensors by a control circuitry of the imaging sensors to perform the imaging. The system is configured to obtain (i.e., receive or read out) the image data from the optical imaging sensor. The image data may be obtained by receiving the image data from the optical imag ing sensor (e.g., via the interface 112), by reading the image data out from a memory of the optical imaging sensor (e.g., via the interface 112), or by reading the image data from a storage device 116 of the system 110, e.g., after the image data has been written to the stor age device 116 by the optical imaging sensor or by another system or processor.
The system is configured to generate the display signal the display of the touch screen of the microscope system. In general, the display signal may be a signal for driving (e.g., control ling) the display of the touch-screen 130. For example, the display signal may comprise video data and/or control instructions for driving the display. For example, the display sig nal may be provided via one of the one or more interfaces 112 of the system. Accordingly, the system 110 may comprise a video interface 112 that is suitable for providing the video signal to the display of the touch screen.
The display signal comprises two components: the continuously updated representation 140 of the image data and the visual control overlay 150 being overlaid over the representation of the image data. For example, the continuously updated representation of the image data may be a live video that is generated based on the image data generated by the optical imag ing sensor of the microscope. The image data, and thus also the representation of the image data, may show a view through the microscope on the sample being observed via the micro scope, e.g., a view on a surgical site being observed through the surgical microscope. The representation 140 of the image data is continuously updated, so both the image data may be video data (i.e., continuously updated image data), and the representation may be a (live) video of the video data generated by the optical imaging sensor. For example, the represen- tation 140 of the image data may (always) be shown in the background, with the visual con trol overlay 150 being shown in the foreground, partially obstructing the representation 140 of the image data.
The visual control overlay is at the core of the present concept - it provides the user with a user interface for controlling the (surgical) microscope system. In other words, the visual control overlay may be a user interface for controlling the (surgical) microscope system. In the context of the present disclosure, the visual control overlay is also denoted “rotary menu”, as it provides a touch-controlled menu to the user that is arch-shaped and that allows the manipulation of values through a rotary movement. Figs lc to If and 3a to 5e show dif ferent examples of such a visual control overlay.
The examples shown in Figs lc to If have three groups of elements in common: a first touch-activated button 152, one or more second touch-activated buttons 154, and a third touch-controlled control element 156. However, the one or more second touch-activated buttons 154 and the third touch-controlled control element merely serve as an example for the at least one further touch-controlled control element. Any combination of touch- controlled user interface elements, arranged in any direction relative to the first touch- activated button may be used to implement the at least one further touch-controlled control element. Various examples may use a first button (e.g., an “activation button”) that “un folds” an arbitrary number of second-level buttons (e.g., the one or more second touch- activated buttons) that are geometrically arranged relative to that activation button. For ex ample, the one or more second touch-activated buttons may be two or more second touch- activated buttons, or three or more second touch-activated buttons, or four or more second touch-activated buttons. For example, in Figs lc to If, four second touch -activated buttons are used. Each of these second-level buttons can activate any other third-level touch- activated control element, for example a slider, a toggle or a set of buttons (or any other graphical user interface control element). For example, the third-level control element might become visible only on activation. Furthermore, the second and third level be arranged in any position or shape on the screen, relative to the activation button, with the half-moon shape only being one example.
The first touch-activated button 152 is used to, or configured to, expand or collapse the vis ual control overlay upon actuation (i.e., to toggle between showing and hiding the remaining elements of the visual control overlay upon actuation). In the examples given in Figs lc to If, the one or more second touch-activated buttons are used to, or configured to, select a functionality of the microscope system that is to be controlled. The third touch-controlled element is used to control the microscope system e.g., by selecting or adjusting a value as sociated with the functionality being selected through one of the one or more second touch- activated buttons.
As laid out above, the first touch-activated buttons is used for showing or hiding the (rest of) the menu. As such, the first touch-activated may always be visible. In other words, the sys tem may be configured to generate the display signal such that the first touch -activated but ton is constantly shown overlaid over the representation of the image data. Depending on whether the other elements of the visual control overlay are being shown, a visual represen tation of the first touch-activated button may change. As shown in Fig. 3f, if the visual con trol overlay is closed (i.e., collapsed), the button may be shown 310a with a pictogram rep resenting settings being adjusted (e.g., by showing sliders being adjusted, as shown in Fig. 3f). If the visual control overlay is opened (i.e., expanded), the button may be shown 320b with a pictogram representing a “cancel” operation, thereby providing an affordance for the user to re-use the same button for opening and closing the menu.
In Figs lc to If, the one or more second touch-activated buttons 154 are fanned out around the first touch-activated button. In this context, the term “fanned out” indicates that the one or more second touch-activated buttons are arranged at different sides of the first touch- activated button, located adjacent to the first touch -activated button, with each second touch-activated button having (substantially) the same distance to the first touch-activated button.
In various examples of the present disclosure, a rotary menu is presented, with rows of but tons being spread out radially around the first touch-activated button. Consequently, as shown in Figs lc to If, the one or more second touch-activated buttons and the third touch- controlled control element may be spread-out radially around the first touch-activated but ton. In other words, the one or more second touch-activated buttons may be arranged on a first circle, or rather a portion of a first circle, around the first touch-activated button, the shape of the one or more second touch-activated buttons following the circle (with a center of gravity of the one or more second touch-activated buttons being placed on the circle). Similarly, the third touch-controlled element may be arranged on a second circle, or rather a portion of a second circle, around the first touch-activated button, with a shape of the third touch-controlled element following the second circle (thus being arc-shaped). In other words, the third touch-controlled element, and/or the one or more second touch-activated buttons, may have a shape that is centered around the respective circle, thus creating an arc that covers a portion of the circle. As the third touch-controlled element is located further away from the first touch-activated button than the one or more second touch-activated but tons (i.e., the one or more second touch-activated buttons are arranged between the third touch-controlled control element and the first touch-activated button), the second circle may be larger than the first circle (and thus further out than the first circle relative to the first touch-activated button). For example, as shown in Figs lc to If, 3a to 5, the one or more second touch-activated buttons may be arranged radially between the first touch -activated button and the third touch-controlled control element.
In various implementations, the one or more second touch-activated buttons and the third touch-controlled control element are constrained to a portion of the circle, e.g., to provide intuitive touch targets to a hand when the first touch-activated button has been activated via a thumb of the hand. For example, the one or more second touch-activated buttons and the third touch-controlled control element may be constrained to a semi-circle (or less than a semi-circle) “above” (i.e., above on the display) or “below” (on the display) the first touch- activated button (and centered around the first touch-activated button). In other words, the one or more second touch-activated buttons and the third touch-controlled control element may be contained within a circular sector of a circle being spanned around the first touch- activated button. For example, the circular sector may have an angle of less than 180 de grees (or at most 170 degrees, or at most 160 degrees). In the examples shown in Figs lc to If, the circular sector has an angle of 156 degrees. For example, the one or more second touch-activated buttons (as a group) and the third touch-controlled elements may be hori zontally (i.e., laterally) centered around the first touch-activated buttons. In other words, the circular sector may be horizontally aligned with, and/or horizontally centered around, the first touch-activated button. The (entire) second touch-activated buttons and the third touch- controlled control element, and thus the circular sector, may be arranged, within the display signal/on the display, above a center point of the first touch -activated button. In various examples, the visual control overlay is constrained to a small portion of the screen, e.g., so a large part of the representation 140 of the image data can be seen even if the visual control overlay is shown. For example, the system may be configured to generate the visual control overlay such that the first touch-activated button covers at most 5% (or at most 3%, or at most 2% of an overall surface of the display (or of the representation of the image data). As for the entire visual control overlay (i.e., the first touch-activated button, the one or more second touch-activated buttons and the third touch-controlled control element), at most 30% of the display may be covered. In other words, the system may be configured to generate the visual control overlay such that the visual control overlay covers at most 30% of an overall surface of the display (or of the representation of the image data). How ever, in some implementations, the visual control overlay may cover more than 30% of the overall surface of the display. If needed, the unfolded visual control overlay may extend across the entire display (without extending across the borders of the display). Furthermore, the visual control overlay may be restricted to the bottom half (or the top half) of the dis play. In other words, the system may be configured to generate the visual control overlay such, that the visual control overlay is constrained within a bottom (or top) half of the dis play. For example, the visual control overlay may be centered horizontally at the bottom or top of the display. In some examples, the first button, and correspondingly the other touch- activated control elements that are arranged relative to the first buttons, may be arranged (e.g., dragged) to any position of the user interface, as long as the visual control overlay can be fully unfolded without extending across the borders of the display. For example, the posi tion of the remaining touch-controlled control elements may be adapted (e.g., rotated around the first touch-activated button), depending on the position of the first touch -activated but ton on the display.
In the following, the behavior of the one or more second touch-activated buttons 154 and of the third touch-controlled control element 156 is discussed. In Figs lc to If, four second touch-activated buttons 154a-154d are shown. Furthermore, various components 156a-156f of the third touch-controlled control element 156 are distinguished.
In general, the one or more second touch-activated buttons 154 are used to select the func tionality to control via the visual control overlay. In other words, upon actuation of one of the one or more second touch-activated buttons, a functionality to control (via the third touch-controlled) may be selected. Thus, each of the one or more second touch-activated buttons is associated with a functionality, which may be controlled upon actuation of the respective touch-activated button. For example, at least one of the one or more second touch-activated buttons may be associated with an optics-related functionality of the micro scope system, such as zoom or autofocus. Additionally or alternatively, at least one of the one or more second touch-activated buttons may be associated with an illumination-related functionality of the microscope system (e.g., brightness control or activation of illumination elements). Additionally, or alternatively, at least one of the one or more second touch- activated buttons may be associated with an imaging-mode-related functionality of the mi croscope system, e.g., to switch between reflectance imaging and fluorescence imaging, or to switch on fluorescence imaging in addition to reflectance imaging. Each button may in clude a pictogram representing the functionality, as shown in Figs. 3b to 3f.
For example, the system may be configured to generate the visual control overlay such that each of the one or more second touch-activated buttons is associated with a single function ality of the microscope system (at a time). However, the functionality may change depend ing the microscope operation. For example, the microscope system may be suitable for per forming imaging in two or more different imaging modes (e.g., reflectance imaging mode, fluorescence imaging mode in first frequency band, fluorescence imaging mode in second frequency band etc.). The one or more second touch-activated buttons may be associated with different functionalities in different imaging modes. For example, the system may be configured to generate the visual control overlay such that a functionality associated with the one or more second touch-activated buttons is dependent on an imaging mode being performed by the microscope system. In other words, in a first imaging mode (e.g., reflec tance imaging mode), at least one of the one or more second touch-activated buttons may be associated with a different functionality than in a second imaging mode (e.g., fluorescence imaging mode). However, the number of second touch-activated buttons may be the same in the two different imaging modes.
While the one or more second touch-activated buttons are used to select the functionality being controlled, the third touch-controlled control element is used control the selected functionality. In other words, the third touch-controlled control element may be suitable for controlling a functionality of the microscope system, the functionality of the microscope system being coupled to a functionality that is associated with the second touch -activated button being actuated. While the present disclosure refers to the third touch-controlled element merely as element, it can comprise one or multiple sub-elements. For example, the system may be configured to generate the visual control overlay such that the third touch -controlled control element comprises at least one of a button and a slider control. In particular, as shown in Figs lc to If, the system may be configured to generate the visual control overlay such that the third touch-controlled control element comprises or is composed of two buttons, such that the third touch-controlled control element comprises or is composed of a slider control, or such that the third touch-controlled control element comprises or is composed of two buttons and a slider control. Examples for these alternatives are shown in Figs lc to If. In Fig. lc, the third touch-controlled control element is composed of a slider 156. In Fig. Id, the third touch-controlled control element is composed of a slider 156 and two buttons 156a; 156b (for increasing or decreasing a numerical value, for example). For example, the buttons 156a; 156b may be integrated within the slider (as shown in Fig. 3b) or separate from the slider, as shown in Fig. Id. In Fig. le, the third touch-controlled control element is also composed of a slider, with the relative proportions of the two portions 156c; 156d represent ing the numerical value being selected, with portion 156c representing the 156c representing the numerical value being selected, and portion 156d (which also includes the portion de noted 156e) representing the complementary value. Optionally, the third touch-controlled control element may further comprise a threshold value, which is below the maximal value, and which may limit the values that are selectable via the third touch -controlled control el ement. In Fig. le, portion 156e represents a portion of the third touch-controlled touch ele ment that is beyond the threshold (the threshold being indicated by the dashed line). For example, the portion 156e of the slider on the right of the dashed line may represents an upper threshold which can never be reached by changing the slider, with portion 156c repre senting the actual value, and portion 156d representing the complementary value. For ex ample, such a slider may be used to adjust a brightness level of an illumination system of the microscope system, with a threshold being applied to limit the selectable brightness. For example, if the numerical value is 30% and the threshold is at 70%, the portion 156c covers values 0% to 30%, the portion 156d (without portion 156e), i.e., the complement, covers values 30% to 70%, and the portion 156e, which cannot be reached and is marked as “for bidden zone”, covers 70% to 100%. However, the same principle may be used for other numerical values as well, such as the working distance. For example, the threshold may be updated dynamically, depending on the current state of the microscope. For example, in case of the illumination, the threshold may be dynamically updated depending on the work ing distance of the microscope.
In Fig. If, the third touch-controlled control element is composed of two buttons 156f; 156g, which may be used to select one of two values (e.g., “on” / “off’ or “activate” / “deacti vate”). In some examples, however, the third touch-controlled control element may com prise or be composed of more than two buttons, e.g., three buttons or four buttons. In the context of the present disclosure, the term “touch -controlled control element” is used for touch-activated buttons (156a; 156b; 156f; 156g) as well as for the touch-controlled slider controller. In case a slider is used, and thus a numerical value is being adjusted, the visual control overlay may also include a numerical representation of the slider control, e.g., within the slider control or on top of the slider control, as shown (320) in Figs. 3b to 3e. However, any combination of sliders, toggles and click button controls may be used for the third touch-controlled control element.
In general, the visual control overlay may be generated with different levels of expansion - only showing the first touch-activated button, or showing the first touch-activated button, the one or more second touch-activated buttons and the third touch-controlled control ele ment. In some cases, however, the inclusion of the third touch-controlled control element may be made condition on a functionality being selected via one of the one or more second touch-activated buttons (as the third touch-controlled control element might only serve a purpose once a functionality is selected. Thus, the system may be configured to toggle in cluding the third touch-controlled control element in the visual control overlay upon actuat ing one of the one or more second touch-activated buttons, e.g., toggle between showing and hiding the third touch-controlled control element in the visual control overlay upon ac tuating one of the one or more second touch-activated button. Alternatively, one of the func tionalities may be pre-selected upon expansion of the visual control overlay (e.g., by select ing a default functionality or by select the last-selected functionality). Thus, the third touch- controlled control element may always be shown, even directly after the expansion.
To help the user understand which of the functionalities is being selected, or to see whether a functionality of selected at all, the visual appearance of at least one of the one or more second touch-activated buttons, and, optionally, the third touch-controlled control element, may be changed depending on whether one of the functionalities has been selected. For ex- ample, the touch-controlled control elements, such as the first touch-activated button, the one or more second touch-activated buttons and/or the third touch-controlled control ele ment, may have an active state and a passive state. The system may be configured to gener ate the visual control overlay such that the touch-controlled control elements are shown in different colors depending on their state. For example, if a button/control element is activat ed, it may be shown (i.e., “highlighted”) with a lighter and/or more noticeable color (e.g., a bright color instead of black or grey).
The visual control overlay is controlled via the touch input obtained via the touch interface. In turn, the touch input obtained via the touch interface is obtained with the sensor signal that is obtained from the touch interface of the touch screen 130 of the microscope system. The sensor signal represents the touch input obtained via the touch interface. For example, the sensor signal may comprise information on one or more coordinates at which a touch input has occurred.
The touch input is used to control the visual control overlay. For example, the system may be configured to control the visual control overlay via the touch input obtained via the touch interface. In general, the system may be configured to locate the touch input (e.g., in coordi nates) relative to the visual control overlay, e.g., relative to the buttons and/or element of the visual control overlay, and to control the visual control overlay based on a location of the touch input, and, with respect to the slider control, optionally based on a movement of the touch input.
In general, the first touch-activated button, the one or more second touch-activated buttons and the third touch-controlled control element may have so-called touch targets, which are regions of a coordinate system representing at least the visual control overlay. If a touch input intersects with a touch target, the respective button (or control element) is actuated. In case of a button, the actuation may activate or deactivate the button. In case of the slider control, an actuation may set the slider value to the position of the touch input. Optionally, and additionally or alternatively to the previous option, when a touch input is registered at the slider control, a touch movement after the initial touch input may be taken into account as well, increasing or decreasing the numerical value associated with the slider control de pending on a direction of the touch movement. For example, a touch movement from left to right or bottom to top may increase the value, and a touch value from right to left or from top to bottom may decrease the value (with diagonal movements being considered similar ly). Alternatively, the value may be set to an end-position of the end of the touch movement. In any case, touch inputs radially outside the third touch-controlled control element (i.e., radially outwards within the same sector of the circle) may be used for controlling the third touch-controlled control element as well. For example, touch input that is registered radially outward of the third touch-controlled control element may be used to control the third touch-controlled control element (e.g., both for single touches happening radially outward of the third touch-controlled control element and for touch movements starting and/or end ing radially outward of the third touch-controlled control element). As the slider can be con trolled via single touch inputs as well as, optionally, via touch movements, the third touch- controlled control element is touch-controlled, instead of being merely touch-activated. However, in some cases, the third touch-controlled touch element may be a third touch- activated control element, accepting only single touch inputs.
The one or more interfaces 112 may correspond to one or more inputs and/or outputs for receiving and/or transmitting information, which may be in digital (bit) values according to a specified code, within a module, between modules or between modules of different enti ties. For example, the one or more interfaces 112 may comprise interface circuitry config ured to receive and/or transmit information. In embodiments the one or more processors 114 may be implemented using one or more processing units, one or more processing devices, any means for processing, such as a processor, a computer or a programmable hardware component being operable with accordingly adapted software. In other words, the described function of the one or more processors 114 may as well be implemented in software, which is then executed on one or more programmable hardware components. Such hardware com ponents may comprise a general-purpose processor, a Digital Signal Processor (DSP), a micro-controller, etc. In at least some embodiments, the one or more storage devices 116 may comprise at least one element of the group of a computer readable storage medium, such as a magnetic or optical storage medium, e.g., a hard disk drive, a flash memory, Flop py-Disk, Random Access Memory (RAM), Programmable Read Only Memory (PROM), Erasable Programmable Read Only Memory (EPROM), an Electronically Erasable Pro grammable Read Only Memory (EEPROM), or a network storage.
More details and aspects of the system and microscope system are mentioned in connection with the proposed concept or one or more examples described above or below (e.g., Fig. 2 to 6). The system and microscope system may comprise one or more additional optional features corresponding to one or more aspects of the proposed concept or one or more ex amples described above or below.
Fig. 2 shows a flow chart of an example of a corresponding method for a microscope sys tem, e.g., for the microscope system 100 of Figs la to If. The method comprises obtaining 210 image data from an optical imaging sensor of a microscope of the microscope system. The method comprises obtaining 220 a sensor signal from a touch interface of a touch screen of the microscope system. The sensor signal represents a touch input obtained via the touch interface. The method comprises generating 230 a display signal for a display of the touch screen of the microscope system. The display signal comprises a continuously updat ed representation of the image data and a visual control overlay being overlaid over the rep resentation of the image data. The visual control overlay is controlled via the touch input obtained via the touch interface. The visual control overlay comprises a first touch -activated button that is configured to show or hide the remaining touch-activated control elements of the visual control interface upon actuation. The visual control overlay further comprises at least one further touch-controlled control element being shown or hidden based on the actu ation of the first touch-activated button. For example, the method may comprise providing the display signal to the display of the touch-screen.
As indicated above, features described in connection with the system 110, the microscope 120 and the microscope system 100 of Figs la to If may be likewise applied to the method of Fig. 2.
More details and aspects of the method are mentioned in connection with the proposed con cept or one or more examples described above or below (e.g., Fig. la to If, 3a to 6). The method may comprise one or more additional optional features corresponding to one or more aspects of the proposed concept or one or more examples described above or below.
Various aspects of the present disclosure relate to radial menu with live background, in the following denoted a “rotary menu” (e.g., the visual control overlay 150 introduced in con nection with Figs la to 2). The present disclosure may thus relate to microscope on-screen controls. The User Interface (UI)/User experience (UX) may enable a selection and control of microscope functions, in an intuitive and clear manner, while keeping a clean UI and where possible showing the immediate effect of the selected changes. For example, the Ro tary Menu is may be a context menu that is (only) available during surgery and located at the bottom of the Live View (e.g., the representation of the image data). The menu may be opened and closed with a one-button handling.
In the following, the “rotary menu” is shown with respect to an illumination control screen. However, the same concept can be applied to other control screens of the microscope as well, e.g., to a control screen for activating / deactivating external devices, to a control screen for controlling recording of the surgical procedure, or to a control screen for control ling different imaging modes. The shown illumination control screen has a live image from the microscope in the background. A single button (e.g., the first touch-activated button) brings up the radial menu with 4 “petals” (e.g., the one or more second touch-activated but tons) each corresponding to a different function. A further circumferential bar (e.g., the third touch-controlled control element) is displayed depending on the selected function, which can either comprise separate buttons or a slider bar. The screen can detect touch radially outward of the circumferential bar.
For example, the "Rotary Menu" button (e.g., the first touch-activated button) 310 at the bottom, changes to a "X" button for closing the menu again. With the rotary menu the user can change simple functions that needs to be accessed fast and easy by the user (e.g., change the illumination)
Figs. 3a to 3e show various states of an exemplary rotary menu. The rotary menu comprises the “rotary menu” button 310 (which may correspond to the first touch -activated button 152 of Figs la to 2), to open/close the rotary menu, a slider or toggle buttons 315 (which may correspond to the third touch-controlled control element 156), which is the area to manipu late values, and four buttons 311-314 (e.g., the one or more second touch-activated buttons 154) for changing the content of the slider or toggle buttons 315. As shown in Fig. 3b, above the rotary menu, a description of the currently activated function and a numerical value (if applicable) representing a value of the currently activated function may be shown. For example, the rotary menu may contain four sections, the description, the slider or toggle buttons, the buttons for changing the content of the slider or toggle buttons, and the button or buttons to open/close the rotary menu. In Fig. 3a, a generic representation is shown, where button 310 and button 312 are high lighted. The third touch-controlled control element 315 relates to the function associated with button 312.
In general, the functionalities assigned to the buttons may be shown on the respective but tons, e.g., as text or as pictograms, as shown in Figs. 3a to 3e. The functionalities assigned to buttons 311-314 in Figs. 3b to 3 e are “video focus” 311, which may be moved upwards or downwards until a terminal position is reached, “illumination” 312, which may also be set to a value between 0 and 100, “auto focus” 313, which may be set to on or off, and “bright care” 314 which may also be set to on or off.
Using the toggle buttons 315b; 315d beneath the slider - the user can change the parameters that need to be adjusted by the slider. Alternatively, two ON / OFF Toggle buttons 315e; 315f may be shown, which function as classic buttons, as do the buttons 311-314. Once pressed, the background may turn "Aquamarine" and the icon of the button may get the "fo cused" status, for example. For example, if a maximal or minimal value or position is reached, one of the buttons 315b; 315d may be disabled.
In Fig. 3b, the button “illumination” 312 is activated. As the value for “illumination” can be set to a numerical value on a scale (e.g., from 0 to 100%), element 315 is a slider, with a highlighted portion 315a representing the numerical value, a complementary non- highlighted portion 315c representing the complement to the numerical value. In some ex amples, buttons 315b and 315d may be shown for decreasing and increasing the value. In Fig. 3b, the buttons are shown within the slider. However, as shown in Fig. Id, they may also be separate from the slider. Also, the numerical value may be controlled by placing the finger on any position on the slider, with the numerical value being adjusted according to the position of the finger on the slider.
In Fig. 3c, the button “bright care” 314 is activated. As the value for “bright care” may be set to “on” or “off’, area 315 is shown as toggle buttons 315e; 315f, with toggle button 315f (e.g., “on”) being active and thus highlighted.
The slider 315 in the middle provides the manipulation of values in two directions (up and down), it can be changed using the slide gesture or by tapping on the "+" or buttons 315b; 315d. The two buttons 315e; 315f on the left and the right side are used to toggle val ues between two states (currently only on/off).
The slider of the Rotary Menu may be displayed in white in any circumstances. The user may receive direct feedback on the changes by the visual appearance and manipulation of the slider itself. Also a quick tap on the "+" or or on any other area of the slider may make fast manipulation possible.
In Fig. 3d, a more detailed view on the touch targets 331-335a/b for actuating the buttons 311-315b/d is shown. The rotary menu, and the description, is center-aligned around a cen tral axis 330. In Fig. 3e, the angles between the different elements are shown. Fig. 3e shows a horizontal axis 340 and a vertical axis 341. The rotary menu shown in Figs. 3a to 3e, or at least the elements 311-315 thereof, create an arch that spans a circular sector of 156 degrees of a circle being spanned around button 310. Thus, in the example, the angle between the vertical axis 340 and the elements 314; 315 is 22 degrees (same between the horizontal axis 340 and the elements 311; 315 on the other side). An angle between gaps separating the buttons 311-315 is 34 degrees in the example.
Fig. 3f shows two different states for button 310 according to an example. Button 310a may be shown when the rotary menu is collapsed, while button 310b may be shown when the rotary menu is open, as shown in Figs. 3a to 3e.
Fig. 4 shows an exemplary placement of the rotary menu with first touch -controlled button 410; four second touch-controlled buttons 420 and two third touch-controlled elements 430 on top of a live view 440. On top of the rotary menu, the description of the functionality is shown. As shown in Fig. 4, the rotary menu may be constrained to the bottom half of the screen, and may be centered horizontally.
Fig. 5 shows an exemplary transition between different states of the rotary menu. Fig. 5 shows a first view 510, where the rotary menu is collapsed, a second view 520, in which the illumination button is activated, with the illumination being controlled via the slider, and a third view 530, in which the autofocus button is activated, with the choice being between “on” and “off’. Once the rotary menu button on the bottom of the menu is pressed, the menu collapses again, and the first view 510 is shown again. More details and aspects of the rotary menu are mentioned in connection with the proposed concept or one or more examples described above or below (e.g., Fig. la to 2, 6). The rotary menu may comprise one or more additional optional features corresponding to one or more aspects of the proposed concept or one or more examples described above or below.
As used herein the term “and/or” includes any and all combinations of one or more of the associated listed items and may be abbreviated as ‘7”.
Although some aspects have been described in the context of an apparatus, it is clear that these aspects also represent a description of the corresponding method, where a block or device corresponds to a method step or a feature of a method step. Analogously, aspects described in the context of a method step also represent a description of a corresponding block or item or feature of a corresponding apparatus.
Some embodiments relate to a microscope comprising a system as described in connection with one or more of the Figs. 1 to 5. Alternatively, a microscope may be part of or connected to a system as described in connection with one or more of the Figs. 1 to 5. Fig. 6 shows a schematic illustration of a system 600 configured to perform a method described herein. The system 600 comprises a microscope 610 and a computer system 620. The microscope 610 is configured to take images and is connected to the computer system 620. The computer system 620 is configured to execute at least a part of a method described herein. The computer system 620 may be configured to execute a machine learning algorithm. The computer system 620 and microscope 610 may be separate entities but can also be integrated together in one common housing. The computer system 620 may be part of a central processing system of the microscope 610 and/or the computer system 620 may be part of a subcomponent of the microscope 610, such as a sensor, an actor, a camera or an illumination unit, etc. of the microscope 610.
The computer system 620 may be a local computer device (e.g., personal computer, laptop, tablet computer or mobile phone) with one or more processors and one or more storage devices or may be a distributed computer system (e.g., a cloud computing system with one or more processors and one or more storage devices distributed at various locations, for example, at a local client and/or one or more remote server farms and/or data centers). The computer system 620 may comprise any circuit or combination of circuits. In one embodiment, the computer system 620 may include one or more processors which can be of any type. As used herein, processor may mean any type of computational circuit, such as but not limited to a microprocessor, a microcontroller, a complex instruction set computing (CISC) microprocessor, a reduced instruction set computing (RISC) microprocessor, a very long instruction word (VLIW) microprocessor, a graphics processor, a digital signal processor (DSP), multiple core processor, a field programmable gate array (FPGA), for example, of a microscope or a microscope component (e.g., camera) or any other type of processor or processing circuit. Other types of circuits that may be included in the computer system 620 may be a custom circuit, an application-specific integrated circuit (ASIC), or the like, such as, for example, one or more circuits (such as a communication circuit) for use in wireless devices like mobile telephones, tablet computers, laptop computers, two-way radios, and similar electronic systems. The computer system 620 may include one or more storage devices, which may include one or more memory elements suitable to the particular application, such as a main memory in the form of random access memory (RAM), one or more hard drives, and/or one or more drives that handle removable media such as compact disks (CD), flash memory cards, digital video disk (DVD), and the like. The computer system 620 may also include a display device, one or more speakers, and a keyboard and/or controller, which can include a mouse, trackball, touch screen, voice-recognition device, or any other device that permits a system user to input information into and receive information from the computer system 620.
Some or all of the method steps may be executed by (or using) a hardware apparatus, like for example, a processor, a microprocessor, a programmable computer or an electronic circuit. In some embodiments, some one or more of the most important method steps may be executed by such an apparatus.
Depending on certain implementation requirements, embodiments of the invention can be implemented in hardware or in software. The implementation can be performed using a non- transitory storage medium such as a digital storage medium, for example a floppy disc, a DVD, a Blu-Ray, a CD, a ROM, a PROM, and EPROM, an EEPROM or a FLASH memory, having electronically readable control signals stored thereon, which cooperate (or are capable of cooperating) with a programmable computer system such that the respective method is performed. Therefore, the digital storage medium may be computer readable. Some embodiments according to the invention comprise a data carrier having electronically readable control signals, which are capable of cooperating with a programmable computer system, such that one of the methods described herein is performed.
Generally, embodiments of the present invention can be implemented as a computer program product with a program code, the program code being operative for performing one of the methods when the computer program product runs on a computer. The program code may, for example, be stored on a machine readable carrier.
Other embodiments comprise the computer program for performing one of the methods described herein, stored on a machine readable carrier.
In other words, an embodiment of the present invention is, therefore, a computer program having a program code for performing one of the methods described herein, when the computer program runs on a computer.
A further embodiment of the present invention is, therefore, a storage medium (or a data carrier, or a computer-readable medium) comprising, stored thereon, the computer program for performing one of the methods described herein when it is performed by a processor. The data carrier, the digital storage medium or the recorded medium are typically tangible and/or non-transitionary. A further embodiment of the present invention is an apparatus as described herein comprising a processor and the storage medium.
A further embodiment of the invention is, therefore, a data stream or a sequence of signals representing the computer program for performing one of the methods described herein. The data stream or the sequence of signals may, for example, be configured to be transferred via a data communication connection, for example, via the internet.
A further embodiment comprises a processing means, for example, a computer or a programmable logic device, configured to, or adapted to, perform one of the methods described herein. A further embodiment comprises a computer having installed thereon the computer program for performing one of the methods described herein.
A further embodiment according to the invention comprises an apparatus or a system configured to transfer (for example, electronically or optically) a computer program for performing one of the methods described herein to a receiver. The receiver may, for example, be a computer, a mobile device, a memory device or the like. The apparatus or system may, for example, comprise a file server for transferring the computer program to the receiver.
In some embodiments, a programmable logic device (for example, a field programmable gate array) may be used to perform some or all of the functionalities of the methods described herein. In some embodiments, a field programmable gate array may cooperate with a microprocessor in order to perform one of the methods described herein. Generally, the methods are preferably performed by any hardware apparatus.
List of reference Signs
100 Microscope system 105 Base unit 110 System
112 One or more interfaces
114 One or more processors
120 Microscope
130 Touch screen
140 Representation of image data
150 Visual control overlay
152 First touch-activated button
154, 154a-d One or more second touch-activated buttons
156, 156a-g Third touch-controlled control element
160 Arm
170 Handles
210 Obtaining image data
220 Obtaining a sensor signal
230 Generating a display signal
310 Rotary menu button
311-314 Function buttons
315 Slider
315a Highlighted portion 315b Button
315c Non-highlighted portion 315d Button 315e Button 315f Button 320 Description 330 Central axis 331-335 Touch targets
340 Horizontal axis
341 Vertical axis Line indicating angle between horizontal axis and buttons/slider
Line indicating angle between buttons
First touch-controlled button
Second touch-controlled buttons
Third touch-controlled elements
Live view
First view
Second view
Third view
System
Microscope
Computer system

Claims

Claims
1. A system (110; 620) for a microscope system (100; 600), the system comprising one or more processors (114) and one or more storage devices (116), wherein the system is configured to: obtain image data from an optical imaging sensor of a microscope (120; 610) of the microscope system; obtain a sensor signal from a touch interface of a touch screen (130) of the micro scope system, the sensor signal representing a touch input obtained via the touch in terface; generate a display signal for a display of the touch screen of the microscope system, the display signal comprising a continuously updated representation (140) of the im age data and a visual control overlay (150) being overlaid over the representation of the image data, the visual control overlay being controlled via the touch input ob tained via the touch interface, wherein the visual control overlay comprises a first touch -activated button (152; 310) that is configured to show or hide the remaining touch-activated control ele ments of the visual control interface upon actuation, and wherein the visual control overlay comprises at least one further touch-controlled control element being shown or hidden based on the actuation of the first touch-activated button.
2. The system according to claim 1, wherein the visual control overlay comprises one or more second touch-activated buttons (154; 311; 312; 313; 314) that are arranged adjacent to the first touch-activated button.
3. The system according to claim 1, wherein the one or more second touch -activated buttons are fanned out around the first touch-activated button.
4. The system according to one of the claims 2 or 3, wherein the microscope system is suitable for performing imaging in two or more different imaging modes, wherein the system is configured to generate the visual control overlay such that a functional ity associated with the one or more second touch-activated buttons is dependent on an imaging mode being performed by the microscope system.
5. The system according to one of the claims 2 to 4, wherein the system is configured to generate the visual control overlay such that each of the one or more second touch-activated buttons is associated with a single functionality of the microscope system.
6. The system according to one of the claims 2 to 5, wherein the visual control overlay comprises, at least if one of the one or more or more second touch-activated buttons is actuated via the touch interface, a third touch-controlled control element (156; 315), with the one or more second touch-activated buttons being arranged between the third touch-controlled control element and the first touch-activated button.
7. The system according to claim 6, wherein the one or more second touch-activated buttons and the third touch-controlled control element are spread-out radially around the first touch-activated button. and/or wherein the one or more second touch-activated buttons and the third touch- controlled control element are contained within a circular sector of a circle being spanned around the first touch-activated button, the circular sector having an angle of less than 180 degrees.
8. The system according to one of the claims 6 or 7, wherein the system is configured to generate the visual control overlay such that the third touch-controlled control el ement comprises at least one of a button, a toggle and a slider control.
9. The system according to one of the claims 6 to 8, wherein the system is configured to toggle between showing and hiding the third touch-controlled control element in the visual control overlay upon actuating one of the one or more second touch- activated buttons.
10. The system according to one of the claims 6 to 9, wherein the system is configured to control the visual control overlay via the touch input obtained via the touch inter face, wherein touch input that is registered radially outward of the third touch- controlled control element is used to control the third touch -controlled control ele ment.
11. The system according to one of the claims 1 to 10, wherein the system is configured to generate the display signal such that the first touch -activated button is constantly shown overlaid over the representation of the image data, and/or wherein the system is configured to generate the visual control overlay such, that the visual control overlay is constrained within a bottom half of the display, and/or wherein the system is configured to generate the visual control overlay such that the first touch-activated button covers at most 5% of an overall surface of the display, and/or wherein the wherein the system is configured to generate the visual control overlay such that the visual control overlay covers at most 30% of an overall surface of the display.
12. The system according to one of the claims 1 to 11, wherein touch-activated control elements of the visual control overlay have an active state and a passive state, where in the system is configured to generate the visual control overlay such that the touch- activated control elements are shown in different colors depending on their state.
13. A surgical microscope system (100; 600) comprising the system (110; 620) accord ing to one of the claims 1 to 12, the microscope (120; 610) and the display (130).
14. A method for a microscope system, the method comprising: obtaining (210) image data from an optical imaging sensor of a microscope of the microscope system; obtaining (220) a sensor signal from a touch interface of a touch screen of the micro scope system, the sensor signal representing a touch input obtained via the touch in terface; generating (230) a display signal for a display of the touch screen of the microscope system, the display signal comprising a continuously updated representation of the image data and a visual control overlay being overlaid over the representation of the image data, the visual control overlay being controlled via the touch input obtained via the touch interface, wherein the visual control overlay comprises a first touch -activated button that is configured to show or hide the remaining touch-activated control elements of the visual control interface upon actuation and wherein the visual control overlay com prises at least one further touch-controlled control element being shown or hidden based on the actuation of the first touch -activated button.
15. A computer program with a program code for performing the method according to claim 14 when the computer program is executed on a processor.
PCT/EP2022/056886 2021-03-18 2022-03-16 Microscope system and corresponding system, method and computer program WO2022194965A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22715612.2A EP4308990A1 (en) 2021-03-18 2022-03-16 Microscope system and corresponding system, method and computer program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021106671 2021-03-18
DE102021106671.9 2021-03-18

Publications (1)

Publication Number Publication Date
WO2022194965A1 true WO2022194965A1 (en) 2022-09-22

Family

ID=81327833

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/056886 WO2022194965A1 (en) 2021-03-18 2022-03-16 Microscope system and corresponding system, method and computer program

Country Status (2)

Country Link
EP (1) EP4308990A1 (en)
WO (1) WO2022194965A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170196453A1 (en) * 2016-01-13 2017-07-13 Novartis Ag Apparatuses and methods for parameter adjustment in surgical procedures
US20190099226A1 (en) * 2017-10-04 2019-04-04 Novartis Ag Surgical suite integration and optimization
US20190294317A1 (en) * 2018-03-26 2019-09-26 Microscopes International, Llc Interface for display of multi-layer images in digital microscopy
US20210018741A1 (en) * 2018-03-23 2021-01-21 Leica Microsystems Cms Gmbh Microscope System and Method for Controlling a Microscope System of this Type

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170196453A1 (en) * 2016-01-13 2017-07-13 Novartis Ag Apparatuses and methods for parameter adjustment in surgical procedures
US20190099226A1 (en) * 2017-10-04 2019-04-04 Novartis Ag Surgical suite integration and optimization
US20210018741A1 (en) * 2018-03-23 2021-01-21 Leica Microsystems Cms Gmbh Microscope System and Method for Controlling a Microscope System of this Type
US20190294317A1 (en) * 2018-03-26 2019-09-26 Microscopes International, Llc Interface for display of multi-layer images in digital microscopy

Also Published As

Publication number Publication date
EP4308990A1 (en) 2024-01-24

Similar Documents

Publication Publication Date Title
AU2020269232B2 (en) Clock faces for an electronic device
US9507518B2 (en) Keyboard based graphical user interface navigation
KR101433490B1 (en) Unlocking a touch screen device
US9256917B1 (en) Nested zoom in windows on a touch sensitive device
JP5456529B2 (en) Method and computer system for manipulating graphical user interface objects
EP3285107B2 (en) Surgical microscope with gesture control and method for a gesture control of a surgical microscope
US20210385417A1 (en) Camera and visitor user interfaces
US20120120224A1 (en) Microscope having a touch screen
US20070146352A1 (en) Portable device having rotatable input buttons and method of operating the same
JP2010146506A (en) Input device, method for controlling input device, program for controlling input device, computer-readable recording medium, and information terminal device
CN115562533A (en) Integration of cursor with touch screen user interface
US20200004487A1 (en) Object moving program
US20160202892A1 (en) Graphical user interface providing virtual super-zoom functionality
US20230046644A1 (en) Apparatuses, Methods and Computer Programs for Controlling a Microscope System
US10018823B2 (en) Force-feedback control device and method for digital microscope
WO2022194965A1 (en) Microscope system and corresponding system, method and computer program
US10140003B1 (en) Simultaneous zoom in windows on a touch sensitive device
JP2012029180A (en) Peripheral image display device and display method thereof
US20230031240A1 (en) Systems and methods for processing electronic images of pathology data and reviewing the pathology data
US20230169698A1 (en) Microscope system and corresponding system, method and computer program for a microscope system
WO2018074055A1 (en) Information processing device, information processing method and program
JP2021518580A (en) Microscope system and how to control such a microscope system
JPWO2018087977A1 (en) Information processing apparatus, information processing method, and program
WO2022055820A1 (en) Method of displaying selectable options
US11650672B2 (en) Healthcare information manipulation and visualization controllers

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22715612

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022715612

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2022715612

Country of ref document: EP

Effective date: 20231018