WO2021165361A1 - Graphical user interface handling a plurality of visualisation devices - Google Patents

Graphical user interface handling a plurality of visualisation devices Download PDF

Info

Publication number
WO2021165361A1
WO2021165361A1 PCT/EP2021/053964 EP2021053964W WO2021165361A1 WO 2021165361 A1 WO2021165361 A1 WO 2021165361A1 EP 2021053964 W EP2021053964 W EP 2021053964W WO 2021165361 A1 WO2021165361 A1 WO 2021165361A1
Authority
WO
WIPO (PCT)
Prior art keywords
visualisation
monitor device
user interface
graphical user
image data
Prior art date
Application number
PCT/EP2021/053964
Other languages
French (fr)
Inventor
Line Sandahl UBBESEN
Original Assignee
Ambu A/S
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ambu A/S filed Critical Ambu A/S
Publication of WO2021165361A1 publication Critical patent/WO2021165361A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00039Operational features of endoscopes provided with input arrangements for the user
    • A61B1/0004Operational features of endoscopes provided with input arrangements for the user for electronic operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00059Operational features of endoscopes provided with identification means for the endoscope
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/00411Display of information to the user, e.g. menus the display also being used for user input, e.g. touch screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/00413Display of information to the user, e.g. menus using menus, i.e. presenting the user with a plurality of selectable options
    • H04N1/00416Multi-level menus
    • H04N1/00419Arrangements for navigating between pages or parts of the menu
    • H04N1/00424Arrangements for navigating between pages or parts of the menu using a list of graphical elements, e.g. icons or icon bar
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the present disclosure relates to a visualisation device, such as an endoscope and a medical visualisation system, such as an endoscope system, comprising a visualisation device. More specifically the present disclosure relates to a graphical user interface and a monitor device having such graphical user interface for interacting with the medical visualisation system.
  • a visualisation device may be utilized to visually examine certain areas of the body of a person, such as inside a body cavity of the person.
  • a visualisation device may be used to inspect the airways, the digestive tract, or the intestines.
  • a visualisation device may be provided with a camera and be attached to a monitor device, such as a monitor with a display screen, a video output from the camera of the visualisation device may be received and displayed at the monitor device, thereby allowing an operator to control the visualisation device to inspect an area of interest.
  • a monitor device such as a monitor with a display screen
  • a visualisation device may be an endoscope, such as a disposable endoscope.
  • An endoscope comprises an operating handle at the proximal end and an insertion tube extending from the handle towards the distal end.
  • the handle is configured to be held by an operator and inter alia comprises externally protruding operating members connected to internal control means allowing the operator to control the movement of a bending section at the distal end of the insertion tube, while advancing the distal end of the insertion tube to a desired location e.g. within a body cavity of a person.
  • an attached monitor device such as a monitor with a display screen, the location to which the distal end has been advanced may be inspected using the endoscope.
  • the monitor device of a medical visualisation system may be provided with some functionality, such as ability to save still images and/or video sequences of the view from the attached visualisation device. Furthermore, the monitor device may comprise some image processing capabilities, and may be configured to output a video or image output, e.g. to an external display.
  • a plurality of visualisation devices e.g. a video laryngoscope and an endoscope simultaneously, e.g. to inspect different positions within a body cavity of the person.
  • the video laryngoscope may be used to visually aid the operator in inserting an endoscope into the airways.
  • a second inserted visualisation device may be used to confirm the anticipated position of a first inserted visualisation device.
  • the present disclosure relates to a visualisation device, such as an endoscope, and a visualisation system, such as an endoscope system.
  • the visualisation device may be a disposable camera endoscope.
  • the visualisation device may be a video laryngoscope, an endotracheal tube and/or a laryngeal mask.
  • the visualisation system may further comprise a monitor device for being connected to the visualisation device, e.g. the monitor device may be configured to receive image data from the visualisation device.
  • the present disclosure further relates to a graphical user interface for such monitor device of a medical visualisation system.
  • a medical visualisation system and a method performed at a monitor device of the medical visualisation system are disclosed.
  • the medical visualisation system comprises a visualisation device, such as an endoscope, such as a disposable endoscope.
  • the visualisation device may be a video laryngoscope, an endotracheal tube and/or a laryngeal mask.
  • the visualisation device has an image sensor configured to generate image data indicative of a view from the visualisation device.
  • the medical visualisation system may comprise a plurality of visualisation devices each having an image sensor configured to generate image data indicative of a view from the visualisation device.
  • the plurality of visualisation devices may include a first visualisation device and/or a second visualisation device.
  • the first visualisation device may comprise a first image sensor configured to generate image data indicative of a view from the first visualisation device.
  • the second visualisation device may comprise a second image sensor configured to generate image data indicative of a view from the second visualisation device.
  • the image sensor(s) may be any sensor capable of detecting and conveying information used to make an image.
  • the image sensor(s) may comprise a CCD or CMOS sensor, or similar.
  • the image sensor(s) may generate image data corresponding to a square image, i.e. having equal height and width.
  • the image data generated by the image sensors may correspond to a 300x300 pixel image, or a 400x400 pixel image, or a 600x600 pixel image, or an 800x800 pixel image.
  • the image sensor may generate image data corresponding to a non-square image, which is cropped to form a square image, such as a square image having 300x300 pixels, 400x400 pixels, 600x600 pixels, or 800x800 pixels.
  • the medical visualisation system further comprises a monitor device receiving and/or being operable to receive the image data generated by the image sensor.
  • the monitor device may receive and/or be operable to receive the image data, e.g. as the image data is being generated, e.g. within limitation of the hardware.
  • the monitor device comprises a first housing extending in a first direction from a first housing side to a second housing side and in a second direction perpendicular to the first direction from a third housing side to a fourth housing side.
  • the monitor device comprises a display, e.g. a touch sensitive display.
  • the display may be accommodated in the first housing.
  • the display may have a first length in the first direction and a second length in the second direction. The second length may be longer than the first length, e.g.
  • the display may be a 16:9 or 16:10 display.
  • the first length may be longer than the second length, or the first length and the second length may be substantially the same.
  • the touch sensitive display may be any suitable type of touch display, e.g. capacitive touch display or resistive touch display.
  • the monitor device may comprise one or more connection ports configured to receive a connector of the visualisation device.
  • the connection ports and the corresponding connector of the visualisation device may be a proprietary plug and socket connectors, or any standard connector capable of transmitting therethrough at least the image data from the image sensor.
  • the connector and connection ports may be configured to supply power to the components of the visualisation device.
  • the one or more connection ports may be arranged on the first housing.
  • the one or more connection ports may be provided on the third housing side, and/or on the fourth housing side.
  • the monitor device may comprise an on/off button.
  • the on/off button may be arranged on the first housing.
  • the on/off button may be provided on the third housing side or on the fourth housing side.
  • the one or more connection ports may be provided on the third housing side and the on/off button may be provided on the fourth housing side.
  • the monitor device may establish connection to a visualisation device, such as the first visualisation device and/or the second visualisation device.
  • Establishing connection to the visualisation device may include receiving a device connector of the respective visualisation device in a connection port of the one or more connection ports of the monitor device.
  • Establishing connection to a visualisation device may include obtaining device identifier information from a device identifier (e.g. EPROM, QR-code, NFC, RFID or similar) of the visualisation device.
  • establishing connection to the first visualisation device may include obtaining first device identifier information from a first device identifier of the first visualisation device and/or establishing connection to the second visualisation device may include obtaining second device identifier information from a second device identifier of the second visualisation device.
  • the monitor device may comprise a processing unit and memory.
  • the processing unit and/or the memory may be accommodated in the first housing.
  • the monitor device may comprise a second housing, and the processing unit and/or the memory may be accommodated in the second housing.
  • the monitor device may comprise an orientation sensor, e.g. for determining the orientation of the monitor device, such as of the first housing, relative to gravity.
  • the orientation sensor may comprise one or more accelerometers and/or a gyroscope.
  • the orientation sensor may be accommodated in the first housing.
  • the processing unit may be connected to the touch sensitive display to control display of information with the touch sensitive display, and the processing unit may be adapted to receive a signal from the touch sensitive display indicative of touch inputs on the touch sensitive display.
  • the monitor device may detect user inputs, e.g. in the form of touch inputs, with the touch sensitive display.
  • Touch inputs may, for example, comprise single tap(s), double tap(s), or swipe(s) on the touch sensitive display.
  • the processing unit may be connected to the orientation sensor to receive an orientation signal indicative of the orientation of the monitor device, such as of the first housing of the monitor device.
  • the processing unit may be connected to the memory and be adapted to read and write data from and to the memory.
  • the monitor device may comprise a power unit for powering the monitor device.
  • the power unit may comprise a rechargeable battery and/or a power connection for connecting the power unit to an external power supply, such as a conventional AC power socket.
  • the power unit may be accommodated in the first housing. Alternatively, the power unit may be accommodated in the second housing.
  • the monitor device may comprise a graphical user interface.
  • the monitor device and/or the processing unit of the monitor device may display with the touch sensitive display the graphical user interface.
  • the graphical user interface may comprise one or more portions, such as a plurality of portions.
  • the portions may be non-overlapping portions, such as a plurality of non-overlapping portions.
  • the portions may include a first portion and/or a second portion.
  • the portions may further include a third portion and/or a fourth portion.
  • the second portion and/or the fourth portion may be designated as background portions, e.g. the second portion may be a first background portion and/or the fourth portion may be a second background portion.
  • Each of the plurality of portions may extend substantially throughout the first length in the first direction.
  • the first portion may be arranged between the fourth portion and the second portion along the second direction.
  • the fourth portion may be arranged between the third portion and the first portion along the second direction.
  • the third portion may be arranged between a side of the first housing, e.g. the third housing side, and the fourth portion along the second direction.
  • the second portion may be arranged between another side of the first housing, e.g. the fourth housing side, and the first portion along the second direction.
  • the first portion and the fourth portion may be arranged between the second portion and the third portion along the second direction.
  • the first portion of the graphical user interface may be square.
  • the first portion of the graphical user interface may occupy the centre of the touch sensitive display.
  • the first portion of the graphical user interface may be larger along the second direction than the second portion, the third portion and/or the fourth portion, individually and/or collectively.
  • the first portion of the graphical user interface may extend throughout more than 40% of the second length in the second direction, such as more than 50% of the second length in the second direction, such as more than 60% of the second length in the second direction.
  • the monitor device may display a live representation of the image data, e.g. within the first portion of the graphical user interface.
  • the live representation of the image data may be displayed, e.g. by the processing unit, with the touch sensitive display, e.g. within the first portion of the graphical user interface.
  • the visualisation device e.g. an endoscope
  • the visualisation device may comprise a handle and an elongated flexible member extending from the handle to a distal end.
  • the image sensor may be arranged at the distal end of the elongated flexible member.
  • the image data may be indicative of a view from the distal end of the elongated flexible member.
  • the handle may comprise a control button adapted to receive an input in a first input direction and/or in a second input direction.
  • the first input direction and the second input direction may be opposite.
  • the touch input in the first input direction may cause a distal portion of the elongated flexible member to bend in a first bending direction and/or may cause movement of the image sensor in a first image sensor direction.
  • the touch input in the second input direction may cause the distal portion of the elongated flexible member to bend in a second bending direction and/or may cause movement of the image sensor in a second image sensor direction.
  • the live representation of the image data may have directions corresponding to directions of the image sensor generating the image data.
  • the first bending direction may correspond to a first image direction of a representation of the image data, such as the live representation of the image data.
  • the second bending direction may correspond to a second image direction of the representation of the image data, such as the live representation of the image data.
  • the first image direction and/or the second image direction may be parallel to the first direction of the first housing.
  • the monitor device may provide one or more actionable items.
  • One or more actionable items may be displayed, e.g. by the processing unit, with the touch sensitive display, e.g. within the second portion of the graphical user interface.
  • One or more actionable menu items may be displayed, e.g. by the processing unit, with the touch sensitive display, e.g. within the third portion of the graphical user interface.
  • a battery indicator may be displayed, e.g. by the processing unit, with the touch sensitive display, e.g. within the third portion of the graphical user interface.
  • a time indicator may be displayed, e.g. by the processing unit, with the touch sensitive display, e.g. within the third portion of the graphical user interface.
  • the one or more actionable items may comprise an image capture button and/or a video capture button.
  • an image capture button e.g. by a user providing a touch input, e.g. a single tap, at the respective location of the touch sensitive display
  • an image data file corresponding to the image data received when the image capture button was activated may be stored, e.g. in memory of the monitor device.
  • the video capture button e.g. by a user providing a touch input, e.g.
  • a video sequence of image data corresponding to the image data received when the video capture button was activated may be stored, e.g. in memory of the monitor device.
  • a first activation of the video capture button may start collection of image data for the video sequence, and a second activation of the video capture button (subsequent to the first activation of the video capture button) may stop the collection of image data for the video sequence.
  • the stored video sequence may correspond to the image data received between the first activation and the second activation of the video capture button.
  • the video capture button may be displayed in a first appearance prior to the first activation and after the second activation.
  • the video capture button may be displayed in a second appearance after the first activation and before the second activation.
  • the monitor device may be adapted to establish connection to a first visualisation device of a plurality of visualisation devices.
  • the plurality of visualisation devices may be a plurality of visualisation devices being substantially similar, or the plurality of visualisation devices may comprise visualisation devices of different types.
  • the first visualisation device may be an endoscope and a second visualisation device may be a laryngoscope.
  • the first visualisation device may be an endoscope and the second visualisation device may be an endoscope.
  • the first visualisation device may be a laryngoscope and the second visualisation device may be an endoscope.
  • the first visualisation device may be a medical airway device such as an endotracheal tube or a laryngeal mask and the second visualisation device may be an endoscope.
  • the first visualisation device may be an endoscope and the second visualisation device may be a medical airway device such as an endotracheal tube or a laryngeal mask.
  • the monitor device may display, e.g. with the touch sensitive display, within the first portion of the graphical user interface, a first live representation of first image data generated by a first image sensor of the first visualisation device.
  • the monitor device While the first visualisation device is connected to the monitor device, the monitor device may be further adapted to establish connection to a second visualisation device of the plurality of visualisation devices.
  • the monitor device In response to establishing the connection to the second visualisation device concurrently displays a second live representation of second image data generated by a second image sensor of the second visualisation device and the first live representation of first image data generated by the first image sensor of the first visualisation device.
  • the second live representation is displayed in the fourth portion and extending into the first portion of the graphical user interface.
  • the first live representation is displayed in the second portion and extending into the first portion of the graphical user interface.
  • the first live representation is displayed in reduced size compared to the second live representation.
  • a monitor device for an endoscope system and a method which allows handling of multiple simultaneously connected visualisation devices, by providing a dual mode view. Consequently, the present disclosure provides the user with the ability of displaying input from two different scopes, to inspect two positions simultaneously, and allow viewing of the two inputs on the same display. Importantly, the present disclosure provides logic for determining what visualisation device to display largest and what to show smallest.
  • the disclosure provides a solution for altering the displayed content, in a predictive manner and with minimum visual disturbance for the user, upon connection of an additional endoscope and/or disconnection of one of two endoscopes.
  • the system facilitates the user in continuing an ongoing procedure while connecting or disconnecting additional endoscopes.
  • the monitor device may comprise a plurality of connection ports including a first connection port and a second connection port.
  • a first connector of the first visualisation device may be received by the first connection port.
  • a second connector of the second visualisation device may be received by the second connection port.
  • the live representations may be overlaid with an indicator.
  • a first indicator may be overlaid on the first live representation and/or a second indicator may be overlaid on the second live representation.
  • the first connection port may be labelled with a first port indicator resembling the first indicator.
  • the second connection port may be labelled with a second port indicator resembling the second indicator.
  • the monitor device may further display a rearrange icon.
  • the monitor device may be adapted, e.g. with the touch sensitive display, to detect a first user input corresponding to selection of the rearrange icon.
  • the monitor device may replace display of the second live representation in the fourth portion and extending into the first portion of the graphical user interface, with display of the first live representation and/or the monitor device may replace display of the first live representation in the second portion and extending into the first portion of the graphical user interface, with display of the second live representation.
  • the second live representation may be displayed in reduced size compared to the first live representation.
  • the monitor device may open a first procedure session corresponding to the first device identifier information obtained from the first device identifier of the first visualisation device.
  • the monitor device may open a second procedure session corresponding to the second device identifier information obtained from the first device identifier of the first visualisation device.
  • a procedure session may be created for each connected visualisation device.
  • the monitor device may associate the first image file with the first procedure session; and/or associate the second image file with the second procedure session.
  • a procedure session may be implemented by creating a folder in the file system of the monitor device, wherein image files and video sequences obtained from a visualisation device is stored in the folder corresponding to the visualisation device.
  • Flence association of an image file to a procedure session may be implemented by storing the image file in the folder of the procedure session. Opening a procedure session may further comprise creating a log, registering the time and date for initiating the procedure, registering information about the visualisation device, registering software version and/or other information.
  • the monitor device may determine, based on the device identifier information, whether the visualisation device has been previously connected to the monitor device. For example, in accordance with determining that the visualisation device has previously been connected to the monitor device, the monitor device may reopen the procedure session corresponding to the device identifier information; and/or in accordance with determining that the visualisation device has not previously been connected to the monitor device, the monitor device may create the procedure session, e.g. a new procedure session, corresponding to the device identifier information.
  • a third user input e.g. with the touch sensitive display, corresponding to selection of a first actionable menu item of the one or more actionable menu items, may be detected.
  • the monitor device may display a primary menu, e.g. associated with the first actionable menu item.
  • the primary menu may be displayed within the fourth portion of the graphical user interface obscuring the live representation in the fourth portion of the graphical user interface (e.g. the second live representation).
  • the primary menu may comprise one or more primary actionable items including a first primary actionable item. While the primary menu is displayed, the monitor device may be adapted to detect a fourth user input corresponding to selection of the first primary actionable item.
  • the monitor device may display a secondary menu associated with the primary actionable item in the first portion, and optionally the second portion and/or the fourth portion, of the touch sensitive display.
  • the monitor device may cease display of the primary menu associated with the first actionable menu item and display the partly obscured live representation (e.g. the second live representation) in the fourth portion and extending into the first portion of the graphical user interface.
  • the partly obscured live representation e.g. the second live representation
  • the monitor device may display, within the first portion of the graphical user interface, the live representation of image data generated by the image sensor of the visualisation device still being connected.
  • the monitor device may be adapted to detect disconnection of the first visualisation device from the monitor device, and in response to detecting disconnection of the first visualisation device from the monitor device, the monitor device may display, within the first portion of the graphical user interface, the second live representation of second image data generated by the second image sensor of the second visualisation device.
  • Fig. 1 schematically illustrates an exemplary medical visualisation system
  • Fig. 4 is a block diagram of an exemplary monitor device
  • Figs. 6A-6B schematically illustrates exemplary user interactions with an exemplary graphical user interface
  • Figs. 7 schematically illustrates an exemplary graphical user interface
  • Figs. 8A-8D schematically illustrates exemplary user interactions with an exemplary graphical user interface
  • Figs. 9A-9D schematically illustrates exemplary user interactions with an exemplary graphical user interface
  • the visualisation device 4 may be connected to the monitor device 20.
  • a device cable 14 extending from the handle 6 terminates in a device connector 16 connected to a connection port 40 of the monitor device 20.
  • the monitor device 20 is operable to receive image data generated by the image sensor 12 of the visualisation device 4.
  • the monitor device 20 may receive image data generated by the image sensor 12 via the device cable 14, the connector 16 and connection port 40.
  • the handle 6 comprises a control button 7 adapted to receive an input in a first input direction and/or in a second input direction.
  • the touch input in the first input direction on the control button 7 causes a distal portion 9 of the elongated flexible member 8 to bend in a first bending direction, e.g. via wires extending from the handle, through the elongated flexible member 8 to the distal portion 9.
  • the touch input in the second input direction on the control button 7 causes the distal portion 9 of the elongated flexible member 8 to bend in a second bending direction.
  • the first input direction and the second input direction may be opposite.
  • the first bending direction and the second bending direction may be opposite.
  • Bending the distal portion 9 of the elongated flexible member 8 may cause a movement of the distal end 10 and the image sensor 12 in a direction relative to the image sensor 12. Thereby, seeing an image generated by the image sensor 12, a direction, e.g. up or down, in the image may correspond to a respective input on the control button 7.
  • the monitor device may comprise an on/off button 41, which may be provided on the fourth housing side 24, as illustrated.
  • FIG. 3 schematically illustrates an exemplary monitor device 20, such as the monitor device 20 as illustrated in Figs. 1-2.
  • a device connector 16 may be connected to a connection port 40.
  • the monitor device 20 may be provided with a graphical user interface 27.
  • the graphical user interface 27 may be displayed with the touch sensitive display 26, and the user may interact with the graphical user interface 27, e.g. by means of providing touch inputs on the touch sensitive display 26.
  • the graphical user interface 27 is displayed with the touch sensitive display 26.
  • the graphical user interface 27 comprises a plurality of non-overlapping portions 31, 32, 33, 34.
  • Each of the portions 31, 32, 33, 34 extends substantially throughout the first length LI in the first direction xl.
  • the non-overlapping portions includes a first portion 31, a second portion 32, a third portion 33 and a fourth portion 34.
  • the first portion 31 is arranged between the fourth portion 34 and the second portion 32 along the second direction x2.
  • the fourth portion 34 is arranged between the third portion 33 and the first portion 31 along the second direction x2.
  • the third portion 33 is arranged between a side of the first housing, e.g. the third housing side 23, and the fourth portion 34 along the second direction x2.
  • the second portion 32 is arranged between another side of the first housing 25, e.g. the fourth housing side 24, and the first portion 31 along the second direction x2.
  • the first portion 31 and the fourth portion 34 are arranged between the second portion 32 and the third portion 33 along the second direction.
  • the monitor device 20 displays a live representation 70 of the image data within the first portion 31 of the touch sensitive display 26.
  • the first bending direction and the second bending direction of the distal portion 9 of the elongated flexible member 8, as described with respect to Fig. 1, may corresponds to a first image direction 37 and a second image direction 38 of the live representation 70, respectively.
  • the first image direction 37 and the second image direction 38 may be parallel to the first direction xl, as illustrated.
  • the first image direction 37 and the second image direction 38 may be opposite, as illustrated.
  • a user operating the control button 7 of visualisation device 2 may cause movement of the distal portion 9 of the elongated flexible member 8 to bend in a direction corresponding to the first image direction 37 or the second image direction 38 of the live representation 70.
  • the monitor device 20 displays with the touch sensitive display 26 one or more actionable items 36 within the second portion 32 of the graphical user interface 27.
  • the actionable items 36 may comprise an image capture button 36a, e.g. for storing an image data file corresponding to the image data received when the image capture button 36a was activated.
  • the actionable items 36 may comprise a video capture button 36b, e.g. for storing a video sequence of image data corresponding to the image data received when the video capture button 36b was activated.
  • the monitor device 20 displays with the touch sensitive display 26 one or more actionable menu items 42 within the third portion 33 of the graphical user interface 27.
  • the actionable menu items 42 may, for example, comprise a login menu item for initiating a login procedure, a settings menu item for accessing a settings menu, an archive menu item for browsing an archive, and a default menu item for returning to a default view. Also a battery indicator 50 is displayed in the third portion 33.
  • Fig. 4 is a block diagram of an exemplary monitor device 20, such as the monitor device 20 of the previous figures.
  • the monitor device 20 comprises a processing unit 60 and memory 62.
  • the memory 62 may comprise both volatile and non-volatile memory.
  • the monitor device 20 also comprise an orientation sensor 64 for determining the orientation of the first housing 25 relative to gravity.
  • the orientation sensor 64 may comprise one or more accelerometers and/or a gyroscope.
  • the monitor device 20 comprises input/output module 66, such as for receiving image data from the image sensor 12 via connectors of visualisation device 4.
  • the input/output module 66 may also comprise ethernet connector, WiFi transceiver, Bluetooth transceiver, video connectors, USB ports etc., and respective controllers.
  • the monitor device 20 also comprises the touch sensitive display 26 as described earlier.
  • the monitor device 20 may display information, graphical user interface objects, images, buttons etc, with the touch sensitive display 27.
  • the monitor device 20 also comprises a microphone 68.
  • the monitor device 20 comprises a power unit 61 for powering the monitor device 20.
  • the power unit 61 may comprise a rechargeable battery 61a.
  • the power unit 61 may comprise a power connection 61b for connecting the power unit 61 to an external power supply, such as a conventional AC power socket.
  • the components of the monitor device 20 may be interconnected by buses or signal lines. Some or all of the components of the monitor device may be accommodated in the first housing 25 as illustrated. Flowever, alternatively some of the components, e.g. the processing unit 60, the memory 62, input/output module 66 and/or the power unit 61 may be accommodated in a second housing of the monitor device 20.
  • the monitor device 20 may display content with the touch sensitive display 26.
  • the monitor device 20 may display content by the processing device 60 transmitting instructions to the touch sensitive display 26 indicative of the content to be displayed.
  • the monitor device 20 may receive user input with the touch sensitive display 26.
  • the monitor device 20 may detect user inputs with the touch sensitive display 26.
  • a user providing a touch input on the touch sensitive display 26 causes a change in one or more electrical parameters of the touch sensitive display 26 indicative of at least the location of the touch input.
  • Information of the touch input is transmitted from the touch sensitive display 26 to the processing unit 60, and the processing unit 60 may determine whether the touch input corresponds to an action to perform, e.g. whether the location of the touch input corresponds to the location of a soft-button displayed at the touch sensitive display.
  • the user may interact with the monitor device 20 via the graphical user interface 27 by providing user inputs, e.g. by means of providing touch inputs on the touch sensitive display 26, and the monitor device 20 may detect such user inputs with the touch sensitive display 26.
  • a touch input e.g. a single tap, double tap, swipe or similar, and the location of the touch input on the touch sensitive display 26 is registered by the touch sensitive display 26, which transmits information of the touch input (e.g. including type of touch (double tap, single tap, swipe, etc.) and/or location of the touch) to the processing unit 60 of the monitor device 20.
  • the processing unit 60 interprets the information received and determines whether the touch input corresponds to activation of an action, e.g. whether the touch input correspond to activation of a button displayed with the touch sensitive display 27 at the location of the touch input. In response to a determination that the touch input corresponds to activation of an action, the processing unit 60 performs the respective action.
  • the user may tap the image capture button 36a.
  • the tap and the location of the tap is registered by the touch sensitive display 26, which transmits the information of the tap to the processing unit 60 of the monitor device 20.
  • the processing unit 60 interprets the information received and determines that the user tapped the location corresponding to the image capture button 36a.
  • the processing unit 60 stores, in memory 62 an image data file corresponding to the image data received.
  • the user may tap the video capture button 36b.
  • the tap and the location of the tap is registered by the touch sensitive display 26, which transmits the information of the tap to a processing unit 60 (see Fig. 4) of the monitor device 20.
  • the processing unit 60 interprets the information received and determines that the user tapped the location corresponding to the video capture button 36b.
  • the processing unit 60 starts collection of image data received from the image sensor 12 and temporarily stores the data in memory 62. To stop the recording, the user may tap the video capture button 36b again.
  • the processing unit 60 determines, based on the signal received from the touch sensitive display 26, that that the user tapped the video capture button 36b and stops collecting image data received from the image sensor 12.
  • the processing unit 60 read the temporarily stored data from the memory 62 and create a complete video sequence based thereon, which the processing unit 60 stores in the memory 62.
  • Figs. 5A-5D schematically illustrates exemplary user interactions with a graphical user interface of a monitor device 20, such as the monitor device 20 of any of the previous figures.
  • the monitor device 20 has a first connection port 40a, a second connection port 40b, and a third connection port 40c.
  • Each of the connection ports 40a-40c are further labelled with respective port indicators.
  • the first connection port 40a is labelled with a first port indicator 106
  • the second connection port 40b is labelled with a second port indicator 108
  • the third connection port 40c is labelled with a third port indicator 109.
  • the port indicators 106, 108, 109 may be printed or engraved on the monitor device 20, such as on the first housing 25 of the monitor device 20.
  • the port indicators 106, 108, 109 may be displayed with the touch sensitive display 26 at the vicinity of the respective connection ports.
  • Fig. 5A schematically illustrates a monitor device 20 with no connected visualisation devices, e.g. connection ports 40a-40c are empty.
  • this situation may correspond to a situation where a user initially powers on the device, e.g. by pressing the on/off button 41.
  • an animation 100 of connecting the visualisation device to the monitor device 20 is displayed within the first portion 31 of the graphical user interface.
  • Fig. 5B schematically illustrates the monitor device 20, e.g. following the situation as shown in Fig. 5A, where, in comparison to the example of Fig. 5A, a first visualisation device has been connected, by a first device connector 16a of the first visualisation device being received at the first connection port 40a.
  • a first live representation 70a of first image data generated by a first image sensor of the first visualisation device is displayed within the first portion 31 of the graphical user interface.
  • Fig. 5C schematically illustrates the monitor device 20, e.g. following the situation as shown in Fig. 5B, where, in comparison to the example of Fig. 5B, a second visualisation device has been connected, by a second device connector 16b of the second visualisation device being received at the second connection port 40b. The second visualisation device is connected while the first visualisation device is also connected to the monitor device 20.
  • the monitor device 20 enters a dual view mode, wherein the monitor device concurrently displays the first live representation 70a of the first image data generated by the first image sensor and a second live representation 70b of second image data generated by a second image sensor of the second visualisation device.
  • the second live representation 70b is displayed in the fourth portion 34 of the graphical user interface and extending into the first portion 31 of the graphical user interface.
  • the first live representation 70a is displayed in the second portion 32 of the graphical user interface and extending into the first portion 31 of the graphical user interface.
  • the first live representation 70a is displayed in reduced size compared to the second live representation 70b.
  • the newly connected visualisation device e.g. the second visualisation device
  • the newly connected visualisation device is displayed in full size, while the previously connected visualisation device is displayed in reduced size.
  • the newly connected visualisation device is shown biggest, as it has been found by the present inventors that a newly connected visualisation device is connected with an intention of using that device, and that therefore it was found advantageous to show the newly connected visualisation device with the biggest image.
  • the present disclosure thereby provides a solution for optimally showing live representations from two simultaneously connected visualisation devices, utilizing a conventional sized display screen. Hence, production costs may be lowered as the need for customized components is reduced.
  • the live representations may be overlaid with an indicator to allow mapping between the displayed representations and the physical visualisation device.
  • a first indicator 102 is overlaid on the first live representation 70a
  • a second indicator 104 is overlaid on the second live representation 70b.
  • the first connection port 40a is labelled with a first port indicator 106 resembling the first indicatorl02
  • the second connection port 40b is labelled with a second port indicator 108 resembling the second indicator 104.
  • the monitor device In response to establishing connection to the second visualisation device, while the first visualisation device remains connected, the monitor device further displays a rearrange icon 36c.
  • the monitor device displays the live representation of the remaining visualisation device in the first portion 31 of the graphical user interface.
  • the monitor device 20 e.g. the processing unit of the monitor device, may be adapted to detect disconnection of a visualisation device, such as of the first visualisation device from the monitor device 20.
  • the monitor device displays, within the first portion of the graphical user interface, the second live representation of second image data generated by the second image sensor of the second visualisation device.
  • Figs. 6A-6B schematically illustrates exemplary user interactions with a graphical user interface of a monitor device 20, such as the monitor device 20 of any of the previous figures.
  • Fig. 6A particularly illustrates the situation where a user provides a first user input 110, e.g. a touch input, such as a tap, at the touch sensitive display 26 at a location corresponding to the rearrange icon 36c.
  • the monitor device 20 such as the processing unit of the monitor device 20 may determine, based on the signal from the touch sensitive display 26, that the first user input 110 corresponds to selection of the rearrange icon 36c.
  • the display of the second live representation 70b in the fourth portion 34 and extending into the first portion 31 of the graphical user interface is replaced with display of the first live representation 70a.
  • the display of the of the first live representation 70a in the second portion 32 and extending into the first portion 31 of the graphical user interface is replaced with display of the second live representation 70b. Consequently, the second live representation 70b is displayed in reduced size compared to the first live representation 70a.
  • the first indicator 102 is repositioned to continuously be overlaid on the first live representation 70a
  • the second indicator 104 is repositioned to continuously be overlaid on the second live representation 70b.
  • the monitor device 20 displays one or more actionable items 36 within the second portion 32 of the graphical user interface.
  • the actionable items 36 may comprise an image capture button 36a, e.g. for storing an image data file corresponding to the image data received when the image capture button 36a was activated.
  • the actionable items 36 may comprise a video capture button 36b, e.g. for storing a video sequence of image data corresponding to the image data received when the video capture button 36b was activated.
  • Fig. 7 schematically illustrates an exemplary user interaction with a graphical user interface of a monitor device 20, such as the monitor device 20 of any of the previous figures. Particularly, Fig. 7 illustrates that also when having a plurality of visualisation devices connected (e.g. a first visualisation device and a second visualisation device, as illustrated), the monitor device displays the one or more actionable items 36, allowing, e.g., storing of image data as well as video sequences. The actionable items 36 are displayed within the second portion 32 of the graphical user interface.
  • a plurality of visualisation devices connected e.g. a first visualisation device and a second visualisation device, as illustrated
  • the monitor device displays the one or more actionable items 36, allowing, e.g., storing of image data as well as video sequences.
  • the actionable items 36 are displayed within the second portion 32 of the graphical user interface.
  • the monitor device 20 is adapted to, e.g. via the touch sensitive display, to detect a second user input 112 corresponding to selection of the image capture button 36a.
  • the monitor device 20 such as the processing unit of the monitor device 20 stores a first image file corresponding to the first image data received when the second user input 112 was detected and stores a second image file corresponding to the second image data received when second user input 112 was detected.
  • both an image file corresponding to the first visualisation device and an image file corresponding to the second visualisation device may be stored. The same may be applied for video capturing.
  • the monitor device 20 in response to detection of a user input corresponding to selection of the video capture button 36b the monitor device 20, such as the processing unit of the monitor device 20 stores a first video sequence of image data corresponding to the first image data received when the user input was detected and stores a second video sequence of image data corresponding to the second image data received when the user input was detected.
  • a visualisation device When a visualisation device is connected, e.g. when the monitor device 20 and/or the processing unit of the monitor device detects connection of a visualisation device, device identifier information from a device identifier of the respective visualisation device may be obtained.
  • the visualisation device(s) may be fitted with an EPROM (alternatively a QR code, RFID tag, NFC etc may be used), which the monitor device 20 is able to read.
  • the processing unit of the monitor device may execute a process for interrogating the device identifier, via the device connector and connection port.
  • the EPROM may store information of the visualisation device, e.g. a serial number of the visualisation device, which may uniquely identify the visualisation device.
  • the device identifier information may be indicative of the type of visualisation device, e.g. whether it is an endoscope or a laryngoscope, brand of the visualisation device, production version, batch number etc.
  • the monitor device may open (create or reopen, depending on whether the visualisation device has previously been connected) a procedure session corresponding to the device identifier information. For example, a first procedure session may be opened corresponding to the first device identifier information obtained from the first visualisation device, and a second procedure session may be opened corresponding to the second device identifier information obtained from the second visualisation device.
  • the procedure sessions may be unique, and therefore reconnecting a previously connected visualisation device cause the monitor device to reopen a previously created session.
  • Stored video sequences or image files may be associated with the procedure session for the respective visualisation device.
  • the first image file stored in response to detection of the second user input 112 may be associated with the first procedure session
  • the second image file stored in response to detection of the second user input 112 may be associated with the second procedure session.
  • a procedure session may be implemented by creating a folder in the file system of the monitor device, wherein image files and video sequences obtained from a visualisation device is stored in the folder corresponding to the visualisation device.
  • Figs. 8A-8D schematically illustrates exemplary user interactions with a graphical user interface of a monitor device 20, such as the monitor device 20 of any of the previous figures. Particularly, the examples may follow from Fig. 5B, wherein only a single visualisation device is connected.
  • the monitor device 20 may receive a first primary user input 180 corresponding to selection of a first actionable menu item 42a of the one or more actionable menu items 42 (cf. Fig. 3).
  • the first primary user input 180 is a touch input, e.g. a tap, on an icon, i.e. the first actionable menu item, indicating access to an archive.
  • the monitor device 20 detects the primary user input 180 with the touch sensitive display 26, and in response to detecting the first primary user input 180, the monitor device 20 displays a primary menu 182, as illustrated in Fig. 8B.
  • the primary menu 182 is associated with the first actionable menu item 42a.
  • the primary menu 182 is displayed within the fourth portion 34 of the graphical user interface, and without obscuring part of the first portion 31 of the graphical user interface.
  • the primary menu 182 comprises one or more primary actionable items including a first primary actionable item 183a.
  • the primary menu 182 is an archive menu with a first primary actionable item 183a to retrieve recent (e.g. recently stored) images and videos, and a second primary actionable item 183b to search for stored images and videos.
  • the monitor device 20 may detect with the touch sensitive display 26 a second primary user input 184 corresponding to selection of the first primary actionable item 183a, as illustrated in Fig. 8C.
  • the monitor device displays a secondary menu 186 associated with the primary actionable item 183a in the first portion 31, and optionally the second portion 32, of the graphical user interface, as illustrated in Fig. 8D.
  • the secondary menu 186 extends into also the fourth portion 34 of the graphical user interface.
  • the second primary user input 184 in Fig. 8C is not received within a predetermined time from receiving/detecting the first primary user input and/or from displaying the primary menu. Thereby, the likelihood of receiving two unintentional inputs causing the live representation of the image data to be obstructed may be decreased.
  • Figs. 9A-9D schematically illustrates exemplary user interactions with a graphical user interface of a monitor device 20, such as the monitor device 20 of any of the previous figures.
  • Fig. 9A illustrates a situation, which may continue from the situation illustrated in, e.g., Fig. 5C.
  • Fig. 9A illustrates the monitor device 20 and the graphical user interface in a situation where two visualisation devices are connected, and that one live representation, e.g. the second live representation 70b is displayed in the fourth portion 34 of the graphical user interface and extending into the first portion 31, and the first live representation 70a is displayed in the second portion 32 of the graphical user interface and extending into the first portion 31.
  • one live representation e.g. the second live representation 70b is displayed in the fourth portion 34 of the graphical user interface and extending into the first portion 31
  • the first live representation 70a is displayed in the second portion 32 of the graphical user interface and extending into the first portion 31.
  • the monitor device receives a third user input 190, e.g. corresponding to the first primary user input 180 of Fig. 8A, to an actionable menu item, such as to the first actionable menu item 42a.
  • the monitor device 20 detects the third user input 190 with the touch sensitive display 26, and in response to detecting the third user input 190, the monitor device 20 displays, within the fourth portion 34 of the graphical user interface, the primary menu 192 associated with the actionable menu item receiving the third user input 190.
  • the monitor device displays the first primary menu 192 within the fourth portion 34 of the graphical user interface.
  • monitor device 20 is operating in the dual view mode, where live representations of two connected visualisation devices are displayed, including displaying a live representation 70b in the fourth portion 34, display of the first primary menu 192 within the fourth portion 34 of the graphical user interface causes a part of the live representation 70b of the image data to be obscured.
  • the primary menu 192 is associated with the first actionable menu item 42a.
  • the primary menu 192 comprises one or more primary actionable items including a first primary actionable item 193a.
  • the primary menu 192 is an archive menu (similar to the primary menu 182 of Figs. 8B and 8C) with a first primary actionable item 193a to retrieve recent (e.g. recently stored) images and videos, and a second primary actionable item 193b to search for stored images and videos.
  • the monitor device 20 may detect with the touch sensitive display 26 a fourth user input 194 corresponding to selection of the first primary actionable item 193a, as illustrated in Fig. 9C, e.g. corresponding to the second primary user input 184 of Fig. 8C.
  • the monitor device displays a secondary menu 196 associated with the primary actionable item 193a in the first portion 31, and optionally the second portion 32, of the graphical user interface, as illustrated in Fig. 9D.
  • the secondary menu 196 extends into also the fourth portion 34 of the graphical user interface.
  • the monitor device may cease display of the primary menu 192 and display the entire live representation 70b of the image data in the fourth portion 34 and extending into the first portion 31 of the graphical user interface.
  • a threshold amount of time e.g. 5 seconds
  • the monitor device may cease display of the primary menu 192 and display the entire live representation 70b of the image data in the fourth portion 34 and extending into the first portion 31 of the graphical user interface.
  • obscuring part of the live representation 70b of the image data in the dual view mode may be limited by a timeout.
  • the likelihood of receiving two unintentional inputs causing the live representations 70a 70b to be completely obstructed may be decreased.
  • the monitor device 20 in the dual view mode (Figs. 9A-9D), needs to receive two consecutive inputs, e.g. within a time frame, to display a menu, which covers the entire live representation of the image data displayed in the first portion 31 of the graphical user interface.
  • connection port(s) 38 second image direction 40 connection port(s) 42 actionable menu item(s) 44 invert view button 46 inverted view mode indicator 50 battery indicator 60 processing unit 61 power supply 61a battery 61b power connection 62 memory

Abstract

Disclosed is a method and a monitor device of a medical visualisation system comprising a plurality of visualisation devices each having an image sensor configured to generate image data indicative of a view from the visualisation device, the monitor device being operable to receive the image data as the image data is being generated by the image sensors of the plurality of visualisation devices. The monitor device concurrently displays, in a fourth portion and extending into a first portion of the graphical user interface, a second live representation of second image data generated by a second image sensor of the second visualisation device; and in a second portion and extending into the first portion of the graphical user interface, the first live representation of first image data generated by the first image sensor of the first visualisation device.

Description

GRAPHICAL USER INTERFACE HANDLING A PLURALITY OF VISUALISATION DEVICES
The present disclosure relates to a visualisation device, such as an endoscope and a medical visualisation system, such as an endoscope system, comprising a visualisation device. More specifically the present disclosure relates to a graphical user interface and a monitor device having such graphical user interface for interacting with the medical visualisation system.
BACKGROUND
A visualisation device may be utilized to visually examine certain areas of the body of a person, such as inside a body cavity of the person. For example, a visualisation device may be used to inspect the airways, the digestive tract, or the intestines.
A visualisation device may be provided with a camera and be attached to a monitor device, such as a monitor with a display screen, a video output from the camera of the visualisation device may be received and displayed at the monitor device, thereby allowing an operator to control the visualisation device to inspect an area of interest.
For example, a visualisation device may be an endoscope, such as a disposable endoscope. An endoscope comprises an operating handle at the proximal end and an insertion tube extending from the handle towards the distal end. The handle is configured to be held by an operator and inter alia comprises externally protruding operating members connected to internal control means allowing the operator to control the movement of a bending section at the distal end of the insertion tube, while advancing the distal end of the insertion tube to a desired location e.g. within a body cavity of a person. By means of an attached monitor device, such as a monitor with a display screen, the location to which the distal end has been advanced may be inspected using the endoscope.
The monitor device of a medical visualisation system may be provided with some functionality, such as ability to save still images and/or video sequences of the view from the attached visualisation device. Furthermore, the monitor device may comprise some image processing capabilities, and may be configured to output a video or image output, e.g. to an external display.
In some procedures it may be desirous to use a plurality of visualisation devices, e.g. a video laryngoscope and an endoscope simultaneously, e.g. to inspect different positions within a body cavity of the person. For example, the video laryngoscope may be used to visually aid the operator in inserting an endoscope into the airways. In another example, a second inserted visualisation device may be used to confirm the anticipated position of a first inserted visualisation device. SUMMARY
The present disclosure relates to a visualisation device, such as an endoscope, and a visualisation system, such as an endoscope system. Particularly, but not exclusively the visualisation device may be a disposable camera endoscope. Alternatively, the visualisation device may be a video laryngoscope, an endotracheal tube and/or a laryngeal mask. The visualisation system may further comprise a monitor device for being connected to the visualisation device, e.g. the monitor device may be configured to receive image data from the visualisation device. The present disclosure further relates to a graphical user interface for such monitor device of a medical visualisation system.
It is an object of the present disclosure to provide a solution which at least improve the solutions of the prior art. Particularly, it is an object of the present disclosure to provide a graphical user interface for a medical visualisation system which facilitates and enhances human interaction with the system.
It is a further object of the present disclosure to provide a system and method facilitating enhanced control and usability of a medical visualisation system.
Accordingly, a medical visualisation system and a method performed at a monitor device of the medical visualisation system are disclosed.
The medical visualisation system comprises a visualisation device, such as an endoscope, such as a disposable endoscope. Alternatively, the visualisation device may be a video laryngoscope, an endotracheal tube and/or a laryngeal mask. The visualisation device has an image sensor configured to generate image data indicative of a view from the visualisation device. The medical visualisation system may comprise a plurality of visualisation devices each having an image sensor configured to generate image data indicative of a view from the visualisation device. The plurality of visualisation devices may include a first visualisation device and/or a second visualisation device. The first visualisation device may comprise a first image sensor configured to generate image data indicative of a view from the first visualisation device. The second visualisation device may comprise a second image sensor configured to generate image data indicative of a view from the second visualisation device. The image sensor(s) may be any sensor capable of detecting and conveying information used to make an image. For example, the image sensor(s) may comprise a CCD or CMOS sensor, or similar. The image sensor(s) may generate image data corresponding to a square image, i.e. having equal height and width. For example, the image data generated by the image sensors may correspond to a 300x300 pixel image, or a 400x400 pixel image, or a 600x600 pixel image, or an 800x800 pixel image. Alternatively or additionally, the image sensor may generate image data corresponding to a non-square image, which is cropped to form a square image, such as a square image having 300x300 pixels, 400x400 pixels, 600x600 pixels, or 800x800 pixels.
The medical visualisation system further comprises a monitor device receiving and/or being operable to receive the image data generated by the image sensor. The monitor device may receive and/or be operable to receive the image data, e.g. as the image data is being generated, e.g. within limitation of the hardware. The monitor device comprises a first housing extending in a first direction from a first housing side to a second housing side and in a second direction perpendicular to the first direction from a third housing side to a fourth housing side. The monitor device comprises a display, e.g. a touch sensitive display. The display may be accommodated in the first housing. The display may have a first length in the first direction and a second length in the second direction. The second length may be longer than the first length, e.g. the display may be a 16:9 or 16:10 display. Alternatively, the first length may be longer than the second length, or the first length and the second length may be substantially the same. The touch sensitive display may be any suitable type of touch display, e.g. capacitive touch display or resistive touch display.
The monitor device may comprise one or more connection ports configured to receive a connector of the visualisation device. The connection ports and the corresponding connector of the visualisation device may be a proprietary plug and socket connectors, or any standard connector capable of transmitting therethrough at least the image data from the image sensor. Furthermore, the connector and connection ports may be configured to supply power to the components of the visualisation device.
The one or more connection ports may be arranged on the first housing. The one or more connection ports may be provided on the third housing side, and/or on the fourth housing side.
The monitor device may comprise an on/off button. The on/off button may be arranged on the first housing. The on/off button may be provided on the third housing side or on the fourth housing side. The one or more connection ports may be provided on the third housing side and the on/off button may be provided on the fourth housing side.
The monitor device may establish connection to a visualisation device, such as the first visualisation device and/or the second visualisation device. Establishing connection to the visualisation device may include receiving a device connector of the respective visualisation device in a connection port of the one or more connection ports of the monitor device. Establishing connection to a visualisation device may include obtaining device identifier information from a device identifier (e.g. EPROM, QR-code, NFC, RFID or similar) of the visualisation device. For example, establishing connection to the first visualisation device may include obtaining first device identifier information from a first device identifier of the first visualisation device and/or establishing connection to the second visualisation device may include obtaining second device identifier information from a second device identifier of the second visualisation device.
The monitor device may comprise a processing unit and memory. The processing unit and/or the memory may be accommodated in the first housing. Alternatively, the monitor device may comprise a second housing, and the processing unit and/or the memory may be accommodated in the second housing. The monitor device may comprise an orientation sensor, e.g. for determining the orientation of the monitor device, such as of the first housing, relative to gravity. The orientation sensor may comprise one or more accelerometers and/or a gyroscope. The orientation sensor may be accommodated in the first housing. The processing unit may be connected to the touch sensitive display to control display of information with the touch sensitive display, and the processing unit may be adapted to receive a signal from the touch sensitive display indicative of touch inputs on the touch sensitive display. Thereby, the monitor device may detect user inputs, e.g. in the form of touch inputs, with the touch sensitive display. Touch inputs may, for example, comprise single tap(s), double tap(s), or swipe(s) on the touch sensitive display. The processing unit may be connected to the orientation sensor to receive an orientation signal indicative of the orientation of the monitor device, such as of the first housing of the monitor device. The processing unit may be connected to the memory and be adapted to read and write data from and to the memory.
The monitor device may comprise a power unit for powering the monitor device. The power unit may comprise a rechargeable battery and/or a power connection for connecting the power unit to an external power supply, such as a conventional AC power socket. The power unit may be accommodated in the first housing. Alternatively, the power unit may be accommodated in the second housing.
The monitor device may comprise a graphical user interface. The monitor device and/or the processing unit of the monitor device may display with the touch sensitive display the graphical user interface. The graphical user interface may comprise one or more portions, such as a plurality of portions. The portions may be non-overlapping portions, such as a plurality of non-overlapping portions. The portions may include a first portion and/or a second portion. The portions may further include a third portion and/or a fourth portion. The second portion and/or the fourth portion may be designated as background portions, e.g. the second portion may be a first background portion and/or the fourth portion may be a second background portion. Each of the plurality of portions may extend substantially throughout the first length in the first direction. The first portion may be arranged between the fourth portion and the second portion along the second direction. The fourth portion may be arranged between the third portion and the first portion along the second direction. The third portion may be arranged between a side of the first housing, e.g. the third housing side, and the fourth portion along the second direction. The second portion may be arranged between another side of the first housing, e.g. the fourth housing side, and the first portion along the second direction. The first portion and the fourth portion may be arranged between the second portion and the third portion along the second direction. The first portion of the graphical user interface may be square. The first portion of the graphical user interface may occupy the centre of the touch sensitive display. The first portion of the graphical user interface may be larger along the second direction than the second portion, the third portion and/or the fourth portion, individually and/or collectively. The first portion of the graphical user interface may extend throughout more than 40% of the second length in the second direction, such as more than 50% of the second length in the second direction, such as more than 60% of the second length in the second direction.
The monitor device may display a live representation of the image data, e.g. within the first portion of the graphical user interface. The live representation of the image data may be displayed, e.g. by the processing unit, with the touch sensitive display, e.g. within the first portion of the graphical user interface.
The visualisation device, e.g. an endoscope, may comprise a handle and an elongated flexible member extending from the handle to a distal end. The image sensor may be arranged at the distal end of the elongated flexible member. The image data may be indicative of a view from the distal end of the elongated flexible member. The handle may comprise a control button adapted to receive an input in a first input direction and/or in a second input direction. The first input direction and the second input direction may be opposite. The touch input in the first input direction may cause a distal portion of the elongated flexible member to bend in a first bending direction and/or may cause movement of the image sensor in a first image sensor direction. The touch input in the second input direction may cause the distal portion of the elongated flexible member to bend in a second bending direction and/or may cause movement of the image sensor in a second image sensor direction. The live representation of the image data may have directions corresponding to directions of the image sensor generating the image data. The first bending direction may correspond to a first image direction of a representation of the image data, such as the live representation of the image data. The second bending direction may correspond to a second image direction of the representation of the image data, such as the live representation of the image data. The first image direction and/or the second image direction may be parallel to the first direction of the first housing.
The monitor device may provide one or more actionable items. One or more actionable items may be displayed, e.g. by the processing unit, with the touch sensitive display, e.g. within the second portion of the graphical user interface. One or more actionable menu items may be displayed, e.g. by the processing unit, with the touch sensitive display, e.g. within the third portion of the graphical user interface. A battery indicator may be displayed, e.g. by the processing unit, with the touch sensitive display, e.g. within the third portion of the graphical user interface. A time indicator may be displayed, e.g. by the processing unit, with the touch sensitive display, e.g. within the third portion of the graphical user interface.
The one or more actionable items, e.g. displayed within the second portion of the graphical user interface of the monitor device, may comprise an image capture button and/or a video capture button. In response to activation of the image capture button, e.g. by a user providing a touch input, e.g. a single tap, at the respective location of the touch sensitive display, an image data file corresponding to the image data received when the image capture button was activated may be stored, e.g. in memory of the monitor device. In response to activation of the video capture button, e.g. by a user providing a touch input, e.g. a single tap, at the respective location of the touch sensitive display, a video sequence of image data corresponding to the image data received when the video capture button was activated may be stored, e.g. in memory of the monitor device. A first activation of the video capture button may start collection of image data for the video sequence, and a second activation of the video capture button (subsequent to the first activation of the video capture button) may stop the collection of image data for the video sequence. The stored video sequence may correspond to the image data received between the first activation and the second activation of the video capture button. The video capture button may be displayed in a first appearance prior to the first activation and after the second activation. The video capture button may be displayed in a second appearance after the first activation and before the second activation.
The monitor device may be adapted to establish connection to a first visualisation device of a plurality of visualisation devices. The plurality of visualisation devices may be a plurality of visualisation devices being substantially similar, or the plurality of visualisation devices may comprise visualisation devices of different types. For example, the first visualisation device may be an endoscope and a second visualisation device may be a laryngoscope. Alternatively, the first visualisation device may be an endoscope and the second visualisation device may be an endoscope. Alternatively, the first visualisation device may be a laryngoscope and the second visualisation device may be an endoscope. Alternatively, the first visualisation device may be a medical airway device such as an endotracheal tube or a laryngeal mask and the second visualisation device may be an endoscope. Alternatively, the first visualisation device may be an endoscope and the second visualisation device may be a medical airway device such as an endotracheal tube or a laryngeal mask.
In response to establishing the connection to the first visualisation device, the monitor device may display, e.g. with the touch sensitive display, within the first portion of the graphical user interface, a first live representation of first image data generated by a first image sensor of the first visualisation device.
While the first visualisation device is connected to the monitor device, the monitor device may be further adapted to establish connection to a second visualisation device of the plurality of visualisation devices.
In response to establishing the connection to the second visualisation device the monitor device concurrently displays a second live representation of second image data generated by a second image sensor of the second visualisation device and the first live representation of first image data generated by the first image sensor of the first visualisation device. The second live representation is displayed in the fourth portion and extending into the first portion of the graphical user interface. The first live representation is displayed in the second portion and extending into the first portion of the graphical user interface. The first live representation is displayed in reduced size compared to the second live representation.
Accordingly, a monitor device for an endoscope system and a method are disclosed, which allows handling of multiple simultaneously connected visualisation devices, by providing a dual mode view. Consequently, the present disclosure provides the user with the ability of displaying input from two different scopes, to inspect two positions simultaneously, and allow viewing of the two inputs on the same display. Importantly, the present disclosure provides logic for determining what visualisation device to display largest and what to show smallest.
Furthermore, the disclosure provides a solution for altering the displayed content, in a predictive manner and with minimum visual disturbance for the user, upon connection of an additional endoscope and/or disconnection of one of two endoscopes. Thereby, the system facilitates the user in continuing an ongoing procedure while connecting or disconnecting additional endoscopes.
The monitor device may comprise a plurality of connection ports including a first connection port and a second connection port. To establish connection to the first visualisation device a first connector of the first visualisation device may be received by the first connection port. To establish connection to the second visualisation device a second connector of the second visualisation device may be received by the second connection port.
The live representations may be overlaid with an indicator. For example, a first indicator may be overlaid on the first live representation and/or a second indicator may be overlaid on the second live representation. The first connection port may be labelled with a first port indicator resembling the first indicator. The second connection port may be labelled with a second port indicator resembling the second indicator. Thereby, the live representations may be easily mapped to the physical connection port and consequently to the connected visualisation device.
In response to establishing connection to the second visualisation device, the monitor device may further display a rearrange icon. The monitor device may be adapted, e.g. with the touch sensitive display, to detect a first user input corresponding to selection of the rearrange icon. In response to detection of the first user input the monitor device may replace display of the second live representation in the fourth portion and extending into the first portion of the graphical user interface, with display of the first live representation and/or the monitor device may replace display of the first live representation in the second portion and extending into the first portion of the graphical user interface, with display of the second live representation. The second live representation may be displayed in reduced size compared to the first live representation.
While the first visualisation device and the second visualisation device are connected to the monitor device, the monitor device, e.g. by the touch sensitive display, may be adapted to detect a second user input corresponding to selection of the image capture button of the one or more actionable items. In response to detection of the second user input, the monitor device may store a first image file corresponding to the first image data received when the second user input was detected and store a second image file corresponding to the second image data received when second user input was detected. Thus, when the monitor device is operating in dual view mode, activation of the capture image button may cause two images to be capture, one from each connected visualisation device. Establishing connection to a visualisation device may include obtaining device identifier information from a device identifier (e.g. EPROM, QR-code, NFC, RFID or similar) of the visualisation device. In response to establishing connection to the first visualisation device the monitor device may open a first procedure session corresponding to the first device identifier information obtained from the first device identifier of the first visualisation device. In response to establishing connection to the second visualisation device the monitor device may open a second procedure session corresponding to the second device identifier information obtained from the first device identifier of the first visualisation device. Flence, a procedure session may be created for each connected visualisation device.
In response to detection of the second user input, i.e. the user input corresponding to selection of the image capture button of the one or more actionable items, the monitor device may associate the first image file with the first procedure session; and/or associate the second image file with the second procedure session.
A procedure session may be implemented by creating a folder in the file system of the monitor device, wherein image files and video sequences obtained from a visualisation device is stored in the folder corresponding to the visualisation device. Flence, association of an image file to a procedure session may be implemented by storing the image file in the folder of the procedure session. Opening a procedure session may further comprise creating a log, registering the time and date for initiating the procedure, registering information about the visualisation device, registering software version and/or other information.
In opening a procedure session the monitor device may determine, based on the device identifier information, whether the visualisation device has been previously connected to the monitor device. For example, in accordance with determining that the visualisation device has previously been connected to the monitor device, the monitor device may reopen the procedure session corresponding to the device identifier information; and/or in accordance with determining that the visualisation device has not previously been connected to the monitor device, the monitor device may create the procedure session, e.g. a new procedure session, corresponding to the device identifier information.
A third user input, e.g. with the touch sensitive display, corresponding to selection of a first actionable menu item of the one or more actionable menu items, may be detected. In response to detection of the third user input the monitor device may display a primary menu, e.g. associated with the first actionable menu item. The primary menu may be displayed within the fourth portion of the graphical user interface obscuring the live representation in the fourth portion of the graphical user interface (e.g. the second live representation).
The primary menu may comprise one or more primary actionable items including a first primary actionable item. While the primary menu is displayed, the monitor device may be adapted to detect a fourth user input corresponding to selection of the first primary actionable item.
In accordance with detecting the fourth user input within a threshold amount of time, e.g. 5 seconds, after detection of the third user input, the monitor device may display a secondary menu associated with the primary actionable item in the first portion, and optionally the second portion and/or the fourth portion, of the touch sensitive display.
In accordance with not detecting the fourth user input within the threshold amount of time after detection of the first user input, the monitor device may cease display of the primary menu associated with the first actionable menu item and display the partly obscured live representation (e.g. the second live representation) in the fourth portion and extending into the first portion of the graphical user interface.
In response to the user disconnecting one of the connected visualisation devices, the monitor device may display, within the first portion of the graphical user interface, the live representation of image data generated by the image sensor of the visualisation device still being connected. For example, the monitor device may be adapted to detect disconnection of the first visualisation device from the monitor device, and in response to detecting disconnection of the first visualisation device from the monitor device, the monitor device may display, within the first portion of the graphical user interface, the second live representation of second image data generated by the second image sensor of the second visualisation device. Similarly, the monitor device may be adapted to detect disconnection of the second visualisation device from the monitor device, and in response to detecting disconnection of the second visualisation device from the monitor device, the monitor device may display, within the first portion of the graphical user interface, the first live representation of first image data generated by the first image sensor of the first visualisation device.
BRIEF DESCRIPTION OF THE FIGURES
Embodiments of the disclosure will be described in more detail in the following with regard to the accompanying figures. The figures show one way of implementing the present invention and are not to be construed as being limiting to other possible embodiments falling within the scope of the attached claim set.
Fig. 1 schematically illustrates an exemplary medical visualisation system,
Fig. 2 schematically illustrates an exemplary monitor device, Fig. 3 schematically illustrates an exemplary monitor device,
Fig. 4 is a block diagram of an exemplary monitor device,
Figs. 5A-5D schematically illustrates exemplary user interactions with an exemplary graphical user interface,
Figs. 6A-6B schematically illustrates exemplary user interactions with an exemplary graphical user interface,
Figs. 7 schematically illustrates an exemplary graphical user interface,
Figs. 8A-8D schematically illustrates exemplary user interactions with an exemplary graphical user interface,
Figs. 9A-9D schematically illustrates exemplary user interactions with an exemplary graphical user interface,
DETAILED DESCRIPTION
Various exemplary embodiments and details are described hereinafter, with reference to the figures when relevant. It should be noted that the figures may or may not be drawn to scale and that elements of similar structures or functions are represented by like reference numerals throughout the figures. It should also be noted that the figures are only intended to facilitate the description of the embodiments. They are not intended as an exhaustive description of the invention or as a limitation on the scope of the invention. In addition, an illustrated embodiment needs not have all the aspects or advantages shown. An aspect or an advantage described in conjunction with a particular embodiment is not necessarily limited to that embodiment and can be practiced in any other embodiments even if not so illustrated, or if not so explicitly described.
Fig. 1 schematically illustrates an exemplary medical visualisation system 2 comprising a visualisation device 4 and a monitor device 20. The visualisation device 4 has an image sensor 12, e.g. a CCD or a CMOS, configured to generate image data indicative of a view from the visualisation device 4. In the illustrated example, the visualisation device 4 is an endoscope comprising a handle 6 and an elongated flexible member 8, e.g. an insertion tube, extending from the handle 6 to a distal end 10. The image sensor 12 may be configured to generate image data indicative of a view from the distal end 10 of the elongated flexible member 8.
The visualisation device 4 may be connected to the monitor device 20. In the illustrated example, a device cable 14 extending from the handle 6 terminates in a device connector 16 connected to a connection port 40 of the monitor device 20. The monitor device 20 is operable to receive image data generated by the image sensor 12 of the visualisation device 4. For example, the monitor device 20 may receive image data generated by the image sensor 12 via the device cable 14, the connector 16 and connection port 40.
The handle 6 comprises a control button 7 adapted to receive an input in a first input direction and/or in a second input direction. The touch input in the first input direction on the control button 7 causes a distal portion 9 of the elongated flexible member 8 to bend in a first bending direction, e.g. via wires extending from the handle, through the elongated flexible member 8 to the distal portion 9. The touch input in the second input direction on the control button 7 causes the distal portion 9 of the elongated flexible member 8 to bend in a second bending direction. The first input direction and the second input direction may be opposite. The first bending direction and the second bending direction may be opposite. Bending the distal portion 9 of the elongated flexible member 8 may cause a movement of the distal end 10 and the image sensor 12 in a direction relative to the image sensor 12. Thereby, seeing an image generated by the image sensor 12, a direction, e.g. up or down, in the image may correspond to a respective input on the control button 7.
Fig. 2 schematically illustrates an exemplary monitor device 20, such as the monitor device 20 as illustrated in Fig. 1. The monitor device 20 comprises a first housing 25. The first housing 25 extends in a first direction xl from a first housing side 21 to a second housing side 22 and in a second direction x2 perpendicular to the first direction xl from a third housing side 23 to a fourth housing side 24. The monitor device comprises a touch sensitive display 26 accommodated in the first housing 25. The touch sensitive display 26 has a first length LI in the first direction xl and a second length L2 in the second direction x2. The second length L2 may be longer than the first length LI as illustrated. The monitor device may comprise one or more connection port(s) 40, such as three connection ports 40, as illustrated. The connection ports 40 may allow visualisation devices to be connected. The connection port(s) 40 may be arranged at the third housing side 23, as illustrated. Alternatively or additionally, connection port(s) 40 may be arranged at the fourth housing side 24.
The monitor device may comprise an on/off button 41, which may be provided on the fourth housing side 24, as illustrated.
Fig. 3 schematically illustrates an exemplary monitor device 20, such as the monitor device 20 as illustrated in Figs. 1-2. As illustrated a device connector 16 may be connected to a connection port 40.
The monitor device 20 may be provided with a graphical user interface 27. The graphical user interface 27 may be displayed with the touch sensitive display 26, and the user may interact with the graphical user interface 27, e.g. by means of providing touch inputs on the touch sensitive display 26.
The graphical user interface 27 is displayed with the touch sensitive display 26. The graphical user interface 27 comprises a plurality of non-overlapping portions 31, 32, 33, 34. Each of the portions 31, 32, 33, 34 extends substantially throughout the first length LI in the first direction xl. The non-overlapping portions includes a first portion 31, a second portion 32, a third portion 33 and a fourth portion 34. The first portion 31 is arranged between the fourth portion 34 and the second portion 32 along the second direction x2. The fourth portion 34 is arranged between the third portion 33 and the first portion 31 along the second direction x2. The third portion 33 is arranged between a side of the first housing, e.g. the third housing side 23, and the fourth portion 34 along the second direction x2. The second portion 32 is arranged between another side of the first housing 25, e.g. the fourth housing side 24, and the first portion 31 along the second direction x2. The first portion 31 and the fourth portion 34 are arranged between the second portion 32 and the third portion 33 along the second direction.
The monitor device 20 displays a live representation 70 of the image data within the first portion 31 of the touch sensitive display 26. The first bending direction and the second bending direction of the distal portion 9 of the elongated flexible member 8, as described with respect to Fig. 1, may corresponds to a first image direction 37 and a second image direction 38 of the live representation 70, respectively. The first image direction 37 and the second image direction 38 may be parallel to the first direction xl, as illustrated. The first image direction 37 and the second image direction 38 may be opposite, as illustrated. Thereby, a user operating the control button 7 of visualisation device 2 may cause movement of the distal portion 9 of the elongated flexible member 8 to bend in a direction corresponding to the first image direction 37 or the second image direction 38 of the live representation 70.
The monitor device 20 displays with the touch sensitive display 26 one or more actionable items 36 within the second portion 32 of the graphical user interface 27. The actionable items 36 may comprise an image capture button 36a, e.g. for storing an image data file corresponding to the image data received when the image capture button 36a was activated. Alternatively or additionally, the actionable items 36 may comprise a video capture button 36b, e.g. for storing a video sequence of image data corresponding to the image data received when the video capture button 36b was activated.
The monitor device 20 displays with the touch sensitive display 26 one or more actionable menu items 42 within the third portion 33 of the graphical user interface 27. The actionable menu items 42 may, for example, comprise a login menu item for initiating a login procedure, a settings menu item for accessing a settings menu, an archive menu item for browsing an archive, and a default menu item for returning to a default view. Also a battery indicator 50 is displayed in the third portion 33.
Fig. 4 is a block diagram of an exemplary monitor device 20, such as the monitor device 20 of the previous figures. The monitor device 20 comprises a processing unit 60 and memory 62. The memory 62 may comprise both volatile and non-volatile memory. The monitor device 20 also comprise an orientation sensor 64 for determining the orientation of the first housing 25 relative to gravity. The orientation sensor 64 may comprise one or more accelerometers and/or a gyroscope. The monitor device 20 comprises input/output module 66, such as for receiving image data from the image sensor 12 via connectors of visualisation device 4. The input/output module 66 may also comprise ethernet connector, WiFi transceiver, Bluetooth transceiver, video connectors, USB ports etc., and respective controllers. The monitor device 20 also comprises the touch sensitive display 26 as described earlier. The monitor device 20 may display information, graphical user interface objects, images, buttons etc, with the touch sensitive display 27. The monitor device 20 also comprises a microphone 68. The monitor device 20 comprises a power unit 61 for powering the monitor device 20. The power unit 61 may comprise a rechargeable battery 61a. The power unit 61 may comprise a power connection 61b for connecting the power unit 61 to an external power supply, such as a conventional AC power socket. The components of the monitor device 20 may be interconnected by buses or signal lines. Some or all of the components of the monitor device may be accommodated in the first housing 25 as illustrated. Flowever, alternatively some of the components, e.g. the processing unit 60, the memory 62, input/output module 66 and/or the power unit 61 may be accommodated in a second housing of the monitor device 20.
The power unit 61 may comprise components for, e.g. indirectly measuring, capacity of the rechargeable battery 61a. For example, the power unit 61 may comprise a voltage gauge to measure the voltage of the rechargeable battery 61a. Based on the measured voltage, the remaining capacity of the rechargeable battery 61a may be estimated, e.g. by the processing unit 60. The power unit 61 may also comprise components for measuring power consumption of the monitor device 20. For example, the power unit 61 may comprise a power meter to measure the rate at which the monitor device 20 consumes power from the rechargeable battery 61a. The voltage gauge may be a low current consumption integrated circuit or a resistor coupled in parallel with the battery. A current sensor may be provided, and the power may be computed as the product of the voltage and current. Additionally, an integrated circuit may be provided that includes a voltage gauge and a current sensor, and which outputs a power value in digital form.
The monitor device 20 may display content with the touch sensitive display 26. For example, the monitor device 20 may display content by the processing device 60 transmitting instructions to the touch sensitive display 26 indicative of the content to be displayed. The monitor device 20 may receive user input with the touch sensitive display 26. Particularly, the monitor device 20 may detect user inputs with the touch sensitive display 26. For example, a user providing a touch input on the touch sensitive display 26 causes a change in one or more electrical parameters of the touch sensitive display 26 indicative of at least the location of the touch input. Information of the touch input is transmitted from the touch sensitive display 26 to the processing unit 60, and the processing unit 60 may determine whether the touch input corresponds to an action to perform, e.g. whether the location of the touch input corresponds to the location of a soft-button displayed at the touch sensitive display.
The user may interact with the monitor device 20 via the graphical user interface 27 by providing user inputs, e.g. by means of providing touch inputs on the touch sensitive display 26, and the monitor device 20 may detect such user inputs with the touch sensitive display 26. A touch input, e.g. a single tap, double tap, swipe or similar, and the location of the touch input on the touch sensitive display 26 is registered by the touch sensitive display 26, which transmits information of the touch input (e.g. including type of touch (double tap, single tap, swipe, etc.) and/or location of the touch) to the processing unit 60 of the monitor device 20. The processing unit 60 interprets the information received and determines whether the touch input corresponds to activation of an action, e.g. whether the touch input correspond to activation of a button displayed with the touch sensitive display 27 at the location of the touch input. In response to a determination that the touch input corresponds to activation of an action, the processing unit 60 performs the respective action.
For example, with reference to Figs. 3 and 4, to capture an image corresponding to the presently shown live representation 70, e.g. corresponding to the image data received from the image sensor, the user may tap the image capture button 36a. The tap and the location of the tap is registered by the touch sensitive display 26, which transmits the information of the tap to the processing unit 60 of the monitor device 20. The processing unit 60 interprets the information received and determines that the user tapped the location corresponding to the image capture button 36a. In response thereto, the processing unit 60 stores, in memory 62 an image data file corresponding to the image data received.
In further reference to Figs. 3 and 4, to capture a video sequence corresponding to the shown live representation 70 over a period of time, e.g. corresponding to the image data received from the image sensor over a period of time, the user may tap the video capture button 36b. The tap and the location of the tap is registered by the touch sensitive display 26, which transmits the information of the tap to a processing unit 60 (see Fig. 4) of the monitor device 20. The processing unit 60 interprets the information received and determines that the user tapped the location corresponding to the video capture button 36b. In response thereto, the processing unit 60 starts collection of image data received from the image sensor 12 and temporarily stores the data in memory 62. To stop the recording, the user may tap the video capture button 36b again. The processing unit 60 determines, based on the signal received from the touch sensitive display 26, that that the user tapped the video capture button 36b and stops collecting image data received from the image sensor 12. The processing unit 60 read the temporarily stored data from the memory 62 and create a complete video sequence based thereon, which the processing unit 60 stores in the memory 62.
Figs. 5A-5D schematically illustrates exemplary user interactions with a graphical user interface of a monitor device 20, such as the monitor device 20 of any of the previous figures. In the illustrated examples the monitor device 20 has a first connection port 40a, a second connection port 40b, and a third connection port 40c. Each of the connection ports 40a-40c are further labelled with respective port indicators. For example, the first connection port 40a is labelled with a first port indicator 106, the second connection port 40b is labelled with a second port indicator 108, and the third connection port 40c is labelled with a third port indicator 109. The port indicators 106, 108, 109 may be printed or engraved on the monitor device 20, such as on the first housing 25 of the monitor device 20. Alternatively or additionally, the port indicators 106, 108, 109 may be displayed with the touch sensitive display 26 at the vicinity of the respective connection ports.
Fig. 5A schematically illustrates a monitor device 20 with no connected visualisation devices, e.g. connection ports 40a-40c are empty. For example, this situation may correspond to a situation where a user initially powers on the device, e.g. by pressing the on/off button 41. In this situation, based on that no visualisation device is connected an animation 100 of connecting the visualisation device to the monitor device 20 is displayed within the first portion 31 of the graphical user interface.
Fig. 5B schematically illustrates the monitor device 20, e.g. following the situation as shown in Fig. 5A, where, in comparison to the example of Fig. 5A, a first visualisation device has been connected, by a first device connector 16a of the first visualisation device being received at the first connection port 40a. In this situation, e.g. in response to establishing the connection to the first visualisation device, a first live representation 70a of first image data generated by a first image sensor of the first visualisation device is displayed within the first portion 31 of the graphical user interface.
Fig. 5C schematically illustrates the monitor device 20, e.g. following the situation as shown in Fig. 5B, where, in comparison to the example of Fig. 5B, a second visualisation device has been connected, by a second device connector 16b of the second visualisation device being received at the second connection port 40b. The second visualisation device is connected while the first visualisation device is also connected to the monitor device 20.
Consequently, e.g. in response to establishing connection to the second visualisation device, the monitor device 20 enters a dual view mode, wherein the monitor device concurrently displays the first live representation 70a of the first image data generated by the first image sensor and a second live representation 70b of second image data generated by a second image sensor of the second visualisation device.
The second live representation 70b is displayed in the fourth portion 34 of the graphical user interface and extending into the first portion 31 of the graphical user interface. The first live representation 70a is displayed in the second portion 32 of the graphical user interface and extending into the first portion 31 of the graphical user interface.
Furthermore, as conventional display screens have a non-square aspect ratio, e.g. 16:9 or 16:10, and because the image data from the image sensor may be square (or may be cropped to a square format), e.g. due to conventions or de-facto standards within the field of medical visualisation, showing the two live representations 70a, 70b side by side while utilizing the entire height of the display for a maximum size view, will result in (further) cropping, hiding or distortion of the representations or part thereof. In medical imaging, it is important that the user is aware in case he/she chooses to deviate from a standard or de-facto standard way of viewing the representation. Thus, it may be advantageous to avoid hiding or distorting part of a conventional view, unless deliberately and knowingly chosen by the operator. Thus, to maximize one image, it is necessary to reduce the other. As seen, the first live representation 70a is displayed in reduced size compared to the second live representation 70b. The newly connected visualisation device, e.g. the second visualisation device, is displayed in full size, while the previously connected visualisation device is displayed in reduced size. It is preferred that the newly connected visualisation device is shown biggest, as it has been found by the present inventors that a newly connected visualisation device is connected with an intention of using that device, and that therefore it was found advantageous to show the newly connected visualisation device with the biggest image. The present disclosure thereby provides a solution for optimally showing live representations from two simultaneously connected visualisation devices, utilizing a conventional sized display screen. Hence, production costs may be lowered as the need for customized components is reduced.
The live representations may be overlaid with an indicator to allow mapping between the displayed representations and the physical visualisation device. For example, a first indicator 102 is overlaid on the first live representation 70a, and a second indicator 104 is overlaid on the second live representation 70b. Furthermore, the first connection port 40a is labelled with a first port indicator 106 resembling the first indicatorl02 and the second connection port 40b is labelled with a second port indicator 108 resembling the second indicator 104. Thereby, the operator is able to identify the visualisation device corresponding to a certain live representation.
In response to establishing connection to the second visualisation device, while the first visualisation device remains connected, the monitor device further displays a rearrange icon 36c.
As illustrated in Fig. 5D, in the event the user disconnects the first visualisation device, leaving only the connection of the second visualisation device, the monitor device displays the live representation of the remaining visualisation device in the first portion 31 of the graphical user interface. The monitor device 20, e.g. the processing unit of the monitor device, may be adapted to detect disconnection of a visualisation device, such as of the first visualisation device from the monitor device 20. In response to detecting disconnection of the first visualisation device from the monitor device 20, as illustrated in Fig. 5D, the monitor device displays, within the first portion of the graphical user interface, the second live representation of second image data generated by the second image sensor of the second visualisation device.
Figs. 6A-6B schematically illustrates exemplary user interactions with a graphical user interface of a monitor device 20, such as the monitor device 20 of any of the previous figures.
Fig. 6A schematically illustrates a monitor device 20, e.g. following the situation as shown in Fig. 5C, where a first visualisation device and a second visualisation device are connected to the monitor device 20 at the same time. As mentioned above, the monitor device displays a rearrange icon 36c when both a first visualisation device and a second visualisation device are connected.
Fig. 6A particularly illustrates the situation where a user provides a first user input 110, e.g. a touch input, such as a tap, at the touch sensitive display 26 at a location corresponding to the rearrange icon 36c. Thus, the monitor device 20, such as the processing unit of the monitor device 20 may determine, based on the signal from the touch sensitive display 26, that the first user input 110 corresponds to selection of the rearrange icon 36c.
As illustrated in fig. 6B, in response to detecting the first user input 110 the display of the second live representation 70b in the fourth portion 34 and extending into the first portion 31 of the graphical user interface is replaced with display of the first live representation 70a. Furthermore, also in response to detecting the first user input 110, the display of the of the first live representation 70a in the second portion 32 and extending into the first portion 31 of the graphical user interface is replaced with display of the second live representation 70b. Consequently, the second live representation 70b is displayed in reduced size compared to the first live representation 70a.
The first indicator 102 is repositioned to continuously be overlaid on the first live representation 70a, and the second indicator 104 is repositioned to continuously be overlaid on the second live representation 70b.
As explained with respect to Fig. 3, the monitor device 20 displays one or more actionable items 36 within the second portion 32 of the graphical user interface. The actionable items 36 may comprise an image capture button 36a, e.g. for storing an image data file corresponding to the image data received when the image capture button 36a was activated. Alternatively or additionally, the actionable items 36 may comprise a video capture button 36b, e.g. for storing a video sequence of image data corresponding to the image data received when the video capture button 36b was activated.
Fig. 7 schematically illustrates an exemplary user interaction with a graphical user interface of a monitor device 20, such as the monitor device 20 of any of the previous figures. Particularly, Fig. 7 illustrates that also when having a plurality of visualisation devices connected (e.g. a first visualisation device and a second visualisation device, as illustrated), the monitor device displays the one or more actionable items 36, allowing, e.g., storing of image data as well as video sequences. The actionable items 36 are displayed within the second portion 32 of the graphical user interface.
The monitor device 20 is adapted to, e.g. via the touch sensitive display, to detect a second user input 112 corresponding to selection of the image capture button 36a. In response to detection of the second user input 112 the monitor device 20, such as the processing unit of the monitor device 20 stores a first image file corresponding to the first image data received when the second user input 112 was detected and stores a second image file corresponding to the second image data received when second user input 112 was detected. Thus, both an image file corresponding to the first visualisation device and an image file corresponding to the second visualisation device may be stored. The same may be applied for video capturing. For example, in response to detection of a user input corresponding to selection of the video capture button 36b the monitor device 20, such as the processing unit of the monitor device 20 stores a first video sequence of image data corresponding to the first image data received when the user input was detected and stores a second video sequence of image data corresponding to the second image data received when the user input was detected.
When a visualisation device is connected, e.g. when the monitor device 20 and/or the processing unit of the monitor device detects connection of a visualisation device, device identifier information from a device identifier of the respective visualisation device may be obtained. For example, the visualisation device(s) may be fitted with an EPROM (alternatively a QR code, RFID tag, NFC etc may be used), which the monitor device 20 is able to read. For example, the processing unit of the monitor device may execute a process for interrogating the device identifier, via the device connector and connection port. The EPROM may store information of the visualisation device, e.g. a serial number of the visualisation device, which may uniquely identify the visualisation device. Also the device identifier information may be indicative of the type of visualisation device, e.g. whether it is an endoscope or a laryngoscope, brand of the visualisation device, production version, batch number etc. In response to detecting connection of a visualisation device, and after obtaining the device identifier information, the monitor device may open (create or reopen, depending on whether the visualisation device has previously been connected) a procedure session corresponding to the device identifier information. For example, a first procedure session may be opened corresponding to the first device identifier information obtained from the first visualisation device, and a second procedure session may be opened corresponding to the second device identifier information obtained from the second visualisation device. The procedure sessions may be unique, and therefore reconnecting a previously connected visualisation device cause the monitor device to reopen a previously created session.
Stored video sequences or image files may be associated with the procedure session for the respective visualisation device. For example, the first image file stored in response to detection of the second user input 112, may be associated with the first procedure session, and the second image file stored in response to detection of the second user input 112, may be associated with the second procedure session. A procedure session may be implemented by creating a folder in the file system of the monitor device, wherein image files and video sequences obtained from a visualisation device is stored in the folder corresponding to the visualisation device.
Figs. 8A-8D schematically illustrates exemplary user interactions with a graphical user interface of a monitor device 20, such as the monitor device 20 of any of the previous figures. Particularly, the examples may follow from Fig. 5B, wherein only a single visualisation device is connected.
As illustrated in Fig. 8A, the monitor device 20 may receive a first primary user input 180 corresponding to selection of a first actionable menu item 42a of the one or more actionable menu items 42 (cf. Fig. 3). In the illustrated example, the first primary user input 180 is a touch input, e.g. a tap, on an icon, i.e. the first actionable menu item, indicating access to an archive.
The monitor device 20 detects the primary user input 180 with the touch sensitive display 26, and in response to detecting the first primary user input 180, the monitor device 20 displays a primary menu 182, as illustrated in Fig. 8B. The primary menu 182 is associated with the first actionable menu item 42a. The primary menu 182 is displayed within the fourth portion 34 of the graphical user interface, and without obscuring part of the first portion 31 of the graphical user interface. The primary menu 182 comprises one or more primary actionable items including a first primary actionable item 183a. In the illustrated example, the primary menu 182 is an archive menu with a first primary actionable item 183a to retrieve recent (e.g. recently stored) images and videos, and a second primary actionable item 183b to search for stored images and videos. While displaying the primary menu 182, the monitor device 20 may detect with the touch sensitive display 26 a second primary user input 184 corresponding to selection of the first primary actionable item 183a, as illustrated in Fig. 8C. In response to detecting the second primary user input 184, the monitor device displays a secondary menu 186 associated with the primary actionable item 183a in the first portion 31, and optionally the second portion 32, of the graphical user interface, as illustrated in Fig. 8D. Optionally, the secondary menu 186 extends into also the fourth portion 34 of the graphical user interface.
In the illustrated example, the user provides touch input 184 on the first primary actionable item 183a being a button for retrieving recent images and videos (Fig. 8C). In response to detecting the touch input 184 with the touch sensitive display, the monitor device 20 displays the secondary menu 186, in the illustrated example being a list of the recently stored procedures (Fig. 8D), wherefrom the user may navigate to retrieve images and videos stored therein. The list of stored procedures 186 are named according to the date for performing them, and the list 186 shows the timeframe of each procedure and any notes that the user may have added to the procedure.
As seen in Figs. 8A-8D, the user needs to make two consecutive inputs to display a menu, which covers part of, or the entire live representation of the image data displayed in the first portion 31 of the graphical user interface. Thereby, unintentional touch inputs on the screen is less likely to cause interference with the display of the live representation of the image data, which could potentially be life threatening, in case the operator is performing a critical procedure with the aid of the live representation of the image data. In some examples, the primary menu (e.g. 182 as illustrated in Fig. 8B) being displayed in response to the first primary user input (e.g. 180 as illustrated in Fig. 8A), is ceased to be displayed if a second primary user input (e.g. the second primary user input 184 in Fig. 8C) is not received within a predetermined time from receiving/detecting the first primary user input and/or from displaying the primary menu. Thereby, the likelihood of receiving two unintentional inputs causing the live representation of the image data to be obstructed may be decreased.
Figs. 9A-9D schematically illustrates exemplary user interactions with a graphical user interface of a monitor device 20, such as the monitor device 20 of any of the previous figures. Fig. 9A illustrates a situation, which may continue from the situation illustrated in, e.g., Fig. 5C.
Fig. 9A illustrates the monitor device 20 and the graphical user interface in a situation where two visualisation devices are connected, and that one live representation, e.g. the second live representation 70b is displayed in the fourth portion 34 of the graphical user interface and extending into the first portion 31, and the first live representation 70a is displayed in the second portion 32 of the graphical user interface and extending into the first portion 31.
As illustrated, the monitor device receives a third user input 190, e.g. corresponding to the first primary user input 180 of Fig. 8A, to an actionable menu item, such as to the first actionable menu item 42a. The monitor device 20 detects the third user input 190 with the touch sensitive display 26, and in response to detecting the third user input 190, the monitor device 20 displays, within the fourth portion 34 of the graphical user interface, the primary menu 192 associated with the actionable menu item receiving the third user input 190. In the present example, the monitor device displays the first primary menu 192 within the fourth portion 34 of the graphical user interface. Because the monitor device 20 is operating in the dual view mode, where live representations of two connected visualisation devices are displayed, including displaying a live representation 70b in the fourth portion 34, display of the first primary menu 192 within the fourth portion 34 of the graphical user interface causes a part of the live representation 70b of the image data to be obscured.
The primary menu 192 is associated with the first actionable menu item 42a. The primary menu 192 comprises one or more primary actionable items including a first primary actionable item 193a. In the illustrated example, the primary menu 192 is an archive menu (similar to the primary menu 182 of Figs. 8B and 8C) with a first primary actionable item 193a to retrieve recent (e.g. recently stored) images and videos, and a second primary actionable item 193b to search for stored images and videos.
While displaying the primary menu 192, the monitor device 20 may detect with the touch sensitive display 26 a fourth user input 194 corresponding to selection of the first primary actionable item 193a, as illustrated in Fig. 9C, e.g. corresponding to the second primary user input 184 of Fig. 8C. In response to detecting the fourth user input 194, the monitor device displays a secondary menu 196 associated with the primary actionable item 193a in the first portion 31, and optionally the second portion 32, of the graphical user interface, as illustrated in Fig. 9D. Optionally, the secondary menu 196 extends into also the fourth portion 34 of the graphical user interface.
In accordance with not receiving and/or detecting the fourth user input 194, within a threshold amount of time, e.g. 5 seconds, after receipt/detection of the third user input 190, the monitor device may cease display of the primary menu 192 and display the entire live representation 70b of the image data in the fourth portion 34 and extending into the first portion 31 of the graphical user interface. Thus, obscuring part of the live representation 70b of the image data in the dual view mode, may be limited by a timeout. Furthermore, the likelihood of receiving two unintentional inputs causing the live representations 70a 70b to be completely obstructed may be decreased.
Similar to when operating in the normal view mode (Figs. 8A-8D), the monitor device 20, in the dual view mode (Figs. 9A-9D), needs to receive two consecutive inputs, e.g. within a time frame, to display a menu, which covers the entire live representation of the image data displayed in the first portion 31 of the graphical user interface.
The invention has been described with reference to preferred embodiments. Flowever, the scope of the invention is not limited to the illustrated embodiments, and alterations and modifications can be carried out without deviating from the scope of the invention. Throughout the description, the use of the terms "first", "second", "third", "fourth", "primary", "secondary", "tertiary" etc. do not imply any particular order of importance but are included to identify individual elements. Furthermore, the labelling of a first element does not imply the presence of a second element and vice versa.
LIST OF REFERENCES
2 medical visualisation system
4 visualisation device
6 handle
7 control button
8 elongated flexible member
9 distal part
10 distal end of elongated flexible member 12 image sensor 14 device cable 16 device connector 20 monitor device 21 first housing side 22 second housing side
23 third housing side
24 fourth housing side
25 first housing
26 touch sensitive display 27 graphical user interface
31 first portion
32 second portion
33 third portion
34 fourth portion
36 actionable item(s)
37 first image direction
38 second image direction 40 connection port(s) 42 actionable menu item(s) 44 invert view button 46 inverted view mode indicator 50 battery indicator 60 processing unit 61 power supply 61a battery 61b power connection 62 memory
64 orientation sensor
66 input/output
68 microphone 70 live representation of image data xl first direction x2 second direction LI first length L2 second length

Claims

1. A monitor device of a medical visualisation system comprising a plurality of visualisation devices each having an image sensor configured to generate image data indicative of a view from the visualisation device, the monitor device being operable to receive the image data as the image data is being generated by the image sensors of the plurality of visualisation devices, the monitor device comprising a first housing extending in a first direction from a first housing side to a second housing side and in a second direction perpendicular to the first direction from a third housing side to a fourth housing side, the monitor device comprising a touch sensitive display accommodated in the first housing and having a first length in the first direction and a second length in the second direction, and the monitor device comprising a graphical user interface comprising a plurality of non-overlapping portions including a first portion, a second portion, a third portion and a fourth portion, wherein the first portion and the fourth portion are arranged between the second portion and the third portion along the second direction, and wherein the fourth portion is arranged between the first portion and the third portion along the second direction, the monitor device displays the graphical user interface with the touch sensitive display, wherein the monitor device is adapted to establish connection to a first visualisation device of the plurality of visualisation devices; and in response to establishing the connection to the first visualisation device, the monitor device displays, within the first portion of the graphical user interface, a first live representation of first image data generated by a first image sensor of the first visualisation device; and wherein the monitor device, while the first visualisation device is connected to the monitor device, is further adapted to establish connection to a second visualisation device of the plurality of visualisation devices, and in response to establishing the connection to the second visualisation device the monitor device concurrently displays:
- in the fourth portion and extending into the first portion of the graphical user interface, a second live representation of second image data generated by a second image sensor of the second visualisation device; and
- in the second portion and extending into the first portion of the graphical user interface, the first live representation of first image data generated by the first image sensor of the first visualisation device, wherein the first live representation is displayed in reduced size compared to the second live representation.
2. Monitor device according to claim 1, wherein the monitor device comprises a plurality of connection ports for receiving connectors of the visualisation devices, the plurality of connection ports including a first connection port and a second connection port, and wherein to establish connection to the first visualisation device a first connector of the first visualisation device is received by the first connection port, and to establish connection to the second visualisation device a second connector of the second visualisation device is received by the second connection port.
3. Monitor device according to any of the preceding claims, wherein a first indicator is overlaid on the first live representation, and a second indicator is overlaid on the second live representation.
4. Monitor device according to claim 3 as dependent on claim 2 wherein the first connection port is labelled with a first port indicator resembling the first indicator and the second connection port is labelled with a second port indicator resembling the second indicator.
5. Monitor device according to any of the preceding claims, wherein in response to establishing connection to the second visualisation device, the monitor device further displays a rearrange icon and is adapted to detect a first user input corresponding to selection of the rearrange icon, and in response to detection of the first user input the monitor device:
- replaces display of the second live representation in the fourth portion and extending into the first portion of the graphical user interface, with display of the first live representation; and
- replaces display of the first live representation in the second portion and extending into the first portion of the graphical user interface, with display of the second live representation, wherein the second live representation is displayed in reduced size compared to the first live representation.
6. Monitor device according to any of the preceding claims, wherein the monitor device displays one or more actionable items within the second portion of the graphical user interface, and wherein the one or more actionable items comprise an image capture button, and wherein the monitor device is adapted to, while the first visualisation device and the second visualisation device are connected to the monitor device, detect a second user input corresponding to selection of the image capture button, and in response to detection of the second user input the monitor device:
- stores a first image file corresponding to the first image data received when the second user input was detected; and
- stores a second image file corresponding to the second image data received when second user input was detected.
7. Monitor device according to any of the preceding claims, wherein establishing connection to the first visualisation device includes obtaining first device identifier information from a first device identifier of the first visualisation device, and in response to establishing connection to the first visualisation device the monitor device opens a first procedure session corresponding to the first device identifier information, and establishing connection to the second visualisation device includes obtaining second device identifier information from a second device identifier of the second visualisation device, and in response to establishing connection to the second visualisation device the monitor device opens a second procedure session corresponding to the second device identifier information.
8. Monitor device according to claim 7 as dependent on claim 6, wherein in response to detection of the second user input the monitor device:
- associates the first image file with the first procedure session; and associates the second image file with the second procedure session.
9. Monitor device according to any of the preceding claims, wherein the monitor device displays one or more actionable menu items within the third portion of the graphical user interface, and wherein the monitor device is further adapted to detect a third user input corresponding to selection of a first actionable menu item of the one or more actionable menu items, and in response to detection of the third user input the monitor device displays a primary menu associated with the first actionable menu item within the fourth portion of the graphical user interface obscuring the second live representation in the fourth portion of the graphical user interface, wherein the primary menu comprises one or more primary actionable items including a first primary actionable item, while the primary menu is displayed, the monitor device is adapted to detect a fourth user input corresponding to selection of the first primary actionable item, and in accordance with detecting the fourth user input within a threshold amount of time after detection of the third user input, the monitor device displays a secondary menu associated with the primary actionable item in the first portion, and optionally the second portion and/or the fourth portion, of the touch sensitive display, in accordance with not detecting the fourth user input within the threshold amount of time after detection of the first user input, the monitor device ceases display of the primary menu associated with the first actionable menu item and displays the second live representation in the fourth portion and extending into the first portion of the graphical user interface.
10. Monitor device according to any of the preceding claims, wherein the monitor device is adapted to detect disconnection of the first visualisation device from the monitor device, and in response to detecting disconnection of the first visualisation device from the monitor device, the monitor device displays, within the first portion of the graphical user interface, the second live representation of second image data generated by the second image sensor of the second visualisation device.
PCT/EP2021/053964 2020-02-21 2021-02-18 Graphical user interface handling a plurality of visualisation devices WO2021165361A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DKPA202070112 2020-02-21
DKPA202070112 2020-02-21

Publications (1)

Publication Number Publication Date
WO2021165361A1 true WO2021165361A1 (en) 2021-08-26

Family

ID=74672323

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/053964 WO2021165361A1 (en) 2020-02-21 2021-02-18 Graphical user interface handling a plurality of visualisation devices

Country Status (1)

Country Link
WO (1) WO2021165361A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009112644A (en) * 2007-11-08 2009-05-28 Olympus Medical Systems Corp Image processor
US20160278611A1 (en) * 2013-03-20 2016-09-29 Covidien Lp System and method for enhancing picture-in-picture display for imaging devices used for surgical procedures
US20190033571A1 (en) * 2014-05-30 2019-01-31 General Electric Company Systems and methods for providing monitoring state-based selectable buttons to non-destructive testing devices
US20190142262A1 (en) * 2017-11-15 2019-05-16 Aircraft Medical Ltd. Multifunctional visualization instrument

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009112644A (en) * 2007-11-08 2009-05-28 Olympus Medical Systems Corp Image processor
US20160278611A1 (en) * 2013-03-20 2016-09-29 Covidien Lp System and method for enhancing picture-in-picture display for imaging devices used for surgical procedures
US20190033571A1 (en) * 2014-05-30 2019-01-31 General Electric Company Systems and methods for providing monitoring state-based selectable buttons to non-destructive testing devices
US20190142262A1 (en) * 2017-11-15 2019-05-16 Aircraft Medical Ltd. Multifunctional visualization instrument

Similar Documents

Publication Publication Date Title
US20210259536A1 (en) Video laryngoscope systems and methods
US11910998B2 (en) Medical visualisation system including a monitor and a graphical user interface therefore
EP1632168B1 (en) Device for detecting shape of endoscope
EP2100552A1 (en) Capsule guiding system
JP6901734B2 (en) Electrocardiographic data transmission system
US20070078300A1 (en) System and method for detecting content in-vivo
WO2007036941A2 (en) System and method for detecting content in-vivo
CN111479495B (en) Monitor and display screen switching method thereof
JP2020146482A5 (en)
CN101248452B (en) Image display apparatus
WO2021165361A1 (en) Graphical user interface handling a plurality of visualisation devices
EP2359742A1 (en) Medical device system, capsule medical device system, and method for displaying posture items of body to be tested
EP4106614A1 (en) Capturing and browsing images in a medical visualisation system
WO2021165358A1 (en) User interface for a medical visualisation system
EP4106597A1 (en) Rotational user interface for a medical visualisation system
EP4356231A1 (en) Medical visualisation system
WO2022263376A1 (en) Medical visualisation device with programmable buttons
EP3709887B1 (en) Ultrasonic probe and ultrasonic measurement system
EP4106599A1 (en) Battery monitoring for a medical visualisation system
JPH06133937A (en) At-home medical care data management system
US20240095917A1 (en) Examination support device, examination support method, and storage medium storing examination support program
EP4336387A1 (en) Configuration of a medical visualisation system
JP2008307280A (en) Bioinformation collection system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21706904

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21706904

Country of ref document: EP

Kind code of ref document: A1