CN115135219A - Capturing and viewing images in a medical visualization system - Google Patents

Capturing and viewing images in a medical visualization system Download PDF

Info

Publication number
CN115135219A
CN115135219A CN202180015434.7A CN202180015434A CN115135219A CN 115135219 A CN115135219 A CN 115135219A CN 202180015434 A CN202180015434 A CN 202180015434A CN 115135219 A CN115135219 A CN 115135219A
Authority
CN
China
Prior art keywords
monitor device
user input
detecting
response
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180015434.7A
Other languages
Chinese (zh)
Inventor
莱恩·桑达尔·乌贝森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ambu AS
Original Assignee
Ambu AS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ambu AS filed Critical Ambu AS
Publication of CN115135219A publication Critical patent/CN115135219A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00039Operational features of endoscopes provided with input arrangements for the user
    • A61B1/0004Operational features of endoscopes provided with input arrangements for the user for electronic operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00059Operational features of endoscopes provided with identification means for the endoscope
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/00411Display of information to the user, e.g. menus the display also being used for user input, e.g. touch screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/00413Display of information to the user, e.g. menus using menus, i.e. presenting the user with a plurality of selectable options
    • H04N1/00416Multi-level menus
    • H04N1/00419Arrangements for navigating between pages or parts of the menu
    • H04N1/00424Arrangements for navigating between pages or parts of the menu using a list of graphical elements, e.g. icons or icon bar
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/743Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7435Displaying user selection data, e.g. icons in a graphical user interface
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7475User input or interface means, e.g. keyboard, pointing device, joystick
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method and a medical visualization system are disclosed, the medical visualization system comprising a visualization apparatus having an image sensor configured to generate image data indicative of a view from the visualization apparatus. The monitor device is adapted to detect a first user input corresponding to selection of the image capture button, and in response to detecting the first user input, the monitor device: storing a first image file corresponding to image data received when the first user input is detected; associating the first image file with the program session; and displaying a first representation of a still image corresponding to the stored first image file within a background portion of the graphical user interface. After a predetermined delay after detecting the first user input, the monitor device displays an animation that transitions the first representation to the folder icon.

Description

Capturing and viewing images in a medical visualization system
The present disclosure relates to visualization devices (e.g., endoscopes) and medical visualization systems (e.g., endoscopic systems) including visualization devices. More particularly, the present disclosure relates to graphical user interfaces and monitor devices having such graphical user interfaces for interacting with medical visualization systems.
Background
Visualization devices may be used to visually inspect certain regions of a human body, such as the interior of a human body cavity. For example, the visualization device may be used to view the airway, digestive tract, or intestinal tract.
The visualization device may be provided with a camera and attached to a monitor device (e.g. a monitor with a display screen), where video output by the camera of the visualization device may be received and displayed, thereby allowing an operator to control the visualization device to view the area of interest.
For example, the visualization device may be an endoscope, such as a disposable endoscope. The endoscope includes an operating handle at a proximal end and an insertion tube extending from the handle toward a distal end. The handle is configured to be held by an operator and in particular comprises an externally protruding operating member connected to an internal control device, allowing the operator to control the movement of the curved section at the distal end of the insertion tube when advancing the distal end of the insertion tube to a desired position (e.g. within a body cavity of a person). With the aid of an attached monitor device (e.g., a monitor having a display screen), the endoscope can be used to view the position to which the distal end has been advanced.
The monitor device of the medical visualization system may be provided with some functionality, such as the ability to save still images and/or video sequences from the view of the attached visualization device. Further, the monitor device may include some image processing capabilities and may be configured to output video or image output (e.g., to an external display).
Disclosure of Invention
The present disclosure relates to visualization devices (e.g., endoscopes) and visualization systems (e.g., endoscope systems). In particular, but not exclusively, the visualization means may be a disposable camera endoscope. Alternatively, the visualization device may be a video laryngoscope, an endotracheal tube, and/or a laryngeal mask. The visualization system may further comprise a monitor device for connection to the visualization device, e.g. the monitor device may be configured to receive image data from the visualization device. The disclosure further relates to a graphical user interface for such a monitor device of a medical visualization system.
It is an object of the present disclosure to provide a solution that at least improves the prior art solutions. In particular, it is an object of the present disclosure to provide a graphical user interface for a medical visualization system that facilitates and enhances human interaction with the system.
It is a further object of the present disclosure to provide a system and method that facilitates enhanced control and usability of a medical visualization system.
Accordingly, a medical visualization system and a method performed at a monitor device of the medical visualization system are disclosed.
Medical visualization systems include visualization devices, such as endoscopes (e.g., disposable endoscopes). Alternatively, the visualization device may be a video laryngoscope, an endotracheal tube, and/or a laryngeal mask. The visualization apparatus has an image sensor configured to generate image data indicative of a view from the visualization apparatus. The medical visualization system may comprise a plurality of visualization apparatuses, each having an image sensor configured to generate image data indicative of a view from the visualization apparatus. The plurality of visualization devices may comprise a first visualization device and/or a second visualization device. The first visualization apparatus may comprise a first image sensor configured to generate image data indicative of a view from the first visualization apparatus. The second visualization apparatus may comprise a second image sensor configured to generate image data indicative of a view from the second visualization apparatus. The image sensor(s) may be any sensor capable of detecting and communicating information used to make an image. For example, the image sensor(s) may comprise a CCD or CMOS sensor or the like. The image sensor(s) may generate image data corresponding to a square image (i.e., having equal height and width). For example, the image data generated by the image sensor may correspond to a 300x 300 pixel image, or a 400x 400 pixel image, or a 600x 600 pixel image, or an 800x 800 pixel image. Alternatively or additionally, the image sensor may generate image data corresponding to a non-square image that is cropped to form a square image, such as a square image having 300x 300 pixels, 400x 400 pixels, 600x 600 pixels, or 800x 800 pixels.
The medical visualization system further comprises a monitor device receiving and/or operable to receive image data generated by the image sensor. The monitor device may receive and/or be operable to receive image data, such as while the image data is being generated (e.g., within the limits of hardware). The monitor apparatus includes a first housing extending from a first housing side to a second housing side in a first direction and extending from a third housing side to a fourth housing side in a second direction perpendicular to the first direction. The monitor device comprises a display, for example a touch sensitive display. The display may be accommodated in the first housing. The display may have a first length in a first direction and a second length in a second direction. The second length may be longer than the first length, for example the display may be a 16:9 or 16:10 display. Alternatively, the first length may be longer than the second length, or the first length and the second length may be substantially the same. The touch sensitive display may be any suitable type of touch display, such as a capacitive touch display or a resistive touch display.
The monitor device may include one or more connection ports configured to receive a connector of the visualization device. The connection port and the corresponding connector of the visualization device may be a proprietary plug and socket connector or any standard connector through which at least image data from the image sensor can be transmitted. Further, the connector and the connection port may be configured to supply power to components of the visualization device.
The one or more connection ports may be disposed on the first housing. The one or more connection ports may be provided on the third housing side and/or on the fourth housing side. The monitor device may include an on/off button. The on/off button may be disposed on the first housing. The on/off button may be provided on the third housing side or on the fourth housing side. The one or more connection ports may be provided on the third housing side and the on/off button may be provided on the fourth housing side.
The monitor device may establish a connection with the visualization device (e.g. the first visualization device and/or the second visualization device). Establishing a connection with a visualization device may comprise receiving a device connector of the respective visualization device in one of the one or more connection ports of the monitor device. Establishing a connection with a visualization device may include obtaining device identifier information from a device identifier (e.g., EPROM, QR code, NFC, RFID, or the like) of the visualization device. For example, establishing a connection with a first visualization device may include obtaining first device identifier information from a first device identifier of the first visualization device, and/or establishing a connection with a second visualization device may include obtaining second device identifier information from a second device identifier of the second visualization device.
The monitor device may include a processing unit and a memory. The processing unit and/or the memory may be accommodated in the first housing. Alternatively, the monitor device may include a second housing, and the processing unit and/or the memory may be accommodated in the second housing. The monitor device may include an orientation sensor, for example, for determining an orientation of the monitor device relative to gravity (e.g., an orientation of the first housing relative to gravity). The orientation sensor may comprise one or more accelerometers and/or gyroscopes. The orientation sensor may be housed in the first housing. The processing unit may be connected to the touch-sensitive display to control display of information with the touch-sensitive display, and the processing unit may be adapted to receive signals from the touch-sensitive display indicative of touch inputs on the touch-sensitive display. Thus, the monitor device may utilize the touch sensitive display to detect user input, for example in the form of touch input. The touch input may, for example, comprise a single click(s), a double click(s), or a swipe(s) on the touch-sensitive display. The processing unit may be connected to the orientation sensor to receive an orientation signal indicative of an orientation of the monitor device (e.g., an orientation of a first housing of the monitor device). The processing unit may be connected to the memory and adapted to read data from the memory and write data to the memory.
The monitor device may include a power unit for powering the monitor device. The power unit may include a rechargeable battery and/or a power connection for connecting the power unit to an external power source (e.g., a conventional AC power outlet). The power unit may be accommodated in the first housing. Alternatively, the power unit may be accommodated in the second housing.
The monitor device may include a graphical user interface. The monitor device and/or a processing unit of the monitor device may display a graphical user interface with a touch-sensitive display. The graphical user interface may comprise one or more portions, such as a plurality of portions. These portions may be non-overlapping portions, such as a plurality of non-overlapping portions. The portions may include the first portion and/or the second portion. The portions may further include a third portion and/or a fourth portion. The second portion and/or the fourth portion may be denoted as background portions, e.g. the second portion may be a first background portion and/or the fourth portion may be a second background portion. Each of the plurality of portions may extend substantially over a first length in a first direction. The first portion may be disposed between the fourth portion and the second portion along the second direction. The fourth portion may be disposed between the third portion and the first portion along the second direction. The third portion may be disposed between one side (e.g., a third housing side) of the first housing and the fourth portion along the second direction. The second portion may be disposed between the other side (e.g., a fourth housing side) of the first housing and the first portion along the second direction. The first portion and the fourth portion may be disposed between the second portion and the third portion along the second direction. The first portion of the graphical user interface may be square. The first portion of the graphical user interface may occupy a center of the touch-sensitive display. The first portion of the graphical user interface may be individually and/or collectively larger than the second, third, and/or fourth portions along the second direction. The first portion of the graphical user interface may extend over more than 40% of the second length in the second direction, such as over more than 50% of the second length in the second direction, such as over more than 60% of the second length in the second direction.
The monitor device may display a live representation of the image data, for example within the first portion of the graphical user interface. A live representation of the image data may be displayed, e.g., by the processing unit, with the touch-sensitive display, e.g., within the first portion of the graphical user interface.
A visualization device (e.g., an endoscope) may include a handle and an elongated flexible member extending from the handle to a distal end. The image sensor may be disposed at a distal end of the elongated flexible member. The image data may indicate a view from the distal end of the elongate flexible member. The handle may comprise control buttons adapted to receive input in the first input direction and/or in the second input direction. The first input direction and the second input direction may be opposite. A touch input in a first input direction may cause the distal portion of the elongate flexible member to bend in a first bending direction and/or may cause the image sensor to move in a first image sensor direction. A touch input in the second input direction may cause the distal portion of the elongate flexible member to bend in the second bending direction and/or may cause the image sensor to move in the second image sensor direction. The live representation of the image data may have an orientation corresponding to an orientation of an image sensor generating the image data. The first bending direction may correspond to a first image direction of a representation of the image data (e.g., a live representation of the image data). The second bending direction may correspond to a second image direction of a representation of the image data (e.g., a live representation of the image data). The first image direction and/or the second image direction may be parallel to the first direction of the first housing.
The monitor device may provide one or more actionable items. One or more actionable items may be displayed, for example, by the processing unit, with the touch-sensitive display, for example, within the second portion of the graphical user interface. One or more actionable menu items may be displayed, for example, by the processing unit, using the touch-sensitive display, for example, within the third portion of the graphical user interface. The battery indicator may be displayed, for example, by the processing unit with the touch-sensitive display, for example, within the third portion of the graphical user interface. The time indicator may be displayed, for example, by the processing unit with the touch-sensitive display, for example, within the third portion of the graphical user interface.
The one or more actionable items (e.g., displayed within the second portion of the graphical user interface of the monitor device) may include an image capture button and/or a video capture button. In response to activation of the image capture button, an image data file corresponding to image data received while the image capture button is activated may be stored, for example in a memory of a monitor device, for example, by a user providing a touch input (e.g., a single click) at a respective location of the touch-sensitive display. In response to activation of the video capture button, for example by a user providing a touch input (e.g., a single tap) at a respective location of the touch-sensitive display, a video sequence of image data corresponding to image data received while the video capture button is activated may be stored, for example in a memory of a monitor device. A first activation of the video capture button may begin collecting image data for the video sequence and a second activation of the video capture button (after the first activation of the video capture button) may stop collecting image data for the video sequence. The stored video sequence may correspond to image data received between the first and second activations of the video capture button. The video capture button may be displayed in a first appearance before a first activation and after a second activation. The video capture button may be displayed with a second appearance after the first activation and before the second activation.
The monitor device may open a program session, for example, in response to establishing a connection with a visualization device (e.g., a first visualization device). Establishing a connection with a visualization device may include obtaining device identifier information from a device identifier (e.g., EPROM, QR code, NFC, RFID, or the like) of the visualization device. In response to establishing the connection with the first visualization device, the monitor device may open a first program session corresponding to the first device identifier information obtained from the first device identifier of the first visualization device. In response to establishing the connection with the second visualization device, the monitor device may open a second program session corresponding to the second device identifier information obtained from the first device identifier of the first visualization device. Thus, a program session may be created for each individual visualization device. The program session may be implemented by creating a folder in the file system of the monitor device, so that the image files and video sequences obtained from the visualization device may be stored in the folder corresponding to the visualization device. Thus, the association of image files with a program session may be accomplished by storing the image files in a folder of the program session. Opening a program session may further include creating a log, logging the time and date for starting the program, logging information about the visualization device, logging software versions, and/or other information.
When the program session is opened, the monitor device may determine whether the visualization device has been previously connected to the monitor device based on the device identifier information. For example, in accordance with a determination that the visualization device has been previously connected to the monitor device, the monitor device may reopen a program session corresponding to the device identifier information; and/or in accordance with a determination that the visualization device has not been previously connected to the monitor device, the monitor device may create a program session (e.g., a new program session) corresponding to the device identifier information.
The folder icon may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device. The folder icon may be displayed within a background portion of the graphical user interface. The background portion may be a portion other than the first portion of the graphical user interface. For example, the background portion may be the second portion or the fourth portion of the graphical user interface. The folder icon may include a visual representation of the number of stored files stored during the program session.
A first user input corresponding to selection of an image capture button of the one or more actionable items can be received and/or detected on the touch-sensitive display, such as when a live representation is displayed within the first portion and the one or more actionable items are displayed in the second portion. The monitor device (e.g., with a touch-sensitive display) may detect the first user input. In response to detecting the first user input, the monitor device may store a first image file corresponding to image data received upon detecting the first user input. The monitor device may further associate the first image file with the program session in response to the first user input. For example, the monitor device may locate the first image file in a folder of a program session (e.g., a program session corresponding to a connected visualization device).
Also, in response to detecting the first user input, a first representation of a still image corresponding to the stored first image file may be displayed, e.g., by the monitor device and/or with a touch-sensitive display of the monitor device, e.g., within a background portion of the graphical user interface. Providing a representation of the captured still image informs the operator that the image is stored and provides an example of the stored image, allowing the operator to quickly confirm that the image shows what he/she intended to capture.
After a predetermined delay after detecting the first user input, an animation of transitioning the first representation to the folder icon may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device. Thus, the operator can be visually notified that the captured image is stored and placed in the folder represented by the folder icon. Thus, the operator is made aware of where he/she can retrieve the image just captured.
The inventors have investigated the preferred time delay and found that optimally the predetermined delay is between 1 and 8 seconds, such as between 3 and 7 seconds, such as between 4 and 6 seconds, such as 5 seconds or between 1.5 and 3 seconds, such as 1.5 seconds or such as 2 seconds. The inventors have also investigated the preferred duration of the animation and found that optimally the duration of the animation may be between 100ms and 1500ms, such as between 300ms and 1000ms, such as between 300ms and 800ms, such as between 300ms and 600ms or between 500ms and 700 ms. For example, the duration of the animation may be 400ms, 500ms, or 600 ms. The duration of the animation may be a fraction of the predetermined delay, such as between 1/7 and 1/13 of the predetermined delay, such as between 1/8 and 1/12 of the predetermined delay, such as between 1/9 and 1/11 of the predetermined delay, such as 1/10 of the predetermined delay.
Also, in response to detecting the first user input, alternatively, after displaying the animation that transitions the first representation to the folder icon, the display of the visual representation of the number of stored files of the folder icon may be updated, including, for example, increasing the number of stored files.
After detecting the first user input, a second user input corresponding to selection of the image capture button may be received and/or detected. The monitor means may be adapted to detect the second user input. In response to detecting the second user input, the monitor device may store a second image file corresponding to image data received when the second user input is detected. The monitor device may further associate a second image file with the program session in response to a second user input. For example, the monitor device may locate the second image file in a folder of a program session (e.g., a program session corresponding to a connected visualization device).
Also, in response to detecting the second user input, a second representation of the still image corresponding to the stored second image file may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, for example, within a background portion of the graphical user interface. After a predetermined delay after detecting the second user input, an animation of transitioning the second representation to the folder icon may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device.
Also, in response to detecting the second user input, alternatively, the display of the visual representation of the number of stored files of the folder icon may be updated after displaying the animation that transitions the second representation to the folder icon, including, for example, increasing the number of stored files.
By providing representations, animations and folder icons associated with the captured still image(s) within the background portion (e.g., within the second portion or fourth portion of the graphical user interface), indications related to image capture may be provided while a live representation of the image data is shown within the first portion of the graphical user interface and without interfering with the live representation of the image data.
The monitor device may be adapted to establish a connection with the second visualization device, for example while the first visualization device is still connected, or after the connection with the first visualization device has been disconnected. In establishing the connection with the second visualization device, the monitor device may obtain the second device identifier information from a second device identifier of the second visualization device. In response to establishing the connection with the second visualization device, the monitor device may open a second program session, e.g., corresponding to the second device identifier information.
A live representation of the second image data generated by the second image sensor of the second visualization apparatus may be displayed. A live representation of the second image data may be displayed in the first portion of the graphical user interface.
The second folder icon may be displayed, for example, by the monitor device (e.g., with a touch-sensitive display of the monitor device). The second folder icon may be displayed within a background portion of the graphical user interface, such as within the second portion or the fourth portion of the graphical user interface. The second folder icon may include a visual representation of the number of stored files stored during the second program session. The display of the second folder icon may replace the display of the folder icon associated with the previously described visualization device (e.g., the first visualization device), for example if the connection between the monitor device and the previously described visualization device is disconnected.
A third user input corresponding to selection of the image capture button may be received and/or detected when the second visualization device is connected to the monitor device and/or when a live representation of the second image data is displayed within the first portion and the one or more actionable items are displayed in the second portion. The monitor means may be adapted to detect a third user input, for example using a touch sensitive display. In response to detecting the third user input, the monitor device may store a third image file corresponding to second image data received upon detecting the third user input. The monitor device may further associate a third image file with the second program session in response to a third user input. For example, the monitor device may locate the third image file in a second folder of a second program session (e.g., a program session corresponding to the connected second visualization device).
Moreover, in response to detecting the third user input, a third representation of the still image corresponding to the stored third image file may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, for example, within a background portion of the graphical user interface. After a predetermined delay after detecting the third user input, an animation transitioning the third representation to the second folder icon may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device.
Also, in response to detecting the third user input, alternatively, after displaying the animation that transitions the third representation to the second folder icon, the display of the visual representation of the number of stored files of the second folder icon may be updated, e.g., including increasing the number of stored files.
The video capture button may be displayed in a first appearance. A fourth user input corresponding to selection of the video capture button may be received and/or detected. The monitor device (e.g. with a touch sensitive display) may be adapted to detect the fourth user input. In response to detecting the fourth user input, the monitor device may change the appearance of the video capture button to the second appearance. Further, also in response to detecting the fourth user input, the monitor device may begin collecting and temporarily storing image data received from an image sensor (corresponding to the image sensor of the connected visualization device (e.g., the first visualization device and/or the second visualization device)) in the memory.
After detecting the fourth user input, a fifth user input corresponding to selection of the video capture button may be received and/or detected. The monitor means may be adapted to detect a fifth user input, for example using a touch sensitive display. In response to detecting the fifth user input, the monitor device changes the appearance of the video capture button to the first appearance. Also in response to detecting the fifth user input, the monitor device stores a first video data file corresponding to image data received between the detection of the fourth user input and the fifth user input. Also in response to detecting the fifth user input, the monitor device associates the first video data file with a program session (corresponding to an open program session according to the connected visualization device, e.g. the program session may be a program session corresponding to the first visualization device and/or a second program session corresponding to the second visualization device).
Also, in response to detecting the fifth user input, a fourth representation of the frame corresponding to the stored first video data file may be displayed, e.g., by the monitor device and/or with a touch-sensitive display of the monitor device, e.g., within a background portion of the graphical user interface. After a predetermined delay after detecting the fifth user input, an animation may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, transitioning the fourth representation to a folder icon (e.g., a folder icon associated with the first visualization device or a second folder icon associated with the second first visualization device).
Also, in response to detecting the fifth user input, alternatively, the display of the visual representation of the number of stored files of the folder icon may be updated after displaying the animation that transitions the fourth representation to the folder icon, including, for example, increasing the number of stored files.
A sixth user input corresponding to selection of the folder icon may be received and/or detected. The monitor device may be adapted to detect the sixth user input, for example using a touch sensitive display. In response to detecting the sixth user input, a first plurality of representations corresponding to the first plurality of stored image files stored during the program session may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device. The first plurality of representations may be displayed within a background portion of the graphical user interface, such as within a fourth portion of the graphical user interface.
A seventh user input may be received and/or detected. The seventh user input may correspond to selection of a primary representation of the first plurality of representations displayed in response to detecting the sixth user input. The primary representation may correspond to a primary stored image file. The monitor device may be adapted to detect a seventh user input. In response to detecting the seventh user input, an enlarged representation of the primary stored image file may be displayed, e.g., by the monitor device and/or with a touch-sensitive display of the monitor device, e.g., within the first portion of the graphical user interface. Further, in response to detecting the seventh user input, thumbnail representations of the second plurality of stored image files stored during the program session may be displayed, e.g., by the monitor device and/or with a touch-sensitive display of the monitor device, e.g., within the first portion of the graphical user interface.
Further in response to detecting the sixth user input, a session summary icon may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, for example, within a background portion of the graphical user interface. An eighth user input corresponding to selection of the session summary icon may be received and/or detected. The monitor means may be adapted to detect an eighth user input, for example using a touch sensitive display. In response to detecting the eighth user input, a third plurality of representations corresponding to a third plurality of stored image files stored during the program session may be displayed, e.g., by the monitor device and/or with a touch-sensitive display of the monitor device, e.g., within the first portion of the graphical user interface. The third plurality of representations may be displayed in a grid-like pattern.
Also, for example, in response to detecting the eighth user input, general information for the program session may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, for example, in the second portion of the graphical user interface. Also, for example, in response to detecting the eighth user input, the comment field may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, for example, within the fourth portion of the graphical user interface.
A ninth user input corresponding to selecting the comment field may be received and/or detected. The monitor means may be adapted to detect a ninth user input. In response to detecting the ninth user input, a virtual keyboard may be displayed, for example within the first portion of the graphical user interface, and optionally extended into the second portion and/or the fourth portion of the graphical user interface. The virtual keyboard may be displayed by the monitor device and/or with a touch sensitive display of the monitor device. The virtual keyboard is configured for entering text in the comment field.
A keyboard user input sequence may be displayed that corresponds to typing text using the displayed virtual keyboard. The monitor means may be adapted to detect a keyboard user input sequence, for example using a touch sensitive display. In response to detecting the keyboard user input sequence, a corresponding text string may be displayed in the comment field. A tenth user input may be received and/or detected indicating acceptance of text entered using the displayed virtual keyboard, e.g., the user may press an accept button. The monitor device may be adapted to detect a tenth user input, and in response to detecting the tenth user input, the monitor device may store the entered text and associate the text as an annotation of the program session.
The monitor device may be adapted to detect that the visualization device is disconnected. In response to the visualization device being disconnected from the monitor device, for example in response to detecting the visualization device being disconnected, the same content as in response to detecting the sixth user input as described above may be displayed, for example by the monitor device and/or with a touch-sensitive display of the monitor device, i.e., displaying a first plurality of representations corresponding to a first plurality of stored image files stored during a program session, for example where the program session corresponds to the visualization device just removed. The first plurality of representations may be displayed within a background portion of the graphical user interface, such as within a fourth portion of the graphical user interface. Further, a session summary icon may be displayed within a background portion of the graphical user interface, also in response to the visualization device being disconnected from the monitor device.
The one or more actionable menu items (e.g., displayed in the third portion of the graphical user interface) may include an archive menu item. An eleventh user input corresponding to selection of an archive menu item may be received and/or detected. The monitor device may be adapted to detect an eleventh user input, for example using a touch sensitive display. In response to detecting the eleventh user input, a primary archive menu associated with the archive menu item may be displayed, e.g., within the fourth portion of the graphical user interface. The main filing menu may be displayed by the monitor device and/or with a touch sensitive display of the monitor device. The master archive menu may include one or more primarily actionable archive items, including, for example, a first primarily actionable archive item and/or a second primarily actionable archive item.
While displaying the primary archive menu, a twelfth user input corresponding to selecting the first primary actionable archive item may be received and/or detected. The monitor device may be adapted to detect a twelfth user input. In response to detecting the twelfth user input, a secondary archival menu associated with the first primary actionable archival item may be displayed, for example, by the monitor device and/or utilizing a touch-sensitive display of the monitor device. The secondary archive menu may be displayed in a first portion of the graphical user interface, optionally extending into a second portion and/or a fourth portion of the graphical user interface.
The monitor device may include features that should be limited. Thus, the monitor device may support authentication of users that are allowed to access restricted features. Thus, the monitor device may operate in an authorized state and/or an unauthorized state.
Depending on the monitor device operating in the authorized state, the secondary archival menu may include a list of stored program sessions, such as a complete list of all programs stored in the memory of the monitor device. The secondary archive menu may comprise an empty list or a list of a subset of stored program sessions, depending on the monitor device operating in an unauthorized state and the settings requiring authorization being activated. The subset may be, for example, the program session recorded for the current day or the last recorded session. Depending on the monitor device operating in an unauthorized state and the settings requiring authorization being disabled, the secondary archiving menu may include a list of stored program sessions, such as a complete list of all programs stored in the memory of the monitor device.
When displaying a second level archive menu that includes a list of stored program sessions or a list of a subset of stored program sessions, a thirteenth user input may be received and/or detected that corresponds to selecting a first stored program session in the list of stored program sessions or the list of the subset of stored program sessions. The monitor means may be adapted to detect a thirteenth user input, for example using a touch sensitive display. In response to detecting the thirteenth user input, a fourth plurality of representations corresponding to a fourth plurality of stored image files stored during the first stored program session may be displayed, for example in the first portion of the graphical user interface. The monitor device and/or a touch sensitive display of the monitor device may display the fourth plurality of representations.
Also in response to detecting the thirteenth user input, general information of the first stored program session may be displayed, for example, in a fourth portion of the graphical user interface. Also in response to detecting the thirteenth user input, a comment field may be displayed, for example, in a fourth portion of the graphical user interface.
The graphical user interface and/or graphical user interface content displayed in response to the thirteenth user input may correspond to the graphical user interface and/or graphical user interface content displayed in response to the eighth user input if the program sessions are the same.
A fourteenth user input may be received and/or detected that corresponds to selection of a selected representation of the fourth plurality of representations. The selected representation may correspond to a selected stored image file. The monitor means may be adapted to detect a fourteenth user input. In response to detecting the fourteenth user input, a magnified representation of the selected stored image file may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, for example, within the first portion of the graphical user interface. Further in response to detecting the fourteenth user input, thumbnail representations of the fifth plurality of stored image files stored during the first stored program session may be displayed, e.g., by the monitor device and/or with a touch-sensitive display of the monitor device, e.g., within the first portion of the graphical user interface. These thumbnail representations may include thumbnail representations of the selected stored image files.
Also, in response to detecting the fourteenth user input, image information associated with the selected stored image file may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, for example, within the second portion of the graphical user interface. The image information associated with the selected stored image file may include information indicative of a visualization device. The information indicative of the visualization device may have been obtained from the device identifier information.
A fifteenth user input may be received and/or detected that corresponds to selection of a selected thumbnail from the displayed thumbnail representations. The selected thumbnail may correspond to a second selected stored image file. The monitor means is further adapted to detect a fifteenth user input. In response to detecting the fifteenth user input, a magnified representation of the second selected stored image file may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, for example, within the first portion of the graphical user interface.
Moreover, in response to detecting the fifteenth user input, thumbnail representations of the sixth plurality of stored image files stored during the first stored program session may be displayed, e.g., by the monitor device and/or with a touch-sensitive display of the monitor device, e.g., within the first portion of the graphical user interface. The sixth plurality of stored image files may be the same as the fifth plurality of stored image files.
The export icon may be displayed, for example, in response to detecting the fourteenth user input. A sixteenth user input may be received corresponding to selecting the derived icon. The monitor device may be adapted to detect a sixteenth user input, for example using a touch sensitive display. In response to detecting the sixteenth user input, the fourth plurality of representations corresponding to the fourth plurality of stored image files may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, for example, in the first portion of the graphical user interface. Each of the fourth plurality of representations may include a selection indicator, wherein the selection indicator of the selected representation may be activated. Further, in response to detecting the sixteenth user input, a export menu including an export confirmation icon may be displayed, e.g., by the monitor device and/or with a touch-sensitive display of the monitor device, e.g., within the second portion and/or the fourth portion of the graphical user interface.
Alternatively and/or additionally, a derived icon may be displayed in response to detecting a seventh user input. In such cases, in response to detecting the user input corresponding to selection of the export icon, a second plurality of representations corresponding to the second plurality of stored image files may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, for example in the first portion of the graphical user interface. Each of the second plurality of representations may include a selection indicator, wherein the selection indicator of the primary stored image file may be activated. Similarly to the case described above, a lead-out menu including a lead-out confirmation icon may be displayed.
Alternatively and/or additionally, a derived icon may be displayed in response to detecting the eighth user input. In such cases, in response to detecting the user input corresponding to selection of the export icon, a third plurality of representations corresponding to the third plurality of stored image files may be displayed, for example, by the monitor device and/or with a touch-sensitive display of the monitor device, for example in the first portion of the graphical user interface. Each of the third plurality of representations may include a selection indicator, wherein none of the selection indicators is initially activated. Similarly to the case described above, a lead-out menu including a lead-out confirmation icon may be displayed.
Alternatively and/or additionally, the export icon may be displayed in response to detecting the thirteenth user input. In such cases, in response to detecting the user input corresponding to selection of the export icon, a fourth plurality of representations corresponding to the fourth plurality of stored image files may be displayed, e.g., by the monitor device and/or with a touch-sensitive display of the monitor device, e.g., in the first portion of the graphical user interface. Each of the fourth plurality of representations may include a selection indicator, wherein none of the selection indicators is initially activated. Similarly to the case described above, a lead-out menu including a lead-out confirmation icon may be displayed.
After detecting the sixteenth user input, a seventeenth user input can be received and/or detected that corresponds to selecting one or more of the selection indicators (e.g., of the fourth plurality of representations). The monitor device may be adapted to detect a seventeenth user input, for example using a touch sensitive display. In response to detecting the seventeenth user input, a selection indicator of the plurality of representations corresponding to the selected one or more selection indicators may be activated.
After detecting the sixteenth user input and/or after detecting the seventeenth user input, an eighteenth user input corresponding to selecting the export confirmation icon may be received and/or detected. The monitor device may be adapted to detect an eighteenth user input, for example with a touch sensitive display. In response to detecting the eighteenth user input, the stored image file corresponding to the selected one or more selection indicators may be transmitted to an auxiliary device, such as a USB drive or a remote server. For example, the monitor device may transmit the stored image file corresponding to the selected one or more selection indicators to the auxiliary device, e.g., via bluetooth, USB, LAN, WiFi, or any other suitable connection.
The delete icon may be displayed, for example, in response to detecting: a fourteenth user input, a seventh user input, an eighth user input, a thirteenth user input, a sixteenth user input, and/or a seventeenth user input. A nineteenth user input corresponding to selection of the delete icon may be received and/or detected. The monitor means may be adapted to detect a nineteenth user input, for example using a touch sensitive display. In response to detecting the nineteenth user input, a confirmation dialog may be displayed, e.g., by the monitor device and/or with a touch-sensitive display of the monitor device, indicating potential deletion of one or more image files (e.g., corresponding to the displayed enlarged representation or stored image file having its selection indicator selected).
A twentieth user input to the confirmation dialog may be received. The monitor device may be adapted to detect a twentieth user input, for example using a touch sensitive display. The one or more image files may be deleted (e.g., removed from memory) based on a twentieth user input indicating that the user confirms deletion of the one or more image files. Further, the display of the enlarged representation may be replaced with an enlarged representation of the second image file. Upon a twentieth user input indicating to the user to cancel deletion of the selected stored image file, deletion of the one or more image files may be aborted. Further, the display of the enlarged representation may be maintained, for example, within the first portion of the graphical user interface.
A delete icon may be displayed in response to detecting the fourteenth user input. In such cases, in response to detecting the nineteenth user input, a confirmation dialog is displayed indicating potential deletion of the selected stored image file. Deleting the selected stored image file in response to a twentieth user input instructing the user to confirm deletion of the selected stored image file. Further, the display of the enlarged representation of the selected stored image file is replaced with the enlarged representation of the second selected stored image file. In accordance with a twentieth user input instructing the user to cancel deleting the selected stored image file, forgoing deleting the selected stored image file. Further, display of an enlarged representation of the selected stored image file within the first portion of the graphical user interface is maintained.
Drawings
Embodiments of the present disclosure will be described in more detail below with respect to the accompanying drawings. The drawings illustrate one way of implementing the invention and are not to be construed as limited to other possible embodiments falling within the scope of the appended set of claims.
Figure 1 schematically illustrates an exemplary medical visualization system,
figure 2 schematically illustrates an exemplary monitor device,
figure 3 schematically illustrates an exemplary monitor device,
figure 4 is a block diagram of an exemplary monitor device,
figures 5A-5H schematically illustrate exemplary user interaction with an exemplary graphical user interface,
figures 6A-6D schematically illustrate exemplary user interaction with an exemplary graphical user interface,
figures 7A-7F schematically illustrate exemplary user interactions with an exemplary graphical user interface,
figures 8A-8H schematically illustrate exemplary user interaction with an exemplary graphical user interface,
figures 9A-9K schematically illustrate exemplary user interactions with an exemplary graphical user interface,
10A-10D schematically illustrate exemplary user interactions with an exemplary graphical user interface, an
11A-11E schematically illustrate exemplary user interactions with an exemplary graphical user interface.
Detailed Description
Various exemplary embodiments and details are described below with reference to the drawings (when relevant). It should be noted that the figures may or may not be drawn to scale and that elements that are structurally or functionally similar are represented by like reference numerals throughout the figures. It should also be noted that the drawings are only intended to facilitate the description of the embodiments. The drawings are not intended as an exhaustive description of the invention or as a limitation on the scope of the invention. Further, the embodiments shown need not have all of the aspects or advantages shown. Aspects or advantages described in connection with a particular embodiment are not necessarily limited to that embodiment, and may be practiced in any other embodiment, even if not so shown or not so explicitly described.
Fig. 1 schematically illustrates an exemplary medical visualization system 2 comprising a visualization apparatus 4 and a monitor apparatus 20. The visualization device 4 has an image sensor 12 (e.g. CCD or CMOS) configured to generate image data indicative of the views from the visualization device 4. In the illustrated example, the visualization device 4 is an endoscope that includes a handle 6 and an elongated flexible member 8 (e.g., insertion tube) extending from the handle 6 to a distal end 10. The image sensor 12 may be configured to generate image data indicative of a view from the distal end 10 of the elongate flexible member 8.
The visualization means 4 may be connected to a monitor means 20. In the illustrated example, the device cable 14 extending from the handle 6 terminates in a device connector 16 connected to a connection port 40 of the monitor device 20. The monitor device 20 is operable to receive image data generated by the image sensor 12 of the visualization device 4. For example, the monitor device 20 may receive image data generated by the image sensor 12 via the device cable 14, the connector 16, and the connection port 40.
The handle 6 comprises control buttons 7 adapted to receive input in a first input direction and/or in a second input direction. A touch input on the control button 7 in a first input direction causes the distal portion 9 of the elongated flexible member 8 to bend in a first bending direction, e.g. via a wire extending from the handle through the elongated flexible member 8 to the distal portion 9. A touch input on the control button 7 in a second input direction causes the distal portion 9 of the elongate flexible member 8 to bend in a second bending direction. The first input direction and the second input direction may be opposite. The first bending direction and the second bending direction may be opposite. Bending the distal portion 9 of the elongate flexible member 8 may cause the distal end 10 and the image sensor 12 to move in a direction relative to the image sensor 12. Thus, in the event that the image generated by the image sensor 12 is seen, the direction (e.g., up or down) in the image may correspond to a respective input on the control button 7.
Fig. 2 schematically illustrates an exemplary monitor device 20, such as the monitor device 20 illustrated in fig. 1. The monitor device 20 includes a first housing 25. The first housing 25 extends in a first direction x1 from the first housing side 21 to the second housing side 22 and in a second direction x2 perpendicular to the first direction x1 from the third housing side 23 to the fourth housing side 24. The monitor device includes a touch-sensitive display 26 accommodated in a first housing 25. The touch sensitive display 26 has a first length L1 in the first direction x1 and a second length L2 in the second direction x 2. As illustrated, the second length L2 may be longer than the first length L1.
As illustrated, the monitor device may include one or more connection ports 40, such as three connection ports 40. The connection port 40 may allow for connection of a visualization device. As illustrated, the connection port(s) 40 may be arranged at the third housing side 23. Alternatively or additionally, the connection port(s) 40 may be arranged at the fourth housing side 24.
As illustrated, the monitor device may include an on/off button 41, which may be disposed on the fourth housing side 24.
Fig. 3 schematically illustrates an exemplary monitor device 20, such as the monitor device 20 illustrated in fig. 1-2. As illustrated, the device connector 16 may be connected to the connection port 40.
The monitor device 20 may be provided with a graphical user interface 27. The graphical user interface 27 may be displayed with a touch sensitive display 26 and a user may interact with the graphical user interface 27, for example by providing touch input on the touch sensitive display 26.
The graphical user interface 27 is displayed using the touch sensitive display 26. The graphical user interface 27 comprises a plurality of non-overlapping portions 31, 32, 33, 34. Each of the portions 31, 32, 33, 34 extends substantially over a first length L1 in a first direction x 1. The non-overlapping portion includes a first portion 31, a second portion 32, a third portion 33, and a fourth portion 34. The first portion 31 is arranged between the fourth portion 34 and the second portion 32 along the second direction x 2. The fourth portion 34 is arranged between the third portion 33 and the first portion 31 along the second direction x 2. The third portion 33 is arranged between one side of the first housing (e.g., the third housing side 23) and the fourth portion 34 along the second direction x 2. The second portion 32 is arranged between the other side (e.g., the fourth case side 24) of the first case 25 and the first portion 31 along the second direction x 2. The first portion 31 and the fourth portion 34 are arranged between the second portion 32 and the third portion 33 along the second direction.
The monitor device 20 displays a live representation 70 of the image data within the first portion 31 of the touch sensitive display 26. As described with respect to fig. 1, the first and second bending directions of the distal portion 9 of the elongated flexible member 8 may correspond to the first and second image directions 37, 38, respectively, of the live representation 70. As shown, the first image direction 37 and the second image direction 38 may be parallel to the first direction x 1. As illustrated, the first image direction 37 and the second image direction 38 may be opposite. Thus, a user operating the control button 7 of the visualization device 2 may cause the distal portion 9 of the elongated flexible member 8 to move to bend in a direction corresponding to the first image direction 37 or the second image direction 38 of the live representation 70.
The monitor device 20 displays one or more actionable items 36 within the second portion 32 of the graphical user interface 27 using the touch-sensitive display 26. Actionable items 36 may include an image capture button 36a, for example, for storing an image data file corresponding to image data received when the image capture button 36a is activated. Alternatively or additionally, the actionable item 36 may include a video capture button 36b, for example, for storing a video sequence of image data corresponding to image data received when the video capture button 36b is activated.
The monitor device 20 displays one or more actionable menu items 42 within the third portion 33 of the graphical user interface 27 using the touch-sensitive display 26. Actionable menu items 42 may include, for example, a login menu item for initiating a login procedure, a settings menu item for accessing a settings menu, an archive menu item for browsing archives, and a default menu item for returning to a default view. A battery indicator 50 is also displayed in the third portion 33.
Fig. 4 is a block diagram of an exemplary monitor device 20, such as the monitor device 20 of the previous figures. The monitor device 20 includes a processing unit 60 and a memory 62. The memory 62 may include both volatile and non-volatile memory. The monitor device 20 further comprises an orientation sensor 64 for determining the orientation of the first housing 25 with respect to gravity. Orientation sensor 64 may include one or more accelerometers and/or gyroscopes. The monitor device 20 comprises an input/output module 66, such as for receiving image data from the image sensor 12 via a connector of the visualization device 4. The input/output module 66 may also include an ethernet connector, WiFi transceiver, bluetooth transceiver, video connector, USB port, etc., and corresponding controller. The monitor device 20 also includes a touch sensitive display 26 as described earlier. The monitor device 20 may utilize a touch sensitive display 27 to display information, graphical user interface objects, images, buttons, and the like. The monitor device 20 also includes a microphone 68. The monitor device 20 includes a power unit 61 for supplying power to the monitor device 20. The power unit 61 may include a rechargeable battery 61 a. The power unit 61 may include a power connector 61b for connecting the power unit 61 to an external power source (e.g., a conventional AC power outlet). The components of the monitor device 20 may be interconnected by buses or signal lines. As illustrated, some or all of the components of the monitor device may be housed in the first housing 25. Alternatively, however, some components (e.g., the processing unit 60, the memory 62, the input/output module 66, and/or the power unit 61) may be housed in the second housing of the monitor device 20.
The power unit 61 may include means for indirectly measuring the capacity of the rechargeable battery 61a, for example. For example, the power unit 61 may include a voltmeter to measure the voltage of the rechargeable battery 61 a. Based on the measured voltage, the remaining capacity of the rechargeable battery 61a may be estimated, for example, by the processing unit 60. The power unit 61 may also include means for measuring the power consumption of the monitor device 20. For example, the power unit 61 may include a power meter to measure the rate at which the monitor device 20 consumes power from the rechargeable battery 61 a. The voltmeter may be a low current consuming integrated circuit or a resistor coupled in parallel with the battery. A current sensor may be provided and the power may be calculated as the product of the voltage and the current. Additionally, an integrated circuit may be provided that includes the voltmeter and the current sensor, and that outputs the power value in digital form.
The monitor device 20 may utilize a touch sensitive display 26 to display content. For example, the monitor device 20 may display content by the processing device 60 transmitting instructions to the touch-sensitive display 26 indicating the content to be displayed. The monitor device 20 may utilize the touch sensitive display 26 to receive user input. In particular, the monitor device 20 may utilize the touch sensitive display 26 to detect user input. For example, a user providing a touch input on the touch-sensitive display 26 causes a change in one or more electrical parameters of the touch-sensitive display 26 that are indicative of at least a location of the touch input. Information of the touch input is transmitted from the touch-sensitive display 26 to the processing unit 60, and the processing unit 60 may determine whether the touch input corresponds to an action to be performed, e.g., whether a location of the touch input corresponds to a location of a soft button displayed at the touch-sensitive display.
The user may interact with the monitor device 20 by providing user input via the graphical user interface 27 (e.g., by way of providing touch input on the touch-sensitive display 26), and the monitor device 20 may utilize the touch-sensitive display 26 to detect such user input. Touch inputs (e.g., single click, double click, swipe, or the like) and locations of the touch inputs on the touch-sensitive display 26 are registered by the touch-sensitive display 26, which transmits information of the touch inputs (e.g., including the type of touch (double click, single click, swipe, etc.) and/or the location of the touch) to the processing unit 60 of the monitor device 20. The processing unit 60 interprets the received information and determines whether the touch input corresponds to activation of an action, for example whether the touch input corresponds to activation of a button displayed with the touch sensitive display 27 at the location of the touch input. In response to determining that the touch input corresponds to activation of an action, the processing unit 60 performs the corresponding action.
For example, referring to fig. 3 and 4, to capture an image corresponding to the currently shown live representation 70 (e.g., corresponding to image data received from an image sensor), the user may tap the image capture button 36 a. The tap and the location of the tap are registered by the touch sensitive display 26, which transmits information of the tap to the processing unit 60 of the monitor device 20. The processing unit 60 interprets the received information and determines that the user has tapped the location corresponding to the image capture button 36 a. In response to this, the processing unit 60 stores an image data file corresponding to the received image data in the memory 62.
With further reference to fig. 3 and 4, to capture a video sequence corresponding to the live representation 70 shown over a period of time (e.g., corresponding to image data received from an image sensor over a period of time), the user may tap the video capture button 36 b. The tap and the location of the tap are registered by the touch sensitive display 26, which transmits information of the tap to the processing unit 60 of the monitor device 20 (see fig. 4). The processing unit 60 interprets the received information and determines that the user tapped the location corresponding to the video capture button 36 b. In response, the processing unit 60 begins collecting image data received from the image sensor 12 and temporarily storing the data in the memory 62. To stop recording, the user may tap the video capture button 36b again. The processing unit 60 determines that the user has tapped the video capture button 36b and stops collecting image data received from the image sensor 12 based on the signal received from the touch sensitive display 26. The processing unit 60 reads the temporarily stored data from the memory 62 and creates a complete video sequence based thereon, which the processing unit 60 stores in the memory 62.
Fig. 5A-5H schematically illustrate exemplary user interactions with a graphical user interface of a monitor device 20 (e.g., the monitor device 20 of any of the previous figures).
In fig. 5A, as illustrated, a first visualization device (not shown) has been connected to the monitor device 20 by the first device connector 16a of the first visualization device received at the first connection port 40 a. A first live representation 70a of the first image data generated by the first image sensor of the first visualization device is displayed within the first portion 31 of the graphical user interface.
When a visualization device is connected (e.g. when the monitor device 20 and/or a processing unit of the monitor device detects the connection of the visualization device), the device identifier information may be obtained from the device identifier of the respective visualization device. For example, the visualization device may be equipped with an EPROM (alternatively, a QR code, RFID tag, NFC, etc. may be used) that the monitor device 20 can read. For example, the processing unit of the monitor device may perform a process of interrogating the device identifier via the device connector and the connection port. The EPROM can store information of the visualization device (e.g., a serial number of the visualization device) that can uniquely identify the visualization device. Also, the device identifier information may indicate the type of visualization device (e.g., whether it is an endoscope or a laryngoscope), the brand of the visualization device, the product version, the lot number, and the like.
In response to detecting the connection of the visualization device and after obtaining the device identifier information, the monitor device may open (create or reopen, depending on whether the visualization device has been previously connected) a program session corresponding to the device identifier information. For example, a first program session may be opened that corresponds to first device identifier information obtained from a first visualization device, and a second program session may be opened that corresponds to second device identifier information obtained from a second visualization device (if connected). The program session may be unique and therefore reconnecting a previously connected visualization device may cause the monitor device 20 to reopen a previously created session. The program session may be implemented by creating a folder in the file system of the monitor device 20, for example in a memory of the monitor device 20. The image files and video sequences obtained with a particular visualization device may be stored in a folder corresponding to the visualization device.
Thus, in opening a program session, the monitor device 20 may determine whether a visualization device has been previously connected to the monitor device, e.g., based on the device identifier information. Thus, if it is determined that the visualization device has been previously connected to the monitor device 20, the monitor device 20 reopens the program session corresponding to the device identifier information, and if it is determined that the visualization device has not been previously connected to the monitor device 20, the monitor device 20 creates a new program session corresponding to the device identifier information.
The folders of sessions and images/videos may be based on device identifier information of the attached visualization device. Thus, when a new endoscope is attached, a new folder is created, and if the previously attached scope is reattached, the previously created folder is reopened, and additionally the captured still image or video is saved to the existing folder. Thus, if the endoscope is accidentally pulled out and reinserted, the captured images/videos are not saved into a new folder, but are organized in the same folder.
The folder icon 152 is displayed within a background portion (e.g., the second portion 32 and/or the fourth portion 34) of the graphical user interface. In this example, a folder icon 152 is displayed within the fourth portion. Alternatively, the folder icon 152 may be shown in the second portion 32. The folder icon 152 includes a visual representation 158 of the number of files stored (e.g., during a program session).
As illustrated in fig. 5A, the user may provide a first user input 150 corresponding to selection of the image capture button 36 a. The monitor device 20 (e.g., utilizing the touch sensitive display 26) is adapted to detect the first user input 150. In response to detecting the first user input 150, the monitor device stores a first image file corresponding to image data received at the time the first user input 150 was detected, for example corresponding to the live representation 70a displayed in the first portion 31. The monitor device 20 associates the first image file with the program session. For example, the monitor device 20 may store the first image file in a folder corresponding to the program session.
Further, as shown in fig. 5B, in response to detecting the first user input 150 and the storing of the first image file, the monitor device displays a first representation 154 of a still image corresponding to the stored first image file within a background portion (e.g., fourth portion 34 as shown) of the graphical user interface. Thereby, the operator is informed that the monitor device 20 has stored the image and is further provided with an example of the stored image, for example to allow the operator to quickly confirm that the image does show what he/she intended to have the image show.
As shown in FIG. 5C, after a predetermined delay after detecting the first user input, the monitor device displays an animation 156 that transitions the first representation 154 to the folder icon 152. The predetermined delay may be between 1 second and 5 seconds, such as between 1.5 seconds and 3 seconds, such as 1.5 seconds or such as 2 seconds. Thus, the operator is visually notified that the captured image is stored and placed in the folder represented by the folder icon 152. The duration of the animation 156 may be between 100ms and 1500ms, such as between 300ms and 1000ms, such as between 300ms and 600ms, such as 400ms or 500 ms.
Further, in response to detecting the first user input 150 and the storage of the first image file (see FIG. 5B), the display of the visual representation 158 of the number of stored files stored during the program session is updated by increasing the number of stored files. Alternatively, after the animation 156 is displayed (e.g., in the transition between fig. 5C and 5D), the display of the visual representation 158 of the number of stored files stored during the program session may be updated by increasing the number of stored files.
Fig. 5E to 5H show the capture of a second image file after the first image file has been stored (as explained with reference to fig. 5A to 5D).
As illustrated in fig. 5E, the user may provide a second user input 151 corresponding to selection of the image capture button 36 a. In response to detecting the second user input 151, the monitor device stores a second image file corresponding to the image data received at the time the second user input 151 was detected, for example corresponding to the live representation 70a displayed in the first portion 31. The monitor device 20 associates the second image file with the program session, for example the monitor device 20 may store the second image file in a folder corresponding to the program session, i.e. in the same folder as the first image file.
As shown in fig. 5F, in response to detecting the second user input 151 and the storage of the second image file, the monitor device displays a second representation 155 of a still image corresponding to the stored second image file within a background portion (e.g., fourth portion 34 as shown) of the graphical user interface.
As shown in fig. 5G, after a predetermined delay after detecting the second user input 151, the monitor device displays an animation 157 that transitions the second representation 155 to the folder icon 152. The duration of the animation 157 may be between 100ms and 1500ms, such as between 300ms and 1000ms, such as between 300ms and 600ms, such as 400ms or 500 ms.
Fig. 6A-6D schematically illustrate exemplary user interactions with a graphical user interface of a monitor device 20 (e.g., the monitor device 20 of any of the previous figures).
As seen in fig. 6A, the first visualization device, which is connected to the monitor device as illustrated by device connector 16A in fig. 5A to 5H, has been disconnected. In contrast, a second visualization device having a device connector 16b has been connected to the monitor device 20 via the second connection port 40 b. The second visualization means may equally be connected to the first connection port 40 a.
As explained above, when the visualization device is connected, the device identifier information may be obtained from the device identifier of the corresponding visualization device, and the monitor device 20 may open a program session based on the device identifier information. Thus, in establishing a connection with the second visualization device (including obtaining the second device identifier information from the second device identifier of the second visualization device), the monitor device 20 may open a second program session corresponding to the second device identifier information, e.g., different from the program session as described with respect to fig. 5A-5H.
Thus, as illustrated in fig. 6A, the monitor device 20 displays a live representation 70b of the second image data generated by the second image sensor of the second visualization device, and the folder icon 162 is displayed within a background portion (e.g., the second portion 32 and/or the fourth portion 34) of the graphical user interface. In this example, a folder icon 162 is displayed within the fourth portion. Alternatively, the folder icon 162 may be shown in the second portion 32. Folder icon 162 includes a visual representation 168 of the number of files stored (e.g., stored during the second program session). Since the second program session is different from the program sessions of fig. 5A-5H (because the visualization means is different), the visual representation 168 may have a different count than the visual representation 158 of fig. 5A-5H.
As illustrated in fig. 6A, the user may provide a third user input 160 corresponding to selection of the image capture button 36A. The monitor device 20 (e.g. utilizing the touch sensitive display 26) is adapted to detect the third user input 160. In response to detecting the third user input 160, the monitor device stores a third image file corresponding to the second image data received at the time the third user input 160 was detected, e.g. corresponding to the live representation 70b displayed in the first portion 31. The monitor device 20 associates the third image file with the second program session. For example, the monitor device 20 stores the third image file in a folder corresponding to the second program session.
As shown in fig. 6B, in response to detecting the third user input 160 and the storage of the third image file, the monitor device 20 displays a third representation 164 of the still image corresponding to the stored third image file within a background portion of the graphical user interface (e.g., the fourth portion 34 as shown).
As shown in fig. 6C, after a predetermined delay after detecting the third user input, the monitor device displays an animation 166 transitioning the third representation 164 to the folder icon 162. The predetermined delay may be the same delay as explained with respect to fig. 5C. The animation 166 may have a duration like the duration of the animation 156 as explained with respect to fig. 5C.
Further, in response to detecting the third user input 160 and the storage of the third image file (see FIG. 6B), the display of the visual representation 168 of the number of stored files stored during the second program session is updated by increasing the number of stored files.
Fig. 7A-7F schematically illustrate exemplary user interactions with a graphical user interface of a monitor device 20 (such as the monitor device 20 of any of the previous figures). The example graphical user interface of fig. 7A may follow, for example, the example graphical user interface of fig. 5H. Thus, the folder icon 172 illustrated in fig. 7A-7F may correspond to the folder icon 152 of fig. 5A-5H, and the program session as mentioned with respect to fig. 7A-7F may be the same program session as explained with respect to fig. 5A-5H.
As illustrated in fig. 7A, the user may provide a fourth user input 170 corresponding to selection of the video capture button 36 b. The video capture button 36b may be initially displayed in a first appearance as illustrated in fig. 7A. The monitor device 20 (e.g., with the touch sensitive display 26) is adapted to detect the fourth user input 170. In response to detecting the fourth user input 170, the monitor device changes the appearance of the video capture button 36B to a second appearance as shown in fig. 7B. Further, in response to detecting the fourth user input 170, the monitor device begins collecting image data received from the image sensor and temporarily storing the data in the memory.
After detecting the fourth user input 170, the monitor device is adapted to detect a fifth user input 171 (as provided in fig. 7C) corresponding to the selection of the video capture button 36 b.
In response to detecting the fifth user input 171, the monitor device stops recording. More specifically, the monitor device stores a first video data file corresponding to image data received between the detection of the fourth user input 170 (fig. 7A) and the detection of the fifth user input 171. The monitor device 20 (e.g., a processing unit of the monitor device 20) may read the temporarily stored data from the memory and create a first video data file based thereon. Monitor device 20 further associates the first video data file with the program session. For example, the monitor device 20 may store a first video data file in a folder corresponding to the program session.
Further, as shown in fig. 7D, in response to detecting the fifth user input 171 and the storage of the first video data file, the monitor device displays a fourth representation 174 of the frame corresponding to the stored first video data file within a background portion of the graphical user interface (e.g., fourth portion 34 as shown). As illustrated, the representation 174 may include an indicator indicating that the representation corresponds to a video data file. In this example, shown as a "play" icon. Similarly, an indicator may be provided on the representation of the image file that may be different from the indicator provided on the representation of the video data file. Thus, an indicator may be set to visually distinguish whether a video or still image has been captured.
Further, as also shown in fig. 7D, in response to detecting the fifth user input 171, the monitor device changes the appearance of the video capture button 36b to the first appearance.
Further, in response to detecting the fifth user input 170 and the storage of the first video data file, the display of the visual representation 178 of the number of stored files stored during the program session is updated by increasing the number of stored files.
As shown in fig. 7E, after a predetermined delay after detecting the fifth user input 171, the monitor device displays an animation 176 that transitions the fourth representation 174 to the folder icon 172. The predetermined delay may be the same delay as explained with respect to fig. 5C. The animation 166 may have a duration like the duration of the animation 156 as explained with respect to fig. 5C.
Fig. 8A-8H schematically illustrate exemplary user interactions with a graphical user interface of a monitor device 20 (e.g., the monitor device 20 of any of the previous figures). The exemplary graphical user interface of fig. 8A may follow, for example, the exemplary graphical user interface of fig. 7F. Thus, the folder icon 802 illustrated in fig. 8A-8F may correspond to the folder icon 172 of fig. 7A-7F, and the program session as mentioned with respect to fig. 8A-8F may be the same program session as explained with respect to fig. 7A-7F.
As illustrated in fig. 8A, the user may provide a sixth user input 800 corresponding to selecting a folder icon 802. The monitor device 20 is adapted to detect the sixth user input 800 and, as illustrated in fig. 8B, in response to detecting the sixth user input 800, the monitor device 20 displays within a background portion (e.g., the second portion or the fourth portion 34) of the graphical user interface a first plurality of representations 804 corresponding to the first plurality of stored image files stored during the program session. In this example, the first plurality of representations 804 is displayed within the fourth portion 34 of the graphical user interface. Alternatively, the first plurality of representations 804 may be displayed in the second portion 32 of the graphical user interface. Advantageously, the first plurality of representations 804 are displayed in the same portion as folder icon 802.
Also, in response to detecting the sixth user input 800, a session information box 805 and a session summary icon 807 are displayed within a background portion of the graphical user interface, such as within the fourth portion 34 of the graphical user interface.
The exemplary graphical user interface of fig. 8B may alternatively be achieved by disconnecting the visualization device from the monitor device 20. For example, in response to the visualization device being disconnected from the monitor device 20, the monitor device 20 may display a graphical user interface as illustrated in fig. 8B, e.g., including the first plurality of representations 804 corresponding to the first plurality of stored image files stored during the program session, as well as a session information box 805 and a session summary icon 807.
As illustrated in fig. 8C, the user may provide a seventh user input 806 corresponding to selection of the primary representation 804a of the first plurality of representations 804. Primary representation 804a corresponds to a primary stored image file. The monitor device 20 is adapted to detect a seventh user input 806 and, as illustrated in fig. 8D, in response to detecting the seventh user input 806, the monitor device 20 displays an enlarged representation 808 of the primary stored image file within the first portion 31 of the graphical user interface. In addition, the monitor device 20 also displays thumbnail representations 809 of a second plurality of stored image files stored during the program session. The display thumbnail representation 809 distinguishes the view from a live view (e.g., fig. 3), thereby informing the user that the monitor device 20 is not currently showing a live representation of image data received from the visualization device, but rather that the monitor device 20 is displaying a stored image.
As illustrated in fig. 8E, stemming from the situation as described with respect to fig. 8B, the user may provide an eighth user input 810 corresponding to selecting session summary icon 807. The monitor device 20 is adapted to detect an eighth user input 810 and, as illustrated in fig. 8F, in response to detecting the eighth user input 810, the monitor device 20 displays a third plurality of representations 812 in the first portion of the graphical user interface corresponding to the third plurality of stored image files stored during the program session. Further, in response to detecting the eighth user input 810, the monitor device displays general information 814 for the program session (e.g., in the second portion 32 of the graphical user interface) and a comment field 816 in the fourth portion 34 of the graphical user interface.
The monitor device also displays a export icon 932 and a delete icon 946 in a second portion of the graphical user interface. These will be described in more detail later.
As illustrated in fig. 8G, the user may provide a ninth user input 818 corresponding to selecting the comment field 816. The monitor device 20 is adapted to detect the ninth user input 818 and, as illustrated in fig. 8H, in response to detecting the ninth user input 818, the monitor device 20 displays a virtual keyboard 820 in the first portion 31 of the graphical user interface and optionally extends the virtual keyboard into the second portion 32 of the graphical user interface. The virtual keyboard is configured for entering text in the comment field 816. Accordingly, the monitor device 20 is further adapted to detect (e.g., with the touch-sensitive display 26) a sequence of keyboard user inputs corresponding to typing text using the displayed virtual keyboard, and to detect a tenth user input 822 indicating acceptance of text typed using the displayed virtual keyboard 820. For example, as illustrated, the tenth user input 822 may correspond to selecting a "save" button. In response to detecting the tenth user input 822, the monitor device 20 stores the entered text and associates the text as an annotation to the program session.
Fig. 9A-9L schematically illustrate exemplary user interactions with a graphical user interface of a monitor device 20 (e.g., the monitor device 20 of any of the previous figures). The monitor device 20 may be connected with visualization means or it may not be connected with visualization means.
Referring to FIG. 9A, the one or more actionable menu items 42 displayed in the third portion 33 of the graphical user interface include a first actionable menu item that is an archive menu item 42 a.
As illustrated in fig. 9B, an eleventh user input 900 may be received corresponding to selection of the archive menu item 42 a. The monitor device 20 is adapted to detect an eleventh user input 900 and, as illustrated in fig. 9C, in response to detecting the eleventh user input 900, the monitor device 20 displays a main filing menu 902 associated with the filing menu item 42a within the fourth portion of the graphical user interface. Primary archive menu 902 includes one or more primary actionable archive items (including first primary actionable archive item 903 a). In the illustrated example, the one or more primary actionable archive items include a first primary actionable item 903a to retrieve a most recent (e.g., most recently stored) image and video, and a second primary actionable item 903b to search for the stored image and video.
As illustrated in fig. 9C, while primary archive menu 902 is displayed, a twelfth user input 904 may be received corresponding to selection of a first primary actionable archive item 903 a. The monitor device is adapted to detect a twelfth user input 904 and, as illustrated in fig. 9D, in response to detecting the twelfth user input 904, the monitor device 20 displays a secondary archiving menu 906a associated with the first primary actionable archiving item in the first portion of the graphical user interface, optionally extending the secondary archiving menu into the second portion 32 and/or the fourth portion 34 of the graphical user interface.
The contents of the secondary archive menu may be subject to authorization. For example, it is subject to whether the user is logged into the monitor device (e.g., authenticated by typing in a username and password) and whether the user being logged in has certain privileges. For example, depending on the monitor device operating in the authorized state, the secondary archiving menu may include a list of stored (e.g., all stored) program sessions, such as, for example, secondary archiving menu 906a as illustrated in fig. 9D. Depending on the monitor device 20 operating in an unauthorized state and the setting requiring authorization being activated, the secondary archiving menu may include an empty list (e.g., such as secondary archiving menu 906c as illustrated in fig. 9F) or a list of a subset of stored program sessions (e.g., such as secondary archiving menu 906b as illustrated in fig. 9E). Depending on the monitor device 20 operating in an unauthorized state and the settings requiring authorization being disabled, the secondary archiving menu may include a list of stored program sessions, such as, for example, secondary archiving menu 906a as illustrated in fig. 9D.
As illustrated in fig. 9G, when secondary archive menu 906a is displayed, the illustrated example shows secondary archive menu 906a according to fig. 9D. However, the same would apply to the secondary archiving menu 906b according to FIG. 9E, with thirteenth user input 908 corresponding to selection of first stored program session 907. The monitor means is adapted to detect a thirteenth user input 908 and, as illustrated in fig. 9H, in response to detecting the thirteenth user input 908, the monitor means displays a fourth plurality of representations 912 in the first portion 31 of the graphical user interface corresponding to the fourth plurality of stored image files stored during the first stored program session 907. Further, also in response to detecting the thirteenth user input 908, the monitor device 20 displays general information 914 of the first stored program session 907, for example in the second portion 32 of the graphical user interface. Further, also in response to detecting the thirteenth user input 908, the monitor device 20 displays a comment field 916, for example in the fourth portion 34 of the graphical user interface.
In the case where the first stored program session 907 selected by the thirteenth user input 908 is the same as the program sessions of fig. 8A-8H, the graphical user interface as provided in response to detecting the thirteenth user input 908 and as illustrated in fig. 9H would be the same as the graphical user interface as provided in response to detecting the eighth user input 810 and as illustrated in fig. 8F. Thus, the disclosure regarding the selection of the comment field 816 by the ninth user input 818 as described with respect to FIG. 8F applies mutatis mutandis to the case of FIG. 9H. Similarly, the following disclosure with respect to fig. 9H applies mutatis mutandis to the case of fig. 8F.
As illustrated in fig. 9H, a fourteenth user input 918 may be received corresponding to selection of a selected representation 919 of the fourth plurality of representations 912. The selected representation 919 corresponds to a selected stored image file. The monitor means is adapted to detect a fourteenth user input 918 and, as illustrated in fig. 9J, in response to detecting the fourteenth user input 918, the monitor means displays a magnified representation 920 of the selected stored image file within the first portion 31 of the graphical user interface. Further, in response to a fourteenth user input 918, the monitor device 20 displays thumbnail representations 922 of a fifth plurality of stored image files stored during the first stored program session 917. Thumbnail representation 922 includes thumbnail representation 920a of the selected stored image file. The thumbnail representation 922 further includes further thumbnails 924 corresponding to further stored image files.
Further, in response to detecting the fourteenth user input 918, the monitor device 20 displays image information 922 associated with the selected stored image file within the second portion 32 of the graphical user interface. Image information 922 associated with the selected stored image file includes information indicative of a visualization device. The information is saved based on device identifier information obtained in response to detecting the visualization device.
As illustrated in fig. 9J, a fifteenth user input 926 may be received corresponding to a selection of a selected thumbnail 924 in the displayed thumbnail representation 922. The selected thumbnail corresponds to a second selected stored image file. The monitor means is adapted to detect a fifteenth user input 926 and, as illustrated in fig. 9K, in response to detecting the fifteenth user input 926, the monitor means displays an enlarged representation 928 of the second selected stored image file within the first portion 31 of the graphical user interface. Thumbnail representation 922 is updated to display representations 930 of the sixth plurality of stored image files stored during the first stored program session. In the illustrated example, the thumbnail representation 922 (as seen in fig. 9J) of the fifth plurality of stored image files and the thumbnail representation 930 (as seen in fig. 9K) of the sixth plurality of stored image files are the same plurality of stored image files. However, in some examples, the thumbnail representations may be different, for example, where the number of images is so large that all images cannot be displayed full screen at the same time.
Fig. 10A-10D schematically illustrate exemplary user interactions with a graphical user interface of a monitor device 20 (e.g., the monitor device 20 of any of the previous figures). The monitor device 20 may be connected with visualization means or it may not be connected with visualization means. The example graphical user interface of fig. 10A may follow the example graphical user interface of fig. 9J, 9K, or 8D, for example. In the illustrated example, the monitor device 20 displays an enlarged representation 1001 of the stored image file. The enlarged representation 1001 may correspond to the enlarged representation 920 of the selected stored image file of FIG. 9J, the enlarged representation 928 of the second selected stored image file of FIG. 9K, or the enlarged representation 808 of the primary stored image file of FIG. 8D.
Referring to fig. 10A, a derived icon 932 is displayed, for example, in the second portion of the graphical user interface. As illustrated, a sixteenth user input 934 corresponding to selection of export icon 932 may be received. The monitor device 20 is adapted to detect a sixteenth user input 934, and in response to detecting the sixteenth user input 934, as illustrated in fig. 10B, the monitor device displays, in the first portion 31 of the graphical user interface, a plurality of representations 1002 corresponding to a plurality of stored image files, for example the plurality of representations 1002 may be the fourth plurality of representations 912 corresponding to the fourth plurality of stored image files, as illustrated with respect to fig. 9H. Each of the plurality of representations 1002 includes a selection indicator 1004, wherein the selection indicator 1004a of the representation of the stored image file (whose enlarged representation 1001 is displayed upon detection of the sixteenth user input 934) is activated.
Further, in response to detecting the sixteenth user input 934, the monitor device displays a export menu 1006 in the second and fourth portions 32, 34 of the graphical user interface. As illustrated, the export menu 1006 may be divided into two parts. Export menu 1006 includes export confirmation icon 1008.
After detecting the sixteenth user input 934, and as illustrated in fig. 10C, a seventeenth user input 1010 may be received, the seventeenth user input corresponding to selecting one or more of the selection indicators 1004 of the plurality of representations 1002. In the illustrated example, the seventeenth user input 1010 corresponds to selecting the second selection indicator 1004b of the selection indicators 1004. The monitor device 20 is adapted to detect a seventeenth user input 1010, and in response to detecting the seventeenth user input 1010, the monitor device 20 activates a selection indicator corresponding to the selected one or more selection indicators. Thus, in the illustrated example, as illustrated in fig. 10D, the second selection indicator 1004b is activated. In a case where the seventeenth user input 1010 corresponds to selection of an indicator that has been selected (e.g., the selection indicator 1004a), the monitor device 20 has deactivated the selection indicator 1004 a.
As illustrated, the export menu 1006 includes an export summary 1006 that indicates the number of files and file types to be exported.
As illustrated, the user may select an export type, such as DICOM or Basic (e.g., jpg, png, tiff, etc.), and may enter patient details such as patient ID, first name, last name, date of birth, etc. In some examples, export via DICOM may be used only by authenticated users.
As illustrated in fig. 10D, after selecting a file for export, the user may provide an eighteenth user input 1010 corresponding to selecting export confirmation icon 1008. The monitor device is adapted to detect an eighteenth user input 1010 and, in response to detecting the eighteenth user input 1010, the monitor device transmits the stored image file corresponding to the selected one or more selection indicators 1004a, 1004b to an auxiliary device (e.g., a USB drive or server) via a local area network or a wide area network.
As seen, for example, after an eighth user input 810 corresponding to selecting session summary icon 807 of fig. 8E in fig. 8F, and a thirteenth user input 908 corresponding to selecting first stored program session 907 of fig. 9G in fig. 9H, other graphical user interfaces may include export icon 932. Providing input corresponding to selection of export icon 932 in these instances will result in a graphical user interface as illustrated in fig. 10B and associated description, with the modification that none of the selection indicators will be activated, i.e., none of the stored image files were pre-selected for export.
Fig. 11A-11E schematically illustrate exemplary user interactions with a graphical user interface of a monitor device 20 (e.g., the monitor device 20 of any of the previous figures). The monitor device 20 may be connected with visualization means or it may not be connected with visualization means. The example graphical user interface of fig. 11A may follow the example graphical user interface of fig. 9J, 9K, or 8D, for example. In the illustrated example, the monitor device 20 displays an enlarged representation 1101 of the stored image file. The enlarged representation 1101 may correspond to the enlarged representation 920 of the selected stored image file of fig. 9J, the enlarged representation 928 of the second selected stored image file of fig. 9K, or the enlarged representation 808 of the primary stored image file of fig. 8D.
A nineteenth user input 948 may be received that corresponds to selection of the delete icon 946. The monitor device 20 is adapted to detect the nineteenth user input 948, and as illustrated in fig. 11B, in response to detecting the nineteenth user input 948, the monitor device 20 displays a confirmation dialog 950 indicating a potential deletion of the stored image file (corresponding to the enlarged representation 1101 displayed when the nineteenth user input 948 was detected).
The user provides a twentieth user input 952 to the confirmation dialog 950. The monitor device is adapted to detect a twentieth user input 952 and determine whether the twentieth user input 952 is indicative of a user confirmation of deletion of the stored image file or of a user de-deletion of the selected stored image file.
In one example, as illustrated in fig. 11B, a twentieth user input 952 indicates that the user has de-deleted the selected stored image file. Thus, the monitor device foregoes deleting the selected stored image file and keeps displaying the enlarged representation 1101 of the stored image file within the first portion 31 of the graphical user interface, as shown in fig. 11C.
In another example, as illustrated in fig. 11D, a twentieth user input 952 indicates that the user confirms deletion of the stored image file. Thus, the monitor device deletes the stored image file and replaces the display of the enlarged representation 1101 of the stored image file with an enlarged representation 1102 of a certain stored image file, as illustrated in fig. 11E.
As seen, for example, after an eighth user input 810 corresponding to selecting session summary icon 807 of fig. 8E in fig. 8F, and a thirteenth user input 908 corresponding to selecting first stored program session 907 of fig. 9G in fig. 9H, a graphical user interface including a selection interface as explained with respect to fig. 10B-10D may include a delete icon 946. Multiple image files may be selected for deletion according to the selections as explained with respect to fig. 10B-10D, and subsequent user input to the delete icon 946 will display a confirmation dialog similar to the confirmation dialog 950 indicating potential deletion of the selected stored image files. Confirmation or cancellation similar to that described with respect to fig. 11B and 11D will cause deletion or cancellation of the selected stored image file.
Exemplary embodiments of the present disclosure are set forth in the following entries:
1. a medical visualization system comprising a visualization apparatus having an image sensor configured to generate image data indicative of a view from the visualization apparatus,
the medical visualization system further includes a monitor device operable to receive the image data while the image sensor is generating the image data, the monitor device including a first housing extending from a first housing side to a second housing side in a first direction and from a third housing side to a fourth housing side in a second direction perpendicular to the first direction, the monitor device including a touch-sensitive display housed in the first housing and having a first length in the first direction and a second length in the second direction, the monitor device displaying a graphical user interface with the touch-sensitive display,
wherein the monitor device:
-opening a program session;
-displaying a live representation of the image data within the first portion of the graphical user interface;
-displaying one or more actionable items within a second portion of the graphical user interface, wherein the one or more actionable items comprise an image capture button, the second portion and the first portion being non-overlapping; and
-displaying a folder icon within a background portion of the graphical user interface, the background portion and the first portion being non-overlapping,
and wherein the monitor device is further adapted to detect a first user input corresponding to selection of the image capture button, and in response to detecting the first user input, the monitor device:
-storing a first image file corresponding to image data received upon detection of the first user input;
-associating the first image file with the program session; and
-displaying a first representation of a still image corresponding to the stored first image file within a background portion of the graphical user interface;
the monitor device displays an animation that transitions the first representation to the folder icon after a predetermined delay after detecting the first user input.
2. The medical visualization system according to item 1, wherein the monitor device is adapted to establish a connection with the visualization device, including obtaining device identifier information from a device identifier of the visualization device, and wherein the monitor device opens the procedure session in response to establishing the connection with the visualization device, and the procedure session corresponds to the device identifier information.
3. The medical visualization system according to item 2, wherein, in opening the procedure session, the monitor device determines whether the visualization device has been previously connected to the monitor device based on the device identifier information; and wherein the one or more of the one or more,
in accordance with a determination that the visualization device has been previously connected to the monitor device, the monitor device reopens the program session corresponding to the device identifier information; and
in accordance with a determination that the visualization device has not been previously connected to the monitor device, the monitor device creates a program session corresponding to the device identifier information.
4. The medical visualization system according to any of the preceding items, wherein the predetermined delay is between 1 and 5 seconds, such as between 1.5 and 3 seconds, such as 1.5 or 2 seconds.
5. The medical visualization system according to any of the preceding items, wherein a duration of the animation of the transition of the first representation to the folder icon is between 100ms and 1500ms, such as between 300ms and 1000ms, such as between 300ms and 600ms, such as 400ms or 500 ms.
6. The medical visualization system of any of the preceding items, wherein the folder icon includes a visual representation of a number of stored files stored during the procedure session, and wherein, in response to detecting the first user input, the monitor device updates the display of the visual representation of the number of stored files stored during the procedure session, including increasing the number of stored files.
7. The medical visualization system according to any of the preceding items, wherein, after detecting the first user input, the monitor device is further adapted to detect a second user input corresponding to a selection of the image capture button, and in response to detecting the second user input, the monitor device:
-storing a second image file corresponding to image data received upon detection of the second user input;
-associating the second image file with the program session; and
-displaying a second representation of the still image corresponding to the stored second image file within the background portion of the graphical user interface;
after the predetermined delay after detecting the second user input, the monitor device displays an animation that transitions the second representation to the folder icon.
8. The medical visualization system according to any of the preceding items, wherein the monitor device is adapted to establish a connection with a second visualization device, including obtaining second device identifier information from a second device identifier of the second visualization device, the monitor device opening a second procedure session corresponding to the second device identifier information in response to establishing the connection with the second visualization device.
9. The medical visualization system according to item 8, wherein the monitor device:
-displaying a live representation of second image data generated by a second image sensor of the second visualization apparatus;
-displaying a second folder icon within a background portion of the graphical user interface;
and wherein the monitor device is adapted to detect a third user input corresponding to selection of the image capture button, and in response to detecting the third user input, the monitor device:
-storing a third image file corresponding to second image data received upon detection of the third user input; and
-associating the third image file with the second program session.
10. The medical visualization system according to item 9, wherein, in response to detecting the third user input, the monitor device displays a third representation of the still image corresponding to the stored third image file within the background portion of the graphical user interface,
after the predetermined delay after detecting the third user input, the monitor device displays an animation that transitions the third representation to the second folder icon.
11. The medical visualization system according to any of the preceding items, wherein the one or more actionable items include a video capture button displayed with a first appearance, and wherein the monitor device is adapted to detect a fourth user input corresponding to selection of the video capture button, and in response to detecting the fourth user input, the monitor device changes the appearance of the video capture button to a second appearance,
after detecting the fourth user input, the monitor device is adapted to detect a fifth user input corresponding to selection of the video capture button, and in response to detecting the fifth user input, the monitor device:
-changing the appearance of the video capture button to the first appearance;
-storing a first video data file corresponding to image data received between the detection of the fourth user input and the fifth user input;
-associating the first video data file with the program session; and
-displaying a fourth representation of a frame corresponding to the stored first video data file within a background portion of the graphical user interface,
after the predetermined delay after detecting the fifth user input, the monitor device displays an animation that transitions the fourth representation to the folder icon.
12. The medical visualization system according to any of the preceding items, wherein the monitor is adapted to detect a sixth user input corresponding to selection of the folder icon, and in response to detecting the sixth user input, the monitor device displays, within a background portion of the graphical user interface, a first plurality of representations corresponding to a first plurality of stored image files stored during the program session.
13. The medical visualization system of item 12, wherein the monitor device is further adapted to detect a seventh user input corresponding to selecting a primary representation of the first plurality of representations displayed in response to detecting the sixth user input, the primary representation corresponding to a primary stored image file, and in response to detecting the seventh user input, the monitor device:
-displaying a magnified representation of the primary stored image file within the first portion of the graphical user interface; and
-displaying thumbnail representations of a second plurality of the stored image files stored during the program session.
14. The medical visualization system according to any of the items 12 to 13, wherein, in response to detecting the sixth user input, the monitor device displays a session summary icon within a background portion of the graphical user interface,
the monitor device is adapted to detect an eighth user input corresponding to selection of the session summary icon, and in response to detecting the eighth user input, the monitor device:
-displaying, in the first portion of the graphical user interface, a third plurality of representations corresponding to a third plurality of stored image files stored during the program session;
-displaying general information of the program session; and
-displaying the annotation field.
15. The medical visualization system of item 14, wherein the monitor device is further adapted to detect a ninth user input corresponding to selection of the comment field, and in response to detecting the ninth user input, the monitor device displays a virtual keyboard in the first portion of the graphical user interface and optionally extends the virtual keyboard into the second portion of the graphical user interface, the virtual keyboard configured for entering text in the comment field,
the monitor device is further adapted to:
-detecting a sequence of keyboard user inputs corresponding to typing text using the displayed virtual keyboard; and
-detecting a tenth user input indicating acceptance of text entered using the displayed virtual keyboard,
in response to detecting the tenth user input, the monitor device stores the entered text and associates the text as an annotation to the program session.
16. The medical visualization system of any of the preceding items, wherein, in response to the visualization device being disconnected from the monitor device, the monitor device displays, within a background portion of the graphical user interface, a first plurality of representations corresponding to a first plurality of stored image files stored during the procedure session.
17. The medical visualization system of item 16, wherein the monitor device further displays a session summary icon within the background portion of the graphical user interface in response to the visualization device being disconnected from the monitor device.
18. The medical visualization system of any of the preceding items, wherein the monitor device displays one or more actionable menu items within a third portion of the graphical user interface, wherein the one or more actionable menu items include an archived menu item, the third portion and the first portion being non-overlapping,
the monitor means is adapted to detect an eleventh user input corresponding to selection of the archive menu item, and in response to detecting the eleventh user input, the monitor means displays a primary archive menu associated with the archive menu item within a fourth portion of the graphical user interface, wherein the primary archive menu includes one or more primary actionable archive items, including a first primary actionable archive item, the fourth portion and the first portion being non-overlapping,
the monitor means is adapted to detect a twelfth user input corresponding to selection of the first primary actionable archive item when the primary archive menu is displayed, and in response to detecting the twelfth user input, the monitor means displays a secondary archive menu associated with the first primary actionable archive item in the first portion of the graphical user interface,
in accordance with the monitor device operating in an authorized state, the secondary archive menu includes a list of stored program sessions,
depending on the monitor device operating in an unauthorized state and requiring an authorized setting to be activated, the secondary archive menu includes an empty list or a list of a subset of these stored program sessions,
the secondary archive menu includes a list of stored program sessions in accordance with the monitor device operating in an unauthorized state and requiring an authorized setting to be disabled.
19. The medical visualization system of item 18, wherein, when displaying the secondary archival menu including the list of stored procedure sessions or the list of subsets of the stored procedure sessions, the monitor device is adapted to detect a thirteenth user input corresponding to selection of the first stored procedure session of the list of stored procedure sessions or the list of subsets of the stored procedure sessions, and in response to detecting the thirteenth user input, the monitor device:
-displaying, in the first portion of the graphical user interface, a fourth plurality of representations corresponding to a fourth plurality of stored image files stored during the first stored program session;
-displaying general information of the first stored program session; and
-displaying the annotation field.
20. The medical visualization system of item 19, wherein the monitor device is further adapted to detect a fourteenth user input corresponding to selection of a selected representation of the fourth plurality of representations, wherein the selected representation corresponds to a selected stored image file, and in response to detecting the fourteenth user input, the monitor device:
-displaying a magnified representation of the selected stored image file within the first portion of the graphical user interface; and
-displaying thumbnail representations of a fifth plurality of the stored image files stored during the first stored program session.
21. The medical visualization system of item 20, wherein the thumbnail representations include thumbnail representations of the selected stored image file.
22. The medical visualization system according to any of the items 20 to 21, wherein, in response to detecting the fourteenth user input, the monitor device further displays image information associated with the selected stored image file within the second portion of the graphical user interface.
23. The medical visualization system of item 22, wherein the image information associated with the selected stored image file includes information indicative of the visualization device.
24. The medical visualization system according to any one of the items 20 to 23, wherein the monitor device is further adapted to detect a fifteenth user input corresponding to a selection of a selected thumbnail of the displayed thumbnail representations, wherein the selected thumbnail corresponds to a second selected stored image file, and in response to detecting the fifteenth user input, the monitor device:
-displaying a magnified representation of the second selected stored image file within the first portion of the graphical user interface; and
-displaying thumbnail representations of a sixth plurality of the stored image files stored during the first stored program session.
25. The medical visualization system according to any of the items 20 to 24, wherein, in response to detecting the fourteenth user input, the monitor device further displays a derived icon, and wherein the monitor device is further adapted to detect a sixteenth user input corresponding to selecting the derived icon, and in response to detecting the sixteenth user input, the monitor device:
-displaying, in a first portion of the graphical user interface, the fourth plurality of representations corresponding to the fourth plurality of stored image files, wherein each of the fourth plurality of representations comprises a selection indicator, wherein the selection indicator of the selected representation is activated; and
-displaying a lead-out menu comprising a lead-out confirmation icon.
26. The medical visualization system of item 25, wherein the monitor device is further adapted to detect a seventeenth user input after detecting the sixteenth user input, the seventeenth user input corresponding to selection of one or more of the selection indicators of the fourth plurality of representations, and in response to detecting the seventeenth user input, the monitor device activates the selection indicator of the fourth plurality of representations corresponding to the selected one or more selection indicators.
27. The medical visualization system according to any of the items 25 to 26, wherein the monitor device is further adapted to detect an eighteenth user input corresponding to the selection of the export confirmation icon after detecting the sixteenth user input, and in response to detecting the eighteenth user input, the monitor device transmits the stored image file corresponding to the selected one or more selection indicators to an auxiliary device.
28. The medical visualization system according to any of the items 20 to 27, wherein, in response to detecting the fourteenth user input, the monitor device further displays a delete icon, and wherein the monitor device is further adapted to detect a nineteenth user input corresponding to selecting the delete icon, and in response to detecting the nineteenth user input, the monitor device displays a confirmation dialog indicating a potential deletion of the selected stored image file,
the monitor means adapted to detect a twentieth user input to the confirmation dialog and, in dependence upon the twentieth user input, instruct the user to confirm deletion of the selected stored image file, the monitor means:
-deleting the selected stored image file; and
-replacing the display of the enlarged representation of the selected stored image file with the enlarged representation of the second selected stored image file,
in accordance with the twentieth user input instructing the user to cancel deletion of the selected stored image file, the monitor means:
-discarding the deletion of the selected stored image file; and
-maintaining display of the enlarged representation of the selected stored image file within the first portion of the graphical user interface.
The invention has been described with reference to the preferred embodiments. However, the scope of the present invention is not limited to the illustrated embodiments, and variations and modifications may be carried out without departing from the scope of the present invention.
Throughout the description, the use of the terms "first," "second," "third," "fourth," "primary," "secondary," "third," etc. do not imply any particular order of importance, but are included to identify individual elements. Further, the designation of a first element does not imply the presence of a second element and vice versa.
List of reference numerals
2 medical visualization system
4 visualization device
6 handle
7 control button
8 elongated flexible member
9 distal part
10 distal end of an elongate flexible member
12 image sensor
14 device cable
16 device connector
20 monitor device
21 first shell side
22 second shell side
23 third shell side
24 fourth shell side
25 first casing
26 touch sensitive display
27 graphic user interface
31 first part
32 second part
33 third part
34 fourth section
36 actionable items
37 first image orientation
38 second image orientation
40 connecting port
42 actionable menu items
44 inverted view button
46 inverted view mode indicator
50 Battery indicator
60 processing unit
61 Power supply
61a battery
61b electric power connector
62 memory
64 orientation sensor
66 input/output
68 microphone
70 live representation of image data
x1 first direction
x2 second direction
L1 first length
L2 second length

Claims (17)

1. A medical visualization system comprising a visualization apparatus having an image sensor configured to generate image data indicative of a view from the visualization apparatus,
the medical visualization system further includes a monitor device operable to receive the image data while the image sensor is generating the image data, the monitor device including a first housing extending from a first housing side to a second housing side in a first direction and from a third housing side to a fourth housing side in a second direction perpendicular to the first direction, the monitor device including a touch-sensitive display housed in the first housing and having a first length in the first direction and a second length in the second direction, the monitor device displaying a graphical user interface with the touch-sensitive display,
wherein the monitor device:
-opening a program session;
-displaying a live representation of the image data within the first portion of the graphical user interface;
-displaying one or more actionable items within a second portion of the graphical user interface, wherein the one or more actionable items comprise an image capture button, the second portion and the first portion being non-overlapping; and
-displaying a folder icon within a background portion of the graphical user interface, the background portion and the first portion being non-overlapping,
and wherein the monitor device is further adapted to detect a first user input corresponding to selection of the image capture button, and in response to detecting the first user input, the monitor device:
-storing a first image file corresponding to image data received upon detection of the first user input;
-associating the first image file with the program session; and
-displaying a first representation of a still image corresponding to the stored first image file within a background portion of the graphical user interface;
the monitor device displays an animation that transitions the first representation to the folder icon after a predetermined delay after detecting the first user input.
2. The medical visualization system according to claim 1, wherein the monitor device is adapted to establish a connection with the visualization device, including obtaining device identifier information from a device identifier of the visualization device, and wherein the monitor device opens the procedure session in response to establishing the connection with the visualization device, and the procedure session corresponds to the device identifier information.
3. The medical visualization system according to claim 2, wherein in opening the procedure session, the monitor device determines whether the visualization device has been previously connected to the monitor device based on the device identifier information; and wherein the one or more of the one or more,
in accordance with a determination that the visualization device has been previously connected to the monitor device, the monitor device reopens the program session corresponding to the device identifier information; and
in accordance with a determination that the visualization device has not been previously connected to the monitor device, the monitor device creates a program session corresponding to the device identifier information.
4. Medical visualization system according to any of the preceding claims, wherein the predetermined delay is between 1 and 5 seconds, such as between 1.5 and 3 seconds, such as 1.5 or 2 seconds.
5. The medical visualization system according to any of the preceding claims, wherein a duration of the animation transitioning the first representation to the folder icon is between 100ms and 1500ms, such as between 300ms and 1000ms, such as between 300ms and 600ms, such as 400ms or 500 ms.
6. The medical visualization system according to any of the preceding claims, wherein the folder icon includes a visual representation of a number of stored files stored during the procedure session, and wherein, in response to detecting the first user input, the monitor device updates the display of the visual representation of the number of stored files stored during the procedure session, including increasing the number of stored files.
7. The medical visualization system according to any one of the preceding claims, wherein after detecting the first user input, the monitor device is further adapted to detect a second user input corresponding to a selection of the image capture button, and in response to detecting the second user input, the monitor device:
-storing a second image file corresponding to image data received upon detection of the second user input;
-associating the second image file with the program session; and
-displaying a second representation of the still image corresponding to the stored second image file within the background portion of the graphical user interface;
after the predetermined delay after detecting the second user input, the monitor device displays an animation that transitions the second representation to the folder icon.
8. The medical visualization system according to any of the preceding claims, wherein the monitor device is adapted to establish a connection with a second visualization device, comprising obtaining second device identifier information from a second device identifier of the second visualization device, the monitor device opening a second program session corresponding to the second device identifier information in response to establishing the connection with the second visualization device.
9. The medical visualization system of claim 8, wherein the monitor device:
-displaying a live representation of second image data generated by a second image sensor of the second visualization apparatus;
-displaying a second folder icon within a background portion of the graphical user interface;
and wherein the monitor device is adapted to detect a third user input corresponding to selection of the image capture button, and in response to detecting the third user input, the monitor device:
-storing a third image file corresponding to second image data received upon detection of the third user input; and
-associating the third image file with the second program session.
10. The medical visualization system of claim 9, wherein, in response to detecting the third user input, the monitor device displays a third representation of a still image corresponding to the stored third image file within the background portion of the graphical user interface,
after the predetermined delay after detecting the third user input, the monitor device displays an animation that transitions the third representation to the second folder icon.
11. The medical visualization system according to any of the preceding claims, wherein the one or more actionable items include a video capture button displayed with a first appearance, and wherein the monitor device is adapted to detect a fourth user input corresponding to a selection of the video capture button, and in response to detecting the fourth user input, the monitor device changes the appearance of the video capture button to a second appearance,
after detecting the fourth user input, the monitor device is adapted to detect a fifth user input corresponding to selection of the video capture button, and in response to detecting the fifth user input, the monitor device:
-changing the appearance of the video capture button to the first appearance;
-storing a first video data file corresponding to image data received between the detection of the fourth user input and the fifth user input;
-associating the first video data file with the program session; and
-displaying a fourth representation of a frame corresponding to the stored first video data file within the background portion of the graphical user interface,
after the predetermined delay after detecting the fifth user input, the monitor device displays an animation that transitions the fourth representation to the folder icon.
12. The medical visualization system according to any of the preceding claims, wherein the monitor is adapted to detect a sixth user input corresponding to a selection of the folder icon, and in response to detecting the sixth user input, the monitor means displays within a background portion of the graphical user interface a first plurality of representations corresponding to a first plurality of stored image files stored during the program session.
13. The medical visualization system of claim 12, wherein the monitor device is further adapted to detect a seventh user input corresponding to a selection of a primary representation of the first plurality of representations displayed in response to detecting the sixth user input, the primary representation corresponding to a primary stored image file, and in response to detecting the seventh user input, the monitor device:
-displaying a magnified representation of the primary stored image file within the first portion of the graphical user interface; and
-displaying thumbnail representations of a second plurality of the stored image files stored during the program session.
14. The medical visualization system according to any of the claims 12 to 13, wherein in response to detecting the sixth user input, the monitor device displays a session summary icon within a background portion of the graphical user interface,
the monitor device is adapted to detect an eighth user input corresponding to selection of the session summary icon, and in response to detecting the eighth user input, the monitor device:
-displaying, in the first portion of the graphical user interface, a third plurality of representations corresponding to a third plurality of stored image files stored during the program session;
-displaying general information of the program session; and
-displaying the annotation field.
15. The medical visualization system according to claim 14, wherein the monitor device is further adapted to detect a ninth user input corresponding to a selection of the comment field, and in response to detecting the ninth user input, the monitor device displays a virtual keyboard in the first portion of the graphical user interface and optionally extends the virtual keyboard into the second portion of the graphical user interface, the virtual keyboard being configured for entering text in the comment field,
the monitor device is further adapted to:
-detecting a sequence of keyboard user inputs corresponding to typing text using the displayed virtual keyboard; and
-detecting a tenth user input indicating acceptance of text entered using the displayed virtual keyboard,
in response to detecting the tenth user input, the monitor device stores the entered text and associates the text as an annotation of the program session.
16. The medical visualization system according to any of the preceding claims, wherein the monitor device displays a first plurality of representations corresponding to a first plurality of stored image files stored during the program session within a background portion of the graphical user interface in response to the visualization device being disconnected from the monitor device.
17. The medical visualization system of claim 16, wherein the monitor device further displays a session summary icon within the background portion of the graphical user interface in response to the visualization device being disconnected from the monitor device.
CN202180015434.7A 2020-02-21 2021-02-18 Capturing and viewing images in a medical visualization system Pending CN115135219A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DKPA202070111 2020-02-21
DKPA202070111 2020-02-21
PCT/EP2021/053963 WO2021165360A1 (en) 2020-02-21 2021-02-18 Capturing and browsing images in a medical visualisation system

Publications (1)

Publication Number Publication Date
CN115135219A true CN115135219A (en) 2022-09-30

Family

ID=74672322

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180015434.7A Pending CN115135219A (en) 2020-02-21 2021-02-18 Capturing and viewing images in a medical visualization system

Country Status (3)

Country Link
EP (1) EP4106614B1 (en)
CN (1) CN115135219A (en)
WO (1) WO2021165360A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8069420B2 (en) * 2004-12-29 2011-11-29 Karl Storz Endoscopy-America, Inc. System for controlling the communication of medical imaging data
EP2719322A1 (en) * 2010-09-08 2014-04-16 Covidien LP Catheter with imaging assembly
US10108003B2 (en) * 2014-05-30 2018-10-23 General Electric Company Systems and methods for providing monitoring state-based selectable buttons to non-destructive testing devices
US10462412B2 (en) * 2018-01-30 2019-10-29 Manish Eknath Ingle Surgical visualization and recording system

Also Published As

Publication number Publication date
EP4106614A1 (en) 2022-12-28
WO2021165360A1 (en) 2021-08-26
EP4106614B1 (en) 2024-06-26

Similar Documents

Publication Publication Date Title
US11910998B2 (en) Medical visualisation system including a monitor and a graphical user interface therefore
US20120306881A1 (en) Chemical liquid injector
EP2174578A1 (en) Image processing device, its operating method and its program
US20090131746A1 (en) Capsule endoscope system and method of processing image data thereof
US20090043157A1 (en) Image display apparatus
WO2007023631A1 (en) Device for analyzing endoscope insertion shape and system for analyzing endoscope insertion shape
JP6901734B2 (en) Electrocardiographic data transmission system
JP6368885B1 (en) Endoscope system, terminal device, server, transmission method and program
JP2020146482A5 (en)
US20180302586A1 (en) Endoscope apparatus, endoscope system and endoscopic image display control method
CN115135219A (en) Capturing and viewing images in a medical visualization system
CN104918574A (en) Medical portable terminal device
JP2015073610A (en) Round support terminal, round carriage, and round support server
JP2013137679A (en) Biological information measurement system and biological information measurement device
WO2021165361A1 (en) Graphical user interface handling a plurality of visualisation devices
JP2002085333A (en) Medical instrument use status information collecting apparatus
EP4107609A1 (en) User interface for a medical visualisation system
EP4106599B1 (en) Battery monitoring for a medical visualisation system
WO2022263376A1 (en) Medical visualisation device with programmable buttons
EP4336387A1 (en) Configuration of a medical visualisation system
WO2022263377A1 (en) Medical visualisation system
EP4106597A1 (en) Rotational user interface for a medical visualisation system
WO2017073404A1 (en) Endoscope system
WO2023067523A1 (en) Video laryngoscope image file management systems and methods
US20240095917A1 (en) Examination support device, examination support method, and storage medium storing examination support program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination