JP6994466B2 - 医療情報との相互作用のための方法およびシステム - Google Patents
医療情報との相互作用のための方法およびシステム Download PDFInfo
- Publication number
- JP6994466B2 JP6994466B2 JP2018546777A JP2018546777A JP6994466B2 JP 6994466 B2 JP6994466 B2 JP 6994466B2 JP 2018546777 A JP2018546777 A JP 2018546777A JP 2018546777 A JP2018546777 A JP 2018546777A JP 6994466 B2 JP6994466 B2 JP 6994466B2
- Authority
- JP
- Japan
- Prior art keywords
- medical
- electric field
- reference object
- gesture
- detection unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1415—Digital output to display device ; Cooperation and interconnection of the display device with other functional units with means for detecting differences between the image stored in the host and the images displayed on the displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
- A61B2034/254—User interfaces for surgical systems being adapted depending on the stage of the surgical procedure
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/35—Surgical robots for telesurgery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/02—Networking aspects
- G09G2370/022—Centralised management of display operation, e.g. in a server instead of locally
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/08—Biomedical applications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Robotics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- User Interface Of Digital Computer (AREA)
Description
一実施形態では、光センサはカメラを含む。
一実施形態では、システムは、カメラによって撮像される基準面上に少なくとも1つの基準アイコンを投影するプロジェクタをさらに備え、少なくとも1つの基準アイコンの各々は、少なくとも1つの仮想アイコンのそれぞれに対応する。
一実施形態では、カメラは、2Dカメラ、モノクロカメラ、ステレオカメラ、および飛行時間型カメラのうちの1つを含む。
一実施形態では、身体部分は、手および少なくとも1本の指のうちの一方を含む。
一実施形態では、上記基準物体の位置および向きを検出するステップは、基準物体の位置および向きを判定し、医療施術者によって実行されるジェスチャを判定するように適合された単一のセンサを使用して実施される。
一実施形態では、上記検出するステップはカメラを使用して行われる。
一実施形態では、方法は、カメラによって撮像される基準面上に少なくとも1つの基準アイコンを投影するステップをさらに含み、少なくとも1つの基準アイコンの各々は、少なくとも1つの仮想アイコンのそれぞれに対応する。
一実施形態では、カメラは、2Dカメラ、モノクロカメラ、ステレオカメラ、および飛行時間型カメラのうちの1つを含む。
一実施形態では、身体部分は、手および少なくとも1本の指のうちの一方を含む。
シングルエアタップ:領域およびシステム状態を反応させる。
左スワイプ:ズームモードを起動する。
上スワイプ:パンモードを起動する。1本の指でカーソルを所望のロケーションに移動することによってパンする。
Claims (14)
- 医療施術者によって実行される無接触ジェスチャに対応する医療情報を表示ユニットに表示することを可能にするシステムであって、
少なくとも、前記医療施術者によって使用される基準物体の3D位置を検出するための、無接触検知ユニットと、
前記無接触検知ユニットによって検出される前記基準物体の前記3D位置を使用して前記無接触ジェスチャを判定するステップと、
前記無接触ジェスチャに対応する前記医療情報に対するコマンドを識別し、前記医療情報を前記表示ユニットに表示するために前記コマンドを実行するステップと、
前記基準物体の仮想表現、ならびに、前記無接触検知ユニットの仮想表現および少なくとも1つの仮想アイコンのうちの少なくとも一方を含むグラフィカルユーザインターフェース(GUI)を生成するステップであり、前記GUI内の前記基準物体の前記仮想表現の位置は、前記無接触検知ユニットによって検出される前記基準物体の前記3D位置に応じて選択され、前記少なくとも1つの仮想アイコンの各々は、それぞれの動作モード、それぞれのユーザ通知、およびそれぞれのシステム設定オプションのうちの1つに対応する、生成するステップと、
前記GUIを前記医療情報に隣接して表示するステップと
を行うために、前記無接触検知ユニットと通信する少なくとも1つの制御ユニットと
を備え、
前記無接触検知ユニットは、前記基準物体の向きを検出するようにさらに適合され、前記GUI内の前記基準物体の前記仮想表現の向きは、前記無接触検知ユニットによって検出される前記基準物体の前記向きに応じて選択され、
前記無接触検知ユニットは、前記基準物体の前記3D位置および前記向きを判定し、前記医療施術者によって実行される前記無接触ジェスチャを判定するように適合されている単一のセンサを含み、
前記単一のセンサはカメラを含み、
前記カメラは、前記少なくとも1つの仮想アイコンが表示される基準面を撮像するように構成され、
前記システムは、前記カメラによって撮像される前記基準面上に前記少なくとも1つの仮想アイコンを投影するプロジェクタをさらに備える、システム。 - 前記カメラは、2Dカメラ、モノクロカメラ、3Dカメラ、ステレオカメラ、および飛行時間型カメラのうちの1つを含む、請求項1に記載のシステム。
- 医療施術者によって実行される無接触ジェスチャに対応する医療情報を表示ユニットに表示することを可能にするシステムであって、
少なくとも、前記医療施術者によって使用される基準物体の3D位置を検出するための電場センサを含む無接触検知ユニットと、
前記無接触検知ユニットによって検出される前記基準物体の前記3D位置を使用して前記無接触ジェスチャを判定するステップと、
前記無接触ジェスチャに対応する前記医療情報に対するコマンドを識別し、前記医療情報を前記表示ユニットに表示するために前記コマンドを実行するステップと、
前記基準物体の仮想表現、ならびに、前記無接触検知ユニットの仮想表現および少なくとも1つの仮想アイコンのうちの少なくとも一方を含むグラフィカルユーザインターフェース(GUI)を生成するステップであり、前記GUI内の前記基準物体の前記仮想表現の位置は、前記無接触検知ユニットによって検出される前記基準物体の前記3D位置に応じて選択され、前記少なくとも1つの仮想アイコンの各々は、それぞれの動作モード、それぞれのユーザ通知、およびそれぞれのシステム設定オプションのうちの1つに対応する、生成するステップと、
前記GUIを前記医療情報に隣接して表示するステップと
を行うために、前記無接触検知ユニットと通信する少なくとも1つの制御ユニットと
を備え、
前記無接触検知ユニットは、前記基準物体の向きを検出するようにさらに適合され、前記GUI内の前記基準物体の前記仮想表現の向きは、前記無接触検知ユニットによって検出される前記基準物体の前記向きに応じて選択され、
前記無接触検知ユニットは、前記基準物体の前記3D位置を判定するための第1のセンサと、前記基準物体の前記向きを判定する第2のセンサとを備え、前記無接触ジェスチャは、前記第1のセンサおよび前記第2のセンサのうちの1つによって判定され、
前記第1のセンサは、前記電場センサを含み、前記第2のセンサは、前記基準物体の向きを判定するための光センサを含む、システム。 - 前記光センサはカメラを含み、前記カメラは、2Dカメラ、モノクロカメラ、3Dカメラ、ステレオカメラ、および飛行時間型カメラのうちの1つを含む、請求項3に記載のシステム。
- 前記カメラは、前記電場センサの上に位置する領域を撮像するように位置決めされる、請求項4に記載のシステム。
- 前記電場センサ上または前記電場センサの周りに少なくとも1つの基準アイコンを投影するプロジェクタをさらに備え、前記少なくとも1つの基準アイコンの各々は、前記少なくとも1つの仮想アイコンのそれぞれに対応する、請求項5に記載のシステム。
- 少なくとも1つの基準アイコンが表示される画面をさらに備え、前記少なくとも1つの基準アイコンの各々は、前記少なくとも1つの仮想アイコンのそれぞれに対応し、前記電場センサは、前記画面上に位置決めされる、請求項5に記載のシステム。
- 前記基準物体は、前記医療施術者の身体部分を含む、請求項1~7のいずれか一項に記載のシステム。
- 前記基準物体は、導電性材料および半導体材料のうちの1つから作成される、請求項1~7のいずれか一項に記載のシステム。
- 前記コマンドは、コンピュータマシンに接続可能な周辺装置からの所与の既知のコマンドに対応する、請求項1~9のいずれか一項に記載のシステム。
- 前記医療情報は、医療画像、3Dモデル、およびそれらの任意の組み合わせまたはシーケンスを含む、請求項1~10のいずれか一項に記載のシステム。
- 前記医療情報に対する前記コマンドは、すでに表示されている医療画像の少なくとも1つの特性の変更を引き起こすコマンドを含む、請求項1~11のいずれか一項に記載のシステム。
- 前記少なくとも1つの制御ユニットは、前記医療施術者による所与の選択に応じて、前記少なくとも1つの仮想アイコンのうちの1つの外観を変更するように適合されている、請求項1~12のいずれか一項に記載のシステム。
- 医療施術者によって実行される無接触ジェスチャに対応する医療情報を表示ユニットに表示することを可能にするコンピュータ実施方法であって、
前記医療施術者によって使用される基準物体の3D位置を無接触検知ユニットによって検出するステップと、
前記検出される基準物体の3D位置を使用して前記無接触ジェスチャを判定するステップと、
前記無接触ジェスチャに対応する前記医療情報に対するコマンドを識別し、前記医療情報を前記表示ユニットに表示するために前記コマンドを実行するステップと、
前記基準物体の仮想表現、ならびに、前記無接触検知ユニットの仮想表現および少なくとも1つの仮想アイコンのうちの少なくとも一方を含むグラフィカルユーザインターフェース(GUI)を生成するステップであり、前記GUI内の前記基準物体の前記仮想表現の位置は、前記検出される基準物体の3D位置に応じて選択され、前記少なくとも1つの仮想アイコンの各々は、それぞれの動作モード、それぞれのユーザ通知、およびそれぞれのシステム設定オプションのうちの1つに対応する、生成するステップと、
前記GUIを前記医療情報と共に前記表示ユニット上に表示するステップと
を含み、
前記無接触検知ユニットは、前記基準物体の向きを検出するようにさらに適合され、前記GUI内の前記基準物体の前記仮想表現の向きは、前記無接触検知ユニットによって検出される前記基準物体の前記向きに応じて選択され、
前記無接触検知ユニットは、前記基準物体の前記3D位置および前記向きを判定し、前記医療施術者によって実行される前記無接触ジェスチャを判定するように適合されている単一のセンサを含み、
前記単一のセンサはカメラを含み、
前記カメラは、前記少なくとも1つの仮想アイコンが表示される基準面を撮像するように構成され、
前記コンピュータ実施方法は、前記カメラによって撮像される前記基準面上に前記少なくとも1つの仮想アイコンを投影するステップをさらに含む、コンピュータ実施方法。
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562260428P | 2015-11-27 | 2015-11-27 | |
US62/260,428 | 2015-11-27 | ||
PCT/IB2016/056228 WO2017089910A1 (en) | 2015-11-27 | 2016-10-17 | Method and system for interacting with medical information |
Publications (3)
Publication Number | Publication Date |
---|---|
JP2019501747A JP2019501747A (ja) | 2019-01-24 |
JP2019501747A5 JP2019501747A5 (ja) | 2019-12-05 |
JP6994466B2 true JP6994466B2 (ja) | 2022-01-14 |
Family
ID=58764004
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2018546777A Active JP6994466B2 (ja) | 2015-11-27 | 2016-10-17 | 医療情報との相互作用のための方法およびシステム |
Country Status (5)
Country | Link |
---|---|
US (2) | US11256334B2 (ja) |
EP (1) | EP3380031A4 (ja) |
JP (1) | JP6994466B2 (ja) |
CA (1) | CA3002918C (ja) |
WO (1) | WO2017089910A1 (ja) |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018020661A1 (ja) * | 2016-07-29 | 2018-02-01 | 三菱電機株式会社 | 表示装置、表示制御装置および表示制御方法 |
JP7267209B2 (ja) * | 2017-06-08 | 2023-05-01 | メドス・インターナショナル・エスエイアールエル | 滅菌野及び他の作業環境のためのユーザインターフェースシステム |
US10675094B2 (en) * | 2017-07-21 | 2020-06-09 | Globus Medical Inc. | Robot surgical platform |
DE102018130640A1 (de) * | 2017-12-18 | 2019-06-19 | Löwenstein Medical Technology S.A. Luxembourg | Virtuelle Bedienung |
JP2021531120A (ja) * | 2018-07-24 | 2021-11-18 | ソニーグループ株式会社 | 手術室でのビデオルーティング |
US20220104694A1 (en) * | 2020-10-02 | 2022-04-07 | Ethicon Llc | Control of a display outside the sterile field from a device within the sterile field |
US11830602B2 (en) | 2020-10-02 | 2023-11-28 | Cilag Gmbh International | Surgical hub having variable interconnectivity capabilities |
US11992372B2 (en) | 2020-10-02 | 2024-05-28 | Cilag Gmbh International | Cooperative surgical displays |
US11963683B2 (en) | 2020-10-02 | 2024-04-23 | Cilag Gmbh International | Method for operating tiered operation modes in a surgical system |
US11883022B2 (en) | 2020-10-02 | 2024-01-30 | Cilag Gmbh International | Shared situational awareness of the device actuator activity to prioritize certain aspects of displayed information |
US11748924B2 (en) | 2020-10-02 | 2023-09-05 | Cilag Gmbh International | Tiered system display control based on capacity and user operation |
US11877897B2 (en) | 2020-10-02 | 2024-01-23 | Cilag Gmbh International | Situational awareness of instruments location and individualization of users to control displays |
US11672534B2 (en) | 2020-10-02 | 2023-06-13 | Cilag Gmbh International | Communication capability of a smart stapler |
US20240016364A1 (en) * | 2020-11-30 | 2024-01-18 | Sony Group Corporation | Surgery system, surgery control device, control method, and program |
US20230023635A1 (en) * | 2021-07-22 | 2023-01-26 | Cilag Gmbh International | Hub identification and tracking of objects and personnel within the or to overlay data that is custom to the user's need |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140085185A1 (en) | 2011-03-24 | 2014-03-27 | Beth Israel Deaconess Medical Center | Medical image viewing and manipulation contactless gesture-responsive system and method |
US20150084866A1 (en) | 2012-06-30 | 2015-03-26 | Fred Thomas | Virtual hand based on combined data |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7370983B2 (en) * | 2000-03-02 | 2008-05-13 | Donnelly Corporation | Interior mirror assembly with display |
US8411034B2 (en) * | 2009-03-12 | 2013-04-02 | Marc Boillot | Sterile networked interface for medical systems |
US8681098B2 (en) | 2008-04-24 | 2014-03-25 | Oblong Industries, Inc. | Detecting, representing, and interpreting three-space input: gestural continuum subsuming freespace, proximal, and surface-contact modes |
US8578282B2 (en) * | 2006-03-15 | 2013-11-05 | Navisense | Visual toolkit for a virtual user interface |
US8390438B2 (en) * | 2008-09-24 | 2013-03-05 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Robotic catheter system including haptic feedback |
JP5698733B2 (ja) * | 2009-05-04 | 2015-04-08 | オブロング・インダストリーズ・インコーポレーテッド | 三空間入力の検出、表現、および解釈:自由空間、近接、および表面接触モードを組み込むジェスチャ連続体 |
US9439736B2 (en) * | 2009-07-22 | 2016-09-13 | St. Jude Medical, Atrial Fibrillation Division, Inc. | System and method for controlling a remote medical device guidance system in three-dimensions using gestures |
US9274699B2 (en) * | 2009-09-03 | 2016-03-01 | Obscura Digital | User interface for a large scale multi-user, multi-touch system |
US8878779B2 (en) * | 2009-09-21 | 2014-11-04 | Extreme Reality Ltd. | Methods circuits device systems and associated computer executable code for facilitating interfacing with a computing platform display screen |
US9146674B2 (en) * | 2010-11-23 | 2015-09-29 | Sectra Ab | GUI controls with movable touch-control objects for alternate interactions |
JP6110404B2 (ja) * | 2011-12-23 | 2017-04-05 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 3次元超音波画像のインタラクティブ表示のための方法及び装置 |
US20160103500A1 (en) * | 2013-05-21 | 2016-04-14 | Stanley Innovation, Inc. | System and method for a human machine interface utilizing near-field quasi-state electrical field sensing technology |
US9020194B2 (en) * | 2013-06-14 | 2015-04-28 | Qualcomm Incorporated | Systems and methods for performing a device action based on a detected gesture |
WO2015047595A1 (en) * | 2013-09-27 | 2015-04-02 | Smiths Medical Asd, Inc. | Infusion pump with touchless user interface and related methods |
US20150253860A1 (en) | 2014-03-07 | 2015-09-10 | Fresenius Medical Care Holdings, Inc. | E-field sensing of non-contact gesture input for controlling a medical device |
WO2015188011A1 (en) * | 2014-06-04 | 2015-12-10 | Quantum Interface, Llc. | Dynamic environment for object and attribute display and interaction |
US9839490B2 (en) * | 2015-01-28 | 2017-12-12 | Terarecon, Inc. | Touchless advanced image processing and visualization |
US10755128B2 (en) * | 2018-12-18 | 2020-08-25 | Slyce Acquisition Inc. | Scene and user-input context aided visual search |
-
2016
- 2016-10-17 US US15/775,639 patent/US11256334B2/en active Active
- 2016-10-17 WO PCT/IB2016/056228 patent/WO2017089910A1/en active Application Filing
- 2016-10-17 JP JP2018546777A patent/JP6994466B2/ja active Active
- 2016-10-17 EP EP16868102.1A patent/EP3380031A4/en active Pending
- 2016-10-17 CA CA3002918A patent/CA3002918C/en active Active
-
2022
- 2022-01-20 US US17/580,356 patent/US11662830B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140085185A1 (en) | 2011-03-24 | 2014-03-27 | Beth Israel Deaconess Medical Center | Medical image viewing and manipulation contactless gesture-responsive system and method |
US20150084866A1 (en) | 2012-06-30 | 2015-03-26 | Fred Thomas | Virtual hand based on combined data |
Non-Patent Citations (1)
Title |
---|
TAN,Justin H. et al.,Infomatics in Radiology Developing a Touchless User Interface for Intraoperative Image Control during Interventional Radiology Procedures,RadioGraphics,2013年04月,Vol.33 No.2 ,pages E61-E70 |
Also Published As
Publication number | Publication date |
---|---|
US11256334B2 (en) | 2022-02-22 |
US20220147150A1 (en) | 2022-05-12 |
WO2017089910A1 (en) | 2017-06-01 |
EP3380031A4 (en) | 2018-11-21 |
US20180329504A1 (en) | 2018-11-15 |
CA3002918C (en) | 2019-01-08 |
JP2019501747A (ja) | 2019-01-24 |
EP3380031A1 (en) | 2018-10-03 |
US11662830B2 (en) | 2023-05-30 |
CA3002918A1 (en) | 2017-06-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6994466B2 (ja) | 医療情報との相互作用のための方法およびシステム | |
US10403402B2 (en) | Methods and systems for accessing and manipulating images comprising medically relevant information with 3D gestures | |
CA2960725C (en) | Surgical system user interface using cooperatively-controlled robot | |
US8036917B2 (en) | Methods and systems for creation of hanging protocols using eye tracking and voice command and control | |
US20100013764A1 (en) | Devices for Controlling Computers and Devices | |
US7694240B2 (en) | Methods and systems for creation of hanging protocols using graffiti-enabled devices | |
DK2649409T3 (en) | SYSTEM WITH INTEGRATION OF 3D USER INTERFACE | |
Hatscher et al. | GazeTap: towards hands-free interaction in the operating room | |
US20140085185A1 (en) | Medical image viewing and manipulation contactless gesture-responsive system and method | |
US20160180046A1 (en) | Device for intermediate-free centralised control of remote medical apparatuses, with or without contact | |
EP3454177B1 (en) | Method and system for efficient gesture control of equipment | |
US20160004315A1 (en) | System and method of touch-free operation of a picture archiving and communication system | |
JP6488153B2 (ja) | カーソル制御方法、カーソル制御プログラム、スクロール制御方法、スクロール制御プログラム、カーソル表示システム、及び医療機器 | |
USRE48221E1 (en) | System with 3D user interface integration | |
Manolova | System for touchless interaction with medical images in surgery using Leap Motion | |
KR20160023015A (ko) | 의료영상의 제공방법 | |
Hurka et al. | Method, accuracy and limitation of computer interaction in the operating room by a navigated surgical instrument | |
De Paolis | A touchless gestural platform for the interaction with the patients data | |
US20160004318A1 (en) | System and method of touch-free operation of a picture archiving and communication system | |
Stuij | Usability evaluation of the kinect in aiding surgeon computer interaction | |
O’Hara et al. | Interactions for Clinicians | |
De Paolis et al. | An Advanced Modality of Visualization and Interaction with Virtual Models of the Human Body | |
De Paolis et al. | A New Interaction Modality for the Visualization of 3D Models of Human Organ |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A529 | Written submission of copy of amendment under article 34 pct |
Free format text: JAPANESE INTERMEDIATE CODE: A529 Effective date: 20180724 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20191017 |
|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20191017 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20191023 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20200925 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20200929 |
|
A601 | Written request for extension of time |
Free format text: JAPANESE INTERMEDIATE CODE: A601 Effective date: 20201211 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20210322 |
|
A02 | Decision of refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A02 Effective date: 20210525 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20210917 |
|
C60 | Trial request (containing other claim documents, opposition documents) |
Free format text: JAPANESE INTERMEDIATE CODE: C60 Effective date: 20210917 |
|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A821 Effective date: 20210917 |
|
A911 | Transfer to examiner for re-examination before appeal (zenchi) |
Free format text: JAPANESE INTERMEDIATE CODE: A911 Effective date: 20211021 |
|
C21 | Notice of transfer of a case for reconsideration by examiners before appeal proceedings |
Free format text: JAPANESE INTERMEDIATE CODE: C21 Effective date: 20211026 |
|
TRDD | Decision of grant or rejection written | ||
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20211116 |
|
A61 | First payment of annual fees (during grant procedure) |
Free format text: JAPANESE INTERMEDIATE CODE: A61 Effective date: 20211213 |
|
R150 | Certificate of patent or registration of utility model |
Ref document number: 6994466 Country of ref document: JP Free format text: JAPANESE INTERMEDIATE CODE: R150 |