WO2016158000A1 - 情報処理装置、情報処理方法及び情報処理システム - Google Patents
情報処理装置、情報処理方法及び情報処理システム Download PDFInfo
- Publication number
- WO2016158000A1 WO2016158000A1 PCT/JP2016/053136 JP2016053136W WO2016158000A1 WO 2016158000 A1 WO2016158000 A1 WO 2016158000A1 JP 2016053136 W JP2016053136 W JP 2016053136W WO 2016158000 A1 WO2016158000 A1 WO 2016158000A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- pointer
- information processing
- information
- display
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/08—Cursor circuits
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/00048—Constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H80/00—ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/40—Support for services or applications
- H04L65/401—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
- H04L65/4015—Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/14—Solving problems related to the presentation of information to be displayed
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and an information processing system.
- HMD head mounted display
- the HMD is a display device that is worn on the user's head and used in recent years as a display device for a user to check information while working in a work environment in addition to a display device such as an AV device or a computer game. in use.
- an HMD is used as a display device for projecting an endoscope image.
- the surgeon wears the HMD and performs an operation while viewing the video displayed on the HMD.
- an endoscope image it has been common for an endoscope image to be displayed on a monitor installed near the operator, so the operator has to move his / her line of sight frequently between the monitor and the affected area.
- the surgeon can confirm the endoscope image and the affected area displayed on the display unit of the HMD without greatly moving the line of sight.
- the present disclosure proposes a new and improved information processing apparatus, information processing method, and information processing system capable of facilitating mutual communication when an image is shared by a plurality of users when performing surgery. .
- the processing unit that changes the pointer attribute of the pointer displayed in the display area of the head-mounted display in accordance with the user-specified position based on the user attribute of the user wearing the head-mounted display is provided.
- An information processing apparatus is provided.
- the user attribute of the user wearing the head-mounted display is acquired, and the pointer displayed in the display area of the head-mounted display according to the user-specified position based on the user attribute
- An information processing method is provided, including changing a pointer attribute.
- the pointer of the pointer displayed in the display area of the head-mounted display according to the user-specified position based on the head-mounted display and the user attribute of the user wearing the head-mounted display An information processing system comprising a processing unit that changes an attribute is provided.
- FIG. 3 is a functional block diagram illustrating a functional configuration of an HMD and a processor unit that configure an information processing system according to an embodiment of the present disclosure.
- FIG. It is a flowchart which shows the pointer display process which concerns on the same embodiment. It is explanatory drawing which shows the other example of the display format of a pointer. It is explanatory drawing which shows the example which displayed additional information in the pointer vicinity. It is a flowchart which shows the pointer operation process which concerns on the same embodiment. It is explanatory drawing explaining the pointer operation process which concerns on the same embodiment. It is explanatory drawing which shows the example of a display of HMD in the case of providing a paint function.
- FIG. 1 is an explanatory diagram showing a state in which endoscopic surgery is performed by a plurality of users wearing the HMD 100.
- an image 300 of the affected area captured by the endoscopic device is displayed on the display unit of the HMD 100.
- the patient is operated while visually recognizing the displayed image 300.
- each surgeon who is a user P1 to P3 of the HMD 100 is sharing and observing the image 300, as shown in FIG. 1, by displaying pointers 311, 312 and 313 that can be operated by each user, It becomes possible to point to a specific part in the image 300.
- the pointers 311, 312, and 313 are operated by a non-contact operation technique such as a corresponding user's line of sight or gesture, for example.
- the HMD 100 can be operated in a non-contact manner, and it is difficult to hinder the work. Therefore, an operation input using a line of sight or a gesture is effective.
- the information processing apparatus changes the pointer attributes of the corresponding pointers 311, 312, and 313 according to the user attributes of the users P1 to P3.
- the user attribute is user identification information for identifying a user who uses the HMD, and includes information such as a user ID, affiliation, job type, and class.
- the pointer attribute is information representing specifications related to pointer display such as shape and color.
- the colors of the pointers 311, 312, and 313 displayed in the image 300 are made different according to the user attribute.
- the users P1 to P3 can clearly recognize the pointer that can be operated by the user, and can also clearly recognize which user is operating the other pointer.
- additional information that can specify the operator of the pointer such as a user name may be displayed in the vicinity of the displayed pointer.
- the information processing apparatus may be able to operate the pointers of other users in accordance with the authority information included in the user attributes.
- the authority information is information indicating whether or not other users can operate the pointer.
- a pointer of a user who has a lower authority than itself may be operable.
- a user with higher authority can operate a pointer of a user with lower authority to directly instruct or instruct a user with lower authority.
- the HMD 100 is a display device that displays information such as an input video from an external device such as an endoscope device.
- the HMD 100 is, for example, a goggle-shaped non-transparent HMD, and is used in a state where the user wears the head.
- HMD100 consists of a main-body part provided with the display part which shows information to the wearer of the HMD100, and an upper fixing part and a rear fixing part for fixing the main body part to the head. When fixed to the wearer's head by the fixing unit, the display unit of the main body is positioned in front of the left and right eyes of the wearer.
- the body part is a part covering both eyes of the wearer.
- the main body may be configured to cover, for example, the vicinity of the left and right temples of the wearer.
- the front of the wearer's eyes can be almost completely covered at the time of wearing, and external images are not incident on the wearer's eyes, making it easier to view the image.
- a first display element (reference numeral 132 in FIG. 2) that presents the left-eye image on the first display unit, and a second display that presents the right-eye image on the second display unit.
- a display element (reference numeral 134 in FIG. 2) is provided.
- Each display element presents, for example, an image of an endoscope apparatus provided by the processor unit 200, an image captured by a camera of the main body, or the like.
- the main body is provided with a cable (not shown) connected to the processor unit 200 in order to transmit and receive information to and from the processor unit 200.
- Information communication between the HMD 100 and the processor unit 200 may be wired communication or wireless communication.
- the information displayed on the display unit of the HMD 100 can be operated from, for example, a remote controller 102 such as a foot switch on which the wearer performs a stepping operation with his / her foot, a line of sight of the wearer of the HMD 100, a gesture, or the like.
- a remote controller 102 such as a foot switch on which the wearer performs a stepping operation with his / her foot, a line of sight of the wearer of the HMD 100, a gesture, or the like.
- Input information from the remote controller 102 and input information such as the wearer's gaze direction acquired by the gaze detection function and the wearer's gesture acquired by the gesture detection function are output to the processor unit 200.
- the HMD 100 includes a display port 110, an image generation unit 120, display elements 132 and 134, and a sensor unit 140.
- the display port 110 is an interface that receives input information from the processor unit 200.
- a cable that enables information communication with the processor unit 200 is connected to the display port 110.
- image signals to be output to the display elements 132 and 134 and information visually recognized by the wearer of the HMD 100 are input to the display port 110, for example.
- Information input from the display port 110 is output to the image generation unit 120.
- the image generation unit 120 generates image signals to be output to the display elements 132 and 134 based on the information acquired via the processor unit 200, respectively.
- the image generation unit 120 outputs the left-eye image signal output to the first display element 132 and the right-eye image output to the second display element 134.
- a shift process that causes a shift in the image signal is performed.
- the shift amount between the image signal for the left eye and the image signal for the right eye is determined according to the distance between the display elements 132 and 134 and the eye of the wearer, the interval between the wearer's eyes, the virtual image position, and the like. Is done.
- the image generation unit 120 outputs the generated image signal to the first display element 132 and the second display element 134.
- the image generation unit 120 generates a pointer image signal based on a detection result of a line-of-sight sensor 142 or a gesture sensor 146 described later.
- the pointer image signal is a signal for displaying a pointer indicating a designated position designated by the user in the display area.
- the image generation unit 120 also generates pointer image signals of other users.
- the image generation unit 120 superimposes the pointer image signal on the image signal generated based on the information from the processor unit 200 and outputs the image signal to the display elements 132 and 134.
- the display elements 132 and 134 emit image light toward the display unit based on the image signal input from the image generation unit 120.
- the display elements 132 and 134 are arranged to face the display unit in the front-rear direction of the wearer's face when the HMD 100 is worn, for example. Thereby, the optical axis of the image light emitted from the display elements 132 and 134 becomes substantially parallel to the direction of the line of sight when the wearer faces the front.
- the display elements 132 and 134 are composed of, for example, organic EL (Electroluminescence) elements.
- organic EL elements As the display elements 132 and 134, it is possible to realize a small size, high contrast, quick response, and the like.
- the display elements 132 and 134 have a configuration in which, for example, a plurality of red organic EL elements, green organic EL elements, blue organic EL elements, and the like are arranged in a matrix. Each of these elements is driven by an active matrix type or passive matrix type drive circuit to emit light at a predetermined timing, brightness, or the like.
- By controlling the drive circuit based on the image signal generated by the image generation unit 120, a predetermined image is displayed as a whole on the display elements 132 and 134, and the image is provided to the wearer via the display unit. Is done.
- a plurality of eyepieces may be disposed as an optical system between the display elements 132 and 134 and the display unit.
- these eyepieces and the wearer's eyes face each other at a predetermined distance, it becomes possible for the wearer to observe a virtual image that appears to display an image at a predetermined position (virtual image position).
- a 3D image can be provided.
- the virtual image position and the size of the virtual image are set by the configuration of the display elements 132 and 134 and the optical system.
- the sensor unit 140 includes various sensors that acquire various types of information in the HMD 100.
- the sensors provided in the HMD 100 include a line-of-sight sensor 142, an iris sensor 144, a gesture sensor 146, and the like.
- the line-of-sight sensor 142 detects the line-of-sight direction of the user wearing the HMD 100.
- the line-of-sight sensor 142 is provided on the side of the main body of the HMD 100 that faces the eye of the wearer.
- the line-of-sight sensor 142 includes, for example, a light source that irradiates the wearer's eyeball with light in the infrared band (infrared light) and an imaging unit that photographs the wearer's eyeball.
- the line-of-sight sensor 142 irradiates light from the light source to the eyeball of the wearer observing the display surface of the display unit, and images the eyeball irradiated with light by the imaging unit.
- a captured image captured by the imaging unit of the line-of-sight sensor 142 is output to the processor unit 200.
- the iris sensor 144 detects the iris of the eyeball of the user wearing the HMD 100.
- the iris sensor 144 obtains an iris image from an image sensor such as a CMOS or a CCD and a captured eyeball image, performs polar coordinate conversion, feature extraction, and calculates iris authentication information for performing iris authentication. It consists of a processing unit.
- the iris authentication information acquired by the iris sensor 144 is used to acquire user identification information representing the user attribute of the user wearing the HMD 100.
- the iris sensor 144 may automatically operate when it is detected that the user has worn the HMD 100, for example.
- the gesture sensor 146 detects a gesture performed by a user wearing the HMD 100.
- the gesture sensor 146 is provided, for example, in the main body of the HMD 100 so as to detect the outside world.
- the gesture sensor 146 is a 3D motion sensor, for example.
- the gesture sensor 146 acquires three-dimensional information representing the position of the object detected with the sensor as a reference, and outputs it to the processor unit 200.
- the processor unit 200 is a control device that controls devices connected to the processor unit 200.
- the processor unit 200 includes, for example, one or a plurality of HMDs 100 mounted on the users P1 to P3 shown in FIG. 1, an external device such as an endoscope apparatus, a display for an unspecified user to view information, and the like. Is connected.
- the processor unit 200 processes information input from an external device into information that can be displayed on the display unit or display of the HMD 100, and outputs the information to each display device. Further, the processor unit 200 switches information displayed on the display unit of the HMD 100 based on operation inputs from the remote controller, the line-of-sight detection function, and the gesture detection function of each HMD 100.
- the processor unit 200 includes an image input unit 211, a detection information input unit 212, a detection information processing unit 213, an image processing unit 214, a display control unit 215, an output unit 216, and an operation unit.
- An input unit 217 and a setting storage unit 218 are provided.
- the image input unit 211 is an interface that receives an image input to the processor unit 200 from an external device.
- the endoscope apparatus 10 is shown as an external device.
- an image in which the affected part is captured by a camera (not shown) of the endoscope apparatus 10 is displayed in the image input unit 211.
- the image input unit 211 outputs the input image to the image processing unit 214.
- the image input unit 211 acquires, for example, a microscope image acquired by a microscope or an ultrasonic examination apparatus as a medical image such as an image obtained by imaging the affected part acquired by the endoscope apparatus 10.
- An ultrasonic image or the like may be input.
- These medical images may be used as the same image shared among a plurality of users wearing the HMD 100.
- vital sign information such as blood pressure, body temperature, pulse rate, and respiratory rate may be input to the image input unit 211 from a biological information monitor.
- the detection information input unit 212 is an interface through which detection information is input from the sensor unit 140 of the HMD 100 connected to the processor unit 200.
- the detection information input to the detection information input unit 212 is output to the detection information processing unit 213.
- the detection information processing unit 213 performs arithmetic processing for acquiring user identification information based on detection information input from the HMD 100 connected to the processor unit 200. First, the detection information processing unit 213 performs iris authentication processing based on the iris authentication information acquired by the iris sensor 144. In the iris authentication process, the detection information processing unit 213 compares the iris master information of each user stored in advance with the iris authentication information to identify the user. Once authenticated, user identification information can be acquired as information representing user attributes based on user-specific information (for example, user ID) associated with iris master information. Note that the iris master DB that stores the iris master information of each user may be stored in the processor unit 200 or may be on a server that can be connected via a network.
- the user identification information of each user may be stored in the setting storage unit 218 or may be on a server that can be connected via a network.
- the detection information processing unit 213 outputs the specified user ID or the user identification information of each user acquired by the user ID to the image processing unit 214.
- the image processing unit 214 processes the image input to the processor unit 200 into an image to be displayed on the HMD 100.
- the image processing unit 214 for example, a left-eye image to be displayed on the first display unit of the HMD 100 and a right-eye image to be displayed on the second display unit from an image captured by the camera of the endoscope apparatus 10 Is generated.
- the image processed by the image processing unit 214 is output to the display control unit 215. Further, the image processing unit 214 acquires pointer display format information for each user based on the user identification information, generates pointer images, and outputs the pointer images to the display control unit 215.
- the display control unit 215 controls information to be displayed on the display unit of the HMD 100.
- the display control unit 215 controls to display the instructed information based on the display switching instruction obtained from the detection information of the remote controller 102 or the sensor unit 140. For example, based on the display switching instruction, the display control unit 215 determines which image of displayable information such as an image of the endoscope apparatus 10 or a video see-through image is to be displayed.
- the display control unit 215 outputs the information to each HMD 100 via the output unit 216.
- the operation input unit 217 is an input unit that receives an operation input from the wearer of the HMD 100.
- information displayed on the display unit of the HMD 100 can be switched by, for example, the remote controller 102.
- An operation input to the remote controller 102 is output to the operation input unit 217, and the operation input unit 217 outputs this operation input information to the display control unit 215.
- the display control unit 215 outputs the instructed information to the HMD 100 via the output unit 216 based on the display switching instruction from the remote controller 102.
- the setting storage unit 218 is a storage unit that stores display setting information of the HMD 100 corresponding to each user identification information.
- the display setting information stored in the setting storage unit 218 includes, for example, various setting information such as image quality, image orientation, and image arrangement.
- the image quality setting information is information representing setting values such as image brightness and hue.
- the image orientation information is information representing the display orientation of the image displayed on the display unit.
- the image display direction represents a change with respect to the display state of the reference image.
- the setting storage unit 218 stores user identification information of each user.
- the user identification information is information in which the user ID is associated with information such as the user's affiliation, job type, and class.
- the user identification information includes authority information indicating whether or not other users can operate the pointer.
- the authority information is set according to the job type, class, role, etc. of the user.
- the setting storage unit 218 stores pointer display format information such as a color and shape, a display user name displayed together with the pointer, and the like associated with at least one information included in the user identification information as a pointer attribute. . Accordingly, the image processing unit 214 can generate a pointer image for each user with reference to the display format information of the pointer.
- FIG. 3 is a flowchart showing pointer display processing according to the present embodiment.
- FIG. 4 is an explanatory diagram showing another example of a pointer display format.
- FIG. 5 is an explanatory diagram showing an example in which additional information is displayed near the pointer.
- the user when the user first wears the HMD 100, the user is authenticated (S100).
- the user authentication is performed by the detection information processing unit 213 of the processor unit 200 based on the iris authentication information acquired based on the detection result of the iris sensor 144 of the HMD 100.
- the detection information processing unit 213 compares the iris master information and iris authentication information stored in advance for each user, identifies the user, and acquires the identified user ID of the user.
- the image processing unit 214 when the user is specified, the image processing unit 214 generates a pointer image of the user based on the user identification information indicating the user attribute (S110).
- the image processing unit 214 refers to the setting storage unit 218, acquires the display format information of the input user ID pointer, and generates a pointer image of the user.
- the image processing unit 214 specifies the pointer display format information corresponding to the user attribute of the specified user, and generates a pointer image.
- the user identification information for determining the pointer display format information is set in advance.
- the pointer display format information may be determined based on authority information included in the user identification information.
- the authority information is divided into, for example, upper, middle, and lower levels, the pointer image of the user having the higher authority is based on the display format information of the pointer set for those having the higher authority. Created.
- the display format information of the pointer may be determined based on one or a plurality of user identification information without being limited to the authority information because there may be no authority information or there may be no vertical relationship between users.
- the same pointer image can be set not to be displayed.
- the pointer image generated in step S110 is output to the HMD 100 together with the image to be displayed on the display unit of the HMD 100.
- the image generation unit 120 of the HMD 100 performs predetermined processing on the input image, calculates the display position of the pointer image from the detection result of the sensor unit 140, superimposes the pointer image signal on the image signal, and displays it. Output to the elements 132 and 134.
- the initial display position of the pointer image may be set in advance, such as in the center of the display area, and may be a designated position designated by the user based on the direction of the line of sight or the gesture.
- a pointer such as a shape corresponding to the user attribute of the user is displayed.
- a plurality of colors may be set so that the user can be distinguished even with the same shape (arrow) as shown in FIG. 1.
- Circle, square, etc. may be set so that users can be distinguished.
- additional pointer information that specifically indicates the user may be set in the display format information of the pointer. For example, as shown in FIG. 5, by displaying the user name as pointer additional information 311a, 312a, 313a and displaying it in the vicinity of each pointer, it is possible to distinguish the users more clearly.
- the name of the user is displayed as additional pointer information.
- the name of the person in charge such as a surgeon or assistant, may be displayed as additional pointer information.
- the pointer is displayed on the display unit of the HMD 100.
- the pointers of other users are also displayed. Displayed in the section. That is, when the image being observed is a shared image shared by a plurality of users, a plurality of pointers are displayed.
- the display of each pointer differs depending on the user attribute. Therefore, even when an image is shared by a plurality of users, the pointer corresponding to each user is clear and communication between users can be performed smoothly.
- FIG. 6 is a flowchart showing pointer operation processing according to the present embodiment.
- FIG. 7 is an explanatory diagram illustrating pointer operation processing according to the present embodiment.
- the pointer operation processing is processing for determining, for example, by the detection information processing unit 213, whether other user's pointer can be operated based on user authority information. Such processing is performed, for example, when an instruction is given to change the pointer operated by the user when the image is shared by a plurality of users and a plurality of pointers are displayed on the display unit.
- the instruction to change the pointer to be operated can be given by, for example, a foot switch or a gesture.
- a description will be given along the situation where the users P1 to P3 share the image 300 as shown in FIG. In FIG. 7, the pointers of the users P1, P2, and P3 correspond to the pointer 311, the pointer 312 and the pointer 313, respectively. At this time, consider a case where the user P1 operates the pointer 313 of another user P3.
- the detection information processing unit 213 determines whether or not the user P1 can operate the pointer 313 of another user P3 (S200). Whether or not the user P1 has an intention to operate the pointer 313 of the user P3 may be determined based on, for example, whether or not the designated position designated by the user P1 is at the display position of the pointer 313 of the user P3.
- the designated position by the user P1 is specified by, for example, the detection result of the line-of-sight sensor 142 or the gesture sensor 146.
- the detection information processing unit 213 Based on the detection result of the line-of-sight sensor 142 or the gesture sensor 146 of the HMD 100 worn by the user P1, the detection information processing unit 213 has the designated position of the user P1 at the display position of the pointer 313 of the user P3, and the user P1 is the user P3. It is detected that the pointer 313 is designated. Then, the detection information processing unit 213 confirms the authority information of the users P1 and P3. The authority information is acquired from the user identification information in the setting storage unit 218. The detection information processing unit 213 compares authority information between the user P1 and the user P3, and identifies a user having higher authority. When the user P1 has a higher authority than the user P3, the user P1 proceeds to the process of step S220, assuming that the user P3 has the operation authority of the pointer 313.
- the detection information processing unit 213 generates information notifying that the pointer 313 of the user P3 cannot be operated, outputs the information to the HMD 100 via the output unit 216, and notifies the user P1 with a message, voice, or the like. May be.
- the detection information processing unit 213 determines whether a user other than the user P1 is operating the pointer 313 (S220). A user other than the user P3 who is the original user of the pointer 313 may be operating the pointer 313. Therefore, if the user P3 is operating the pointer 313, the detection information processing unit 213 determines whether the user P1 has higher authority than the user who is operating the pointer 313 (S230).
- the detection information processing unit 213 determines that the user P1 can operate the pointer 313 on behalf of another user who is operating ( S240).
- the detection information processing unit 213 determines that the user P1 operates the pointer 313 to be operated. It is determined that it cannot be performed (S210), and the process is terminated.
- step S220 If it is determined in step S220 that the pointer 313 has not been operated, the user P1 can operate the pointer 313 (S240).
- the operation availability information is output to the HMD 100 via the output unit 216 together with the image generated by the display control unit 215 and the pointer image. .
- the image generation unit 120 of the HMD 100 Based on these pieces of information, the image generation unit 120 of the HMD 100 generates an image signal to be output to the display elements 132 and 134 and a pointer image signal for each user observing the shared image.
- the image generation unit 120 confirms whether the display position of the other user's pointer can be changed based on the determination result of the processing in FIG. .
- the image generation unit 120 changes the display position of the other user's pointer based on its own line-of-sight position or gesture. At this time, its own pointer is not moved.
- the image generation unit 120 superimposes the pointer image signal on the image signal generated based on the information from the processor unit 200 and outputs the image signal to the display elements 132 and 134.
- the operation of the pointers of other users that can be operated continues until the user gives an instruction to restore the pointer to be operated, or an instruction to further operate the pointers of other users.
- the pointer operation processing according to the present embodiment has been described above. In this way, by enabling other users' pointers to be operated according to the user's authority information, a user having higher authority operates a pointer of a user having lower authority to grant lower authority. It is possible to give instructions and guidance directly to users who have them.
- the surgeon has higher authority and the assistant has lower authority.
- medical diagnosis such as ultrasonic diagnosis
- the doctor has the highest authority, and then the authority decreases in the order of laboratory technician and patient.
- telemedicine a medical worker who guides a patient at a remote place has a higher authority, and a patient at a remote place has a lower authority.
- an educator has a higher authority and a student has a lower authority.
- the present disclosure is not limited to such an example, and for example, according to the authority information,
- the number of pointers displayed on the display unit of the HMD 100 may be changed. For example, in endoscopic surgery, it is assumed that a surgeon, an assistant, and a scopist who operates an endoscopic device wear the HMD 100, and all users share and observe the same image during the operation. There is also.
- the operator, assistant, and scopist have higher authority in this order.
- the scopist pointer may not be displayed in many cases.
- the pointer of the user having authority lower than that of the operator may not be displayed.
- the information processing system 1 including the HMD 100 according to the present embodiment has been described above.
- the display mode of the pointer operated by each user is changed based on the user attribute of each user. Display in the area.
- the user can clearly recognize the pointers that can be operated by the user, and which other pointers It is possible to clearly recognize whether the user is operating.
- the information processing system 1 may be able to operate the pointers of other users according to the authority information included in the user attributes.
- a user with higher authority can operate a pointer of a user with lower authority to directly instruct or instruct a user with lower authority.
- user identification information that is a user attribute is acquired based on iris authentication information acquired by an iris sensor, but the present disclosure is not limited to such an example.
- it may be obtained by biometric authentication information other than iris authentication information, and by using near field communication technology (NFC), user identification information is obtained from an IC card or the like that stores user identification information. Also good.
- NFC near field communication technology
- the information processing system capable of displaying a pointer in the display area of the HMD and pointing to a specific part has been described, but the present disclosure is not limited to such an example.
- a paint function for superimposing and displaying lines and figures in the display area may be further provided.
- FIG. 8 there are a pen icon 331 that enables a line to be drawn and a graphic icon 333 that allows a graphic to be inserted as a paint object 330 at the upper right of the display area. Accordingly, the user can select an icon with a line of sight, for example, move the line of sight, and draw a line with a line of sight on a desired portion. Further, such an operation can be performed by a gesture other than the line of sight.
- An information processing apparatus comprising: a processing unit that changes a pointer attribute of a pointer displayed in a display area of the head-mounted display in accordance with a user-specified position based on a user attribute of a user who wears the head-mounted display.
- the processing unit displays the same image shared among the users and pointers of at least some of the plurality of users in a display area of a head-mounted display worn by each of the plurality of users.
- the information processing apparatus according to (1).
- the information processing apparatus according to any one of (1) to (9), wherein the pointer attribute is display format information of a pointer.
- the pointer display format information is information related to a shape or a color of the pointer.
- the designated position of the user is determined by a line of sight of the user.
- the designated position of the user is determined by a gesture of the user.
- a head mounted display Based on a user attribute of a user wearing the head-mounted display, a processing unit that changes a pointer attribute of a pointer displayed in a display area of the head-mounted display according to a user-specified position;
- An information processing system comprising: (16) The information processing system according to (15), wherein the head-mounted display includes a line-of-sight sensor. (17) The information processing system according to (15) or (16), wherein the head-mounted display includes a gesture sensor. (18) The information processing system according to any one of (15) to (17), further including an imaging device that images an affected area.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Medical Informatics (AREA)
- Human Computer Interaction (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Computer Hardware Design (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Radiology & Medical Imaging (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Urology & Nephrology (AREA)
- Biophysics (AREA)
- Gynecology & Obstetrics (AREA)
- Optics & Photonics (AREA)
- Computer Networks & Wireless Communication (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
1.概要
2.構成
2.1.HMD
2.2.プロセッサユニット
3.表示制御処理
3.1.ポインタ表示処理
3.2.ポインタ操作処理
まず、図1を参照して、本開示の一実施形態に係る情報処理装置により行われる処理の概要について説明する。なお、図1は、HMD100を装着する複数のユーザにより内視鏡手術を行う様子を示す説明図である。
まず、図2に基づいて、情報処理システム1を構成するHMD100及びプロセッサユニット200の構成について説明する。なお、図2は、本実施形態に係る情報処理システム1を構成するHMD100およびプロセッサユニット200の機能構成を示す機能ブロック図である。なお、図2では、HMD100の表示部の表示制御を行う際に機能する機能部を示しており、実際には、他の機能部を備えているものとする。プロセッサユニット200は、HMD100への操作入力に基づき、HMD100の表示制御を行う表示制御装置として機能する。
HMD100は、例えば内視鏡装置等の外部機器からの入力映像等の情報を表示する表示装置である。HMD100は、例えばゴーグル形状の非透過型のHMDであって、ユーザが頭部に装着した状態で使用される。HMD100は、当該HMD100の装着者に情報を提示する表示部を備える本体部と、本体部を頭部に固定させるための上部固定部および後部固定部とからなる。固定部で装着者の頭部に固定されると、本体部の表示部が装着者の左右の眼の前に位置するようになる。
プロセッサユニット200は、当該プロセッサユニット200に接続された機器を制御する制御装置である。プロセッサユニット200には、例えば図1に示すユーザP1~P3が装着する各HMD100のうち1または複数のHMD100や、内視鏡装置等の外部機器、不特定のユーザが情報を見るためのディスプレイ等が接続される。例えば、プロセッサユニット200は、外部機器から入力された情報を、HMD100の表示部やディスプレイに表示可能な情報に処理し、各表示装置へ出力する。また、プロセッサユニット200は、各HMD100のリモートコントローラや視線検出機能、ジェスチャ検出機能からの操作入力に基づいて、HMD100の表示部に表示する情報を切り替える。
以下、図3~図7に基づいて、図2に示した本実施形態に係る情報処理システム1におけるポインタの表示制御処理について説明する。
まず、図3~図5に基づいて、本実施形態に係る情報処理システム1におけるポインタ表示処理について説明する。図3は、本実施形態に係るポインタ表示処理を示すフローチャートである。図4は、ポインタの表示形式の他の例を示す説明図である。図5は、ポインタ近傍に付加情報を表示させた例を示す説明図である。
本実施形態では、複数ユーザで画像を共有している場合に、ユーザの権限情報に基づいて、他のユーザのポインタも操作可能としてもよい。以下、図6及び図7に基づいて、本実施形態に係る情報処理システム1におけるポインタ操作処理について説明する。なお、図6は、本実施形態に係るポインタ操作処理を示すフローチャートである。図7は、本実施形態に係るポインタ操作処理を説明する説明図である。
以上、本実施形態に係るHMD100を備える情報処理システム1について説明した。本実施形態によれば、HMD100を装着する複数のユーザが画像を共有して観察しているとき、各ユーザのユーザ属性に基づいて、各ユーザがそれぞれ操作するポインタの表示形式を変化させて表示領域に表示させる。これにより、表示領域に複数のポインタが表示されている場合であっても、異なる色や形状で表示されるため、ユーザは自分が操作可能なポインタを明確に認識できるとともに、他のポインタがどのユーザによって操作されているものかを明確に認識することができる。
(1)
頭部装着ディスプレイを装着するユーザのユーザ属性に基づいて、前記頭部装着ディスプレイの表示領域にユーザの指定位置に応じて表示されるポインタのポインタ属性を変更する処理部を備える、情報処理装置。
(2)
前記処理部は、複数のユーザがそれぞれ装着する頭部装着ディスプレイの表示領域に、前記ユーザ間で共有される同一の画像と、前記複数のユーザのうち少なくとも一部のユーザのポインタとを表示させる、前記(1)に記載の情報処理装置。
(3)
前記処理部は、前記ユーザ間のユーザ属性の関係に基づいて、他のユーザのポインタの表示可否が決定される、前記(2)に記載の情報処理装置。
(4)
前記処理部は、前記ユーザ間のユーザ属性の関係に基づいて、他のユーザのポインタの操作可否を決定する、前記(2)または(3)に記載の情報処理装置。
(5)
前記ユーザ間で共有される同一の画像は、医療用画像である、前記(2)~(4)のいずれか1項に記載の情報処理装置。
(6)
前記ユーザ属性は、ユーザの権限情報である、前記(1)~(5)のいずれか1項に記載の情報処理装置。
(7)
前記ユーザの権限情報は、医療従事者の間での権限情報である、前記(6)に記載の情報処理装置。
(8)
前記ユーザ属性は、生体認証により取得される、前記(1)~(7)のいずれか1項に記載の情報処理装置。
(9)
前記ユーザ属性は、近距離無線通信により取得される、前記(1)~(8)のいずれか1項に記載の情報処理装置。
(10)
前記ポインタ属性は、ポインタの表示形式情報である、前記(1)~(9)のいずれか1項に記載の情報処理装置。
(11)
前記ポインタの表示形式情報は、ポインタの形状又は色に関する情報である、前記(10)に記載の情報処理装置。
(12)
前記ユーザの指定位置は、前記ユーザの視線により決定される、前記(1)~(11)のいずれか1項に記載の情報処理装置。
(13)
前記ユーザの指定位置は、前記ユーザのジェスチャにより決定される、前記(1)~(11)のいずれか1項に記載の情報処理装置。
(14)
頭部装着ディスプレイを装着するユーザのユーザ属性を取得すること、
前記ユーザ属性に基づいて、前記頭部装着ディスプレイの表示領域に、ユーザの指定位置に応じて表示されるポインタのポインタ属性を変更すること、
とを含む、情報処理方法。
(15)
頭部装着ディスプレイと、
前記頭部装着ディスプレイを装着するユーザのユーザ属性に基づいて、前記頭部装着ディスプレイの表示領域にユーザの指定位置に応じて表示されるポインタのポインタ属性を変更する処理部と、
を備える、情報処理システム。
(16)
前記頭部装着ディスプレイは視線センサを有する、前記(15)に記載の情報処理システム。
(17)
前記頭部装着ディスプレイはジェスチャセンサを有する、前記(15)または(16)に記載の情報処理システム。
(18)
患部を撮像する撮像装置をさらに備える、前記(15)~(17)のいずれか1項に記載の情報処理システム。
10 内視鏡装置
100 HMD
102 リモートコントローラ
110 ディスプレイポート
120 画像生成部
132 第1の表示素子
134 第2の表示素子
140 センサ部
142 視線センサ
144 虹彩センサ
146 ジェスチャセンサ
200 プロセッサユニット
211 画像入力部
212 検出情報入力部
213 検出情報処理部
214 画像処理部
215 表示制御部
216 出力部
217 操作入力部
218 設定記憶部
Claims (18)
- 頭部装着ディスプレイを装着するユーザのユーザ属性に基づいて、前記頭部装着ディスプレイの表示領域にユーザの指定位置に応じて表示されるポインタのポインタ属性を変更する処理部を備える、情報処理装置。
- 前記処理部は、複数のユーザがそれぞれ装着する頭部装着ディスプレイの表示領域に、前記ユーザ間で共有される同一の画像と、前記複数のユーザのうち少なくとも一部のユーザのポインタとを表示させる、請求項1に記載の情報処理装置。
- 前記処理部は、前記ユーザ間のユーザ属性の関係に基づいて、他のユーザのポインタの表示可否が決定される、請求項2に記載の情報処理装置。
- 前記処理部は、前記ユーザ間のユーザ属性の関係に基づいて、他のユーザのポインタの操作可否を決定する、請求項2に記載の情報処理装置。
- 前記ユーザ間で共有される同一の画像は、医療用画像である、請求項2に記載の情報処理装置。
- 前記ユーザ属性は、ユーザの権限情報である、請求項1に記載の情報処理装置。
- 前記ユーザの権限情報は、医療従事者の間での権限情報である、請求項6に記載の情報処理装置。
- 前記ユーザ属性は、生体認証により取得される、請求項1に記載の情報処理装置。
- 前記ユーザ属性は、近距離無線通信により取得される、請求項1に記載の情報処理装置。
- 前記ポインタ属性は、ポインタの表示形式情報である、請求項1に記載の情報処理装置。
- 前記ポインタの表示形式情報は、ポインタの形状又は色に関する情報である、請求項10に記載の情報処理装置。
- 前記ユーザの指定位置は、前記ユーザの視線により決定される、請求項1に記載の情報処理装置。
- 前記ユーザの指定位置は、前記ユーザのジェスチャにより決定される、請求項1に記載の情報処理装置。
- 頭部装着ディスプレイを装着するユーザのユーザ属性を取得すること、
前記ユーザ属性に基づいて、前記頭部装着ディスプレイの表示領域に、ユーザの指定位置に応じて表示されるポインタのポインタ属性を変更すること、
とを含む、情報処理方法。 - 頭部装着ディスプレイと、
前記頭部装着ディスプレイを装着するユーザのユーザ属性に基づいて、前記頭部装着ディスプレイの表示領域にユーザの指定位置に応じて表示されるポインタのポインタ属性を変更する処理部と、
を備える、情報処理システム。 - 前記頭部装着ディスプレイは視線センサを有する、請求項15に記載の情報処理システム。
- 前記頭部装着ディスプレイはジェスチャセンサを有する、請求項15に記載の情報処理システム。
- 患部を撮像する撮像装置をさらに備える、請求項15に記載の情報処理システム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/553,192 US10854168B2 (en) | 2015-03-30 | 2016-02-03 | Information processing apparatus, information processing method, and information processing system |
JP2017509340A JP6693507B2 (ja) | 2015-03-30 | 2016-02-03 | 情報処理装置、情報処理方法及び情報処理システム |
EP16771884.0A EP3279780A4 (en) | 2015-03-30 | 2016-02-03 | Information processing device, information processing method, and information processing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015068899 | 2015-03-30 | ||
JP2015-068899 | 2015-03-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016158000A1 true WO2016158000A1 (ja) | 2016-10-06 |
Family
ID=57006954
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/053136 WO2016158000A1 (ja) | 2015-03-30 | 2016-02-03 | 情報処理装置、情報処理方法及び情報処理システム |
Country Status (4)
Country | Link |
---|---|
US (1) | US10854168B2 (ja) |
EP (1) | EP3279780A4 (ja) |
JP (1) | JP6693507B2 (ja) |
WO (1) | WO2016158000A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018169709A (ja) * | 2017-03-29 | 2018-11-01 | 富士ゼロックス株式会社 | コンテンツ表示装置およびコンテンツ表示プログラム |
JP2019013397A (ja) * | 2017-07-05 | 2019-01-31 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用観察装置 |
JP2019088734A (ja) * | 2017-11-17 | 2019-06-13 | ソニー株式会社 | 手術システム、情報処理装置、及び情報処理方法 |
JP2019537149A (ja) * | 2017-03-31 | 2019-12-19 | 北京七▲シン▼易▲維▼信息技▲術▼有限公司Beijing 7Invensun Technology Co.,Ltd. | 視線追跡装置および頭部装着型表示装置 |
WO2020137162A1 (ja) * | 2018-12-28 | 2020-07-02 | 富士フイルム株式会社 | 超音波診断装置および超音波診断装置の制御方法 |
JP2020520521A (ja) * | 2017-05-16 | 2020-07-09 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 協働的介入のための拡張現実 |
US11026560B2 (en) * | 2018-03-15 | 2021-06-08 | Sony Olympus Medical Solutions Inc. | Medical display control apparatus and display control method |
WO2022208600A1 (ja) * | 2021-03-29 | 2022-10-06 | 京セラ株式会社 | ウェアラブル端末装置、プログラムおよび表示方法 |
KR102458495B1 (ko) * | 2022-03-17 | 2022-10-25 | 주식회사 메디씽큐 | 원격협진지원을 위한 3차원 포인팅시스템 및 그 제어방법 |
JP7500866B2 (ja) | 2021-03-29 | 2024-06-17 | 京セラ株式会社 | ウェアラブル端末装置、プログラムおよび表示方法 |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3525023A1 (en) * | 2018-02-09 | 2019-08-14 | Leica Instruments (Singapore) Pte. Ltd. | Arm adapted to be attached to a microscope, and microscope |
US10795435B2 (en) * | 2018-07-19 | 2020-10-06 | Samsung Electronics Co., Ltd. | System and method for hybrid eye tracker |
JP7318258B2 (ja) | 2019-03-26 | 2023-08-01 | コベルコ建機株式会社 | 遠隔操作システムおよび遠隔操作サーバ |
USD953456S1 (en) * | 2020-06-08 | 2022-05-31 | Nautilus, Inc. | Display of a stationary exercise machine |
USD953457S1 (en) * | 2020-06-08 | 2022-05-31 | Nautilus, Inc. | Display of a stationary exercise machine |
KR20230027299A (ko) * | 2020-06-29 | 2023-02-27 | 스냅 인코포레이티드 | 공유 시선 반응형 보기 기능을 가진 아이웨어 |
CN113194299B (zh) * | 2021-07-01 | 2021-08-31 | 深圳市修远文化创意有限公司 | 一种智能医疗场景下的口腔治疗实时画面分享方法 |
WO2023119986A1 (ja) * | 2021-12-21 | 2023-06-29 | 株式会社クボタ | 農業機械、および、農業機械に用いるジェスチャ認識システム |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08171586A (ja) * | 1994-12-16 | 1996-07-02 | Hitachi Ltd | 共同作業支援方法、情報処理システムおよびプラント制御システム |
JP2002368762A (ja) * | 2001-06-06 | 2002-12-20 | Olympus Optical Co Ltd | ローカルネットワークシステム、ネットワークシステム及びテレビ会議システム並びに移動通信装置 |
JP2010170354A (ja) * | 2009-01-23 | 2010-08-05 | Seiko Epson Corp | 共有情報表示装置、共有情報表示方法およびコンピュータプログラム |
JP2010213129A (ja) * | 2009-03-12 | 2010-09-24 | Brother Ind Ltd | テレビ会議装置、テレビ会議システム、テレビ会議制御方法、及びテレビ会議装置のプログラム |
JP2013054661A (ja) * | 2011-09-06 | 2013-03-21 | Nec Biglobe Ltd | 情報表示システム、情報表示方法、及び情報表示用プログラム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001117046A (ja) | 1999-10-22 | 2001-04-27 | Shimadzu Corp | 視線検出機能付ヘッドマウントディスプレイシステム |
JP2001238205A (ja) * | 2000-02-24 | 2001-08-31 | Olympus Optical Co Ltd | 内視鏡システム |
US7242389B1 (en) * | 2003-10-07 | 2007-07-10 | Microsoft Corporation | System and method for a large format collaborative display for sharing information |
US7331929B2 (en) * | 2004-10-01 | 2008-02-19 | General Electric Company | Method and apparatus for surgical operating room information display gaze detection and user prioritization for control |
EP2453290A1 (en) * | 2010-11-11 | 2012-05-16 | BAE Systems PLC | Image presentation method and apparatus therefor |
US9063566B2 (en) * | 2011-11-30 | 2015-06-23 | Microsoft Technology Licensing, Llc | Shared collaboration using display device |
WO2014021602A2 (ko) * | 2012-07-31 | 2014-02-06 | 인텔렉추얼디스커버리 주식회사 | 착용형 전자 장치 및 그의 제어 방법 |
US9710968B2 (en) * | 2012-12-26 | 2017-07-18 | Help Lightning, Inc. | System and method for role-switching in multi-reality environments |
US9678567B2 (en) * | 2014-07-16 | 2017-06-13 | Avaya Inc. | Indication of eye tracking information during real-time communications |
US20160027218A1 (en) * | 2014-07-25 | 2016-01-28 | Tom Salter | Multi-user gaze projection using head mounted display devices |
US9538962B1 (en) * | 2014-12-31 | 2017-01-10 | Verily Life Sciences Llc | Heads-up displays for augmented reality network in a medical environment |
-
2016
- 2016-02-03 JP JP2017509340A patent/JP6693507B2/ja active Active
- 2016-02-03 WO PCT/JP2016/053136 patent/WO2016158000A1/ja active Application Filing
- 2016-02-03 EP EP16771884.0A patent/EP3279780A4/en not_active Ceased
- 2016-02-03 US US15/553,192 patent/US10854168B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08171586A (ja) * | 1994-12-16 | 1996-07-02 | Hitachi Ltd | 共同作業支援方法、情報処理システムおよびプラント制御システム |
JP2002368762A (ja) * | 2001-06-06 | 2002-12-20 | Olympus Optical Co Ltd | ローカルネットワークシステム、ネットワークシステム及びテレビ会議システム並びに移動通信装置 |
JP2010170354A (ja) * | 2009-01-23 | 2010-08-05 | Seiko Epson Corp | 共有情報表示装置、共有情報表示方法およびコンピュータプログラム |
JP2010213129A (ja) * | 2009-03-12 | 2010-09-24 | Brother Ind Ltd | テレビ会議装置、テレビ会議システム、テレビ会議制御方法、及びテレビ会議装置のプログラム |
JP2013054661A (ja) * | 2011-09-06 | 2013-03-21 | Nec Biglobe Ltd | 情報表示システム、情報表示方法、及び情報表示用プログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3279780A4 * |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018169709A (ja) * | 2017-03-29 | 2018-11-01 | 富士ゼロックス株式会社 | コンテンツ表示装置およびコンテンツ表示プログラム |
JP2019537149A (ja) * | 2017-03-31 | 2019-12-19 | 北京七▲シン▼易▲維▼信息技▲術▼有限公司Beijing 7Invensun Technology Co.,Ltd. | 視線追跡装置および頭部装着型表示装置 |
US11143869B2 (en) | 2017-03-31 | 2021-10-12 | Beijing 7Invensun Technology Co., Ltd. | Eye tracking device and head-mounted display device |
JP2020520521A (ja) * | 2017-05-16 | 2020-07-09 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 協働的介入のための拡張現実 |
JP2019013397A (ja) * | 2017-07-05 | 2019-01-31 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用観察装置 |
JP6996883B2 (ja) | 2017-07-05 | 2022-01-17 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用観察装置 |
US11224329B2 (en) | 2017-07-05 | 2022-01-18 | Sony Olympus Medical Solutions Inc. | Medical observation apparatus |
JP7155511B2 (ja) | 2017-11-17 | 2022-10-19 | ソニーグループ株式会社 | 手術システム、情報処理装置、及び情報処理方法 |
JP2019088734A (ja) * | 2017-11-17 | 2019-06-13 | ソニー株式会社 | 手術システム、情報処理装置、及び情報処理方法 |
US11587670B2 (en) | 2017-11-17 | 2023-02-21 | Sony Corporation | Surgery system, information processing apparatus, and information processing method |
US11026560B2 (en) * | 2018-03-15 | 2021-06-08 | Sony Olympus Medical Solutions Inc. | Medical display control apparatus and display control method |
WO2020137162A1 (ja) * | 2018-12-28 | 2020-07-02 | 富士フイルム株式会社 | 超音波診断装置および超音波診断装置の制御方法 |
JP7501044B2 (ja) | 2020-03-30 | 2024-06-18 | セイコーエプソン株式会社 | 表示システム、情報処理装置及び表示システムの表示制御方法 |
WO2022208600A1 (ja) * | 2021-03-29 | 2022-10-06 | 京セラ株式会社 | ウェアラブル端末装置、プログラムおよび表示方法 |
JP7500866B2 (ja) | 2021-03-29 | 2024-06-17 | 京セラ株式会社 | ウェアラブル端末装置、プログラムおよび表示方法 |
KR102458495B1 (ko) * | 2022-03-17 | 2022-10-25 | 주식회사 메디씽큐 | 원격협진지원을 위한 3차원 포인팅시스템 및 그 제어방법 |
WO2023177003A1 (ko) * | 2022-03-17 | 2023-09-21 | 주식회사 메디씽큐 | 원격협진지원을 위한 3차원 포인팅시스템 및 그 제어방법 |
Also Published As
Publication number | Publication date |
---|---|
US20180122333A1 (en) | 2018-05-03 |
US10854168B2 (en) | 2020-12-01 |
EP3279780A1 (en) | 2018-02-07 |
EP3279780A4 (en) | 2018-11-07 |
JP6693507B2 (ja) | 2020-05-13 |
JPWO2016158000A1 (ja) | 2018-01-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016158000A1 (ja) | 情報処理装置、情報処理方法及び情報処理システム | |
JP6574939B2 (ja) | 表示制御装置、表示制御方法、表示制御システムおよび頭部装着ディスプレイ | |
EP3107474B1 (en) | Multi-display control device for a surgical system | |
US11819273B2 (en) | Augmented and extended reality glasses for use in surgery visualization and telesurgery | |
US9898662B2 (en) | Information processing apparatus, information processing method, and information processing system | |
US10506913B2 (en) | Apparatus operation device, apparatus operation method, and electronic apparatus system | |
CN117441362A (zh) | 具有远程装置功能的远程控件对的指示 | |
TW201505603A (zh) | 資訊處理裝置、資訊處理方法及資訊處理系統 | |
JP6507827B2 (ja) | 表示システム | |
US11094283B2 (en) | Head-wearable presentation apparatus, method for operating the same, and medical-optical observation system | |
WO2021062375A1 (en) | Augmented and extended reality glasses for use in surgery visualization and telesurgery | |
JP2016189120A (ja) | 情報処理装置、情報処理システム、及び頭部装着ディスプレイ | |
JP6589855B2 (ja) | 頭部装着ディスプレイ、制御装置および制御方法 | |
JP6617766B2 (ja) | 医療用観察システム、表示制御システムおよび表示制御装置 | |
US11224329B2 (en) | Medical observation apparatus | |
EP4034028A1 (en) | Augmented and extended reality glasses for use in surgery visualization and telesurgery |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16771884 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017509340 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15553192 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2016771884 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |