AU2016270422B2 - Apparatus and method for image capture of medical or dental images using a head mounted camera and computer system - Google Patents
Apparatus and method for image capture of medical or dental images using a head mounted camera and computer system Download PDFInfo
- Publication number
- AU2016270422B2 AU2016270422B2 AU2016270422A AU2016270422A AU2016270422B2 AU 2016270422 B2 AU2016270422 B2 AU 2016270422B2 AU 2016270422 A AU2016270422 A AU 2016270422A AU 2016270422 A AU2016270422 A AU 2016270422A AU 2016270422 B2 AU2016270422 B2 AU 2016270422B2
- Authority
- AU
- Australia
- Prior art keywords
- interest
- image
- view
- dental area
- head
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 20
- 230000007246 mechanism Effects 0.000 claims description 27
- 230000009849 deactivation Effects 0.000 claims description 3
- 239000011521 glass Substances 0.000 description 18
- 241000282461 Canis lupus Species 0.000 description 10
- 210000003128 head Anatomy 0.000 description 9
- 230000008859 change Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- SGPGESCZOCHFCL-UHFFFAOYSA-N Tilisolol hydrochloride Chemical compound [Cl-].C1=CC=C2C(=O)N(C)C=C(OCC(O)C[NH2+]C(C)(C)C)C2=C1 SGPGESCZOCHFCL-UHFFFAOYSA-N 0.000 description 1
- 241000949477 Toona ciliata Species 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- IWEDIXLBFLAXBO-UHFFFAOYSA-N dicamba Chemical compound COC1=C(Cl)C=CC(Cl)=C1C(O)=O IWEDIXLBFLAXBO-UHFFFAOYSA-N 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 150000002500 ions Chemical class 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00043—Operational features of endoscopes provided with output arrangements
- A61B1/00045—Display arrangement
- A61B1/00048—Constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/045—Control thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/24—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/634—Warning indications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/683—Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
- H04N23/687—Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00203—Electrical control of surgical instruments with speech control or speech recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30036—Dental; Teeth
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/62—Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Abstract
A head mounted magnifying camera is communicated with a computer system with a head-mounted display having a center of a field of view. A magnified view of a live dental area of interest is provided, image capture of the live dental area of interest is performed and downloaded into the computer system, or an image of a stored dental area of interest is uploaded from the computer system and matched to the live dental area of interest. The image of the live dental area of interest is automatically tracked in the computer system, and the image of the live dental area of interest is kept in the center of the field of view of the head-mounted display.
Description
Apparatus and Method for Image Capture of Medical or Dental Images Using a Head Mounted Camera and Computer System
[01] Background
[02] Field of the Technology
The invention relates to the field of dental loupes and image tracking and capture.
[03] Description of the Prior Art
[04] Google Glass is a wearable computer with an optical head- mounted display (OHMD). K was developed by Google with the mission of producing a mass-market ubiquitous computer. Google Glass displays
information in a smartphone-li» hands-free format Wearers communicate with the Internet via natural language voice commands. Google Glass became officially available to the general public on May 15th, 2014. The development and manufacture of head mounted computer displays, such as marketed under the brand, Google Glass, has made computer displays readily available to the computer user in a display unit mounted in a glasses frame. This allows the user to see and interact by appropriate means with a computer system, while simultaneously slowing the user operate both hands-free and to have
substantially unimpeded view of any other field of view. Google Glass essentialy duplicates an aviator's view of instrument readings on the cockpit glass or face shield while Blowing simultaneous normal viewing through the same cockpit glass or face shield.
[05] Using these features several proofs of concept for Google Glass have been proposed in healthcare. These uses have generally been used for audiovisual communication and information retrieval during a variety of medical procedures.
[06] On March 31 , 2014, a startup company Dentyzion, founded by four dental students at the University of Michigan School of Dentistry, announced the world's first Google Glass loupes. The dental technology and digital marketing consulting company designed the add-on to Glass and collaborated with SurgiTel who manufactured the magnifying optics according to the model. The loupe lenses are critical for Glass' functionality in dentistry and surgery by allowing dentists and surgeons to continue to use magnification whiie using Googie Giass. The design is reported to have been well received amongst dental students and faculty at the University of Michigan. SurgiTel's Micro 2.5x optics that were installed are light weight and do not affect the comfort and balance for Glass users. Dentyzion is currently collaborating with the University of Michigan to implement Google Glass in the dental curriculum. Dentyzion, however, merely mechanically mounted dental loupes into' the frames of a Google Glass having a telescopic camera directed to the same or nearly the same field of view of the loupes for the purposes of creating an audiovisual source for use in and for enhancing dental education.
[07] Users of dental loupes can readily testify that it takes practice and training to use them, since the magnified field of view seen through the loupes can rapidly change as the user moves his head. The problem becomes
particularly acute, when the user changes his line of sight to another field of view other than that seen through the loupes and then returns his eyes to the field of view provided by the loupes. Typically, the angle or position of the head has changed enough when the user changes his line of sight, so that upon return to the field of view of the loupes, a completely different location is seen in magnified view than that which was being viewed before the user looked away. The user then has to move his head angle to hunt for the prior magnified field of view based on a magnified scene or view that may not be clearly related in the user's mind to the prior field of view which is desired. This hunting for the prior field of view is distracting and may entail wasted time and effort from the desired task at hand unless the user is practiced in men tally switching between scales and views in an effortless manner.
Brief Summary
[08] The illustrated embodiments of the invention include a method including the steps of providing a head mounted magnifying camera
communicated with a computer system with a head-mounted display having a center of a field of view, providing a magnified view of a live dental area of interest, actuating image capture of the live dental area of interest and
downloading it into the computer system,, or uploading an image of a stored dental area of interest from the computer system and matching it to the live dental area of interest, automatically tracking the image of the live dental area of
interest in the computer system, and Keeping the image of the live dental area of interest in the center of the field of view of the head-mounted display.
[09] The method further includes the step of automatically
compensating for small movements of the head mounted magnifying camera shifting the center of the field of view by using a tracking program in the computer system to keep the live dental area of interest in the center of the field of view of the head-mounted display.
[10] The method further includes the steps of providing a micro-pan and tilt mechanism coupled to the camera or a lens of the camera for larger movements of the head that take lve dental area of interest out of the center of the field of view beyond that which can be∞rnpensated by the tracking program; and automatjcaty moving the camera angle using the micro-pan and till mechanism to keep the live dental area of interest in the center of the field of view of the head-mounted display.
[11] The step of autornatjcaly tracking the image of the live dental area of interest in the computer system includes the steps of providing a micro-pan and tilt mechanism coupled to the camera or a lens of the camera; and
automatically moving the camera angle using the micro-pan and tilt mechanism to keep the live dental area of interest in the center of the field of view of the head-mounted display.
[12] The method further the steps of storing a wide field of view around the image of the ive dental area of interest that exceeds the range of the pan and tilt mechanism's abWty to stay pointed at the live dental area of interest.
displaying an "out-of-frame" message or icon in the head-mounted display, controllng the pan and tit mechanism to direct the head-mounted camera to the live dental area off interest and/or displaying directional arrows in the head- mounted display to cue the user to the direction in which the head-mounted camera needs to be turned in order to bring the of the live dental area of interest toward the center of the field of view by using an image recognition program in the computer system using the stored wide field of view around the image of the live dental area of interest, reacquiring at least a portion of the wide field of view around the image of the live dental area of interest, identifying the desired five dental area of interest by the image recognition program, and moving the desired live dental area of interest into the center of the field of view using the tracking program.
[13] The step of storing a wide field of view around image of the five dental area of interest includes taking and storing a fast image capture of an enhanced wide field of view using a telescopic camera control.
[14] The step of storing the wide field of view is stored in a fast frame shot at the time of image capture during a time interval which is short enough so that the user does not perceive that the camera has taken a telescopic enhanced wide field of view image.
[15] The step of storing a wide field of view around image of the Hve dental area of interest includes taking a high resolution wide angle image, but displaying only a center portion of the high resolution wide angle image in the head-mounted display in a magnified view.
[16] The method further includes releasing or erasing the captured image by a deactivation command from the user.
[17] The illustrated embodiments of the apparatus also include a computer system with a memory, a head mounted magnifying camera
communicated with the computer system for providing a magnified image of a live dental area of interest, and a head-mounted display having a center of a field of view and communicated with the computer system, where the computer systsm automaticaly tracks the magnified image of the live dental area of interest to keep it in the center of the field of view of the head-mounted display, and where the memory stores a captured image of the live dental area of interest.
[18] The computer system automatically compensates for small movements of the head mounted magnifying camera shifting the center of the field of view to keep the live dental area of interest in the center of the field of view of the head-mounted display by image tracking.
[19] The apparatus further includes a micro-pan and tilt mechanism coupled to the camera or a lens of the camera for larger movements of the head that take live dental area of interest out of the center of the field of view beyond that which can be compensated by the tracking program, and the computer system automaticaly moves the camera angle using the micro-pan and tilt mechanism to keep the live dental area of interest in the center of the field of view of the head-mounted display by image tracking.
[20] The apparatus further Includes a micro-pan and tilt mechanism coupled to the camera or & Ions of the camera; and the computer system automatically moves the camera angle using the rriicro-pan and tilt mechanism to keep the live dental area of interest in the cerrter of Ihe field off view of the head- mounted display by image tracking.
[21] The computer system stores an image with a wide field of view
around the image of the live dental area of interest that exceeds the range of the pan and tttt mechanism's ability to stay pointed at the live dental area of interest, displays an "out-of-frame" message or toon in the heed-mounted display, controls the pan and titt mechanism to direct the head-mounted camera to the Bve dental area of interest and/or displays directional arrows in the head-mounted display to cue the user to the direction in which the heod-mounted camera needs to be turned in order to bring the of the live dental area of interest toward the center of the field of view using image recognition on the stored wide field of view around the image of the live dental area of interest, reacquires at least a portion of the wide field of view around the image of the live dental area of interest, Identities the desired live dental area of interest by image recognition, and moves the desired live dental area of interest into the center of the field of view by image tracking.
[22] The computer system stores the wide field of view
the live dental area of interest by taking and storing a fast image capture of an enhanced wide field of view using a telescopic camera control.
[23] The computer system takes and stores the fast image capture by taking a fast frame shot at the time of image capture during a time interval which is short enough so that the user does not perceive that the camera has taken a telescopic enhanced wide field of view image.
[24] The computer system stores an image with a wide field of view around the image of the live dental area of interest by taking a high resolution wide angle image, but displaying only a center portion of the high resolution wide angle image in the head-mounted display in a magnified view.
[25] While the apparatus and method has or will be described for the sake of grammatical fluidity with functional explanations, it is to be expressly understood that the claims, unless expressly formulated under 35 USC 112, are not to be construed as necessarily limited in any way by the construction of "means" or "steps" limitations, but are to be accorded the full scope of the meaning and equivalents of the definition provided by the claims under the judicial doctrine of equivalents, and in the case where the claims are expressly formulated under 35 USC 112 are to be accorded full statutory equivalents under 35 USC 112. The disclosure can be better visualized by turning now to the following drawings wherein like elements are referenced by like numerals.
Brief Description of the Drawings
[26] Fig. 1 is a block diagram of remote head-mounted camera and display subsystem wirelessly communicated with a computer system.
[27] Fig.2 is a depiction of a using wearing the head-mounted camera and display subsystem of Fig. 1.
[28] Fig.3 is a flow diagram of the operation of the apparatus of Figs. 1 and 2 according to the illustrated embodiments of the invention.
[29] The disclosure and its various embodiments can now be bettor understood by turning to the following detected description of the preferred ornbodiments which are presented as illustrated examples of the embodiments defined in the claims, it is expressly understood that the embodiments as defined by the claims may be broader than the Hustrated embodiments described betow.
[30]
Detailed Description of the Preferred Embodiments
[31] In the Hustrated embodiment of the invention the dental loupes are replaced by a head mounted magnifying camera communicated with a computer system with a head-mounted display, similar to that shown as Google Glass disclosed in US Pat Pub.2013/0044042 incorporated herein by reference, except that the camera is modified in various particulars as set forth below and its output is used to provide a magnified view of the dental area of interest subject to certain controls as disclosed below. Head -mounted camera and display system 12, which includes a camera 22 and dose-up display 24 mounted in a glasses frame 28, is worn by a user or dental practitioner as shown in Fig.2. Head-mounted camera and display system 12 communicates wirelessly to a
computer system 20, which includes a processor 14, display 16 and memory 18 as shown in Fig. 1.
[32] The user first turns on Google Glass 22 at step 30. Once the user acquires the desired view of the tooth or dental area of nterest, he actuates image capture at step 32 in Fig.3 with any kind of motion using a foot switch, orally or facially activated switch, finger flexure switch, uttering a word such as "capture", or blinking an eye or uploads a stored dental image from memory 18 by an appropriate command. Thereafter, an automatic tracking program in the computer system 20 to which the head mounted magnifying camera 22 is communicated, keeps the captured image in the center of the head-mounted display 24 in stop 34. Image tracking programs are well known to the art.
[33] Small rnovements of the camera 22 shifting the live field of view
(FOV) as detected at step 36 can be automatically compensated by the tracking program to keep the live view matching the stored captured image in the center of the head display 24 as determined by an image identification program. For larger movements of the head that would tend to take the location of the desired captured image out of the field of live view of the camera 22, a conventional micro-pan and tilt mechanism (not shown) included in frame 28 and coupled to camera 22 or included in camera 22 and coupled to the lens of the camera 24 automatically moves the camera angle in step 38 to keep the location of the captured image in the ive field of view of the camera 22 and hence in the center of the screen of the head-mounted display 24.
[34] For extremely large movements that exceed the range of even the pan and tit mechanism's ability to stay pointed at the location of the captured image as determined at step 40, the computer system 20 stored a wide field of view around the captured image at the time of actuation at step 32 in memory 18. This can be accomplished by a fast Image capture of a wider field of view using conventional telescopic camera control included in camera 22. The wider field of view is taken in a fast enough frame shot at the time of image capture such that the user is unaware that the camera 22 has even taken the wide angle shot
[35] Alternatively, the camera 22 can be permanently adjusted to take high resolution wide angle shots. The captured image in the computer is thus a wide angle shot, but only the center of the high resolution live shot is displayed in the head-mounted display 24 in magnified scale under software control in processor 14.
[36] Alter an extremely targe movement taking the desired location out of the maximum field of view of the camera 22, an "out-of-frame" message or icon appears in the head-mounted display 24 at step 40. As the user returns file camera angle toward the original field of view, the Kve shot will begin to pick up parts of the image in the stored wide angle shot in memory 18 of the view of interest. An image recognition program in processor 14 at step 42 then directs the angle of the camera's pan and tillt mechanism to the direction of the location of the captured image, or displays directional arrows in the display 24 which cue the user to the direction in which the head mounted camera 22 needs to be turned in order to be once again be pointed at the desired location matching that
of the captured image. Image recognition programs are well known in the art The last view of the captured image may be maintained as a captured image in display 24 in such an instance. As scon as the live shot regains the location of the desired image, it is identified as such by the image recognition program in processor 14 and moved by the tracking program in processor 14 into the center of the display 24 and maintained there as long as it is in the live field of view of the camera 22.
[37] The operator may move around, viewing the entire area or a different view, while the display screen 24 would still be showing a live view of the particular tooth or area in question. The displayed image would change since the camera angle is changing, but the area of interest would stay centered in the field of view, avoiding the need for the user to constantly hunting lor the tooth or area by controlling his/her head position.
[38] The captured image is released or erased by a deactivation command from the user at step 44, which is an interrupt that can be entered from any point in the program, which again could be made by any kind of motion or saying a word, such as "release". The computer system may also store multiple captured images which can be orally labeled by the user at the time of capture and reacquired at any time by calling out the file name of the stored captured image, such as by saying the name of the tooth. The user can thus go quickly back using the computer tracking and direction to a previously captured location without the need to recapture the location.
[39] Record keeping in a database in memory 18 can also be easily achieved by similarly using oral commands from the user at step 32 to store any captured image into a patient record. Thus, well centered images of a tooth before, during and after any procedure may be captured and stored in a readiy accessible patient record without the need to stop or delay the procedure for the purposes of making a photographic record.
[40] Many alterations and modhlcations may be made by those having ordinary skill in the art without departing from the spirit and scope of the embodiments. Therefore, it must be understood that the illustrated embodiment has been set forth only for the purposes of example and that it should not be taken as limiting the embodiments as defined by the folowing embodiments and its various embodiments.
[41] Therefore, it must be understood that the illustrated embodiment has been set forth only for the purposes of example and that it should not be taken as limiting the embodiments as defined by the following claims. For example, notwithstanding the fact that the elements of a claim are set forth below in a certain combination, it must be expressly understood that the embodiments includes other combinations of fewer, more or different elements, which are disclosed in above even when not inttiaty claimed in such combinations. A teaching that two elements are combined in a claimed combination is further to be understood as also allowing for a claimed combination in which the two elements are not combined with each other, but may be used atone or combined
in other combinations. The excision of any disclosed element of the embodiments is expicitly contemplated as within the scope of the embodiments.
[42] The words used in this specification to describe the various embodiments are to be understood not only in the sense of their commonly defined meanings, but to include by special definition in this specification structure, malarial or acts beyond the scope of the commonly defined meanings. Thus if an element can be understood in the context of this specification as including mom than one meaning, then Hs use in a daim must be uriderstood as being generic to all possible meanings supported by the specification and by the word itself.
[43] The definitions of the words or elements of the following claims are, therefore, defined in this specification to include not only the combination of elements which are literally set forth, but all equivalent structure, material or acts for performing substantially the same function in substantially the same way to obtain substantially the same result In this sense it is therefore contemplated that an equivalent substitution of two or more elements may be made for any one of the elements in the claims below or that a single element may be substituted for two or more elements in a claim. Although elements may be described above as acting in certain combinations and even initially claimed as such, it is to be expressly understood that one or more elements from a claimed combination can in some cases be excised from the combination and that the claimed
combination may be directed to a subcombination or variation of a
subcombination.
[44] Insubstantial changes from the claimed subject matter as viewed by a person with ordinary skill in the art, now known or later devised, are expressly contemplated as being equivalentty within the scope of the claims. Therefore, obvious substitutions now or later known to one with ordinary ski in the art are defined to be within the scope of the defined elements.
[45] The claims are thus to be understood to include what is specifically illustrated and described above, what is conceptionally equivalent, what can be obviously substituted and also what essentially incorporates the essential idea of the embodiments.
Claims (17)
1. A method comprising:
providing a head mounted magnifying camera communicated with a computer system with a head-mounted display having a center of a field of view;
providing a magnified view of a live dental area of interest; actuating image capture of the live dental area of interest and downloading it into the computer system, or uploading an image of a stored dental area of interest from the computer system and matching it to the live dental area of interest;
automatically tracking the image of the live dental area of interest in the computer system; and
keeping the image of the live dental area of interest in the center of the field of view of the head-mounted display.
2. The method of claim 1 further comprising automatically compensating for small movements of the head mounted magnifying camera shifting the center of the field of view by using a tracking program in the computer system to keep the live dental area of interest in the center of the field of view of the head-mounted display.
3. The method of claim 2 further comprising:
providing a micro-pan and tilt mechanism coupled to the camera or a lens of the camera for larger movements of the head that take live dental area of interest out
of the center of the field of view beyond that which can be compensated by the tracking program; and
automatically moving the camera angle using the micro-pan and tilt mechanism to keep the live dental area of interest in the center of the field of view of the head-mounted display.
4. The method of claim 1 where automatically tracking the image of the five dental area of interest in the computer system comprises:
providing a micro-pan and tilt mechanism coupled to the camera or a lens of the camera; and
automatically moving the camera angle using the micro-pan and tit mechanism to keep the live dental area of interest in the center of the field of view of the head-mounted display.
5. The method of claim 3 further comprising:
storing a wide field of view around the image of the live dental area of interest that exceeds the range of the pan and tirt mechanism's ability to stay pointed at the live dental area of interest;
displaying an "out-of-frame" message or icon in the head-mounted display; controling the pan and tit mechanism to direct the head-mounted camera to the live dental area of interest and/or displaying directional arrows in the head- mounted display to cue the user to the direction in which the head-mounted camera needs to be turned in order to bring the of the live dental area of interest toward the
center of the field of view by using an image recognition program in the computer system using the stored wide field of view around the image of the live dental area of interest;
reacquiring at least a portion of the wide field of view around the image of the live dental area of interest;
identifying the desired live dental area of interest by the image recognition program; and
moving the desired live dental area of interest into the center of the field of view using the tracking program.
6. The method of claim 5 where storing a wide field of view around image of the live dental area of interest comprises taking and storing a fast image capture of an enhanced wide field of view using a telescopic camera control.
7. The method of claim β where storing the wide field of view is stored in a fast frame shot at the time of image capture during a time interval which is short enough so that the user does not perceive that the camera has taken a telescopic enhanced wide field of view image.
8. The method of claim 5 where storing a wide field of view around image of the live dental area of interest comprises taking a high resolution wide angle image, but displaying only a center portion of the high resolution wide angle image in the head- mounted display in a magnified view.
9. The method of claim 1 further comprising releasing or erasing the captured image by a deactivation command from the user.
10. An apparatus comprising:
a computer system with a memory;
a head mounted magnrfying camera communicated with the computer system for providing a magnified image of a live dental area of interest; and
a head-mounted display having a center of a field of view and communicated with the computer system;
where the computer system automatically tracks the magnified image of the live dental area of interest to keep it in the center of the field of view of the head- mounted display, and where the memory stores a captured image of the live dental area a of interest.
11. The apparatus of claim 10 where the computer system automatically compensates for small movements of the head mounted magnifying camera shifting the center of the field of view to keep the live dental area of interest in the center of the field of view of the head-mounted display by linage tracking.
12. The apparatus of claim 11 further comprising:
a micro-pan and tilt mechanism coupled to the camera or a lens of the camera for larger movements of the head that take live dental area of interest out of the
center of the field of view beyond that which can be compensated by the tracking program; and
where the computer system automatically moves the camera angle using the micro-pan and tilt mechanism to keep the live dental area of interest in the center of the field of view of the head-mounted display by linage tracking.
13. The apparatus of claim 10 further comprising:
a micro-pan and tilt mechanism coupled to the camera or a lens of the camera; and
where the computer system automatically moves the camera angle using the micro-pan and tilt mechanism to keep the live dental area of interest in the center of the field of view of the head-mounted display by image tracking.
14. The apparatus of claim 12 where the computer system stores an image with a wide field of view around the image of the ftve dental area of interest that exceeds the range of the pan and tilt mechanism's ability to stay pointed at the live dental area of interest, displays an "out-of-frame" message or loon in the head-mounted display, controls the pan and tilt mechanism to drect the heed-mounted camera to the live dental area of interest and/or displays directional arrows in the head-mounted display to cue the user to the direction in which the head-mounted camera needs to be turned in order to bring the of the live dental area of interest toward the center of the field of view using image recognition on the stored wide field of view around the image of the live dental area of interest, reacquires at least a portion of the wide field of view around the
image of the live dental area of interest, identifies the desired live dental area of interest by image recognition, and moves Hie desired live dental area of interest into the center of the field of view by image tracking.
15. The apparatus of claim 14 where the computer system stores the wide field of view around image of the live dental area of interest by taking and storing a fast image capture of an enhanced wide field of view using a telescopic camera control.
16. The apparatus of claim 15 where the computer system takes and stores the fast image capture by taking a fast frame shot at the time of image capture during a time interval which is short enough so that the user does not perceive that the camera has taken a telescopic enhanced wide field of view image.
17. The apparatus of claim 14 where the computer system stores an image with a wide field of view around the image of the live dental area of interest by taking a high resolution wide angle image, but displaying only a center portion of the high resolution wide angle image in the head-mounted display in a magnified view.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/732,483 | 2015-06-05 | ||
US14/732,483 US10473942B2 (en) | 2015-06-05 | 2015-06-05 | Apparatus and method for image capture of medical or dental images using a head mounted camera and computer system |
PCT/US2016/032470 WO2016195972A1 (en) | 2015-06-05 | 2016-05-13 | Apparatus and method for image capture of medical or dental images using a head mounted camera and computer system |
Publications (2)
Publication Number | Publication Date |
---|---|
AU2016270422A1 AU2016270422A1 (en) | 2017-08-17 |
AU2016270422B2 true AU2016270422B2 (en) | 2020-12-03 |
Family
ID=57441429
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2016270422A Active AU2016270422B2 (en) | 2015-06-05 | 2016-05-13 | Apparatus and method for image capture of medical or dental images using a head mounted camera and computer system |
Country Status (6)
Country | Link |
---|---|
US (1) | US10473942B2 (en) |
EP (1) | EP3304173B1 (en) |
CN (1) | CN107850778B (en) |
AU (1) | AU2016270422B2 (en) |
CA (1) | CA2974733C (en) |
WO (1) | WO2016195972A1 (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IL244255A (en) * | 2016-02-23 | 2017-04-30 | Vertical Optics Llc | Wearable vision redirecting devices |
DE102017108235A1 (en) * | 2017-04-18 | 2018-10-18 | Bredent Medical Gmbh & Co. Kg | Glasses with at least a partially transparent screen and method of operating glasses |
CN110265121A (en) * | 2017-04-23 | 2019-09-20 | 奥康科技有限公司 | Wearable device and to wearable device positioning method and medium |
US10877262B1 (en) | 2017-06-21 | 2020-12-29 | Itzhak Luxembourg | Magnification glasses with multiple cameras |
CN110611763A (en) * | 2018-06-16 | 2019-12-24 | 仁明(杭州)智能科技有限公司 | System and method for adjusting orientation of image on head-mounted camera |
CN111067468B (en) * | 2019-12-30 | 2023-03-24 | 北京双翼麒电子有限公司 | Method, apparatus, and storage medium for controlling endoscope system |
US11166006B2 (en) | 2020-01-22 | 2021-11-02 | Photonic Medical Inc. | Open view, multi-modal, calibrated digital loupe with depth sensing |
US20220226065A1 (en) * | 2021-01-20 | 2022-07-21 | CinVivo | Camera System for Healthcare |
CN113079315B (en) * | 2021-03-25 | 2022-04-22 | 联想(北京)有限公司 | Image processing method, image processing device, electronic equipment and computer readable storage medium |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050206583A1 (en) * | 1996-10-02 | 2005-09-22 | Lemelson Jerome H | Selectively controllable heads-up display system |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7710654B2 (en) * | 2003-05-12 | 2010-05-04 | Elbit Systems Ltd. | Method and system for improving audiovisual communication |
KR101189550B1 (en) * | 2008-03-21 | 2012-10-11 | 아츠시 타카하시 | Three-dimensional digital magnifier operation supporting system |
JP5036612B2 (en) * | 2008-03-28 | 2012-09-26 | 三洋電機株式会社 | Imaging device |
US10602921B2 (en) * | 2010-06-10 | 2020-03-31 | Cao Group, Inc. | Virtual dental operatory |
US20120056993A1 (en) * | 2010-09-08 | 2012-03-08 | Salman Luqman | Dental Field Visualization System with Improved Ergonomics |
DE202011005311U1 (en) * | 2011-04-15 | 2011-11-15 | Jörg Vollstedt | dental camera |
US9285592B2 (en) | 2011-08-18 | 2016-03-15 | Google Inc. | Wearable device with input and output structures |
US8643951B1 (en) * | 2012-03-15 | 2014-02-04 | Google Inc. | Graphical menu and interaction therewith through a viewing window |
US20150020796A1 (en) * | 2012-09-07 | 2015-01-22 | Hamilton Beach Brands, Inc. | Grill and Method of Assembling Same |
IL221863A (en) * | 2012-09-10 | 2014-01-30 | Elbit Systems Ltd | Digital system for surgical video capturing and display |
US9819843B2 (en) * | 2012-09-20 | 2017-11-14 | Zeriscope Inc. | Head-mounted systems and methods for providing inspection, evaluation or assessment of an event or location |
CN203101728U (en) * | 2012-11-27 | 2013-07-31 | 天津市天堰医教科技开发有限公司 | Head type display for assisting medical operation teaching |
JP5411380B1 (en) * | 2013-05-20 | 2014-02-12 | 正一 中村 | Medical imaging and recording device |
KR102119659B1 (en) * | 2013-09-23 | 2020-06-08 | 엘지전자 주식회사 | Display device and control method thereof |
US20150207961A1 (en) * | 2014-01-17 | 2015-07-23 | James Albert Gavney, Jr. | Automated dynamic video capturing |
US11103122B2 (en) * | 2014-07-15 | 2021-08-31 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
EP3146715B1 (en) * | 2014-05-20 | 2022-03-23 | University Of Washington Through Its Center For Commercialization | Systems and methods for mediated-reality surgical visualization |
IL235073A (en) * | 2014-10-07 | 2016-02-29 | Elbit Systems Ltd | Head-mounted displaying of magnified images locked on an object of interest |
US9990296B2 (en) * | 2015-07-31 | 2018-06-05 | Oracle International Corporation | Systems and methods for prefetching data |
-
2015
- 2015-06-05 US US14/732,483 patent/US10473942B2/en active Active
-
2016
- 2016-05-13 EP EP16803959.2A patent/EP3304173B1/en active Active
- 2016-05-13 CA CA2974733A patent/CA2974733C/en active Active
- 2016-05-13 WO PCT/US2016/032470 patent/WO2016195972A1/en active Application Filing
- 2016-05-13 CN CN201680032065.1A patent/CN107850778B/en active Active
- 2016-05-13 AU AU2016270422A patent/AU2016270422B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050206583A1 (en) * | 1996-10-02 | 2005-09-22 | Lemelson Jerome H | Selectively controllable heads-up display system |
Also Published As
Publication number | Publication date |
---|---|
EP3304173A4 (en) | 2018-05-30 |
CN107850778A (en) | 2018-03-27 |
CN107850778B (en) | 2020-09-29 |
CA2974733C (en) | 2023-05-23 |
EP3304173B1 (en) | 2020-04-29 |
US10473942B2 (en) | 2019-11-12 |
WO2016195972A1 (en) | 2016-12-08 |
EP3304173A1 (en) | 2018-04-11 |
US20160358327A1 (en) | 2016-12-08 |
CA2974733A1 (en) | 2016-12-08 |
AU2016270422A1 (en) | 2017-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2016270422B2 (en) | Apparatus and method for image capture of medical or dental images using a head mounted camera and computer system | |
CN103917913B (en) | Head mounted display, the method controlling optical system and computer-readable medium | |
US20170147880A1 (en) | Staredown to Produce Changes in Information Density and Type | |
JP6751401B2 (en) | Improving visual perception of displayed color symbology | |
CN104067160B (en) | Make picture material method placed in the middle in display screen using eyes tracking | |
US8971570B1 (en) | Dual LED usage for glint detection | |
US20180067312A1 (en) | Graphic Interface for Real-Time Vision Enhancement | |
US11844573B2 (en) | Active visual alignment stimuli in fundus photography | |
EP3725254A2 (en) | Microsurgery system with a robotic arm controlled by a head-mounted display | |
JP2014505897A5 (en) | ||
US20130106674A1 (en) | Eye Gaze Detection to Determine Speed of Image Movement | |
CN107749952B (en) | Intelligent unmanned photographing method and system based on deep learning | |
JPH11501403A (en) | Microscopes, especially surgical microscopes | |
US9916771B2 (en) | Portable vision aid with motion pan | |
CN110100199B (en) | System and method for acquisition, registration and multimedia management | |
Yamashita et al. | 8K ultra-high-definition microscopic camera for ophthalmic surgery | |
CN110554501B (en) | Head mounted display and method for determining line of sight of user wearing the same | |
EP3047883A1 (en) | Compressible eyecup assemblies in a virtual reality headset | |
CN106331411A (en) | Digital slice manufacturing device | |
KR20180062953A (en) | Display apparatus and method of displaying using context display and projectors | |
KR200294068Y1 (en) | Digital Microscope | |
JP7145944B2 (en) | Display device and display method using means for providing visual cues | |
US20030112506A1 (en) | Method and apparatus for viewing a scene | |
CN113366367A (en) | Electronic magnifier | |
US10620432B1 (en) | Devices and methods for lens position adjustment based on diffraction in a fresnel lens |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGA | Letters patent sealed or granted (standard patent) |