US20200178785A1 - Virtual dental operatory - Google Patents

Virtual dental operatory Download PDF

Info

Publication number
US20200178785A1
US20200178785A1 US16/790,572 US202016790572A US2020178785A1 US 20200178785 A1 US20200178785 A1 US 20200178785A1 US 202016790572 A US202016790572 A US 202016790572A US 2020178785 A1 US2020178785 A1 US 2020178785A1
Authority
US
United States
Prior art keywords
image
virtual
rendering
operatory
dental
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/790,572
Inventor
Steven D Jensen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cao Group LLC
Cao Group Inc
Original Assignee
Cao Group LLC
Cao Group Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cao Group LLC, Cao Group Inc filed Critical Cao Group LLC
Priority to US16/790,572 priority Critical patent/US20200178785A1/en
Assigned to CAO GROUP LLC reassignment CAO GROUP LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JENSEN, STEVEN D
Publication of US20200178785A1 publication Critical patent/US20200178785A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/24Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/14Applications or adaptations for dentistry
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/46Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment with special arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/48Diagnostic techniques
    • A61B6/486Diagnostic techniques involving generating temporal series of image data
    • A61B6/51
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/56Details of data transmission or power supply, e.g. use of slip rings
    • A61B6/563Details of data transmission or power supply, e.g. use of slip rings involving image data transmission via a network

Definitions

  • the present invention relates to the field of dental operatory and more particularly relates to devices, systems and methods of providing virtual live views of a treatment site.
  • the modern dental operatory is centered on an adjustable dental chair.
  • the dental chair has the purpose of providing the patient with a comfortable seat while at the same time allowing the dental professional to maneuver the dental chair in multiple directions to provide easier access to the patient's oral cavity.
  • a dental professional can recline, rise, lower, or spin the dental chair in order to maneuver the patient to allow the dental professional the best access to the patient's oral cavity to perform dental treatment procedures.
  • the oral cavity is generally a small area in which it is difficult to perform dental treatment procedures.
  • dental treatment procedures usually include performing high tolerance procedures on a very small scale. Due to the high tolerances and small scale of most dental treatment procedures, dental professionals usually try to use some form of magnification to enhance their ability to perform the dental treatment procedure.
  • the nature of the dental cavity often makes it very difficult for a dental professional to use magnification tools effectively.
  • many dental professionals may use magnification lenses that may attach to a pair of conventional glasses and are hand adjustable. The magnification lenses, however, are usually heavy and are an annoyance when worn for long periods of time
  • the oral cavity is generally a dark area, which further increases the challenges associated with performing dental treatment procedures.
  • most dental professionals now use a high-powered overhead light to flood the oral cavity with light to help the dental professional adequately see the treatment site.
  • the high-powered lights help provide light within the oral cavity, many times the dental professional's hands or tools block the light, thus limiting the effectiveness of the high-powered light.
  • Embodiments of the present invention include methods, devices, and systems that light and magnify a treatment site within the oral cavity without interfering with a dental treatment procedure.
  • example embodiments of the present invention provide a virtual operatory system that includes an image capture subsystem connected to an image display subsystem.
  • the image capture subsystem can capture and communicate a live image of a treatment site to the image display subsystem for the dental professional to view while performing a dental treatment procedure.
  • the virtual operatory system can also include an image rendering subsystem that can render the live image of the treatment site in one or more ways to provide an improved visual to the dental professional of the treatment site.
  • FIG. 1 illustrates various exemplary devices and components of a virtual dental operatory system
  • FIG. 2 illustrates an exemplary implementation of the system of FIG. 1 according to principles described herein;
  • FIG. 3 illustrates exemplary components of an image capture subsystem according to principles described herein;
  • FIG. 4 illustrates exemplary components of an image rendering subsystem according to principles described herein;
  • FIG. 5 illustrates exemplary components of an image display subsystem according to principles described herein;
  • FIG. 6 illustrates an exemplary method of performing virtual dental operatory
  • FIG. 7 illustrates an example computing device according to principles described herein.
  • Embodiments of the present invention include methods, devices, and systems that light and magnify a treatment site within the oral cavity without interfering with a dental treatment procedure.
  • example embodiments of the present invention provide a virtual operatory system that includes an image capture subsystem connected to an image display subsystem.
  • the image capture subsystem can capture and communicate a live image of a treatment site to the image display subsystem for the dental professional to view while performing a dental treatment procedure.
  • the virtual operatory system can also include an image rendering subsystem that can render the live image of the treatment site in one or more ways to provide an improved visual to the dental professional of the treatment site.
  • the virtual dental operatory system allows the treatment site located within an oral cavity of a patient to be magnified without having to use heavy bulky optical magnifiers that are often uncomfortable to wear. Moreover, the virtual dental operatory system can provide an image where the treatment site is lighted well, or at least where the image of the treatment site is rendered such that the treatment site appears lighted. Furthermore, the virtual dental operatory system is configured such that the dental professional can perform the procedure minimizing the opportunity for the dental professional or the dental professional's tools to block the lighting effect.
  • FIG. 1 illustrates one example embodiment of a virtual dental operatory system 100 .
  • operatory means any dental procedure, including but not limited to removal of cavity, adding fillings, root canals, bridges, orthodontics, surgery, and/or any or all treatments that take place around or within the oral cavity.
  • the virtual dental operatory system 100 can be arranged and configured to cooperate with a standard dental chair wherein a patient 102 sits while a dental professional 104 performs a dental treatment procedure.
  • FIG. 1 illustrates one arrangement, the virtual dental operatory system 100 can be arranged in almost any arrangement as will be described further below.
  • Figure one illustrates that the virtual dental operatory system 100 can include an image capture subsystem 200 .
  • the image capture subsystem 200 is configured to capture an image of a treatment site and communicate that image to various other subsystems within the virtual dental operatory system 100 .
  • the image capture subsystem 200 can be configured to capture and communicate a live video image of a treatment site.
  • portions of the image capture subsystem 200 can be mounted to a wall or ceiling by way of an adjustable mounting system 106 that allows the dental professional 104 to position the image capture subsystem 200 in the best location to capture the best image.
  • portions of the image capture subsystem 200 can be located directly on dental tools that the dental professional is using, for example, a dental drill housing or similar dental tool.
  • portions of the image capture subsystem 200 can be a free standing hand-held tool that a dental assistant can maneuver and position while the dental professional 104 performs a dental treatment procedure.
  • the image capture subsystem can capture an image of at least a portion of the patient's 102 oral cavity 110 .
  • the image capture subsystem can then communicate the image of the oral cavity 110 to an image display subsystem 400 a and/or 400 b such that the dental professional can view the image during the dental procedure.
  • FIG. 1 illustrates that that part of the image display subsystem 400 a and/or 400 b may include a monitor that is mounted to a wall or ceiling by way of an adjustable mounting system 108 that allows the dental professional to maneuver the monitor to almost any position or location.
  • the monitor can display a virtual view 112 of the treatment site within the oral cavity 110 .
  • the image display subsystem 400 a and/or 400 b can include goggles that include a display that allows the dental professional 104 to view the virtual view 112 of the treatment site within the oral cavity by wearing the goggles.
  • FIG. 2 further illustrates an example embodiment of the virtual dental operatory system 100 .
  • the virtual dental operatory system 100 can include the image capture subsystem 200 that can capture an image of a treatment area within a patient's oral cavity to produce a captured image.
  • the image capture subsystem 200 can be communicably connected to an image rendering subsystem 300 .
  • the image capture subsystem 200 can communicate the captured image to the image rendering subsystem 300 .
  • the image rendering subsystem 300 can apply one or more rendering routines to the captured image to produce a rendered image.
  • the image rendering subsystem 300 can be communicably connected to an image display subsystem 400 .
  • the image rendering subsystem 300 can communicate the rendered image to the image display subsystem 400 .
  • the image display subsystem 400 can display the rendered image on one or more image display devices.
  • the image capture subsystem 200 , the image rendering subsystem 300 and the image display subsystem 400 can communicate using any suitable communication technologies, devices, networks, media, and protocols supportive of remote data communications.
  • the subsystems can communicate over a network using any communication platforms and technologies suitable for transporting captured images and/or communication signals, including known communication technologies, devices, transmission media, and protocols supportive of remote data communications, examples of which include, but are not limited to, data transmission media, communications devices, Transmission Control Protocol (“TCP”), Internet Protocol (“IP”), File Transfer Protocol (“FTP”), Telnet, Hypertext Transfer Protocol (“HTTP”), Hypertext Transfer Protocol Secure (“HTTPS”), Session Initiation Protocol (“SIP”), Simple Object Access Protocol (“SOAP”), Extensible Mark-up Language (“XML”) and variations thereof, Simple Mail Transfer Protocol (“SMTP”), Real-Time Transport Protocol (“RTP”), User Datagram Protocol (“UDP”), Global System for Mobile Communications (“GSM”) technologies, Code Division Multiple Access (“CDMA”) technologies, Evolution Data Optimized
  • the network may include one or more networks or types of networks (and communication links thereto) capable of carrying communications, captured images, and/or data signals between the subsystems.
  • the network may include, but is not limited to, one or more wireless networks, cable networks, hybrid fiber coax networks, optical fiber networks, broadband networks, narrowband networks, the Internet, wide area networks, local area networks, public networks, private networks, packet-switched networks, and any other networks capable of carrying data and/or communications signals between the subsystems. Communications between the subsystems may be transported using any one of above-listed networks, or any combination or sub-combination of the above-listed networks.
  • FIGS. 1 and 2 can each include various facilities that assist the various subsystems in performing various tasks within the virtual dental operatory system 100 .
  • FIG. 3 illustrates on example of the image capture subsystem 200 .
  • the image capture subsystem 200 can include a capture facility 202 .
  • the capture facility 202 can include any devices that provide for the capture of an image, and in particular, the capture of a live video image.
  • the capture facility 202 can include a camera.
  • the camera can be a live feed auto-focus camera.
  • the camera can have various optical magnification lenses to provide optical magnification of the treatment site.
  • a camera such as the SONY ALPHA A350 is a camera with live feed and auto focus capabilities.
  • the capture facility can be configured to focus on a particular instrument such that the camera's focus automatically adjusts depending on the location of the particular instrument within the oral cavity. This feature can be provided by image recognition routines that are part of the capture facility 202 .
  • the image capture subsystem 200 can further include a communication facility 204 .
  • Communication facility 204 may be configured to facilitate communication between the image capture subsystem and the image rendering subsystem 300 and/or the image display subsystem 400 .
  • communication facility 204 may be configured to transmit and/or receive communication signals, captured images, metadata and/or any other data to/from the image capture subsystem 200 .
  • communication facility 204 may transmit data representative of one or more captured images to the image rendering subsystem 300 .
  • Data representative of captured images may be transmitted in one or more media content streams, as one or more data files, or in any other suitable manner as may serve a particular implementation.
  • Communication facility 204 may be configured to interface with any suitable communication media, protocols, and formats, including any of those mentioned above.
  • the image capture subsystem 200 can include a storage facility 206 .
  • Storage facility 206 may be configured to maintain captured data 208 representative of one or more captured images. It will be recognized that storage facility 208 may maintain additional or alternative data as may serve a particular implementation.
  • the captured data can be used by the dentist to demonstrate a particular procedure to another patient, insurance company, or keep it on file for other purposes, such as if a particular procedure fails then the dentist can review the procedure in an attempt to identify the cause of failure.
  • FIG. 4 illustrates an example embodiment of the image rendering subsystem 300 .
  • the image rendering subsystem 300 can include a rendering facility 302 configured to perform one or more rendering routines on the captured image.
  • the rendering routines can perform digital enhancements that render or change the captured image to a rendered image.
  • the rendering or changes to the captured image can enhance the captured image in one or more ways such that resulting rendered image allows the dental professional a superior view or visual access to the treatment site.
  • rendering routines performed on the captured image can magnify, brighten, increase/decrease contrast, enlarge, zoom in, zoom out, provide a three-dimension model, change colors, add lighting effects and/or change or render the captured image in any way to produce a customized rendered image for the dental professionals use.
  • the image rendering subsystem 300 can include a user input interface that allows the dental professional to customize the rendered image by inputting rendering specifications.
  • the rendering facility can be provided with pre-set rendering routines that easily allow the dental professional to render the captured image in a preset way and also be able to switch back and forth between two or more pre-set rendering routines during a procedure.
  • the image rendering subsystem 300 can further include a communication facility 304 .
  • Communication facility 304 may be configured to facilitate communication between the image rendering subsystem 300 and the image capture subsystem 200 and/or the image display subsystem 400 .
  • communication facility 304 may be configured to transmit and/or receive communication signals, captured images, metadata and/or any other data to/from the image rendering subsystem 300 .
  • communication facility 304 may transmit data representative of one or more rendered images to the image display subsystem 400 .
  • Data representative of rendered images may be transmitted in one or more media content streams, as one or more data files, or in any other suitable manner as may serve a particular implementation.
  • Communication facility 304 may be configured to interface with any suitable communication media, protocols, and formats, including any of those mentioned above.
  • the image rendering subsystem 300 can include a storage facility 306 .
  • Storage facility 306 may be configured to maintain rendered data 308 representative of one or more rendered images. It will be recognized that storage facility 308 may maintain additional or alternative data as may serve a particular implementation.
  • the rendered data 308 can be used by the dentist to demonstrate a particular procedure to another patient, insurance company, or keep it on file for other purposes, such as if a particular procedure fails then the dentist can review the rendered data in an attempt to identify the cause of failure.
  • FIG. 5 illustrates an example embodiment of the image display subsystem 400 .
  • the image display subsystem 400 can include a presentation facility 402 .
  • the presentation facility has the ability to use data received from either the image capture subsystem 200 or the image rendering subsystem 300 and create a visual depiction of the treatment site.
  • the presentation facility 400 can include a monitor or projector that displays a visual depiction of the treatment site.
  • the presentation facility 400 can include goggles (e.g., virtual reality goggles) that are capable of producing a visual depiction of the treatment site to the wearer of the goggles.
  • the goggles can be TECHWOOD TG-06V IC Goggles from WELTON ELECTRONICS. The goggles can be worn like traditional glasses or goggles with the depiction of the treatment site viewed directly through the goggles.
  • the dental professional can perform a dental procedure on the patient, while directly monitoring an enhanced view of the dental procedure through the virtual operatory system 100 .
  • the dental professional is provided with a superior view of the oral cavity during the procedure (including magnification and lighting effects).
  • the image display subsystem 400 can further include a communication facility 404 .
  • Communication facility 404 may be configured to facilitate communication between the image display subsystem 400 and the image capture subsystem 200 and/or the image rendering subsystem 300 .
  • communication facility 404 may be configured to transmit and/or receive communication signals, captured images, rendered images, metadata and/or any other data to/from the image display subsystem 400 .
  • communication facility 404 may receive data representative of one or more rendered images from the image rendering subsystem 300 .
  • Data representative of rendered images may be transmitted in one or more media content streams, as one or more data files, or in any other suitable manner as may serve a particular implementation.
  • Communication facility 404 may be configured to interface with any suitable communication media, protocols, and formats, including any of those mentioned above.
  • FIG. 6 illustrates an exemplary method 600 of virtual dental operatory. While FIG. 6 illustrates exemplary steps according to one embodiment, other embodiments may omit, add to, reorder, and/or modify any of the steps shown in FIG. 6 . The steps shown in FIG. 6 may be performed by any component or combination of components of the virtual dental operatory system 100 .
  • an image representative of a treatment site in a patient's oral cavity is captured during the performance of a dental treatment procedure to produce a captured image.
  • FIG. 3 illustrates that the image capture subsystem 200 can include the capture facility that is configured to capture an image of the treatment site.
  • a camera or similar image capturing device can be used to capture the image.
  • the captured image is rendered to enhance the visual properties of treatment site to produce a rendered image.
  • FIG. 4 illustrates that the image rendering subsystem 300 can include a rendering facility that is configured to apply one or more rendering routines to the captured image to produce a rendered image that is enhanced and customized to provide a superior visual depiction of the treatment site.
  • the rendered image is displayed to the dental professional during the performance of the dental treatment procedure.
  • FIG. 5 illustrates that the image display subsystem 400 can include a presentation facility that is configured to display the rendered image.
  • virtual reality goggles can be used to display the rendered image to the dental professional.
  • one or more of the components and/or processes described herein may be implemented and/or performed by one or more appropriately configured computing devices.
  • one or more of the systems and/or components described above may include or be implemented by any computer hardware and/or computer-implemented instructions (e.g., software), or combinations of computer-implemented instructions and hardware, configured to perform one or more of the processes described herein.
  • system components may be implemented on one physical computing device or may be implemented on more than one physical computing device. Accordingly, system components may include any number of computing devices, and may employ any of a number of computer operating systems.
  • one or more of the processes described herein may be implemented at least in part as instructions executable by one or more computing devices.
  • a processor e.g., a microprocessor
  • receives instructions from a tangible computer-readable medium, (e.g., a memory, etc.), and executes those instructions, thereby performing one or more processes, including one or more of the processes described herein.
  • Such instructions may be stored and/or transmitted using any of a variety of known computer-readable media.
  • a computer-readable medium includes any medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including, but not limited to, non-volatile media and/or volatile media.
  • Non-volatile media may include, for example, optical or magnetic disks and other persistent memory.
  • Volatile media may include, for example, dynamic random-access memory (“DRAM”), which typically constitutes a main memory.
  • DRAM dynamic random-access memory
  • Computer-readable media include, for example, a floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other tangible medium from which a computer may read.
  • FIG. 7 illustrates an exemplary computing device 700 that may be configured to perform one or more of the processes described herein.
  • computing device 700 may include a communication interface 702 , a processor 704 , a storage device 706 , and an input/output (“I/O”) module 708 communicatively connected via a communication infrastructure 710 .
  • I/O input/output
  • FIG. 7 While an exemplary computing device 700 is shown in FIG. 7 , the components illustrated in FIG. 7 are not intended to be limiting. Additional or alternative components may be used in other embodiments. Components of computing device 700 shown in FIG. 7 will now be described in additional detail.
  • Communication interface 702 may be configured to communicate with one or more computing devices. Examples of communication interface 702 include, without limitation, a wired network interface (such as a network interface card), a wireless network interface (such as a wireless network interface card), a modem, and any other suitable interface. In at least one embodiment, communication interface 702 may provide a direct connection between system 100 and one or more of provisioning systems via a direct link to a network, such as the Internet. Communication interface 702 may additionally or alternatively provide such a connection through, for example, a local area network (such as an Ethernet network), a personal area network, a telephone or cable network, a satellite data connection, a dedicated URL, or any other suitable connection. Communication interface 702 may be configured to interface with any suitable communication media, protocols, and formats, including any of those mentioned above.
  • Processor 704 generally represents any type or form of processing unit capable of processing data or interpreting, executing, and/or directing execution of one or more of the instructions, processes, and/or operations described herein. Processor 704 may direct execution of operations in accordance with one or more applications 712 or other computer-executable instructions such as may be stored in storage device 706 or another computer-readable medium.
  • Storage device 706 may include one or more data storage media, devices, or configurations and may employ any type, form, and combination of data storage media and/or device.
  • storage device 706 may include, but is not limited to, a hard drive, network drive, flash drive, magnetic disc, optical disc, random access memory (“RAM”), dynamic RAM (“DRAM”), other non-volatile and/or volatile data storage units, or a combination or sub-combination thereof.
  • Electronic data, including data described herein, may be temporarily and/or permanently stored in storage device 706 .
  • data representative of one or more executable applications 712 (which may include, but are not limited to, one or more of the software applications described herein) configured to direct processor 704 to perform any of the operations described herein may be stored within storage device 706 .
  • data may be arranged in one or more databases residing within storage device 706 .
  • I/O module 708 may be configured to receive user input and provide user output and may include any hardware, firmware, software, or combination thereof supportive of input and output capabilities.
  • I/O module 708 may include hardware and/or software for capturing user input, including, but not limited to, a keyboard or keypad, a touch screen component (e.g., touch screen display), a receiver (e.g., an RF or infrared receiver), and/or one or more input buttons.
  • I/O module 708 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen, one or more output drivers (e.g., display drivers), one or more audio speakers, and one or more audio drivers.
  • I/O module 708 is configured to provide graphical data to a display for presentation to a user.
  • the graphical data may be representative of one or more graphical user interfaces and/or any other graphical content as may serve a particular implementation.
  • any of the facilities described herein may be implemented by or within one or more components of computing device 700 .
  • one or more applications 712 residing within storage device 706 may be configured to direct processor 704 to perform one or more processes or functions associated with capture facility 202 , communication facility 204 , rendering facility 302 , communication facility 304 , presentation facility 402 , and/or communication facility 404 .
  • storage facilities 206 and/or 306 may be implemented by or within storage device 706 .

Abstract

Embodiments of the present invention include methods, devices, and systems that light and magnify a treatment site within the oral cavity without interfering with a dental treatment procedure. In particular, example embodiments of the present invention provide a virtual operatory system that includes an image capture subsystem connected to an image display subsystem. For example, the image capture subsystem can capture and communicate a live image of a treatment site to the image display subsystem for the dental professional to view while performing a dental treatment procedure. In at least one example embodiment, the virtual operatory system can also include an image rendering subsystem that can render the live image of the treatment site in one or more ways to provide an improved visual to the dental professional of the treatment site.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the priority as a continuation application of U.S. application Ser. No. 13/158,067, filed Jun. 10, 2011, which in turn claims the benefit of U.S. Provisional Patent Application No. 61/353,614, filed Jun. 10, 2010, the content of both applications being incorporated herein by reference in their entirety.
  • TECHNICAL FIELD OF THE INVENTION
  • The present invention relates to the field of dental operatory and more particularly relates to devices, systems and methods of providing virtual live views of a treatment site.
  • BACKGROUND OF THE INVENTION
  • The modern dental operatory is centered on an adjustable dental chair. The dental chair has the purpose of providing the patient with a comfortable seat while at the same time allowing the dental professional to maneuver the dental chair in multiple directions to provide easier access to the patient's oral cavity. For example, a dental professional can recline, rise, lower, or spin the dental chair in order to maneuver the patient to allow the dental professional the best access to the patient's oral cavity to perform dental treatment procedures.
  • Although the dental chair can assist the dental professional in gaining access to the patient's oral cavity, the oral cavity is generally a small area in which it is difficult to perform dental treatment procedures. For example, dental treatment procedures usually include performing high tolerance procedures on a very small scale. Due to the high tolerances and small scale of most dental treatment procedures, dental professionals usually try to use some form of magnification to enhance their ability to perform the dental treatment procedure. The nature of the dental cavity, however, often makes it very difficult for a dental professional to use magnification tools effectively. For example, many dental professionals may use magnification lenses that may attach to a pair of conventional glasses and are hand adjustable. The magnification lenses, however, are usually heavy and are an annoyance when worn for long periods of time
  • In addition to the small scale of dental treatment procedures, the oral cavity is generally a dark area, which further increases the challenges associated with performing dental treatment procedures. In particular, most dental professionals now use a high-powered overhead light to flood the oral cavity with light to help the dental professional adequately see the treatment site. Although the high-powered lights help provide light within the oral cavity, many times the dental professional's hands or tools block the light, thus limiting the effectiveness of the high-powered light.
  • What is needed in the art are systems, devices and methods that easily allow a dental professional to light and magnify a treatment site within the oral cavity without interfering with the dental treatment procedure.
  • SUMMARY OF THE INVENTION
  • Embodiments of the present invention include methods, devices, and systems that light and magnify a treatment site within the oral cavity without interfering with a dental treatment procedure. In particular, example embodiments of the present invention provide a virtual operatory system that includes an image capture subsystem connected to an image display subsystem. For example, the image capture subsystem can capture and communicate a live image of a treatment site to the image display subsystem for the dental professional to view while performing a dental treatment procedure. In at least one example embodiment, the virtual operatory system can also include an image rendering subsystem that can render the live image of the treatment site in one or more ways to provide an improved visual to the dental professional of the treatment site.
  • Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the invention. The features and advantages of the invention may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features of the present invention will become more fully apparent from the following description and appended claims or may be learned by the practice of the invention as set forth hereinafter.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In order to describe the manner in which the above-recited and other advantages and features of the invention can be obtained, a more particular description of the invention briefly described above will be rendered by reference to specific example implementations thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical implementations of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
  • FIG. 1 illustrates various exemplary devices and components of a virtual dental operatory system;
  • FIG. 2 illustrates an exemplary implementation of the system of FIG. 1 according to principles described herein;
  • FIG. 3 illustrates exemplary components of an image capture subsystem according to principles described herein;
  • FIG. 4 illustrates exemplary components of an image rendering subsystem according to principles described herein;
  • FIG. 5 illustrates exemplary components of an image display subsystem according to principles described herein;
  • FIG. 6 illustrates an exemplary method of performing virtual dental operatory; and
  • FIG. 7 illustrates an example computing device according to principles described herein.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Embodiments of the present invention include methods, devices, and systems that light and magnify a treatment site within the oral cavity without interfering with a dental treatment procedure. In particular, example embodiments of the present invention provide a virtual operatory system that includes an image capture subsystem connected to an image display subsystem. For example, the image capture subsystem can capture and communicate a live image of a treatment site to the image display subsystem for the dental professional to view while performing a dental treatment procedure. In at least one example embodiment, the virtual operatory system can also include an image rendering subsystem that can render the live image of the treatment site in one or more ways to provide an improved visual to the dental professional of the treatment site.
  • As will be further explained below, the virtual dental operatory system allows the treatment site located within an oral cavity of a patient to be magnified without having to use heavy bulky optical magnifiers that are often uncomfortable to wear. Moreover, the virtual dental operatory system can provide an image where the treatment site is lighted well, or at least where the image of the treatment site is rendered such that the treatment site appears lighted. Furthermore, the virtual dental operatory system is configured such that the dental professional can perform the procedure minimizing the opportunity for the dental professional or the dental professional's tools to block the lighting effect.
  • FIG. 1 illustrates one example embodiment of a virtual dental operatory system 100. As used herein, operatory means any dental procedure, including but not limited to removal of cavity, adding fillings, root canals, bridges, orthodontics, surgery, and/or any or all treatments that take place around or within the oral cavity. As illustrated in FIG. 1, the virtual dental operatory system 100 can be arranged and configured to cooperate with a standard dental chair wherein a patient 102 sits while a dental professional 104 performs a dental treatment procedure. Although FIG. 1 illustrates one arrangement, the virtual dental operatory system 100 can be arranged in almost any arrangement as will be described further below.
  • Figure one illustrates that the virtual dental operatory system 100 can include an image capture subsystem 200. As illustrated in FIG. 1, the image capture subsystem 200 is configured to capture an image of a treatment site and communicate that image to various other subsystems within the virtual dental operatory system 100. In particular, the image capture subsystem 200 can be configured to capture and communicate a live video image of a treatment site.
  • As illustrated in FIG. 1, portions of the image capture subsystem 200 can be mounted to a wall or ceiling by way of an adjustable mounting system 106 that allows the dental professional 104 to position the image capture subsystem 200 in the best location to capture the best image. In alternative embodiments, portions of the image capture subsystem 200 can be located directly on dental tools that the dental professional is using, for example, a dental drill housing or similar dental tool. Alternatively, portions of the image capture subsystem 200 can be a free standing hand-held tool that a dental assistant can maneuver and position while the dental professional 104 performs a dental treatment procedure.
  • As shown in FIG. 1, the image capture subsystem can capture an image of at least a portion of the patient's 102 oral cavity 110. The image capture subsystem can then communicate the image of the oral cavity 110 to an image display subsystem 400 a and/or 400 b such that the dental professional can view the image during the dental procedure. For example, FIG. 1 illustrates that that part of the image display subsystem 400 a and/or 400 b may include a monitor that is mounted to a wall or ceiling by way of an adjustable mounting system 108 that allows the dental professional to maneuver the monitor to almost any position or location. As illustrated, the monitor can display a virtual view 112 of the treatment site within the oral cavity 110. In an alternative, or in the same embodiment, the image display subsystem 400 a and/or 400 b can include goggles that include a display that allows the dental professional 104 to view the virtual view 112 of the treatment site within the oral cavity by wearing the goggles. These and other components will be explained in more detail below with reference to FIGS. 2 through 5.
  • In addition to the various example components and subsystems illustrated in FIG. 1, FIG. 2 further illustrates an example embodiment of the virtual dental operatory system 100. As discussed above with reference to FIG. 1, the virtual dental operatory system 100 can include the image capture subsystem 200 that can capture an image of a treatment area within a patient's oral cavity to produce a captured image.
  • As illustrated in FIG. 2, the image capture subsystem 200 can be communicably connected to an image rendering subsystem 300. The image capture subsystem 200 can communicate the captured image to the image rendering subsystem 300. After receiving the captured image, the image rendering subsystem 300 can apply one or more rendering routines to the captured image to produce a rendered image.
  • As further illustrated in FIG. 2, the image rendering subsystem 300 can be communicably connected to an image display subsystem 400. In particular, the image rendering subsystem 300 can communicate the rendered image to the image display subsystem 400. After receiving the rendered image, the image display subsystem 400 can display the rendered image on one or more image display devices.
  • The image capture subsystem 200, the image rendering subsystem 300 and the image display subsystem 400 (the “subsystems”) can communicate using any suitable communication technologies, devices, networks, media, and protocols supportive of remote data communications. For example, the subsystems can communicate over a network using any communication platforms and technologies suitable for transporting captured images and/or communication signals, including known communication technologies, devices, transmission media, and protocols supportive of remote data communications, examples of which include, but are not limited to, data transmission media, communications devices, Transmission Control Protocol (“TCP”), Internet Protocol (“IP”), File Transfer Protocol (“FTP”), Telnet, Hypertext Transfer Protocol (“HTTP”), Hypertext Transfer Protocol Secure (“HTTPS”), Session Initiation Protocol (“SIP”), Simple Object Access Protocol (“SOAP”), Extensible Mark-up Language (“XML”) and variations thereof, Simple Mail Transfer Protocol (“SMTP”), Real-Time Transport Protocol (“RTP”), User Datagram Protocol (“UDP”), Global System for Mobile Communications (“GSM”) technologies, Code Division Multiple Access (“CDMA”) technologies, Evolution Data Optimized Protocol (“EVDO”), Time Division Multiple Access (“TDMA”) technologies, radio frequency (“RF”) signaling technologies, wireless communication technologies (e.g., Bluetooth, Wi-Fi, etc.), optical transport and signaling technologies, live transmission technologies (e.g., media streaming technologies), media file transfer technologies, in-band and out-of-band signaling technologies, and other suitable communications technologies.
  • Moreover, the network may include one or more networks or types of networks (and communication links thereto) capable of carrying communications, captured images, and/or data signals between the subsystems. For example, the network may include, but is not limited to, one or more wireless networks, cable networks, hybrid fiber coax networks, optical fiber networks, broadband networks, narrowband networks, the Internet, wide area networks, local area networks, public networks, private networks, packet-switched networks, and any other networks capable of carrying data and/or communications signals between the subsystems. Communications between the subsystems may be transported using any one of above-listed networks, or any combination or sub-combination of the above-listed networks.
  • The various subsystems illustrated in FIGS. 1 and 2 can each include various facilities that assist the various subsystems in performing various tasks within the virtual dental operatory system 100. For example, FIG. 3 illustrates on example of the image capture subsystem 200. As illustrated, the image capture subsystem 200 can include a capture facility 202. The capture facility 202 can include any devices that provide for the capture of an image, and in particular, the capture of a live video image.
  • For example, the capture facility 202 can include a camera. In one embodiment, the camera can be a live feed auto-focus camera. The camera can have various optical magnification lenses to provide optical magnification of the treatment site. For example, a camera such as the SONY ALPHA A350 is a camera with live feed and auto focus capabilities.
  • The capture facility can be configured to focus on a particular instrument such that the camera's focus automatically adjusts depending on the location of the particular instrument within the oral cavity. This feature can be provided by image recognition routines that are part of the capture facility 202.
  • The image capture subsystem 200 can further include a communication facility 204. Communication facility 204 may be configured to facilitate communication between the image capture subsystem and the image rendering subsystem 300 and/or the image display subsystem 400. In particular, communication facility 204 may be configured to transmit and/or receive communication signals, captured images, metadata and/or any other data to/from the image capture subsystem 200. For example, communication facility 204 may transmit data representative of one or more captured images to the image rendering subsystem 300.
  • Data representative of captured images may be transmitted in one or more media content streams, as one or more data files, or in any other suitable manner as may serve a particular implementation. Communication facility 204 may be configured to interface with any suitable communication media, protocols, and formats, including any of those mentioned above.
  • In addition to the communication facility 204, the image capture subsystem 200 can include a storage facility 206. Storage facility 206 may be configured to maintain captured data 208 representative of one or more captured images. It will be recognized that storage facility 208 may maintain additional or alternative data as may serve a particular implementation. In one example, the captured data can be used by the dentist to demonstrate a particular procedure to another patient, insurance company, or keep it on file for other purposes, such as if a particular procedure fails then the dentist can review the procedure in an attempt to identify the cause of failure.
  • FIG. 4 illustrates an example embodiment of the image rendering subsystem 300. The image rendering subsystem 300 can include a rendering facility 302 configured to perform one or more rendering routines on the captured image. The rendering routines can perform digital enhancements that render or change the captured image to a rendered image. The rendering or changes to the captured image can enhance the captured image in one or more ways such that resulting rendered image allows the dental professional a superior view or visual access to the treatment site.
  • For example, rendering routines performed on the captured image can magnify, brighten, increase/decrease contrast, enlarge, zoom in, zoom out, provide a three-dimension model, change colors, add lighting effects and/or change or render the captured image in any way to produce a customized rendered image for the dental professionals use.
  • The image rendering subsystem 300 can include a user input interface that allows the dental professional to customize the rendered image by inputting rendering specifications. The rendering facility can be provided with pre-set rendering routines that easily allow the dental professional to render the captured image in a preset way and also be able to switch back and forth between two or more pre-set rendering routines during a procedure.
  • The image rendering subsystem 300 can further include a communication facility 304. Communication facility 304 may be configured to facilitate communication between the image rendering subsystem 300 and the image capture subsystem 200 and/or the image display subsystem 400. In particular, communication facility 304 may be configured to transmit and/or receive communication signals, captured images, metadata and/or any other data to/from the image rendering subsystem 300. For example, communication facility 304 may transmit data representative of one or more rendered images to the image display subsystem 400.
  • Data representative of rendered images may be transmitted in one or more media content streams, as one or more data files, or in any other suitable manner as may serve a particular implementation. Communication facility 304 may be configured to interface with any suitable communication media, protocols, and formats, including any of those mentioned above.
  • In addition to the communication facility 304, the image rendering subsystem 300 can include a storage facility 306. Storage facility 306 may be configured to maintain rendered data 308 representative of one or more rendered images. It will be recognized that storage facility 308 may maintain additional or alternative data as may serve a particular implementation. In one example, the rendered data 308 can be used by the dentist to demonstrate a particular procedure to another patient, insurance company, or keep it on file for other purposes, such as if a particular procedure fails then the dentist can review the rendered data in an attempt to identify the cause of failure.
  • FIG. 5 illustrates an example embodiment of the image display subsystem 400. For example, the image display subsystem 400 can include a presentation facility 402. The presentation facility has the ability to use data received from either the image capture subsystem 200 or the image rendering subsystem 300 and create a visual depiction of the treatment site.
  • For example, the presentation facility 400 can include a monitor or projector that displays a visual depiction of the treatment site. In another example, the presentation facility 400 can include goggles (e.g., virtual reality goggles) that are capable of producing a visual depiction of the treatment site to the wearer of the goggles. In one embodiment, the goggles can be TECHWOOD TG-06V IC Goggles from WELTON ELECTRONICS. The goggles can be worn like traditional glasses or goggles with the depiction of the treatment site viewed directly through the goggles.
  • With the goggles, or the monitor, the dental professional can perform a dental procedure on the patient, while directly monitoring an enhanced view of the dental procedure through the virtual operatory system 100. Thus, the dental professional is provided with a superior view of the oral cavity during the procedure (including magnification and lighting effects).
  • The image display subsystem 400 can further include a communication facility 404. Communication facility 404 may be configured to facilitate communication between the image display subsystem 400 and the image capture subsystem 200 and/or the image rendering subsystem 300. In particular, communication facility 404 may be configured to transmit and/or receive communication signals, captured images, rendered images, metadata and/or any other data to/from the image display subsystem 400. For example, communication facility 404 may receive data representative of one or more rendered images from the image rendering subsystem 300.
  • Data representative of rendered images may be transmitted in one or more media content streams, as one or more data files, or in any other suitable manner as may serve a particular implementation. Communication facility 404 may be configured to interface with any suitable communication media, protocols, and formats, including any of those mentioned above.
  • FIG. 6 illustrates an exemplary method 600 of virtual dental operatory. While FIG. 6 illustrates exemplary steps according to one embodiment, other embodiments may omit, add to, reorder, and/or modify any of the steps shown in FIG. 6. The steps shown in FIG. 6 may be performed by any component or combination of components of the virtual dental operatory system 100.
  • In step 602, an image representative of a treatment site in a patient's oral cavity is captured during the performance of a dental treatment procedure to produce a captured image. To illustrate, FIG. 3 illustrates that the image capture subsystem 200 can include the capture facility that is configured to capture an image of the treatment site. For example, a camera or similar image capturing device can be used to capture the image.
  • In step 604, the captured image is rendered to enhance the visual properties of treatment site to produce a rendered image. To illustrate, FIG. 4 illustrates that the image rendering subsystem 300 can include a rendering facility that is configured to apply one or more rendering routines to the captured image to produce a rendered image that is enhanced and customized to provide a superior visual depiction of the treatment site.
  • In step 606, the rendered image is displayed to the dental professional during the performance of the dental treatment procedure. To illustrate, FIG. 5 illustrates that the image display subsystem 400 can include a presentation facility that is configured to display the rendered image. For example, virtual reality goggles can be used to display the rendered image to the dental professional.
  • In certain embodiments, one or more of the components and/or processes described herein may be implemented and/or performed by one or more appropriately configured computing devices. To this end, one or more of the systems and/or components described above may include or be implemented by any computer hardware and/or computer-implemented instructions (e.g., software), or combinations of computer-implemented instructions and hardware, configured to perform one or more of the processes described herein. In particular, system components may be implemented on one physical computing device or may be implemented on more than one physical computing device. Accordingly, system components may include any number of computing devices, and may employ any of a number of computer operating systems.
  • In certain embodiments, one or more of the processes described herein may be implemented at least in part as instructions executable by one or more computing devices. In general, a processor (e.g., a microprocessor) receives instructions, from a tangible computer-readable medium, (e.g., a memory, etc.), and executes those instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions may be stored and/or transmitted using any of a variety of known computer-readable media.
  • A computer-readable medium (also referred to as a processor-readable medium) includes any medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including, but not limited to, non-volatile media and/or volatile media. Non-volatile media may include, for example, optical or magnetic disks and other persistent memory. Volatile media may include, for example, dynamic random-access memory (“DRAM”), which typically constitutes a main memory. Common forms of computer-readable media include, for example, a floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other tangible medium from which a computer may read.
  • FIG. 7 illustrates an exemplary computing device 700 that may be configured to perform one or more of the processes described herein. As shown in FIG. 7, computing device 700 may include a communication interface 702, a processor 704, a storage device 706, and an input/output (“I/O”) module 708 communicatively connected via a communication infrastructure 710. While an exemplary computing device 700 is shown in FIG. 7, the components illustrated in FIG. 7 are not intended to be limiting. Additional or alternative components may be used in other embodiments. Components of computing device 700 shown in FIG. 7 will now be described in additional detail.
  • Communication interface 702 may be configured to communicate with one or more computing devices. Examples of communication interface 702 include, without limitation, a wired network interface (such as a network interface card), a wireless network interface (such as a wireless network interface card), a modem, and any other suitable interface. In at least one embodiment, communication interface 702 may provide a direct connection between system 100 and one or more of provisioning systems via a direct link to a network, such as the Internet. Communication interface 702 may additionally or alternatively provide such a connection through, for example, a local area network (such as an Ethernet network), a personal area network, a telephone or cable network, a satellite data connection, a dedicated URL, or any other suitable connection. Communication interface 702 may be configured to interface with any suitable communication media, protocols, and formats, including any of those mentioned above.
  • Processor 704 generally represents any type or form of processing unit capable of processing data or interpreting, executing, and/or directing execution of one or more of the instructions, processes, and/or operations described herein. Processor 704 may direct execution of operations in accordance with one or more applications 712 or other computer-executable instructions such as may be stored in storage device 706 or another computer-readable medium.
  • Storage device 706 may include one or more data storage media, devices, or configurations and may employ any type, form, and combination of data storage media and/or device. For example, storage device 706 may include, but is not limited to, a hard drive, network drive, flash drive, magnetic disc, optical disc, random access memory (“RAM”), dynamic RAM (“DRAM”), other non-volatile and/or volatile data storage units, or a combination or sub-combination thereof. Electronic data, including data described herein, may be temporarily and/or permanently stored in storage device 706. For example, data representative of one or more executable applications 712 (which may include, but are not limited to, one or more of the software applications described herein) configured to direct processor 704 to perform any of the operations described herein may be stored within storage device 706. In some examples, data may be arranged in one or more databases residing within storage device 706.
  • I/O module 708 may be configured to receive user input and provide user output and may include any hardware, firmware, software, or combination thereof supportive of input and output capabilities. For example, I/O module 708 may include hardware and/or software for capturing user input, including, but not limited to, a keyboard or keypad, a touch screen component (e.g., touch screen display), a receiver (e.g., an RF or infrared receiver), and/or one or more input buttons.
  • I/O module 708 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen, one or more output drivers (e.g., display drivers), one or more audio speakers, and one or more audio drivers. In certain embodiments, I/O module 708 is configured to provide graphical data to a display for presentation to a user. The graphical data may be representative of one or more graphical user interfaces and/or any other graphical content as may serve a particular implementation.
  • In some examples, any of the facilities described herein may be implemented by or within one or more components of computing device 700. For example, one or more applications 712 residing within storage device 706 may be configured to direct processor 704 to perform one or more processes or functions associated with capture facility 202, communication facility 204, rendering facility 302, communication facility 304, presentation facility 402, and/or communication facility 404. Likewise, storage facilities 206 and/or 306 may be implemented by or within storage device 706.
  • The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described implementations are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims (17)

What is claimed is:
1. A virtual operatory system, comprising:
a video camera configured to capture images of a clinical treatment site producing a live video stream;
a rendering facility communicatively coupled to the video camera and configured to receive the captured video image; and,
at least one pair of virtual reality goggles;
wherein the captured video is rendered by the rendering facility with preset rendering routines to produce a rendered image stream with enhanced visual properties of the treatment site which is then displayed with the virtual reality goggles, the rendering facility being capable of switching between two or more preset rendering routines during a procedure.
2. The system recited in claim 1, wherein the capture facility includes a digital camera.
3. The system recited in claim 2, wherein the camera is an autofocus camera.
4. The system recited in claim 3, wherein the camera has a variety of optical magnifications.
5. The system recited in claim 1, wherein the one or more rendering routines applied to the captured image include adding a lighting effect.
6. The system recited in claim 4, wherein the one or more rendering routines applied to the captured image include adding a magnification effect.
7. The system recited in claim 4, wherein the one or more rendering routines applied to the captured image include selecting a portion of the captured image focusing on the selected portion.
8. The system recited in claim 1, wherein the presentation facility includes a monitor that displays the rendered image.
9. The system recited in claim 1, wherein the presentation facility includes a virtual reality screen that displays the rendered image.
10. A virtual operatory system, comprising:
a video camera configured to capture images of a clinical treatment site and produce a live video stream;
a rendering facility communicatively coupled to the video camera and configured to receive the captured live video stream;
a plurality of preset rendering routines to enhance the visual properties of the treatment site; and,
a virtual reality screen;
wherein the rendering facility renders the live video stream with at least two of the plurality of preset rendering routines, producing an enhanced video stream, and switches back and forth between said at least two preset rendering routines during a procedure and the enhanced video stream is displayed on the virtual reality screen.
11. The virtual operatory system of claim 10, wherein the capturing of the image is provided by a digital camera.
12. The virtual operatory system of claim 10, wherein the displaying of the rendered image is provided by a monitor.
13. The virtual operatory system of claim 10 wherein the rendering of the captured image includes adding lighting effects to the captured image.
14. The virtual operatory system of claim 10, wherein the rendering of the captured image includes magnification of the captured image.
15. The virtual operatory system of claim 10, wherein the displaying of the rendered image is provided by a monitor by way of modeling software.
16. The virtual operatory system of claim 10, wherein the displaying of the rendered image is a projection of a holographic image
17. The virtual operatory system of claim 10, wherein the displaying of the rendered image is a projection onto a background screen.
US16/790,572 2010-06-10 2020-02-13 Virtual dental operatory Abandoned US20200178785A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/790,572 US20200178785A1 (en) 2010-06-10 2020-02-13 Virtual dental operatory

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US35361410P 2010-06-10 2010-06-10
US13/158,067 US10602921B2 (en) 2010-06-10 2011-06-10 Virtual dental operatory
US16/790,572 US20200178785A1 (en) 2010-06-10 2020-02-13 Virtual dental operatory

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/158,067 Continuation US10602921B2 (en) 2010-06-10 2011-06-10 Virtual dental operatory

Publications (1)

Publication Number Publication Date
US20200178785A1 true US20200178785A1 (en) 2020-06-11

Family

ID=45096492

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/158,067 Active 2031-07-09 US10602921B2 (en) 2010-06-10 2011-06-10 Virtual dental operatory
US16/790,572 Abandoned US20200178785A1 (en) 2010-06-10 2020-02-13 Virtual dental operatory

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/158,067 Active 2031-07-09 US10602921B2 (en) 2010-06-10 2011-06-10 Virtual dental operatory

Country Status (1)

Country Link
US (2) US10602921B2 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9486393B2 (en) * 2009-12-18 2016-11-08 Cao Group, Inc. Single component tooth root sealer
US10473942B2 (en) * 2015-06-05 2019-11-12 Marc Lemchen Apparatus and method for image capture of medical or dental images using a head mounted camera and computer system
CN109394156A (en) * 2018-11-01 2019-03-01 佛山科学技术学院 A kind of electronic reflector
EP3914954A1 (en) * 2019-01-24 2021-12-01 CAO Group, Inc. Electronic loupe

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5986662A (en) * 1996-10-16 1999-11-16 Vital Images, Inc. Advanced diagnostic viewer employing automated protocol selection for volume-rendered imaging
US7410138B2 (en) * 2003-03-14 2008-08-12 Tgr Intellectual Properties, Llc Display adjustably positionable about swivel and pivot axes
JP3959644B2 (en) * 2003-09-30 2007-08-15 英宏 藤江 Dental care equipment
TWM287663U (en) * 2005-05-10 2006-02-21 Tusng-Chuan Liu Microvideo on dental lamp equipment
US20080147267A1 (en) * 2006-12-13 2008-06-19 Smartdrive Systems Inc. Methods of Discretizing data captured at event data recorders

Also Published As

Publication number Publication date
US20110306005A1 (en) 2011-12-15
US10602921B2 (en) 2020-03-31

Similar Documents

Publication Publication Date Title
US20200178785A1 (en) Virtual dental operatory
US11546527B2 (en) Methods and apparatuses for compensating for retinitis pigmentosa
US20120056993A1 (en) Dental Field Visualization System with Improved Ergonomics
WO2015145395A1 (en) Augmented reality glasses for medical applications and corresponding augmented reality system
JP2003504976A (en) Stereoscopic video observation and image enlargement system
WO2017061294A1 (en) Surgical control device, surgical control method, and program
CN110612720B (en) Information processing apparatus, information processing method, and readable storage medium
JP6459184B2 (en) Display system, health appliance, control device and program
WO2017061293A1 (en) Surgical operation system, surgical operation control device, and surgical operation control method
US20210221000A1 (en) Systems and methods for lifesaving trauma stabilization medical telepresence of a remote user
KR102493223B1 (en) Information processing device, information processing method and information processing program
US20210019921A1 (en) Image processing device, image processing method, and program
JP6490001B2 (en) Medical system
JP6784619B2 (en) Medical observation device, operation method of medical observation device, endoscopic system, and surgical microscope system
JP2006297060A (en) Microscope for surgery and treatment system equipped with it
US11611705B1 (en) Smart glasses with augmented reality capability for dentistry
US10448004B1 (en) Ergonomic protective eyewear
JP2021531883A (en) Distributed image processing system in the operating room
JP2015149552A (en) Wearable electronic apparatus
JP2008123257A (en) Remote operation support system and display control method
US11883120B2 (en) Medical observation system, medical signal processing device, and medical signal processing device driving method
US11343550B2 (en) Transmission device, transmission method, and content distribution system
US10743043B2 (en) Management device and management method
US11778325B2 (en) Image processing apparatus, image processing method, and image processing program
US11902692B2 (en) Video processing apparatus and video processing method

Legal Events

Date Code Title Description
AS Assignment

Owner name: CAO GROUP LLC, UTAH

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JENSEN, STEVEN D;REEL/FRAME:051817/0110

Effective date: 20150528

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION