WO2015196388A1 - Intra-oral imaging using operator interface with gesture recognition - Google Patents

Intra-oral imaging using operator interface with gesture recognition Download PDF

Info

Publication number
WO2015196388A1
WO2015196388A1 PCT/CN2014/080732 CN2014080732W WO2015196388A1 WO 2015196388 A1 WO2015196388 A1 WO 2015196388A1 CN 2014080732 W CN2014080732 W CN 2014080732W WO 2015196388 A1 WO2015196388 A1 WO 2015196388A1
Authority
WO
WIPO (PCT)
Prior art keywords
intra
oral
camera
processor
movement
Prior art date
Application number
PCT/CN2014/080732
Other languages
French (fr)
Inventor
Yingqian WU
Wei Wang
Guijian WANG
Yan Zhang
Original Assignee
Carestream Health, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Carestream Health, Inc. filed Critical Carestream Health, Inc.
Priority to JP2016574384A priority Critical patent/JP2017525411A/en
Priority to EP14895612.1A priority patent/EP3160356A4/en
Priority to US15/315,002 priority patent/US20170300119A1/en
Priority to PCT/CN2014/080732 priority patent/WO2015196388A1/en
Publication of WO2015196388A1 publication Critical patent/WO2015196388A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/24Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/50Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
    • A61B6/51Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for dentistry
    • A61B6/512Intraoral means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/81Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation

Definitions

  • the invention relates generally to the field of intra-oral imaging and more particularly relates to methods and apparatus for gesture- based operator interface interaction provided with the intra-oral camera.
  • a succession of digital images such as video images, can be obtained from the mouth of the patient.
  • the images obtained from an intra-oral camera 30 are generally displayed on a display monitor 20 that is visible to both a practitioner 16 and a patient 14. This capability allows practitioner 16 to more clearly visualize a problem area and can help to provide a better understanding of a recommended procedure for the patient.
  • a number of display features are available for the obtained digital images. Standard display functions, such as zooming in or out, panning, brightness or color adjustment, and other functions are readily available on a graphical user interface for improving the visibility of an affected area.
  • the same display can be used as the operator interface for displaying or for entry of information about the patient, such as previous treatment or history data, patient identification, scheduling, and so on.
  • a practical problem that affects use of the display as an operator interface relates to command or instruction entry. If the practitioner is required to repeatedly switch between imaging functions and instruction or data entry, moving between the mouth of the patient and a keyboard or computer mouse, touch screen, or other data entry, selection, or pointing device, there is a potential risk of infection.
  • the present invention is directed to improvements in intra-oral image capture and display.
  • Embodiments of the present invention address the problem of practitioner interaction with the intra-oral imaging apparatus for display of images and entry of instructions and patient data during an imaging session.
  • the intraoral camera itself is configured to sense gestural instructions that are intended to send instructions that affect displayed information during the patient imaging session.
  • embodiments of the present invention allow the dental practitioner to enter instructions for control of the display contents, entry of imaging parameters, or entry of patient data or other instructions using detected motion of the camera itself.
  • a method for obtaining an intra-oral image executed at least in part by a computer system and comprising: emitting illumination from an intra-oral camera toward an object that is within the mouth of a patient; obtaining image data content of the object at an image sensor of the intra-oral camera; displaying the image content obtained from the imaging sensor; obtaining one or more movement signals indicative of movement of the intra-oral camera along at least two of three mutually orthogonal axes; interpreting the one or more obtained movement signals as an operator instruction corresponding to a predetermined movement pattern; and changing at least the display of the image content according to the operator instruction.
  • an intra-oral imaging apparatus comprising: an intra-oral camera comprising: (i) a light source that is energizable to emit illumination toward an object that is within the mouth of a patient; (ii) an imaging sensor that is energizable to obtain image content of the object; (iii) a motion sensing element that provides one or more signals indicative of acceleration of the intra-oral camera along at least two of three mutually orthogonal axes; a display that displays obtained image content from the imaging sensor and that provides a graphical user interface for control of intra-oral camera imaging; a processor that is in signal communication with the motion sensing element and is configured to recognize an operator instruction according to the signals indicative of a predetermined movement pattern for the camera, detected by the motion sensing element; wherein the recognized operator instruction relates to the displayed image content for the patient and changes at least the graphical user interface on the display; and a switch that is in signal communication with the processor; wherein a switch position indicates to the processor either
  • Figure 1 A is a perspective view showing use of an intra-oral camera for obtaining an image from the mouth of a patient.
  • Figure IB is a perspective view showing the use of an intra-oral camera for entering operator instructions in a command mode.
  • Figure 2 is a schematic block diagram showing components of an intra-oral imaging system according to an embodiment of the present invention.
  • Figure 3 is a perspective view that shows an intra-oral camera moved in different directions to provide user interface instructions.
  • Figure 4 is a perspective view that shows three conventional Cartesian coordinate axes.
  • Figure 5 is a graph showing accelerometer data measured over time for rotational movement.
  • Figure 6 is a graph showing accelerometer data measured over time for linear movement.
  • Figure 7 is a table showing a number of exemplary movement pattern vectors for instruction entry.
  • Figure 8 is a logic flow diagram that shows a sequence for entry of an operator instruction when using the intra-oral camera in a command mode.
  • an image sensor for example, is energizable to record image data when it receives the necessary power and enablement signals.
  • two elements are considered to be substantially orthogonal if their angular orientations differ from each other by 90 degrees +/- 12 degrees.
  • actuable has its conventional meaning, relating to a device or component that is capable of effecting an action in response to a stimulus, such as in response to an electrical signal, for example.
  • the terms “user”, “viewer”, “technician”, “practitioner”, and “operator” are considered to be equivalent when referring to the person who operates the intra-oral imaging system, enters commands or instructions, and views its results.
  • the term “instructions” is used to include entry of commands or of selections such as on-screen button selection, listings, hyperlinks, or menu selections. Instructions can relate to commands that initiate image capture, adjustments to or selections of imaging parameters or imaging process, such as still or video image capture or other selection of commands or parameters that control the functions and performance of an imaging apparatus, including commands that adjust the appearance of displayed features.
  • Figure 1A showed dental practitioner 16 obtaining an image from the mouth of patient 14 using intra-oral camera 30 and viewing results on display 20. Entry of instructions, such as those needed to pan, zoom, or otherwise adjust what appears on display 20, is difficult for practitioner 16 without either a staff assistant or using some type of hands-free input device.
  • Figure IB shows the practitioner 16 using intra-oral camera 30 in a command mode for instruction entry, according to an embodiment of the present invention.
  • the practitioner 16 can enter instructions to perform functions such as pan or zoom of the display; brightness, color, contrast, or other image quality adjustment; display and selection from a pull-down menu 44 or control button 28; on-screen cursor 24 positioning; or data entry, such as from an on-screen keypad 29.
  • functions such as pan or zoom of the display; brightness, color, contrast, or other image quality adjustment; display and selection from a pull-down menu 44 or control button 28; on-screen cursor 24 positioning; or data entry, such as from an on-screen keypad 29.
  • the schematic block diagram of Figure 2 shows an intra-oral imaging apparatus 10 for obtaining an image of one or more objects, such as teeth, in the mouth of the patient.
  • the components housed within a chassis 32 of an intra-oral camera 30 are shown within a dashed outline.
  • a light source 46 provides illumination to the object, such as single color or white light illumination or infrared (IR) or ultraviolet (UV) light, or a combination of light having different spectral content.
  • Light source 46 is in signal communication with and controlled by one or more signals from a processor 40.
  • Optics 12 such as one or more lens elements, filters, polarizers, and other components, condition and direct the imaged light to an image sensor 42, such as a CCD (Charge-Coupled Device) array or CMOS (Complementary Metal-Oxide Semiconductor) array, which is in signal communication with processor 40 and provides image data to processor 40.
  • An optional switch 22 is provided for manually switching camera 30 from a command mode into an imaging mode. It should be noted that the function of switch 22 for switching between camera modes can alternately be executed automatically by interpreting detected camera 30 motion, since pre-determined movements of camera 30 that are used for instruction entry, as described subsequently, are different from movement patterns typically used during image capture. The function of switch 22 can also be executed by determining camera 30 focus. According to an embodiment of the present invention, command mode is disabled during imaging, either for single images or for video imaging.
  • keyboard or mouse command entry at the graphical user interface of display 70 (Figure 2) overrides command entry from movement patterns.
  • a motion sensing element 50 such as a 3- D accelerometer or a set of multiple accelerometers, provides motion information that is used for user interface instruction entry.
  • Processor 40 uses this information in order to detect operator instructions, as described in more detail subsequently.
  • Processor 40 is a control logic processor that obtains the image data from image sensor 42 and instruction-related motion operation from motion sensing element 50 and provides this data for display.
  • Processor 40 is in signal communication with a host computer or other processor 60 over a communication link 58, which may be a wired or wireless communication link.
  • Host computer 60 is in signal communication with a display 70 for display of the obtained image and patient information and for entry of user interface instructions.
  • Display 70 provides a graphical user interface for controlling and using intra-oral camera 30. According to an embodiment of the present invention, the graphical user interface on display 70 displays the command that has been entered using camera 30 movement. In addition, commands entered according to spatial movement patterns of camera 30 change the display of acquired image content on the graphical user interface.
  • Host computer 60 is also in signal communication with a memory 62 for short- or long-term storage of patient image data and related patient information, such as treatment history and personal information about the patient.
  • processor 40 and host computer 60 can be performed by a more powerful processor 40 on intra-oral camera 30 itself, thus eliminating the need for the external host computer 60.
  • Memory 62 can also be provided to processor 40, stored on camera 30.
  • Processor 40 may also connect directly to display 70 for display of image content obtained from image sensor 42.
  • processor 40 can have only a data conditioning function and be primarily a transmitter device that simply provides all of its acquired data to host computer 60 for more complex image processing and motion analysis functions for recognizing operator commands. It can be appreciated that compact packaging of intra-oral camera 30 may dictate how much processing and storage capability can be provided within the body of camera 30.
  • Figure 3 shows the hand-held intra-oral camera 30 used for user interface instruction entry in a command mode. Arrows indicate some of the possible motion that can be provided for entering commands.
  • Figure 4 shows the three orthogonal axes for 3 -dimensional (3-D) movement, conventionally known as Cartesian coordinate axes and identified as x, y, and z axes, respectively.
  • Accelerometers used for motion sensing element 50 can measure movement in space with respect to any of the x, y, and z axes, as well as rotation relative to the axes, as shown.
  • Accelerometers can be micro-electromechanical system (MEMS) devices, such as those conventionally used in various types of smart phone and handheld personal computer pads and similar devices.
  • MEMS micro-electromechanical system
  • the accelerometer output is a movement signal that is indicative of static acceleration, such as due to gravity, and dynamic acceleration from hand and arm movement of the operator and from other movement, such as from hand vibration. Since there is always some inherent noise in the accelerometer output, the measured activity from the movement signal is generally non-zero.
  • a single 3-D accelerometer is used to detect motion along any of the three coordinate axes of Figure 4.
  • three accelerometers are used, each sensing motion along a corresponding one of the three orthogonal axes, respectively, as shown in Figure 4. It can be appreciated that one, two or three accelerometers can alternately be used, in various configurations, for various levels of measurement range and accuracy, each accelerometer providing a corresponding movement signal for interpretation by processor 40 ( Figure 2).
  • Figures 5 and 6 show characteristic curves obtained from sampling the movement signal data from motion sensing element 50 when using multiple accelerometers.
  • Figure 5 shows normalized accelerometer data collected, over time, when the intra-oral camera 30 is moved rotationally in a clockwise (CW) circle.
  • Figure 6 shows normalized accelerometer data collected, over time, when the intraoral camera 30 is moved horizontally along a line from left (L) to right (R). The acceleration scale is normalized relative to gravity.
  • These characteristic curves provide sufficient information for identifying the movement path and duration and are interpreted for entry of various user interface instructions according to movement of intra-oral camera 30 in command mode.
  • intra-oral camera 30 can be in an imaging mode or in a command mode, according to the position of optional switch 22 ( Figure 2).
  • mode selection can alternately be performed in an automated manner, such as by sensing whether or not camera 30 is focused on a tooth or other object or is removed from the mouth and held in a position from which no object is in focus.
  • Methods of focus detection that can be used for this type of automatic mode determination are well known to those skilled in the imaging arts.
  • Still other methods of determining the mode of intra-oral camera 30 relate to detecting movement of camera 30 as reported by motion sensing element 50.
  • the predetermined movement patterns of camera 30 that are used to enter instructions are generally executed at speeds that would cause significant amounts of blur in obtained images.
  • An alternate source for movement sensing relates to image blur.
  • the camera 30 mode for imaging mode or command mode, is determined using a combination of both acceleration data and focus detection.
  • Image analysis detects camera 30 motion and provides a movement signal that is indicative of accelerometer data and, optionally, image analysis.
  • Figure 7 shows a table that lists, by way of example and not by way of limitation, some of the characteristic movement patterns of camera 30 that can be readily detected by the one or more accelerometers used in motion sensing element 50, consistent with an embodiment of the present invention.
  • Each block of the table shows a movement pattern with its movement pattern vector and shows the corresponding orthogonal axes over which movement can be sensed.
  • a dot in each block represents the beginning of a movement pattern; the arrow shows movement direction.
  • a vector Via shows a left-to-right (L-R) movement pattern, measured relative to x and y axes.
  • a vector VI b shows the opposite right- to-left (R-L) movement pattern, measured relative to x and y axes.
  • a vector Vic shows a vertical movement pattern in the upward direction, measured relative to x and y axes.
  • a vector V2a shows a vertical movement pattern in the downward direction, measured relative to x and y axes.
  • Vectors V2b and V2c show right angle movement patterns, relative to x-y axes.
  • Vectors V3a and V3b show triangular movement patterns, relative to x-y axes.
  • Vectors V3c and V4a show circular movement patterns in different clockwise (CW) and counter-clockwise (CCW) directions, measured relative to the indicated x-z and y-z axes.
  • CW clockwise
  • CCW counter-clockwise
  • Each of these characteristic movement patterns can be detected using the arrangement of one, two, or three accelerometers for motion sensing element 50 as described previously with reference to Figure 2.
  • Each of these and other movement patterns can be used to provide a movement signal that indicates an operator instruction.
  • image processing can also be used to supplement or to verify or validate movement data from motion sensing element 50 according to detection of motion blur.
  • the logic flow diagram of Figure 8 shows a sequence of steps used for executing user interface instructions according to detected movement in command mode. This sequence of steps runs continuously when in command mode.
  • a mode decision step SI 00 the mode of operation of intra-oral camera 30 is detected as either imaging mode or command mode.
  • switch 22 indicates the camera mode as either command mode or imaging mode. If in imaging mode, movement of the intra-oral camera 30 is not interpreted for command entry. If the camera 30 is in command mode, then a pattern acquisition step SI 10 executes, acquiring the measurement data from motion sensing element 50.
  • motion sensing element 50 is a single 3-D accelerometer
  • a time series of 3- dimensional vector data acquired by the accelerometer is provided as input to the gesture detection steps shown here.
  • a number of measurements are obtained for characterizing the movement pattern, as was described previously with reference to the graphs of Figures 5 and 6.
  • An optional noise compensation step SI 20 eliminates noise data from random movement, such as unintended or incidental movement along or about one or possibly two of the orthogonal axes. This noise removal is useful because the practitioner is not likely to move the camera 30 in precisely one direction or to provide rotation that is symmetrical about a single axis, for example. Vibration from the dental office environment or from nearby equipment can also add some noise content. With respect to the example of Figure 6, for example, movement along the y and z directions appears to be unintended, whereas movement along the x axis appears to be intentional and is prominent.
  • a pattern identification step SI 30 can then be executed, identifying the most likely movement pattern indicated by the measured data, such as that shown in the table of Figure 7. Once the most likely pattern is identified, an instruction identification step SI 40 then correlates the movement pattern to a corresponding instruction. An instruction execution step SI 50 then executes the entered instruction. The sequence continues for additional command entry, looping back to mode decision step SI 00 as shown.
  • Ambiguous movement data is possible and the practitioner observes the display 20 screen to ascertain that the intended instruction has been received.
  • a prompt or other verification is posted to the display screen, requesting clarification or verification of the entered command.
  • This can also be provided, for example, with a movement pattern that is not likely to be unambiguous, such as by the movement pattern shown by vector VI c or V2a in Figure 7, for example.
  • a standard set of predetermined movement patterns for intra-oral camera 30 is provided, with each pattern identifying a unique, corresponding instruction for operator entry, such as the set of movement patterns shown in Figure 7.
  • Gesture training software optionally provided as part of intra-oral imaging system 10 ( Figure 2) uses many of the same components that are employed for gesture detection. According to an embodiment of the present invention, software containing training algorithms for resetting and calibrating gestures with intra-oral camera 30 are provided as part of processor 40 ( Figure 2).
  • the practitioner has a setup utility that allows redefinition of one or more movement patterns as well as allowing additional movement patterns to be defined and correlated with particular operator instructions.
  • This utility can be particularly useful for customizing how the imaging system performs various functions.
  • a zoom-in viewing function may be customized to zoom in fixed, discrete increments, such as at 100%, 150%, and 200%, with an increment change effected with each completion of a movement pattern.
  • zoom-in can be continuous, so that zoom operation continuously enlarges the imaged object as long as the operator continues the corresponding movement pattern.
  • the setup utility can also be used to adjust sensitivity and sampling rate of motion sensing element 50 to suit the preferences of the dental practitioner who uses intra-oral imaging system 10.
  • Motion sensing element 50 can use any suitable number of accelerometers or other devices for measuring motion along orthogonal axes.
  • Options for motion sensing element 50 include the use of only one or two
  • MEMS accelerometers or the use of three or more accelerometers for measuring movement in appropriate directions.
  • MEMS accelerometer devices are advantaged for size, availability, and cost; other accelerometer types can alternately be used.
  • gyroscopes, magnetometers, and other devices that are capable of measuring measure movement along an axis or rotation about an axis can be used.
  • a host processor or computer executes a program with stored instructions that provide imaging functions and instruction sensing functions in accordance with the method described.
  • a computer program of an embodiment of the present invention can be utilized by a suitable, general-purpose computer system, such as a personal computer or workstation.
  • a suitable, general-purpose computer system such as a personal computer or workstation.
  • many other types of computer systems can be used to execute the computer program of the present invention, including networked processors.
  • the computer program for performing the method of the present invention may be stored in a computer readable storage medium.
  • This medium may comprise, for example; magnetic storage media such as a magnetic disk (such as a hard drive) or magnetic tape or other portable type of magnetic disk; optical storage media such as an optical disc, optical tape, or machine readable bar code; solid state electronic storage devices such as random access memory (RAM), or read only memory (ROM); or any other physical device or medium employed to store a computer program.
  • the computer program for performing the method of the present invention may also be stored on computer readable storage medium that is connected to the image processor by way of the internet or other communication medium. Those skilled in the art will readily recognize that the equivalent of such a computer program product may also be constructed in hardware.
  • the computer program product of the present invention may make use of various image manipulation algorithms and processes that are well known. It will be further understood that the computer program product embodiment of the present invention may embody algorithms and processes not specifically shown or described herein that are useful for implementation. Such algorithms and processes may include conventional utilities that are within the ordinary skill of the image processing arts. Additional aspects of such algorithms and systems, and hardware and/or software for producing and otherwise processing the images or co-operating with the computer program product of the present invention, are not specifically shown or described herein and may be selected from such algorithms, systems, hardware, components and elements known in the art.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Endoscopes (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)

Abstract

A method for obtaining an intra-oral image,the method executed at least in part by a computer system, emits illumination from an intra-oral camera toward an object that is within the mouth of a patient, then obtains image data content of the object at an image sensor of the intra-oral camera. The image content obtained from the imaging sensor is displayed and one or more movement signals indicative of movement of the intra-oral camera along at least two of three mutually orthogonal axes is obtained. The one or more obtained movement signals are interpreted as an operator instruction corresponding to a predetermined movement pattern. At least the display of the image content is changed according to the operator instruction.

Description

INTRA-ORAL IMAGING USING OPERATOR INTERFACE WITH
GESTURE RECOGNITION
FIELD OF THE INVENTION
The invention relates generally to the field of intra-oral imaging and more particularly relates to methods and apparatus for gesture- based operator interface interaction provided with the intra-oral camera.
BACKGROUND OF THE INVENTION
Dental practitioners have recognized the value of intra-oral imaging apparatus for improving diagnostic capability, for maintaining more accurate patient records, and for improving their communication with patients. Advantages such as these, coupled with ongoing improvements in capability, compactness, affordability, and usability have made intra-oral imaging systems attractive for use in dental offices and clinics.
Using an intra-oral camera, a succession of digital images, such as video images, can be obtained from the mouth of the patient. As shown in Figure 1 A, the images obtained from an intra-oral camera 30 are generally displayed on a display monitor 20 that is visible to both a practitioner 16 and a patient 14. This capability allows practitioner 16 to more clearly visualize a problem area and can help to provide a better understanding of a recommended procedure for the patient.
A number of display features are available for the obtained digital images. Standard display functions, such as zooming in or out, panning, brightness or color adjustment, and other functions are readily available on a graphical user interface for improving the visibility of an affected area. In addition, the same display can be used as the operator interface for displaying or for entry of information about the patient, such as previous treatment or history data, patient identification, scheduling, and so on. A practical problem that affects use of the display as an operator interface relates to command or instruction entry. If the practitioner is required to repeatedly switch between imaging functions and instruction or data entry, moving between the mouth of the patient and a keyboard or computer mouse, touch screen, or other data entry, selection, or pointing device, there is a potential risk of infection. Continually changing gloves or using replaceable covers or films for keyboard or touch screen and other devices are options; however, these can be impractical for reasons of usability, efficiency, likelihood of error, and cost. Often, a "four hands" solution is the only workable arrangement; for this, the dental practitioner enlists the assistance of another staff member for help with image view adjustment, patient data entry, and instruction entry during an imaging session.
There have been a number of solutions proposed for addressing this problem and allowing the practitioner to interact with the imaging system directly. These include, for example, the use of foot pedal control devices, voice sensing, infrared source tracking, and other mechanisms for instruction entry. Understandably, solutions such as these can be error-prone, can be difficult to calibrate or adjust, and can be awkward to set up and use.
Thus, there is a need for apparatus and methods that allow the dental practitioner to obtain images and enter patient data or instructions without requiring assistance from other members of the staff and without setting the intra-oral camera aside in order to change gloves.
SUMMARY OF THE I VENTION
The present invention is directed to improvements in intra-oral image capture and display. Embodiments of the present invention address the problem of practitioner interaction with the intra-oral imaging apparatus for display of images and entry of instructions and patient data during an imaging session.
It is a feature of embodiments of the present invention that the intraoral camera itself is configured to sense gestural instructions that are intended to send instructions that affect displayed information during the patient imaging session.
Advantageously, embodiments of the present invention allow the dental practitioner to enter instructions for control of the display contents, entry of imaging parameters, or entry of patient data or other instructions using detected motion of the camera itself.
According to an embodiment of the present invention, there is provided a method for obtaining an intra-oral image, the method executed at least in part by a computer system and comprising: emitting illumination from an intra-oral camera toward an object that is within the mouth of a patient; obtaining image data content of the object at an image sensor of the intra-oral camera; displaying the image content obtained from the imaging sensor; obtaining one or more movement signals indicative of movement of the intra-oral camera along at least two of three mutually orthogonal axes; interpreting the one or more obtained movement signals as an operator instruction corresponding to a predetermined movement pattern; and changing at least the display of the image content according to the operator instruction.
According to another aspect of the present invention, there is provided an intra-oral imaging apparatus comprising: an intra-oral camera comprising: (i) a light source that is energizable to emit illumination toward an object that is within the mouth of a patient; (ii) an imaging sensor that is energizable to obtain image content of the object; (iii) a motion sensing element that provides one or more signals indicative of acceleration of the intra-oral camera along at least two of three mutually orthogonal axes; a display that displays obtained image content from the imaging sensor and that provides a graphical user interface for control of intra-oral camera imaging; a processor that is in signal communication with the motion sensing element and is configured to recognize an operator instruction according to the signals indicative of a predetermined movement pattern for the camera, detected by the motion sensing element; wherein the recognized operator instruction relates to the displayed image content for the patient and changes at least the graphical user interface on the display; and a switch that is in signal communication with the processor; wherein a switch position indicates to the processor either to acquire image content or to obtain an operator instruction.
These objects are given only by way of illustrative example, and such objects may be exemplary of one or more embodiments of the invention. Other desirable objectives and advantages inherently achieved by the disclosed invention may occur or become apparent to those skilled in the art. The invention is defined by the appended claims.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other objects, features, and advantages of the invention will be apparent from the following more particular description of the embodiments of the invention, as illustrated in the accompanying drawings.
The elements of the drawings are not necessarily to scale relative to each other.
Figure 1 A is a perspective view showing use of an intra-oral camera for obtaining an image from the mouth of a patient.
Figure IB is a perspective view showing the use of an intra-oral camera for entering operator instructions in a command mode.
Figure 2 is a schematic block diagram showing components of an intra-oral imaging system according to an embodiment of the present invention.
Figure 3 is a perspective view that shows an intra-oral camera moved in different directions to provide user interface instructions.
Figure 4 is a perspective view that shows three conventional Cartesian coordinate axes.
Figure 5 is a graph showing accelerometer data measured over time for rotational movement.
Figure 6 is a graph showing accelerometer data measured over time for linear movement.
Figure 7 is a table showing a number of exemplary movement pattern vectors for instruction entry. Figure 8 is a logic flow diagram that shows a sequence for entry of an operator instruction when using the intra-oral camera in a command mode.
DETAILED DESCRIPTION OF THE INVENTION
Figures provided herein are given in order to illustrate key principles of operation and component relationships along their respective optical paths according to the present invention and are not drawn with intent to show actual size or scale. Some exaggeration may be necessary in order to emphasize basic structural relationships or principles of operation. Some conventional components that would be needed for implementation of the described embodiments are not shown in the drawings in order to simplify description of the invention itself, including, for example, components that provide power and transmit data in a wired or wireless manner. In the drawings and text that follow, like components are designated with like reference numerals, and similar descriptions concerning components and arrangement or interaction of components already described are omitted.
In the context of the present disclosure, the terms "first", "second", and so on, do not necessarily denote any ordinal, sequential, or priority relation, but are simply used to more clearly distinguish one element or set of elements from another, unless specified otherwise.
In the context of the present disclosure, the term "energizable" describes a component or device that is enabled to perform a function upon receiving power and, optionally, upon receiving an enabling signal. An image sensor, for example, is energizable to record image data when it receives the necessary power and enablement signals.
In the context of the present disclosure, two elements are considered to be substantially orthogonal if their angular orientations differ from each other by 90 degrees +/- 12 degrees.
In the context of the present disclosure, the term "actuable" has its conventional meaning, relating to a device or component that is capable of effecting an action in response to a stimulus, such as in response to an electrical signal, for example.
In the context of the present disclosure, the terms "user", "viewer", "technician", "practitioner", and "operator" are considered to be equivalent when referring to the person who operates the intra-oral imaging system, enters commands or instructions, and views its results. The term "instructions" is used to include entry of commands or of selections such as on-screen button selection, listings, hyperlinks, or menu selections. Instructions can relate to commands that initiate image capture, adjustments to or selections of imaging parameters or imaging process, such as still or video image capture or other selection of commands or parameters that control the functions and performance of an imaging apparatus, including commands that adjust the appearance of displayed features.
Figure 1A showed dental practitioner 16 obtaining an image from the mouth of patient 14 using intra-oral camera 30 and viewing results on display 20. Entry of instructions, such as those needed to pan, zoom, or otherwise adjust what appears on display 20, is difficult for practitioner 16 without either a staff assistant or using some type of hands-free input device. Figure IB shows the practitioner 16 using intra-oral camera 30 in a command mode for instruction entry, according to an embodiment of the present invention. By making any of a number of predetermined gestures with intra-oral camera 30, such as briefly moving the camera 30 in a horizontal direction from left to right or moving the camera in a circular motion about an axis as shown in Figure IB, the practitioner 16 can enter instructions to perform functions such as pan or zoom of the display; brightness, color, contrast, or other image quality adjustment; display and selection from a pull-down menu 44 or control button 28; on-screen cursor 24 positioning; or data entry, such as from an on-screen keypad 29.
Consistent with an embodiment of the present invention, the schematic block diagram of Figure 2 shows an intra-oral imaging apparatus 10 for obtaining an image of one or more objects, such as teeth, in the mouth of the patient. The components housed within a chassis 32 of an intra-oral camera 30 are shown within a dashed outline. A light source 46 provides illumination to the object, such as single color or white light illumination or infrared (IR) or ultraviolet (UV) light, or a combination of light having different spectral content. Light source 46 is in signal communication with and controlled by one or more signals from a processor 40. Optics 12, such as one or more lens elements, filters, polarizers, and other components, condition and direct the imaged light to an image sensor 42, such as a CCD (Charge-Coupled Device) array or CMOS (Complementary Metal-Oxide Semiconductor) array, which is in signal communication with processor 40 and provides image data to processor 40. An optional switch 22 is provided for manually switching camera 30 from a command mode into an imaging mode. It should be noted that the function of switch 22 for switching between camera modes can alternately be executed automatically by interpreting detected camera 30 motion, since pre-determined movements of camera 30 that are used for instruction entry, as described subsequently, are different from movement patterns typically used during image capture. The function of switch 22 can also be executed by determining camera 30 focus. According to an embodiment of the present invention, command mode is disabled during imaging, either for single images or for video imaging.
According to an alternate embodiment of the present invention, keyboard or mouse command entry at the graphical user interface of display 70 (Figure 2) overrides command entry from movement patterns.
In the Figure 2 embodiment, a motion sensing element 50, such as a 3- D accelerometer or a set of multiple accelerometers, provides motion information that is used for user interface instruction entry. Processor 40 uses this information in order to detect operator instructions, as described in more detail subsequently.
Processor 40 is a control logic processor that obtains the image data from image sensor 42 and instruction-related motion operation from motion sensing element 50 and provides this data for display. Processor 40 is in signal communication with a host computer or other processor 60 over a communication link 58, which may be a wired or wireless communication link. Host computer 60 is in signal communication with a display 70 for display of the obtained image and patient information and for entry of user interface instructions. Display 70 provides a graphical user interface for controlling and using intra-oral camera 30. According to an embodiment of the present invention, the graphical user interface on display 70 displays the command that has been entered using camera 30 movement. In addition, commands entered according to spatial movement patterns of camera 30 change the display of acquired image content on the graphical user interface. Host computer 60 is also in signal communication with a memory 62 for short- or long-term storage of patient image data and related patient information, such as treatment history and personal information about the patient.
Still considering the arrangement of Figure 2, it can be appreciated that the functions described for processor 40 and host computer 60 can be performed by a more powerful processor 40 on intra-oral camera 30 itself, thus eliminating the need for the external host computer 60. Memory 62 can also be provided to processor 40, stored on camera 30. Processor 40 may also connect directly to display 70 for display of image content obtained from image sensor 42. Alternately, processor 40 can have only a data conditioning function and be primarily a transmitter device that simply provides all of its acquired data to host computer 60 for more complex image processing and motion analysis functions for recognizing operator commands. It can be appreciated that compact packaging of intra-oral camera 30 may dictate how much processing and storage capability can be provided within the body of camera 30.
Figure 3 shows the hand-held intra-oral camera 30 used for user interface instruction entry in a command mode. Arrows indicate some of the possible motion that can be provided for entering commands. Figure 4 shows the three orthogonal axes for 3 -dimensional (3-D) movement, conventionally known as Cartesian coordinate axes and identified as x, y, and z axes, respectively.
Accelerometers used for motion sensing element 50 can measure movement in space with respect to any of the x, y, and z axes, as well as rotation relative to the axes, as shown.
Accelerometers can be micro-electromechanical system (MEMS) devices, such as those conventionally used in various types of smart phone and handheld personal computer pads and similar devices. The accelerometer output is a movement signal that is indicative of static acceleration, such as due to gravity, and dynamic acceleration from hand and arm movement of the operator and from other movement, such as from hand vibration. Since there is always some inherent noise in the accelerometer output, the measured activity from the movement signal is generally non-zero.
According to an embodiment of the present invention, a single 3-D accelerometer is used to detect motion along any of the three coordinate axes of Figure 4. According to an alternate embodiment of the present invention, three accelerometers are used, each sensing motion along a corresponding one of the three orthogonal axes, respectively, as shown in Figure 4. It can be appreciated that one, two or three accelerometers can alternately be used, in various configurations, for various levels of measurement range and accuracy, each accelerometer providing a corresponding movement signal for interpretation by processor 40 (Figure 2).
Figures 5 and 6 show characteristic curves obtained from sampling the movement signal data from motion sensing element 50 when using multiple accelerometers. Figure 5 shows normalized accelerometer data collected, over time, when the intra-oral camera 30 is moved rotationally in a clockwise (CW) circle. Figure 6 shows normalized accelerometer data collected, over time, when the intraoral camera 30 is moved horizontally along a line from left (L) to right (R). The acceleration scale is normalized relative to gravity. These characteristic curves provide sufficient information for identifying the movement path and duration and are interpreted for entry of various user interface instructions according to movement of intra-oral camera 30 in command mode. As noted earlier, intra-oral camera 30 can be in an imaging mode or in a command mode, according to the position of optional switch 22 (Figure 2). It can be appreciated that mode selection can alternately be performed in an automated manner, such as by sensing whether or not camera 30 is focused on a tooth or other object or is removed from the mouth and held in a position from which no object is in focus. Methods of focus detection that can be used for this type of automatic mode determination are well known to those skilled in the imaging arts. Still other methods of determining the mode of intra-oral camera 30 relate to detecting movement of camera 30 as reported by motion sensing element 50. The predetermined movement patterns of camera 30 that are used to enter instructions are generally executed at speeds that would cause significant amounts of blur in obtained images.
An alternate source for movement sensing relates to image blur.
According to an alternate embodiment of the present invention, the camera 30 mode, for imaging mode or command mode, is determined using a combination of both acceleration data and focus detection. Image analysis detects camera 30 motion and provides a movement signal that is indicative of accelerometer data and, optionally, image analysis.
Figure 7 shows a table that lists, by way of example and not by way of limitation, some of the characteristic movement patterns of camera 30 that can be readily detected by the one or more accelerometers used in motion sensing element 50, consistent with an embodiment of the present invention. Each block of the table shows a movement pattern with its movement pattern vector and shows the corresponding orthogonal axes over which movement can be sensed. A dot in each block represents the beginning of a movement pattern; the arrow shows movement direction. A vector Via shows a left-to-right (L-R) movement pattern, measured relative to x and y axes. A vector VI b shows the opposite right- to-left (R-L) movement pattern, measured relative to x and y axes. A vector Vic shows a vertical movement pattern in the upward direction, measured relative to x and y axes. A vector V2a shows a vertical movement pattern in the downward direction, measured relative to x and y axes. Vectors V2b and V2c show right angle movement patterns, relative to x-y axes. Vectors V3a and V3b show triangular movement patterns, relative to x-y axes. Vectors V3c and V4a show circular movement patterns in different clockwise (CW) and counter-clockwise (CCW) directions, measured relative to the indicated x-z and y-z axes. In addition to those patterns shown, other movement patterns, such as "w" shaped movement patterns, relative to two or more axes, can be used. Each of these characteristic movement patterns can be detected using the arrangement of one, two, or three accelerometers for motion sensing element 50 as described previously with reference to Figure 2. Each of these and other movement patterns can be used to provide a movement signal that indicates an operator instruction. As noted previously, image processing can also be used to supplement or to verify or validate movement data from motion sensing element 50 according to detection of motion blur.
Consistent with an embodiment of the present invention, the logic flow diagram of Figure 8 shows a sequence of steps used for executing user interface instructions according to detected movement in command mode. This sequence of steps runs continuously when in command mode. In a mode decision step SI 00, the mode of operation of intra-oral camera 30 is detected as either imaging mode or command mode. For the camera embodiment shown in Figure 2, switch 22 indicates the camera mode as either command mode or imaging mode. If in imaging mode, movement of the intra-oral camera 30 is not interpreted for command entry. If the camera 30 is in command mode, then a pattern acquisition step SI 10 executes, acquiring the measurement data from motion sensing element 50. Where motion sensing element 50 is a single 3-D accelerometer, for example, a time series of 3- dimensional vector data acquired by the accelerometer is provided as input to the gesture detection steps shown here. A number of measurements are obtained for characterizing the movement pattern, as was described previously with reference to the graphs of Figures 5 and 6. An optional noise compensation step SI 20 eliminates noise data from random movement, such as unintended or incidental movement along or about one or possibly two of the orthogonal axes. This noise removal is useful because the practitioner is not likely to move the camera 30 in precisely one direction or to provide rotation that is symmetrical about a single axis, for example. Vibration from the dental office environment or from nearby equipment can also add some noise content. With respect to the example of Figure 6, for example, movement along the y and z directions appears to be unintended, whereas movement along the x axis appears to be intentional and is prominent.
Continuing with the sequence shown in Figure 8, a pattern identification step SI 30 can then be executed, identifying the most likely movement pattern indicated by the measured data, such as that shown in the table of Figure 7. Once the most likely pattern is identified, an instruction identification step SI 40 then correlates the movement pattern to a corresponding instruction. An instruction execution step SI 50 then executes the entered instruction. The sequence continues for additional command entry, looping back to mode decision step SI 00 as shown.
Ambiguous movement data is possible and the practitioner observes the display 20 screen to ascertain that the intended instruction has been received. In some cases, a prompt or other verification is posted to the display screen, requesting clarification or verification of the entered command. This can also be provided, for example, with a movement pattern that is not likely to be unambiguous, such as by the movement pattern shown by vector VI c or V2a in Figure 7, for example.
Consistent with an embodiment of the present invention, a standard set of predetermined movement patterns for intra-oral camera 30 is provided, with each pattern identifying a unique, corresponding instruction for operator entry, such as the set of movement patterns shown in Figure 7. A default set of movement
characteristics is initially used. However, motion sensing element 50 can further be trained to identify additional instructions or trained to respond to a particular set of instructions from the practitioner. Gesture training software, optionally provided as part of intra-oral imaging system 10 (Figure 2) uses many of the same components that are employed for gesture detection. According to an embodiment of the present invention, software containing training algorithms for resetting and calibrating gestures with intra-oral camera 30 are provided as part of processor 40 (Figure 2).
According to an embodiment of the present invention, the practitioner has a setup utility that allows redefinition of one or more movement patterns as well as allowing additional movement patterns to be defined and correlated with particular operator instructions. This utility can be particularly useful for customizing how the imaging system performs various functions. As just one example, a zoom-in viewing function may be customized to zoom in fixed, discrete increments, such as at 100%, 150%, and 200%, with an increment change effected with each completion of a movement pattern. Alternately, zoom-in can be continuous, so that zoom operation continuously enlarges the imaged object as long as the operator continues the corresponding movement pattern. In addition, the setup utility can also be used to adjust sensitivity and sampling rate of motion sensing element 50 to suit the preferences of the dental practitioner who uses intra-oral imaging system 10.
Motion sensing element 50 can use any suitable number of accelerometers or other devices for measuring motion along orthogonal axes.
Options for motion sensing element 50 include the use of only one or two
accelerometers, or the use of three or more accelerometers for measuring movement in appropriate directions. MEMS accelerometer devices are advantaged for size, availability, and cost; other accelerometer types can alternately be used. Alternately, gyroscopes, magnetometers, and other devices that are capable of measuring measure movement along an axis or rotation about an axis can be used.
Consistent with an embodiment of the present invention, a host processor or computer executes a program with stored instructions that provide imaging functions and instruction sensing functions in accordance with the method described. As can be appreciated by those skilled in the image processing arts, a computer program of an embodiment of the present invention can be utilized by a suitable, general-purpose computer system, such as a personal computer or workstation. However, many other types of computer systems can be used to execute the computer program of the present invention, including networked processors. The computer program for performing the method of the present invention may be stored in a computer readable storage medium. This medium may comprise, for example; magnetic storage media such as a magnetic disk (such as a hard drive) or magnetic tape or other portable type of magnetic disk; optical storage media such as an optical disc, optical tape, or machine readable bar code; solid state electronic storage devices such as random access memory (RAM), or read only memory (ROM); or any other physical device or medium employed to store a computer program. The computer program for performing the method of the present invention may also be stored on computer readable storage medium that is connected to the image processor by way of the internet or other communication medium. Those skilled in the art will readily recognize that the equivalent of such a computer program product may also be constructed in hardware.
It will be understood that the computer program product of the present invention may make use of various image manipulation algorithms and processes that are well known. It will be further understood that the computer program product embodiment of the present invention may embody algorithms and processes not specifically shown or described herein that are useful for implementation. Such algorithms and processes may include conventional utilities that are within the ordinary skill of the image processing arts. Additional aspects of such algorithms and systems, and hardware and/or software for producing and otherwise processing the images or co-operating with the computer program product of the present invention, are not specifically shown or described herein and may be selected from such algorithms, systems, hardware, components and elements known in the art.
The invention has been described in detail with particular reference to a presently preferred embodiment, but it will be understood that variations and modifications can be effected within the spirit and scope of the invention. For example, the function of optional switch 22 (Figure 2) for changing the intra-oral camera 30 between imaging and command modes can be effected by sensing, so that the camera 30 is automatically placed in command mode when movement or imaging data indicate that the camera 30 is not in the patient's mouth. Various movement patterns can be provided in addition to the examples shown in Figure 7. The image processing and operation logic functions described with reference to Figure 2 can be performed on a single processor that resides internally on intra-oral camera 30, on an external host processor 60, or on some combination of internal and external logic processing devices, including one or more networked computers, for example. The presently disclosed embodiments are therefore considered in all respects to be illustrative and not restrictive. The scope of the invention is indicated by the appended claims, and all changes that come within the meaning and range of equivalents thereof are intended to be embraced therein.

Claims

WHAT IS CLAIMED IS:
1. A method for obtaining an intra-oral image, comprising: emitting illumination from an intra-oral camera toward an object within the mouth of a patient;
obtaining image data content of the object at an image sensor of the intra-oral camera;
displaying the image content obtained from the imaging sensor;
obtaining one or more movement signals indicative of movement of the intraoral camera along at least two of three mutually orthogonal axes;
interpreting the one or more obtained movement signals as an operator instruction corresponding to a predetermined movement pattern; and
changing at least the display of the image content according to the operator instruction.
2. The method of claim 1 further comprising changing one or more imaging parameters according to the one or more obtained movement signals.
3. The method of claim 1 further comprising providing an operator interface that is responsive to the one or more obtained movement signals.
4. The method of claim 1 wherein changing the display of the image content changes the zoom magnification of the displayed image content.
5. The method of claim 1 wherein changing the display of the image content changes the panning of the displayed image content.
6. The method of claim 1 further comprising indicating a menu selection on the display according to the operator instruction.
7. The method of claim 1 wherein interpreting the one or more obtained signals further comprises using results from training software.
8. The method of claim 1 wherein obtaining the one or more signals indicative of movement of the intra-oral camera further comprises sensing a switch position.
9. An intra-oral imaging apparatus comprising:
an intra-oral camera comprising:
(i) a light source that is energizable to emit illumination toward an object that is within the mouth of a patient;
(ii) an imaging sensor that is energizable to obtain image content of the object; and
(iii) a motion sensing element that provides one or more signals indicative of acceleration of the intra-oral camera along at least two of three mutually orthogonal axes;
a display that displays obtained image content from the imaging sensor and that provides a graphical user interface for control of intra-oral camera imaging; a processor in signal communication with the motion sensing element and configured to recognize an operator instruction according to the signals indicative of a predetermined movement pattern for the camera, detected by the motion sensing element, the recognized operator instruction relating to the displayed image content for the patient and changes at least the graphical user interface on the display; and a switch that is in signal communication with the processor, wherein a switch position indicates to the processor either to acquire image content or to obtain an operator instruction.
10. The intra-oral imaging apparatus of claim 9 wherein the motion sensing element comprises one or more accelerometers.
11. The intra-oral imaging apparatus of claim 9 wherein the motion sensing element comprises a gyroscope or a magnetometer.
12. The intra-oral imaging apparatus of claim 9 wherein the processor is within a chassis of the intra-oral camera.
13. The intra-oral imaging apparatus of claim 9 wherein the operator instruction from the movement pattern appears on the graphical user interface of the display.
14. The intra-oral imaging apparatus of claim 9 wherein the operator instruction performs a pan or zoom adjustment of the displayed image content.
15. The intra-oral imaging apparatus of claim 12 wherein the processor is in signal communication with the display.
16. The intra-oral imaging apparatus of claim 9 wherein the intraoral camera provides a signal that indicates whether it is in an imaging mode, within the patient's mouth, or in a command mode, outside the patient's mouth.
17. The intra-oral imaging apparatus of claim 12 wherein the processor is a first processor and is within the chassis of the intra-oral camera and further comprising a second processor that is in signal communication with the first processor and wherein the second processor is further in signal communication with the display.
18. The intra-oral imaging apparatus of claim 11 wherein the signal communication between the first and second processor is wireless
communication.
19. The intra-oral imaging apparatus of claim 16 wherein the intraoral camera detects motion blur to determine whether it is in the imaging or the command mode.
PCT/CN2014/080732 2014-06-25 2014-06-25 Intra-oral imaging using operator interface with gesture recognition WO2015196388A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2016574384A JP2017525411A (en) 2014-06-25 2014-06-25 Intraoral imaging using an operator interface with gesture recognition
EP14895612.1A EP3160356A4 (en) 2014-06-25 2014-06-25 Intra-oral imaging using operator interface with gesture recognition
US15/315,002 US20170300119A1 (en) 2014-06-25 2014-06-25 Intra-oral imaging using operator interface with gesture recognition
PCT/CN2014/080732 WO2015196388A1 (en) 2014-06-25 2014-06-25 Intra-oral imaging using operator interface with gesture recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/080732 WO2015196388A1 (en) 2014-06-25 2014-06-25 Intra-oral imaging using operator interface with gesture recognition

Publications (1)

Publication Number Publication Date
WO2015196388A1 true WO2015196388A1 (en) 2015-12-30

Family

ID=54936457

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/080732 WO2015196388A1 (en) 2014-06-25 2014-06-25 Intra-oral imaging using operator interface with gesture recognition

Country Status (4)

Country Link
US (1) US20170300119A1 (en)
EP (1) EP3160356A4 (en)
JP (1) JP2017525411A (en)
WO (1) WO2015196388A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018053044A1 (en) * 2016-09-14 2018-03-22 Dental Imaging Technologies Corporation Multiple-dimension imaging sensor with operation based on movement detection
US10213180B2 (en) 2016-09-14 2019-02-26 Dental Imaging Technologies Corporation Multiple-dimension imaging sensor with operation based on magnetic field detection
CN109414153A (en) * 2016-05-26 2019-03-01 口腔智能镜公司 Dental-mirrors and its application with integrated camera
US10299742B2 (en) 2016-09-14 2019-05-28 Dental Imaging Technologies Corporation Multiple-dimension imaging sensor with fault condition detection
CN110035700A (en) * 2016-09-14 2019-07-19 登塔尔图像科技公司 The operation based on state of multiplanar imaging sensor and the imaging system comprising multiplanar imaging sensor
JP2020505189A (en) * 2017-01-06 2020-02-20 フォトニケア,インコーポレイテッド Self-oriented imaging device and method of using same
US11571274B2 (en) 2019-03-18 2023-02-07 J. Morita Mfg. Corp. Dental instrument, control method thereof, and three-dimensional measuring

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102539649B1 (en) * 2017-03-20 2023-06-01 쓰리세이프 에이/에스 3D scanner system with handheld scanner
CN108511050B (en) * 2018-02-12 2022-01-07 苏州佳世达电通有限公司 Mouth scanning machine, mouth scanning system and control method of mouth scanning machine
KR102237033B1 (en) * 2019-03-06 2021-04-07 주식회사 디디에스 Oral scanner that can automatically change a scan mode and method for scanning using thereof
CN110859640A (en) * 2019-11-13 2020-03-06 先临三维科技股份有限公司 Scanner, operation method, device and system thereof, storage medium and processor

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1277432A1 (en) * 2001-07-17 2003-01-22 Byteworks GmbH & Co. KG Apparatus for viewing and taking pictures of objects in the mouth of a patient and processing
CN102193626A (en) * 2010-03-15 2011-09-21 欧姆龙株式会社 Gesture recognition apparatus, method for controlling gesture recognition apparatus, and control program
CN103393423A (en) * 2013-08-02 2013-11-20 广州医学院第一附属医院 Oral cavity detecting system

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3532660B2 (en) * 1995-06-09 2004-05-31 オリンパス株式会社 Body cavity observation device
US6175301B1 (en) * 1999-03-19 2001-01-16 Gregory H. Piesinger Low tire pressure warning system
US7173604B2 (en) * 2004-03-23 2007-02-06 Fujitsu Limited Gesture identification of controlled devices
WO2007137093A2 (en) * 2006-05-16 2007-11-29 Madentec Systems and methods for a hands free mouse
US20080097176A1 (en) * 2006-09-29 2008-04-24 Doug Music User interface and identification in a medical device systems and methods
JP5570801B2 (en) * 2009-12-23 2014-08-13 株式会社モリタ製作所 Medical treatment equipment
JP2011218140A (en) * 2010-03-23 2011-11-04 Panasonic Corp Intraoral camera
US8571281B2 (en) * 2010-07-13 2013-10-29 Carestream Health, Inc. Dental shade mapping
WO2012076013A1 (en) * 2010-12-06 2012-06-14 3Shape A/S System with 3d user interface integration
NZ611792A (en) * 2010-12-22 2014-12-24 Spark Dental Technology Ltd Dental charting system
JP5651132B2 (en) * 2011-01-11 2015-01-07 株式会社アドバンス Intraoral radiography display system
US10372292B2 (en) * 2013-03-13 2019-08-06 Microsoft Technology Licensing, Llc Semantic zoom-based navigation of displayed content

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1277432A1 (en) * 2001-07-17 2003-01-22 Byteworks GmbH & Co. KG Apparatus for viewing and taking pictures of objects in the mouth of a patient and processing
CN102193626A (en) * 2010-03-15 2011-09-21 欧姆龙株式会社 Gesture recognition apparatus, method for controlling gesture recognition apparatus, and control program
CN103393423A (en) * 2013-08-02 2013-11-20 广州医学院第一附属医院 Oral cavity detecting system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3160356A4 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109414153A (en) * 2016-05-26 2019-03-01 口腔智能镜公司 Dental-mirrors and its application with integrated camera
JP2019526424A (en) * 2016-09-14 2019-09-19 デンタル・イメージング・テクノロジーズ・コーポレーション Multidimensional imaging sensor with fault condition detection
KR20190047034A (en) * 2016-09-14 2019-05-07 덴탈 이미징 테크놀로지스 코퍼레이션 Multidimensional Imaging Sensor with Magnetic Field Sensing Based Operation
JP2019534117A (en) * 2016-09-14 2019-11-28 デンタル・イメージング・テクノロジーズ・コーポレーション Multidimensional imaging sensor with operation based on motion detection
US10925571B2 (en) 2016-09-14 2021-02-23 Dental Imaging Technologies Corporation Intra-oral imaging sensor with operation based on output of a multi-dimensional sensor
CN110035700A (en) * 2016-09-14 2019-07-19 登塔尔图像科技公司 The operation based on state of multiplanar imaging sensor and the imaging system comprising multiplanar imaging sensor
CN110035699A (en) * 2016-09-14 2019-07-19 登塔尔图像科技公司 The multiplanar imaging sensor operated based on mobile detection
US10390788B2 (en) 2016-09-14 2019-08-27 Dental Imaging Technologies Corporation Multiple-dimension imaging sensor with operation based on detection of placement in mouth
WO2018053044A1 (en) * 2016-09-14 2018-03-22 Dental Imaging Technologies Corporation Multiple-dimension imaging sensor with operation based on movement detection
CN110035699B (en) * 2016-09-14 2024-02-06 登塔尔图像科技公司 Multidimensional imaging sensor operating based on movement detection
US10213180B2 (en) 2016-09-14 2019-02-26 Dental Imaging Technologies Corporation Multiple-dimension imaging sensor with operation based on magnetic field detection
US10299742B2 (en) 2016-09-14 2019-05-28 Dental Imaging Technologies Corporation Multiple-dimension imaging sensor with fault condition detection
US10932733B2 (en) 2016-09-14 2021-03-02 Dental Imaging Technologies Corporation Multiple-dimension imaging sensor with operation based on movement detection
KR102281221B1 (en) 2016-09-14 2021-07-26 덴탈 이미징 테크놀로지스 코퍼레이션 Multidimensional imaging sensor with magnetic field sensing-based actuation
EP3512427B1 (en) * 2016-09-14 2023-06-28 Dental Imaging Technologies Corporation Multiple-dimension imaging sensor and state-based operation of an imaging system including a multiple-dimension imaging sensor
US11576568B2 (en) 2017-01-06 2023-02-14 Photonicare Inc. Self-orienting imaging device and methods of use
JP7092382B2 (en) 2017-01-06 2022-06-28 フォトニケア,インコーポレイテッド Self-oriented imaging device and how to use it
JP2020505189A (en) * 2017-01-06 2020-02-20 フォトニケア,インコーポレイテッド Self-oriented imaging device and method of using same
US11571274B2 (en) 2019-03-18 2023-02-07 J. Morita Mfg. Corp. Dental instrument, control method thereof, and three-dimensional measuring

Also Published As

Publication number Publication date
US20170300119A1 (en) 2017-10-19
EP3160356A4 (en) 2018-01-24
JP2017525411A (en) 2017-09-07
EP3160356A1 (en) 2017-05-03

Similar Documents

Publication Publication Date Title
US20170300119A1 (en) Intra-oral imaging using operator interface with gesture recognition
US8411034B2 (en) Sterile networked interface for medical systems
US11625841B2 (en) Localization and tracking method and platform, head-mounted display system, and computer-readable storage medium
EP3389020B1 (en) Information processing device, information processing method, and program
US10492873B2 (en) Medical spatial orientation system
US9134800B2 (en) Gesture input device and gesture input method
JP2020510461A (en) Augmented reality intervention system providing context overlay
JP5412227B2 (en) Video display device and display control method thereof
JP6390799B2 (en) Input device, input method, and program
CN103677259B (en) For guiding the method for controller, multimedia device and its target tracker
EP2189835A1 (en) Terminal apparatus, display control method, and display control program
US20170122771A1 (en) Sensor output configuration
CN108027656A (en) Input equipment, input method and program
WO2018074045A1 (en) Information processing device, information processing method, and program
KR101365083B1 (en) Interface device using motion recognition and control method thereof
KR20170117650A (en) Photographing method and electronic device supporting the same
US20190354177A1 (en) Information processing apparatus, information processing method, and non-transitory computer readable recording medium
US10823964B2 (en) Work assistance apparatus, work assistance method, and computer-readable, non-transitory recording medium recording work assistance program executed by computer
US9300908B2 (en) Information processing apparatus and information processing method
JP6679083B2 (en) Information processing system, information processing method, wearable terminal, and program
JP2023168746A (en) Information processing apparatus, information processing system, information processing method, and program
JP6008904B2 (en) Display control apparatus, display control method, and program
JP6661882B2 (en) Information processing apparatus, tremor information display method, information processing system, and program
JP7427937B2 (en) Image processing device, image processing method, and program
WO2024106223A1 (en) Imaging device, imaging work assistance method, and imaging work assistance program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14895612

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15315002

Country of ref document: US

REEP Request for entry into the european phase

Ref document number: 2014895612

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014895612

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2016574384

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE