WO2021157415A1 - Toothbrush system, and program for toothbrush system - Google Patents

Toothbrush system, and program for toothbrush system Download PDF

Info

Publication number
WO2021157415A1
WO2021157415A1 PCT/JP2021/002528 JP2021002528W WO2021157415A1 WO 2021157415 A1 WO2021157415 A1 WO 2021157415A1 JP 2021002528 W JP2021002528 W JP 2021002528W WO 2021157415 A1 WO2021157415 A1 WO 2021157415A1
Authority
WO
WIPO (PCT)
Prior art keywords
toothbrush
posture
brush
information
present disclosure
Prior art date
Application number
PCT/JP2021/002528
Other languages
French (fr)
Japanese (ja)
Inventor
真美 筒井
真人 布村
浩輝 篠田
侑樹 二之宮
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2021157415A1 publication Critical patent/WO2021157415A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C17/00Devices for cleaning, polishing, rinsing or drying teeth, teeth cavities or prostheses; Saliva removers; Dental appliances for receiving spittle
    • A61C17/16Power-driven cleaning or polishing devices
    • A61C17/22Power-driven cleaning or polishing devices with brushes, cushions, cups, or the like

Definitions

  • This disclosure relates to a toothbrush system capable of estimating a toothbrushing part and a program for a toothbrush system.
  • Patent Document 1 or Patent Document 2 describes that the accuracy of discriminating a toothpaste portion is improved by using a camera, a temperature sensor, or a distance sensor.
  • Patent Document 3 As in Patent Document 1 or Patent Document 2, when a large number of sensors are used in combination, it is vulnerable to dirt and the complexity of wiring is raised as problems, and it is configured by means for detecting contact or proximity. We are trying to simplify it.
  • Patent Document 4 discloses a technique for detecting a polished portion by using an image taken by a camera mounted on a personal hygiene device assuming a smartphone or a tablet PC and an inertial sensor mounted on a toothbrush. There is.
  • the present disclosure provides a toothbrush system capable of highly accurate detection of a toothpaste site.
  • the toothbrush system in the present disclosure is a toothbrush system including a brush portion for wiping teeth, a toothbrush having a posture detection sensor for detecting a posture, and a terminal device, and the toothbrush system is based on the output of the posture detection sensor.
  • the posture information generation unit that generates posture information indicating the posture, the image information including the face of the person who inserted the toothbrush into the oral cavity acquired from the camera, and the posture information, the contact of the brush part in the dentition It is provided with an estimation unit for estimating a contact site.
  • FIG. 1 is a view showing a side surface of a toothbrush of a toothbrush system and a front surface of a terminal device.
  • FIG. 2 is a view showing the front surface of the toothbrush of the toothbrush system and the front surface of the terminal device.
  • FIG. 3 is a block diagram showing a functional configuration of the toothbrush system in the present disclosure.
  • FIG. 4 is a diagram showing variations of image information including a face portion during tooth brushing.
  • FIG. 5 is a flowchart showing a classification narrowing method based on a threshold value determination of posture information (yaw angle).
  • FIG. 6 is a flowchart showing a classification method based on threshold determination of posture information (pitch angle and roll angle).
  • FIG. 5 is a flowchart showing a classification narrowing method based on a threshold value determination of posture information (yaw angle).
  • FIG. 6 is a flowchart showing a classification method based on threshold determination of posture information (pitch angle and roll angle).
  • FIG. 7 is a flowchart showing an estimation method of the estimation unit based on the posture information when the brush portion is determined to be the buccal side based on the image information.
  • FIG. 8 is a flowchart showing an estimation method of the estimation unit based on the posture information when the brush unit is determined to be the lingual side based on the image information.
  • FIG. 1 is a view showing a side surface of a toothbrush of a toothbrush system and a front surface of a terminal device.
  • FIG. 2 is a view showing the back surface of the toothbrush of the toothbrush system and the front surface of the terminal device.
  • FIG. 3 is a block diagram showing a functional configuration of the toothbrush system in the present disclosure.
  • the toothbrush system 100 is a system capable of acquiring a positional relationship between a tooth and a toothbrush in the oral cavity, and includes a toothbrush 110 and a terminal device 150.
  • the toothbrush 110 is a device that is partially inserted into the oral cavity while being held in the hand to wipe the teeth, and includes a handle portion 111, a shaft portion 112, a brush portion 113, a posture detection sensor 120, and the like. It has. In the case of the present embodiment, the toothbrush 110 further includes a first communication device 117 and a control means 118 for operating the posture information generation unit 121.
  • the handle portion 111 is a portion that is gripped when the teeth are wiped, and has a substantially cylindrical shape.
  • the toothbrush 110 is an electric toothbrush in which the brush portion 113 is driven by power, and the drive means 119, the control means 118, and the power supply portion 116 are provided inside the handle portion 111.
  • a drive switch 114 that can be operated by fingers is attached to the outer peripheral surface of the handle portion 111.
  • the drive means 119 is a rotary actuator such as an electric motor, a linear actuator using an electromagnetic force, a piezoelectric element, a magnetostrictive element, or the like, and the brush portion 113 is operated by a mechanical element such as a gear or a link.
  • the power supply unit 116 is a battery that supplies electric power to the drive means 119, the posture detection sensor 120, the first communication device 117, and the like. At least one of a primary battery, a secondary battery, and a commercial power source is used for the power supply unit 116.
  • the drive switch 114 is a device that can determine whether or not to drive the brush unit 113.
  • the drive switch 114 is a push button switch, and the operation of the drive means 119 can be turned on and off via the control means 118 by being operated by a finger.
  • the control means 118 is a device that controls the operation of the toothbrush 110 by inputting and outputting information.
  • the control means 118 is a device called SoC (System on a Chip), and a function for controlling the operation of the toothbrush 110 is mounted on the semiconductor chip.
  • SoC System on a Chip
  • the control means 118 may be an analog control device such as a simple electrical contact or a relay.
  • the first communication device 117 is a device for performing information communication with the terminal device 150.
  • the first communication device 117 can transmit the posture information determined by the posture information generation unit 121 based on the information detected by the posture detection sensor 120.
  • the communication method of the first communication device 117 is not particularly limited and may be either wired communication or wireless communication, but in the case of the present embodiment, wireless communication by Bluetooth (registered trademark) or the like is adopted. ..
  • the first communication device 117 may be incorporated in the control means 118.
  • the posture detection sensor 120 is a sensor attached to the handle portion 111 to detect the posture of the toothbrush 110.
  • the type of the attitude detection sensor 120 and the type of physical quantity that can be detected are not particularly limited, and examples thereof include an acceleration sensor, a gyro sensor, a one-axis, two-axis or three-axis acceleration sensor, a gyro sensor, and a geomagnetic sensor. It can be and only needs to have at least one of these. Further, an inertial sensor capable of detecting the position, direction, acceleration, and speed with one unit may be adopted as the attitude detection sensor 120.
  • the posture information generation unit 121 is a processing unit that generates posture information indicating the posture of the toothbrush 110 based on the output of the posture detection sensor 120. In the case of the present embodiment, the posture information generation unit 121 functions by causing the control means 118 to execute the program.
  • the shaft portion 112 is a rod-shaped portion connected in a protruding shape from the end portion of the handle portion 111 in the axial direction of the handle portion 111 (Z-axis direction in FIGS. 1 and 2). When the tooth is wiped with the toothbrush 110, at least a part of the shaft portion 112 may be inserted into the oral cavity together with the brush portion 113.
  • the brush portion 113 is connected to the tip portion (Z + side end portion in FIGS. 1 and 2) opposite to the handle portion 111 of the shaft portion 112, and is connected to the axial direction of the shaft portion 112 (Z in FIGS. 1 and 2). It is a portion having a plurality of brush bristles extending in a direction intersecting (axial direction) (Y-axis direction in FIGS. 1 and 2).
  • the brush portion 113 is formed by flocking the brush bristles on the base member, and is removable from the handle portion 111.
  • the terminal device 150 is a device capable of performing information communication with the toothbrush 110, and performs information processing by executing a program.
  • the terminal device 150 includes an input means 152, a display means 153, a camera 154, a processing unit 160 that realizes an estimation unit 161 and a second communication device 151.
  • the type of the terminal device 150 is not particularly limited, and the processing unit 160 may be realized by a device dedicated to the toothbrush 110. Further, as in the case of the present embodiment, the processing unit 160 may be realized by executing the program on a general-purpose device such as a so-called smartphone. Further, the terminal device 150 may be either a portable type or a stationary type.
  • the toothbrush system 100 may connect a plurality of terminal devices 150 and a plurality of toothbrushes 110 via a network.
  • the second communication device 151 is a device for performing information communication with at least the toothbrush 110.
  • the second communication device 151 receives the posture information detected by the posture detection sensor 120 and generated by the posture information generation unit 121.
  • the terminal device 150 may transmit information to the toothbrush 110 via the second communication device 151 to change the vibration state of the brush unit 113.
  • the input means 152 is a man-machine interface such as a so-called touch panel.
  • the input means 152 is a touch panel provided on the front surface of the display means 153.
  • the input means 152 may be a physical push button such as a volume button.
  • the display means 153 is a device such as a liquid crystal display device or an organic EL (Electroluminescence) display device that visually displays text, an image, a video, or the like.
  • the camera 154 images the face of a person who has the toothbrush 110 inserted into the oral cavity and generates image information.
  • the type of the camera 154 is not particularly limited, and an apparatus including a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Sensor) image sensor and a lens system for forming an image on the image sensor shall be exemplified. Can be done.
  • the camera 154 is a so-called in-camera in which the display means 153 captures an image of a face portion existing on the side displaying an image or the like.
  • the face part is the part of the face that includes the lips into which the toothbrush is inserted, and further includes the cheeks, chin, etc.
  • the face may include the nose, eyes, eyebrows, forehead, ears, temporomandibular joints, facial contours and the like.
  • the processing unit 160 is an arithmetic unit that realizes various processing units by executing a program, and in the case of the present embodiment, the estimation unit 161 is made to function by executing the program.
  • the estimation unit 161 estimates the contact portion of the brush unit 113 in the dentition based on the posture information transmitted from the toothbrush 110 and the image information including the face portion obtained from the camera 154. The details of the estimation method will be described later.
  • FIG. 4 is a diagram showing variations of image information including a face portion during tooth brushing.
  • the estimation unit 161 estimates the bulging position of the cheek and the state of the lips based on the image information acquired from the camera 154, and the brush unit 113 in the dentition based on the posture of the toothbrush 110 and the like. Estimate the contact area of.
  • the estimation method of the estimation unit 161 is not particularly limited, but is, for example, an image including the face portion shown in FIGS. 4A or 4 or an image information of an image in which the brush portion 113 is inserted on the right side.
  • artificial intelligence such as deep learning is cultivated by using a large number of image information of different states during brushing and the posture information of this section part of the brush portion 113 and the toothbrush 110 at that time as teacher information.
  • the contact portion of the brush portion 113 may be estimated from the image information acquired from the camera 154 and the posture information using the trained artificial intelligence.
  • the parts estimated as the contact parts with the brush part 113 are (1) maxillary side / mandibular side and (2) lingual side / buccal side / meshing side (in the case of molar teeth) for each tooth. May be good. If it is difficult to estimate the position of each tooth, the dentition may be divided into five regions of leftmost / left / center / right / rightmost.
  • the tooth brushing part can be estimated accurately.
  • the estimation unit 161 may estimate whether the brush unit 113 is on the lingual side or the buccal side based on the deformation of at least one of the lips and the cheek included in the image information acquired from the camera 154. Specifically, based on the image information, the brush portion 113 exists on the buccal side in the state of the lips and cheeks shown in FIGS. If it is in the state, it is estimated that the brush portion 113 exists on the lingual side. Specifically, it is conceivable that the estimation unit 161 makes a determination using artificial intelligence. In this case, since the image information is only classified into two states, the lingual side and the buccal side, the artificial intelligence can be easily trained, and the estimation process using the trained artificial intelligence can be easily performed.
  • the estimation unit 161 may presume that the brush unit 113 is outside the oral cavity. In this case, it is not necessary to perform the subsequent processing.
  • the estimation unit 161 may estimate the portion of the dentition where the brush unit 113 is located by classifying the posture information with a predetermined threshold value after the estimation of the above two classifications is completed.
  • the estimation unit 161 may refer to FIGS. 5, 6, 7 or 7 based on the pitch angle (angle around the Y axis), yaw angle (angle around the X axis), and roll angle (angle around the Z axis) included in the attitude information.
  • the threshold value determination as shown in FIG. 8 may be performed to estimate the portion of the dentition that the brush portion 113 is in contact with.
  • the X-axis, Y-axis, and Z-axis shown in FIGS. 1 and 2 do not indicate the three axes of space, but indicate the three axes fixed to the toothbrush 110.
  • the X-axis, Y-axis, and Z-axis follow the posture of the toothbrush 110.
  • the origins of the X-axis, Y-axis, and Z-axis are at the center of gravity of the toothbrush 110
  • the Z-axis is the alignment direction of the handle portion 111, the shaft portion 112, and the brush portion 113
  • the Y-axis is the spread of the brush hair of the brush portion 113.
  • the existing direction and the X-axis are fixed in the direction orthogonal to the Y-axis and the Z-axis.
  • nth range (n is an integer) shown in FIGS. 5 and 6 indicates a preset angle range, and the ranges indicated by adjacent numbers are described as adjacent. In addition, the angle ranges are described as not overlapping.
  • the estimation process can be speeded up by limiting the estimation based on the image to a part and estimating the others by the threshold value judgment.
  • the estimation unit 161 may use the posture information to estimate at least one of the estimation of whether the brush unit 113 is on the meshing surface and the estimation of whether the brush unit 113 is on the front side or the lingual side. .. Specifically, based on the roll angle included in the posture information, it is determined whether the brush portion 113 is in contact with the meshing of the molar teeth or the brush portion 113 is in contact with the central lingual side, and the determination is made. The procedure may be omitted from the classification procedure of FIGS. 5 to 8.
  • the operation example 3 it is possible to exclude the erroneous detection of the meshing by the image information or the central lingual side in the operation example 2, and it is possible to speed up the processing and improve the accuracy.
  • the present disclosure is not limited to the above-described embodiment.
  • another embodiment realized by arbitrarily combining the components described in the present specification and excluding some of the components may be the embodiment of the present disclosure.
  • the present disclosure also includes modifications obtained by making various modifications that can be conceived by those skilled in the art within the scope of the gist of the present disclosure, that is, the meaning indicated by the wording described in the claims, with respect to the above-described embodiment. Is done.
  • the toothbrush 110 may not include the posture information generation unit 121, and the processing unit 160 of the terminal device 150 may make the posture information generation unit 121 function.
  • the camera 154 may be a separate body from the toothbrush 110 and the terminal device 150, and may be provided by, for example, a mirror (smart mirror) in the washroom. Further, the smart mirror may function as the terminal device 150.
  • the user interface between the device and the user of the present disclosure may have various embodiments.
  • the user interface may be configured to include an input interface and / or an output interface.
  • the input interface is used by the user to input information to the devices of the present disclosure.
  • the output interface is used by the apparatus of the present disclosure to output information to the user.
  • Various embodiments can be considered as the input interface.
  • the input interface may be composed of mechanical operating members.
  • the input interface may be configured by a transparent plate-shaped operating member installed above the display. As such a transparent plate-shaped operating member, a contact type or a non-contact type can be used.
  • the device of the present disclosure may be used as an input interface by photographing the user's motion using a camera and recognizing the motion by the apparatus of the present disclosure. Further, the device of the present disclosure may receive the sound emitted by the user to serve as an input interface. A smart speaker or the like corresponds to such a configuration.
  • various embodiments can be considered as the output interface.
  • the output interface may be configured on the display. As the display, a display such as a segment type display device, a liquid crystal display, or an organic EL display can be used. Further, the output interface may be configured by turning on or off the light by using it for an LED or the like. Further, the output interface may be configured by displaying an image on a display or a projector.
  • the output interface may be configured by the device of the present disclosure producing a sound. Further, the device of the present disclosure may configure an output interface by stimulating the user's tactile sensation.
  • various embodiments can be considered with respect to where the user interface is provided.
  • the user interface may be provided in the device of the present disclosure, or may be provided separately from the device of the present disclosure. When the user interface is provided separately from the device of the present disclosure, it is possible to communicate between the user interface and the device of the present disclosure by wire or wirelessly. In this case, the user interface and the device of the present disclosure may be able to communicate directly, or may be indirectly communicated via the Internet, an access point, or the like. Further, in the case of wireless communication, it may be possible to communicate by a mobile communication method, or it may be possible to communicate in accordance with other standards. Further, when communicating wirelessly, a remote radio may be used or a proximity radio may be used.
  • the control means in the present disclosure may be any one capable of controlling the device in the present disclosure.
  • the apparatus of the present disclosure may be controlled by a controller or a control unit or a word similar thereto in addition to the control means.
  • the control means can be realized in various aspects.
  • a processor may be used as a control means. If a processor is used as a control means, various processes can be executed by having the processor read the program from the storage medium in which the program is stored and executing the program by the processor. Therefore, since the processing content can be changed by changing the program stored in the storage medium, the degree of freedom in changing the control content can be increased.
  • the processor include a CPU (Central Processing Unit) and an MPU (Micro-Processing Unit).
  • Examples of the storage medium include a hard disk, a flash memory, an optical disk, and the like.
  • a control means a wired logic in which the program cannot be rewritten may be used. If wired logic is used as the control means, it is effective in improving the processing speed. Examples of the wired logic include an ASIC (Application Specific Integrated Circuit) and the like.
  • a control means a processor and a wired logic may be combined and realized. If the control means is realized by combining a processor and wired logic, the processing speed can be improved while increasing the degree of freedom in software design. Further, the control means and the circuit having a function different from that of the control means may be configured by one semiconductor element.
  • circuits having another function include an A / D conversion circuit and a D / A conversion circuit.
  • the control means may be composed of one semiconductor element or a plurality of semiconductor elements. When composed of a plurality of semiconductor elements, each control described in the claims may be realized by different semiconductor elements. Further, the control means may be configured by a configuration including a semiconductor element and a passive component such as a resistor or a capacitor.
  • the communicator in the present disclosure may be any one that enables communication between the device of the present disclosure and an external device.
  • a communication means or a communication device or a transmission / reception means or a transmission / reception unit or a wording similar thereto is used to enable communication between the device of the present disclosure and an external device. It may be written.
  • the communicator can be realized in various ways. For example, the communicator may be connected to an external device by wire, or may be connected to an external device by wireless communication. A communicator that connects the device of the present disclosure and an external device by wire is effective in terms of communication security and communication stability.
  • Examples of the wired connection communicator include a wired LAN (Local Area Network) based on the Ethernet (Ethernet: registered trademark) standard, and a wired connection using an optical fiber cable.
  • the wireless connection communicator includes a wireless connection with an external device via a base station or the like, or a direct wireless connection with an external device.
  • Examples of wireless connection with an external device via a base station or the like include an IEEE802.11 compatible wireless LAN that wirelessly communicates with a WiFi (WiMAX: registered trademark) router, a third generation mobile communication system (commonly known as 3G), and the first.
  • 4 generation mobile communication systems commonly known as 4G
  • WiMax WiMAX: registered trademark
  • LPWA Low Power Wide Area
  • the disclosed device can communicate with an external device.
  • Examples of the communicator that directly wirelessly connects the device of the present disclosure and the external device include communication by Bluetooth (registered trademark), communication by NFC (Near Field Communication) via a loop antenna, infrared communication, and the like. be.
  • This disclosure is applicable to manual toothbrushes, electric toothbrushes, etc., because it is possible to accurately determine the toothbrushed part with a simple configuration.
  • Toothbrush system 110 Toothbrush 111 Handle 112 Shaft 113 Brush 114 Drive switch 116 Power supply 117 First communication device 118 Control means 119 Drive means 120 Attitude detection sensor 121 Attitude information generator 150 Terminal device 151 Second communication device 152 Input Means 153 Display Means 154 Camera 160 Processing Unit 161 Estimating Unit

Abstract

A toothbrush system (100) comprising: a toothbrush (110) that has a brush part (113) for brushing teeth, and an orientation-sensing sensor (120) for sensing the orientation of the toothbrush (110); and a terminal device (150). The toothbrush system (100) also comprises: an orientation information generation unit (121) that, on the basis of the output from the orientation-sensing sensor (120), generates orientation information indicating the orientation of the toothbrush (110); and an estimation unit (161) that, on the basis of the orientation information and image information in which is included the head of a person into the oral cavity of whom the toothbrush (110) has been inserted, said image information being acquired from a camera (154), estimates a contact site at which the brush part (113) contacts the teeth.

Description

歯ブラシシステム、および歯ブラシシステム用プログラムToothbrush system and programs for toothbrush system
 本開示は、歯磨きしている部位を推定することのできる歯ブラシシステム、および歯ブラシシステム用プログラムに関する。 This disclosure relates to a toothbrush system capable of estimating a toothbrushing part and a program for a toothbrush system.
 従来、慣性センサを用いて得られる歯ブラシの姿勢情報によって、歯磨き部位を推定する手法が存在する。歯磨き部位を特定することにより、口腔衛生に適した磨き方などを教示することができる。 Conventionally, there is a method of estimating the toothpaste part from the posture information of the toothbrush obtained by using the inertia sensor. By specifying the tooth brushing site, it is possible to teach how to brush suitable for oral hygiene.
 一般に慣性センサのみを用いた磨き部位検知では、センサ出力が類似していると詳細な部位の判別が十分にできず、誤検知することがある。特に歯に対して頬側に歯ブラシが挿入されているか、舌側に挿入されているかの判別、および、左右のいずれに歯ブラシが挿入されているかの判別が困難である。 In general, in the polished part detection using only the inertia sensor, if the sensor outputs are similar, it is not possible to sufficiently discriminate the detailed part, and erroneous detection may occur. In particular, it is difficult to determine whether the toothbrush is inserted on the buccal side or the lingual side of the tooth, and whether the toothbrush is inserted on the left or right side.
 そこで誤検知を低減するために、慣性センサ以外の手段を併用することがある。たとえば特許文献1あるいは特許文献2は、カメラ、温度センサ、または距離センサを用いることにより歯磨き部位の判別精度が向上していることが記載されている。 Therefore, in order to reduce false positives, means other than the inertial sensor may be used together. For example, Patent Document 1 or Patent Document 2 describes that the accuracy of discriminating a toothpaste portion is improved by using a camera, a temperature sensor, or a distance sensor.
 また、特許文献3においては、特許文献1あるいは特許文献2のように多数のセンサを併用すると汚れに弱いこと、および、配線の煩雑さを課題に挙げ、接触または近接を検知する手段によって構成の簡易化を図っている。 Further, in Patent Document 3, as in Patent Document 1 or Patent Document 2, when a large number of sensors are used in combination, it is vulnerable to dirt and the complexity of wiring is raised as problems, and it is configured by means for detecting contact or proximity. We are trying to simplify it.
 また、特許文献4は、スマートフォン又はタブレットPCを想定した個人衛生デバイスに搭載されたカメラによって撮影された画像と歯ブラシに搭載された慣性センサとを用いて、磨き部位検知を行う技術を開示している。 Further, Patent Document 4 discloses a technique for detecting a polished portion by using an image taken by a camera mounted on a personal hygiene device assuming a smartphone or a tablet PC and an inertial sensor mounted on a toothbrush. There is.
 しかしながら、ブラシ部にセンサを搭載する従来技術では、歯ブラシの構造の複雑化は避けられない。また、いずれの文献に記載の技術でも、歯ブラシが頬側に配置されているか舌側に配置されているかなどの正確な歯磨き部位の判定は困難である。 However, with the conventional technology of mounting a sensor on the brush part, the structure of the toothbrush is inevitably complicated. Further, with any of the techniques described in the literature, it is difficult to accurately determine the toothbrushing portion such as whether the toothbrush is arranged on the buccal side or the lingual side.
特開2009-240760号公報Japanese Unexamined Patent Publication No. 2009-240760 特開2009-240759号公報JP-A-2009-240759 特開2011-139844号公報Japanese Unexamined Patent Publication No. 2011-139844 特表2019-500632号公報Special Table 2019-500632
 本開示は、歯磨き部位の高精度な検知が可能な歯ブラシシステムを提供する。 The present disclosure provides a toothbrush system capable of highly accurate detection of a toothpaste site.
 本開示における歯ブラシシステムは、歯を刷掃するブラシ部と、姿勢を検知する姿勢検知センサとを有する歯ブラシと、端末装置とを備える歯ブラシシステムであって、姿勢検知センサの出力に基づいて歯ブラシの姿勢を示す姿勢情報を生成する姿勢情報生成部と、カメラから取得する歯ブラシを口腔内に挿入した人の顔部が含まれた画像情報、および姿勢情報に基づいて、歯列におけるブラシ部の当接部位を推定する推定部と、を備える。 The toothbrush system in the present disclosure is a toothbrush system including a brush portion for wiping teeth, a toothbrush having a posture detection sensor for detecting a posture, and a terminal device, and the toothbrush system is based on the output of the posture detection sensor. Based on the posture information generation unit that generates posture information indicating the posture, the image information including the face of the person who inserted the toothbrush into the oral cavity acquired from the camera, and the posture information, the contact of the brush part in the dentition It is provided with an estimation unit for estimating a contact site.
図1は、歯ブラシシステムの歯ブラシの側面と端末装置の正面とを示した図である。FIG. 1 is a view showing a side surface of a toothbrush of a toothbrush system and a front surface of a terminal device. 図2は、歯ブラシシステムの歯ブラシの正面と端末装置の正面とを示した図である。FIG. 2 is a view showing the front surface of the toothbrush of the toothbrush system and the front surface of the terminal device. 図3は、本開示における歯ブラシシステムの機能構成を示すブロック図である。FIG. 3 is a block diagram showing a functional configuration of the toothbrush system in the present disclosure. 図4は、歯磨き中の顔部を含む画像情報のバリエーションを示す図である。FIG. 4 is a diagram showing variations of image information including a face portion during tooth brushing. 図5は、姿勢情報(ヨー角)の閾値判断による分類絞り込み手法を示すフローチャートである。FIG. 5 is a flowchart showing a classification narrowing method based on a threshold value determination of posture information (yaw angle). 図6は、姿勢情報(ピッチ角とロール角)の閾値判断による分類手法を示すフローチャートである。FIG. 6 is a flowchart showing a classification method based on threshold determination of posture information (pitch angle and roll angle). 図7は、画像情報によりブラシ部が頬側と判断された場合の姿勢情報に基づく推定部の推定方法を示すフローチャートである。FIG. 7 is a flowchart showing an estimation method of the estimation unit based on the posture information when the brush portion is determined to be the buccal side based on the image information. 図8は、画像情報によりブラシ部が舌側と判断された場合の姿勢情報に基づく推定部の推定方法を示すフローチャートである。FIG. 8 is a flowchart showing an estimation method of the estimation unit based on the posture information when the brush unit is determined to be the lingual side based on the image information.
 以下、図面を参照しながら、実施の形態を詳細に説明する。但し、必要以上に詳細な説明は省略する場合がある。例えば、既によく知られた事項の詳細説明、または、実質的に同一の構成に対する重複説明を省略する場合がある。これは、以下の説明が必要以上に冗長になるのを避け、当業者の理解を容易にするためである。 Hereinafter, embodiments will be described in detail with reference to the drawings. However, more detailed explanation than necessary may be omitted. For example, detailed explanations of already well-known matters or duplicate explanations for substantially the same configuration may be omitted. This is to prevent the following explanation from becoming unnecessarily redundant and to facilitate the understanding of those skilled in the art.
 なお、添付図面および以下の説明は、当業者が本開示を十分に理解するために提供されるのであって、これらにより請求の範囲に記載の主題を限定することを意図していない。 It should be noted that the accompanying drawings and the following description are provided for those skilled in the art to fully understand the present disclosure, and are not intended to limit the subject matter described in the claims.
 以下、図を用いて、実施の形態を説明する。 Hereinafter, embodiments will be described with reference to the figures.
 [1-1.構成]
 図1は、歯ブラシシステムの歯ブラシの側面と端末装置の正面とを示した図である。図2は、歯ブラシシステムの歯ブラシの背面と端末装置の正面とを示した図である。図3は、本開示における歯ブラシシステムの機能構成を示すブロック図である。歯ブラシシステム100は、口腔内における歯と歯ブラシとの位置関係を取得することができるシステムであって、歯ブラシ110と、端末装置150と、を備えている。
[1-1. Constitution]
FIG. 1 is a view showing a side surface of a toothbrush of a toothbrush system and a front surface of a terminal device. FIG. 2 is a view showing the back surface of the toothbrush of the toothbrush system and the front surface of the terminal device. FIG. 3 is a block diagram showing a functional configuration of the toothbrush system in the present disclosure. The toothbrush system 100 is a system capable of acquiring a positional relationship between a tooth and a toothbrush in the oral cavity, and includes a toothbrush 110 and a terminal device 150.
 歯ブラシ110は、手に保持された状態で一部が口腔内に挿入され、歯を刷掃する器具であり、ハンドル部111と、軸部112と、ブラシ部113と、姿勢検知センサ120と、を備えている。本実施の形態の場合、歯ブラシ110は、さらに第一通信装置117と、姿勢情報生成部121を機能させる制御手段118と、を備えている。 The toothbrush 110 is a device that is partially inserted into the oral cavity while being held in the hand to wipe the teeth, and includes a handle portion 111, a shaft portion 112, a brush portion 113, a posture detection sensor 120, and the like. It has. In the case of the present embodiment, the toothbrush 110 further includes a first communication device 117 and a control means 118 for operating the posture information generation unit 121.
 ハンドル部111は、歯を刷掃する際に把持される部分であり、概略円筒形状を呈している。本実施の形態の場合、歯ブラシ110は、ブラシ部113が動力により駆動する電動歯ブラシであり、ハンドル部111の内側には、駆動手段119と、制御手段118と、電源部116と、を備えている。また、ハンドル部111の外周面には、手指によって操作することができる駆動スイッチ114が取り付けられている。 The handle portion 111 is a portion that is gripped when the teeth are wiped, and has a substantially cylindrical shape. In the case of the present embodiment, the toothbrush 110 is an electric toothbrush in which the brush portion 113 is driven by power, and the drive means 119, the control means 118, and the power supply portion 116 are provided inside the handle portion 111. There is. Further, a drive switch 114 that can be operated by fingers is attached to the outer peripheral surface of the handle portion 111.
 駆動手段119は、電動モータなどの回転アクチュエータ、電磁力、圧電素子、磁歪素子などを用いたリニアアクチュエータなどであり、ギアやリンクなどの機械要素によりブラシ部113を動作させる。 The drive means 119 is a rotary actuator such as an electric motor, a linear actuator using an electromagnetic force, a piezoelectric element, a magnetostrictive element, or the like, and the brush portion 113 is operated by a mechanical element such as a gear or a link.
 電源部116は、駆動手段119、姿勢検知センサ120、第一通信装置117などに電力を供給するバッテリーである。電源部116は、一次電池、二次電池、および商用電源の少なくとも一つが用いられる。 The power supply unit 116 is a battery that supplies electric power to the drive means 119, the posture detection sensor 120, the first communication device 117, and the like. At least one of a primary battery, a secondary battery, and a commercial power source is used for the power supply unit 116.
 駆動スイッチ114は、ブラシ部113を駆動させるか否かを決定することができる装置である。本実施の形態の場合、駆動スイッチ114は、押しボタンスイッチであり、手指により操作されることで制御手段118を介して駆動手段119の動作のオンとオフとを決定することができる。 The drive switch 114 is a device that can determine whether or not to drive the brush unit 113. In the case of the present embodiment, the drive switch 114 is a push button switch, and the operation of the drive means 119 can be turned on and off via the control means 118 by being operated by a finger.
 制御手段118は、情報の入出力を行うことにより歯ブラシ110の動作を司る装置である。本実施の形態の場合、制御手段118は、いわゆるSoC(System on a Chip)と称される装置であり、半導体チップ上に歯ブラシ110の動作を制御するための機能が実装されている。なお、制御手段118は、単なる電気的接点、リレーなどアナログ的な制御装置であってもかまわない。 The control means 118 is a device that controls the operation of the toothbrush 110 by inputting and outputting information. In the case of the present embodiment, the control means 118 is a device called SoC (System on a Chip), and a function for controlling the operation of the toothbrush 110 is mounted on the semiconductor chip. The control means 118 may be an analog control device such as a simple electrical contact or a relay.
 第一通信装置117は、端末装置150との間で情報通信を行うための装置である。本実施の形態の場合、第一通信装置117は、姿勢検知センサ120が検知した情報に基づき姿勢情報生成部121で判断された姿勢情報を送信することができるものとなっている。第一通信装置117の通信方法は、特に限定されるものではなく、有線通信、無線通信のいずれでもかまわないが、本実施の形態の場合ブルートゥース(登録商標)などによる無線通信が採用されている。なお、第一通信装置117は、制御手段118に組み込まれていてもかまわない。 The first communication device 117 is a device for performing information communication with the terminal device 150. In the case of the present embodiment, the first communication device 117 can transmit the posture information determined by the posture information generation unit 121 based on the information detected by the posture detection sensor 120. The communication method of the first communication device 117 is not particularly limited and may be either wired communication or wireless communication, but in the case of the present embodiment, wireless communication by Bluetooth (registered trademark) or the like is adopted. .. The first communication device 117 may be incorporated in the control means 118.
 姿勢検知センサ120は、ハンドル部111に取り付けられ、歯ブラシ110の姿勢を検知するセンサである。姿勢検知センサ120の種類や検知できる物理量の種類は特に限定されるものではなく、例えば加速度センサ、ジャイロセンサ、1軸、2軸あるいは3軸の加速度センサ、ジャイロセンサ、および地磁気センサなどを例示することができ、これらの少なくとも1つを備えていればよい。また、1つのユニットで位置、方位、加速度、および速度を検知することができる慣性センサを姿勢検知センサ120として採用してもかまわない。 The posture detection sensor 120 is a sensor attached to the handle portion 111 to detect the posture of the toothbrush 110. The type of the attitude detection sensor 120 and the type of physical quantity that can be detected are not particularly limited, and examples thereof include an acceleration sensor, a gyro sensor, a one-axis, two-axis or three-axis acceleration sensor, a gyro sensor, and a geomagnetic sensor. It can be and only needs to have at least one of these. Further, an inertial sensor capable of detecting the position, direction, acceleration, and speed with one unit may be adopted as the attitude detection sensor 120.
 姿勢情報生成部121は、姿勢検知センサ120の出力に基づいて歯ブラシ110の姿勢を示す姿勢情報を生成する処理部である。本実施の形態の場合、制御手段118にプログラムを実行させることにより姿勢情報生成部121が機能している。 The posture information generation unit 121 is a processing unit that generates posture information indicating the posture of the toothbrush 110 based on the output of the posture detection sensor 120. In the case of the present embodiment, the posture information generation unit 121 functions by causing the control means 118 to execute the program.
 軸部112は、ハンドル部111の軸方向(図1及び図2中のZ軸方向)において、ハンドル部111の端部から突出状に接続される棒状の部分である。歯ブラシ110により歯を刷掃する際には軸部112の少なくとも一部はブラシ部113と共に口腔内に挿入される場合がある。 The shaft portion 112 is a rod-shaped portion connected in a protruding shape from the end portion of the handle portion 111 in the axial direction of the handle portion 111 (Z-axis direction in FIGS. 1 and 2). When the tooth is wiped with the toothbrush 110, at least a part of the shaft portion 112 may be inserted into the oral cavity together with the brush portion 113.
 ブラシ部113は、軸部112のハンドル部111と逆側の先端部分(図1及び図2中のZ+側端部)に接続され、軸部112の軸方向(図1及び図2中のZ軸方向)と交差する方向(図1及び図2中のY軸方向)に延在する複数のブラシ毛を有する部分である。本実施の形態の場合、ブラシ部113は、ブラシ毛が基部材に植毛されて形成されたものであり、ハンドル部111から着脱可能である。 The brush portion 113 is connected to the tip portion (Z + side end portion in FIGS. 1 and 2) opposite to the handle portion 111 of the shaft portion 112, and is connected to the axial direction of the shaft portion 112 (Z in FIGS. 1 and 2). It is a portion having a plurality of brush bristles extending in a direction intersecting (axial direction) (Y-axis direction in FIGS. 1 and 2). In the case of the present embodiment, the brush portion 113 is formed by flocking the brush bristles on the base member, and is removable from the handle portion 111.
 端末装置150は、歯ブラシ110との間で情報通信を行うことができる装置であり、プログラムを実行することにより情報処理を行う。端末装置150は、入力手段152と、表示手段153と、カメラ154と、推定部161を実現する処理部160と、第二通信装置151と、を備えている。端末装置150の種類は特に限定されるものではなく、歯ブラシ110専用の装置により処理部160を実現してもかまわない。また本実施の形態の場合のように、いわゆるスマートフォンなどの汎用の装置にプログラムを実行させることにより処理部160を実現してもかまわない。また、端末装置150は、携帯型、据え置き型のいずれでもよい。なお、歯ブラシシステム100は、複数台の端末装置150と複数台の歯ブラシ110とを、ネットワークを介して接続するものでもかまわない。 The terminal device 150 is a device capable of performing information communication with the toothbrush 110, and performs information processing by executing a program. The terminal device 150 includes an input means 152, a display means 153, a camera 154, a processing unit 160 that realizes an estimation unit 161 and a second communication device 151. The type of the terminal device 150 is not particularly limited, and the processing unit 160 may be realized by a device dedicated to the toothbrush 110. Further, as in the case of the present embodiment, the processing unit 160 may be realized by executing the program on a general-purpose device such as a so-called smartphone. Further, the terminal device 150 may be either a portable type or a stationary type. The toothbrush system 100 may connect a plurality of terminal devices 150 and a plurality of toothbrushes 110 via a network.
 第二通信装置151は、少なくとも歯ブラシ110との間で情報通信を行うための装置である。本実施の形態の場合、第二通信装置151は、姿勢検知センサ120がそれぞれ検出し姿勢情報生成部121が生成した姿勢情報を受信する。なお、端末装置150は、第二通信装置151を介して歯ブラシ110に情報を送信し、ブラシ部113の振動状態を変化させるなどを実行してもかまわない。 The second communication device 151 is a device for performing information communication with at least the toothbrush 110. In the case of the present embodiment, the second communication device 151 receives the posture information detected by the posture detection sensor 120 and generated by the posture information generation unit 121. The terminal device 150 may transmit information to the toothbrush 110 via the second communication device 151 to change the vibration state of the brush unit 113.
 入力手段152は、いわゆるタッチパネルなどのマンマシンインターフェースである。本実施の形態の場合、入力手段152は、表示手段153の前面に設けられたタッチパネルである。なお、入力手段152はボリュームボタンなどの物理的な押しボタンなどでもかまわない。表示手段153は、液晶表示装置、又は有機EL(Electro Luminescence)表示装置などの、テキスト、画像又は映像などを視認可能に表示する装置である。 The input means 152 is a man-machine interface such as a so-called touch panel. In the case of the present embodiment, the input means 152 is a touch panel provided on the front surface of the display means 153. The input means 152 may be a physical push button such as a volume button. The display means 153 is a device such as a liquid crystal display device or an organic EL (Electroluminescence) display device that visually displays text, an image, a video, or the like.
 カメラ154は、歯ブラシ110を口腔内に挿入した人の顔部を撮像して画像情報を生成する。カメラ154の種類は特に限定されるものではなく、CCD(Charge Coupled Device)イメージセンサ、またはCMOS(Complementary Metal Oxide Semiconductor)イメージセンサと、イメージセンサに結像するレンズ系とを備える装置を例示することができる。本実施の形態の場合、カメラ154は、表示手段153が画像などを表示する側に存在する顔部を撮像する、いわゆるインカメラである。 The camera 154 images the face of a person who has the toothbrush 110 inserted into the oral cavity and generates image information. The type of the camera 154 is not particularly limited, and an apparatus including a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Sensor) image sensor and a lens system for forming an image on the image sensor shall be exemplified. Can be done. In the case of the present embodiment, the camera 154 is a so-called in-camera in which the display means 153 captures an image of a face portion existing on the side displaying an image or the like.
 顔部とは、歯ブラシが挿入されている唇を含み、さらに頬、顎などを含む顔の部分である。さらに、顔部は、鼻、目、眉、額、耳、顎関節部、顔の輪郭などを含んでもよい。 The face part is the part of the face that includes the lips into which the toothbrush is inserted, and further includes the cheeks, chin, etc. In addition, the face may include the nose, eyes, eyebrows, forehead, ears, temporomandibular joints, facial contours and the like.
 処理部160は、プログラムを実行することにより各種処理部を実現する演算装置であり、本実施の形態の場合、プログラムを実行することにより推定部161を機能させている。 The processing unit 160 is an arithmetic unit that realizes various processing units by executing a program, and in the case of the present embodiment, the estimation unit 161 is made to function by executing the program.
 推定部161は、歯ブラシ110から送信される姿勢情報、およびカメラ154から得られる顔部を含む画像情報に基づき、歯列におけるブラシ部113の当接部位を推定する。なお、推定方法の詳細は後述する。 The estimation unit 161 estimates the contact portion of the brush unit 113 in the dentition based on the posture information transmitted from the toothbrush 110 and the image information including the face portion obtained from the camera 154. The details of the estimation method will be described later.
 (動作例1)
 以上のように構成された歯ブラシシステム100の動作を以下で説明する。図4は、歯磨き中の顔部を含む画像情報のバリエーションを示す図である。同図に示すように、推定部161は、カメラ154から取得した画像情報に基づき、頬の膨らんでいる位置、唇の状態を推定するとともに、歯ブラシ110の姿勢などに基づき歯列におけるブラシ部113の当接部位を推定する。推定部161の推定方法は特に限定されるものではないが、例えば、図4のa~eに示す顔部を含む画像、または、右側にブラシ部113が挿入されている像の画像情報などであって、相互に異なる歯磨き中の状態の多数の画像情報と、その時のブラシ部113の当節部位および歯ブラシ110の姿勢情報と、を教師情報としてディープラーニングなどの人工知能を育成しておき、育成済みの人工知能を用いてカメラ154から取得した画像情報、および姿勢情報からブラシ部113の当接部位を推定してもよい。
(Operation example 1)
The operation of the toothbrush system 100 configured as described above will be described below. FIG. 4 is a diagram showing variations of image information including a face portion during tooth brushing. As shown in the figure, the estimation unit 161 estimates the bulging position of the cheek and the state of the lips based on the image information acquired from the camera 154, and the brush unit 113 in the dentition based on the posture of the toothbrush 110 and the like. Estimate the contact area of. The estimation method of the estimation unit 161 is not particularly limited, but is, for example, an image including the face portion shown in FIGS. 4A or 4 or an image information of an image in which the brush portion 113 is inserted on the right side. Therefore, artificial intelligence such as deep learning is cultivated by using a large number of image information of different states during brushing and the posture information of this section part of the brush portion 113 and the toothbrush 110 at that time as teacher information. The contact portion of the brush portion 113 may be estimated from the image information acquired from the camera 154 and the posture information using the trained artificial intelligence.
 なお、ブラシ部113との当接部位として推定する部位は、各歯についての(1)上顎側/下顎側、(2)舌側/頬側/噛み合わせ側(臼歯の場合)を推定してもよい。歯の1本1本の位置を推定することが困難である場合、歯列を最左側/左側/中央/右側/最右側の5領域に区分して推定してもかまわない。 The parts estimated as the contact parts with the brush part 113 are (1) maxillary side / mandibular side and (2) lingual side / buccal side / meshing side (in the case of molar teeth) for each tooth. May be good. If it is difficult to estimate the position of each tooth, the dentition may be divided into five regions of leftmost / left / center / right / rightmost.
 以上の様に、人工知能を用い、入力ベクトルとして姿勢情報、教師情報として姿勢情報、および、上顎側/下顎側、舌側/頬側/噛み合わせ側、歯列における歯毎の位置、または最左側/左側/中央/右側/最右側の5領域を用いることで、歯磨き部位を精度よく推定することができる。 As described above, using artificial intelligence, posture information as input vector, posture information as teacher information, and the position of each tooth in the upper jaw side / lower jaw side, lingual side / buccal side / meshing side, or the most By using the five regions of left side / left side / center / right side / rightmost side, the tooth brushing part can be estimated accurately.
 (動作例2)
 推定部161は、カメラ154から取得した画像情報に含まれる唇、および頬の少なくとも一方の変形に基づき、ブラシ部113が舌側にあるか頬側にあるかを推定してもかまわない。具体的には、画像情報に基づき、図4のa~cに示される唇および頬の状態であればブラシ部113は頬側に存在し、図4のd、eに示される唇および頬の状態であればブラシ部113は舌側に存在すると推定する。具体的には、推定部161は人工知能を用いて判定することが考えられる。この場合、画像情報を舌側および頬側の2つの状態に分類するだけであるため、人工知能の育成が容易になり、また育成済みの人工知能を用いた推定処理も容易になる。
(Operation example 2)
The estimation unit 161 may estimate whether the brush unit 113 is on the lingual side or the buccal side based on the deformation of at least one of the lips and the cheek included in the image information acquired from the camera 154. Specifically, based on the image information, the brush portion 113 exists on the buccal side in the state of the lips and cheeks shown in FIGS. If it is in the state, it is estimated that the brush portion 113 exists on the lingual side. Specifically, it is conceivable that the estimation unit 161 makes a determination using artificial intelligence. In this case, since the image information is only classified into two states, the lingual side and the buccal side, the artificial intelligence can be easily trained, and the estimation process using the trained artificial intelligence can be easily performed.
 なお、推定部161は、ブラシ部113が口腔外にあると推定してもかまわない。この場合、その後の処理を行わなくてもかまわない。 The estimation unit 161 may presume that the brush unit 113 is outside the oral cavity. In this case, it is not necessary to perform the subsequent processing.
 推定部161は、上記2分類の推定終了後、姿勢情報を所定の閾値で分類することによりブラシ部113が位置する歯列の部位を推定してもよい。 The estimation unit 161 may estimate the portion of the dentition where the brush unit 113 is located by classifying the posture information with a predetermined threshold value after the estimation of the above two classifications is completed.
 以下、閾値判断の一例を説明する。推定部161は、姿勢情報に含まれるピッチ角(Y軸周りの角度)、ヨー角(X軸周りの角度)、ロール角(Z軸周りの角度)に基づき図5、図6、図7または図8に示すような閾値判断を行って、ブラシ部113が当接している歯列の部位を推定してもかまわない。なお、図1および図2に示すX軸、Y軸、Z軸は、空間の3軸を示すものではなく、歯ブラシ110に固定された3軸を示している。従って、歯ブラシ110の姿勢が変化した場合、X軸、Y軸、Z軸は歯ブラシ110の姿勢に追従する。例えば、X軸、Y軸、Z軸の原点は歯ブラシ110の重心にあり、Z軸はハンドル部111、軸部112、ブラシ部113の並び方向、Y軸は、ブラシ部113のブラシ毛の延在方向、X軸は、Y軸、Z軸に直行する方向に固定される。 The following is an example of threshold judgment. The estimation unit 161 may refer to FIGS. 5, 6, 7 or 7 based on the pitch angle (angle around the Y axis), yaw angle (angle around the X axis), and roll angle (angle around the Z axis) included in the attitude information. The threshold value determination as shown in FIG. 8 may be performed to estimate the portion of the dentition that the brush portion 113 is in contact with. The X-axis, Y-axis, and Z-axis shown in FIGS. 1 and 2 do not indicate the three axes of space, but indicate the three axes fixed to the toothbrush 110. Therefore, when the posture of the toothbrush 110 changes, the X-axis, Y-axis, and Z-axis follow the posture of the toothbrush 110. For example, the origins of the X-axis, Y-axis, and Z-axis are at the center of gravity of the toothbrush 110, the Z-axis is the alignment direction of the handle portion 111, the shaft portion 112, and the brush portion 113, and the Y-axis is the spread of the brush hair of the brush portion 113. The existing direction and the X-axis are fixed in the direction orthogonal to the Y-axis and the Z-axis.
 また、図5および図6に記載される第n範囲(nは整数)は予め設定された角度範囲を示しており、隣り合う数字が示す範囲は隣り合うものとして記載している。また、角度範囲は重複しないものとして記載している。 Further, the nth range (n is an integer) shown in FIGS. 5 and 6 indicates a preset angle range, and the ranges indicated by adjacent numbers are described as adjacent. In addition, the angle ranges are described as not overlapping.
 以上の動作例2によれば、画像に基づく推定を一部に止め、その他は閾値判断にて推定することで、推定処理の高速化を図ることができる。 According to the above operation example 2, the estimation process can be speeded up by limiting the estimation based on the image to a part and estimating the others by the threshold value judgment.
 (動作例3)
 推定部161は、姿勢情報を用いてブラシ部113が噛み合わせ面にあるかの推定、およびブラシ部113が前側にあるか舌側にあるかの推定の少なくとも一方の推定をしてもかまわない。具体的には、姿勢情報に含まれるロール角に基づき、臼歯の噛み合わせにブラシ部113が当接しているか、中央舌側にブラシ部113が当接しているかを判定し、当該判定に対応する手順を図5~図8の分類手順から省いておいてもよい。
(Operation example 3)
The estimation unit 161 may use the posture information to estimate at least one of the estimation of whether the brush unit 113 is on the meshing surface and the estimation of whether the brush unit 113 is on the front side or the lingual side. .. Specifically, based on the roll angle included in the posture information, it is determined whether the brush portion 113 is in contact with the meshing of the molar teeth or the brush portion 113 is in contact with the central lingual side, and the determination is made. The procedure may be omitted from the classification procedure of FIGS. 5 to 8.
 動作例3によれば、動作例2において画像情報による噛み合わせあるいは中央舌側かの誤検出を除外することができ、処理の高速化、および精度の向上を図ることができる。 According to the operation example 3, it is possible to exclude the erroneous detection of the meshing by the image information or the central lingual side in the operation example 2, and it is possible to speed up the processing and improve the accuracy.
 なお、本開示は、上記実施の形態に限定されるものではない。例えば、本明細書において記載した構成要素を任意に組み合わせて、また、構成要素のいくつかを除外して実現される別の実施の形態を本開示の実施の形態としてもよい。また、上記実施の形態に対して本開示の主旨、すなわち、請求の範囲に記載される文言が示す意味を逸脱しない範囲で当業者が思いつく各種変形を施して得られる変形例も本開示に含まれる。 Note that the present disclosure is not limited to the above-described embodiment. For example, another embodiment realized by arbitrarily combining the components described in the present specification and excluding some of the components may be the embodiment of the present disclosure. The present disclosure also includes modifications obtained by making various modifications that can be conceived by those skilled in the art within the scope of the gist of the present disclosure, that is, the meaning indicated by the wording described in the claims, with respect to the above-described embodiment. Is done.
 例えば、歯ブラシ110は、姿勢情報生成部121を備えることなく、端末装置150の処理部160が姿勢情報生成部121を機能させてもよい。 For example, the toothbrush 110 may not include the posture information generation unit 121, and the processing unit 160 of the terminal device 150 may make the posture information generation unit 121 function.
 また、カメラ154は、歯ブラシ110、および端末装置150とは別体であって、例えば洗面所の鏡(スマートミラー)が備えるものでもかまわない。また、スマートミラーが端末装置150として機能してもかまわない。 Further, the camera 154 may be a separate body from the toothbrush 110 and the terminal device 150, and may be provided by, for example, a mirror (smart mirror) in the washroom. Further, the smart mirror may function as the terminal device 150.
 本開示の装置とユーザとの間のユーザインタフェース(入力手段などと表記をする場合がある)は様々な実施の形態が考えられる。ユーザインタフェースは、入力インタフェースおよび出力インタフェースの両方またはいずれか一方を含むように構成してもよい。入力インタフェースは、ユーザが本開示の装置に対して情報を入力するために用いられる。出力インタフェースは、本開示の装置がユーザに対して情報を出力するために用いられる。入力インタフェースとしては様々な実施の形態が考えられる。例えば、機械式の操作部材類で入力インタフェースを構成してもよい。また、ディスプレイの上方に設置された透明板状の操作部材で入力インタフェースを構成してもよい。このような透明板状の操作部材は、接触式のものを用いることもできるし、非接触式のものを用いることもできる。また、カメラを用いてユーザの動作を撮影し、その動作を本開示の装置が認識することで、入力インタフェースとしてもよい。また、ユーザの発する音を本開示の装置が受信することで、入力インタフェースとしてもよい。このような構成としてはスマートスピーカなどが該当する。また、出力インタフェースとしても様々な実施の形態が考えられる。例えば、ディスプレイで出力インタフェースを構成してもよい。ディスプレイとしては、セグメント式の表示装置、液晶ディスプレイ、有機ELディスプレイなどのディスプレイを使用できる。また、LED等に用いて光を点灯または消滅することにより、出力インタフェースを構成してもよい。また、ディスプレイまたはプロジェクタなどで画像を表示することで、出力インタフェースを構成してもよい。また、本開示の装置が音を発することで、出力インタフェースを構成してもよい。また、本開示の装置がユーザの触覚に対して刺激をすることで、出力インタフェースを構成してもよい。また、ユーザインタフェースをいずれの場所に設けるかに関しても様々な実施の形態が考えられる。ユーザインタフェースは、本開示の装置に設けてもよいし、本開示の装置とは別体として設けてもよい。ユーザインタフェースを本開示の装置と別体として設ける場合は、ユーザインタフェースと本開示の装置との間を、有線または無線により通信可能とする。この場合、ユーザインタフェースと本開示の装置とを直接的に通信可能してもよいし、インターネットまたはアクセスポイントなどを介在させて間接的に通信可能としてもよい。また、無線で通信する場合、移動体通信方式で通信可能としてもよいし、その他の規格に準拠して通信可能としてもよい。さらに、無線で通信する場合、遠隔無線を用いてもよいし、近接無線を用いてもよい。 The user interface between the device and the user of the present disclosure (may be referred to as an input means or the like) may have various embodiments. The user interface may be configured to include an input interface and / or an output interface. The input interface is used by the user to input information to the devices of the present disclosure. The output interface is used by the apparatus of the present disclosure to output information to the user. Various embodiments can be considered as the input interface. For example, the input interface may be composed of mechanical operating members. Further, the input interface may be configured by a transparent plate-shaped operating member installed above the display. As such a transparent plate-shaped operating member, a contact type or a non-contact type can be used. Further, the device of the present disclosure may be used as an input interface by photographing the user's motion using a camera and recognizing the motion by the apparatus of the present disclosure. Further, the device of the present disclosure may receive the sound emitted by the user to serve as an input interface. A smart speaker or the like corresponds to such a configuration. In addition, various embodiments can be considered as the output interface. For example, the output interface may be configured on the display. As the display, a display such as a segment type display device, a liquid crystal display, or an organic EL display can be used. Further, the output interface may be configured by turning on or off the light by using it for an LED or the like. Further, the output interface may be configured by displaying an image on a display or a projector. Further, the output interface may be configured by the device of the present disclosure producing a sound. Further, the device of the present disclosure may configure an output interface by stimulating the user's tactile sensation. In addition, various embodiments can be considered with respect to where the user interface is provided. The user interface may be provided in the device of the present disclosure, or may be provided separately from the device of the present disclosure. When the user interface is provided separately from the device of the present disclosure, it is possible to communicate between the user interface and the device of the present disclosure by wire or wirelessly. In this case, the user interface and the device of the present disclosure may be able to communicate directly, or may be indirectly communicated via the Internet, an access point, or the like. Further, in the case of wireless communication, it may be possible to communicate by a mobile communication method, or it may be possible to communicate in accordance with other standards. Further, when communicating wirelessly, a remote radio may be used or a proximity radio may be used.
 本開示における制御手段は、本開示における装置を制御できるものであればよい。発明の主題を表現する際に、本開示の装置を制御するものとして、制御手段の他にもコントローラまたは制御部またはそれらに類似する文言で表記する場合がある。制御手段は様々な態様で実現可能である。例えば、制御手段としてプロセッサを用いてもよい。制御手段としてプロセッサを用いれば、プログラムを格納している記憶媒体からプログラムをプロセッサに読み込ませ、プロセッサによりプログラムを実行することで、各種処理を実行することが可能となる。このため、記憶媒体に格納されたプログラムを変更することで処理内容を変更できるので、制御内容の変更の自由度を高めることができる。プロセッサとしては、例えば、CPU(Central Processing Unit)、及び、MPU(Micro-Processing Unit)などがある。記憶媒体としては、例えば、ハードディスク、フラッシュメモリ、及び、光ディスクなどがある。また、制御手段としてプログラムの書き換えが不可能なワイヤードロジックを用いてもよい。制御手段としてワイヤードロジックを用いれば、処理速度の向上に有効である。ワイヤードロジックとしては、例えば、ASIC(Application Specific Integrated Circuit)などがある。また、制御手段として、プロセッサとワイヤードロジックとを組み合わせて実現してもよい。制御手段を、プロセッサとワイヤードロジックとを組み合わせて実現すれば、ソフトウェア設計の自由度を高めつつ、処理速度を向上することができる。また、制御手段と、制御手段と別の機能を有する回路とを、1つの半導体素子で構成してもよい。別の機能を有する回路としては、例えば、A/D変換回路、D/A変換回路などがある。また、制御手段は、1つの半導体素子で構成してもよいし、複数の半導体素子で構成してもよい。複数の半導体素子で構成する場合、請求の範囲に記載の各制御を、互いに異なる半導体素子で実現してもよい。さらに、半導体素子と抵抗またはコンデンサなどの受動部品とを含む構成によって制御手段を構成してもよい。 The control means in the present disclosure may be any one capable of controlling the device in the present disclosure. When expressing the subject of the invention, the apparatus of the present disclosure may be controlled by a controller or a control unit or a word similar thereto in addition to the control means. The control means can be realized in various aspects. For example, a processor may be used as a control means. If a processor is used as a control means, various processes can be executed by having the processor read the program from the storage medium in which the program is stored and executing the program by the processor. Therefore, since the processing content can be changed by changing the program stored in the storage medium, the degree of freedom in changing the control content can be increased. Examples of the processor include a CPU (Central Processing Unit) and an MPU (Micro-Processing Unit). Examples of the storage medium include a hard disk, a flash memory, an optical disk, and the like. Further, as a control means, a wired logic in which the program cannot be rewritten may be used. If wired logic is used as the control means, it is effective in improving the processing speed. Examples of the wired logic include an ASIC (Application Specific Integrated Circuit) and the like. Further, as a control means, a processor and a wired logic may be combined and realized. If the control means is realized by combining a processor and wired logic, the processing speed can be improved while increasing the degree of freedom in software design. Further, the control means and the circuit having a function different from that of the control means may be configured by one semiconductor element. Examples of circuits having another function include an A / D conversion circuit and a D / A conversion circuit. Further, the control means may be composed of one semiconductor element or a plurality of semiconductor elements. When composed of a plurality of semiconductor elements, each control described in the claims may be realized by different semiconductor elements. Further, the control means may be configured by a configuration including a semiconductor element and a passive component such as a resistor or a capacitor.
 本開示におけるコミュニケータは、本開示の装置と外部機器との通信を可能にするものであればよい。発明の主題を表現する際に、本開示の装置と外部機器との通信を可能にするものとして、コミュニケータの他にも通信手段または通信装置または送受信手段または送受信部またはそれらに類似する文言で表記する場合がある。コミュニケータは様々な態様で実現可能である。例えば、コミュニケータは、外部機器と有線で接続する態様であってもよいし、外部機器と無線で通信接続する態様であってもよい。本開示の装置と外部機器とを有線で接続するコミュニケータであれば、通信のセキュリティ性、及び、通信の安定性において有効である。有線接続のコミュニケータとしては、例えば、Ethernet(イーサネット:登録商標)規格に基づく有線LAN(Local Area Network)、または、光ファイバーケーブルを用いた有線接続などがある。無線接続のコミュニケータとしては、基地局等を介しての外部機器との無線接続、または、外部機器との直接無線接続などがある。基地局等を介しての外部機器との無線接続としては、例えば、WiFi(ワイファイ:登録商標)ルータと無線通信するIEEE802.11対応の無線LAN、第3世代移動通信システム(通称3G)、第4世代移動通信システム(通称4G)、IEEE 802.16対応のWiMax(ワイマックス:登録商標)、または、LPWA(Low Power Wide Area)などがある。本開示の装置と外部機器とを直接無線接続するコミュニケータを用いれば、通信のセキュリティ性の向上に有効であるとともに、WiFi(ワイファイ:登録商標)ルータなどの中継機器が存在しない場所でも、本開示の装置は外部機器と通信できる。本開示の装置と外部機器とを直接無線接続するコミュニケータとしては、例えば、Bluetooth(ブルートゥース:登録商標)による通信、ループアンテナを介したNFC(Near Field Communication)による通信、または、赤外線通信などがある。 The communicator in the present disclosure may be any one that enables communication between the device of the present disclosure and an external device. In expressing the subject of the invention, in addition to the communicator, a communication means or a communication device or a transmission / reception means or a transmission / reception unit or a wording similar thereto is used to enable communication between the device of the present disclosure and an external device. It may be written. The communicator can be realized in various ways. For example, the communicator may be connected to an external device by wire, or may be connected to an external device by wireless communication. A communicator that connects the device of the present disclosure and an external device by wire is effective in terms of communication security and communication stability. Examples of the wired connection communicator include a wired LAN (Local Area Network) based on the Ethernet (Ethernet: registered trademark) standard, and a wired connection using an optical fiber cable. The wireless connection communicator includes a wireless connection with an external device via a base station or the like, or a direct wireless connection with an external device. Examples of wireless connection with an external device via a base station or the like include an IEEE802.11 compatible wireless LAN that wirelessly communicates with a WiFi (WiMAX: registered trademark) router, a third generation mobile communication system (commonly known as 3G), and the first. There are 4 generation mobile communication systems (commonly known as 4G), WiMax (WiMAX: registered trademark) compatible with IEEE 802.16, LPWA (Low Power Wide Area), and the like. By using a communicator that directly wirelessly connects the device of the present disclosure and an external device, it is effective in improving the security of communication, and even in a place where a relay device such as a WiFi (registered trademark) router does not exist. The disclosed device can communicate with an external device. Examples of the communicator that directly wirelessly connects the device of the present disclosure and the external device include communication by Bluetooth (registered trademark), communication by NFC (Near Field Communication) via a loop antenna, infrared communication, and the like. be.
 なお、上述の実施の形態は、本開示における技術を例示するためのものであるから、請求の範囲またはその均等の範囲において種々の変更、置き換え、付加、省略などを行うことができる。 Since the above-described embodiment is for exemplifying the technology in the present disclosure, various changes, replacements, additions, omissions, etc. can be made within the scope of claims or the equivalent scope thereof.
 本開示は、簡易な構成で歯磨き部位の判定を精度よく実現できるため、手動の歯ブラシ、電動の歯ブラシなどに、適用可能である。 This disclosure is applicable to manual toothbrushes, electric toothbrushes, etc., because it is possible to accurately determine the toothbrushed part with a simple configuration.
100 歯ブラシシステム
110 歯ブラシ
111 ハンドル部
112 軸部
113 ブラシ部
114 駆動スイッチ
116 電源部
117 第一通信装置
118 制御手段
119 駆動手段
120 姿勢検知センサ
121 姿勢情報生成部
150 端末装置
151 第二通信装置
152 入力手段
153 表示手段
154 カメラ
160 処理部
161 推定部
100 Toothbrush system 110 Toothbrush 111 Handle 112 Shaft 113 Brush 114 Drive switch 116 Power supply 117 First communication device 118 Control means 119 Drive means 120 Attitude detection sensor 121 Attitude information generator 150 Terminal device 151 Second communication device 152 Input Means 153 Display Means 154 Camera 160 Processing Unit 161 Estimating Unit

Claims (4)

  1.  歯を刷掃するブラシ部と、姿勢を検知する姿勢検知センサと、を有する歯ブラシと、
     端末装置と、
    を備える歯ブラシシステムであって、
     前記姿勢検知センサの出力に基づいて前記歯ブラシの姿勢を示す姿勢情報を生成する姿勢情報生成部と、
     カメラから取得する前記歯ブラシを口腔内に挿入した人の顔部が含まれた画像情報、および前記姿勢情報に基づいて、歯列における前記ブラシ部の当接部位を推定する推定部と、
    を備える歯ブラシシステム。
    A toothbrush having a brush part for wiping teeth and a posture detection sensor for detecting posture,
    With the terminal device
    Toothbrush system with
    A posture information generation unit that generates posture information indicating the posture of the toothbrush based on the output of the posture detection sensor.
    An estimation unit that estimates the contact portion of the brush portion in the dentition based on the image information including the face portion of the person who inserted the toothbrush into the oral cavity acquired from the camera and the posture information.
    Toothbrush system with.
  2.  前記推定部は、
     前記画像情報に含まれる唇、および頬の少なくとも一方の変形に基づき前記ブラシ部が舌側にあるか頬側にあるかを推定する
    請求項1に記載の歯ブラシシステム。
    The estimation unit
    The toothbrush system according to claim 1, wherein the brush portion is estimated to be on the lingual side or the buccal side based on the deformation of at least one of the lips and the cheek included in the image information.
  3.  前記推定部は、
     前記ブラシ部が噛み合わせ面にあるかの推定、および前記ブラシ部が前側にあるか舌側にあるかの推定の少なくとも一方の推定をする際に、前記画像情報を用いることなく前記姿勢情報を用いて推定する
    請求項1または2に記載の歯ブラシシステム。
    The estimation unit
    When estimating at least one of the estimation of whether the brush portion is on the meshing surface and the estimation of whether the brush portion is on the front side or the lingual side, the posture information is used without using the image information. The toothbrush system according to claim 1 or 2, which is estimated using the toothbrush system.
  4.  歯を刷掃するブラシ部と、姿勢を検知する姿勢検知センサと、を有する歯ブラシと、端末装置と、を備える歯ブラシシステム用のプログラムであって、
     前記姿勢検知センサの出力に基づいて前記歯ブラシの姿勢を示す姿勢情報を生成する姿勢情報生成部と、
     カメラから取得する前記歯ブラシを口腔内に挿入した人の顔部が含まれた画像情報、および前記姿勢情報に基づいて、歯列における前記歯ブラシの前記ブラシ部の当接部位を推定する推定部と、
    をコンピュータに機能させる歯ブラシシステム用プログラム。
    A program for a toothbrush system including a toothbrush having a brush portion for wiping teeth, a posture detection sensor for detecting a posture, and a terminal device.
    A posture information generation unit that generates posture information indicating the posture of the toothbrush based on the output of the posture detection sensor.
    An estimation unit that estimates the contact portion of the brush portion of the toothbrush in the dentition based on the image information including the face portion of the person who inserted the toothbrush into the oral cavity acquired from the camera and the posture information. ,
    A program for a toothbrush system that makes your computer work.
PCT/JP2021/002528 2020-02-07 2021-01-26 Toothbrush system, and program for toothbrush system WO2021157415A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020019376A JP7336688B2 (en) 2020-02-07 2020-02-07 Toothbrush system and program for toothbrush system
JP2020-019376 2020-02-07

Publications (1)

Publication Number Publication Date
WO2021157415A1 true WO2021157415A1 (en) 2021-08-12

Family

ID=77199316

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/002528 WO2021157415A1 (en) 2020-02-07 2021-01-26 Toothbrush system, and program for toothbrush system

Country Status (2)

Country Link
JP (1) JP7336688B2 (en)
WO (1) WO2021157415A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009240760A (en) * 2008-03-14 2009-10-22 Omron Healthcare Co Ltd Electric toothbrush
WO2017102859A1 (en) * 2015-12-15 2017-06-22 Koninklijke Philips N.V. System and method for tracking an oral care device
WO2017157411A1 (en) * 2016-03-14 2017-09-21 Kolibree Oral hygiene system with visual recognition for compliance monitoring

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BR112018070811A2 (en) 2016-04-15 2019-02-05 Koninklijke Philips Nv method for providing feedback, system for providing feedback
KR102006610B1 (en) 2017-07-27 2019-08-05 키튼플래닛 주식회사 Method and apparatus for providing tooth-brushing guide information using augmented reality
RU2020132321A (en) 2018-03-01 2022-04-01 Конинклейке Филипс Н.В. METHOD FOR LOCATION OF ORAL CARE DEVICE

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009240760A (en) * 2008-03-14 2009-10-22 Omron Healthcare Co Ltd Electric toothbrush
WO2017102859A1 (en) * 2015-12-15 2017-06-22 Koninklijke Philips N.V. System and method for tracking an oral care device
WO2017157411A1 (en) * 2016-03-14 2017-09-21 Kolibree Oral hygiene system with visual recognition for compliance monitoring

Also Published As

Publication number Publication date
JP7336688B2 (en) 2023-09-01
JP2021122609A (en) 2021-08-30

Similar Documents

Publication Publication Date Title
AU2021236443B2 (en) Oral hygiene systems and methods
JP7220644B2 (en) wearable projection device
JP7203856B2 (en) System for classifying use of handheld consumer devices
US8393037B2 (en) Electric toothbrush
KR101548156B1 (en) A wireless exoskeleton haptic interface device for simultaneously delivering tactile and joint resistance and the method for comprising the same
JP5482209B2 (en) electric toothbrush
US11484113B2 (en) Electric toothbrush, system, brushing site detection method, and computer-readable recording medium
US10380914B2 (en) Imaging gloves including wrist cameras and finger cameras
RU2759877C2 (en) Method for determining the orientation of the head of the user during teeth brushing
CN111614919B (en) Image recording device and head-mounted display
JP2020196060A (en) Teaching method
JP2019017418A5 (en)
WO2021157415A1 (en) Toothbrush system, and program for toothbrush system
CN111108463A (en) Information processing apparatus, information processing method, and program
US20230359422A1 (en) Techniques for using in-air hand gestures detected via a wrist-wearable device to operate a camera of another device, and wearable devices and systems for performing those techniques
KR20190056944A (en) A toothbrush guide module interlocked with a smart toothbrush and a system of smart toothbrush comprising the toothbrush guide module and the smart toothbrush
JP7281692B2 (en) toothbrush system
JP2010057593A (en) Walking assisting system for vision challenging person
WO2022176943A1 (en) Intraoral camera system and image display method
CN205568142U (en) Stationery box with function is corrected to position of sitting
Devi et al. Microcontroller Based Gesture Controlled Wheelchair Using Accelerometer
US20230096570A1 (en) Electronic device and method for processing scanned image of three dimensional scanner
Mayol-Cuevas Wearable visual robots
Mateen et al. Artificial Gloves For Kinesics Communication To Articulate Dysfunction Individual Using ESP 32
KR20230023940A (en) Wearable electronic device and method for providing information of brushing theeth in wearable electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21750145

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21750145

Country of ref document: EP

Kind code of ref document: A1