CN113516175B - Method and device for identifying brushing area, brushing equipment and storage medium - Google Patents

Method and device for identifying brushing area, brushing equipment and storage medium Download PDF

Info

Publication number
CN113516175B
CN113516175B CN202110627164.7A CN202110627164A CN113516175B CN 113516175 B CN113516175 B CN 113516175B CN 202110627164 A CN202110627164 A CN 202110627164A CN 113516175 B CN113516175 B CN 113516175B
Authority
CN
China
Prior art keywords
brushing
area
matched
target
zone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110627164.7A
Other languages
Chinese (zh)
Other versions
CN113516175A (en
Inventor
方睿
蒙元鹏
何金国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Youpin International Science And Technology Shenzhen Co ltd
Original Assignee
Youpin International Science And Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Youpin International Science And Technology Shenzhen Co ltd filed Critical Youpin International Science And Technology Shenzhen Co ltd
Priority to CN202110627164.7A priority Critical patent/CN113516175B/en
Publication of CN113516175A publication Critical patent/CN113516175A/en
Application granted granted Critical
Publication of CN113516175B publication Critical patent/CN113516175B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C17/00Devices for cleaning, polishing, rinsing or drying teeth, teeth cavities or prostheses; Saliva removers; Dental appliances for receiving spittle
    • A61C17/16Power-driven cleaning or polishing devices
    • A61C17/22Power-driven cleaning or polishing devices with brushes, cushions, cups, or the like
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Abstract

The application relates to a method and a device for identifying brushing areas, a brushing device and a storage medium. The method for identifying the brushing area comprises the following steps: acquiring current posture information of tooth brushing equipment; screening at least one matched brushing area matched with the current posture information; a target brushing zone is identified from at least one of the matching brushing zones, the target brushing zone being used to characterize the brushing zone in which the user is currently brushing. The method for identifying the brushing areas can improve the accuracy of identifying the brushing areas.

Description

Method and device for identifying brushing area, brushing equipment and storage medium
Technical Field
The application relates to the technical field of intelligent tooth brushing, in particular to a tooth brushing area identification method, a tooth brushing area identification device, tooth brushing equipment and a storage medium.
Background
With the development of intelligent brushing technology, a method for identifying brushing areas appears. The method for identifying brushing areas refers to a technique in which brushing equipment automatically identifies brushing areas.
At present, the tooth brushing area is identified mainly by acquiring posture information of tooth brushing equipment through a sensor on the tooth brushing equipment, so that the tooth brushing area of a user is identified according to the posture information.
However, if the head motion is changed during the brushing process, the brushing area may be misjudged.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a method, apparatus, brushing device, and storage medium for identifying brushing areas that can improve accuracy in identifying brushing areas.
A method of identifying brushing zones, comprising:
acquiring current posture information of tooth brushing equipment;
screening at least one matched brushing area matched with the current posture information;
a target brushing zone is identified from at least one of the matching brushing zones, the target brushing zone being used to characterize the brushing zone in which the user is currently brushing.
In one embodiment, the identifying a target brushing zone from at least one of the matching brushing zones comprises:
if the current posture information corresponds to the initial brushing area, identifying a target brushing area from at least one matched brushing area according to a first preset rule;
and if the current posture information corresponds to the non-initial brushing area, identifying the target brushing area from at least one matched brushing area according to a second preset rule.
In one embodiment, the identifying a target brushing zone from at least one of the matching brushing zones according to a first preset rule comprises:
If the matched brushing area is one, the matched brushing area is taken as the target brushing area;
and if the number of the matched brushing areas is more than two, determining the target brushing area and the alternative brushing areas according to the priority of each matched brushing area, wherein the alternative brushing areas are other brushing areas except the target brushing area in the matched brushing areas.
In one embodiment, the determining the target brushing zone based on the priority of each of the matching brushing zones comprises:
determining whether a first preset oral area exists in more than two of the matched brushing areas;
if so, taking one of the first preset oral areas with the highest priority as the target brushing area according to the priority of the first preset oral area;
if not, using one of the matching brushing zones with the highest priority as the target brushing zone according to the priority of each matching brushing zone.
In one embodiment, the method further comprises:
acquiring set auxiliary judgment information;
and determining the priority of each brushing area according to the auxiliary judgment information, wherein the higher the correlation between the brushing area and the auxiliary judgment information is, the higher the priority of the brushing area is.
In one embodiment, the identifying a target brushing zone from at least one of the matching brushing zones according to a second preset rule comprises:
acquiring a last brushing area corresponding to last posture information, wherein the last brushing area at least comprises a last target brushing area;
matching the similarity between each matched brushing area and each previous brushing area to obtain a matching result, wherein the matching result comprises the similarity between each matched brushing area and each previous brushing area;
and determining the target brushing area and the alternative brushing area according to the pairing result.
In one embodiment, the determining the target brushing area and the alternative brushing area based on the pairing result comprises:
obtaining a candidate brushing area corresponding to the greatest similarity, wherein the candidate brushing area is at least one of the matched brushing areas;
if the candidate tooth brushing area is one, taking the candidate tooth brushing area as the target tooth brushing area, and taking the last tooth brushing area corresponding to the maximum similarity as the last target tooth brushing area;
if the candidate brushing area is more than two, one candidate area is taken as the target brushing area, and the other candidate areas are taken as the last candidate brushing area identified by the next brushing area.
In one embodiment, said proximity pairing each of said matching brushing zones with each of said previous brushing zones comprises:
if the number of the matched brushing area and the last brushing area is one at the same time, taking the matched brushing area as the target brushing area;
if the number of the matched brushing areas and the number of the previous brushing areas are different, matching the similarity between each matched brushing area and each previous brushing area;
wherein when the number of the previous brushing zones is more than two, the previous brushing zone further comprises a previous alternative brushing zone.
In one embodiment, the matching each of the matching brushing areas with each of the previous brushing areas to obtain a matching result includes:
determining a distance of each of the matching brushing zones from each of the previous brushing zones;
and taking the distance as the similarity, wherein the smaller the distance is, the larger the similarity is.
In one embodiment, prior to said identifying a target brushing zone from at least one of said matching brushing zones, comprising:
Acquiring posture change information of the tooth brushing equipment;
performing logic pruning judgment according to the attitude change information;
and eliminating redundant brushing areas in the matched brushing areas according to the result of the logic pruning judgment.
In one embodiment, the results of the logical pruning determination include at least one of a brushing zone switch determination, a brushing zone ipsilateral determination, and a brushing zone commutation determination.
In one embodiment, the screening out at least one matching brushing zone for which the current pose information matches comprises:
acquiring threshold values corresponding to a plurality of preset oral areas respectively;
and determining the matched brushing area according to the current posture information, wherein the matched brushing area is at least one of a plurality of preset oral areas.
In one embodiment, the determining the matching brushing zone based on the current pose information comprises:
determining a target threshold value matched with the current gesture information;
and taking the preset oral cavity area corresponding to the target threshold value as the matched tooth brushing area.
An identification device for brushing zones, comprising:
the posture information acquisition module is used for acquiring the current posture information of the tooth brushing equipment;
The screening module is used for screening at least one matched brushing area matched with the current posture information;
an identification module for identifying a target brushing zone from at least one of the matching brushing zones, the target brushing zone being indicative of a brushing zone in which the user is currently brushing.
A brushing device comprising a memory storing a computer program and a processor implementing the steps of the method described above when the processor executes the computer program.
A computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the method described above.
The method and the device for identifying the brushing area, the brushing equipment and the storage medium, and the method for identifying the brushing area comprises the following steps: acquiring current posture information of tooth brushing equipment; screening at least one matched brushing area matched with the current posture information; the target brushing area is identified from at least one matched brushing area, and the target brushing area is used for representing the brushing area of the current brushing of the user, and as the target brushing area is identified from at least one matched brushing area, namely the brushing area of the current brushing of the user is identified from one or more possible brushing target brushing areas, the occurrence of the condition of head action change of the user and the like is considered, and the identification accuracy of the brushing area is improved.
Drawings
In order to more clearly illustrate the technical solutions of embodiments or conventional techniques of the present application, the drawings required for the descriptions of the embodiments or conventional techniques will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person of ordinary skill in the art.
FIG. 1 is a flow chart of a method for identifying brushing zones according to one embodiment;
FIG. 2 is a flow chart of another method for identifying brushing zones according to one embodiment;
FIG. 3 is a detailed flow chart of step 240 of FIG. 2, provided by one embodiment;
FIG. 4 is a detailed flow chart of step 330 of FIG. 3, provided by one embodiment;
FIG. 5 is a detailed flow chart of step 250 of FIG. 2, provided by one embodiment;
FIG. 6 is a schematic illustration of a scenario featuring a brushing zone identification provided by one embodiment;
FIG. 7 is a schematic illustration of another brushing zone identification scenario provided by one embodiment;
FIG. 8 is a flow chart of another method for identifying brushing zones, provided in one embodiment;
FIG. 9 is a detailed flow chart of the steps of FIG. 1 provided in one embodiment;
FIG. 10 is a schematic structural view of a device for identifying brushing zones according to one embodiment.
Detailed Description
In order to facilitate an understanding of the present application, a more complete description of the present application will now be provided with reference to the relevant figures. Examples of the present application are given in the accompanying drawings. This application may, however, be embodied in many different forms and is not limited to the embodiments described herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein in the description of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
It will be understood that the terms "first," "second," and the like, as used herein, may be used to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another element.
As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," and/or the like, specify the presence of stated features, integers, steps, operations, elements, components, or groups thereof, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or groups thereof. Also, the term "and/or" as used in this specification includes any and all combinations of the associated listed items.
Referring to fig. 1, fig. 1 is a flow chart of a method for identifying brushing zones according to one embodiment. In one embodiment, as shown in FIG. 1, a method of identifying brushing zones is provided, including steps 110 through 130.
Step 110, obtaining current posture information of the brushing device.
Among other things, brushing devices include, but are not limited to, smart electric toothbrushes. The intelligent electric toothbrush is a novel high-tech toothbrush, can analyze the brushing habit of a user, records the information of each tooth, and displays the data on the intelligent mobile phone through mobile application software. The current posture information refers to posture information of the brushing device at the current moment. Optionally, the gesture information includes at least euler angles. The euler angle is a combination attitude angle such as roll angle (roll angle), pitch angle (pitch angle), and heading angle (yaw angle), and the like, and is not limited herein. The current matching brushing gesture can be identified by the current gesture information.
Specifically, in the starting working state of the brushing device, the current brushing posture position information can be determined through the angular velocity data and the acceleration data detected in real time by the built-in sensor of the brushing device, and the sensor comprises a sensor such as a screw instrument and an accelerometer, the angular velocity data of the current electric toothbrush is obtained through the screw instrument, and the acceleration data is obtained through the accelerometer, wherein the process of obtaining the data by the sensor is not repeated herein. For the micro control unit of the electric toothbrush, the micro control unit is connected with a built-in sensor, so that the micro control unit can acquire data such as angles transmitted by the sensor in real time.
And 120, screening out at least one matched brushing area matched with the current posture information.
Wherein, the matched brushing area refers to the brushing area with matched current posture information. In other words, matching brushing zones refers to all brushing zones that are currently possible to the user under the current pose information. Specifically, since the head motion of the user is changed during the brushing process, the number of the matched brushing areas may be one or more, which is determined according to the actual situation, and the embodiment is not limited. In this step, at least one matching brushing zone is screened for matching current pose information, taking into account the changing head motion.
Step 130, identifying a target brushing zone from at least one of the matching brushing zones, the target brushing zone being used to characterize the brushing zone in which the user is currently brushing.
Wherein the target brushing zone refers to the final determined brushing zone. Specifically, the target brushing zone is one of the matching brushing zones. At this step, a target brushing zone is identified from one or more possible all brushing zones, resulting in a brushing zone in which the user is currently brushing.
In this embodiment, since the target brushing area is identified from at least one matching brushing area, i.e., the brushing area currently being brushed by the user is identified from one or more possible brushing target brushing areas, the accuracy of identifying the brushing area is improved by taking into account the occurrence of a change in the user's head motion, etc.
In one embodiment, a plurality of preset oral areas are predefined in the oral cavity of the human body, and the matched brushing area is at least one of the preset oral areas. The predetermined oral area may be precisely the area and orientation of each tooth or may be a general division of the oral area.
For example, the plurality of preset oral regions include, but are not limited to, lower left bite, lower left inner side, lower left outer side, lower middle inner side, lower right outer side, lower right bite, lower right inner side, upper right outer side, upper right bite, upper right inner side, upper middle outer side, upper middle inner side, upper left outer side, upper left bite, and upper left inner side regions.
As another example, the plurality of predefined oral regions include, but are not limited to, a lower left bite, a lower left inside and a lower left outside as a lower left region, an upper left bite, an upper left inside and an upper left outside as an upper left brushing region, and a right and middle brushing region treated in a similar manner.
As another example, the individual small brushing zones divided as described above can also be subdivided separately, such as the brushing zone for the lower left bite, or divided into zones and orientations of each tooth as previously mentioned.
It will be appreciated that the above-described division of the predetermined oral area is merely an example, and the present embodiment is not limited thereto.
In one embodiment, step 110, obtaining current pose information for a brushing device comprises the steps of:
acquiring position data of a sensing device of the brushing apparatus, and angular velocity and acceleration during brushing;
coordinate transformation is carried out on the position data of the sensing device, so that the position data of the tooth brushing equipment are obtained;
filtering the position data of the tooth brushing equipment and the angular speed and acceleration data in the tooth brushing process;
and carrying out attitude calculation on the angular velocity data in the tooth brushing process after the filtering treatment.
Euler angles of the tooth brushing device can be obtained through gesture calculation, so that position data and gesture data of the tooth brushing device are obtained.
Referring to fig. 2, fig. 2 is a flow chart of another method for identifying brushing zones, according to one embodiment. In one embodiment, as shown in FIG. 2, another method of identifying brushing zones includes steps 210 through 250.
Step 210, obtaining current posture information of the brushing device.
The present step may refer to the description of any one of the above embodiments, which is not repeated.
Step 220, screening out at least one matched brushing area matched with the current posture information.
The present step may refer to the description of any one of the above embodiments, which is not repeated.
Step 230, judging whether the current posture information corresponds to an initial brushing area.
In this step, if the current posture information corresponds to the initial brushing area, step 240 is performed; if the current posture information corresponds to a non-initial brushing zone, step 250 is performed.
Step 240, identifying a target brushing zone from at least one of the matched brushing zones according to a first preset rule.
Where the initial brushing zone refers to the zone where the user begins brushing. Specifically, the initial brushing area may be an area where the user just starts brushing, or an area where the user resumes brushing after pausing in the middle of the brushing, which is not limited herein. The first preset rule refers to the manner in which the target brushing zone is identified from the matching brushing zone when the current attitude information corresponds to the initial brushing zone. The current posture information corresponds to the initial brushing area, and refers to posture information when the user starts brushing.
Step 250, identifying a target brushing zone from at least one of the matched brushing zones according to a second preset rule.
Wherein the non-initial brushing zone is opposite the initial brushing zone, and the non-initial brushing zone is the zone where the user is not beginning to brush. The second preset rule refers to the manner in which the target brushing zone is identified from the brushing zone when the current attitude information corresponds to a non-initial brushing zone. The current posture information corresponds to a non-initial brushing area, and refers to posture information when the user does not begin brushing.
In this embodiment, different preset rules are adopted to identify the target brushing area according to whether the current posture information corresponds to the initial brushing area or the non-initial brushing area, so that accuracy of brushing area identification can be further improved.
Referring to fig. 3, fig. 3 is a detailed flow chart of step 240 of fig. 2 provided by one embodiment. In one embodiment, as shown in FIG. 3, step 240, identifying a target brushing zone from at least one of the matching brushing zones according to a first preset rule, comprises steps 310 through 330.
Step 310, determining whether the matching brushing zone is one.
In this step, if the matching brushing zone is one, then step 320 is performed; if there are more than two matching brushing zones, step 330 is performed.
Step 320, regarding the matching brushing zone as the target brushing zone.
In this step, if the number of matched brushing zones is one, the matched brushing zone is directly set as the target brushing zone.
And 330, determining the target brushing area and the alternative brushing areas according to the priority of each matched brushing area, wherein the alternative brushing areas are other brushing areas except the target brushing area in the matched brushing areas.
The priority may be determined based on the habit of brushing the teeth of the human, or the habit of brushing the teeth of the user individually. Specifically, whether it is a habit of brushing teeth of a human or a habit of brushing teeth of a user individually, a certain rule is followed, so that the preset oral area can be prioritized, and after the matched brushing area is obtained, the target brushing area and the alternative brushing area are determined according to the priority of the matched brushing area. The alternative brushing zone serves as a reference zone for the next brushing zone identification. Alternatively, the highest priority matching brushing zone is targeted for brushing zones and the other matching brushing zones are targeted for brushing zones. The higher the priority, the higher the likelihood that the target brushing zone is the matching brushing zone. In other words, the higher the priority, the higher the probability of a match.
In this embodiment, when the number of matching brushing areas is more than two, the target brushing area is determined according to the priority of the matching brushing areas, and the obtained target brushing area is more accurate in consideration of the brushing habit of the user. In addition, the obtained alternative brushing area is used as a reference area for the next brushing area identification, so that the result of the last brushing area identification is also referred to in the next brushing area identification process, and the accuracy of brushing area identification is further improved.
In one embodiment, prior to determining the target brushing zone and the alternate brushing zone based on the priority of each of the matching brushing zones, further comprising:
acquiring set auxiliary judgment information;
and determining the priority of each brushing area according to the auxiliary judgment information, wherein the higher the correlation between the brushing area and the auxiliary judgment information is, the higher the priority of the brushing area is.
The auxiliary judgment information is information for determining the priority of the brushing area. Specifically, the auxiliary judgment information may be input by the user, or may be generated according to the brushing habit of the user during the use of the brushing device, which is not limited herein. In this embodiment, the higher the correlation between the brushing area and the assistance determination information, the higher the priority of the brushing area.
Illustratively, for the initial brushing zone, the selection criteria is to prioritize the brushing zone that is handy (i.e., the non-dominant hand zone, e.g., the right hand brushing zone is used by the user, and the left hand zone is the non-dominant zone, the user may have previously entered his handedness as assistance judgment information to the brushing device). For another example, the default auxiliary judgment information of the system is prioritized, for example, the default auxiliary judgment information of the system indicates that the external tooth surface is a preferred solution, the internal tooth surface is preferred to be a final solution when the external tooth surface is not available, and the solution is randomly selected according to the currently available area when only the occlusal surface is available for selection, so that the target brushing area and the alternative brushing area are determined.
Illustratively, if the matching brushing zone includes a lower left side, an upper left side, and an upper middle side, the user inputs (inputs right hand brushing and other input criteria) at which point the target brushing zone is determined to be the lower left side zone.
It can be appreciated that in this embodiment, by performing a preliminary correction in a manner of assisting in the determination, the obtained target brushing area is more accurate, which is beneficial to the determination of other non-initial brushing areas, and reduces the possibility of erroneous determination.
Referring to fig. 4, fig. 4 is a detailed flow chart of step 330 of fig. 3 according to one embodiment. In one embodiment, as shown in FIG. 4, step 330, determining the target brushing zone and the alternate brushing zone based on the priority of each of the matching brushing zones, comprises steps 410 through 430.
Step 410, determining whether more than two of the matching brushing zones have a first preset oral zone.
Wherein the first preset oral area may be determined based on the brushing habits of the user. Generally, the area where the user is used to first begin brushing will be the first preset oral area. In this step, if there is a first predetermined oral area in more than two matching brushing areas, then step 420 is performed; if there are no first predetermined oral areas in more than two matching brushing zones, step 430 is performed.
Illustratively, the first predetermined oral area includes at least one of a lateral side and a lower occlusal side. If more than two matching brushing zones exist with at least one of the lateral side and the lower occlusal side, step 420 is performed.
And step 420, taking one of the first preset oral areas with the highest priority as the target brushing area according to the priority of the first preset oral area.
In this step, one of the first preset oral areas having the highest priority is set as the target brushing area according to the priority of the first preset oral area. If the first preset oral cavity area with the highest priority is more than two, one of the first preset oral cavity areas is taken as the target brushing area.
Illustratively, if there are exterior and lower occlusal surfaces in more than two matching brushing zones, then one of the exterior and lower occlusal surfaces is considered the target brushing zone; if there is one of the lateral side and the lower occlusal side in more than two matching brushing zones, then one of the lateral side and the lower occlusal side is present as the target brushing zone.
Step 430, taking one of the matched brushing areas with the highest priority as the target brushing area according to the priority of each matched brushing area.
In this step, if there is no first preset oral cavity region in more than two matched brushing regions, one of the highest priority matched brushing regions is taken as the target brushing region.
Referring to fig. 5, fig. 5 is a detailed flow chart of step 250 of fig. 2 provided by one embodiment. In one embodiment, as shown in FIG. 5, step 250, identifying a target brushing zone from at least one of the matching brushing zones according to a second preset rule, comprises steps 510 through 530.
Step 510, obtaining a previous brushing area corresponding to the previous posture information, where the previous brushing area at least includes a previous target brushing area.
Wherein, the last posture information refers to the posture information when the brushing area is identified last time. The previous posture information is a brushing area determined according to the previous posture information. The last target brushing zone is the target brushing zone determined in the last brushing zone identification. The last brushing zone comprises at least the last target brushing zone. Optionally, the previous brushing zone further comprises a previous alternative brushing zone, as determined by the actual situation and not limited herein.
Step 520, pairing each matched brushing area with each previous brushing area to obtain a pairing result, wherein the pairing result comprises the similarity of each matched brushing area with each previous brushing area.
In this step, each of the matching brushing zones is matched in proximity to each of the previous brushing zones, thereby obtaining a proximity of each of the matching zones to the previous brushing zone. Illustratively, if there are 5 matching brushing zones and 4 previous brushing zones, the pairing result includes 5*4 =20 closeness.
And 530, determining the target brushing area and the alternative brushing area according to the pairing result.
In this step, the target brushing zone and the alternate brushing zone are determined based on the pairing result.
In this embodiment, the recognition result of the last brushing area is also referred to when the target brushing area is recognized at the present time, so that the accuracy of the recognition of the brushing area is further improved.
In one embodiment, step 530, determining the target brushing zone and the alternate brushing zone based on the pairing result comprises the steps of:
obtaining a candidate brushing area corresponding to the greatest similarity, wherein the candidate brushing area is at least one of the matched brushing areas;
if the candidate tooth brushing area is one, taking the candidate tooth brushing area as the target tooth brushing area, and taking the last tooth brushing area corresponding to the maximum similarity as the last target tooth brushing area;
if the candidate brushing area is more than two, one candidate area is taken as the target brushing area, and the other candidate areas are taken as the last candidate brushing area identified by the next brushing area.
Wherein, the candidate brushing area refers to at least one of the matching brushing areas with the greatest proximity. The number of candidate brushing zones is determined based on the practice and the present embodiment is not limited.
In one embodiment, step 520, proximity pairing each of the matching brushing zones with each of the previous brushing zones comprises the steps of:
if the number of the matched brushing area and the last brushing area is one at the same time, taking the matched brushing area as the target brushing area;
if the number of the matched brushing areas and the number of the previous brushing areas are different, matching the similarity between each matched brushing area and each previous brushing area;
wherein when the number of the previous brushing zones is more than two, the previous brushing zone further comprises a previous alternative brushing zone.
In this embodiment, the number of matching brushing zones and the last brushing zone is not the same time, i.e., the number of at least one of the matching brushing zones and the last brushing zone is more than two. Specifically, the number of the matched brushing areas may be two or more, or the number of the previous brushing areas may be two or more, or the number of the matched brushing areas and the previous brushing areas may be two or more.
In one embodiment, proximity pairing each of the matching brushing zones with each of the previous brushing zones results in a pairing result comprising:
Determining a distance of each of the matching brushing zones from each of the previous brushing zones;
and taking the distance as the similarity, wherein the smaller the distance is, the larger the similarity is.
Wherein, the distance refers to a parameter indicating the distance between the two. In this embodiment, the distance indicates how far the brushing zone was matched to the previous brushing zone. Alternatively, the smaller the distance, the closer the matching brushing zone and the last brushing zone are indicated. Alternatively, the distance may be an actual spatial distance or a custom spatial distance, which is not limited in this embodiment. The space distance refers to the distance between the midpoint, the line and the surface of the three-dimensional space in the solid geometry. In this embodiment, the distance is a distance that matches the brushing zone to the previous brushing zone. In this embodiment, the distance is defined as a proximity, and the smaller the distance is, the larger the proximity is.
Referring to fig. 6, fig. 6 is a schematic illustration of a scenario for brushing zone identification, provided in one embodiment. In one embodiment, as shown in FIG. 6, n represents the identification of the current brushing zone, n-1 represents the identification of the last brushing zone, and n+1 represents the identification of the next brushing zone. Wherein n is more than or equal to 2. Wherein, in the n-1 th tooth brushing area identification, two matched tooth brushing areas A1 and B1 are identified, A1 is determined to be the target tooth brushing area, and B1 is determined to be the alternative tooth brushing area. Then in the identification of the current brushing zone, a matching brushing zone is obtained for C1, and the distances for C1 and A1, and between C1 and B1, respectively, are calculated. If the distance between C1 and B1 is smaller than the distance between C1 and A1, C1 is the currently identified target brushing area, and B1 is the last target brushing area. In the n+1th brushing zone identification, since the identified matching brushing zone is D1 and the last brushing zone is C1, D1 is the n+1th identified target brushing zone.
Referring to fig. 7, fig. 7 is a schematic illustration of another brushing zone identification scenario provided by one embodiment. n represents the identification of the current brushing zone, n-1 represents the identification of the last brushing zone, and n+1 represents the identification of the next brushing zone. Wherein n is more than or equal to 2. Wherein, in the n-1 th tooth brushing area identification, two matched tooth brushing areas A2 and B2 are identified, A2 is determined to be the target tooth brushing area, and B2 is determined to be the alternative tooth brushing area. Then in the identification of the current brushing zone, two matching brushing zones C2 and D2 are obtained, and then the distance between C2 and A2, the distance between C2 and B2, the distance between D2 and A2, and the distance between D2 and B2 are calculated, respectively. If the distance between C2 and B2 = D2 and B2 = shortest, then one of C2 and D2 is the target brushing zone and the other is the alternate brushing zone. In the n+1th identification, E2 is obtained as a matching brushing zone, and E2 is the target brushing zone. And respectively calculating the distance between E2 and C2 and the distance between E2 and D2, wherein if the distance between E2 and C2 is smaller than the distance between E2 and D2, C2 is the last target brushing area.
It will be appreciated that the above-described schematic view of a scenario is merely an example, and the present embodiment is not limited thereto, as the actual brushing conditions may vary.
Referring to fig. 8, fig. 8 is a flow chart of another method for identifying brushing zones, according to one embodiment. The embodiment is suitable for scenes with reduced computational complexity. In one embodiment, as shown in FIG. 8, another method of identifying brushing zones includes steps 810 through 860.
Step 810, obtaining current posture information of the brushing device.
The present step may refer to the description of any one of the above embodiments, which is not repeated.
Step 820, screening out at least one matching brushing area for which the current posture information is matched.
The present step may refer to the description of any one of the above embodiments, which is not repeated.
Step 830, obtaining posture change information of the brushing device.
The posture change information refers to information of posture change. Specifically, the current posture information and the last posture information can be obtained.
And 840, performing logic pruning judgment according to the attitude change information.
In the step, the logic pruning judgment aims at eliminating some interference data, so that the calculated amount is reduced, and the tooth brushing area identification efficiency is improved.
And 850, eliminating redundant brushing areas in the matched brushing areas according to the result of the logic pruning judgment.
Wherein, redundant brushing areas refer to inaccurate brushing areas of matched brushing areas. In the step, redundant brushing areas in the matched brushing areas are removed according to the result of the logic pruning judgment, so that the number of proximity pairing can be reduced. Illustratively, there are 5 matching brushing zones, 2 last brushing zones, and 5*2 =10 closeness if calculated normally. If the redundant brushing areas are judged to be 2 and removed, 3 matched brushing areas are left, and 3*2 =6 closeness is calculated. Specifically, the redundant brushing zones of this embodiment do not match the results of the logical pruning determination. For example, the last target brushing zone is the bottom right outer zone, and the current matching brushing zone is the bottom right bite, bottom right inner zone, top right outer zone, top right bite, top left outer zone, top left bite, and top left inner zone, and if the logical pruning determines that no side change has occurred, then the top left outer zone, top left bite, and top left inner zone are redundant brushing zones, which need to be removed.
Step 860 identifies a target brushing zone from at least one of the matching brushing zones, the target brushing zone being used to characterize the brushing zone in which the user is currently brushing.
The present step may refer to the description of any one of the above embodiments, which is not repeated.
In this embodiment, by eliminating redundant brushing areas among the matching brushing areas according to the result of the logical pruning determination between identification target brushing areas, some inaccurate matching brushing areas may be eliminated, reducing unnecessary computations.
In one embodiment, the outcome of the logical pruning determination includes at least one of a brushing zone switch determination, a brushing zone ipsilateral determination, and a brushing zone reversing determination.
The brushing area switching judgment means a judgment to judge whether or not the user switches brushing areas. The same-side judgment of the brushing area means judgment of whether the user is brushing the teeth on the same side, for example, judgment of whether the user is switching from the right side of the oral cavity to the left side of the oral cavity or switching from the left side of the oral cavity to the right side of the oral cavity. The brushing area reversing judgment refers to judging whether the user reverses to brush teeth.
Specifically, if the user does not switch brushing zones, the brushing zones can be identified based on historical data. If the user switches brushing areas, the next stage of area judgment can be performed. Namely, the judgment of the same side of the brushing area is carried out.
Since the posture data of the two side areas in the oral cavity are relatively close during brushing, it is necessary to further judge the brushing areas on the same side. If the user does not switch the brushing zones on the same side, the brushing zones can be identified based on historical data. If the user switches the brushing area, the area judgment of the next stage, namely the brushing area reversing judgment, can be performed.
The brushing zone reversal determination is for the case where the user has not switched brushing zones, but has conducted a brush head reversal. If the user does not change the direction of the toothbrush head, the specific judgment of the brushing area can be carried out according to the historical data. If the user switches the direction of the toothbrush head, the data identified by the toothbrush can be adjusted.
In one embodiment, the brushing zone switch determination can be determined by a heading angle and/or a roll angle. For example, the heading angle exceeds a certain threshold at the end of the last brushing action; or the maximum variation of course angle of the user in the stage of not brushing teeth exceeds a certain value; or the maximum variation of the rolling angle exceeds a certain value in the stage that the user does not brush teeth; or the maximum variation of course angle and roll angle exceeds a certain value during the period when the user is not brushing teeth.
In one embodiment, the brushing zone ipsilateral judgment comprises the steps of:
judging whether the change amount of the course angle of the toothbrush exceeds a certain value;
when the change of the course angle of the toothbrush exceeds a certain value, judging whether the change of the rolling angle exceeds a certain value.
If the course angle change exceeds the specified value and the roll angle change is smaller than the specified value, the brushing of the teeth and the changing of the sides do not occur.
In one embodiment, the brushing zone reversal determination can be determined by determining whether the amount of change in roll angle is greater than a certain value.
Specifically, if the amount of change in the roll angle is greater than a certain value, a reversal of the brushing zone occurs.
Referring to fig. 9, fig. 9 is a detailed flow chart of the steps of fig. 1 provided by one embodiment. In one embodiment, as shown in FIG. 9, step 120, screening out at least one matching brushing zone for which the current pose information matches, includes steps 910 through 920.
Step 910, obtaining thresholds corresponding to a plurality of preset oral areas respectively.
In this step, each preset oral area corresponds to one or more thresholds. The threshold is used to divide the location of each preset oral area. Optionally, the threshold corresponding to the preset oral area is a threshold of the preset oral under different head movements.
In this embodiment, under different head actions, the threshold corresponding to each preset oral cavity region is established, so that the threshold of each preset oral cavity region under different head actions can be obtained.
Step 920, determining the matched brushing area according to the current posture information, wherein the matched brushing area is at least one of a plurality of preset oral areas.
In this step, since the threshold is a threshold of the preset oral cavity region under different head movements, a possible brushing area can be determined according to the current posture information, and used as a matching brushing area. Specifically, because it is not known what the current user's head motion is, the present step determines the matching brushing zones based on the current pose information, taking into account the possible brushing zones for different head motions.
In this embodiment, by acquiring the thresholds corresponding to the multiple preset oral areas respectively, and determining the matched brushing areas according to the current posture information, possible brushing areas under different head actions are considered, and preparation is made for accurate identification of subsequent brushing areas.
It will be appreciated that due to the variability of human individuals, there may be overlap of threshold descriptions for each preset oral area, i.e., the same threshold descriptions for possibly different preset oral areas, and therefore, multiple solutions may exist during the determination of matching brushing areas based on current pose information, i.e., the current pose information may correspond to one or more matching brushing areas.
In one embodiment, the determining the matching brushing zone based on the current pose information comprises:
and determining a target threshold value matched with the current gesture information.
And taking the preset oral cavity area corresponding to the target threshold value as the matched tooth brushing area.
In this embodiment, the target threshold refers to a threshold at which the current posture information matches. Specifically, the current gesture information is converted into the same description as the threshold value, and a target threshold value matched with the current gesture information can be determined. And taking the preset oral cavity area corresponding to the target threshold value as a matched tooth brushing area.
Alternatively, in this embodiment, a reference coordinate system may be established in advance with the oral cavity of the user, where it is to be noted that the reference coordinate system is a preset area in the oral cavity to be defined, and is a different coordinate system from the sensor coordinate system of the brushing apparatus. And a threshold value is correspondingly established for each oral cavity area defined in the oral cavity, namely, the position relation of each oral cavity area in the oral cavity is clear through threshold value division.
In one embodiment, the reference coordinate system established by the oral cavity may be a global coordinate system, the sensor coordinate system may be an ontology coordinate system, for example, the sensor of the brushing device is used to acquire the posture information of a certain brushing tooth in real time, that is, know the coordinate in the sensor coordinate system under a certain brushing tooth posture, the brushing tooth posture corresponds to the ontology coordinate system of the brushing device, and the current first coordinate of the current brushing tooth posture in the ontology coordinate system may be determined according to the current posture information, then, the current first coordinate in the ontology coordinate system may be converted into the current second coordinate in the reference coordinate system by using the direction cosine matrix (Direction Cosine Matrix, DCM), and the current second coordinate of the current brushing tooth posture in the reference coordinate system is determined, that is, the position information of the current posture information in the reference coordinate system is known, that is, the matched preset oral cavity area under the current brushing tooth posture in the oral cavity is known, and vice versa. Meanwhile, when the previous brushing area is identified, the previous second coordinate of the previous posture information corresponding to the previous brushing area under the reference coordinate system is also determined, and then coordinate calculation can be performed according to the current second coordinate and the previous second coordinate, so that the distance between the matched brushing area and the previous brushing area can be determined. Wherein the matching brushing zone corresponds to the current second coordinate and the last brushing zone corresponds to the last second coordinate.
In order to quickly understand the correspondence between the sensor coordinate system and the oral cavity reference coordinate system, in the invention, a reference coordinate system is established in advance by the human oral cavity of a user, and a plurality of preset oral cavity areas are established on the reference coordinate system. The plurality of preset oral areas may be constructed by referring to the description of the above embodiments, and are not described herein. After dividing the plurality of preset oral areas, a threshold description is established for each defined preset oral area within the oral cavity, the threshold description being used to divide the location of each preset oral area.
It should be understood that, although the steps in the flowcharts of fig. 1 to 5 and 8 to 9 are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least a portion of the steps of fig. 1-5, 8-9 may include multiple steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor does the order in which the steps or stages are performed necessarily occur sequentially, but may be performed alternately or alternately with other steps or at least a portion of the steps or stages in other steps.
Referring to fig. 10, fig. 10 is a schematic structural view of a tooth brushing region identification device according to one embodiment. In one embodiment, as shown in fig. 10, there is provided a brushing zone identification device comprising a gesture information acquisition module 1010, a screening module 1020, and an identification module 1030, wherein:
the gesture information acquisition module 1010 is configured to acquire current gesture information of the brushing device; the screening module 1020 is configured to screen at least one matching brushing area that matches the current posture information; the identification module 1030 is configured to identify a target brushing zone from at least one of the matching brushing zones, the target brushing zone being configured to characterize a brushing zone in which the user is currently brushing.
In one embodiment, the recognition module 1030 includes: the first recognition unit is used for recognizing a target brushing area from at least one matched brushing area according to a first preset rule if the current posture information corresponds to the initial brushing area; and the second identification unit is used for identifying the target brushing area from at least one matched brushing area according to a second preset rule if the current posture information corresponds to the non-initial brushing area.
In one embodiment, the first identification unit comprises: a first identification subunit for regarding the matched brushing zone as the target brushing zone if the matched brushing zone is one; and the second identification subunit is used for determining the target brushing area and the alternative brushing areas according to the priority of each matched brushing area if the matched brushing areas are more than two, wherein the alternative brushing areas are other brushing areas except the target brushing area in the matched brushing areas.
In one embodiment, the second identification subunit is specifically configured to determine whether more than two of the matching brushing zones are present in a first preset oral zone; if so, taking one of the first preset oral areas with the highest priority as the target brushing area according to the priority of the first preset oral area; if not, using one of the matching brushing zones with the highest priority as the target brushing zone according to the priority of each matching brushing zone.
In one embodiment, the first identification unit further comprises: an auxiliary judgment information subunit, configured to obtain set auxiliary judgment information; and the priority determining subunit is used for determining the priority of each brushing area according to the auxiliary judging information, wherein the higher the correlation between the brushing area and the auxiliary judging information is, the higher the priority of the brushing area is.
In one embodiment, the second identification unit comprises: a last brushing area identification subunit, configured to obtain a last brushing area corresponding to last posture information, where the last brushing area at least includes a last target brushing area; a pairing subunit, configured to pair each of the matched brushing areas with each of the previous brushing areas in proximity, to obtain a pairing result, where the pairing result includes a proximity of each of the matched brushing areas with each of the previous brushing areas; and a third identification subunit for determining the target brushing area and the alternative brushing area according to the pairing result.
In one embodiment, the third identifying subunit is specifically configured to obtain a candidate brushing area corresponding to the greatest proximity, where the candidate brushing area is at least one of the matched brushing areas; if the candidate tooth brushing area is one, taking the candidate tooth brushing area as the target tooth brushing area, and taking the last tooth brushing area corresponding to the maximum similarity as the last target tooth brushing area; if the candidate brushing area is more than two, one candidate area is taken as the target brushing area, and the other candidate areas are taken as the last candidate brushing area identified by the next brushing area.
In one embodiment, the pairing subunit is specifically configured to treat the matching brushing zone as the target brushing zone if the number of the matching brushing zone and the last brushing zone are one at the same time; if the number of the matched brushing areas and the number of the previous brushing areas are different, matching the similarity between each matched brushing area and each previous brushing area; wherein when the number of the previous brushing zones is more than two, the previous brushing zone further comprises a previous alternative brushing zone.
In one embodiment, the third identification subunit is specifically further configured to determine a distance of each of said matching brushing zones from each of said previous brushing zones; and taking the distance as the similarity, wherein the smaller the distance is, the larger the similarity is.
In one embodiment, the apparatus further comprises: the posture change information acquisition module is used for acquiring posture change information of the tooth brushing equipment; the judging module is used for carrying out logic pruning judgment according to the attitude change information; and the rejecting module is used for rejecting redundant brushing areas in the matched brushing areas according to the result of the logic pruning judgment.
In one embodiment, the results of the logical pruning determination include at least one of a brushing zone switch determination, a brushing zone ipsilateral determination, and a brushing zone commutation determination.
In one embodiment, the screening module 1020 includes: the threshold value acquisition unit is used for acquiring threshold values corresponding to a plurality of preset oral cavity areas respectively; and the screening unit is used for determining the matched brushing areas according to the current posture information, wherein the matched brushing areas are at least one of a plurality of preset oral areas.
In one embodiment, the screening unit is specifically configured to determine a target threshold value for matching the current pose information; and taking the preset oral cavity area corresponding to the target threshold value as the matched tooth brushing area.
Specific limitations regarding the identification means of the brushing zone can be found in the above description of the identification method of the brushing zone and will not be described in detail herein. The various modules in the above-described brushing zone identification means can be implemented in whole or in part by software, hardware, and combinations thereof. The above modules can be embedded in hardware or independent of the processor in the brushing device, or can be stored in software in the memory of the brushing device, so that the processor can call and execute the operations corresponding to the above modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
In one embodiment, a brushing device is provided that includes a memory having a computer program stored therein and a processor that executes the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, implements the steps of the method embodiments described above.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, or the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like.
In the description of the present specification, reference to the terms "some embodiments," "other embodiments," "desired embodiments," and the like, means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic descriptions of the above terms do not necessarily refer to the same embodiment or example.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples merely represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (14)

1. A method of identifying brushing zones, comprising:
acquiring current posture information of tooth brushing equipment;
screening at least one matched brushing area matched with the current posture information;
if the current posture information corresponds to the initial brushing area, identifying a target brushing area from at least one matched brushing area according to a first preset rule;
If the current posture information corresponds to a non-initial brushing area, acquiring a last brushing area corresponding to last posture information, wherein the last brushing area at least comprises a last target brushing area;
matching the similarity between each matched brushing area and each previous brushing area to obtain a matching result, wherein the matching result comprises the similarity between each matched brushing area and each previous brushing area;
and determining the target brushing area and the alternative brushing area according to the pairing result, wherein the target brushing area is used for representing the brushing area of the current brushing of the user.
2. The method of claim 1, wherein the identifying a target brushing zone from at least one of the matching brushing zones according to a first preset rule comprises:
if the matched brushing area is one, the matched brushing area is taken as the target brushing area;
if the number of the matched brushing areas is more than two, determining the target brushing area and the alternative brushing areas according to the priority of each matched brushing area, wherein the alternative brushing areas are other brushing areas except the target brushing area in the matched brushing areas, and the priority is determined according to the habit of human brushing or the personalized brushing habit of a user.
3. The method of claim 2 wherein said determining said target brushing zone based on a priority of each of said matching brushing zones comprises:
determining whether a first preset oral area exists in more than two of the matched brushing areas;
if so, taking one of the first preset oral areas with the highest priority as the target brushing area according to the priority of the first preset oral area, wherein the first preset oral area comprises an area for the user to habit to begin brushing firstly;
if not, using one of the matching brushing zones with the highest priority as the target brushing zone according to the priority of each matching brushing zone.
4. The method according to claim 2, wherein the method further comprises:
acquiring set auxiliary judgment information;
and determining the priority of each brushing area according to the auxiliary judgment information, wherein the higher the correlation between the brushing area and the auxiliary judgment information is, the higher the priority of the brushing area is.
5. The method of claim 1 wherein the determining the target brushing zone and the alternate brushing zone based on the pairing result comprises:
Obtaining a candidate brushing area corresponding to the greatest similarity, wherein the candidate brushing area is at least one of the matched brushing areas;
if the candidate tooth brushing area is one, taking the candidate tooth brushing area as the target tooth brushing area, and taking the last tooth brushing area corresponding to the maximum similarity as the last target tooth brushing area;
if the candidate brushing area is more than two, one candidate area is taken as the target brushing area, and the other candidate areas are taken as the last candidate brushing area identified by the next brushing area.
6. The method of claim 1 wherein said proximity pairing each of said matching brushing zones with each of said previous brushing zones comprises:
if the number of the matched brushing area and the last brushing area is one at the same time, taking the matched brushing area as the target brushing area;
if the number of the matched brushing areas and the number of the previous brushing areas are different, matching the similarity between each matched brushing area and each previous brushing area;
wherein when the number of the previous brushing zones is more than two, the previous brushing zone further comprises a previous alternative brushing zone.
7. The method of claim 1 wherein said proximity pairing each of said matching brushing zones with each of said previous brushing zones to obtain a pairing result comprises:
determining a distance of each of the matching brushing zones from each of the previous brushing zones;
and taking the distance as the similarity, wherein the smaller the distance is, the larger the similarity is.
8. The method of any one of claims 1-7, comprising, prior to said identifying a target brushing zone from at least one of said matching brushing zones:
acquiring posture change information of the tooth brushing equipment;
performing logic pruning judgment according to the attitude change information;
and eliminating redundant brushing areas in the matched brushing areas according to the result of the logic pruning judgment.
9. The method of claim 8, wherein the outcome of the logical pruning determination comprises at least one of a brushing zone switch determination, a brushing zone ipsilateral determination, and a brushing zone reversing determination.
10. The method of any of claims 1-7, wherein said screening out at least one matching brushing zone for which the current pose information matches comprises:
Acquiring threshold values corresponding to a plurality of preset oral areas respectively;
and determining the matched brushing area according to the current posture information, wherein the matched brushing area is at least one of a plurality of preset oral areas.
11. The method of claim 10 wherein said determining said matching brushing zone based on said current pose information comprises:
determining a target threshold value matched with the current gesture information;
and taking the preset oral cavity area corresponding to the target threshold value as the matched tooth brushing area.
12. A brushing zone identification device, comprising:
the posture information acquisition module is used for acquiring the current posture information of the tooth brushing equipment;
the screening module is used for screening at least one matched brushing area matched with the current posture information;
the identification module is used for identifying a target brushing area from at least one matched brushing area according to a first preset rule if the current posture information corresponds to the initial brushing area; if the current posture information corresponds to a non-initial brushing area, acquiring a last brushing area corresponding to last posture information, wherein the last brushing area at least comprises a last target brushing area; matching the similarity between each matched brushing area and each previous brushing area to obtain a matching result, wherein the matching result comprises the similarity between each matched brushing area and each previous brushing area; and determining the target brushing area and the alternative brushing area according to the pairing result, wherein the target brushing area is used for representing the brushing area of the current brushing of the user.
13. A brushing device comprising a memory and a processor, said memory storing a computer program, wherein the processor when executing said computer program implements the steps of the method of any one of claims 1 to 11.
14. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 11.
CN202110627164.7A 2021-06-04 2021-06-04 Method and device for identifying brushing area, brushing equipment and storage medium Active CN113516175B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110627164.7A CN113516175B (en) 2021-06-04 2021-06-04 Method and device for identifying brushing area, brushing equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110627164.7A CN113516175B (en) 2021-06-04 2021-06-04 Method and device for identifying brushing area, brushing equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113516175A CN113516175A (en) 2021-10-19
CN113516175B true CN113516175B (en) 2024-04-09

Family

ID=78065428

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110627164.7A Active CN113516175B (en) 2021-06-04 2021-06-04 Method and device for identifying brushing area, brushing equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113516175B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114299576A (en) * 2021-12-24 2022-04-08 广州星际悦动股份有限公司 Oral cavity cleaning area identification method, tooth brushing information input system and related device
CN114387295A (en) * 2021-12-25 2022-04-22 广州星际悦动股份有限公司 Motion trajectory generation method and device, electric toothbrush and storage medium
CN114343901A (en) * 2021-12-31 2022-04-15 广州星际悦动股份有限公司 Control method of oral cleaning device and setting method of oral cleaning strategy
CN114329991A (en) * 2021-12-31 2022-04-12 广州星际悦动股份有限公司 Oral cavity cleaning scoring method and device, oral cavity cleaning device and storage medium
CN114577259A (en) * 2022-02-08 2022-06-03 深圳市云顶信息技术有限公司 Tooth brushing information feedback method and device, electronic equipment and storage medium
CN117193611A (en) * 2022-05-31 2023-12-08 广州星际悦动股份有限公司 Oral care instruction method, device, display equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110537989A (en) * 2019-09-11 2019-12-06 爱芽(北京)科技有限公司 Tooth cleaning method and system
CN112051868A (en) * 2020-09-10 2020-12-08 湖北咿呀医疗投资管理股份有限公司 Method and system for positioning track of intelligent electric toothbrush

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5482209B2 (en) * 2010-01-08 2014-05-07 オムロンヘルスケア株式会社 electric toothbrush

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110537989A (en) * 2019-09-11 2019-12-06 爱芽(北京)科技有限公司 Tooth cleaning method and system
CN112051868A (en) * 2020-09-10 2020-12-08 湖北咿呀医疗投资管理股份有限公司 Method and system for positioning track of intelligent electric toothbrush

Also Published As

Publication number Publication date
CN113516175A (en) 2021-10-19

Similar Documents

Publication Publication Date Title
CN113516175B (en) Method and device for identifying brushing area, brushing equipment and storage medium
US11517789B2 (en) Method for monitoring swimming state by means of wearable device, and wearable device
WO2019033586A1 (en) Swimming exercise analysis method based on smartwatch and smartwatch
WO2018040757A1 (en) Wearable device and method of using same to monitor motion state
CN111199230B (en) Method, device, electronic equipment and computer readable storage medium for target detection
CN109685037B (en) Real-time action recognition method and device and electronic equipment
JP6835218B2 (en) Crowd state recognizer, learning method and learning program
CN108245869B (en) Swimming information detection method and device and electronic equipment
CN111291865B (en) Gait recognition method based on convolutional neural network and isolated forest
CN111345817A (en) QRS complex position determination method, device, equipment and storage medium
CN111288986A (en) Motion recognition method and motion recognition device
EP3718056A1 (en) Selecting learning model
Çatal et al. LatentSLAM: unsupervised multi-sensor representation learning for localization and mapping
KR20230080938A (en) Method and apparatus of gesture recognition and classification using convolutional block attention module
CN111803902B (en) Swimming stroke identification method and device, wearable device and storage medium
CN112884132B (en) Tooth brushing detection method and device based on neural network, electric toothbrush and medium
CN114067406A (en) Key point detection method, device, equipment and readable storage medium
CN111160173B (en) Gesture recognition method based on robot and robot
CN110555353B (en) Action recognition method and device
KR101870542B1 (en) Method and apparatus of recognizing a motion
CN113916223B (en) Positioning method and device, equipment and storage medium
Kubota et al. Structured learning for partner robots based on natural communication
Awais et al. Online intention learning for human-robot interaction by scene observation
US10661733B2 (en) Interaction method, interaction apparatus and vehicle-mounted device
CN112527118B (en) Head posture recognition method based on dynamic time warping

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant