CN113869296A - Terminal equipment and automatic control method thereof - Google Patents

Terminal equipment and automatic control method thereof Download PDF

Info

Publication number
CN113869296A
CN113869296A CN202010611667.0A CN202010611667A CN113869296A CN 113869296 A CN113869296 A CN 113869296A CN 202010611667 A CN202010611667 A CN 202010611667A CN 113869296 A CN113869296 A CN 113869296A
Authority
CN
China
Prior art keywords
confidence
mode
information
automatic control
recognition result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010611667.0A
Other languages
Chinese (zh)
Inventor
朱泽春
李志鹏
乔中义
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Joyoung Household Electrical Appliances Co Ltd
Original Assignee
Hangzhou Joyoung Household Electrical Appliances Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Joyoung Household Electrical Appliances Co Ltd filed Critical Hangzhou Joyoung Household Electrical Appliances Co Ltd
Priority to CN202010611667.0A priority Critical patent/CN113869296A/en
Publication of CN113869296A publication Critical patent/CN113869296A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a terminal device and an automatic control method thereof, wherein the terminal device comprises an information acquisition device for acquiring target information; the method comprises the following steps: acquiring an identification result of the target information; determining a confidence level of the recognition result; and determining a corresponding execution mode according to the confidence coefficient. By the scheme of the embodiment, the confirmation intention of the user can be accurately acquired, the accuracy of an image recognition algorithm and the like is improved, the information confirmation accuracy is improved, the number of times of user intervention is reduced, misoperation is reduced, the error correction probability of the user is reduced, and the user experience is improved.

Description

Terminal equipment and automatic control method thereof
Technical Field
The present disclosure relates to terminal application technologies, and in particular, to a terminal device and an automatic control method thereof.
Background
Currently, in mobile internet application activities, there are many scenes in which a user needs to confirm an operation, and especially, the confirmation operation is performed according to an identification result of image information acquired by a camera, for example: the payment is confirmed by swiping the face when the mobile terminal purchases and pays; the intelligent door lock of the family has the face brushing or fingerprint identification confirmation operation when people enter; the intelligent household appliance performs confirmation operation during automatic cooking of dishes through image recognition of the food material types, and the like.
The prior art has the defects that in a prompt box or a confirmation page comprising a confirmation button and a cancel button, a user needs to confirm operation by clicking or sliding the button, but the user may cause mistaken clicking and misoperation due to an inadvertent touch screen or other touch events, inconvenience is brought to the user, and user experience is influenced.
Disclosure of Invention
The application provides a terminal device and an automatic control method thereof, which can accurately acquire the confirmation intention of a user, improve the accuracy of an image recognition algorithm and the like, reduce the number of times of user intervention, reduce misoperation and reduce the probability of error correction of the user.
The embodiment of the application provides an automatic control method of terminal equipment, wherein the terminal equipment can comprise an information acquisition device for acquiring target information; the method may include:
acquiring an identification result of the target information;
determining a confidence level of the recognition result;
and determining a corresponding execution mode according to the confidence coefficient.
In an exemplary embodiment of the present application, the execution mode may include an automatic control mode and a one-touch confirmation mode.
In an exemplary embodiment of the present application, the determining the corresponding execution mode according to the confidence may include:
and when the confidence coefficient is greater than or equal to a first confidence coefficient threshold value, entering the automatic control mode according to the recognition result.
In an exemplary embodiment of the present application, the automatic control mode may include: and directly entering an execution flow of the function corresponding to the identification result.
In an exemplary embodiment of the present application, the determining the corresponding execution mode according to the confidence may include:
entering the one-key confirmation mode when the confidence coefficient is smaller than a first confidence coefficient threshold value and is larger than or equal to a second confidence coefficient threshold value; wherein the first confidence threshold is greater than the second confidence threshold.
In an exemplary embodiment of the present application, the one-touch confirmation mode may include: providing one or more option controls for a piece of confirmation through the option controls;
wherein the option control is used for representing the correctness of the recognition result; or to characterize the correct options possible.
In an exemplary embodiment of the present application, the execution mode may further include: a one-key revocation mode;
the determining the corresponding execution mode according to the confidence may include: entering the one-key revocation mode when the confidence is less than a second confidence threshold; and/or the presence of a gas in the gas,
the method may further comprise: after the automatic control mode is executed, providing an option control entering the one-key revocation mode, and revoking an execution process corresponding to the automatic control mode when the option control of the one-key revocation mode is triggered.
In an exemplary embodiment of the present application, the method may further include:
after the identification result of the target information is obtained, information verification is carried out on the identification result; when the recognition result is verified to be correct, determining the confidence coefficient of the recognition result; and entering a one-key return mode when the identification result is checked to be wrong.
In an exemplary embodiment of the present application, the method may further include: after the recognition result of the target information is obtained, the preference degree corresponding to the recognition result is determined, and the corresponding execution mode is determined according to the confidence degree and the preference degree.
The embodiment of the present application further provides a terminal device, which may include a processor and a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the instructions are executed by the processor, the method for automatically controlling a terminal device is implemented.
The terminal equipment of the embodiment of the application can comprise an information acquisition device for acquiring target information; the method may include: acquiring an identification result of the target information; determining a confidence level of the recognition result; and determining a corresponding execution mode according to the confidence coefficient. By the scheme of the embodiment, the confirmation intention of the user can be accurately acquired, the accuracy of an image recognition algorithm and the like is improved, the information confirmation accuracy is improved, the number of times of user intervention is reduced, misoperation is reduced, the error correction probability of the user is reduced, and the user experience is improved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. Other advantages of the present application may be realized and attained by the instrumentalities and combinations particularly pointed out in the specification and the drawings.
Drawings
The accompanying drawings are included to provide an understanding of the present disclosure and are incorporated in and constitute a part of this specification, illustrate embodiments of the disclosure and together with the examples serve to explain the principles of the disclosure and not to limit the disclosure.
Fig. 1 is a flowchart of a terminal interaction method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an interaction pattern of a one-touch return according to an embodiment of the present application;
fig. 3 is a flowchart of a method for determining confidence of information to be confirmed according to an embodiment of the present application;
fig. 4 is a flowchart illustrating a method for determining processing preference of information to be confirmed according to an embodiment of the present application;
fig. 5 is a first schematic diagram of a terminal interaction method according to an embodiment of the present application;
fig. 6 is a second schematic diagram of a terminal interaction method according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an auto-confirm mode according to an embodiment of the present application;
FIG. 8 is a schematic view of a one-touch confirmation mode according to an embodiment of the present application;
FIG. 9 is a diagram illustrating a key revocation mode according to an embodiment of the present application;
FIG. 10 is a schematic view illustrating a message to be confirmed being an image of food material in an oven according to an embodiment of the present application;
fig. 11 is a schematic diagram of a terminal structure according to an embodiment of the present application.
Detailed Description
The present application describes embodiments, but the description is illustrative rather than limiting and it will be apparent to those of ordinary skill in the art that many more embodiments and implementations are possible within the scope of the embodiments described herein. Although many possible combinations of features are shown in the drawings and discussed in the detailed description, many other combinations of the disclosed features are possible. Any feature or element of any embodiment may be used in combination with or instead of any other feature or element in any other embodiment, unless expressly limited otherwise.
The present application includes and contemplates combinations of features and elements known to those of ordinary skill in the art. The embodiments, features and elements disclosed in this application may also be combined with any conventional features or elements to form a unique inventive concept as defined by the claims. Any feature or element of any embodiment may also be combined with features or elements from other inventive aspects to form yet another unique inventive aspect, as defined by the claims. Thus, it should be understood that any of the features shown and/or discussed in this application may be implemented alone or in any suitable combination. Accordingly, the embodiments are not limited except as by the appended claims and their equivalents. Furthermore, various modifications and changes may be made within the scope of the appended claims.
Further, in describing representative embodiments, the specification may have presented the method and/or process as a particular sequence of steps. However, to the extent that the method or process does not rely on the particular order of steps set forth herein, the method or process should not be limited to the particular sequence of steps described. Other orders of steps are possible as will be understood by those of ordinary skill in the art. Therefore, the particular order of the steps set forth in the specification should not be construed as limitations on the claims. Further, the claims directed to the method and/or process should not be limited to the performance of their steps in the order written, and one skilled in the art can readily appreciate that the sequences may be varied and still remain within the spirit and scope of the embodiments of the present application.
The embodiment of the application provides an automatic control method of terminal equipment, wherein the terminal equipment can comprise an information acquisition device for acquiring target information; as shown in fig. 1, the method may comprise steps S101-S103:
s101, obtaining an identification result of the target information;
s102, determining the confidence of the recognition result;
s103, determining a corresponding execution mode according to the confidence coefficient.
In an exemplary embodiment of the present application, information to be confirmed about a control instruction may be received and taken as the target information; the information to be confirmed may be feedback information for the control instruction.
In the exemplary embodiment of the present application, the control instruction may include, but is not limited to, a voice control instruction, an operation control instruction (e.g., a touch operation on a display device, a key operation on a preset key, etc.), a gesture instruction, a card swiping instruction, and the like. For example, the home appliance may be activated, the door may be swiped, the face may be swiped, or the payment may be made.
In an exemplary embodiment of the present application, the target information may include any one or more of: state descriptive information, image information, biometric information, payment information, and sensor measurement data;
the biological information may include any one of: iris information, fingerprint information, voice information, face information, and gesture information.
In an exemplary embodiment of the present application, for example, when the scheme of the embodiment of the present application is applied to an intelligent oven in the field of home appliances, the target information received by the terminal device may be an image of food material inside the oven, food material type information after food material identification, baking tray and grill position information, and the like. When the scheme of the embodiment of the application is applied to management systems such as entrance guard attendance in the security field, the target information received by the terminal equipment can be face image information, fingerprint image information, finger vein or palm vein image information, iris information and other biological characteristic information which uniquely identifies natural people.
In the exemplary embodiment of the present application, the target information may be information to be confirmed sent by the terminal device itself, or information to be confirmed sent by other devices or terminals. When the target information is to-be-confirmed information sent by the terminal itself, the scheme of the embodiment of the present application can be independently implemented by the terminal device, and the terminal device may include: the system comprises a request module, a display module, a response module and a sensing module, wherein the request module is positioned at a terminal device and can be responsible for requesting and issuing information to be confirmed to a system, the display module is positioned at the terminal device and can be used as a realization carrier for providing several confirmation interaction modes (confirmation execution modes), the response module is positioned at the terminal device and can be responsible for responding request instructions of various applications of the local machine and sending the generated information to be confirmed to the corresponding specific applications, and the sensing module is positioned at the terminal device and can be used for collecting original information data, including but not limited to images and video information shot by a camera on the terminal device, bar code and two-dimensional code information scanned by the camera, data information collected by various sensors and the like, and an identification result is generated after the data information is processed by a processor of the local machine. The scheme of the embodiment realizes the safety control method based on a plurality of confirmation interaction modes in an application environment without networking.
In an exemplary embodiment of the present application, the terminal device may include, but is not limited to: household appliances, home master control units, mobile phones, palm computers, intelligent wearable devices and the like.
In an exemplary embodiment of the present application, the man-machine interaction mode adopted by the terminal device may include: display interaction and/or voice interaction.
In an exemplary embodiment of the present application, the display interaction may be interacted through a display module (e.g., a display screen, a touch display screen, etc.), and the voice interaction may be interacted through a voice recognition and playing module.
In an exemplary embodiment of the present application, the method may further include: after the identification result of the target information is obtained, information verification is carried out on the identification result; when the recognition result is verified to be correct, determining the confidence coefficient of the recognition result; and entering a one-key return mode when the identification result is checked to be wrong.
In an exemplary embodiment of the present application, the method may further include: after receiving the target information, carrying out information verification on the target information; when the target information is verified correctly, identifying the target information; and when the target information is checked to be wrong, entering a one-key return mode.
In the exemplary embodiment of the present application, after receiving the target information, the target information may be checked through a preset security control module, and when the target information is checked incorrectly, the target information itself is incorrect, so that a re-request interaction may be returned with one key, as shown in fig. 2, when the target information or the identification result is checked incorrectly, the return with one key may be implemented on the current terminal. The terminal can comprise a display module (such as a touch display module), a main body part area in the display module can display but is not limited to request failure information or return error information, one-key return can be realized by detecting the triggering operation of a user on the request failure information or the return error information, and the initial page is switched; or automatically switching to the initial page after the limited time is reached so as to re-request the willingness function (such as clicking a button to re-request).
In the exemplary embodiment of the application, the one-key return mode (such as a slide cancellation confirmation event) realizes convenient operation when the user intends to change, and avoids the interactive disadvantage of manual return and multiple operations of the user.
In an exemplary embodiment of the present application, a confidence level of the recognition result may be determined when the target information and the recognition result check correctly; and determining a corresponding execution mode according to the confidence.
In an exemplary embodiment of the present application, the execution mode may include an automatic control mode and a one-touch confirmation mode.
In an exemplary embodiment of the present application, the automatic control mode may include: and directly entering an execution flow of the function corresponding to the identification result.
In exemplary embodiments of the present application, the automatic control mode may also be referred to as an automatic confirmation mode, which may include, but is not limited to: directly providing result information confirmed for the target information in a man-machine interaction mode (such as display or voice), and directly entering an execution process of a function corresponding to the confirmed information when a change command or a cancel command for the result is not detected within a preset time length;
in an exemplary embodiment of the present application, the one-touch confirmation mode may include: providing one or more option controls for a piece of confirmation through the option controls;
wherein the option control is used for representing the correctness of the recognition result; or to characterize the correct options possible.
In an exemplary embodiment of the present application, the one-touch confirmation mode may include, but is not limited to: providing one or more options to be confirmed (namely, the option control mentioned above) in the man-machine interaction manner, wherein the one or more options to be confirmed may include the target information; the options except the target information in the options to be confirmed can be alternative options which are sequentially ordered and given according to confidence degrees and/or preference degrees from high to low; and when any one of the one or more options to be confirmed is selected, realizing one-key confirmation, and directly entering an execution flow of the function corresponding to the selected option.
In an exemplary embodiment of the present application, the determining the corresponding execution mode according to the confidence may include:
and when the confidence coefficient is greater than or equal to a first confidence coefficient threshold value, entering the automatic control mode according to the recognition result.
In an exemplary embodiment of the present application, the determining the corresponding execution mode according to the confidence may further include:
entering the one-key confirmation mode when the confidence coefficient is smaller than a first confidence coefficient threshold value and is larger than or equal to a second confidence coefficient threshold value; wherein the first confidence threshold is greater than the second confidence threshold.
In an exemplary embodiment of the present application, as shown in fig. 5, when the target information received by the terminal passes through the security control module (i.e., when the confidence is greater than or equal to the first confidence threshold, the recognition result is confirmed to be authentic), the terminal may automatically perform a confirmation operation, and display the confirmed result information in the display module, when the target information received by the terminal does not pass through the security control module (i.e., when the confidence is less than the first confidence threshold and greater than or equal to the second confidence threshold, the recognition result is not authentic), the terminal may display the recognition result in the display module and request the user to select a target option according to his/her intention and complete the confirmation operation, then display the confirmed result information in the display module, and finally, the terminal device responds to the event after the confirmation operation.
In an exemplary embodiment of the present application, the logic control method in the security control module may be as shown in fig. 6, and may be configured to sort the confidence level (or the reliability, the security, the score value, and other parameters that identify the credibility of the target information) and the habit preference level of the user in a descending order, and then determine whether the comprehensive confidence level of the target information exceeds a set threshold (e.g., the first confidence threshold mentioned above), if the comprehensive confidence level exceeds the threshold, an automatic confirmation interaction manner may be invoked, and if the comprehensive user habit preference level of the target information does not exceed the threshold, it may be continuously determined whether the comprehensive user habit preference level of the target information exceeds the set threshold; if the threshold is exceeded, a one-key confirmation interaction mode can be called, and meanwhile, the item information with the highest preference degree is set as the default option in the selected state, and if the threshold is not exceeded, the one-key confirmation interaction mode can be called, and meanwhile, the item information with the highest confidence degree is set as the default option in the selected state.
In an exemplary embodiment of the present application, an automatic confirmation mode, that is, an automatic control mode or an automatic confirmation interaction mode, as shown in fig. 7, after the recognition result passes through the security control module, the interaction mode may be executed according to a confidence level (for example, it is confirmed that the recognition result is trusted), and the interaction mode may be implemented on the terminal device. The terminal can comprise a display module (such as a touch display module), the confirming result information is displayed in a main body part area in the display module, the area can attract the attention of a user through the technologies of color, font, flicker, dynamic visual effect and the like, if the intention of the user is consistent with the confirmation result information, other operations are not needed, and the confirming event can be automatically completed after the limited time is reached. The automatic confirmation mode realizes automatic confirmation of high-confidence information, can effectively acquire the intention of a user, and avoids the conditions of mistaken clicking and misoperation of the user during confirmation interaction.
In the exemplary embodiment of the present application, the one-touch confirmation mode, i.e., the one-touch confirmation interaction mode, as shown in fig. 8, may be performed after the target information passes through the security control module (e.g., the identification result is not trusted), and may be implemented on the terminal device. The terminal device may include a display module (e.g., a touch display module), a main body part region in the display module may include result information to be confirmed and confirmation buttons (i.e., one or more option controls provided in the above-mentioned one-key confirmation mode) for a user to click, the result information may be placed at the top of the region and be in a selected state by default, the confirmation button is clicked if the default item information is consistent with the user intention, and the confirmation button is automatically executed without being clicked again after a target option is manually clicked if the default item information is inconsistent with the user intention, thereby completing a one-key confirmation interaction mode. The one-button confirmation mode realizes that the most possible option is pushed as a default option, the user intention can be presumed, and the condition that the user clicks by mistake and operates by mistake during the confirmation interaction is avoided through the single-button setting.
In an exemplary embodiment of the present application, the execution mode may further include: a one-key revocation mode;
the determining the corresponding execution mode according to the confidence may include: entering the one-key revocation mode when the confidence is less than a second confidence threshold; and/or the presence of a gas in the gas,
the method may further comprise: after the automatic control mode is executed, providing an option control entering the one-key revocation mode, and revoking an execution process corresponding to the automatic control mode when the option control of the one-key revocation mode is triggered.
In an exemplary embodiment of the application, the one-key revocation mode may provide a selection item corresponding to the target information and a one-key revocation option in a man-machine interaction manner; and when the one-key revocation option is selected, revoking the execution flow corresponding to the automatic control mode.
In an exemplary embodiment of the present application, the one-key revocation mode may adopt a sliding revocation confirmation event interaction manner, as shown in fig. 9, when the confirmation result information displayed in the main body part region in the display module is inconsistent with the user intention, the interaction manner may be executed, and may be implemented on the terminal. The terminal can comprise a touch display module, the main body part in the display module can display confirmation result information, if the intention of a user is inconsistent with the display confirmation result information, the user can click the main body part area, the cancel button can be activated to slide out of the bottom of the display module, the user can drag the selected main body part area to the cancel button, and the operation of one-key sliding cancel confirmation event is completed. The one-key return re-request realizes that the re-request is automatically returned after the information self-checking error, and avoids the interactive defects of manual return and repeated operation of a user.
In an exemplary embodiment of the present application, the one-key revocation mode may further implement, by the user, one-key revocation on the current execution flow when an execution flow error corresponding to the automatic control mode is found.
In an exemplary embodiment of the present application, the display and touch functions of the display module may also be replaced by voice broadcast and voice control functions of the voice interaction module.
In an exemplary embodiment of the present application, the method may further include: after the recognition result of the target information is obtained, the preference degree corresponding to the recognition result is determined, and the corresponding execution mode is determined according to the confidence degree and the preference degree.
In the exemplary embodiment of the present application, only the confidence level determination or only the preference level determination may be performed on the recognition result, or the confidence level determination and the processing preference level determination may be performed on the target information at the same time, for example, the confidence level determination is performed first, and then the processing preference level determination is performed.
The confidence level judgment and preference level judgment scheme in the exemplary embodiment of the present application may be implemented by a preset security control module.
In the exemplary embodiment of the present application, the reliability determination and the preference determination are described in detail below, respectively.
In an exemplary embodiment of the present application, as shown in fig. 3, the performing confidence level determination on the recognition result may include steps S201 to S203:
s201, detecting the matching degree of the execution function corresponding to the control instruction corresponding to the target information and the function description and/or the function display fed back in the identification result;
s202, giving corresponding confidence score according to the matching degree;
s203, comparing the confidence score with a pre-stored confidence threshold (such as the first confidence threshold), and judging that the recognition result is credible when the confidence score is greater than or equal to the confidence threshold; and when the confidence score is smaller than the confidence threshold value, judging that the recognition result is not credible.
In an exemplary embodiment of the present application, the confidence level may be replaced by any parameter that identifies the credibility of the recognition result, such as reliability, security, matching degree, matching score, and the like.
In an exemplary embodiment of the present application, the matching degree may be determined by a matching degree of one or more functional features of the two (the execution function corresponding to the control instruction and the function description and/or function display fed back in the recognition result), for example, when the control instruction is pizza baking, the food material corresponding to the execution function should be pizza, the corresponding heating manner is simultaneous heating up and down, and the position of the baking tray is an intermediate layer; if the recognition result of the control command for baking pizza is the food material image information in the oven, the food material information, the heating method information, the grill pan position information, and the like included in the food material image should be the same as the information corresponding to the execution function, and at least the food material information should be the same, so that the matching degree of the food material information, the heating method information, the grill pan position information, and the like in the two can be confirmed based on the matching degree of at least one of the functional characteristics, and when the matching degree of the two is high, a high confidence score can be given, and when the matching degree of the two is low, a low confidence score can be given.
In an exemplary embodiment of the present application, the above-mentioned plurality of functional features may be compared, and different matching degree scores may be given according to different matching degrees, wherein the higher the matching degree, the higher the matching degree score may be, and the lower the matching degree score may be. After the comparison is completed, the obtained multiple matching degree scores can be weighted and calculated, and finally the confidence score is obtained, wherein the weighting coefficients during the weighting calculation can be set to different values according to different functional characteristics, the greater the importance of the functional characteristics is, the greater the weighting coefficient can be, the smaller the importance of the functional characteristics is, and the smaller the weighting coefficient can be.
In the exemplary embodiment of the present application, the set confidence threshold (e.g., the first confidence threshold) may not be fixed in the embodiment of the present application, and may be different according to the usage scenario of the recognition result. For example, in a scenario where the intelligent cooking device confirms the food material information and executes an automatic cooking program after recognizing the food material, the confidence threshold of the recognition result may be required to be not lower than 90%; in a scene of confirming information and carrying out currency payment operation after face recognition, a confidence threshold of a recognition result can be required to be not lower than 95%; in a scene where information is confirmed after a fire is detected and an early warning operation is performed during fire monitoring, a confidence threshold of a recognition result may be required to be higher, for example, 98%.
In an exemplary embodiment of the application, the set confidence threshold may generate different subsequent responses for the same recognition result, and the set confidence threshold may also be different, for example, after the recognition result generated after the face recognition is subjected to a confirmation operation, one response may be to perform a money payment, and the other response may be to perform an entrance guard opening, and the set confidence threshold of the former should be higher.
In an exemplary embodiment of the present application, as shown in fig. 4, the determining the preference degree of the target information may include steps S301 to S304:
s301, obtaining preference related data of the function corresponding to the function description and/or the function display fed back in the recognition result.
In an exemplary embodiment of the present application, the obtaining of the preference-related data of the function corresponding to the function description and/or the function presentation fed back in the recognition result may include:
acquiring historical execution data of the function corresponding to the function description and/or the function display fed back in the identification result;
performing data statistics on execution preferences of the function according to the history;
sorting the plurality of execution parameters of the function according to the execution preference; wherein said ranking according to said execution preferences comprises: performing descending arrangement according to the sequence of gradually reducing the execution preference;
the arranged plurality of execution parameters are used as the preference related data.
In the exemplary embodiment of the present application, the user habit preference degree is obtained from a plurality of historical data (i.e., historical execution data) after the user issues such a request, and the user operation intention of such a request is statistically evaluated according to the historical data, so as to obtain the execution preference for such a request, thereby obtaining the preference related data.
In an exemplary embodiment of the present application, for example, for a control scheme of a range hood, the historical execution data may include: the high-speed rotation speed is 7:10-7:25 for 10 minutes and then the low-speed operation is 5 minutes, the high-speed rotation speed is 18:20-18:38 for 15 minutes and then the low-speed operation speed is 3 minutes, the high-speed rotation speed is 7:11-7:29 for 13 minutes and then the low-speed operation speed is 5 minutes, and the high-speed rotation speed is 19:00-19:15 for 11 minutes and then the low-speed operation speed is 4 minutes; the data related to the preference can be counted by the historical execution data, wherein the data related to the preference comprises that the range hood is started at the speed of 7:10-7:30, the range hood is started at the high speed for 10-15 minutes, the range hood is started at the low speed for 2-5 minutes and the range hood is started at the speed of 18:00-18:30 after the range hood is rotated at the high speed for 8 minutes and is operated at the low speed for 3 minutes after the range hood is rotated at the speed of 18:00-18:20 for 17 minutes.
In the exemplary embodiment of the present application, it can be known from the above-described history execution data that: starting the range hood at a speed of 7:10-7:30, rotating at a high speed for 10-15 minutes, rotating at a low speed for 2-5 minutes, and starting the range hood at a speed of 18:00-18:30, wherein the execution preference of starting the range hood at a speed of 7:10-7:30 is the largest, rotating at a low speed for 2-5 minutes, rotating at a high speed for 10-15 minutes, and then rotating at a speed of 18:00-18:30 is the smallest. This execution preference ranking may be used as preference related data in a range hood control scheme.
S302, comparing the execution parameters involved in the function description and/or the function presentation with the preference related data.
In an exemplary embodiment of the present application, for example, when the execution parameter involved in the function description and/or the function demonstration is 7:15-7:30 for 7 minutes of low-speed operation after 8 minutes of high-speed rotation, it is known that 7:15-7:30 corresponds to the maximum preference of 7:10-7:30 for starting the hood, but 7 minutes of low-speed operation after 8 minutes of high-speed rotation does not correspond to the preference of 10-15 minutes of high-speed rotation and 2-5 minutes of low-speed rotation, as compared with the preference-related data described above.
And S303, calculating the preference degree score of the identification result according to the comparison result.
In an exemplary embodiment of the present application, the calculating a preference score of the recognition result according to the comparison result may include:
acquiring execution parameters matched with the preference related data in the execution parameters related to the function description and/or the function display;
carrying out weighted calculation on the matched execution parameters according to corresponding weighted numerical values to obtain the preference degree score;
wherein the weighted values are set according to the preference of the corresponding execution parameters; the higher the preference, the larger the weighted value, and the lower the preference, the smaller the weighted value.
In an exemplary embodiment of the present application, one preference coincidence degree score may be obtained according to coincidence degrees with different preference related data, where the higher the coincidence degree is, the higher the preference coincidence degree score may be, and finally, a plurality of obtained preference coincidence degree scores may be weighted and calculated, where a weighting coefficient may be determined according to the preference degrees of different preference related data, the higher the preference degree of a certain preference related data is, the higher the corresponding weighting coefficient may be, and conversely, the lower the preference degree of a certain preference related data is, the lower the corresponding weighting coefficient may be.
In the exemplary embodiment of the present application, it can be seen from the above embodiments that 7:15-7:30 meets the maximum preference of 7:10-7:30 for starting the range hood, but that 7 minutes of low speed operation after 8 minutes of high speed rotation does not meet the preference of 10-15 minutes of high speed rotation and 2-5 minutes of low speed rotation. And the time point of 7:15-7:30 is higher in accordance with the preference of starting the range hood at the time point of 7:10-7:30, so that higher preference coincidence degree scores can be obtained, and since the low-speed rotation for 7 minutes after the high-speed rotation for 8 minutes is not in accordance with the preference of the high-speed rotation for 10-15 minutes and the low-speed rotation for 2-5 minutes, the preference coincidence degree scores of the latter two items can be lower, for example, 0. In addition, since the preference degree of starting the range hood is the greatest at 7:10-7:30, the corresponding weighting coefficients of the preference conformity degree scores corresponding to 7:15-7:30 are also higher.
S304, comparing the preference degree score with a pre-stored preference degree threshold value, and when the preference degree score is greater than or equal to the preference degree threshold value, judging that the identification result is preference information; and when the preference degree score is smaller than the preference degree threshold value, judging that the identification result is non-preference information.
In the exemplary embodiment of the present application, the set preference threshold (or habit preference threshold) may be determined comprehensively according to the confirmation operation frequency and the history of such requests in the terminal.
In an exemplary embodiment of the present application, it may be determined whether the recognition result of the current target information is habit information or preference information of the user by comparing the calculated preference degree score with a preference degree threshold.
In an exemplary embodiment of the present application, the performing confidence judgment and preference judgment on the recognition result may include:
when the confidence degree score is smaller than the confidence degree threshold (such as the first confidence degree threshold) as a result of performing confidence degree judgment on the recognition result, performing preference degree judgment on the recognition result, and taking the preference degree judgment result of the recognition result as a judgment result of the recognition result; alternatively, the first and second electrodes may be,
respectively carrying out confidence judgment and preference judgment on the recognition result, carrying out weighted calculation on confidence scores obtained in the confidence judgment and preference scores obtained in the preference judgment to obtain comprehensive judgment scores, comparing the comprehensive judgment scores with a preset comprehensive judgment threshold value, and confirming that the recognition result is credible when the comprehensive judgment scores are greater than or equal to the comprehensive judgment threshold value; and when the comprehensive judgment score is smaller than the comprehensive judgment threshold value, confirming that the identification result is not credible.
In the exemplary embodiment of the present application, two processing embodiments when both the reliability determination and the preference determination are performed are given to this embodiment.
In the exemplary embodiment of the present application, a corresponding execution mode is provided according to the determination result (confidence determination and/or preference determination), so that preliminary automatic processing of the recognition result is realized.
In an exemplary embodiment of the present application, as shown in fig. 6, the providing a corresponding execution mode according to the determination result to perform a preliminary automatic processing on the target information may include:
providing the automatic confirmation mode when the target information is determined to be credible and/or the target information is preference information;
providing the one-touch confirmation mode when it is determined that the target information is not authentic and the target information is preference information;
providing the one-key revocation mode when it is determined that the target information is not authentic and/or the target information is not preference information.
In the exemplary embodiments of the present application, the embodiments of the present application are described below by means of several specific embodiments.
In an exemplary embodiment of the application, in an application embodiment of an intelligent cooking device, for example, an intelligent cooking device such as a rice cooker or an oven, based on an interaction scheme of image recognition, when a confidence of an image recognition food material is higher than a threshold, an automatic cooking control process is entered without interaction with a user; when the confidence is lower than a threshold value, interaction occurs to ensure the user to confirm; by adopting the scheme, on one hand, the number of times of user intervention is reduced, on the other hand, the probability of error correction of the user is also reduced, and the accuracy of the image recognition algorithm is improved.
In an exemplary embodiment of the application, in an application embodiment of an intelligent oven in the field of home appliances, as shown in fig. 10, when the target information received by the terminal is an image of food material in the oven, food material type information after food material identification and baking tray grill position information, the threshold parameter (for example, confidence threshold) in the safety control module is a confidence score value of a food material identification result, the set confidence threshold may be 90 scores, and the habit preference of a user stored in the safety control module may be a taste preference of the oven for cooking the food material, such as moderate taste, crisp taste, soft taste, and the like.
In an exemplary embodiment of the application, after the food material is placed in the oven, after the food material identification one-key cooking button is clicked, if the confidence score of the food material identification result is higher than a set threshold (i.e., a confidence threshold, such as 90 points), an automatic confirmation interaction mode may be performed, for example, the main display area displays an image of the food material in the oven and information of the confirmation result after the food material identification, and then responds to an event of the confirmation operation, that is, the corresponding cooking process is automatically called to cook the food material.
In an exemplary embodiment of the application, if the confidence score of the food material recognition result is lower than a set threshold (for example, 90 scores), a one-key confirmation interactive mode may be performed, the main display area displays an item of an image of a food material and an item of a food material category in the oven (the food material category with the highest confidence score may be at the top and may be in a selected state by default), when the default selected food material category is consistent with the user intention option, a one-key cooking button may be clicked, and when the default selected food material category is inconsistent with the user intention option, the one-key cooking button after clicking the target option may automatically enable confirmation.
In an exemplary embodiment of the present application, in an application embodiment of voice interaction in the home appliance field, if the target information received by the terminal is voice instruction information for controlling a home appliance, the threshold parameter (confidence threshold, for example, the first confidence threshold) in the security control module may be a numerical value for measuring a reliability score of the voice instruction information after voice recognition, and the habit preference of the user in the security control module may be an expression habit preference for expressing a certain control instruction for the user voice. Taking a range hood with a voice interaction function as an example, after a user sends an instruction for starting, closing or adjusting the air volume through voice, if the reliability score (confidence score) of a voice instruction recognition result is higher than a set threshold (a confidence threshold, such as a first confidence threshold), an automatic confirmation interaction mode can be executed (namely, an automatic confirmation mode is provided), and a voice interaction module can broadcast confirmed instruction result information and then respond to an event of confirmation operation, namely, automatically control the working state of the range hood. If the reliability score of the voice command recognition result is lower than the set threshold, a one-key confirmation interaction mode can be executed (namely, a one-key confirmation mode is provided), the voice interaction module can broadcast the recognized voice command result in a query mode, only positive words are required to be answered when the user intends to be a positive response of the query, and negative words are answered when the user intends to be a negative response of the query, and interaction is carried out again.
In an exemplary embodiment of the present application, in an application embodiment of a management system for entrance guard attendance and the like in the security and protection field, the target information received by the terminal may be biometric information that uniquely identifies a natural person, such as face image information, fingerprint image information, finger vein or palm vein image information, and the threshold parameter (confidence threshold, for example, a first confidence threshold) in the security control module is a matching degree of the biometric information that uniquely identifies the natural person. When the living body approaches the data acquisition device, the device automatically acquires images and identifies, if the matching degree score (confidence score) of the biological feature identification result is higher than a set threshold (confidence threshold, such as a first confidence threshold), an automatic confirmation interaction mode can be executed (i.e., an automatic confirmation mode is provided), the main display area can display the confirmation result information of the target person, and then the confirmation operation can be responded, namely, the verification is passed, the gate is opened, the attendance information is input, and the like. If the matching degree score of the biological characteristic recognition result is lower than the set threshold value, a one-key confirmation interactive mode (namely, a one-key confirmation mode is provided) can be executed, the main display area prompts re-verification, or password verification is input, or management personnel assistance is requested.
In an exemplary embodiment of the present application, when an identification result received by a terminal passes verification of a security control module (i.e., the identification result is trusted and/or the identification result is preference information), the terminal automatically performs a confirmation operation and displays the confirmed result information in a display module, when the identification result received by the terminal does not pass verification of the security control module (i.e., the identification result is not trusted and/or the identification result is not preference information), the terminal displays the identification result in the display module and requests a user to select a target option according to the user's intention and complete the confirmation operation, then displays the confirmed result information in the display module, and finally responds to an event after the confirmation operation. According to the scheme of the embodiment of the application, the confirmation intention of the user can be accurately acquired, repeated confirmation operation of the user on the high-reliability recognition result is avoided, meanwhile, the loss caused by mistaken clicking and misoperation of the user is also avoided, and the user interaction experience is improved.
The embodiment of the present application further provides a terminal device 1, as shown in fig. 11, which may include a processor 11 and a computer-readable storage medium 12, where the computer-readable storage medium 12 stores instructions, and when the instructions are executed by the processor 11, the method for automatically controlling a terminal device described in any one of the above items is implemented.
In an exemplary embodiment of the present application, the terminal device 1 may further include a human-computer interaction module 13, and when the instructions are executed by the processor 11 and the human-computer interaction module 13, the terminal interaction method described in one of the above items is implemented.
In the exemplary embodiment of the present application, any embodiment of the foregoing method embodiments may be applicable to the terminal embodiment, and details are not repeated here.
It will be understood by those of ordinary skill in the art that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the components may be implemented as software executed by a processor, such as a digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those of ordinary skill in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.

Claims (10)

1. The automatic control method of the terminal equipment is characterized in that the terminal equipment comprises an information acquisition device for acquiring target information; the method comprises the following steps:
acquiring an identification result of the target information;
determining a confidence level of the recognition result;
and determining a corresponding execution mode according to the confidence coefficient.
2. The automatic control method of terminal equipment according to claim 1, wherein the execution modes include an automatic control mode and a one-touch confirmation mode.
3. The method of claim 2, wherein said determining a corresponding execution mode according to said confidence level comprises:
and when the confidence coefficient is greater than or equal to a first confidence coefficient threshold value, entering the automatic control mode according to the recognition result.
4. The automatic control method of a terminal device according to claim 2 or 3, wherein the automatic control mode includes: and directly entering an execution flow of the function corresponding to the identification result.
5. The method of claim 2, wherein said determining a corresponding execution mode according to said confidence level comprises:
entering the one-key confirmation mode when the confidence coefficient is smaller than a first confidence coefficient threshold value and is larger than or equal to a second confidence coefficient threshold value; wherein the first confidence threshold is greater than the second confidence threshold.
6. The automatic control method of a terminal device according to claim 2 or 5, wherein the one-touch confirmation mode includes: providing one or more option controls for a piece of confirmation through the option controls;
wherein the option control is used for representing the correctness of the recognition result; or to characterize the correct options possible.
7. The automatic control method of a terminal device according to claim 2, wherein the execution mode further includes: a one-key revocation mode;
the determining the corresponding execution mode according to the confidence degree comprises: entering the one-key revocation mode when the confidence is less than a second confidence threshold; and/or the presence of a gas in the gas,
the method further comprises the following steps: after the automatic control mode is executed, providing an option control entering the one-key revocation mode, and revoking an execution process corresponding to the automatic control mode when the option control of the one-key revocation mode is triggered.
8. The automatic control method of a terminal device according to any one of claims 1 to 3, characterized in that the method further comprises:
after the identification result of the target information is obtained, information verification is carried out on the identification result; when the recognition result is verified to be correct, determining the confidence coefficient of the recognition result; and entering a one-key return mode when the identification result is checked to be wrong.
9. The automatic control method of a terminal device according to any one of claims 1 to 3, characterized in that the method further comprises: after the recognition result of the target information is obtained, the preference degree corresponding to the recognition result is determined, and the corresponding execution mode is determined according to the confidence degree and the preference degree.
10. A terminal device comprising a processor and a computer-readable storage medium having instructions stored thereon, wherein the instructions, when executed by the processor, implement the automatic control method of the terminal device according to any one of claims 1 to 9.
CN202010611667.0A 2020-06-30 2020-06-30 Terminal equipment and automatic control method thereof Pending CN113869296A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010611667.0A CN113869296A (en) 2020-06-30 2020-06-30 Terminal equipment and automatic control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010611667.0A CN113869296A (en) 2020-06-30 2020-06-30 Terminal equipment and automatic control method thereof

Publications (1)

Publication Number Publication Date
CN113869296A true CN113869296A (en) 2021-12-31

Family

ID=78981228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010611667.0A Pending CN113869296A (en) 2020-06-30 2020-06-30 Terminal equipment and automatic control method thereof

Country Status (1)

Country Link
CN (1) CN113869296A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117196560A (en) * 2023-11-07 2023-12-08 深圳市慧云智跑网络科技有限公司 Data acquisition method and system of card punching equipment based on Internet of things

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117196560A (en) * 2023-11-07 2023-12-08 深圳市慧云智跑网络科技有限公司 Data acquisition method and system of card punching equipment based on Internet of things
CN117196560B (en) * 2023-11-07 2024-02-13 深圳市慧云智跑网络科技有限公司 Data acquisition method and system of card punching equipment based on Internet of things

Similar Documents

Publication Publication Date Title
US10573171B2 (en) Method of associating user input with a device
US10698990B2 (en) Eye movement traces authentication system, method, and non-transitory computer readable medium, the same which integrate with face recognition and hand recognition
CN109974235B (en) Method and device for controlling household appliance and household appliance
US7623970B2 (en) Personal authentication method and device
CN107329688B (en) Fingerprint acquisition method and terminal
US9760383B2 (en) Device configuration with multiple profiles for a single user using remote user biometrics
CN109074819A (en) Preferred control method based on operation-sound multi-mode command and the electronic equipment using it
US9600304B2 (en) Device configuration for multiple users using remote user biometrics
CN110895934A (en) Household appliance control method and device
CN106843669A (en) Application interface operating method and device
EP3339742B1 (en) Food preparation entity
CN107450329A (en) The control method and its device of home appliance
CN113869296A (en) Terminal equipment and automatic control method thereof
CN113760123A (en) Screen touch optimization method and device, terminal device and storage medium
CN113028597B (en) Voice control method and device
CN108363915A (en) unlocking method, mobile terminal and computer readable storage medium
CN115731684B (en) Control method and device of wireless key switch equipment
US11245543B2 (en) Identifying abnormal usage of electronic device
CN113038257B (en) Volume adjusting method and device, smart television and computer readable storage medium
CN113591600A (en) Cooking equipment control method and device based on user identification and gas stove
CN108732937B (en) Remote controller, mobile terminal, display method of control interface and medium
CN116305049B (en) Visual control system and method for tablet personal computer
CN111859103A (en) Control device, home appliance, communication device, server, and information presentation system
CN109993139B (en) Gesture recognition method, gesture recognition device and equipment
CN105912253B (en) Virtual photographing key triggering method and device and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination