CN112000024A - Method, device and equipment for controlling household appliance - Google Patents

Method, device and equipment for controlling household appliance Download PDF

Info

Publication number
CN112000024A
CN112000024A CN202010904880.0A CN202010904880A CN112000024A CN 112000024 A CN112000024 A CN 112000024A CN 202010904880 A CN202010904880 A CN 202010904880A CN 112000024 A CN112000024 A CN 112000024A
Authority
CN
China
Prior art keywords
action
actions
acquiring
action set
control instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010904880.0A
Other languages
Chinese (zh)
Other versions
CN112000024B (en
Inventor
吴红东
董航
王守峰
原勇健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Haier Technology Co Ltd
Original Assignee
Qingdao Haier Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Haier Technology Co Ltd filed Critical Qingdao Haier Technology Co Ltd
Priority to CN202010904880.0A priority Critical patent/CN112000024B/en
Publication of CN112000024A publication Critical patent/CN112000024A/en
Application granted granted Critical
Publication of CN112000024B publication Critical patent/CN112000024B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The application relates to the technical field of Internet of things and discloses a method for controlling household appliances. The method comprises the following steps: acquiring the category of a played video and acquiring a control instruction set; determining a recommended scene rule according to the category of the played video and the control instruction set; and controlling the household appliance according to the recommended scene rule. The method comprises the steps of determining a recommended scene rule according to the category of the played video and a control instruction set by acquiring the category of the played video and acquiring the control instruction set, and controlling the household appliances according to the recommended scene rule, so that the household appliances can be linked when the video played by a user is played, and the experience of the user in watching the video is improved. The application also discloses a device and equipment for controlling the household appliance.

Description

Method, device and equipment for controlling household appliance
Technical Field
The present application relates to the field of internet of things technology, and for example, to a method, an apparatus, and a device for controlling a home appliance.
Background
With the rapid development of the internet of things technology and the smart home technology, the household appliances can control various household appliances such as lighting systems, electric curtains, digital cinema systems, network household appliances and the like in the home according to scene rules through the internet of things technology.
In the process of implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art: the existing household appliance control technology cannot perform linkage of household appliances when a user watches videos, and user experience is poor.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a method, a device and equipment for controlling household appliances, so that linkage among the household appliances can be carried out during playing of videos.
In some embodiments, the method for controlling an electric home appliance includes:
acquiring the category of a played video and acquiring a control instruction set;
determining a recommended scene rule according to the category of the played video and the control instruction set;
controlling the household appliance according to the recommended scene rule;
the control instruction set comprises control instructions corresponding to the household electrical appliance; the recommended scene rule comprises the corresponding relation between the category of the played video and the control instruction.
In some embodiments, the obtaining of the category of the played video includes:
acquiring a key frame of the played video;
and acquiring the category of the played video according to the key frame.
In some embodiments, obtaining the category of the played video according to the key frame includes:
acquiring character information in the key frame;
and judging the type of the played video according to the text information.
In some embodiments, the fetch control instruction set includes:
acquiring a first action set of the household appliance in a first time period;
acquiring the control instruction set according to the first action set;
the first action set comprises actions corresponding to the home appliance executing the control command.
In some embodiments, retrieving the set of control instructions according to the first set of actions includes:
and acquiring an alternative action set according to the first action set, and acquiring the control instruction set according to the alternative action set.
In some embodiments, obtaining a set of alternative actions from the first set of actions comprises:
the method comprises the steps of obtaining the support degree of each action in a first action set, screening out the actions smaller than a preset first support degree threshold value in the first action set, and storing the actions into a third action set; determining the third set of actions as the set of alternative actions; or the like, or, alternatively,
acquiring a third action set according to the frequency of actions in the first action set; acquiring a second action set of the household appliance in more than two second time periods, and acquiring a fourth action set according to the frequency of actions in the second action set; acquiring actions corresponding to the control instruction executed by the household appliance in more than two second time periods and storing the actions into a second action set; acquiring the support degree of each action in the first action set and the second action set; screening out actions smaller than a preset first support degree threshold value from the first action set and storing the actions into a third action set; screening out actions smaller than a preset second support degree threshold value from the second action set and storing the actions into a fourth action set; taking intersection of the third action set and the fourth action set to obtain an alternative action set;
the second action set, the third action set, the fourth action set and the alternative action set respectively comprise actions corresponding to the home appliance device executing the control instruction; the support degree is the proportion of the frequency of each action in the action set to the frequency of all actions in the set.
In some embodiments, retrieving a set of control instructions from the set of alternative actions includes:
and acquiring a first confidence coefficient among the actions in the alternative action set, and acquiring the control instruction set according to the first confidence coefficient.
In some embodiments, determining a recommended scene rule based on the category of the played video and the set of control instructions comprises:
acquiring a second confidence coefficient between the category of the played video and the control instruction in the control instruction set;
and generating a recommended scene rule according to the second confidence degree.
In some embodiments, the apparatus for controlling an electric home appliance includes a processor and a memory storing program instructions, and the processor is configured to execute the method for controlling an electric home appliance when executing the program instructions.
In some embodiments, the device comprises the above device for controlling the household appliance.
The method, the device and the equipment for controlling the household appliance provided by the embodiment of the disclosure can realize the following technical effects: the method comprises the steps of determining a recommended scene rule according to the category of the played video and a control instruction set by acquiring the category of the played video and acquiring the control instruction set, and controlling the household appliances according to the recommended scene rule, so that the household appliances can be linked when the video played by a user is played, and the experience of the user in watching the video is improved.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the accompanying drawings and not in limitation thereof, in which elements having the same reference numeral designations are shown as like elements and not in limitation thereof, and wherein:
fig. 1 is a schematic diagram of a method for controlling an electric home appliance according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a method for obtaining a category of projected videos according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a method for obtaining recommended scene rules according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of an apparatus for controlling a home device according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of a system for controlling an electrical home device according to an embodiment of the present disclosure.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, one or more embodiments may be practiced without these details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
The terms "first," "second," and the like in the description and in the claims, and the above-described drawings of embodiments of the present disclosure, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the present disclosure described herein may be made. Furthermore, the terms "comprising" and "having," as well as any variations thereof, are intended to cover non-exclusive inclusions.
The term "plurality" means two or more unless otherwise specified.
In the embodiment of the present disclosure, the character "/" indicates that the preceding and following objects are in an or relationship. For example, A/B represents: a or B.
The term "and/or" is an associative relationship that describes objects, meaning that three relationships may exist. For example, a and/or B, represents: a or B, or A and B.
With reference to fig. 1, an embodiment of the present disclosure provides a method for controlling an electrical home appliance, including:
step S101, acquiring the type of a played video and acquiring a control instruction set;
step S102, determining a recommended scene rule according to the category of the played video and a control instruction set;
and step S103, controlling the household appliance according to the recommended scene rule.
The control instruction set comprises control instructions corresponding to the household electrical appliance; the recommended scene rule includes a corresponding relation between the category of the played video and the control instruction.
The method comprises the steps of obtaining the category of the played video and the control instruction set, determining a recommended scene rule according to the category of the played video and the control instruction set, controlling the household appliances according to the recommended scene rule, enabling linkage between the household appliances to be carried out when the user plays the video, and improving user experience in watching the video.
Optionally, the obtaining of the category of the played video includes: acquiring a key frame of a played video; and acquiring the category of the played video according to the key frame.
Optionally, the key frames are extracted from the played video through the video data set VideoNet. In some embodiments, the VideoNet dataset performs shot segmentation on the played video and extracts 5 key frames in a shot segmentation unit according to sharpness.
Optionally, the obtaining the category of the played video according to the key frame includes: acquiring character information in the key frame; and judging the type of the played video according to the text information.
In some embodiments, detecting the key frame by an OCR (Optical Character Recognition) algorithm to obtain text information in the key frame; under the condition that the text information is detected, extracting key words in the text information, and matching video categories corresponding to the key words in a preset video database according to the key words; the preset video database stores the corresponding relation between the keywords and the video categories.
Optionally, a control instruction set is obtained, comprising: acquiring a first action set of the household appliance in a first time period; acquiring a control instruction set according to the first action set; the first action set comprises actions corresponding to the execution of the control command by the household appliance.
Optionally, acquiring actions corresponding to the execution control instructions of all the household appliances in a first time period through an intelligent scene recommending module to obtain a first action set; and acquiring the relevance among the actions corresponding to the control instructions executed by all the household appliances according to the first action set, and acquiring a control instruction set according to the relevance. Optionally, the association is that an action corresponding to a control instruction executed by a certain home device is a precondition for another home device to execute an action corresponding to the control instruction.
Optionally, obtaining a set of control instructions according to the first set of actions includes: and acquiring an alternative action set according to the first action set, and acquiring a control instruction set according to the alternative action set.
Optionally, obtaining the alternative action set according to the first action set includes: the method comprises the steps of obtaining the support degree of each action in a first action set, screening out the actions smaller than a preset first support degree threshold value in the first action set, and storing the actions into a third action set; determining the third action set as an alternative action set; or, acquiring a third action set according to the frequency of the actions in the first action set; acquiring a second action set of the household appliance in more than two second time periods, and acquiring a fourth action set according to the frequency of actions in the second action set; acquiring actions corresponding to the control command executed by the household appliance in more than two second time periods and storing the actions into a second action set; acquiring the support degree of each action in the first action set and the second action set; screening out actions smaller than a preset first support degree threshold value from the first action set and storing the actions into a third action set; screening out actions smaller than a preset second support degree threshold value from the second action set and storing the actions into a fourth action set; taking intersection of the third action set and the fourth action set to obtain an alternative action set; the second action set, the third action set, the fourth action set and the alternative action set respectively comprise actions corresponding to the execution control instruction of the household appliance device; the support degree is the proportion of the frequency of each action in the action set to the frequency of all actions in the set.
Optionally, the support degree of each action in the first action set is a proportion of the frequency of each action in all actions in the first action set.
Optionally, sorting the actions in the first action set from high to low according to the frequency of the actions to obtain an FP-tree of each action; obtaining the support degrees of all actions in the FP tree, and selecting a threshold value S smaller than the first support degree1The minimum support fp (x) tree of each action, that is, the third action set, is obtained.
Optionally, the support is a percentage of the frequency of each action to the frequency of all actions in the FP-tree.
Optionally, an action set corresponding to the home appliance device execution control instruction in more than two second time periods is selected to obtain a second action set. Optionally, the second action set includes actions corresponding to one or more control instructions. In some embodiments, 19:00 to 19:20 of three days are continuously selected as the second time period, the second action set in the 19:00 to 19:20 time period of the 1 st day comprises opening the curtain 5 times, opening the air conditioner 1 time, turning on the lamp 10 times and turning on the television 4 times, the second action set in the 19:00 to 19:20 time period of the 2 nd day comprises opening the curtain 4 times, turning on the lamp 12 times and turning off the lamp 4 times, and the second action set in the 19:00 to 19:20 time period of the 3 rd day comprises opening the curtain 5 times, opening the air conditioner 1 time, turning on the lamp 10 times and turning on the water heater 4 times; respectively calculating the support degree of each action in the second action set in each second time period, and selecting a threshold value S smaller than the second support degree2The method can be performed. For example: the support degrees of opening the curtain are 5/20, 4/20 and 5/20 respectively, and the threshold value S of the second support degree23/20, storing the action of opening the window curtain into a fourth action set; the water heaters have a support degree of 0/20, 0/20, 4/20 respectively, wherein the second support degree threshold S is satisfied only one day2When the operation of the water heater is not regular 3/20, the operation of the water heater is deleted. Optionally, an intersection is taken between the third action set and the fourth action set to obtain an alternative action set. In this way, the actions in the alternative action set can be more regular by acquiring the second action set of the household appliance in more than two second time periods, screening according to the second support threshold and deleting the actions corresponding to other control instructions which do not meet the second support threshold, so that the selected actions can represent the behavior habit of the user in a certain time period.
Optionally by calculation
Figure BDA0002661048960000061
And obtaining the support degree S, wherein Q is the frequency of executing the action corresponding to the control command by the household appliances, and P is the frequency of executing the action corresponding to the control command by all the household appliances.
Optionally, obtaining a control instruction set according to the alternative action set includes: and acquiring a first confidence degree between the actions in the alternative action set, and acquiring a control instruction set according to the first confidence degree.
Optionally, the candidate action set is filtered by using the first confidence, i.e. the conditional probability confidence and the time difference, through the FP-growth algorithm. In some embodiments, the conditional probability confidence between the action a corresponding to the control instruction executed by the home device and the action B corresponding to the control instruction executed by the home device is a percentage of action items of the action B corresponding to the control instruction, which are directly or indirectly executed by the home device after the home device executes the action a corresponding to the control instruction in the first action set, to the total number of action items a.
Optionally by calculation
Figure BDA0002661048960000071
Obtaining a conditional probability confidence; wherein P (AB) is the probability of action A occurring simultaneously with action B, P (A) is the probability of action A occurring, and P (B | A) is the probability of action B occurring in the case of action A. In some embodiments, alternatives are paired according to a conditional probability confidence thresholdAnd screening the actions in the action set, selecting the actions larger than a conditional probability confidence threshold, and screening according to the time difference between the actions corresponding to the control instruction executed by the household appliance to obtain a control instruction set. In some embodiments, a time difference Δ t minutes is set, and if action D occurs within Δ t minutes of the occurrence of action C, then there is a relationship between action C and action D, and "execute control command C, execute control command D" is stored in the control command set.
Optionally, determining a recommended scene rule according to the category of the played video and the control instruction set includes: acquiring a second confidence coefficient between the category of the played video and a control instruction in the control instruction set; and generating a recommended scene rule according to the second confidence degree.
Optionally, the second confidence is a ratio between a support degree of the home device for executing the action corresponding to the control instruction and a support degree of the type of the video played in the first time period. In some embodiments, the support of "turn on green light and close curtain" of the home appliance executing the action corresponding to the control command is 27.5%, and the support of playing terrorist films in the first time period is 50%, then
Figure BDA0002661048960000072
Figure BDA0002661048960000081
Optionally, in the case that the second confidence is greater than 50%, determining the correspondence between the category of the played video and the control instruction set as a recommended scene rule, namely "horror sheet, green light on, and curtain off". In some embodiments, the second confidence level is 55% indicating that the home device performs the action "turn on green light, turn off curtain" corresponding to the control command, and 55% is suitable for the scenario rule.
In some embodiments, in the case of watching horror films, for more than 90% of horror films, the household appliance executes a control instruction corresponding to "turn on green light and close curtain"; in the case of watching comedies, for more than 95% of comedies, the home devices perform "open curtains, no lights during the day time period, and yellow incandescent lights during the night time period". Optionally, the category of the played video and the control instruction are stored in the scene rule base in the form of rules. In some embodiments, a rule attribute such as "light" is an attribute relied upon in a video category such as "horror; the category of the played video is 1 or more types; one video category corresponds to only one control instruction, such as "horror, turn on green light", "horror, close curtain".
Controlling the household appliance according to the recommended scene rule, comprising: and controlling the household appliances used for playing the video and/or controlling other household appliances except the household appliances playing the video.
In some embodiments, if the recommended scene rule is "horror, turn down the brightness of the television", the scene recommending module controls the television according to the recommended scene rule, and turns down the brightness of the television; in some embodiments, the recommended scene rule is "war film, and projection screen projection brightness is increased", and the scene recommending module controls the projector according to the recommended scene rule to increase projection screen projection brightness. In some embodiments, the recommended scene rule is "war, turn up the television volume", and the scene recommendation module controls the television volume to be increased according to the recommended scene rule.
With reference to fig. 2, a method for obtaining a category of a screen projection video provided by an embodiment of the present disclosure includes:
in step S201, the video identification system obtains a uniform resource locator URL of the screen-projected content, i.e., a source of the screen-projected content.
Step S202, the video identification system detects the video category according to the source of the screen projection content; in the case where the video category is detected, step S203 is executed; in the case where the video category is not detected, step S204 is performed.
Step S203, sending the detected video category to an IOT (the Internet Of Things platform) cloud platform, where the IOT cloud platform sends a control instruction corresponding to the video category to the household appliance according to the video category, and triggers the household appliance to execute the control instruction.
And step S204, extracting key frames of the video content under the condition that the screen projection content is judged to be the video.
And step S205, acquiring character information in the key frame through an OCR algorithm.
Step S206, obtaining video classification according to the character information in the key frame; then step S203 is performed.
In some embodiments, a user performs screen projection operation on a television through a mobile phone, optionally, the user performs shot segmentation on screen projection content through a VideoNet data set under the condition that a video recognition system does not detect video classification, extracts a key frame from a lens segmentation unit according to definition, and judges whether the screen projection content is a video or not according to the key frame; under the condition that the screen projection content is a video, extracting 5 key frames of the video, optionally, acquiring character information in the 5 key frames through an OCR algorithm, and classifying the video according to the acquired character information; and under the condition of acquiring the video category, sending the video category to the IOT cloud platform and storing the video category. Optionally, the key frames are extracted by a multimedia processing tool ffmpeg. In some embodiments, in the event that the screen-projected content is determined not to be a video, no other operation is performed. The VideoNet data set labels videos from three dimensions, labels category labels to the whole video from event dimensions, labels category and position frames to the lens key frames from object dimensions, and labels category labels to the lens key frames from scene dimensions. Therefore, the screen projection content is marked from three dimensions through the VideoNet data set and then identified, rich semantic association existing among the dimensions is fully utilized to establish a model, semantic relation among multi-dimensional content is reflected, video identification accuracy is improved, and video categories are convenient to obtain.
Referring to fig. 3, a method for obtaining a recommended scene rule according to an embodiment of the present disclosure includes:
step S301, acquiring an action corresponding to the home appliance device executing the control instruction in the first time period, that is, a first action set.
Step S302, according to the attribute, the condition and the action in the scene model, performing data clearing on the action corresponding to the control command, and mapping the data into a rule list in a scene model data structure.
Step S303, storing the action corresponding to the control command executed by the home appliance to the FP tree structure, and counting the frequency of the action corresponding to the control command executed by the home appliance.
Step S304, a third action set and a fourth action set are obtained; optionally, a third action set is obtained according to the frequency of the actions in the first action set; and acquiring a second action set of the household appliance in more than two second time periods, and acquiring a fourth action set according to the frequency of actions in the second action set.
Step S305, taking intersection of the third action set and the fourth action set to obtain an alternative action set; and acquiring a first confidence coefficient between each action in the alternative action set through an FP-growth algorithm.
And S306, acquiring a control instruction set from the alternative action set according to the first confidence between the actions in the alternative action set.
Step S307, determining a recommended scene rule according to a second confidence coefficient between the played video category and the control instruction in the control instruction set.
By adopting the method for obtaining the recommended scene rule provided by the embodiment of the disclosure, under the condition that a user decides what film is watched and how to control the lamp light and the lamp linkage operation of the curtain, the user can not only customize the scene by the intelligent scene recommending module, but also recommend the scene by the intelligent scene recommending system of the embodiment of the disclosure, automatically generate the scene model according to the user habit, namely, the user does not set the scene, and under the condition of starting the intelligent scene recommendation, automatically generate the scene rule of the linkage operation according to the habit of the user controlling the household appliance by controlling the household appliance for a period of time.
In some embodiments, a user puts a horror film on a television for playing, and triggers an intelligent scene recommendation module to acquire an action corresponding to a control instruction of the household appliance within a period of time, that is, a first action set. And performing data clearing on actions corresponding to the household appliance execution control instructions according to the attributes, conditions and actions in the scene model, and mapping the data into a rule list { rule attribute, rule condition and rule action } in a scene model data structure. Optionally, the rule attribute includes "light", "temperature", and the like, the rule condition is a category of the played video, and the rule action is a control instruction of the home appliance.
Optionally, the user projects audio, photos, videos and the like from an IOS (internet Operation System) device or an apple computer Mac to a device supporting acceptance of the Airplay through a multi-screen interactive technology Airplay, and the audio, the photos, the videos and the like are mapped from a small screen to a large screen to perform operations such as display, wireless music, picture sharing and the like. Optionally, the user performs interworking between a wireless Network and a wired Network including a personal computer, a consumer appliance, and a mobile device through a DLNA (digital living Network Alliance) protocol, so that unlimited sharing and growth of digital media and content services are realized.
OCR algorithms include image processing and character recognition. The image processing comprises preprocessing the video key frame, namely performing graying, binaryzation, noise reduction, inclination correction, character segmentation and the like on the video key frame. In some embodiments, a video key frame is grayed first, that is, a color picture is converted into a gray picture, wherein the method for graying the key frame includes a component method, a maximum value method, an average value method or a weighted average method; then, the key frame is subjected to binarization processing, and the data of the key frame is divided into two parts by setting a threshold value, namely a pixel group larger than the threshold value and a pixel group smaller than the threshold value. The key frame after binarization processing only comprises pure white and pure black colors, and optionally, the method for binarization processing of the key frame comprises a double peak method, a parameter method, an iteration method or an Otsu algorithm OTSU; optionally, image denoising is performed on the key frame after binarization processing, and influence of imaging equipment, external environment noise drying and the like on the digital image in the digitization and transmission processes is reduced through the image denoising, so that character information in the key frame can be conveniently extracted; and then, carrying out character correction and character segmentation, wherein the character segmentation comprises the step of segmenting characters and segmenting characters to obtain character information in the key frame. After the image processing is finished, character recognition is carried out on the acquired character information, the character information in the key frame is recognized through different characteristics of different characters, key words are extracted, and then the characters in the key words are subjected to post-processing, namely, the characters are optimized. In some embodiments, the "steel" is similar to the "class", and in the case that the extracted keyword is "class ironman", the keyword is corrected to obtain the keyword "ironman". In some embodiments, the keywords extracted by the OCR algorithm in the video keyframes are "steel man", "united states captain", "vengeant alliance", etc., the video categories corresponding to the keywords are matched from a preset video category database as "science fiction", "action", etc., and the preset video category database stores the corresponding relationship between the keywords and the video categories. The character information in the key frame is obtained through an OCR algorithm, recognition of multiple languages is supported, the recognition rate is high, the obtained keywords of the video are more accurate, the success rate of matching accurate video categories is higher, and the user experience degree is improved.
Optionally, the category of the played video is obtained in a manner of "searching for a picture", a key descriptor between pictures is obtained by a SIFT (Scale-invariant feature transform) algorithm, similarity calculation is performed according to an euclidean distance of the key descriptor, a video corresponding to the picture is obtained according to the similarity, and then the category of the video is obtained. Optionally, feature detection is performed on the video keyframe through a SIFT algorithm, and the feature detection includes scale space extremum detection, keypoint positioning, direction determination, and keypoint descriptors. In some embodiments, scale invariance is obtained by constructing a scale space and detecting scale space extreme points; and then filtering and accurately positioning the key points, and then distributing direction values to the key points to generate key descriptors. The Euclidean distance of the keypoint descriptors is then used as a measure of the facies of the keypoints in the keyframe. And calculating the similarity of the key frames according to the Euclidean distance of the key point descriptors in the key frames, matching a video corresponding to the similarity in a preset video database according to the similarity, and acquiring the category of the video. The preset video database stores the direct corresponding relation between the similarity and the video, and the video in the preset video data comprises the category of the video. The method has the advantages that the video category is obtained in a mode of searching the image through the SIFT algorithm, and rapid and accurate matching can be carried out in the feature database, so that the user can obtain the video category more conveniently and rapidly, and the user experience is improved.
As shown in fig. 4, an apparatus for controlling an electrical home device according to an embodiment of the present disclosure includes a processor (processor)100 and a memory (memory)101 storing program instructions. Optionally, the apparatus may also include a Communication Interface (Communication Interface)102 and a bus 103. The processor 100, the communication interface 102, and the memory 101 may communicate with each other via a bus 103. The communication interface 102 may be used for information transfer. The processor 100 may call the program instructions in the memory 101 to perform the method for controlling the electric home appliance of the above-described embodiment.
Further, the program instructions in the memory 101 may be implemented in the form of software functional units and stored in a computer readable storage medium when sold or used as a stand-alone product.
The memory 101, which is a computer-readable storage medium, may be used for storing software programs, computer-executable programs, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 100 executes functional applications and data processing by executing program instructions/modules stored in the memory 101, that is, implements the method for controlling the home appliance in the above-described embodiments.
The memory 101 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal device, and the like. In addition, the memory 101 may include a high-speed random access memory, and may also include a nonvolatile memory.
According to the device for controlling the household appliances, the category of the played video is acquired, the control instruction set is acquired, the recommended scene rule is determined according to the category of the played video and the control instruction set, and the household appliances are controlled according to the recommended scene rule, so that the household appliances can be linked when the video is played by a user, and the user experience in watching the video is improved.
The device provided by the embodiment of the disclosure comprises the device for controlling the household appliance.
Optionally, the device is an intelligent household appliance with a display screen, such as an intelligent television or an intelligent refrigerator with a display screen.
Optionally, the device is a mobile terminal such as a smartphone or a tablet.
Optionally, the device is a server or an intelligent gateway.
The equipment provided by the embodiment of the disclosure can perform linkage between the household appliances when the video is played by a user, and improve the experience of the user when watching the video by acquiring the category of the played video and the control instruction set, determining the recommended scene rule according to the category of the played video and the control instruction set, and controlling the household appliances according to the recommended scene rule.
As shown in fig. 5, in some embodiments, the present disclosure provides a system for controlling an electric home appliance, including: IOT cloud platform 1, router 2, smart TV 3, (window) curtain 4, lamp 5. Optionally, the IOT cloud platform 1 acquires the category of the display content in the smart television 3 through the router 2; the IOT cloud platform 1 sends the household appliance control instructions corresponding to the types of the display contents to the curtains 4 and the lamps 5 through the router 2, and triggers the curtains 4 and the lamps 5 to execute the corresponding control instructions.
Optionally, the IOT cloud platform 1 sends the home device control command corresponding to the category of the display content to other home devices through the router 2, for example: intelligent stereo set, intelligent air conditioner etc..
In some embodiments, a user connects a home appliance device, such as a television, a curtain, a lighting device, and the like, to a Wi-Fi through a mobile phone, and binds the home appliance device to the same IOT cloud platform, optionally, the home appliance device is an intelligent device with a Wi-Fi module; and the video is projected to the television through the mobile phone. The television judges whether the received screen projection is a video or not, and dynamically detects the video type through an automatic video identification technology under the condition that the video is the video, for example, the video type is terrorist, the IOT cloud platform acquires the video type as terrorist through the router, and closes a curtain and opens a green light according to an acquired household appliance control instruction corresponding to the video type of terrorist and sends the closed curtain and the opened green light to corresponding household appliances, for example: the curtain and the lamp trigger the curtain to execute a command of closing the curtain and the lamp to execute a command of opening the green lamp.
The embodiment of the disclosure provides a computer-readable storage medium, which stores computer-executable instructions configured to execute the method for controlling the home appliance.
The disclosed embodiments provide a computer program product comprising a computer program stored on a computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform the above-described method for controlling an electric home appliance.
The computer-readable storage medium described above may be a transitory computer-readable storage medium or a non-transitory computer-readable storage medium.
The technical solution of the embodiments of the present disclosure may be embodied in the form of a software product, where the computer software product is stored in a storage medium and includes one or more instructions to enable a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium comprising: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes, and may also be a transient storage medium.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It can be clearly understood by the skilled person that, for convenience and brevity of description, the specific working processes of the system, the apparatus and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be merely a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than disclosed in the description, and sometimes there is no specific order between the different operations or steps. For example, two sequential operations or steps may in fact be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved. Each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

Claims (10)

1. A method for controlling an appliance, comprising:
acquiring the category of a played video and acquiring a control instruction set;
determining a recommended scene rule according to the category of the played video and the control instruction set;
controlling the household appliance according to the recommended scene rule;
the control instruction set comprises control instructions corresponding to the household electrical appliance; the recommended scene rule comprises the corresponding relation between the category of the played video and the control instruction.
2. The method of claim 1, wherein the obtaining of the category of the played video comprises:
acquiring a key frame of the played video;
and acquiring the category of the played video according to the key frame.
3. The method of claim 2, wherein obtaining the category of the played video according to the key frame comprises:
acquiring character information in the key frame;
and judging the type of the played video according to the text information.
4. The method of claim 1, wherein the fetching the set of control instructions comprises:
acquiring a first action set of the household appliance in a first time period;
acquiring the control instruction set according to the first action set;
the first action set comprises actions corresponding to the home appliance device executing the control instruction.
5. The method of claim 4, wherein retrieving a set of control instructions according to the first set of actions comprises:
and acquiring an alternative action set according to the first action set, and acquiring the control instruction set according to the alternative action set.
6. The method of claim 5, wherein obtaining a set of alternative actions from the first set of actions comprises:
the method comprises the steps of obtaining the support degree of each action in a first action set, screening out the actions smaller than a preset first support degree threshold value in the first action set, and storing the actions into a third action set; determining the third set of actions as the set of alternative actions; or the like, or, alternatively,
acquiring a third action set according to the frequency of actions in the first action set; acquiring a second action set of the household appliance in more than two second time periods, and acquiring a fourth action set according to the frequency of actions in the second action set; acquiring actions corresponding to the control instruction executed by the household appliance in more than two second time periods and storing the actions into a second action set; acquiring the support degree of each action in the first action set and the second action set; screening out actions smaller than a preset first support degree threshold value from the first action set and storing the actions into a third action set; screening out actions smaller than a preset second support degree threshold value from the second action set and storing the actions into a fourth action set; taking intersection of the third action set and the fourth action set to obtain an alternative action set;
the second action set, the third action set, the fourth action set and the alternative action set respectively comprise actions corresponding to the home appliance device executing the control instruction; the support degree is the proportion of the frequency of each action in the action set to the frequency of all actions in the set.
7. The method of claim 5, wherein retrieving a set of control instructions from the set of alternative actions comprises:
and acquiring a first confidence coefficient among the actions in the alternative action set, and acquiring the control instruction set according to the first confidence coefficient.
8. The method of any one of claims 1 to 7, wherein determining recommended scene rules according to the category of the played video and the control instruction set comprises:
acquiring a second confidence coefficient between the category of the played video and the control instruction in the control instruction set;
and generating a recommended scene rule according to the second confidence degree.
9. An apparatus for controlling an appliance comprising a processor and a memory storing program instructions, wherein the processor is configured to perform the method for controlling an appliance of any of claims 1 to 8 when executing the program instructions.
10. A device comprising means for controlling an electric household appliance according to claim 9.
CN202010904880.0A 2020-09-01 2020-09-01 Method, device and equipment for controlling household appliance Active CN112000024B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010904880.0A CN112000024B (en) 2020-09-01 2020-09-01 Method, device and equipment for controlling household appliance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010904880.0A CN112000024B (en) 2020-09-01 2020-09-01 Method, device and equipment for controlling household appliance

Publications (2)

Publication Number Publication Date
CN112000024A true CN112000024A (en) 2020-11-27
CN112000024B CN112000024B (en) 2022-08-05

Family

ID=73464970

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010904880.0A Active CN112000024B (en) 2020-09-01 2020-09-01 Method, device and equipment for controlling household appliance

Country Status (1)

Country Link
CN (1) CN112000024B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113009839A (en) * 2021-02-18 2021-06-22 青岛海尔科技有限公司 Scene recommendation method and device, storage medium and electronic equipment
CN116033209A (en) * 2022-08-29 2023-04-28 荣耀终端有限公司 Screen projection method and electronic equipment
WO2023159821A1 (en) * 2022-02-23 2023-08-31 青岛海尔科技有限公司 Method and device for determining operational behavior, storage medium, and electronic device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103747292A (en) * 2014-01-10 2014-04-23 北京酷云互动科技有限公司 Television program-associated application program recommending method and recommending device
CN105072483A (en) * 2015-08-28 2015-11-18 深圳创维-Rgb电子有限公司 Smart home equipment interaction method and system based on smart television video scene
CN105955045A (en) * 2016-05-31 2016-09-21 微鲸科技有限公司 Intelligent film-watching scene implementation system and method
CN109669358A (en) * 2018-11-06 2019-04-23 闽江学院 A kind of curtain control method and device
US20190198019A1 (en) * 2017-12-26 2019-06-27 Baidu Online Network Technology (Beijing) Co., Ltd Method, apparatus, device, and storage medium for voice interaction
CN110289078A (en) * 2019-06-28 2019-09-27 青岛海尔科技有限公司 A kind of recipe recommendation method and device based on wisdom domestic operation system
CN111158886A (en) * 2019-12-31 2020-05-15 青岛海尔科技有限公司 Method and device for optimizing task scheduling of operating system and intelligent equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103747292A (en) * 2014-01-10 2014-04-23 北京酷云互动科技有限公司 Television program-associated application program recommending method and recommending device
CN105072483A (en) * 2015-08-28 2015-11-18 深圳创维-Rgb电子有限公司 Smart home equipment interaction method and system based on smart television video scene
US20170201791A1 (en) * 2015-08-28 2017-07-13 Shenzhen Skyworth-Rgb Electronic Co., Ltd Interactive method on intelligent home appliance based on smart tv video scenes and the system thereof
CN105955045A (en) * 2016-05-31 2016-09-21 微鲸科技有限公司 Intelligent film-watching scene implementation system and method
US20190198019A1 (en) * 2017-12-26 2019-06-27 Baidu Online Network Technology (Beijing) Co., Ltd Method, apparatus, device, and storage medium for voice interaction
CN109669358A (en) * 2018-11-06 2019-04-23 闽江学院 A kind of curtain control method and device
CN110289078A (en) * 2019-06-28 2019-09-27 青岛海尔科技有限公司 A kind of recipe recommendation method and device based on wisdom domestic operation system
CN111158886A (en) * 2019-12-31 2020-05-15 青岛海尔科技有限公司 Method and device for optimizing task scheduling of operating system and intelligent equipment

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113009839A (en) * 2021-02-18 2021-06-22 青岛海尔科技有限公司 Scene recommendation method and device, storage medium and electronic equipment
CN113009839B (en) * 2021-02-18 2023-07-21 青岛海尔科技有限公司 Scene recommendation method and device, storage medium and electronic equipment
WO2023159821A1 (en) * 2022-02-23 2023-08-31 青岛海尔科技有限公司 Method and device for determining operational behavior, storage medium, and electronic device
CN116033209A (en) * 2022-08-29 2023-04-28 荣耀终端有限公司 Screen projection method and electronic equipment
CN116033209B (en) * 2022-08-29 2023-10-20 荣耀终端有限公司 Screen projection method and electronic equipment

Also Published As

Publication number Publication date
CN112000024B (en) 2022-08-05

Similar Documents

Publication Publication Date Title
CN112000024B (en) Method, device and equipment for controlling household appliance
CN110119711B (en) Method and device for acquiring character segments of video data and electronic equipment
JP5934653B2 (en) Image classification device, image classification method, program, recording medium, integrated circuit, model creation device
US10621755B1 (en) Image file compression using dummy data for non-salient portions of images
US8848985B2 (en) Face-image registration device, face-image registration method, face-image registration program, and storage medium
WO2018028583A1 (en) Subtitle extraction method and device, and storage medium
US9247106B2 (en) Color correction based on multiple images
CN110633669B (en) Mobile terminal face attribute identification method based on deep learning in home environment
US20170339340A1 (en) Device, system and method for cognitive image capture
US10853407B2 (en) Correlating image annotations with foreground features
US9749710B2 (en) Video analysis system
US20150169978A1 (en) Selection of representative images
CN108154086B (en) Image extraction method and device and electronic equipment
CN111274442B (en) Method for determining video tag, server and storage medium
CN108959462B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN110856037B (en) Video cover determination method and device, electronic equipment and readable storage medium
CN109271552B (en) Method and device for retrieving video through picture, electronic equipment and storage medium
US20220172476A1 (en) Video similarity detection method, apparatus, and device
CN112581355A (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN113591758A (en) Human behavior recognition model training method and device and computer equipment
CN112200844A (en) Method, device, electronic equipment and medium for generating image
US20170013309A1 (en) System and method for product placement
US9866894B2 (en) Method for annotating an object in a multimedia asset
KR20150101846A (en) Image classification service system based on a sketch user equipment, service equipment, service method based on sketch and computer readable medium having computer program recorded therefor
US11283945B2 (en) Image processing apparatus, image processing method, program, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant