CN115599006B - Control method and device of intelligent equipment and intelligent equipment - Google Patents
Control method and device of intelligent equipment and intelligent equipment Download PDFInfo
- Publication number
- CN115599006B CN115599006B CN202110719627.2A CN202110719627A CN115599006B CN 115599006 B CN115599006 B CN 115599006B CN 202110719627 A CN202110719627 A CN 202110719627A CN 115599006 B CN115599006 B CN 115599006B
- Authority
- CN
- China
- Prior art keywords
- user
- target
- data
- control
- orientation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 230000009471 action Effects 0.000 claims abstract description 70
- 230000003993 interaction Effects 0.000 claims description 32
- 238000012937 correction Methods 0.000 claims description 20
- 230000002829 reductive effect Effects 0.000 description 12
- 210000003414 extremity Anatomy 0.000 description 11
- 238000010586 diagram Methods 0.000 description 8
- 230000003247 decreasing effect Effects 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 210000001364 upper extremity Anatomy 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 210000003141 lower extremity Anatomy 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 2
- 230000001815 facial effect Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 238000005406 washing Methods 0.000 description 2
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/042—Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
- G05B19/0423—Input/output
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/25—Pc structure of the system
- G05B2219/25257—Microcontroller
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application relates to and discloses a control method for intelligent equipment, which comprises the steps of receiving orientation data and action data of a user; determining target intelligent equipment according to the orientation data of the user and determining a control instruction according to the action data of the user; and sending the control instruction to the target intelligent equipment so as to enable the target intelligent equipment to execute corresponding operation. The method not only ensures the correct determination of the control instruction, but also avoids the accurate identification of the intelligent equipment which cannot realize the required control when various intelligent equipment exist. The application also discloses a control device for the intelligent equipment and the intelligent equipment.
Description
Technical Field
The application relates to the technical field of intelligent household appliances, in particular to a control method and device for intelligent equipment and the intelligent equipment.
Background
At present, the interaction between a user and intelligent equipment is a research hotspot in the intelligent family field, and the research in the existing intelligent family field mainly focuses on identifying the intelligent family equipment and the actual control instruction which are required to be actually controlled by the user through analyzing human bodies, human faces, gestures or the like in images, so that the interaction between the user and the intelligent family equipment is completed.
However, in implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art:
For example, since a plurality of intelligent devices are commonly present in an intelligent home, among which various intelligent devices such as an intelligent television, an intelligent refrigerator, an intelligent washing machine, an intelligent electric lamp, an intelligent microwave oven, an intelligent water heater, an intelligent air conditioner, etc. may be included, in an actual scene, when a plurality of intelligent home devices may be controlled by actions made by gestures, faces, human bodies, etc., when a user issues gesture control instructions, responses of the plurality of intelligent devices may be caused, thereby causing erroneous interactive operations.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview, and is intended to neither identify key/critical elements nor delineate the scope of such embodiments, but is intended as a prelude to the more detailed description that follows.
The embodiment of the disclosure provides a control method and device for intelligent equipment and the intelligent equipment, so as to solve the technical problem of meeting diversified use scenes.
In some embodiments, the method comprises:
Receiving orientation data and action data of a user;
Determining target intelligent equipment according to the orientation data of the user and determining a control instruction according to the action data of the user; and
And sending the control instruction to the target intelligent equipment so as to enable the target intelligent equipment to execute corresponding operation.
Specifically, the determining the target intelligent device according to the orientation data of the user includes:
determining a target direction according to the orientation data of the user; and
And determining the target intelligent device in the target direction.
Specifically, the determining the target direction according to the orientation data of the user includes:
Receiving orientation data of a user, wherein the orientation data comprises at least two of a palm orientation direction vector, a human face orientation direction vector and a human chest front orientation direction vector; and
Calculating a direction vector of the target intelligent device by adopting a formula delta=aα+bβ+cγ, wherein α is a direction vector of the palm orientation of the user, β is a direction vector of the face orientation of the user, γ is a direction vector of the front orientation of the chest of the user, δ is a direction vector of the target intelligent device, a, b and c are weighting coefficients of the palm orientation direction vector, the face orientation direction vector and the front orientation direction vector of the chest of the user, a, b and c are real numbers greater than or equal to 0, and a+b+c=1.
Specifically, the determining the target intelligent device in the target direction includes:
receiving location data of a user;
Searching all intelligent devices in the target direction; and
And determining one intelligent device closest to the position of the user in all intelligent devices as a target intelligent device.
Specifically, the determining the control instruction according to the action data of the user includes:
Receiving action data of a user, wherein the action data comprises at least two of gesture action data, face action data and human limb action data;
if the corresponding instructions of the action data in the database are the same, determining the corresponding instructions as the control instructions; or alternatively
If the corresponding instructions of the motion data in the database are different and the operation data comprise gesture motion data, determining the instruction corresponding to the gesture motion data as the control instruction; otherwise, determining an instruction corresponding to the human body limb action data as a control instruction;
The database is a database associated with the intelligent equipment, and the database records basic information and basic instructions of all the intelligent equipment, wherein the instructions comprise instructions corresponding to gesture action data, instructions corresponding to face action data and instructions corresponding to human body limb action data.
Specifically, the determining the control instruction according to the action data of the user includes:
Transmitting the information of the target intelligent equipment and the control instruction to a human-computer interaction interface, wherein the information of the target intelligent equipment comprises an identification number and position information of the target intelligent equipment;
If the control center receives that the information sent by the user through the man-machine interaction interface is that the information and the control instruction of the target intelligent equipment are correct, storing orientation data, action data and the information and the control instruction of the target intelligent equipment, and sending the control instruction to the target intelligent equipment; if the control center receives that the information sent by the user through the man-machine interaction interface only comprises incorrect information of the target intelligent device and not comprises incorrect control instructions, the information and the control instructions of the target intelligent device are sent to the man-machine interaction interface after the target intelligent device is corrected according to the orientation data of the user; if the control center receives that the information sent by the user through the man-machine interaction interface comprises incorrect information of the target intelligent device or incorrect control instructions, the target intelligent device is determined to be corrected according to the orientation data of the user, and the information requesting the user to input the correction control instructions is sent to the man-machine interaction interface; if the control center receives that the information sent by the user through the man-machine interaction interface only comprises incorrect control instructions, the control center sends information requesting the user to input correction control instructions to the man-machine interaction interface.
Specifically, the determining the correction target intelligent device according to the orientation data of the user includes:
And adjusting the weight coefficients of the direction vector of the palm orientation, the direction vector of the face orientation and the direction vector of the front face orientation of the chest of the human body to obtain a correction target direction, and determining one intelligent device closest to the position of the user in all intelligent devices in the correction target direction as a correction target intelligent device.
In some embodiments, the apparatus comprises:
A receiving module configured to receive orientation data and motion data of a user;
A determining module configured to determine a target smart device according to the orientation data of the user and determine a control instruction according to the action data of the user;
the control module is configured to send the control instruction to the target intelligent device so as to enable the target intelligent device to execute corresponding operation;
and the storage module is configured to store the orientation data, the action data, the information of the target intelligent device and the control instruction.
In some embodiments, the apparatus comprises a processor and a memory storing program instructions, the processor being configured to, when executing the program instructions, perform a control method for a smart device as described above
In some embodiments, the smart device comprises a control apparatus for a smart device as described above.
The control method and device for the intelligent equipment and the intelligent equipment provided by the embodiment of the disclosure can realize the following technical effects:
The target intelligent equipment and the control instruction are determined by identifying the orientation data and the action data of the user, so that the accurate determination of the control instruction is ensured, and the accurate identification of the intelligent equipment which cannot realize the required control when various intelligent equipment exist is avoided.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
One or more embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements, and in which like reference numerals refer to similar elements, and in which:
fig. 1 is a schematic flow chart of a control method for an intelligent device according to an embodiment of the disclosure;
FIG. 2 is a flow chart of another control method for a smart device provided by an embodiment of the present disclosure;
Fig. 3 is a schematic view of an actual scenario of a control method for an intelligent device according to an embodiment of the disclosure;
fig. 4 is a flowchart of another practical scenario of a control method for a smart device according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of a control device for a smart device according to an embodiment of the present disclosure;
Fig. 6 is a schematic diagram of another control device for an intelligent device according to an embodiment of the present disclosure.
Detailed Description
So that the manner in which the features and techniques of the disclosed embodiments can be understood in more detail, a more particular description of the embodiments of the disclosure, briefly summarized below, may be had by reference to the appended drawings, which are not intended to be limiting of the embodiments of the disclosure. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, one or more embodiments may still be practiced without these details. In other instances, well-known structures and devices may be shown simplified in order to simplify the drawing.
The terms first, second and the like in the description and in the claims of the embodiments of the disclosure and in the above-described figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe embodiments of the present disclosure. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion.
The term "plurality" means two or more, unless otherwise indicated.
In the embodiment of the present disclosure, the character "/" indicates that the front and rear objects are an or relationship. For example, A/B represents: a or B.
The term "and/or" is an associative relationship that describes an object, meaning that there may be three relationships. For example, a and/or B, represent: a or B, or, A and B.
A smart device herein refers to any device, appliance, or machine having computing processing capabilities.
Referring to fig. 1, an embodiment of the present disclosure provides a control method for an intelligent device, including:
S01, the control center receives the orientation data and the action data of the user;
s02, the control center determines target intelligent equipment according to the orientation data of the user and determines a control instruction according to the action data of the user; and
S03, the control center sends the control instruction to the target intelligent equipment so as to enable the target intelligent equipment to execute corresponding operation.
By adopting the control method for the intelligent equipment, which is provided by the embodiment of the disclosure, the target intelligent equipment and the control instruction are determined by identifying the orientation data and the action data of the user, so that the control method is suitable for situations in which more than one intelligent equipment exists in an actual scene, the correct determination of the control instruction is ensured, and the accurate interaction between the user and the intelligent household equipment is realized when various intelligent equipment exists.
Optionally, the determining, by the control center, the target smart device according to the orientation data of the user in step S02 includes:
S02a, the control center determines a target direction according to the direction data of the user; and
S02b, the control center determines the target intelligent device in the target direction.
Optionally, the determining, by the control center in step S02a, the target direction according to the orientation data of the user includes:
the control center receives the orientation data of the user, wherein the orientation data comprises at least two of a palm orientation direction vector, a human face orientation direction vector and a human chest front orientation direction vector; and
The control center calculates the direction vector of the target intelligent device by adopting a formula delta=aα+bβ+cγ, wherein α is the direction vector of the palm direction of the user, β is the direction vector of the face direction of the user, γ is the direction vector of the front face direction of the chest of the user, δ is the direction vector of the target intelligent device, a, b and c are respectively the weight coefficients of the palm direction vector, the face direction vector and the front face direction vector of the chest of the human, a, b and c are real numbers greater than or equal to 0, and a+b+c=1.
Optionally, determining, by the control center in the target direction in step S02b, the target smart device includes:
The control center receives the position data of the user;
the control center searches all intelligent devices in the target direction; and
The control center determines one intelligent device closest to the position of the user in all intelligent devices as a target intelligent device. In an actual scene, a plurality of intelligent devices exist, a situation that a plurality of intelligent devices exist in the same direction is also frequently caused, and a user can control the intelligent devices in the vicinity of the intelligent devices to be controlled, so that the control center can determine the actual intelligent home devices according to the intelligent devices closest to the user in the comprehensive direction, and the actual intention of the user can be effectively judged.
Optionally, the determining, by the control center in step S02, a control instruction according to the action data of the user includes:
The control center receives action data of a user, wherein the action data comprises at least two of gesture action data, face action data and human limb action data;
if the corresponding instructions of the action data in the database are the same, the control center determines the corresponding instructions as actual instructions; or alternatively
If the corresponding instructions of the motion data in the database are different and the operation data comprise gesture motion data, the control center determines the instruction corresponding to the gesture motion data as an actual instruction; otherwise, the control center determines the instruction corresponding to the human body limb action data as an actual instruction;
The database is a database of the related intelligent devices, and basic information and basic instructions of all the intelligent devices are recorded in the database, wherein the instructions comprise instructions corresponding to gesture action data, instructions corresponding to face action data and instructions corresponding to human body limb action data.
Here, since most users still get used to control the smart home device through gesture motion data according to the statistics result, when the corresponding instructions of the motion data in the database are different, if gesture motion data exists, the control center determines the instruction corresponding to the gesture motion data as an actual instruction, and when the gesture motion data is not included in the operation data, optionally, according to the statistics result, most users get used to express the instruction through limbs, and the control center determines the instruction corresponding to the human body limb motion data as an actual instruction. Optionally, the human body limb motion data includes human body upper limb motion data and/or human body lower limb motion data, when the corresponding instructions of the motion data in the database are different, if the operation data do not include gesture motion data and include human body upper limb motion data, the control center determines the instruction corresponding to the human body upper limb motion data as an actual instruction, otherwise, the control center determines the instruction corresponding to the human body lower limb motion data as an actual instruction.
Optionally, as shown in connection with fig. 2, an embodiment of the present disclosure provides another control method for an intelligent device, including:
s00, the intelligent equipment and the control center start to be powered on;
S01, the control center receives the orientation data and the action data of the user;
s02, the control center determines target intelligent equipment according to the orientation data of the user and determines a control instruction according to the action data of the user;
s03, the control center sends the control instruction to the target intelligent equipment so as to enable the target intelligent equipment to execute corresponding operation;
s04, the control center sends information and control instructions of the target intelligent equipment to the man-machine interaction interface, wherein the information of the target intelligent equipment comprises identification numbers and position information of the target intelligent equipment;
S05, the control center judges whether the information of the target intelligent equipment and the control instruction are correct is received or not;
s051, if the control center receives that the information sent by the user through the man-machine interaction interface is that the information and the control instruction of the target intelligent device are correct, storing the orientation data, the action data, the information and the control instruction of the target intelligent device, and sending the control instruction to the target intelligent device
An apparatus;
S052, if the control center receives that the information sent by the user through the man-machine interaction interface only comprises the information of the target intelligent device but not the control instruction, the information of the target intelligent device is corrected after the target intelligent device is determined to be corrected according to the orientation data of the user
And the control instruction is sent to the man-machine interaction interface;
S053, if the control center receives that the information sent by the user through the man-machine interaction interface comprises incorrect information of the target intelligent device and incorrect control instructions, determining to correct the target intelligent device according to the orientation data of the user and sending the information requesting the user to input the correction control instructions to the man-machine interaction interface;
s054, if the control center receives that the information sent by the user through the man-machine interaction interface only comprises incorrect control instructions, the control center sends information for requesting the user to input correction control instructions to the man-machine interaction interface.
Here, if the same orientation data and action data are repeatedly received in the same scene, i.e. the scene including the same intelligent device and the same intelligent device position information, the control center can directly send the control instruction to the target intelligent device according to the previously stored orientation data, action data, information of the target intelligent device and the control instruction, so that the target intelligent device performs corresponding operation, thereby omitting the process of determining the target intelligent device and determining the control instruction by the control center in step S02, and greatly improving the control efficiency of the intelligent device; on the other hand, since the target intelligent device and the control command are determined to have a certain error rate, if the user determines that the target intelligent device and the control command determined by the processor are correct through the man-machine interaction interface, the control center can store the orientation data, the action data, the information of the target intelligent device and the control command and directly call the information in the same scene to save calculation steps in the future as in the above case, if the user determines that the target intelligent device and the control command determined by the controller are not completely correct through the man-machine interaction interface, the control center can correct the command through a correction mode and send the command to the man-machine interaction interface again to search for confirmation of the user, and the correction mode can be performed for a plurality of times until the user confirms that the target intelligent device and the control command are completely correct.
Optionally, the determining, by the control center, the correction target intelligent device according to the orientation data of the user includes:
The control center adjusts the weight coefficients of the palm oriented direction vector, the face oriented direction vector and the human chest front oriented direction vector to obtain a correction target direction, and one intelligent device closest to the position of the user in all intelligent devices is determined to be the correction target intelligent device in the correction target direction.
Here, the control center may adjust the weighting coefficients of the palm-facing direction vector, the face-facing direction vector, and the face-facing direction vector through actual conditions, for example, when the actual scene includes only gesture data and face data, c=0 may be set for the first time, a and b are both equal to 0.5, and when the target direction needs to be corrected, a may be increased by 0.1, b may be decreased by 0.1, or a may be decreased by 0.1, b may be increased by 0.1, when each adjustment is required; or c=0 is set for the first time, because the user mainly expresses the actual requirement through gestures in most cases, the face action is taken as an aid, a is set to 0.8, b is set to 0.2, and when the target direction needs to be corrected, a can be increased by 0.05 and b can be reduced by 0.05 or a can be reduced by 0.05 and b can be increased by 0.05 in each adjustment; Or when the hand action of the user is inconvenient to express, but the actual requirement is mainly expressed through the face action, a is set to 0, b is set to 0.8, c is set to 0.2 for the first time, and when the target direction needs to be corrected, b can be increased by 0.05 and c can be reduced by 0.05 or b can be reduced by 0.05 and c can be increased by 0.05 during each adjustment. For another example, when only gesture data and human body data are included in the actual scene, b=0 may be set for the first time, and a and c are both equal to 0.5, and when the target direction needs to be corrected, a may be increased by 0.1 and c may be decreased by 0.1 or a may be decreased by 0.1 and c may be increased by 0.1 at each adjustment; Or b=0 for the first time, because the user mainly expresses the actual requirement through gestures in most cases, the face action is taken as an aid, a is set to 0.8, c is set to 0.2, and when the target direction needs to be corrected, a can be increased by 0.05 and b can be reduced by 0.05 or a can be reduced by 0.05 and b can be increased by 0.05 in each adjustment; or when the hand motion of the user is inconvenient to express and the actual requirement is mainly expressed through the human body motion, a is set to 0.2 and c is set to 0.8 for the first time, and when the target direction needs to be corrected, a can be increased by 0.05 and c can be reduced by 0.05 or a can be reduced by 0.05 and b can be increased by 0.05 during each adjustment. For another example, when in an actual scene, the gesture action of the user cannot be expressed, and the operation data only includes face data and body data, a=0 may be set for the first time, b and c are all equal to 0.5, and when the target direction needs to be corrected, c may be increased by 0.1 and c may be decreased by 0.1 or b may be decreased by 0.1 and c may be increased by 0.1 when each adjustment is required; or a=0, while some users have rich facial actions, it is more desirable to express the actual requirement through the facial actions, the human actions are taken as assistance, b is set to 0.8 for the first time, c is set to 0.2, when the target direction needs to be corrected, b can be increased by 0.1 and c can be reduced by 0.1 or b can be reduced by 0.1 and c can be increased by 0.1 during each adjustment; Or when the face action of the user is inconvenient to express, and the actual requirement is mainly expressed through the human body action, b is set to 0.2 and c is set to 0.8 for the first time, and when the target direction needs to be corrected, b can be increased by 0.1 and c can be reduced by 0.1 or b can be reduced by 0.1 and c can be increased by 0.1 during each adjustment. More specifically, when the gesture data exists in the operation data of the user, and the gesture data includes face data and body data, three weight coefficients may be set according to the actual situations of different users, for example, the weights of the three weights are approximately equal for the first time, a=0.34, b=0.33, and c=0.33, and when the target direction needs to be corrected, a, b, and c may be increased or decreased accordingly during each adjustment. The setting of different weight coefficients can be suitable for different actual scenes, and can meet various operation requirements in the actual environment and adapt to different user groups.
Alternatively, the orientation data of the user, the motion data of the user, the position data of the user and the position data of the intelligent device may be collected by a collecting device, where the collecting device may be a device with at least one of a monocular RGB camera and a depth camera.
In practical applications (as shown in fig. 3 and 4), the intelligent devices 13, 14 and 18 are six identical or different intelligent devices, and the intelligent devices may be one of intelligent devices such as an intelligent television, an intelligent refrigerator, an intelligent air conditioner, an intelligent water heater, an intelligent microwave oven, an intelligent electric lamp, and an intelligent washing machine.
After the user 12 enters the room, the collecting device 11 starts to collect the orientation data of the user, the action data of the user, the position data of the user, and the position information of six intelligent devices such as the intelligent device 13, the intelligent device 14 and the intelligent device 18, and sends all the information to the computer 19, however, the collecting device 11 may collect the position information of six intelligent devices such as the intelligent device 13, the intelligent device 14 and the intelligent device 18 first, and then collect the orientation data of the user, the action data of the user, and the position data of the user, or collect the orientation data of the user, the action data of the user, the position data of the user, and the position information of six intelligent devices such as the intelligent device 13, the intelligent device 14 and the intelligent device 18 simultaneously or not simultaneously by the plurality of collecting devices 11.
M02, the computer 19 determines a target direction according to the received direction data of the user and a formula δ=aα+bβ+cγ, where the direction data includes a palm-facing direction vector, a face-facing direction vector, and a human chest front-facing direction vector, α is the palm-facing direction vector, β is the face-facing direction vector, γ is the human chest front-facing direction vector, δ is the target smart device direction vector, a, b, and c are weighting coefficients of the palm-facing direction vector, the face-facing direction vector, and the human chest front-facing direction vector, respectively, and a=0.34, b=0.33, and c=0.33 are set.
And M03, the computer 19 determines one intelligent device closest to the position of the user as a target intelligent device according to the position information of the user and the position information of the intelligent device in the target direction.
M04, computer 19 matches the corresponding instructions in the associated database according to the action data. If the gesture motion data, the face motion data, and the human motion data of the user 12 are the same in the corresponding instructions in the database, the computer 19 determines the corresponding instructions as control instructions; if the corresponding instructions of the motion data in the database are different, the computer 19 determines the instruction corresponding to the gesture motion data as a control instruction; naturally, if gesture data not including the user exists, the computer 19 determines an instruction corresponding to the human body limb motion data as a control instruction; optionally, the human body limb motion data may be further divided into human body upper limb motion data and/or human body lower limb motion data, where the human body upper limb motion data and the human body lower limb motion data respectively correspond to the instructions in the database, and when the human body upper limb motion data exists, the computer 19 determines the instruction corresponding to the human body upper limb motion data as the control instruction, and otherwise, determines the instruction corresponding to the human body lower limb motion data as the control instruction.
And M05, the computer 19 sends the information and the control instruction of the target intelligent device to the man-machine interaction interface, waits for a user to confirm whether the target intelligent device and the control instruction are correct, stores the information and the control instruction of the target intelligent device into a memory if the user confirms that the target intelligent device and the control instruction are correct, can be directly called in the same scene later, corrects the instruction if the user confirms that the target intelligent device and the control instruction are not completely correct, and sends the corrected instruction to the man-machine interaction interface again to seek the confirmation of the user until the user confirms that the target intelligent device and the control instruction are completely correct.
As shown in conjunction with fig. 5, an embodiment of the present disclosure provides a control apparatus for an intelligent device, including a receiving module 21, a determining module 22, and a control module 23 and a storage module 24.
A receiving module 21 configured to receive orientation data and motion data of a user;
a determining module 22 configured to determine a target smart device from the orientation data of the user and to determine a control instruction from the action data of the user;
A control module 23 configured to send a control instruction to the target smart device, so that the target smart device performs a corresponding operation;
The storage module 24 is configured to store orientation data, motion data, information of the target smart device, and control instructions.
By adopting the control device for the intelligent equipment provided by the embodiment of the disclosure to form a schematic diagram, when a plurality of targets to be identified or a plurality of intelligent equipment exist, the actual intelligent equipment and the actual instruction can be determined through the operation data obtained by the identification acquisition module, so that the purposes of accurately identifying the intention of a user and accurately controlling the actual intelligent equipment are achieved.
As shown in connection with fig. 6, an embodiment of the present disclosure provides a control apparatus for a smart device, including a processor (processor) 100 and a memory (memory) 101. Optionally, the apparatus may further comprise a communication interface (Communication Interface) 102 and a bus 103. The processor 100, the communication interface 102, and the memory 101 may communicate with each other via the bus 103. The communication interface 102 may be used for information transfer. The processor 100 may call logic instructions in the memory 101 to perform the control method for the smart device of the above-described embodiments.
Further, the logic instructions in the memory 101 described above may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand alone product.
The memory 101 is a computer readable storage medium that can be used to store a software program, a computer executable program, such as program instructions/modules corresponding to the methods in the embodiments of the present disclosure. The processor 100 executes functional applications and data processing by running program instructions/modules stored in the memory 101, i.e. implements the control method for the smart device in the above-described embodiments.
The memory 101 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for a function; the storage data area may store data created according to the use of the terminal device, etc. Further, the memory 101 may include a high-speed random access memory, and may also include a nonvolatile memory.
The embodiment of the disclosure provides an intelligent device, which comprises the control device for the intelligent device.
The disclosed embodiments provide a computer-readable storage medium storing computer-executable instructions configured to perform the above-described control method for an intelligent device.
The disclosed embodiments provide a computer program product comprising a computer program stored on a computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, cause the computer to perform the control method for a smart device as described above.
The computer readable storage medium may be a transitory computer readable storage medium or a non-transitory computer readable storage medium.
Embodiments of the present disclosure may be embodied in a software product stored on a storage medium, including one or more instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described above in embodiments of the present disclosure. And the aforementioned storage medium may be a non-transitory storage medium including: a plurality of media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or a transitory storage medium.
The above description and the drawings illustrate embodiments of the disclosure sufficiently to enable those skilled in the art to practice them. Other embodiments may involve structural, logical, electrical, process, and other changes. The embodiments represent only possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in, or substituted for, those of others. Moreover, the terminology used in the present application is for the purpose of describing embodiments only and is not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a," "an," and "the" (the) are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this disclosure is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, when used in the present disclosure, the terms "comprises," "comprising," and/or variations thereof, mean that the recited features, integers, steps, operations, elements, and/or components are present, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising one …" does not exclude the presence of other like elements in a process, method or apparatus that includes such elements. In this context, each embodiment may be described with emphasis on the differences from the other embodiments, and the same similar parts between the various embodiments may be referred to each other. For the methods, products, etc. disclosed in the embodiments, if they correspond to the method sections disclosed in the embodiments, the description of the method sections may be referred to for relevance.
Those of skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. The skilled person may use different methods for each particular application to achieve the described functionality, but such implementation should not be considered to be beyond the scope of the embodiments of the present disclosure. It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system, apparatus and unit may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the embodiments disclosed herein, the disclosed methods, articles of manufacture (including but not limited to devices, apparatuses, etc.) may be practiced in other ways. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the above units may be merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form. The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to implement the present embodiment. In addition, each functional unit in the embodiments of the present disclosure may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than that disclosed in the description, and sometimes no specific order exists between different operations or steps. For example, two consecutive operations or steps may actually be performed substantially in parallel, they may sometimes be performed in reverse order, which may be dependent on the functions involved. Each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Claims (8)
1. A control method for an intelligent device, comprising:
Receiving orientation data and action data of a user;
Determining target intelligent equipment according to the orientation data of the user and determining a control instruction according to the action data of the user; and
The control instruction is sent to the target intelligent equipment so that the target intelligent equipment executes corresponding operation;
the determining the target intelligent device according to the orientation data of the user comprises: determining a target direction according to the orientation data of the user; and determining a target intelligent device in the target direction;
The determining a target direction according to the orientation data of the user comprises: receiving orientation data of a user, wherein the orientation data comprises at least two of a palm orientation direction vector, a human face orientation direction vector and a human chest front orientation direction vector; and adopts the formula Calculating a direction vector of the target smart device, whereinIs the direction vector of the palm direction of the user,Is the direction vector of the face of the user,Is a direction vector of the front face of the chest of the user,And a, b and c are respectively weighting coefficients of the palm-oriented direction vector, the face-oriented direction vector and the human chest-oriented direction vector, wherein a, b and c are real numbers greater than or equal to 0, and a+b+c=1.
2. The control method according to claim 1, wherein the determining a target smart device in the target direction includes:
receiving location data of a user;
Searching all intelligent devices in the target direction; and
And determining one intelligent device closest to the position of the user in all intelligent devices as a target intelligent device.
3. The control method according to claim 2, wherein the determining a control instruction according to the action data of the user includes:
Receiving action data of a user, wherein the action data comprises at least two of gesture action data, face action data and human limb action data;
if the corresponding instructions of the action data in the database are the same, determining the corresponding instructions as the control instructions; or alternatively
If the corresponding instructions of the motion data in the database are different and the gesture motion data are included in the motion data, determining the instruction corresponding to the gesture motion data as the control instruction; otherwise, determining an instruction corresponding to the human body limb action data as a control instruction;
The database is a database associated with the intelligent equipment, and the database records basic information and basic instructions of all the intelligent equipment, wherein the instructions comprise instructions corresponding to gesture action data, instructions corresponding to face action data and instructions corresponding to human body limb action data.
4. A control method according to claim 3, wherein said determining a control instruction according to said user's motion data further comprises:
Transmitting the information of the target intelligent equipment and the control instruction to a human-computer interaction interface, wherein the information of the target intelligent equipment comprises an identification number and position information of the target intelligent equipment;
If the control center receives that the information sent by the user through the man-machine interaction interface is that the information and the control instruction of the target intelligent equipment are correct, storing orientation data, action data and the information and the control instruction of the target intelligent equipment, and sending the control instruction to the target intelligent equipment; if the control center receives that the information sent by the user through the man-machine interaction interface only comprises incorrect information of the target intelligent device and not comprises incorrect control instructions, the information and the control instructions of the target intelligent device are sent to the man-machine interaction interface after the target intelligent device is corrected according to the orientation data of the user; if the control center receives that the information sent by the user through the man-machine interaction interface comprises incorrect information of the target intelligent device or incorrect control instructions, the target intelligent device is determined to be corrected according to the orientation data of the user, and the information requesting the user to input the correction control instructions is sent to the man-machine interaction interface; if the control center receives that the information sent by the user through the man-machine interaction interface only comprises incorrect control instructions, the control center sends information requesting the user to input correction control instructions to the man-machine interaction interface.
5. The control method according to claim 4, wherein the determining a correction target smart device according to the orientation data of the user includes:
And adjusting the weight coefficients of the direction vector of the palm orientation, the direction vector of the face orientation and the direction vector of the front face orientation of the chest of the human body to obtain a correction target direction, and determining one intelligent device closest to the position of the user in all intelligent devices in the correction target direction as a correction target intelligent device.
6. A control device for an intelligent device, comprising:
A receiving module configured to receive orientation data and motion data of a user;
A determining module configured to determine a target smart device according to the orientation data of the user and determine a control instruction according to the action data of the user; the determining the target intelligent device according to the orientation data of the user comprises: determining a target direction according to the orientation data of the user; and determining a target intelligent device in the target direction; the determining a target direction according to the orientation data of the user comprises: receiving orientation data of a user, wherein the orientation data comprises at least two of a palm orientation direction vector, a human face orientation direction vector and a human chest front orientation direction vector; and adopts the formula Calculating a direction vector of the target smart device, whereinIs the direction vector of the palm direction of the user,Is the direction vector of the face of the user,For the direction vector of the front face of the chest of the user's human body,A, b and c are respectively weighting coefficients of the palm-facing direction vector, the face-facing direction vector and the human chest-facing direction vector, wherein a, b and c are real numbers greater than or equal to 0, and a+b+c=1;
the control module is configured to send the control instruction to the target intelligent device so as to enable the target intelligent device to execute corresponding operation;
and the storage module is configured to store the orientation data, the action data, the information of the target intelligent device and the control instruction.
7. A control apparatus for a smart device comprising a processor and a memory storing program instructions, wherein the processor is configured to perform a control method for a smart device as claimed in any one of claims 1 to 5 when executing the program instructions.
8. A smart device comprising a control apparatus for a smart device as claimed in claim 6 or 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110719627.2A CN115599006B (en) | 2021-06-28 | 2021-06-28 | Control method and device of intelligent equipment and intelligent equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110719627.2A CN115599006B (en) | 2021-06-28 | 2021-06-28 | Control method and device of intelligent equipment and intelligent equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115599006A CN115599006A (en) | 2023-01-13 |
CN115599006B true CN115599006B (en) | 2024-06-25 |
Family
ID=84841190
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110719627.2A Active CN115599006B (en) | 2021-06-28 | 2021-06-28 | Control method and device of intelligent equipment and intelligent equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115599006B (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106775910A (en) * | 2016-11-22 | 2017-05-31 | 杭州赫智电子科技有限公司 | A kind of man-machine interaction method and system based on interactive device |
CN110535732A (en) * | 2019-07-29 | 2019-12-03 | 深圳绿米联创科技有限公司 | A kind of apparatus control method, device, electronic equipment and storage medium |
CN112130918A (en) * | 2020-09-25 | 2020-12-25 | 深圳市欧瑞博科技股份有限公司 | Intelligent device awakening method, device and system and intelligent device |
CN112417923A (en) * | 2019-08-20 | 2021-02-26 | 云丁网络技术(北京)有限公司 | System, method and apparatus for controlling smart devices |
CN112581664A (en) * | 2020-12-26 | 2021-03-30 | 深兰盛视科技(苏州)有限公司 | Control method and device based on intelligent identification, electronic equipment and storage medium |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4965767B2 (en) * | 2000-03-29 | 2012-07-04 | キヤノン株式会社 | Image processing apparatus and control method thereof |
US20120060127A1 (en) * | 2010-09-06 | 2012-03-08 | Multitouch Oy | Automatic orientation of items on a touch screen display utilizing hand direction |
US20120095575A1 (en) * | 2010-10-14 | 2012-04-19 | Cedes Safety & Automation Ag | Time of flight (tof) human machine interface (hmi) |
TWI562800B (en) * | 2014-07-01 | 2016-12-21 | Univ Shu Te | An automatic boot rescue device |
US9821470B2 (en) * | 2014-09-17 | 2017-11-21 | Brain Corporation | Apparatus and methods for context determination using real time sensor data |
US9860077B2 (en) * | 2014-09-17 | 2018-01-02 | Brain Corporation | Home animation apparatus and methods |
CN108717271A (en) * | 2018-05-30 | 2018-10-30 | 辽东学院 | Man-machine interaction control method, device, system and readable storage medium storing program for executing |
CN212489368U (en) * | 2020-05-08 | 2021-02-09 | 邓小龙 | Novel mobile phone quilt |
CN112000224A (en) * | 2020-08-24 | 2020-11-27 | 北京华捷艾米科技有限公司 | Gesture interaction method and system |
CN112217699A (en) * | 2020-10-09 | 2021-01-12 | 珠海格力电器股份有限公司 | Control method and device of intelligent household equipment |
CN112711979A (en) * | 2020-11-18 | 2021-04-27 | 北京邮电大学 | Non-contact vital sign monitoring under slow random motion based on biological radar |
CN112526892B (en) * | 2020-12-18 | 2022-08-05 | 青岛海尔科技有限公司 | Method and device for controlling intelligent household equipment and electronic equipment |
-
2021
- 2021-06-28 CN CN202110719627.2A patent/CN115599006B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106775910A (en) * | 2016-11-22 | 2017-05-31 | 杭州赫智电子科技有限公司 | A kind of man-machine interaction method and system based on interactive device |
CN110535732A (en) * | 2019-07-29 | 2019-12-03 | 深圳绿米联创科技有限公司 | A kind of apparatus control method, device, electronic equipment and storage medium |
CN112417923A (en) * | 2019-08-20 | 2021-02-26 | 云丁网络技术(北京)有限公司 | System, method and apparatus for controlling smart devices |
CN112130918A (en) * | 2020-09-25 | 2020-12-25 | 深圳市欧瑞博科技股份有限公司 | Intelligent device awakening method, device and system and intelligent device |
CN112581664A (en) * | 2020-12-26 | 2021-03-30 | 深兰盛视科技(苏州)有限公司 | Control method and device based on intelligent identification, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN115599006A (en) | 2023-01-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105095882B (en) | Gesture recognition method and device | |
CN109028515A (en) | Control method, air conditioner and the storage medium of air conditioner | |
CN102270036A (en) | Vision-based hand movement recognition system and method thereof | |
CN112526892B (en) | Method and device for controlling intelligent household equipment and electronic equipment | |
CN110533694A (en) | Image processing method, device, terminal and storage medium | |
CN107590474B (en) | Unlocking control method and related product | |
CN112198810B (en) | Method and device for navigation control of intelligent household equipment and intelligent household equipment | |
CN114704934B (en) | Method and device for controlling direct current air conditioner and direct current air conditioner | |
CN115599006B (en) | Control method and device of intelligent equipment and intelligent equipment | |
CN116311515A (en) | Gesture recognition method, device, system and storage medium | |
CN110657556A (en) | Method for controlling remote controller and remote controller | |
CN111104541A (en) | Efficient face picture retrieval method and device | |
CN113375307B (en) | Control method and control device for air conditioner and air conditioner | |
CN113531797B (en) | Method and device for preheating air conditioner, air conditioner and air conditioning system | |
CN109885232A (en) | Control method, device and system of data input equipment | |
CN112902394A (en) | Equipment control method, air conditioning unit and readable storage medium | |
CN111880488A (en) | Method, device and equipment for acquiring position of household appliance | |
CN103984415A (en) | Information processing method and electronic equipment | |
CN110941187A (en) | Household appliance control method and device | |
CN115264842A (en) | Method and device for controlling air conditioner, electronic equipment and storage medium | |
CN114840086A (en) | Control method, electronic device and computer storage medium | |
CN114622798A (en) | Method and device for opening door of household appliance and household appliance | |
CN113932387A (en) | Method and device for controlling air conditioner and air conditioner | |
CN114690651A (en) | Method and device for screen image transmission, smart home equipment and system | |
CN113380250B (en) | Information processing method, device and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |