CN114679612A - Intelligent household system and control method thereof - Google Patents

Intelligent household system and control method thereof Download PDF

Info

Publication number
CN114679612A
CN114679612A CN202210252875.5A CN202210252875A CN114679612A CN 114679612 A CN114679612 A CN 114679612A CN 202210252875 A CN202210252875 A CN 202210252875A CN 114679612 A CN114679612 A CN 114679612A
Authority
CN
China
Prior art keywords
track
display device
head
image
control method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210252875.5A
Other languages
Chinese (zh)
Inventor
武志涛
程万胜
魏东
孙伟忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology Liaoning USTL
Original Assignee
University of Science and Technology Liaoning USTL
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology Liaoning USTL filed Critical University of Science and Technology Liaoning USTL
Priority to CN202210252875.5A priority Critical patent/CN114679612A/en
Publication of CN114679612A publication Critical patent/CN114679612A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/4222Remote control device emulator integrated into a non-television apparatus, e.g. a PDA, media center or smart toy
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42221Transmission circuitry, e.g. infrared [IR] or radio frequency [RF]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C2201/00Transmission systems of control signals via wireless link
    • G08C2201/90Additional features
    • G08C2201/93Remote control using other portable devices, e.g. mobile phone, PDA, laptop

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an intelligent home system and a control method thereof, wherein the intelligent home system comprises: the first display device and the second display device that communication is connected still include: the head-mounted device is provided with a camera and an input module, and is in communication connection with the first display device; a wristband device configured with a speaker module, the wristband device communicatively connected with the second display device; a fingerstall device in communicative connection with the wristband device.

Description

Intelligent household system and control method thereof
Technical Field
The invention relates to the field of intelligent home furnishing, in particular to an intelligent home furnishing system and a control method thereof.
Background
When people use an intelligent terminal (such as a mobile phone and a computer) to watch videos, the videos played on the intelligent terminal can be played by throwing the screen onto a television, and the current screen throwing method is to select the television to be thrown through the intelligent terminal and then throw the television.
Disclosure of Invention
The invention provides an intelligent home system and an intelligent home control method, which are used for releasing a video on an intelligent terminal to a television.
In order to achieve the purpose, the invention adopts the following technical scheme:
an intelligent home system, comprising: the first display device and the second display device that are connected in communication further include:
the head-mounted device is provided with a camera and an input module, and is in communication connection with the first display device;
a wristband device configured with a speaker module, the wristband device communicatively connected with the second display device;
a fingerstall device in communicative connection with the wristband device.
Further, the first display device comprises a mobile phone and/or a computer, and the second display device comprises a television.
Further, the input module comprises a key and/or a touch pad.
The intelligent home control method is characterized in that based on any one of the systems provided by the invention, the control method comprises the following steps:
s1: acquiring a first image through a camera of the head-mounted device, and executing S2 when the first display device exists in the first image;
s2: outputting first prompt information through the head-mounted device, and executing S3 when first control information is acquired through an input module of the head-mounted device;
S3: acquiring a second image through a camera of the head-mounted device, and executing S4 when the second display device exists in the second image;
s4: and outputting second prompt information through the head-mounted equipment, and when second control information is acquired through an input module of the head-mounted equipment, sending a first control instruction to the first display equipment through the head-mounted equipment, wherein the first control instruction is a screen projection instruction.
Further, in S1, when there are a plurality of the first display devices in the first image, in S2, the first prompt information includes identifications of the plurality of the first display devices in the first image, the first control information is used to select a first target display device from the plurality of the first display devices in the first image, and in S4, the head mounted device sends a first control instruction to the first target display device.
Further, in the S1, when a plurality of the first display devices exist in the first image, the first display device closest to the center of the first image is determined as a first target display device, and in the S4, the head mounted device transmits a first control instruction to the first target display device.
Further, the method also comprises the following steps:
s5: executing S6 when the wristband device detects a first track of the finger cot device on the target surface;
s6: the wrist strap device detects a second track of the finger cot device on the target surface;
s7: executing S8 when the wristband device detects a third track of the finger cot device on the target face;
s8: the wrist strap device calculates a fourth track, generates a second control instruction according to the fourth track, and sends the second control instruction to the second display device, wherein the fourth track is a mapping of a part of the second track except the third track on a first plane, and the first plane is constructed according to the first track.
Further, the target surface is a palm of a hand wearing the wristband device.
Further, the S1 includes: s11: presetting a first standard track; s12: and judging whether the first track is the first standard track, if so, executing S3 and outputting first voice information, and if not, outputting second voice information.
Further, the S3 includes: s31: calculating a fifth trajectory, wherein the fifth trajectory is a mapping of the second trajectory on the first plane; s32: presetting a sixth standard track; s33: and judging whether the fifth track comprises a sixth track, if so, executing S4, and outputting third voice information, wherein the sixth track is the same as the sixth standard track.
Compared with the prior art, the invention has the following beneficial effects:
before the intelligent home system is launched, the first display device and the second display device are identified through an image acquired by a camera of the head-mounted device, a control instruction is sent to the first display device, the first display device is controlled to launch a played video to the second display device, and the process does not require a user to manually select the second display device on an App of the first display device for playing the video; in the release process, the wrist band device sends a control instruction to the second display device to control the volume and the like, and the traditional control mode using a remote controller is replaced.
Drawings
FIG. 1 is a schematic diagram of an intelligent home system according to one embodiment;
fig. 2 is a flowchart of an intelligent home control method in the second embodiment.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The fixing connection in the embodiment can be completed by welding, bonding, screws or interference fit and the like.
The first embodiment is as follows:
an intelligent home system, comprising: the first display device may be a mobile phone and/or a computer, that is, the first display device may be one or more, which is not limited herein, the second display device may be a television, and the first display device is in communication connection with the second display device, so that a video played by the first display device can be delivered to the second display device.
It should be noted that two processes are mainly involved when the video played by the first display device is launched to the second display device, one is to select the second display device on the App for playing the video by the first display device, and the other is to launch the video played by the first display device to the second display device; the smart home system of this embodiment is designed mainly for the first process, and the second process may be completed by using the prior art, and for this reason, the smart home system of this embodiment is designed as follows, specifically, as shown in fig. 1, further includes: head-mounted equipment, wrist strap equipment and dactylotheca equipment.
The head-mounted device may be an AR device or a VR device. The head-mounted equipment is provided with a camera and an input module; through the camera, the head-mounted equipment can acquire an image in front of the head-mounted equipment; the input module can be a key and/or a touch pad, and a user can input instructions to the head-mounted device through the input module. The head-mounted device is in communication connection with the first display device, the communication connection between the head-mounted device and the first display device is used for achieving that the head-mounted device sends a control instruction to the first display device, and particularly the head-mounted device and the first display device can be in communication connection through wifi or Bluetooth.
Wrist strap equipment disposes the loudspeaker module, wrist strap equipment with second display device communication connection, both communication connection are used for realizing wrist strap equipment to second display device sends control command, and particularly, both accessible wifi or bluetooth carry out communication connection.
The finger cot device is in communication connection with the wrist strap device, the two communication connections are used for enabling the wrist strap device to detect a motion track of the finger cot device, the motion track is used for the wrist strap device to generate a control instruction, and particularly the two communication connections can be made through nfc.
Before the release, the first display device and the second display device are identified through the image acquired by the camera of the head-mounted device, and a control instruction is sent to the first display device to control the first display device to release the played video to the second display device, so that the user does not need to manually select the second display device on the App of the first display device for playing the video in the process; in the release process, the wrist band device sends a control instruction to the second display device to control the volume and the like, and the traditional control mode using a remote controller is replaced.
The second embodiment:
an intelligent home control method is based on an intelligent home system of an embodiment one, and as shown in fig. 2, the control method of the embodiment includes the following steps:
s1: acquiring a first image through a camera of the head-mounted device, and executing S2 when the first display device exists in the first image.
In the control method of this embodiment, when a user wants to deliver a video played by the first display device to the second display device, the user needs to wear the head-mounted device first and look at the first display device, and at this time, the camera of the head-mounted device acquires the first image including the first display device, that is, "acquiring the first image by the camera of the head-mounted device" is performed, and when the first display device exists in the first image, S2 is performed. In a specific implementation, in order to facilitate identification of the first display device in the first image, the image of the first display device may be pre-stored in the head-mounted device.
S2: outputting the first prompt information through the head-mounted device, and executing S3 when the first control information is acquired through the input module of the head-mounted device.
In the control method of this embodiment, the first prompt message is used to remind the user that the first display device, i.e. the video source, is identified. In a specific implementation, the first prompt message may be an image message, a video message, or an audio message, for example, a preset image message or a preset video message may be played through a lens of the head-mounted display device.
In the control method of this embodiment, the first control information is used for the user to determine the first prompt information and start the subsequent steps. In specific implementation, when the input module is a key, the first control information may be a single click, and when the head-mounted device detects the single click, S3 is executed; when the input module is a touch pad, the first control information may be a sliding of a preset track.
S3: acquiring a second image through a camera of the head-mounted device, and executing S4 when the second display device exists in the second image.
In the control method of this embodiment, after the first control information is acquired through the input module of the head-mounted device, the user needs to look at the second display device, and at this time, the camera of the head-mounted device acquires the second image including the second display device, that is, "acquiring the second image through the camera of the head-mounted device" is performed, and when the second display device exists in the second image, S4 is performed. In an implementation, in order to facilitate the identification of the second display device in the second image, the image of the second display device may be pre-stored in the head-mounted device.
S4: and outputting second prompt information through the head-mounted equipment, and when second control information is acquired through an input module of the head-mounted equipment, sending a first control instruction to the first display equipment through the head-mounted equipment, wherein the first control instruction is a screen projection instruction.
In the control method of this embodiment, the first prompt message is used to remind the user that the second display device, that is, the device to be delivered, is identified. The first control information is used for the user to determine the second prompt information and instruct the head-mounted device to send a first control instruction to the first display device.
In the control method of this embodiment, the first control instruction may include name information of the second display device, and the first control instruction is further configured to control the first display device to select the second display device according to the name information of the second display device. It should be noted that, as described above, after the second display device is selected from the first display device, the video played by the first display device may be delivered to the second display device through the prior art, and therefore, the details are not repeated herein; in addition, the implementation of the control method of this embodiment also involves a remote control technology of a mobile phone or a computer, and in this embodiment, the remote control technology in the prior art can be used for implementation, so that details are not described here.
In the control method of this embodiment, there may be a case where a plurality of first display devices exist in the first image during implementation, and for this case, the following optional embodiments are further provided in this embodiment.
In an alternative embodiment, in S1, when there are a plurality of the first display devices in the first image, in S2, the first prompt information includes identifications of the plurality of the first display devices in the first image, the first control information is used to select a first target display device from the plurality of the first display devices in the first image, and in S4, the head mounted device sends a first control instruction to the first target display device.
In this alternative embodiment, when a plurality of the first display apparatuses exist in the first image, the first target display apparatus, i.e., the video source, is determined by the user's selection.
In this alternative embodiment, the identifier of the first display device is used to prompt a user to select the first target display device from a plurality of first display devices. In a specific implementation, the identifier of the first display device may be a label mapped one-to-one with each of the first display devices and a name of each of the first display devices. In an exemplary embodiment, when the first image includes a mobile phone and a computer, text information is displayed through a lens of the head-mounted device, 1-the mobile phone and 2-the computer are used for prompting a user to select one of the mobile phone and the computer as the first target display device, and corresponding to the first target display device, if the user wants to select the mobile phone, the user clicks a key of the head-mounted device, and if the user wants to select the computer, the user double clicks the key of the head-mounted device, and the head-mounted device determines that the mobile phone or the computer is the first target display device according to the single click or the double click.
In an alternative embodiment, in the S1, when there are a plurality of the first display devices in the first image, the first display device closest to the center of the first image is determined as a first target display device, and in the S4, the head mounted device transmits a first control instruction to the first target display device.
In this alternative embodiment, when a plurality of the first display devices exist in the first image, the first display device closest to the center of the first image is determined as the first target display device. In specific implementation, the distance from the center of each first display device in the first image to the center of the first image may be calculated, and the first display device with the smallest distance is determined as the first target display device.
It should be understood that, when the control method of the present embodiment is implemented, there may be a case where there are a plurality of second display devices in the second image, and for this case, the idea of the above alternative embodiment may be used to improve the adaptability of S4.
In addition, in the process of using the second display device to launch a video, the second display device is usually controlled by using a remote controller, such as volume increasing and volume decreasing, which has the disadvantage that a user usually needs to watch the remote controller when operating the remote controller, and at this time, the content displayed on the television may be missed, and for this disadvantage, the present embodiment also provides the following optional embodiments.
In an optional embodiment, the method further comprises the following steps:
s5: executing S6 when the wristband device detects a first track of the finger cot device on the target surface;
s6: the wrist strap device detects a second track of the finger cot device on the target surface;
s7: executing S8 when the wristband device detects a third track of the finger cot device on the target face;
s8: the wrist strap device calculates a fourth track, generates a second control instruction according to the fourth track, and sends the second control instruction to the second display device, wherein the fourth track is a mapping of a part of the second track except the third track on a first plane, and the first plane is constructed according to the first track.
In this alternative embodiment, the user is required to wear the wrist band device with one hand and the finger cot device with one finger in the other hand, and specifically, the wrist band device may be worn with the left hand of the user and the finger cot device may be worn with the index finger of the right hand of the user (for convenience of description, the wrist band device is worn with the left hand of the user and the finger cot device is worn with the index finger of the right hand is taken as an example in the following description); in addition, a wireless communication module, such as an nfc module, is required to be configured in the wrist band device and the finger sleeve device, so that when the finger sleeve device is close to the wrist band device, the motion track of the finger sleeve device can be detected by the wrist band device, and when the finger sleeve device is within a range of 20cm around the wrist band device, the motion track of the finger sleeve device can be detected by the wrist band device; it should be noted that, the method for detecting the motion trajectory of the electronic device configured with the nfc module belongs to the prior art, and therefore, the detailed description is omitted here.
In this alternative embodiment, the target surface is a palm on which the wristband device is worn, and the palm is made to form a "touchpad", and when the control method of this alternative embodiment is executed, the user unfolds the palm of the left hand, and then slides the index finger of the right hand on the palm of the left hand, so as to construct the first trajectory, the second trajectory, the third trajectory, and the fourth trajectory of the finger cot device on the target surface, where these trajectories of the finger cot device are detected by the wristband device, and then these trajectories generate the corresponding second control instructions for controlling the second display device. In summary, the control method of the alternative embodiment is performed without the user looking at the wrist band device and the finger stall device, thereby solving the technical problem existing in the use of the remote controller.
In this alternative embodiment, the first trajectory functions as one of initiating the control method of this alternative embodiment, i.e., the steps of generating the second control instruction are initiated when the wristband device detects the first trajectory; the second is to construct the first plane, and in implementation, the first plane can be constructed by arbitrarily taking three points from the first track.
In this optional embodiment, the beneficial effects achieved by the first plane are as follows, one is that since the first plane is constructed based on the first track drawn by the user, rather than a preset flat plate structure/device, the beneficial effects of being convenient for the user to use are achieved, and the flexibility of the algorithm is improved; and secondly, the dimensionality of the second track is reduced through the first plane, and the target instruction is generated by using the two-dimensional fourth track, so that the difficulty of the algorithm is reduced.
In this optional embodiment, the second trajectory is a plurality of trajectories of the finger cot device on the target surface after the first trajectory, and the second trajectory should include two parts, one part is a part for calculating the fourth trajectory, and the other part is the third trajectory. In a specific scenario, after the index finger of the right hand of the user finishes drawing the first trajectory on the palm of the left hand, a plurality of trajectories (i.e., second trajectories) continue to be drawn on the palm of the left hand, and when the wristband device detects that the third trajectory is included in the plurality of trajectories, a fourth trajectory is calculated through a portion between the first trajectory and the third trajectory (i.e., a portion of the second trajectory excluding the third trajectory). It can be seen that, in this alternative embodiment, the third trajectory functions as a control method in this alternative embodiment, namely, when the wristband device detects the third trajectory, the wristband device stops detecting the second trajectory and starts generating the second control instruction, and the wristband device determines a portion of the second trajectory used for generating the fourth trajectory, that is, a portion of the second trajectory used for generating the second control instruction, so as to prevent misjudgment.
In an alternative embodiment, the S1 includes: s11: presetting a first standard track; s12: and judging whether the first track is the first standard track, if so, executing S3 and outputting first voice information, and if not, outputting second voice information.
In this alternative embodiment, in practical implementation, the first standard trajectory may be that the index finger of the right hand (i.e. the finger cot device) draws a clockwise circle on the palm of the left hand; the first voice information may be "please input an instruction", and the second voice information may be "please redraw". In a specific scene, when a user wants to send a control instruction to a television, the index finger of the right hand draws the first track on the palm of the left hand, if the first track is the same as the first standard track, a first voice message 'please input the instruction' is output to remind the user that the instruction can be input, and if the first track is not the same as the first standard track, a second voice message 'please redraw' is output to remind the user of drawing errors. Of course, it should be understood that the first standard trajectory may be other patterns, such as circular arcs, and the first voice message and the second voice message may be in other forms, such as "drops" or "dripping" as long as the first voice message and the second voice message are easily distinguished.
In an alternative embodiment, the S3 includes: s31: calculating a fifth track, wherein the fifth track is a mapping of the second track on the first plane; s32: presetting a sixth standard track; s33: and judging whether the fifth track comprises a sixth track, if so, executing S4, and outputting third voice information, wherein the sixth track is the same as the sixth standard track.
In this alternative embodiment, the projection of the sixth trajectory, i.e. the third trajectory, on the first plane; in addition, in an implementation, the sixth standard trajectory may be a clockwise circle on the first plane, or other patterns, and the third voice message may be "to draw" or other, such as "drip", for reminding the user that the drawing is completed.
In an optional embodiment, the S3 further includes: s34: presetting a time interval threshold, calculating a time interval from a first moment to the current moment, and outputting fourth voice information when the time interval is greater than or equal to the time interval threshold and the fifth track does not contain the sixth track, wherein the first moment is the moment when the first track is detected.
In this optional embodiment, when the user finishes drawing the first trajectory, if the third trajectory cannot be drawn for a long time, the fourth voice message is output to remind the user.
In an alternative embodiment, the S4 includes: s41: calculating a fourth track, wherein the fourth track is a part of the fifth track except the sixth track; s42: presetting an instruction set, wherein the instruction set comprises a plurality of instructions, and each instruction is mapped with a different fourth standard track; s43: matching the fourth track with the fourth standard track mapped by each instruction in sequence; s44: and if the only instruction is matched, determining the instruction as the control instruction, sending the control instruction to a display terminal, and if the instruction is not matched or the matched instruction is more than or equal to two, outputting fifth voice information.
In an alternative embodiment, S45: calculating a fourth track, wherein the fourth track is a part of the fifth track except the sixth track; s46: presetting an instruction set, wherein the instruction set comprises a plurality of instructions, and each instruction is mapped with a different fourth standard track; s47: matching the fourth track with the fourth standard track mapped by each instruction in sequence; s48: determining the instruction corresponding to the fourth standard track with the highest matching degree as the control instruction.
In the above optional embodiment, the fourth standard trajectory needs to be significantly different from the sixth standard trajectory in advance, so as to avoid a false judgment of the fourth standard trajectory and the sixth standard trajectory during identification.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.

Claims (10)

1. An intelligent home system, comprising: first display device and second display device of communication connection, its characterized in that still includes:
the head-mounted equipment is provided with a camera and an input module, and is in communication connection with the first display equipment;
a wristband device configured with a speaker module, the wristband device communicatively connected with the second display device;
the finger stall equipment is in communication connection with the wrist strap equipment.
2. The system of claim 1, wherein the first display device comprises a cell phone and/or a computer and the second display device comprises a television.
3. The system of claim 1, wherein the input module comprises a key and/or a touch pad.
4. An intelligent home control method, based on the system according to any one of claims 1-3, the control method comprising the steps of:
s1: acquiring a first image through a camera of the head-mounted device, and executing S2 when the first display device exists in the first image;
s2: outputting first prompt information through the head-mounted device, and executing S3 when first control information is acquired through an input module of the head-mounted device;
s3: acquiring a second image through a camera of the head-mounted device, and executing S4 when the second display device exists in the second image;
s4: and outputting second prompt information through the head-mounted equipment, and when second control information is acquired through an input module of the head-mounted equipment, sending a first control instruction to the first display equipment by the head-mounted equipment, wherein the first control instruction is a screen projection instruction.
5. The control method according to claim 4, wherein in the S1, when there are a plurality of the first display devices in the first image, in the S2, the first prompt information includes identifications of the plurality of the first display devices in the first image, the first control information is used to select a first target display device from the plurality of the first display devices in the first image, and in the S4, the head mounted device transmits a first control instruction to the first target display device.
6. The control method according to claim 4, wherein in the S1, when a plurality of the first display devices exist in the first image, the first display device closest to the center of the first image is determined as a first target display device, and in the S4, the head mounted device transmits a first control instruction to the first target display device.
7. The control method according to any one of claims 4 to 6, characterized by further comprising the steps of:
s5: executing S6 when the wristband device detects a first track of the finger cot device on the target surface;
s6: the wrist strap device detects a second track of the finger cot device on the target surface;
s7: executing S8 when the wristband device detects a third track of the finger cot device on the target face;
s8: the wrist strap device calculates a fourth track, generates a second control instruction according to the fourth track, and sends the second control instruction to the second display device, wherein the fourth track is a mapping of a part of the second track except the third track on a first plane, and the first plane is constructed according to the first track.
8. The control method according to claim 7, wherein the target surface is a palm of a hand wearing the wristband device.
9. The control method according to claim 7, wherein the S1 includes: s11: presetting a first standard track; s12: and judging whether the first track is the first standard track, if so, executing S3 and outputting first voice information, and if not, outputting second voice information.
10. The control method according to any one of claim 7, wherein the S3 includes: s31: calculating a fifth trajectory, wherein the fifth trajectory is a mapping of the second trajectory on the first plane; s32: presetting a sixth standard track; s33: and judging whether the fifth track comprises a sixth track, if so, executing S4, and outputting third voice information, wherein the sixth track is the same as the sixth standard track.
CN202210252875.5A 2022-03-15 2022-03-15 Intelligent household system and control method thereof Pending CN114679612A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210252875.5A CN114679612A (en) 2022-03-15 2022-03-15 Intelligent household system and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210252875.5A CN114679612A (en) 2022-03-15 2022-03-15 Intelligent household system and control method thereof

Publications (1)

Publication Number Publication Date
CN114679612A true CN114679612A (en) 2022-06-28

Family

ID=82074606

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210252875.5A Pending CN114679612A (en) 2022-03-15 2022-03-15 Intelligent household system and control method thereof

Country Status (1)

Country Link
CN (1) CN114679612A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808791A (en) * 2015-04-03 2015-07-29 杨皓 Method for inputting or controlling electronic equipment by triggering skin surface through finger
CN105338032A (en) * 2014-08-06 2016-02-17 中国银联股份有限公司 Smart glasses based multi-screen synchronizing system and multi-screen synchronizing method
CN105744293A (en) * 2016-03-16 2016-07-06 北京小米移动软件有限公司 Video live broadcast method and device
CN107003738A (en) * 2014-12-03 2017-08-01 微软技术许可有限责任公司 Fixation object application launcher
CN107272892A (en) * 2017-05-31 2017-10-20 北京数科技有限公司 A kind of virtual touch-control system, method and device
US20180069955A1 (en) * 2015-04-08 2018-03-08 Samsung Electronics Co., Ltd. Method and apparatus for interworking between electronic devices
KR20180049739A (en) * 2016-11-03 2018-05-11 엘지전자 주식회사 Hmd mobile terminal and operating method thereof
CN111190488A (en) * 2019-12-30 2020-05-22 华为技术有限公司 Device control method, communication apparatus, and storage medium
CN111481177A (en) * 2020-03-27 2020-08-04 深圳光启超材料技术有限公司 Head-mounted device, screen projection system and method, and computer-readable storage medium
CN112929734A (en) * 2021-02-05 2021-06-08 维沃移动通信有限公司 Screen projection method and device and electronic equipment

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105338032A (en) * 2014-08-06 2016-02-17 中国银联股份有限公司 Smart glasses based multi-screen synchronizing system and multi-screen synchronizing method
CN107003738A (en) * 2014-12-03 2017-08-01 微软技术许可有限责任公司 Fixation object application launcher
CN104808791A (en) * 2015-04-03 2015-07-29 杨皓 Method for inputting or controlling electronic equipment by triggering skin surface through finger
US20180069955A1 (en) * 2015-04-08 2018-03-08 Samsung Electronics Co., Ltd. Method and apparatus for interworking between electronic devices
CN105744293A (en) * 2016-03-16 2016-07-06 北京小米移动软件有限公司 Video live broadcast method and device
KR20180049739A (en) * 2016-11-03 2018-05-11 엘지전자 주식회사 Hmd mobile terminal and operating method thereof
CN107272892A (en) * 2017-05-31 2017-10-20 北京数科技有限公司 A kind of virtual touch-control system, method and device
CN111190488A (en) * 2019-12-30 2020-05-22 华为技术有限公司 Device control method, communication apparatus, and storage medium
CN111481177A (en) * 2020-03-27 2020-08-04 深圳光启超材料技术有限公司 Head-mounted device, screen projection system and method, and computer-readable storage medium
CN112929734A (en) * 2021-02-05 2021-06-08 维沃移动通信有限公司 Screen projection method and device and electronic equipment

Similar Documents

Publication Publication Date Title
US10007354B2 (en) Method and apparatus for controlling smart device
US10453331B2 (en) Device control method and apparatus
US20160217794A1 (en) Information processing apparatus, information processing method, and program
RU2642410C2 (en) Intelligent household appliance control method, device and system, mobile and wearable intelligent household appliances and apparatus
CN105608861B (en) Control method of electronic device and device
EP2980675A2 (en) Mobile device and method of pairing the same with electric device
US10120441B2 (en) Controlling display content based on a line of sight of a user
CN107810459A (en) The pairing of media streaming device
EP3136793A1 (en) Method and apparatus for awakening electronic device
US11197064B2 (en) Display device, display control method, and program
JP2017529814A (en) Method and apparatus for controlling intelligent devices
US20150346816A1 (en) Display device using wearable eyeglasses and method of operating the same
CN104813642A (en) Methods, apparatuses and computer readable medium for triggering a gesture recognition mode and device pairing and sharing via non-touch gestures
JP6423129B1 (en) Smart device control method and apparatus
CN111580661A (en) Interaction method and augmented reality device
CN103873959A (en) Control method and electronic device
WO2020006746A1 (en) Method and apparatus for recognizing downlink transmission
CN112911190A (en) Remote assistance method, electronic equipment and system
WO2020078319A1 (en) Gesture-based manipulation method and terminal device
CN112565837A (en) Media content system for communicating playback markers between networked playback devices
JP2018531453A (en) Timing method and apparatus
JP2018531453A6 (en) Timing method and apparatus
US20210307104A1 (en) Method and apparatus for controlling intelligent voice control device and storage medium
CN111123716B (en) Remote control method, remote control device, and computer-readable storage medium
CN106896917B (en) Method and device for assisting user in experiencing virtual reality and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220628