CN109088803B - AR remote control device, intelligent home remote control system and method - Google Patents

AR remote control device, intelligent home remote control system and method Download PDF

Info

Publication number
CN109088803B
CN109088803B CN201811102831.4A CN201811102831A CN109088803B CN 109088803 B CN109088803 B CN 109088803B CN 201811102831 A CN201811102831 A CN 201811102831A CN 109088803 B CN109088803 B CN 109088803B
Authority
CN
China
Prior art keywords
gesture
intelligent home
instruction
remote control
control instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811102831.4A
Other languages
Chinese (zh)
Other versions
CN109088803A (en
Inventor
卫荣杰
房晓俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tapuyihai Shanghai Intelligent Technology Co ltd
Original Assignee
Tapuyihai Shanghai Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tapuyihai Shanghai Intelligent Technology Co ltd filed Critical Tapuyihai Shanghai Intelligent Technology Co ltd
Priority to CN201811102831.4A priority Critical patent/CN109088803B/en
Publication of CN109088803A publication Critical patent/CN109088803A/en
Application granted granted Critical
Publication of CN109088803B publication Critical patent/CN109088803B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The invention discloses an AR remote control device, an intelligent home remote control system and a method, and relates to the AR field, wherein the AR remote control device comprises: the AR display equipment is used for capturing images of the intelligent home; the instruction control equipment is used for receiving the control instruction and sending a control instruction coding signal to the intelligent home according to the control instruction. The remote controller based on the AR technology enables users of any remote controller to feel concise and convenient, and is especially good news and consciousness for patients losing language ability; and the language semantic recognition technology is assisted, so that the old and children users feel warm.

Description

AR remote control device, intelligent home remote control system and method
Technical Field
The invention relates to the field of remote control devices, in particular to a remote control device based on an AR technology, an intelligent home remote control system based on the AR technology and a remote control implementation method.
Background
The remote control of the electronic/intelligent home is generally completed through an infrared remote controller, and different remote controllers are required for different electronic/intelligent home, so that the condition that the remote controller of an air conditioner is held to operate a television or the phenomenon that a user cannot find a desired remote controller and grabs the television at one time often occurs.
In the prior art, unified control is performed through the intelligent sound box, but the disadvantage is that the verbal command of the user is not necessarily matched with the control command of a specific electronic/intelligent home, for example, the ambiguity of the user command is or is ambiguous, so that the intelligent sound box is required to have the neural network search technology of the AI chip to perform further language semantic analysis, the operation command of the user can be restored, and the introduction of the AI chip can definitely increase the cost of the product.
In addition, users with vision and consciousness but no language ability can not operate the smart home through the smart speakers.
The invention with application number 201810050821.4 provides a near-eye perspective head-display optical system, which comprises a first lens, a second lens and a miniature image display, wherein the first lens and the second lens are attached to the miniature image display, and the first lens and the second lens are free-form surface lenses with uniform thickness. By the near-to-eye perspective head-display optical system framework provided by the invention, not only the refraction times of light in the optical system framework can be reduced, but also the aberration of light emitted by the miniature image display in all directions can be eliminated, so that the aberration of the image can not be caused in all directions and angles.
The utility model of application number 201821172477.8 provides an AR display device on the basis of the above optical system, wherein the AR display device comprises at least one display/projector and at least one optical means by which a real image is built, a virtual image is built in the optical means by the display/projector, the real image and the virtual image in combination appear in the same field of view, wherein the optical means comprises at least one main lens and at least one free-form lens for reflecting the image projected by the display/projector; the main lens is used for reflecting the image projected by the display/projector again. The AR device can achieve large visual angle, small volume and light weight, and a user wearing the AR device can have the experience of immersive AR.
The utility model with application number 201810994555.0 further provides a head-mounted virtual-real interaction device and a virtual-real interaction method based on the AR display device, wherein the head-mounted virtual-real interaction device comprises a head-mounted AR display device for displaying at least 3D virtual images in a field of view of the head-mounted AR display device; and the virtual-real interactor is used for acquiring behavior actions or control instructions of a person wearing the head-mounted AR display device and changing or controlling the content of the 3D virtual image according to the behavior actions or the control instructions.
Disclosure of Invention
The invention provides an AR remote control device, an intelligent home remote control system and a method based on the light-transmitting AR technology or the image AR technology of the cited patent, and aims to provide a novel remote control device and a novel remote control mode, so that a user can use a remote controller more conveniently and rapidly based on the AR technology, and the applicable crowd is wider.
The technical scheme provided by the invention is as follows:
an AR remote control device, comprising: the AR display device is used for capturing images of intelligent home/intelligent household appliances; the instruction control equipment is used for receiving the control instruction and sending a control instruction coding signal to the intelligent home according to the control instruction.
In the technical scheme, the instruction control equipment and the AR display equipment are combined to form the AR remote control device, and the visual remote control device can drive any intelligent home in the home and is convenient to use.
Further, the instruction control apparatus includes: the instruction generation module is used for receiving the control instruction; and the communication module is used for sending a control instruction coding signal to the intelligent home according to the control instruction.
Further, the instruction generation module includes: gesture interaction components, head sight components (helmet sights), eye tracking components (eye tracking cameras), smart gloves, smart rings, control handles, and/or voice control components; the gesture interaction component is used for considering that the control input by the user is received when the gesture action made by the user is detected to be the same as the preset gesture; and the voice control component is used for receiving a control instruction of voice input.
In the technical scheme, the control modes on the AR remote control device are various, so that the realization possibility and the application range of the AR remote control device are greatly enriched.
Further, the gesture interaction component is configured to, when detecting that a gesture action made by a user is the same as a preset gesture, specifically: the gesture interaction component is configured to consider that the gesture action made by the user is the same as the preset gesture when it is detected that the first gesture action made by the user is the same as the preset first gesture in the preset gestures, the second gesture action made by the user is the same as the preset second gesture in the preset gestures, and the time difference between the first gesture action and the second gesture action is within the preset time range.
In the technical scheme, the linkage gesture is used as the input of the control instruction, so that the accuracy is high.
Further, the control instructions include gesture instructions, voice instructions, head-of-sight instructions, eye-movement instructions, and/or menu instructions.
Further, the camera in the AR remote controller further includes: depth cameras, RGB cameras, TOF cameras, fisheye cameras, and/or wide angle cameras. The AR remote control device further includes: the image recognition module is used for acquiring images or recognition codes of the intelligent home in the environment; and the image comparison module is used for comparing and determining the type, brand and model of the intelligent home according to the image or the identification code of the intelligent home.
Further, the AR remote control device further includes: the storage module is used for pre-storing images or identification codes (bar codes, two-dimensional codes, AR identification trigger marks and the like attached to the main shell of the intelligent home for determining the types, brands and models) of the intelligent home for calling when the image comparison module compares the images; the storage module is further used for pre-storing control instruction coding signals of the intelligent home with determined types, brands and models, and the control instruction coding signals are sent to the specific intelligent home by the instruction control equipment.
In the technical scheme, the AR remote control device stores various intelligent home data, so that the intelligent home control is more humanized and more convenient.
Further, the AR remote control device further includes: the network searcher is used for searching the images or the identification codes of the intelligent home with the determined types, brands and models and calling the images or the identification codes when the images are compared by the image comparison module; the network searcher is also used for searching the control instruction coding signals of the intelligent home with the determined types, brands and models, and the control instruction coding signals are sent to the specific intelligent home by the instruction control equipment.
In the technical scheme, if the local information is not stored, the network connection downloading can be performed automatically, the intelligent degree is higher, and the user experience is better.
Further, the AR remote control device further includes: and the virtual object generation module is used for assisting the instruction control equipment to clarify the control instruction.
Further, the virtual object generated by the virtual object generating module includes: virtual control instruction menus, virtual intermediaries, 3D virtual space environments, or 3D map images.
In the technical scheme, the virtual object generation module further assists a user to clearly control the instruction, and the control is more accurate, convenient and diversified.
The invention also provides an intelligent home remote control system, which comprises: the intelligent household remote control device comprises an intelligent household and an AR remote control device, wherein the AR remote control device is used for sending control instruction coding signals to the intelligent household.
Further, the smart home remote control system further includes: and the auxiliary camera module is used for acquiring the remote image of the intelligent home and sending the remote image to the AR remote control device.
Further, the auxiliary camera module further includes: network camera, unmanned aerial vehicle on-board camera or unmanned aerial vehicle on-board camera.
Further, the AR remote control device comprises an instruction control device, wherein the instruction control device is used for receiving the control instruction and sending a control instruction coding signal to the intelligent home according to the control instruction.
Further, the control instructions include gesture instructions, voice instructions, head-of-sight instructions, eye-movement instructions, and/or menu instructions.
Further, the instruction control apparatus includes: the instruction generation module is used for receiving the control instruction; and the communication module is used for sending a control instruction coding signal to the intelligent home according to the control instruction.
Further, the instruction generation module includes: the intelligent finger ring comprises a gesture interaction component, a head aiming component, an eye movement tracking component, an intelligent glove, an intelligent finger ring, a control handle and/or a voice control component, wherein the voice control component is used for receiving a control instruction of voice input; the AR remote control device further comprises a semantic recognition module, and is further used for sending out inquiry information when detecting that the behavior of the user accords with a preset trigger condition; and receiving an object confirmation instruction which is input by the user according to the inquiry information.
And the gesture interaction component is used for considering that the control input by the user is received when the gesture action made by the user is detected to be the same as the preset gesture.
Further, the smart home remote control system further includes: the image recognition module is used for acquiring images or recognition codes of the intelligent home in the environment; and the image comparison module is used for comparing and determining the type, brand and model of the intelligent home according to the image or the identification code of the intelligent home.
Further, the smart home remote control system includes: a storage module and/or a web searcher; the storage module is used for pre-storing images or identification codes of the intelligent home with determined types, brands and models and is used for being called when the image comparison module compares the images; the storage module is further used for pre-storing control instruction coding signals of the intelligent home with determined types, brands and models, and the control instruction coding signals are sent to the specific intelligent home by the instruction control equipment; the network searcher is used for searching the images or the identification codes of the intelligent home with determined types, brands and models and calling the images or the identification codes when the images are compared by the image comparison module; the network searcher is also used for searching the control instruction coding signals of the intelligent home with the determined types, brands and models, and the control instruction coding signals are sent to the specific intelligent home by the instruction control equipment.
The invention also provides a remote control method of the intelligent home, which comprises the following steps: s100: acquiring and displaying an image of the smart home; s300: displaying an object to be controlled in an image of the smart home according to the specifically pointed smart home (namely, when a smart home confirmation instruction input by a user is received, confirming the smart home to be controlled in the displayed image of the smart home, namely, the object to be controlled according to the smart home confirmation instruction); s500: generating a corresponding control instruction coding signal according to a control instruction of the object to be controlled (namely generating a corresponding control instruction coding signal according to the event control instruction when receiving the event control instruction issued according to the smart home to be controlled); s600: and sending the control instruction coding signal to the object to be controlled, so that the object to be controlled executes corresponding operation according to the control signal.
Further, before the step S300, the method further includes: s200: and when the gesture action made by the user is detected to be the same as the preset gesture, determining the intelligent home specifically pointed in the image of the intelligent home (the intelligent home confirmation instruction input by the user is considered to be received).
Further, the step S200 specifically includes: s210: and when the first gesture (made by the user) is detected to be the same as the preset first gesture in the preset gestures, the second gesture (made by the user) is detected to be the same as the preset second gesture in the preset gestures, and the time difference between the first gesture and the second gesture is within the preset time range, the gesture made by the user is considered to be detected to be the same as the preset gestures.
Further, the step S300 includes: s310: and displaying an object to be controlled in the image of the smart home according to the voice command of the smart home (namely, when receiving the voice input smart home confirmation command, confirming the displayed object to be controlled in the image of the smart home according to the smart home confirmation command).
Further, the step S100 further includes: s150: when the behavior is detected to accord with the preset triggering condition, inquiry information is sent out; the step S300 includes: s330: and displaying an object to be controlled in the image according to the voice command of the smart home (namely, when receiving a smart home confirmation command input according to the query information, confirming the smart home to be controlled in the displayed image according to the smart home confirmation command).
Further, the step S100 of obtaining an image of the smart home includes: s110: acquiring images of the intelligent home around the view field of the AR remote control device through a camera; or, S120: and receiving the images around the field of view of the auxiliary camera module, which are sent by the auxiliary camera module.
Further, between the step S300 and the step S500, further includes: s470: searching the type, brand and model of an object to be controlled (namely a virtual control instruction menu of the smart home to be controlled) in a preset database, and displaying the virtual control instruction menu; the step S500 specifically comprises the following steps: s510: and generating a corresponding control instruction coding signal according to the control instruction issued by the virtual control instruction menu of the object to be controlled (namely, generating the corresponding control instruction coding signal according to the event control instruction when receiving the event control instruction issued by the user according to the displayed virtual control instruction menu of the intelligent home to be controlled).
Further, before the step S470, the method further includes: s410: when the type, brand and model of the object to be controlled are not found in a preset database, scanning identification information of the object to be controlled (namely, when a virtual control instruction menu of the intelligent home to be controlled is not found), scanning the identification information of the intelligent home to be controlled; s430: and downloading the control instruction coding signals and/or the virtual control instruction menu of the object to be controlled to the preset database in a networking manner according to the identification information.
Further, the step S100 includes: s180: according to the voice command, displaying a 3D virtual space environment or a 3D map image related to the voice command (namely, when receiving a voice control command issued by a user, acquiring and displaying the 3D virtual space environment or the 3D map image related to the voice control command); the step S300 includes: s350: and displaying the object to be controlled in the 3D virtual space environment or the 3D map image according to the intelligent home with specific pointing direction (namely, when receiving an intelligent home confirmation instruction input by a user according to the 3D virtual space environment or the 3D map image, confirming the displayed intelligent home to be controlled in the 3D virtual space environment or the 3D map image according to the intelligent home confirmation instruction).
Compared with the prior art, the AR remote control device, the intelligent home remote control system and the intelligent home remote control method have the technical effects or beneficial effects that: various intelligent electrical appliances/intelligent home furnishing are visually remotely controlled, the congenital defect of direct remote control of the existing single voice remote controller or infrared single remote controller is overcome, and one AR remote controller can drive on/off, operation and adjustment of any intelligent electrical appliance/intelligent home furnishing in home.
Drawings
The above features, technical features, advantages and implementation manners of an AR remote control device, an intelligent home remote control system and method will be further described with reference to the accompanying drawings in a clearly understandable manner.
FIG. 1 is a schematic diagram of the AR remote control device of the present invention;
FIG. 2 is a schematic diagram of an image-type AR headset with a gesture interaction component according to the present invention;
FIG. 3 is a schematic diagram of a light transmissive AR headset with a gesture interaction component of the present invention;
FIG. 4 is a schematic diagram of a structure of a gesture interaction component sensing fingertip of the present invention;
FIG. 5 is a schematic diagram of a first embodiment of the smart home remote control system of the present invention;
FIG. 6 is a flowchart of a first embodiment of the smart home remote control method of the present invention;
fig. 7 is a flowchart of a modification of the first embodiment of the smart home remote control method of the present invention;
FIG. 8 is a schematic diagram of a second embodiment of the smart home remote control system of the present invention;
FIG. 9 is a schematic view of an image in a head-up view of a space in an apartment according to the present invention;
FIG. 10 is a flow chart of a second embodiment of the smart home remote control method of the present invention;
FIG. 11 is a schematic diagram of a third embodiment of the smart home remote control system of the present invention;
FIG. 12 is a flow chart of a third embodiment of the smart home remote control method of the present invention;
fig. 13 is a flowchart of a modification of the third embodiment of the smart home remote control method of the present invention;
fig. 14 is a schematic representation of a three-dimensional overhead view of a virtual apartment space of the present invention.
Reference numerals illustrate:
AR remote control device/ware, 1 AR display device, 2 instruction control device, 21 instruction generation module, 22 communication module, 3 image recognition module, 4 image comparison module, 5 storage module, 6 network searcher, 7 virtual object generation module, 8 gesture interaction component, 200 smart home/intelligent household appliances, 300 auxiliary camera module.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will explain the specific embodiments of the present invention with reference to the accompanying drawings. It is evident that the drawings in the following description are only examples of the invention, from which other drawings and other embodiments can be obtained by a person skilled in the art without inventive effort.
For the sake of simplicity of the drawing, the parts relevant to the present invention are shown only schematically in the figures, which do not represent the actual structure thereof as a product. Additionally, in order to simplify the drawing for ease of understanding, components having the same structure or function in some of the drawings are shown schematically with only one of them, or only one of them is labeled. Herein, "a" means not only "only this one" but also "more than one" case.
Example 1
In one embodiment of a device according to the present invention, as shown in fig. 1, an AR remote control device (remote controller) 100 includes: the AR display device 1 is used for capturing images of smart home/intelligent home appliances; the instruction control device 2 is in communication connection with the AR display device 1, and is used for receiving a control instruction and sending a control instruction coding signal to the intelligent home according to the control instruction.
Specifically, AR (Augmented Reality ) is a technique that calculates the position and angle of a camera image in real time and adds corresponding images, videos, and 3D models, and the goal of this technique is to "fit" the virtual world to the real world and interact on the screen.
The AR display device 1 includes: immersion head-mounted display devices, AR all-in-one, digital glass mirrors, digital glass desktops, displays, smart mobile devices (smartphones, smarttablets, etc.), or AR helmets connectable to the smart mobile devices; the immersion type helmet-mounted display device comprises an image type head-mounted display, a transmission type head-mounted display or an image transmission combined type head-mounted display; wearable smart devices, for example: AR glasses, AR helmets, and the like.
The instruction control device may accept a control instruction input by a user (voice/gesture), the control instruction being divided by content: smart home acknowledge instructions, control instructions, event control instructions, etc.
The smart home acknowledge/control command determines which smart home the user wants to control, the event control command is what event the smart home is controlled to do, for example: turning on/off televisions, audio, lights, air conditioners, humidifiers, fans, air cleaners, intelligent curtains, sweeping robots, webcams, intelligent ventilation, access control, and the like.
The instruction control device generates a corresponding control instruction coding signal according to the received control instruction and the smart home to be controlled (namely a specific designated smart home), sends the control instruction coding signal to the instruction control device, and controls the instruction control device to execute corresponding operation.
Optionally, the instruction control apparatus 2 includes: an instruction generation module 21 for receiving the control instruction; and the communication module 22 is used for sending a control instruction coding signal to the intelligent home according to the control instruction.
Specifically, the communication module includes: infrared emission component, NFC (Near Field Communication ), wi-Fi, bluetooth, zigBee, li-Fi, Z-Wave, etc.
Li-Fi-Light Fidelity visible Light wireless communication, light surfing, namely illumination surfing, uses Light emitted by an LED illuminating lamp as a transmission tool of a network signal, and utilizes rapid Light pulse to wirelessly transmit information. That is, wi-Fi signals are not needed, and the user can surf the internet by clicking one LED lamp. The principle is to encode information in the light according to different rates, e.g. LED on for 1 and off for 0, and to transmit the information by means of a fast high frequency switch. The LED optical network transmits network signals through visible light, and can directly utilize the existing energy consumption output of flashlights, illuminating lamps, street lamps, indoor illumination, public illumination and the like to complete double tasks. The light internet surfing is characterized by low radiation, low energy consumption, low carbon and environmental protection.
The instruction generation module 21 includes: gesture interaction components, head aiming components, eye tracking components, smart gloves, smart rings, control handles, and/or voice control components. Different parts make the control command divide into according to the input mode: gesture instructions, voice instructions, head-of-sight instructions, eye-movement instructions, and/or menu instructions.
When the head-aiming component is applied to AR glasses or AR helmets, an IMU sensor such as a gyroscope in the AR glasses or AR helmets is relied on to determine the orientation.
All components in the instruction generation module mainly realize the positioning and operation functions of the mouse.
The gesture interaction component is used for considering that the control input by the user is received when the gesture action made by the user is detected to be the same as the preset gesture; specifically, the gesture interaction means is a motion sensor for gestures, for example: LEAP MOTION, uSens Fingo, etc.
And the voice control component is used for receiving a control instruction of voice input. The speech control part includes: microphone and semantic speech determination sub-module.
Preferably, the gesture interaction component 8 is configured to, when detecting that the gesture motion made by the user is the same as the preset gesture, specifically: the gesture interaction component is configured to consider that the gesture action made by the user is the same as the preset gesture when it is detected that the first gesture action made by the user is the same as the preset first gesture in the preset gestures, the second gesture action made by the user is the same as the preset second gesture in the preset gestures, and the time difference between the first gesture action and the second gesture action is within the preset time range.
The AR remote control device 100 further includes: the image recognition module 3 is configured to obtain an image or an identification code (the identification code may be a bar code, a two-dimensional code, an AR recognition trigger flag, etc.) of the smart home in the environment; and the image comparison module 4 is electrically connected with the image recognition module 3 and is used for comparing and determining the type, brand and model of the intelligent home according to the image or the recognition code of the intelligent home.
The storage module 5 is electrically connected with the image comparison module 4 and the instruction control equipment and is used for pre-storing the images or the identification codes of the smart home of the determined types, brands and models for calling when the image comparison module compares; the storage module is further used for pre-storing control instruction coding signals (and virtual control instruction menus) of the intelligent home with determined types, brands and models, and the control instruction coding signals are sent to the specific intelligent home by the instruction control equipment.
Specifically, the working principle of the gesture interaction part 8 is explained and illustrated by using the LEAP MOTION: taking LEAP MOTION as an example, the device is a micron-sized 3D manual interaction device, can track MOTION as small as 0.01 mm, has a viewing angle of 150 degrees, can track MOTION of 10 fingers of 1 person, and has a maximum frequency of 290 frames per second. The space created by LEAP MOTION captures gestures, shows hand and arm skeletons, one hand of a person, with 29 bones, 29 joints, 123 ligaments, 48 nerves and 30 arteries. The LEAP MOTION controller also has almost complete knowledge of this technique, that is, it is impossible to detect gesture movements that are accessible to a non-human hand.
The LEAP MOTION space is overlapped into the AR three-dimensional display space, the gestures and objects in the AR three-dimensional display space are interacted/fed back, and the LEAP MOTION controller can accurately track. Technically, this is an 8 cubic foot interactive 3D space. The Leap Motion controller can track 10 fingers of all people with the accuracy as high as 1/100 mm. It is far more accurate than existing motion control techniques. The 150-degree ultra-wide spatial field of view can move hands in 3D space as randomly as in the real world. The LEAP MOTION controller tracks the user's hand movements at speeds exceeding 200 frames per second, achieving perfect synchronization.
The identified objects include Thumb (Thumb finger), index finger (Index finger), middle finger (Middle finger), ring finger (Ring finger), pinky finger (Pinky finger), each of which can acquire its initial position start (X, Y, Z), end position end (X, Y, Z), and direction (pitch, roll, yaw).
Fig. 2 is an AR head mounted display (image HMD) with a gesture interaction part, and fig. 3 is another AR head mounted display (light transmissive HMD) with a gesture interaction part.
Specific examples of identifying the control instruction by the gesture interaction part 8 are as follows:
1. And building a 3D virtual scene space by using the Unity3D engine, and creating a certain 3D virtual object in the virtual scene space, such as a virtual control instruction menu and a virtual control instruction menu.
2. The six degrees of freedom SDKs (software development kits (foreign language acronyms: SDKs, foreign language acronyms: software Development Kit) that access the high-pass production are typically a collection of development tools that some software engineers create application software for a particular software package, software framework, hardware platform, operating system, etc.). The position of the device in the virtual scene is calculated through the gyroscope positioning data on the device, and the position is mapped into the virtual scene space created by the Unity3D, so that the 6DOF effect capable of rotating and walking in the 3D space is realized.
3. The SDK for recognizing gestures (space parameters) provided by the Leap Motion is accessed, and a hand model (comprising hands and arms) is added in the virtual scene space. Here, the driver of Leap Motion and hardware device support are required. The LeapSDK can transfer the gesture information parameters detected by the driver to the unity3D. Map these information to the hand model, i.e. the real hand can be simulated into a virtual hand and presented into the virtual 3D scene space.
4. The gesture information is analyzed and calculated in the Unity3D engine to obtain specific gesture shapes, such as a motion of an index finger and a finger in a hand, a motion of the middle finger and a thumb in a finger, wherein the middle finger is selected by a mouse, and the middle finger is clicked by the mouse.
Analysis results in the beginning and end of the "in finger" action. The start and end points to the smart home or virtual control instruction menu according to the extension of the index finger tip or finger tip (as shown in fig. 4). When the space distance between the tip of the index finger and the intelligent electric appliance or the virtual control instruction menu is smaller than a certain threshold value, the state of entering the finger is obtained, and after a few seconds, the distance between the tip of the index finger and the intelligent electric appliance or the virtual control instruction menu is larger than a certain threshold value, namely the state of leaving the finger is obtained, and the action of the finger is recognized in the whole process.
It is worth mentioning that if the object of "finger" is far away, if the specific smart home image is not clear enough, the smart home at the far away can be displayed by zooming in and zooming out through the wide-angle camera on the AR device. Then the operation of 'finger-in' is performed.
5. After the "in finger" action can be identified, the start and end of the "ringing finger" action are analyzed, and when the start and end are according to the distance between the finger tip of the middle finger and the finger tip of the thumb (shown in fig. 4), the state of entering the "ringing finger" is obtained when the distance between the two finger tips is smaller than a certain threshold value, and the state of leaving the "ringing finger" is obtained when the distance between the two finger tips is larger than a certain threshold value. The "in the finger" and the "ringing" refer to a joint operation, and the completion of the latter action and the completion of the former action have a time requirement, and the 2 actions can not cause any operation even if the two actions are completed successively. Of course, the action of "determining" can be freely set, and is not limited to the above-described gestures.
If the actions of the finger middle and the finger ringing are recognized within the interval time, the gesture instruction is considered to be received, and the intelligent home to be controlled and what to do with the intelligent home to be controlled are determined according to the intelligent home pointed by the finger middle and the virtual control instruction menu.
6. And the Unity3D engine is used for publishing application programs of corresponding hardware use platforms (including android, iOS, PSP, xbox, PCs and the like).
The virtual control instruction menu displayed by the AR display equipment is led out from the specific intelligent home to be controlled, namely, a scene that virtual objects and entities are mutually combined is seen from the display of the AR display equipment, and the display mode of the virtual control instruction menu can be a text button type, an icon button type or a section of animation or video image playing. Preferably, the AR remote control device also includes a speaker, and has audio or acoustic feedback, so as to enhance the experience effect of the user when interacting with the virtual control command menu and the virtual control command menu.
Besides the control technologies of head sight (helmet sight), eye tracking, intelligent gloves and the like, the AR glasses can be combined with another 2 control technologies, wherein one is (bare hand) gesture control (cooperation of body sensing sensor hardware is needed), and the other is sound control (cooperation of body microphone and hardware such as voice and semantic judgment module is needed). According to the AR remote controller, corresponding intelligent home is controlled according to the control instruction input by the user, the original remote controller is replaced, the user can send the control instruction in different modes, multiple choices are given, and the range of applicable crowds is improved.
A first system embodiment according to the present invention, as shown in fig. 5, comprises: a smart home 200 (i.e. each smart home 1-n in fig. 5) and an AR remote control device 100 (i.e. the AR remote controller in fig. 5), the AR remote control device 100 being configured to send a control command encoding signal to the smart home.
The AR remote control device in this embodiment is the AR remote control device in the above embodiment, and the same parts are not described here again. Smart home/smart appliances include, but are not limited to: television, stereo, lighting lamp, air conditioner, washing machine, refrigerator, humidifier, electric fan, air purifier, intelligent curtain, sweeping robot, network camera, intelligent ventilation/exhaust mechanism, door control, microwave oven, electric cooker, oven, cooking fume remover, egg beater, bread machine, wall breaking machine, computer, PAD, mobile phone, socket (exhaust), alarm clock, smoke feeling, air switch, etc.
The AR remote control device 100 includes an instruction control device, configured to receive the control instruction, and send a control instruction encoding signal to the smart home according to the control instruction. The control instructions include gesture instructions, voice instructions, head-of-sight instructions, eye-movement instructions, and/or menu instructions.
The instruction control apparatus includes: the instruction generation module is used for receiving the control instruction; and the communication module is used for sending a control instruction coding signal to the intelligent home according to the control instruction.
Optionally, the instruction generation module includes: gesture interaction component, head aiming component (helmet sighting device), eye movement tracking component, intelligent glove, intelligent ring, control handle and/or voice control component, wherein the voice control component is used for receiving control instruction of voice input; and the gesture interaction component is used for considering that the control input by the user is received when the gesture action made by the user is detected to be the same as the preset gesture.
The intelligent home remote control system further comprises: the image recognition module is used for acquiring images or recognition codes of the intelligent home in the environment; the image comparison module is used for comparing and determining the type, brand and model of the intelligent home according to the image or the identification code of the intelligent home;
The storage module is used for pre-storing images or identification codes of the intelligent home with determined types, brands and models and is used for being called when the image comparison module compares the images; the storage module is also used for pre-storing control instruction coding signals of the intelligent home with determined types, brands and models, and the control instruction coding signals are sent to the specific intelligent home by the instruction control equipment;
the network searcher is used for searching the images or the identification codes of the intelligent home with the determined types, brands and models and calling the images or the identification codes when the images are compared by the image comparison module;
the network searcher is further used for searching the control instruction coding signals of the intelligent home with the determined types, brands and models, and the control instruction coding signals are sent to the specific intelligent home by the instruction control equipment;
the virtual object generation module is used for assisting the instruction control equipment to clarify the control instruction; the virtual object generation module includes: virtual control instruction menu.
In other embodiments, the smart home remote control system may include only one storage module or web searcher.
The same parts of the AR remote control device in this embodiment as those of the AR remote control device in the above device embodiment are not described here again. The smart home 200 is communicatively connected to the AR remote control device 100.
The intelligent home remote control system of the embodiment can control the intelligent home by using the AR remote control device, has wide application range and convenient use, and greatly improves the use experience of users.
According to a first method embodiment of the present invention, the smart home 200 and the AR remote control device 100 applied to the above system embodiment, as shown in fig. 6, the method includes:
s100: acquiring and displaying an image of the smart home; when the image of the intelligent home is smaller (farther), the image of the intelligent home can be zoomed in and enlarged through the wide-angle camera;
s200: when the gesture action is detected to be the same as the preset gesture, the intelligent home confirmation instruction/control instruction input by the user is considered to be received;
s300: when an input intelligent home identification instruction/control instruction is received, identifying an object to be controlled (selected intelligent home to be controlled) in the displayed image of the intelligent home according to the intelligent home identification instruction/control instruction;
s410: when the virtual control instruction menu of the intelligent home to be controlled is not found in a preset database (local library), scanning the identification information of the intelligent home to be controlled; s430: the virtual control instruction menu of the intelligent home to be controlled is downloaded to the preset database in a networking mode according to the identification information;
S470: searching a virtual control instruction menu of the intelligent home to be controlled in a preset database, and displaying the virtual control instruction menu;
s500: when an event control instruction issued according to the smart home to be controlled is received, a corresponding control instruction coding signal is generated according to the event control instruction. Optionally, the step S500 specifically includes: s510: when an event control instruction issued by a user according to the displayed virtual control instruction menu of the smart home to be controlled is received, generating a corresponding control instruction coding signal according to the event control instruction (the input mode of the event control instruction can be a voice mode, a gesture mode and the like and is determined according to the mode supported by an AR remote controller);
s600: and sending the control instruction coding signal to the intelligent home to be controlled, so that the intelligent home to be controlled executes corresponding operation according to the control instruction coding signal.
Preferably, the step S200 specifically includes:
s210: and when the detected first gesture is the same as the preset first gesture in the preset gestures, the detected second gesture is the same as the preset second gesture in the preset gestures, and the time difference between the first gesture and the second gesture is within the preset time range, the detected gesture is considered to be the same as the preset gestures.
Specifically, taking AR glasses as an AR display device and a gesture interaction unit as an example:
the user aims at a certain intelligent electrical appliance through an (depth or wide angle) camera on the AR glasses/the helmet, and the user inputs an intelligent home confirmation instruction/control instruction in a gesture mode. For example: the middle refrigerator is selected, the sounding finger represents confirming selection of the refrigerator, the whole operation represents intelligent house confirming instructions/control instructions, and the refrigerator is confirmed to serve as the intelligent house to be controlled.
Determining a specific intelligent home by comparing an intelligent electrical appliance database (namely a local preset database) in a memory on the AR glasses with a graphic image/identification code of the intelligent home to be controlled, which is transmitted by a camera; if necessary, a corresponding virtual control instruction menu can be further obtained, the menu displays more detailed operation instructions and/or operation instructions, a user enters the virtual control instruction menu of the intelligent home through the gesture interaction component, inputs event control instructions such as inquiry, on/off or adjustment and the like, after a corresponding control instruction coding signal is generated, a short-distance communication chip (namely an instruction sender) on the AR glasses sends the control instruction coding signal to the intelligent home, and a corresponding short-distance communication chip (an instruction receiver) is also arranged on the intelligent home, and the intelligent electrical appliance after receiving the control instruction coding signal performs related operations, so that the control instruction of the user can be fed back.
When the AR glasses cannot find the corresponding virtual control instruction menu and control instruction coding signals in the intelligent electrical appliance database, the control instruction coding signals and the virtual control instruction menu can be downloaded according to the graphic images/identification codes of the intelligent home to be controlled.
In other embodiments, there may be no local pre-set database, direct web search; or only a local preset database, and has no network searching function.
It should be noted that, as shown in fig. 7, which is a flowchart illustrating a modification of the first embodiment of the smart home remote control method of the present invention, S200 and S300 of the present embodiment may be supplemented or replaced by S310, S310: when receiving a voice input intelligent home confirmation instruction/control instruction, confirming the intelligent home to be controlled in the displayed image of the intelligent home according to the intelligent home confirmation instruction/control instruction.
The basic intelligent home remote control in the embodiment is a visual remote control intelligent electric appliance, the congenital defect of voice remote control or direct remote control by a remote controller is overcome, and an AR remote controller can drive the switch, operation and adjustment of any intelligent electric appliance in home. The remote controller based on the AR technology has more visual and efficient remote control effect no matter in short distance or in local area network or wide area network.
Specifically, in a modification of this embodiment, the user may input the smart home confirmation command/control command in a voice manner, and similarly, the (event) control command may also be input in a voice manner. Because the virtual selection menu is displayed or prompted in the AR glasses/helmets, the assistance of sound control is more convenient, such as "page turning", "next line", "left/right", "determination", that is, if the user is unfamiliar with the specific language control command of the system, the whole operation process is smoother and more efficient for some users by performing simple language operation under the display of the virtual menu.
The AR remote control device or the network stores various image and graphic databases, control instruction coding signals, user use records and the like of the intelligent home, and the AR remote control device is more humanized and more convenient in electric appliance control and provides powerful background support for further development and improvement of user experience.
The technical effects or beneficial effects of the AR remote control device, the intelligent home remote control system and the method of the embodiment are as follows: the intelligent electric appliance remote control device has the advantages that the intelligent electric appliance remote control device is high in universality, the intelligent electric appliance is incorporated into a remote control system, remote control can be realized only through one AR remote controller, the virtual menu function is convenient for remote control implementation, and the voice auxiliary function enriches remote control means.
Based on the combination of virtual (virtual selection menu) and real (specific intelligent electric appliance) of AR technology, users of any remote controllers feel concise and convenient, and especially, the remote controllers are good news devices for patients with (temporary) losing language ability or muscle control ability, and the remote controllers are assisted by language semantic recognition technology, so that old people and children feel warm.
Example two
According to the above embodiments of the present invention, including improvements of the apparatus, system embodiments and method embodiments, except for the same parts, as shown in fig. 1 and 8, the smart home remote control system further includes: the auxiliary camera module 300 is communicatively connected to the AR remote control device 100, and is configured to acquire images of the remote smart home/smart home appliance, and send the images to the AR remote control device 100. For controlling an intelligent electrical appliance object which is not in front, a network technology is required to be introduced, and a WIFI signal can be used as a carrier of a control instruction in a home;
the auxiliary camera module 300 further includes: network camera, unmanned aerial vehicle on-board camera or unmanned aerial vehicle on-board camera. Wide area network or IoT technologies are needed for intelligent appliances in office remote control homes. As shown in fig. 9. The network camera in home is required to cooperate with the AR glasses to control, namely the AR glasses are connected with the network camera in the room through a local area network or a wide area network, specific intelligent electric appliances are checked through the network camera, virtual selection menus of the intelligent electric appliances are displayed, then head aiming, voice or gesture and other operations are performed, a router in home forms a control center for command forwarding, and each networking intelligent electric appliance receives command signals from the router.
Through the internet environment, identifiable intelligent electrical appliance objects are greatly expanded, cloud databases of all large-brand manufacturers can be accessed, latest remote control operation instructions are searched, remote operation of an Internet of things (LoRa or Nb-IoT) is performed, and of course, all intelligent electrical appliances are embedded with the communication chips of the Internet of things so as to receive instruction signals from the Internet of things.
Various wisdom electrical apparatus all have own LOGO, through the discernment of AR equipment to the LOGO, lock brand manufacturer more easily, also access brand manufacturer's high in the clouds database more easily, confirm specific wisdom electrical apparatus's model or style.
An intelligent home remote control method, as shown in fig. 10, comprises: the step S100 of obtaining the image of the smart home comprises the following steps: s110: acquiring images of the intelligent home around the view field of the AR remote control device through a camera; or, S120: and receiving the forwarded images of the smart home around the field of view of the auxiliary camera module.
Specifically, take unmanned aerial vehicle on-board camera or unmanned aerial vehicle on-board camera and AR remote controller communication connection for example: the airborne camera can move under the control of a user and send the acquired image to the AR remote controller, so that the user wearing the AR remote controller can remotely control the intelligent home, and the intelligent home of the user in the space/room A remote control space/room B is realized.
In addition, compared with a fixed network camera, the intelligent mobile small robot is flexible, the intelligent mobile small robot is provided with an indoor positioning sensor, can enter any room in home by driving the intelligent mobile small robot, can search and observe through eyes (cameras) of the intelligent mobile small robot, can perform AR 'capturing' on intelligent home/appliances, and can realize remote control.
The intelligent mobile terminal (mobile phone or PAD) can also send out wireless remote control instruction signals of the intelligent electric appliance in a short message, a micro message, a multimedia message, an APP or cellular mobile communication mode.
The remote control modes on the AR remote control device are also various, including head aiming (helmet sighting device), eye movement tracking, sound, intelligent gloves, control handles, freehand gestures and the interaction mode with a virtual caretaker for remote control, so that the realization possibility and the application range of the AR remote control device are greatly enriched.
The technical effects or beneficial effects of the AR remote control device, the intelligent home remote control system and the method of the embodiment are as follows: in the embodiment, the long distance is realized by the auxiliary camera module based on remote control of the Internet or the Internet of things, and the visual remote control operation can be performed on any intelligent home/intelligent electric appliance in any space of the home, so that the method is accurate, specific and visual. The remote controller based on the AR technology has more visual and efficient remote control effect no matter in short distance or in local area network or wide area network.
The AR equipment or the network stores various intelligent electric appliance image graphic databases, remote control instruction databases, user use records and the like, and the intelligent electric appliance control is more humanized and more convenient in order to further develop and improve user experience, so that powerful background support is provided.
Example III
According to the above two device/system embodiments of the present invention, based on the improvement of the first AR remote control device 100, the virtual object generating module further includes, in addition to the same parts thereof: virtual control instruction menus, virtual intermediaries, 3D virtual space environments, and/or 3D map imagery.
The voice control component is used for receiving a control instruction input by the voice of a user; when the behavior of the user is detected to be in accordance with the preset triggering condition, inquiry information is sent out; and receiving a control instruction which is input by a user according to the inquiry information.
Based on the improvement of the first system embodiment, as shown in fig. 11, in addition to the same parts, the virtual object generating module in the smart home remote control system further includes: virtual intermediaries, 3D virtual space environments, and/or 3D map imagery; and the voice control component is used for receiving a control instruction input by the voice of the user; when the behavior of the user is detected to be in accordance with the preset triggering condition, inquiry information is sent out; and receiving a control instruction which is input by a user according to the inquiry information.
According to the foregoing method embodiment, based on the improvement of the first method embodiment, a remote control method for an intelligent home, based on virtual intermediaries, includes:
the step S100 further comprises the following steps: s150: when the detected behavior accords with a preset trigger condition (for example, the gesture action of the user is the same as the preset gesture, the user looks at the same place for a certain time, the ambient light is lower than a certain time, and the like), the inquiry information is sent out (the virtual intermediary).
The step S300 comprises the following steps: s330: when receiving an intelligent home confirmation instruction/control instruction input according to inquiry information voice, confirming the intelligent home to be controlled or virtual images thereof in the displayed images according to the intelligent home confirmation instruction/control instruction.
Note that SS410, S430, and S470 are optional steps.
Specifically, taking an AR helmet as an AR remote controller of an AR display device as an example:
when the virtual intermediary (such as a housekeeper eidolon) detects that the duration of time that the user stares at the air conditioner and looks for a preset time (such as 5 seconds) through the eye tracking camera, the behavior of the user is considered to be in accordance with a preset trigger condition, the housekeeper eidolon sends out inquiry information' master, and does you need to open a virtual control instruction menu of the air conditioner? "; the user may input the smart home confirmation command in different manners, such as gesture, voice, etc., for example, the user answers: if yes, the intelligent home confirmation instruction is a virtual control instruction menu of the air conditioner, the corresponding virtual control instruction menu is searched in a preset database or the internet and displayed for the user to see, and the user issues an event control instruction according to the virtual control instruction menu.
For another example: when the luminosity sensor at the window frame senses that the light intensity is weak to a certain value, the housekeeper eidolon sends inquiry information of 'the master, whether the master pulls up the curtains of all rooms or not', after confirmation, the housekeeper eidolon gives a curtain pulling instruction to all intelligent curtains of all rooms, and the intelligent household to be controlled corresponding to the intelligent household confirmation instruction is the intelligent curtains of all rooms; the event control instruction is a window curtain.
It should be noted that S100, S150, S330 in the present embodiment may be replaced by S180 and S350 described below.
As shown in fig. 13, S180: when a voice control instruction issued by a user is received, acquiring and displaying a 3D virtual space environment or a 3D map image related to the voice control instruction; s350: and when receiving an intelligent house confirmation instruction input by a user according to the 3D virtual space environment or the 3D map image, confirming the intelligent house to be controlled in the displayed 3D virtual space environment or the 3D map image according to the intelligent house confirmation instruction.
Specifically, in some cases, the voice control instruction issued by the user is not clear enough, for example, the user says: "housekeeper eidolon, please turn on the air conditioner, cooling mode, 26 ℃. "a planar or 3-dimensional scene of a housekeeper eidolon and the apartment of fig. 14 would appear in an AR helmet, the housekeeper eidolon would refer to a room (e.g., the apartment where the user's home is the 3-room): "do the owner, you say the air conditioner of this room? "after obtaining the user's repudiation, the housekeeper sprite would refer to another room for continued interrogation. After the confirmation of the user is obtained, the smart home to be controlled is determined, the housekeeper eidolon turns on the air conditioner of the room according to the previous event control instruction, and the refrigeration mode is determined, and the temperature is 26 ℃.
The housekeeper sprites appear in conjunction with the 3-dimensional scene of an apartment (as shown in fig. 14), or as shown in fig. 9 by a field of view of a certain spatial scene. In the embodiment, a virtual management intermediary, namely a housekeeper eidolon, is generated on the AR remote control device, benign interaction is well carried out with a user of the remote control, and instruction objects, specific instructions and instruction degrees of the user are clear and clarified, so that the realization of the technology is greatly simplified, the intelligent degree is stronger, the reliability is high, and the application range is wide.
The technical effects or beneficial effects of the AR remote control device, the intelligent home remote control system and the method of the embodiment are as follows: on AR equipment, including AR all-in-one, AR glasses, smart phone or PAD, generate virtual management intermediary-housekeeper eidolon, carry out benign interaction with the user of remote controller well, clear and clear user's instruction object, specific instruction and instruction degree, so simplified the realization of technique greatly, intelligent degree is stronger, the reliability is high, the range of application is wide.
The remote control modes on the AR equipment are also various, including head aiming (helmet sighting device), eye movement tracking, sound, intelligent gloves, control handles, freehand gestures and the interaction mode with a virtual caretaker for remote control, so that the realization possibility and the application range of the AR remote control equipment are greatly enriched.
It should be noted that, the identifiers of the steps in each flow chart in the present patent specification are not limited in order, and the sequence can be adjusted under reasonable conditions.
It should be noted that the above embodiments can be freely combined as needed. The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (24)

1. An AR remote control device, comprising:
the AR display equipment is used for capturing images of the intelligent home;
the instruction control equipment is used for receiving a control instruction and sending a control instruction coding signal to the intelligent home according to the control instruction;
the instruction control apparatus includes:
the instruction generation module is used for receiving the control instruction;
the communication module is used for sending a control instruction coding signal to the intelligent home according to the control instruction;
the instruction generation module includes: a gesture interaction component;
the gesture interaction component is configured to, when detecting that the gesture action is the same as the preset gesture, specifically:
The gesture interaction component is configured to consider that the detected gesture is the same as the preset gesture when the detected first gesture is the same as the preset first gesture in the preset gestures, the detected second gesture is the same as the preset second gesture in the preset gestures, and the time difference between the first gesture and the second gesture is within the preset time range.
2. The AR remote control device of claim 1, wherein the instruction generation module comprises: a head-aiming component, an eye-tracking component, a smart glove, a smart ring, a control handle, and/or a speech-control component;
the voice control component is used for receiving the control instruction of voice input.
3. The AR remote control of claim 1, wherein the control instructions comprise gesture instructions, voice instructions, head-of-sight instructions, eye-movement instructions, and/or menu instructions.
4. The AR remote control device of claim 1, further comprising:
the image recognition module is used for acquiring images or recognition codes of the intelligent home in the environment;
and the image comparison module is used for comparing and determining the type, brand and model of the intelligent home according to the image or the identification code of the intelligent home.
5. The AR remote control device of claim 4, further comprising: the storage module is used for pre-storing images or identification codes of the intelligent home with determined types, brands and models and is used for being called when the image comparison module compares the images; and/or the number of the groups of groups,
the storage module is used for pre-storing control instruction coding signals of the intelligent home with determined types, brands and models, and the control instruction coding signals are sent to the specific intelligent home by the instruction control equipment.
6. The AR remote control device of claim 4 or 5, further comprising: the network searcher is used for searching the images or the identification codes of the intelligent home with the determined types, brands and models and calling the images or the identification codes when the images are compared by the image comparison module; and/or the number of the groups of groups,
the network searcher is used for searching the control instruction coding signals of the intelligent home with the determined types, brands and models, and the control instruction coding signals are sent to the specific intelligent home by the instruction control equipment.
7. The AR remote control device of claim 5, further comprising: and the virtual object generation module is used for assisting the instruction control equipment to clarify the control instruction.
8. The AR remote control device of claim 7, wherein the virtual object generated by the virtual object generation module comprises: virtual control instruction menus, virtual intermediaries, 3D virtual space environments, or 3D map images.
9. An intelligent home remote control system, comprising: the intelligent household system comprises an intelligent household and an AR remote control device, wherein the AR remote control device is used for sending a control instruction coding signal to the intelligent household;
the AR remote control device includes an instruction control apparatus including:
the instruction generation module is used for receiving the control instruction;
the communication module is used for sending a control instruction coding signal to the intelligent home according to the control instruction;
the instruction generation module includes: a gesture interaction component;
the gesture interaction component is configured to, when detecting that the gesture action is the same as the preset gesture, specifically:
the gesture interaction component is configured to consider that the detected gesture is the same as the preset gesture when the detected first gesture is the same as the preset first gesture in the preset gestures, the detected second gesture is the same as the preset second gesture in the preset gestures, and the time difference between the first gesture and the second gesture is within the preset time range.
10. The smart home remote control system of claim 9, further comprising: and the auxiliary camera module is used for acquiring the remote image of the intelligent home and sending the remote image to the AR remote control device.
11. The intelligent home remote control system of claim 10, wherein the auxiliary camera module comprises: network camera, unmanned aerial vehicle on-board camera or unmanned aerial vehicle on-board camera.
12. The intelligent home remote control system of claim 11, wherein the webcam, unmanned aerial vehicle, or unmanned aerial vehicle includes an instruction control device for accepting the control instruction and transmitting a control instruction encoding signal to the intelligent home in accordance with the control instruction.
13. The smart home remote control system of claim 12, wherein the control instructions comprise gesture instructions, voice instructions, head-of-sight instructions, eye-movement instructions, and/or menu instructions.
14. The smart home remote control system of claim 9, wherein the instruction generation module comprises: a head aiming component, an eye tracking component, a smart glove, a smart ring, a control handle and/or a voice control component,
The voice control component is used for receiving the control instruction of voice input.
15. The smart home remote control system as claimed in any one of claims 9 to 11, further comprising:
the image recognition module is used for acquiring images or recognition codes of the intelligent home in the environment;
and the image comparison module is used for comparing and determining the type, brand and model of the intelligent home according to the image or the identification code of the intelligent home.
16. The smart home remote control system of claim 15, wherein the smart home remote control system comprises: a storage module;
the storage module is used for pre-storing images or identification codes of the intelligent home with determined types, brands and models and is used for being called when the image comparison module compares the images; and/or the number of the groups of groups,
the storage module is used for pre-storing control instruction coding signals of the intelligent home with determined types, brands and models, and the control instruction coding signals are sent to the specific intelligent home by the instruction control equipment.
17. The smart home remote control system of claim 15, wherein the smart home remote control system comprises: a network searcher;
The network searcher is used for searching the images or the identification codes of the intelligent home with determined types, brands and models and calling the images or the identification codes when the images are compared by the image comparison module; and/or the number of the groups of groups,
the network searcher is used for searching the control instruction coding signals of the intelligent home with the determined types, brands and models, and the control instruction coding signals are sent to the specific intelligent home by the instruction control equipment.
18. The remote control method for the intelligent home is characterized by comprising the following steps of:
s100: acquiring and displaying the image of the intelligent home;
s300: displaying an object to be controlled in an image of the intelligent home according to the intelligent home pointed specifically;
s500: generating a corresponding control instruction coding signal according to the control instruction of the object to be controlled;
s600: transmitting the control instruction coding signal to the object to be controlled;
the step S300 further includes:
s200: when the gesture action is detected to be the same as the preset gesture, determining the intelligent home specifically pointed in the image of the intelligent home;
the step S200 specifically comprises the following steps:
s210: and when the detected first gesture is the same as the preset first gesture in the preset gestures, the detected second gesture is the same as the preset second gesture in the preset gestures, and the time difference between the first gesture and the second gesture is within the preset time range, the detected gesture is considered to be the same as the preset gestures.
19. The method for remote control of smart home as claimed in claim 18, wherein said step S300 comprises:
s310: and displaying the object to be controlled in the image of the intelligent home according to the voice instruction of the intelligent home.
20. The smart home remote control method as claimed in claim 19, wherein:
the step S100 further includes:
s150: when the behavior is detected to accord with the preset triggering condition, inquiry information is sent out;
the step S300 includes:
s330: and displaying the object to be controlled in the image according to the voice instruction of the intelligent home.
21. The method for remote control of smart home as claimed in claim 18, wherein the step of S100 of obtaining an image of smart home comprises:
s110: acquiring images of the intelligent home around the view field of the AR remote control device through a camera; or,
s120: and receiving the forwarded images of the smart home around the field of view of the auxiliary camera module.
22. The method for remotely controlling a smart home as recited in claim 18, wherein between said step S300 and said step S500 further comprises:
s470: searching the type, brand and model of the object to be controlled in a preset database, and displaying a corresponding virtual control instruction menu;
The step S500 specifically comprises the following steps:
s510: and generating a corresponding control instruction coding signal according to the control instruction issued by the virtual control instruction menu of the object to be controlled.
23. The method for remote control of a smart home as claimed in claim 22, wherein before said step S470, the smart home further comprises:
s410: scanning identification information of the object to be controlled when the type, brand and model of the object to be controlled are not found in a preset database;
s430: and searching and downloading the control instruction coding signals of the object to be controlled to the preset database according to the identification information in a networking mode.
24. The method for remotely controlling a smart home according to claim 18, wherein;
the step S100 includes:
s180: according to the voice command, displaying a 3D virtual space environment or a 3D map image related to the voice command;
the step S300 includes:
s350: and displaying the object to be controlled in the 3D virtual space environment or the 3D map image according to the specifically pointed intelligent home.
CN201811102831.4A 2018-09-20 2018-09-20 AR remote control device, intelligent home remote control system and method Active CN109088803B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811102831.4A CN109088803B (en) 2018-09-20 2018-09-20 AR remote control device, intelligent home remote control system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811102831.4A CN109088803B (en) 2018-09-20 2018-09-20 AR remote control device, intelligent home remote control system and method

Publications (2)

Publication Number Publication Date
CN109088803A CN109088803A (en) 2018-12-25
CN109088803B true CN109088803B (en) 2023-11-07

Family

ID=64842061

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811102831.4A Active CN109088803B (en) 2018-09-20 2018-09-20 AR remote control device, intelligent home remote control system and method

Country Status (1)

Country Link
CN (1) CN109088803B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109905299A (en) * 2019-02-18 2019-06-18 珠海格力电器股份有限公司 Control method of household appliance and electronic equipment
CN110321003A (en) * 2019-05-30 2019-10-11 苏宁智能终端有限公司 Smart home exchange method and device based on MR technology
CN110109370A (en) * 2019-06-05 2019-08-09 重庆邮电大学 Intelligent appliance control system based on GAN algorithm and MR platform
CN112751734A (en) * 2019-10-29 2021-05-04 珠海市一微半导体有限公司 Household appliance control method based on cleaning robot, cleaning robot and chip
CN111176126A (en) * 2019-12-30 2020-05-19 创维集团有限公司 Equipment control method, system and storage medium based on voice recognition
CN116489268A (en) * 2020-08-05 2023-07-25 华为技术有限公司 Equipment identification method and related device
CN112017418A (en) * 2020-08-27 2020-12-01 上海博泰悦臻电子设备制造有限公司 Sunroof control method, system, medium, and apparatus for vehicle
CN112115855B (en) * 2020-09-17 2022-11-01 四川长虹电器股份有限公司 Intelligent household gesture control system and control method based on 5G
US11170540B1 (en) 2021-03-15 2021-11-09 International Business Machines Corporation Directional based commands
CN114296361A (en) * 2021-12-28 2022-04-08 广州河东科技有限公司 Intelligent household equipment configuration method and device, electronic equipment and storage medium
CN115225419A (en) * 2022-06-17 2022-10-21 宜百科技(深圳)有限公司 AR system capable of controlling smart home
CN116540613A (en) * 2023-06-20 2023-08-04 威海海鸥智能电子科技有限责任公司 Intelligent control equipment for electric appliance

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202931503U (en) * 2012-10-31 2013-05-08 康佳集团股份有限公司 Hand gesture recognition remote control system and television
WO2015090185A1 (en) * 2013-12-20 2015-06-25 乐视致新电子科技(天津)有限公司 Method for generating wireless gesture remote control instruction and wireless remote controller
CN104977904A (en) * 2014-04-04 2015-10-14 浙江大学 Visible and controllable intelligent household control system and control method thereof
CN105807624A (en) * 2016-05-03 2016-07-27 惠州Tcl移动通信有限公司 Method for controlling intelligent home equipment through VR equipment and VR equipment
CN105955042A (en) * 2016-05-27 2016-09-21 浙江大学 Virtuality reality type visible and controllable intelligent household control system and method
CN106054650A (en) * 2016-07-18 2016-10-26 汕头大学 Novel intelligent household system and multi-gesture control method thereof
CN106249611A (en) * 2016-09-14 2016-12-21 深圳众乐智府科技有限公司 A kind of Smart Home localization method based on virtual reality, device and system
CN106445156A (en) * 2016-09-29 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Method, device and terminal for intelligent home device control based on virtual reality
CN106502118A (en) * 2016-12-21 2017-03-15 惠州Tcl移动通信有限公司 A kind of intelligent home furnishing control method and system based on AR photographic head
CN206075026U (en) * 2016-07-18 2017-04-05 汕头大学 A kind of intelligent household control terminal based on many gesture controls
CN106681354A (en) * 2016-12-02 2017-05-17 广州亿航智能技术有限公司 Flight control method and flight control device for unmanned aerial vehicles
CN106713082A (en) * 2016-11-16 2017-05-24 惠州Tcl移动通信有限公司 Virtual reality method for intelligent home management
CN106896732A (en) * 2015-12-18 2017-06-27 美的集团股份有限公司 The methods of exhibiting and device of household electrical appliance
CN107168076A (en) * 2017-05-19 2017-09-15 上海青研科技有限公司 A kind of eye control intelligent domestic system and method based on AR or VR
CN107272902A (en) * 2017-06-23 2017-10-20 深圳市盛路物联通讯技术有限公司 Smart home service end, control system and control method based on body feeling interaction
CN107705539A (en) * 2016-12-21 2018-02-16 深圳中盛智兴科技有限公司 Intelligent remote controller and intelligent main equipment, intelligent distant control method and system
CN108509049A (en) * 2018-04-19 2018-09-07 北京华捷艾米科技有限公司 The method and system of typing gesture function

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN202931503U (en) * 2012-10-31 2013-05-08 康佳集团股份有限公司 Hand gesture recognition remote control system and television
WO2015090185A1 (en) * 2013-12-20 2015-06-25 乐视致新电子科技(天津)有限公司 Method for generating wireless gesture remote control instruction and wireless remote controller
CN104977904A (en) * 2014-04-04 2015-10-14 浙江大学 Visible and controllable intelligent household control system and control method thereof
CN106896732A (en) * 2015-12-18 2017-06-27 美的集团股份有限公司 The methods of exhibiting and device of household electrical appliance
CN105807624A (en) * 2016-05-03 2016-07-27 惠州Tcl移动通信有限公司 Method for controlling intelligent home equipment through VR equipment and VR equipment
CN105955042A (en) * 2016-05-27 2016-09-21 浙江大学 Virtuality reality type visible and controllable intelligent household control system and method
CN206075026U (en) * 2016-07-18 2017-04-05 汕头大学 A kind of intelligent household control terminal based on many gesture controls
CN106054650A (en) * 2016-07-18 2016-10-26 汕头大学 Novel intelligent household system and multi-gesture control method thereof
CN106249611A (en) * 2016-09-14 2016-12-21 深圳众乐智府科技有限公司 A kind of Smart Home localization method based on virtual reality, device and system
CN106445156A (en) * 2016-09-29 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Method, device and terminal for intelligent home device control based on virtual reality
CN106713082A (en) * 2016-11-16 2017-05-24 惠州Tcl移动通信有限公司 Virtual reality method for intelligent home management
CN106681354A (en) * 2016-12-02 2017-05-17 广州亿航智能技术有限公司 Flight control method and flight control device for unmanned aerial vehicles
CN106502118A (en) * 2016-12-21 2017-03-15 惠州Tcl移动通信有限公司 A kind of intelligent home furnishing control method and system based on AR photographic head
CN107705539A (en) * 2016-12-21 2018-02-16 深圳中盛智兴科技有限公司 Intelligent remote controller and intelligent main equipment, intelligent distant control method and system
CN107168076A (en) * 2017-05-19 2017-09-15 上海青研科技有限公司 A kind of eye control intelligent domestic system and method based on AR or VR
CN107272902A (en) * 2017-06-23 2017-10-20 深圳市盛路物联通讯技术有限公司 Smart home service end, control system and control method based on body feeling interaction
CN108509049A (en) * 2018-04-19 2018-09-07 北京华捷艾米科技有限公司 The method and system of typing gesture function

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于Kinect的智能家居体感控制系统的研究与设计;傅大梅;倪瑛;;价值工程(第32期);174-176 *
智能家居中体感技术的应用和前景;杨栋;张建强;曹鹏;徐国祥;;电视技术(第21期);71-75 *

Also Published As

Publication number Publication date
CN109088803A (en) 2018-12-25

Similar Documents

Publication Publication Date Title
CN109088803B (en) AR remote control device, intelligent home remote control system and method
US20210365228A1 (en) Controlling external devices using reality interfaces
EP2093650B1 (en) User interface system based on pointing device
JP7095602B2 (en) Information processing equipment, information processing method and recording medium
TWI423112B (en) Portable virtual human-machine interaction device and method therewith
JP6517255B2 (en) Character image generation apparatus, character image generation method, program, recording medium, and character image generation system
JP2014535154A (en) Light control method and lighting apparatus using the light control method
JP2009134718A5 (en)
JP6569726B2 (en) Information processing apparatus, information processing method, and program
US11265428B2 (en) Information processing apparatus and non-transitory computer readable medium for operating a target object in a real space through a virtual interface by detecting a motion of a user between a display surface displaying the virtual interface and the user
US11514705B2 (en) Information processing apparatus and non-transitory computer readable medium to allow operation without contact
WO2019077897A1 (en) Information processing device, information processing method, and program
CN208723929U (en) A kind of AR remote controler
CN109324693A (en) AR searcher, the articles search system and method based on AR searcher
CN106569409A (en) Graph capturing based household equipment control system, device and method
US20210160150A1 (en) Information processing device, information processing method, and computer program
CN209514548U (en) AR searcher, the articles search system based on AR searcher
WO2022009338A1 (en) Information processing terminal, remote control method, and program
CN111033606A (en) Information processing apparatus, information processing method, and program
US11449451B2 (en) Information processing device, information processing method, and recording medium
JP2019139793A (en) Character image generation device, character image generation method, program and recording medium
CN117788754A (en) Virtual space interaction method, device, equipment and medium
CN118092634A (en) Man-machine interaction method, device, equipment and medium
WO2019082520A1 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant