CN112994992B - Intelligent home implementation method and system, machine vision device and user equipment - Google Patents

Intelligent home implementation method and system, machine vision device and user equipment Download PDF

Info

Publication number
CN112994992B
CN112994992B CN201911274275.3A CN201911274275A CN112994992B CN 112994992 B CN112994992 B CN 112994992B CN 201911274275 A CN201911274275 A CN 201911274275A CN 112994992 B CN112994992 B CN 112994992B
Authority
CN
China
Prior art keywords
local image
local
target object
state information
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911274275.3A
Other languages
Chinese (zh)
Other versions
CN112994992A (en
Inventor
艾本仁
杜聚龙
李晓荔
鲍海兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baustem Information Technology Co ltd
Original Assignee
Beijing Baustem Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baustem Information Technology Co ltd filed Critical Beijing Baustem Information Technology Co ltd
Priority to CN201911274275.3A priority Critical patent/CN112994992B/en
Publication of CN112994992A publication Critical patent/CN112994992A/en
Application granted granted Critical
Publication of CN112994992B publication Critical patent/CN112994992B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Alarm Systems (AREA)
  • Telephonic Communication Services (AREA)

Abstract

An implementation method, a system, a machine vision device and user equipment for smart home, wherein the smart home system comprises: the system comprises a local machine vision device, a vision sensor and an intelligent home application, wherein the vision sensor is used for monitoring a target object and sending monitoring data to the local machine vision device; the local machine vision device is used for extracting state information of the target object according to the monitoring data and sending the state information to the intelligent home application; and the intelligent home application is used for synthesizing and displaying the state information and the local image according to the corresponding relation between the visual sensor and the local image. Through the embodiment of the application, the privacy safety of the user can be considered, and more optimized smart home use experience is provided.

Description

Intelligent home implementation method and system, machine vision device and user equipment
Technical Field
The present disclosure relates to smart home technologies, and in particular, to a method and system for implementing smart home, a machine vision device, a user equipment, and a computer-readable storage medium.
Background
In intelligent house field, usually be difficult to compromise between ease of use and security, what most APP adopted in the relevant solution is that list user environment's smart machine list provides user operation, the user needs manual select device in smart machine list, then the operation corresponds equipment, when user environment has a plurality of equipment of the same kind, the user still needs manual interpolation to mark corresponding equipment, for example the door magnetism in kitchen, the door magnetism of registering one's residence, the lamp in kitchen, the lamp in corridor, unusual, it is not directly perceived.
One method is to transmit the image in the user environment in real time directly through the camera, and the user can directly operate the intelligent device in the image, but the privacy security problem of the user is involved.
Disclosure of Invention
The application provides a realization method and a system of smart home, a machine vision device, user equipment and a computer readable storage medium, so as to give consideration to user privacy security and usability.
The embodiment of the application provides an intelligent home system, include: local machine vision device, vision sensor and smart home application, wherein
The vision sensor is used for monitoring a target object and sending monitoring data to the local machine vision device;
the local machine vision device is used for extracting state information of the target object according to the monitoring data and sending the state information to the intelligent home application;
and the intelligent home application is used for synthesizing and displaying the state information and the local image according to the corresponding relation between the visual sensor and the local image.
In an embodiment, the state information of the target object is non-sensitive information, and includes identification information, location information, and state parameters of the target object.
In an embodiment, the smart home application is further configured to obtain the local image in advance, and store the local image in the smart home application and the local machine vision device.
In an embodiment, the local image comprises a 3D image and an overall planar 2D image of the user environment.
In an embodiment, the smart home application is further configured to determine a correspondence between the visual sensor and the local image according to a position of the visual sensor in the local image and an actual position of the visual sensor.
In an embodiment, the smart home application is configured to determine, according to a correspondence between the visual sensor and a local image, location information of the target object in the local image, and combine and display the location information and the state information with the local image.
In an embodiment, when the target object is a person, the smart home application is further configured to find an image matching the target object from a local photo library, and synthesize and display the image with the local image.
In an embodiment, the smart home system further includes: cloud system, wherein
The cloud system is used for providing a cloud visual identification model and a visual identification algorithm for the smart home application and the local machine visual device.
In an embodiment, the smart home system further includes: control device, wherein
The intelligent household application is also used for receiving a control instruction and sending the control instruction to the control device;
and the control device is used for controlling the corresponding target object according to the control instruction.
The embodiment of the application further provides an implementation method of smart home, which includes:
the vision sensor monitors a target object and sends monitoring data to the local machine vision device;
the local machine vision device extracts the state information of the target object according to the monitoring data and sends the state information to the intelligent home application;
and the intelligent home application synthesizes and displays the state information and the local image according to the corresponding relation between the visual sensor and the local image.
In an embodiment, the method further comprises:
the intelligent home application acquires the local image in advance, and stores the local image in the intelligent home application and the local machine vision device.
In an embodiment, the method further comprises:
and the intelligent home application determines the corresponding relation between the visual sensor and the local image according to the position of the visual sensor in the local image and the actual position of the visual sensor.
In an embodiment, the synthesizing and displaying the state information and the local image by the smart home application according to the corresponding relationship between the visual sensor and the local image includes:
and the intelligent home application determines the position information of the target object in the local image according to the corresponding relation between the visual sensor and the local image, and synthesizes and displays the position information and the state information with the local image.
In one embodiment, when the target object is a human, the method further comprises:
and the intelligent home application finds an image matched with the target object from a local photo library, synthesizes the image with the local image and displays the image.
The embodiment of the application further provides an implementation method of smart home, which includes:
the local machine vision device acquires monitoring data obtained by monitoring a target object by a vision sensor;
and the local machine vision device extracts the state information of the target object according to the monitoring data and sends the state information to the intelligent home application, so that the intelligent home application synthesizes and displays the state information and a local image.
The embodiment of the application further provides an implementation method of smart home, which includes:
the intelligent home application acquires state information of a target object sent by a local machine vision device;
and the intelligent home application synthesizes and displays the state information and the local image according to the corresponding relation between the visual sensor and the local image.
The embodiment of the present application further provides a machine vision device, including: the intelligent home system comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the program to realize the intelligent home implementation method.
An embodiment of the present application further provides a user equipment, including: the intelligent home system comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the program to realize the intelligent home implementation method.
The embodiment of the application also provides a computer-readable storage medium, which stores computer-executable instructions, wherein the computer-executable instructions are used for executing the implementation method of the smart home.
Compared with the prior art, the intelligent home system of the embodiment of the application comprises: the system comprises a local machine vision device, a vision sensor and an intelligent home application, wherein the vision sensor is used for monitoring a target object and sending monitoring data to the local machine vision device; the local machine vision device is used for extracting state information of the target object according to the monitoring data and sending the state information to the intelligent home application; and the intelligent home application is used for synthesizing and displaying the state information and the local image according to the corresponding relation between the visual sensor and the local image. Through the embodiment of the application, the privacy safety of the user can be considered, and more optimized smart home use experience is provided.
Additional features and advantages of the present application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the present application. Other advantages of the present application may be realized and attained by the instrumentalities and combinations particularly pointed out in the specification and the drawings.
Drawings
The drawings are intended to provide an understanding of the present disclosure, and are to be considered as forming a part of the specification, and are to be used together with the embodiments of the present disclosure to explain the present disclosure without limiting the present disclosure.
Fig. 1 is a schematic composition diagram of an intelligent home system according to an embodiment of the present application;
fig. 2 is a flowchart of a method for implementing smart home (applied to a smart home system) according to an embodiment of the present application;
fig. 3 is a flowchart of a method for implementing smart home according to another embodiment of the present application;
fig. 4 is a flowchart of an implementation method of smart home (applied to a local machine vision device) according to an embodiment of the present application;
fig. 5 is a flowchart of a method for implementing smart home (applied to smart home application) according to an embodiment of the present application;
fig. 6 is a schematic diagram of an application example of the present application.
Detailed Description
The description herein describes embodiments, but is intended to be exemplary, rather than limiting and it will be apparent to those of ordinary skill in the art that many more embodiments and implementations are possible that are within the scope of the embodiments described herein. Although many possible combinations of features are shown in the drawings and discussed in the detailed description, many other combinations of the disclosed features are possible. Any feature or element of any embodiment may be used in combination with or instead of any other feature or element in any other embodiment, unless expressly limited otherwise.
The present application includes and contemplates combinations of features and elements known to those of ordinary skill in the art. The embodiments, features and elements disclosed in this application may also be combined with any conventional features or elements to form a unique inventive concept as defined by the claims. Any feature or element of any embodiment may also be combined with features or elements from other inventive aspects to form yet another unique inventive aspect, as defined by the claims. Thus, it should be understood that any of the features shown and/or discussed in this application may be implemented alone or in any suitable combination. Accordingly, the embodiments are not to be restricted except in light of the attached claims and their equivalents. Furthermore, various modifications and changes may be made within the scope of the appended claims.
Further, in describing representative embodiments, the specification may have presented the method and/or process as a particular sequence of steps. However, to the extent that the method or process does not rely on the particular order of steps set forth herein, the method or process should not be limited to the particular sequence of steps described. Other orders of steps are possible as will be understood by those of ordinary skill in the art. Therefore, the particular order of the steps set forth in the specification should not be construed as limitations on the claims. Further, the claims directed to the method and/or process should not be limited to the performance of their steps in the order written, and one skilled in the art can readily appreciate that the sequences may be varied and still remain within the spirit and scope of the embodiments of the present application.
As shown in fig. 1, the smart home system according to the embodiment of the present application includes: local machine vision device, vision sensor and intelligent house application.
Wherein the local machine vision device and vision sensor are disposed in a user environment. The vision sensor may include a camera.
The local machine vision device can be deployed as an independent device, in home equipment such as a home gateway and a router, or together with a vision sensor to form an independent vision sensor device.
The smart home application is arranged in the user equipment.
The smart home system may further include a cloud system configured to provide a cloud visual recognition model and a cloud visual recognition algorithm for the smart home application and the local machine vision device.
The cloud system deployed at the cloud provides complete machine vision functions of training, evaluation and reasoning, and the machine vision device deployed at the local completes recognition and state judgment of a local monitoring object (namely a target object).
The intelligent home system can further comprise a control device, wherein the control device is used for receiving a control instruction from the intelligent home application and controlling the corresponding target object according to the control instruction.
As shown in fig. 2, the implementation method of the smart home in the embodiment of the present application includes:
step 201, a vision sensor monitors a target object and sends monitoring data to a local machine vision device.
Wherein the target object may include, but is not limited to, a person, a door, a window, a curtain, a bed, a seat, a sofa, lighting, and the like.
The monitoring data may be a picture taken by a vision sensor.
In an embodiment, before step 201, the method further comprises:
the intelligent home application acquires the local image in advance, and stores the local image in the intelligent home application and the local machine vision device.
The local image may include a 3D image of the user's environment and an overall planar 2D image. The local images are stored in the user equipment and the user home environment in an off-line file mode, such as a local machine vision device, and since the 3D images and the 2D images of the user environment are sensitive information and are only stored in the user equipment and the user home environment, the privacy of the user is guaranteed.
In an embodiment, before step 201, the method further comprises:
and the intelligent home application determines the corresponding relation between the visual sensor and the local image according to the position of the visual sensor in the local image and the actual position of the visual sensor.
The corresponding relationship may be a conversion relationship, and the smart home application in the user equipment calculates the conversion relationship between each camera coordinate and the local image coordinate according to the position of the camera in the local image to obtain the coordinate conversion matrix M camera-n
Step 202, the local machine vision device extracts the state information of the target object according to the monitoring data, and sends the state information to the smart home application.
The state information of the target object is non-sensitive information, where the sensitive information may refer to user personal information and user environment information, such as a user name, a phone call, a home address, a personal photo, a home internal photo, a home external photo, and the like, the non-sensitive information may refer to information other than the user personal information and the user environment information, and the state information of the target object in this embodiment may include identification information, location information, and state parameters of the target object.
When the target object is a person, the state information of the target object may further include a feature value.
For example: the object is as follows: door, position: state: the door is opened by 30 degrees; the object is as follows: human, eigenvalue: position: state: lying down.
The local machine vision device can identify monitoring data according to a model and an algorithm provided by a cloud system, and extract state information of a target object.
The local machine vision device can send the state information to the intelligent household application in the user equipment through the internet.
And 203, synthesizing and displaying the state information and the local image by the intelligent home application according to the corresponding relation between the visual sensor and the local image.
In an embodiment, the smart home application determines position information of the target object in the local image according to a corresponding relationship between the visual sensor and the local image, and synthesizes and displays the position information and the state information with the local image.
The smart home application can be used for converting the coordinates of each camera and the coordinates of the local image according to the M camera-n And obtaining coordinate information of the target object in the local image, compounding the coordinate information and the state information of the target object into the local image, generating an image and displaying the image to a user.
In an embodiment, when the target object is a person, the smart home application finds an image matched with the target object from a local photo library, and synthesizes and displays the image and the local image.
As shown in fig. 3, in an embodiment, after step 203, the method further includes:
step 301, the smart home application receives a control instruction and sends the control instruction to the control device.
The user can check the state of the target object according to the image displayed by the smart home application, and can directly send a control instruction to perform operation, such as opening and closing a window, opening and closing a socket and the like.
And the intelligent home application sends the control instruction of the user to the control device.
And step 302, the control device controls the corresponding target object according to the control instruction.
And each target object can be provided with a control device, and after receiving the control instruction, corresponding actions are executed according to the control instruction.
Through the embodiment of the application, the privacy and the safety of the user can be considered, and more optimized intelligent household use experience is also provided.
As shown in fig. 4, for the local machine vision device, the implementation method of the smart home includes:
step 401, the local machine vision device acquires a vision sensor to monitor a target object to obtain monitoring data.
Wherein the target object may include, but is not limited to, a person, a door, a window, a curtain, a bed, a seat, a sofa, lighting, and the like.
The monitoring data may be a picture taken by a vision sensor.
And 402, the local machine vision device extracts the state information of the target object according to the monitoring data and sends the state information to the intelligent home application, so that the intelligent home application synthesizes and displays the state information and a local image.
The state information of the target object is non-sensitive information and may include identification information, position information and state parameters of the target object.
When the target object is a person, the state information of the target object may further include a feature value.
The local machine vision device can identify monitoring data according to a model and an algorithm provided by a cloud system, and extract state information of a target object.
The local machine vision device may send the state information to the smart home application in the user equipment via the internet.
As shown in fig. 5, for the smart home application, the implementation method of the smart home includes:
step 501, the smart home application acquires state information of a target object sent by a local machine vision device.
In an embodiment, before step 501, the method further comprises:
the intelligent home application acquires the local image in advance, and stores the local image in the intelligent home application and the local machine vision device.
The local image may include a 3D image of the user's environment and an overall planar 2D image. The local images are stored in the user equipment and the user home environment in an off-line file mode, such as a local machine vision device, and since the 3D images and the 2D images of the user environment are sensitive information and are only stored in the user equipment and the user home environment, the privacy of the user is guaranteed.
In an embodiment, before step 501, the method further comprises:
and the intelligent home application determines the corresponding relation between the visual sensor and the local image according to the position of the visual sensor in the local image and the actual position of the visual sensor.
The corresponding relationship may be a conversion relationship, and the smart home application in the user equipment calculates the conversion relationship between each camera coordinate and the local image coordinate according to the position of the camera in the local image to obtain the coordinate conversion matrix M camera-n
And 502, synthesizing and displaying the state information and the local image by the intelligent home application according to the corresponding relation between the visual sensor and the local image.
In an embodiment, the smart home application determines position information of the target object in the local image according to a corresponding relationship between the visual sensor and the local image, and synthesizes and displays the position information and the state information with the local image.
The smart home application can be based on the conversion relation M between the coordinates of each camera and the coordinates of the local image camera-n And obtaining coordinate information of the target object in the local image, compounding the coordinate information and the state information of the target object into the local image, generating an image and displaying the image to a user.
In an embodiment, when the target object is a person, the smart home application finds an image matched with the target object from a local photo library, and synthesizes and displays the image and the local image.
In an embodiment, after step 502, the method further includes:
and the intelligent household application receives the control instruction and sends the control instruction to the control device.
The user can check the state of the target object according to the image displayed by the smart home application, and can directly send a control instruction to perform operation, such as opening and closing a window, opening and closing a socket and the like.
And the intelligent household application sends the control instruction of the user to the control device.
The following is a description of an application example.
1. By using the mobile phone application, the unmanned aerial vehicle or other camera equipment acquires the 3D image and the whole 2D image of the user environment in advance, and stores the 3D image and the whole 2D image in the user equipment and the user family environment in an off-line file manner, such as the local machine vision system.
2. The intelligent household application in the user equipment calculates the conversion relation between the coordinates of each camera and the coordinates of the indoor 3D image according to the positions of the cameras in the image to obtain a coordinate conversion matrix M camera-n
3. Cameras deployed in the user environment and local machine vision devices monitor target objects including, but not limited to, people, doors, windows, curtains, beds, seats, sofas, lighting, etc., and corresponding position coordinates, status. Such as the object: door, position: state: the door is opened by 30 degrees; object: human, eigenvalue: position: state: lying down.
4. And transmitting the non-sensitive information such as the identification, the characteristic, the position, the state and the like of the target object detected by the local machine vision device to the user equipment through the Internet.
5. And the intelligent home application in the user equipment obtains coordinate information of the target object in the 3D image according to the conversion relation M between the coordinates of each camera and the coordinates of the indoor 3D image, compounds the coordinate information and the state information of the target object obtained in the step 3 into the 3D image, and generates an image to be displayed to a user.
When the target object is a person, the smart home application in the user equipment can find the image of the person matched with the characteristic value from the local photo library, further participate in image synthesis, and facilitate browsing, for example, the corresponding head portrait is superimposed on the human body information.
As shown in fig. 6, a 3D image or a 2D image is presented on the user device and superimposed with the real-time status information of the corresponding smart device, and with the real-time luminance information, the corresponding avatar is superimposed. The user can browse the 3D image or the 2D image, view the corresponding state, and can directly perform operations such as opening and closing a window, opening and closing a socket, and the like.
Correspondingly, the intelligent home system of the embodiment of the invention comprises: local machine vision device, vision sensor and smart home application, wherein
The vision sensor is used for monitoring a target object and sending monitoring data to the local machine vision device;
the local machine vision device is used for extracting state information of the target object according to the monitoring data and sending the state information to the intelligent home application;
and the intelligent home application is used for synthesizing and displaying the state information and the local image according to the corresponding relation between the visual sensor and the local image.
In an embodiment, the state information of the target object is non-sensitive information, and includes identification information, location information, and state parameters of the target object.
In an embodiment, the smart home application is further configured to obtain the local image in advance, and store the local image in the smart home application and a local machine vision device.
In an embodiment, the local image comprises a 3D image and an overall planar 2D image of the user environment.
In an embodiment, the smart home application is further configured to determine a correspondence between the visual sensor and the local image according to a position of the visual sensor in the local image and an actual position of the visual sensor.
In an embodiment, the smart home application is configured to determine, according to a correspondence between the visual sensor and a local image, location information of the target object in the local image, and combine and display the location information and the state information with the local image.
In an embodiment, when the target object is a person, the smart home application is further configured to find an image matching the target object from a local photo library, and synthesize and display the image and the local image.
In an embodiment, the smart home system further includes: cloud system, wherein
The cloud system is used for providing a cloud visual identification model and a visual identification algorithm for the smart home application and the local machine visual device.
In an embodiment, the smart home system further includes: control device, wherein
The intelligent household application is also used for receiving a control instruction and sending the control instruction to the control device;
and the control device is used for controlling the corresponding target object according to the control instruction.
An embodiment of the present invention further provides a machine vision apparatus, including: the intelligent home system comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the program to realize the intelligent home implementation method.
An embodiment of the present invention further provides a user equipment, including: the intelligent home system comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor executes the program to realize the implementation method of the intelligent home system.
The embodiment of the invention also provides a computer-readable storage medium, which stores computer-executable instructions, wherein the computer-executable instructions are used for executing the implementation method of the smart home.
In this embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software, firmware, hardware, or suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the components may be implemented as software executed by a processor, such as a digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on computer readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). The term computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data, as is well known to those skilled in the art. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer. In addition, communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to those skilled in the art.

Claims (11)

1. The utility model provides an intelligent home systems which characterized in that includes: local machine vision devices, vision sensors, and smart home applications, wherein,
the vision sensor is used for monitoring a target object and sending monitoring data to the local machine vision device;
the local machine vision device is used for extracting state information of the target object according to the monitoring data and sending the state information to the intelligent home application;
the intelligent home application is used for synthesizing and displaying the state information and the local image according to the corresponding relation between the visual sensor and the local image; the local image acquisition module is also used for determining the corresponding relation between the visual sensor and the local image according to the position of the visual sensor in the local image and the actual position of the visual sensor;
the smart home application is configured to synthesize and display the state information and the local image according to a corresponding relationship between the visual sensor and the local image, and includes:
and the intelligent home application determines the position information of the target object in the local image according to the corresponding relation between the visual sensor and the local image, synthesizes and displays the state information and the position information of the target object in the local image with the local image, or synthesizes and displays the state information, the position information of the target object in the local image and an image matched with the target object in a local photo library with the local image.
2. The smart home system of claim 1,
the state information of the target object is non-sensitive information and comprises identification information, position information and state parameters of the target object.
3. The smart home system of claim 1,
the smart home application is further configured to obtain the local image in advance, and store the local image in the smart home application and a local machine vision device.
4. The smart home system of claim 3,
the local image comprises a 3D image of the user environment and an overall planar 2D image.
5. The smart home system of claim 1, further comprising: a cloud system, wherein,
the cloud system is used for providing a cloud visual identification model and a visual identification algorithm for the smart home application and the local machine visual device.
6. The smart home system according to any one of claims 1 to 5, further comprising: a control device, wherein,
the intelligent household application is also used for receiving a control instruction and sending the control instruction to the control device;
and the control device is used for controlling the corresponding target object according to the control instruction.
7. A realization method of smart home is characterized by comprising the following steps:
the vision sensor monitors a target object and sends monitoring data to the local machine vision device;
the local machine vision device extracts the state information of the target object according to the monitoring data and sends the state information to the intelligent home application;
the intelligent home application synthesizes and displays the state information and the local image according to the corresponding relation between the visual sensor and the local image;
the method further comprises the following steps:
the intelligent home application determines the corresponding relation between the visual sensor and the local image according to the position of the visual sensor in the local image and the actual position of the visual sensor;
the smart home application synthesizes and displays the state information and the local image according to the corresponding relation between the visual sensor and the local image, and the method comprises the following steps:
the intelligent home application determines the position information of the target object in the local image according to the corresponding relation between the visual sensor and the local image, and synthesizes and displays the state information and the position information of the target object in the local image with the local image; or synthesizing and displaying the state information, the position information of the target object in the local image, and the image matched with the target object in the local photo library and the local image.
8. The method of claim 7, further comprising:
the intelligent home application acquires the local image in advance, and stores the local image in the intelligent home application and the local machine vision device.
9. The implementation method of the smart home is characterized by comprising the following steps:
the intelligent home application acquires state information of a target object sent by a local machine vision device; the state information is obtained by extracting monitoring data for monitoring the target object by a visual sensor;
the intelligent home application synthesizes and displays the state information and the local image according to the corresponding relation between the visual sensor and the local image;
the intelligent home application synthesizing and displaying the state information and the local image according to the corresponding relation between a visual sensor and the local image comprises the following steps:
and the intelligent home application determines the position information of the target object in the local image according to the corresponding relation between the visual sensor and the local image, synthesizes and displays the state information and the position information of the target object in the local image with the local image, or synthesizes and displays the state information, the position information of the target object in the local image and an image matched with the target object in a local photo library with the local image.
10. A user equipment, comprising: the smart home implementation method comprises a memory, a processor and a computer program which is stored in the memory and can run on the processor, wherein the processor implements the smart home implementation method according to claim 9 when executing the program.
11. A computer-readable storage medium storing computer-executable instructions for performing the method of any one of claims 7-9.
CN201911274275.3A 2019-12-12 2019-12-12 Intelligent home implementation method and system, machine vision device and user equipment Active CN112994992B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911274275.3A CN112994992B (en) 2019-12-12 2019-12-12 Intelligent home implementation method and system, machine vision device and user equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911274275.3A CN112994992B (en) 2019-12-12 2019-12-12 Intelligent home implementation method and system, machine vision device and user equipment

Publications (2)

Publication Number Publication Date
CN112994992A CN112994992A (en) 2021-06-18
CN112994992B true CN112994992B (en) 2022-09-06

Family

ID=76331749

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911274275.3A Active CN112994992B (en) 2019-12-12 2019-12-12 Intelligent home implementation method and system, machine vision device and user equipment

Country Status (1)

Country Link
CN (1) CN112994992B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104301661A (en) * 2013-07-19 2015-01-21 中兴通讯股份有限公司 Intelligent household monitoring method and client and related devices
CN104426726A (en) * 2013-09-11 2015-03-18 郑州朗鑫智能电子科技有限公司 Intelligent home system for protecting privacy and information safety of user
CN204903983U (en) * 2015-08-21 2015-12-23 杨珊珊 Smart home systems and unmanned vehicles , intelligent maincenter equipment thereof
CN106302057A (en) * 2016-09-30 2017-01-04 北京小米移动软件有限公司 Intelligent home equipment control method and device
CN109543569A (en) * 2018-11-06 2019-03-29 深圳绿米联创科技有限公司 Target identification method, device, visual sensor and smart home system
CN110119201A (en) * 2019-04-22 2019-08-13 珠海格力电器股份有限公司 A kind of method and apparatus of virtual experience household appliance collocation domestic environment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9652959B2 (en) * 2015-04-07 2017-05-16 Vivint, Inc. Smart wake
US10602035B2 (en) * 2017-09-19 2020-03-24 Google Llc Temperature-controlled camera assembly

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104301661A (en) * 2013-07-19 2015-01-21 中兴通讯股份有限公司 Intelligent household monitoring method and client and related devices
CN104426726A (en) * 2013-09-11 2015-03-18 郑州朗鑫智能电子科技有限公司 Intelligent home system for protecting privacy and information safety of user
CN204903983U (en) * 2015-08-21 2015-12-23 杨珊珊 Smart home systems and unmanned vehicles , intelligent maincenter equipment thereof
CN106302057A (en) * 2016-09-30 2017-01-04 北京小米移动软件有限公司 Intelligent home equipment control method and device
CN109543569A (en) * 2018-11-06 2019-03-29 深圳绿米联创科技有限公司 Target identification method, device, visual sensor and smart home system
CN110119201A (en) * 2019-04-22 2019-08-13 珠海格力电器股份有限公司 A kind of method and apparatus of virtual experience household appliance collocation domestic environment

Also Published As

Publication number Publication date
CN112994992A (en) 2021-06-18

Similar Documents

Publication Publication Date Title
US20240119815A1 (en) Virtual enhancement of security monitoring
CN105760106B (en) A kind of smart home device exchange method and device
CN110168485B (en) Augmented reality control of internet of things devices
CN105074615B (en) virtual sensor system and method
US11070728B2 (en) Methods and systems of multi-camera with multi-mode monitoring
EP3023873A1 (en) Electronic device and method for providing map service
US10140832B2 (en) Systems and methods for behavioral based alarms
US20140257532A1 (en) Apparatus for constructing device information for control of smart appliances and method thereof
CN107113544A (en) The 3D mappings of internet of things equipment
EP4045872A1 (en) Navigation using selected visual landmarks
US11074451B2 (en) Environment-based application presentation
US20150356802A1 (en) Low Power Door-Lock Apparatus Based On Battery Using Face Recognition
CN106054620A (en) Smart control apparatus and system
US10666768B1 (en) Augmented home network visualization
US10847014B1 (en) Recording activity detection
US11430215B2 (en) Alerts of mixed reality devices
US11495054B2 (en) Motion-based human video detection
KR20170066054A (en) Method and apparatus for providing audio
JP2014042160A (en) Display terminal, setting method of target area of moving body detection and program
CN108765581A (en) A kind of method and device showing label in virtual three-dimensional space
WO2022052613A1 (en) Camera control method and apparatus, electronic device, and storage medium
KR20170104953A (en) Method and apparatus for managing system
US20190020498A1 (en) Intelligent Smart Room Control System
TWI433568B (en) Human-environment interactive system and portable device using the same
CN112994992B (en) Intelligent home implementation method and system, machine vision device and user equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant