CN113467267A - Control method of intelligent home system and intelligent home system - Google Patents

Control method of intelligent home system and intelligent home system Download PDF

Info

Publication number
CN113467267A
CN113467267A CN202110858497.0A CN202110858497A CN113467267A CN 113467267 A CN113467267 A CN 113467267A CN 202110858497 A CN202110858497 A CN 202110858497A CN 113467267 A CN113467267 A CN 113467267A
Authority
CN
China
Prior art keywords
user
indoor
dimensional modeling
information
target equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110858497.0A
Other languages
Chinese (zh)
Inventor
唐楚强
吴斌
杨会敏
刘光有
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202110858497.0A priority Critical patent/CN113467267A/en
Publication of CN113467267A publication Critical patent/CN113467267A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • User Interface Of Digital Computer (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The invention belongs to the field of intelligent home furnishing, and particularly relates to a control method of an intelligent home furnishing system and the intelligent home furnishing system. The control method of the intelligent home system comprises the following steps: acquiring indoor environment image information, constructing indoor three-dimensional modeling information based on the acquired indoor environment image information, and mapping the indoor three-dimensional modeling information on a control terminal; calibrating target equipment on indoor three-dimensional modeling information through a control terminal; and acquiring the position information of the user, and popping up a control interface of the target equipment on the control terminal when detecting that the user approaches the target equipment. The intelligent home system of the invention realizes the intelligent control of the intelligent equipment under different scenes by positioning different intelligent equipment.

Description

Control method of intelligent home system and intelligent home system
Technical Field
The invention belongs to the field of intelligent home furnishing, and particularly relates to a control method of an intelligent home furnishing system and the intelligent home furnishing system.
Background
The indoor environment equipment of the user is complex and various, the equipment which needs to be controlled by the user in different scenes is possibly different, the positioning of the user equipment and the timely understanding of the position relation between the user and the equipment can better help the user to control different equipment in different scenes, and the intelligent control of the equipment in different scenes is realized.
The present invention has been made in view of this situation.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a control method of an intelligent home system and the intelligent home system, which can realize intelligent control of intelligent equipment in different scenes by positioning different intelligent equipment.
In order to solve the technical problem, the invention provides a control method of an intelligent home system, which comprises the following steps:
acquiring indoor environment image information, constructing indoor three-dimensional modeling information based on the acquired indoor environment image information, and mapping the indoor three-dimensional modeling information on a control terminal;
calibrating target equipment on indoor three-dimensional modeling information through a control terminal;
and when the condition that the user controls the target equipment is met, popping up a control interface of the target equipment on the control terminal.
Further optionally, the condition for achieving user control of the target device comprises
When it is determined that the user is close to the target device,
and/or when the user selects the target equipment on the indoor three-dimensional modeling information through the control terminal.
Further optionally, the acquiring indoor environment image information, constructing indoor three-dimensional modeling information based on the acquired indoor environment image information, and mapping the indoor three-dimensional modeling information on the control terminal includes:
acquiring indoor environment image information, and acquiring depth information of each pixel point based on the indoor environment image information;
determining the position coordinate of each pixel point through the depth information of each pixel point, and constructing indoor three-dimensional modeling information based on the position coordinate of each pixel point;
and mapping the position coordinates of each pixel point to a control terminal according to a certain proportion, and presenting the three-dimensional modeling information on the control terminal.
Further optionally, the calibrating the target device on the indoor three-dimensional modeling information by the control terminal includes
According to the imaging of the target equipment on the three-dimensional modeling information, determining the position coordinates of the target equipment on the three-dimensional modeling information;
and binding the position coordinates of the target equipment with the ID label of the target equipment to finish the calibration of the target equipment.
Further optionally, the determining the position coordinates of the target device on the three-dimensional modeling information according to the imaging of the target device on the three-dimensional modeling information comprises
Selecting at least four pixel point coordinates to form a rectangular frame, and selecting the imaging of the target equipment on the three-dimensional modeling information;
and determining the coordinates of the central point of the rectangular frame, wherein the coordinates of the central point of the rectangular frame are regarded as the position coordinates of the target equipment on the three-dimensional modeling information.
Further optionally, the determining that the user is close to the target device includes
Acquiring user image information, and acquiring depth information of each pixel point based on the user image information;
determining the position coordinate of each pixel point through the depth information of each pixel point, and determining the position of a user in the indoor three-dimensional modeling information based on the position coordinate of each pixel point;
and determining the distance between the user and the target device based on the positions of the user and the target device in the indoor three-dimensional modeling information, and judging that the user approaches the target device when the distance between the user and the target device is smaller than or equal to a set distance.
Further optionally, the control terminal is a mobile terminal, and when the distance between the user holding the mobile terminal and the target device is less than or equal to the set distance, a control interface of the target device pops up on the mobile terminal.
Further optionally, the user selects the target device on the indoor three-dimensional modeling information through the control terminal, including
And when the pixel point at the selected position of the indoor three-dimensional modeling information of the user is located in the pixel point range corresponding to the target equipment, judging that the target equipment is selected by the user.
The invention also provides an intelligent household system, which comprises
The visual sensing module is used for acquiring indoor environment image information and user image information, constructing indoor three-dimensional modeling information based on the acquired indoor environment image information, determining position information of a user in the indoor three-dimensional modeling information based on the acquired user image information, and mapping the indoor three-dimensional modeling information and the position information of the user in an indoor three-dimensional model on the control terminal;
the device calibration module is used for determining the position coordinate of the target device according to the imaging of the target device in the indoor three-dimensional modeling information and binding the position coordinate of the target device with the ID label of the target device;
the control module acquires the position information of the user through the visual sensing module to determine the distance between the user and the target equipment, and pops up a control interface of the target equipment on the control terminal according to the control information of the target equipment calibrated by the equipment calibration module when detecting that the user is close to the target equipment; and/or when detecting that the user selects the target equipment on the indoor three-dimensional modeling information, popping up a control interface of the target equipment on the control terminal according to the control information of the target equipment calibrated by the equipment calibration module.
The invention also proposes a non-transitory computer-readable storage medium on which program instructions are stored, which program instructions, when executed by one or more processors, are adapted to implement the control method of any one of the above.
The invention also proposes a control terminal, which employs the method of any one of the above, or has a non-transitory computer-readable storage medium as described above.
After adopting the technical scheme, compared with the prior art, the invention has the following beneficial effects:
the intelligent household appliance control method and the intelligent household appliance system can determine the positions of different intelligent devices by establishing the three-dimensional modeling information of the indoor environment, and automatically pop up the control interface of the target device on the control terminal when the condition that the user controls the target device is met by binding the ID labels of the different intelligent devices with the intelligent devices, thereby realizing the intelligent control of the intelligent devices in different scenes.
The following describes embodiments of the present invention in further detail with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention, are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention without limiting the invention to the right. It is obvious that the drawings in the following description are only some embodiments, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
FIG. 1: is one of the control flow diagrams of the embodiment of the invention.
FIG. 2: the second control flow chart of the embodiment of the present invention.
FIG. 3: the invention provides a target device calibration flow chart.
FIG. 4: is a schematic diagram of the coordinates of the pixels of the rectangular frame used to select the target device.
It should be noted that the drawings and the description are not intended to limit the scope of the inventive concept in any way, but to illustrate it by a person skilled in the art with reference to specific embodiments.
Detailed Description
In the description of the present invention, it should be noted that the terms "inside", "outside", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, which are only for convenience in describing the present invention and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and operate, and thus, should not be construed as limiting the present invention.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," "contacting," and "communicating" are to be construed broadly, e.g., as meaning fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; may be directly connected or indirectly connected through an intermediate. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The indoor environment of the user is complex and various, and the devices required to be controlled by different scenes of the user can be different. The embodiment provides a control method of an intelligent home system, which aims at the problems that too many indoor devices of a user, such as an air conditioner, a television, a humidifier, an electric fan and the like, cannot be positioned and intelligently controlled in time, and comprises the following steps:
acquiring indoor environment image information, constructing indoor three-dimensional modeling information based on the acquired indoor environment image information, and mapping the indoor three-dimensional modeling information on a control terminal;
calibrating target equipment on indoor three-dimensional modeling information through a control terminal;
and when the condition that the user controls the target equipment is met, popping up a control interface of the target equipment on the control terminal.
Specifically, as shown in fig. 1, the control method of the present embodiment includes the following steps:
s1, acquiring indoor environment image information, constructing indoor three-dimensional modeling information based on the acquired indoor environment image information, and mapping the indoor three-dimensional modeling information on the control terminal; the embodiment acquires the image information of the indoor environment by arranging the visual sensor in the indoor environment, and the visual sensor can be selected from TOF, a plurality of biological cameras or laser radar and the like. The visual sensor is used for acquiring indoor space characteristic information, can utilize depth information to complete three-dimensional modeling of an indoor environment, and can present indoor three-dimensional modeling information constructed by the TOF module on the control terminal. The control terminal can be a display screen of a mobile phone, an IPAD or a cabinet air conditioner.
S2, calibrating the target equipment on the indoor three-dimensional modeling information through the control terminal; each device has a corresponding ID tag, the Server stores device control information corresponding to each ID tag, the ID tag of a device is a mark that has been defined by a manufacturer before product shipment, each device-specific ID tag has function control information corresponding to the ID tag at the Server, for example, a fan, and functions corresponding to the ID tag at the Server may include acceleration and deceleration, control direction change, and the like, and are used for controlling the device by a user. The target equipment is positioned by determining the position of the target equipment on the indoor three-dimensional modeling information on the control terminal, and the position of the target equipment is bound with the corresponding ID label to realize the calibration of the target equipment.
And S3, when the condition that the user controls the target equipment is met, popping up a control interface of the target equipment on the control terminal so that the user can control the target equipment. The condition for achieving the target equipment control of the user comprises the step of judging that the user approaches the target equipment and/or the step of selecting the target equipment on the indoor three-dimensional modeling information through the control terminal. For example, when indoor three-dimensional image information is mapped onto a mobile phone of a user, after the user finishes calibration of target equipment through the mobile phone, when the user is identified to be close to the target equipment, a control interface of the target equipment can be automatically popped up from the mobile phone, for example, when the user is close to a television, a control interface of the television can be automatically popped up from the mobile phone at the moment, and functions such as channel changing, volume increasing and decreasing and the like can be realized; when the indoor three-dimensional image information is mapped to the air-conditioning screen, the indoor three-dimensional image can be displayed on the air-conditioning screen, and a user touches the corresponding position of the target device by using a finger, so that a control interface of the target device can be automatically popped up on the air-conditioning screen.
Further optionally, the acquiring indoor environment image information, constructing indoor three-dimensional modeling information based on the acquired indoor environment image information, and mapping the indoor three-dimensional modeling information on the control terminal includes:
acquiring indoor environment image information, and acquiring depth information of each pixel point based on the indoor environment image information;
determining the position coordinate of each pixel point through the depth information of each pixel point, and constructing indoor three-dimensional modeling information based on the position coordinate of each pixel point;
and mapping the position coordinates of each pixel point to a control terminal according to a certain proportion, and presenting the three-dimensional modeling information on the control terminal.
Specifically, as shown in fig. 2, in this embodiment, the vision sensing module continuously obtains indoor image information with a certain number of frames, and accurately obtains depth information of each indoor pixel point, each pixel point has a certain coordinate (X, Y, Z), a coordinate system of the coordinate uses a position where the vision sensing module is located as an origin, and an installation position of the vision sensing module can be freely selected as required, for example, the vision sensing module is installed at the top of a cabinet air conditioner or on a wall. Control terminals such as an air conditioner screen or a mobile phone APP can present indoor three-dimensional modeling information constructed by the vision sensing module, because a certain proportional relation exists between pixel coordinates imaged by the vision sensing module and pixel coordinates imaged by the control terminal, for example, the coordinates mapped on the mobile phone screen or the air conditioner screen by the origin coordinates O1 of the vision sensing module are O2, and a certain mapping relation exists between O1 and O2, so that the pixel coordinates on the control terminal are matched with the pixel coordinate information presented by the vision sensing module, namely, the vision sensing module acquires real-time indoor environment coordinate information and maps the real-time indoor environment coordinate information to the air conditioner screen or the mobile phone APP.
Further optionally, the calibrating the target device on the indoor three-dimensional modeling information by the control terminal includes:
according to the imaging of the target equipment on the three-dimensional modeling information, determining the position coordinates of the target equipment on the three-dimensional modeling information;
and binding the position coordinates of the target equipment with the ID label of the target equipment to finish the calibration of the target equipment.
Specifically, as shown in fig. 2 and 3, according to the imaging of the indoor three-dimensional modeling information of the indoor environment device, for example, the humidifier, at this time, the imaging of the humidifier in the three-dimensional modeling information needs to be found, the user manually determines the position of the target device on the control terminal, obtains the position coordinates of the target device according to the position determined by the user, and inputs the ID tag of the corresponding device to complete the binding of the corresponding device.
Further optionally, the determining the position coordinates of the target device on the three-dimensional modeling information according to the imaging of the target device on the three-dimensional modeling information comprises
As shown in fig. 3, selecting at least four pixel coordinates to form a rectangular frame to select the imaging of the target device on the three-dimensional modeling information;
and determining the coordinates of the central point of the rectangular frame, wherein the coordinates of the central point of the rectangular frame are regarded as the position coordinates of the target equipment on the three-dimensional modeling information.
Specifically, the user can manually define at least 4 pixel coordinates of the corresponding device on the mobile phone APP or the air conditioner screen, and the target device frame can be selected by a rectangular frame formed by the pixel coordinates selected by the user. For example, as shown in FIG. 4, the user selects four pixel point coordinates of (X1, Y1, Z1), (X2, Y2, Z2), (X3, Y3, Z3) and (X4, Y4, Z4) to form a rectangular frame, the coordinate selection of the four pixel points requires that a user performs indoor three-dimensional modeling imaging according to the equipment of the indoor environment at the moment, such as a humidifier, the imaging of the humidifier in the three-dimensional modeling information needs to be found at the moment, meanwhile, the visual sensing module respectively obtains the coordinates of the 4 pixel points, and after the user finally determines the 4 pixel points, as the basis of the positioning information of the device at this time, the center point coordinates O ((X4+ X1)/2, (Y2+ Y1)/2) of the device are taken as the position coordinates of the device, and then, according to the position coordinates of the target equipment, the user is required to input the ID label of the target equipment on an operation interface to finish the binding of the corresponding equipment.
Further optionally, the determining that the user is close to the target device includes
Acquiring user image information, and acquiring depth information of each pixel point based on the user image information;
determining the position coordinate of each pixel point through the depth information of each pixel point, and determining the position of a user in the indoor three-dimensional modeling information based on the position coordinate of each pixel point;
and determining the distance between the user and the target device based on the positions of the user and the target device in the indoor three-dimensional modeling information, and judging that the user approaches the target device when the distance between the user and the target device is smaller than or equal to a set distance.
After the target equipment is calibrated, corresponding control can be realized through the distance between the user and the target equipment, because each ID label and the Server have corresponding control information, when the fact that the user is close to the target equipment is detected, the fact that the user has the intention of controlling the target equipment is indicated, the control information of the target equipment is popped up on the control terminal, and the user can realize intelligent control according to equipment control. Specifically, the visual sensing module further continuously acquires user image information with a certain number of frames, accurately acquires depth information of each pixel point in the user image, and has a determined coordinate (X) for each pixel point,Y,Z) And determining the position of the user in the three-dimensional modeling information according to the determined coordinates of each pixel point in the user image by using the position of the visual sensing module as an original point in a coordinate system of the coordinates, further determining the relative position between the user and the target equipment, judging that the user approaches the target equipment when the distance between the user and the target equipment is smaller than or equal to a set distance, and popping up a control interface of the target equipment by the control terminal.
When the control terminal is a mobile terminal such as a mobile phone or an IPAD, in order to avoid popping up a control interface on the mobile terminal when other family members approach the smart device in the process of indoor activities, the present embodiment limits that the control interface of the target device pops up on the mobile terminal when the distance between the user holding the mobile terminal and the target device is less than or equal to the set distance. The method comprises the steps that the distance between the mobile terminal and the target equipment is obtained, and when the distance between the mobile terminal and a user is smaller than or equal to the set distance, the mobile terminal pops up a control interface of the target equipment so that the user can control the target equipment. The size of the set distance can be adjusted according to the actual situation, and the set distance can be selected to be 0.5 m.
Further optionally, the user selects the target device on the indoor three-dimensional modeling information through the control terminal, including
And when the pixel point at the selected position of the indoor three-dimensional modeling information of the user is located in the pixel point range corresponding to the target equipment, judging that the target equipment is selected by the user. The control terminal takes an air conditioner screen as an example, the control on the air conditioner screen can present indoor three-dimensional imaging at the moment, a user touches a corresponding position by using a finger, and if a pixel point of the touched position is just in a pixel point range corresponding to the target equipment, the air conditioner screen automatically pops up related control information of the target equipment.
For the control method of this embodiment, in a specific implementation manner, the visual sensing module is a TOF, the TOF acquires depth information of each point of an indoor environment at this time, the TOF realizes three-dimensional modeling as shown in the figure on an air-conditioning screen or a mobile phone APP, a user U can complete annotation information as shown in the figure on the device a, the TOF acquires coordinates of 4 pixel points of the device a from the air-conditioning screen or the mobile phone APP at this time in a wireless communication or serial communication manner, after the user U completes annotation of the device a, a corresponding device ID tag is input, at this time, indoor position information of the device a and control information of the Server about the device a are basically determined, assuming that the user U can pop up control information of the device a on a mobile phone picture when being away from the device a, control over the device a is realized, or, when the user clicks a corresponding position of a target device on indoor three-dimensional imaging presented on the air-conditioning screen, and automatically popping up the related control information of the target equipment by the air conditioner screen.
This embodiment has still provided an intelligent home systems, include
The visual sensing module is used for acquiring indoor environment image information and user image information, constructing indoor three-dimensional modeling information based on the acquired indoor environment image information, determining position information of a user in the indoor three-dimensional modeling information based on the acquired user image information, and mapping the indoor three-dimensional modeling information and the position information of the user in an indoor three-dimensional model on the control terminal;
the device calibration module is used for determining the position coordinate of the target device according to the imaging of the target device in the indoor three-dimensional modeling information and binding the position coordinate of the target device with the ID label of the target device;
the control module acquires the position information of the user through the visual sensing module to determine the distance between the user and the target equipment, and pops up a control interface of the target equipment on the control terminal according to the control information of the target equipment calibrated by the equipment calibration module when detecting that the user is close to the target equipment; and/or when detecting that the user selects the target equipment on the indoor three-dimensional modeling information, popping up a control interface of the target equipment on the control terminal according to the control information of the target equipment calibrated by the equipment calibration module.
In this embodiment, the visual sensing module is at least one of TOF, at least two biological cameras or laser radar. The visual sensing module is connected with the control terminal in a wireless communication or serial port communication mode.
The present embodiments also propose a non-transitory computer-readable storage medium having stored thereon program instructions for implementing the control method of any one of the above when the program instructions are executed by one or more processors.
The invention also proposes a control terminal, which employs the method of any one of the above, or has a non-transitory computer-readable storage medium as described above.
Although the present invention has been described with reference to a preferred embodiment, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (11)

1. A control method of an intelligent home system is characterized by comprising the following steps:
acquiring indoor environment image information, constructing indoor three-dimensional modeling information based on the acquired indoor environment image information, and mapping the indoor three-dimensional modeling information on a control terminal;
calibrating target equipment on indoor three-dimensional modeling information through a control terminal;
and when the condition that the user controls the target equipment is met, popping up a control interface of the target equipment on the control terminal.
2. The control method of the smart home system according to claim 1, wherein the condition for achieving the user control target device includes
When it is determined that the user is close to the target device,
and/or when the user selects the target equipment on the indoor three-dimensional modeling information through the control terminal.
3. The method for controlling the smart home system according to claim 1 or 2, wherein the obtaining indoor environment image information, constructing indoor three-dimensional modeling information based on the obtained indoor environment image information, and mapping the indoor three-dimensional modeling information on the control terminal includes:
acquiring indoor environment image information, and acquiring depth information of each pixel point based on the indoor environment image information;
determining the position coordinate of each pixel point through the depth information of each pixel point, and constructing indoor three-dimensional modeling information based on the position coordinate of each pixel point;
and mapping the position coordinates of each pixel point to a control terminal according to a certain proportion, and presenting the three-dimensional modeling information on the control terminal.
4. The control method of the intelligent home system according to claim 1 or 2, wherein the calibrating the target device on the indoor three-dimensional modeling information by the control terminal comprises
According to the imaging of the target equipment on the three-dimensional modeling information, determining the position coordinates of the target equipment on the three-dimensional modeling information;
and binding the position coordinates of the target equipment with the ID label of the target equipment to finish the calibration of the target equipment.
5. The method according to claim 4, wherein the determining the position coordinates of the target device on the three-dimensional modeling information according to the imaging of the target device on the three-dimensional modeling information comprises
Selecting at least four pixel point coordinates to form a rectangular frame, and selecting the imaging of the target equipment on the three-dimensional modeling information;
and determining the coordinates of the central point of the rectangular frame, wherein the coordinates of the central point of the rectangular frame are regarded as the position coordinates of the target equipment on the three-dimensional modeling information.
6. The method according to claim 5, wherein the step of determining that the user is close to the target device comprises
Acquiring user image information, and acquiring depth information of each pixel point based on the user image information;
determining the position coordinate of each pixel point through the depth information of each pixel point, and determining the position of a user in the indoor three-dimensional modeling information based on the position coordinate of each pixel point;
and determining the distance between the user and the target device based on the positions of the user and the target device in the indoor three-dimensional modeling information, and judging that the user approaches the target device when the distance between the user and the target device is smaller than or equal to a set distance.
7. The method according to claim 6, wherein the control terminal is a mobile terminal, and when a distance between a user holding the mobile terminal and the target device is less than or equal to a set distance, a control interface of the target device pops up on the mobile terminal.
8. The method for controlling the smart home system according to claim 5, wherein the user selects the target device on the indoor three-dimensional modeling information through the control terminal, including
And when the pixel point at the selected position of the indoor three-dimensional modeling information of the user is located in the pixel point range corresponding to the target equipment, judging that the target equipment is selected by the user.
9. An intelligent home system is characterized by comprising
The visual sensing module is used for acquiring indoor environment image information and user image information, constructing indoor three-dimensional modeling information based on the acquired indoor environment image information, determining position information of a user in the indoor three-dimensional modeling information based on the acquired user image information, and mapping the indoor three-dimensional modeling information and the position information of the user in an indoor three-dimensional model on the control terminal;
the device calibration module is used for determining the position coordinate of the target device according to the imaging of the target device in the indoor three-dimensional modeling information and binding the position coordinate of the target device with the ID label of the target device;
the control module acquires the position information of the user through the visual sensing module to determine the distance between the user and the target equipment, and pops up a control interface of the target equipment on the control terminal according to the control information of the target equipment calibrated by the equipment calibration module when detecting that the user is close to the target equipment; and/or when detecting that the user selects the target equipment on the indoor three-dimensional modeling information, popping up a control interface of the target equipment on the control terminal according to the control information of the target equipment calibrated by the equipment calibration module.
10. A non-transitory computer readable storage medium having stored thereon program instructions which, when executed by one or more processors, are operable to implement a control method according to any one of claims 1 to 8.
11. A control terminal employing the method of any one of claims 1 to 7 or having a non-transitory computer-readable storage medium of claim 10.
CN202110858497.0A 2021-07-28 2021-07-28 Control method of intelligent home system and intelligent home system Pending CN113467267A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110858497.0A CN113467267A (en) 2021-07-28 2021-07-28 Control method of intelligent home system and intelligent home system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110858497.0A CN113467267A (en) 2021-07-28 2021-07-28 Control method of intelligent home system and intelligent home system

Publications (1)

Publication Number Publication Date
CN113467267A true CN113467267A (en) 2021-10-01

Family

ID=77882935

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110858497.0A Pending CN113467267A (en) 2021-07-28 2021-07-28 Control method of intelligent home system and intelligent home system

Country Status (1)

Country Link
CN (1) CN113467267A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110568769A (en) * 2019-08-23 2019-12-13 广州佳恒工程技术有限公司 Intelligent household control system
CN115068945A (en) * 2022-08-19 2022-09-20 深圳市必凡娱乐科技有限公司 Information interaction method and system in game process

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108064447A (en) * 2017-11-29 2018-05-22 深圳前海达闼云端智能科技有限公司 Method for displaying image, intelligent glasses and storage medium
CN108168539A (en) * 2017-12-21 2018-06-15 儒安科技有限公司 A kind of blind man navigation method based on computer vision, apparatus and system
CN108931924A (en) * 2018-08-06 2018-12-04 珠海格力电器股份有限公司 The control method and device of smart home system, processor, storage medium
CN109407541A (en) * 2019-01-08 2019-03-01 京东方科技集团股份有限公司 The control method and device of smart home device
CN109491263A (en) * 2018-12-13 2019-03-19 深圳绿米联创科技有限公司 Intelligent home equipment control method, device, system and storage medium
CN109507904A (en) * 2018-12-18 2019-03-22 珠海格力电器股份有限公司 Home equipment management method, server and management system
CN110572305A (en) * 2019-08-26 2019-12-13 珠海格力电器股份有限公司 Smart home equipment binding method and system, smart home equipment and mobile terminal
CN111340939A (en) * 2020-02-21 2020-06-26 广东工业大学 Indoor three-dimensional semantic map construction method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108064447A (en) * 2017-11-29 2018-05-22 深圳前海达闼云端智能科技有限公司 Method for displaying image, intelligent glasses and storage medium
CN108168539A (en) * 2017-12-21 2018-06-15 儒安科技有限公司 A kind of blind man navigation method based on computer vision, apparatus and system
CN108931924A (en) * 2018-08-06 2018-12-04 珠海格力电器股份有限公司 The control method and device of smart home system, processor, storage medium
CN109491263A (en) * 2018-12-13 2019-03-19 深圳绿米联创科技有限公司 Intelligent home equipment control method, device, system and storage medium
CN109507904A (en) * 2018-12-18 2019-03-22 珠海格力电器股份有限公司 Home equipment management method, server and management system
CN109407541A (en) * 2019-01-08 2019-03-01 京东方科技集团股份有限公司 The control method and device of smart home device
CN110572305A (en) * 2019-08-26 2019-12-13 珠海格力电器股份有限公司 Smart home equipment binding method and system, smart home equipment and mobile terminal
CN111340939A (en) * 2020-02-21 2020-06-26 广东工业大学 Indoor three-dimensional semantic map construction method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110568769A (en) * 2019-08-23 2019-12-13 广州佳恒工程技术有限公司 Intelligent household control system
CN115068945A (en) * 2022-08-19 2022-09-20 深圳市必凡娱乐科技有限公司 Information interaction method and system in game process

Similar Documents

Publication Publication Date Title
CN107229231B (en) Household equipment management method and device
EP3526658B1 (en) Method and device for obtaining real time status and controlling of transmitting devices
CN108931924B (en) Control method and device of intelligent household system, processor and storage medium
CN109682023B (en) Air conditioner air supply track display method and device
CN113467267A (en) Control method of intelligent home system and intelligent home system
US10735820B2 (en) Electronic device and method for controlling the electronic device
CN111107333B (en) Brightness correction method, system, equipment and computer readable storage medium
WO2020052167A1 (en) Method and device for determining air blowing angle range of air conditioner, and air conditioner
KR20150136981A (en) Apparatus and method for controlling internet of things devices
US9482606B2 (en) Method for processing data and electronic device thereof
WO2014121521A1 (en) A method, system and processor for instantly recognizing and positioning an object
CN107562288A (en) Response method based on infrared contactor control device, infrared contactor control device and medium
KR20170066054A (en) Method and apparatus for providing audio
JP5554301B2 (en) Air conditioner
WO2019214641A1 (en) Optical tag based information apparatus interaction method and system
CN112017133B (en) Image display method and device and electronic equipment
CN113870390A (en) Target marking processing method and device, electronic equipment and readable storage medium
CN111161130B (en) Video correction method based on three-dimensional geographic information
CN111240217B (en) State detection method and device, electronic equipment and storage medium
CN109507904B (en) Household equipment management method, server and management system
CN111652942A (en) Calibration method of camera module, first electronic device and second electronic device
KR20080086292A (en) Multimedia table device
CN107886540B (en) Method for identifying and positioning articles in refrigeration equipment and refrigeration equipment
JP2005115069A (en) Display device
WO2009025499A1 (en) Remote control system using natural view user interface, remote control device, and method therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20211001