WO2020191755A1 - Procédé de commande mise en œuvre pour une maison intelligente et un dispositif intelligent - Google Patents

Procédé de commande mise en œuvre pour une maison intelligente et un dispositif intelligent Download PDF

Info

Publication number
WO2020191755A1
WO2020191755A1 PCT/CN2019/080226 CN2019080226W WO2020191755A1 WO 2020191755 A1 WO2020191755 A1 WO 2020191755A1 CN 2019080226 W CN2019080226 W CN 2019080226W WO 2020191755 A1 WO2020191755 A1 WO 2020191755A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
user
user event
sensing
event information
Prior art date
Application number
PCT/CN2019/080226
Other languages
English (en)
Chinese (zh)
Inventor
李修球
Original Assignee
李修球
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 李修球 filed Critical 李修球
Priority to PCT/CN2019/080226 priority Critical patent/WO2020191755A1/fr
Publication of WO2020191755A1 publication Critical patent/WO2020191755A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B44/00Circuit arrangements for operating electroluminescent light sources
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the present invention relates to the field of smart homes, in particular to a realization method and smart device of a smart home.
  • Smart home refers to the establishment of a safe, comfortable, convenient, smart and warm home environment based on the Internet of Things and broadband networks, relying on new-generation information technology such as mobile Internet and cloud computing, and realizes the intelligent provision of services, people and families Two-way intelligent interaction of facilities.
  • the lighting system in the smart home system is the most basic needs of users.
  • the smart home system mainly controls the lights, because the lights are not self-smart devices, and other self-smart devices (such as air conditioners, electric curtains, TVs, etc.) only need Network control can be achieved through network communication or infrared emission.
  • the smart home control center can be a variety of devices (for example, smart TVs, indoor intercom extensions, smart robots, smart gateways, smart routers, magic mirrors, smart speakers, refrigerators, air conditioners, etc.), but these devices themselves do not have lighting Function, but through multi-network technology and intelligent switch module communication for intelligent control, so users must first purchase intelligent lighting switches, but the reality is that most of the new and old building lighting circuits are designed with a single live wire loop, such a smart switch is the most Only three circuits can be controlled, and all single live-wire circuits have the problem of dim LED lights at night.
  • devices for example, smart TVs, indoor intercom extensions, smart robots, smart gateways, smart routers, magic mirrors, smart speakers, refrigerators, air conditioners, etc.
  • the smart switch can only be operated through passive intelligent methods such as panel control, APP software control, remote control, and voice control.
  • panel control the number of panels (lighting panels, video intercom panels, air conditioning panels, alarm keyboards, etc.) that need to be installed in the home is large, not beautiful, and the construction is difficult, and the user must go to the panel to control; for APP software
  • the control method solves the problem of not going to the panel to control, it requires the user to control the home equipment by clicking the hierarchical directory of the APP software, and the mobile phone must be carried at home at any time, the operation is also inconvenient and not active, especially for the elderly , Children, disabled, etc.
  • remote control mode the remote control is not placed in a fixed position, and it is easy to find it when needed; for voice control mode, it is limited by the recognition distance, Artificial wake-up and recognition rate issues, as well as control issues when it is sometimes inconvenient or unable to speak; for gesture control, the current technology is very immature and the distance is very close, and video recognition gestures involve user privacy, so it is only close Gesture control in a specific area is no different from direct panel control, just a concept. Therefore, the existing lighting control methods have more or less limitations.
  • smart devices currently communicate with peripheral smart devices to control their output through multi-network technology. At most, they have their own display screen display output or voice output.
  • the display screen output and voice output methods also have limitations, such as voice The output is not suitable for people with disabilities such as deaf-mute people, and the display output method requires the user to walk closer to the display to see it, and it is also not suitable for the visually impaired and people with special mobility. Even if the user accepts these passive and intelligent operation methods, the intelligently controlled lights are only switching and dimming, and no other smarter functions are given to the lights.
  • the existing smart devices give users a poor sense of experience.
  • the technical problem to be solved by the present invention lies in the defect of poor user experience in the prior art.
  • the technical solution adopted by the present invention to solve its technical problems is: constructing a realization method of a smart home, which is applied in a smart device, and includes the following steps:
  • Step S11. Acquire user characteristic information in real time, the user characteristic information including: identity, location, and posture;
  • Step S12 Input the location information and site information into a pre-established user event model, and obtain user event information according to the output result of the user event model, wherein the site information includes at least one of the following: House structure information, indoor space information, environmental information, time information;
  • Step S13 Compare the user event information with a pre-stored first association table, and when the user event information exists in the first association table, obtain LED control information corresponding to the user event information,
  • the first association table includes multiple user event information and LED control information corresponding to each user event information;
  • Step S14 Drive at least one set of LED light strips according to the LED control information, wherein the at least one set of LEDs is arranged on a smart device.
  • the following steps are performed:
  • Step S15 Compare the user event information with a pre-stored second association table, and when the user event information exists in the second association table, obtain the room light control information corresponding to the user event information , Wherein the second association table includes multiple user event information and room light control information corresponding to each user event information;
  • Step S16 Control the lights installed at the specific location of the room according to the room light control information.
  • the LED control information includes: color, brightness, lighting mode, and duration.
  • the step S11 includes:
  • the image data is recognized to obtain user characteristic information in real time.
  • the step S11 further includes:
  • the location information of the user is determined in real time according to the sensing data.
  • the step S11 further includes:
  • Acquire sensing data of at least one distance sensing module in real time wherein the at least one distance sensing is arranged on the outer periphery of the smart device, and a sensing area larger than 180 degrees is formed on the periphery of the smart device;
  • the location information of the user is determined in real time according to the sensing data.
  • the step S11 further includes:
  • the sensing data of the lower sensing module and the sensing data of the upper sensing module are acquired in real time, wherein the lower sensing module and the upper sensing module are respectively arranged on the casing of the smart device, and the lower sensing The module performs distance sensing on the target in the formed lower detection space, the upper sensing module performs distance sensing on the target in the formed upper detection space, and the lower detection space and the upper The detection space at least partially crosses;
  • the three-dimensional position information of the user is determined in real time according to the sensing data of the lower detection space and the sensing data of the upper detection space.
  • the user event information includes at least one of the following: entering, going out, getting up at night, waking up, falling, unable to sit for a long time, sedentary, unable to sleep for a long time, taking healthy medicine, and illegal invasion.
  • the present invention also constructs an intelligent device, including:
  • At least one set of LED strips At least one set of LED strips
  • the first acquisition module is configured to acquire user characteristic information in real time, where the user characteristic information includes: identity, location, and posture;
  • the second acquisition module is used to input the location information and site information into a pre-established user event model, and acquire user event information according to the output result of the user event model, where the site information includes the following At least one of: house structure information, indoor space information, environmental information, and time information;
  • the third acquisition module is configured to compare the user event information with a pre-stored first association table, and when the user event information exists in the first association table, obtain the user event information corresponding to the LED control information, wherein the first association table includes multiple user event information and LED control information corresponding to each user event information;
  • the driving module is used to drive at least one group of LED light strips according to the LED control information.
  • the at least one set of LED light strips includes:
  • the first LED strip arranged on the front; and/or,
  • the second LED light strips arranged on the left and right sides; and/or,
  • the third LED strip at the top and/or
  • the fourth LED strip at the bottom.
  • it also includes:
  • the fourth acquiring module is configured to compare the user event information with a pre-stored second association table when the user event information does not exist in the first association table, and when the user event information exists in the Acquiring the room light control information corresponding to the user event information in the second association table, where the second association table includes multiple user event information and room light control information corresponding to each user event information;
  • the light control module is used to control the lights installed at a specific location in the room according to the room light control information.
  • it also includes a charging module and an energy storage module.
  • it also includes a camera, and,
  • the first acquisition module is configured to acquire image data from the camera in real time and recognize the image data to acquire user characteristic information in real time.
  • it also includes a human body sensing module, and,
  • the first acquisition module is also used to acquire sensing data of the human body sensing module in real time, and determine the user's location information in real time according to the sensing data.
  • it further includes at least one distance sensing module arranged on the outer periphery of the smart device, and a sensing area larger than 180 degrees is formed on the periphery of the smart device, and,
  • the first acquisition module is configured to acquire sensing data of at least one distance sensing module in real time, and determine the location information of the user in real time according to the sensing data.
  • it also includes:
  • the upper sensing module is used for distance sensing of the target in the formed upper detection space
  • the lower sensing module is used for distance sensing of the target in the formed lower detection space
  • the first acquisition module is also used to acquire the sensing data of the lower sensing module and the sensing data of the upper sensing module in real time, and determine the user in real time according to the sensing data of the lower detection space and the sensing data of the upper detection space The three-dimensional location information.
  • the user event can be intelligently determined, and then at least one set of LED light strips can be controlled accordingly according to the user event.
  • the user The indoor lighting can be controlled without adding smart switches, and it is actively perceptually controlled to actively provide care services to users, especially for special people such as deaf and mute people, and the elderly, so that users can get convenience, intelligence, comfort, Fast user experience.
  • FIG. 1 is a flowchart of Embodiment 1 of a method for implementing a smart home according to the present invention
  • Fig. 2 is a logical structure diagram of the first embodiment of the smart device of the present invention.
  • FIG. 1 is a flowchart of Embodiment 1 of a smart home implementation method of the present invention.
  • the smart home implementation method of this embodiment is applied to a smart device, where the smart device includes, for example, a smart TV, a smart indoor intercom extension device, Smart robot device, smart gateway, smart router, magic mirror, smart speaker, smart refrigerator, smart air conditioner.
  • the implementation method of the smart home of this embodiment includes the following steps:
  • Step S11. Acquire user characteristic information in real time, the user characteristic information including: identity, location, and posture;
  • Step S12 Input the location information and site information into a pre-established user event model, and obtain user event information according to the output result of the user event model, wherein the site information includes at least one of the following: House structure information, indoor space information, environmental information, time information;
  • Step S13 Compare the user event information with a pre-stored first association table, and when the user event information exists in the first association table, obtain LED control information corresponding to the user event information,
  • the first association table includes multiple user event information and LED control information corresponding to each user event information;
  • Step S14 Drive at least one set of LED light strips according to the LED control information, wherein the at least one set of LEDs is arranged on a smart device.
  • the on-site information in step S12 includes: house structure information, indoor space information, environmental information, and time information. Of course, it may also include on-site audio information.
  • the house structure information can be input to the smart device by the user, property management personnel or manufacturer maintenance personnel before the smart device is used for the first time, for example, the house structure picture is input to the designated path of the smart device;
  • the process of using the smart device it is obtained through big data analysis according to the house structure data of other smart devices in the community LAN. It can also sense the frequency, location, speed, stay time, time period, and time period of the user's activity area through distance sensors and environmental sensors. Information such as lighting is obtained by machine learning.
  • indoor space information for example, including indoor layout, furniture placement, etc.
  • it can be obtained by taking pictures of the room with its own camera before the first use of the smart device and processing it into a panoramic picture; it can also be obtained through its own distance sensor
  • the probe detects the distance in all directions of the room and generates a digital map of the room.
  • environmental information it is obtained through one or more of its own smoke sensor, gas sensor, CO sensor, VOC sensor, temperature and humidity sensor, PM2.5 sensor, formaldehyde sensor, CO2 sensor, and illuminance sensor.
  • time information it can be obtained through its own clock generation module, and the clock can also be calibrated by communicating with at least one of the following: user mobile terminal, remote server, other smart devices in the community LAN, unit door phone Wait.
  • audio information it can be obtained through its own microphone (or microphone array).
  • step S12 the user event model in step S12 is established in advance, and the specific establishment steps are:
  • the training sample data includes user characteristic information and site information in a specific time period
  • a preset training method is used to train the feature information and user event information to obtain a user event model.
  • the user event information may include: entering, going out, getting up at night, getting up, lying down, falling, being unable to sit for a long time, being sedentary, taking healthy medicine, and illegal intrusion.
  • step S11 may obtain user characteristic information in the following manner:
  • the image data is recognized to obtain user characteristic information in real time.
  • the user's identity, location, and posture are acquired through image recognition, where the user's identity can be acquired through facial feature recognition in the captured image data, or through iris features.
  • the location of the user can be obtained by the size of the user in the captured image data, and the location of other reference objects.
  • the user's posture can be obtained by recognizing the feature points of the arms, legs, head, etc. in the captured image data.
  • the camera may be a group of multiple cameras connected to the shooting range, or a single 180° camera.
  • step S11 may also obtain user characteristic information in the following manner:
  • Acquire sensing data of a human body sensing module in real time wherein the human body sensing module is set on a smart device, and the human body sensing module includes an infrared sensor, an infrared array sensor, etc.;
  • the location information of the user is determined in real time according to the sensing data.
  • the user’s location information is acquired through the sensing data of the human body sensing module.
  • the location information can be used independently or it can be fused with location information acquired in other ways to improve the acquired user The accuracy of the location.
  • step S11 may also obtain user characteristic information in the following manner:
  • the measurement module can be a radar sensor with multiple transmitters and multiple receivers, or at least two radar/ultrasonic sensors with single transmitter and single receiver;
  • the location information of the user is determined in real time according to the sensing data.
  • multiple distance sensors can be arranged on the periphery of the smart speaker to form a 360-degree sensing area on the periphery of the smart speaker, so that the user's horizontal position within a 360-degree range can be measured.
  • the location information can be used independently, or it can be fused with location information obtained in other ways to improve the accuracy of the obtained user location.
  • step S11 may also obtain user characteristic information in the following manner:
  • the sensing data of the lower sensing module and the sensing data of the upper sensing module are acquired in real time.
  • the lower sensing module and the upper sensing module are respectively arranged on the casing of the smart device, and the lower sensing module is paired with
  • the target object in the formed lower detection space performs distance sensing
  • the upper sensing module performs distance sensing on the target object in the formed upper detection space
  • the lower detection space and the upper detection space At least partially cross.
  • the lower sensing module includes at least two single-shot single-receiving radar/ultrasonic sensors, and/or at least one multiple-shot single-receive radar sensor;
  • the upper sensing module includes at least one single-shot single-receive radar/ultrasonic sensor , And/or, at least one radar sensor with multiple transmitters and multiple receivers;
  • the three-dimensional position information of the user is determined in real time according to the sensing data of the lower detection space and the sensing data of the upper detection space.
  • the smart device since two sensing modules are arranged in the height direction on the smart device, and the smart device is installed according to the type of user and the indoor environment, and the height of the user in the home is ensured to be at the top and bottom.
  • the horizontal position of the user can be determined based on the sensing data of the lower sensing module, and then the height of the user can be determined based on the sensing data of the upper sensing module, so as to finally determine the three-dimensional position of the user.
  • the number of LED strips on the smart device can be flexibly set to one or more according to actual applications.
  • the at least one set of LED light strips includes: a first LED light strip arranged on the front; a second LED light strip arranged on the left and right sides; and a third LED arranged on the top Light strip; the fourth LED light strip at the bottom.
  • the plurality of lamp beads in the first LED light strip are arranged in a circular, square, spiral, octagonal shape, etc.; the plurality of lamp beads in the second LED light strip are arranged in a rectangular, oval, or striped shape.
  • the multiple lamp beads in the third and fourth LED light strips are arranged in rectangular, elliptical, bar-shaped and other shapes.
  • the LED control information includes: color, brightness, lighting mode, and duration.
  • the LED strip includes three groups of red, green, and blue lamp beads, and these three lamp beads can be connected in series or in parallel.
  • the corresponding LED lamp can be adjusted Control and adjust the color and brightness of the belt.
  • the event information and the corresponding LED control information stored in the first association table can be customized by the user, or can be the system default.
  • the first (front) LED light strip can be controlled to light up with a brightness of 100%, so as to automatically turn on the lights when entering the door; if the user is detected The event is that the user goes out to work.
  • the LED strip lights up and the brightness can be 50%, which not only illuminates the floor for users, but is not too dazzling, and the side light strips facing the toilet are turned on, and the brightness can be 50%, which is convenient for users to guide and illuminate the path Go to the toilet; if the detected user event information is that the user cannot afford to sit for a long time (although the user sits in one place for a long time, but the hands, feet, head, etc.
  • control the second (left and right side) LED The light strip flashes red and yellow frequently, and flashes continuously for one minute to remind the user that it is time to get up and move the body; if the detected user event information is that the user is sedentary (the user sits in one place for a long time with hands and feet , The head does not move, different from "can not sit for a long time"), control the first (front) LED light strip to flash red frequently, and automatically alarm or ask other family members for help if necessary; if the detected user event information is When the user wakes up in the morning, and the user does not install the smart curtain, first control the second (left and right side) LED lights to light up, wait a preset time (for example, after 2 minutes), and then control the first (front) LED The light strip lights up, and the brightness is increased in sequence, for example, gradually increase from 10% to 80%, the color is a comfortable yellow light for human eyes, to illuminate the room for the user, and at the same time give the user’s eyes
  • the existing voice output method can also be combined.
  • the first (front) LED light strip and the second (left and right side) LED light strips can be controlled to flash once per second or according to data encoding Flashing lights, such as 2, 1, 3 times; if the hearing impaired or the user does not like the alarm sound and the emergency distress event information, you can control the first (front) LED light strip to flash once every time, the second (left and right) Side)
  • the LED strip flashes once, which is equivalent to flashing according to the data code, such as 1, 1, 2 times to confirm the alarm message sending confirmation; if the detected user is the event information that the user prefers to turn on the light and sleep at night, you can control the first One (front) LED light strip is off, and the second (left and right side) LED light strip is kept on for a preset time (for example, after 30 minutes) or a preset time after
  • Step S15 Compare the user event information with a pre-stored second association table, and when the user event information exists in the second association table, obtain the room light control information corresponding to the user event information , Wherein the second association table includes multiple user event information and room light control information corresponding to each user event information;
  • Step S16 Control the lights installed at the specific location of the room according to the room light control information.
  • user events can also be linked to the room lights, and there are several ways to control the room lights:
  • the smart device uses a single live wire to take power, that is, the smart device is connected to the AC circuit of the room light, and can be directly powered by the energy storage module, and communicate with the internal interface to control the light according to the room light control information. Because the smart device itself has built-in LED lights with lighting, it has the function of replacing the original light circuit of the room. Users do not need to upgrade the original light to LED lights, but can also directly remove the original old bulbs.
  • the room light control information can be transmitted to the corresponding smart switch through the device's energy storage for smart control of electrical energy and bus communication;
  • the room light control information can be transmitted to the corresponding smart switch through wireless communication.
  • the smart device of this embodiment includes: a first acquisition module 10, a second acquisition module 20, a third acquisition module 30, a drive module 40 and multiple sets of LED light strips 51...52.
  • the first acquiring module 10 is used to acquire user characteristic information in real time, and the user characteristic information includes: identity, location, and posture;
  • the second acquiring module 20 is used to input the position information and site information to the pre-established user respectively
  • the on-site information includes at least one of the following: house structure information, indoor space information, environmental information, and time information;
  • third acquisition module 30 is configured to compare the user event information with a pre-stored first association table, and when the user event information exists in the first association table, obtain LED control information corresponding to the user event information,
  • the first association table includes multiple user event information and LED control information corresponding to each user event information;
  • the driving module 40 is used to drive at least one set of
  • the smart device of this embodiment further includes a camera 11 and a sensing module 12, wherein the camera 11 is preferably a 180° camera, or multiple cameras are provided, and the shooting ranges of adjacent cameras overlap.
  • the sensing module 12 is, for example, an infrared sensing module or a distance sensing module. Preferably, the number of sensing modules can be set to multiple, thereby increasing the sensing range.
  • the first acquisition module 10 is configured to acquire image data from the camera 11 in real time and recognize the image data to acquire user characteristic information in real time.
  • the sensing module 12 is a human body sensing module
  • the first acquiring module 10 is also used to acquire sensing data of the human body sensing module in real time, and to determine the location of the user in real time according to the sensing data information.
  • the sensing module 12 includes at least one distance sensing module arranged on the outer periphery of the smart device, and a sensing area larger than 180 degrees is formed on the periphery of the smart device, and the first acquiring module 10 also uses Acquire sensing data of at least one distance sensing module in real time, and determine the location information of the user in real time according to the sensing data.
  • the sensing module includes an upper sensing module and a lower sensing module, wherein the upper sensing module is used for distance sensing of the target in the formed upper detection space; the lower sensing The module is used for distance sensing of the target in the formed lower detection space; moreover, the first acquisition module 10 is also used for real-time acquisition of the sensing data of the lower sensing module and the sensing data of the upper sensing module, and The three-dimensional position coordinates of the user are determined in real time according to the sensing data in the lower detection space and the sensing data in the upper detection space.
  • the at least one set of LED light strips includes: a first LED light strip arranged on the front; a second LED light strip arranged on the left and right sides; and a third LED light arranged on the top Band; the fourth LED light band at the bottom.
  • the plurality of lamp beads in the first LED light strip are arranged in a circular, square, spiral, octagonal shape, etc.
  • the plurality of lamp beads in the second LED light strip are arranged in a rectangular, oval, bar shape, etc.
  • Shape arrangement the multiple lamp beads in the third and fourth LED light strips are arranged in a rectangular, oval, bar shape, etc.
  • the LED control information includes: color, brightness, lighting mode, and duration.
  • the LED strip includes three groups of red, green, and blue lamp beads, and these three lamp beads can be connected in series or in parallel.
  • the corresponding LED lamp can be adjusted Control and adjust the color and brightness of the belt.
  • the smart device of the present invention further includes a charging module and an energy storage module, and power is supplied to each module of the smart device through the energy storage module, so that even if the smart device gets power through a single live wire loop, it will not LED lights with strobe light.
  • the smart device of the present invention can also perform linkage control of user events and lights set at specific locations in the room, specifically including a fourth acquisition module and a light control module, and the fourth acquisition module is used for When the information does not exist in the first association table, the user event information is compared with a pre-stored second association table, and when the user event information exists in the second association table, the user The room light control information corresponding to the event information, wherein the second association table includes a plurality of user event information and room light control information corresponding to each user event information; the light control module is used for the room light control information Control the lights installed at specific locations in the room.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Optics & Photonics (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

La présente invention concerne un procédé de mise en œuvre pour une maison intelligente et un dispositif intelligent. Le procédé de mise en œuvre comprend les étapes consistant à : acquérir en temps réel des informations de caractéristique d'utilisateur (S11) ; fournir en entrée respectivement des informations d'emplacement et des informations sur site dans un modèle d'événement d'utilisateur pré-construit et acquérir des informations d'événement d'utilisateur (S12) ; comparer les informations d'événement d'utilisateur à une première table d'association préenregistrée et acquérir, lorsque les informations d'événement d'utilisateur sont présentes dans la première table d'association, des informations de commande de DEL correspondant aux informations d'événement d'utilisateur (S13) ; et commander au moins une bande lumineuse à DEL sur la base des informations de commande de DEL (S14).
PCT/CN2019/080226 2019-03-28 2019-03-28 Procédé de commande mise en œuvre pour une maison intelligente et un dispositif intelligent WO2020191755A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/080226 WO2020191755A1 (fr) 2019-03-28 2019-03-28 Procédé de commande mise en œuvre pour une maison intelligente et un dispositif intelligent

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/080226 WO2020191755A1 (fr) 2019-03-28 2019-03-28 Procédé de commande mise en œuvre pour une maison intelligente et un dispositif intelligent

Publications (1)

Publication Number Publication Date
WO2020191755A1 true WO2020191755A1 (fr) 2020-10-01

Family

ID=72609585

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/080226 WO2020191755A1 (fr) 2019-03-28 2019-03-28 Procédé de commande mise en œuvre pour une maison intelligente et un dispositif intelligent

Country Status (1)

Country Link
WO (1) WO2020191755A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114545837A (zh) * 2022-03-04 2022-05-27 珠海市一微机器人技术有限公司 一种机器人交互方法、芯片以及机器人
CN114609921A (zh) * 2022-03-03 2022-06-10 江苏悦达绿色建筑科技有限公司 一种高舒适低能耗三恒家居环境控制系统及方法
CN116156695A (zh) * 2023-04-21 2023-05-23 永林电子股份有限公司 一种居家用led智能系统
CN116406058A (zh) * 2023-04-12 2023-07-07 永林电子股份有限公司 一种led智能控制系统
CN117478119A (zh) * 2023-12-27 2024-01-30 深圳市华腾智能科技有限公司 一种指划区控制的智能开关面板及其控制方法

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140354187A1 (en) * 2013-05-28 2014-12-04 Abl Ip Holding Llc Distributed processing using resources of intelligent lighting elements of a lighting system
CN104244528A (zh) * 2014-09-22 2014-12-24 小米科技有限责任公司 智能灯的控制方法及装置
US20150115834A1 (en) * 2013-10-30 2015-04-30 Samsung Electronics Co., Ltd. Smart home network apparatus and control method thereof
CN104703369A (zh) * 2015-04-01 2015-06-10 山东共达电声股份有限公司 一种控制灯具的方法及装置
CN106304533A (zh) * 2015-06-12 2017-01-04 泉州市金太阳照明科技有限公司 一种灯光场景智能控制系统
CN107085695A (zh) * 2015-12-11 2017-08-22 富奇想股份有限公司 智能系统及信息和服务提供方法
CN107770915A (zh) * 2017-11-16 2018-03-06 万子恺 一种智能灯光控制系统及灯光控制的方法
CN109445393A (zh) * 2018-11-14 2019-03-08 重庆工业职业技术学院 基于移动终端定位的光感测自动照明系统
CN109445288A (zh) * 2018-09-28 2019-03-08 深圳慧安康科技有限公司 一种智慧家庭普及应用的实现方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140354187A1 (en) * 2013-05-28 2014-12-04 Abl Ip Holding Llc Distributed processing using resources of intelligent lighting elements of a lighting system
US20150115834A1 (en) * 2013-10-30 2015-04-30 Samsung Electronics Co., Ltd. Smart home network apparatus and control method thereof
CN104244528A (zh) * 2014-09-22 2014-12-24 小米科技有限责任公司 智能灯的控制方法及装置
CN104703369A (zh) * 2015-04-01 2015-06-10 山东共达电声股份有限公司 一种控制灯具的方法及装置
CN106304533A (zh) * 2015-06-12 2017-01-04 泉州市金太阳照明科技有限公司 一种灯光场景智能控制系统
CN107085695A (zh) * 2015-12-11 2017-08-22 富奇想股份有限公司 智能系统及信息和服务提供方法
CN107770915A (zh) * 2017-11-16 2018-03-06 万子恺 一种智能灯光控制系统及灯光控制的方法
CN109445288A (zh) * 2018-09-28 2019-03-08 深圳慧安康科技有限公司 一种智慧家庭普及应用的实现方法
CN109445393A (zh) * 2018-11-14 2019-03-08 重庆工业职业技术学院 基于移动终端定位的光感测自动照明系统

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114609921A (zh) * 2022-03-03 2022-06-10 江苏悦达绿色建筑科技有限公司 一种高舒适低能耗三恒家居环境控制系统及方法
CN114609921B (zh) * 2022-03-03 2023-02-03 江苏悦达绿色建筑科技有限公司 一种高舒适低能耗三恒家居环境控制系统及方法
CN114545837A (zh) * 2022-03-04 2022-05-27 珠海市一微机器人技术有限公司 一种机器人交互方法、芯片以及机器人
CN116406058A (zh) * 2023-04-12 2023-07-07 永林电子股份有限公司 一种led智能控制系统
CN116156695A (zh) * 2023-04-21 2023-05-23 永林电子股份有限公司 一种居家用led智能系统
CN117478119A (zh) * 2023-12-27 2024-01-30 深圳市华腾智能科技有限公司 一种指划区控制的智能开关面板及其控制方法
CN117478119B (zh) * 2023-12-27 2024-04-16 深圳市华腾智能科技有限公司 一种指划区控制的智能开关面板及其控制方法

Similar Documents

Publication Publication Date Title
WO2020191755A1 (fr) Procédé de commande mise en œuvre pour une maison intelligente et un dispositif intelligent
CN109917666B (zh) 智慧家庭的实现方法及智能装置
US20240127090A1 (en) Intelligent Docking System for Devices
CN109644166B (zh) 用于智能网络的智能模块和智能网络系统
US10709335B2 (en) Infant monitoring system with observation-based system control and feedback loops
CN105116859B (zh) 一种利用无人飞行器实现的智能家居系统及方法
CN109634129B (zh) 主动关怀的实现方法、系统及装置
US20170238401A1 (en) Solid State Lighting Systems
US20150204561A1 (en) Control System With Mobile Sensors
KR101436306B1 (ko) 보안기능을 갖는 조명 시스템
CN106131075A (zh) 一种基于tcp/ip协议的摇头灯及其控制系统
EP4155782A1 (fr) Systèmes et procédés de détection d'ultrasons dans des dispositifs intelligents
CN107526298A (zh) 智能家居服务机器人
CN111886633A (zh) 基于所分析的视频流的利用智能音频提示的婴儿监视
CN203784722U (zh) 多功能一体化智能电灯
CN110275443A (zh) 主动的智能控制方法、系统及智能装置
KR102190190B1 (ko) 인공지능을 활용한 공동주택 정보통신 통합 관리 시스템
CN108965717A (zh) 环目摄像机
CN207053531U (zh) 一种基于物联网的智能家居系统
CN107949121A (zh) 一种用于建筑高效节能型智慧照明系统及其工作方法
CN111948995A (zh) 一种基于物联网的远程舞台智能化控制系统
WO2021218597A1 (fr) Appareil, système et procédé de commande intelligente
CN109547916B (zh) 智能照明设备、通信系统及室内定位方法
CN110888333A (zh) 一种智能家居系统的场景启动装置
JP6624375B2 (ja) 監視システム、監視方法及びセンサーライト

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19921252

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19921252

Country of ref document: EP

Kind code of ref document: A1