CN118170036A - Intelligent home environment atmosphere adjusting method, system, device and storage medium - Google Patents

Intelligent home environment atmosphere adjusting method, system, device and storage medium Download PDF

Info

Publication number
CN118170036A
CN118170036A CN202410272108.XA CN202410272108A CN118170036A CN 118170036 A CN118170036 A CN 118170036A CN 202410272108 A CN202410272108 A CN 202410272108A CN 118170036 A CN118170036 A CN 118170036A
Authority
CN
China
Prior art keywords
environment
data
current
user
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410272108.XA
Other languages
Chinese (zh)
Inventor
黄立清
陈明亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen Leelen Technology Co Ltd
Original Assignee
Xiamen Leelen Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen Leelen Technology Co Ltd filed Critical Xiamen Leelen Technology Co Ltd
Priority to CN202410272108.XA priority Critical patent/CN118170036A/en
Publication of CN118170036A publication Critical patent/CN118170036A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Air Conditioning Control Device (AREA)

Abstract

A method, a system, a device and a storage medium for regulating the environment atmosphere of an intelligent home, which comprise the following steps: acquiring an environment mode selection instruction; acquiring environmental data parameters preset by the environmental mode selected by the user according to the environmental mode selection instruction, and acquiring environmental data where the user is currently located; issuing a current environment image acquisition instruction according to preset environment data parameters, and acquiring a current environment image; processing the current environment image to obtain current environment data and transmitting the current environment data and the current environment data of the user to an environment atmosphere adjustment model obtained through pre-training; comparing the current environmental data with preset environmental data parameters based on the current environmental data of the user to obtain a comparison result; based on the comparison result, a corresponding environmental atmosphere adjusting instruction is issued. According to the method, the user does not need to frequently adjust the numerical value, the whole environment atmosphere can be adjusted by one key only by selecting the needed environment mode, the user operation is reduced, and the user experience is effectively improved.

Description

Intelligent home environment atmosphere adjusting method, system, device and storage medium
Technical Field
The invention relates to the technical field of intelligent home, in particular to an intelligent home environment atmosphere adjusting method, system and device and a storage medium.
Background
The intelligent home is connected with home equipment such as audio and video, illumination, curtains, air conditioners, security protection and the like through the internet and the internet of things, so that multiple functions such as home appliance control, illumination adjustment, remote telephone control, anti-theft alarm and environment monitoring are realized. Compared with the traditional home, the intelligent home not only provides living functions, but also integrates building, network communication, information home appliances and equipment automation, provides omnibearing information interaction, and saves energy cost.
However, the existing intelligent home control is mainly used for independently adjusting single products, so that ideal environment atmosphere formed by coordination of multiple devices is difficult to realize, the operation is complex, and the overall adjustment requirement of a user is difficult to meet. In order to solve the problem, a device group control mode is commonly adopted in the industry, devices needing to realize one-key control are combined into a device group card through an app related interaction page, a user clicks the device group card, enters a device group control page, manually selects control attributes shared by the devices, and performs device control under the control device attributes so as to realize adjustment of multiple devices in an environment range.
Although the device group control mode can realize simultaneous control of multiple devices to a certain extent, there are disadvantages that if the whole environment atmosphere cannot be adjusted by one key, the device adjustment values commonly used by the user cannot be identified, and the user experience is poor. That is, the existing intelligent home environment atmosphere adjusting method has the problem that the user experience is poor.
Disclosure of Invention
The invention mainly aims to provide an intelligent home environment atmosphere adjusting method, an intelligent home environment atmosphere adjusting system, an intelligent home environment atmosphere adjusting device and a storage medium, and aims to solve the technical problem that the user experience is poor in the conventional intelligent home environment atmosphere adjusting method.
In order to achieve the above purpose, the invention provides an intelligent home environment atmosphere adjusting method, which comprises the following steps: acquiring an environment mode selection instruction, wherein the instruction comprises an environment mode selected by a user; acquiring environment data parameters preset by the environment mode selected by the user and environment data where the user is currently located according to the environment mode selection instruction; issuing a current environment image acquisition instruction according to preset environment data parameters, and acquiring a current environment image; processing the current environment image to obtain current environment data and transmitting the current environment data and the current environment data of the user to an environment atmosphere adjusting model obtained through pre-training; comparing the current environmental data with preset environmental data parameters based on the current environmental data of the user to obtain a comparison result; based on the comparison result, a corresponding environment atmosphere adjusting instruction is issued to realize the adjustment of the current environment to the environment mode selected by the user.
Optionally, the acquiring an environmental mode selection instruction specifically includes: providing a first interactive interface to obtain the environment mode selection instruction, wherein the interface comprises one or more preset environment modes for selection by a user.
Optionally, the current environment image acquisition instruction includes an image acquisition instruction for one or more intelligent home devices in a current control state; the current environment data comprises one or more current control states of the intelligent household equipment.
Optionally, the preset environmental data parameters include one or more environments where preset users are located, and one or more preset control parameters of the smart home devices under the environments where the preset users are located; based on the current environmental data of the user, comparing the current environmental data with preset environmental data parameters to obtain a comparison result, wherein the comparison result specifically comprises: determining the environment of the preset user according to the current environment data of the user, and comparing the current control state of the intelligent home equipment with preset control parameters of the intelligent home equipment in the environment of the preset user to obtain a comparison result; the comparison result is specifically the difference between the current control state of the same intelligent home equipment and preset control parameters; based on the comparison result, a corresponding environmental atmosphere adjusting instruction is issued, specifically: based on the difference between the current control state and the preset control parameters of the same intelligent household equipment, a corresponding environment atmosphere adjusting instruction is issued to the intelligent household equipment, so that the current control state is adjusted to the preset control parameters.
Optionally, the current environment image is processed, specifically: converting the current environment image into bitmap form data, and converting the bitmap form data into tensor form data.
Optionally, the building of the environmental atmosphere adjustment model at least comprises the following steps: acquiring first image data corresponding to control states of all intelligent home equipment under environments where different preset users are located and second image data corresponding to overall environmental atmosphere; based on a preset environment mode, storing the first image data and the second image data into corresponding preset environment mode folders in a classified mode to form a local data set; creating a custom data set based on PyTorch libraries and the local data set; preprocessing the data of the data set, and converting the data into a format which can be used by a perception model; and constructing a perception model through a convolutional neural network, wherein the model can extract characteristics from image data and realize adjustment of environmental atmosphere.
Optionally, training the perception model after the perception model is built, and carrying out a digital recognition test and an image recognition test after the perception model is trained so as to judge whether the model is accurate; if the model is accurate, the constructed perception model is stored and converted into a form capable of running on a mobile application or other software platforms to be used as an environmental atmosphere adjustment model; if the model is inaccurate, constructing a perception model again through a convolutional neural network to obtain a new model, and training, digital identification testing and image identification testing the new model again until the model is accurate.
Corresponding to the intelligent home environment atmosphere adjusting method, the invention provides an intelligent home environment atmosphere adjusting system which comprises terminal equipment, image acquisition equipment, intelligent home equipment and a cloud end, wherein an environment atmosphere adjusting model and an image processing program are arranged in the terminal equipment; the terminal equipment is used for acquiring an environment mode selection instruction, wherein the instruction comprises an environment mode selected by a user; acquiring environment data parameters preset by the environment mode selected by the user and environment data where the user is currently located according to the environment mode selection instruction; issuing a current environment image acquisition instruction to the image acquisition equipment according to preset environment data parameters; the image acquisition equipment is used for acquiring a current environment image according to the current environment image acquisition instruction; the terminal device is further used for processing the current environment image based on the image processing program to obtain current environment data and transmitting the current environment data and the current environment data of the user to the environment atmosphere adjustment model obtained through pre-training; the terminal equipment is further used for comparing the current environment data with preset environment data parameters based on the current environment data of the user through the environment atmosphere adjustment model to obtain a comparison result; based on the comparison result, a corresponding environmental atmosphere adjusting instruction is issued to the terminal equipment; the terminal equipment is further used for adjusting the intelligent home equipment through the cloud based on the environment atmosphere adjusting instruction so as to adjust the current environment to the environment mode selected by the user.
Corresponding to the intelligent home environment atmosphere adjusting method, the invention provides an intelligent home environment atmosphere adjusting device, which comprises: the instruction acquisition module is used for acquiring an environment mode selection instruction, wherein the instruction comprises an environment mode selected by a user; the data acquisition module is used for acquiring environment data parameters preset by the environment mode selected by the user and environment data where the user is currently located according to the environment mode selection instruction; the image acquisition module is used for issuing a current environment image acquisition instruction according to preset environment data parameters and acquiring a current environment image; the image processing module is used for processing the current environment image to obtain current environment data and transmitting the current environment data and the current environment data of the user to a pre-trained environment atmosphere adjustment model; the comparison module is used for comparing the current environment data with preset environment data parameters based on the current environment data of the user to obtain a comparison result; and the environmental atmosphere adjusting module is used for issuing a corresponding environmental atmosphere adjusting instruction based on the comparison result so as to realize the adjustment of the current environment to the environment mode selected by the user.
In addition, in order to achieve the above object, the present invention further provides a computer-readable storage medium, on which an intelligent home environment atmosphere adjustment program is stored, which when executed by a processor, implements the steps of the intelligent home environment atmosphere adjustment method as described above.
The beneficial effects of the invention are as follows:
(1) Compared with the prior art, the method and the device have the advantages that based on the current environmental data of the user, the current environmental data is compared with the preset environmental data parameters, and a comparison result is obtained; based on the comparison result, a corresponding environment atmosphere adjusting instruction is issued to realize the adjustment of the current environment to the environment mode selected by the user; the user does not need to frequently adjust the numerical value, and can adjust the whole environment atmosphere by one key only by selecting the needed environment mode, so that the user operation is reduced, and the user experience is effectively improved;
(2) Compared with the prior art, the environment mode selection instruction is acquired by providing the first interactive interface, and one or more preset environment modes are included in the interface for the user to select, so that the environment atmosphere adjustment options are visualized and more visual, the user operation can be facilitated, and the user experience is further improved;
(3) Compared with the prior art, the method and the device have the advantages that based on the difference between the current control state and the preset control parameters of the same intelligent household equipment, the corresponding environment atmosphere adjusting instruction is issued to the intelligent household equipment, and the current control state is adjusted to the preset control parameters; the intelligent household equipment can be automatically adjusted quickly, the user does not need to repeatedly switch and adjust the intelligent household equipment in each type of group equipment, the use experience of the user can be improved, and the efficiency of adjusting the environment atmosphere of the intelligent household is improved;
(4) Compared with the prior art, the method and the device have the advantages that the current environment image is converted into tensor form data by processing the current environment image, so that subsequent comparison processing is facilitated;
(5) Compared with the prior art, the invention can judge whether the model is accurate or not by constructing the perception model, training the perception model and carrying out the digital recognition test and the image recognition test on the perception model, and can obtain a more accurate environmental atmosphere adjusting model so as to ensure the accuracy of environmental atmosphere adjustment.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flow chart of an embodiment of a method for intelligent home environmental atmosphere control according to the present invention;
FIG. 2 is a schematic diagram of an embodiment of a first interactive interface according to the present invention;
FIG. 3 is a schematic diagram of an embodiment of the invention for prompting the completion of the environmental atmosphere adjustment;
Fig. 4 is a frame diagram of an embodiment of the intelligent home environment atmosphere control device of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1, the method for adjusting the environment atmosphere of the smart home comprises the following steps: acquiring an environment mode selection instruction, wherein the instruction comprises an environment mode selected by a user; acquiring environmental data parameters preset by the environmental mode selected by the user according to the environmental mode selection instruction, and acquiring environmental data where the user is currently located; issuing a current environment image acquisition instruction according to preset environment data parameters, and acquiring a current environment image; processing the current environment image to obtain current environment data and transmitting the current environment data and the current environment data of the user to an environment atmosphere adjustment model obtained through pre-training; comparing the current environmental data with preset environmental data parameters based on the current environmental data of the user to obtain a comparison result; based on the comparison result, a corresponding environment atmosphere adjusting instruction is issued to realize the adjustment of the current environment to the environment mode selected by the user.
Preferably, the environmental data in which the user is currently located at least includes a current time (for example, morning, noon, evening, or a certain time period, or a certain time point), a current location of the user (for example, a certain city), a weather (for example, sunny, cloudy, rainy, etc.) corresponding to the current location of the user, and an area in which the user is currently located (for example, a living room, a restaurant, a study room, a primary lying, a secondary lying, etc., which is obtained by dividing according to a functional requirement of a house and a living habit of an occupant). Specifically, the current environmental data of the user is obtained, for example, the current time, the current location of the user, the weather corresponding to the current location of the user, and the current area of the user can be obtained through the location interface of the app.
Comparing current environmental data with preset environmental data parameters based on the current environmental data of a user to obtain a comparison result; based on the comparison result, a corresponding environment atmosphere adjusting instruction is issued to realize the adjustment of the current environment to the environment mode selected by the user; the user does not need to frequently adjust the numerical value, and can adjust the whole environment atmosphere by one key only by selecting the required environment mode, so that the user operation is reduced, and the user experience is effectively improved.
In this embodiment, the environmental mode selection instruction is obtained specifically as follows: providing a first interactive interface to obtain an environment mode selection instruction, wherein the interface comprises one or more preset environment modes for selection by a user. Preferably, the preset environmental mode uses the mode name (such as bright atmosphere, comfortable environment, weak light rest, etc.) and the whole environmental atmosphere image as main display contents, and the main display contents are displayed in the first interactive interface, and the user clicks a certain preset environmental mode in the first interactive interface to generate an environmental mode selection instruction, which can refer to fig. 2. Or the user can also input a certain preset environmental mode name through voice to generate an environmental mode selection instruction.
According to the invention, the first interactive interface is provided to obtain the environment mode selection instruction, and one or more preset environment modes are included in the interface for the user to select, so that the environment atmosphere adjustment options are visualized, more visual, convenient for the user to operate and further capable of improving the user experience.
In this embodiment, the method for adjusting the environmental atmosphere of the smart home further includes: and after the current environment is regulated to the environment mode selected by the user, prompting that the regulation of the environment atmosphere is completed. Preferably, the completion of the ambient atmosphere adjustment is prompted by updating the presentation situation in the first interactive interface (e.g. showing the control result successful in the form of a toast), see fig. 3. Or the adjustment of the environmental atmosphere can be also finished through voice broadcasting.
According to the invention, after the current environment is regulated to the environment mode selected by the user, the completion of the regulation of the environment atmosphere is prompted, so that the user can conveniently acquire the regulation result of the environment atmosphere, the user does not need to inquire independently, the user time is saved, and the user experience is further improved.
In this embodiment, the current environmental image acquisition instruction includes an image acquisition instruction for one or more intelligent home devices in a current control state; the current environmental data includes one or more current control states of the smart home devices.
In this embodiment, a current environment image acquisition instruction is issued to the image acquisition module, and the image acquisition module is invoked to acquire a current environment image. For example, the image acquisition module is specifically a camera device, and captures images of a current control state (such as a half-open state or a full-open state or other states) of the curtain and images of a current control state (such as an open state or an unopened state or other states) of the lamp according to an image acquisition instruction of the current control state of the smart home device.
In this embodiment, the preset environmental data parameters include one or more preset environments where the preset user is located, and one or more preset control parameters of the smart home devices under the preset environments where the preset user is located. Therefore, the current control state of the intelligent home equipment to be acquired can be determined by acquiring the environmental data parameters preset by the selected environmental mode according to the environmental mode selection instruction, then the current environmental image acquisition instruction is issued according to the preset environmental data parameters, and the current environmental image is acquired. Each preset environmental model has corresponding preset environmental data parameters during the development phase (default to obtain corresponding preset environmental data parameters through a large number of user studies and user interviews in the prior period).
Preferably, the environment in which the preset user is located at least includes a preset time (for example, morning, noon, evening, or a certain time period, or a certain time point), a preset user location (for example, a certain city), a weather (for example, sunny, cloudy, rainy, etc.) corresponding to the preset user location, and an area in which the preset user is located (for example, a living room, a restaurant, a study room, a primary lying, a secondary lying, etc. are divided according to the functional requirement of the house and the living habit of the resident).
In this embodiment, based on the current environmental data of the user, comparing the current environmental data with the preset environmental data parameters to obtain a comparison result, which specifically includes: determining the environment of a preset user according to the current environment data of the user, and comparing the current control state of the intelligent household equipment with preset control parameters of the intelligent household equipment in the environment of the preset user to obtain a comparison result; the comparison result is specifically the difference between the current control state of the same intelligent home equipment and preset control parameters.
Further, based on the comparison result, a corresponding environmental atmosphere adjustment instruction is issued, specifically: based on the difference between the current control state and the preset control parameters of the same intelligent household equipment, a corresponding environment atmosphere adjusting instruction is issued to the intelligent household equipment, so that the current control state is adjusted to the preset control parameters. The preset environment data parameters comprise one or more of environments where preset users are located and preset control parameters of one or more intelligent household devices under the environments where the preset users are located
For example, the environment mode selected by the user is "low light rest", the preset time in the environment where a certain preset user is located in the mode is night (for easy understanding, only the preset time is exemplified here, so the weather corresponding to the preset user location, and the area where the preset user is located are not mentioned), the preset control parameter of the window curtain in the environment where the preset user is located is fully closed, and the preset control parameter of the lamp is slightly bright; when the current time in the environment data where the user is currently located is at night (here, for the convenience of understanding, only the current time is exemplified, so that the current location of the user, the weather corresponding to the current location of the user and the area where the user is currently located are not mentioned), the current control state of the curtain is fully opened, and the current control state of the lamp is unopened; the corresponding environment atmosphere adjusting instruction is issued to the intelligent household equipment, specifically, the full-closing adjusting instruction is issued to the curtain, so that the curtain adjusts the current control state (full-opening) to the preset control parameter (full-closing); and sending an adjusting instruction of ' turning on ' and adjusting the brightness to be slightly bright ' to the lamp, so that the lamp can adjust the current control state (not turned on) to a preset control parameter (slightly bright).
Based on the difference between the current control state and preset control parameters of the same intelligent household equipment, a corresponding environment atmosphere adjusting instruction is issued to the intelligent household equipment, so that the current control state is adjusted to the preset control parameters; the intelligent household device can be automatically adjusted quickly, the user does not need to repeatedly switch and adjust various group devices, the user experience can be improved, and the intelligent household environment atmosphere adjusting efficiency is improved.
In this embodiment, the processing of the current environmental image is specifically: the current environmental image is converted into bitmap-form data, and the bitmap-form data is converted into tensor-form data. Preferably, in this embodiment, the image information is converted into a Bitmap format by the image processing program PRIEVIEWVIEW corresponding to the app, and then the Bitmap format is converted into the data transducer.
In this embodiment, the construction of the environmental atmosphere adjustment model at least includes the following steps:
S10, acquiring first image data corresponding to control states of all intelligent home equipment in environments where different preset users are located and second image data corresponding to overall environmental atmosphere; preferably, the environments where different preset users are located are formed in a permutation and combination mode based on different preset times, different preset user positions, weather corresponding to the different preset user positions and areas where the different preset users are located; furthermore, according to the requirements of practical application, when different environments where different preset users are located are formed, different temperature and humidity environments and the like can be considered.
Taking a living room area as an example, in step S10, acquiring first image data corresponding to the control state of each intelligent home equipment and second image data corresponding to the whole environment atmosphere, wherein the first image data is 8:00 in the morning when a user is located in an A city in sunny days; the method comprises the steps of acquiring first image data corresponding to control states of all intelligent home equipment and second image data corresponding to overall environmental atmosphere, wherein the first image data corresponds to the control states of all intelligent home equipment when a user is located in an A city and in the morning of rainy days at 10:00; and further acquiring first image data corresponding to the control states of all intelligent home equipment, second image data corresponding to the whole environment atmosphere and the like when the user is located in the city B and in the morning of 10:00 in rainy days. The image acquisition process of other areas and other environments where other preset users are located is similar to the above process, and will not be described here in detail, and the image acquisition can be performed according to the actual needs of the preset environment mode.
It should be noted that: because the final environmental atmosphere adjustment model needs to be capable of achieving the effect of achieving the same preset environmental mode through corresponding adjustment for different cities, different times, different weather and different areas, a large amount of relevant data needs to be acquired and uniformly archived after the data are arranged and combined during the training of the advanced deep learning model, and the subsequent data comparison processing of the model is facilitated.
S20, based on a preset environment mode, storing the first image data and the second image data into corresponding preset environment mode folders in a classified mode to form a local data set;
s30, creating a custom data set based on PyTorch libraries and local data sets, wherein the custom data set at least comprises the following steps:
Based on PyTorch libraries and local datasets, firstly inheriting Dataset classes in the torch, then initializing the parent class in an initialization function, traversing paths (pictures+folder names) of the local datasets, and finally placing pictures (path forms) and labels (folder names) in the paths into a [ tuple () ] structure.
Since the tuples are immutable, this means that once one tuple is created, the content therein cannot be altered, helping to ensure that the information of the sample is not accidentally altered during subsequent processing. Preferably, the pictures are first image data and second image data, and the folder name is a preset environment mode folder name.
S40, preprocessing the data of the data set, converting the data into a format which can be used by a perception model, and at least comprising the following steps:
And normalizing the data of the data set to ensure that img=img.reshape (-1), then carrying out one-bit effective coding (one_hot) on the label of the data set, creating an array with all 0s, enabling the corresponding position to be 1, and returning to a truly required object (np.flava32) for the establishment of a subsequent perception model.
In the above "img=img. Reshape (-1)", reshape is one method of NumPy array for changing the shape of the array. -1 is here a special value that causes reshape functions to automatically calculate the size of the dimension to keep the total number of raw data unchanged. Typically, when images are processed, they are typically three-dimensional (height, width, channel), but in some cases (e.g., when the image is to be flattened for input to a fully connected layer), it may be desirable to convert it from three-dimensional to two-dimensional. The size of the second dimension can be automatically calculated using reshape (-1) in order to keep the total number of pixels in the image unchanged.
The above "np. Flava 32" is a data type used in NumPy to represent a 32-bit floating point number that provides sufficient precision while not taking up too much memory as a 64-bit floating point number.
The dataset has been converted into a format that can be used by the perceptual model by the preprocessing step described above. By passing the input data (normalized and flattened image) and the corresponding one-hot encoded labels to the network, the network can be trained to recognize the environmental patterns in the image data and learn to map the inputs to the correct categories.
S50, constructing a perception model through a convolutional neural network, wherein the model can extract features from image data and realize adjustment of environmental atmosphere, and the method at least comprises the following steps of:
S51, constructing a perception model through a convolutional neural network, and for a first layer of convolution: defining the number of input channels, wherein each output channel corresponds to one offset, and activating by using Relu activating functions after convolution operation; for the first layer of pooling, 2x2 pooling is adopted to reduce the space size of the data, and meanwhile important characteristic information is reserved, and the image data is uniformly compressed into 14x14 in scale through pooling.
S52, after the first convolution layer and the pooling layer, the network continues to stack the second convolution layer and the pooling layer so as to further extract and abstract image features; unlike the first convolution layer, in the second convolution layer, 64 different convolution kernels are used, which will generate 64 convolution feature maps; each convolution kernel performs convolution operation with input data (usually the output of the last convolution layer) to generate a feature map, and by increasing the number of feature maps, the network can learn more diversified features; after passing through the second convolution layer and the pooling layer, the image size is further reduced to 7x7.
S53, initializing a full connection layer, which is used for processing the image, adding corresponding offset and expansion data in the full connection layer, and finally, using a ReLU activation function again to increase the nonlinearity of the network, so as to help the network to learn and adapt to complex modes better.
Further, to prevent overfitting, dropout is defined between the fully connected and output layers, leaving a certain number of neurons unoutput, while using a softmax classifier, and cross entropy (cross-entropy) as the loss function.
In the embodiment, after the perception model is built, training is carried out, and after the training is finished, digital recognition testing and image recognition testing are carried out to judge whether the model is accurate or not; if the model is accurate, the constructed perception model is stored and converted into a form capable of running on a mobile application or other software platforms to be used as an environmental atmosphere adjustment model; if the model is inaccurate, constructing a perception model again through a convolutional neural network to obtain a new model, and training, digital identification testing and image identification testing the new model again until the model is accurate.
Preferably, in this embodiment, the method of cuda () is mainly used to train the model on the GPU, the model, the data (input and label) and the loss function are all transferred to the GPU for calculation, and the whole training data set is used to train the model 30 times (i.e. perform 30 iterations), so that the accuracy of the whole model is stabilized at 99.2%.
Preferably, in this embodiment, three images of a half-open curtain, no light, and daytime midday light are used as test images to perform test description: firstly inputting a test image, extracting about 2000 possible object areas Region Proposal which need to be identified from top to bottom by utilizing a selective algorithm in a model code, inputting Tensor data into a CNN class function, reducing the data size to 7x7 as an output characteristic, finally activating the extracted characteristic data of the CNN by using relu, inputting the characteristic data into an SVM (support vector machine) divider for initial classification for comparison and judgment, outputting a final result through a print () function after the judgment is finished, and if the model is 'weak light rest' (supposing that the model is midday, a curtain is half-opened and no light), conforming to the expected result, indicating that the whole model has certain accuracy.
According to the invention, the perception model is constructed, trained, and subjected to digital recognition test and image recognition test to judge whether the model is accurate, so that a more accurate environmental atmosphere regulation model can be obtained, and the accuracy of environmental atmosphere regulation is ensured.
The invention also correspondingly provides an intelligent home environment atmosphere adjusting system which comprises terminal equipment, image acquisition equipment, intelligent home equipment and a cloud end, wherein an environment atmosphere adjusting model and an image processing program are arranged in the terminal equipment; the terminal equipment is used for acquiring an environment mode selection instruction, wherein the instruction comprises an environment mode selected by a user; acquiring environmental data parameters preset by the environmental mode selected by the user according to the environmental mode selection instruction, and acquiring environmental data where the user is currently located; issuing a current environment image acquisition instruction to the image acquisition equipment according to preset environment data parameters; the image acquisition equipment is used for acquiring a current environment image according to the current environment image acquisition instruction; the terminal equipment is also used for processing the current environment image based on the image processing program to obtain current environment data and transmitting the current environment data and the current environment data of the user to the environmental atmosphere adjustment model obtained through pre-training; the terminal equipment is also used for comparing the current environmental data with preset environmental data parameters based on the current environmental data of the user through the environmental atmosphere adjustment model to obtain a comparison result; based on the comparison result, issuing a corresponding environmental atmosphere adjustment instruction to the terminal equipment; the terminal equipment is also used for adjusting the intelligent home equipment through the cloud based on the environmental atmosphere adjusting instruction so as to adjust the current environment to the environment mode selected by the user.
As shown in fig. 4, the present invention further correspondingly provides an intelligent home environment atmosphere adjusting device, which includes: an instruction obtaining module 10, configured to obtain an environmental mode selection instruction, where the instruction includes an environmental mode selected by a user; the data acquisition module 20 is configured to acquire environmental data parameters preset by the selected environmental mode and environmental data currently located by the user according to the environmental mode selection instruction; the image acquisition module 30 is configured to issue a current environmental image acquisition instruction according to a preset environmental data parameter, and acquire a current environmental image; the image processing module 40 is configured to process the current environmental image to obtain current environmental data, and transmit the current environmental data and the current environmental data of the user to the environmental atmosphere adjustment model obtained by pre-training; the comparison module 50 is configured to compare the current environmental data with preset environmental data parameters based on the current environmental data of the user, so as to obtain a comparison result; the environmental atmosphere adjusting module 60 is configured to issue a corresponding environmental atmosphere adjusting instruction based on the comparison result, so as to adjust the current environment to the environmental mode selected by the user.
The embodiment of the present invention also provides a computer readable storage medium, which may be a computer readable storage medium contained in the memory in the above embodiment; or may be a computer-readable storage medium, alone, that is not assembled into a device. The computer readable storage medium has at least one instruction stored therein, the instruction being loaded and executed by a processor to implement the intelligent home environment atmosphere adjustment method shown in fig. 1. The computer readable storage medium may be a read-only memory, a magnetic disk or optical disk, etc.
It should be noted that, in the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described as different from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other. For system, apparatus, and storage medium embodiments, the description is relatively simple, as it is substantially similar to the method embodiments, and reference should be made to the description of the method embodiments for relevant points.
Also, herein, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
While the foregoing description illustrates and describes the preferred embodiments of the present invention, it is to be understood that the invention is not limited to the forms disclosed herein, but is not to be construed as limited to other embodiments, but is capable of use in various other combinations, modifications and environments and is capable of changes or modifications within the scope of the inventive concept, either as described above or as a matter of skill or knowledge in the relevant art. And that modifications and variations which do not depart from the spirit and scope of the invention are intended to be within the scope of the appended claims.

Claims (10)

1. The intelligent home environment atmosphere adjusting method is characterized by comprising the following steps of:
acquiring an environment mode selection instruction, wherein the instruction comprises an environment mode selected by a user;
Acquiring environment data parameters preset by the environment mode selected by the user and environment data where the user is currently located according to the environment mode selection instruction;
issuing a current environment image acquisition instruction according to preset environment data parameters, and acquiring a current environment image;
processing the current environment image to obtain current environment data and transmitting the current environment data and the current environment data of the user to an environment atmosphere adjusting model obtained through pre-training;
Comparing the current environmental data with preset environmental data parameters based on the current environmental data of the user to obtain a comparison result;
based on the comparison result, a corresponding environment atmosphere adjusting instruction is issued to realize the adjustment of the current environment to the environment mode selected by the user.
2. The smart home environment atmosphere regulating method according to claim 1, wherein: the method comprises the steps of obtaining an environment mode selection instruction, specifically: providing a first interactive interface to obtain the environment mode selection instruction, wherein the interface comprises one or more preset environment modes for selection by a user.
3. The smart home environment atmosphere regulating method according to claim 1, wherein: the current environment image acquisition instruction comprises an image acquisition instruction for one or more intelligent home equipment current control states;
The current environment data comprises one or more current control states of the intelligent household equipment.
4. A smart home environment atmosphere regulating method according to claim 3, characterized in that: the preset environment data parameters comprise one or more environments where preset users are located and one or more preset control parameters of intelligent household equipment in the environments where the preset users are located;
Based on the current environmental data of the user, comparing the current environmental data with preset environmental data parameters to obtain a comparison result, wherein the comparison result specifically comprises: determining the environment of the preset user according to the current environment data of the user, and comparing the current control state of the intelligent home equipment with preset control parameters of the intelligent home equipment in the environment of the preset user to obtain a comparison result; the comparison result is specifically the difference between the current control state of the same intelligent home equipment and preset control parameters;
Based on the comparison result, a corresponding environmental atmosphere adjusting instruction is issued, specifically: based on the difference between the current control state and the preset control parameters of the same intelligent household equipment, a corresponding environment atmosphere adjusting instruction is issued to the intelligent household equipment, so that the current control state is adjusted to the preset control parameters.
5. The smart home environment atmosphere regulating method according to claim 1, wherein: the current environment image is processed, specifically: converting the current environment image into bitmap form data, and converting the bitmap form data into tensor form data.
6. The smart home environment atmosphere regulating method according to claim 1, wherein: the construction of the environmental atmosphere adjustment model at least comprises the following steps:
acquiring first image data corresponding to control states of all intelligent home equipment under environments where different preset users are located and second image data corresponding to overall environmental atmosphere;
Based on a preset environment mode, storing the first image data and the second image data into corresponding preset environment mode folders in a classified mode to form a local data set;
creating a custom data set based on PyTorch libraries and the local data set;
preprocessing the data of the data set, and converting the data into a format which can be used by a perception model;
and constructing a perception model through a convolutional neural network, wherein the model can extract characteristics from image data and realize adjustment of environmental atmosphere.
7. The intelligent home environment atmosphere adjustment method according to claim 6, wherein: training the perception model after the perception model is built, and carrying out digital recognition test and image recognition test on the perception model after the perception model is built so as to judge whether the model is accurate;
If the model is accurate, the constructed perception model is stored and converted into a form capable of running on a mobile application or other software platforms to be used as an environmental atmosphere adjustment model; if the model is inaccurate, constructing a perception model again through a convolutional neural network to obtain a new model, and training, digital identification testing and image identification testing the new model again until the model is accurate.
8. The intelligent home environment atmosphere adjusting system is characterized by comprising terminal equipment, image acquisition equipment, intelligent home equipment and a cloud end, wherein an environment atmosphere adjusting model and an image processing program are arranged in the terminal equipment;
The terminal equipment is used for acquiring an environment mode selection instruction, wherein the instruction comprises an environment mode selected by a user; acquiring environment data parameters preset by the environment mode selected by the user and environment data where the user is currently located according to the environment mode selection instruction; issuing a current environment image acquisition instruction to the image acquisition equipment according to preset environment data parameters;
the image acquisition equipment is used for acquiring a current environment image according to the current environment image acquisition instruction;
The terminal device is further used for processing the current environment image based on the image processing program to obtain current environment data and transmitting the current environment data and the current environment data of the user to the environment atmosphere adjustment model obtained through pre-training;
The terminal equipment is further used for comparing the current environment data with preset environment data parameters based on the current environment data of the user through the environment atmosphere adjustment model to obtain a comparison result; based on the comparison result, a corresponding environmental atmosphere adjusting instruction is issued to the terminal equipment;
the terminal equipment is further used for adjusting the intelligent home equipment through the cloud based on the environment atmosphere adjusting instruction so as to adjust the current environment to the environment mode selected by the user.
9. An intelligent home environment atmosphere adjusting device, which is characterized by comprising:
The instruction acquisition module is used for acquiring an environment mode selection instruction, wherein the instruction comprises an environment mode selected by a user;
the data acquisition module is used for acquiring environment data parameters preset by the environment mode selected by the user and environment data where the user is currently located according to the environment mode selection instruction;
the image acquisition module is used for issuing a current environment image acquisition instruction according to preset environment data parameters and acquiring a current environment image;
The image processing module is used for processing the current environment image to obtain current environment data and transmitting the current environment data and the current environment data of the user to a pre-trained environment atmosphere adjustment model;
the comparison module is used for comparing the current environment data with preset environment data parameters based on the current environment data of the user to obtain a comparison result;
And the environmental atmosphere adjusting module is used for issuing a corresponding environmental atmosphere adjusting instruction based on the comparison result so as to realize the adjustment of the current environment to the environment mode selected by the user.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon an intelligent home environment atmosphere adjustment program, which when executed by a processor, implements the steps of the intelligent home environment atmosphere adjustment method according to any one of claims 1to 7.
CN202410272108.XA 2024-03-11 2024-03-11 Intelligent home environment atmosphere adjusting method, system, device and storage medium Pending CN118170036A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410272108.XA CN118170036A (en) 2024-03-11 2024-03-11 Intelligent home environment atmosphere adjusting method, system, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410272108.XA CN118170036A (en) 2024-03-11 2024-03-11 Intelligent home environment atmosphere adjusting method, system, device and storage medium

Publications (1)

Publication Number Publication Date
CN118170036A true CN118170036A (en) 2024-06-11

Family

ID=91359697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410272108.XA Pending CN118170036A (en) 2024-03-11 2024-03-11 Intelligent home environment atmosphere adjusting method, system, device and storage medium

Country Status (1)

Country Link
CN (1) CN118170036A (en)

Similar Documents

Publication Publication Date Title
JP6763925B2 (en) Machine learning system for building rendering and building information modeling data
CN107833183B (en) Method for simultaneously super-resolving and coloring satellite image based on multitask deep neural network
WO2021129181A1 (en) Portrait segmentation method, model training method and electronic device
CN112819686A (en) Image style processing method and device based on artificial intelligence and electronic equipment
CN107465868B (en) Object identification method and device based on terminal and electronic equipment
AU2020201897B2 (en) Labeling using interactive assisted segmentation
CN110851760B (en) Human-computer interaction system for integrating visual question answering in web3D environment
CN106201535A (en) The method and apparatus that toolbar background color converts along with the domain color of picture
CN112308144A (en) Method, system, equipment and medium for screening samples
CN112200062A (en) Target detection method and device based on neural network, machine readable medium and equipment
CN116012488A (en) Stylized image generation method, device, computer equipment and storage medium
CN107507620A (en) Voice broadcast sound setting method and device, mobile terminal and storage medium
CN110288513A (en) For changing the method, apparatus, equipment and storage medium of face character
CN111353257A (en) Spatial odor adjusting system
CN111507259B (en) Face feature extraction method and device and electronic equipment
CN113760024A (en) Environmental control system based on 5G intelligent space
CN114972944A (en) Training method and device of visual question-answering model, question-answering method, medium and equipment
CN111989917B (en) Electronic device and control method thereof
CN118170036A (en) Intelligent home environment atmosphere adjusting method, system, device and storage medium
CN110866866B (en) Image color imitation processing method and device, electronic equipment and storage medium
CN113221695A (en) Method for training skin color recognition model, method for recognizing skin color and related device
CN116229188B (en) Image processing display method, classification model generation method and equipment thereof
CN117475031A (en) Image generation method, device and storage medium
CN116958325A (en) Training method and device for image processing model, electronic equipment and storage medium
CN112218414A (en) Method and system for adjusting brightness of self-adaptive equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination