CN111222392A - Intelligent courtyard control system - Google Patents

Intelligent courtyard control system Download PDF

Info

Publication number
CN111222392A
CN111222392A CN201910833510.XA CN201910833510A CN111222392A CN 111222392 A CN111222392 A CN 111222392A CN 201910833510 A CN201910833510 A CN 201910833510A CN 111222392 A CN111222392 A CN 111222392A
Authority
CN
China
Prior art keywords
lamp
model
lamps
courtyard
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910833510.XA
Other languages
Chinese (zh)
Inventor
陈炯
孙亮程
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Elega Garden Solution Ningbo Co ltd
Original Assignee
Elega Garden Solution Ningbo Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Elega Garden Solution Ningbo Co ltd filed Critical Elega Garden Solution Ningbo Co ltd
Priority to CN201910833510.XA priority Critical patent/CN111222392A/en
Publication of CN111222392A publication Critical patent/CN111222392A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Abstract

The invention discloses an intelligent courtyard control system. The problem of among the prior art customer select lamps and lanterns can't learn the actual lamp light effect in the installation place, exist and not conform to user's requirement, cause the wasting of resources, improve the cost is solved. The system comprises a camera unit, a recognition unit and an interactive simulation unit which are connected. Through gathering the lamps and lanterns image, discern the lamps and lanterns model, by lamps and lanterns signal calling lamps and lanterns model, put the lamps and lanterns model in courtyard scene model according to user's instruction again, combine lamps and lanterns model to put the state, acquire the irradiation region, the light efficiency is handled to the filter scheme that corresponds of installation setting in the irradiation region, the light efficiency of this lamps and lanterns shows in courtyard scene model for the user can experience the actual effect of lamps and lanterns under the scene when buying the lamp, obtain better use and experience.

Description

Intelligent courtyard control system
Technical Field
The invention relates to the technical field of lamp simulation, in particular to an intelligent courtyard control system.
Background
At present, a user can only know the performance of the lamp when purchasing the lamp but cannot know the actual lighting effect of the lamp after the lamp is installed in an installation place, and the user often only can think when selecting the lamp, so that the selected lamp possibly has the condition that the selected lamp does not meet or meets the requirements of the user, the lamp is discarded or is idle, the resource waste is caused, and the cost is increased.
Disclosure of Invention
The invention mainly solves the problems that in the prior art, a customer selects a lamp and cannot know the actual light effect of the lamp in an installation place, the requirement of the customer is not met, the resource waste is caused, and the cost is increased, and provides an intelligent courtyard control system.
The technical problem of the invention is mainly solved by the following technical scheme: an intelligence courtyard control system which characterized in that: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
the camera shooting unit is used for collecting a lamp image;
the identification unit identifies the model of the lamp according to the image;
the interactive simulation unit is configured with a courtyard scene model, lamp models and filter processing schemes of light effects of the lamps, the lamp models are placed in the courtyard scene model according to the instructions, and the light effects of the positions of the lamp models are processed according to the filter processing schemes;
the camera shooting unit is connected with the identification unit, and the identification unit is connected with the exchange simulation unit.
The system identifies the lamps according to the camera unit, the interactive simulation unit places corresponding lamp models according to user instructions, and the lighting effect is achieved by combining different scenes, so that users can experience the actual effect of the lamps in the scenes while buying the lamps, and better use experience is obtained. The courtyard scene model can be a three-dimensional picture of a courtyard, the lamp model of each model corresponds to a light effect filter scheme, and the filter scheme is a set image processing scheme and is used for carrying out image processing on the picture in the set range of the lamp model so as to show the light effect after the lamp is placed. The camera unit and the delivery simulation unit can be respectively arranged independently or integrated into a whole, when the camera unit and the delivery simulation unit are arranged separately, the camera unit and the operation terminal are respectively arranged, and the integrated body is an operation terminal with a camera, such as a mobile phone.
As a preferred scheme, the identification unit is arranged at the cloud end and comprises a classifier for identifying the lamp, the image data collected by the camera shooting unit is transmitted to the identification unit through a network, and the identification unit sends the processing result to the exchange simulation unit through the network. According to the invention, the identification unit is arranged at the cloud end, and the user transmits the image shot by the camera shooting unit to the identification unit for identification, so that the requirements on user side equipment are reduced, and the use of the user is more convenient.
Preferably, the switching simulation unit comprises a touch display screen. Touch display screen can show courtyard scene model and lamps and lanterns model in this scheme, and the user can put the operation to lamps and lanterns model through touch display screen.
As a preferred solution, the intelligent yard control system control comprises the steps of,
s1, collecting a lamp image;
s2, identifying the model of the lamp according to the image;
s3, calling a corresponding lamp model from the courtyard scene model, and placing the lamp model at a designated position of the courtyard scene model according to an instruction;
s4, establishing a light effect area according to the information of the placement position of the lamp model;
and S5, carrying out light effect processing on the light effect area by adopting a filter processing scheme.
The lamp model is identified according to the lamp image, the lamp model is called according to the lamp signal, the lamp model is placed in the courtyard scene model according to the user instruction, the illumination area is obtained by combining the placement state of the lamp model, the set corresponding filter scheme is installed in the illumination area for light effect processing, and the light effect of the lamp is displayed in the courtyard scene model, so that the user can experience the actual effect of the lamp in the scene while buying the lamp, and better use experience is obtained.
As a preferred scheme, the specific process of identifying the model of the lamp according to the image includes:
s21, constructing a classifier for identifying the model of the lamp;
s22, placing the collected lamp image into a classifier for detection, and obtaining lamp model information;
s23, the interactive simulation unit judges whether the lamp model is obtained, if not, an error is prompted, no corresponding lamp exists, if so, a lamp model corresponding to the lamp model is extracted, and the step S3 is executed. The lamp image is identified by constructing the classifier in the scheme, the operation is simple and rapid, and the identification accuracy is high.
As a preferable scheme, the specific process of constructing the classifier in step S21 includes:
s211, collecting sample images, wherein the sample images are collected under the conditions of different illumination and different angles, and comprise images of lamps of various models;
s212, performing mirror transformation, scaling, rotation and cutting on part of sample images to expand the number of the sample images;
s213, marking the model of each lamp on the sample image;
and S214, inputting the marked sample image into a VGG network for training to obtain a lamp model classifier.
As a preferred scheme, the lamp model has position information, light-emitting angle information, and a corresponding irradiation area set according to the lamp model. Wherein the position information is coordinate information of the model placed in the yard scene model. The light-emitting angle information is an included angle between the light-emitting direction of the lamp model and a horizontal line. The irradiation area is set according to the sector-shaped area arranged in the emergent direction of the lamp, the shape and the area of the irradiation area are set according to the model of the lamp, each model of lamp has a corresponding irradiation area, and if the sector angle radians of the irradiation areas of the lamps of different models are different or the sector radii are different, the shapes and the areas of the irradiation areas are different.
As a preferable scheme, the specific process of step S4 includes:
s41, acquiring position information and light-emitting angle information of the lamp model according to the placing state of the lamp model;
and S42, framing a corresponding area on the courtyard scene model according to the shape of the irradiation area at the position of the lamp model and towards the emergent direction, and acquiring an image in the framed area to establish a light effect area. According to the scheme, a corresponding area is framed on the courtyard scene model according to the position of the lamp model, the emergent direction and the irradiation area, an image in the area is copied and obtained, a light effect area is established on the framed area according to the obtained image, and the light effect area covers the framed area.
Therefore, the invention has the advantages that: the lamp is identified according to the camera shooting unit, the corresponding lamp model is placed according to the user instruction through the interactive simulation unit, the lighting effect is achieved by combining different scenes, the user can experience the actual effect of the lamp in the scene while buying the lamp, and better use experience is obtained.
Drawings
FIG. 1 is a block diagram of one configuration of the present invention;
FIG. 2 is a schematic flow diagram of the process of the present invention.
1-camera unit 2-recognition unit 3-interactive simulation unit.
Detailed Description
The technical scheme of the invention is further specifically described by the following embodiments and the accompanying drawings.
Example (b):
the intelligent courtyard control system comprises a camera unit 1, an identification unit 2 and an interactive simulation unit 3, as shown in fig. 1. The camera unit is used for collecting lamp images, and the camera unit is an independent camera or a camera part integrated with other equipment. The identification unit is internally provided with a classifier for identifying the lamp type, the identification unit identifies the lamp type according to an image, the identification unit is a remote server and is arranged at the cloud end, the camera shooting unit is connected with the identification unit through a network, and a shot image is sent to the identification unit through the network. The interactive simulation unit comprises a touch display screen, a courtyard scene model, lamp models and filter processing schemes of light effects of lamps are configured in the interactive simulation unit, the courtyard scene model can be a three-dimensional image of a courtyard, the lamp models are lamp images or three-dimensional models set according to the lamps, and the filter processing schemes are set image processing schemes and are used for processing images of pictures in a setting range of the lamp models so as to show the light effect after the lamps are placed. The user puts the selected lamp model through the touch display screen, the interactive simulation unit places the lamp model in the courtyard scene model according to the putting instruction, and the light effect test is carried out on the position of the lamp model according to the filter processing scheme, so that the light effect under the scene is displayed. The identification unit is connected to the interactive simulation unit through a network.
A control process using an intelligent yard control system, as shown in figure 2, includes the steps of,
s1, a camera unit collects a lamp image;
s2, the identification unit identifies the model of the lamp according to the image; the specific process comprises the following steps:
s21, constructing a classifier for identifying the model of the lamp;
s22, placing the collected lamp image into a classifier for detection, and obtaining lamp model information; wherein the process of establishing a classifier comprises:
s211, collecting sample images, wherein the sample images are collected under the conditions of different illumination and different angles, and comprise images of lamps of various models;
s212, performing mirror transformation, scaling, rotation and cutting on part of sample images to expand the number of the sample images;
s213, marking the model of each lamp on the sample image;
and S214, inputting the marked sample image into a VGG network for training to obtain a lamp model classifier.
S23, the interactive simulation unit judges whether the lamp model is obtained, if not, an error is prompted, no corresponding lamp exists, if so, a lamp model corresponding to the lamp model is extracted, and the step S3 is executed.
S3, calling a corresponding lamp model from the courtyard scene model by the interactive simulation unit, and placing the lamp model at a designated position of the courtyard scene model according to an instruction; the lamp model is provided with position information, light-emitting angle information and a corresponding irradiation area according to the model of the lamp. The position information is coordinate information of the model placed in the yard scene model. The light-emitting angle information is an included angle between the light-emitting direction of the lamp model and a horizontal line. The irradiation area is set according to the sector-shaped area arranged in the emergent direction of the lamp, the shape and the area of the irradiation area are set according to the model of the lamp, each model of lamp has a corresponding irradiation area, and if the sector angle radians of the irradiation areas of the lamps of different models are different or the sector radii are different, the shapes and the areas of the irradiation areas are different.
S4, establishing a light effect area according to the information of the placement position of the lamp model; the specific process comprises the following steps:
s41, acquiring position information and light-emitting angle information of the lamp model according to the placing state of the lamp model;
and S42, framing a corresponding area on the courtyard scene model according to the shape of the irradiation area at the position of the lamp model and towards the emergent direction, and acquiring an image in the framed area to establish a light effect area.
And S5, carrying out light effect test on the light effect area by adopting a filter processing scheme.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.
Although the terms camera unit, recognition unit, interactive simulation unit, etc. are used more often herein, the possibility of using other terms is not excluded. These terms are used merely to more conveniently describe and explain the nature of the present invention; they are to be construed as being without limitation to any additional limitations that may be imposed by the spirit of the present invention.

Claims (8)

1. An intelligence courtyard control system which characterized in that: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
the camera shooting unit is used for collecting a lamp image;
the identification unit identifies the model of the lamp according to the image;
the interactive simulation unit is configured with a courtyard scene model, lamp models and filter processing schemes of light effects of the lamps, the lamp models are placed in the courtyard scene model according to the instructions, and the light effects of the positions of the lamp models are processed according to the filter processing schemes;
the camera shooting unit is connected with the identification unit, and the identification unit is connected with the exchange simulation unit.
2. The intelligent courtyard control system of claim 1, wherein the identification unit is arranged at a cloud end and comprises a classifier for identifying lamps, image data collected by the camera unit is transmitted to the identification unit through a network, and the identification unit sends a processing result to the exchange simulation unit through the network.
3. An intelligent yard control system as claimed in claim 1 or claim 2 wherein the switching simulation unit comprises a touch screen display.
4. An intelligent yard control system as claimed in claim 1, wherein said specific process of identifying the model of the light fixture from the image comprises:
s21, constructing a classifier for identifying the model of the lamp;
s22, placing the collected lamp image into a classifier for detection, and obtaining lamp model information;
s23, the interactive simulation unit judges whether the lamp model is obtained or not, if not, an error is prompted, no corresponding lamp exists, and if yes, a lamp model corresponding to the lamp model is extracted.
5. The intelligent yard control system of claim 4, wherein the specific process of constructing the classifier in step S21 includes:
s211, collecting sample images, wherein the sample images are collected under the conditions of different illumination and different angles, and comprise images of lamps of various models;
s212, performing mirror transformation, scaling, rotation and cutting on part of sample images to expand the number of the sample images;
s213, marking the model of each lamp on the sample image;
and S214, inputting the marked sample image into a VGG network for training to obtain a lamp model classifier.
6. An intelligent yard control system as claimed in claim 4 wherein the light model has position information, light angle information, and corresponding illumination zones based on the type of light.
7. An intelligent yard control system as claimed in claim 1, wherein the interactive simulation unit comprises the steps of,
s3, calling a corresponding lamp model from the courtyard scene model, and placing the lamp model at a designated position of the courtyard scene model according to an instruction;
s4, establishing a light effect area according to the information of the placement position of the lamp model;
and S5, carrying out light effect test on the light effect area by adopting a filter processing scheme.
8. The intelligent yard control system of claim 7, wherein the specific process of step S4 includes:
s41, acquiring position information and light-emitting angle information of the lamp model according to the placing state of the lamp model;
and S42, framing a corresponding area on the courtyard scene model according to the shape of the irradiation area at the position of the lamp model and towards the emergent direction, and acquiring an image in the framed area to establish a light effect area.
CN201910833510.XA 2019-09-04 2019-09-04 Intelligent courtyard control system Pending CN111222392A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910833510.XA CN111222392A (en) 2019-09-04 2019-09-04 Intelligent courtyard control system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910833510.XA CN111222392A (en) 2019-09-04 2019-09-04 Intelligent courtyard control system

Publications (1)

Publication Number Publication Date
CN111222392A true CN111222392A (en) 2020-06-02

Family

ID=70830889

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910833510.XA Pending CN111222392A (en) 2019-09-04 2019-09-04 Intelligent courtyard control system

Country Status (1)

Country Link
CN (1) CN111222392A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114114936A (en) * 2021-11-10 2022-03-01 深圳市欧瑞博科技股份有限公司 Grouping method and grouping device for intelligent lamps, intelligent equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107689073A (en) * 2016-08-05 2018-02-13 阿里巴巴集团控股有限公司 The generation method of image set, device and image recognition model training method, system
US20190220694A1 (en) * 2018-01-18 2019-07-18 Accenture Global Solutions Limited Dynamically identifying object attributes via image analysis

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107689073A (en) * 2016-08-05 2018-02-13 阿里巴巴集团控股有限公司 The generation method of image set, device and image recognition model training method, system
US20190220694A1 (en) * 2018-01-18 2019-07-18 Accenture Global Solutions Limited Dynamically identifying object attributes via image analysis

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
林智奇 等: "LED 室内照明仿真平台的研究与实现" *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114114936A (en) * 2021-11-10 2022-03-01 深圳市欧瑞博科技股份有限公司 Grouping method and grouping device for intelligent lamps, intelligent equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109167924B (en) Video imaging method, system, device and storage medium based on hybrid camera
US11579904B2 (en) Learning data collection device, learning data collection system, and learning data collection method
WO2019233445A1 (en) Data collection and model generation method for house
CN108174097B (en) Picture shooting method and device, and picture shooting parameter providing method and device
CN109635621A (en) For the system and method based on deep learning identification gesture in first person
US20180286069A1 (en) Image processing apparatus and image processing method
KR101165415B1 (en) Method for recognizing human face and recognizing apparatus
TW201823983A (en) Method and system for creating virtual message onto a moving object and searching the same
CN105872456A (en) Security protection method and apparatus based on smart television
CN107870962B (en) Method and system for remotely managing local space objects
CN109828681A (en) Laser pen light stream trace tracking method, projection device and computer readable storage medium
CN111339831A (en) Lighting lamp control method and system
CN109819400A (en) Lookup method, device, equipment and the medium of user location
CN107710736A (en) Aid in the method and system of user's capture images or video
CN107430498A (en) Extend the visual field of photo
CN112019826A (en) Projection method, system, device, electronic equipment and storage medium
JP4694957B2 (en) Information presenting apparatus, information presenting method, and program thereof
CN106777071B (en) Method and device for acquiring reference information by image recognition
CN111222392A (en) Intelligent courtyard control system
JP7310123B2 (en) Imaging device and program
CN105554366A (en) Multimedia photographing processing method and device and intelligent terminal
CN112906610A (en) Method for living body detection, electronic circuit, electronic apparatus, and medium
CN104076990B (en) Screen localization method and device
US20200036939A1 (en) Transparency system for commonplace camera
US20210125339A1 (en) Method and device for segmenting image, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination