CN117671464B - Equipment internet of things data management system based on edge computing - Google Patents

Equipment internet of things data management system based on edge computing Download PDF

Info

Publication number
CN117671464B
CN117671464B CN202410148643.4A CN202410148643A CN117671464B CN 117671464 B CN117671464 B CN 117671464B CN 202410148643 A CN202410148643 A CN 202410148643A CN 117671464 B CN117671464 B CN 117671464B
Authority
CN
China
Prior art keywords
monitoring
moment
image
shading
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410148643.4A
Other languages
Chinese (zh)
Other versions
CN117671464A (en
Inventor
冯钰洋
蔡俊松
肖高峰
刘磊
郁波
潘保成
余海平
罗国卿
任俊杰
李龙江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Energy Solution Development Shenzhen Co ltd
Original Assignee
Energy Solution Development Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Energy Solution Development Shenzhen Co ltd filed Critical Energy Solution Development Shenzhen Co ltd
Priority to CN202410148643.4A priority Critical patent/CN117671464B/en
Publication of CN117671464A publication Critical patent/CN117671464A/en
Application granted granted Critical
Publication of CN117671464B publication Critical patent/CN117671464B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses an equipment internet of things data management system based on edge calculation, which belongs to the technical field of data processing and comprises an image set generating unit, a shading coefficient generating unit and an image processing unit; the image set generating unit is used for generating an image set; the shading coefficient generation unit is used for generating a shading coefficient sequence; and the image processing unit is used for uploading all the processed monitoring images to the edge computing terminal. The invention discloses an equipment internet of things data management system based on edge calculation, which can process images of monitoring videos acquired by camera equipment, consider the brightness change of images in the whole time period and the illumination intensity change of a monitoring area, improve the quality of the images acquired by the camera equipment, facilitate the use of users, and the whole data processing process does not need to be uploaded to a terminal for further processing, thereby improving the data processing efficiency.

Description

Equipment internet of things data management system based on edge computing
Technical Field
The invention belongs to the technical field of data processing, and particularly relates to an equipment internet of things data management system based on edge computing.
Background
The monitoring system can be applied to the fields of industrial production, traditional Internet, mobile communication technology and the like, and mass network video monitoring data are generated along with the wide application of network video monitoring, so that great pressure is generated on the processing of the network video monitoring data. The image frame classified storage is one of storage modes of network video monitoring data. In the prior art, the technical problem of poor image quality of the classified storage of the image frames of the monitoring video exists. The edge calculation can place data processing and analysis at a place closer to a data source, so that delay of data transmission and network congestion are reduced, and the speed and efficiency of data processing are improved, so that how to realize the data processing of the monitoring equipment based on the edge calculation becomes a problem to be solved urgently.
Disclosure of Invention
In order to solve the problems, the invention provides an equipment internet of things data management system based on edge computing.
The technical scheme of the invention is as follows: the data management system of the equipment Internet of things based on edge calculation comprises an image set generating unit, a shading coefficient generating unit and an image processing unit;
the image set generating unit is used for acquiring a monitoring video of the monitoring area through the camera, preprocessing the monitoring video and generating an image set;
the shading coefficient generation unit is used for generating corresponding shading coefficients for each monitoring image in the image set and generating a shading coefficient sequence;
the image processing unit is used for processing each monitoring image in the image set according to the light and shade coefficient sequence and uploading all the processed monitoring images to the edge computing terminal.
Further, the method for preprocessing the monitoring video by the image set generating unit specifically comprises the following steps: and carrying out framing treatment on the monitoring video of the monitoring area by using the Opencv model to obtain monitoring images at all times, and generating an image set.
Further, the generating of the shading coefficient sequence by the shading coefficient generating unit comprises the sub-steps of:
a1, obtaining brightness values of all pixel points in each moment monitoring image, and constructing a brightness change function for each moment monitoring image;
a2, correcting the brightness change function of the monitoring image at each moment to obtain a brightness correction factor of the monitoring image at each moment;
a3, taking the difference value between the shading correction factor of the monitoring image at the final moment and the shading correction factor of the monitoring image at the initial moment as a shading change factor, judging whether the ratio between the shading change factor and the shading correction factor of the monitoring image at the initial moment is greater than or equal to 0.5, if so, entering A4, otherwise, entering A5;
a4, taking the mean value of the shading change factors and the shading correction factors of the monitoring images at all times as the shading coefficient of the monitoring images at all times to generate a shading coefficient sequence;
and A5, taking the shading correction factors of the monitoring images at all the moments as shading coefficients of the monitoring images at all the moments, and generating a shading coefficient sequence.
The beneficial effects of the above-mentioned further scheme are: in the invention, the maximum brightness value of each row and column pixel point in each time monitoring image is summed to obtain a function representing the brightness change of the monitoring image, and the brightness change function at the current time is corrected by the brightness change functions at the front and rear adjacent times to obtain the brightness correction factors at each time. Then, the present invention judges whether the difference of shading correction factors (namely shading change factors) at the first and last moments is too large to generate a shading coefficient sequence.
Further, in A1,ttime monitoring the shading function of an imagef t The expression of (2) is:the method comprises the steps of carrying out a first treatment on the surface of the In the method, in the process of the invention,S m representation oftTime monitoring image No.mThe maximum luminance value in the row of pixel points,S n representation oftTime monitoring image No.nThe maximum luminance value in the column pixel points,Mrepresentation oftThe number of pixel lines of the image is monitored at the moment,Nrepresentation oftThe number of pixel columns of the image is monitored at the moment,max(. Cndot.) represents the maximum value operation.
Further, in A2,tshading correction factor of time monitoring imageF t The calculation formula of (2) is as follows:the method comprises the steps of carrying out a first treatment on the surface of the In the method, in the process of the invention,f t representation oftThe shading function of the image is monitored at the moment,f t-1 representation oft-1 monitoring the shading function of the image at time instant,f t+1 representation oftThe shading function of the image is monitored at time +1.
Further, the image processing unit processing each monitoring image in the image set includes the steps of:
b1, determining the brightness intensity at each moment according to the brightness coefficient sequence;
b2, carrying out pixel value processing on the monitoring images at all the moments according to the brightness intensity at all the moments;
and B3, sequentially carrying out mean value filtering processing and denoising processing on the monitoring images at all moments after the pixel value processing.
The beneficial effects of the above-mentioned further scheme are: in the invention, elements of a shading coefficient sequence are arranged according to a time sequence, elements of a reconstructed shading coefficient sequence are arranged according to a shading coefficient sequence from small to large, and the median (considered from the time sequence and the size sequence respectively) and illumination brightness of the two sequences are calculated to determine the shading intensity at each moment. And adjusting RGB components of the monitored image pixel points at each moment according to the brightness intensity at each moment, so that the image quality is improved.
Further, B1 comprises the sub-steps of:
b11, sequencing each shading coefficient of the shading coefficient sequence from small to large to generate a reconstructed shading coefficient sequence;
b12, collecting illumination brightness at each moment in the monitoring area;
and B13, determining the brightness intensity at each moment according to the brightness coefficient sequence, the reconstructed brightness coefficient sequence and the illumination brightness at each moment in the monitoring area.
Further, in B13,tintensity of light and shade at timeq t The calculation formula of (2) is as follows:the method comprises the steps of carrying out a first treatment on the surface of the In the method, in the process of the invention,c 1 represents the median of the sequence of shading coefficients,c 2 represents the median of the reconstructed sequence of shading coefficients,L t indicating the illumination intensity at each moment in the monitored area.
Further, in B2, the specific method for performing pixel value processing on the monitor image at each time is as follows: the sum of the red component value of each pixel point in the monitoring image and the light and shade intensity of the corresponding moment is used as the latest red component value of each pixel point, the sum of the green component value of each pixel point in the monitoring image and the light and shade intensity of the corresponding moment is used as the latest green component value of each pixel point, and the sum of the blue component value of each pixel point in the monitoring image and the light and shade intensity of the corresponding moment is used as the latest blue component value of each pixel point.
The beneficial effects of the invention are as follows: the invention discloses an equipment internet of things data management system based on edge calculation, which can process images of monitoring videos acquired by camera equipment, consider the brightness change of images in the whole time period and the illumination intensity change of a monitoring area, improve the quality of the images acquired by the camera equipment, facilitate the use of users, and the whole data processing process does not need to be uploaded to a terminal for further processing, thereby improving the data processing efficiency.
Drawings
Fig. 1 is a schematic structural diagram of an internet of things data management system of a device based on edge computing.
Description of the embodiments
Embodiments of the present invention are further described below with reference to the accompanying drawings.
As shown in fig. 1, the invention provides an equipment internet of things data management system based on edge calculation, which comprises an image set generating unit, a shading coefficient generating unit and an image processing unit;
the image set generating unit is used for acquiring a monitoring video of the monitoring area through the camera, preprocessing the monitoring video and generating an image set;
the shading coefficient generation unit is used for generating corresponding shading coefficients for each monitoring image in the image set and generating a shading coefficient sequence;
the image processing unit is used for processing each monitoring image in the image set according to the light and shade coefficient sequence and uploading all the processed monitoring images to the edge computing terminal.
In the embodiment of the invention, the method for preprocessing the monitoring video by the image set generating unit specifically comprises the following steps: and carrying out framing treatment on the monitoring video of the monitoring area by using the Opencv model to obtain monitoring images at all times, and generating an image set.
In an embodiment of the invention, the generating of the shading coefficient sequence by the shading coefficient generating unit comprises the sub-steps of:
a1, obtaining brightness values of all pixel points in each moment monitoring image, and constructing a brightness change function for each moment monitoring image;
a2, correcting the brightness change function of the monitoring image at each moment to obtain a brightness correction factor of the monitoring image at each moment;
a3, taking the difference value between the shading correction factor of the monitoring image at the final moment and the shading correction factor of the monitoring image at the initial moment as a shading change factor, judging whether the ratio between the shading change factor and the shading correction factor of the monitoring image at the initial moment is greater than or equal to 0.5, if so, entering A4, otherwise, entering A5;
a4, taking the mean value of the shading change factors and the shading correction factors of the monitoring images at all times as the shading coefficient of the monitoring images at all times to generate a shading coefficient sequence;
and A5, taking the shading correction factors of the monitoring images at all the moments as shading coefficients of the monitoring images at all the moments, and generating a shading coefficient sequence.
In the invention, the maximum brightness value of each row and column pixel point in each time monitoring image is summed to obtain a function representing the brightness change of the monitoring image, and the brightness change function at the current time is corrected by the brightness change functions at the front and rear adjacent times to obtain the brightness correction factors at each time. Then, the present invention judges whether the difference of shading correction factors (namely shading change factors) at the first and last moments is too large to generate a shading coefficient sequence.
In the embodiment of the present invention, in A1,ttime monitoring the shading function of an imagef t The expression of (2) is:the method comprises the steps of carrying out a first treatment on the surface of the In the method, in the process of the invention,S m representation oftTime monitoring image No.mThe maximum luminance value in the row of pixel points,S n representation oftTime monitoring image No.nThe maximum luminance value in the column pixel points,Mrepresentation oftThe number of pixel lines of the image is monitored at the moment,Nrepresentation oftThe number of pixel columns of the image is monitored at the moment,max(. Cndot.) represents the maximum value operation.
In the embodiment of the present invention, in A2,tshading correction factor of time monitoring imageF t The calculation formula of (2) is as follows:the method comprises the steps of carrying out a first treatment on the surface of the In the method, in the process of the invention,f t representation oftThe shading function of the image is monitored at the moment,f t-1 representation oft-1 monitoring the shading function of the image at time instant,f t+1 representation oftThe shading function of the image is monitored at time +1.
In an embodiment of the present invention, the processing of each monitoring image in the image set by the image processing unit includes the following steps:
b1, determining the brightness intensity at each moment according to the brightness coefficient sequence;
b2, carrying out pixel value processing on the monitoring images at all the moments according to the brightness intensity at all the moments;
and B3, sequentially carrying out mean value filtering processing and denoising processing on the monitoring images at all moments after the pixel value processing.
In the invention, elements of a shading coefficient sequence are arranged according to a time sequence, elements of a reconstructed shading coefficient sequence are arranged according to a shading coefficient sequence from small to large, and the median (considered from the time sequence and the size sequence respectively) and illumination brightness of the two sequences are calculated to determine the shading intensity at each moment. And adjusting RGB components of the monitored image pixel points at each moment according to the brightness intensity at each moment, so that the image quality is improved.
In an embodiment of the present invention, B1 comprises the following sub-steps:
b11, sequencing each shading coefficient of the shading coefficient sequence from small to large to generate a reconstructed shading coefficient sequence;
b12, collecting illumination brightness at each moment in the monitoring area;
and B13, determining the brightness intensity at each moment according to the brightness coefficient sequence, the reconstructed brightness coefficient sequence and the illumination brightness at each moment in the monitoring area.
In the embodiment of the present invention, in B13,tintensity of light and shade at timeq t The calculation formula of (2) is as follows:the method comprises the steps of carrying out a first treatment on the surface of the In the method, in the process of the invention,c 1 represents the median of the sequence of shading coefficients,c 2 represents the median of the reconstructed sequence of shading coefficients,L t indicating the illumination intensity at each moment in the monitored area.
In the embodiment of the invention, in the step B2, the specific method for carrying out pixel value processing on the monitoring image at each moment is as follows: the sum of the red component value of each pixel point in the monitoring image and the light and shade intensity of the corresponding moment is used as the latest red component value of each pixel point, the sum of the green component value of each pixel point in the monitoring image and the light and shade intensity of the corresponding moment is used as the latest green component value of each pixel point, and the sum of the blue component value of each pixel point in the monitoring image and the light and shade intensity of the corresponding moment is used as the latest blue component value of each pixel point.
Those of ordinary skill in the art will recognize that the embodiments described herein are for the purpose of aiding the reader in understanding the principles of the present invention and should be understood that the scope of the invention is not limited to such specific statements and embodiments. Those of ordinary skill in the art can make various other specific modifications and combinations from the teachings of the present disclosure without departing from the spirit thereof, and such modifications and combinations remain within the scope of the present disclosure.

Claims (5)

1. The data management system of the equipment Internet of things based on edge calculation is characterized by comprising an image set generating unit, a shading coefficient generating unit and an image processing unit;
the image set generating unit is used for acquiring a monitoring video of the monitoring area through the camera, preprocessing the monitoring video and generating an image set;
the shading coefficient generation unit is used for generating corresponding shading coefficients for each monitoring image in the image set and generating a shading coefficient sequence;
the image processing unit is used for processing each monitoring image in the image set according to the light and shade coefficient sequence and uploading all the processed monitoring images to the edge computing terminal;
the generating of the shading coefficient sequence by the shading coefficient generating unit comprises the following sub-steps:
a1, obtaining brightness values of all pixel points in each moment monitoring image, and constructing a brightness change function for each moment monitoring image;
a2, correcting the brightness change function of the monitoring image at each moment to obtain a brightness correction factor of the monitoring image at each moment;
a3, taking the difference value between the shading correction factor of the monitoring image at the final moment and the shading correction factor of the monitoring image at the initial moment as a shading change factor, judging whether the ratio between the shading change factor and the shading correction factor of the monitoring image at the initial moment is greater than or equal to 0.5, if so, entering A4, otherwise, entering A5;
a4, taking the mean value of the shading change factors and the shading correction factors of the monitoring images at all times as the shading coefficient of the monitoring images at all times to generate a shading coefficient sequence;
a5, taking the shading correction factors of the monitoring images at all the moments as shading coefficients of the monitoring images at all the moments, and generating a shading coefficient sequence;
in the A1, the brightness change function f of the monitoring image at the time t t The expression of (2) is:wherein S is m Representing the maximum brightness value in the m-th row pixel point in the monitoring image at the moment t, S n The maximum brightness value in the nth row of pixels in the monitoring image at the moment t is represented, M represents the number of rows of pixels in the monitoring image at the moment t, N represents the number of columns of pixels in the monitoring image at the moment t, and max (·) represents the maximum value operation;
in the A2, the brightness correction factor F of the monitoring image at the moment t t The calculation formula of (2) is as follows:wherein f t Represents the brightness change function of the monitoring image at the moment t, f t-1 Represents the brightness change function of the monitoring image at the time t-1, f t+1 A brightness change function of the monitoring image at the time t+1 is represented;
the image processing unit processes each monitoring image in the image set, and comprises the following steps:
b1, determining the brightness intensity at each moment according to the brightness coefficient sequence;
b2, carrying out pixel value processing on the monitoring images at all the moments according to the brightness intensity at all the moments;
and B3, sequentially carrying out mean value filtering processing and denoising processing on the monitoring images at all moments after the pixel value processing.
2. The data management system of the internet of things of the device based on edge computing according to claim 1, wherein the method for preprocessing the surveillance video by the image set generating unit specifically comprises: and carrying out framing treatment on the monitoring video of the monitoring area by using the Opencv model to obtain monitoring images at all times, and generating an image set.
3. The edge computing-based device internet of things data management system of claim 1, wherein the B1 comprises the sub-steps of:
b11, sequencing each shading coefficient of the shading coefficient sequence from small to large to generate a reconstructed shading coefficient sequence;
b12, collecting illumination brightness at each moment in the monitoring area;
and B13, determining the brightness intensity at each moment according to the brightness coefficient sequence, the reconstructed brightness coefficient sequence and the illumination brightness at each moment in the monitoring area.
4. The data management system of the internet of things based on the edge computing device according to claim 3, wherein in the B13, the brightness intensity q at the time t is t The calculation formula of (2) is as follows:wherein, c 1 Representing the median, c, of the sequence of shading coefficients 2 Represents the median of the reconstructed shading coefficient sequence, L t Indicating the illumination intensity at each moment in the monitored area.
5. The data management system of the internet of things of the device based on edge computing according to claim 1, wherein in the B2, the specific method for performing pixel value processing on the monitoring image at each moment is as follows: the sum of the red component value of each pixel point in the monitoring image and the light and shade intensity of the corresponding moment is used as the latest red component value of each pixel point, the sum of the green component value of each pixel point in the monitoring image and the light and shade intensity of the corresponding moment is used as the latest green component value of each pixel point, and the sum of the blue component value of each pixel point in the monitoring image and the light and shade intensity of the corresponding moment is used as the latest blue component value of each pixel point.
CN202410148643.4A 2024-02-02 2024-02-02 Equipment internet of things data management system based on edge computing Active CN117671464B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410148643.4A CN117671464B (en) 2024-02-02 2024-02-02 Equipment internet of things data management system based on edge computing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410148643.4A CN117671464B (en) 2024-02-02 2024-02-02 Equipment internet of things data management system based on edge computing

Publications (2)

Publication Number Publication Date
CN117671464A CN117671464A (en) 2024-03-08
CN117671464B true CN117671464B (en) 2024-04-16

Family

ID=90075424

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410148643.4A Active CN117671464B (en) 2024-02-02 2024-02-02 Equipment internet of things data management system based on edge computing

Country Status (1)

Country Link
CN (1) CN117671464B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110189349A (en) * 2019-06-03 2019-08-30 湖南国科微电子股份有限公司 Image processing method and device
CN110798592A (en) * 2019-10-29 2020-02-14 普联技术有限公司 Object movement detection method, device and equipment based on video image and storage medium
CN116708724A (en) * 2023-08-07 2023-09-05 江苏省电子信息产品质量监督检验研究院(江苏省信息安全测评中心) Sample monitoring method and system based on machine vision
CN116828209A (en) * 2023-08-30 2023-09-29 华洋通信科技股份有限公司 Method and system for transmitting intelligent video monitoring data under mine
CN117292330A (en) * 2023-11-27 2023-12-26 山东海博科技信息系统股份有限公司 Intelligent monitoring system suitable for time sequence data operation and maintenance
CN117409000A (en) * 2023-12-14 2024-01-16 华能澜沧江水电股份有限公司 Radar image processing method for slope

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110189349A (en) * 2019-06-03 2019-08-30 湖南国科微电子股份有限公司 Image processing method and device
CN110798592A (en) * 2019-10-29 2020-02-14 普联技术有限公司 Object movement detection method, device and equipment based on video image and storage medium
CN116708724A (en) * 2023-08-07 2023-09-05 江苏省电子信息产品质量监督检验研究院(江苏省信息安全测评中心) Sample monitoring method and system based on machine vision
CN116828209A (en) * 2023-08-30 2023-09-29 华洋通信科技股份有限公司 Method and system for transmitting intelligent video monitoring data under mine
CN117292330A (en) * 2023-11-27 2023-12-26 山东海博科技信息系统股份有限公司 Intelligent monitoring system suitable for time sequence data operation and maintenance
CN117409000A (en) * 2023-12-14 2024-01-16 华能澜沧江水电股份有限公司 Radar image processing method for slope

Also Published As

Publication number Publication date
CN117671464A (en) 2024-03-08

Similar Documents

Publication Publication Date Title
CN107680056B (en) Image processing method and device
CN110211056B (en) Self-adaptive infrared image de-striping algorithm based on local median histogram
CN104506755B (en) HD video based on FPGA automates defogging method in real time
CN110944176B (en) Image frame noise reduction method and computer storage medium
CN111612722B (en) Low-illumination image processing method based on simplified Unet full-convolution neural network
US20210258584A1 (en) Static video recognition
CN114821449B (en) License plate image processing method based on attention mechanism
CN113191995A (en) Video image automatic exposure correction method based on deep learning
CN112598612A (en) Flicker-free dim light video enhancement method and device based on illumination decomposition
CN113781367B (en) Noise reduction method after low-illumination image histogram equalization
CN113068011B (en) Image sensor, image processing method and system
CN103428522B (en) Method for improving image quality at low illumination level of web camera
US11640654B2 (en) Image processing method and apparatus
CN117671464B (en) Equipment internet of things data management system based on edge computing
US11823352B2 (en) Processing video frames via convolutional neural network using previous frame statistics
CN117635649A (en) Landslide monitoring method and system
CN110136085B (en) Image noise reduction method and device
CN108898566B (en) Low-illumination color video enhancement method using space-time illumination map
TWI556643B (en) Image adjustment method
US11861814B2 (en) Apparatus and method for sensing image based on event
CN114120167A (en) Method and device for measuring exposure degree of video image and automatically correcting video image
CN104915937A (en) Quick single-lens calculating imaging method based on frequency domain matrix decomposition
CN117793538B (en) Automatic image exposure correction and enhancement method and device
CN110009577B (en) Tone mapping system based on FPGA
CN113379631B (en) Image defogging method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant