CN208335208U - Image fusion acquisition system containing meteorological parameters - Google Patents

Image fusion acquisition system containing meteorological parameters Download PDF

Info

Publication number
CN208335208U
CN208335208U CN201820313952.2U CN201820313952U CN208335208U CN 208335208 U CN208335208 U CN 208335208U CN 201820313952 U CN201820313952 U CN 201820313952U CN 208335208 U CN208335208 U CN 208335208U
Authority
CN
China
Prior art keywords
image
wireless communication
communication device
meteorological
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201820313952.2U
Other languages
Chinese (zh)
Inventor
阮驰
冯亚闯
陈小川
王允韬
马新旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XiAn Institute of Optics and Precision Mechanics of CAS
Original Assignee
XiAn Institute of Optics and Precision Mechanics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XiAn Institute of Optics and Precision Mechanics of CAS filed Critical XiAn Institute of Optics and Precision Mechanics of CAS
Priority to CN201820313952.2U priority Critical patent/CN208335208U/en
Application granted granted Critical
Publication of CN208335208U publication Critical patent/CN208335208U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The utility model relates to an image fusion collection system who contains meteorological parameter solves current image measurement and relies on image brightness information, and image brightness information receives the problem of shooting environmental disturbance. The system comprises a fusion acquisition unit and a server unit; the server unit comprises a first controller, a first wireless communication device and at least one database, wherein the first wireless communication device and the database are respectively connected with the first controller; the fusion acquisition unit comprises a second controller, a second wireless communication device, a storage device, a camera, a positioning device, a timing device and a plurality of meteorological sensors, wherein the second wireless communication device, the storage device, the camera, the positioning device, the timing device and the meteorological sensors are respectively connected with the second controller; the second wireless communication device communicates with the first wireless communication device.

Description

Image fusion acquisition system containing meteorological parameters
Technical Field
The utility model relates to an image field, concretely relates to image fusion collection system who contains meteorological parameter.
Background
At present, the expression mode of a traditional image is usually original brightness information or characteristic expression based on brightness calculation, such as remote sensing image measurement, spectral image measurement, infrared image measurement and the like, the image measurement depends heavily on the brightness information of the image, the brightness information of the image is obviously interfered by a shooting environment, the characteristics are all established on the basis of visible light, and under different environments, the light condition of the same scene has larger difference.
The representation of the image is usually represented by one channel (gray image), three channels (color image) or multiple channels (spectral image), and the feature expression in computer vision application is also calculated based on the image parameters, however, the same scene is usually different in performance under different shooting conditions; for example, as light gets weaker, the image appears to transition from light to dark; with the increase of the humidity in the air, the spectral curve of the ground object in the image can generate obvious change; the temperature is different when the image is shot, and the shot image effect is different; visibility directly affects the sharpness of the image. Therefore, the image representation method only depending on the collected brightness values of the channels is incomplete at present, environmental information during image shooting is lacked in image representation, and shooting meteorological conditions such as atmospheric temperature, humidity, visibility, illumination and the like are ignored, so that image information is lacked.
SUMMERY OF THE UTILITY MODEL
The utility model aims at solving current image measurement and relying on image brightness information, and image brightness information receives the problem of shooting environmental disturbance, provides an image fusion collecting system who contains meteorological parameter, can realize saving the image that contains a plurality of meteorological parameter to further improve the accuracy of shooting the image.
The technical scheme of the utility model is that:
an image fusion acquisition system containing meteorological parameters comprises a fusion acquisition unit and a server unit; the server unit comprises a first controller, a first wireless communication device and at least one database, wherein the first wireless communication device and the database are respectively connected with the first controller; the fusion acquisition unit comprises a second controller, a second wireless communication device, a storage device, a camera, a positioning device, a timing device and a plurality of meteorological sensors, wherein the second wireless communication device, the storage device, the camera, the positioning device, the timing device and the meteorological sensors are respectively connected with the second controller; the second wireless communication device communicates with the first wireless communication device; the database stores images in a specific format and including meteorological information; the second controller associates the image data with the meteorological data according to the position identifier acquired by the positioning device and the time identifier generated by the timing device, sends the image data and the meteorological data to the server unit in a specific format, and stores the image data and the meteorological data in a database of the server unit; the specific format comprises a file header and a file body, the file header comprises a file registration code and file image parameters, and the file image parameters comprise an image format, an image size, an image channel, a reserved information bit, a time identification bit and a position identification; the file body includes weather data and image data.
Further, the camera is a camera, a camera or a pan-tilt camera for capturing images.
Further, the weather sensors include visibility instruments, illuminance sensors, sensors for detecting PM2.5, and/or automated weather stations.
Further, the positioning device comprises a Beidou satellite positioning or GPS positioning system.
Further, the second wireless communication device and the first wireless communication device are in wired connection, WiFi network connection or cellular connection.
Simultaneously the utility model also provides an image storage method who contains meteorological parameter based on above-mentioned system, including following step:
1) collecting data;
acquiring images and meteorological parameters of the same environmental scene, and storing data in a storage device according to a position identifier and a time identifier at a meteorological acquisition point;
2) meteorological data processing and correlation;
the second controller associates the image data with the meteorological data according to the position identifier acquired by the positioning device and the time identifier generated by the timing device, generates an image containing meteorological parameters, and transmits the generated image to the server unit in the specific format to be stored in a database of the server unit; the specific format comprises a file header and a file body, the file header comprises a file registration code and file image parameters, and the file image parameters comprise an image format, an image size, an image channel, a reserved information bit, a time identification bit and a position identification; the file body comprises meteorological data and image data;
3) processing and classifying images;
3.1) standardizing the meteorological data and then obtaining the weather data characteristic F by using a full-connection networkwea
3.2) utilizing the weather data characteristic F obtained in the step 3.1)weaConstructing an adaptive convolutional neural network;
3.3) extracting the remote sensing image characteristic F by utilizing the self-adaptive convolution neural network constructed in the step 3.2)rgbAnd classifying the data by using a SoftMax classifier;
3.4) training and testing the adaptive convolutional neural network, and classifying the remote sensing images by using the trained adaptive convolutional neural network;
4) storing;
images generated in a particular format containing meteorological parameters are accumulated in a database.
Further, step 3.1) is to specifically set the initial weather feature vector asThe full-connection network has L layers, and the process from the L layer to the L +1 layer is as follows:
wherein,taking a random initialization value for the weight of the l +1 th layer;taking a random initialization value for the base vector of the l +1 th layer;is the output of the l-th layer;is the output of layer l + 1; sigmoid is an activation function;
repeating the formula (1) to obtain the output of the L +1 th layerRecording the output as the final output weather data characteristic Fwea
Further, the step 3.2) is to specifically set the convolution kernel parameter of the l layer of the original convolution neural network asWeather data characteristic F obtained by step 1)weaWeighting the convolution kernel parameters to obtain new convolution kernel parametersThe process is as follows:
wherein, WtransferIs a transformation matrix, is a deformation function,obtained by multiplication of a representative element, formula (2)I.e. the original convolution kernel parametersThe adaptive parameters of (1).
Further, step 3.3) is specifically that the adaptive convolutional neural network is a multi-layer network structure, each layer is composed of three operations of convolution, activation and pooling, and the calculation from the l-th layer to the l + 1-th layer is obtained by the following formula:
wherein formula (3) represents a convolution operation, formula (4) represents an activation operation, and formula (5) represents a pooling operation; in the formula (3), the first and second groups,is output for the convolution operation in layer l +1,indicating the kth filter in the l +1 th layer,indicating the weight bias of the kth filter in the l +1 th layer,represents the output of the l-th layer; in the formula (4), the first and second groups,represents the activation operation output in the l +1 th layer, and max refers to the maximum value operation; in the formula (5), Zl+1Represents the overall output of the l +1 th layer, Pooling refers to pooling operation;
the first layer input of the convolutional neural network is an RGB image IrgbThus Z is1=IrgbThe convolution kernel parameter of the adaptive layer l is obtained in the step 2)Obtaining the output Z of the last layer of network through forward propagation layer by layerL+1Recording the output as the final remote sensing image characteristic FrgbAnd then, classifying the features by using a SoftMax classifier.
Further, the step 3.4) is specifically as follows:
4a) training: training the parameters of the fully-connected network in the step 1) and the self-adaptive convolutional neural network in the step 2) on an acquired data set, wherein the training method is an error back propagation algorithm, and the data set is used as a training set;
4b) and (3) testing: inputting the pictures in the test set and the corresponding weather data into an overall network obtained by training, calculating the classification precision of the overall network according to the difference between the pre-classification and the actual classification, wherein the number of the images with correct scores is R, and the classification precision is accuracuracy, wherein R isGeneral assemblyNumber of samples to test lumped:
accuracy=R/Rgeneral assembly×100%(6)
4c) And (4) classification: inputting an arbitrary remote sensing image and weather data corresponding to the image into a network, and outputting the remote sensing scene category corresponding to the image.
The utility model has the advantages that:
1. the utility model provides a meteorological parameter fuses image method and system with time identification and location sign, can be more comprehensive and accurate actual conditions who presents the scene to can let the user further handle the image according to time, position and meteorological parameter of shooting.
2. The utility model discloses system and method can obtain the temperature, humidity, the luminance of same scene when gathering scene grey scale information, and the system of all-round meteorological information such as pressure has avoided shooting image data can receive the influence of these meteorological information, and the image of shooing is a complete scene expression mode.
3. The utility model discloses the image format is self-defined ZCP format, contains time identification, position identification, image data, meteorological parameter data etc. and realizes on-vehicle meteorological image and fuses the acquisition unit, the quick convenient comprehensive scene information that acquires of convenience of customers.
4. The utility model discloses construct the convolution neural network that carries out parameter adjustment according to weather characteristic self-adaptation, utilize weather characteristic and image characteristic simultaneously, overcome traditional method and be subject to the drawback of environmental impact such as illumination for expression to the scene refines more, makes the characteristic of learning more have the generalization nature, thereby has improved the categorised precision of scene.
5. The utility model discloses remove the luminance value of the scene image who gathers, environmental information when still considering the shooting image simultaneously can effectively avoid the ambiguity problem of scene perception and understanding through this kind of mode.
6. The utility model discloses broken through the limitation of image information expression mode among the current method, through the multi-feature fusion, obtained the correct expression mode of image scene, overcome the difficult point that the remote sensing image ground thing is complicated, the similarity is big between the class, can be used to aspects such as geographical national conditions reconnaissance, military reconnaissance and environmental monitoring.
Drawings
FIG. 1 is a system structure diagram of the present invention;
FIG. 2 is a flow chart of a method of generating an image including meteorological parameters according to the present invention;
FIG. 3 is a ZCP structure diagram of the storage format containing meteorological parameter images of the present invention;
FIG. 4 is a flow chart of a data association method of the present invention;
FIG. 5 is a schematic diagram of the present invention using a fully connected neural network to extract weather features;
FIG. 6 is a schematic diagram of the present invention using weather features to construct an adaptive convolutional neural network;
fig. 7 is a schematic diagram of the present invention for extracting image features by using an adaptive convolutional neural network.
Detailed Description
The technical solution of the present invention is clearly and completely described below with reference to the drawings of the specification.
The utility model provides an image fusion collection system and image storage method containing meteorological parameter, record image information simultaneously and acquire the meteorological information of the same moment of image, same scene, like temperature, humidity, atmospheric pressure, rainfall, wind speed, wind direction, visibility, illuminance etc to and the grey scale/spectral information of shooting scene. Since the image data is affected by the weather information, the image containing the environmental weather information is a complete image representation. The utility model discloses to have and stride characteristics such as subject, high resolution, quick, harmless and be applied to the research of computer vision task, change people to the traditional understanding of image, will perfect the expression mode of image, the utility model discloses an aspect can serve spectral image system's design, and fields such as meticulous agriculture, intelligent transportation also can be used to this equipment of on the other hand promote the development of computer vision technique, the utility model discloses can drive the development of subjects such as optical imaging system, computer vision, intelligent driving, robot, have great academic and economic double value.
The utility model provides an image fusion collection system who contains meteorological parameter, this system configurable is for the monitored control system who is used for any environment, for example among outdoor remote monitoring systems such as road traffic, construction site, forest park, the system monitors outdoor scene's meteorological information and image information, handles the data collection in the server unit, improves the accuracy and the integrality of monitoring image.
As shown in fig. 1, the image fusion acquisition system including meteorological parameters comprises a server unit and a fusion acquisition unit, wherein the server unit comprises a first controller, a first wireless communication device and a plurality of databases; the first wireless communication device and the database are respectively connected with the first controller, and the database stores images containing weather information in a specific format; the fusion acquisition unit comprises a second controller, a second wireless communication device, a storage device, a camera, a plurality of meteorological sensors, a timing device and a timing device, wherein the second wireless communication device, the storage device, the camera, the meteorological sensors, the timing device and the timing device are respectively connected with the second controller, and the second wireless communication device is communicated with the first wireless communication device. The fusion acquisition unit is used for capturing image data of an environmental scene and acquiring environmental meteorological data, associating the image data and the meteorological data according to the acquired position identification and the generated time identification, and sending the image data and the meteorological data to the server unit in a specific format. The fusion acquisition unit can classify the environmental meteorological data and the image data according to the positioning information and the time information, specifically, the meteorological data and the image data within a preset distance range and a preset time period are associated, the associated data are generated into an image format containing meteorological parameters according to a specific format and are sent to the first wireless communication device through the second wireless communication device, and the first controller stores the received images in a corresponding database according to the geographical position or the time relationship.
The camera is a camera, a camera or a pan-tilt camera for capturing images in a specific range, and can continuously shoot images or shoot images at specific time intervals; the meteorological sensor is a sensor for detecting visibility, illuminance, wind speed, wind direction, PM2.5, temperature, humidity, atmospheric pressure, ultraviolet rays or rainfall, and specifically can be a visibility instrument, an illuminance sensor, a sensor for detecting PM2.5 or an automatic meteorological station; the positioning device can have the functions of a Beidou satellite positioning system or a GPS positioning system; the connection of the second wireless communication device with the first wireless communication device is a wired connection, a WiFi network connection, or a cellular connection. And data transmission is carried out among all components in the fusion acquisition unit through buses or serial communication such as RS232 and RS485 interfaces.
The second controller can analyze and process weather format data sent by the automatic weather station or other weather acquisition systems and simultaneously control the working modes of the sensor state, the camera state, the second wireless communication device, the positioning device and the timing device; the second controller converts the meteorological parameters collected by the meteorological sensors into a set of meteorological data in a specific format, processes the image information captured by the camera into a required format, such as jpg, bmp, and the like, converts the time information provided by the timing device into a time identifier, converts the positioning information provided by the positioning device into a position identifier, and then converts the meteorological data and the image data into the specific format according to the timing identifier and the position identifier.
The processing functions of the second controller may also be performed on a server unit, which may be configured to collect weather data from the weather collection device and to collect positioning and image data from the intelligent terminal at specific periods. In the server unit, images of the intelligent terminals in the monitoring range of each weather collecting device are accumulated, correlated and stored in a database.
An image storage method including meteorological parameters as shown in fig. 2, comprising the steps of:
1) collecting data;
acquiring images and meteorological parameters of the same environmental scene, and storing data in a storage device according to a position identifier and a time identifier at a meteorological acquisition point;
2) meteorological data processing and correlation;
the second controller associates the image data with the meteorological data according to the position identifier acquired by the positioning device and the time identifier generated by the timing device, generates an image containing meteorological parameters, and transmits the generated image to the server unit in the specific format to be stored in a database of the server unit; the specific format comprises a file header and a file body, the file header comprises a file registration code and file image parameters, and the file image parameters comprise an image format, an image size, an image channel, a reserved information bit, a time identification bit and a position identification; the file body comprises meteorological data and image data;
3) processing and classifying images;
3.1) standardizing the meteorological data and then obtaining the weather data characteristic F by using a full-connection networkwea
3.2) utilizing the weather data characteristic F obtained in the step 3.1)weaConstructing an adaptive convolutional neural network;
3.3) extracting the remote sensing image characteristic F by utilizing the self-adaptive convolution neural network constructed in the step 3.2)rgbAnd classifying the data by using a SoftMax classifier;
3.4) training and testing the adaptive convolutional neural network, and classifying the remote sensing images by using the trained adaptive convolutional neural network;
4) storing;
images generated in a particular format containing meteorological parameters are accumulated in a database.
The utility model discloses the concrete step that data association realized is as follows:
step 3.1, extracting weather data characteristics by using a full-connection network;
as shown in fig. 5, the collected weather conditions of the present invention are 34 types, as follows:
thus, the initial weather feature is a 34-dimensional vector with each element of the vector being either 1 or 0, representing the presence or absence of such weather; because there is very strong associativity between various weather, consequently the utility model discloses in obtain ultimate weather characteristic with a full connection network of initial weather characteristic input, establish initial weather eigenvector and do(R represents a rational number,is a 34-dimensional rational vector) full-link network has L layers, the process from L layer to L +1 layer is as follows:
wherein,the weight of the l +1 th layer is a random initialization value;the base vector of the l +1 th layer is a random initialization value;is the output of the l-th layer;is the output of layer l + 1; sigmoid refers to an activation function;
repeating the above process L times to obtain the L +Output of 1 layerRecording this output as the final output F of the networkweaParameter ofAndis a random initialization value;
3.2, constructing an adaptive convolutional neural network by using the weather data characteristics generated in the step 3.1;
as shown in FIG. 6, the convolutional neural network is a multi-layer network structure, each layer is composed of convolution, activation and pooling, and the related parameter is a convolution kernel WconvAnd a weight bias bconvThe initial values of these parameters are randomly generated, and the convolution kernel parameters of the l layer of the original convolution neural network are set as
It is possible to generate a weather signature F by using the weather signature F generated in step 1weaTo weight the convolution kernel parameters to obtain new convolution kernel parametersThe process is as follows:
wherein, WtransferIs a transformation matrix, since here FweaIs generally equal toAre different, while the subsequent element multiplication operation requires both dimensions to be the same, so the transformation matrix and reshape function are introduced here,the purpose that the two dimensions are the same is achieved through the combined action; reshape is a deformation function, and the transformation matrix and the deformation function have the effect of transforming the weather feature vector FweaIs converted into andis determined by the dimension of the matrix of the object,obtained by multiplication of a representative element, formula (2)Is the original convolution kernel parameterCompared with the original convolution kernel, the new convolution kernel can effectively combine with weather information to extract more semantic features in the image;
3.3, extracting the depth characteristics of the remote sensing image by using the self-adaptive convolutional neural network constructed in the step 3.2;
as shown in fig. 7, the adaptive convolutional neural network is a multi-layer network structure, each layer is composed of convolution, activation and pooling, and the calculation from the l-th layer to the l + 1-th layer can be obtained by the following formula:
wherein equation (3) represents convolution operation, equation (4) represents activation operation, and equation (5) represents pooling operation; in the formula (3), the first and second groups,is output for the convolution operation in layer l +1,indicating the kth filter in the l +1 th layer,indicating the weight bias of the kth filter in the l +1 th layer,represents the output of the l-th layer; in the formula (4), the first and second groups,represents the activation operation output in the l +1 th layer, and max refers to the maximum value operation; in the formula (5), Zl+1Representing the overall output of the l +1 th layer, posing refers to pooling operation, because the finally obtained image feature should be a feature vector, so the pooling operation of the last layer of the convolutional neural network in the present invention is global average pooling (global average pooling).
The first layer input of the convolutional neural network is an RGB image IrgbThus Z is1=IrgbThe convolution kernel parameter of the adaptive layer l is obtained in step 2Through forward propagation layer by layer (L layers), the output Z of the last layer of network is obtainedL+1And finally, the output is recorded as the characteristic F of the remote sensing imagergbAnd classifying the features by using a SoftMax classifier so as to achieve the purpose of classifying the remote sensing images.
Step 3.4, training and testing the adaptive convolutional neural network, and classifying the remote sensing images by using the trained network; each acquired scene image has corresponding weather data and scene category labels, and the acquired data is divided into two parts which are respectively a training set and a testing set;
(3.4a) training: the network comprises two sub-network modules in total, namely a fully-connected network module in the step 1 and a self-adaptive convolutional neural network module in the step 2; the parameters of the two modules need to be trained on the data set acquired by the utility model, the training method adopts an error back propagation algorithm, and the data set is used as a training set;
(3.4b) test: inputting the pictures in the test set and the corresponding weather data into an overall network obtained by training, calculating the classification precision of the overall network according to the difference between the pre-classification and the actual classification, wherein the number of the images with correct scoring classification is R, and the classification accuracy is R (wherein R is the number of the images with correct scoring classification)General assemblyNumber of samples lumped for testing), accuracy represents the classification accuracy:
accuracy=R/Rgeneral assembly×100%(6)
(3.4c) classification: inputting an arbitrary remote sensing image and weather data corresponding to the image into a network, and outputting the remote sensing scene category corresponding to the image.
As shown in FIG. 3, the present invention comprises a ZCP format structure chart for storing meteorological parameters, wherein the ZCP storage file format comprises header file information, image data stored in the image file format and shot by a camera, and meteorological data for recording environmental meteorological parameters during the shooting of images, shooting position information and time information; specifically, the specific format comprises a file header and a file body, wherein the file header is 100 bits (byte) and is divided into file registration codes (0-19 bits), parameters of images in the files comprise image formats, image sizes, image channels (20-31 bits), reserved information bits (32-51 bits), time identification bits (52-59 bits), position identifications (60-67), and the file body is divided into meteorological data and image data, the meteorological data comprise visibility, temperature, humidity, wind speed, wind direction, illuminance, atmospheric pressure and the like (68-99 bits), and the image data are stored in a binary format (100 to the tail of the file).
The program of the present invention may be stored and provided to a computer using any type of non-transitory computer readable medium, and the program may be stored in a carrier including one of an electronic signal, an optical signal, a radio signal, and a computer readable storage medium as follows. Non-transitory computer readable media include any type of tangible storage media. Non-transitory computer-readable media include magnetic storage media (e.g., floppy disks, magnetic tapes, hard drives, etc.), magneto-optical storage media (e.g., magneto-optical disks), compact disc read-only memories (CD-ROMs), CD-R, CD-R/W, and semiconductor memories (e.g., mask read-only memories, programmable read-only memories (PROMs), erasable programmable read-only memories (EPROMs), flash read-only memories, Random Access Memories (RAMs), etc.), programs may be provided to a computer using any type of transitory computer-readable media, including electrical signals, optical signals, and electromagnetic waves, which may provide the programs to the computer via wired (e.g., electrical and optical) or wireless communication lines.
The calculated weather parameters for the scene are accumulated in a weather database and associated with the captured images, and then the images are combined with the weather parameters along with the time and location identifications in a particular format and stored for later viewing or processing. The system includes a weather collection device including a processor, a controller having instruction storage that, when executed by the controller, collects a plurality of sensor data in real time such that a plurality of weather parameters can be processed into a time-identified set of weather data; the controller processes the collected information into data with a specific format; the communication device is configured to communicate with the server unit to transfer certain formatted data and instructions in any one of near field communication, NFC, bluetooth, radio frequency identification, RFID, and WiFi connections.
The utility model discloses fuse collection system and use an embodiment on-vehicle and intelligent terminal, on-vehicle collection system can monitor visibility, temperature, humidity and wind speed and wind direction etc. at any time to combine the image and the meteorological parameter of user's shooting through intelligent terminal, make the user grasp the information of shooting the scene more comprehensively and further handle the image.

Claims (5)

1. An image fusion acquisition system containing meteorological parameters is characterized in that: comprises a fusion acquisition unit and a server unit;
the server unit comprises a first controller, a first wireless communication device and at least one database, wherein the first wireless communication device and the database are respectively connected with the first controller;
the fusion acquisition unit comprises a second controller, a second wireless communication device, a storage device, a camera, a positioning device, a timing device and a plurality of meteorological sensors, wherein the second wireless communication device, the storage device, the camera, the positioning device, the timing device and the meteorological sensors are respectively connected with the second controller; the second wireless communication device communicates with the first wireless communication device;
the database stores images in a specific format and including meteorological information; the second controller sends the data to the server unit in a specific format and stores the data in a database of the server unit;
the specific format comprises a file header and a file body, the file header comprises a file registration code and file image parameters, and the file image parameters comprise an image format, an image size, an image channel, a reserved information bit, a time identification bit and a position identification; the file body includes weather data and image data.
2. The image fusion acquisition system including meteorological parameters according to claim 1, wherein: the camera is a camera, a camera or a pan-tilt camera for capturing images.
3. The image fusion acquisition system containing meteorological parameters according to claim 1 or 2, wherein: the weather sensors include visibility instruments, illuminance sensors, sensors for detecting PM2.5, and/or automated weather stations.
4. The image fusion acquisition system including meteorological parameters according to claim 3, wherein: the positioning device comprises a Beidou satellite positioning or GPS positioning system.
5. The image fusion acquisition system including meteorological parameters according to claim 4, wherein: the second wireless communication device and the first wireless communication device are in wired connection, WiFi network connection or cellular connection.
CN201820313952.2U 2018-03-07 2018-03-07 Image fusion acquisition system containing meteorological parameters Active CN208335208U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201820313952.2U CN208335208U (en) 2018-03-07 2018-03-07 Image fusion acquisition system containing meteorological parameters

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201820313952.2U CN208335208U (en) 2018-03-07 2018-03-07 Image fusion acquisition system containing meteorological parameters

Publications (1)

Publication Number Publication Date
CN208335208U true CN208335208U (en) 2019-01-04

Family

ID=64782675

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201820313952.2U Active CN208335208U (en) 2018-03-07 2018-03-07 Image fusion acquisition system containing meteorological parameters

Country Status (1)

Country Link
CN (1) CN208335208U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111209980A (en) * 2019-12-25 2020-05-29 深圳供电局有限公司 Environment detection method and device, electronic equipment and computer readable storage medium
CN112649900A (en) * 2020-11-27 2021-04-13 上海眼控科技股份有限公司 Visibility monitoring method, device, equipment, system and medium
CN115290526A (en) * 2022-09-29 2022-11-04 南通炜秀环境技术服务有限公司 Air pollutant concentration detection method based on data analysis

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111209980A (en) * 2019-12-25 2020-05-29 深圳供电局有限公司 Environment detection method and device, electronic equipment and computer readable storage medium
CN111209980B (en) * 2019-12-25 2024-02-09 深圳供电局有限公司 Environment detection method and device, electronic equipment and computer readable storage medium
CN112649900A (en) * 2020-11-27 2021-04-13 上海眼控科技股份有限公司 Visibility monitoring method, device, equipment, system and medium
CN115290526A (en) * 2022-09-29 2022-11-04 南通炜秀环境技术服务有限公司 Air pollutant concentration detection method based on data analysis

Similar Documents

Publication Publication Date Title
CN108537122B (en) Image fusion acquisition system containing meteorological parameters and image storage method
CN107862705B (en) Unmanned aerial vehicle small target detection method based on motion characteristics and deep learning characteristics
CN112990262B (en) Integrated solution system for monitoring and intelligent decision of grassland ecological data
CN108109385B (en) System and method for identifying and judging dangerous behaviors of power transmission line anti-external damage vehicle
CN208335208U (en) Image fusion acquisition system containing meteorological parameters
CN106682592B (en) Image automatic identification system and method based on neural network method
CN111582234B (en) Large-scale oil tea tree forest fruit intelligent detection and counting method based on UAV and deep learning
CN109116298B (en) Positioning method, storage medium and positioning system
CN109444912B (en) Driving environment sensing system and method based on cooperative control and deep learning
CN111458721B (en) Exposed garbage identification and positioning method, device and system
CN111444801A (en) Real-time detection method for infrared target of unmanned aerial vehicle
CN109977908B (en) Vehicle driving lane detection method based on deep learning
CN111474955B (en) Identification method, device and equipment for unmanned aerial vehicle graph signaling system and storage medium
CN112613438A (en) Portable online citrus yield measuring instrument
CN112633120A (en) Intelligent roadside sensing system based on semi-supervised learning and model training method
CN112037252A (en) Eagle eye vision-based target tracking method and system
CN114898238A (en) Wild animal remote sensing identification method and device
CN116448773A (en) Pavement disease detection method and system with image-vibration characteristics fused
CN106453523A (en) Intelligent weather identification system and method
EP3698336A1 (en) Intrusion detection methods and devices
CN116258940A (en) Small target detection method for multi-scale features and self-adaptive weights
CN118097463A (en) Lodging area identification method and system based on crop remote sensing image
CN117423077A (en) BEV perception model, construction method, device, equipment, vehicle and storage medium
CN112966780A (en) Animal behavior identification method and system
CN202887397U (en) Geological environment disaster video monitor

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant