CN116016805A - Data processing method, device, electronic equipment and storage medium - Google Patents

Data processing method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116016805A
CN116016805A CN202310302059.5A CN202310302059A CN116016805A CN 116016805 A CN116016805 A CN 116016805A CN 202310302059 A CN202310302059 A CN 202310302059A CN 116016805 A CN116016805 A CN 116016805A
Authority
CN
China
Prior art keywords
monitoring
data
sensor
data obtained
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310302059.5A
Other languages
Chinese (zh)
Other versions
CN116016805B (en
Inventor
陈友明
陈思竹
樊荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Honghe Communication Group Co ltd
Original Assignee
Sichuan Honghe Communication Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Honghe Communication Group Co ltd filed Critical Sichuan Honghe Communication Group Co ltd
Priority to CN202310302059.5A priority Critical patent/CN116016805B/en
Publication of CN116016805A publication Critical patent/CN116016805A/en
Application granted granted Critical
Publication of CN116016805B publication Critical patent/CN116016805B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Closed-Circuit Television Systems (AREA)

Abstract

A data processing method, a data processing device, electronic equipment and a storage medium relate to the technical field of computers. The data processing method comprises the following steps: acquiring image data and/or video data containing monitoring data obtained by a monitoring device and/or sensing data obtained by a sensor; and acquiring the monitoring data acquired by the monitoring device and/or the sensing data acquired by the sensor from the acquired image data and/or video data containing the monitoring data acquired by the monitoring device and/or the sensing data acquired by the sensor. The data processing method, the device, the electronic equipment and the storage medium can acquire the monitoring data acquired by the monitoring equipment or the sensing data acquired by the sensor under the condition that the monitoring equipment does not provide the original monitoring data acquired by the monitoring equipment or the sensor does not provide the original sensing data acquired by the sensor.

Description

Data processing method, device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a data processing method, a data processing device, an electronic device, and a storage medium.
Background
Many management areas are based on management requirements of different standards, and are provided with a plurality of management systems, security systems, monitoring systems and early warning systems. For example, gas stations and oil reservoirs are often equipped with a large number of monitoring devices and various sensors for real-time management and risk early warning, thereby ensuring site safety. These monitoring devices and sensors may come from different manufacturers. The monitoring devices or sensors of different manufacturers may use different data protocols, and may even use data protocols that the manufacturers themselves develop. Based on privacy concerns, manufacturers are not willing to disclose the details of the data protocol to third parties, which makes the subsequent use of the data obtained by these monitoring devices and sensors, which may form islands of information, difficult.
Disclosure of Invention
In order to obtain data in monitoring devices or sensors of different manufacturers, the application provides a data processing method, a data processing device, electronic equipment and a storage medium.
In a first aspect, the present application provides a data processing method, including the steps of: s01, acquiring image data and/or video data containing monitoring data obtained by a monitoring device and/or sensing data obtained by a sensor; s02, acquiring the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor from the acquired image data and/or video data containing the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor.
In a second aspect, the present application provides a data processing apparatus comprising: the first acquisition module is used for acquiring image data and/or video data containing monitoring data obtained by the monitoring equipment and/or sensing data obtained by the sensor; the second acquisition module is used for acquiring the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor from the acquired image data and/or video data containing the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor.
In a third aspect, the present application provides an electronic device comprising a memory, a processor and computer instructions stored on the memory, the processor executing the computer instructions to implement the steps in the data processing method of the first aspect.
In a fourth aspect, the present application provides a computer storage medium storing computer software instructions for use in the first or second aspects described above, comprising a program designed to perform the aspects described above.
According to the data processing method, the device, the electronic equipment and the storage medium, the monitoring data obtained by the monitoring equipment or the sensing data obtained by the sensor can be obtained under the condition that the monitoring equipment does not provide the original monitoring data obtained by the monitoring equipment or the sensor does not provide the original sensing data obtained by the sensor.
Drawings
Fig. 1 is a flow chart of a data processing method of the present application.
Detailed Description
The data processing method, apparatus, electronic device and storage medium of the present application are described in detail below with reference to the accompanying drawings.
The gas station is provided with a plurality of monitoring devices and various sensors for ensuring the safety of the gas station. These monitoring devices and sensors may come from different manufacturers. The monitoring devices or sensors of different manufacturers may use different data protocols, and may even use data protocols that the manufacturers themselves develop. Based on privacy concerns, manufacturers are not willing to disclose the details of the data protocol to third parties.
In an actual application scenario, the monitoring device may not provide the original monitoring data obtained by the monitoring device to its user, and the monitoring device may only show the monitoring data obtained by the monitoring device to its user. In this case, the user can only view the monitoring data obtained by the monitoring device through the display device connected to the monitoring device, but cannot directly obtain the original monitoring data obtained by the monitoring device through the monitoring device, so that the original monitoring data obtained by the monitoring device cannot be processed and utilized effectively.
In a practical application scenario, the sensor may not provide the raw sensing data obtained by the sensor to its user, and the sensor may simply display the sensing data obtained by the sensor to the user of the sensor through a display device connected to the sensor. In this case, the user cannot directly obtain the raw sensing data obtained by the sensor through the sensor, and thus cannot perform subsequent processing and efficient use of the raw sensing data obtained by the sensor. For example, in a practical application scenario, a thermometer may be installed that includes a temperature sensor and a display screen connected to the temperature sensor, where the display screen of the thermometer displays in real time a temperature value obtained by the temperature sensor of the thermometer. The user of the thermometer may not be able to directly obtain the temperature value obtained by the temperature sensor of the thermometer and to make subsequent processing and efficient use of the temperature value obtained by the temperature sensor of the thermometer.
The application provides a data processing method, a data processing device, electronic equipment and a storage medium. By using the data processing method, the data processing device, the electronic equipment and the storage medium, the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor can be obtained.
Fig. 1 is a flow chart of a data processing method of the present application. As shown in fig. 1, the method comprises the steps of:
s01, acquiring image data and/or video data containing monitoring data obtained by a monitoring device and/or sensing data obtained by a sensor; s02, acquiring the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor from the acquired image data and/or video data containing the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor.
The acquisition of image data and/or video data containing monitoring data obtained by a monitoring device and/or sensing data obtained by a sensor may employ the following method: and using an image pickup device to pick up a display device for displaying the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor, and reading the image data and/or the video data obtained by the image pickup device. For example, for a thermometer including a temperature sensor and a display screen connected to the temperature sensor, a display screen of the thermometer, which displays a temperature value obtained by the temperature sensor of the thermometer in real time, may be photographed by using a camera, and video data obtained by photographing by the camera may be read.
In an actual application scene, video data captured by an imaging device may need to be read in real time. If the number of image capturing apparatuses to read video data in real time is large and/or the resolution of the video captured by the image capturing apparatuses is high (for example, the resolution of the video captured by the image capturing apparatuses reaches 4096×3112 pixel values) and/or the frame rate of the video captured by the image capturing apparatuses is high (for example, the frame rate of the video captured by the image capturing apparatuses is 120 frames/second), the amount of data of the video data read in real time is large, which may cause untimely reading, and a phenomenon that the video data read at a certain moment is actually the video data captured by the image capturing apparatuses several seconds before the moment occurs, and the real-time property is poor.
In order to solve the above problem, the following method may be used to read the video data captured by the image capturing device:
step S0111, a first process is created;
step S0112, creating a second process;
step S0113, a first queue is created;
step S0114, the first process reads a certain frame in the video stream captured by the image capturing device from the image capturing device; executing step S0115;
step S0115, the first process judges whether the first queue has images, if the first queue has images, all the images in the first queue are deleted, the frame of images in the video stream obtained by the camera device read from the camera device are put into the first queue, and step S0114 is executed; if no image exists in the first queue, the frame image in the video stream shot by the camera device read by the camera device is put into the first queue, and step S0114 is executed;
in step S0116, the second process reads the image in the first queue from the first queue, and then performs step S0116.
The time interval for executing the step S0114 twice before and after the step S0114 is adjusted according to the actual situation, so that the first process can be ensured to read a certain frame in the video stream captured by the image capturing device from the image capturing device when the step S0114 is executed last time, and the first process can be ensured to read the next frame in the certain frame in the video stream captured by the image capturing device from the image capturing device when the step S0114 is executed again (i.e. last time).
The method for reading the video data shot by the camera device can also adopt the following steps:
step S0121, a first process is created;
step S0122, creating a second process;
step S0123, a first queue is created;
step S0124, the first process reads a certain frame in the video stream captured by the image capturing device from the image capturing device; executing step S0125;
step S0125, the first process judges whether there is an image in the first queue, if there is an image in the first queue, all the images in the first queue are deleted, and the frame image in the video stream captured by the imaging device read from the imaging device is put into the first queue, and step S0124 is executed; if no image exists in the first queue, the frame image in the video stream shot by the camera device read by the camera device is put into the first queue, and step S0124 is executed;
in step S0126, the second process reads the image in the first queue from the first queue, then deletes the image in the first queue, and then executes step S0126.
The time interval for executing the step S0124 twice before and after the step S0124 is adjusted according to the actual situation, so that the first process can be ensured to read a certain frame in the video stream captured by the image capturing device from the image capturing device when the step S0124 is executed last time, and the first process can be ensured to read the next frame in the certain frame in the video stream captured by the image capturing device from the image capturing device when the step S0124 is executed again (i.e. last time).
The following method may also be employed to obtain image data and/or video data containing monitoring data obtained by the monitoring device and/or sensing data obtained by the sensor: image data and/or video data containing monitoring data obtained by the monitoring device and/or sensing data obtained by the sensor are obtained from a data transmission line between a display device connected with the monitoring device and/or the sensor and the monitoring device and/or the sensor. For example, the connector of the monitoring device that is originally connected to the display may be connected to the input of the video distributor, one output of the video distributor is connected to the display, the other output of the video distributor is connected to the video capture card, the video distributor includes one input and at least two outputs, and then image data and/or video data including monitoring data obtained by the monitoring device and/or sensing data obtained by the sensor may be obtained by the video capture card connected to the video distributor.
The acquisition of the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor from the obtained image data and/or video data comprising the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor may comprise the steps of: step S021, detecting whether the obtained image data and/or video data containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor contain characters or not; if it is detected that the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor contains characters, step S023 is performed; in step S023, characters contained in the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor are identified. When step S021 is performed, if it is detected that the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor contains characters, step S022 may also be performed before step S023 is performed: and determining the positions of characters contained in the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor. The obtained position of the character contained in the image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor may be the position of the character in the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor.
Detecting whether or not the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor contains characters, and if the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor contains characters, determining the position of the characters contained in the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor may employ the following method: establishing a detection model; acquiring a specified number of image data containing monitoring data obtained by a monitoring device and/or sensing data obtained by a sensor; marking the obtained image data containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor in a specified quantity to obtain a training set; inputting the image data in the training set into the established detection model, and training the established detection model; when the loss function value of the detection model is smaller than or equal to a preset loss function threshold value, obtaining a detection model after training; inputting image data to be detected containing monitoring data obtained by monitoring equipment and/or sensing data obtained by a sensor into a trained detection model, determining whether the image data to be detected containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor contains characters, and if the image data to be detected containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor contains characters, determining the position of the characters contained in the image data to be detected containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor. Since the video data is essentially a collection of image data composed at a certain frame rate, the method is also suitable for detection of video data obtained comprising monitoring data obtained by a monitoring device and/or sensor data obtained by a sensor.
The building of the detection model may comprise the step of setting a target detection box.
The loss function value of the detection model may be calculated using the following formula:
Figure SMS_1
wherein ,
Figure SMS_7
a loss function for the detection model; />
Figure SMS_6
The number of target detection frames; />
Figure SMS_22
Is->
Figure SMS_13
Whether the target detection frame of the target detection frames contains a marking value of a character or not, if +.>
Figure SMS_17
The object detection frame contains characters, then +.>
Figure SMS_12
If->
Figure SMS_23
No character is contained in the target detection frame, then +.>
Figure SMS_11
;/>
Figure SMS_19
To be constant->
Figure SMS_2
Is the logarithm of the base; />
Figure SMS_20
Is->
Figure SMS_4
Target detection of individual target detection framesDetecting whether the predicted value obtained by the detection model of the character is contained in the frame, if the predicted result of the detection model is +.>
Figure SMS_14
The object detection frame contains characters, then +.>
Figure SMS_9
If the result predicted by the detection model is +.>
Figure SMS_15
No character is contained in the target detection frame, then +.>
Figure SMS_8
;/>
Figure SMS_21
Is->
Figure SMS_3
The number of target detection frames including characters in the target detection frames identified by the detection model is detected in the target detection frames; />
Figure SMS_16
Is->
Figure SMS_10
Height mark values of the target detection frames; />
Figure SMS_18
Is->
Figure SMS_5
And the height predicted value obtained by the detection model of each target detection frame.
The application also provides a data processing device, comprising: the first acquisition module is used for acquiring image data and/or video data containing monitoring data obtained by the monitoring equipment and/or sensing data obtained by the sensor; the second acquisition module is used for acquiring the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor from the acquired image data and/or video data containing the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor.
The first obtaining module may be specifically configured to: and reading image data and/or video data obtained by a display device of the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor.
The first obtaining module may be specifically configured to: image data and/or video data containing monitoring data obtained by the monitoring device and/or sensing data obtained by the sensor are obtained from a data transmission line between a display device connected with the monitoring device and/or the sensor and the monitoring device and/or the sensor.
The second obtaining module may be specifically configured to: detecting whether the obtained image data and/or video data containing monitoring data obtained by the monitoring equipment and/or sensing data obtained by the sensor contain characters or not; if the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor contain characters, determining the positions of the characters contained in the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor; and identifying characters contained in the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor.
The second obtaining module may be specifically configured to: establishing a detection model; acquiring a specified number of image data containing monitoring data obtained by a monitoring device and/or sensing data obtained by a sensor; marking the obtained image data containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor in a specified quantity to obtain a training set; inputting the image data in the training set into the established detection model, and training the established detection model; when the loss function value of the detection model is smaller than or equal to a preset loss function threshold value, obtaining a detection model after training; inputting image data to be detected containing monitoring data obtained by monitoring equipment and/or sensing data obtained by a sensor into a trained detection model, determining whether the image data to be detected containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor contains characters, and if the image data to be detected containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor contains characters, determining the position of the characters contained in the image data to be detected containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor.
The establishing of the detection model may include setting a target detection box.
The loss function value of the detection model may be calculated using the following formula:
Figure SMS_24
wherein ,
Figure SMS_35
a loss function for the detection model; />
Figure SMS_36
The number of target detection frames; />
Figure SMS_41
Is->
Figure SMS_29
Whether the target detection frame of the target detection frames contains a marking value of a character or not, if +.>
Figure SMS_43
The object detection frame contains characters, then +.>
Figure SMS_32
If->
Figure SMS_42
No character is contained in the target detection frame, then +.>
Figure SMS_31
;/>
Figure SMS_44
To be constant->
Figure SMS_25
Is the logarithm of the base; />
Figure SMS_38
Is->
Figure SMS_30
Whether the target detection frame of the target detection frames contains the predicted value obtained by the detection model of the character or not, if the predicted result of the detection model is +.>
Figure SMS_40
The object detection frame contains characters, then +.>
Figure SMS_34
If the result predicted by the detection model is +.>
Figure SMS_45
No character is contained in the target detection frame, then +.>
Figure SMS_26
;/>
Figure SMS_37
Is->
Figure SMS_33
The number of target detection frames including characters in the target detection frames identified by the detection model is detected in the target detection frames; />
Figure SMS_46
Is->
Figure SMS_28
Height mark values of the target detection frames; />
Figure SMS_39
Is->
Figure SMS_27
And the height predicted value obtained by the detection model of each target detection frame.
The application also provides an electronic device, comprising: the system comprises a memory and a processor, wherein the memory is connected with the processor, and computer instructions are stored in the memory and can be run on the processor to realize the steps in the data processing method.
The present application also provides a storage medium storing computer software instructions for use with the data processing method of the present application or the data processing apparatus of the present application, comprising a program designed to execute the data processing method of the present application.
The gas station is provided with a plurality of management systems, security systems, monitoring systems and early warning systems. For many reasons, manufacturers of the management system, the security system, the monitoring system and the early warning system often do not provide data interfaces for third parties, and the systems are likely to form information islands. The data processing method, the data processing device, the electronic equipment and the storage medium can acquire information acquired or generated by each system. According to the data processing method, the device, the electronic equipment and the storage medium, the unified information management system is matched to perform unified management on information obtained or generated by each system obtained by the data processing method, the device, the electronic equipment and the storage medium, so that a user can check key information obtained or generated by each system on the same operation interface and quickly master the key information; and the operation instructions can be issued to each system on the same operation interface to perform cross-system management, so that the information island formed by each system is broken, and the management efficiency is improved. The unified information management system can implement one-screen (e.g., same operation interface) unified management (unified management for a plurality of different devices or systems) in cooperation with OCR (Optical Character Recognition), i.e., optical character recognition) technology.

Claims (10)

1. A data processing method, comprising the steps of:
s01, acquiring image data and/or video data containing monitoring data obtained by a monitoring device and/or sensing data obtained by a sensor;
s02, acquiring the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor from the acquired image data and/or video data containing the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor.
2. The data processing method according to claim 1, wherein the specific method for acquiring image data and/or video data containing monitoring data obtained by a monitoring device and/or sensing data obtained by a sensor is:
and using an image pickup device to pick up a display device for displaying the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor, and reading the image data and/or the video data obtained by the image pickup device.
3. The data processing method according to claim 2, wherein the reading of the video data captured by the image capturing device includes the steps of:
step S0111, a first process is created;
step S0112, creating a second process;
step S0113, a first queue is created;
step S0114, the first process reads a certain frame in the video stream captured by the image capturing device from the image capturing device; executing step S0115;
step S0115, the first process judges whether the first queue has images, if the first queue has images, all the images in the first queue are deleted, the frame of images in the video stream obtained by the camera device read from the camera device are put into the first queue, and step S0114 is executed; if no image exists in the first queue, the frame image in the video stream shot by the camera device read by the camera device is put into the first queue, and step S0114 is executed;
in step S0116, the second process reads the image in the first queue from the first queue, and then performs step S0116.
4. The data processing method according to claim 2, wherein the reading of the video data captured by the image capturing device includes the steps of:
step S0121, a first process is created;
step S0122, creating a second process;
step S0123, a first queue is created;
step S0124, the first process reads a certain frame in the video stream captured by the image capturing device from the image capturing device; executing step S0125;
step S0125, the first process judges whether there is an image in the first queue, if there is an image in the first queue, all the images in the first queue are deleted, and the frame image in the video stream captured by the imaging device read from the imaging device is put into the first queue, and step S0124 is executed; if no image exists in the first queue, the frame image in the video stream shot by the camera device read by the camera device is put into the first queue, and step S0124 is executed;
in step S0126, the second process reads the image in the first queue from the first queue, then deletes the image in the first queue, and then executes step S0126.
5. The data processing method according to claim 1, wherein the specific method for acquiring image data and/or video data containing monitoring data obtained by a monitoring device and/or sensing data obtained by a sensor is:
image data and/or video data containing monitoring data obtained by the monitoring device and/or sensing data obtained by the sensor are obtained from a data transmission line between a display device connected with the monitoring device and/or the sensor and the monitoring device and/or the sensor.
6. The data processing method according to claim 1, wherein the specific method for acquiring the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor from the acquired image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor is as follows:
detecting whether the obtained image data and/or video data containing monitoring data obtained by the monitoring equipment and/or sensing data obtained by the sensor contain characters or not; if the character is contained, determining the position of the character contained in the obtained image data and/or video data containing the monitoring data obtained by the monitoring device and/or the sensing data obtained by the sensor; identifying characters contained in the obtained image data and/or video data containing monitoring data obtained by the monitoring device and/or sensing data obtained by the sensor;
the specific method for determining the positions of characters contained in the obtained image data and/or video data containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor comprises the following steps:
establishing a detection model;
acquiring a specified number of image data containing monitoring data obtained by a monitoring device and/or sensing data obtained by a sensor;
marking the obtained image data containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor in a specified quantity to obtain a training set;
inputting the image data in the training set into the established detection model, and training the established detection model;
when the loss function value of the detection model is smaller than or equal to a preset loss function threshold value, obtaining a detection model after training;
inputting image data to be detected containing monitoring data obtained by monitoring equipment and/or sensing data obtained by a sensor into a detection model after training is completed, determining whether the image data to be detected containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor contains characters, and if the image data to be detected contains characters, determining the positions of the characters contained in the image data to be detected containing the monitoring data obtained by the monitoring equipment and/or the sensing data obtained by the sensor.
7. The data processing method of claim 6, wherein the loss function value of the detection model is calculated using the following formula:
Figure QLYQS_1
wherein ,
Figure QLYQS_13
a loss function for the detection model; />
Figure QLYQS_5
The number of target detection frames; />
Figure QLYQS_17
Is->
Figure QLYQS_3
Whether the target detection frame of the target detection frames contains a marking value of a character or not, if +.>
Figure QLYQS_16
The object detection frame contains characters, then +.>
Figure QLYQS_11
If->
Figure QLYQS_22
No character is contained in the target detection frame, then +.>
Figure QLYQS_9
;/>
Figure QLYQS_18
To be constant->
Figure QLYQS_2
Is the logarithm of the base; />
Figure QLYQS_14
Is->
Figure QLYQS_6
Whether the target detection frame of the target detection frames contains the predicted value obtained by the detection model of the character or not, if the predicted result of the detection model is +.>
Figure QLYQS_19
The object detection frame contains characters, then +.>
Figure QLYQS_7
If the result predicted by the detection model is +.>
Figure QLYQS_15
No character is contained in the target detection frame, then +.>
Figure QLYQS_8
;/>
Figure QLYQS_21
Is->
Figure QLYQS_12
The number of target detection frames including characters in the target detection frames identified by the detection model is detected in the target detection frames; />
Figure QLYQS_23
Is->
Figure QLYQS_4
Height mark values of the target detection frames; />
Figure QLYQS_20
Is->
Figure QLYQS_10
And the height predicted value obtained by the detection model of each target detection frame.
8. A data processing apparatus, comprising: the first acquisition module is used for acquiring image data and/or video data containing monitoring data obtained by the monitoring equipment and/or sensing data obtained by the sensor; the second acquisition module is used for acquiring the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor from the acquired image data and/or video data containing the monitoring data acquired by the monitoring equipment and/or the sensing data acquired by the sensor.
9. An electronic device, comprising: a memory and a processor, the memory being coupled to the processor, the memory having stored therein computer instructions that are executed by the processor to implement the steps in the data processing method of any of claims 1-7.
10. A storage medium storing computer software instructions containing a program designed to perform the steps of the data processing method of any one of claims 1-7.
CN202310302059.5A 2023-03-27 2023-03-27 Data processing method, device, electronic equipment and storage medium Active CN116016805B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310302059.5A CN116016805B (en) 2023-03-27 2023-03-27 Data processing method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310302059.5A CN116016805B (en) 2023-03-27 2023-03-27 Data processing method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116016805A true CN116016805A (en) 2023-04-25
CN116016805B CN116016805B (en) 2023-06-20

Family

ID=86025154

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310302059.5A Active CN116016805B (en) 2023-03-27 2023-03-27 Data processing method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116016805B (en)

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103678916A (en) * 2013-12-12 2014-03-26 电子科技大学 Universal method and device for collecting custody data of custody instruments automatically
CN108256526A (en) * 2017-12-07 2018-07-06 上海理工大学 A kind of automobile license plate position finding and detection method based on machine vision
CN109490843A (en) * 2018-11-15 2019-03-19 成都傅立叶电子科技有限公司 A kind of normalization radar screen monitoring method and system
CN110515544A (en) * 2019-08-06 2019-11-29 科华恒盛股份有限公司 The method and terminal device of data storage
CN111258774A (en) * 2020-01-07 2020-06-09 深圳壹账通智能科技有限公司 Flow processing method and device, computer equipment and storage medium
CN211350112U (en) * 2020-02-26 2020-08-25 深圳智信生物医疗科技有限公司 Automatic identification system of monitor
CN112434586A (en) * 2020-11-16 2021-03-02 中山大学 Multi-complex scene target detection method based on domain adaptive learning
CN112506676A (en) * 2020-12-02 2021-03-16 深圳市广和通无线股份有限公司 Inter-process data transmission method, computer device and storage medium
CN112819074A (en) * 2021-02-02 2021-05-18 上海明略人工智能(集团)有限公司 Loss function optimization method, device and equipment for target detection model
CN113554022A (en) * 2021-06-07 2021-10-26 华北电力科学研究院有限责任公司 Automatic acquisition method and device for detection test data of power instrument
CN113657385A (en) * 2021-10-20 2021-11-16 山东摄云信息技术有限公司 Data detection method and device of electronic metering device and electronic equipment
CN113778723A (en) * 2021-11-11 2021-12-10 中汽数据(天津)有限公司 Data playback method, electronic device and readable storage medium
CN113886494A (en) * 2021-09-30 2022-01-04 完美世界(北京)软件科技发展有限公司 Message storage method, device, equipment and computer readable medium for instant messaging
CN113989721A (en) * 2021-10-29 2022-01-28 北京百度网讯科技有限公司 Target detection method and training method and device of target detection model
CN114581516A (en) * 2022-01-30 2022-06-03 天津大学 Monocular vision-based multi-unmanned aerial vehicle intelligent identification and relative positioning method
US20220227367A1 (en) * 2019-06-06 2022-07-21 Mobileye Vision Technologies Ltd. Systems and methods for vehicle navigation
CN114842285A (en) * 2022-03-23 2022-08-02 超级视线科技有限公司 Roadside berth number identification method and device
CN115393682A (en) * 2022-08-17 2022-11-25 龙芯中科(南京)技术有限公司 Target detection method, target detection device, electronic device, and medium
CN115393781A (en) * 2021-05-08 2022-11-25 华为技术有限公司 Video monitoring data processing method and device
CN115774796A (en) * 2022-07-29 2023-03-10 宁波星巡智能科技有限公司 Video data step-by-step storage method and device, electronic equipment and storage medium

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103678916A (en) * 2013-12-12 2014-03-26 电子科技大学 Universal method and device for collecting custody data of custody instruments automatically
CN108256526A (en) * 2017-12-07 2018-07-06 上海理工大学 A kind of automobile license plate position finding and detection method based on machine vision
CN109490843A (en) * 2018-11-15 2019-03-19 成都傅立叶电子科技有限公司 A kind of normalization radar screen monitoring method and system
US20220227367A1 (en) * 2019-06-06 2022-07-21 Mobileye Vision Technologies Ltd. Systems and methods for vehicle navigation
CN110515544A (en) * 2019-08-06 2019-11-29 科华恒盛股份有限公司 The method and terminal device of data storage
CN111258774A (en) * 2020-01-07 2020-06-09 深圳壹账通智能科技有限公司 Flow processing method and device, computer equipment and storage medium
CN211350112U (en) * 2020-02-26 2020-08-25 深圳智信生物医疗科技有限公司 Automatic identification system of monitor
CN112434586A (en) * 2020-11-16 2021-03-02 中山大学 Multi-complex scene target detection method based on domain adaptive learning
CN112506676A (en) * 2020-12-02 2021-03-16 深圳市广和通无线股份有限公司 Inter-process data transmission method, computer device and storage medium
CN112819074A (en) * 2021-02-02 2021-05-18 上海明略人工智能(集团)有限公司 Loss function optimization method, device and equipment for target detection model
CN115393781A (en) * 2021-05-08 2022-11-25 华为技术有限公司 Video monitoring data processing method and device
CN113554022A (en) * 2021-06-07 2021-10-26 华北电力科学研究院有限责任公司 Automatic acquisition method and device for detection test data of power instrument
CN113886494A (en) * 2021-09-30 2022-01-04 完美世界(北京)软件科技发展有限公司 Message storage method, device, equipment and computer readable medium for instant messaging
CN113657385A (en) * 2021-10-20 2021-11-16 山东摄云信息技术有限公司 Data detection method and device of electronic metering device and electronic equipment
CN113989721A (en) * 2021-10-29 2022-01-28 北京百度网讯科技有限公司 Target detection method and training method and device of target detection model
CN113778723A (en) * 2021-11-11 2021-12-10 中汽数据(天津)有限公司 Data playback method, electronic device and readable storage medium
CN114581516A (en) * 2022-01-30 2022-06-03 天津大学 Monocular vision-based multi-unmanned aerial vehicle intelligent identification and relative positioning method
CN114842285A (en) * 2022-03-23 2022-08-02 超级视线科技有限公司 Roadside berth number identification method and device
CN115774796A (en) * 2022-07-29 2023-03-10 宁波星巡智能科技有限公司 Video data step-by-step storage method and device, electronic equipment and storage medium
CN115393682A (en) * 2022-08-17 2022-11-25 龙芯中科(南京)技术有限公司 Target detection method, target detection device, electronic device, and medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
C. CHEN ET AL.: "Region Proposal Network with Graph Prior and Iou-Balance Loss for Landmark Detection in 3D Ultrasound", 2020 IEEE 17TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI) *
X. -Y. ZHU, J. ZHANG AND G. -S. CHEN: "Context Combined Cascaded Region Proposal Network with Dual Loss Function for Accurate Car Detection", 2018 14TH IEEE INTERNATIONAL CONFERENCE ON SOLID-STATE AND INTEGRATED CIRCUIT TECHNOLOGY (ICSICT) *
张童;谭南林;包辰铭;: "应用于嵌入式平台的实时红外行人检测方法", 激光与红外, no. 02 *
陈伟骏;周长胜;黄宏博;彭帅;崇美英;: "基于卷积神经网络的目标检测算法综述", 北京信息科技大学学报(自然科学版), no. 02 *

Also Published As

Publication number Publication date
CN116016805B (en) 2023-06-20

Similar Documents

Publication Publication Date Title
CN110705405A (en) Target labeling method and device
CN108337505B (en) Information acquisition method and device
US20090097754A1 (en) Video communication device and image processing system and method of the same
CN103824064A (en) Huge-amount human face discovering and recognizing method
CN112669344A (en) Method and device for positioning moving object, electronic equipment and storage medium
CN111898581A (en) Animal detection method, device, electronic equipment and readable storage medium
CN113096158A (en) Moving object identification method and device, electronic equipment and readable storage medium
JP2011133984A (en) Motion feature extraction device and motion feature extraction method
CN112422909A (en) Video behavior analysis management system based on artificial intelligence
CN106506932A (en) The acquisition methods and device of image
WO2015069063A1 (en) Method and system for creating a camera refocus effect
JP6618349B2 (en) Video search system
CN112073713B (en) Video leakage test method, device, equipment and storage medium
CN116016805B (en) Data processing method, device, electronic equipment and storage medium
CN111753766A (en) Image processing method, device, equipment and medium
JPWO2020039897A1 (en) Station monitoring system and station monitoring method
CN108737733B (en) Information prompting method and device, electronic equipment and computer readable storage medium
KR101925799B1 (en) Computer program for preventing information spill displayed on display device and security service using the same
CN112802112B (en) Visual positioning method, device, server and storage medium
CN111507140A (en) Portrait comparison method, system, electronic equipment and readable storage medium
CN110213457B (en) Image transmission method and device
US11544833B1 (en) System and method for capturing by a device an image of a light colored object on a light colored background for uploading to a remote server
CN114095783A (en) Picture uploading method and device, computer equipment and storage medium
CN113887384A (en) Pedestrian trajectory analysis method, device, equipment and medium based on multi-trajectory fusion
CN113947795A (en) Mask wearing detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant