CN114359818A - Utilization rate analysis method and device, computer equipment and storage medium - Google Patents

Utilization rate analysis method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN114359818A
CN114359818A CN202210256376.3A CN202210256376A CN114359818A CN 114359818 A CN114359818 A CN 114359818A CN 202210256376 A CN202210256376 A CN 202210256376A CN 114359818 A CN114359818 A CN 114359818A
Authority
CN
China
Prior art keywords
time
image frame
utilization rate
video
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210256376.3A
Other languages
Chinese (zh)
Other versions
CN114359818B (en
Inventor
陈兴委
李嘉鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huafu Technology Co ltd
Original Assignee
Shenzhen Huafu Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huafu Information Technology Co ltd filed Critical Shenzhen Huafu Information Technology Co ltd
Priority to CN202210256376.3A priority Critical patent/CN114359818B/en
Publication of CN114359818A publication Critical patent/CN114359818A/en
Application granted granted Critical
Publication of CN114359818B publication Critical patent/CN114359818B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses a utilization rate analysis method and device, computer equipment and a storage medium. The method comprises the following steps: pulling a video to be analyzed from a media service; processing the video to be analyzed to obtain an image frame; performing edge calculation on the image frame to obtain a calculation result; and calculating the utilization rate according to the calculation result to obtain the utilization rate. The method of the embodiment of the invention can realize the processing of the image frame at the edge side, greatly reduce the data flow and improve the data security, and the edge calculation mode has high real-time performance, is beneficial to processing the accident in time and has high calculation result accuracy.

Description

Utilization rate analysis method and device, computer equipment and storage medium
Technical Field
The invention relates to utilization rate, in particular to a utilization rate analysis method, a utilization rate analysis device, computer equipment and a storage medium.
Background
With the development of new technologies such as intelligent internet of things, deep learning and video analysis, the traditional factory industry starts to transform to an intelligent factory, and technical means are utilized to help enterprises reduce labor cost, improve production efficiency and improve product quality in the aspects of equipment, flow, manufacturing and the like.
In the aspect of production efficiency, the improvement of the utilization rate is an important index, the utilization rate refers to the proportion of necessary time required for producing necessary production capacity aiming at the full-load operation capacity of equipment under the timing condition, the utilization rate of a factory can be divided into the personnel utilization rate and the equipment utilization rate, the statistics of most of the existing intelligent factories on the utilization rate is mainly analyzed by simply acquiring the data of a production line and monitoring the stations and equipment states of the production line, the calculation result has errors, and the equipment cannot be processed in time when in failure.
Therefore, it is necessary to design a new method to reduce the data traffic, improve the data security, have high real-time performance, facilitate timely handling of the emergency, and have high accuracy of the calculation result.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a utilization rate analysis method, a utilization rate analysis device, computer equipment and a storage medium.
In order to achieve the purpose, the invention adopts the following technical scheme: the utilization rate analysis method comprises the following steps:
pulling a video to be analyzed from a media service;
processing the video to be analyzed to obtain an image frame;
performing edge calculation on the image frame to obtain a calculation result;
and calculating the utilization rate according to the calculation result to obtain the utilization rate.
The further technical scheme is as follows: the processing the video to be analyzed to obtain an image frame includes:
and decoding the video to be analyzed to obtain an image frame.
The further technical scheme is as follows: the calculation result comprises the working time of the personnel on duty, the working time of the equipment, the loss time of the equipment, the load time and the downtime;
the performing edge calculation on the image frame to obtain a calculation result includes:
calculating the on-duty working time of the personnel by adopting an edge calculation method for the image frame so as to obtain the on-duty working time of the personnel;
calculating the equipment operation time and the equipment loss time of the image frame by adopting an edge calculation method to obtain the equipment operation time and the equipment loss time;
and calculating the load time and the downtime of the image frame by adopting an edge calculation method to obtain the load time and the downtime.
The further technical scheme is as follows: the calculating the on-duty working time of the personnel on the duty by adopting an edge calculating method to obtain the on-duty working time of the personnel comprises the following steps:
framing a working area for the image frame at the edge side and importing a face picture of a related person;
associating the operation area with the face picture of the related personnel to obtain associated information;
identifying related personnel and located area information in the image frame;
and determining the on Shift working time of the related personnel according to the associated information, the related personnel and the located area information to obtain the on Shift working time of the personnel.
The further technical scheme is as follows: the calculating the equipment operation time and the equipment loss time of the image frame by adopting an edge calculation method to obtain the equipment operation time and the equipment loss time comprises the following steps:
and identifying the color of the safety lamp in the image frame through a safety lamp video identification algorithm at the edge side so as to determine the working time and the loss time of the equipment.
The further technical scheme is as follows: the method for calculating the load time and the stop time of the image frame by adopting an edge calculation method to obtain the load time and the stop time comprises the following steps:
the readings of the instrument panel within the image frame are identified at the edge side by a meter video recognition algorithm to determine the load time as well as the down time.
The further technical scheme is as follows: the utilization rate includes a personnel utilization rate and an equipment utilization rate.
The present invention also provides a utilization rate analyzing apparatus, including:
the system comprises a pulling unit, a video analyzing unit and a video analyzing unit, wherein the pulling unit is used for pulling a video to be analyzed from a media service;
the processing unit is used for processing the video to be analyzed to obtain an image frame;
the edge calculation unit is used for carrying out edge calculation on the image frame to obtain a calculation result;
and the utilization rate calculating unit is used for calculating the utilization rate according to the calculation result so as to obtain the utilization rate.
The invention also provides computer equipment which comprises a memory and a processor, wherein the memory is stored with a computer program, and the processor realizes the method when executing the computer program.
The invention also provides a storage medium storing a computer program which, when executed by a processor, implements the method described above.
Compared with the prior art, the invention has the beneficial effects that: according to the method, the video to be analyzed is pulled, the video to be analyzed is decoded to obtain the image frame, the edge calculation is carried out, then the utilization rate calculation is carried out, the image frame is processed at the edge side, the data flow is greatly reduced, the data safety is improved, the real-time performance of the edge calculation mode is high, the sudden accidents can be timely processed, and the calculation result accuracy is high.
The invention is further described below with reference to the accompanying drawings and specific embodiments.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view of an application scenario of the utilization rate analysis method according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a utilization rate analysis method provided by an embodiment of the present invention;
fig. 3 is a schematic sub-flow chart of a utilization rate analysis method provided by an embodiment of the present invention;
fig. 4 is a schematic sub-flow chart of a utilization rate analysis method provided by an embodiment of the present invention;
FIG. 5 is a schematic view of a utilization display interface provided by an embodiment of the present invention;
FIG. 6 is a schematic diagram of an attendance display interface provided by an embodiment of the present invention;
FIG. 7 is a schematic illustration of an analysis page of the person utilization rate provided by an embodiment of the present invention;
FIG. 8 is a schematic diagram of a monitor page according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of a real-time video preview interface provided by an embodiment of the invention;
FIG. 10 is a diagram illustrating a create task provided by an embodiment of the invention;
FIG. 11 is a schematic diagram of a selection algorithm provided by an embodiment of the present invention;
FIG. 12 is a schematic diagram illustrating multiple regions according to an embodiment of the present invention;
FIG. 13 is a schematic diagram showing a structured analysis provided by an embodiment of the present invention;
fig. 14 is a schematic block diagram of a utilization rate analyzing apparatus provided by an embodiment of the present invention;
fig. 15 is a schematic block diagram of an edge calculating unit of the utilization rate analyzing apparatus provided by the embodiment of the present invention;
fig. 16 is a schematic block diagram of an on Shift hours calculation subunit of the utilization analysis apparatus provided by an embodiment of the present invention;
FIG. 17 is a schematic block diagram of a computer device provided by an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
Referring to fig. 1 and fig. 2, fig. 1 is a schematic view of an application scenario of the utilization rate analysis method according to an embodiment of the present invention. Fig. 2 is a schematic flow chart of a utilization rate analysis method provided by an embodiment of the present invention. The utilization rate analysis method is applied to a server. The server performs data interaction with a camera and a terminal, wherein the server is as shown in figure 1; the server comprises a service background, a media service and a video structured analysis service; the service background provides web services for terminal access, internal service data such as channels, task information, analysis results and the like are stored through a middleware (mysql), and the media tasks and the analysis tasks are respectively issued to the media service and the video structured analysis service; the media service provides a video stream, namely, a pull stream push and a local recording of a video to be analyzed, a video source supports IPC, NVR and a push stream of a third-party service, a Real-Time Streaming Protocol (RTSP)/Real-Time Messaging Protocol (RTMP)/HTTP flv/hls (an HTTP-based adaptive code rate Streaming media transmission Protocol) mode is supported after the pull stream, the pushed stream is used for video structured analysis service on one hand and is watched by a front-end page, namely, a terminal on the other hand; the structured video analysis service adopts Huacheng MindX SDK, the framework provides related plug-ins such as coding and decoding, analysis, picture frames and the like for IP optimization on the chip, and edge calculation results after analysis results and results to be analyzed are pushed to a service background through a private TCP protocol to be processed so as to calculate the utilization rate. The server is a Linghua DLAP221 edge computing box and can be connected with a cloud object through an mqtt protocol. The interior of the box employs Huaqi Shengji 310 processor to support multi-channel real-time video analysis; the machine body adopts the design of industrial specification and is suitable for the factory environment; the device has the advantages of low cost, small volume and low power consumption. Edge computing has the following advantages over cloud computing: sensitive data are processed at the edge side, so that the data flow is greatly reduced and the data security is improved; the real-time performance is high, and the method is favorable for timely handling sudden accidents.
The video structured analysis service takes a GStreamer as a system frame, namely a pipeline type streaming media processing frame as the system frame, self-defines the dynamic loading of an algorithm plug-in support model, and an analysis task is managed in a pipeline form.
Fig. 2 is a schematic flow chart of a utilization rate analysis method provided by an embodiment of the present invention. As shown in fig. 2, the method includes the following steps S110 to S140.
S110, pulling the video to be analyzed from the media service.
In this embodiment, the video to be analyzed refers to a video captured by a camera in real time and uploaded to a video formed by a media service.
Specifically, a video stream such as rtsp, a video file, etc. is acquired through a video source input node.
And S120, processing the video to be analyzed to obtain an image frame.
In this embodiment, the image frame refers to a number of frame images decoded from the video to be analyzed.
Specifically, the video to be analyzed is decoded to obtain an image frame.
And decoding the video to be analyzed through a video source input node to obtain an image frame.
And S130, performing edge calculation on the image frame to obtain a calculation result.
In this embodiment, the calculation results include the on-duty working time of the personnel, the working time of the equipment, the equipment loss time, the load time and the downtime. Specifically, edge calculation is performed on image frames obtained from a video source input node through an analysis node, and a calculation result is output.
In an embodiment, referring to fig. 3, the step S130 may include steps S131 to S133.
S131, calculating the on-duty working time of the personnel by adopting an edge calculation method for the image frame to obtain the on-duty working time of the personnel.
In this embodiment, the working time of the person on duty refers to the time of the relevant person at the corresponding position.
In an embodiment, referring to fig. 4, the step S131 may include steps S1311 to S1314.
S1311, framing a work area on the image frame on the edge side, and importing a face picture of the related person.
In this embodiment, the purpose of framing the work area is to determine whether the person is on the corresponding position, and the purpose of importing the face picture of the relevant person is to ensure that the corresponding person is on the corresponding position.
And S1312, associating the work area with the face picture of the related person to obtain associated information.
In this embodiment, the related information refers to a one-to-one corresponding relationship between the work area and the face picture of the related person.
And S1313, identifying related personnel and located area information in the image frame.
In this embodiment, the face information of the relevant person in the image frame is identified, and the region information refers to the position information of the person on the image frame.
And S1314, determining the on Shift operation time of the related personnel according to the associated information, the related personnel and the located region information to obtain the on Shift operation time of the personnel.
In this embodiment, it is determined that the corresponding person is in the area according to the associated information, the related person and the located area information to determine that the person is on duty working time, and if the person is not in the area or the corresponding person is not in the area, the person is considered not to be on duty, and the person is timed to be determined not to be on duty working time.
Specifically, the on-duty working time of the personnel is obtained through an on-duty video recognition algorithm of the personnel. Firstly, a working area is framed in a web page, a face picture of related personnel is imported, and the area is associated with the personnel. When the relevant person is detected in the image frame and the person is identified to be in a designated area in the video, the person is judged to be on duty, otherwise, the person is considered to be off duty, and the attendance time is generally a fixed time. Further, personnel utilization = actual working hours/attendance times = on Shift hours/attendance times.
And S132, calculating the equipment operation time and the equipment loss time of the image frame by adopting an edge calculation method to obtain the equipment operation time and the equipment loss time.
In this embodiment, the equipment operation time refers to the time length of equipment operation; the equipment loss time refers to the time length of equipment running error and no operation.
Specifically, the color of the safety light within the image frame is identified on the edge side by a safety light video identification algorithm to determine the equipment working time and the equipment lapsing time. And (4) lighting a safety lamp (non-red lamp) when the equipment runs, and regarding the equipment as being in operation. When the equipment has an error in operation, the safety lamp is red and is regarded as non-operation. The safety light video identification algorithm specifically identifies the colors of the safety lights of the equipment in the image frame to determine the time lengths of different colors, so as to determine the working time and the loss time of the equipment. Further, the equipment utilization rate = (working time-non-working time)/working time.
And S133, calculating the load time and the downtime of the image frame by adopting an edge calculation method to obtain the load time and the downtime.
In this embodiment, the load time refers to a time period during which the reading of the instrument panel of the device is not zero; the down time refers to the length of time that the instrument panel reads zero.
In particular, the readings of the instrument panel within the image frame are identified on the edge side by a meter video recognition algorithm to determine the load time as well as the down time.
And obtaining the load time and the downtime through an instrument video identification algorithm, aligning a video source to an equipment instrument panel, and judging the equipment load if the reading of a pointer is more than 0, or else, judging the equipment to be stopped. In addition, the time utilization rate (availability rate) = (load time-downtime)/load time. The instrument video recognition algorithm refers to an algorithm for recognizing a numerical value of an instrument panel of the equipment.
And S140, calculating the utilization rate according to the calculation result to obtain the utilization rate.
In this embodiment, the utilization rates include a personnel utilization rate and an equipment utilization rate.
And sending the calculation result to a service background through an http request to calculate the utilization rate. The utilization rates include a personnel utilization rate and an equipment utilization rate. Personnel utilization = actual work time/attendance time; equipment utilization rate = (working time-elapsed time)/working time; time utilization rate (availability rate) = (load time-down time)/load time; performance utilization rate (performance index) = theoretical takt time investment quantity/utilization time; yield (quality index) = (input quantity-defective quantity)/input quantity; equipment integrated efficiency (OEE) = availability x performance index x quality index.
The related video detection and calculation are carried out on personnel, equipment and products existing in a factory at the edge side, so that the purposes of automatically generating a report of the utilization rate and avoiding sensitive data from being transmitted outwards are achieved.
Referring to fig. 5, an analysis result report, that is, a report of utilization rate and an event record screenshot are shown, including equipment utilization rate data, a workshop attendance condition, a product production condition and a safety alarm event record in about 30 days; referring to fig. 6, the attendance situation of the staff is shown, and a staff flow thermodynamic diagram corresponding to the workshop is generated according to the identification result; referring to FIG. 7, employee utilization trends and rankings are shown; referring to fig. 8, the utilization rate of the equipment and the condition of the production line are shown for a period of time; referring to fig. 9, a real-time monitoring workshop condition is shown, that is, a real-time display of a video to be detected is shown; the method comprises the steps that one or four video source monitoring pictures and algorithm results are displayed simultaneously by clicking one or four sub-screens of a terminal interface, a card stream is displayed on the right side in real time, and the card comprises screenshots of algorithm targets, algorithm names, algorithm results and specific time corresponding to the results; specifically, a multi-algorithm and multi-region analysis task can be created through the server, and the specific flow is as follows, please refer to fig. 10, an analysis task is created, a channel is selected, a certain path of video stream is bound to the channel, and formats such as RTSP/RTMP/NVR/video file are supported; referring to fig. 11, one or more algorithms are selected, and algorithm parameters such as confidence, decimation interval, minimum recognition target size may be configured; referring to FIG. 12, a region is plotted and one or more algorithms are selected; referring to fig. 13, the final analysis results are shown, that is, the utilization rate and the detection results are shown.
According to the utilization rate analysis method, the video to be analyzed is pulled, the video to be analyzed is decoded to obtain the image frame, the edge calculation is carried out, then the utilization rate calculation is carried out, the image frame is processed at the edge side, the data flow is greatly reduced, the data safety is improved, the real-time performance of the edge calculation mode is high, the sudden accidents can be timely processed, and the calculation result accuracy is high.
Fig. 14 is a schematic block diagram of a utilization rate analyzing apparatus 300 according to an embodiment of the present invention. As shown in fig. 14, the present invention also provides a utilization rate analyzing apparatus 300 corresponding to the above utilization rate analyzing method. The utilization rate analyzing apparatus 300 includes a unit for executing the utilization rate analyzing method described above, and the apparatus may be disposed in a server. Specifically, referring to fig. 14, the utilization rate analyzing apparatus 300 includes a pulling unit 301, a processing unit 302, an edge calculating unit 303, and a utilization rate calculating unit 304.
A pull unit 301, configured to pull a video to be analyzed from a media service; a processing unit 302, configured to process the video to be analyzed to obtain an image frame; an edge calculation unit 303, configured to perform edge calculation on the image frame to obtain a calculation result; the utilization rate calculating unit 304 is configured to perform utilization rate calculation according to the calculation result to obtain the utilization rate.
In an embodiment, the processing unit 302 is configured to decode the video to be analyzed to obtain an image frame.
In one embodiment, as shown in fig. 15, the edge calculation unit 303 includes an on duty working time calculation subunit 3031, a device time calculation subunit 3032, and a meter time calculation subunit 3033.
An on Shift operation time calculating subunit 3031, configured to calculate the on Shift operation time of the person by using an edge calculation method for the image frame, so as to obtain the on Shift operation time of the person; an apparatus time calculating subunit 3032, configured to perform apparatus operation time and apparatus loss time calculation on the image frame by using an edge calculation method, so as to obtain an apparatus operation time and an apparatus loss time; and the meter time calculating subunit 3033 is configured to perform load time and downtime calculation on the image frames by using an edge calculation method to obtain load time and downtime.
In one embodiment, as shown in fig. 16, the on Shift job time calculation subunit 3031 includes a framing module 30311, an association module 30312, an identification module 30313, and a time determination module 30314.
A framing module 30311, configured to frame a work area for the image frame at an edge side and import a face picture of a related person; a correlation module 30312, configured to correlate the work area with the face picture of the relevant person to obtain correlation information; an identifying module 30313, configured to identify relevant people and located region information in the image frame; and a time determining module 30314, configured to determine the on Shift working time of the relevant person according to the association information, the relevant person and the located region information, so as to obtain the on Shift working time of the person.
In an embodiment, the device time calculating subunit 3032 is configured to recognize the color of the safety light in the image frame through a safety light video recognition algorithm on the edge side to determine the device working time and the device running time.
In an embodiment, the meter time calculation subunit 3033 is configured to identify the reading of the instrument panel in the image frame by a meter video recognition algorithm on the edge side to determine the load time and the downtime.
It should be noted that, as can be clearly understood by those skilled in the art, the specific implementation process of the utilization rate analyzing apparatus 300 and each unit may refer to the corresponding description in the foregoing method embodiment, and for convenience and brevity of description, no further description is provided herein.
The utilization rate analyzing apparatus 300 described above may be implemented in the form of a computer program that can be run on a computer device as shown in fig. 17.
Referring to fig. 17, fig. 17 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device 500 may be a server, wherein the server may be an independent server or a server cluster composed of a plurality of servers.
Referring to fig. 17, the computer device 500 includes a processor 502, memory, and a network interface 505 connected by a system bus 501, where the memory may include a non-volatile storage medium 503 and an internal memory 504.
The non-volatile storage medium 503 may store an operating system 5031 and a computer program 5032. The computer program 5032 comprises program instructions that, when executed, cause the processor 502 to perform a utilization analysis method.
The processor 502 is used to provide computing and control capabilities to support the operation of the overall computer device 500.
The internal memory 504 provides an environment for running the computer program 5032 in the non-volatile storage medium 503, and when the computer program 5032 is executed by the processor 502, the processor 502 can be caused to execute a utilization rate analysis method.
The network interface 505 is used for network communication with other devices. Those skilled in the art will appreciate that the configuration shown in fig. 17 is a block diagram of only a portion of the configuration relevant to the present teachings and does not constitute a limitation on the computer device 500 to which the present teachings may be applied, and that a particular computer device 500 may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
Wherein the processor 502 is configured to run the computer program 5032 stored in the memory to implement the following steps:
pulling a video to be analyzed from a media service; processing the video to be analyzed to obtain an image frame; performing edge calculation on the image frame to obtain a calculation result; and calculating the utilization rate according to the calculation result to obtain the utilization rate.
Wherein the calculation result comprises the working time of the personnel on duty, the working time of the equipment, the loss time of the equipment, the load time and the downtime.
The utilization rate includes a personnel utilization rate and an equipment utilization rate.
In an embodiment, when the processor 502 implements the step of processing the video to be analyzed to obtain the image frame, the following steps are specifically implemented:
and decoding the video to be analyzed to obtain an image frame.
In an embodiment, when the processor 502 implements the step of performing the edge calculation on the image frame to obtain the calculation result, the following steps are specifically implemented:
calculating the on-duty working time of the personnel by adopting an edge calculation method for the image frame so as to obtain the on-duty working time of the personnel; calculating the equipment operation time and the equipment loss time of the image frame by adopting an edge calculation method to obtain the equipment operation time and the equipment loss time; and calculating the load time and the downtime of the image frame by adopting an edge calculation method to obtain the load time and the downtime.
In an embodiment, when implementing the step of calculating the on-duty working time of the person by using the edge calculation method for the image frame to obtain the on-duty working time of the person, the processor 502 specifically implements the following steps:
framing a working area for the image frame at the edge side and importing a face picture of a related person; associating the operation area with the face picture of the related personnel to obtain associated information; identifying related personnel and located area information in the image frame; and determining the on Shift working time of the related personnel according to the associated information, the related personnel and the located area information to obtain the on Shift working time of the personnel.
In an embodiment, when the processor 502 performs the step of calculating the device operation time and the device running time by using the edge calculation method for the image frame to obtain the device operation time and the device running time, the following steps are specifically implemented:
and identifying the color of the safety lamp in the image frame through a safety lamp video identification algorithm at the edge side so as to determine the working time and the loss time of the equipment.
In an embodiment, when the processor 502 performs the load time and the downtime calculation on the image frame by using the edge calculation method to obtain the load time and the downtime step, the following steps are specifically implemented:
the readings of the instrument panel within the image frame are identified at the edge side by a meter video recognition algorithm to determine the load time as well as the down time.
It should be understood that, in the embodiment of the present Application, the Processor 502 may be a Central Processing Unit 302 (CPU), and the Processor 502 may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field-Programmable Gate arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
It will be understood by those skilled in the art that all or part of the flow of the method implementing the above embodiments may be implemented by a computer program instructing associated hardware. The computer program includes program instructions, and the computer program may be stored in a storage medium, which is a computer-readable storage medium. The program instructions are executed by at least one processor in the computer system to implement the flow steps of the embodiments of the method described above.
Accordingly, the present invention also provides a storage medium. The storage medium may be a computer-readable storage medium. The storage medium stores a computer program, wherein the computer program, when executed by a processor, causes the processor to perform the steps of:
pulling a video to be analyzed from a media service; processing the video to be analyzed to obtain an image frame; performing edge calculation on the image frame to obtain a calculation result; and calculating the utilization rate according to the calculation result to obtain the utilization rate.
Wherein the calculation result comprises the working time of the personnel on duty, the working time of the equipment, the loss time of the equipment, the load time and the downtime.
The utilization rate includes a personnel utilization rate and an equipment utilization rate.
In an embodiment, when the processor executes the computer program to implement the step of processing the video to be analyzed to obtain the image frames, the following steps are specifically implemented:
and decoding the video to be analyzed to obtain an image frame.
In an embodiment, when the processor executes the computer program to implement the step of performing the edge calculation on the image frame to obtain the calculation result, the processor specifically implements the following steps:
calculating the on-duty working time of the personnel by adopting an edge calculation method for the image frame so as to obtain the on-duty working time of the personnel; calculating the equipment operation time and the equipment loss time of the image frame by adopting an edge calculation method to obtain the equipment operation time and the equipment loss time; and calculating the load time and the downtime of the image frame by adopting an edge calculation method to obtain the load time and the downtime.
In an embodiment, when the processor executes the computer program to implement the on-duty working time calculation of the image frame by using the edge calculation method to obtain the on-duty working time step of the person, the following steps are specifically implemented:
framing a working area for the image frame at the edge side and importing a face picture of a related person; associating the operation area with the face picture of the related personnel to obtain associated information; identifying related personnel and located area information in the image frame; and determining the on Shift working time of the related personnel according to the associated information, the related personnel and the located area information to obtain the on Shift working time of the personnel.
In an embodiment, when the processor executes the computer program to implement the step of performing the device operation time and the device elapsed time calculation on the image frame by using the edge calculation method to obtain the device operation time and the device elapsed time, the following steps are specifically implemented:
and identifying the color of the safety lamp in the image frame through a safety lamp video identification algorithm at the edge side so as to determine the working time and the loss time of the equipment.
In an embodiment, when the processor executes the computer program to perform the load time and the downtime calculation on the image frame by using the edge calculation method to obtain the load time and the downtime step, the following steps are specifically implemented:
the readings of the instrument panel within the image frame are identified at the edge side by a meter video recognition algorithm to determine the load time as well as the down time.
The storage medium may be a usb disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk, which can store various computer readable storage media.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, and that the components and steps of the examples have been described in a functional general in the foregoing description for the purpose of illustrating clearly the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, various elements or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs. The units in the device of the embodiment of the invention can be merged, divided and deleted according to actual needs. In addition, functional units in the embodiments of the present invention may be integrated into one processing unit 302, or each unit may exist alone physically, or two or more units are integrated into one unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a terminal, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and various equivalent modifications and substitutions can be easily made by those skilled in the art within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. The utilization rate analysis method is characterized by comprising the following steps:
pulling a video to be analyzed from a media service;
processing the video to be analyzed to obtain an image frame;
performing edge calculation on the image frame to obtain a calculation result;
and calculating the utilization rate according to the calculation result to obtain the utilization rate.
2. The utilization analysis method of claim 1, wherein the processing the video to be analyzed to obtain image frames comprises:
and decoding the video to be analyzed to obtain an image frame.
3. The utilization analysis method of claim 1, wherein the calculation results include personnel on Shift hours, equipment lapses, load hours, and downtime;
the performing edge calculation on the image frame to obtain a calculation result includes:
calculating the on-duty working time of the personnel by adopting an edge calculation method for the image frame so as to obtain the on-duty working time of the personnel;
calculating the equipment operation time and the equipment loss time of the image frame by adopting an edge calculation method to obtain the equipment operation time and the equipment loss time;
and calculating the load time and the downtime of the image frame by adopting an edge calculation method to obtain the load time and the downtime.
4. The utilization analysis method of claim 3, wherein said performing on Shift personnel on Shift hours calculations on said image frames using an edge calculation method to obtain personnel on Shift hours comprises:
framing a working area for the image frame at the edge side and importing a face picture of a related person;
associating the operation area with the face picture of the related personnel to obtain associated information;
identifying related personnel and located area information in the image frame;
and determining the on Shift working time of the related personnel according to the associated information, the related personnel and the located area information to obtain the on Shift working time of the personnel.
5. The utilization rate analysis method according to claim 3, wherein the calculating of the device operation time and the device running away time for the image frames by using the edge calculation method to obtain the device operation time and the device running away time comprises:
and identifying the color of the safety lamp in the image frame through a safety lamp video identification algorithm at the edge side so as to determine the working time and the loss time of the equipment.
6. The utilization analysis method of claim 3, wherein said performing a load time and a downtime calculation on said image frames using an edge calculation method to obtain a load time and a downtime comprises:
the readings of the instrument panel within the image frame are identified at the edge side by a meter video recognition algorithm to determine the load time as well as the down time.
7. The utilization rate analysis method according to claim 1, wherein the utilization rates include a person utilization rate and an equipment utilization rate.
8. Utilization rate analytical equipment, its characterized in that includes:
the system comprises a pulling unit, a video analyzing unit and a video analyzing unit, wherein the pulling unit is used for pulling a video to be analyzed from a media service;
the processing unit is used for processing the video to be analyzed to obtain an image frame;
the edge calculation unit is used for carrying out edge calculation on the image frame to obtain a calculation result;
and the utilization rate calculating unit is used for calculating the utilization rate according to the calculation result so as to obtain the utilization rate.
9. A computer device, characterized in that the computer device comprises a memory, on which a computer program is stored, and a processor, which when executing the computer program implements the method according to any of claims 1 to 7.
10. A storage medium, characterized in that the storage medium stores a computer program which, when executed by a processor, implements the method according to any one of claims 1 to 7.
CN202210256376.3A 2022-03-16 2022-03-16 Utilization rate analysis method and device, computer equipment and storage medium Active CN114359818B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210256376.3A CN114359818B (en) 2022-03-16 2022-03-16 Utilization rate analysis method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210256376.3A CN114359818B (en) 2022-03-16 2022-03-16 Utilization rate analysis method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114359818A true CN114359818A (en) 2022-04-15
CN114359818B CN114359818B (en) 2022-06-14

Family

ID=81094573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210256376.3A Active CN114359818B (en) 2022-03-16 2022-03-16 Utilization rate analysis method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114359818B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115311796A (en) * 2022-07-11 2022-11-08 西安电子科技大学广州研究院 Edge intelligent security alarm system
CN116362454A (en) * 2023-06-03 2023-06-30 宁德时代新能源科技股份有限公司 Yield analysis system and method, electronic equipment, storage medium and product

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160149850A1 (en) * 2014-11-24 2016-05-26 Linkedln Corporation Intelligent scheduling for employee activation
CN111193662A (en) * 2019-12-27 2020-05-22 浙江华工赛百数据系统有限公司 Edge computing gateway based on visual identification
CN112016369A (en) * 2019-05-31 2020-12-01 阿里巴巴集团控股有限公司 Method and system for determining production progress and order checking and edge server
CN112689069A (en) * 2020-12-18 2021-04-20 上海上实龙创智能科技股份有限公司 Production line error correction auxiliary system and method based on edge gateway
CN113238544A (en) * 2021-04-27 2021-08-10 深圳市益普科技有限公司 Equipment data acquisition system and method based on action signals
CN113723315A (en) * 2021-09-01 2021-11-30 常熟希那基汽车零件有限公司 Raspberry pie-based utilization rate monitoring system
CN113780906A (en) * 2020-06-09 2021-12-10 富鼎电子科技(嘉善)有限公司 Machine management method and device and computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160149850A1 (en) * 2014-11-24 2016-05-26 Linkedln Corporation Intelligent scheduling for employee activation
CN112016369A (en) * 2019-05-31 2020-12-01 阿里巴巴集团控股有限公司 Method and system for determining production progress and order checking and edge server
CN111193662A (en) * 2019-12-27 2020-05-22 浙江华工赛百数据系统有限公司 Edge computing gateway based on visual identification
CN113780906A (en) * 2020-06-09 2021-12-10 富鼎电子科技(嘉善)有限公司 Machine management method and device and computer readable storage medium
CN112689069A (en) * 2020-12-18 2021-04-20 上海上实龙创智能科技股份有限公司 Production line error correction auxiliary system and method based on edge gateway
CN113238544A (en) * 2021-04-27 2021-08-10 深圳市益普科技有限公司 Equipment data acquisition system and method based on action signals
CN113723315A (en) * 2021-09-01 2021-11-30 常熟希那基汽车零件有限公司 Raspberry pie-based utilization rate monitoring system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115311796A (en) * 2022-07-11 2022-11-08 西安电子科技大学广州研究院 Edge intelligent security alarm system
CN116362454A (en) * 2023-06-03 2023-06-30 宁德时代新能源科技股份有限公司 Yield analysis system and method, electronic equipment, storage medium and product
CN116362454B (en) * 2023-06-03 2023-10-20 宁德时代新能源科技股份有限公司 Yield analysis system and method, electronic equipment, storage medium and product

Also Published As

Publication number Publication date
CN114359818B (en) 2022-06-14

Similar Documents

Publication Publication Date Title
CN114359818B (en) Utilization rate analysis method and device, computer equipment and storage medium
CN111723727B (en) Cloud monitoring method and device based on edge computing, electronic equipment and storage medium
CN110572579B (en) Image processing method and device and electronic equipment
CN104980752B (en) The method and system of multichannel self-adaptive parallel transcoding are realized using CPU and GPU
CN110348522B (en) Image detection and identification method and system, electronic equipment, and image classification network optimization method and system
CN111629264B (en) Web-based separate front-end image rendering method
US11551447B2 (en) Real-time video stream analysis system using deep neural networks
CN114943936A (en) Target behavior identification method and device, electronic equipment and storage medium
CN113395523B (en) Image decoding method, device, equipment and storage medium based on parallel threads
CN102148983A (en) Method for solving over-high occupancy of high-resolution image resource
CN110704268B (en) Automatic testing method and device for video images
US11893791B2 (en) Pre-processing image frames based on camera statistics
CN112040090A (en) Video stream processing method and device, electronic equipment and storage medium
US20230168945A1 (en) Efficient High Bandwidth Shared Memory Architectures for Parallel Machine Learning and AI Processing of Large Data Sets and Streams
CN117176990A (en) Video stream processing method and device, electronic equipment and storage medium
CN115147752A (en) Video analysis method and device and computer equipment
EP3029940A1 (en) Method and device for post processing of a video stream
CN114039279A (en) Control cabinet monitoring method and system in rail transit station
TW202249035A (en) Cloud-edge collaborative processing method of industrial internet, electronic device, and storage medium
CN113596395A (en) Image acquisition method and monitoring equipment
CN110321857B (en) Accurate passenger group analysis method based on edge calculation technology
CN109194981B (en) Remote video tracking method, device and storage medium
CN113747195A (en) Video data processing method, device, equipment and storage medium
CN111866586A (en) Underground video data processing method and device, electronic equipment and storage medium
EP3975159A1 (en) Method and a system for measuring the latency of a graphical display output

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Patentee after: Shenzhen Huafu Technology Co.,Ltd.

Address before: 518000 Room 201, building A, 1 front Bay Road, Shenzhen Qianhai cooperation zone, Shenzhen, Guangdong

Patentee before: SHENZHEN HUAFU INFORMATION TECHNOLOGY Co.,Ltd.