CN112819306A - Method, system, device and medium for evaluating work efficiency based on computer vision - Google Patents

Method, system, device and medium for evaluating work efficiency based on computer vision Download PDF

Info

Publication number
CN112819306A
CN112819306A CN202110090185.XA CN202110090185A CN112819306A CN 112819306 A CN112819306 A CN 112819306A CN 202110090185 A CN202110090185 A CN 202110090185A CN 112819306 A CN112819306 A CN 112819306A
Authority
CN
China
Prior art keywords
work
worker
workers
working
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110090185.XA
Other languages
Chinese (zh)
Inventor
刘金杉
卢昱杰
王瑞
刘博�
张自然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202110090185.XA priority Critical patent/CN112819306A/en
Publication of CN112819306A publication Critical patent/CN112819306A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06398Performance of employee with respect to a job function
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2462Approximate or statistical queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Probability & Statistics with Applications (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Economics (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Fuzzy Systems (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Game Theory and Decision Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • General Factory Administration (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a method, a system, equipment and a medium for evaluating the working efficiency based on computer vision, which belong to the field of computer vision, and the method comprises the following steps: acquiring a video image of a construction site; identifying workers from the video images through a deep learning module, identifying the work types of the workers through a people re-identification module, and determining the work type information of the workers; continuously identifying and capturing workers, and recording work information of the workers, wherein the work information at least comprises position information; acquiring preset judgment conditions according to the work type information, and judging whether a worker is in a working state or not by combining position information; and determining the working efficiency of the worker according to the duration of the working state and the preset standard working time. According to the invention, the position information of workers is judged through three aspects of a working area, a working path area and work type clustering in the preset judgment condition, and the working state and the working efficiency of the workers are obtained. The invention can make up for some defects in the existing evaluation method.

Description

Method, system, device and medium for evaluating work efficiency based on computer vision
Technical Field
The invention relates to the field of computer vision, in particular to a method, a system, equipment and a medium for evaluating the working efficiency based on computer vision.
Background
Infrastructure construction greatly pulls the internal demand level of China, generates a large number of working posts and makes an important contribution to GDP growth. For the building construction engineering, the input-output ratio is a key problem concerned by people, so that people pay attention to the construction efficiency, the final cost and the output are directly influenced by the working efficiency of any work, and the monitoring, the control and the evaluation of the engineering construction efficiency are always the focus problems concerned by people.
The economic quantity of the engineering construction industry in China is huge, and the engineering construction industry is always a pillar industry for economic construction and social development. However, according to statistics, the value-of-production increase rate and the value-of-production profit rate of the building industry both tend to be low, and meanwhile, under a severe market competition environment, the profit margin of the building enterprises is further compressed. The main reasons are the long-standing industrial characteristics in the engineering construction industry, such as extensive industrial development mode, low skill quality of construction workers, incomplete supervision system, laggard technical equipment and the like.
Under the condition that the engineering construction industry faces severe market situation, the automatic management of construction personnel teams is realized through the computer vision technology vigorously advocated by the nation, the construction efficiency and the enterprise profit rate can be effectively improved, and the safety accidents of a construction site can be favorably reduced. The key task for realizing the automation of the management of the constructors is to acquire the labor state information of the site construction workers, such as the positions of the site workers, the number of the workers, the labor amount and progress, the safety state of the workers, the labor time and efficiency and the like.
The 'computer vision based building worker labor state analysis' article of the university of southern China 'university of studios' in 2019 carries out recognition of the working state of workers according to the change rule of the included angle between the limbs when the workers carry out specific work, distinguishes whether the workers are in the working state by improving a DTW algorithm, and further carries out evaluation on the construction efficiency by the rule of 'total work load/(number of workers multiplied by labor time'). The method proposed by the paper solves the discrimination of the working state of workers from the viewpoint of human posture recognition.
The university of great concussion university '2019 student' academic paper 'worker activity recognition and management research based on intelligent mobile terminal collected data' classifies and recognizes postures, actions and behaviors of workers, and defines worker work attributes as no work, auxiliary work and effective work based on recognition results, so as to calculate and evaluate the 'effective work rate' of the workers. The paper defines the difference of the working attributes to a certain extent, so that the problem of evaluating the working efficiency of workers is solved, but different discrimination methods are required to be used for different work types, and the discrimination complexity is high.
An invention, namely a method for detecting the on-duty state of a worker based on improved YOLOv3 (application publication number: CN111860152A), disclosed in the 2020 of Taiji university of Engineers proposes an improved ILF-YOLOv3 algorithm based on YOLOv3, improves the detection precision and the recall rate, and can further distinguish and detect the working state phenomenon of the worker through posture recognition of the worker. However, the marking process is complex, and the worker state identification under various complex scenes cannot be realized, so that the actual operability is weak.
The invention "apparatus and method for analyzing status of manual work by worker" and program disclosed in 2019 (application publication No. CN111565293A) proposes a method for determining status of manual work by worker using computer vision technology. The method can be used for pertinently judging the specific working state, but has lower universality on the work.
The invention discloses a method, a system, equipment and computer equipment for detecting personnel states (application publication number: CN111860152A) disclosed in 2020 by Dahua technology of Zhejiang, and provides a set of software and hardware platforms using computer vision technology for detecting and identifying the number of people on duty. The method realizes the statistics of the off-duty time of the personnel through the confirmation of the identity of the personnel. The technology has the disadvantages that the position information of the target is not considered and the group behavior and the activity type cannot be distinguished.
At present, the attention of scholars at home and abroad on the efficiency management and evaluation method of a construction site mainly lies in the aspects of influence factors of construction efficiency, an improvement method, a quantification method, an evaluation model and the like. The research on the improvement method is mostly limited to the areas with special environments or specific engineering types, and has no universality. For the research on the influence factors, the main analysis angle shows two polarization trends: the method aims at the analysis of the aspects of national policy, government support, enterprise management system, employee training system and the like, and the analysis is too macroscopic and has poor real operability; and secondly, the method aims at the research of special environmental influence and special process flow, and the analysis has weak generalization due to the limitation of contents and scenes.
In summary, for the efficiency evaluation of a construction site, the evaluation method at the present stage mainly considers the aspects of workload and completion time, and the method cannot give a better evaluation on the work concentration of specific workers.
Disclosure of Invention
Aiming at the problem of poor efficiency evaluation effect of a construction site in the prior art, the invention aims to provide a method, a system, equipment and a medium for evaluating the working efficiency based on computer vision.
In order to achieve the purpose, the technical scheme of the invention is as follows:
in one aspect, the present invention provides a method for evaluating work efficiency based on computer vision, comprising the steps of,
acquiring a video image of a construction site;
identifying workers from the video images through a deep learning module, identifying the work types of the workers through a people re-identification module, and determining the work type information of the workers;
continuously identifying and capturing workers, and recording work information of the workers, wherein the work information at least comprises position information;
acquiring a preset judgment condition according to the work type information of a worker, and judging whether the worker is in a working state or not according to the work information of the worker and the preset judgment condition;
and determining the working efficiency of the worker according to the duration of the working state or the non-working state and the preset standard working time.
Preferably, the step of obtaining the preset judgment condition according to the work type information of the worker, and judging whether the worker is in a working state according to the work information of the worker and the preset judgment condition comprises:
and detecting whether the time of the position of a worker of a certain work type outside the working area corresponding to the work type exceeds a first threshold value, if so, judging the worker is in a non-working state, and the time length of the worker exceeding the first threshold value is the duration of the non-working state.
In another preferred embodiment, the step of obtaining a preset judgment condition according to the work type information of the worker, and then judging whether the worker is in a working state according to the work information of the worker and the preset judgment condition is as follows:
and detecting whether the time of the moving path of a worker of a certain work type outside the working path area corresponding to the work type exceeds a second threshold value, if so, judging the worker is in a non-working state, and the time length of the moving path of the worker of the certain work type outside the working path area corresponding to the work type is the duration of the non-working state.
In another preferred embodiment, the step of obtaining a preset judgment condition according to the work type information of the worker, and then judging whether the worker is in a working state according to the work information of the worker and the preset judgment condition is as follows:
and detecting whether the duration of the work type clustering exceeds a third threshold value, if so, judging the work type clustering to be in a non-working state, and if not, determining the duration of the work type clustering to be in the non-working state of workers participating in the work type clustering.
Preferably, the work species clustering is determined by the following steps:
detecting whether a plurality of workers whose mutual distances are smaller than a fourth threshold exist;
and judging whether the workers belong to different work categories, if so, judging that the workers are grouped in the work categories.
Preferably, the working path area is determined by:
determining the position of each corresponding work element according to the work type information;
connecting the working elements with each other;
and expanding each connecting line outwards for a certain distance to form a strip-shaped area, wherein the collection of the strip-shaped areas is the working path area.
Further, the work information further includes posture information, and the categories of the posture information include walking, squatting, standing still and others, the method further includes the steps of:
judging whether the time that the posture information of the worker belongs to a certain category of walking, squatting and standing exceeds a fifth threshold value or not, if so, judging the worker is in an abnormal state, and if so, judging the worker is in the abnormal state, wherein the time that the posture information of the worker exceeds the fifth threshold value is the duration of the abnormal state;
and determining the duration of the working state or the non-working state of the worker according to the result of manually correcting the abnormal state, and further determining the working efficiency of the worker.
In another aspect, the present invention further provides a system for evaluating work efficiency based on computer vision, comprising
The acquisition module is used for acquiring a video image and determining a preset judgment condition according to the work type information;
a deep learning module to identify a worker from the video images;
the pedestrian re-identification module is used for determining the identity and the work type of a worker;
a capturing and recording module for continuously identifying and capturing workers and recording work information of the workers;
the judging module is used for judging whether the worker is in a working state or not according to the working information of the worker and the preset judging condition; and
the calculation module is used for calculating the working efficiency of the worker according to the duration of the working state or the non-working state of the worker and the preset standard working time.
In yet another aspect, the present invention provides an electronic device comprising
A memory storing executable program code; and
a processor coupled with the memory;
wherein the processor executes the computer vision-based work efficiency evaluation method as described above when calling the executable program code stored in the memory.
In yet another aspect, the present invention also provides a computer-readable storage medium storing a computer program which, when executed by a processor, performs the steps of the method as described above.
By adopting the technical scheme, the invention has the beneficial effects that:
1. the implementation of the computer vision technology, the deep learning technology and the pedestrian re-identification technology enables the identity, the work type and the position information of workers in a construction site to be determined, so that the working state of the workers of the work type can be determined according to the working area, the path and the position information of the workers, which are limited during construction operation, of each work type, and further the working efficiency is obtained;
2. the scheme of the invention distinguishes that the prior evaluation mode can not carry out objective evaluation on the idle phenomenon of the personnel, and combines the working rule of workers on the construction site, and carries out objective evaluation by taking the analysis results of the position information of the workers, the working path information of the workers, the condition whether the workers are clustered or not and the posture information of the workers as the judgment basis for judging whether the personnel are in the working state or not, and the evaluation results have accuracy and high efficiency;
3. according to the evaluation method provided by the invention, the labor employment relationship can be optimized, and the labor resource demand for completing the work can be obtained by a method of reducing the non-working time based on the recognition result.
Drawings
FIG. 1 is a flowchart of a method for evaluating work efficiency based on computer vision according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating a worker non-working state determination based on a working area according to a first embodiment;
FIG. 3 is a schematic diagram illustrating a worker non-working state determination process based on a working area according to an embodiment;
FIG. 4 is a schematic diagram illustrating a worker non-working state determination based on a working path region according to a first embodiment;
FIG. 5 is a schematic diagram illustrating a worker non-working state determination process based on a working path region according to an embodiment;
FIG. 6 is a schematic diagram illustrating a non-working status determination of workers based on work category clustering according to an embodiment;
FIG. 7 is a schematic diagram illustrating a non-working status determining process of workers based on work category clustering according to an embodiment;
FIG. 8 is a flowchart of another method for evaluating work efficiency based on computer vision according to one embodiment of the present invention;
FIG. 9 is a schematic structural diagram of a computer vision-based work efficiency evaluation system according to the present invention;
fig. 10 is a schematic structural diagram of an electronic device according to the present invention.
Detailed Description
The following further describes embodiments of the present invention with reference to the drawings. It should be noted that the description of the embodiments is provided to help understanding of the present invention, but the present invention is not limited thereto. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
It should be noted that in the description of the present invention, the terms "upper", "lower", "left", "right", "front", "rear", and the like indicate orientations or positional relationships based on structures shown in the drawings, and are only used for convenience in describing the present invention, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention.
In the technical scheme, the terms "first" and "second" are only used for referring to the same or similar structures or corresponding structures with similar functions, and are not used for ranking the importance of the structures, or comparing the sizes or other meanings.
In addition, unless expressly stated or limited otherwise, the terms "mounted" and "connected" are to be construed broadly, e.g., the connection may be a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; the two structures can be directly connected or indirectly connected through an intermediate medium, and the two structures can be communicated with each other. To those skilled in the art, the specific meanings of the above terms in the present invention can be understood in light of the present general concepts, in connection with the specific context of the scheme.
Example one
A method for evaluating work efficiency based on computer vision, as shown in FIG. 1, comprises the following steps,
step S1, acquiring a video image of a construction site;
step S2, identifying workers from the video images through the deep learning module, identifying the work types of the workers through the people re-identification module, and determining the work type information of the workers;
step S3, continuously identifying and capturing workers, and recording the work information of the workers, wherein the work information at least comprises position information;
step S4, acquiring preset judgment conditions according to the work type information of workers, and judging whether the workers are in working states or not according to the working information of the workers and the preset judgment conditions;
and step S5, determining the working efficiency of the worker according to the duration of the working state or the non-working state and the preset standard working time.
When the device is used, a plurality of groups of image acquisition equipment are erected on a construction site where worker work efficiency evaluation is needed, wherein the image acquisition equipment comprises but is not limited to a high-definition digital camera, an industrial camera, a video monitoring camera and the like with a stabilizing device. The number and arrangement of image capturing devices is based on the field environment, with the goal of covering areas where workers may be present.
The image acquisition equipment works continuously, and acquired video image (field image) information is transmitted to equipment operating with the method through image transmission equipment; identifying the persons in the video images by a deep learning method, identifying the characteristic information of the persons by a pedestrian re-identification technology, and recording the identity information of a certain worker; and continuously identifying and capturing the worker to acquire the work information of the worker, wherein in the embodiment, the work information is the position information of the worker, and the position information is finally output as a record with coordinates.
And the working state of a worker can be judged by analyzing the position information of the worker and matching with one or more preset judgment conditions. For example:
1. the working area of a certain worker is used as a judgment condition. The judging method is that whether the time that the position of a worker of a certain work type is outside the working area corresponding to the work type exceeds a first threshold value is detected, if so, the worker is judged to be in a non-working state, and the time length beyond the first threshold value is the duration of the non-working state.
In reality, the working activity range of a certain work type is determined, according to the rule, a working area of the work type is planned in advance, then the information of the position where a certain worker of the work type is located in real time is acquired, timing is started when the real-time position of the worker exceeds the working area of the planned work type, and if the timing time does not reach the set time threshold, the real-time position of the worker returns to the planned working area again, and the worker is considered to be still in the working state all the time; and if the timing time reaches or exceeds the set first threshold value, and the real-time position of the worker is still outside the defined working area, determining that the worker is in a non-working state, as shown in fig. 2.
Based on the above method, taking 8 hours of operation as an example, the first threshold is set to 10 minutes. As shown in fig. 3, the working area and the working path of the concrete pouring worker are shown, and if the worker works normally in the working time and has no emergency, the following phenomena occur: the worker leaves the work site for 1 time for 70 minutes because of personal things, then returns to the work post again to continue working, and works normally in the work post at other times.
Through the judgment of the judgment method, the worker leaves the working area, the time length exceeds a first threshold value, and the worker is marked with a non-working state. Finally, the efficiency of the worker on the day is as follows: 1-1 × (70-10)/60/8 × 100% ═ 87.5%.
2. The work path area of a worker of a certain type is used as a judgment condition. The judging method is that whether the time that the moving path of a worker of a certain work type is located outside the working path area corresponding to the work type exceeds a second threshold value is detected, if yes, the worker is judged to be in a non-working state, and the time length beyond the second threshold value is the duration of the non-working state.
Wherein the work path region is determined by the following steps: determining the position of each corresponding work element according to the work type information; connecting all the working elements with each other; and expanding each connecting line outwards for a certain distance to form a strip-shaped area, wherein the collection of the strip-shaped areas is the working path area.
Because the raw materials, instruments, equipment, work sites and work objects used in a certain work category are generally fixed in reality, during work, workers of the work category should closely surround the raw materials, instruments, equipment, work sites and work objects and shuttle among the raw materials, the instruments, the equipment, the work sites and the work objects, i.e., the moving path of the workers should only appear near the line of the above-mentioned several factors. All the factors are connected by straight lines, each straight line respectively extends to two sides by a plurality of meters to form a belt-shaped area, and the area is the determined working path area. If the position of the worker in the working time is in the working path area, the worker is considered to be in a working state; if the position of the worker within the working time occurs outside the working path area, the worker is considered to be in a non-working state, as shown in fig. 4.
In addition, it should be noted that possible emergencies should be excluded when using the present method, such as: workers are in short-term toileting, temporarily go to office areas, and operate other equipment such as distribution boxes. The method for excluding possible emergencies includes 2: (1) according to a mathematical statistical method, subtracting the emergency daily average time from the counted daily non-working time of the workers according to the daily average time of the emergency of the daily workers; (2) the factors including offices, toilets and the like are marked, and the moving path appearing in the area is deleted from the non-working state by using the method which is the same as the evaluation method, is connected with other factors and is expanded to two sides to obtain a specific area.
Based on the above method, taking 8-hour work as an example, the second threshold is set to 10 minutes, and the extended distance of the connecting line between the work elements is 2 meters. As shown in fig. 5, the working area of the steel-bar binding worker is the position point and the path of the equipment (working element) used. If the worker normally works in the working time, the emergency condition is avoided temporarily, and the following phenomena occur: the worker leaves the range of the working position path due to personal affairs, but still stays in the defined working area for 130 minutes, then returns to the working position again to continue working, and works normally in the working position at other times.
Through the judgment of the judgment method, the worker leaves the working path area, the time length exceeds a second threshold value, and the worker is marked with a non-working state. Finally, the efficiency of the worker on the day is as follows: 1-1 × (130-10)/60/8 × 100% ═ 75%.
3. The work species clustering is used as a judgment condition. And the judging method is that whether the duration of the work type clustering exceeds a third threshold value is detected, if so, the work type clustering is judged to be in a non-working state, and the duration of the work type clustering exceeding the third threshold value is the duration of the non-working state of the workers participating in the work type clustering.
The judgment step of the work species clustering comprises the following steps: detecting whether a plurality of workers whose mutual distances are smaller than a fourth threshold exist; and judging whether the workers belong to different work categories, if so, judging that the workers are clustered by the work categories.
When the work type cluster judgment is implemented, on the basis of positioning and work type judgment of each worker; detecting other 3 workers (surrounding people) with the nearest distance to each worker (central person) through the positioning information, calculating the relation between the linear distance of the 3 workers and a fourth threshold value, and if the distance is smaller than the fourth threshold value, calling the surrounding people as clustered people; if more than or equal to 4 workers are mutual crowders and the work types of the workers are not unique, the workers are judged to have the crowding phenomenon, as shown in fig. 6.
Based on the above method, taking 8 hours of operation as an example, the third threshold is set to 10 minutes and the fourth threshold is set to 1 meter. As shown in fig. 7, the working area and the using equipment position points and the path of the masonry worker and the steel bar binding worker are shown. If the worker normally works in the working time, the emergency condition is avoided temporarily, and the following phenomena occur: the two workers are gathered together by 2 workers, 4 workers in total, the distance between the workers is less than 1m due to personal reasons, the workers are in respective working areas and last for 250 minutes, then the workers return to the working post to continue working, and the workers normally work at the working post at other times.
Through the judgment of the judgment method, the 4 workers all have the work type clustering phenomenon, and the time length exceeds a third threshold value, and the workers are all subjected to non-working state marking. Finally, the daily efficiencies of the 4 workers are as follows: 1-1 × (250-10)/60/8 × 100% ═ 50%.
It can be understood that the above three methods for determining whether a worker is in a working state based on location information may be implemented individually or jointly, but usually implemented sequentially, that is, it is determined whether the worker is in a working area, then it is determined whether a moving path of the work is in a working path area, and then it is determined whether different kinds of workers have a work type clustering phenomenon. The sum of the time of the non-working state obtained by the worker in a period of time (one working day) after being judged by the three judging methods is the total time value of the non-working state in one day, the percentage of the total time in the standard working time is calculated, and the complement of the percentage and 100 percent is the working efficiency of the worker.
The working efficiency calculation method of the worker is as the following formula:
Figure BDA0002912178650000071
in the formula: eta represents the working efficiency; t represents standard working time; k represents the total number of times the worker exceeded the zone threshold within the standard work time; t is tnA duration representing the worker's threshold for exceeding the zone for the nth time at the standard work time; t is t0Representing a time threshold. The region threshold represents a work region, a work path region and a work category cluster under the three judging methods, and the time threshold represents a first threshold, a second threshold and a third threshold under the three judging methods.
In another embodiment, the operation information further includes attitude information in addition to the position information. The posture information is obtained by identifying the posture of a worker through a deep learning technology, wherein the posture of the worker is divided into four types including walking, squatting, standing still and the like, and correspondingly, the types of the posture information also include walking, squatting, standing still and the like.
Therefore, the method of the present invention further provides a step of determining the working status of the worker based on the posture information, as shown in fig. 8, based on the above embodiment, where the method includes:
step S6, judging whether the time that the posture information of the worker belongs to a certain category of walking, squatting and standing still exceeds a fifth threshold value, if so, judging the worker is in an abnormal state, and the time length of exceeding the fifth threshold value is the duration of the abnormal state;
and step S7, determining the duration of the working state or the non-working state of the worker according to the result of the manual correction of the abnormal state, and further determining the working efficiency of the worker.
That is, in the process of continuously identifying and recording the position information of the worker through the deep learning algorithm, and also synchronously recognizing and recording the posture information of the worker, thereby obtaining the posture information of the worker, the attitude information is used as an auxiliary judgment mode, after one or more judgment methods based on the position information are used, the identification and the judgment are carried out through the attitude information, when the worker's posture is of the type walking, squatting, standing still for an extended period of time (exceeding a fifth threshold) over a period of time (e.g. a working day), the worker is determined to be abnormal and marked as an abnormal state, the video is watched and screened manually, the abnormal state is determined to belong to an emergency, a working state or a non-working state, and according to the manual determination result, the working efficiency of the worker can be calculated and obtained through the formula (1). In addition, the manual checking result of the abnormal state can also be used as experience data to provide a basis for deducting the time of the emergency in the non-working state of the worker in the future.
Example two
A computer vision-based work efficiency evaluation system, as shown in FIG. 9, includes
The acquisition module is used for acquiring the video image and determining a preset judgment condition according to the work type information;
the deep learning module is used for identifying workers from the video images;
the pedestrian re-identification module is used for determining the identity and the work type of a worker;
the system comprises a capturing and recording module, a monitoring module and a control module, wherein the capturing and recording module is used for continuously identifying and capturing workers and recording the work information of the workers;
the judging module is used for judging whether the worker is in a working state or not according to the working information of the worker and a preset judging condition; and
and the calculation module is used for calculating the working efficiency of the worker according to the duration of the working state or the non-working state of the worker and the preset standard working time.
EXAMPLE III
An electronic device, as shown in FIG. 10, includes
A memory storing executable program code; and
a processor coupled to the memory;
when the processor calls the executable program codes stored in the memory, the steps of the working efficiency evaluation method based on computer vision in the first embodiment are executed.
Example four
A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method as disclosed in the first embodiment.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The embodiments of the present invention have been described in detail with reference to the accompanying drawings, but the present invention is not limited to the described embodiments. It will be apparent to those skilled in the art that various changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, and the scope of protection is still within the scope of the invention.

Claims (10)

1. A work efficiency evaluation method based on computer vision is characterized in that: comprises the following steps of (a) carrying out,
acquiring a video image of a construction site;
identifying workers from the video images through a deep learning module, identifying the work types of the workers through a people re-identification module, and determining the work type information of the workers;
continuously identifying and capturing workers, and recording work information of the workers, wherein the work information at least comprises position information;
acquiring a preset judgment condition according to the work type information of a worker, and judging whether the worker is in a working state or not according to the work information of the worker and the preset judgment condition;
and determining the working efficiency of the worker according to the duration of the working state or the non-working state and the preset standard working time.
2. The computer vision-based work efficiency evaluation method according to claim 1, characterized in that: the method comprises the following steps of obtaining preset judgment conditions according to the work type information of workers, and judging whether the workers are in working states or not according to the work information of the workers and the preset judgment conditions:
and detecting whether the time of the position of a worker of a certain work type outside the working area corresponding to the work type exceeds a first threshold value, if so, judging the worker is in a non-working state, and the time length of the worker exceeding the first threshold value is the duration of the non-working state.
3. The computer vision-based work efficiency evaluation method according to claim 1, characterized in that: the method comprises the following steps of obtaining preset judgment conditions according to the work type information of workers, and judging whether the workers are in working states or not according to the work information of the workers and the preset judgment conditions:
and detecting whether the time of the moving path of a worker of a certain work type outside the working path area corresponding to the work type exceeds a second threshold value, if so, judging the worker is in a non-working state, and the time length of the moving path of the worker of the certain work type outside the working path area corresponding to the work type is the duration of the non-working state.
4. The computer vision-based work efficiency evaluation method according to claim 1, characterized in that: the method comprises the following steps of obtaining preset judgment conditions according to the work type information of workers, and judging whether the workers are in working states or not according to the work information of the workers and the preset judgment conditions:
and detecting whether the duration of the work type clustering exceeds a third threshold value, if so, judging the work type clustering to be in a non-working state, and if not, determining the duration of the work type clustering to be in the non-working state of workers participating in the work type clustering.
5. The computer vision-based work efficiency evaluation method according to claim 4, characterized in that: the work species clustering is judged by the following steps:
detecting whether a plurality of workers whose mutual distances are smaller than a fourth threshold exist;
and judging whether the workers belong to different work categories, if so, judging that the workers are grouped in the work categories.
6. The computer vision-based work efficiency evaluation method according to claim 3, characterized in that: the work path region is determined by:
determining the position of each corresponding work element according to the work type information;
connecting the working elements with each other;
and expanding each connecting line outwards for a certain distance to form a strip-shaped area, wherein the collection of the strip-shaped areas is the working path area.
7. The computer vision-based work efficiency evaluation method according to claim 1, characterized in that: the work information further includes posture information, and categories of the posture information include walking, squatting, standing still, and others, the method further includes the steps of:
judging whether the time that the posture information of the worker belongs to a certain category of walking, squatting and standing exceeds a fifth threshold value or not, if so, judging the worker is in an abnormal state, and if so, judging the worker is in the abnormal state, wherein the time that the posture information of the worker exceeds the fifth threshold value is the duration of the abnormal state;
and determining the duration of the working state or the non-working state of the worker according to the result of manually correcting the abnormal state, and further determining the working efficiency of the worker.
8. A work efficiency evaluation system based on computer vision is characterized in that: comprises that
The acquisition module is used for acquiring a video image and determining a preset judgment condition according to the work type information;
a deep learning module to identify a worker from the video images;
the pedestrian re-identification module is used for determining the identity and the work type of a worker;
a capturing and recording module for continuously identifying and capturing workers and recording work information of the workers;
the judging module is used for judging whether the worker is in a working state or not according to the working information of the worker and the preset judging condition; and
the calculation module is used for calculating the working efficiency of the worker according to the duration of the working state or the non-working state of the worker and the preset standard working time.
9. An electronic device, characterized in that: comprises that
A memory storing executable program code; and
a processor coupled with the memory;
wherein the processor, when calling the executable program code stored in the memory, executes the computer vision-based work efficiency evaluation method of any one of claims 1 to 7.
10. A computer-readable storage medium storing a computer program, characterized in that: the computer program, when being executed by a processor, performs the steps of the method according to any of the claims 1-7.
CN202110090185.XA 2021-01-22 2021-01-22 Method, system, device and medium for evaluating work efficiency based on computer vision Pending CN112819306A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110090185.XA CN112819306A (en) 2021-01-22 2021-01-22 Method, system, device and medium for evaluating work efficiency based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110090185.XA CN112819306A (en) 2021-01-22 2021-01-22 Method, system, device and medium for evaluating work efficiency based on computer vision

Publications (1)

Publication Number Publication Date
CN112819306A true CN112819306A (en) 2021-05-18

Family

ID=75858855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110090185.XA Pending CN112819306A (en) 2021-01-22 2021-01-22 Method, system, device and medium for evaluating work efficiency based on computer vision

Country Status (1)

Country Link
CN (1) CN112819306A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210225029A1 (en) * 2018-05-16 2021-07-22 Panasonic Intellectual Property Management Co., Ltd. Work analyzing system and work analyzing method
CN113570749A (en) * 2021-07-22 2021-10-29 用友汽车信息科技(上海)股份有限公司 Method for determining station operation time, station operation system and readable storage medium
CN114758267A (en) * 2022-03-14 2022-07-15 北京明略软件系统有限公司 Method and device for determining loading and unloading operation efficiency
CN118211817A (en) * 2024-05-22 2024-06-18 汶上如意润发纺织有限公司 Operator safety management system suitable for cotton bale plucker

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107992836A (en) * 2017-12-12 2018-05-04 中国矿业大学(北京) A kind of recognition methods of miner's unsafe acts and system
CN108596148A (en) * 2018-05-10 2018-09-28 华南理工大学 A kind of construction worker's labour state analysis system based on computer vision and method
CN109978390A (en) * 2019-03-29 2019-07-05 嘉应学院 Office efficiency assessment system and its method based on image recognition
CN110292387A (en) * 2019-06-28 2019-10-01 广西慧云信息技术有限公司 A kind of device and statistical method counting agriculture labourer's working efficiency
CN111613028A (en) * 2020-06-24 2020-09-01 上海明略人工智能(集团)有限公司 Personnel monitoring system and personnel monitoring method
CN111798581A (en) * 2020-06-09 2020-10-20 上海万位科技有限公司 Personnel management method and system based on indoor and outdoor fusion position service

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107992836A (en) * 2017-12-12 2018-05-04 中国矿业大学(北京) A kind of recognition methods of miner's unsafe acts and system
CN108596148A (en) * 2018-05-10 2018-09-28 华南理工大学 A kind of construction worker's labour state analysis system based on computer vision and method
CN109978390A (en) * 2019-03-29 2019-07-05 嘉应学院 Office efficiency assessment system and its method based on image recognition
CN110292387A (en) * 2019-06-28 2019-10-01 广西慧云信息技术有限公司 A kind of device and statistical method counting agriculture labourer's working efficiency
CN111798581A (en) * 2020-06-09 2020-10-20 上海万位科技有限公司 Personnel management method and system based on indoor and outdoor fusion position service
CN111613028A (en) * 2020-06-24 2020-09-01 上海明略人工智能(集团)有限公司 Personnel monitoring system and personnel monitoring method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
罗德焕: ""基于计算机视觉的建筑工人劳动状态分析"", 《中国优秀硕士学位论文全文数据库 工程科技Ⅰ辑》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210225029A1 (en) * 2018-05-16 2021-07-22 Panasonic Intellectual Property Management Co., Ltd. Work analyzing system and work analyzing method
US11842511B2 (en) * 2018-05-16 2023-12-12 Panasonic Intellectual Property Management Co., Ltd. Work analyzing system and work analyzing method
CN113570749A (en) * 2021-07-22 2021-10-29 用友汽车信息科技(上海)股份有限公司 Method for determining station operation time, station operation system and readable storage medium
CN114758267A (en) * 2022-03-14 2022-07-15 北京明略软件系统有限公司 Method and device for determining loading and unloading operation efficiency
CN118211817A (en) * 2024-05-22 2024-06-18 汶上如意润发纺织有限公司 Operator safety management system suitable for cotton bale plucker

Similar Documents

Publication Publication Date Title
CN112819306A (en) Method, system, device and medium for evaluating work efficiency based on computer vision
CN110781914B (en) Equipment fault monitoring and processing method, device, equipment and storage medium
CN107169426A (en) A kind of detection of crowd's abnormal feeling and localization method based on deep neural network
CN112785798A (en) Behavior analysis method for construction project constructors of electric power substation engineering
CN110414400B (en) Automatic detection method and system for wearing of safety helmet on construction site
CN106991668B (en) Evaluation method for pictures shot by skynet camera
CN112560745B (en) Method for discriminating personnel on electric power operation site and related device
CN113903081A (en) Visual identification artificial intelligence alarm method and device for images of hydraulic power plant
CN112580995A (en) Construction safety big data monitoring system and safety risk dynamic evaluation method
CN113240249B (en) Urban engineering quality intelligent evaluation method and system based on unmanned aerial vehicle augmented reality
CN115550609B (en) Building internet of things monitoring system capable of realizing automatic adaptation
CN116777395A (en) Intelligent building supervision acceptance system for building engineering
CN116805204B (en) Intelligent plant monitoring method and system
CN117114619B (en) Project security management system based on big data analysis
CN110543866A (en) Safety management system and method for capital construction engineering constructors
CN114140999A (en) Engineering supervision system based on communication of Internet of things
CN117035419B (en) Intelligent management system and method for enterprise project implementation
CN115995056A (en) Automatic bridge disease identification method based on deep learning
CN116993665A (en) Intelligent detection method for construction progress of construction engineering working face based on computer vision
CN112150098A (en) Electric power facility anti-terrorism security supervision method and system
CN114548912A (en) Whole-process tracking method and system for building engineering project management
CN114091944A (en) Cloud-end-coordinated distribution network engineering field operation analysis decision system
CN113421168B (en) Intelligent machining system for mechanical basic parts
CN116344378B (en) Intelligent detection system for photovoltaic panel production and detection method thereof
CN117079203A (en) Road construction monitoring method, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210518