CN109492602B - Process timing method and system based on human body language - Google Patents

Process timing method and system based on human body language Download PDF

Info

Publication number
CN109492602B
CN109492602B CN201811393385.7A CN201811393385A CN109492602B CN 109492602 B CN109492602 B CN 109492602B CN 201811393385 A CN201811393385 A CN 201811393385A CN 109492602 B CN109492602 B CN 109492602B
Authority
CN
China
Prior art keywords
worker
production process
video data
timing
peak value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811393385.7A
Other languages
Chinese (zh)
Other versions
CN109492602A (en
Inventor
杜吉祥
黄金龙
张洪博
卢孔知
周以重
池守敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huaqiao University
Original Assignee
Huaqiao University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huaqiao University filed Critical Huaqiao University
Priority to CN201811393385.7A priority Critical patent/CN109492602B/en
Publication of CN109492602A publication Critical patent/CN109492602A/en
Application granted granted Critical
Publication of CN109492602B publication Critical patent/CN109492602B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C1/00Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
    • G07C1/10Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people together with the recording, indicating or registering of other data, e.g. of signs of identity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2218/00Aspects of pattern recognition specially adapted for signal processing
    • G06F2218/08Feature extraction
    • G06F2218/10Feature extraction by analysing the shape of a waveform, e.g. extracting parameters relating to peaks

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • General Factory Administration (AREA)

Abstract

The invention discloses a process timing method and system based on human body limb language. Wherein the method comprises the following steps: the method comprises the steps of obtaining video data in the production process of a worker, identifying body language information in the production process of the worker from the video data, establishing a two-dimensional coordinate graph based on a video frame of the video data and the body language information according to the video data and the body language information, denoising an image of the two-dimensional coordinate graph, marking coordinate data which represent the body language information in the two-dimensional coordinate graph after the image is denoised, marking a peak value, taking two peak value intervals as a process period according to the marked peak value, counting process time in the production process of the worker, and obtaining process timing in the production process of the worker. By the aid of the mode, the working procedure in the production process of workers can be automatically timed without manual participation, so that the working procedure timing is more objective and accurate.

Description

Process timing method and system based on human body language
Technical Field
The invention relates to the technical field of procedure timing, in particular to a procedure timing method and system based on human body limb language.
Background
With the development of computer science and technology, the sewing industry can utilize the modern computer science and technology to informationize the application of general sewing time in the sewing industry, so that an enterprise side has an intelligent, friendly and convenient working environment. A powerful static database can be established through computer system software, the database has the characteristics of luxuriant pictures and texts and is convenient to retrieve, common managers can easily complete other operations such as data selection, data copying and the like, and finally the purposes of quickly realizing style method analysis and time analysis are achieved. The computer system software can describe the analyzed result in the forms of graph, table and the like, and can perform data operations such as process balance, personnel allocation, machine allocation, production ranking, capacity analysis and the like through a computer, thereby providing great support for the management of the sewing industry.
In the current quilting industry, the computer management systems used are based solely on conventional computer software technology. The application range and the reliability are limited, particularly, in the timing of each process of sewing, the current main method adopts a method of swiping a card by a worker, and the card is respectively swiped at the beginning and the end of the process so as to verify the identity and count the time sequence. The operations need a large amount of manual participation, which easily causes waste of personnel, and meanwhile, the timing result is not objective and accurate due to the manual participation factor.
Disclosure of Invention
In view of the above, the present invention provides a method and a system for timing a process based on human body language, which can automatically time a process in a production process of a worker without human participation, and avoid human participation factors in the process timing, so that the process timing is more objective and accurate.
According to one aspect of the invention, a process timing method based on human body limb language is provided, which comprises the following steps:
acquiring video data in a production process of a worker;
identifying body language information of the worker in the production process from the video data according to the acquired video data;
according to the acquired video data and the recognized body language information, a two-dimensional coordinate graph based on the video frames of the video data and the recognized body language information is established;
carrying out image denoising on the established two-dimensional coordinate graph;
marking coordinate data which represent body language information in the two-dimensional coordinate graph after the image is denoised, and marking a peak value; wherein the peak values include a maximum peak value and a minimum peak value;
and taking two peak value intervals as a process period according to the marked peak values, and counting the process time in the production process of the worker to obtain the process timing in the production process of the worker.
Wherein, the acquiring of the video data in the production process of a worker comprises:
and video data in the production process of one worker is acquired by adopting a camera shooting mode.
Wherein, according to the obtained video data, identifying the body language information of the worker in the production process from the video data comprises:
and identifying the body language information of the worker in the production process from the video data by adopting a body language identification mode according to the acquired video data.
Wherein, according to the peak value noted, two peak value intervals are taken as a process period, and the process time in the production process of the worker is counted to obtain the process timing in the production process of the worker, and the method comprises the following steps:
and according to the time data respectively corresponding to the maximum peak interval and the minimum peak interval, counting the working procedure time in the production process of the worker to obtain the working procedure timing in the production process of the worker.
Wherein, after taking two peak intervals as a process cycle according to the marked peaks, counting the process time in the production process of the worker and obtaining the process timing in the production process of the worker, the method further comprises:
according to the obtained process timing in the production process of the worker, counting the same periodic process time in the obtained process timing in the production process of the worker, comparing the counted process time of each same periodic process, comparing the abnormal conditions of the process time, and automatically prompting the abnormal conditions of the process time for the worker according to the abnormal conditions of the process time.
According to an aspect of the present invention, there is provided a human body limb language based procedure timing system, comprising:
the system comprises a video acquisition unit, a limb language identification unit, a coordinate establishment unit, an image denoising unit, a peak value acquisition unit and a time statistic unit;
the video acquisition unit is used for acquiring video data in the production process of a worker;
the limb language identification unit is used for identifying the limb language information of the worker in the production process from the video data according to the acquired video data;
the coordinate establishing unit is used for establishing a two-dimensional coordinate graph based on the video frame of the video data and the recognized body language information according to the acquired video data and the recognized body language information;
the image denoising unit is used for denoising the image of the established two-dimensional coordinate graph;
the peak value taking unit is used for marking coordinate data which represent the body language information in the two-dimensional coordinate graph after the image is denoised and marking a peak value; wherein the peak values include a maximum peak value and a minimum peak value;
and the time counting unit is used for taking the interval of two peak values as a process period according to the marked peak values, counting the process time in the production process of the worker and obtaining the process timing in the production process of the worker.
The video acquisition unit is specifically configured to:
and video data in the production process of one worker is acquired by adopting a camera shooting mode.
Wherein, the body language identification unit is specifically configured to:
and identifying the body language information of the worker in the production process from the video data by adopting a body language identification mode according to the acquired video data.
Wherein, the time statistic unit is specifically configured to:
and according to the time data respectively corresponding to the maximum peak interval and the minimum peak interval, counting the working procedure time in the production process of the worker to obtain the working procedure timing in the production process of the worker.
Wherein, the process timing system based on human body limb language further comprises:
and the process timing analysis unit is used for counting the same periodic process time in the obtained process timing in the production process of the worker according to the obtained process timing in the production process of the worker, comparing the counted process time of each same periodic process, comparing the abnormal conditions of the process time, and automatically prompting the abnormal conditions of the process time for the worker according to the abnormal conditions of the process time.
It can be found that, according to the above scheme, the video data in the production process of a worker can be obtained, the limb language information in the production process of the worker is identified from the video data according to the obtained video data, a two-dimensional coordinate graph based on the video frame of the video data and the identified limb language information is established according to the obtained video data and the identified limb language information, the established two-dimensional coordinate graph is subjected to image denoising, the coordinate data representing the limb language information in the two-dimensional coordinate graph after the image denoising is marked, a peak value is marked, two peak value intervals are taken as a process period according to the marked peak value, the process time in the production process of the worker is counted to obtain the process timing in the production process of the worker, and the process in the production process of the worker can be automatically timed without manual participation, the artificial participation factor in the process timing is avoided, so that the process timing is more objective and accurate.
Furthermore, according to the scheme, the same periodic process time in the obtained process timing in the production process of the worker can be counted according to the obtained process timing in the production process of the worker, the counted process time of each same periodic process is compared, the abnormal condition of the process time is compared, and the abnormal condition of the process time is automatically prompted to the worker according to the abnormal condition of the process time, so that the worker with abnormal working efficiency can be timely prompted in the production process, and the working efficiency in the production process is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flowchart illustrating an embodiment of a human body language-based process timing method according to the present invention;
FIG. 2 is an exemplary illustration of a method for human body limb language based process timing according to an embodiment of the present invention for creating a video frame based on video data of a worker in a production process and identifying a two-dimensional coordinate graph based on limb language information of the worker in the production process;
FIG. 3 is an exemplary illustration of a video frame of video data of a worker in a production process after image denoising and a two-dimensional waveform diagram of two-dimensional coordinates identified based on body language information of the worker in the process of production according to an embodiment of the method for timing a process based on human body language;
FIG. 4 is a flowchart illustrating another embodiment of the human body language-based procedure timing method according to the present invention;
FIG. 5 is a schematic diagram of an embodiment of a human body language-based process timing system according to the present invention;
FIG. 6 is a schematic structural diagram of another embodiment of the human body language-based process timing system of the present invention;
FIG. 7 is a schematic structural diagram of another embodiment of the human body limb language-based process timing system of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be noted that the following examples are only illustrative of the present invention, and do not limit the scope of the present invention. Similarly, the following examples are only some but not all examples of the present invention, and all other examples obtained by those skilled in the art without any inventive work are within the scope of the present invention.
The invention provides a process timing method based on human body limb language, which can automatically time a process in the production process of a worker without manual participation, avoids manual participation factors in the process timing and enables the process timing to be more objective and accurate.
Referring to fig. 1, fig. 1 is a flowchart illustrating a process timing method based on human body language according to an embodiment of the present invention. It should be noted that the method of the present invention is not limited to the flow sequence shown in fig. 1 if the results are substantially the same. As shown in fig. 1, the method comprises the steps of:
s101: video data in a production process of a worker is acquired.
The acquiring of the video data in the production process of a worker may include:
and video data in the production process of one worker is acquired by adopting a camera shooting mode.
S102: and identifying the body language information of the worker in the production process from the video data according to the acquired video data.
Wherein, the identifying the body language information of the worker in the production process from the video data according to the acquired video data may include:
and identifying the body language information of the worker in the production process from the video data by adopting an OpenPose (body language identification) mode according to the acquired video data.
S103: and establishing a two-dimensional coordinate graph based on the video frame of the video data and the recognized body language information according to the acquired video data and the recognized body language information.
Referring to fig. 2, fig. 2 is an exemplary illustration of establishing a video frame based on video data of a worker in a production process and an identified two-dimensional coordinate graph based on body language information of the worker in the human body language-based process timing method according to an embodiment of the present invention. As shown in fig. 2, a sewing process is taken as an example, in actual operation, most sewing processes are performed through a series of operations such as material taking, sewing and material placing by workers, and when normal processes are easily found, the workers always take materials by using left hands, then perform sewing under a sewing machine, and place finished products in a finished product area after the sewing is finished.
As shown in fig. 2, when the worker takes the material with the left hand, in the picture of the video frame, the X direction of the left wrist of the worker at that moment is farthest from the origin, so that the coordinate in the X direction of the left wrist of the worker in the video frame is the maximum value, and a two-dimensional graph of the X direction value of the frequency frame and the number of the video frame shown in fig. 2 is drawn. As shown in fig. 2, the X axis represents the number of frames of the frequency frame, the Y axis represents the coordinate of the left wrist of the worker in the X direction of the video frame, and the maximum value in the X direction corresponds to the peak point, that is, the peak point represents the time when the worker takes the material. And then, automatically timing the working procedures in the production process of a worker according to the characteristics in the production process and the two-dimensional graph.
S104: and carrying out image denoising on the established two-dimensional coordinate graph.
S105: marking coordinate data which represent body language information in the two-dimensional coordinate graph after the image is denoised, and marking a peak value; wherein the peak comprises a maximum peak and a minimum peak.
Referring to fig. 3, fig. 3 is an exemplary illustration of a video frame of video data in a production process of a worker after image denoising and a two-dimensional waveform diagram of two-dimensional coordinates based on body language information in the production process of the worker according to an embodiment of the process timing method based on human body language. As shown in fig. 3, taking a sewing process as an example, a first curve from top to bottom represents a two-dimensional waveform of a body language of a worker in a production process, such as an X coordinate of a wrist joint of a human body, and a frame number of video data of the worker in the production process, a second curve from top to bottom represents a two-dimensional waveform of a body language of the worker in the production process, such as a Y coordinate of a wrist joint of a human body, and a frame number of video data of the worker in the production process, and a peak is marked by a dot, which can represent a time when the left hand of the worker takes a material. Two scatter connecting line graphs are arranged below the peak point marked in a dot mode, and the two scatter connecting line graphs are a two-classification classifier which can be constructed through a GoogleNet (convolutional neural network), so that whether the current workbench is empty or not is classified, when the workbench is not empty, the high threshold is set, the scatter graph with the working procedure threshold value of 600 is selected, otherwise, the scatter graph with the lower threshold is set, and the scatter graph with the working procedure threshold value of 580 is selected. As shown in fig. 3, it is clear that at the peak point, a high threshold, such as the scatter point with threshold 600 described above, is more intermittent, indicating that the stage is now empty for a period of time.
S106: and taking two peak value intervals as a process period according to the marked peak value, and counting the process time in the production process of the worker to obtain the process timing in the production process of the worker.
Wherein, should take two peak value intervals as a process cycle according to the peak value of should marking out, make statistics to the process time in this workman's production process, obtain the process timing in this workman's production process, can include:
and counting the working procedure time in the production process of the worker according to the obtained time data respectively corresponding to the maximum peak interval and the minimum peak interval to obtain the working procedure timing in the production process of the worker.
Wherein, should take two peak value intervals as a process cycle according to the peak value noted, count the process time in the worker's production process, after obtaining the process timing in the worker's production process, can also include:
counting the same periodic process time in the obtained process timing in the production process of the worker according to the obtained process timing in the production process of the worker, comparing the counted process time of each same periodic process, comparing the abnormal conditions of the process time, and automatically prompting the abnormal conditions of the process time for the worker according to the abnormal conditions of the process time.
In this embodiment, the method for timing a process based on human body limb language can be applied to the sewing product industry, and can perform process timing on a process in a production process by using a visual shooting mode, so that artificial factors in process timing can be avoided.
In this embodiment, the process timing method based on human body language can utilize a computer vision technology, particularly a human body posture estimation technology, to analyze the working time of workers in the production process, i.e., the process time, automatically extract the time period for producing each product, and provide core data for subsequent management, work efficiency analysis, and the like.
In the embodiment, the human body limb language-based process timing method does not need to manually hold a stopwatch for timing or count time by a worker, can perform video shooting in a shooting mode, can automatically analyze the video content to obtain a process time result, can realize an active process timing process without manual factors, and can fill the blank of a visual sewing product process timing system in the market.
It can be found that, according to the above scheme, the video data in the production process of a worker can be obtained, the limb language information in the production process of the worker is identified from the video data according to the obtained video data, a two-dimensional coordinate graph based on the video frame of the video data and the identified limb language information is established according to the obtained video data and the identified limb language information, the established two-dimensional coordinate graph is subjected to image denoising, the coordinate data representing the limb language information in the two-dimensional coordinate graph after the image denoising is marked, a peak value is marked, two peak value intervals are taken as a process period according to the marked peak value, the process time in the production process of the worker is counted to obtain the process timing in the production process of the worker, and the process in the production process of the worker can be automatically timed without manual participation, the artificial participation factor in the process timing is avoided, so that the process timing is more objective and accurate.
Referring to fig. 4, fig. 4 is a flowchart illustrating another embodiment of the human body language-based procedure timing method according to the present invention. In this embodiment, the method includes the steps of:
s401: video data in a production process of a worker is acquired.
As described above in S101, further description is omitted here.
S402: and identifying the body language information of the worker in the production process from the video data according to the acquired video data.
As described above in S102, further description is omitted here.
S403: and establishing a two-dimensional coordinate graph based on the video frame of the video data and the recognized body language information according to the acquired video data and the recognized body language information.
As described above in S103, which is not described herein.
S404: and carrying out image denoising on the established two-dimensional coordinate graph.
S405: marking coordinate data which represent body language information in the two-dimensional coordinate graph after the image is denoised, and marking a peak value; wherein the peak comprises a maximum peak and a minimum peak.
As described above in S105, which is not described herein.
S406: and taking two peak value intervals as a process period according to the marked peak value, and counting the process time in the production process of the worker to obtain the process timing in the production process of the worker.
As described above in S106, and will not be described herein.
S407: counting the same periodic process time in the obtained process timing in the production process of the worker according to the obtained process timing in the production process of the worker, comparing the counted process time of each same periodic process, comparing the abnormal conditions of the process time, and automatically prompting the abnormal conditions of the process time for the worker according to the abnormal conditions of the process time.
It can be found that, in this embodiment, the same periodic process time in the obtained process timing in the production process of the worker can be counted according to the obtained process timing in the production process of the worker, the counted process times of the same periodic processes are compared, abnormal conditions of the process times are compared, and a prompt of abnormal process time is automatically performed on the worker according to the abnormal conditions of the process times, so that the worker with abnormal working efficiency can be timely prompted in the production process, and the working efficiency in the production process is improved.
The invention also provides a process timing system based on human body limb language, which can automatically time the process in the production process of workers without manual participation, avoids the manual participation factor in the process timing and enables the process timing to be more objective and accurate.
Referring to fig. 5, fig. 5 is a schematic structural diagram of an embodiment of the human body language-based procedure timing system of the present invention. In this embodiment, the human body limb language-based process timing system 50 includes a video obtaining unit 51, a limb language recognition unit 52, a coordinate establishing unit 53, an image denoising unit 54, a peak value obtaining unit 55, and a time counting unit 56.
The video acquiring unit 51 is used for acquiring video data in the production process of a worker.
The body language identification unit 52 is configured to identify body language information in the production process of the worker from the video data according to the acquired video data.
The coordinate establishing unit 53 is configured to establish a two-dimensional coordinate map based on the video frame of the video data and the identified body language information according to the acquired video data and the identified body language information.
The image denoising unit 54 is configured to denoise an image of the established two-dimensional coordinate graph.
The peak value taking unit 55 is configured to label coordinate data representing the body language information in the two-dimensional coordinate graph after the image denoising, and label a peak value; wherein the peak comprises a maximum peak and a minimum peak.
The time counting unit 56 is configured to take an interval between two peak values as a process cycle according to the marked peak value, count the process time in the production process of the worker, and obtain the process timing in the production process of the worker.
Optionally, the video obtaining unit 51 may be specifically configured to:
and video data in the production process of one worker is acquired by adopting a camera shooting mode.
Optionally, the body language identification unit 52 may be specifically configured to:
and identifying the body language information of the worker in the production process from the video data by adopting a body language identification OpenPose mode according to the acquired video data.
Optionally, the time statistic unit 56 may be specifically configured to:
and counting the working procedure time in the production process of the worker according to the obtained time data respectively corresponding to the maximum peak interval and the minimum peak interval to obtain the working procedure timing in the production process of the worker.
In this embodiment, the video acquiring unit 51 may be any terminal capable of acquiring video data in a production process of a worker by using an image capturing method, such as a video camera, a mobile phone, and the like, which is not limited by the invention.
Referring to fig. 6, fig. 6 is a schematic structural diagram of another embodiment of the human body limb language-based procedure timing system of the present invention. Different from the previous embodiment, the human body language-based process timing system 60 of the present embodiment further includes: a process timing analysis unit 61.
The process timing analysis unit 61 is configured to count the same periodic process time in the obtained process timing of the worker in the production process according to the obtained process timing of the worker in the production process, compare the counted process time of each same periodic process, compare the abnormal situation of the process time, and automatically prompt the worker for the abnormal process time according to the abnormal situation of the process time.
Each unit module of the human body limb language-based procedure timing system 50/60 can respectively execute the corresponding steps in the above method embodiments, and therefore, the description of each unit module is omitted here, and please refer to the description of the corresponding steps above in detail.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a human body language-based procedure timing system according to another embodiment of the present invention. Each unit module of the human body limb language-based process timing system can respectively execute the corresponding steps in the method embodiment. For a detailed description of the above method, please refer to the above method, which is not repeated herein.
In this embodiment, the process timing system based on human body language includes: a processor 71, a memory 72 coupled to the processor 71, a calculator 73, and an analyzer 74.
The processor 71 is configured to obtain video data of a worker in a production process, identify body language information of the worker in the production process from the video data according to the obtained video data, establish a two-dimensional coordinate graph based on a video frame of the video data and the identified body language information according to the obtained video data and the identified body language information, perform image denoising on the established two-dimensional coordinate graph, and label coordinate data representing the body language information in the two-dimensional coordinate graph after the image denoising to label a peak value; wherein the peak comprises a maximum peak and a minimum peak.
The memory 72 is used for storing an operating system, instructions executed by the processor 71, and the like.
The calculator 73 is configured to take an interval between two peak values as a process cycle according to the marked peak value, and count the process time in the production process of the worker to obtain the process timing in the production process of the worker.
The analyzer 74 is configured to count the same periodic process time in the obtained process timing in the production process of the worker according to the obtained process timing in the production process of the worker, compare the counted process time of each of the same periodic processes, compare abnormal conditions of the process time, and automatically prompt the worker for abnormal process time according to the abnormal conditions of the process time.
Optionally, the processor 71 may be specifically configured to:
and video data in the production process of one worker is acquired by adopting a camera shooting mode.
Optionally, the processor 71 may be specifically configured to:
and identifying the body language information of the worker in the production process from the video data by adopting a body language identification OpenPose mode according to the acquired video data.
Optionally, the calculator 73 may be specifically configured to:
and counting the working procedure time in the production process of the worker according to the obtained time data respectively corresponding to the maximum peak interval and the minimum peak interval to obtain the working procedure timing in the production process of the worker.
It can be found that, according to the above scheme, the video data in the production process of a worker can be obtained, the limb language information in the production process of the worker is identified from the video data according to the obtained video data, a two-dimensional coordinate graph based on the video frame of the video data and the identified limb language information is established according to the obtained video data and the identified limb language information, the established two-dimensional coordinate graph is subjected to image denoising, the coordinate data representing the limb language information in the two-dimensional coordinate graph after the image denoising is marked, a peak value is marked, two peak value intervals are taken as a process period according to the marked peak value, the process time in the production process of the worker is counted to obtain the process timing in the production process of the worker, and the process in the production process of the worker can be automatically timed without manual participation, the artificial participation factor in the process timing is avoided, so that the process timing is more objective and accurate.
Furthermore, according to the scheme, the same periodic process time in the obtained process timing in the production process of the worker can be counted according to the obtained process timing in the production process of the worker, the counted process time of each same periodic process is compared, the abnormal condition of the process time is compared, and the abnormal condition of the process time is automatically prompted to the worker according to the abnormal condition of the process time, so that the worker with abnormal working efficiency can be timely prompted in the production process, and the working efficiency in the production process is improved.
In the several embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be substantially or partially implemented in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only a part of the embodiments of the present invention, and not intended to limit the scope of the present invention, and all equivalent devices or equivalent processes performed by the present invention through the contents of the specification and the drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (8)

1. A process timing method based on human body limb language is characterized by comprising the following steps:
acquiring video data in a production process of a worker;
according to the obtained video data, identifying body language information of the worker in the production process from the video data;
according to the acquired video data and the recognized body language information, a two-dimensional coordinate graph based on the video frames of the video data and the recognized body language information is established;
carrying out image denoising on the established two-dimensional coordinate graph;
marking coordinate data which represent body language information in the two-dimensional coordinate graph after the image is denoised, and marking a peak value; wherein the peak values include a maximum peak value and a minimum peak value;
taking two peak value intervals as a working procedure period according to the marked peak values, and counting the working procedure time in the production process of the worker to obtain the working procedure timing in the production process of the worker; the method comprises the following steps:
according to the marked peak value including the maximum peak value and the minimum peak value, two adjacent peak value intervals are taken as a working procedure period, coordinate conversion is carried out on the marked peak value intervals, the peak value intervals are converted into time data of the same timing standard, time data corresponding to the adjacent maximum peak value intervals or the adjacent minimum peak value intervals respectively are obtained, working procedure time in the production process of the worker is counted according to the obtained time data corresponding to the maximum peak value intervals and the minimum peak value intervals respectively, and working procedure timing in the production process of the worker is obtained.
2. The human body limb language based process timing method as claimed in claim 1, wherein the obtaining video data of a worker in a production process comprises:
and video data in the production process of one worker is acquired by adopting a camera shooting mode.
3. The human body limb language based process timing method as claimed in claim 1 or 2, wherein the identifying the limb language information of the worker in the production process from the video data according to the acquired video data comprises:
and identifying the body language information of the worker in the production process from the video data by adopting a body language identification mode according to the acquired video data.
4. The human body limb language based process timing method as claimed in claim 1, wherein after taking two peak intervals as a process cycle according to the marked peaks, counting the process time in the production process of the worker and obtaining the process timing in the production process of the worker, further comprising:
according to the obtained process timing in the worker production process, counting the same periodic process time in the obtained process timing in the worker production process, comparing the counted process time of each same periodic process, comparing the abnormal conditions of the process time, and automatically prompting the abnormal conditions of the process time for the worker according to the abnormal conditions of the process time.
5. A process timing system based on human body limb language is characterized by comprising:
the system comprises a video acquisition unit, a limb language identification unit, a coordinate establishment unit, an image denoising unit, a peak value acquisition unit and a time statistic unit;
the video acquisition unit is used for acquiring video data in the production process of a worker;
the body language identification unit is used for identifying body language information of the worker in the production process from the video data according to the acquired video data;
the coordinate establishing unit is used for establishing a two-dimensional coordinate graph based on the video frame of the video data and the recognized body language information according to the acquired video data and the recognized body language information;
the image denoising unit is used for denoising the image of the established two-dimensional coordinate graph;
the peak value taking unit is used for marking coordinate data which represent body language information in the two-dimensional coordinate graph after the image is denoised and marking a peak value; wherein the peak values include a maximum peak value and a minimum peak value;
the time counting unit is used for taking the interval of two peak values as a process period according to the marked peak values, counting the process time in the production process of the worker and obtaining the process timing in the production process of the worker; the method comprises the following steps:
and according to the time data respectively corresponding to the maximum peak interval and the minimum peak interval, counting the working procedure time in the production process of the worker to obtain the working procedure timing in the production process of the worker.
6. The human body limb language-based process timing system of claim 5, wherein the video acquisition unit is specifically configured to:
and video data in the production process of one worker is acquired by adopting a camera shooting mode.
7. The human body limb language-based process timing system of claim 5 or 6, wherein the limb language identification unit is specifically configured to:
and identifying the body language information of the worker in the production process from the video data by adopting a body language identification mode according to the acquired video data.
8. The body extremity language based procedure timing system of claim 5, further comprising:
and the process timing analysis unit is used for counting the same periodic process time in the obtained process timing in the production process of the worker according to the obtained process timing in the production process of the worker, comparing the counted process time of each same periodic process, comparing the abnormal conditions of the process time, and automatically prompting the abnormal conditions of the process time for the worker according to the abnormal conditions of the process time.
CN201811393385.7A 2018-11-21 2018-11-21 Process timing method and system based on human body language Active CN109492602B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811393385.7A CN109492602B (en) 2018-11-21 2018-11-21 Process timing method and system based on human body language

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811393385.7A CN109492602B (en) 2018-11-21 2018-11-21 Process timing method and system based on human body language

Publications (2)

Publication Number Publication Date
CN109492602A CN109492602A (en) 2019-03-19
CN109492602B true CN109492602B (en) 2020-11-03

Family

ID=65697282

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811393385.7A Active CN109492602B (en) 2018-11-21 2018-11-21 Process timing method and system based on human body language

Country Status (1)

Country Link
CN (1) CN109492602B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111507231B (en) * 2020-04-10 2023-06-23 盛景智能科技(嘉兴)有限公司 Automatic detection method and system for correctness of process steps
US11348355B1 (en) 2020-12-11 2022-05-31 Ford Global Technologies, Llc Method and system for monitoring manufacturing operations using computer vision for human performed tasks

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8718337B1 (en) * 2010-06-30 2014-05-06 Imdb.Com, Inc. Identifying an individual for a role

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105469073A (en) * 2015-12-16 2016-04-06 安徽创世科技有限公司 Kinect-based call making and answering monitoring method of driver
CN108062533A (en) * 2017-12-28 2018-05-22 北京达佳互联信息技术有限公司 Analytic method, system and the mobile terminal of user's limb action

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8718337B1 (en) * 2010-06-30 2014-05-06 Imdb.Com, Inc. Identifying an individual for a role

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
采用共享空间稀疏表示的单幅图像超分辨率方法;张建,彭佳林,杜吉祥;《华侨大学学报(自然科学版)》;20180331;第39卷(第2期);第268-273页 *

Also Published As

Publication number Publication date
CN109492602A (en) 2019-03-19

Similar Documents

Publication Publication Date Title
US11321583B2 (en) Image annotating method and electronic device
US11386717B2 (en) Fingerprint inputting method and related device
CN109492602B (en) Process timing method and system based on human body language
CN108011976B (en) Internet access terminal model identification method and computer equipment
CN110610125A (en) Ox face identification method, device, equipment and storage medium based on neural network
CN105825192A (en) Facial expression identification method and system
CN105187641A (en) Method and system for smartly reminding application notification
CN109886162A (en) Fingerprint authentication method and relevant apparatus
CN108229232A (en) The method of batch scanning Quick Response Code and the device of batch scanning Quick Response Code
CN109660936A (en) A kind of face and mobile device identification number mapping method, system and readable storage medium storing program for executing
CN115601811A (en) Facial acne detection method and device
CN106919377A (en) Determine application program displaying interface whether be white screen method and device
CN109064464B (en) Method and device for detecting burrs of battery pole piece
CN102783174B (en) Image processing equipment, content delivery system, image processing method and program
CN110084815A (en) The method that skin allergy decision-making system and skin allergy determine
CN110400560B (en) Data processing method and device, storage medium and electronic device
CN106204788B (en) Vibration perception Sensor Network industry sewing device automatic counting method
CN101639907A (en) Automatic comparison system and operation method thereof
CN110705420A (en) Elevator car advertisement accurate pushing method and device and storage medium
CN108334602B (en) Data annotation method and device, electronic equipment and computer storage medium
CN113592789A (en) Dim light image identification method, device, equipment and storage medium
CN110647841B (en) Image recognition result filtering method and device, computer equipment and storage medium
CN111401262A (en) Non-electronic medical scientific research data automatic input method, system and equipment
CN109118070A (en) test method and device
CN110796062A (en) Method and device for precisely matching and displaying object frame and storage device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant