CN110135332B - Bearing assembly production line efficiency monitoring method - Google Patents

Bearing assembly production line efficiency monitoring method Download PDF

Info

Publication number
CN110135332B
CN110135332B CN201910395920.0A CN201910395920A CN110135332B CN 110135332 B CN110135332 B CN 110135332B CN 201910395920 A CN201910395920 A CN 201910395920A CN 110135332 B CN110135332 B CN 110135332B
Authority
CN
China
Prior art keywords
hand
foot
station
point
efficiency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201910395920.0A
Other languages
Chinese (zh)
Other versions
CN110135332A (en
Inventor
于新
王坤
姜盛乾
杨兆军
赵安然
陈传海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN201910395920.0A priority Critical patent/CN110135332B/en
Publication of CN110135332A publication Critical patent/CN110135332A/en
Application granted granted Critical
Publication of CN110135332B publication Critical patent/CN110135332B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06398Performance of employee with respect to a job function
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Theoretical Computer Science (AREA)
  • Strategic Management (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Health & Medical Sciences (AREA)
  • Operations Research (AREA)
  • Game Theory and Decision Science (AREA)
  • Quality & Reliability (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Manufacturing & Machinery (AREA)
  • Primary Health Care (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • General Factory Administration (AREA)

Abstract

The invention belongs to the field of intelligent manufacturing, and particularly relates to a method for monitoring the efficiency of a bearing assembly production line. The monitoring method comprises the following steps: monitoring the single-point efficiency, the production line efficiency, the production rate, the operation speed and the quality coefficient of an assembly line; and step two, recording the invalid operation action in the efficiency display, and recording the completion condition of each station of the invalid operation action in detail for a manager to check. The method introduces a motion sensing technology, identifies various operation actions of the bearing through bone point data collected by Kinect V2, completes identification and judgment of effective actions, monitors the efficiency of a production field in real time according to the planned work times, the actual output by a sensor and the quality inspection condition updated by quality inspection personnel in real time, solves some problems existing in the judgment of the actual production efficiency of a factory through daily inspection, weekly inspection or monthly inspection by the factory, and fills the blank of the current field.

Description

Bearing assembly production line efficiency monitoring method
Technical Field
The invention belongs to the field of intelligent manufacturing, and particularly relates to a bearing assembly production line efficiency monitoring method.
Background
Because the bearing products are various, most assembly lines adopt manual operation to realize flexible production. The production line, which is mainly based on operators, must ensure that there is no interference and no quality loss in order to maintain high productivity. Of course, this requirement is not possible in actual production, and due to a number of factors, there is a significant amount of failure in the assembly line: for example, the fatigue and laziness of production line operators cause the accumulation of single-point products, the defective rate of the assembled products is increased due to the poor state of the operators, defective products or rework are generated, and the actual efficiency of the production line is reduced.
With the continuous development of the internet, most factories in China begin to advance from the automation of industry 2.0 to the informatization of industry 3.0. The detection of the production line is not detected on site any more, but an electronic billboard is adopted for direct monitoring, and in actual production, the real-time assembly efficiency is difficult to be directly fed back because the actual operation condition cannot be predicted.
Disclosure of Invention
The invention introduces a motion sensing technology, identifies various operation actions of the bearing through the bone point data collected by Kinect V2, completes the identification and judgment of effective actions, monitors the efficiency of a production field in real time according to the planned work times, the actual output by the sensor and the quality inspection condition updated by quality inspection personnel in real time, solves the problems existing in the judgment of the actual production efficiency of a factory through daily inspection, weekly inspection or monthly inspection by a factory, and fills the blank of the current field.
The technical scheme of the invention is explained by combining the drawings as follows:
a monitoring method for efficiency of a bearing assembly production line comprises the following steps:
step one, efficiency i of single point of assembly line1Production line efficiency I and production rate O1Operating rate O2And mass coefficient O3Monitoring is carried out;
wherein the assembly line single point efficiency i1Comprising 3 parameters, respective production rates O1Operating rate O2Mass coefficient O3Single point efficiency i1=O1·O2·O3(ii) a The production line efficiency is equal to the product of the efficiencies of the single points, namely I ═ I1×i2…×in(ii) a Productivity O1Actual production number P1Planned production number P2The assembly line will install an infrared sensor at the exit of each line to count the production parts and thus continuously correct the actual production number P1(ii) a Planned production number P2Starting a self-starting system, adjusting t according to the actual condition of a production line every t seconds, and automatically adding 1; operating Rate O2Number of effective operations P3Number of planned operations P4Planning the number of operations P4After the system is started, every T seconds, adjusting T according to the actual condition of a production line, and automatically adding 1; mass coefficient O3Number of good products P5Total production quantity P6When the assembly of the product is finished, quality testing personnel monitor the product, if the product is qualified, the qualified product is placed in the same qualified channel, otherwise, the qualified product is placed in an unqualified channel for maintenance, the quantity of good products is fed back by an infrared sensor of the qualified channel, and the total production quantity is added with the total passing quantity of the qualified channel and the unqualified channel;
and step two, recording the invalid operation action in the efficiency display, and recording the completion condition of each station of the invalid operation action in detail for a manager to check.
The number of effective operations P3The specific measuring and calculating method comprises the following steps:
1) firstly, Kinect V2 is used to collect 25 human skeleton points including head, neck, shoulder center and left thumb A1(x1,i,y1,i,z1,i) The right thumb A2(x2,i,y2,i,z2,i) Left fingertip A3(x3,i,y3,i,z3,i) Right fingertip A4(x4,i,y4,i,z4,i) Left hand A5(x5,i,y5,i,z5,i) Right hand A6(x6,i,y6,i,z6,i) Left wrist, right wrist, left elbow, right elbow, left shoulder, right shoulder, spine, hip center, left hip, left knee, left ankle, left foot A7(x7,i,y7,i,z7,i) Right hip, right knee, right ankle, right foot A8(x8,i,y8,i,z8,i) Wherein A is1(x1,i,y1,i,z1,i) A coordinate point representing a left thumb point of an ith frame;
2) the assembly operation is carried out at a plurality of different stations, and one effective operation is not that an operator completes the work of one station but continuously completes the operation of a plurality of stations;
3) marking the two-foot falling point and the two-hand falling point in the station, and marking the left-hand falling point BStation 1, left hand, i(XLeft hand, i,YLeft hand, i,ZLeft hand, i) Right hand drop point BStation 1, right hand, i(XRight hand, i,YRight hand, i,ZRight hand, i) Left foot landing point BStation 1, left foot, i(XLeft foot, i,YLeft foot, i,ZLeft foot, i) Right foot landing point BStation 1, right foot, i(XRight foot, i,YRight foot, i,ZRight foot, i) In which B isStation 1, left hand, iThe method comprises the following steps of (1) representing a drop point which needs to be completed by the ith left hand, wherein the left foot and the right foot have only 1 drop point in the same station, and the left hand and the right hand have a plurality of drop points in the same station;
4) calculating whether the operator finishes the specified action at the station 1;
scheme 1: firstly, the left hand, the right hand, the left foot and the right foot are required to be within a specified coordinate range of 7cm, secondly, the grabbing action is finished by two hands, and the specific calculation process is as follows:
the distance between the left hand and the left hand landing point 1 is
Figure BDA0002058106380000021
The distance l between the right hand and the right hand landing point 1 can be obtained in the same way2Distance l between left foot and left foot landing point 13Distance l between right foot and right foot landing point 14(ii) a Calculating left-to-left thumb vector simultaneously
Figure BDA0002058106380000022
Vector from left hand to left fingertip
Figure BDA0002058106380000023
Angle theta of1Right hand to right thumb vector
Figure BDA0002058106380000024
And right hand to right fingertip vector
Figure BDA0002058106380000025
Angle theta of2When l is1、l2、l3、l4Are all less than 7cm, and theta1、θ2Both are less than 30 degrees, the two-hand falling point i is i +1, and the process is turned to the flow 2;
and (2) a flow scheme: after the flow 1 is completed, whether the installation action is completed or not needs to be evaluated, whether the left hand and the right hand reach respective drop points 2 or not needs to be calculated, and the hands-off action is completed, wherein the specific calculation process is as follows:
calculating the distance between the left hand and the left hand landing point 2 as l5Distance l between the right hand and the right hand landing point 26Left hand to left thumb vector
Figure BDA0002058106380000026
Vector from left hand to left fingertip
Figure BDA0002058106380000027
Angle of (theta)3Right hand to right thumb vector
Figure BDA0002058106380000028
And right hand to right fingertip vector
Figure BDA0002058106380000029
Angle of (theta)4When l is5、l6Are all less than 7cm, and theta3、θ4When the temperature is more than 30 degrees, the flow 2 and the flow 3 are considered to be completed;
and (3) a flow path: if the station only performs one operation back and forth, the station operation is considered to be completed, otherwise, i is i +1, and the process is switched to the flow 1;
5) calculating whether the operator completes the specified action at other stations;
the flow of the step is the same as that of the step 4), when the operation staff finishes the operation actions of all the stations, the operation staff considers that one effective operation action is finished, and the effective operation action is added to the step P3
The invention has the beneficial effects that:
the invention recognizes the action of the assembly line operator through the motion sensing technology, and completes the assembly line efficiency monitoring by utilizing the actual output by the sensor and the quality inspection condition updated by the quality inspection personnel in real time, thereby reducing the workload of the professional personnel and reducing the labor cost. Related technical researches are not available in the current field, and the method fills the current blank.
Drawings
FIG. 1 is a schematic diagram of the distribution of skeletal joint points in accordance with the present invention;
FIG. 2 is a schematic view of an assembly sequence of the present invention;
fig. 3 is a schematic view of the bone landing point of the present invention.
Detailed Description
A monitoring method for efficiency of a bearing assembly production line comprises the following steps:
step one, efficiency i of single point of assembly line1Production line efficiency I and production rate O1Operation rate O2And mass coefficient O3Monitoring is carried out;
wherein the assembly line single point efficiency i1Comprising 3 parameters, respective production rates O1Operating rate O2Mass coefficient of O3Single point efficiency i1=O1·O2·O3
The production line efficiency is equal to the product of the efficiencies of the single points, namely I ═ I1×i2…×in
Productivity O1Actual production number P1Planned production number P2The assembly line will install an infrared sensor at the exit of each line to count the production parts and thus continuously correct the actual production number P1(ii) a Planned production number P2Starting a self-starting system, adjusting t according to the actual condition of a production line every t seconds, and automatically adding 1;
operating Rate O2Number of effective operations P3Number of planned operations P4Planning the number of operations P4After the system is started, every T seconds, T is adjusted according to the actual condition of the production line and is automaticAdding 1;
number of valid operations P3The measuring and calculating method comprises the following steps:
referring to fig. 1, 1) first, Kinect V2 is used to collect 25 human bone points including head, neck, shoulder center, left thumb A1(x1,i,y1,i,z1,i) The right thumb A2(x2,i,y2,i,z2,i) Left fingertip A3(x3,i,y3,i,z3,i) Right fingertip A4(x4,i,y4,i,z4,i) Left hand A5(x5,i,y5,i,z5,i) Right hand A6(x6,i,y6,i,z6,i) Left wrist, right wrist, left elbow, right elbow, left shoulder, right shoulder, spine, hip center, left hip, left knee, left ankle, left foot A7(x7,i,y7,i,z7,i) Right hip, right knee, right ankle, right foot A8(x8,i,y8,i,z8,i). Wherein A is1(x1,i,y1,i,z1,i) A coordinate point representing the left thumb point of the ith frame.
2) The operator is marked at different positions and is respectively marked as a station 1, a station 2 and the like, referring to fig. 2, the assembly operator needs to carry out assembly operation at a plurality of different stations, and one effective operation is that the operator does not complete the work of one station but continuously completes the operation of a plurality of stations.
See fig. 3, 3) marks the two-foot and two-hand drop points in the station, the left-hand drop point BStation 1, left hand, i(XLeft hand, i,YLeft hand, i,ZLeft hand, i) Right hand drop point BStation 1, right hand, i(XRight hand, i,YRight hand, i,ZRight hand, i) Left foot landing point BStation 1, left foot, i(XLeft foot, i,YLeft foot, i,ZLeft foot, i) Right foot landing point BStation 1, right foot, i(XRight foot, i,YRight foot, i,ZRight foot, i) In which B isStation 1, left hand, iThe setting point of the ith left hand to be completed is shown, the left foot and the right foot have only 1 setting point in the same station,the left and right hands have multiple landing points in the same station.
4) And calculating whether the operator completes the specified action at the station 1.
Scheme 1: firstly, the left hand, the right hand, the left foot and the right foot are required to be within a specified coordinate range of 7cm, secondly, the grabbing action is finished by two hands, and the specific calculation process is as follows.
The distance between the left hand and the left hand landing point 1 is
Figure BDA0002058106380000041
The distance l between the right hand and the right hand landing point 1 can be obtained in the same way2Distance l between left foot and left foot landing point 13Distance l between right foot and right foot landing point 14(ii) a Computing left-to-left thumb vectors simultaneously
Figure BDA0002058106380000042
Vector from left hand to left fingertip
Figure BDA0002058106380000043
Angle of (theta)1Right hand to right thumb vector
Figure BDA0002058106380000044
And right hand to right fingertip vector
Figure BDA0002058106380000045
Angle of (theta)2When l is1、l2、l3、l4Are all less than 7cm, and theta1、θ2Are both less than 30 degrees, and the two hands fall at point i ═ i +1, turn to flow 2.
And (2) a flow scheme: after the flow 1 is completed, whether the installation action is completed or not needs to be evaluated, whether the left hand and the right hand reach respective drop points 2 or not needs to be calculated, and the hands-off action is completed, wherein the specific calculation process is as follows.
Calculating the distance between the left hand and the left hand landing point 2 as l5Distance l between the right hand and the right hand landing point 26Vector from left hand to left thumb
Figure BDA0002058106380000046
Vector from left hand to left fingertip
Figure BDA0002058106380000047
Angle of (theta)3Right hand to right thumb vector
Figure BDA0002058106380000048
And right hand to right fingertip vector
Figure BDA0002058106380000049
Angle of (theta)4When l is5、l6Are all less than 7cm, and theta3、θ4Are all larger than 30 degrees, the flow 2 is considered to be completed, and the flow 3 is changed.
And (3) a flow path: if the station only carries out one operation back and forth, the station is considered to be finished, otherwise, i is equal to i +1, and the flow is shifted to the flow 1.
5) And calculating whether the operator completes the specified action at other stations.
The flow of the step is the same as that of the step 4), when the operation staff finishes the operation actions of all the stations, the operation staff considers that one effective operation action is finished, and the effective operation action is added to the step P3
Mass coefficient O3Number of good products P5Total production quantity P6When the assembly of the product is finished, quality testing personnel monitor the product, if the product is qualified, the qualified product is placed in the same qualified channel, otherwise, the qualified product is placed in an unqualified channel for maintenance, the quantity of good products is fed back by an infrared sensor of the qualified channel, and the total production quantity is added with the total passing quantity of the qualified channel and the unqualified channel;
and step two, recording the invalid operation action in the efficiency display, and recording the completion condition of each station of the invalid operation action in detail for a manager to check.

Claims (1)

1. A bearing assembly production line efficiency monitoring method is characterized by comprising the following steps:
step one, efficiency i of single point of assembly line1Production line efficiency I and production rate O1Operating rate O2And mass coefficient O3Monitoring is carried out;
wherein the assembly line single point efficiency i1Comprising 3 parameters, respective production rates O1Operating rate O2Mass coefficient O3Single point efficiency i1=O1·O2·O3(ii) a The production line efficiency is equal to the product of the efficiencies of the single points, namely I ═ I1×i2…×in(ii) a Productivity O1Actual production number P1Planned production number P2The assembly line will install an infrared sensor at the exit of each line to count the production parts and thus continuously correct the actual production number P1(ii) a Planned production number P2Starting a self-starting system, adjusting t according to the actual condition of a production line every t seconds, and automatically adding 1; operating Rate O2Number of effective operations P3Number of planned operations P4Planning the number of operations P4After the system is started, every T seconds, adjusting T according to the actual condition of a production line, and automatically adding 1; mass coefficient O3Number of good products P5Total production quantity P6When the assembly of the product is finished, quality testing personnel monitor the product, if the product is qualified, the qualified product is placed in the same qualified channel, otherwise, the qualified product is placed in an unqualified channel for maintenance, the quantity of good products is fed back by an infrared sensor of the qualified channel, and the total production quantity is added with the total passing quantity of the qualified channel and the unqualified channel;
step two, in the efficiency display, the invalid operation action is recorded, and the completion condition of each station of the invalid operation action is recorded in detail for a manager to check;
the number of effective operations P3The specific measuring and calculating method comprises the following steps:
1) firstly, Kinect V2 is used to collect 25 human skeleton points including head, neck, shoulder center and left thumb A1(x1,i,y1,i,z1,i) Right thumb A2(x2,i,y2,i,z2,i) Left fingertip A3(x3,i,y3,i,z3,i) Right fingertip A4(x4,i,y4,i,z4,i) Left hand A5(x5,i,y5,i,z5,i) Right hand A6(x6,i,y6,i,z6,i) Left wrist, right wrist, left elbow, right elbow, left shoulder, right shoulder, spine, hip center, left hip, left knee, left ankle, left foot A7(x7,i,y7,i,z7,i) Right hip, right knee, right ankle, right foot A8(x8,i,y8,i,z8,i) Wherein A is1(x1,i,y1,i,z1,i) A coordinate point representing a left thumb point of an ith frame;
2) the assembly operation is carried out at a plurality of different stations, and one effective operation is not that an operator finishes the work of one station but continuously finishes the operation of a plurality of stations;
3) marking the two-foot falling point and the two-hand falling point in the station, and marking the left-hand falling point BStation 1, left hand, i(XLeft hand, i,YLeft hand, i,ZLeft hand, i) Right hand drop point BStation 1, right hand, i(XRight hand, i,YRight hand, i,ZRight hand, i) Left foot landing point BStation 1, left foot, i(XLeft foot, i,YLeft foot, i,ZLeft foot, i) Right foot landing point BStation 1, right foot, i(XRight foot, i,YRight foot, i,ZRight foot, i) In which B isStation 1, left hand, iThe method comprises the following steps of (1) representing a drop point which needs to be completed by the ith left hand, wherein the left foot and the right foot have only 1 drop point in the same station, and the left hand and the right hand have a plurality of drop points in the same station;
4) calculating whether the operator finishes the specified action at the station 1;
scheme 1: firstly, the left hand, the right hand, the left foot and the right foot are required to be within a specified coordinate range of 7cm, secondly, the grabbing action is finished by two hands, and the specific calculation process is as follows:
the distance between the left hand and the left hand falling point 1 is
Figure FDA0003523008830000011
The distance l between the right hand and the right hand landing point 1 can be obtained in the same way2Distance l between left foot and left foot landing point 13Right foot and right foot landing point 1Distance l4(ii) a Computing left-to-left thumb vectors simultaneously
Figure FDA0003523008830000021
Vector from left hand to left fingertip
Figure FDA0003523008830000022
Angle of (theta)1Right hand to right thumb vector
Figure FDA0003523008830000023
And right hand to right fingertip vector
Figure FDA0003523008830000024
Angle of (theta)2When l is1、l2、l3、l4Are all less than 7cm, and theta1、θ2Both are less than 30 degrees, the two-hand falling point i is i +1, and the process is turned to the flow 2;
and (2) a flow scheme: after the flow 1 is completed, whether the installation action is completed needs to be evaluated, whether the left hand and the right hand reach respective drop points 2 needs to be calculated, and the hands-off action is completed, wherein the specific calculation process is as follows:
calculating the distance between the left hand and the left hand landing point 2 as l5Distance l between the right hand and the right hand landing point 26Left hand to left thumb vector
Figure FDA0003523008830000025
Vector from left hand to left fingertip
Figure FDA0003523008830000026
Angle theta of3Right hand to right thumb vector
Figure FDA0003523008830000027
And right hand to right fingertip vector
Figure FDA0003523008830000028
Angle of (theta)4When l is5、l6Are all less than 7cm, and theta3、θ4When the temperature is more than 30 degrees, the flow 2 and the flow 3 are considered to be completed;
and (3) a flow path: if the station only performs one operation back and forth, the station operation is considered to be completed, otherwise, i is i +1, and the process is switched to the flow 1;
5) calculating whether the operator completes the specified action at other stations;
the flow of the step is the same as that of the step 4), when the operation staff finishes the operation actions of all the stations, the operation staff considers that one effective operation action is finished, and the effective operation action is added to the step P3
CN201910395920.0A 2019-05-14 2019-05-14 Bearing assembly production line efficiency monitoring method Expired - Fee Related CN110135332B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910395920.0A CN110135332B (en) 2019-05-14 2019-05-14 Bearing assembly production line efficiency monitoring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910395920.0A CN110135332B (en) 2019-05-14 2019-05-14 Bearing assembly production line efficiency monitoring method

Publications (2)

Publication Number Publication Date
CN110135332A CN110135332A (en) 2019-08-16
CN110135332B true CN110135332B (en) 2022-05-31

Family

ID=67573713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910395920.0A Expired - Fee Related CN110135332B (en) 2019-05-14 2019-05-14 Bearing assembly production line efficiency monitoring method

Country Status (1)

Country Link
CN (1) CN110135332B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004185252A (en) * 2002-12-03 2004-07-02 Mazda Motor Corp Production support program, method and system for assembly production
CN101901009A (en) * 2009-03-24 2010-12-01 厦门智赢信息科技有限公司 Assembly line work data acquisition and control system
CN103370733A (en) * 2010-09-24 2013-10-23 纽乐金集团 Method, system and apparatus for automatic quality control using a plurality of computers
CN104463452A (en) * 2014-11-29 2015-03-25 中山市铧禧电子科技有限公司 Management method of production management system
CN105252532A (en) * 2015-11-24 2016-01-20 山东大学 Method of cooperative flexible attitude control for motion capture robot
CN108090448A (en) * 2017-12-20 2018-05-29 吉林大学 Model is worth evaluation method in a kind of Virtual assemble
CN108133119A (en) * 2018-01-19 2018-06-08 吉林大学 Swing acts time study method in a kind of Virtual assemble
CN108268017A (en) * 2018-02-02 2018-07-10 南京水木自动化科技有限公司 The appraisal procedure and optimization method and its apparatus for evaluating of tank production line efficiency processed
CN108958478A (en) * 2018-06-14 2018-12-07 吉林大学 Action recognition and appraisal procedure are ridden in a kind of operation of Virtual assemble

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004185252A (en) * 2002-12-03 2004-07-02 Mazda Motor Corp Production support program, method and system for assembly production
CN101901009A (en) * 2009-03-24 2010-12-01 厦门智赢信息科技有限公司 Assembly line work data acquisition and control system
CN103370733A (en) * 2010-09-24 2013-10-23 纽乐金集团 Method, system and apparatus for automatic quality control using a plurality of computers
CN104463452A (en) * 2014-11-29 2015-03-25 中山市铧禧电子科技有限公司 Management method of production management system
CN105252532A (en) * 2015-11-24 2016-01-20 山东大学 Method of cooperative flexible attitude control for motion capture robot
CN108090448A (en) * 2017-12-20 2018-05-29 吉林大学 Model is worth evaluation method in a kind of Virtual assemble
CN108133119A (en) * 2018-01-19 2018-06-08 吉林大学 Swing acts time study method in a kind of Virtual assemble
CN108268017A (en) * 2018-02-02 2018-07-10 南京水木自动化科技有限公司 The appraisal procedure and optimization method and its apparatus for evaluating of tank production line efficiency processed
CN108958478A (en) * 2018-06-14 2018-12-07 吉林大学 Action recognition and appraisal procedure are ridden in a kind of operation of Virtual assemble

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Research on low cost virtual assembly training platform based on somatosensory technology;Shengqian Jiang等;《IEEE International Conference on Industrial Engineering & Engineering Management》;20180212;250-254页 *
基于RGB-D摄像机的人体动作识别的研究;乔文静;《中国优秀硕士学位论文全文数据库 信息科技辑》;20170315;I138-5552 *

Also Published As

Publication number Publication date
CN110135332A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
CN107861478B (en) A kind of parallel control method in intelligence workshop and system
CN111708332B (en) Digital twin system of production line
CN112270508A (en) Digital twin smart cloud scheduling method meeting personalized customized production
CN110567974B (en) Cloud artificial intelligence based surface defect detection system
CN101276209A (en) Steel plate cooling control analog system and method
CN108681878A (en) The acquisition of intelligent plant production line data and billboard management system and method
CN108288119A (en) A kind of intelligentized factory control method and system
CN107480901A (en) Garment production management method and system
CN105171936A (en) Stone deep processing modularization automatic production line and production method
CN106257510A (en) Operational data processing method based on Intelligent worn device and system
JP7086280B2 (en) Work support device
Wang et al. Analysis of a linear walking worker line using a combination of computer simulation and mathematical modeling approaches
CN111308925A (en) Information processing apparatus, production instruction support method, and computer program
CN107765566A (en) One kind is based on the five-axle number control machine tool simulated training system of " internet+"
CN113591295A (en) Power cable management method based on digital twinning
CN110135332B (en) Bearing assembly production line efficiency monitoring method
CN109189112A (en) A kind of idler roller strip tension sliding-mode control and control device
Szajna et al. The application of augmented reality technology in the production processes
CN111338304A (en) Method for real-time prediction and information communication of production line yield by applying artificial intelligence cloud computing
Wei et al. Predictive maintenance system for production line equipment based on digital twin and augmented reality
WO2021147347A1 (en) Acquisition network system for industrial big data and application method therefor
CN107045316A (en) A kind of precision machined PLC integrated control systems
CN108107882A (en) Service robot automatic Calibration and detecting system based on optical motion tracking
CN108427389A (en) Leather optimal layout based on industry internet coordinates cutting method with multimachine
CN118014314B (en) Semiconductor work management system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220531

CF01 Termination of patent right due to non-payment of annual fee