CN117636451A - Job detection method, job detection device, electronic equipment and readable storage medium - Google Patents

Job detection method, job detection device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN117636451A
CN117636451A CN202210969057.7A CN202210969057A CN117636451A CN 117636451 A CN117636451 A CN 117636451A CN 202210969057 A CN202210969057 A CN 202210969057A CN 117636451 A CN117636451 A CN 117636451A
Authority
CN
China
Prior art keywords
point information
action
image
characteristic point
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210969057.7A
Other languages
Chinese (zh)
Inventor
海涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202210969057.7A priority Critical patent/CN117636451A/en
Publication of CN117636451A publication Critical patent/CN117636451A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • General Factory Administration (AREA)

Abstract

The present disclosure relates to a job detection method, apparatus, electronic device, and readable storage medium, the method including: collecting an image during working of a workbench, identifying characteristic point information in the image, determining a target action appearing in the image according to the characteristic point information, determining the deviation degree of the target action, and sending out prompt information under the condition that the deviation degree is higher than a preset threshold value; the image data during operation of the workbench is collected through the equipment, the operation of using manual supervision personnel is avoided, time and manpower resources are saved, meanwhile, when a manufactured finished product is problematic, the reason of the problem can be traced back according to the collected image data during operation of the personnel, and the reject ratio of the manufactured finished product is further reduced.

Description

Job detection method, job detection device, electronic equipment and readable storage medium
Technical Field
The disclosure relates to the field of computer technology, and in particular, to a job detection method, a job detection device, an electronic device and a readable storage medium.
Background
On the workshop assembly line of intelligent manufacturing, the workman can be on the assembly line continuous operation. If the worker is manually supervised, the time and the labor are consumed, but if the worker is not supervised, the speed and the compliance of the worker operation cannot be ensured, and when the manufactured finished product is problematic, the reason of the problem cannot be traced, so that the defective rate of the manufactured finished product is increased.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a job detection method, apparatus, electronic device, and readable storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided a job detection method, including collecting an image of a stage when working; identifying feature point information in the image; determining a target action appearing in the image according to the characteristic point information; determining the deviation degree of the target action; and sending out prompt information under the condition that the deviation degree is higher than a preset threshold value.
Optionally, the characteristic point information comprises material characteristic point information and action characteristic point information; identifying feature point information in the image, comprising: and identifying the material characteristic point information and the action characteristic point information in the image.
Optionally, the job detection method further includes: determining material properties in the image according to the material characteristic point information; the material attribute comprises material type, material position and material color; and sending out the prompt information under the condition that the material attribute does not accord with the specified condition.
Optionally, the target actions include a material placing action and a release film tearing action, and the determining the target actions appearing in the image according to the feature point information includes: and determining the material placing action and the tearing film action according to the material types, the material positions and the action characteristic point information.
Optionally, the job detection method further includes: determining the completion rate and the completion speed of the material placing action and the tearing-off film action; and adjusting the transfer speed of each material according to the completion rate and the completion speed.
According to a second aspect of the embodiments of the present disclosure, there is provided a job detection apparatus including: the acquisition module is configured to acquire images during operation of the workbench; an identification module configured to identify feature point information in the image; a processing module configured to determine a target action occurring in the image from the feature point information; the processing module is further configured to determine a degree of deviation of the target action; the processing module is further configured to send out prompt information under the condition that the deviation degree is higher than a preset threshold value.
Optionally, the identifying module is further configured to identify the material characteristic point information and the action characteristic point information in the image.
Optionally, the processing module is further configured to determine a material attribute in the image according to the material characteristic point information; the material attribute comprises material type, material position and material color; and sending out the prompt information under the condition that the material attribute does not accord with the specified condition.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic device, comprising: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to execute the executable instructions to implement the steps of the job detection method described previously.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the job detection method provided by the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: collecting an image during working of a workbench, identifying characteristic point information in the image, determining a target action appearing in the image according to the characteristic point information, determining the deviation degree of the target action, and sending out prompt information under the condition that the deviation degree is higher than a preset threshold value; the image data during operation of the workbench is collected through the equipment, irregular operation in a production link is avoided, meanwhile, when a manufactured finished product is problematic, the reason for the problem can be traced back according to the collected image data during operation of personnel, and the reject ratio of the manufactured finished product is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic diagram of a computer system shown in an exemplary embodiment of the present disclosure.
Fig. 2 is a flowchart of a job detection method shown in an exemplary embodiment of the present disclosure.
Fig. 3 is a block diagram illustrating a job detection apparatus according to an exemplary embodiment.
Fig. 4 is a block diagram of an apparatus according to an example embodiment.
Fig. 5 is a block diagram of an apparatus according to an example embodiment.
Fig. 6 is a block diagram illustrating an apparatus for job detection, according to an example embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
It should be noted that, all actions for acquiring signals, information or data in the present application are performed under the condition of conforming to the corresponding data protection rule policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
Referring to fig. 1, fig. 1 is a schematic diagram of a computer system according to an exemplary embodiment of the present disclosure. The computer system includes a camera 110, a terminal 120, a user terminal 130, and a server 140. The server 140 is communicatively connected to the camera 110, the terminal 120, and the user terminal 130, respectively.
The cameras 110 are used to capture video recordings or images of personnel (or so-called workers) while working on the intelligent manufacturing plant line, which video recordings or images are then uploaded to the server 140, one or more cameras 110 may be included in the computer system, only one of which is shown in fig. 1.
The terminal 120 may include at least one of a notebook computer, a desktop computer, and a tablet computer. The terminal 120 may be used to receive video recordings, images, job detection results, or the like of the worker's job transmitted by the server 140. Terminal 120 includes a display; the display is used for displaying video recordings, images or operation detection results of the worker operation in real time.
The user terminal 130 may include at least one of a smart phone, a tablet computer, and a smart watch, and the user terminal 130 may be associated with a worker operating on a smart manufacturing shop line and may be used to receive the prompt information sent by the server 140.
Fig. 2 is a flowchart illustrating a job detection method, as shown in fig. 2, for use in an electronic device, such as the camera or server shown in fig. 1, according to an exemplary embodiment, the job detection method including the following steps.
In step S11, an image at the time of the table work is acquired.
By way of example, image data of a worker while working on a workbench may be acquired by a camera shown in fig. 1, and the image data may be, but is not limited to, video, pictures, etc. of the worker's work. When workers operate on an intelligent manufacturing workshop assembly line, assembly of some finished products is usually carried out, and compliance of material placing actions of the workers and other operation actions of the workers is also an important index to be supervised. It can be understood that the collected images of the worker's operation include the material images and the worker's action images, that is, the assembly actions of the worker during the worker's operation, the placement position of the material, what kind of assembly material is used, the kind of material placed on the workbench, and the like.
In step S12, feature point information in the image is identified.
The characteristic point information comprises material characteristic point information and action characteristic point information, wherein the material characteristic point information is used for representing material properties and material positions, the material properties can also be called as the types of materials, and the action characteristic point information is used for representing the operation actions of workers.
In step S13, a target action appearing in the image is determined from the feature point information.
For example, the working action of the worker can be determined according to action feature point information in the feature point information, and the target action is screened from the working actions and is defined manually in advance, for example, the target action can be a material placing action or a release film tearing action.
In step S14, the degree of deviation of the target motion is determined.
The target action can be determined according to the action feature points, and then the target action is compared with a predefined standard action to determine the deviation degree of the target action.
In step S15, a prompt message is sent out if the deviation is higher than a preset threshold.
And under the condition that the deviation degree of the target action is higher than a preset threshold value, indicating that the target action is not in compliance, sending out prompt information to remind a worker to correct the current operation action, or informing a supervisor of the condition that the target action is not in compliance, so that the supervisor can find out the condition that the operation of the worker is not in compliance as soon as possible.
The image data during operation of the workbench is collected through the equipment, the operation of using manual supervision personnel is avoided, time and manpower resources are saved, meanwhile, when a manufactured finished product is problematic, the reason of the problem can be traced back according to the collected image data during operation of the personnel, and the reject ratio of the manufactured finished product is further reduced.
FIG. 3 is a flowchart illustrating a job detection method, as shown in FIG. 3, for use in an electronic device, such as the camera or server shown in FIG. 1, according to an exemplary embodiment; note that, the content of the operation detection method shown in fig. 3 is identical to that of the operation detection method shown in fig. 2, and the details not described herein may refer to fig. 2; the job detection method includes the following steps.
In step S21, an image at the time of the table work is acquired.
By way of example, image data of a worker while working on a workbench may be acquired by a camera shown in fig. 1, and the image data may be, but is not limited to, video, pictures, etc. of the worker's work. When workers operate on an intelligent manufacturing workshop assembly line, assembly of some finished products is usually carried out, and compliance of material placing actions of the workers and other operation actions of the workers is also an important index to be supervised. It can be understood that the collected images of the worker's operation include the material images and the worker's action images, that is, the assembly actions of the worker during the worker's operation, the placement position of the material, what kind of assembly material is used, the kind of material placed on the workbench, and the like.
In step S22, feature point information in the image is identified.
The characteristic point information comprises material characteristic point information and action characteristic point information, wherein the material characteristic point information is used for representing material attributes, the material attributes comprise material types, material positions and material colors, and the action characteristic point information is used for representing operation actions of workers.
For example, the image is a video of a worker during operation, at this time, a key frame in the video of the worker operation may be obtained, the key frame is a video frame that can represent the operation of the worker, then some image detection algorithms that are common in the art are used to detect the characteristic point information of the material, and the type and the position of the material, such as a classification algorithm and a target detection algorithm, are obtained according to the characteristic point information of the material. For example, material characteristic point information of a material image in a key frame can be obtained through an image detection algorithm, the material characteristic points are points capable of representing the edges of the material outline, then the material position is determined according to the material characteristic point information, the image characteristic of a material region in the key frame can be extracted after the material position is determined, and the material type is obtained based on the image characteristic and combining with some general classification algorithms in the field. It should be noted that, when the material attribute is detected to be inconsistent with the specified condition, prompt information is sent, for example, only materials related to the operation are allowed to be placed on the workbench, other unrelated materials are not allowed to be placed, if the identified material type is the materials which are not allowed to be placed, the workbench can be determined to be placed with the foreign matters at the moment, the server can send the prompt information to the user terminal of the worker corresponding to the workbench so as to prompt the worker to remove the foreign matters, so that the material is compliant, or the condition that the workbench is placed with the foreign matters is notified to a supervisor, so that the supervisor can find the condition that the workbench is placed with the foreign matters as soon as possible. Whether the material placement position is compliant or not can also be determined according to the material position, for example, the material placement on the workbench should conform to the relevant regulations, if the position of a certain material is detected to be not compliant with the relevant regulations, then the abnormality of the material placement position can be determined, and prompt information can be sent to the user terminal of the worker corresponding to the workbench through the server so as to prompt the worker to correct the position of the material.
In step S23, a target action appearing in the image is determined from the feature point information.
For example, the working action of the worker can be determined according to action feature point information in the feature point information, and the target action is screened from the working actions and is defined manually in advance, for example, the target action can be a material placing action or a release film tearing action. The material placing action and the film tearing action can be determined according to the material type, the material position and the action characteristic point information.
The method for determining the operation action of the worker according to the action characteristic point information comprises the following steps: and detecting a person image in the images, and determining the personnel operation action according to the person image. For example, the image data is a worker operation video, at this time, a key frame in the worker operation video may be obtained, the key frame is a video frame that can represent a worker operation, and then a person image in the key frame is obtained by clipping using some image detection algorithms that are common in the art, for example, a face region in the key frame is detected using a classification algorithm, and then the whole person image is located according to the face region. The two classification algorithms for detecting the human face greatly improve the efficiency of detecting the figure image due to simplicity and small operation amount. Then, hand skeleton point detection and human skeleton point detection are carried out on the character image, if the hand skeleton point and the human skeleton point can be detected by adopting a key point detection technology, a hand skeleton point detection result and a human skeleton point detection result are obtained, a worker operation action is determined according to the hand skeleton point detection result and the human skeleton point detection result, the worker operation action is classified, the time of key actions of the worker operation action is determined, and a target action is screened from the operation actions, for example, the target action can be a material placing action or a tearing film action.
In one embodiment, the worker's work action in the worker's work video may be detected using an optical flow histogram that is used to count optical flow information in the video to represent the motion information of objects in the video for the computer to distinguish between the actions in the video. The working action of a worker usually lasts for a long time from beginning to end, the optical flow only describes the movement of an object between two adjacent frames of images, the optical flow histogram is coarser in time division, the time dimension describing the movement has larger limitation, and the long-time movement is accurately described, so that the information of continuous multiframes is combined, and in another embodiment, the characteristic of the movement state of the character can be described by adopting a track characteristic mode. The method comprises the steps of dividing the horizontal component and the vertical component of the optical flow in each pixel into a table representing the displacement in the horizontal direction and the vertical direction, then taking out the horizontal displacement, and putting the horizontal displacement value between 0 and 255, thus obtaining a gray image called an optical flow gray image in the horizontal direction. And the same can obtain the gray level image of the optical flow in the vertical direction. The optical flow gray level map in the horizontal and vertical directions can be used as the input of a convolutional neural network to extract the motion characteristics in the video. And the information of the video can be divided into static and dynamic aspects, and the static information refers to the appearance of objects in the image, including related scenes and objects, such as static materials, which can be obtained through static picture frames. Dynamic information refers to motion information of a person or object in a video sequence, and can be obtained through an optical flow gray scale map. A dual-stream convolutional neural network, which is widely used in video behavior recognition, utilizes two different networks to realize simultaneous processing of static and dynamic information. The double-flow convolutional neural network comprises a time flow convolutional neural network and a space flow convolutional neural network. Because the double-flow convolutional neural network is two independent convolutional neural networks, the optical flow is decomposed into a horizontal optical flow chart and a vertical optical flow chart, and then the horizontal optical flow chart and the vertical optical flow chart are sent into the convolutional neural network to extract the motion characteristics. This is to let the computer learn the motion information in the light flow graph automatically. Therefore, the convolutional neural network has higher efficient feature expression capability, and features from bottom-layer pixels to high-layer semantics are extracted step by step through the network, so that behavior recognition worker operation actions are more effectively performed.
In step S24, the degree of deviation of the target motion is determined.
The target action is determined in the above steps, and then the target action is compared with a predefined standard action to determine the deviation degree of the target action.
In step S25, a prompt message is sent out if the deviation is higher than a preset threshold.
And under the condition that the deviation degree of the target action is higher than a preset threshold value, indicating that the target action is not in compliance, sending out prompt information to remind a worker to correct the current operation action, or informing a supervisor of the condition that the target action is not in compliance, so that the supervisor can find out the condition that the operation of the worker is not in compliance as soon as possible.
It should be noted that, the present disclosure is also used for determining a completion rate and a completion speed of a material placing action and a release film tearing action, and then adjusting a transfer speed of each material according to the completion rate and the completion speed, so as to ensure that the transfer speed of the material is within a reasonable range.
It should be noted that, the server shown in fig. 1 is further configured to store at least one of a compliance condition of a material attribute of the workbench, a working action of a worker, a compliance condition of a working action of a worker, and a time of a working action of a worker, and store image data of the worker when the workbench works, so that when a problem occurs in a manufactured product, a cause of the problem can be traced back according to playback image data, and thus, a reject ratio of the manufactured product is reduced.
In summary, the job detection method provided in the present disclosure includes: collecting an image during working of a workbench, identifying characteristic point information in the image, determining a target action appearing in the image according to the characteristic point information, determining the deviation degree of the target action, and sending out prompt information under the condition that the deviation degree is higher than a preset threshold value; the image data during operation of the workbench is collected through the equipment, the operation of using manual supervision personnel is avoided, time and manpower resources are saved, meanwhile, when a manufactured finished product is problematic, the reason of the problem can be traced back according to the collected image data during operation of the personnel, and the reject ratio of the manufactured finished product is further reduced.
Fig. 4 is a block diagram of a job detection device according to an example embodiment. Referring to fig. 4, the apparatus 20 includes an acquisition module 201, an identification module 202, and a processing module 203.
The acquisition module 201 is configured to acquire an image of a workbench during operation;
the identifying module 202 is configured to identify feature point information in the image;
the processing module 203 is configured to determine a target action appearing in the image according to the feature point information;
the processing module 203 is further configured to determine a degree of deviation of the target action;
the processing module 203 is further configured to send out a prompt message if the deviation is higher than a preset threshold.
Optionally, the identifying module 202 is further configured to identify the material feature point information and the action feature point information in the image.
Optionally, the processing module 203 is further configured to determine a material attribute in the image according to the material feature point information; the material attribute comprises material type, material position and material color;
and sending out the prompt information under the condition that the material attribute does not accord with the specified condition.
Optionally, the processing module 203 is further configured to determine the material placing action and the release film tearing action according to the material type, the material position and the action feature point information.
Optionally, the processing module 203 is further configured to determine a completion rate and a completion speed of the placing material action and the tearing film action;
and adjusting the transfer speed of each material according to the completion rate and the completion speed.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
The present disclosure also provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the job detection method provided by the present disclosure.
Fig. 5 is a block diagram illustrating an apparatus for job detection, according to an example embodiment. For example, apparatus 800 may be a camera, a terminal, a user terminal, a mobile phone, a computer, a digital broadcast terminal, a messaging device, a tablet device, a personal digital assistant, etc. in fig. 1.
Referring to fig. 5, apparatus 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the apparatus 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. Processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the job detection method described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on the device 800, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 806 provides power to the various components of the device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen between the device 800 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 800 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
Input/output interface 812 provides an interface between processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the apparatus 800. For example, the sensor assembly 814 may detect an on/off state of the device 800, a relative positioning of the components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, an orientation or acceleration/deceleration of the device 800, and a change in temperature of the device 800. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the apparatus 800 and other devices, either in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the job detection methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including instructions executable by processor 820 of apparatus 800 to perform the job detection method described above. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
The apparatus may be a stand-alone electronic device or may be part of a stand-alone electronic device, for example, in one embodiment, the apparatus may be an integrated circuit (Integrated Circuit, IC) or a chip, where the integrated circuit may be an IC or may be a collection of ICs; the chip may include, but is not limited to, the following: GPU (Graphics Processing Unit, graphics processor), CPU (Central Processing Unit ), FPGA (Field Programmable Gate Array, programmable logic array), DSP (Digital Signal Processor ), ASIC (Application Specific Integrated Circuit, application specific integrated circuit), SOC (System on Chip, SOC, system on Chip or System on Chip), etc. The integrated circuit or chip may be configured to execute executable instructions (or code) to implement the job detection method described above. The executable instructions may be stored on the integrated circuit or chip or may be retrieved from another device or apparatus, such as the integrated circuit or chip including a processor, memory, and interface for communicating with other devices. The executable instructions may be stored in the memory, which when executed by the processor implement the job detection method described above; alternatively, the integrated circuit or chip may receive the executable instructions through the interface and transmit the executable instructions to the processor for execution, so as to implement the job detection method described above.
In another exemplary embodiment, a computer program product is also provided, the computer program product comprising a computer program executable by a programmable apparatus, the computer program having code portions for performing the job detection method described above when executed by the programmable apparatus.
Fig. 6 is a block diagram illustrating an apparatus for job detection, according to an example embodiment. For example, the apparatus 1900 may be provided as a server as shown in fig. 1. Referring to fig. 6, the apparatus 1900 includes a processing component 1922 that further includes one or more processors and memory resources represented by memory 1932 for storing instructions, such as application programs, that can be executed by the processing component 1922. The application programs stored in memory 1932 may include one or more modules each corresponding to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the job detection method described above.
The apparatus 1900 may further comprise a power component 1926 configured to perform power management of the apparatus 1900, a wired or wireless network interface 1950 configured to connect the apparatus 1900 to a network, and an input/output interface 1958. The apparatus 1900 may operate based on an operating system stored in the memory 1932, such as Windows Server TM ,Mac OS X TM ,Unix TM ,Linux TM ,FreeBSD TM Or the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. A job detection method, comprising:
collecting an image during operation of a workbench;
identifying feature point information in the image;
determining a target action appearing in the image according to the characteristic point information;
determining the deviation degree of the target action;
and sending out prompt information under the condition that the deviation degree is higher than a preset threshold value.
2. The method according to claim 1, wherein the characteristic point information includes material characteristic point information and action characteristic point information; identifying feature point information in the image, comprising:
and identifying the material characteristic point information and the action characteristic point information in the image.
3. The method as recited in claim 2, further comprising:
determining material properties in the image according to the material characteristic point information; the material attribute comprises material type, material position and material color;
and sending out the prompt information under the condition that the material attribute does not accord with the specified condition.
4. A method according to claim 3, wherein the target actions include a material placement action and a release film tearing action, and the determining the target actions appearing in the image according to the feature point information includes:
and determining the material placing action and the tearing film action according to the material types, the material positions and the action characteristic point information.
5. The method as recited in claim 4, further comprising:
determining the completion rate and the completion speed of the material placing action and the tearing-off film action;
and adjusting the transfer speed of each material according to the completion rate and the completion speed.
6. A job detection apparatus, comprising:
the acquisition module is configured to acquire images during operation of the workbench;
an identification module configured to identify feature point information in the image;
a processing module configured to determine a target action occurring in the image from the feature point information;
the processing module is further configured to determine a degree of deviation of the target action;
the processing module is further configured to send out prompt information under the condition that the deviation degree is higher than a preset threshold value.
7. The apparatus of claim 6, wherein the device comprises a plurality of sensors,
the identification module is further configured to identify the material characteristic point information and the action characteristic point information in the image.
8. The apparatus of claim 7, wherein the device comprises a plurality of sensors,
the processing module is further configured to determine material properties in the image according to the material characteristic point information; the material attribute comprises material type, material position and material color;
and sending out the prompt information under the condition that the material attribute does not accord with the specified condition.
9. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to execute the executable instructions to implement the steps of the method of any one of claims 1 to 5.
10. A computer readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the steps of the method of any of claims 1 to 5.
CN202210969057.7A 2022-08-12 2022-08-12 Job detection method, job detection device, electronic equipment and readable storage medium Pending CN117636451A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210969057.7A CN117636451A (en) 2022-08-12 2022-08-12 Job detection method, job detection device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210969057.7A CN117636451A (en) 2022-08-12 2022-08-12 Job detection method, job detection device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN117636451A true CN117636451A (en) 2024-03-01

Family

ID=90029132

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210969057.7A Pending CN117636451A (en) 2022-08-12 2022-08-12 Job detection method, job detection device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN117636451A (en)

Similar Documents

Publication Publication Date Title
CN106651955B (en) Method and device for positioning target object in picture
EP3125135B1 (en) Picture processing method and device
US10452890B2 (en) Fingerprint template input method, device and medium
EP2977956B1 (en) Method, apparatus and device for segmenting an image
CN107582028B (en) Sleep monitoring method and device
US9892314B2 (en) Method and device for determining associated user
US20210374447A1 (en) Method and device for processing image, electronic equipment, and storage medium
RU2643464C2 (en) Method and apparatus for classification of images
US20200034900A1 (en) Method and apparatus for displaying commodity
CN105357425A (en) Image shooting method and image shooting device
EP3975046B1 (en) Method and apparatus for detecting occluded image and medium
CN107798314A (en) Skin color detection method and device
US11551465B2 (en) Method and apparatus for detecting finger occlusion image, and storage medium
CN112927122A (en) Watermark removing method, device and storage medium
CN105208284A (en) Photographing reminding method and device
CN111340691A (en) Image processing method, image processing device, electronic equipment and storage medium
EP3929804A1 (en) Method and device for identifying face, computer program, and computer-readable storage medium
CN113920465A (en) Method and device for identifying film trailer, electronic equipment and storage medium
CN110086921B (en) Method and device for detecting performance state of terminal, portable terminal and storage medium
CN117636451A (en) Job detection method, job detection device, electronic equipment and readable storage medium
CN115334291A (en) Tunnel monitoring method and device based on hundred million-level pixel panoramic compensation
CN115408544A (en) Image database construction method, device, equipment, storage medium and product
CN115937629B (en) Template image updating method, updating device, readable storage medium and chip
CN110784721A (en) Picture data compression method and device, electronic equipment and storage medium
CN106778500A (en) A kind of method and apparatus for obtaining people's object plane phase information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination