CN111242943A - Image processing method, image processing apparatus, image processing device, and storage medium - Google Patents

Image processing method, image processing apparatus, image processing device, and storage medium Download PDF

Info

Publication number
CN111242943A
CN111242943A CN202010076749.XA CN202010076749A CN111242943A CN 111242943 A CN111242943 A CN 111242943A CN 202010076749 A CN202010076749 A CN 202010076749A CN 111242943 A CN111242943 A CN 111242943A
Authority
CN
China
Prior art keywords
image
target
detection model
detected
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010076749.XA
Other languages
Chinese (zh)
Other versions
CN111242943B (en
Inventor
黎少伟
江列琼
方超
艾义
殷年俊
龙建军
郑光文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010076749.XA priority Critical patent/CN111242943B/en
Publication of CN111242943A publication Critical patent/CN111242943A/en
Application granted granted Critical
Publication of CN111242943B publication Critical patent/CN111242943B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses an image processing method, an image processing device, image processing equipment and a storage medium, wherein the method comprises the following steps: receiving an image to be detected, and acquiring a target detection model corresponding to the image to be detected; detecting a target image processor bound with a target detection model in a mapping table, wherein the mapping table records the binding relationship between a plurality of image processors and a plurality of detection models, each image processor is used for operating one or more detection models, and each detection model is obtained based on a corresponding training image and a defect supervision label corresponding to the training image; and if the target image processor is obtained through detection, transmitting the image to be detected to the target image processor so as to carry out defect detection on the image to be detected by running the target detection model in the target image processor. The embodiment of the invention can improve the accuracy of image processing.

Description

Image processing method, image processing apparatus, image processing device, and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an image processing device, and a storage medium.
Background
With the development of artificial intelligence, image recognition technology is widely applied in the field of image processing as one of important research directions of artificial intelligence. Such as license plate discernment, unmanned aerial vehicle patrol and examine automatic piece of judging in the project etc.. The automatic film judgment of the unmanned aerial vehicle inspection project is to read the uploaded images shot by the unmanned aerial vehicle in an artificial intelligence instead of manual mode, and call corresponding image detection models to identify some defects in the images, classify and mark the defects.
At present, in an automatic film judgment application of an unmanned aerial vehicle, an image processing device generally includes a plurality of image processors, each image processor designates to run a detection model, and when the image processing device is initialized, the detection models corresponding to all the image processors start to run. When an image to be detected needs to be detected, a detection model needed for detecting the image to be detected is determined, and the image to be detected is transmitted to an image processor operating the detection model for image detection. In this way, once a certain image processor fails, the detection model bound by the image processor stops running, and the image needing the detection model cannot be detected or detected by other models, so that the fault tolerance of the image processing equipment is reduced, and the image processing accuracy is affected. Therefore, in the field of image processing, how to accurately perform image processing becomes a hot issue in recent research.
Disclosure of Invention
The embodiment of the invention provides an image processing method, an image processing device, image processing equipment and a storage medium, which can improve the accuracy of image processing.
In one aspect, an embodiment of the present invention provides an image processing method, including:
receiving an image to be detected, and acquiring a target detection model corresponding to the image to be detected;
detecting a target image processor bound with the target detection model in a mapping table, wherein the mapping table records the binding relationship between a plurality of image processors and a plurality of detection models, each image processor is used for operating one or more detection models, and each detection model is obtained based on a corresponding training image and defect supervision labels corresponding to the training image;
and if the target image processor is obtained through detection, transmitting the image to be detected to the target image processor so as to carry out defect detection on the image to be detected by running the target detection model in the target image processor.
In one aspect, an embodiment of the present invention provides an image processing apparatus, including:
the receiving unit is used for receiving an image to be detected;
the acquisition unit is used for acquiring a target detection model corresponding to the image to be detected;
the processing unit is used for detecting a target image processor bound with the target detection model in a mapping table, the mapping table records the binding relationship between a plurality of image processors and a plurality of detection models, each image processor is used for operating one or more detection models, and each detection model is obtained based on a corresponding training image and defect supervision labels corresponding to the training image;
and the transmission unit is used for transmitting the image to be detected to the target image processor if the target image processor is obtained through detection so as to carry out defect detection on the image to be detected by operating the target detection model in the target image processor.
In one aspect, an embodiment of the present invention provides an image processing apparatus, including: a processor adapted to implement one or more instructions; and the number of the first and second groups,
a computer storage medium storing one or more instructions adapted to be loaded by the processor and to perform the steps of:
receiving an image to be detected, and acquiring a target detection model corresponding to the image to be detected;
detecting a target image processor bound with the target detection model in a mapping table, wherein the mapping table records the binding relationship between a plurality of image processors and a plurality of detection models, each image processor is used for operating one or more detection models, and each detection model is obtained based on a corresponding training image and defect supervision labels corresponding to the training image;
and if the target image processor is obtained through detection, transmitting the image to be detected to the target image processor so as to carry out defect detection on the image to be detected by running the target detection model in the target image processor.
In one aspect, an embodiment of the present invention provides a computer storage medium, where computer program instructions are stored, and when executed by a processor, the computer program instructions are configured to execute the image processing method as described above.
In the embodiment of the invention, when the image processing equipment receives the image to be detected, the target detection model required for detecting the image to be detected is determined, then the target image processor bound with the target detection model is searched in the mapping table, and the binding relationship between the plurality of image processors and the plurality of detection models is recorded in the mapping table. And after the target image processor is found, transmitting the image to be detected to the target image processor, and carrying out defect detection on the image to be detected by operating the target detection model by the target image processor. Any one detection model can be operated in one image processor, so that the operation of any one image detection model cannot be influenced when any one image processor fails, the situation that the image to be detected cannot be identified or is identified mistakenly is avoided, and the accuracy of image processing is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an image processing system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating interaction among modules in an image processing apparatus according to an embodiment of the present invention;
FIG. 3a is a schematic diagram of a dynamic scheduling object detection model according to an embodiment of the present invention;
FIG. 3b is a schematic diagram of a scheduling object detection model according to an embodiment of the present invention;
FIG. 3c is an interaction diagram of a dynamic scheduling object detection model according to an embodiment of the present invention;
FIG. 4 is a flowchart illustrating an image processing method according to an embodiment of the present invention;
FIG. 5 is a flow chart of another image processing method according to an embodiment of the present invention;
FIG. 6a is a schematic diagram of a method for determining a target detection model according to an embodiment of the present invention;
FIG. 6b is a schematic diagram of another deterministic object detection model provided by embodiments of the present invention;
FIG. 6c is a schematic diagram of another determined object detection model provided by an embodiment of the invention;
fig. 7 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
The embodiment of the invention provides an image processing scheme which can be applied to application scenes with more models and non-uniform flow distribution of each model in the intelligent manufacturing industry, such as an unmanned aerial vehicle inspection scene. In the specific implementation, the mapping table records the binding relationship between a plurality of image processors and a plurality of detection models, and when an image processing device receives an image to be detected, a target detection model corresponding to the image to be detected is obtained; and detecting the target image processor bound with the target detection model in a mapping table, and if the target image processor is detected, transmitting the image to be detected to the target image processor so as to carry out defect detection on the image to be detected by operating the target detection model in the target image processor. Any one detection model can be operated in one image processor, so that the operation of any one image detection model cannot be influenced when any one image processor fails, the situation that the image to be detected cannot be identified or is identified mistakenly is avoided, and the accuracy of image processing is improved.
Based on the image processing scheme, an embodiment of the present invention provides an image processing system, and referring to fig. 1, a schematic structural diagram of the image processing system provided in the embodiment of the present invention is shown. The image processing system shown in fig. 1 may include an image acquisition device 101 and an image processing device 102, wherein the image acquisition device 101 is configured to acquire an image to be detected and transmit the acquired image to be detected to the image processing device 102. The image capturing device 101 may be a stand-alone device, or may be a device configured in the image processing device 102. The image processing device 102 is configured to receive the image to be detected transmitted by the image acquisition device 101, and perform defect detection on the image to be detected.
In one embodiment, the image processing device 102 may have a plurality of inspection models stored therein, each inspection model being used to detect one or more defects, such as in a drone patrol application, and one inspection model may be used to detect one or more of the following defects: a power transmission channel hazard source, insulator burst of a power transmission line iron tower, vibration damper loss, pin loss and the like. Optionally, each detection model may be trained based on a corresponding training image and a defect supervision label corresponding to the training image. Because the defects which can be detected by each detection model are different, when each detection model is trained, some images including the defects which can be detected by the detection model are selected as corresponding training images of the detection model. For example, if one detection model is used for detecting the insulator burst defect of the transmission line iron tower, some images with the insulator burst defect are selected as training images of the detection model; for another example, if a detection model is used to detect the power transmission channel dangerous source defect, some images with the power transmission channel dangerous source defect may be selected as training images corresponding to the detection model.
In one embodiment, the image processing system shown in fig. 1 may include a model training module 1021, and the model training module 1021 may be located in the image processing apparatus 102 or may be independent from the image processing apparatus 102. In the embodiment of the present invention, the model training module 1021 is located in the image processing apparatus 102 as an example. The model training module 1021 is configured to train each detection model, and specifically, the model training module 1021 trains each detection model based on a training image and a defect supervision label corresponding to each detection model to obtain a trained detection model.
In one embodiment, the image processing apparatus 102 may further include an artificial intelligence management module 1022, and the artificial intelligence management module 1022 is connected to the model training module 1021. The model training module 1021 transmits the trained multiple detection models to the artificial intelligence management module 1021, and the worker intelligence management module 1021 manages and schedules the detection models.
In one embodiment, the image processing device 102 may further include a plurality of image processors 1023 (also referred to as a graphics card), where the image processor is a microprocessor dedicated to image and graphics related operations on devices such as a personal computer, a game console, and some mobile devices, and in an embodiment of the present invention, each image processor 1023 may be configured to run any one or more detection models to implement defect detection by running the detection models.
In one embodiment, the image processing device 102 further includes a data collection module 1024, the data collection module 1024 may be connected to the image capturing device 101, and the image processing device 102 receives the image to be detected transmitted by the image capturing device 101 through the data collection module 1024.
In one embodiment, the image processing device 102 may further include a matching module 1025, the matching module 1025 may be connected to the data collecting module 1024, after the data collecting module 1024 receives the image to be detected, the image to be detected may be transmitted to the matching module 1025, and the matching module 1025 may determine a target detection model required for detecting the detection image based on the position information of the image capturing device when the image to be detected is captured.
In one embodiment, the image processing device may further include an image processor scheduling module 1026, the image processor scheduling module 1026 may be connected with the artificial intelligence module 1022 and the image processor 1023,
in summary, the interaction among the image acquisition device, the image processing device and the modules included in the image processing device included in the image processing system may refer to fig. 2, specifically:
the model training module 1021 trains to obtain a plurality of detection models, and transmits the plurality of detection models to the artificial intelligence management module 1022; after the image acquisition device 101 acquires the image to be detected, the image to be detected is transmitted to a data collection module 1024 in the image processing device; the data collection module 1024 transmits the image to be detected to the matching module 1025, the matching module 1025 performs matching processing on the image to be detected to obtain a target detection module corresponding to the image to be detected, then the artificial intelligence management module 1022 is notified, the artificial intelligence management module 1022 performs dynamic scheduling, and the target detection module is scheduled to the target image processor to operate so as to realize defect detection of the image to be detected. Optionally, the image processing apparatus may further include an image defect type classification module, and after the artificial intelligence module dynamically schedules the detection model to perform defect detection on the image to be detected, the detection result is transmitted to the image defect type classification module, so as to classify and sort the image.
In an embodiment, a schematic diagram of the artificial intelligence module 1022 when dynamically scheduling the target detection model for defect detection is shown in fig. 3a, which may specifically include:
in fig. 3a, the image processing device 102 includes a matching module 1025 that notifies the artificial intelligence management module 1022 after determining the target detection model; the artificial intelligence module 1022 first searches a mapping table, and detects whether a target image processor bound to the target detection model exists in the mapping table; if so, transmitting the image to be detected to a target image processor so as to detect the defect of the image to be detected through a target detection model operated in the target image processing; if the artificial intelligence management module 1022 does not find the target image processor bound to the target detection model in the mapping table, the notification image processor scheduling module 1026 selects the target image processor bound to the target detection model from the plurality of image processors according to the operating condition of each image processor, and schedules the target detection module to the target image processor for operation; further, the target detection model and the target image processor are bound, and the binding relationship between the target detection model and the target image processor is recorded in the mapping table, so that if a new image to be detected needs to be detected by the target detection model, the new image to be detected can be directly transmitted to the target image processor for detection according to the record in the mapping table.
In the prior art, the binding relationship between a detection model and an image processor is fixed in advance, when image processing equipment is initialized, all detection models are loaded into the bound image processor to operate, and when the detection model needs to be called for defect detection, an image to be detected is transmitted to the image processor. Referring to fig. 3b, a schematic diagram of defect detection of an image to be detected in the prior art is shown. In fig. 3b, an image processor corresponding to a detection model is preset, for example, the detection model 1 runs on the image processor 1, the detection model 2 runs on the image processor, the detection model 3 runs on the image processor 3, and the target detection model runs on the target image processor; and after determining a target detection model required for defect detection of the image to be detected, transmitting the image to be detected to a target image processor. Since the detection model running on each image processor is fixed, but the detection models required by a plurality of images to be detected are not fixed, there may be 100 images to be detected which need to be detected by using the detection model 1, and the images which need to be detected by the detection models 2 and 3 are far less than 100. This causes resource usage imbalance, which affects image processing efficiency. The image processing method also causes poor fault tolerance of the image processing device, for example, if an image processor bound with a certain detection model breaks down, the detection model can not be used continuously, and an image to be detected which needs to be detected by the detection model subsequently can not be detected or has errors in detection, so that the accuracy of the image processor is reduced.
In one embodiment, the interaction diagram between the various modules in FIG. 3a may be as described with reference to FIG. 3 c. If the detection result in the mapping table is that the target image processor bound to the target detection model is detected, which may be represented as "[ the target image processor bound to the target detection module exists in the mapping table ═ TRUE ]", the artificial intelligence management module 1022 finds the target image processor, and transmits the image to be detected to the target image processor; if the detection result in the mapping table is that the target image processor bound to the target detection model is not detected, and the detection result may be represented as "[ the target image processor bound to the target detection module is present in the mapping table ═ FALSE ]", the artificial intelligence management module 1022 notifies the image processor scheduling module 1026 to allocate the target image processor to the target detection model, and after allocation is completed, the binding relationship between the target detection model and the target image processor is saved in the mapping table.
The practice shows that the image processing scheme shown in the prior art and the embodiment of the invention finds that: under the binding relationship between the existing fixed image processors and the detection models, the number of the image processors is positively correlated with the total number of the detection models, namely, the number of the image processors is needed for how many detection models; in the image processing method according to the embodiment of the invention, the image processors and the detection models are dynamically bound, so that the number of the image processors is positively correlated with the number of the detection models which are running, that is, the number of the image processors which need to be started and run changes along with the change of the number of the detection models which are running, all the image processors do not need to be started and run, the number of the image processors which run simultaneously is reduced, and the running overhead of image processing equipment is further reduced.
Compared with the prior art, in the image processing system of the embodiment of the invention, the dynamic binding relationship between the plurality of image processors and the plurality of detection models is recorded in the mapping table, and when the image to be detected is received and the target detection model corresponding to the image to be detected is determined, the target image processor bound with the target detection model is detected in the mapping table; and if the target image processor is detected, transmitting the image to be detected to the target image processor so that the target image processor performs defect detection on the image to be detected by operating the target detection model. The binding relationship between the image processor and one or more detection models is not fixed, so that even if any one image processor fails, all the detection models can be dynamically replaced to be bound with other image processing equipment, and the fault tolerance of the image processing equipment is improved. In addition. Because the image processor and the detection model are dynamically bound, the situation that one image processor runs more detection models and the other image processor is idle is avoided, resources can be used in a balanced manner to a certain extent, and the image processing efficiency is improved.
Based on the image processing system, an embodiment of the present invention provides an image processing method, and referring to fig. 4, a flowchart of the image processing method according to the embodiment of the present invention is shown. The image processing method shown in fig. 4 may be performed by an image processing apparatus, which may be the image processing apparatus in the image processing system shown in fig. 1, and may include any one or more of the following modules: the system comprises a data collection module, a matching module, an artificial intelligence management module, an image processor scheduling module, a model training module and the like. Specifically, the image processing method shown in fig. 4 may be executed by scheduling the respective modules by a processor of the image processing apparatus. The image processing method shown in fig. 4 may include the steps of:
step S401, receiving an image to be detected, and acquiring a target detection model corresponding to the image to be detected.
The image to be detected is an image which is acquired by an image acquisition device and needs to be subjected to defect detection, and the image acquisition device can be a device configured in an image processing device, such as an image sensor in the image processing device; alternatively, the image capturing device may be any device independent of the image processing device, such as a camera or a video camera; still alternatively, the image capturing device may also be a medium image sensor configured in other terminal settings, such as an image sensor in an unmanned aerial vehicle, an image sensor in an unmanned vehicle, and the like.
In one embodiment, the possible defect in the image to be detected actually refers to a possible defect in a shooting object in the object to be detected, for example, the image to be detected is obtained by shooting a power transmission channel between two signal towers by an unmanned aerial vehicle, the shooting object is the power transmission channel between the signal towers, and the possible defect in the image to be detected actually refers to a possible defect in the power transmission channel between the two signal towers, for example, a dangerous source exists in the power transmission channel (or a dangerous obstacle exists, such as mountain fire/dangerous vehicle/foreign matter identification); if the image to be detected is obtained by shooting an insulator on a power transmission line between signal towers by the unmanned aerial vehicle, the shot object is the insulator, and the possible defects of the image to be detected can include insulator burst.
In one embodiment, the target detection model corresponding to the image to be detected refers to a detection model that can be used for detecting defects that may be included in the image to be detected. A plurality of inspection models may be stored in the image processing apparatus, each inspection model being usable to inspect one or more defects. Each detection model is obtained by training based on a training image set and a defect supervision label corresponding to each training image in the training image set, wherein in order to ensure the accuracy of the detection model, the training images in the training image set can comprise a positive sample training image with defects and a negative sample training image without defects, and the defect supervision label corresponding to the positive sample training image is used for representing the defects in the positive sample training image, such as insulator burst, shockproof hammer loss and the like; and the defect label corresponding to the negative sample training image is that no defect exists. In other embodiments, in order to save the cost of training the models, each detection model may also be trained by using only the positive sample training image, that is, the training image set corresponding to each detection model only includes the positive sample training image. It should be understood that each inspection model can detect different defects, and therefore, the training image set used to train each inspection model is different.
Optionally, in step S401, the obtaining of the target detection model corresponding to the image to be detected includes: predicting the defect type included in the image to be detected; and searching a detection model capable of detecting the defect type from the plurality of detection models as a target detection model.
The implementation manner of predicting the defect type included in the image to be detected may be: determining a target shooting position of image acquisition equipment when the image to be detected is shot; and acquiring the defect type corresponding to the target shooting position according to the preset corresponding relation between the shooting position of the image acquisition equipment and the defect type. For example, the preset correspondence between the shooting position of the image capturing device and the defect type is as follows: if the image acquisition equipment is positioned at a shooting position where the insulator of the signal tower can be shot, the corresponding defect can be that the insulator bursts; if the image acquisition equipment is positioned at a shooting position capable of shooting the power transmission channel between the signal towers, the corresponding defect can be a power transmission channel danger source; if the image acquisition device is located at a shooting position where the vibration damper on the power transmission line can be shot, the corresponding defect can be the vibration damper missing. Based on the assumption, if it is detected that the power transmission channel can be shot at the target shooting position of the image acquisition equipment when the image to be detected is shot, the defect type included in the image to be detected is predicted to be a power transmission channel dangerous source.
Optionally, if one detection model capable of detecting the defect type is found from the plurality of detection models, the found detection model can be directly used as a target detection model; if at least two detection models capable of detecting the defect type are found from the plurality of detection models, one target detection model can be selected from the at least two detection models according to the operation resources required by the operation of the at least two detection models or the number of the defect types capable of being detected by the at least two detection models. For example, one of the at least two detection models with the least required operation resources is selected as the target detection model, so that the power consumption overhead of the image processing device can be saved; or, one of the at least two detection models with the least number of types of defects capable of being detected is selected as the target detection model, so that the accuracy of defect detection can be improved.
And S402, detecting the target image processor bound with the target detection model in the mapping table.
In one embodiment, the plurality of detection models stored in the image processing device may run on a plurality of image processors, which are microprocessors dedicated to image and graphics related operations on devices such as personal computers, game machines, and some mobile devices. The plurality of image processors according to the embodiments of the present invention may be configured in an image processing apparatus; alternatively, the image processing device may be configured in another terminal device independent of the image processor device; still alternatively, a part of the plurality of image processors may be disposed in the image processing apparatus, and another part may be disposed in another terminal apparatus independent from the image processing apparatus. In the following description of the embodiments of the present invention, a description will be given taking an example in which a plurality of image processors are each configured in an image processing apparatus.
In a specific implementation, which detection models can be simultaneously operated on one image processor can be determined according to the operation capability of the image processor and the operation capability required by each detection model, the detection models operated on each image processor and the corresponding image processor are determined to be in a binding relationship, and the binding relationship between each image processor and the detection models is stored in a mapping table, so that the image processor corresponding to any detection model can be conveniently searched from the mapping table.
In the embodiment of the invention, when image processing equipment is initialized, any detection model is not operated on each image processor, only when the image processing equipment receives an image to be detected, the detection model corresponding to the image to be detected is determined, then an image processor is scheduled for the detection model to be bound, so that the detection model operates on the image processor, the binding relation between the detection model and the image processor is recorded in a mapping table, and all subsequent images required to be detected by the detection model are transmitted to the image processor. And continuously receiving the images to be detected along with the passage of time, and recording the binding relationship between the plurality of image processors and the plurality of detection models in the route record of the image processing equipment.
Based on the above, after the image processing device obtains the target detection model corresponding to the image to be detected, it may first search from the mapping table whether the mapping table records the target image processor bound to the target detection model, where the target image processor may be any one of the plurality of image processors bound to the target detection model. If the target image processor bound with the target detection model is recorded in the mapping table, step S403 is executed, that is, the image processing device may directly transmit the received image to be detected to the target image processor; if the target image processor bound with the target detection model is not recorded in the mapping table, the target image processor bound with the target detection model can be selected from the plurality of image processors according to the model binding condition of each image processor, then the target detection model and the target image processor are bound, and the binding relationship is recorded in the mapping table.
The model binding condition of each image processor may include the number of detection models that the corresponding image processor may bind and the number of detection models that have been bound. In specific implementation, the number of detection models that can be bound by each image processor can be determined according to the model binding condition of the image processor, and the image processor with the largest number of detection models that can be bound is selected as the target image processor corresponding to the target detection model. Optionally, for any image processor, the difference between the number of detection models that can be bound and the number of detection models that have already been bound is the number of detection models that the image processor can also bind.
And S403, if the target image processor is obtained through detection, transmitting the image to be detected to the target image processor so as to perform defect detection on the image to be detected by running the target detection model in the target image processor.
In one embodiment, if the image processing device finds the target image processor matched with the target detection model in the mapping table, the image to be detected is transmitted to the target image processor, the target image processor operates the target detection model to detect the defect of the image to be detected, and the defect label corresponding to the image to be detected can be output after the detection is finished. Optionally, the image processing device may still perform subsequent processing, such as classification, on the to-be-detected image using the defect label corresponding to each to-be-detected image.
In the embodiment of the invention, when the image processing equipment receives the image to be detected, the target detection model required for detecting the image to be detected is determined, then the target image processor bound with the target detection model is searched in the mapping table, and the binding relationship between the plurality of image processors and the plurality of detection models is recorded in the mapping table. And after the target image processor is found, transmitting the image to be detected to the target image processor, and carrying out defect detection on the image to be detected by operating the target detection model by the target image processor. Any one detection model can be operated in one image processor, so that the operation of any one image detection model cannot be influenced when any one image processor fails, the situation that the image to be detected cannot be identified or is identified mistakenly is avoided, and the accuracy of image processing is improved.
Based on the image processing method, another image processing method is further provided in the embodiment of the present invention, and with reference to fig. 5, a flowchart of another image processing method provided in the embodiment of the present invention is shown. The flowchart shown in fig. 5 may be executed by an image processing apparatus, which may be the image processing apparatus in the image processing system shown in fig. 1, and may include any one or more of the following modules: the system comprises a data collection module, a matching module, an artificial intelligence management module, an image processor scheduling module, a model training module and the like. Specifically, the image processing method shown in fig. 5 may be executed by scheduling the respective modules by a processor of the image processing apparatus. The image processing method shown in fig. 5 may include the steps of:
and S501, receiving an image to be detected and acquiring a target shooting position corresponding to the image to be detected.
In an embodiment, some possible implementations included in step S501 may refer to descriptions of corresponding steps in the embodiment in fig. 4, and are not described herein again.
And S502, determining a target detection model corresponding to the image to be detected according to the target shooting position.
In one embodiment, before a subject is photographed to obtain an image to be detected, portions of the subject where defects may occur may be predetermined, and then the image capturing apparatus may be adjusted to capture an image including the portions where defects may occur. Therefore, the defect detection can be realized without shooting all parts of the shooting object, the number of images to be detected which need to be detected is reduced, the power consumption expense of the image processing equipment is reduced, and the defect detection efficiency can be improved. Based on this, as can be seen from the foregoing, each detection model is used to identify one or more defects, and the determining, by the image processing apparatus in step S502, a target detection model corresponding to an image to be detected according to a target shooting position may include: acquiring a target defect type corresponding to the target shooting position according to the corresponding relation between the plurality of shooting positions and the plurality of defect types; and selecting a detection model matched with the target defect type from a plurality of detection models as a target detection model.
For example, assume that the image acquisition device is an image sensor on an unmanned aerial vehicle, and the shooting object is a signal tower of a plurality of power transmission lines and a power transmission channel between the signal towers. Referring to fig. 6a, a schematic diagram of determining a target detection model according to an embodiment of the present invention is provided, in fig. 6a, an image sensor on an image capturing device, that is, an unmanned aerial vehicle, and a shooting object, that is, a plurality of signal towers and a power transmission channel between the signal towers are included. Assuming that defects are easily generated at the cross arm 601 of the signal tower, the vibration damper 602 on the power transmission line, the insulator 603, and other parts in the shooting scene, the corresponding relationship between the three position information that the image capturing device can shoot 601, 602, and 603 and the defect type may be preset, and in the specific implementation: the method comprises the steps that 601 can be shot at a shooting position A, a possible defect of 601 is cross arm deflection, a detection model corresponding to the cross arm deflection is a model A, 602 can be shot at a shooting position B, a possible defect of 602 is a shockproof hammer loss, a detection model corresponding to the shockproof hammer loss is a model B, 603 can be shot at a shooting position C, a possible defect of 603 is insulator burst, and a detection model corresponding to the insulator burst is a model C. In fig. 6a, if the target shooting position is a, it can be found that the defect of the target image to be detected is a cross-arm skew according to the correspondence between the shooting position and the defect type, and a detection model a capable of detecting the cross-arm skew is selected from a plurality of detection models as the target detection model.
In other embodiments, on the basis of the above method, the image processing apparatus may further directly establish a correspondence between the shooting position and the detection model according to the correspondence between the shooting position and the defect type and the correspondence between the defect type and the detection model, so that time for determining the target detection model according to the target shooting position may be saved to a certain extent. In specific implementation, determining the target detection model corresponding to the image to be detected according to the target shooting position includes: traversing corresponding relations between a plurality of preset detection models and shooting positions; and determining a detection model corresponding to the target shooting position as a target detection model.
Based on the description in fig. 6a, the correspondence relationship between the shooting position and the defect type, and the correspondence relationship between the defect type and the detection model, which is set by the image processing apparatus, may be: the shooting position A corresponds to the model A, the shooting position B corresponds to the model B, and the shooting position C corresponds to the model C; after the target shooting position is obtained, if the target shooting position is the position A, the target detection model corresponding to the image to be detected can be rapidly determined as the model A according to the corresponding relation.
In the above two embodiments, even if the preset image capturing device is located at position A, B or position C for capturing, the position of the image capturing device may deviate from the preset position due to the influence of external factors (such as strong wind) during capturing, and thus, the method may not find the target detection model corresponding to the target capturing position, and may even fail to detect the defect. Based on this, the above case is more suitable for a scene in which the shooting scene does not change much.
In another embodiment, in order to ensure that defect detection is performed in more shooting scenes, the embodiment of the present invention further provides another way of determining a target detection model corresponding to an image to be detected according to a target shooting position based on the above method. In the concrete implementation: pre-storing corresponding relations between a plurality of shooting position ranges and a detection model, and matching the target shooting position with a preset shooting position range to obtain a matched shooting position range; and determining the detection model corresponding to the matched shooting position range as the target detection model corresponding to the image to be detected according to the corresponding relation between the shooting position range and the detection model. The shooting position range can be set according to the performance of the image acquisition equipment and other factors, and can be a position interval or a position area. In brief, for any part of a shooting object with a possibility of a defect, setting a position range within which an image acquisition device can shoot the part, and taking the position range as a shooting position range corresponding to the defect; and then establishing a corresponding relation between the detection model and the shooting position range based on the detection model required for detecting the defects.
For example, referring to fig. 6b for another embodiment of determining a target detection model provided by the embodiment of the present invention, based on the same assumption in fig. 6a, the object is a signal tower of a power transmission line and a power transmission channel between the signal towers, and the possible defective parts of the object may include a cross arm 601 of the signal tower, a vibration damper 602 on the power transmission line, an insulator 603, and the like. Assuming that an image including the cross arm 601 can be acquired when the image acquisition device is determined to be located at any position between the position D and the position E through pre-measurement and experiments; similarly, when the image acquisition device is located at any position between the F position and the G position, an image including the vibration damper on the power transmission line can be acquired, and then a detection model (the same as the above, referred to as model a) in which the photographing position range (D, E) corresponds to the cross arm deflection can be set, and a detection model (referred to as model B) in which the vibration damper is lost can be set. At this time, the shooting position range can be regarded as a position section.
For another example, assuming that the image capturing device can capture an image including a cross arm at any position within a semicircular area including the D position and the E position as shown in 604 in fig. 6c, the semicircular area 604 can be set as a shooting position range, and the corresponding detection model is model a. At this time, the position range of the photographed person can be regarded as a position area.
And step S503, detecting the target image processor bound with the target detection model in the mapping table.
In one embodiment, the form of recording the binding relationship between each detection model and the image processor in the mapping table may be: { model flag mTag, image processor address addr, last scheduling time }, where the model flag is used to flag the model, and may be the name of the model or the identification code of the model, and the last scheduling time refers to the latest scheduled time of the model.
In one embodiment, after determining the target detection model corresponding to the image to be detected, the image processing device first searches the mapping table for the target image processor. Optionally, when detecting the target processor bound to the target detection model in the mapping table, the image processing apparatus follows an artificial intelligence AI scheduling protocol, where the AI scheduling protocol includes a model flag of the target detection model to be scheduled, a current calling time, and a scheduling timeout time, where the scheduling timeout time refers to how long the target processor is searched in the mapping table but still is searched. For example, if the scheduling timeout time is 5 minutes, the target image processor is not found after 5 minutes have been found in the mapping table; in other embodiments, the image processing apparatus may also preset a scheduling time threshold, where the scheduling timeout time is a time difference between a time length that has been searched in the route record and the scheduling time threshold, for example, the scheduling time threshold is 5 minutes, the time length that has been searched in the mapping table is 8 minutes, and the timeout scheduling time length is 3 minutes. It should be understood that the above is only two possible definitions of the schedule timeout period listed in the embodiment of the present invention, and in a specific application, the definition of the schedule timeout period may be set according to a time situation.
In one embodiment, the AI scheduling protocol may be expressed as: { schedule timeout, model flags mTag, time: current invocation time }. Alternatively, the image processing apparatus may preset a cutoff duration for stopping scheduling of the target image processor from the mapping table, and stop searching for the target image processor from the mapping table if it is detected that the scheduling timeout period in the AI scheduling protocol is greater than or equal to the cutoff duration, at which time the target image processor may not be detected from the mapping table.
And step S504, if the target image processor is obtained through detection, transmitting the image to be detected to the target image processor so as to carry out defect detection on the image to be detected by running the target detection model in the target image processor.
And step S505, if the target image processor is not detected in the mapping table, traversing the model binding condition of each image processor.
And S506, selecting a target image processor from the plurality of image processors according to the model binding condition of each image processor, and binding the target image processor with the target detection model.
And step S507, recording the binding relationship between the target image processor and the target detection model into a mapping table.
In one embodiment, the image processors may rely on an image processor deployment protocol when performing step S506, which may include one or more of how many detection models each image processor binds to, timeout times of detection models for each image processor binding, and image processor lists. The timeout time of the first detection model bound to the first image sensor (the first detection model may be any one detection model) recorded in the deployment protocol of the image processor refers to: and the current time is different from the time when the detection model bound on the first image sensor is called for defect detection last time.
Alternatively, the image processor deployment protocol may be represented as follows: [ mNum: how many detection models are bound to each image processor, List: image processor list, time: timeout time of bound detection models on respective image processors ].
In practical application, after an image processor and a detection model are bound, an image processing device does not receive an image which needs the detection model to perform defect detection for a long time, in this case, in order to avoid that the detection model occupies the image processor for a long time and the image processors which can be bound by other detection models are less selected, a time length threshold value may be set, and if the timeout time of the detection model bound on any one image processor is detected to be greater than or equal to the time length threshold value according to an image processor deployment protocol, the binding relationship between the image processor and the detection model is released.
In one embodiment, in the process of selecting the target image processor from the plurality of image processors according to the model binding condition of each image processor, if a fault is detected in a certain image processor, the image processor can be added with a fault mark, so that the image processor added with the fault mark is not traversed in the process of subsequently establishing the binding relationship between the image processor and the detection model, and the efficiency of establishing the binding relationship can be improved.
In an embodiment, for some other possible implementation manners included in steps S504 to S507, reference may be made to the description of the relevant steps in the embodiment of fig. 4, and details are not repeated here.
And step S508, if it is detected that a new image to be detected corresponding to the target detection model is not received within the time length threshold, removing the binding relationship between the target detection model and the target image processor.
In an embodiment, the fact that a new image to be detected corresponding to the target detection model is not received within the time length threshold may be that the timeout time of the target detection model bound by the target image processor is greater than or equal to the time length threshold, and at this time, in order to provide more choices for other detection models when binding the image processor, the binding relationship between the target image processor and the target detection model may be released.
In the embodiment of the invention, when the image processing equipment receives the image to be detected, the target detection model required by the detection of the image to be detected is determined according to the target shooting position of the image acquisition equipment when the image to be detected is acquired, then the target image processor bound with the target detection model is searched in the mapping table, and the binding relationship between the plurality of image processors and the plurality of detection models is recorded in the mapping table. And after the target image processor is found, transmitting the image to be detected to the target image processor, and carrying out defect detection on the image to be detected by operating the target detection model by the target image processor. If the target processor is not found, one target image processor is selected according to the model binding conditions of the plurality of image processors, the binding relationship between the target image processor and the target detection model is established, and the binding relationship is recorded in the mapping table, so that the dynamic updating of the mapping table is realized. Meanwhile, the image to be transmitted is transmitted to the target image processor for defect detection, and any one detection model can be operated in one image processor, so that the operation of any one image detection model cannot be influenced when any one image processor fails, the situation that the image to be detected cannot be identified or is identified mistakenly is avoided, and the accuracy of image processing is improved.
Based on the description of the embodiment of the image processing method, the embodiment of the invention also provides an application scene of the image processing method, and the application scene applied to the inspection of the unmanned aerial vehicle is taken as an example below, an application scene graph can be shown in fig. 6a, the application scene includes the image acquisition device and the image processing device, and the image acquisition device is assumed to be the unmanned aerial vehicle, and specifically can be an image sensor on the unmanned aerial vehicle. The shooting object comprises a signal tower of the power transmission line and a power transmission channel between the signal towers. It is assumed that several inspection points (i.e. the aforementioned possible defect locations in the object) included in the object shown in fig. 6a may be a cross arm 601 on a signal tower of a power transmission line, a vibration damper 602 on the power transmission line, and an insulator 603. It is assumed that the correspondence relationship between the shooting position and the defect type and the correspondence relationship between the defect type and the detection model are specified in advance. Specifically, the defect type corresponding to the shooting position A is cross arm deflection, the defect type corresponding to the shooting position B is shockproof hammer loss, and the defect type corresponding to the shooting position C is insulator burst; the detection model corresponding to the deflection of the cross arm is a model A, the detection model corresponding to the loss of the vibration damper is a model B, and the detection model corresponding to the burst of the insulator is a model C.
Suppose that the image acquisition equipment shoots an image to be detected comprising a cross arm at a shooting position A, adds a target shooting position for the image to be detected and then transmits the image to the image processing equipment. The method comprises the steps that image processing equipment obtains a target shooting position corresponding to an image to be detected, matches the target shooting position with corresponding relations between a plurality of pre-stored shooting positions and defect types, determines that the target shooting position is matched with a shooting position A, and finds out the defect type corresponding to the shooting position A as cross arm deflection; further, the image processing device determines a detection model corresponding to the cross arm deflection as a model A according to the corresponding relation between the defect type and the detection model, and determines the model A as a target detection model corresponding to the target shooting position, or the target detection model corresponding to the image to be detected.
Then, the image processing equipment searches a target image processor bound with the target detection model in the mapping table, and if the target image processor is found, the image to be detected is transmitted to the target image processor, so that the target image processor operates the target detection model to detect defects of the image to be detected; and if the target image processor is found, selecting one image processor from the plurality of image processors as the target image processor, binding the target image processor with the target detection model, and transmitting the image to be detected to the target image processor, so that the target image processor operates the target detection model to detect the defects of the image to be detected. And finally, after the target image processor detects the defects of the image to be detected, outputting the defect label of the image to be detected in the user interface of the image processing equipment.
In the embodiment of the invention, when the image processing equipment receives the image to be detected, the target detection model required by the detection of the image to be detected is determined according to the target shooting position of the image acquisition equipment when the image to be detected is acquired, then the target image processor bound with the target detection model is searched in the mapping table, and the binding relationship between the plurality of image processors and the plurality of detection models is recorded in the mapping table. And after the target image processor is found, transmitting the image to be detected to the target image processor, and carrying out defect detection on the image to be detected by operating the target detection model by the target image processor. If the target processor is not found, one target image processor is selected according to the model binding conditions of the plurality of image processors, the binding relationship between the target image processor and the target detection model is established, and the binding relationship is recorded in the mapping table, so that the dynamic updating of the mapping table is realized. Meanwhile, the image to be transmitted is transmitted to the target image processor for defect detection, and any one detection model can be operated in one image processor, so that the operation of any one image detection model cannot be influenced when any one image processor fails, the situation that the image to be detected cannot be identified or is identified mistakenly is avoided, and the accuracy of image processing is improved.
Based on the above embodiment of the image processing method, an embodiment of the present invention provides an image processing apparatus. Referring to fig. 7, which is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention, the image processing apparatus shown in fig. 7 may operate the following units:
a receiving unit 701 configured to receive an image to be detected;
an obtaining unit 702, configured to obtain a target detection model corresponding to the image to be detected;
a processing unit 703, configured to detect a target image processor bound to the target detection model in a mapping table, where the mapping table records binding relationships between multiple image processors and multiple detection models, each image processor is configured to run one or more detection models, and each detection model is obtained by training based on a corresponding training image and a defect surveillance label corresponding to the training image;
and the transmission unit 704 is used for transmitting the image to be detected to the target image processor if the target image processor is obtained by detection, so as to perform defect detection on the image to be detected by running the target detection model in the target image processor.
In one embodiment, the processing unit 703 is further configured to: if the target image processors are not detected in the mapping table, traversing the model binding condition of each image processor, wherein the model binding condition comprises the quantity threshold of the detection models running on each image processor and the quantity of the bound detection models; selecting a target image processor from the plurality of image processors according to the model binding condition of each image processor, and binding the target image processor with the target detection model; and recording the binding relation between the target image processor and the target detection model into the mapping table.
In one embodiment, the processing unit 703 is further configured to: and if detecting that a new image to be detected corresponding to the target detection model is not received within the time length threshold, removing the binding relationship between the target detection model and the target image processor.
In one embodiment, the obtaining unit 702, when obtaining the target detection model corresponding to the image to be detected, performs the following operations: acquiring a target shooting position corresponding to the image to be detected, wherein the target shooting position is used for indicating the position of image acquisition equipment when the image to be detected is acquired; and determining a target detection model corresponding to the image to be detected according to the target shooting position.
In one embodiment, each of the plurality of detection models is used to identify one or more types of defects, and the obtaining unit 702 performs the following operations when determining the target detection model corresponding to the image to be detected according to the target shooting position: acquiring a target defect type corresponding to the target shooting position according to the corresponding relation between the plurality of shooting positions and the plurality of defect types; and selecting a detection model matched with the target defect type from a plurality of detection models as a target detection model.
In one embodiment, the obtaining unit 702, when determining the target detection model corresponding to the image to be detected according to the target shooting position, performs the following operations: matching the target shooting position with a preset shooting position range to obtain a matched shooting position range; and determining the detection model corresponding to the matched shooting position range as the target detection model corresponding to the image to be detected according to the corresponding relation between the shooting position ranges and the detection models.
In one embodiment, the obtaining unit 702, when determining the target detection model corresponding to the image to be detected according to the target shooting position, performs the following operations: traversing corresponding relations between a plurality of preset detection models and shooting positions; and determining a detection model corresponding to the target shooting position as a target detection model.
According to an embodiment of the present invention, the steps involved in the image processing methods shown in fig. 4 and 5 may be performed by units in the image processing apparatus shown in fig. 7. For example, step S401 illustrated in fig. 4 may be performed by the receiving unit 701 and the acquiring unit 702 in the image processing apparatus illustrated in fig. 7, step S402 may be performed by the processing unit 703 in the image processing apparatus illustrated in fig. 7, and step S403 may be performed by the transmitting unit 704 in the image processing apparatus illustrated in fig. 7; as another example, step S501 shown in fig. 5 may be performed by the receiving unit 701 and the acquiring unit 702 in the image processing apparatus shown in fig. 7, step S502 may be performed by the processing unit 703 in the image processing apparatus shown in fig. 7, step S503 may be performed by the processing unit 703 in the image processing apparatus shown in fig. 7, step S504 may be performed by the transmitting unit 704 in the image processing apparatus shown in fig. 7, and steps S505 to S508 may be performed by the processing unit 703 in the image processing apparatus shown in fig. 7.
According to another embodiment of the present invention, the units in the image processing apparatus shown in fig. 7 may be respectively or entirely combined into one or several other units to form the image processing apparatus, or some unit(s) thereof may be further split into multiple units with smaller functions to form the image processing apparatus, which may achieve the same operation without affecting the achievement of the technical effects of the embodiments of the present invention. The units are divided based on logic functions, and in practical application, the functions of one unit can be realized by a plurality of units, or the functions of a plurality of units can be realized by one unit. In other embodiments of the present invention, the image processing apparatus may also include other units, and in practical applications, these functions may also be implemented by being assisted by other units, and may be implemented by cooperation of a plurality of units.
According to another embodiment of the present invention, the image processing apparatus shown in fig. 7 may be constructed by running a computer program (including program codes) capable of executing the steps involved in the respective methods shown in fig. 4 or fig. 5 on a general-purpose computing device such as a computer including a processing element such as a Central Processing Unit (CPU), a random access storage medium (RAM), a read-only storage medium (ROM), and a storage element, and an image processing method according to an embodiment of the present invention may be implemented. The computer program may be embodied on a computer-readable storage medium, for example, and loaded into and executed by the above-described computing apparatus via the computer-readable storage medium.
In the embodiment of the invention, when the image processing equipment receives the image to be detected, the target detection model required for detecting the image to be detected is determined, then the target image processor bound with the target detection model is searched in the mapping table, and the binding relationship between the plurality of image processors and the plurality of detection models is recorded in the mapping table. And after the target image processor is found, transmitting the image to be detected to the target image processor, and carrying out defect detection on the image to be detected by operating the target detection model by the target image processor. Any one detection model can be operated in one image processor, so that the operation of any one image detection model cannot be influenced when any one image processor fails, the situation that the image to be detected cannot be identified or is identified mistakenly is avoided, and the accuracy of image processing is improved.
Based on the embodiments of the image processing method and the image processing apparatus, the embodiment of the invention also provides an image processing device. Fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention. The terminal shown in fig. 8 may include at least a processor 801, an input interface 802, an output interface 803, and a computer storage medium 804. The processor 801, the input interface 802, the output interface 803, and the computer storage medium 804 may be connected by a bus or other means.
A computer storage medium 804 may be stored in the memory of the image processing apparatus, the computer storage medium 804 being for storing a computer program comprising program instructions, the processor 801 being for executing the program instructions stored by the computer storage medium 804. The processor 801 (or CPU) is a computing core and a control core of the terminal, and is adapted to implement one or more instructions, and in particular, is adapted to load and execute the one or more instructions so as to implement a corresponding method flow or a corresponding function; in one embodiment, the processor 801 according to the embodiment of the present invention may be configured to perform: receiving an image to be detected, and acquiring a target detection model corresponding to the image to be detected; detecting a target image processor bound with the target detection model in a mapping table, wherein the mapping table records the binding relationship between a plurality of image processors and a plurality of detection models, each image processor is used for operating one or more detection models, and each detection model is obtained based on a corresponding training image and defect supervision labels corresponding to the training image; and if the target image processor is obtained through detection, transmitting the image to be detected to the target image processor so as to carry out defect detection on the image to be detected by running the target detection model in the target image processor.
An embodiment of the present invention further provides a computer storage medium (Memory), which is a Memory device in the node device and is used to store programs and data. It is understood that the computer storage medium herein may include a built-in storage medium in the terminal, and may also include an extended storage medium supported by the terminal. The computer storage medium provides a storage space that stores an operating system of the terminal. Also stored in the memory space are one or more instructions, which may be one or more computer programs (including program code), suitable for loading and execution by processor 1301. The computer storage medium may be a high-speed RAM memory, or may be a non-volatile memory (non-volatile memory), such as at least one disk memory; and optionally at least one computer storage medium located remotely from the processor.
In one embodiment, one or more instructions stored in a computer storage medium may be loaded and executed by the processor 801 to implement the corresponding steps of the method in the embodiment of the image processing method described above with respect to fig. 4 and 5, and in particular, one or more instructions stored in the computer storage medium may be loaded and executed by the processor 801 to implement the following steps: receiving an image to be detected, and acquiring a target detection model corresponding to the image to be detected; detecting a target image processor bound with the target detection model in a mapping table, wherein the mapping table records the binding relationship between a plurality of image processors and a plurality of detection models, each image processor is used for operating one or more detection models, and each detection model is obtained based on a corresponding training image and defect supervision labels corresponding to the training image; and if the target image processor is obtained through detection, transmitting the image to be detected to the target image processor so as to carry out defect detection on the image to be detected by running the target detection model in the target image processor.
In one embodiment, the loading of one or more instructions in a computer storage medium by processor 801 further performs the steps of: if the target image processors are not detected in the mapping table, traversing the model binding condition of each image processor, wherein the model binding condition comprises the quantity threshold of the detection models running on each image processor and the quantity of the bound detection models; selecting a target image processor from the plurality of image processors according to the model binding condition of each image processor, and binding the target image processor with the target detection model; and recording the binding relation between the target image processor and the target detection model into the mapping table.
In one embodiment, the loading of one or more instructions in a computer storage medium by processor 801 further performs the steps of: and if detecting that a new image to be detected corresponding to the target detection model is not received within the time length threshold, removing the binding relationship between the target detection model and the target image processor.
In one embodiment, the processor 801, when acquiring a target detection model corresponding to the image to be detected, performs the following operations: acquiring a target shooting position corresponding to the image to be detected, wherein the target shooting position is used for indicating the position of image acquisition equipment when the image to be detected is acquired; and determining a target detection model corresponding to the image to be detected according to the target shooting position.
In one embodiment, each of the plurality of detection models is used for identifying one or more types of defects, and the processor 801 performs the following operations when determining the target detection model corresponding to the image to be detected according to the target shooting position: acquiring a target defect type corresponding to the target shooting position according to the corresponding relation between the plurality of shooting positions and the plurality of defect types; and selecting a detection model matched with the target defect type from a plurality of detection models as a target detection model.
In one embodiment, the processor 801, when determining the target detection model corresponding to the image to be detected according to the target shooting position, performs the following operations: matching the target shooting position with a preset shooting position range to obtain a matched shooting position range; and determining the detection model corresponding to the matched shooting position range as the target detection model corresponding to the image to be detected according to the corresponding relation between the shooting position ranges and the detection models.
In one embodiment, the processor 801, when determining the target detection model corresponding to the image to be detected according to the target shooting position, performs the following operations: traversing corresponding relations between a plurality of preset detection models and shooting positions; and determining a detection model corresponding to the target shooting position as a target detection model.
In the embodiment of the invention, when the image processing equipment receives the image to be detected, the target detection model required for detecting the image to be detected is determined, then the target image processor bound with the target detection model is searched in the mapping table, and the binding relationship between the plurality of image processors and the plurality of detection models is recorded in the mapping table. And after the target image processor is found, transmitting the image to be detected to the target image processor, and carrying out defect detection on the image to be detected by operating the target detection model by the target image processor. Any one detection model can be operated in one image processor, so that the operation of any one image detection model cannot be influenced when any one image processor fails, the situation that the image to be detected cannot be identified or is identified mistakenly is avoided, and the accuracy of image processing is improved.
The above disclosure is intended to be illustrative of only some embodiments of the invention, and is not intended to limit the scope of the invention.

Claims (10)

1. An image processing method, comprising:
receiving an image to be detected, and acquiring a target detection model corresponding to the image to be detected;
detecting a target image processor bound with the target detection model in a mapping table, wherein the mapping table records the binding relationship between a plurality of image processors and a plurality of detection models, each image processor is used for operating one or more detection models, and each detection model is obtained based on a corresponding training image and defect supervision labels corresponding to the training image;
and if the target image processor is obtained through detection, transmitting the image to be detected to the target image processor so as to carry out defect detection on the image to be detected by running the target detection model in the target image processor.
2. The method of claim 1, wherein the method further comprises:
if the target image processors are not detected in the mapping table, traversing the model binding condition of each image processor, wherein the model binding condition comprises the quantity threshold of the detection models running on each image processor and the quantity of the bound detection models;
selecting a target image processor from the plurality of image processors according to the model binding condition of each image processor, and binding the target image processor with the target detection model;
and recording the binding relation between the target image processor and the target detection model into the mapping table.
3. The method of claim 2, wherein the method further comprises:
and if detecting that a new image to be detected corresponding to the target detection model is not received within the time length threshold, removing the binding relationship between the target detection model and the target image processor.
4. The method of claim 1, wherein the obtaining of the target detection model corresponding to the image to be detected comprises:
acquiring a target shooting position corresponding to the image to be detected, wherein the target shooting position is used for indicating the position of image acquisition equipment when the image to be detected is acquired;
and determining a target detection model corresponding to the image to be detected according to the target shooting position.
5. The method as claimed in claim 4, wherein each of the plurality of detection models is used for identifying one or more types of defects, and the determining the target detection model corresponding to the image to be detected according to the target shooting position comprises:
acquiring a target defect type corresponding to the target shooting position according to the corresponding relation between the plurality of shooting positions and the plurality of defect types;
and selecting a detection model matched with the target defect type from a plurality of detection models as a target detection model.
6. The method as claimed in claim 4, wherein said determining the target detection model corresponding to the image to be detected according to the target shooting position comprises:
matching the target shooting position with a preset shooting position range to obtain a matched shooting position range;
and determining the detection model corresponding to the matched shooting position range as the target detection model corresponding to the image to be detected according to the corresponding relation between the shooting position ranges and the detection models.
7. The method as claimed in claim 4, wherein said determining the target detection model corresponding to the image to be detected according to the target shooting position comprises:
traversing corresponding relations between a plurality of preset detection models and shooting positions;
and determining a detection model corresponding to the target shooting position as a target detection model.
8. An image processing apparatus characterized by comprising:
the receiving unit is used for receiving an image to be detected;
the acquisition unit is used for acquiring a target detection model corresponding to the image to be detected;
the processing unit is used for detecting a target image processor bound with the target detection model in a mapping table, the mapping table records the binding relationship between a plurality of image processors and a plurality of detection models, each image processor is used for operating one or more detection models, and each detection model is obtained based on a corresponding training image and defect supervision labels corresponding to the training image;
and the transmission unit is used for transmitting the image to be detected to the target image processor if the target image processor is obtained through detection so as to carry out defect detection on the image to be detected by operating the target detection model in the target image processor.
9. An image processing apparatus characterized by comprising:
a processor adapted to implement one or more instructions; and the number of the first and second groups,
a computer storage medium having stored thereon one or more instructions adapted to be loaded by the processor and to perform the image processing method according to any of claims 1-7.
10. A computer storage medium having computer program instructions stored therein, which when executed by a processor, are adapted to perform the image processing method of any of claims 1-7.
CN202010076749.XA 2020-01-22 2020-01-22 Image processing method, image processing apparatus, storage medium, and image processing device Active CN111242943B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010076749.XA CN111242943B (en) 2020-01-22 2020-01-22 Image processing method, image processing apparatus, storage medium, and image processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010076749.XA CN111242943B (en) 2020-01-22 2020-01-22 Image processing method, image processing apparatus, storage medium, and image processing device

Publications (2)

Publication Number Publication Date
CN111242943A true CN111242943A (en) 2020-06-05
CN111242943B CN111242943B (en) 2022-10-28

Family

ID=70874942

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010076749.XA Active CN111242943B (en) 2020-01-22 2020-01-22 Image processing method, image processing apparatus, storage medium, and image processing device

Country Status (1)

Country Link
CN (1) CN111242943B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112654999A (en) * 2020-07-21 2021-04-13 华为技术有限公司 Method and device for determining labeling information
CN112686852A (en) * 2020-12-25 2021-04-20 浙江伟星实业发展股份有限公司 Product defect identification method, device, equipment and storage medium
CN113965241A (en) * 2020-07-01 2022-01-21 深圳中科保泰科技有限公司 Method and related device for endowing artificial intelligence to unmanned aerial vehicle patrolling, mining and patrolling
WO2022036953A1 (en) * 2020-08-19 2022-02-24 上海商汤智能科技有限公司 Defect detection method and related apparatus, device, storage medium, and computer program product
CN116091982A (en) * 2023-04-03 2023-05-09 浪潮电子信息产业股份有限公司 Image detection method, device, electronic equipment and computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040194097A1 (en) * 2003-03-28 2004-09-30 Emulex Corporation Hardware assisted firmware task scheduling and management
CN106843815A (en) * 2017-01-18 2017-06-13 电子科技大学 The optimization method that on-chip multi-processor system multithreading runs simultaneously
CN108563949A (en) * 2018-04-16 2018-09-21 电子科技大学 For the duty mapping method of multi-core processor information security
CN108804211A (en) * 2018-04-27 2018-11-13 西安华为技术有限公司 Thread scheduling method, device, electronic equipment and storage medium
CN109117260A (en) * 2018-08-30 2019-01-01 百度在线网络技术(北京)有限公司 A kind of method for scheduling task, device, equipment and medium
CN109471711A (en) * 2018-11-12 2019-03-15 中国银行股份有限公司 A kind of method and device of task processing
CN110378900A (en) * 2019-08-01 2019-10-25 北京迈格威科技有限公司 The detection method of product defects, apparatus and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040194097A1 (en) * 2003-03-28 2004-09-30 Emulex Corporation Hardware assisted firmware task scheduling and management
CN106843815A (en) * 2017-01-18 2017-06-13 电子科技大学 The optimization method that on-chip multi-processor system multithreading runs simultaneously
CN108563949A (en) * 2018-04-16 2018-09-21 电子科技大学 For the duty mapping method of multi-core processor information security
CN108804211A (en) * 2018-04-27 2018-11-13 西安华为技术有限公司 Thread scheduling method, device, electronic equipment and storage medium
CN109117260A (en) * 2018-08-30 2019-01-01 百度在线网络技术(北京)有限公司 A kind of method for scheduling task, device, equipment and medium
CN109471711A (en) * 2018-11-12 2019-03-15 中国银行股份有限公司 A kind of method and device of task processing
CN110378900A (en) * 2019-08-01 2019-10-25 北京迈格威科技有限公司 The detection method of product defects, apparatus and system

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113965241A (en) * 2020-07-01 2022-01-21 深圳中科保泰科技有限公司 Method and related device for endowing artificial intelligence to unmanned aerial vehicle patrolling, mining and patrolling
CN112654999A (en) * 2020-07-21 2021-04-13 华为技术有限公司 Method and device for determining labeling information
CN112654999B (en) * 2020-07-21 2022-01-28 华为技术有限公司 Method and device for determining labeling information
WO2022036953A1 (en) * 2020-08-19 2022-02-24 上海商汤智能科技有限公司 Defect detection method and related apparatus, device, storage medium, and computer program product
JP2022548438A (en) * 2020-08-19 2022-11-21 上▲海▼商▲湯▼智能科技有限公司 Defect detection method and related apparatus, equipment, storage medium, and computer program product
CN112686852A (en) * 2020-12-25 2021-04-20 浙江伟星实业发展股份有限公司 Product defect identification method, device, equipment and storage medium
CN116091982A (en) * 2023-04-03 2023-05-09 浪潮电子信息产业股份有限公司 Image detection method, device, electronic equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN111242943B (en) 2022-10-28

Similar Documents

Publication Publication Date Title
CN111242943B (en) Image processing method, image processing apparatus, storage medium, and image processing device
CN108399782A (en) Method, apparatus, system, equipment and the storage medium of outdoor reversed guide-car
CN105022694A (en) Test case generation method and system for mobile terminal test
CN112017323A (en) Patrol alarm method and device, readable storage medium and terminal equipment
CN106980572B (en) Online debugging method and system for distributed system
CN112862821A (en) Water leakage detection method and device based on image processing, computing equipment and medium
CN112751910A (en) Information collection method and device
CN108629310B (en) Engineering management supervision method and device
CN109597389B (en) Test system of embedded control system
CN111832983A (en) Detection method, device and system for asset entry information
CN115904719B (en) Data acquisition method and device, electronic equipment and storage medium
CN110751055B (en) Intelligent manufacturing system
CN109086185B (en) Fault detection method, device and equipment of storage cluster and storage medium
CN117214598A (en) Cable monitoring system and method based on inspection image
CN114034972B (en) Intelligent cable fault determining method and device based on image data
CN114328063A (en) Simulation method and device based on mapping mechanism and electronic equipment
CN114721968A (en) Test method, test device and storage medium
CN114237981A (en) Data recovery method, device, equipment and storage medium
CN110765846B (en) Snapshot push test method and device
JP2022175146A (en) inference device
CN112235925A (en) Visual navigation aid equipment state early warning method and related device
CN112203016A (en) Monitoring processing method, device, equipment and medium for video acquisition equipment
CN114298990B (en) Detection method and device of vehicle-mounted image pickup device, storage medium and vehicle
CN113806119B (en) Memory card processing method, device, equipment and storage medium
CN111079466A (en) Vehicle identification method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40024732

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant