CN107862679A - The determination method and device of image detection region - Google Patents

The determination method and device of image detection region Download PDF

Info

Publication number
CN107862679A
CN107862679A CN201710986253.4A CN201710986253A CN107862679A CN 107862679 A CN107862679 A CN 107862679A CN 201710986253 A CN201710986253 A CN 201710986253A CN 107862679 A CN107862679 A CN 107862679A
Authority
CN
China
Prior art keywords
pixel
image
examined object
profile
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710986253.4A
Other languages
Chinese (zh)
Other versions
CN107862679B (en
Inventor
周民刚
吕建涛
王克生
袁旗
沈明锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Inc filed Critical Goertek Inc
Priority to CN201710986253.4A priority Critical patent/CN107862679B/en
Publication of CN107862679A publication Critical patent/CN107862679A/en
Application granted granted Critical
Publication of CN107862679B publication Critical patent/CN107862679B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a kind of determination method and device of image detection region, methods described includes:From detection image corresponding to examined object, the pixel in image corresponding to the examined object is obtained;Pixel in image corresponding to the examined object is handled, obtains pixel corresponding at least one profile of the examined object;Gradient direction corresponding to each pixel is calculated in each pixel corresponding at least one profile of respectively described examined object;The gradient direction that edge is calculated for each pixel, obtain the pixel of at least one pixel unit;Each pixel and at least one pixel obtained along the gradient direction being calculated for each pixel are combined, form pixel combination zone, and using the pixel combination zone as image detection region.According to one embodiment of present invention, avoid due to the problem of entirely maintenance workload of detection device is big caused by the diversity for detecting program.

Description

The determination method and device of image detection region
Technical field
The present invention relates to open defect detection technique field, more particularly, to a kind of determination side of image detection region Method and device.
Background technology
Machine Vision Detection refers to shoot object by camera, and the image then obtained to shooting is handled, With the technology of the feature in detection object, and then obtain the quality condition of object.
Machine Vision Detection is mainly used in dimensional measurement and open defect detection using more and more extensive.Wherein, outward appearance Defects detection plays decisive role to the quality for improving whole product.Machine Vision Detection needs a suitable detection zone Domain, to avoid the interference of the feature in other regions.
At present, due to object shape diversity, cause that the obtained image of shooting is needed to set different detection zones Domain, and then need to develop different detection programs.The diversity of program is detected, causes the maintenance workload of whole detection device Greatly.
Accordingly, it is desirable to provide a kind of new technical scheme, is improved for above-mentioned technical problem of the prior art.
The content of the invention
It is an object of the present invention to provide the new solution of a kind of determination method of image detection region.
According to the first aspect of the invention, there is provided a kind of determination method of image detection region, including:
From detection image corresponding to examined object, the pixel in image corresponding to the examined object is obtained;
Pixel in image corresponding to the examined object is handled, obtains the examined object at least Pixel corresponding to one profile;
Each pixel is calculated in each pixel corresponding at least one profile of respectively described examined object Corresponding gradient direction;
The gradient direction that edge is calculated for each pixel, obtain the pixel of at least one pixel unit;
By each pixel and at least one pixel obtained along the gradient direction being calculated for each pixel Point is combined, and forms pixel combination zone, and using the pixel combination zone as image detection region.
Alternatively, the detection image is gray level image, and the detection image includes figure corresponding to examined object Picture and background image,
From detection image corresponding to examined object, the pixel in image corresponding to the examined object is obtained, Including:
Using Blob algorithms, dividing processing is carried out to each pixel of the detection image, obtains the examined object The pixel in pixel and the background image in corresponding image.
Alternatively, the pixel in image corresponding to the examined object is handled, obtains the thing to be detected Pixel corresponding at least one profile of body, including:
Using Blob algorithms, piecemeal processing is carried out to the pixel in image corresponding to the examined object, obtained multigroup Pixel;
Using the outermost pixel in each group pixel as picture corresponding at least one profile of the examined object Vegetarian refreshments.
Alternatively, at least one profile using the outermost pixel in each group pixel as the examined object Before corresponding pixel, methods described also includes:
Calculate the area or girth in the region that the outermost pixel in multigroup pixel is formed;
Area or girth and the default contour area in the region that the outermost pixel in multigroup pixel is formed Threshold value or default profile perimeter threshold are compared, and obtain comparison result;
According to the comparison result, at least one set of pixel for meeting user's request is determined;
Using the outermost pixel in each group pixel as picture corresponding at least one profile of the examined object Vegetarian refreshments, including:
Using the outermost pixel at least one set of pixel for meeting user's request as the examined object At least one profile corresponding to pixel.
Alternatively, each pixel corresponding at least one profile of respectively described examined object is calculated described each Gradient direction corresponding to pixel, including:
Obtain the gray scale of 8 adjacent pixels of each pixel corresponding at least one profile of the examined object Value;
According to the gray value of 8 adjacent pixels of each pixel, it is calculated terraced corresponding to each pixel Spend direction.Alternatively, using the length direction of the detection image as X-direction, made with the width of the detection image For Y direction,
According to the gray value of 8 adjacent pixels of each pixel, the gradient side of each pixel is calculated To, including:
Based on following calculating formula, gradient magnitude Magnitude of each pixel along the X-direction is obtainedx, and edge The gradient magnitude Magnitude of the Y directiony,
Magnitudex={ (P3-P1)+[2×(P5-P4)]+(P8-P6)}/4
Magnitudey={ (P6-P1)+[2×(P7-P2)]+(P8-P3)}/4
Wherein, P1, P2, P3, P4, P5, P6, P7, P8 represent positioned at the upper left corner of current pixel point, surface, the right side respectively Upper angle, front-left, front-right, the lower left corner, underface and the lower right corner pixel gray value;
Based on following calculating formula, gradient angle [alpha] is obtained,
α=atan2 (Magnitudey,Magnitudex),
According to the gradient angle [alpha], the gradient direction of each pixel is obtained.
According to the second aspect of the invention, there is provided a kind of determining device of image detection region, including:
First acquisition module, it is corresponding for from detection image corresponding to examined object, obtaining the examined object Image in pixel;
Processing module, for handling the pixel in image corresponding to the examined object, obtain described treat Pixel corresponding at least one profile of detection object;
Setup module, for being respectively that each pixel corresponding at least one profile of the examined object is calculated Gradient direction corresponding to each pixel;
Second acquisition module, for along the gradient direction being calculated for each pixel, obtaining at least one pixel The pixel of unit;
Composite module, for what is obtained by each pixel and along the gradient direction being calculated for each pixel At least one pixel is combined, and forms pixel combination zone, and using the pixel combination zone as image detection Region.
Alternatively, the detection image is gray level image, and the detection image includes figure corresponding to examined object Picture and background image,
First acquisition module is additionally operable to:Using Blob algorithms, each pixel of the detection image is split Processing, obtains the pixel in the pixel and the background image in image corresponding to the examined object.
Alternatively, the processing module is additionally operable to:
Using Blob algorithms, piecemeal processing is carried out to the pixel in image corresponding to the examined object, obtained multigroup Pixel;
Using the outermost pixel in each group pixel as picture corresponding at least one profile of the examined object Vegetarian refreshments.
According to the third aspect of the invention we, there is provided a kind of determining device of image detection region, it is characterised in that bag Include:Memory and processor, wherein, the memory storage executable instruction, the executable instruction controls the processor Operated to perform the determination method of the image detection region described in any of the above described one.
The determination method and device of image detection region provided by the invention, suitable for the detection of object of different shapes, Relative to prior art, it is no longer necessary to develop different detection programs, and then avoid because the diversity of detection program causes Whole detection device maintenance workload it is big the problem of.
By referring to the drawings to the present invention exemplary embodiment detailed description, further feature of the invention and its Advantage will be made apparent from.
Brief description of the drawings
It is combined in the description and the accompanying drawing of a part for constitution instruction shows embodiments of the invention, and even It is used for the principle for explaining the present invention together with its explanation.
Fig. 1 shows the process chart of the determination method of image detection region according to an embodiment of the invention.
Fig. 2 shows the schematic diagram of the pixel according to an embodiment of the invention on profile.
Fig. 3 shows the schematic diagram of gradient direction according to an embodiment of the invention.
Fig. 4 shows the schematic diagram of detection image according to an embodiment of the invention.
Fig. 5 shows the structural representation of the determining device of image detection region according to an embodiment of the invention.
Fig. 6 shows another structural representation of the determining device of image detection region according to an embodiment of the invention Figure.
Embodiment
The various exemplary embodiments of the present invention are described in detail now with reference to accompanying drawing.It should be noted that:Unless have in addition Body illustrates that the unlimited system of part and the positioned opposite of step, numerical expression and the numerical value otherwise illustrated in these embodiments is originally The scope of invention.
The description only actually at least one exemplary embodiment is illustrative to be never used as to the present invention below And its application or any restrictions that use.
It may be not discussed in detail for technology, method and apparatus known to person of ordinary skill in the relevant, but suitable In the case of, the technology, method and apparatus should be considered as part for specification.
In shown here and discussion all examples, any occurrence should be construed as merely exemplary, without It is as limitation.Therefore, other examples of exemplary embodiment can have different values.
It should be noted that:Similar label and letter represents similar terms in following accompanying drawing, therefore, once a certain Xiang Yi It is defined, then it need not be further discussed in subsequent accompanying drawing in individual accompanying drawing.
An embodiment provides a kind of determination method of image detection region.Fig. 1 is shown according to this hair The process chart of the determination method of the image detection region of bright one embodiment.Referring to Fig. 1, this method comprises at least following step Rapid S101 to step S105.
Step S101, from detection image corresponding to examined object, obtain the picture in image corresponding to examined object Vegetarian refreshments;
Step S102, the pixel in image corresponding to examined object is handled, obtain examined object extremely Pixel corresponding to a few profile;
Each pixel is calculated in each pixel corresponding to step S103, respectively examined object at least one profile Corresponding gradient direction;
Step S104, edge are the gradient direction that each pixel is calculated, and obtain the pixel of at least one pixel unit;
Step S105, by each pixel and at least one pixel obtained along the gradient direction being calculated for each pixel Point is combined, and forms pixel combination zone, and using pixel combination zone as image detection region.
Detection image of the present invention is the image for being shot to obtain to examined object using camera.The detection figure As being gray level image.Gray level image is typically shown as, from most furvous to most bright white gray scale, can so there is 256 grades of ashes Degree.
In one embodiment of the present of invention, detection image had both included image corresponding to examined object, included background again Image.It should be noted that image corresponding to examined object and background image have high contrast, in order to follow-up point Cut processing.
In one embodiment of the present of invention, using Blob algorithms, dividing processing is carried out to each pixel of detection image, obtained To the pixel in the pixel and background image in image corresponding to examined object.Blob algorithms are to phase in detection image Connected domain with pixel is analyzed, and the connected domain is referred to as Blob.Specifically, a gray threshold is preset, by detection image The gray value of each pixel is compared with the gray threshold respectively, determines that the pixel is examined object according to comparison result Pixel in corresponding image, or be the pixel in background image.For example, examined object is placed on a white Shot on background board, obtain gray level image, wherein, examined object is not white object, using Blob algorithms to the ash Each pixel spent in image carries out dividing processing, wherein, it is 254 to preset gray threshold, by each pixel in gray level image Gray value be compared respectively with default gray threshold, when a certain pixel gray value be more than 254 when, determine the pixel For the pixel in background image, when the gray value of a certain pixel is less than 254, it is examined object pair to determine the pixel Pixel in the image answered.It should be noted that default gray threshold 254 is merely possible to an example, to the present invention simultaneously Any restriction is not caused.Default gray threshold can be the arbitrary value between 0 to 255.In addition, default gray threshold can be according to be checked The brightness of image and the brightness of background image corresponding to object is surveyed to be configured.
After the pixel in getting image corresponding to examined object, in one embodiment of the present of invention, utilize Blob algorithms, piecemeal processing is carried out to the pixel in image corresponding to examined object, obtains multigroup pixel, wherein, often The gray value of each pixel in one group of pixel is same or like, and the close gray value for each pixel herein being related to is each picture The difference of the gray value of vegetarian refreshments is in the range of default gray scale difference value.Then, using the outermost pixel in each group pixel as Pixel corresponding at least one profile of examined object.
In one embodiment of the present of invention, using Blob algorithms, the pixel in image corresponding to examined object is clicked through After the processing of row piecemeal, multigroup pixel is obtained, each group of pixel can correspond to a region, and each region can correspond at least one Individual profile, it so can obtain multiple profiles.According to the demand of user, it may not be necessary to carry out quality inspection to obtained all profiles Survey, now, first, the area in the region that the outermost pixel in multigroup pixel is formed is calculated, then, by multigroup pixel In the area in region that is formed of outermost pixel be compared respectively with default contour area threshold value, obtain comparison result, From comparison result, the pixel group more than default contour area threshold value is screened.Or first, calculate multigroup pixel The girth in the region that the outermost pixel in point is formed, then, the area that the outermost pixel in multigroup pixel is formed The girth in domain is compared with default profile perimeter threshold respectively, obtains comparison result, from comparison result, will exceed default wheel The pixel group of wide perimeter threshold screens.It can so obtain meeting at least one corresponding to user's request, examined object Pixel corresponding to bar profile.
After pixel corresponding at least one profile of examined object is obtained, gradient corresponding to each pixel is calculated Direction.In one embodiment of the present of invention, first, it is adjacent to obtain each pixel corresponding at least one profile of examined object 8 pixels gray value, then, according to the gray value of 8 adjacent pixels of each pixel, each pixel is calculated Corresponding gradient direction.
Fig. 2 shows the schematic diagram of the pixel according to an embodiment of the invention on profile.Referring to Fig. 2, figure In each grid for showing represent a pixel, the image with gray scale is parts of images corresponding to examined object.Fig. 2 The profile of the examined object shown is a curve, and pixel A is located on the profile of examined object.It is right by taking pixel A as an example Determine that gradient direction corresponding to pixel A illustrates.First, based on calculating formula (1) and calculating formula (2), pixel A edges are obtained The gradient magnitude Magnitude of X-directionxWith the gradient magnitude Magnitude along Y directiony,
Magnitudex={ (P3-P1)+[2×(P5-P4)]+(P8-P6)/4-calculating formula (1),
Magnitudey={ (P6-P1)+[2×(P7-P2)]+(P8-P3)/4-calculating formula (2),
Wherein, P1、P2、P3、P4、P5、P6、P7、P8Represent respectively positioned at the pixel A upper left corner, surface, the upper right corner, just Left, front-right, the lower left corner, underface and the lower right corner pixel gray value.Then, based on calculating formula (3), gradient is obtained Angle [alpha],
α=atan2 (Magnitudey,Magnitudex)-calculating formula (3).
Finally, according to gradient angle [alpha], pixel A gradient direction is obtained.Fig. 3 is shown according to one implementation of the present invention The schematic diagram of the gradient direction of example.Referring to Fig. 3, gradient direction includes 8 directions altogether.The corresponding gradient of each gradient direction Angular range.Gradient angular range corresponding to first gradient direction for (- 22.5 °, 22.5 °], ladder corresponding to the second gradient direction Spend angular range for (22.5 °, 67.5 °], gradient angular range corresponding to 3rd gradient direction for (67.5 °, 112.5 °], the 4th Gradient angular range corresponding to gradient direction for (112.5 °, 157.5 °], gradient angular range corresponding to the 5th gradient direction is (157.5 °, 180 °] and (- 180 °, -157.5 °], gradient angular range corresponding to the 6th gradient direction for (- 157.5 °, - 112.5 °], gradient angular range corresponding to the 7th gradient direction for (- 112.5 °, -67.5 °], ladder corresponding to the 8th gradient direction Spend angular range for (- 67.5 °, -22.5 °].The gradient angle [alpha] obtained according to calculating formula (3), determine that the gradient angle [alpha] is located at In the range of which gradient angle, and then determine gradient direction corresponding to pixel A.Using pixel A as starting point, along pixel Point A gradient direction obtains the pixel of at least one pixel unit.
Determination method of the specific embodiment to image detection region provided by the invention is further described below.
Fig. 4 shows the schematic diagram of detection image according to an embodiment of the invention.Referring to Fig. 4, the detection image bag Image corresponding to image corresponding to black material and white background is included.Black material is a square material, and heart district wherein Domain offers a square through hole.First, using Blob algorithms, each pixel of the detection image shown in Fig. 4 is split Processing, obtains the pixel in the pixel and white background picture in image corresponding to black material.Then, calculated using Blob Method carries out piecemeal processing to the pixel in image corresponding to black material, due to each pixel in image corresponding to black material The gray value of point is 0, then obtains one group of pixel.Profile using the outermost pixel in this group of pixel as black material Corresponding pixel.Referring to Fig. 4, two profiles, i.e. the first profile and the second profile can obtain by this group of pixel.Assuming that the One profile and the second profile include 2000 and 1000 points, each pixel and the second profile respectively corresponding to the first profile respectively Gradient direction is calculated in corresponding each pixel.Then, along the gradient direction of each pixel, obtain on 5 pixel units Pixel.So, will obtain respectively for 5 × 2000 and 5 × 1000 pixels of the first profile and the second profile, by 5 × First boundary image of 2000 pixel compositions, and the second boundary image by 5 × 1000 pixel compositions.First side Boundary's image and the second boundary image are as defects detection image.In the embodiment of the present invention, it is preferred to use histogram analysis or The methods of spot detection, carries out quality testing to image detection region, for example, open defect detects.
Based on same inventive concept, the invention provides a kind of determining device of image detection region.Fig. 5 shows basis The structural representation of the determining device of the image detection region of one embodiment of the invention.Referring to Fig. 5, the device comprises at least: First acquisition module 510, for from detection image corresponding to examined object, obtaining in image corresponding to examined object Pixel;Processing module 520, for handling the pixel in image corresponding to examined object, obtain thing to be detected Pixel corresponding at least one profile of body;Setup module 530, for being respectively at least one profile pair of examined object Gradient direction corresponding to each pixel is calculated in each pixel answered;Second acquisition module 540, by along for based on each pixel Obtained gradient direction, obtain the pixel of at least one pixel unit;Composite module 550, for by each pixel and edge At least one pixel obtained for the gradient direction that each pixel is calculated is combined, and forms pixel combination zone, And using pixel combination zone as image detection region.
In one embodiment of the present of invention, detection image is gray level image, and detection image includes examined object pair The image and background image answered,
In one embodiment of the present of invention, the first acquisition module 510 is additionally operable to:Using Blob algorithms, to detection image Each pixel carries out dividing processing, obtains the pixel in the pixel and background image in image corresponding to examined object.
In one embodiment of the present of invention, processing module 520 is additionally operable to:It is corresponding to examined object using Blob algorithms Image in pixel carry out piecemeal processing, obtain multigroup pixel;Using the outermost pixel in each group pixel as treating Pixel corresponding at least one profile of detection object.
In one embodiment of the present of invention, processing module 520 is additionally operable to:Calculate the outermost pixel in multigroup pixel The area or girth in the region of formation;By the area in the region of the outermost pixel formation in multigroup pixel or girth and in advance If contour area threshold value or default profile perimeter threshold are compared, comparison result is obtained;According to comparison result, determine to meet At least one set of pixel of user's request;Using the outermost pixel at least one set of pixel for meeting user's request as treating Pixel corresponding at least one profile of detection object.
Fig. 6 shows another structural representation of the determining device of image detection region according to an embodiment of the invention Figure.Referring to Fig. 6, the device comprises at least:Memory 620 and processor 610, wherein, memory 620 stores executable instruction, Executable instruction control processor 610 operated with perform any of the above described one image detection region determination method.
The present invention can be system, method and/or computer program product.Computer program product can include computer Readable storage medium storing program for executing, containing for making processor realize the computer-readable program instructions of various aspects of the invention.
Computer-readable recording medium can keep and store to perform the tangible of the instruction that uses of equipment by instruction Equipment.Computer-readable recording medium for example can be-- but be not limited to-- storage device electric, magnetic storage apparatus, optical storage Equipment, electromagnetism storage device, semiconductor memory apparatus or above-mentioned any appropriate combination.Computer-readable recording medium More specifically example (non exhaustive list) includes:Portable computer diskette, hard disk, random access memory (RAM), read-only deposit It is reservoir (ROM), erasable programmable read only memory (EPROM or flash memory), static RAM (SRAM), portable Compact disk read-only storage (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanical coding equipment, for example thereon It is stored with punch card or groove internal projection structure and the above-mentioned any appropriate combination of instruction.Calculating used herein above Machine readable storage medium storing program for executing is not construed as instantaneous signal in itself, the electromagnetic wave of such as radio wave or other Free propagations, leads to Cross the electromagnetic wave (for example, the light pulse for passing through fiber optic cables) of waveguide or the propagation of other transmission mediums or transmitted by electric wire Electric signal.
Computer-readable program instructions as described herein can be downloaded to from computer-readable recording medium it is each calculate/ Processing equipment, or outer computer or outer is downloaded to by network, such as internet, LAN, wide area network and/or wireless network Portion's storage device.Network can include copper transmission cable, optical fiber is transmitted, is wirelessly transferred, router, fire wall, interchanger, gateway Computer and/or Edge Server.Adapter or network interface in each calculating/processing equipment receive from network to be counted Calculation machine readable program instructions, and the computer-readable program instructions are forwarded, for the meter being stored in each calculating/processing equipment In calculation machine readable storage medium storing program for executing.
For perform the computer program instructions that operate of the present invention can be assembly instruction, instruction set architecture (ISA) instruction, Machine instruction, machine-dependent instructions, microcode, firmware instructions, condition setup data or with one or more programming languages The source code or object code that any combination is write, programming language-such as Smalltalk of the programming language including object-oriented, C++ etc., and conventional procedural programming languages-such as " C " language or similar programming language.Computer-readable program refers to Order fully can on the user computer be performed, partly performed on the user computer, the software kit independent as one Perform, part performs or completely on remote computer or server on the remote computer on the user computer for part Perform.In the situation of remote computer is related to, remote computer can be by the network of any kind-include LAN Or wide area network (WAN) (LAN)-subscriber computer is connected to, or, it may be connected to outer computer (such as utilize internet Service provider passes through Internet connection).In certain embodiments, believe by using the state of computer-readable program instructions Breath comes personalized customization electronic circuit, such as PLD, field programmable gate array (FPGA) or FPGA Array (PLA), the electronic circuit can perform computer-readable program instructions, so as to realize various aspects of the invention.
Referring herein to method, apparatus (system) and computer program product according to embodiments of the present invention flow chart and/ Or block diagram describes various aspects of the invention.It should be appreciated that each square frame and flow chart of flow chart and/or block diagram and/ Or in block diagram each square frame combination, can be realized by computer-readable program instructions.
These computer-readable program instructions can be supplied to all-purpose computer, special-purpose computer or other programmable datas The processor of processing unit, so as to produce a kind of machine so that these instructions are passing through computer or other programmable datas During the computing device of processing unit, work(specified in one or more of implementation process figure and/or block diagram square frame is generated The device of energy/action.These computer-readable program instructions can also be stored in a computer-readable storage medium, these refer to Order causes computer, programmable data processing unit and/or other equipment to work in a specific way, so as to be stored with instruction Computer-readable medium then includes a manufacture, and it is included in one or more of implementation process figure and/or block diagram square frame The instruction of the various aspects of defined function/action.
Computer-readable program instructions can also be loaded into computer, other programmable data processing units or other In equipment so that series of operation steps is performed on computer, other programmable data processing units or miscellaneous equipment, with production Raw computer implemented process, so that performed on computer, other programmable data processing units or miscellaneous equipment Instruct function/action specified in one or more of implementation process figure and/or block diagram square frame.
Flow chart and block diagram in accompanying drawing show system, method and the computer journey of multiple embodiments according to the present invention Architectural framework in the cards, function and the operation of sequence product.At this point, each square frame in flow chart or block diagram can generation One module of table, program segment or a part for instruction, module, program segment or a part for instruction include one or more be used in fact The executable instruction of logic function as defined in existing.At some as in the realization replaced, the function of being marked in square frame can also To occur different from the order marked in accompanying drawing.For example, two continuous square frames can essentially perform substantially in parallel, it Can also perform in the opposite order sometimes, this is depending on involved function.It is also noted that block diagram and/or flow The combination of each square frame and block diagram in figure and/or the square frame in flow chart, function or action as defined in performing can be used Special hardware based system is realized, or can be realized with the combination of specialized hardware and computer instruction.For this It is well known that, realized for art personnel by hardware mode, realized by software mode and pass through software and hardware With reference to mode realize it is all of equal value.
It is described above various embodiments of the present invention, described above is exemplary, and non-exclusive, and It is not limited to disclosed each embodiment.In the case of without departing from the scope and spirit of illustrated each embodiment, for this skill Many modifications and changes will be apparent from for the those of ordinary skill in art field.The selection of term used herein, purport Best explaining the principle of each embodiment, practical application or to the technological improvement in market, or make the art its Its those of ordinary skill is understood that each embodiment disclosed herein.The scope of the present invention is defined by the appended claims.

Claims (10)

1. a kind of determination method of image detection region, it is characterised in that including:
From detection image corresponding to examined object, the pixel in image corresponding to the examined object is obtained;
Pixel in image corresponding to the examined object is handled, obtains at least one of the examined object Pixel corresponding to profile;
It is corresponding that each pixel is calculated in each pixel corresponding at least one profile of respectively described examined object Gradient direction;
The gradient direction that edge is calculated for each pixel, obtain the pixel of at least one pixel unit;
Each pixel and at least one pixel obtained along the gradient direction being calculated for each pixel are clicked through Row combination, forms pixel combination zone, and using the pixel combination zone as image detection region.
2. according to the method for claim 1, it is characterised in that the detection image is gray level image, and the detection is schemed Image and background image as corresponding to including examined object,
From detection image corresponding to examined object, the pixel in image corresponding to the examined object is obtained, including:
Using Blob algorithms, dividing processing is carried out to each pixel of the detection image, it is corresponding to obtain the examined object Image in pixel and the background image in pixel.
3. according to the method for claim 1, it is characterised in that to the pixel in image corresponding to the examined object Handled, obtain pixel corresponding at least one profile of the examined object, including:
Using Blob algorithms, piecemeal processing is carried out to the pixel in image corresponding to the examined object, obtains multigroup pixel Point;
Using the outermost pixel in each group pixel as pixel corresponding at least one profile of the examined object.
4. according to the method for claim 3, it is characterised in that using the outermost pixel in each group pixel as institute Before stating pixel corresponding at least one profile of examined object, methods described also includes:
Calculate the area or girth in the region that the outermost pixel in multigroup pixel is formed;
Area or girth and the default contour area threshold value in the region that the outermost pixel in multigroup pixel is formed Or default profile perimeter threshold is compared, and obtains comparison result;
According to the comparison result, at least one set of pixel for meeting user's request is determined;
Using the outermost pixel in each group pixel as pixel corresponding at least one profile of the examined object, Including:
Using the outermost pixel at least one set of pixel for meeting user's request as the examined object extremely Pixel corresponding to a few profile.
5. according to any described methods of claim 1-4, it is characterised in that be respectively at least one of the examined object Gradient direction corresponding to each pixel is calculated in each pixel corresponding to profile, including:
Obtain the gray value of 8 adjacent pixels of each pixel corresponding at least one profile of the examined object;
According to the gray value of 8 adjacent pixels of each pixel, gradient side corresponding to each pixel is calculated To.
6. according to the method for claim 5, it is characterised in that using the length direction of the detection image as X-direction, Using the width of the detection image as Y direction,
According to the gray value of 8 adjacent pixels of each pixel, the gradient direction of each pixel is calculated, wraps Include:
Based on following calculating formula, gradient magnitude Magnitude of each pixel along the X-direction is obtainedx, and along the Y The gradient magnitude Magnitude of direction of principal axisy,
Magnitudex={ (P3-P1)+[2×(P5-P4)]+(P8-P6)/4,
Magnitudey={ (P6-P1)+[2×(P7-P2)]+(P8-P3)/4,
Wherein, P1、P2、P3、P4、P5、P6、P7、P8Represent respectively positioned at the upper left corner of current pixel point, surface, the upper right corner, just Left, front-right, the lower left corner, underface and the lower right corner pixel gray value;
Based on following calculating formula, gradient angle [alpha] is obtained,
α=atan2 (Magnitudey,Magnitudex),
According to the gradient angle [alpha], the gradient direction of each pixel is obtained.
A kind of 7. determining device of image detection region, it is characterised in that including:
First acquisition module, scheme for from detection image corresponding to examined object, obtaining corresponding to the examined object Pixel as in;
Processing module, for handling the pixel in image corresponding to the examined object, obtain described to be detected Pixel corresponding at least one profile of object;
Setup module, for being respectively described in each pixel corresponding at least one profile of the examined object is calculated Gradient direction corresponding to each pixel;
Second acquisition module, for along the gradient direction being calculated for each pixel, obtaining at least one pixel unit Pixel;
Composite module, for being obtained at least by each pixel and along the gradient direction being calculated for each pixel One pixel is combined, and forms pixel combination zone, and using the pixel combination zone as image detection region.
8. device according to claim 7, it is characterised in that the detection image is gray level image, and the detection is schemed Image and background image as corresponding to including examined object,
First acquisition module is additionally operable to:Using Blob algorithms, dividing processing is carried out to each pixel of the detection image, Obtain the pixel in the pixel and the background image in image corresponding to the examined object.
9. device according to claim 7, it is characterised in that the processing module is additionally operable to:
Using Blob algorithms, piecemeal processing is carried out to the pixel in image corresponding to the examined object, obtains multigroup pixel Point;
Using the outermost pixel in each group pixel as pixel corresponding at least one profile of the examined object.
A kind of 10. determining device of image detection region, it is characterised in that including:Memory and processor, wherein, it is described to deposit Reservoir stores executable instruction, and the executable instruction controls the processor to be operated to perform according to claim 1-6 In any one described in image detection region determination method.
CN201710986253.4A 2017-10-20 2017-10-20 Method and device for determining image detection area Active CN107862679B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710986253.4A CN107862679B (en) 2017-10-20 2017-10-20 Method and device for determining image detection area

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710986253.4A CN107862679B (en) 2017-10-20 2017-10-20 Method and device for determining image detection area

Publications (2)

Publication Number Publication Date
CN107862679A true CN107862679A (en) 2018-03-30
CN107862679B CN107862679B (en) 2020-12-18

Family

ID=61697622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710986253.4A Active CN107862679B (en) 2017-10-20 2017-10-20 Method and device for determining image detection area

Country Status (1)

Country Link
CN (1) CN107862679B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596876A (en) * 2018-03-28 2018-09-28 潍坊路加精工有限公司 A kind of method of automatic setting detection zone
CN111242888A (en) * 2019-12-03 2020-06-05 中国人民解放军海军航空大学 Image processing method and system based on machine vision

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6233364B1 (en) * 1998-09-18 2001-05-15 Dainippon Screen Engineering Of America Incorporated Method and system for detecting and tagging dust and scratches in a digital image
CN101799434A (en) * 2010-03-15 2010-08-11 深圳市中钞科信金融科技有限公司 Printing image defect detection method
CN103208117A (en) * 2013-03-21 2013-07-17 袁景 Intelligent multifunctional belt surface patch edge detection method
CN104981105A (en) * 2015-07-09 2015-10-14 广东工业大学 Detecting and error-correcting method capable of rapidly and accurately obtaining element center and deflection angle
CN105809131A (en) * 2016-03-08 2016-07-27 宁波裕兰信息科技有限公司 Method and system for carrying out parking space waterlogging detection based on image processing technology
CN106650553A (en) * 2015-10-30 2017-05-10 比亚迪股份有限公司 License plate recognition method and system
CN106815846A (en) * 2016-12-30 2017-06-09 歌尔科技有限公司 The dark angle of image cuts the dark angle compensation method and device of method of determining range, image
CN106845494A (en) * 2016-12-22 2017-06-13 歌尔科技有限公司 The method and device of profile angle point in a kind of detection image

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6233364B1 (en) * 1998-09-18 2001-05-15 Dainippon Screen Engineering Of America Incorporated Method and system for detecting and tagging dust and scratches in a digital image
CN101799434A (en) * 2010-03-15 2010-08-11 深圳市中钞科信金融科技有限公司 Printing image defect detection method
CN103208117A (en) * 2013-03-21 2013-07-17 袁景 Intelligent multifunctional belt surface patch edge detection method
CN104981105A (en) * 2015-07-09 2015-10-14 广东工业大学 Detecting and error-correcting method capable of rapidly and accurately obtaining element center and deflection angle
CN106650553A (en) * 2015-10-30 2017-05-10 比亚迪股份有限公司 License plate recognition method and system
CN105809131A (en) * 2016-03-08 2016-07-27 宁波裕兰信息科技有限公司 Method and system for carrying out parking space waterlogging detection based on image processing technology
CN106845494A (en) * 2016-12-22 2017-06-13 歌尔科技有限公司 The method and device of profile angle point in a kind of detection image
CN106815846A (en) * 2016-12-30 2017-06-09 歌尔科技有限公司 The dark angle of image cuts the dark angle compensation method and device of method of determining range, image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
杨合超等: "几种图像分割技术的比较", 《电脑知识与技术》 *
林灿尧等: "《水利信息化新技术及应用》", 31 July 2013, 长江出版社 *
黄祚继等: "《多源遥感数据目标地物的分类与优化》", 31 May 2017 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108596876A (en) * 2018-03-28 2018-09-28 潍坊路加精工有限公司 A kind of method of automatic setting detection zone
CN108596876B (en) * 2018-03-28 2020-07-03 潍坊路加精工有限公司 Method for automatically setting detection area
CN111242888A (en) * 2019-12-03 2020-06-05 中国人民解放军海军航空大学 Image processing method and system based on machine vision

Also Published As

Publication number Publication date
CN107862679B (en) 2020-12-18

Similar Documents

Publication Publication Date Title
CN109613002B (en) Glass defect detection method and device and storage medium
CN103279765B (en) Steel wire rope surface damage detection method based on images match
JP6811217B2 (en) Crack identification method, crack identification device, crack identification system and program on concrete surface
JP2003520969A (en) Method and system for detecting defects in printed circuit boards
Mery et al. Image processing for fault detection in aluminum castings
CN110910445B (en) Object size detection method, device, detection equipment and storage medium
CN104053984A (en) Image examination method and image examination apparatus
CN107862679A (en) The determination method and device of image detection region
JP7226493B2 (en) Contact wire wear inspection method
CN104899844A (en) Image defogging method and device
CN109300127A (en) Defect inspection method, device, computer equipment and storage medium
JP2011145179A (en) Sensory test device and sensory test method
CN113674277B (en) Unsupervised domain adaptive surface defect region segmentation method and device and electronic equipment
Mir et al. Machine learning-based evaluation of the damage caused by cracks on concrete structures
CN108760755A (en) A kind of dust granule detection method and device
JP2020135051A (en) Fault inspection device, fault inspection method, fault inspection program, learning device and learned model
CN105915763A (en) Video denoising and detail enhancement method and device
TW202020802A (en) Information processing method and computer program
CN106530274B (en) A kind of localization method of girder steel crackle
CN115731390A (en) Method and equipment for identifying rock mass structural plane of limestone tunnel
CN115713622A (en) Casting defect detection method and system based on three-dimensional model and flaw detection image
Prabha et al. Defect detection of industrial products using image segmentation and saliency
US8306311B2 (en) Method and system for automated ball-grid array void quantification
CN115100110A (en) Defect detection method, device and equipment for polarized lens and readable storage medium
CN107203976A (en) A kind of adaptive non-local mean denoising method and system detected based on noise

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20201014

Address after: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronic office building)

Applicant after: GoerTek Optical Technology Co.,Ltd.

Address before: 261031 Dongfang Road, Weifang high tech Development Zone, Shandong, China, No. 268

Applicant before: GOERTEK Inc.

GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221123

Address after: 266104 No. 500, Songling Road, Laoshan District, Qingdao, Shandong

Patentee after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Patentee before: GoerTek Optical Technology Co.,Ltd.