CN107240101A - Target area detection method and device, image partition method and device - Google Patents

Target area detection method and device, image partition method and device Download PDF

Info

Publication number
CN107240101A
CN107240101A CN201710237893.5A CN201710237893A CN107240101A CN 107240101 A CN107240101 A CN 107240101A CN 201710237893 A CN201710237893 A CN 201710237893A CN 107240101 A CN107240101 A CN 107240101A
Authority
CN
China
Prior art keywords
target
target phase
image
line
row
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710237893.5A
Other languages
Chinese (zh)
Other versions
CN107240101B (en
Inventor
苏衍昌
蒋均
王志华
黄巧文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guilin Measuring & Cutting Tool Co ltd
Urit Medical Electronic Co Ltd
Original Assignee
Urit Medical Electronic Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Urit Medical Electronic Co Ltd filed Critical Urit Medical Electronic Co Ltd
Priority to CN201710237893.5A priority Critical patent/CN107240101B/en
Publication of CN107240101A publication Critical patent/CN107240101A/en
Application granted granted Critical
Publication of CN107240101B publication Critical patent/CN107240101B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/28Indexing scheme for image data processing or generation, in general involving image processing hardware

Abstract

The invention discloses a kind of target area detection method, image partition method and device, it is related to technical field of image processing.This method includes:Target phase detection process, binary map is obtained by carrying out binaryzation to image to be detected line by line, detects that original position, end position and the place line number of target phase in binary map are used as the positional information of target phase;Target area detection process, obtains the positional information of target area;Target Segmentation processing, according to the positional information of target area, is partitioned into the target area in image to be detected;The different parallel performance objective section detection process of image to be detected, target area detection process and Target Segmentation are handled respectively.This method and device improve the applicability and real-time of image partition method.

Description

Target area detection method and device, image partition method and device
Technical field
The present invention relates to technical field of image processing, more particularly to a kind of target area detection method and device, image point Segmentation method and device.
Background technology
, it is necessary to split to the target in image to be detected, such as to micrograph in image processing techniques application process The particulate targets such as the cell crystallization as in are split.
But, on the one hand, because multiple targets present in image to be detected are likely to be at any position in image, and mesh Density is marked with different and different, the existing image Segmentation Technology of image, such as the dividing method based on threshold value, is examined based on edge The dividing method of survey and the dividing method based on region, are only capable of for a certain specific image to be detected, applicability is relatively low.
On the other hand, existing image Segmentation Technology can only be calculated according to serial mode, so as to cause at computer The calculating of progress magnanimity is needed during reason, is taken a substantial amount of time, real-time is poor.
The content of the invention
The inventors found that above-mentioned problems of the prior art, and be therefore directed in described problem at least One problem proposes a kind of new technical scheme.
It is an object of the present invention to provide a kind of image Segmentation Technology scheme with stronger applicability and real-time.
According to first embodiment of the invention there is provided a kind of target area detection method, including:Obtain pending figure As corresponding binary map and the binary map every a line in target phase positional information;Read line by line in the binary map , row coordinate interval be present the target phase overlapped and merge into a target area in the target phase;According to the target area The positional information for the target phase that domain is included, records the positional information of the target area.
Alternatively, read the target phase line by line in the binary map, row coordinate interval is existed to the mesh overlapped Bid section, which merges into a target area, to be included:The target phase is read line by line in the binary map, with mesh one of them described Bid section is used as initial target section;Initial target section is existed with the row coordinate interval in other rows with initial target section The target phase of coincidence merges into a target area.
Alternatively, this method also includes:If the target phase of continuous two row is with merging in the binary map The row coordinate interval of the target phase, which is not present, to be overlapped, then the target area merges and finished;Again the two-value is read line by line The target phase not being merged in image carries out the merging of next target area as initial target section.
Alternatively, the positional information of the target phase includes:The origin coordinates of the target phase, end coordinate;The mesh The positional information in mark region includes:Minimum starting row coordinate, maximum end column coordinate, starting line number and the knot of the target area Beam line number.
According to another embodiment of the invention there is provided a kind of image partition method, including:Target phase detection process, leads to Cross and binaryzation is carried out to image to be detected line by line obtain binary map, detect target phase in the binary map origin coordinates, End coordinate as the target phase positional information;Target area detection process, passes through the mesh as described in preceding any embodiment Method for detecting area is marked, the positional information of target area is obtained;With Target Segmentation processing, believed according to the position of the target area Breath, is partitioned into the target area in described image to be detected.
Alternatively, the target phase detection process, the target area are performed parallel to different image to be detected respectively Detection process and Target Segmentation processing.
Alternatively, it is described to perform the target phase detection process, the target parallel to different image to be detected respectively Region detection processing and Target Segmentation processing include:The target phase detection process is carried out to the first image;To the second figure As carrying out carrying out the target area detection process to described first image while the target phase detection process;To described Two images carry out carrying out the Target Segmentation processing to described first image while the target area detection process, to the 3rd Image carries out the target phase detection process;Second image is carried out while the Target Segmentation is handled to the described 3rd Image carries out the target area detection process, and the target phase detection process is carried out to the 4th image;According to input sequence, according to Secondary image to be detected to subsequently inputting continues above-mentioned processing, until all described image to be detected are disposed.
Alternatively, the target phase detection process includes:Downsampled processing, edge inspection are carried out to described image to be detected Survey processing, expansion process and corrosion treatment, so as to obtain the binary map;The binary map is read line by line, is 1 by first value The row coordinate of pixel saves as the original position of the target phase, and first value is preserved from the row coordinate of 1 pixel for being changed into 0 For the end position of the target phase, and line number where the target phase is preserved, so as to complete target phase localization process.
Alternatively, the downsampled processing, edge inspection are performed parallel to not going together in image to be detected respectively Survey processing, the expansion process, the corrosion treatment and the target phase localization process.
Alternatively, it is described to perform the downsampled processing, the side parallel to not going together in image to be detected respectively Edge detection process, the expansion process, the corrosion treatment and the target phase localization process include:According to input sequence, according to Secondary image to be detected to subsequently inputting continues above-mentioned processing, until all described image to be detected are disposed.To described Image to be detected carries out the downsampled processing line by line, and result is exported line by line;Institute is carried out to the first preset value row While stating downsampled processing, the edge detection process is carried out to the 1st row~row of the first preset value -1, to the first preset value While+1 row carries out the downsampled processing, the edge detection process is carried out to the preset value row of the 2nd row~first, according to Capable input sequence is gone on successively, and result is exported line by line;The rim detection is carried out to the second preset value row While processing, the expansion process is carried out to the 1st row~row of the second predicted value -1, the side is carried out to the row of the second preset value+1 While edge detection process, the expansion process is carried out to the preset value row of the 2nd row~second, entered successively according to capable input sequence Row goes down, and result is exported line by line;While carrying out the expansion process to the 3rd preset value row, to the 1st row~the The row of three predicted value -1 carries out the corrosion treatment, while carrying out the expansion process to the row of the 3rd preset value+1, to the 2nd row~ 3rd preset value row carries out the corrosion treatment, is gone on successively according to capable input sequence, result is exported line by line And carry out the target phase localization process.
According to still another embodiment of the invention there is provided a kind of target area detection means, including:Target phase acquisition of information The positional information of target phase in unit, every a line for obtaining the corresponding binary map of pending image and the binary map; Target area combining unit, for reading the target phase line by line in the binary map, has what is overlapped by row coordinate interval The target phase merges into a target area;Target area information acquisition unit, for what is included according to the target area The positional information of the target phase, records the positional information of the target area.
Alternatively, the target area combining unit, for reading the target phase line by line in the binary map, with it In a target phase as initial target section, by initial target section and the row in other rows with the initial target section A target area is merged into and there is the target phase overlapped in coordinate interval.
Alternatively, the device also includes:Merging finishes judging unit, if the institute for continuous two row in the binary map State target phase with merge in the target phase row coordinate interval be not present overlap, then judge that the target area has merged Finish, and notify the target area combining unit to read the target phase not being merged in bianry image line by line again as starting mesh Bid section, carries out the merging of next target area.
Alternatively, the positional information of the target phase includes:The origin coordinates of the target phase, end coordinate;The mesh The positional information in mark region includes:The minimum origin coordinates of the target area, maximum end coordinate, starting line number and end line Number.
According to still a further embodiment there is provided a kind of image segmenting device, including:Target phase detection unit, is used Obtain binary map in carrying out binaryzation to image to be detected line by line, detect target phase in the binary map origin coordinates, End coordinate as the target phase positional information;Target area detection means as described in former one embodiment, is used In the positional information for obtaining target area;Object segmentation unit, for the positional information according to the target area, is treated described The target area is partitioned into detection image.
Alternatively, the target phase detection unit, the target area detection means and object segmentation unit difference Perform corresponding processing parallel to different image to be detected.
Alternatively, the target phase detection unit includes:Binary map obtains subelement, for entering to described image to be detected The downsampled processing of row, edge detection process, expansion process and corrosion treatment, so as to obtain the binary map;Target phase is positioned Subelement, for reading the binary map line by line, rising for the target phase is saved as by first value for the row coordinate of 1 pixel Beginning position, first value saves as to the end position of the target phase from the row coordinate of 1 pixel for being changed into 0, and preserve described Line number where target phase, so as to complete target phase localization process.
Alternatively, the target phase detection unit is performed parallel to not going together in image to be detected respectively described takes downwards Sample processing, the edge detection process, the expansion process, the corrosion treatment and the target phase localization process.
According to still a further embodiment there is provided a kind of image segmenting device, including:Memory and it is coupled to institute The processor of memory is stated, the processor is configured as, based on the instruction being stored in the memory devices, performing as before Target area detection method or image partition method described in any embodiment.
According to still a further embodiment there is provided a kind of computer-readable recording medium, computer is stored thereon with Program, realizes image partition method as described above described in any one embodiment when the program is executed by processor.
One advantage of the embodiment of the present invention is, determines target area by detecting line by line and merging target phase, carries The high applicability of image partition method;By image dividing processing steps simultaneously different to the parallel execution of multiple image, carry The high real-time of image partition method.
Brief description of the drawings
The accompanying drawing for constituting a part for specification describes embodiments of the invention, and is used to solve together with the description Release the principle of the present invention.
Referring to the drawings, according to following detailed description, the present invention can be more clearly understood from, wherein:
Fig. 1 shows the flow chart of one embodiment of the target area detection method according to the present invention.
Fig. 2 shows the flow chart of another embodiment of the target area detection method according to the present invention.
Fig. 3 shows the flow chart of one embodiment of the image partition method according to the present invention.
Fig. 4 shows the schematic diagram of one embodiment of the image partition method according to the present invention.
Fig. 5 shows the flow chart of another embodiment of the image partition method according to the present invention.
Fig. 6 shows the structure chart of one embodiment of the target area detection means according to the present invention.
Fig. 7 shows the structure chart of another embodiment of the target area detection means according to the present invention.
Fig. 8 shows the structure chart of one embodiment of the image segmenting device according to the present invention.
Fig. 9 shows the structure chart of another embodiment of the image segmenting device according to the present invention.
Figure 10 shows the structure chart of another embodiment of the image segmenting device according to the present invention.
Embodiment
The various exemplary embodiments of the present invention are described in detail now with reference to accompanying drawing.It should be noted that:Unless had in addition Body illustrates that the part and the positioned opposite of step, numerical expression and numerical value otherwise illustrated in these embodiments does not limit this The scope of invention.
Simultaneously, it should be appreciated that for the ease of description, the size of the various pieces shown in accompanying drawing is not according to reality Proportionate relationship draw.
The description only actually at least one exemplary embodiment is illustrative below, never as to the present invention And its any limitation applied or used.
It may be not discussed in detail for technology, method and apparatus known to person of ordinary skill in the relevant, but suitable In the case of, the technology, method and apparatus should be considered as authorizing a part for specification.
In shown here and discussion all examples, any occurrence should be construed as merely exemplary, without It is as limitation.Therefore, the other examples of exemplary embodiment can have different values.
It should be noted that:Similar label and letter represents similar terms in following accompanying drawing, therefore, once a certain Xiang Yi It is defined, then it need not be further discussed in subsequent accompanying drawing in individual accompanying drawing.
Fig. 1 shows the flow chart of one embodiment of the target area detection method according to the present invention.
As shown in figure 1, step 1, obtains target phase in every a line of the corresponding binary map of pending image and binary map Positional information.
For example, can according to binary map line number, the target phase that includes of the row, record start position are read since the 1st row Put, end position and place line number as the target phase positional information.
Step 102, target phase is read line by line in binary map, the row coordinate interval where judging target phase whether there is weight Close.If it is, target phase is merged into a target area (step 103);If it is not, then target phase is judged as to be not belonging to One target area (step 104).
For example, jump to the 2nd row, read all target phases of the row, and judge these target phase positions whether and There is common factor the target phase position recorded in the 1st row, complete for there is the target phase of common factor then to merge into same target area Next line is jumped to after merging in a row, while the target phase that mark had merged.
In one embodiment, also to judge whether current goal region merges to finish, as shown in Fig. 2 step 201, by Row reads in bianry image the target phase that was not merged as origin coordinates section, by other rows with initial target section row coordinate A target area is merged into and there is the target phase overlapped in interval.
Step 202, judge in binary map whether the target phase of continuous two row with merge in target phase row coordinate area Between be not present overlap.If it is, being judged as that current goal region merging technique is finished, step 201 is re-executed;If it is not, then will There is the target phase overlapped with origin coordinates section row coordinate interval and merge into a target area.
For example, current goal region merging technique is finished, then the starting line number this time merged is preserved, terminate line number, minimum starting Binary map is scanned in address, maximum end address then again as the positional information of the target area, for having been labeled as The target phase merged, not as origin coordinates section.
In above-described embodiment, present invention only requires the positional information according to target phase in image, line by line to every in binary map The target phase that row is included merges processing, and then obtains the positional information of target area in image to be detected, goes for Various types of image to be detected, so as to improve the applicability of image segmentation.
Fig. 3 shows the flow chart of one embodiment of the image partition method according to the present invention.
As shown in figure 3, step 301, target phase detection process obtains the positional information of target phase.
For example, image to be detected is carried out into binary conversion treatment, target phase rises in then being obtained in binary map per a line Beginning position, end position and place line number.
Step 302, target area detection process obtains the positional information of target area.
For example, can perform such as the target area detection method in above-mentioned embodiment, so as to obtain the minimum of target area Original position, maximum end position, starting line number and end line number.
Step 303, Target Segmentation is handled, and target area is partitioned into image to be detected.
For example, determining mesh according to the minimum original position of target area, maximum end position, starting line number and end line number Region position is marked, target area is partitioned into image to be detected.
In one embodiment, perform the target phase detection process parallel to different image to be detected respectively, it is described Target area detection process and Target Segmentation processing.
As shown in figure 4, carrying out target phase detection to the first image, target phase testing result is stored in memory 41;It is right Second image carries out target phase detection, and testing result is stored in into memory 42, while carrying out target area inspection to the first image Survey, amalgamation result is stored in memory 43;Target area detection is carried out to the second image, amalgamation result is stored in memory 44, while carrying out Target Segmentation to the first image, segmentation result is exported, and target phase detection is carried out to the 3rd image, it will examine Survey result and be stored in memory 41;Target Segmentation is carried out to the second image, segmentation result is exported, while being carried out to the 3rd image Target area is detected, amalgamation result is stored in into memory 43, and carries out target phase detection to the 4th image, and target phase is detected As a result memory 42 is stored in, is circulated down successively according to input sequence.
In one embodiment, high-resolution image to be detected according to line number can be divided into several subgraphs, Then parallel processing is performed to these subgraphs according to the method described above.
In above-described embodiment, the present invention is simultaneously to the different parallel performance objective section detection process of image to be detected, target Region detection processing and Target Segmentation processing, so as to accelerate processing speed, improve the real-time of image segmentation.
Fig. 5 shows the flow chart of another embodiment of the image partition method according to the present invention.
As shown in figure 5, above-mentioned target phase detection can be divided into:
Step 501, image to be detected binary conversion treatment is obtained into binary map.
For example, can be carried out to image to be detected at downsampled processing, edge detection process, expansion process and corrosion Reason, so as to obtain binary map.
Step 502, binary map is read line by line, carries out target phase localization process.
For example, binary map can be read line by line, first value is saved as to the starting of target phase for the row coordinate of 1 pixel Position, the end position of target phase is saved as by first value from the row coordinate of 1 pixel for being changed into 0, and preserves target phase place Line number.
In one embodiment, the downsampled processing, institute are performed parallel to not going together in image to be detected respectively State edge detection process, the expansion process, the corrosion treatment and the target phase localization process.
For example, carrying out downsampled processing line by line to image to be detected, the 1st~1024 row is sequentially output;Advanced to the 6th While row downsampled processing, edge treated is carried out to the 1st~5 row, the 1st row edge detection process result is obtained and exports; While downsampled processing is carried out to the 7th row, edge treated is carried out to the 2nd~6 row, the 2nd row edge detection process is obtained As a result and export, and go on successively;Edge detection process is being carried out (according to the big of edge detection process operator to the 4th row It is small, can be that the 4th~8 row is carried out jointly) while, expansion process is carried out to the 1st~3 row edge detection process result, obtained To the 1st row expansion process result and export;Edge detection process is being carried out (according to the big of edge detection process operator to the 5th row It is small, can be that the 5th~9 row is carried out jointly) while, expansion process is carried out to the 2nd~4 row edge detection process result, obtained To the 2nd row expansion process result and export, and go on successively;(calculated carrying out expansion process to the 4th row according to expansion process The size of son, can be that the 4th~6 row is carried out jointly) while, corrosion treatment is carried out to the 1st~3 row expansion process result, Obtain the 1st row corrosion treatment result and export;The 5th row is carried out expansion process (according to the size of expansion process operator, can be with It is that the 5th~7 row is carried out jointly) while, corrosion treatment is carried out to the 2nd~4 row expansion process result, the corrosion of the 2nd row is obtained Result is simultaneously exported, and is gone on successively according to capable input sequence, and result is exported line by line and target phase is carried out and determined Position processing.
In the above-described embodiments, the present invention can take downwards to parallel perform respectively of not going together in image to be detected simultaneously Sample processing, edge detection process, expansion process, corrosion treatment and target phase localization process, so that processing time is substantially reduced, Improve the real-time of image segmentation.
Fig. 6 shows the structure chart of one embodiment of the target area detection means according to the present invention.
As shown in fig. 6, the device includes:Target phase information acquisition unit 61, target area combining unit 62 and target area Domain information acquiring unit 63.
Target phase information acquisition unit 61 obtains mesh in every a line of the corresponding binary map of pending image and binary map The positional information of bid section.
Target area combining unit 62 reads the target phase line by line in binary map, and row coordinate interval is existed into what is overlapped Target phase merges into a target area.
In one embodiment, target area combining unit 62 reads the target phase line by line in binary map, with wherein There is weight with the row coordinate interval in other rows with initial target section as initial target section by one target phase in initial target section The target phase of conjunction merges into a target area.
The positional information for the target phase that target area information acquisition unit 63 is included according to target area, records target area Positional information.
In one embodiment, the device also includes:Merging finishes judging unit 71.
If in binary map the target phase of continuous two row with merge in target phase row coordinate interval be not present overlap, Then merge and finish judging unit 71 and judge that target area merging is finished, and notify target area combining unit 62 to read line by line again The target phase not being merged in the bianry image carries out the merging of next target area as initial target section.
Finished for example, a target area merges, then jump back to the 1st row of binary map again, read the starting not merged The positional information of target phase, and be compared with the target phase that is not merged in the 2nd row, until all target phases have merged Finish.
In above-described embodiment, present invention only requires the positional information according to initial target section in image, line by line to binary map In often the target phase that includes of row merge the positional information of target area in processing, and then acquisition image to be detected, Ke Yishi For various types of image to be detected, so as to improve the applicability of image segmentation.
Fig. 8 shows the structure chart of one embodiment of the image segmenting device according to the present invention.
As shown in figure 8, the device includes:Object detection unit 81, target area detection means 82 and object segmentation unit 83.Wherein, the function of target area detection means 82 can be found in any of the above-described embodiment, will not be repeated here.
Target phase detection unit 81 carries out binaryzation to image to be detected line by line and obtains binary map, detects in binary map The origin coordinates of target phase, end coordinate as target phase positional information;Object segmentation unit 83 is according to the position of target area Confidence is ceased, and target area is partitioned into image to be detected.For example, object segmentation unit 83 passes through target area detection means 82 The positional information of obtained target area, can calculate the coordinate on four summits of each target area, so that it is determined that target Position of the region in image to be detected, is read out.
In one embodiment, 83 points of target phase detection unit 81, target area detection means 82 and object segmentation unit It is other to perform corresponding processing parallel to different image to be detected.
For example, it is possible to use FPGA (Field-Programmable Gate Array, field programmable gate array) energy The characteristics of enough carrying out parallel processing, realizes target phase detection unit 81, target area detection means 82 and object segmentation unit 83 Different image to be detected is handled simultaneously;The target phase testing result of the different image to be detected produced in processing procedure With target area testing result can be stored respectively in different SRAM (Static Random Access Memory, it is static with Machine accesses memory).
In another embodiment, as shown in figure 9, the target phase detection unit 81 includes:Binary map obtains subelement 911 and target phase locator unit 912.
Binary map obtains subelement 911 and carries out downsampled processing, edge detection process, expansion process to image to be detected And corrosion treatment, so as to obtain the binary map.Target phase locator unit 912 reads the binary map line by line, by first value The original position of target phase is saved as the row coordinate of 1 pixel, first value is protected from the row coordinate of 1 pixel for being changed into 0 The end position of target phase is saved as, and preserves line number where target phase, so as to complete target phase localization process.
For example, target phase detection unit 81 can be examined by Sobel gradient calculations, non-maxima suppression, dual threashold value-based algorithm Survey and edge detection process is realized at connection edge, above-mentioned processing can also be performed parallel to different image to be detected simultaneously with complete Into rim detection.
In one embodiment, target phase detection unit 81 performs downward parallel to row different in image to be detected respectively Sampling processing, edge detection process, expansion process, corrosion treatment and target phase localization process.
In the above-described embodiments, device of the invention can be examined to the different parallel performance objective sections of image to be detected simultaneously Survey processing, target area detection process and Target Segmentation processing, and the object detection unit that the device includes can be while treat Different rows performs downsampled processing, edge detection process, expansion process, corrosion treatment and mesh parallel respectively in detection image Bid section localization process, so as to substantially reduce processing time, improves the real-time of image segmentation.
So far, be described in detail according to the present invention target area detection method and device, image partition method and Device.In order to avoid the design of the masking present invention, some details known in the field are not described.Those skilled in the art's root According to above description, completely it can be appreciated how implementing technical scheme disclosed herein.
The method and system of the present invention may be achieved in many ways.For example, can by software, hardware, firmware or Software, hardware, firmware any combinations come realize the present invention method and system.The said sequence of the step of for methods described Order described in detail above is not limited to merely to illustrate, the step of method of the invention, it is special unless otherwise Do not mentionlet alone bright.In addition, in certain embodiments, the present invention can be also embodied as recording to program in the recording medium, these programs Including the machine readable instructions for realizing the method according to the invention.Thus, the present invention also covering storage is used to perform basis The recording medium of the program of the method for the present invention.
Although some specific embodiments of the present invention are described in detail by example, the skill of this area Art personnel are it should be understood that above example is merely to illustrate, the scope being not intended to be limiting of the invention.The skill of this area Art personnel to above example it should be understood that can modify without departing from the scope and spirit of the present invention.This hair Bright scope is defined by the following claims.

Claims (20)

1. a kind of target area detection method, including:
Obtain the positional information of target phase in every a line of the corresponding binary map of pending image and the binary map;
Read the target phase line by line in the binary map, row coordinate interval is existed to the target phase overlapped and merges into one Individual target area;
The positional information of the target phase included according to the target area, records the positional information of the target area.
2. according to the method described in claim 1, wherein, the target phase is read line by line in the binary map, by row coordinate Interval, which has the target phase overlapped, which merges into a target area, includes:
Read the target phase line by line in the binary map, the section using target phase one of them described as initial target;
Initial target section is had into the target phase overlapped with the row coordinate interval in other rows with initial target section to close And be a target area.
3. method according to claim 2, in addition to:
If in the binary map target phase of continuous two row with merge in the target phase row coordinate it is interval not In the presence of coincidence, then the target area merging is finished;
Again the target phase not being merged in the bianry image is read line by line as initial target section, carries out next target area The merging in domain.
4. according to the method described in claim 1, wherein,
The positional information of the target phase includes:The origin coordinates of the target phase, end coordinate;
The positional information of the target area includes:The minimum of the target area originates row coordinate, maximum end column coordinate, risen The number of beginning and end line number.
5. a kind of image partition method, including:
Target phase detection process, obtains binary map by carrying out binaryzation to image to be detected line by line, detects the binary map In the origin coordinates of target phase, the positional information that end coordinate is the target phase;
Target area detection process, by the target area detection method as any one of Claims 1-4, obtains mesh Mark the positional information in region;With
Target Segmentation processing, according to the positional information of the target area, is partitioned into the target in described image to be detected Region.
6. method according to claim 5, wherein,
Perform the target phase detection process, the target area detection process and institute parallel to different image to be detected respectively State Target Segmentation processing.
7. method according to claim 6, wherein, it is described that the target is performed parallel to different image to be detected respectively Section detection process, the target area detection process and Target Segmentation processing include:
The target phase detection process is carried out to the first image;
The target area detection is carried out to described first image while the target phase detection process is carried out to the second image Processing;
The target point is carried out while the target area detection process is carried out to second image to described first image Processing is cut, the target phase detection process is carried out to the 3rd image;
The target area inspection is carried out to the 3rd image while Target Segmentation processing is carried out to second image Survey is handled, and the target phase detection process is carried out to the 4th image;
According to input sequence, above-mentioned processing is continued to the image to be detected subsequently inputted successively, until all is described to be detected Image procossing is finished.
8. method according to claim 5, wherein, the target phase detection process includes:
Downsampled processing, edge detection process, expansion process and corrosion treatment are carried out to described image to be detected, so as to obtain The binary map;
The binary map is read line by line, and first value is saved as to the original position of the target phase for the row coordinate of 1 pixel, First value is saved as into the end position of the target phase from the row coordinate of 1 pixel for being changed into 0, and preserves the target phase institute In line number, so as to complete target phase localization process.
9. method according to claim 8, wherein,
Perform the downsampled processing parallel to not going together in image to be detected respectively, it is the edge detection process, described Expansion process, the corrosion treatment and the target phase localization process.
10. method according to claim 9, wherein, it is described respectively to the parallel execution institute of not going together in image to be detected State downsampled processing, the edge detection process, the expansion process, the corrosion treatment and the target phase localization process Including:
Carry out the downsampled processing line by line to described image to be detected, and result is exported line by line;
While carrying out the downsampled processing to the first preset value row, the 1st row~row of the first preset value -1 is carried out described Edge detection process, while carrying out the downsampled processing to the row of the first preset value+1, to the preset value row of the 2nd row~first The edge detection process is carried out, is gone on successively according to capable input sequence, and result is exported line by line;
While carrying out the edge detection process to the second preset value row, the 1st row~row of the second predicted value -1 is carried out described Expansion process, while carrying out the edge detection process to the row of the second preset value+1, is carried out to the preset value row of the 2nd row~second The expansion process, is gone on successively according to capable input sequence, and result is exported line by line;
While carrying out the expansion process to the 3rd preset value row, the corrosion is carried out to the 1st row~row of the 3rd predicted value -1 Processing, while carrying out the expansion process to the row of the 3rd preset value+1, the corrosion is carried out to the preset value row of the 2nd row~the 3rd Processing, goes on, result is exported line by line and the target phase localization process is carried out successively according to capable input sequence.
11. a kind of target area detection means, including:
In target phase information acquisition unit, every a line for obtaining the corresponding binary map of pending image and the binary map The positional information of target phase;
, for reading the target phase line by line in the binary map, there is weight by row coordinate interval in target area combining unit The target phase closed merges into a target area;
Target area information acquisition unit, for the positional information of the target phase included according to the target area, record The positional information of the target area.
12. device according to claim 11, wherein,
The target area combining unit, for reading the target phase line by line in the binary map, with described in one of them Target phase deposits initial target section with the row coordinate interval in other rows with initial target section as initial target section A target area is merged into the target phase of coincidence.
13. device according to claim 12, in addition to:
Merging finish judging unit, if the target phase for continuous two row in the binary map with merge in described in The row coordinate interval of target phase, which is not present, to be overlapped, then judges that the target area merges and finish, and notify the target area to close And unit reads the target phase not being merged in the bianry image as initial target section line by line again, carries out next target The merging in region.
14. device according to claim 11, wherein,
The positional information of the target phase includes:The origin coordinates of the target phase, end coordinate;
The positional information of the target area includes:The minimum origin coordinates of the target area, maximum end coordinate, initial row Number and terminate line number.
15. a kind of image segmenting device, including:
Target phase detection unit, obtains binary map for carrying out binaryzation to image to be detected line by line, detects the binary map In the origin coordinates of target phase, end coordinate as the target phase positional information;
Believe target area detection means as any one of claim 11 to 14, the position for obtaining target area Breath;
Object segmentation unit, for the positional information according to the target area, is partitioned into described in described image to be detected Target area.
16. device according to claim 15, wherein,
The target phase detection unit, the target area detection means and the object segmentation unit are respectively to different to be checked Altimetric image performs corresponding processing parallel.
17. device according to claim 15, wherein, the target phase detection unit includes:
Binary map obtains subelement, to the downsampled processing of described image to be detected progress, edge detection process, expansion Reason and corrosion treatment, so as to obtain the binary map;
Target phase locator unit, for reading the binary map line by line, the row coordinate by first value for 1 pixel is saved as The original position of the target phase, first value is saved as from the row coordinate of 1 pixel for being changed into 0 stop bits of the target phase Put, and preserve line number where the target phase, so as to complete target phase localization process.
18. device according to claim 17, wherein,
The target phase detection unit not going together in image to be detected is performed parallel respectively it is described it is downsampled processing, it is described Edge detection process, the expansion process, the corrosion treatment and the target phase localization process.
19. a kind of image segmenting device, including:
Memory;And
The processor of the memory is coupled to, the processor is configured as based on the finger being stored in the memory devices Order, performs the method as any one of claim 1-10.
20. a kind of computer-readable recording medium, is stored thereon with computer program, it is characterised in that the program is by processor The method as any one of claim 1-10 is realized during execution.
CN201710237893.5A 2017-04-13 2017-04-13 Target area detection method and device, and image segmentation method and device Active CN107240101B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710237893.5A CN107240101B (en) 2017-04-13 2017-04-13 Target area detection method and device, and image segmentation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710237893.5A CN107240101B (en) 2017-04-13 2017-04-13 Target area detection method and device, and image segmentation method and device

Publications (2)

Publication Number Publication Date
CN107240101A true CN107240101A (en) 2017-10-10
CN107240101B CN107240101B (en) 2021-01-29

Family

ID=59983858

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710237893.5A Active CN107240101B (en) 2017-04-13 2017-04-13 Target area detection method and device, and image segmentation method and device

Country Status (1)

Country Link
CN (1) CN107240101B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107885439A (en) * 2017-12-01 2018-04-06 维沃移动通信有限公司 A kind of note dividing method and mobile terminal
CN116934774A (en) * 2023-06-30 2023-10-24 安徽大学 Quick and high-precision panoramic image clipping method

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101198033A (en) * 2007-12-21 2008-06-11 北京中星微电子有限公司 Locating method and device for foreground image in binary image
CN101231699A (en) * 2007-12-11 2008-07-30 长安大学 Method for detecting vehicle existence base on image texture
US20090067721A1 (en) * 2007-09-12 2009-03-12 Kim Min-Seok Image processing apparatus and method and a computer readable medium having computer executable instructions stored thereon for performing the method
CN101587622A (en) * 2009-06-18 2009-11-25 任芳 Forest rocket detection and recognition methods and equipment based on video image intelligent analysis
US20100183225A1 (en) * 2009-01-09 2010-07-22 Rochester Institute Of Technology Methods for adaptive and progressive gradient-based multi-resolution color image segmentation and systems thereof
CN101799968A (en) * 2010-01-13 2010-08-11 任芳 Detection method and device for oil well intrusion based on video image intelligent analysis
US20100246926A1 (en) * 2009-03-30 2010-09-30 Ge Healthcare Bio-Sciences Corp. System and method for distinguishing between biological materials
CN102169093A (en) * 2010-12-20 2011-08-31 湖南大学 Multi-station machine vision imaging detection method and system based on graphics processor
CN102346913A (en) * 2011-09-20 2012-02-08 宁波大学 Simplification method of polygon models of image
CN102831683A (en) * 2012-08-28 2012-12-19 华南理工大学 Pedestrian flow counting-based intelligent detection method for indoor dynamic cold load
US20130236092A1 (en) * 2008-02-04 2013-09-12 Eyep, Inc. Modified Propagated Last Labeling System and Method for Connected Components
CN104008401A (en) * 2014-05-07 2014-08-27 中国科学院信息工程研究所 Method and device for image character recognition
CN104680531A (en) * 2015-02-28 2015-06-03 西安交通大学 Connection flux statistical information extraction method and VLSI structure
CN105225236A (en) * 2015-09-21 2016-01-06 中国科学院半导体研究所 A kind of bianry image connected region paralleled detection method and system
CN105513062A (en) * 2015-11-30 2016-04-20 合肥工业大学 Multi-target pixel counting method based on color linear array CCD
CN105740827A (en) * 2016-02-02 2016-07-06 大连楼兰科技股份有限公司 Stop line detection and ranging algorithm on the basis of quick sign communication
CN105844621A (en) * 2016-03-17 2016-08-10 阜阳市飞扬印务有限公司 Method for detecting quality of printed matter

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090067721A1 (en) * 2007-09-12 2009-03-12 Kim Min-Seok Image processing apparatus and method and a computer readable medium having computer executable instructions stored thereon for performing the method
CN101231699A (en) * 2007-12-11 2008-07-30 长安大学 Method for detecting vehicle existence base on image texture
CN101198033A (en) * 2007-12-21 2008-06-11 北京中星微电子有限公司 Locating method and device for foreground image in binary image
US20130236092A1 (en) * 2008-02-04 2013-09-12 Eyep, Inc. Modified Propagated Last Labeling System and Method for Connected Components
US20100183225A1 (en) * 2009-01-09 2010-07-22 Rochester Institute Of Technology Methods for adaptive and progressive gradient-based multi-resolution color image segmentation and systems thereof
US20100246926A1 (en) * 2009-03-30 2010-09-30 Ge Healthcare Bio-Sciences Corp. System and method for distinguishing between biological materials
CN101587622A (en) * 2009-06-18 2009-11-25 任芳 Forest rocket detection and recognition methods and equipment based on video image intelligent analysis
CN101799968A (en) * 2010-01-13 2010-08-11 任芳 Detection method and device for oil well intrusion based on video image intelligent analysis
CN102169093A (en) * 2010-12-20 2011-08-31 湖南大学 Multi-station machine vision imaging detection method and system based on graphics processor
CN102346913A (en) * 2011-09-20 2012-02-08 宁波大学 Simplification method of polygon models of image
CN102831683A (en) * 2012-08-28 2012-12-19 华南理工大学 Pedestrian flow counting-based intelligent detection method for indoor dynamic cold load
CN104008401A (en) * 2014-05-07 2014-08-27 中国科学院信息工程研究所 Method and device for image character recognition
CN104680531A (en) * 2015-02-28 2015-06-03 西安交通大学 Connection flux statistical information extraction method and VLSI structure
CN105225236A (en) * 2015-09-21 2016-01-06 中国科学院半导体研究所 A kind of bianry image connected region paralleled detection method and system
CN105513062A (en) * 2015-11-30 2016-04-20 合肥工业大学 Multi-target pixel counting method based on color linear array CCD
CN105740827A (en) * 2016-02-02 2016-07-06 大连楼兰科技股份有限公司 Stop line detection and ranging algorithm on the basis of quick sign communication
CN105844621A (en) * 2016-03-17 2016-08-10 阜阳市飞扬印务有限公司 Method for detecting quality of printed matter

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
LIFENG HE: "A RUN-BASED ONE-AND-A-HALF-SCAN CONNECTED-COMPONENT LABELING ALGORITHM", 《INTERNATIONAL JOURNAL OF PATTERN RECOGNITION》 *
P. CHEN 等: "Block-run-based connected component labelling algorithm for GPGPU using shared memory", 《ELECTRONICS LETTERS》 *
张桂林 等: "基于跑长码的连通区域标记算法", 《华中理工大学学报》 *
范明喆: "基于红外图像的人工浅埋掩体目标检测技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑(月刊)》 *
陆克中 等: "一种基于游程码的并行区域标记算法", 《计算机工程与应用》 *
陆建华: "基于数学形态学和行扫描的车牌定位算法", 《计算机时代》 *
高红波 等: "一种二值图像连通区域标记的新算法", 《计算机应用》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107885439A (en) * 2017-12-01 2018-04-06 维沃移动通信有限公司 A kind of note dividing method and mobile terminal
CN107885439B (en) * 2017-12-01 2020-06-26 维沃移动通信有限公司 Note segmentation method and mobile terminal
CN116934774A (en) * 2023-06-30 2023-10-24 安徽大学 Quick and high-precision panoramic image clipping method
CN116934774B (en) * 2023-06-30 2024-03-22 安徽大学 Quick and high-precision panoramic image clipping method

Also Published As

Publication number Publication date
CN107240101B (en) 2021-01-29

Similar Documents

Publication Publication Date Title
CN104732510B (en) A kind of camera lens blackspot detection method and device
CN108805871A (en) Blood-vessel image processing method, device, computer equipment and storage medium
US8811750B2 (en) Apparatus and method for extracting edge in image
CN105893957B (en) View-based access control model lake surface ship detection recognition and tracking method
EP4060616A1 (en) Super-resolution reconstruction preprocessing method and super-resolution reconstruction method for ultrasound contrast image
TW201800975A (en) Human hand detection tracking method and device
CN107240101A (en) Target area detection method and device, image partition method and device
CN113269720A (en) Defect detection method and system for straight welded pipe and readable medium
CN114581744A (en) Image target detection method, system, equipment and storage medium
CN106537451B (en) A kind of blood vessel ridge point extracting method and device based on image gradient vector flow field
Wang et al. Prohibited items detection in baggage security based on improved YOLOv5
CN110132975B (en) Method and device for detecting surface of cladding of nuclear fuel rod
CN113449538A (en) Visual model training method, device, equipment and storage medium
CN106228571B (en) The object tracking detection method and device of object manipulator
CN109636832A (en) Stop detection method, device, electronic equipment and storage medium
Zhou et al. BV-Net: Bin-based Vector-predicted Network for tubular solder joint detection
KR20220098309A (en) Object detection method, apparatus and electronic device
US9275467B2 (en) Incremental contour-extraction scheme for binary image segments
CN113902742B (en) TFT-LCD detection-based defect true and false judgment method and system
Gharsallah et al. Image segmentation for defect detection based on level set active contour combined with saliency map
CN107346543B (en) Blood vessel center line processing method and device, terminal and storage medium
CN114612710A (en) Image detection method, image detection device, computer equipment and storage medium
CN105719269B (en) A kind of target object determines method and device
CN114037820A (en) Infrared weak and small multi-target detection method and device based on multi-feature fusion
CN112884755A (en) Method and device for detecting contraband

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220714

Address after: 541004 No. d-07, information industry park, high tech Zone, Guilin, Guangxi Zhuang Autonomous Region

Patentee after: URIT Medical Electronic Co.,Ltd.

Patentee after: GUILIN MEASURING & CUTTING TOOL Co.,Ltd.

Address before: 541004 No. d-07, information industry park, high tech Zone, Guilin, Guangxi Zhuang Autonomous Region

Patentee before: URIT Medical Electronic Co.,Ltd.

TR01 Transfer of patent right