CN104182751B - Object edge extracting method and device - Google Patents

Object edge extracting method and device Download PDF

Info

Publication number
CN104182751B
CN104182751B CN201410361293.6A CN201410361293A CN104182751B CN 104182751 B CN104182751 B CN 104182751B CN 201410361293 A CN201410361293 A CN 201410361293A CN 104182751 B CN104182751 B CN 104182751B
Authority
CN
China
Prior art keywords
line segment
image
coordinate
edge
edge line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201410361293.6A
Other languages
Chinese (zh)
Other versions
CN104182751A (en
Inventor
徐晓舟
陈志军
秦秋平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201410361293.6A priority Critical patent/CN104182751B/en
Publication of CN104182751A publication Critical patent/CN104182751A/en
Application granted granted Critical
Publication of CN104182751B publication Critical patent/CN104182751B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The disclosure is directed to a kind of object edge extracting method and device, belong to image processing field.This method includes:At least two field pictures of photographic subjects thing, obtain from each two field picture is photographed to sensor running parameter during the next two field picture of shooting;For the first image and the second image in this at least two field pictures, first object thing edge line segment and sensor running parameter in first image, the second object edge line segment in second image is estimated;The edge line segment in second image is extracted, as the second image border line segment;According to the second object edge line segment, the second image border line segment is filtered;Object edge line segment using remaining second image border line segment as second image.The present invention extracts the object edge of another two field picture by using the existing object edge extracting result of a two field picture, reduces amount of calculation, saves the time of edge extracting, improves the speed of edge extracting.

Description

Object edge extracting method and device
Technical field
The disclosure is directed to image processing field, specifically on object edge extracting method and device.
Background technology
Image border is the essential characteristic of image, with the development and popularization of image recognition technology, to Edge extraction Research it is also more and more important.When extracting image border, denoising first is carried out to image, gray level image is obtained, utilizes Sobel The edge of the operator extraction gray level image such as (Sobel) or Canny, then Line segment detection is carried out by Hough transformation, obtain gray scale Edge line segment in image.
Various patterns are might have on object in image, the pattern such as printed on bank card, if will When pattern of the edge of object still on object is very complicated in extraction image, it can be extracted using above-mentioned edge extracting method Many line segments, the edge line segment of existing object in these line segments, also there is the edge line segment of pattern on object, at this point it is possible to Growth processing is carried out to the line segment extracted, will can intersect after growth and the line segment of a closed area can be surrounded, as The edge line segment of object.
During the present invention is realized, inventor has found correlation technique existing defects, such as:The line extracted in image Section is a lot, and amount of calculation when to line segment increase processing is very big, and the time of consuming is excessive, so as to cause the speed of edge extracting It is very slow.
The content of the invention
In order to solve problem present in correlation technique, present disclose provides a kind of object edge extracting method and dress Put.The technical scheme is as follows:
According to the first aspect of the embodiment of the present disclosure, there is provided a kind of object edge extracting method, methods described include:
At least two field pictures of photographic subjects thing, obtain from photograph each two field picture to shoot next two field picture when examined The sensor running parameter measured;
For the first image and the second image at least two field pictures, the first mesh in described first image Mark thing edge line segment and change from described first image is photographed to sensor detected when shooting second image Parameter, estimate the second object edge line segment in second image;
The edge line segment in second image is extracted, as the second image border line segment;
According to the second object edge line segment of estimation, second image border line segment is filtered;
Object edge line segment using remaining second image border line segment after filtering as second image.
The first object thing edge line segment in described first image and from photograph described first image to Sensor running parameter detected during second image is shot, estimates the second object edge in second image Line segment includes:
The coordinate of at least one end points of the first object thing edge line segment is obtained, as at least one first coordinate;
Obtain from coordinate offset amount detected when photographing described first image to shooting second image;
The difference of at least one first coordinate and the coordinate offset amount is calculated, obtains at least one second coordinate;
, will be described at least one according to the annexation of at least one end points described in the first object thing edge line segment Coordinate points connection corresponding to second coordinate, using obtained line segment as the second object edge line segment.
The second object edge line segment according to estimation, carrying out filtering to second image border line segment includes:
By the second object edge line segment from former coordinate system transformation to specified coordinate system, the finger of second edge point is obtained Position fixing, the straight line in the former coordinate system are converted into a point in the specified coordinate system, in the former coordinate system A point transformation to the specified coordinate system in a curve;
By second image border line segment from former coordinate system transformation to the specified coordinate system, the 3rd marginal point is obtained Specified coordinate;
In the specified coordinate system, according to specifying for the specified coordinate of the second edge point and the 3rd marginal point Coordinate, filter out the 3rd marginal point outside the preset range of the second edge point.
The object edge line segment bag using remaining second image border line segment after filtering as second image Include:
Calculate the specified coordinate average of multiple 3rd marginal points within the preset range of the second edge point;
Coordinate points corresponding to the specified coordinate average are shifted into the former coordinate system from specified coordinate system inversion, will Object edge line segment of the line segment that inverse transformation obtains as second image.
It is described by the second object edge line segment from former coordinate system transformation to specified coordinate system, obtain second edge point Specified coordinate include:
Using below equation, by the second object edge line segment from former coordinate system transformation to specified coordinate system, obtain The specified coordinate of second edge point:
Wherein, (x1,y1) and (x2,y2) for the second object edge line segment two end points former coordinate, (alpha, distance) is the specified coordinate of the second edge point.
According to the second aspect of the embodiment of the present disclosure, there is provided a kind of object edge extraction device, described device include:
Running parameter acquisition module, at least two field pictures of photographic subjects thing, obtain from photographing each two field picture Detected sensor running parameter during to the next two field picture of shooting;
Estimation module, for for the first image and the second image at least two field pictures, according to described first First object thing edge line segment in image and from described first image is photographed to being detected when shooting second image The sensor running parameter arrived, estimate the second object edge line segment in second image;
Edge extracting module, for extracting the edge line segment in second image, as the second image border line segment;
Filtering module, for the second object edge line segment according to estimation, second image border line segment is carried out Filtering;
Object edge extracting module, for remaining second image border line segment after filtering as second image Object edge line segment.
The estimation module includes:
First coordinate acquiring unit, the coordinate of at least one end points for obtaining the first object thing edge line segment, As at least one first coordinate;
Coordinate offset amount acquiring unit, for obtaining from described first image is photographed to shooting second image when institute The coordinate offset amount detected;
Second coordinate acquiring unit, for calculating the difference of at least one first coordinate and the coordinate offset amount, obtain To at least one second coordinate;
Connection unit, for the annexation according at least one end points described in the first object thing edge line segment, Coordinate points corresponding at least one second coordinate are connected, using obtained line segment as the second object edge line Section.
The filtering module includes:
First converter unit, for by the second object edge line segment from former coordinate system transformation to specified coordinate system, Obtain the specified coordinate of second edge point, the straight line in the former coordinate system is converted into one in the specified coordinate system Point, the curve in a point transformation to the specified coordinate system in the former coordinate system;
Second converter unit, for by second image border line segment from former coordinate system transformation to the specified coordinate System, obtains the specified coordinate of the 3rd marginal point;
Filter element, in the specified coordinate system, according to the specified coordinate of the second edge point and described The specified coordinate of three marginal points, filter out the 3rd marginal point outside the preset range of the second edge point.
The object edge extracting module includes:
Average calculation unit, for calculating multiple 3rd marginal points within the preset range of the second edge point Specified coordinate average;
Inverse transformation block, for coordinate points corresponding to the specified coordinate average to be shifted to from specified coordinate system inversion The former coordinate system, the object edge line segment using the line segment that inverse transformation obtains as second image.
First converter unit is used to apply below equation, and the second object edge line segment is become from former coordinate system Specified coordinate system is shifted to, obtains the specified coordinate of second edge point:
Wherein, (x1,y1) and (x2,y2) for the second object edge line segment two end points former coordinate, (alpha, distance) is the specified coordinate of the second edge point.
The technical scheme provided by this disclosed embodiment can include the following benefits:
The method and apparatus that the present embodiment provides, by the object edge line segment in the first image and from photographing Detected sensor running parameter, estimates the target in second image when first image is to shooting second image Thing edge line segment, according to the object edge line segment of estimation, the edge line segment extracted in second image is filtered, its In, the first image and the second image for the different frame of object image.Multiframe figure can be shot by taking full advantage of picture pick-up device The characteristic of picture, the object edge of another two field picture is extracted using the existing object edge extracting result of a two field picture, is reduced Amount of calculation, the time of edge extracting is saved, improve the speed of edge extracting.
It should be appreciated that the general description and following detailed description of the above are only exemplary, this can not be limited It is open.
Brief description of the drawings
Accompanying drawing herein is merged in specification and forms the part of this specification, shows the implementation for meeting the present invention Example, and for explaining principle of the invention together with specification.
Fig. 1 is a kind of flow chart of object edge extracting method according to an exemplary embodiment;
Fig. 2 is a kind of flow chart of object edge extracting method according to an exemplary embodiment;
Fig. 3 a are the bank card schematic diagrames according to an exemplary embodiment;
Fig. 3 b are bank's card-edge schematic diagrames according to an exemplary embodiment;
Fig. 3 c are the second edge point schematic diagrames according to an exemplary embodiment;
Fig. 3 d are the 3rd marginal point schematic diagrames according to an exemplary embodiment;
Fig. 3 e are the preset range schematic diagrames according to an exemplary embodiment;
Fig. 3 f are the object edge line segment schematic diagrames according to an exemplary embodiment;
Fig. 4 is a kind of block diagram of object edge extraction device according to an exemplary embodiment;
Fig. 5 is a kind of block diagram for object edge extraction device according to an exemplary embodiment.
Embodiment
It is right with reference to embodiment and accompanying drawing for the purpose, technical scheme and advantage of the disclosure are more clearly understood The disclosure is described in further details.Here, the exemplary embodiment of the disclosure and its illustrate to be used to explain the disclosure, but simultaneously Not as the restriction to the disclosure.
The embodiment of the present disclosure provides a kind of object edge extracting method and device, and the disclosure is carried out below in conjunction with accompanying drawing Describe in detail.
Fig. 1 is a kind of flow chart of object edge extracting method according to an exemplary embodiment, such as Fig. 1 institutes Show, comprise the following steps:
In a step 101, at least two field pictures of photographic subjects thing, obtain next to shooting from each two field picture is photographed Detected sensor running parameter during two field picture.
In a step 102, for the first image and the second image in this at least two field pictures, according in first image First object thing edge line segment and from first image is photographed to detected sensor when shooting second image Running parameter, estimate the second object edge line segment in second image.
In step 103, the edge line segment in second image is extracted, as the second image border line segment.
At step 104, according to the second object edge line segment of estimation, the second image border line segment was carried out Filter.
In step 105, the object edge using remaining second image border line segment after filtering as second image Line segment.
The present embodiment provide method, by the object edge line segment in the first image and from photograph this first Detected sensor running parameter, estimates the object edge in second image when image is to shooting second image Line segment, according to the object edge line segment of estimation, the edge line segment extracted in second image is filtered, wherein, first Image and the second image for the different frame of object image.Take full advantage of the spy that picture pick-up device is capable of shoot multi-frame images Property, the object edge of another two field picture is extracted using the existing object edge extracting result of a two field picture, reduces calculating Amount, the time of edge extracting is saved, improve the speed of edge extracting.
The first object thing edge line segment in first image and should to shooting from first image is photographed Detected sensor running parameter during the second image, estimate that the second object edge line segment in second image includes:
The coordinate of at least one end points of the first object thing edge line segment is obtained, as at least one first coordinate;
Obtain from coordinate offset amount detected when photographing first image to shooting second image;
At least one first coordinate and the difference of the coordinate offset amount are calculated, obtains at least one second coordinate;
According to the annexation of at least one end points in the first object thing edge line segment, by least one second seat The corresponding coordinate points connection of mark, using obtained line segment as the second object edge line segment.
, according to the second object edge line segment of estimation, carrying out filtering to the second image border line segment includes for this:
By the second object edge line segment from former coordinate system transformation to specified coordinate system, specifying for second edge point is obtained Coordinate, the straight line in the former coordinate system are converted into a point in the specified coordinate system, a point in the former coordinate system A curve being converted into the specified coordinate system;
By the second image border line segment from former coordinate system transformation to the specified coordinate system, specifying for the 3rd marginal point is obtained Coordinate;
In the specified coordinate system, according to the specified coordinate of the second edge point and the specified coordinate of the 3rd marginal point, Filter out the 3rd marginal point outside the preset range of the second edge point.
This includes remaining second image border line segment after filtering as the object edge line segment of second image:
Calculate the specified coordinate average of multiple 3rd marginal points within the preset range of the second edge point;
Coordinate points corresponding to the specified coordinate average are shifted into the former coordinate system from the specified coordinate system inversion, by inverse transformation Object edge line segment of the obtained line segment as second image.
The second object edge line segment from former coordinate system transformation to specified coordinate system, is obtained the finger of second edge point by this Position fixing includes:
Using below equation, by the second object edge line segment from former coordinate system transformation to specified coordinate system, is obtained The specified coordinate of two marginal points:
Wherein, (x1,y1) and (x2,y2) for the second object edge line segment two end points former coordinate, (alpha, Distance it is) specified coordinate of the second edge point.
Above-mentioned all optional technical schemes, any combination can be used to form the alternative embodiment of the present invention, herein no longer Repeat one by one.
Fig. 2 is a kind of flow chart of object edge extracting method according to an exemplary embodiment, this method Executive agent is image processing apparatus, as shown in Fig. 2 comprising the following steps:
201st, at least two field pictures of the image processing apparatus photographic subjects thing, obtain from each two field picture is photographed to bat Photograph sensor running parameter detected during a two field picture.
Wherein, the object can be bank card, identity card and other items, and the image processing apparatus is used for photographic subjects thing Image, and edge extracting is carried out to the image of shooting, the image processing apparatus can be mobile phone or computer etc., and the present invention is real Example is applied not limit this.When needing to obtain the image of the object, picture pick-up device is directed at the object and shot, should Image processing apparatus carries out edge extracting to the image photographed and image is split, you can obtains the image of the object.Afterwards, The image processing apparatus can also carry out the operation such as feature extraction, image recognition to the image of the object, will not be repeated here.
In actual application, pattern is might have on the object, when the image processing apparatus is to the figure that photographs During as carrying out edge extracting, the edge line segment of pattern on the edge line segment and object of object can be extracted, now will very Difficulty distinguishes the edge line segment of the object.Referring to Fig. 3 a, so that the object is bank card as an example, dotted line represents the bank card Actual edge, the bank card include " Bank of China " and " VISA " pattern, and edge extracting is carried out to the image of the bank card When, the image shown in Fig. 3 b can be obtained, the edge (being represented with solid line) of bank card is both included in the image, also includes " China Bank ", the edge of " VISA " pattern.
In order to extract the edge line segment of the object, the image processing apparatus shoots at least two field pictures of the object, And detected in real time during shooting from sensor running parameter when photographing each two field picture to the next two field picture of shooting, The sensor running parameter be used for represent the image processing apparatus from photograph each two field picture to shoot next two field picture when Moving direction and displacement, when the object is not moved, the image processing apparatus is extracted according to a two field picture Edge line segment and the sensor running parameter be estimated that the edge line segment of next two field picture.Wherein, the sensor becomes Change the coordinate offset amount that parameter can be the image processing apparatus, and the sensor running parameter can be by the image processing apparatus The gravity sensor of configuration detects to obtain.
In the present embodiment, the image processing apparatus is obtained from each two field picture is photographed to shooting next two field picture when institute The sensor running parameter detected can include:In shooting process, the image processing apparatus often detects one frame figure of shooting During picture, current sensor parameters are recorded.For any two field pictures of shooting, sensor when shooting this two field pictures is calculated The difference of parameter, obtain sensor running parameter.Or in shooting process, the image processing apparatus often detects one frame of shooting During image, current sensor parameters are obtained, sensor when calculating current sensor parameters with shooting previous frame image is joined The difference of number, obtains sensor running parameter, is recorded as sensor running parameter corresponding to present image, then for shooting Any two field pictures, calculate sensor running parameter corresponding to each two field picture shot among this two field pictures it With obtain the sensor running parameter between this two field pictures.
202nd, for the first image and the second image in this at least two field pictures, the image processing apparatus determine this first First object thing edge line segment in image, the coordinate of at least one end points of the first object thing edge line segment is obtained, as At least one first coordinate.
In the present embodiment, the image processing apparatus carries out edge extracting to first image, obtains the first object Thing edge line segment.The first object thing edge line segment is made up of at least one line segment, then the first object thing edge line segment also wraps Include at least one end points.The image processing apparatus can pre-establish coordinate system, when extracting the first object thing edge line segment When, according to the position of at least one end points of the first object thing edge line segment in the coordinate system, determine the first object thing The coordinate of at least one end points of edge line segment.
Then before the step 202, this method can also include:The image processing apparatus is by the lower left corner of first image As coordinate origin, using the lower edge of first image as transverse axis (X-axis), using the left hand edge of first image as the longitudinal axis (Y-axis), establish the coordinate system.The present embodiment is not limited the origin, transverse axis, the longitudinal axis of the coordinate system of foundation.
It should be noted that object edge line segment of the image processing apparatus according to previous frame image, estimates next frame The object edge line segment of image, and for the first two field picture, the image processing apparatus can enter to first two field picture Row edge extracting, after obtaining edge line segment, growth processing is carried out to the edge line segment, will can intersect after growth and can surround The edge line segment of one closed area, as object edge line segment.In actual application, the image processing apparatus needs Multiple edge extracting is carried out to the object, then when the image processing apparatus extracts the object edge line segment of the first two field picture Afterwards, you can according to the object edge line segment and sensor running parameter of first two field picture, estimate the target of next two field picture Thing edge line segment, so as to extract the object edge line segment of next two field picture, substantially reduce the next frame image border and carry The amount of calculation taken, improve the edge extracting speed of next two field picture.
203rd, the image processing apparatus is obtained from detected when photographing first image to shooting second image Coordinate offset amount, at least one first coordinate and the difference of the coordinate offset amount are calculated, obtain at least one second coordinate, according to The annexation of at least one end points in the first object thing edge line segment, by the seat corresponding at least one second coordinate Punctuate connects, using obtained line segment as the second object edge line segment.
In the present embodiment, from coordinate offset amount detected when photographing first image to shooting second image The moving direction and displacement of the image processing apparatus can be represented, when the image processing apparatus have taken first image, When have taken second image again after movement, compared with the pixel in first image, corresponding pixel in second image Point is also there occurs movement, and the moving direction of moving direction and the image processing apparatus is on the contrary, displacement and the image procossing The displacement of device is equal, then the image processing apparatus calculates at least one first coordinate and the difference of the coordinate offset amount, As coordinate of at least one end points in second image, is denoted as at least one second coordinate.The image processing apparatus obtains During at least one second coordinate, you can according to annexation of at least one end points in first image, by this extremely Coordinate points connection corresponding to few second coordinate, obtains at least one line segment, using at least one line segment as this second Object edge line segment.
Wherein, the image processing apparatus can calculate at least one second coordinate using below equation:
x1=x0- △ x, y1=y0-△y;
Wherein, (x1,y1) be end points the second coordinate, (x0,y0) be the end points the first coordinate, (△ x, △ y) is the figure As the coordinate offset amount of processing unit.
For example, the unit of the coordinate system is mm, the first coordinate of the end points of the first object thing edge line segment is (x0,y0)。 If the coordinate offset amount that the image processing apparatus detects is (1,0), show that the image processing apparatus moves along X-axis positive direction 1mm, then compared with the end points in first image, the end points moves 1mm along X-axis negative direction in second image, the end Second coordinate of point is (x0-1,y0)。
204th, the image processing apparatus carries out denoising to second image, obtains the gray level image of second image, The edge of the gray level image is extracted, using Hough transformation detective operators, the edge line segment in the gray level image is obtained, as second Image border line segment.
In the present embodiment, the image processing apparatus carries out edge extracting to second image, i.e., is calculated using image denoising Method carries out denoising to second image, obtains the gray level image of second image, the ash is extracted using arithmetic operators The edge of image is spent, and uses Hough transformation detective operators, the edge line segment in the gray level image is obtained, as second image Edge line segment.The second image border line segment is made up of at least one line segment, and the second image border line segment also includes at least one Individual end points.
Wherein, the Image denoising algorithm can be gaussian filtering, mean filter scheduling algorithm, and the arithmetic operators can be Sobel operators or Canny operators etc., the present embodiment is not limited this.
It should be noted that the present embodiment illustrates so that step 204 performs after step 203 as an example, in fact, step Rapid 204 can also be performed before step 202, or be performed simultaneously with step 202-203, and the present embodiment is not limited this.
205th, the image processing apparatus was carried out according to the second object edge line segment to the second image border line segment Filter, the object edge line segment using remaining second image border line segment after filtering as second image.
In the present embodiment, the image processing apparatus is estimated according to the first object thing edge line segment and the coordinate offset amount The second object edge line segment is counted out, and the image processing apparatus extracts the second image border line segment, second image Both include the edge line segment of the object in edge line segment, also include the edge line segment of pattern on the object, then can consider Nearer edge line segment is the edge line segment of object with the second object edge line segment distance of the estimation, and with the estimation The second object edge line segment distance edge line segment farther out be pattern on object edge line segment.Therefore, the image Processing unit filters according to the second object edge line segment to the second image border line segment.
In the present embodiment, " image processing apparatus according to the second object edge line segment, to second image border Line segment is filtered " it may comprise steps of 205a-205c:
205a, the image processing apparatus by the second object edge line segment from former coordinate system transformation to specified coordinate system, The specified coordinate of second edge point is obtained, by the second image border line segment from former coordinate system transformation to the specified coordinate system, is obtained To the specified coordinate of the 3rd marginal point.
Wherein, the straight line in the former coordinate system is converted into a point in the specified coordinate system, in the former coordinate system A point transformation to the specified coordinate system in a curve, then the image processing apparatus is by the second object edge line segment The second edge point is transformed to, the second image border line segment is transformed to the 3rd marginal point.
In the present embodiment, the specified coordinate system can be alpha-distance coordinate systems, the image processing apparatus pair The second object edge line segment and second image border line segment carry out Hough transformation, by the second object edge line segment from Former coordinate system transformation is to the specified coordinate system, by the second image border line segment from former coordinate system transformation to the specified coordinate system. Further, the image processing apparatus can apply below equation, by the second object edge line segment from former coordinate system transformation To the specified coordinate system:
Wherein, (x1,y1) and (x2,y2) for the second object edge line segment two end points former coordinate, (alpha, Distance it is) specified coordinate of the second edge point.Alpha can represent the second object edge line in former coordinate system The inclination angle of section, distance can represent former coordinate origin to the distance of the second object edge line segment.
And below equation is applied, by the second image border line segment from former coordinate system transformation to the specified coordinate system:
Wherein, (x3,y3) and (x4,y4) for the second image border line segment two end points former coordinate, (alpha', Distance' it is) specified coordinate of the 3rd marginal point.Alpha' can represent the second image border line in former coordinate system The inclination angle of section, distance' can represent former coordinate origin to the distance of the second image border line segment.
Fig. 3 c are the second edge point schematic diagrames according to an exemplary embodiment, and Fig. 3 d are according to an exemplary implementation The 3rd marginal point schematic diagram exemplified.The image processing apparatus by the second object edge line segment from former coordinate system transformation to During the specified coordinate system, in the specified coordinate system, second edge point such as Fig. 3 c institutes of the second object edge line segment conversion Show, the image processing apparatus by the second image border line segment from former coordinate system transformation to the specified coordinate system when, this specify In coordinate system, the 3rd marginal point of the second image border line segment conversion is as shown in Figure 3 d.
205b, the image processing apparatus in the specified coordinate system, according to the specified coordinate of the second edge point and this The specified coordinate of three marginal points, filter out the 3rd marginal point outside the preset range of the second edge point.
Wherein, the preset range can determine according to the specified coordinate and pre-determined distance of the second edge point, and this is default Distance can determine according to the accuracy requirements of edge extracting or the setting of technical staff.The image processing apparatus can this Two marginal points are the center of circle, using the pre-determined distance as radius, determine border circular areas, using the border circular areas as the second edge point Preset range, the 3rd marginal point outside the border circular areas may be considered the edge line segment conversion by the pattern on object Obtain, then filter out the 3rd marginal point outside the border circular areas.Certainly, the image processing apparatus can also the second edge point Centered on, using distance of the pre-determined distance as the center to each side, square area is determined, the square area is pre- as this If scope, the present embodiment is not limited the preset range.
Referring to Fig. 3 e, the dashed region in the preset range such as Fig. 3 e of each second edge point that the image processing apparatus determines Domain, then the image processing apparatus filter out the 3rd marginal point outside the plurality of preset range, retain the plurality of preset range it The 3rd interior marginal point.
In the present embodiment, the image processing apparatus is in the specified coordinate system according to the preset range to the 3rd marginal point Cluster rejection is carried out, filters out the 3rd marginal point outside the preset range, only retains the 3rd edge within the preset range Point, then the line segment in the fringe region of the object finally can be only extracted, and filter out the line segment of non-edge.Especially It is for the object such as bank card or credit card, protrusion is imprinted with card number on the object, in the figure to the object During as carrying out edge extracting, the texture of the card number of protrusion marking can be erroneously interpreted as edge, to the edge extracting of the object Impact, and in the present embodiment, the line segment of card number can be filtered out, avoids the influence of pseudo-edge.
205c, the image processing apparatus calculate multiple 3rd marginal points within the preset range of the second edge point Specified coordinate average, coordinate points corresponding to the specified coordinate average are shifted into the former coordinate system from the specified coordinate system inversion, will Object edge line segment of the line segment that inverse transformation obtains as second image.
In the present embodiment, the 3rd marginal point within the preset range may be considered is become by the edge line segment of object Get in return, then can calculate the specified coordinate average of multiple 3rd marginal points within the preset range of the second edge point, Obtained coordinate points are converted using the coordinate points corresponding to the specified coordinate average as the edge line segment of the object, by the coordinate Point shifts to the former coordinate system from the specified coordinate system inversion, you can obtains the object edge line segment.
Referring to Fig. 3 e, the image processing apparatus calculates the specified seat of multiple 3rd marginal points within the plurality of preset range Target average, the coordinate points corresponding to the specified coordinate average are shifted into the former coordinate system from the specified coordinate system inversion, obtained Object edge line segment such as Fig. 3 f in edge it is shown in solid.
The present embodiment provide method, by the object edge line segment in the first image and from photograph this first Detected sensor running parameter, estimates the object edge in second image when image is to shooting second image Line segment, according to the object edge line segment of estimation, the edge line segment extracted in second image is filtered, wherein, first Image and the second image for the different frame of object image.Take full advantage of the spy that picture pick-up device is capable of shoot multi-frame images Property, the object edge of another two field picture is extracted using the existing object edge extracting result of a two field picture, reduces calculating Amount, the time of edge extracting is saved, improve the speed of edge extracting.
Fig. 4 is a kind of block diagram of object edge extraction device according to an exemplary embodiment, should referring to Fig. 4 Device includes:Running parameter acquisition module 401, estimation module 402, edge extracting module 403, filtering module 404 and object Edge extracting module 405.
The running parameter acquisition module 401 is configurable at least two field pictures of photographic subjects thing, obtains from shooting Detected sensor running parameter during to each two field picture to the next two field picture of shooting;
The estimation module 402 is configurable for for the first image and the second image in this at least two field pictures, root According to the first object thing edge line segment in first image and from first image is photographed to shooting the second image when institute The sensor running parameter detected, estimate the second object edge line segment in second image;
The edge extracting module 403 is configurable for extracting the edge line segment in second image, as the second image Edge line segment;
The filtering module 404 is configurable for the second object edge line segment according to estimation, to the second image side Edge line segment is filtered;
The object edge extracting module 405 be configurable for using remaining second image border line segment after filtering as The object edge line segment of second image.
The present embodiment provide device, by the object edge line segment in the first image and from photograph this first Detected sensor running parameter, estimates the object edge in second image when image is to shooting second image Line segment, according to the object edge line segment of estimation, the edge line segment extracted in second image is filtered, wherein, first Image and the second image for the different frame of object image.Take full advantage of the spy that picture pick-up device is capable of shoot multi-frame images Property, the object edge of another two field picture is extracted using the existing object edge extracting result of a two field picture, reduces calculating Amount, the time of edge extracting is saved, improve the speed of edge extracting.
The estimation module 402 includes:
First coordinate acquiring unit, the coordinate of at least one end points for obtaining the first object thing edge line segment, make For at least one first coordinate;
Coordinate offset amount acquiring unit, for obtain from photograph first image to shoot second image when detected The coordinate offset amount arrived;
Second coordinate acquiring unit, for calculating the difference of at least one first coordinate and the coordinate offset amount, obtain to Few second coordinate;
Connection unit, for the annexation according at least one end points in the first object thing edge line segment, by this Coordinate points connection corresponding at least one second coordinate, using obtained line segment as the second object edge line segment.
The filtering module 404 includes:
First converter unit, for the second object edge line segment from former coordinate system transformation to specified coordinate system, to be obtained To the specified coordinate of second edge point, the straight line in the former coordinate system is converted into a point in the specified coordinate system, should A curve in a point transformation to the specified coordinate system in former coordinate system;
Second converter unit, for the second image border line segment from former coordinate system transformation to the specified coordinate system, to be obtained To the specified coordinate of the 3rd marginal point;
Filter element, in the specified coordinate system, according to the specified coordinate of the second edge point and the 3rd edge The specified coordinate of point, filters out the 3rd marginal point outside the preset range of the second edge point.
The object edge extracting module 405 includes:
Average calculation unit, for calculating the finger of multiple 3rd marginal points within the preset range of the second edge point Position fixing average;
Inverse transformation block, for coordinate points corresponding to the specified coordinate average to be shifted into the original from the specified coordinate system inversion Coordinate system, the object edge line segment using the line segment that inverse transformation obtains as second image.
First converter unit be used for apply below equation, by the second object edge line segment from former coordinate system transformation to Specified coordinate system, obtain the specified coordinate of second edge point:
Wherein, (x1,y1) and (x2,y2) for the second object edge line segment two end points former coordinate, (alpha, Distance it is) specified coordinate of the second edge point.
Above-mentioned all optional technical schemes, any combination can be used to form the alternative embodiment of the present invention, herein no longer Repeat one by one.
On the device in above-described embodiment, wherein modules perform the concrete mode of operation in relevant this method Embodiment in be described in detail, explanation will be not set forth in detail herein.
It should be noted that:The object edge extraction device that above-described embodiment provides is when extracting object edge, only With the division progress of above-mentioned each functional module for example, in practical application, can as needed and by above-mentioned function distribution by Different functional modules is completed, i.e., the internal structure of image processing apparatus is divided into different functional modules, more than completion The all or part of function of description.In addition, the object edge extraction device that above-described embodiment provides carries with object edge Embodiment of the method is taken to belong to same design, its specific implementation process refers to embodiment of the method, repeats no more here.
Fig. 5 is a kind of block diagram of device 500 for object edge extracting according to an exemplary embodiment.Example Such as, device 500 can be mobile phone, computer, digital broadcast terminal, messaging devices, game console, and flat board is set It is standby, Medical Devices, body-building equipment, personal digital assistant etc..
Reference picture 5, device 500 can include following one or more assemblies:Processing component 502, memory 504, power supply Component 506, multimedia groupware 508, audio-frequency assembly 510, the interface 512 of input/output (I/O), sensor cluster 514, and Communication component 516.
The integrated operation of the usual control device 500 of processing component 502, such as communicated with display, call, data, phase The operation that machine operates and record operation is associated.Treatment element 502 can refer to including one or more processors 520 to perform Order, to complete all or part of step of above-mentioned method.In addition, processing component 502 can include one or more modules, just Interaction between processing component 502 and other assemblies.For example, processing component 502 can include multi-media module, it is more to facilitate Interaction between media component 508 and processing component 502.
Memory 504 is configured as storing various types of data to support the operation in equipment 500.These data are shown Example includes the instruction of any application program or method for operating on device 500, contact data, telephone book data, disappears Breath, picture, video etc..Memory 504 can be by any kind of volatibility or non-volatile memory device or their group Close and realize, as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM) are erasable to compile Journey read-only storage (EPROM), programmable read only memory (PROM), read-only storage (ROM), magnetic memory, flash Device, disk or CD.
Electric power assembly 506 provides electric power for the various assemblies of device 500.Electric power assembly 506 can include power management system System, one or more power supplys, and other components associated with generating, managing and distributing electric power for device 500.
Multimedia groupware 508 is included in the screen of one output interface of offer between described device 500 and user.One In a little embodiments, screen can include liquid crystal display (LCD) and touch panel (TP).If screen includes touch panel, screen Curtain may be implemented as touch-screen, to receive the input signal from user.Touch panel includes one or more touch sensings Device is with the gesture on sensing touch, slip and touch panel.The touch sensor can not only sensing touch or sliding action Border, but also detect and touched or the related duration and pressure of slide with described.In certain embodiments, more matchmakers Body component 508 includes a front camera and/or rear camera.When equipment 500 is in operator scheme, such as screening-mode or During video mode, front camera and/or rear camera can receive outside multi-medium data.Each front camera and Rear camera can be a fixed optical lens system or have focusing and optical zoom capabilities.
Audio-frequency assembly 510 is configured as output and/or input audio signal.For example, audio-frequency assembly 510 includes a Mike Wind (MIC), when device 500 is in operator scheme, during such as call model, logging mode and speech recognition mode, microphone by with It is set to reception external audio signal.The audio signal received can be further stored in memory 504 or via communication set Part 516 is sent.In certain embodiments, audio-frequency assembly 510 also includes a loudspeaker, for exports audio signal.
I/O interfaces 512 provide interface between processing component 502 and peripheral interface module, and above-mentioned peripheral interface module can To be keyboard, click wheel, button etc..These buttons may include but be not limited to:Home button, volume button, start button and lock Determine button.
Sensor cluster 514 includes one or more sensors, and the state for providing various aspects for device 500 is commented Estimate.For example, sensor cluster 514 can detect opening/closed mode of equipment 500, and the relative positioning of component, for example, it is described Component is the display and keypad of device 500, and sensor cluster 514 can be with 500 1 components of detection means 500 or device Position change, the existence or non-existence that user contacts with device 500, the orientation of device 500 or acceleration/deceleration and device 500 Temperature change.Sensor cluster 514 can include proximity transducer, be configured to detect in no any physical contact The presence of neighbouring object.Sensor cluster 514 can also include optical sensor, such as CMOS or ccd image sensor, for into As being used in application.In certain embodiments, the sensor cluster 514 can also include acceleration transducer, gyro sensors Device, Magnetic Sensor, pressure sensor or temperature sensor.
Communication component 516 is configured to facilitate the communication of wired or wireless way between device 500 and other equipment.Device 500 can access the wireless network based on communication standard, such as WiFi, 2G or 3G, or combinations thereof.In an exemplary implementation In example, communication component 516 receives broadcast singal or broadcast related information from external broadcasting management system via broadcast channel. In one exemplary embodiment, the communication component 516 also includes near-field communication (NFC) module, to promote junction service.Example Such as, in NFC module radio frequency identification (RFID) technology can be based on, Infrared Data Association (IrDA) technology, ultra wide band (UWB) technology, Bluetooth (BT) technology and other technologies are realized.
In the exemplary embodiment, device 500 can be believed by one or more application specific integrated circuits (ASIC), numeral Number processor (DSP), digital signal processing appts (DSPD), PLD (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components are realized, for performing the above method.
In the exemplary embodiment, a kind of non-transitorycomputer readable storage medium including instructing, example are additionally provided Such as include the memory 504 of instruction, above-mentioned instruction can be performed to complete the above method by the processor 520 of device 500.For example, The non-transitorycomputer readable storage medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk With optical data storage devices etc..
A kind of non-transitorycomputer readable storage medium, when the instruction in the storage medium is by the processing of mobile terminal When device performs so that mobile terminal is able to carry out a kind of object edge extracting method, and methods described includes:
At least two field pictures of photographic subjects thing, obtain from photograph each two field picture to shoot next two field picture when examined The sensor running parameter measured;
For the first image and the second image at least two field pictures, the first mesh in described first image Mark thing edge line segment and change from described first image is photographed to sensor detected when shooting second image Parameter, estimate the second object edge line segment in second image;
The edge line segment in second image is extracted, as the second image border line segment;
According to the second object edge line segment of estimation, second image border line segment is filtered;
Object edge line segment using remaining second image border line segment after filtering as second image.
The first object thing edge line segment in described first image and from photograph described first image to Sensor running parameter detected during second image is shot, estimates the second object edge in second image Line segment includes:
The coordinate of at least one end points of the first object thing edge line segment is obtained, as at least one first coordinate;
Obtain from coordinate offset amount detected when photographing described first image to shooting second image;
The difference of at least one first coordinate and the coordinate offset amount is calculated, obtains at least one second coordinate;
, will be described at least one according to the annexation of at least one end points described in the first object thing edge line segment Coordinate points connection corresponding to second coordinate, using obtained line segment as the second object edge line segment.
The second object edge line segment according to estimation, carrying out filtering to second image border line segment includes:
By the second object edge line segment from former coordinate system transformation to specified coordinate system, the finger of second edge point is obtained Position fixing, the straight line in the former coordinate system are converted into a point in the specified coordinate system, in the former coordinate system A point transformation to the specified coordinate system in a curve;
By second image border line segment from former coordinate system transformation to the specified coordinate system, the 3rd marginal point is obtained Specified coordinate;
In the specified coordinate system, according to specifying for the specified coordinate of the second edge point and the 3rd marginal point Coordinate, filter out the 3rd marginal point outside the preset range of the second edge point.
The object edge line segment bag using remaining second image border line segment after filtering as second image Include:
Calculate the specified coordinate average of multiple 3rd marginal points within the preset range of the second edge point;
Coordinate points corresponding to the specified coordinate average are shifted into the former coordinate system from specified coordinate system inversion, will Object edge line segment of the line segment that inverse transformation obtains as second image.
It is described by the second object edge line segment from former coordinate system transformation to specified coordinate system, obtain second edge point Specified coordinate include:
Using below equation, by the second object edge line segment from former coordinate system transformation to specified coordinate system, obtain The specified coordinate of second edge point:
Wherein, (x1,y1) and (x2,y2) for the second object edge line segment two end points former coordinate, (alpha, distance) is the specified coordinate of the second edge point.
Those skilled in the art will readily occur to the present invention its after considering specification and putting into practice invention disclosed herein Its embodiment.The application be intended to the present invention any modification, purposes or adaptations, these modifications, purposes or Person's adaptations follow the general principle of the present invention and including the undocumented common knowledges in the art of the disclosure Or conventional techniques.Description and embodiments are considered only as exemplary, and true scope and spirit of the invention are by following Claim is pointed out.
It should be appreciated that the invention is not limited in the precision architecture for being described above and being shown in the drawings, and And various modifications and changes can be being carried out without departing from the scope.The scope of the present invention is only limited by appended claim.

Claims (10)

1. a kind of object edge extracting method, it is characterised in that methods described includes:
At least two field pictures of photographic subjects thing, obtain from detected when photographing each two field picture to the next two field picture of shooting Sensor running parameter;
For the first image and the second image at least two field pictures, the first object thing in described first image Edge line segment and from described first image is photographed to detected sensor running parameter when shooting second image, Estimate the second object edge line segment in second image;
The edge line segment in second image is extracted, as the second image border line segment, in second graph edge line segment The edge line segment of pattern on edge line segment and the object including the object;
According to the second object edge line segment of estimation, to described in the line segment of second image border on object pattern side Edge line segment is filtered, the object edge line using remaining second image border line segment after filtering as second image Section.
2. according to the method for claim 1, it is characterised in that the first object thing side in described first image Edge line segment and from described first image is photographed to detected sensor running parameter when shooting second image, estimates The the second object edge line segment counted in second image includes:
The coordinate of at least one end points of the first object thing edge line segment is obtained, as at least one first coordinate;
Obtain from coordinate offset amount detected when photographing described first image to shooting second image;
The difference of at least one first coordinate and the coordinate offset amount is calculated, obtains at least one second coordinate;
According to the annexation of at least one end points described in the first object thing edge line segment, by described at least one second Coordinate points connection corresponding to coordinate, using obtained line segment as the second object edge line segment.
3. according to the method for claim 1, it is characterised in that the second object edge line segment according to estimation, it is right The edge line segment of pattern carries out filtering and included on object described in the line segment of second image border:
By the second object edge line segment from former coordinate system transformation to specified coordinate system, the specified seat of second edge point is obtained Mark, the straight line in the former coordinate system are converted into a point in the specified coordinate system, and one in the former coordinate system A curve in individual point transformation to the specified coordinate system;
By second image border line segment from former coordinate system transformation to the specified coordinate system, specifying for the 3rd marginal point is obtained Coordinate;
In the specified coordinate system, according to the specified coordinate of the second edge point and the specified seat of the 3rd marginal point Mark, filters out the 3rd marginal point outside the preset range of the second edge point.
4. according to the method for claim 3, it is characterised in that described to make remaining second image border line segment after filtering Include for the object edge line segment of second image:
Calculate the specified coordinate average of multiple 3rd marginal points within the preset range of the second edge point;
Coordinate points corresponding to the specified coordinate average are shifted into the former coordinate system from specified coordinate system inversion, by inversion Object edge line segment of the line segment got in return as second image.
5. according to the method for claim 3, it is characterised in that it is described by the second object edge line segment from former coordinate System is converted into specified coordinate system, and obtaining the specified coordinate of second edge point includes:
Using below equation, by the second object edge line segment from former coordinate system transformation to specified coordinate system, second is obtained The specified coordinate of marginal point:
<mrow> <mi>a</mi> <mi>l</mi> <mi>p</mi> <mi>h</mi> <mi>a</mi> <mo>=</mo> <mi>arctan</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <msub> <mi>y</mi> <mn>2</mn> </msub> <mo>-</mo> <msub> <mi>y</mi> <mn>1</mn> </msub> </mrow> <mrow> <msub> <mi>x</mi> <mn>2</mn> </msub> <mo>-</mo> <msub> <mi>x</mi> <mn>1</mn> </msub> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <mi>d</mi> <mi>i</mi> <mi>s</mi> <mi>tan</mi> <mi>c</mi> <mi>e</mi> <mo>=</mo> <mo>|</mo> <mfrac> <mrow> <msub> <mi>y</mi> <mn>1</mn> </msub> <mo>-</mo> <mi>t</mi> <mi>a</mi> <mi>n</mi> <mrow> <mo>(</mo> <mi>a</mi> <mi>l</mi> <mi>p</mi> <mi>h</mi> <mi>a</mi> <mo>)</mo> </mrow> <msub> <mi>x</mi> <mn>1</mn> </msub> </mrow> <msqrt> <mrow> <mi>t</mi> <mi>a</mi> <mi>n</mi> <msup> <mrow> <mo>(</mo> <mi>a</mi> <mi>l</mi> <mi>p</mi> <mi>h</mi> <mi>a</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <mn>1</mn> </mrow> </msqrt> </mfrac> <mo>|</mo> <mo>;</mo> </mrow>
Wherein, (x1,y1) and (x2,y2) for the second object edge line segment two end points former coordinate, (alpha, Distance it is) specified coordinate of the second edge point.
6. a kind of object edge extraction device, it is characterised in that described device includes:
Running parameter acquisition module, at least two field pictures of photographic subjects thing, obtain from each two field picture is photographed to bat Photograph sensor running parameter detected during a two field picture;
Estimation module, for for the first image and the second image at least two field pictures, according to described first image In first object thing edge line segment and from photograph described first image to shoot second image when it is detected Sensor running parameter, estimate the second object edge line segment in second image;
Edge extracting module, for extracting the edge line segment in second image, as the second image border line segment, described Two pattern edge line segments include the edge line segment of pattern on the edge line segment and the object of the object;
Filtering module, for the second object edge line segment according to estimation, to mesh described in the line segment of second image border The edge line segment of pattern is filtered on mark thing;
Object edge extracting module, for the mesh using remaining second image border line segment after filtering as second image Mark thing edge line segment.
7. device according to claim 6, it is characterised in that the estimation module includes:
First coordinate acquiring unit, the coordinate of at least one end points for obtaining the first object thing edge line segment, as At least one first coordinate;
Coordinate offset amount acquiring unit, for obtain from photograph described first image to shoot second image when detected The coordinate offset amount arrived;
Second coordinate acquiring unit, for calculating the difference of at least one first coordinate and the coordinate offset amount, obtain to Few second coordinate;
Connection unit, for the annexation according at least one end points described in the first object thing edge line segment, by institute The coordinate points connection corresponding at least one second coordinate is stated, using obtained line segment as the second object edge line segment.
8. device according to claim 6, it is characterised in that the filtering module includes:
First converter unit, for the second object edge line segment from former coordinate system transformation to specified coordinate system, to be obtained The specified coordinate of second edge point, the straight line in the former coordinate system are converted into a point in the specified coordinate system, A curve in a point transformation to the specified coordinate system in the former coordinate system;
Second converter unit, for second image border line segment from former coordinate system transformation to the specified coordinate system, to be obtained To the specified coordinate of the 3rd marginal point;
Filter element, in the specified coordinate system, according to the specified coordinate of the second edge point and the 3rd side The specified coordinate of edge point, filter out the 3rd marginal point outside the preset range of the second edge point.
9. device according to claim 8, it is characterised in that the object edge extracting module includes:
Average calculation unit, for calculating specifying for multiple 3rd marginal points within the preset range of the second edge point Coordinate average;
Inverse transformation block, it is described for coordinate points corresponding to the specified coordinate average to be shifted to from specified coordinate system inversion Former coordinate system, the object edge line segment using the line segment that inverse transformation obtains as second image.
10. device according to claim 8, it is characterised in that first converter unit is used to apply below equation, will The second object edge line segment obtains the specified coordinate of second edge point from former coordinate system transformation to specified coordinate system:
<mrow> <mi>a</mi> <mi>l</mi> <mi>p</mi> <mi>h</mi> <mi>a</mi> <mo>=</mo> <mi>arctan</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <msub> <mi>y</mi> <mn>2</mn> </msub> <mo>-</mo> <msub> <mi>y</mi> <mn>1</mn> </msub> </mrow> <mrow> <msub> <mi>x</mi> <mn>2</mn> </msub> <mo>-</mo> <msub> <mi>x</mi> <mn>1</mn> </msub> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <mi>d</mi> <mi>i</mi> <mi>s</mi> <mi>tan</mi> <mi>c</mi> <mi>e</mi> <mo>=</mo> <mo>|</mo> <mfrac> <mrow> <msub> <mi>y</mi> <mn>1</mn> </msub> <mo>-</mo> <mi>t</mi> <mi>a</mi> <mi>n</mi> <mrow> <mo>(</mo> <mi>a</mi> <mi>l</mi> <mi>p</mi> <mi>h</mi> <mi>a</mi> <mo>)</mo> </mrow> <msub> <mi>x</mi> <mn>1</mn> </msub> </mrow> <msqrt> <mrow> <mi>t</mi> <mi>a</mi> <mi>n</mi> <msup> <mrow> <mo>(</mo> <mi>a</mi> <mi>l</mi> <mi>p</mi> <mi>h</mi> <mi>a</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <mn>1</mn> </mrow> </msqrt> </mfrac> <mo>|</mo> <mo>;</mo> </mrow>
Wherein, (x1,y1) and (x2,y2) for the second object edge line segment two end points former coordinate, (alpha, Distance it is) specified coordinate of the second edge point.
CN201410361293.6A 2014-07-25 2014-07-25 Object edge extracting method and device Active CN104182751B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410361293.6A CN104182751B (en) 2014-07-25 2014-07-25 Object edge extracting method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410361293.6A CN104182751B (en) 2014-07-25 2014-07-25 Object edge extracting method and device

Publications (2)

Publication Number Publication Date
CN104182751A CN104182751A (en) 2014-12-03
CN104182751B true CN104182751B (en) 2017-12-12

Family

ID=51963778

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410361293.6A Active CN104182751B (en) 2014-07-25 2014-07-25 Object edge extracting method and device

Country Status (1)

Country Link
CN (1) CN104182751B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108732484A (en) * 2017-04-20 2018-11-02 深圳市朗驰欣创科技股份有限公司 Detection method and detecting system for component positioning
CN107403452A (en) * 2017-07-27 2017-11-28 深圳章鱼信息科技有限公司 Object identification method and its device based on FIG pull handle
CN111104940A (en) * 2018-10-26 2020-05-05 深圳怡化电脑股份有限公司 Image rotation correction method and device, electronic equipment and storage medium
CN114339371A (en) * 2021-12-30 2022-04-12 咪咕音乐有限公司 Video display method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101170647A (en) * 2006-10-25 2008-04-30 富士胶片株式会社 Method of detecting specific object region and digital camera
CN101493889A (en) * 2008-01-23 2009-07-29 华为技术有限公司 Method and apparatus for tracking video object
CN102387303A (en) * 2010-09-02 2012-03-21 奥林巴斯株式会社 Image processing apparatus, image processing method, and image pickup apparatus
CN102800088A (en) * 2012-06-28 2012-11-28 华中科技大学 Automatic dividing method of ultrasound carotid artery plaque

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8554008B2 (en) * 2010-04-13 2013-10-08 Vivante Corporation Anti-aliasing system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101170647A (en) * 2006-10-25 2008-04-30 富士胶片株式会社 Method of detecting specific object region and digital camera
CN101493889A (en) * 2008-01-23 2009-07-29 华为技术有限公司 Method and apparatus for tracking video object
CN102387303A (en) * 2010-09-02 2012-03-21 奥林巴斯株式会社 Image processing apparatus, image processing method, and image pickup apparatus
CN102800088A (en) * 2012-06-28 2012-11-28 华中科技大学 Automatic dividing method of ultrasound carotid artery plaque

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种快速的基于边缘的道路检测算法;杨文杰等;《计算机科学》;20061231;第33卷(第5期);257-260 *

Also Published As

Publication number Publication date
CN104182751A (en) 2014-12-03

Similar Documents

Publication Publication Date Title
CN104243819B (en) Photo acquisition methods and device
CN105069786B (en) Line detection method and device
CN105809704B (en) Identify the method and device of image definition
CN104484871B (en) edge extracting method and device
CN104504684B (en) Edge extraction method and device
CN105488511B (en) The recognition methods of image and device
CN106384098A (en) Image-based head posture detection method, device and terminal
WO2017215224A1 (en) Fingerprint input prompting method and apparatus
CN106250894A (en) Card image recognition methods and device
CN106296665B (en) Card image fuzzy detection method and apparatus
CN104182751B (en) Object edge extracting method and device
CN108062547A (en) Character detecting method and device
CN105426878B (en) Face cluster method and device
CN106296570A (en) Image processing method and device
CN106127751A (en) image detecting method, device and system
CN104933700B (en) A kind of method and apparatus carrying out picture material identification
CN105117680B (en) A kind of method and apparatus of the information of ID card
JP2017521742A (en) Method and apparatus for acquiring iris image, and iris identification device
CN109034150A (en) Image processing method and device
CN107463903A (en) Face key independent positioning method and device
CN106598429A (en) Method and device for adjusting window of mobile terminal
CN106557755A (en) Fingerprint template acquisition methods and device
CN106056117A (en) Image processing method and device for rectangular object
CN107911576A (en) Image processing method, device and storage medium
CN105931239A (en) Image processing method and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant