CN104182751A - Method and device for edge extraction of target object - Google Patents

Method and device for edge extraction of target object Download PDF

Info

Publication number
CN104182751A
CN104182751A CN201410361293.6A CN201410361293A CN104182751A CN 104182751 A CN104182751 A CN 104182751A CN 201410361293 A CN201410361293 A CN 201410361293A CN 104182751 A CN104182751 A CN 104182751A
Authority
CN
China
Prior art keywords
line segment
image
coordinate
edge line
specified
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410361293.6A
Other languages
Chinese (zh)
Other versions
CN104182751B (en
Inventor
徐晓舟
陈志军
秦秋平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Technology Co Ltd
Xiaomi Inc
Original Assignee
Xiaomi Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Inc filed Critical Xiaomi Inc
Priority to CN201410361293.6A priority Critical patent/CN104182751B/en
Publication of CN104182751A publication Critical patent/CN104182751A/en
Application granted granted Critical
Publication of CN104182751B publication Critical patent/CN104182751B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention discloses a method and a device for edge extraction of a target object, belonging to the field of image processing. The method comprises the following steps of shooting at least two frames of images of the target object and obtaining a sensor variable parameter from shooting each frame of image until shooting the next frame of image; estimating a second target object edge line segment in a second image for a first image and the second image in at least two frames of images according to a first target object edge line segment in the first image and the sensor variable parameter; extracting an edge line segment in the second image as a second image edge liner segment; filtering the second image edge line segment according to the second target object edge line segment; and taking the residual second image edge line segment as a target object edge line segment of the second image. According to the method and the device, the edge of the target object of the other frame of image is extracted by utilizing the existing target object edge extraction result of one frame of image, so that the calculated amount is reduced, the time for edge extraction is reduced, and the speed for edge extraction is improved.

Description

Object edge extracting method and device
Technical field
The disclosure is directed to image processing field, is about object edge extracting method and device specifically.
Background technology
Image border is the essential characteristic of image, along with the development of image recognition technology and universal, also more and more important to the research of Edge extraction.While extracting image border, first image is carried out to denoising, obtain gray level image, utilize the edge of the operator extraction gray level images such as Sobel (Sobel) or Canny, then carry out line segment detection by Hough transformation, obtain the edge line segment in gray level image.
On object in image, may have various patterns, as pattern printing on bank card etc., if when in extraction image, but the pattern on the edge object of object is very complicated, utilize above-mentioned edge extracting method can extract a lot of line segments, the edge line segment of existing object in these line segments, also the edge line segment that has pattern on object, now, can increase processing to the line segment extracting, by intersecting and can surround the line segment of a closed region after increasing, as the edge line segment of object.
In realizing process of the present invention, inventor finds that correlation technique exists defect, and for example: the line segment extracting in image is a lot, calculated amount when line segment is increased to processing is very large, and the time of expending is too much, thereby causes the speed of edge extracting very slow.
Summary of the invention
In order to solve the problem existing in correlation technique, the disclosure provides a kind of object edge extracting method and device.Described technical scheme is as follows:
According to the first aspect of disclosure embodiment, a kind of object edge extracting method is provided, described method comprises:
At least two two field pictures of photographic subjects thing, obtain from photographing each two field picture detected sensor running parameter when taking next frame image;
For the first image in described at least two two field pictures and the second image, according to the first object thing edge line segment in described the first image and from photographing described the first image detected sensor running parameter when taking described the second image, estimate the second object edge line segment in described the second image;
Extract the edge line segment in described the second image, as the second image border line segment;
The second object edge line segment according to estimating, filters described the second image border line segment;
After filtering, remaining the second image border line segment is as the object edge line segment of described the second image.
Described according to the first object thing edge line segment in described the first image and from photographing described the first image detected sensor running parameter when taking described the second image, estimate that the second object edge line segment in described the second image comprises:
Obtain the coordinate of at least one end points of described first object thing edge line segment, as at least one the first coordinate;
Obtain from photographing described the first image detected coordinate offset amount when taking described the second image;
Calculate the poor of described at least one first coordinate and described coordinate offset amount, obtain at least one second coordinate;
According to the annexation of at least one end points described in described first object thing edge line segment, the corresponding coordinate points of described at least one the second coordinate is connected, using the line segment obtaining as described the second object edge line segment.
Described according to the second object edge line segment of estimating, described the second image border line segment is filtered and comprised:
By described the second object edge line segment, from former coordinate system transformation to specified coordinate, be, obtain the specified coordinate of the second marginal point, straight line in described former coordinate system is converted into a point in described specified coordinate system, and a point transformation in described former coordinate system is to a curve in described specified coordinate system;
By described the second image border line segment, from former coordinate system transformation to described specified coordinate, be to obtain the specified coordinate of the 3rd marginal point;
In described specified coordinate is, according to the specified coordinate of the specified coordinate of described the second marginal point and described the 3rd marginal point, filter out the 3rd marginal point outside the preset range of described the second marginal point.
The described object edge line segment using remaining the second image border line segment after filtration as described the second image comprises:
The specified coordinate average of a plurality of three marginal points of calculating within the preset range of described the second marginal point;
The coordinate points that described specified coordinate average is corresponding is extremely described former coordinate system of inverse transformation from described specified coordinate, and the line segment that inverse transformation is obtained is as the object edge line segment of described the second image.
Described is that the specified coordinate that obtains the second marginal point comprises from former coordinate system transformation to specified coordinate by described the second object edge line segment:
Applying following formula, is to obtain the specified coordinate of the second marginal point from former coordinate system transformation to specified coordinate by described the second object edge line segment:
alpha = arctan ( y 2 - y 1 x 2 - x 1 ) ;
dis tan ce = | y 1 - tan ( alpha ) x 1 tan ( alpha ) 2 + 1 | ;
Wherein, (x 1, y 1) and (x 2, y 2) be the former coordinate of two end points of described the second object edge line segment, (alpha, distance) is the specified coordinate of described the second marginal point.
According to the second aspect of disclosure embodiment, a kind of object edge extracting device is provided, described device comprises:
Running parameter acquisition module, at least two two field pictures of photographic subjects thing, obtains from photographing each two field picture detected sensor running parameter when taking next frame image;
Estimation module, for the first image and the second image for described at least two two field pictures, according to the first object thing edge line segment in described the first image and from photographing described the first image detected sensor running parameter when taking described the second image, estimate the second object edge line segment in described the second image;
Edge extracting module, for extracting the edge line segment of described the second image, as the second image border line segment;
Filtering module, for according to the second object edge line segment of estimating, filters described the second image border line segment;
Object edge extracting module, for filtering the object edge line segment of rear remaining the second image border line segment as described the second image.
Described estimation module comprises:
The first coordinate acquiring unit, for obtaining the coordinate of at least one end points of described first object thing edge line segment, as at least one the first coordinate;
Coordinate offset amount acquiring unit, for obtaining from photographing described the first image detected coordinate offset amount when taking described the second image;
The second coordinate acquiring unit, for calculating the poor of described at least one first coordinate and described coordinate offset amount, obtains at least one second coordinate;
Linkage unit, for according to the annexation of at least one end points described in described first object thing edge line segment, connects the corresponding coordinate points of described at least one the second coordinate, using the line segment obtaining as described the second object edge line segment.
Described filtering module comprises:
The first converter unit, for by described the second object edge line segment from former coordinate system transformation to specified coordinate being, obtain the specified coordinate of the second marginal point, straight line in described former coordinate system is converted into a point in described specified coordinate system, and a point transformation in described former coordinate system is to a curve in described specified coordinate system;
The second converter unit, for by described the second image border line segment from former coordinate system transformation to described specified coordinate system, obtain the specified coordinate of the 3rd marginal point;
Filter element, in described specified coordinate system, according to the specified coordinate of the specified coordinate of described the second marginal point and described the 3rd marginal point, filters out the 3rd marginal point outside the preset range of described the second marginal point.
Described object edge extracting module comprises:
Mean value computation unit, for calculating the specified coordinate average of a plurality of the 3rd marginal points within the preset range of described the second marginal point;
Inverse transformation block, for being extremely described former coordinate system of inverse transformation by coordinate points corresponding to described specified coordinate average from described specified coordinate, the line segment that inverse transformation is obtained is as the object edge line segment of described the second image.
Described the first converter unit is used for applying following formula, by described the second object edge line segment, is to obtain the specified coordinate of the second marginal point from former coordinate system transformation to specified coordinate:
alpha = arctan ( y 2 - y 1 x 2 - x 1 ) ;
dis tan ce = | y 1 - tan ( alpha ) x 1 tan ( alpha ) 2 + 1 | ;
Wherein, (x 1, y 1) and (x 2, y 2) be the former coordinate of two end points of described the second object edge line segment, (alpha, distance) is the specified coordinate of described the second marginal point.
The technical scheme that embodiment of the present disclosure provides can comprise following beneficial effect:
The method and apparatus that the present embodiment provides, by according to the object edge line segment in the first image with from photographing this first image detected sensor running parameter when taking this second image, estimate the object edge line segment in this second image, according to the object edge line segment of estimating, the edge line segment extracting in this second image is filtered, the image of the different frame that wherein, the first image and the second image are object.Take full advantage of the characteristic that picture pick-up device can shoot multi-frame images, utilize the existing object edge extracting of two field picture result to extract the object edge of another two field picture, reduce calculated amount, saved the time of edge extracting, improved the speed of edge extracting.
Should be understood that, it is only exemplary that above general description and details are hereinafter described, and can not limit the disclosure.
Accompanying drawing explanation
Accompanying drawing is herein merged in instructions and forms the part of this instructions, shows embodiment according to the invention, and is used from and explains principle of the present invention with instructions one.
Fig. 1 is according to the process flow diagram of a kind of object edge extracting method shown in an exemplary embodiment;
Fig. 2 is according to the process flow diagram of a kind of object edge extracting method shown in an exemplary embodiment;
Fig. 3 a is according to the bank card schematic diagram shown in an exemplary embodiment;
Fig. 3 b is according to the bank card edge schematic diagram shown in an exemplary embodiment;
Fig. 3 c is according to the second marginal point schematic diagram shown in an exemplary embodiment;
Fig. 3 d is according to the 3rd marginal point schematic diagram shown in an exemplary embodiment;
Fig. 3 e is according to the preset range schematic diagram shown in an exemplary embodiment;
Fig. 3 f is according to the object edge line segment schematic diagram shown in an exemplary embodiment;
Fig. 4 is according to the block diagram of a kind of object edge extracting device shown in an exemplary embodiment;
Fig. 5 is according to a kind of block diagram for object edge extracting device shown in an exemplary embodiment.
Embodiment
For making object of the present disclosure, technical scheme and advantage clearer, below in conjunction with embodiment and accompanying drawing, the disclosure is described in further details.At this, exemplary embodiment of the present disclosure and explanation thereof are used for explaining the disclosure, but not as to restriction of the present disclosure.
Disclosure embodiment provides a kind of object edge extracting method and device, below in conjunction with accompanying drawing, the disclosure is elaborated.
Fig. 1 is according to the process flow diagram of a kind of object edge extracting method shown in an exemplary embodiment, as shown in Figure 1, comprises the following steps:
In step 101, at least two two field pictures of photographic subjects thing, obtain from photographing each two field picture detected sensor running parameter when taking next frame image.
In step 102, for the first image in this at least two two field picture and the second image, according to the first object thing edge line segment in this first image and from photographing this first image detected sensor running parameter when taking this second image, estimate the second object edge line segment in this second image.
In step 103, extract the edge line segment in this second image, as the second image border line segment.
In step 104, the second object edge line segment according to estimating, filters this second image border line segment.
In step 105, after filtering, remaining the second image border line segment is as the object edge line segment of this second image.
The method that the present embodiment provides, by according to the object edge line segment in the first image with from photographing this first image detected sensor running parameter when taking this second image, estimate the object edge line segment in this second image, according to the object edge line segment of estimating, the edge line segment extracting in this second image is filtered, the image of the different frame that wherein, the first image and the second image are object.Take full advantage of the characteristic that picture pick-up device can shoot multi-frame images, utilize the existing object edge extracting of two field picture result to extract the object edge of another two field picture, reduce calculated amount, saved the time of edge extracting, improved the speed of edge extracting.
This,, according to the first object thing edge line segment in this first image and from photographing this first image detected sensor running parameter when taking this second image, estimates that the second object edge line segment in this second image comprises:
Obtain the coordinate of at least one end points of this first object thing edge line segment, as at least one the first coordinate;
Obtain from photographing this first image detected coordinate offset amount when taking this second image;
Calculate the poor of this at least one first coordinate and this coordinate offset amount, obtain at least one second coordinate;
According to the annexation of this at least one end points in this first object thing edge line segment, by this, the corresponding coordinate points of at least one the second coordinate connects, using the line segment obtaining as this second object edge line segment.
This,, according to the second object edge line segment of estimating, filters and comprises this second image border line segment:
By this second object edge line segment, from former coordinate system transformation to specified coordinate, be, obtain the specified coordinate of the second marginal point, straight line in this former coordinate system is converted into a point in this specified coordinate system, and a point transformation in this former coordinate system is to a curve in this specified coordinate system;
This second image border line segment, from former coordinate system transformation to this specified coordinate system, is obtained to the specified coordinate of the 3rd marginal point;
In this specified coordinate system, the specified coordinate according to the specified coordinate of this second marginal point and the 3rd marginal point, filters out the 3rd marginal point outside the preset range of this second marginal point.
This comprises filtering the object edge line segment of rear remaining the second image border line segment as this second image:
The specified coordinate average of a plurality of three marginal points of calculating within the preset range of this second marginal point;
By coordinate points corresponding to this specified coordinate average from this specified coordinate be inverse transformation to this former coordinate system, the line segment that inverse transformation is obtained is as the object edge line segment of this second image.
This is that the specified coordinate that obtains the second marginal point comprises from former coordinate system transformation to specified coordinate by this second object edge line segment:
Applying following formula, is to obtain the specified coordinate of the second marginal point from former coordinate system transformation to specified coordinate by this second object edge line segment:
alpha = arctan ( y 2 - y 1 x 2 - x 1 ) ;
dis tan ce = | y 1 - tan ( alpha ) x 1 tan ( alpha ) 2 + 1 | ;
Wherein, (x 1, y 1) and (x 2, y 2) be the former coordinate of two end points of this second object edge line segment, (alpha, distance) is the specified coordinate of this second marginal point.
Above-mentioned all optional technical schemes, can adopt any combination to form optional embodiment of the present invention, and this is no longer going to repeat them.
Fig. 2 is according to the process flow diagram of a kind of object edge extracting method shown in an exemplary embodiment, and the executive agent of the method is image processing apparatus, as shown in Figure 2, comprises the following steps:
201, at least two two field pictures of this image processing apparatus photographic subjects thing, obtain from photographing each two field picture detected sensor running parameter when taking next frame image.
Wherein, this object can be the article such as bank card, I.D., and this image processing apparatus is used for the image of photographic subjects thing, and the image of taking is carried out to edge extracting, this image processing apparatus can be mobile phone or computing machine etc., and the embodiment of the present invention does not limit this.In the time need to obtaining the image of this object, picture pick-up device to be aimed to this object and take, the image photographing is carried out to edge extracting to this image processing apparatus and image is cut apart, and can obtain the image of this object.Afterwards, this image processing apparatus can also carry out the operations such as feature extraction, image recognition to the image of this object, do not repeat them here.
In actual application, on this object, may have pattern, when this image processing apparatus carries out edge extracting to the image photographing, can extract the edge line segment of pattern on the edge line segment of object and object, now will be difficult to distinguish the edge line segment of this object.Referring to Fig. 3 a, take this object as bank card be example, dotted line represents the actual edge of this bank card, this bank card comprises the pattern of " Bank of China " and " VISA ", when the image of this bank card is carried out to edge extracting, can obtain the image shown in Fig. 3 b, in this image, both comprise the edge (representing with solid line) of bank card, also comprise the edge of " Bank of China ", " VISA " pattern.
In order to extract the edge line segment of this object, this image processing apparatus is taken at least two two field pictures of this object, and in the process of taking, detect in real time from photographing the sensor running parameter of each two field picture when taking next frame image, this sensor running parameter is for representing that this image processing apparatus is from photographing moving direction and the displacement of each two field picture when taking next frame image, when this object is not moved, the edge line segment that this image processing apparatus extracts according to a two field picture and this sensor running parameter can estimate the edge line segment of next frame image.Wherein, this sensor running parameter can be the coordinate offset amount of this image processing apparatus, and the gravity sensor detection that this sensor running parameter can be configured by this image processing apparatus obtains.
In the present embodiment, this image processing apparatus obtains from photographing each two field picture detected sensor running parameter when taking next frame image and can comprise: shooting process, this image processing apparatus often detects while taking a two field picture, records current sensor parameters.For any two two field pictures of taking, calculate the poor of sensor parameters while taking this two two field picture, obtain sensor running parameter.Or, in shooting process, this image processing apparatus often detects while taking a two field picture, obtain current sensor parameters, sensor parameters when calculating current sensor parameters and taking previous frame image poor, obtain sensor running parameter, as the sensor running parameter that present image is corresponding, record, for any two two field pictures of taking, the corresponding sensor running parameter of each two field picture sum that calculating is taken in the middle of this two two field picture, obtains the sensor running parameter between this two two field picture.
202, for the first image in this at least two two field picture and the second image, this image processing apparatus is determined the first object thing edge line segment in this first image, obtain the coordinate of at least one end points of this first object thing edge line segment, as at least one the first coordinate.
In the present embodiment, this image processing apparatus carries out edge extracting to this first image, obtains this first object thing edge line segment.This first object thing edge line segment is comprised of at least one line segment, and this first object thing edge line segment also comprises at least one end points.This image processing apparatus can be set up coordinate system in advance, when extracting this first object thing edge line segment, position according at least one end points of this first object thing edge line segment in this coordinate system, determines the coordinate of at least one end points of this first object thing edge line segment.
Before this step 202, the method can also comprise: this image processing apparatus is using the lower left corner of this first image as coordinate origin, using the lower limb of this first image as transverse axis (X-axis), using the left hand edge of this first image as the longitudinal axis (Y-axis), set up this coordinate system.The present embodiment does not limit initial point, transverse axis, the longitudinal axis of the coordinate system of setting up.
It should be noted that, this image processing apparatus is according to the object edge line segment of previous frame image, estimate the object edge line segment of next frame image, and for the first two field picture, this image processing apparatus can carry out edge extracting to this first two field picture, obtains after edge line segment, and this edge line segment is increased to processing, by intersecting and can surround the edge line segment of a closed region after increasing, as object edge line segment.In actual application, this image processing apparatus need to carry out repeatedly edge extracting to this object, when this image processing apparatus extracts after the object edge line segment of the first two field picture, can be according to the object edge line segment of this first two field picture and sensor running parameter, estimate the object edge line segment of next frame image, thereby extract the object edge line segment of next frame image, greatly reduce the calculated amount of this next frame Edge extraction, improved the edge extracting speed of next frame image.
203, this image processing apparatus obtains from photographing this first image detected coordinate offset amount when taking this second image, calculate the poor of this at least one first coordinate and this coordinate offset amount, obtain at least one second coordinate, according to the annexation of this at least one end points in this first object thing edge line segment, by this, the corresponding coordinate points of at least one the second coordinate connects, using the line segment obtaining as this second object edge line segment.
In the present embodiment, moving direction and the displacement that from photographing this first image detected coordinate offset amount when taking this second image, can represent this image processing apparatus, when this image processing apparatus has been taken this first image, while having taken again this second image after movement, compare with the pixel in this first image, in this second image also there is movement in corresponding pixel, and moving direction is contrary with the moving direction of this image processing apparatus, displacement equates with the displacement of this image processing apparatus, this image processing apparatus calculates the poor of this at least one first coordinate and this coordinate offset amount, be the coordinate of this at least one end points in this second image, be denoted as at least one second coordinate.This image processing apparatus obtains this at least one second coordinate time, can be according to this at least one end points the annexation in this first image, by this, the corresponding coordinate points of at least one the second coordinate connects, obtain at least one line segment, using this at least one line segment as this second object edge line segment.
Wherein, this image processing apparatus can be applied following formula and calculate this at least one second coordinate:
x 1=x 0-△x,y 1=y 0-△y;
Wherein, (x 1, y 1) be the second coordinate of end points, (x 0, y 0) be the first coordinate of this end points, (△ x, △ y) is the coordinate offset amount of this image processing apparatus.
For example, the unit of this coordinate system is mm, and the first coordinate of the end points of this first object thing edge line segment is (x 0, y 0).If the coordinate offset amount that this image processing apparatus detects is (1,0), show that this image processing apparatus has moved 1mm along X-axis positive dirction, compare with the end points in this first image, in this second image, this end points has moved 1mm along X-axis negative direction, and the second coordinate of this end points is (x 0-1, y 0).
204, this image processing apparatus carries out denoising to this second image, obtain the gray level image of this second image, extract the edge of this gray level image, adopt Hough transformation to detect operator, obtain the edge line segment in this gray level image, as the second image border line segment.
In the present embodiment, this image processing apparatus carries out edge extracting to this second image, adopt Image denoising algorithm to carry out denoising to this second image, obtain the gray level image of this second image, adopt arithmetic operators to extract the edge of this gray level image, and adopt Hough transformation to detect operator, obtain the edge line segment in this gray level image, as this second image border line segment.This second image border line segment is comprised of at least one line segment, and this second image border line segment also comprises at least one end points.
Wherein, this Image denoising algorithm can be gaussian filtering, mean filter scheduling algorithm, and this arithmetic operators can be Sobel operator or Canny operator etc., and the present embodiment does not limit this.
It should be noted that, the present embodiment is implemented as example and describes after step 203 with step 204, and in fact, step 204 can also be carried out before step 202, or carried out with step 202-203 simultaneously, and the present embodiment does not limit this.
205, this image processing apparatus, according to this second object edge line segment, filters this second image border line segment, and after filtering, remaining the second image border line segment is as the object edge line segment of this second image.
In the present embodiment, this image processing apparatus is according to this first object thing edge line segment and this coordinate offset amount, estimate this second object edge line segment, and this image processing apparatus extracts this second image border line segment, both the edge line segment that had comprised this object in this second image border line segment, also the edge line segment that comprises pattern on this object, can think and be the edge line segment of object with the nearer edge line segment of the second object edge line segment distance of this estimation, and be the edge line segment of pattern on object with the second object edge line segment distance of this estimation edge line segment far away.For this reason, this image processing apparatus, according to this second object edge line segment, filters this second image border line segment.
In the present embodiment, " this image processing apparatus, according to this second object edge line segment, filters this second image border line segment " can comprise the following steps 205a-205c:
205a, this image processing apparatus by this second object edge line segment from former coordinate system transformation to specified coordinate are, obtain the specified coordinate of the second marginal point, this the second image border line segment, from former coordinate system transformation to this specified coordinate system, is obtained to the specified coordinate of the 3rd marginal point.
Wherein, straight line in this former coordinate system is converted into a point in this specified coordinate system, a point transformation in this former coordinate system is to a curve in this specified coordinate system, this image processing apparatus is transformed to this second marginal point by this second object edge line segment, and this second image border line segment is transformed to the 3rd marginal point.
In the present embodiment, this specified coordinate system can be alpha-distance coordinate system, this image processing apparatus carries out Hough transformation to this second object edge line segment and this second image border line segment, by this second object edge line segment, from former coordinate system transformation to this specified coordinate, be from former coordinate system transformation to this specified coordinate, to be by this second image border line segment.Further, this image processing apparatus can be applied following formula, by this second object edge line segment, from former coordinate system transformation to this specified coordinate, is:
alpha = arctan ( y 2 - y 1 x 2 - x 1 ) ;
dis tan ce = | y 1 - tan ( alpha ) x 1 tan ( alpha ) 2 + 1 | ;
Wherein, (x 1, y 1) and (x 2, y 2) be the former coordinate of two end points of this second object edge line segment, (alpha, distance) is the specified coordinate of this second marginal point.Alpha can be illustrated in the pitch angle of this second object edge line segment in former coordinate system, and distance can represent that former coordinate origin is to the distance of this second object edge line segment.
And the following formula of application, by this second image border line segment, from former coordinate system transformation to this specified coordinate, be:
alpha ′ = arctan ( y 4 - y 3 x 4 - x 3 ) ;
dis tan ce ′ = | y 3 - tan ( alpha ′ ) x 3 tan ( alpha ′ ) 2 + 1 | ;
Wherein, (x 3, y 3) and (x 4, y 4) be the former coordinate of two end points of this second image border line segment, (alpha', distance') is the specified coordinate of the 3rd marginal point.Alpha' can be illustrated in the pitch angle of this second image border line segment in former coordinate system, and distance' can represent that former coordinate origin is to the distance of this second image border line segment.
Fig. 3 c is according to the second marginal point schematic diagram shown in an exemplary embodiment, and Fig. 3 d is according to the 3rd marginal point schematic diagram shown in an exemplary embodiment.When this image processing apparatus by this second object edge line segment from former coordinate system transformation to this specified coordinate is, in this specified coordinate system, the second marginal point of this second object edge line segment conversion as shown in Figure 3 c, when this image processing apparatus by this second image border line segment from former coordinate system transformation to this specified coordinate is, in this specified coordinate system, the 3rd marginal point of this second image border line segment conversion as shown in Figure 3 d.
205b, this image processing apparatus are in this specified coordinate system, and the specified coordinate according to the specified coordinate of this second marginal point and the 3rd marginal point, filters out the 3rd marginal point outside the preset range of this second marginal point.
Wherein, this preset range can be definite according to the specified coordinate of this second marginal point and predeterminable range, and this predeterminable range can be definite according to arranging of the degree of accuracy demand of edge extracting or technician.This image processing apparatus can this second marginal point be the center of circle, take this predeterminable range as radius, determine border circular areas, preset range using this border circular areas as this second marginal point, the 3rd marginal point outside this border circular areas can be thought to be obtained by the edge line segment conversion of the pattern on object, filters out the 3rd marginal point outside this border circular areas.Certainly, this image processing apparatus can also this second marginal point centered by, with the distance on Dao Ge limit, this predeterminable range Wei Gai center, determine square area, using this square area as this preset range, the present embodiment does not limit this preset range.
Referring to Fig. 3 e, the preset range of each the second marginal point that this image processing apparatus is determined is as the dashed region in Fig. 3 e, this image processing apparatus filters out the 3rd marginal point outside the plurality of preset range, retains the 3rd marginal point within the plurality of preset range.
In the present embodiment, this image processing apparatus carries out cluster rejection according to this preset range to the 3rd marginal point in this specified coordinate system, filter out the 3rd marginal point outside this preset range, only retain the 3rd marginal point within this preset range, finally only can extract the line segment at the fringe region of this object, and filter out the line segment of non-fringe region.Especially for objects such as bank card or credits card, on this object, outstanding mint-mark has card number, when the image of this object is carried out to edge extracting, it is edge that the texture of the card number of this outstanding mint-mark can be mistaken as, edge extracting to this object impacts, and in the present embodiment, can filter out the line segment of card number, avoided the impact of pseudo-edge.
205c, this image processing apparatus calculate the specified coordinate average of a plurality of the 3rd marginal points within the preset range of this second marginal point, by coordinate points corresponding to this specified coordinate average from this specified coordinate be inverse transformation to this former coordinate system, the line segment that inverse transformation is obtained is as the object edge line segment of this second image.
In the present embodiment, the 3rd marginal point within this preset range can be thought to be obtained by the edge line segment conversion of object, can calculate the specified coordinate average of a plurality of the 3rd marginal points within the preset range of this second marginal point, the coordinate points that edge line segment conversion using the corresponding coordinate points of this specified coordinate average as this object obtains, by this coordinate points from this specified coordinate be inverse transformation to this former coordinate system, can obtain this object edge line segment.
Referring to Fig. 3 e, this image processing apparatus calculates the average of the specified coordinate of a plurality of the 3rd marginal points within the plurality of preset range, by the corresponding coordinate points of this specified coordinate average from this specified coordinate be inverse transformation to this former coordinate system, the object edge line segment obtaining is as shown in the solid line of edge in Fig. 3 f.
The method that the present embodiment provides, by according to the object edge line segment in the first image with from photographing this first image detected sensor running parameter when taking this second image, estimate the object edge line segment in this second image, according to the object edge line segment of estimating, the edge line segment extracting in this second image is filtered, the image of the different frame that wherein, the first image and the second image are object.Take full advantage of the characteristic that picture pick-up device can shoot multi-frame images, utilize the existing object edge extracting of two field picture result to extract the object edge of another two field picture, reduce calculated amount, saved the time of edge extracting, improved the speed of edge extracting.
Fig. 4 is according to the block diagram of a kind of object edge extracting device shown in an exemplary embodiment, referring to Fig. 4, this device comprises: running parameter acquisition module 401, estimation module 402, edge extracting module 403, filtering module 404 and object edge extracting module 405.
This running parameter acquisition module 401 is configured at least two two field pictures for photographic subjects thing, obtains from photographing each two field picture detected sensor running parameter when taking next frame image;
This estimation module 402 is configured to for the first image for this at least two two field picture and the second image, according to the first object thing edge line segment in this first image and from photographing this first image detected sensor running parameter when taking this second image, estimate the second object edge line segment in this second image;
This edge extracting module 403 is configured to for extracting the edge line segment of this second image, as the second image border line segment;
This filtering module 404 is configured to, for according to the second object edge line segment of estimating, this second image border line segment be filtered;
This object edge extracting module 405 is configured to for remaining the second image border line segment after filtering as the object edge line segment of this second image.
The device that the present embodiment provides, by according to the object edge line segment in the first image with from photographing this first image detected sensor running parameter when taking this second image, estimate the object edge line segment in this second image, according to the object edge line segment of estimating, the edge line segment extracting in this second image is filtered, the image of the different frame that wherein, the first image and the second image are object.Take full advantage of the characteristic that picture pick-up device can shoot multi-frame images, utilize the existing object edge extracting of two field picture result to extract the object edge of another two field picture, reduce calculated amount, saved the time of edge extracting, improved the speed of edge extracting.
This estimation module 402 comprises:
The first coordinate acquiring unit, for obtaining the coordinate of at least one end points of this first object thing edge line segment, as at least one the first coordinate;
Coordinate offset amount acquiring unit, for obtaining from photographing this first image detected coordinate offset amount when taking this second image;
The second coordinate acquiring unit, for calculating the poor of this at least one first coordinate and this coordinate offset amount, obtains at least one second coordinate;
Linkage unit, for according to the annexation of this first object thing edge this at least one end points of line segment, by this, the corresponding coordinate points of at least one the second coordinate connects, using the line segment obtaining as this second object edge line segment.
This filtering module 404 comprises:
The first converter unit, for by this second object edge line segment from former coordinate system transformation to specified coordinate being, obtain the specified coordinate of the second marginal point, straight line in this former coordinate system is converted into a point in this specified coordinate system, and a point transformation in this former coordinate system is to a curve in this specified coordinate system;
The second converter unit, for by this second image border line segment from former coordinate system transformation to this specified coordinate system, obtain the specified coordinate of the 3rd marginal point;
Filter element, at this specified coordinate being, the specified coordinate according to the specified coordinate of this second marginal point and the 3rd marginal point, filters out the 3rd marginal point outside the preset range of this second marginal point.
This object edge extracting module 405 comprises:
Mean value computation unit, for calculating the specified coordinate average of a plurality of the 3rd marginal points within the preset range of this second marginal point;
Inverse transformation block, for by coordinate points corresponding to this specified coordinate average from this specified coordinate be inverse transformation to this former coordinate system, the line segment that inverse transformation is obtained is as the object edge line segment of this second image.
This first converter unit is used for applying following formula, by this second object edge line segment, is to obtain the specified coordinate of the second marginal point from former coordinate system transformation to specified coordinate:
alpha = arctan ( y 2 - y 1 x 2 - x 1 ) ;
dis tan ce = | y 1 - tan ( alpha ) x 1 tan ( alpha ) 2 + 1 | ;
Wherein, (x 1, y 1) and (x 2, y 2) be the former coordinate of two end points of this second object edge line segment, (alpha, distance) is the specified coordinate of this second marginal point.
Above-mentioned all optional technical schemes, can adopt any combination to form optional embodiment of the present invention, and this is no longer going to repeat them.
About the device in above-described embodiment, wherein the concrete mode of modules executable operations have been described in detail in the embodiment of relevant the method, will not elaborate explanation herein.
It should be noted that: the object edge extracting device that above-described embodiment provides is when extracting object edge, only the division with above-mentioned each functional module is illustrated, in practical application, can above-mentioned functions be distributed and by different functional modules, completed as required, the inner structure that is about to image processing apparatus is divided into different functional modules, to complete all or part of function described above.In addition, the object edge extracting device that above-described embodiment provides and object edge extracting method embodiment belong to same design, and its specific implementation process refers to embodiment of the method, repeats no more here.
Fig. 5 is according to the block diagram of a kind of device 500 for object edge extracting shown in an exemplary embodiment.For example, device 500 can be mobile phone, computing machine, digital broadcast terminal, information receiving and transmitting equipment, game console, flat-panel devices, Medical Devices, body-building equipment, personal digital assistant etc.
With reference to Fig. 5, device 500 can comprise following one or more assembly: processing components 502, storer 504, power supply module 506, multimedia groupware 508, audio-frequency assembly 510, the interface 512 of I/O (I/O), sensor module 514, and communications component 516.
The integrated operation of processing components 502 common control device 500, such as with demonstration, call, data communication, the operation that camera operation and record operation are associated.Treatment element 502 can comprise that one or more processors 520 carry out instruction, to complete all or part of step of above-mentioned method.In addition, processing components 502 can comprise one or more modules, is convenient to mutual between processing components 502 and other assemblies.For example, processing element 502 can comprise multi-media module, to facilitate mutual between multimedia groupware 508 and processing components 502.
Storer 504 is configured to store various types of data to be supported in the operation of equipment 500.The example of these data comprises for any application program of operation on device 500 or the instruction of method, contact data, telephone book data, message, picture, video etc.Storer 504 can be realized by the volatibility of any type or non-volatile memory device or their combination, as static RAM (SRAM), Electrically Erasable Read Only Memory (EEPROM), Erasable Programmable Read Only Memory EPROM (EPROM), programmable read only memory (PROM), ROM (read-only memory) (ROM), magnetic store, flash memory, disk or CD.
Electric power assembly 506 provides electric power for installing 500 various assemblies.Electric power assembly 506 can comprise power-supply management system, one or more power supplys, and other and the assembly that generates, manages and distribute electric power to be associated for device 500.
Multimedia groupware 508 is included in the screen that an output interface is provided between described device 500 and user.In certain embodiments, screen can comprise liquid crystal display (LCD) and touch panel (TP).If screen comprises touch panel, screen may be implemented as touch-screen, to receive the input signal from user.Touch panel comprises that one or more touch sensors are with the gesture on sensing touch, slip and touch panel.Described touch sensor is the border of sensing touch or sliding action not only, but also detects duration and the pressure relevant to described touch or slide.In certain embodiments, multimedia groupware 508 comprises a front-facing camera and/or post-positioned pick-up head.When equipment 500 is in operator scheme, during as screening-mode or video mode, front-facing camera and/or post-positioned pick-up head can receive outside multi-medium data.Each front-facing camera and post-positioned pick-up head can be fixing optical lens systems or have focal length and optical zoom ability.
Audio-frequency assembly 510 is configured to output and/or input audio signal.For example, audio-frequency assembly 510 comprises a microphone (MIC), and when device 500 is in operator scheme, during as call model, logging mode and speech recognition mode, microphone is configured to receive external audio signal.The sound signal receiving can be further stored in storer 504 or be sent via communications component 516.In certain embodiments, audio-frequency assembly 510 also comprises a loudspeaker, for output audio signal.
I/O interface 512 is for providing interface between processing components 502 and peripheral interface module, and above-mentioned peripheral interface module can be keyboard, some striking wheel, button etc.These buttons can include but not limited to: home button, volume button, start button and locking press button.
Sensor module 514 comprises one or more sensors, is used to device 500 that the state estimation of various aspects is provided.For example, sensor module 514 can detect the opening/closing state of equipment 500, the relative positioning of assembly, for example described assembly is display and the keypad of device 500, the position of all right pick-up unit 500 of sensor module 514 or 500 1 assemblies of device changes, user is with device 500 existence that contact or do not have the temperature variation of device 500 orientation or acceleration/deceleration and device 500.Sensor module 514 can comprise proximity transducer, be configured to without any physical contact time detect near the existence of object.Sensor module 514 can also comprise optical sensor, as CMOS or ccd image sensor, for using in imaging applications.In certain embodiments, this sensor module 514 can also comprise acceleration transducer, gyro sensor, Magnetic Sensor, pressure transducer or temperature sensor.
Communications component 516 is configured to be convenient to the communication of wired or wireless mode between device 500 and other equipment.Device 500 wireless networks that can access based on communication standard, as WiFi, 2G or 3G, or their combination.In one exemplary embodiment, communication component 516 receives broadcast singal or the broadcast related information from external broadcasting management system via broadcast channel.In one exemplary embodiment, described communication component 516 also comprises near-field communication (NFC) module, to promote junction service.For example, can be based on radio-frequency (RF) identification (RFID) technology in NFC module, Infrared Data Association (IrDA) technology, ultra broadband (UWB) technology, bluetooth (BT) technology and other technologies realize.
In the exemplary embodiment, device 500 can be realized by one or more application specific integrated circuit (ASIC), digital signal processor (DSP), digital signal processing appts (DSPD), programmable logic device (PLD) (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic components, for carrying out said method.
In the exemplary embodiment, also provide a kind of non-provisional computer-readable recording medium that comprises instruction, for example, comprised the storer 504 of instruction, above-mentioned instruction can have been carried out said method by the processor 520 of device 500.For example, described non-provisional computer-readable recording medium can be ROM, random access memory (RAM), CD-ROM, tape, floppy disk and optical data storage equipment etc.
A non-provisional computer-readable recording medium, when the instruction in described storage medium is carried out by the processor of mobile terminal, makes mobile terminal can carry out a kind of object edge extracting method, and described method comprises:
At least two two field pictures of photographic subjects thing, obtain from photographing each two field picture detected sensor running parameter when taking next frame image;
For the first image in described at least two two field pictures and the second image, according to the first object thing edge line segment in described the first image and from photographing described the first image detected sensor running parameter when taking described the second image, estimate the second object edge line segment in described the second image;
Extract the edge line segment in described the second image, as the second image border line segment;
The second object edge line segment according to estimating, filters described the second image border line segment;
After filtering, remaining the second image border line segment is as the object edge line segment of described the second image.
Described according to the first object thing edge line segment in described the first image and from photographing described the first image detected sensor running parameter when taking described the second image, estimate that the second object edge line segment in described the second image comprises:
Obtain the coordinate of at least one end points of described first object thing edge line segment, as at least one the first coordinate;
Obtain from photographing described the first image detected coordinate offset amount when taking described the second image;
Calculate the poor of described at least one first coordinate and described coordinate offset amount, obtain at least one second coordinate;
According to the annexation of at least one end points described in described first object thing edge line segment, the corresponding coordinate points of described at least one the second coordinate is connected, using the line segment obtaining as described the second object edge line segment.
Described according to the second object edge line segment of estimating, described the second image border line segment is filtered and comprised:
By described the second object edge line segment, from former coordinate system transformation to specified coordinate, be, obtain the specified coordinate of the second marginal point, straight line in described former coordinate system is converted into a point in described specified coordinate system, and a point transformation in described former coordinate system is to a curve in described specified coordinate system;
By described the second image border line segment, from former coordinate system transformation to described specified coordinate, be to obtain the specified coordinate of the 3rd marginal point;
In described specified coordinate is, according to the specified coordinate of the specified coordinate of described the second marginal point and described the 3rd marginal point, filter out the 3rd marginal point outside the preset range of described the second marginal point.
The described object edge line segment using remaining the second image border line segment after filtration as described the second image comprises:
The specified coordinate average of a plurality of three marginal points of calculating within the preset range of described the second marginal point;
The coordinate points that described specified coordinate average is corresponding is extremely described former coordinate system of inverse transformation from described specified coordinate, and the line segment that inverse transformation is obtained is as the object edge line segment of described the second image.
Described is that the specified coordinate that obtains the second marginal point comprises from former coordinate system transformation to specified coordinate by described the second object edge line segment:
Applying following formula, is to obtain the specified coordinate of the second marginal point from former coordinate system transformation to specified coordinate by described the second object edge line segment:
alpha = arctan ( y 2 - y 1 x 2 - x 1 ) ;
dis tan ce = | y 1 - tan ( alpha ) x 1 tan ( alpha ) 2 + 1 | ;
Wherein, (x 1, y 1) and (x 2, y 2) be the former coordinate of two end points of described the second object edge line segment, (alpha, distance) is the specified coordinate of described the second marginal point.
Those skilled in the art, considering instructions and putting into practice after invention disclosed herein, will easily expect other embodiment of the present invention.The application is intended to contain any modification of the present invention, purposes or adaptations, and these modification, purposes or adaptations are followed general principle of the present invention and comprised undocumented common practise or the conventional techniques means in the art of the disclosure.Instructions and embodiment are only regarded as exemplary, and true scope of the present invention and spirit are pointed out by claim below.
Should be understood that, the present invention is not limited to precision architecture described above and illustrated in the accompanying drawings, and can carry out various modifications and change not departing from its scope.Scope of the present invention is only limited by appended claim.

Claims (10)

1. an object edge extracting method, is characterized in that, described method comprises:
At least two two field pictures of photographic subjects thing, obtain from photographing each two field picture detected sensor running parameter when taking next frame image;
For the first image in described at least two two field pictures and the second image, according to the first object thing edge line segment in described the first image and from photographing described the first image detected sensor running parameter when taking described the second image, estimate the second object edge line segment in described the second image;
Extract the edge line segment in described the second image, as the second image border line segment;
The second object edge line segment according to estimating, filters described the second image border line segment;
After filtering, remaining the second image border line segment is as the object edge line segment of described the second image.
2. method according to claim 1, it is characterized in that, described according to the first object thing edge line segment in described the first image and from photographing described the first image detected sensor running parameter when taking described the second image, estimate that the second object edge line segment in described the second image comprises:
Obtain the coordinate of at least one end points of described first object thing edge line segment, as at least one the first coordinate;
Obtain from photographing described the first image detected coordinate offset amount when taking described the second image;
Calculate the poor of described at least one first coordinate and described coordinate offset amount, obtain at least one second coordinate;
According to the annexation of at least one end points described in described first object thing edge line segment, the corresponding coordinate points of described at least one the second coordinate is connected, using the line segment obtaining as described the second object edge line segment.
3. method according to claim 1, is characterized in that, described according to the second object edge line segment of estimating, and described the second image border line segment is filtered and comprised:
By described the second object edge line segment, from former coordinate system transformation to specified coordinate, be, obtain the specified coordinate of the second marginal point, straight line in described former coordinate system is converted into a point in described specified coordinate system, and a point transformation in described former coordinate system is to a curve in described specified coordinate system;
By described the second image border line segment, from former coordinate system transformation to described specified coordinate, be to obtain the specified coordinate of the 3rd marginal point;
In described specified coordinate is, according to the specified coordinate of the specified coordinate of described the second marginal point and described the 3rd marginal point, filter out the 3rd marginal point outside the preset range of described the second marginal point.
4. method according to claim 3, is characterized in that, the described object edge line segment using remaining the second image border line segment after filtration as described the second image comprises:
The specified coordinate average of a plurality of three marginal points of calculating within the preset range of described the second marginal point;
The coordinate points that described specified coordinate average is corresponding is extremely described former coordinate system of inverse transformation from described specified coordinate, and the line segment that inverse transformation is obtained is as the object edge line segment of described the second image.
5. method according to claim 3, is characterized in that, described is that the specified coordinate that obtains the second marginal point comprises from former coordinate system transformation to specified coordinate by described the second object edge line segment:
Applying following formula, is to obtain the specified coordinate of the second marginal point from former coordinate system transformation to specified coordinate by described the second object edge line segment:
alpha = arctan ( y 2 - y 1 x 2 - x 1 ) ;
dis tan ce = | y 1 - tan ( alpha ) x 1 tan ( alpha ) 2 + 1 | ;
Wherein, (x 1, y 1) and (x 2, y 2) be the former coordinate of two end points of described the second object edge line segment, (alpha, distance) is the specified coordinate of described the second marginal point.
6. an object edge extracting device, is characterized in that, described device comprises:
Running parameter acquisition module, at least two two field pictures of photographic subjects thing, obtains from photographing each two field picture detected sensor running parameter when taking next frame image;
Estimation module, for the first image and the second image for described at least two two field pictures, according to the first object thing edge line segment in described the first image and from photographing described the first image detected sensor running parameter when taking described the second image, estimate the second object edge line segment in described the second image;
Edge extracting module, for extracting the edge line segment of described the second image, as the second image border line segment;
Filtering module, for according to the second object edge line segment of estimating, filters described the second image border line segment;
Object edge extracting module, for filtering the object edge line segment of rear remaining the second image border line segment as described the second image.
7. device according to claim 6, is characterized in that, described estimation module comprises:
The first coordinate acquiring unit, for obtaining the coordinate of at least one end points of described first object thing edge line segment, as at least one the first coordinate;
Coordinate offset amount acquiring unit, for obtaining from photographing described the first image detected coordinate offset amount when taking described the second image;
The second coordinate acquiring unit, for calculating the poor of described at least one first coordinate and described coordinate offset amount, obtains at least one second coordinate;
Linkage unit, for according to the annexation of at least one end points described in described first object thing edge line segment, connects the corresponding coordinate points of described at least one the second coordinate, using the line segment obtaining as described the second object edge line segment.
8. device according to claim 6, is characterized in that, described filtering module comprises:
The first converter unit, for by described the second object edge line segment from former coordinate system transformation to specified coordinate being, obtain the specified coordinate of the second marginal point, straight line in described former coordinate system is converted into a point in described specified coordinate system, and a point transformation in described former coordinate system is to a curve in described specified coordinate system;
The second converter unit, for by described the second image border line segment from former coordinate system transformation to described specified coordinate system, obtain the specified coordinate of the 3rd marginal point;
Filter element, in described specified coordinate system, according to the specified coordinate of the specified coordinate of described the second marginal point and described the 3rd marginal point, filters out the 3rd marginal point outside the preset range of described the second marginal point.
9. device according to claim 8, is characterized in that, described object edge extracting module comprises:
Mean value computation unit, for calculating the specified coordinate average of a plurality of the 3rd marginal points within the preset range of described the second marginal point;
Inverse transformation block, for being extremely described former coordinate system of inverse transformation by coordinate points corresponding to described specified coordinate average from described specified coordinate, the line segment that inverse transformation is obtained is as the object edge line segment of described the second image.
10. device according to claim 8, is characterized in that, described the first converter unit is used for applying following formula, by described the second object edge line segment, is to obtain the specified coordinate of the second marginal point from former coordinate system transformation to specified coordinate:
alpha = arctan ( y 2 - y 1 x 2 - x 1 ) ;
dis tan ce = | y 1 - tan ( alpha ) x 1 tan ( alpha ) 2 + 1 | ;
Wherein, (x 1, y 1) and (x 2, y 2) be the former coordinate of two end points of described the second object edge line segment, (alpha, distance) is the specified coordinate of described the second marginal point.
CN201410361293.6A 2014-07-25 2014-07-25 Object edge extracting method and device Active CN104182751B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410361293.6A CN104182751B (en) 2014-07-25 2014-07-25 Object edge extracting method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410361293.6A CN104182751B (en) 2014-07-25 2014-07-25 Object edge extracting method and device

Publications (2)

Publication Number Publication Date
CN104182751A true CN104182751A (en) 2014-12-03
CN104182751B CN104182751B (en) 2017-12-12

Family

ID=51963778

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410361293.6A Active CN104182751B (en) 2014-07-25 2014-07-25 Object edge extracting method and device

Country Status (1)

Country Link
CN (1) CN104182751B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107403452A (en) * 2017-07-27 2017-11-28 深圳章鱼信息科技有限公司 Object identification method and its device based on FIG pull handle
CN108732484A (en) * 2017-04-20 2018-11-02 深圳市朗驰欣创科技股份有限公司 Detection method and detecting system for component positioning
CN111104940A (en) * 2018-10-26 2020-05-05 深圳怡化电脑股份有限公司 Image rotation correction method and device, electronic equipment and storage medium
CN114339371A (en) * 2021-12-30 2022-04-12 咪咕音乐有限公司 Video display method, device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101170647A (en) * 2006-10-25 2008-04-30 富士胶片株式会社 Method of detecting specific object region and digital camera
CN101493889A (en) * 2008-01-23 2009-07-29 华为技术有限公司 Method and apparatus for tracking video object
US20110249901A1 (en) * 2010-04-13 2011-10-13 Vivante Corporation Anti-Aliasing System and Method
CN102387303A (en) * 2010-09-02 2012-03-21 奥林巴斯株式会社 Image processing apparatus, image processing method, and image pickup apparatus
CN102800088A (en) * 2012-06-28 2012-11-28 华中科技大学 Automatic dividing method of ultrasound carotid artery plaque

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101170647A (en) * 2006-10-25 2008-04-30 富士胶片株式会社 Method of detecting specific object region and digital camera
CN101493889A (en) * 2008-01-23 2009-07-29 华为技术有限公司 Method and apparatus for tracking video object
US20110249901A1 (en) * 2010-04-13 2011-10-13 Vivante Corporation Anti-Aliasing System and Method
CN102387303A (en) * 2010-09-02 2012-03-21 奥林巴斯株式会社 Image processing apparatus, image processing method, and image pickup apparatus
CN102800088A (en) * 2012-06-28 2012-11-28 华中科技大学 Automatic dividing method of ultrasound carotid artery plaque

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨文杰等: "一种快速的基于边缘的道路检测算法", 《计算机科学》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108732484A (en) * 2017-04-20 2018-11-02 深圳市朗驰欣创科技股份有限公司 Detection method and detecting system for component positioning
CN107403452A (en) * 2017-07-27 2017-11-28 深圳章鱼信息科技有限公司 Object identification method and its device based on FIG pull handle
CN111104940A (en) * 2018-10-26 2020-05-05 深圳怡化电脑股份有限公司 Image rotation correction method and device, electronic equipment and storage medium
CN114339371A (en) * 2021-12-30 2022-04-12 咪咕音乐有限公司 Video display method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN104182751B (en) 2017-12-12

Similar Documents

Publication Publication Date Title
CN104243819B (en) Photo acquisition methods and device
US10346669B2 (en) Fingerprint entry prompting method and device
CN105809704A (en) Method and device for identifying image definition
CN105512605A (en) Face image processing method and device
CN104378570A (en) Sound recording method and device
CN104504684B (en) Edge extraction method and device
CN104484871B (en) edge extracting method and device
CN105469356A (en) Human face image processing method and apparatus thereof
CN106355573A (en) Target object positioning method and device in pictures
CN104065878A (en) Method, device and terminal for controlling shooting
CN105488511A (en) Image identification method and device
CN105469056A (en) Face image processing method and device
CN104077585B (en) Method for correcting image, device and terminal
CN105046231A (en) Face detection method and device
CN106296570A (en) Image processing method and device
CN106225764A (en) Based on the distance-finding method of binocular camera in terminal and terminal
CN105069426A (en) Similar picture determining method and apparatus
CN104850852A (en) Feature vector calculation method and device
CN104036240A (en) Face feature point positioning method and device
CN105354793A (en) Facial image processing method and device
CN105139378A (en) Card boundary detection method and apparatus
CN104182751A (en) Method and device for edge extraction of target object
CN104268864A (en) Card edge extracting method and device
CN104077563A (en) Human face recognition method and device
CN105100634A (en) Image photographing method and image photographing device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant