CN108388854A - A kind of localization method based on improvement FAST-SURF algorithms - Google Patents

A kind of localization method based on improvement FAST-SURF algorithms Download PDF

Info

Publication number
CN108388854A
CN108388854A CN201810140177.XA CN201810140177A CN108388854A CN 108388854 A CN108388854 A CN 108388854A CN 201810140177 A CN201810140177 A CN 201810140177A CN 108388854 A CN108388854 A CN 108388854A
Authority
CN
China
Prior art keywords
point
algorithms
matching
surf
fast
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810140177.XA
Other languages
Chinese (zh)
Inventor
汪宇翔
王兴龙
罗元
张毅
胡章芳
张瀚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Post and Telecommunications
Original Assignee
Chongqing University of Post and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Post and Telecommunications filed Critical Chongqing University of Post and Telecommunications
Priority to CN201810140177.XA priority Critical patent/CN108388854A/en
Publication of CN108388854A publication Critical patent/CN108388854A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The present invention is claimed a kind of based on the localization method for improving FAST SURF algorithms, and time-consuming for positioning when carrying out crawl object manipulation for industrial robot, and the low problem of precision, it uses a kind of localization method of improved FAST SURF algorithms:According to the target object pictorial information that binocular vision system acquires, preliminary treatment is carried out using improved FAST algorithms, obtains characteristic point;The matching of characteristic point pair and RANSAC are screened using SURF algorithm;By the characteristic point matched to carrying out triangulation principle processing, it is accurately positioned object dimensional coordinate.This method has to be reduced and is promoted in accuracy in time, is positioned more suitable for the target object under this paper industrial robot backgrounds.

Description

A kind of localization method based on improvement FAST-SURF algorithms
Technical field
The invention belongs to field of locating technology, especially a kind of target positioning side based on improved FAST-SURF algorithms Method.
Background technology
Industrial robot is widely used in the industrial tasks such as logistics classification, object stacking.And pick up is with drop target object One of vital task of industrial robot.The crawl of most of industrial robot remains in the layer of repeated work after advance described point It is secondary.Such working method is high to the stability requirement of environment, cannot cope with the variation of burst.It is continuous with social production Development, it is also higher and higher to the practicability of robot system, intelligent requirements.By vision system, industrial robot can be to mesh Mark object be identified and position, to improve system robustness and real-time come the case where coping with burst.In recent years, vision The application of identification positioning industrially is also more and more extensive.
Binocular visual positioning technology is to simulate the principle of mankind's binocular perceived distance, realizes the three-dimensional space position to scenery Measurement, be based on principle of triangulation, same scenery be imaged from different location with two cameras, to restoring from parallax Range information is an important content of robot vision research, and object location techniques are disadvantageous in that image at present Processing speed is not fast enough, and positioning is inaccurate.
Invention content
Present invention seek to address that the above problem of the prior art.Propose it is a kind of improve whole matching precision based on changing Into the localization method of FAST-SURF algorithms.Technical scheme is as follows:
A kind of localization method based on improvement FAST-SURF algorithms comprising following steps:
S11:The image information of positioning object is acquired using binocular vision system;
S12:The positioning object image information that step S11 is obtained carries out preliminary treatment using improved FAST algorithms, obtains Characteristic point;Improved FAST algorithm improvements are mainly reflected in:The number for detecting contrast points is reduced to 12 by 16, accelerates detection Speed.
S13:According to the characteristic point that step S12 is obtained, the matching of characteristic point pair is carried out to target area using SURF algorithm And screening;
S14:According to the matching pair that step S13 is obtained, multigroup spy is obtained to screening to matching using RANSAC algorithms Levy matching double points;
S15:The multigroup characteristic matching point pair obtained according to step S14, point is calculated using triangulation algorithm Three-dimensional coordinate is accurately positioned object dimensional coordinate.
Further, the step S12 carries out preliminary treatment using improved FAST algorithms, obtains characteristic point and specifically includes Step:
S21:FAST algorithms:It is the center of circle that a pixel P is chosen from picture, and radius is the discretization of 3 pixels Bresenhan justifies, and there are 16 pixels on round boundary;
S22:The improvement of FAST algorithms
16 pixels compared with surrounding are reduced to 12 pixels, detection principle is that continuous 9 pixels all compare Ip+ T is big or compares Ip- t is small, then the pixel is characterized a candidate point;
S23:The gray value difference of the gray value and candidate point of a circle pixel around point to be detected is enough big, can recognize It is a characteristic point for this candidate point;
S24:Non-maxima suppression is carried out to characteristic point, obtains characteristic point output.
Further, the step S23 is specifically included:Any point central coordinate of circle (x, y) being located in two dimensional image, it is special Sign point description algorithmic formula is as follows:
As the gray value I of center pixelPIt punctuates grey scale pixel value I at x less than surroundingxWhen+t, then it is assumed that the gray-scale pixels category In darker, then SP→x=d;And so on similar s and brighter gray-scale pixels point b, in this way at one with candidate feature point p Gray-scale pixels point d, s and b of 3 types are just had found for the border circular areas edge in the center of circle, count the number of d or b, if it is greater than N, then it is assumed that the point is candidate feature point;Then the gray value for justifying upper pixel by radii fixus is compared acquisition candidate feature Point, according to 12 features, judges whether this candidate feature point is characterized a little using ID3 graders;By above step obtain d, S, the number of b is denoted as P respectivelyd、Ps、Pb, the S that is calculatedP→xMust corresponding (2) certain situation, enable IPWhen being characterized Kp=true, otherwise false, to judge whether some pixel is characterized a little.
Further, the step S13 using SURF algorithm to target area carry out characteristic point pair matching, then to It carries out distance discrimination respectively with result and angle differentiates screening, obtain the high characteristic point of accuracy to specifically including:
Point of interest is extracted with the Gaussian derivative of integral image property detector, it will be square centered on the point of interest of extraction Shape region, this region are divided into 4 × 4 grid grids, have the sampled point of 5 × 5 aturegularaintervals in every grid, for each net Lattice, Haar small echos respond dx and dy and are calculated by 5 × 5 sampled points at these intervals, then by small echo respond dx be added with dy with Formation 4-D vectors V, V=(∑ dx, ∑ dy, ∑ | dx |, ∑ | dy |), it connects this and is described for all 4 × 4 grid protocol 64-D Symbol vector.
Further, the first matching double points obtained to SURF algorithm using this method are into row distance and angle character Consistency check with parameter judges that the point of interest is correct characteristic point, otherwise makees if meeting two match parameters simultaneously For the matching double points of mistake, removing does not make reference.
Further, the step S14 is according to obtained matching pair, using RANSAC algorithms to matching to carrying out screening tool Body includes step:
S51:It extracts four sample datas out at random from data set, calculates transformation matrix, be denoted as M;
S52:The projection error of all data and model M in data set is calculated, if error is less than threshold value, interior point set I is added;
S53:If point set I element numbers are more than point set I_best in u in current, then I_best=I is updated;It updates simultaneously Iterations k;
S54:If iterations are more than k, then exit;Otherwise iterations add 1, and repeat the above steps.
Further, multigroup characteristic matching point pair that the step S14 is obtained, is calculated a little using triangulation algorithm The three-dimensional coordinate of position, is accurately positioned object dimensional coordinate and specifically includes:
S51:Left camera O1 is respectively being set as origin system, it is known that point P (X, Y, Z)
Left image feature point coordinates (u0,v0), right image feature point coordinates (u1,v1), b is two cameras Distance, f are camera focus
According to the obtained characteristic point pair of above-mentioned algorithm, the three-dimensional coordinate of each characteristic point is calculated with above formula;
S52:The three-dimensional coordinate of each characteristic point is handled, calculating summarizes coordinate, and the three-dimensional of object need to be positioned by, which obtaining, sits Mark.
It advantages of the present invention and has the beneficial effect that:
Innovative point of the present invention is 16 detection contrast points of FAST algorithms being reduced to 12, reduces FAST and detect spy Levy the time of point.Error hiding pair is rejected using RANSAC algorithms simultaneously, improves accuracy;
Time-consuming for positioning when carrying out crawl object manipulation for industrial robot, and the low problem of precision proposes a kind of new point Laminar target object positioning strategy.Image characteristic point is extracted using improved FAST algorithms.It is used for the characteristic point of extraction SURF algorithm carries out Feature Points Matching to target area, compared with the erroneous matching for again reducing Independent Point for SURF algorithm.It carries High whole matching precision, accelerates whole matching speed.
Description of the drawings
Fig. 1 is that the present invention provides localization method system flow chart of the preferred embodiment based on improved FAST-SURF algorithms;
Fig. 2 is to improve FAST algorithm principle figures;
Fig. 3 is triangulation schematic diagram.
Specific implementation mode
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, detailed Carefully describe.Described embodiment is only a part of the embodiment of the present invention.
The present invention solve above-mentioned technical problem technical solution be:
In industrial environment, time-consuming for positioning when carrying out crawl object manipulation for industrial robot, and the low problem of precision carries Go out a kind of layer-stepping positioning strategy of improved FAST-SURF algorithms, quickly and accurately positions object.First according to binocular vision The target object pictorial information of system acquisition carries out preliminary treatment using improved FAST algorithms, extracts the characteristic point of image;So Carry out the matching and screening of characteristic point pair to target area using SURF algorithm afterwards;Finally by the characteristic point matched to carrying out The processing of triangulation principle, obtains object dimensional coordinate.
As shown in Figure 1, be the localization method system block diagram proposed by the present invention based on improved FAST-SURF algorithms, it should The target object pictorial information that method is acquired according to binocular vision system carries out preliminary treatment using improved FAST algorithms, carries Take image characteristic point;Then the matching and screening of characteristic point pair are carried out to target area using SURF algorithm;It finally will matching Good characteristic point is accurately positioned object dimensional coordinate to carrying out triangulation principle processing.
As shown in Fig. 2, for FAST algorithm principle figures in the present invention.Compare in surrounding 16 pixels are reduced to 12 pictures Vegetarian refreshments.Detection principle is that continuous 9 pixels all compare Ip+ t is big or compares Ip- t is small, then the pixel is characterized a candidate point.It waits for The gray value of a circle pixel and the gray value difference of candidate point around the point of detection reach big, it is believed that this candidate point is One characteristic point, any point central coordinate of circle (x, y) in two dimensional image.Feature point description algorithmic formula is as follows:
As the gray value I of center pixelPIt punctuates grey scale pixel value I at x less than surroundingxWhen+t, then it is assumed that the gray-scale pixels category In darker, then SP→x=d;And so on similar s and brighter gray-scale pixels point b.In this way at one with candidate feature point p Gray-scale pixels point d, s and b of 3 types are just had found for the border circular areas edge in the center of circle.The number for counting d or b, if it is greater than N (when it is 9 to divide test point number, i.e. IxNumber 3/4), then it is assumed that the point be candidate feature point.Then pass through fixation The gray value of the upper pixel of radius circle, which is compared, obtains candidate feature point.Judge this according to 12 features using ID3 graders Whether candidate feature point is characterized a little.Above step acquisition d, s, b are denoted as Pd、Ps、Pb, the S that is calculatedP→xIt must be right Answer certain situation of formula (2).Enable IPKp=true when being characterized, otherwise false, to judge whether some pixel is characterized Point.
Non-maxima suppression is carried out to characteristic point, obtains characteristic point output.
As shown in figure 3, for the triangle location algorithm in the present invention.Above-mentioned improvement SURF algorithm first to characteristic point into Row matching, then the calculating by characteristic point to progress triangle algorithm, obtains the three-dimensional coordinate of each characteristic point.It is directed to each spy again The three-dimensional coordinate of sign point is handled, and calculating summarizes coordinate, obtains the three-dimensional information of target object.
The above embodiment is interpreted as being merely to illustrate the present invention rather than limit the scope of the invention. After the content for having read the record of the present invention, technical staff can make various changes or modifications the present invention, these equivalent changes Change and modification equally falls into the scope of the claims in the present invention.

Claims (7)

1. a kind of based on the localization method for improving FAST-SURF algorithms, which is characterized in that include the following steps:
S11:The image information of positioning object is acquired using binocular vision system;
S12:The positioning object image information that step S11 is obtained carries out preliminary treatment using improved FAST algorithms, obtains feature Point;Improved FAST algorithm improvements are mainly reflected in:The number for detecting contrast points is reduced to 12 by 16, accelerates the speed of detection Degree;
S13:According to the characteristic point that step S12 is obtained, using SURF algorithm to target area carry out characteristic point pair matching and Screening;
S14:According to the matching pair that step S13 is obtained, multigroup feature is obtained to screening to matching using RANSAC algorithms With point pair;
S15:The multigroup characteristic matching point pair obtained according to step S14, the three-dimensional of point is calculated using triangulation algorithm Coordinate is accurately positioned object dimensional coordinate.
2. according to claim 1 based on the localization method for improving FAST-SURF algorithms, which is characterized in that the step S12 carries out preliminary treatment using improved FAST algorithms, obtains characteristic point and specifically includes step:
S21:FAST algorithms:It is the center of circle that a pixel P is chosen from picture, and radius is the Bresenhan of the discretization of 3 pixels Circle, there are 16 pixels on round boundary;
S22:The improvement of FAST algorithms
16 pixels compared with surrounding are reduced to 12 pixels, detection principle is that continuous 9 pixels all compare Ip+ t it is big or Person's ratio Ip- t is small, then the pixel is characterized a candidate point;
S23:The gray value difference of the gray value and candidate point of a circle pixel around point to be detected is enough big, it is believed that this A candidate point is a characteristic point;
S24:Non-maxima suppression is carried out to characteristic point, obtains characteristic point output.
3. according to claim 2 based on the localization method for improving FAST-SURF algorithms, which is characterized in that the step S23 is specifically included:Any point central coordinate of circle (x, y) being located in two dimensional image, feature point description algorithmic formula are as follows:
As the gray value I of center pixelPIt punctuates grey scale pixel value I at x less than surroundingxWhen+t, then it is assumed that the gray-scale pixels belong to more It is dark, then SP→x=d;And so on similar s and brighter gray-scale pixels point b, in this way one with candidate feature point p be circle The border circular areas edge of the heart just has found gray-scale pixels point d, s and b of 3 types, counts the number of d or b, if it is greater than n, then Think that the point is candidate feature point;Then the gray value for justifying upper pixel by radii fixus is compared acquisition candidate feature point, Using ID3 graders, according to 12 features, judge whether this candidate feature point is characterized a little;Above step is obtained into d, s, b Number be denoted as P respectivelyd、Ps、Pb, the S that is calculatedP→xMust corresponding (2) certain situation, enable IPKp=when being characterized True, otherwise false, to judge whether some pixel is characterized a little.
4. according to claim 2 based on the localization method for improving FAST-SURF algorithms, which is characterized in that the step S13 using SURF algorithm to target area carry out characteristic point pair matching, then matching result is carried out respectively distance discrimination and Angle differentiates screening, obtains the high characteristic point of accuracy to specifically including:
Point of interest is extracted with the Gaussian derivative of integral image property detector, by square region centered on the point of interest of extraction Domain, this region are divided into 4 × 4 grid grids, per the sampled point for having 5 × 5 aturegularaintervals in grid, for each grid, Haar small echos respond dx and dy and are calculated by 5 × 5 sampled points at these intervals, are then summed to form small echo response dx and dy 4-D vectors V, V=(∑ dx, ∑ dy, ∑ | dx |, ∑ | dy |), connect this for all 4 × 4 grid protocol 64-D descriptors to Amount.
5. according to claim 4 based on the localization method for improving FAST-SURF algorithms, which is characterized in that the use The first matching double points that this method obtains SURF algorithm into row distance and angle character match parameter consistency check, if simultaneously Meet two match parameters, then judge that the point of interest is correct characteristic point, otherwise the matching double points as mistake, removing is not done With reference to.
6. according to claim 4 based on the localization method for improving FAST-SURF algorithms, which is characterized in that the step S14 specifically includes step to matching according to obtained matching pair, using RANSAC algorithms to carrying out screening:
S51:It extracts four sample datas out at random from data set, calculates transformation matrix, be denoted as M;
S52:The projection error of all data and model M in data set is calculated, if error is less than threshold value, interior point set I is added;
S53:If point set I element numbers are more than point set I_best in u in current, then I_best=I is updated;Update iteration simultaneously Number k;
S54:If iterations are more than k, then exit;Otherwise iterations add 1, and repeat the above steps.
7. according to claim 4 based on the localization method for improving FAST-SURF algorithms, which is characterized in that the step Multigroup characteristic matching point pair that S14 is obtained, the three-dimensional coordinate of point is calculated using triangulation algorithm, is accurately positioned object Three-dimensional coordinate specifically includes:
S51:Left camera O1 is respectively being set as origin system, it is known that point P (X, Y, Z)
Feature point coordinates (u in left image0,v0), feature point coordinates (u in right image1,v1), b is two cameras Distance, f are camera focus;
According to the obtained characteristic point pair of above-mentioned algorithm, the three-dimensional coordinate of each characteristic point is calculated with above formula;
S52:The three-dimensional coordinate of each characteristic point is handled, calculating summarizes coordinate, obtains the three-dimensional coordinate that need to position object.
CN201810140177.XA 2018-02-11 2018-02-11 A kind of localization method based on improvement FAST-SURF algorithms Pending CN108388854A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810140177.XA CN108388854A (en) 2018-02-11 2018-02-11 A kind of localization method based on improvement FAST-SURF algorithms

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810140177.XA CN108388854A (en) 2018-02-11 2018-02-11 A kind of localization method based on improvement FAST-SURF algorithms

Publications (1)

Publication Number Publication Date
CN108388854A true CN108388854A (en) 2018-08-10

Family

ID=63068548

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810140177.XA Pending CN108388854A (en) 2018-02-11 2018-02-11 A kind of localization method based on improvement FAST-SURF algorithms

Country Status (1)

Country Link
CN (1) CN108388854A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109799394A (en) * 2018-12-20 2019-05-24 上海玄彩美科网络科技有限公司 A kind of field distribution collecting method and equipment
CN111882618A (en) * 2020-06-28 2020-11-03 北京石油化工学院 Left and right view feature point matching processing method, terminal and system in binocular ranging
CN112562000A (en) * 2020-12-23 2021-03-26 安徽大学 Robot vision positioning method based on feature point detection and mismatching screening

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105678307A (en) * 2016-01-11 2016-06-15 河海大学 Mobile terminal real-time characteristic detection and matching method based on FAST-SURF
CN106908040A (en) * 2017-03-06 2017-06-30 哈尔滨工程大学 A kind of binocular panorama visual robot autonomous localization method based on SURF algorithm
CN107067415A (en) * 2017-03-21 2017-08-18 南京航空航天大学 A kind of quick accurate positioning method of target based on images match

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105678307A (en) * 2016-01-11 2016-06-15 河海大学 Mobile terminal real-time characteristic detection and matching method based on FAST-SURF
CN106908040A (en) * 2017-03-06 2017-06-30 哈尔滨工程大学 A kind of binocular panorama visual robot autonomous localization method based on SURF algorithm
CN107067415A (en) * 2017-03-21 2017-08-18 南京航空航天大学 A kind of quick accurate positioning method of target based on images match

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
刘代志: "《遥感地球物理与国家安全》", 30 September 2012, 西安地图出版社 *
周莉莉,姜枫: "基于FAST和BRIEF的图像匹配算法", 《计算机工程与设计》 *
杨威,等: "基于FAST-9和SURF的遥感图像匹配算法", 《国家安全地球物理丛书(八)——遥感地球物理与国家安全》 *
燕磊等: "基于Oriented FAST和MAD的特征匹配图像自动配准", 《计算机与数字工程》 *
赖庆峰: "基于双目视觉的跨座式单轨接触网磨损检测技术研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
赵小川: "《机器人技术创意设计》", 31 January 2013, 北京航空航天大学出版社 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109799394A (en) * 2018-12-20 2019-05-24 上海玄彩美科网络科技有限公司 A kind of field distribution collecting method and equipment
CN111882618A (en) * 2020-06-28 2020-11-03 北京石油化工学院 Left and right view feature point matching processing method, terminal and system in binocular ranging
CN111882618B (en) * 2020-06-28 2024-01-26 北京石油化工学院 Left-right view characteristic point matching processing method, terminal and system in binocular ranging
CN112562000A (en) * 2020-12-23 2021-03-26 安徽大学 Robot vision positioning method based on feature point detection and mismatching screening

Similar Documents

Publication Publication Date Title
CN106650640B (en) Negative obstacle detection method based on laser radar point cloud local structure characteristics
CN108898047B (en) Pedestrian detection method and system based on blocking and shielding perception
CN109102547A (en) Robot based on object identification deep learning model grabs position and orientation estimation method
CN108717531B (en) Human body posture estimation method based on Faster R-CNN
CN104778721B (en) The distance measurement method of conspicuousness target in a kind of binocular image
CN105740780B (en) Method and device for detecting living human face
CN107392929B (en) Intelligent target detection and size measurement method based on human eye vision model
CN103824070B (en) A kind of rapid pedestrian detection method based on computer vision
EP2085928B1 (en) Detection of blobs in images
CN108981672A (en) Hatch door real-time location method based on monocular robot in conjunction with distance measuring sensor
CN105335725A (en) Gait identification identity authentication method based on feature fusion
CN107292869B (en) Image speckle detection method based on anisotropic Gaussian kernel and gradient search
CN104751146B (en) A kind of indoor human body detection method based on 3D point cloud image
CN108007388A (en) A kind of turntable angle high precision online measuring method based on machine vision
CN108564120B (en) Feature point extraction method based on deep neural network
CN112818925B (en) Urban building and crown identification method
CN107240112B (en) Individual X corner extraction method in complex scene
CN106682641A (en) Pedestrian identification method based on image with FHOG- LBPH feature
CN108681711A (en) A kind of natural landmark extracting method towards mobile robot
CN111709988B (en) Method and device for determining characteristic information of object, electronic equipment and storage medium
CN113850865A (en) Human body posture positioning method and system based on binocular vision and storage medium
CN108388854A (en) A kind of localization method based on improvement FAST-SURF algorithms
CN116883588A (en) Method and system for quickly reconstructing three-dimensional point cloud under large scene
CN106709432B (en) Human head detection counting method based on binocular stereo vision
CN113689365B (en) Target tracking and positioning method based on Azure Kinect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180810

RJ01 Rejection of invention patent application after publication