CN109974853B - Bionic visual target detection and tracking method based on multispectral composition - Google Patents

Bionic visual target detection and tracking method based on multispectral composition Download PDF

Info

Publication number
CN109974853B
CN109974853B CN201811628870.8A CN201811628870A CN109974853B CN 109974853 B CN109974853 B CN 109974853B CN 201811628870 A CN201811628870 A CN 201811628870A CN 109974853 B CN109974853 B CN 109974853B
Authority
CN
China
Prior art keywords
target
motion
eye
tracking
bionic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811628870.8A
Other languages
Chinese (zh)
Other versions
CN109974853A (en
Inventor
娄小平
李巍
祝连庆
孟晓辰
樊凡
潘志康
董明利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Information Science and Technology University
Original Assignee
Beijing Information Science and Technology University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Information Science and Technology University filed Critical Beijing Information Science and Technology University
Priority to CN201811628870.8A priority Critical patent/CN109974853B/en
Publication of CN109974853A publication Critical patent/CN109974853A/en
Application granted granted Critical
Publication of CN109974853B publication Critical patent/CN109974853B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • G06V10/507Summing image-intensity values; Histogram projection analysis
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging

Abstract

The invention discloses a bionic visual target detection and tracking method based on multispectral compounding, which comprises the following steps: simulating a head-eye coordinated movement mechanism of a primate and building a multi-spectral composite bionic vision system, wherein the bionic vision system comprises a neck first-level macro-motion holder, an eye second-level micro-motion holder, a thermal infrared camera and a visible light camera; setting a neck first-level macro-motion holder motion threshold etaNAnd eye two-stage micro-motion holder motion threshold etaEIntroducing a visual axis rotation angle beta of a bionic visual system with multispectral composition, and controlling the bionic visual system with multispectral composition in a staged mode according to the visual axis rotation angle beta to realize head-eye coordination movement control; the characteristics of the moving target are respectively extracted by using the infrared image and the visible light to carry out rough positioning and precise positioning, the state of a new frame of moving target is estimated by using a KCF tracking method, and a proportion peak value is introduced as a judgment condition of the reliability of a tracking result, so that the real-time online detection and tracking of the moving target are realized.

Description

Bionic visual target detection and tracking method based on multispectral composition
Technical Field
The invention relates to a bionic visual target detection and tracking method based on multispectral compounding, relates to the technical field of bionic robots, and can be used in the fields of service robot technology and intelligent monitoring.
Background
In recent years, with the continuous integration of bionics and machine vision, the bionic vision robot technology is developing and realizing breakthrough at an unprecedented speed, intelligent bionic eyes are constructed by taking the imaging mechanism of an optical system of the human eyes as a reference, and the realization of real-time detection and tracking of a moving target is the key research direction of the bionic vision robot technology, so that the bionic vision robot technology has wide application prospects in related fields such as intelligent video monitoring and service robot systems. In the past, many researchers at home and abroad respectively conduct many explorations and researches on the bionic vision robot target tracking system from different angles and directions.
The northwest University (Northwestern University) in America simulates the distribution of cone cells and rod cells in human retina, and a hemispherical retina detector is manufactured by combining a plurality of colleges to realize variable-resolution curved surface imaging for the first time. A dual-band multi-view-field infrared optical image acquisition system is designed in the human eye image acquisition simulation process of the American army night vision electronic sensor research institute Vizgaitis and the like, and an all-weather large-view-field target tracking task can be realized. The Wangxiangyin et al, university at Zhejiang, proposes a bionic eye structure based on flexible driving and connected in parallel, and utilizes a bionic pneumatic muscle actuator, a CMOS sensor and a three-dimensional acceleration sensor to manufacture a human eye-simulated visual perception system. In order to realize the steady infrared target tracking, Yangfucai, a department of control and engineering college of Hebei Industrial university, and the like, an infrared target tracking method based on sparse coding histogram features and a disturbance perception model is provided. In order to improve the calculation speed of the tracking algorithm, Henriques et al, the technical institute of university of British and Hippocampus, grape and Dentist, provides a nuclear correlation filter tracking algorithm, collects positive and negative samples in the surrounding area of a target by using a cyclic matrix, trains a target detector by using ridge regression, and can meet the real-time requirement of the target tracking algorithm. The method simulates the movement mode of human eyeballs, such as Lehao in the institute of Electrical engineering of Shandong university, develops a four-degree-of-freedom stepping motor drive control system for bionic eyes, and can realize the basic movement function of the bionic eyeballs. People such as Liu Yi of the university of great courseware imitate the movement mode of the head, the eyes and the neck of a human, a human eye and human neck imitating visual system is designed, and the static target positioning task of the robot is realized. However, most of the existing researches on target detection and tracking of the bionic mobile robot are based on an engineering method, single-channel images of targets obtained by using left and right cameras are respectively processed, effective image information sharing is not performed, and the left and right eyes and the head and neck lack an effective coordinated linkage mechanism, so that the maximum utilization of image information cannot be realized.
In view of the above, the present document aims at the advantages and disadvantages of visible light images and infrared images in target detection and tracking applications, combines the effective characteristic information extracted by the visible light images and the infrared images, and adopts a head-eye coordinated motion control strategy to achieve the purpose of real-time online detection and tracking of moving targets.
Disclosure of Invention
The invention discloses a bionic visual target detection and tracking method based on multispectral composition, which aims at the problem that a single-waveband image sensor is difficult to ensure the accuracy of target detection and tracking in a complex dynamic environment with changed illumination and temperature and combines the self-imaging characteristics of a visible light image and an infrared image. The method simulates the imaging characteristics of human cone cells and rod cells, combines target effective information detected by infrared and visible light images, and adopts a head-eye coordinated motion control strategy to realize real-time online detection and tracking of a moving target.
In order to achieve the purpose, the technical scheme of the invention is as follows: a bionic visual target detection and tracking method based on multispectral compounding comprises the following steps:
step 1, simulating a head-eye coordination movement mechanism of a primate, and constructing a multi-spectrum composite bionic vision system which comprises a neck first-level macro-motion holder, an eye second-level micro-motion holder, a thermal infrared camera, a visible light camera and a head-eye coordination control system;
step 2, setting a neck first-level macro-motion holder motion threshold etaNAnd eye two-stage micro-motion holder motion threshold etaEIntroducing a visual axis rotation angle beta of a bionic visual system with multispectral composition, and controlling the bionic visual system with multispectral composition in a staged mode according to the visual axis rotation angle beta to realize head-eye coordination movement control;
step 3, roughly positioning the position of the target in an image coordinate system by using a thermal infrared camera, positioning the accurate position of the target by using a visible light camera when the target is in the central area of the image coordinate system, and estimating the motion state of a new frame of target by using a KCF tracking method; otherwise, adjusting the shooting angle of the thermal infrared camera through the head-eye coordination control system, and repeating the step 3;
and 4, introducing a proportion peak value, judging the reliability of the tracking result of the motion state of the new frame of target, feeding back the motion state of the target in the new frame of image with the reliability meeting the judgment condition to the head-eye coordination control system to finish the real-time tracking of the motion target, and otherwise, repeating the step 3.
Further, the neck first-stage macro-motion holder and the eye second-stage micro-motion holder in the step 1 are all 4-degree-of-freedom holders.
Further, the step 2 of the step of stepwise control of the multi-spectral composite bionic vision system specifically comprises three steps:
1)0<β<ηNat the moment, the neck first-stage macro-motion cradle head completes the task of transferring the fixation point, and the rotation angle etaNBeta, the eye second-level micro-motion tripod head is not required to participate in the motion;
2)ηN<β<ηEat the moment, the first-grade neck macro-motion holder and the second-grade eye micro-motion holder are required to move simultaneously to complete the fixation point transfer task, and the rotation angle eta of the first-grade neck macro-motion holderNSecond-stage micro-motion holder rotation angle eta of eyesE=β-ηN
3)ηE<Beta, at the moment, the neck first-stage macro-motion holder and the eye second-stage micro-motion holder are required to move simultaneously to complete the fixation point transfer task, and the neck rotation angle etaN=β-ηEAngle of rotation of eyeball etaE
Further, the step 3 specifically includes:
acquiring a large-field low-resolution image by using a thermal infrared camera, extracting CENTRIST characteristics, and roughly determining the position of a target in an image coordinate system by using a trained linear SVM classifier;
calculating the rotation angle beta required by the visual axis of the bionic visual system by using the known camera focal length f and the error quantity delta x of the deviation of the target central point from the main point of the image,
Figure BDA0001928548570000041
the head and eye coordination control system drives the two-stage holder to rotate in a stage mode according to the visual axis rotation angle beta, so that the target is kept in the central area of the infrared image;
collecting a small-field high-resolution image by using a visible light camera, extracting HOG characteristics of a current frame image, determining the accurate position of a moving target, and estimating the state of a new frame of moving target by using a KCF tracking method;
further, the step 4 specifically includes:
introducing a proportion peak value s, judging the reliability of the tracking result of the motion state of the new frame of target, wherein the greater the value of s, the higher the reliability of the tracking result,
Figure BDA0001928548570000051
wherein f (z) is the response value of the new frame image classifier, phi is the 20% area of the response distribution diagram with the maximum value as the center, muфAnd σфRespectively is the mean value and the standard deviation in the region phi;
the scale peaks were normalized:
Figure BDA0001928548570000052
wherein, standardized OtValue is [0.1 ]]S betweentIs the proportional peak of the t-th frame;
setting a threshold θ as a criterion for determining the relocation of the classifier, i.e. when the classifier detects a peak OtWhen the target is smaller than the repositioning judgment threshold theta, the tracking result is unreliable, and the target needs to be repositioned by utilizing the CENTRIST characteristic of the infrared image; when the classifier detects the peak value OtWhen the value is larger than the repositioning judgment threshold theta, the tracking result is effective, the HOG characteristic of the detected target is extracted, and positive and negative samples are selected from the periphery of the detected targetAnd adding a training set and updating the SVM classifier.
And feeding back the motion state of the target in the new frame of image with the reliability meeting the judgment condition to the head-eye coordination control system to complete the real-time tracking of the moving target.
The invention has the beneficial effects that: the method of the invention compounds the acquired infrared and visible light multi-mode image information, and adopts a head-eye coordination motion control strategy to detect and track the target, thereby solving the problem that the detection and tracking failure occurs in the single-waveband image target detection and tracking method under the complex conditions of visual field emergence, partial shielding, similar target interference and the like. Experiments prove that compared with the traditional single-channel image target tracking method, the target tracking method based on the multispectral image has the advantages that the success rate is respectively improved by 13.6% and 7.8%, and the robustness is high.
Drawings
FIG. 1 is a diagram of a biomimetic vision system with multi-spectral compounding;
FIG. 2 is a diagram of a dual field-of-view multi-spectral imaging optical system;
FIG. 3 is a view of the precise positioning of an object in a small field of view;
FIG. 4 is a rough map of the location of an object under a large field of view.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings.
In order to verify the accuracy of the bionic visual target detection and tracking method based on multispectral composition and the effectiveness of an active visual control theory, the following tests are used for specific analysis.
Constructing a multi-spectral composite bionic vision system as shown in figure 1, comprising: a Belgium Xenics thermal infrared camera with a model of Raven-640-Analog, a Mercury series color industrial camera with a large constant image company with a model of MER-310-12UC, and an autonomously-built two-stage 4-freedom bionic visual holder are identified by a Chinese measurement institute, wherein the left-right rotation precision is +/-2.4 ', the up-down pitching precision is +/-0.6', an STM32F4 development board is adopted as a holder motor control module, a ZYNQ-7000 series ZedBoard embedded development board is adopted as an infrared image acquisition and processing module, and a processor is dominant frequency 766MHz and operates on an Ubuntu14.04 system. The main control PC machine is used as a visible light image acquisition and processing module, and the processor is intel (R) core (TM) i7-4790K CPU @4.00 GHz. In the experiment, the neck movement threshold η N is set to 30 °, the eye movement threshold η E is set to 45 °, and the repositioning determination threshold θ is set to 0.55. The diagram of the dual-view-field multispectral imaging optical system of the multispectral composite bionic visual system is shown in fig. 2.
In order to quantitatively evaluate the bionic visual target detection and tracking method based on multispectral compounding, the central position error and the success rate graph are used as quantitative evaluation indexes of tracking accuracy. The central position error of the invention is defined as the average European distance between the central position of the display frame of the tracking target and the origin coordinates of the center of the image, and the success rate chart is defined as the percentage of the frame number of which the central position error meets the threshold value requirement in the video of the detection target to all the frame numbers of the video. For convenience, comparing and analyzing a bionic visual target detection and tracking method of multispectral composition with a visible light or thermal infrared single-channel target detection and tracking method, adjusting a visible light image and an infrared image to be 640 × 480pixels in an experiment, selecting an HOG feature for target detection of the visible light image, as shown in fig. 3, selecting a CENTRIST feature for target detection of the thermal infrared image, as shown in fig. 4, selecting 50pixels as an error threshold, and performing comparative analysis in lab, corridor and meeting room scenes respectively, wherein the experiment result is shown in table 1:
TABLE 1 quantitative comparison of target tracking accuracy for visible and infrared images
Figure BDA0001928548570000071
As can be seen from Table 1, compared with the single-channel target detection and tracking method based on the visible light image or the infrared image, the success rate of the bionic visual target detection and tracking method based on the multispectral composition is respectively improved by 13.6% and 7.8%. The failure of detecting and tracking visible light images mainly occurs under the conditions of partial obstruction and visual field, and the failure of detecting and tracking infrared images mainly occurs under the conditions of similar target interference (such as a display or a radiator plate) and scale change. The operation time comparison of the three methods shows that the detection and tracking method based on multispectral compounding only consumes slightly more time than the detection and tracking method based on thermal infrared images due to the reduction of the search range of the visible light images, but obviously less time than the calculation time of the detection and tracking method based on the visible light images, and can basically meet the real-time requirement of target tracking.
The described embodiments are only some embodiments of the invention, not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the scope of the present invention.

Claims (5)

1. A bionic visual target detection and tracking method based on multispectral compounding is characterized by comprising the following steps:
step 1, simulating a head-eye coordination movement mechanism of a primate, and constructing a multi-spectrum composite bionic vision system which comprises a neck first-level macro-motion holder, an eye second-level micro-motion holder, a thermal infrared camera, a visible light camera and a head-eye coordination control system;
step 2, setting a neck first-level macro-motion holder motion threshold etaNAnd eye two-stage micro-motion holder motion threshold etaEIntroducing a visual axis rotation angle beta of a bionic visual system with multispectral composition, and controlling the bionic visual system with multispectral composition in a staged mode according to the visual axis rotation angle beta to realize head-eye coordination movement control;
step 3, roughly positioning the position of the target in an image coordinate system by using a thermal infrared camera, positioning the accurate position of the target by using a visible light camera when the target is in the central area of the image coordinate system, and estimating the motion state of a new frame of target by using a KCF tracking method; otherwise, adjusting the shooting angle of the thermal infrared camera through the head-eye coordination control system, and repeating the step 3;
and 4, introducing a proportion peak value, judging the reliability of the tracking result of the motion state of the new frame of target, feeding back the motion state of the target in the new frame of image with the reliability meeting the judgment condition to the head-eye coordination control system to finish the real-time tracking of the motion target, and otherwise, repeating the step 3.
2. The method according to claim 1, wherein the neck first-stage macro-motion pan/tilt and the eye second-stage micro-motion pan/tilt in step 1 are all 4-degree-of-freedom pan/tilt.
3. The method according to claim 1, wherein the step 2 of performing the multi-spectral compounding based bionic visual target detection and tracking comprises three steps:
1)0<β<ηNat the moment, the neck first-stage macro-motion cradle head completes the task of transferring the fixation point, and the rotation angle etaNBeta, the eye second-level micro-motion tripod head is not required to participate in the motion;
2)ηN<β<ηEat the moment, the first-grade neck macro-motion holder and the second-grade eye micro-motion holder are required to move simultaneously to complete the fixation point transfer task, and the rotation angle eta of the first-grade neck macro-motion holderNSecond-stage micro-motion holder rotation angle eta of eye partE=β-ηN
3)ηE<Beta, at the moment, the neck first-stage macro-motion holder and the eye second-stage micro-motion holder are required to move simultaneously to complete the fixation point transfer task, and the neck rotation angle etaN=β-ηEAngle of rotation of eyeball etaE
4. The method according to claim 1, wherein the step 3 specifically comprises:
acquiring a large-field low-resolution image by using a thermal infrared camera, extracting CENTRIST characteristics, and roughly determining the position of a target in an image coordinate system by using a trained linear SVM classifier;
calculating the rotation angle beta required by the visual axis of the bionic visual system by using the known focal length f of the camera and the error quantity delta x of the deviation of the target central point from the main point of the image,
Figure FDA0002684282820000021
the head and eye coordination control system drives the two-stage holder to rotate in a stage mode according to the visual axis rotation angle beta, so that the target is kept in the central area of the infrared image;
the method comprises the steps of collecting a small-field high-resolution image by using a visible light camera, extracting HOG characteristics of a current frame image, determining the accurate position of a moving target, and estimating the state of a new frame of moving target by using a KCF tracking method.
5. The method according to claim 1, wherein the step 4 specifically comprises:
introducing a proportion peak value s, judging the reliability of the tracking result of the motion state of the new frame of target, wherein the greater the value of s, the higher the reliability of the tracking result,
Figure FDA0002684282820000031
wherein f (z) is the response value of the new frame image classifier, phi is the 20% area of the response distribution diagram with the maximum value as the center, muфAnd σфRespectively is the mean value and the standard deviation in the region phi;
the scale peaks were normalized:
Figure FDA0002684282820000032
wherein, standardized OtValue is [0.1 ]]S betweentIs the proportional peak of the t-th frame;
setting a threshold θ as a classifier relocation decision condition, i.e. when the classifier detects a peakValue OtWhen the target is smaller than the repositioning judgment threshold theta, the tracking result is unreliable, and the target needs to be repositioned by using the CENTRIST characteristic of the infrared image; when the classifier detects the peak value OtWhen the current position is larger than the repositioning judgment threshold theta, the tracking result is effective, the HOG characteristic of the detected target is extracted, positive and negative samples are selected from the periphery of the detected target, a training set is added, and an SVM classifier is updated;
and feeding back the motion state of the target in the new frame of image with the reliability meeting the judgment condition to the head-eye coordination control system to complete the real-time tracking of the moving target.
CN201811628870.8A 2018-12-28 2018-12-28 Bionic visual target detection and tracking method based on multispectral composition Active CN109974853B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811628870.8A CN109974853B (en) 2018-12-28 2018-12-28 Bionic visual target detection and tracking method based on multispectral composition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811628870.8A CN109974853B (en) 2018-12-28 2018-12-28 Bionic visual target detection and tracking method based on multispectral composition

Publications (2)

Publication Number Publication Date
CN109974853A CN109974853A (en) 2019-07-05
CN109974853B true CN109974853B (en) 2020-12-04

Family

ID=67076545

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811628870.8A Active CN109974853B (en) 2018-12-28 2018-12-28 Bionic visual target detection and tracking method based on multispectral composition

Country Status (1)

Country Link
CN (1) CN109974853B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111292376B (en) * 2020-02-13 2023-06-30 北京理工大学 Visual target tracking method of bionic retina
CN112936348B (en) * 2021-03-04 2022-11-18 哈尔滨工业大学 Sensor integration device for sensing RGB-D-T information of intelligent robot
CN113033891B (en) * 2021-03-19 2022-01-18 河北水熠木丰工程技术有限责任公司 High pier bridge health monitoring method and device
CN115278055A (en) * 2022-06-24 2022-11-01 维沃移动通信有限公司 Shooting method, shooting device and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903279A (en) * 2014-03-21 2014-07-02 上海大学 Parallel tracking system and method based on bionic binocular vision onboard platform
CN106646694A (en) * 2016-11-02 2017-05-10 北京信息科技大学 Bionic vision imaging technology based on visible light and near-infrared rays
CN108616744A (en) * 2017-01-23 2018-10-02 上海爱观视觉科技有限公司 A kind of bionical binocular vision calibration system and calibration method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103903279A (en) * 2014-03-21 2014-07-02 上海大学 Parallel tracking system and method based on bionic binocular vision onboard platform
CN106646694A (en) * 2016-11-02 2017-05-10 北京信息科技大学 Bionic vision imaging technology based on visible light and near-infrared rays
CN108616744A (en) * 2017-01-23 2018-10-02 上海爱观视觉科技有限公司 A kind of bionical binocular vision calibration system and calibration method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Neuromimetic Robots inspired by Insect Vision;Franceschini Nicolas et al.;《MINING SMARTNESS FROM NATURE》;20080131;全文 *
基于仿人眼自适应调节的多光谱视觉图像处理方法;宋言明;《红外与激光工程》;20170930;全文 *

Also Published As

Publication number Publication date
CN109974853A (en) 2019-07-05

Similar Documents

Publication Publication Date Title
CN109974853B (en) Bionic visual target detection and tracking method based on multispectral composition
CN109308693A (en) By the target detection and pose measurement list binocular vision system of a ptz camera building
Cheng et al. Appearance-based gaze estimation via evaluation-guided asymmetric regression
CN110070074B (en) Method for constructing pedestrian detection model
WO2020172783A1 (en) Head posture tracking system used for transcranial magnetic stimulation diagnosis and treatment
CN109443206B (en) System and method for measuring tail end pose of mechanical arm based on color spherical light source target
KR20170103931A (en) Image identification system and identification method
CN106796449A (en) Eye-controlling focus method and device
CN105740846A (en) Horizontal visual angle estimation and calibration method based on depth camera
CN102125422A (en) Pupil center-corneal reflection (PCCR) based sight line evaluation method in sight line tracking system
CN110458025A (en) A kind of personal identification and localization method based on binocular camera
CN113591703B (en) Method for locating personnel in classroom and classroom integrated management system
CN111898553A (en) Method and device for distinguishing virtual image personnel and computer equipment
Qiu et al. An underwater micro cable-driven pan-tilt binocular vision system with spherical refraction calibration
Quan et al. Research on Fast Recognition and Localization of an Electric Vehicle Charging Port Based on a Cluster Template Matching Algorithm
CN105856201A (en) Three-degree-of-freedom robot vision servo platform
Weiss et al. Symmetric or not? A holistic approach to the measurement of fluctuating asymmetry from facial photographs
CN115082555A (en) High-precision displacement real-time measurement system and method of RGBD monocular camera
CN115585810A (en) Unmanned vehicle positioning method and device based on indoor global vision
CN109472797A (en) Aquaculture fish three-dimensional coordinate acquisition methods based on computer vision technique
Zhang et al. Object detection based on deep learning and b-spline level set in color images
Sakurai et al. A study on the gaze range calculation method during an actual car driving using eyeball angle and head angle information
Cui et al. Trajectory simulation of badminton robot based on fractal brown motion
CN112556655A (en) Forestry fire prevention monocular positioning method and system
Yu et al. An improved unscented kalman filtering combined with feature triangle for head position tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant