CN111340884A - Binocular heterogeneous camera and RFID dual target positioning and identity identification method - Google Patents

Binocular heterogeneous camera and RFID dual target positioning and identity identification method Download PDF

Info

Publication number
CN111340884A
CN111340884A CN202010110245.5A CN202010110245A CN111340884A CN 111340884 A CN111340884 A CN 111340884A CN 202010110245 A CN202010110245 A CN 202010110245A CN 111340884 A CN111340884 A CN 111340884A
Authority
CN
China
Prior art keywords
camera
binocular
target
coordinate system
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010110245.5A
Other languages
Chinese (zh)
Other versions
CN111340884B (en
Inventor
周海波
王硕
李晨铭
陈胜勇
杨守瑞
赵萌
沈创芸
刘彪
韩慧轩
刘淑键
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University of Technology
Original Assignee
Tianjin University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University of Technology filed Critical Tianjin University of Technology
Priority to CN202010110245.5A priority Critical patent/CN111340884B/en
Publication of CN111340884A publication Critical patent/CN111340884A/en
Application granted granted Critical
Publication of CN111340884B publication Critical patent/CN111340884B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T5/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Abstract

The invention discloses a binocular heterogeneous camera and RFID dual target positioning and identity identification method, which is characterized in that on the basis of combining RFID positioning information and binocular camera stereoscopic vision positioning information, the position information, identity information and image information of a target are linked, so that the position and identity information of the target can be obtained while a target image is collected. The method makes up the defects that the identity of the livestock and the poultry cannot be accurately identified by means of vision alone, and also makes up the defects that the positioning precision is low and a video image cannot be obtained by means of an electronic tag alone.

Description

Binocular heterogeneous camera and RFID dual target positioning and identity identification method
Technical Field
The invention belongs to the technical field of target positioning and identity identification, and particularly relates to a binocular heterogeneous camera and RFID dual target positioning and identity identification method.
Background
The target positioning and identity identification technology has wide application value, and is particularly applied to the livestock and poultry breeding industry. Because the individual movement law of the livestock and poultry is uncertain, the livestock and poultry need to be observed manually to know the physical condition of the livestock and poultry, and the invention has great application value.
Existing target location and identity recognition techniques are typically implemented using a single technique, such as: the electronic tags are used for identifying the identities of the livestock and poultry in the livestock and poultry breeding industry, and meanwhile, sensors such as an attitude sensor, a pedometer, a thermometer and the like are additionally arranged in the electronic tags to detect the activity conditions of the livestock and poultry; in the personnel flow monitoring, a monitoring camera is used for acquiring image information, positioning and face recognition are carried out on personnel in the image, and identity information and activity conditions of the personnel are obtained.
Both of these solutions have their own technical drawbacks: for example, although the former can accurately identify the identity of the target, the video image information of the target cannot be obtained, and if more related information is desired, only more sensors can be added to the electronic tag, which undoubtedly increases the volume, power consumption and cost of the electronic tag; the latter can obtain the video image information of the target, the information quantity is rich, but the accuracy of identity identification cannot be ensured because the face recognition technology is not mature enough.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a binocular heterogeneous camera and RFID dual target positioning and identity identification method.
The invention is realized by the following technical scheme:
a binocular heterogeneous camera and RFID dual target positioning and identity identification method comprises the following steps:
step S1, calibrating the binocular heterogeneous camera, establishing a mapping relation between an image coordinate system and a coordinate system of the binocular heterogeneous camera, and obtaining an internal reference matrix, an external reference matrix and distortion parameters of the image coordinate system and the coordinate system of the binocular heterogeneous camera;
step S2, acquiring a pose transformation matrix from a camera coordinate system to a farm coordinate system according to pose transformation between the camera and the robot and pose transformation between the robot and the farm, and converting the binocular camera three-dimensional coordinates to the farm three-dimensional coordinates by using the pose transformation matrix;
step S3, collecting binocular images, detecting the areas of the targets in the images of the binocular camera respectively, matching the same targets in the two images into a group, and finally extracting the local images of the targets in each group;
step S4, calculating the camera world coordinate of the target according to the target image coordinate and the homography matrix of the image coordinate system and the world coordinate system, and then converting the camera world coordinate into the world coordinate of the farm;
step S5, acquiring coordinates of a world coordinate system of the target and relevant information of the identity ID of the target by using RFID positioning information;
and step S6, carrying out coordinate matching on the world coordinates of the farm calculated by binocular vision and the world coordinates of the farm obtained by RFID, matching the targets with the same or similar coordinates into the same target, and linking the position information, the identity related information and the image information of a certain target together.
In the above technical solution, in the step S1, the binocular camera calibration process mainly includes two processes of monocular camera calibration and binocular camera calibration. By monocular phaseMachine calibration obtaining: camera internal reference matrix
Figure BDA0002389755060000021
(focal length: fx, fy; principal point: cx, cy); and distortion parameter: radial distortion coefficients k1, k2, k 3; tangential distortion coefficient: p1, p 2;
obtaining an external parameter matrix of the camera through binocular camera calibration, wherein the external parameter matrix comprises a rotation matrix Rot and a translation matrix Trans, and the external parameter matrix comprises the following formula:
Figure BDA0002389755060000022
in the above technical solution, in step S2, different coordinate systems are established for the farm patrol environment, including a camera coordinate system C, a robot coordinate system R, and a world coordinate system w of the farm as a whole;
establishing a pose matrix in each coordinate system respectively, wherein the pose matrix comprises a rotation matrix Rot and a translation matrix Trans, and a conversion formula from a camera coordinate system to a farm coordinate system is expressed as follows:
Figure BDA0002389755060000023
in the above technical solution, in step S3, firstly, the image collected by the binocular camera is subjected to distortion removal by using the distortion parameter specified in step S1, and the image is used as an input image; then, carrying out target detection on the input image by using a target detection neural network detection model to obtain the region of the target in the image; and finally, performing feature extraction and matching on the targets detected in the two images, finding out the corresponding relation of the targets, and obtaining the coordinates of the target images.
In the above technical solution, in step S4, the image obtained after distortion correction in step S3 is first subjected to stereo correction, and the two images without distortion are strictly aligned in the horizontal direction, so that epipolar lines of the two images are exactly on the same horizontal line, and thus any point on one image and a matching point on the other image necessarily have the same line number, and a corresponding point can be matched only by performing one-dimensional search on the line;
then, performing three-dimensional reconstruction on the target image coordinates obtained in the step S3 to obtain coordinates of the target in a binocular camera coordinate system;
and finally, converting the coordinates of the target under the binocular camera coordinate system into a farm coordinate system.
The invention has the advantages and beneficial effects that:
the invention establishes the relation between the position information, the identity information and the image information of the target on the basis of combining the RFID positioning information and the binocular camera stereoscopic vision positioning information, thereby acquiring the position and the identity information of the target while acquiring the target image. The method makes up the defects that the identity of the livestock and the poultry cannot be accurately identified by means of vision alone, and also makes up the defects that the positioning precision is low and a video image cannot be obtained by means of an electronic tag alone.
Drawings
Fig. 1 is a topological diagram of the binocular heterogeneous camera and RFID dual target location and identity recognition method of the present invention.
Fig. 2 is a schematic pose diagram of the camera coordinate system to the farm coordinate system.
Fig. 3 is an explanatory diagram of step S3, taking vehicle detection as an example.
Fig. 4 is a schematic diagram of the world coordinate system coordinates of the farm where the target is obtained in step S5.
Fig. 5 is a schematic diagram of step S6.
For a person skilled in the art, other relevant figures can be obtained from the above figures without inventive effort.
Detailed Description
In order to make the technical solution of the present invention better understood, the technical solution of the present invention is further described below with reference to specific examples.
A binocular heterogeneous camera and RFID dual target positioning and identity identification method is shown in figure 1 and specifically comprises the following steps:
and step S1, calibrating the binocular camera, establishing a mapping relation between an image coordinate system and a binocular camera coordinate system, and obtaining an internal reference matrix, an external reference matrix and distortion parameters of the image coordinate system and the binocular camera coordinate system.
In the binocular camera calibration process, the method mainly comprises two processes of monocular camera calibration and binocular camera calibration. Obtaining through monocular camera calibration: camera internal reference matrix
Figure BDA0002389755060000041
(focal length: fx, fy; principal point: cx, cy); and distortion parameter: radial distortion coefficients k1, k2, k 3; tangential distortion coefficient: p1, p 2;
obtaining an external parameter matrix of the camera through binocular camera calibration, wherein the external parameter matrix comprises a rotation matrix Rot and a translation matrix Trans, and the external parameter matrix comprises the following formula:
Figure BDA0002389755060000042
and step S2, determining pose transformation between the camera and the robot and pose transformation between the robot and the farm, acquiring a pose transformation matrix from a camera coordinate system to a farm coordinate system, and converting the binocular camera three-dimensional coordinate to the farm three-dimensional coordinate by using the pose transformation matrix.
Specifically, the method comprises the following steps: in order to locate the target in the camera in the farm, different coordinate systems are required to be established for the farm patrol environment, including a camera coordinate system C, a robot coordinate system R and a world coordinate system w of the whole farm.
And respectively establishing a pose matrix in each coordinate system, wherein the pose matrix comprises a rotation matrix Rot and a translation matrix Trans.
The pose from the camera coordinate system to the farm coordinate system is shown in fig. 2 and is represented by the following formula:
Figure BDA0002389755060000043
step S3, collecting binocular images, detecting the areas of a plurality of targets in the images of the binocular camera by using an image detection technology, matching the same targets in the two images into a group by using an image matching technology, and finally extracting the local images of the targets in each group.
Specifically, the method comprises the following steps: firstly, removing distortion of an image acquired by a binocular camera by using the distortion parameter marked in the step S1 to obtain an input image; then, carrying out target detection on the input image by using a target detection neural network detection model to obtain the region of the target in the image; and finally, carrying out feature extraction and matching on the targets detected in the two images, finding out the corresponding relation of the targets, and obtaining a plurality of groups of target image coordinates subjected to distortion removal. Taking vehicle detection as an example, the process is shown in fig. 3.
And step S4, calculating camera world coordinates of a plurality of targets according to a plurality of groups of undistorted target image coordinates and homography matrixes of the image coordinate system and the world coordinate system, and then converting the camera world coordinates into farm world coordinates.
Specifically, the method comprises the following steps: firstly, stereo correction is carried out on the image subjected to distortion correction in the step S3, and the two images subjected to distortion removal are strictly aligned in the horizontal direction, so that epipolar lines of the two images are exactly on the same horizontal line, thus any point on one image and a matching point on the other image have the same line number, and the corresponding point can be matched only by one-dimensional search on the line.
And then, performing three-dimensional reconstruction on the target image coordinates obtained by matching in the step S3 to obtain the coordinates of the target in a binocular camera coordinate system.
Finally, the coordinates of the camera coordinate system are converted to the farm coordinate system according to the conversion formula of step S2.
Step S5, the coordinates of the world coordinate system of the farm of any target and the relevant information of the identity ID can be obtained through the existing RFID positioning technology, such as Bluetooth, UWB, RFID, WIFI and the like.
For example, referring to fig. 4, the position from the target is measured by (x1, y1) and (x2, y2), two coordinates are obtained by the intersection of two circles, and finally the position is determined by measuring (x3,y3) and determining the final two-dimensional coordinate of the target on the farm according to the distance between the target and the target, thereby obtaining the two-dimensional coordinates of all the livestock and poultry with the suspended labels on the farm.
And step S6, obtaining the two-dimensional coordinates of the target in the farm through dimension reduction according to the coordinates of the target in the world coordinate system of the farm obtained in the step S4, matching the two-dimensional coordinates with the coordinates obtained in the step S5, matching the targets with the same or similar coordinates into the same target, and accordingly linking the position information and the identity related information of a certain target with the image information.
Furthermore, because the existing RFID positioning technology still has certain errors, the information of two nearest electronic tags is obtained through measuring data through multiple experiments, if the distances between the two nearest electronic tags and a target are very close, the target point cannot be accurately judged to be the tag, so that confidence is introduced to adapt to judgment under different environments, and the confidence of corresponding matching results obtained through experiments are
Figure BDA0002389755060000051
(dmax,dminRespectively, the farthest position and the nearest position) and curve fitting is performed according to the data measured under different environments to obtain corresponding confidence function.
The confidence degrees of the two closest electronic tags and the stored information thereof, such as the electronic tag IDs, are sent back to the computer, so that the corresponding information can be marked on the target livestock and poultry in the video image, as shown in fig. 5.
Therefore, the defects that the identity of the livestock and poultry cannot be accurately identified by means of vision alone are overcome, and the defects that the positioning precision is low and a video image cannot be obtained by means of an electronic tag alone are overcome.
Further, the binocular heterogeneous camera system may use heterogeneous cameras to form a binocular system, such as an infrared camera and a color camera; the same camera may also be used to make up a binocular system, such as two color cameras. The same point lies in that two modes both need two cameras to have a certain coincident view field area to meet the requirement of binocular positioning. The same points are that: (1) the left camera and the right camera of the binocular system consisting of the same camera can be detected by the same method, and the left camera and the right camera of the binocular system consisting of heterogeneous cameras need to be detected by different methods; (2) the feature extraction and measurement methods of the left image and the right image in the feature matching of the binocular system composed of the same cameras are the same, and the feature extraction and measurement methods of the left image and the right image in the feature matching of the binocular system composed of heterogeneous cameras are different.
Further, the RFID location system may be implemented using a variety of technologies, including signal strength (RSSI) based location technology, time of flight (TOF) based location technology, and time difference of arrival (TDOA) based location technology, but requires the use of an electronic tag location technology that satisfies the following conditions: (1) the target position can be positioned, the number (2) of the label can be acquired, the positioning range of the label covers the positioning range (3) of the binocular camera, and the positioning speed and the positioning precision of the label meet the actual application requirements.
The invention has been described in an illustrative manner, and it is to be understood that any simple variations, modifications or other equivalent changes which can be made by one skilled in the art without departing from the spirit of the invention fall within the scope of the invention.

Claims (5)

1. A binocular heterogeneous camera and RFID dual target positioning and identity identification method is characterized in that: the method comprises the following steps:
step S1, calibrating the binocular heterogeneous camera, establishing a mapping relation between an image coordinate system and a coordinate system of the binocular heterogeneous camera, and obtaining an internal reference matrix, an external reference matrix and distortion parameters of the image coordinate system and the coordinate system of the binocular heterogeneous camera;
step S2, acquiring a pose transformation matrix from a camera coordinate system to a farm coordinate system according to pose transformation between the camera and the robot and pose transformation between the robot and the farm, and converting the binocular camera three-dimensional coordinates to the farm three-dimensional coordinates by using the pose transformation matrix;
step S3, collecting binocular images, detecting the areas of the targets in the images of the binocular camera respectively, matching the same targets in the two images into a group, and finally extracting the local images of the targets in each group;
step S4, calculating the camera world coordinate of the target according to the target image coordinate and the homography matrix of the image coordinate system and the world coordinate system, and then converting the camera world coordinate into the world coordinate of the farm;
step S5, acquiring coordinates of a world coordinate system of the target and relevant information of the identity ID of the target by using RFID positioning information;
and step S6, carrying out coordinate matching on the world coordinates of the farm calculated by binocular vision and the world coordinates of the farm obtained by RFID, matching the targets with the same or similar coordinates into the same target, and linking the position information, the identity related information and the image information of a certain target together.
2. The binocular heterogeneous camera and RFID dual target positioning and identity recognition method according to claim 1, wherein: in step S1, the binocular camera calibration process mainly includes two processes, namely monocular camera calibration and binocular camera calibration. Obtaining through monocular camera calibration: camera internal reference matrix
Figure FDA0002389755050000011
(focal length: fx, fy; principal point: cx, cy); and distortion parameter: radial distortion coefficients k1, k2, k 3; tangential distortion coefficient: p1, p 2;
obtaining an external parameter matrix of the camera through binocular camera calibration, wherein the external parameter matrix comprises a rotation matrix Rot and a translation matrix Trans, and the external parameter matrix comprises the following formula:
Figure FDA0002389755050000012
3. the binocular heterogeneous camera and RFID dual target positioning and identity recognition method according to claim 1, wherein: in step S2, different coordinate systems are established aiming at the patrol environment of the farm, wherein the coordinate systems comprise a camera coordinate system C, a robot coordinate system R and a whole world coordinate system w of the farm;
establishing a pose matrix in each coordinate system respectively, wherein the pose matrix comprises a rotation matrix Rot and a translation matrix Trans, and a conversion formula from a camera coordinate system to a farm coordinate system is expressed as follows:
Figure FDA0002389755050000021
4. the binocular heterogeneous camera and RFID dual target positioning and identity recognition method according to claim 1, wherein: in step S3, first, the image captured by the binocular camera is subjected to distortion removal using the distortion parameters specified in step S1 to obtain an input image; then, carrying out target detection on the input image by using a target detection neural network detection model to obtain the region of the target in the image; and finally, performing feature extraction and matching on the targets detected in the two images, finding out the corresponding relation of the targets, and obtaining the coordinates of the target images.
5. The binocular heterogeneous camera and RFID dual target positioning and identity recognition method according to claim 4, wherein: in step S4, first, stereo-correcting the image with distortion corrected in step S3, and aligning the two images with distortion removed strictly in the horizontal direction so that epipolar lines of the two images are exactly on the same horizontal line, and thus any point on one image and a matching point on the other image will have the same row number, and the corresponding point can be matched only by performing one-dimensional search on the row;
then, performing three-dimensional reconstruction on the target image coordinates obtained in the step S3 to obtain coordinates of the target in a binocular camera coordinate system;
and finally, converting the coordinates of the target under the binocular camera coordinate system into a farm coordinate system.
CN202010110245.5A 2020-02-24 2020-02-24 Dual-target positioning and identity identification method for binocular heterogeneous camera and RFID Active CN111340884B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010110245.5A CN111340884B (en) 2020-02-24 2020-02-24 Dual-target positioning and identity identification method for binocular heterogeneous camera and RFID

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010110245.5A CN111340884B (en) 2020-02-24 2020-02-24 Dual-target positioning and identity identification method for binocular heterogeneous camera and RFID

Publications (2)

Publication Number Publication Date
CN111340884A true CN111340884A (en) 2020-06-26
CN111340884B CN111340884B (en) 2023-07-07

Family

ID=71181849

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010110245.5A Active CN111340884B (en) 2020-02-24 2020-02-24 Dual-target positioning and identity identification method for binocular heterogeneous camera and RFID

Country Status (1)

Country Link
CN (1) CN111340884B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112288801A (en) * 2020-10-30 2021-01-29 天津理工大学 Four-in-one self-adaptive tracking shooting method and device applied to inspection robot
CN112991465A (en) * 2021-03-26 2021-06-18 禾多科技(北京)有限公司 Camera calibration method and device, electronic equipment and computer readable medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050257748A1 (en) * 2002-08-02 2005-11-24 Kriesel Marshall S Apparatus and methods for the volumetric and dimensional measurement of livestock
CN102902271A (en) * 2012-10-23 2013-01-30 上海大学 Binocular vision-based robot target identifying and gripping system and method
CN103884280A (en) * 2014-03-14 2014-06-25 中国农业大学 Mobile system for monitoring body sizes and weights of pigs in multiple pigsties
CN105432486A (en) * 2015-11-03 2016-03-30 内蒙古农业大学 Feeding detection system and feeding detection method thereof
CN105784083A (en) * 2016-04-05 2016-07-20 北京农业信息技术研究中心 Cow shape measuring method and cow shape measuring system based on stereo vision technology
CN105973228A (en) * 2016-06-28 2016-09-28 江苏环亚医用科技集团股份有限公司 Single camera and RSSI (received signal strength indication) based indoor target positioning system and method
CN107527075A (en) * 2016-06-20 2017-12-29 杭州海康威视数字技术股份有限公司 RFID label tag is established with personnel's corresponding relation and trajectory track method and device
CN108010085A (en) * 2017-11-30 2018-05-08 西南科技大学 Target identification method based on binocular Visible Light Camera Yu thermal infrared camera
CN108614980A (en) * 2018-04-16 2018-10-02 西南科技大学 A kind of the dynamic object positioning system and method for combining RFID and laser intelligence
CN108875647A (en) * 2018-06-22 2018-11-23 成都睿畜电子科技有限公司 A kind of motion track monitoring method and system based on livestock identity
CN109258479A (en) * 2018-09-21 2019-01-25 贵州民族大学 Milk production of cow automatic monitoring system based on image recognition
CN110597333A (en) * 2019-10-24 2019-12-20 任明乐 Pig house environmental monitoring system based on thing networking

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050257748A1 (en) * 2002-08-02 2005-11-24 Kriesel Marshall S Apparatus and methods for the volumetric and dimensional measurement of livestock
CN102902271A (en) * 2012-10-23 2013-01-30 上海大学 Binocular vision-based robot target identifying and gripping system and method
CN103884280A (en) * 2014-03-14 2014-06-25 中国农业大学 Mobile system for monitoring body sizes and weights of pigs in multiple pigsties
CN105432486A (en) * 2015-11-03 2016-03-30 内蒙古农业大学 Feeding detection system and feeding detection method thereof
CN105784083A (en) * 2016-04-05 2016-07-20 北京农业信息技术研究中心 Cow shape measuring method and cow shape measuring system based on stereo vision technology
CN107527075A (en) * 2016-06-20 2017-12-29 杭州海康威视数字技术股份有限公司 RFID label tag is established with personnel's corresponding relation and trajectory track method and device
CN105973228A (en) * 2016-06-28 2016-09-28 江苏环亚医用科技集团股份有限公司 Single camera and RSSI (received signal strength indication) based indoor target positioning system and method
CN108010085A (en) * 2017-11-30 2018-05-08 西南科技大学 Target identification method based on binocular Visible Light Camera Yu thermal infrared camera
CN108614980A (en) * 2018-04-16 2018-10-02 西南科技大学 A kind of the dynamic object positioning system and method for combining RFID and laser intelligence
CN108875647A (en) * 2018-06-22 2018-11-23 成都睿畜电子科技有限公司 A kind of motion track monitoring method and system based on livestock identity
CN109258479A (en) * 2018-09-21 2019-01-25 贵州民族大学 Milk production of cow automatic monitoring system based on image recognition
CN110597333A (en) * 2019-10-24 2019-12-20 任明乐 Pig house environmental monitoring system based on thing networking

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
刘帅等: "《GIS三维建模方法》", 31 March 2016, 中国科学技术出版社, pages: 27 - 30 *
刘煜等: "《阵列相机成像技术与应用》", 30 April 2018, 国防科技大学出版社, pages: 28 - 29 *
赵小川: "《MATLAB图像处理—程序实现与模块化仿真(第2版)》", 31 December 2018, 北京航空航天大学出版社, pages: 245 - 247 *
郭宝龙等: "《数字图像处理系统工程导论》", 31 July 2012, 西安电子科技大学出版社, pages: 155 - 158 *
陆新征等: "《第25届全国结构工程学术会议论文集 第3册》", 31 August 2016, 《工程力学》杂志社, pages: 493 - 494 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112288801A (en) * 2020-10-30 2021-01-29 天津理工大学 Four-in-one self-adaptive tracking shooting method and device applied to inspection robot
CN112991465A (en) * 2021-03-26 2021-06-18 禾多科技(北京)有限公司 Camera calibration method and device, electronic equipment and computer readable medium

Also Published As

Publication number Publication date
CN111340884B (en) 2023-07-07

Similar Documents

Publication Publication Date Title
CN110426051B (en) Lane line drawing method and device and storage medium
CN103411553B (en) The quick calibrating method of multi-linear structured light vision sensors
CN111340797A (en) Laser radar and binocular camera data fusion detection method and system
CN107657639A (en) A kind of method and apparatus of quickly positioning target
CN101441769A (en) Real time vision positioning method of monocular camera
Momeni-k et al. Height estimation from a single camera view
CN111340884B (en) Dual-target positioning and identity identification method for binocular heterogeneous camera and RFID
CN105307115A (en) Distributed vision positioning system and method based on action robot
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
CN112033408B (en) Paper-pasted object space positioning system and positioning method
CN112518748B (en) Automatic grabbing method and system for visual mechanical arm for moving object
CN110930382A (en) Point cloud splicing precision evaluation method and system based on calibration plate feature point extraction
CN106682579B (en) Unmanned aerial vehicle binocular vision image processing system for detecting icing of power transmission line
CN113988228B (en) Indoor monitoring method and system based on RFID and vision fusion
CN100553349C (en) Determine the method for target topological relation and the camera calibration target that can put arbitrarily
CN111932617B (en) Method and system for realizing real-time detection and positioning of regular objects
CN109635692B (en) Scene re-identification method based on ultrasonic sensor
US20240085448A1 (en) Speed measurement method and apparatus based on multiple cameras
CN115457130A (en) Electric vehicle charging port detection and positioning method based on depth key point regression
CN112884832B (en) Intelligent trolley track prediction method based on multi-view vision
CN114937233A (en) Identification method and identification device based on multispectral data deep learning
CN111899289B (en) Infrared image and visible light image registration method based on image characteristic information
Kheng et al. Stereo vision with 3D coordinates for robot arm application guide
CN110135238B (en) Markless Internet of things equipment identification method based on mobile AR
CN110853080A (en) Method for measuring size of field fruit

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant