CN111383264B - Positioning method, positioning device, terminal and computer storage medium - Google Patents

Positioning method, positioning device, terminal and computer storage medium Download PDF

Info

Publication number
CN111383264B
CN111383264B CN201811636299.4A CN201811636299A CN111383264B CN 111383264 B CN111383264 B CN 111383264B CN 201811636299 A CN201811636299 A CN 201811636299A CN 111383264 B CN111383264 B CN 111383264B
Authority
CN
China
Prior art keywords
infrared
target
image
dimensional coordinate
mapping model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811636299.4A
Other languages
Chinese (zh)
Other versions
CN111383264A (en
Inventor
熊友军
庞建新
张惊涛
张万里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN201811636299.4A priority Critical patent/CN111383264B/en
Publication of CN111383264A publication Critical patent/CN111383264A/en
Application granted granted Critical
Publication of CN111383264B publication Critical patent/CN111383264B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The invention is applicable to the technical field of computer application, and provides a positioning method, a device, a terminal and a computer storage medium, wherein the positioning method comprises the following steps: acquiring a pre-constructed first mapping model; shooting a target object to be positioned by using an infrared thermal imager to obtain a first image, and acquiring first target three-dimensional coordinates of each pixel point in the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor; mapping the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model to obtain distance data corresponding to the first target two-dimensional coordinate; the technical problem that the infrared thermal imager can only give azimuth information of survivors and cannot obtain specific distance information is solved.

Description

Positioning method, positioning device, terminal and computer storage medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a positioning method, a positioning device, a terminal, and a computer storage medium.
Background
When survivors are searched in the scene after the disaster, more risks exist in the manual searching, unnecessary casualties are easy to cause, more risks can be avoided when a robot carrying a certain sensor is used for carrying out advanced detection, the approximate positions of the survivors can be obtained according to the data of the sensor, and assistance is provided for later-stage manual rescue.
Under low light conditions, or when there is a lot of smoke or dust in the rescue scene, conventional visible light camera sensors will not be able to accurately search for survivors, but thermal sensors can only be used to identify the position of the victim in the camera image based on a blob detection algorithm. However, the thermal sensor can only give azimuth information of survivors, and cannot obtain specific distance information, which is unfavorable for search and rescue work.
Disclosure of Invention
In view of the above, the embodiments of the present invention provide a positioning method, a positioning device, a terminal, and a computer storage medium, which can solve the technical problem that the distance information of the survivor cannot be obtained.
A first aspect of an embodiment of the present invention provides a positioning method applied to a terminal, where the terminal is configured with an infrared laser sensor and a thermal infrared imager, the positioning method includes:
acquiring a pre-constructed first mapping model; the first mapping model is used for mapping the three-dimensional coordinates obtained by scanning the infrared laser sensor into coordinates on an image shot by the thermal infrared imager;
shooting a target object to be positioned by using the thermal infrared imager to obtain a first image, and acquiring first target three-dimensional coordinates of each pixel point in the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor;
and mapping the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model to obtain distance data corresponding to the first target two-dimensional coordinate.
A second aspect of an embodiment of the present invention provides a positioning device configured to a terminal, the terminal being configured with an infrared laser sensor and a thermal infrared imager, the positioning device comprising:
the acquisition unit is used for acquiring a pre-constructed first mapping model; the first mapping model is used for mapping the three-dimensional coordinates obtained by scanning the infrared laser sensor into coordinates on an image shot by the thermal infrared imager;
the shooting unit is used for shooting a target object to be positioned by using the thermal infrared imager to obtain a first image, and acquiring first target three-dimensional coordinates of each pixel point of the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor;
and the positioning unit is used for mapping the first target three-dimensional coordinate into the first target two-dimensional coordinate on the first image by using the first mapping model, and obtaining distance data corresponding to the first target two-dimensional coordinate.
A third aspect of the embodiments of the present invention provides a terminal comprising an infrared laser sensor, a thermal infrared imager, a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the method described above when executing the computer program.
A fourth aspect of the embodiments of the present invention provides a computer storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method described above.
In the embodiment of the invention, when a terminal performs positioning operation by using a configured infrared laser sensor and an infrared thermal imager, a first image is obtained by acquiring a pre-built first mapping model and shooting a target object to be positioned by using the infrared thermal imager, and a first target three-dimensional coordinate of the target object obtained by scanning an infrared scanning line of the infrared laser sensor is acquired; for example, if the target object is a survivor in the post-disaster scene, mapping the first target three-dimensional coordinate to a first target two-dimensional coordinate on the first image by using the first mapping model to obtain distance data corresponding to the first target two-dimensional coordinate; namely, the distance information of survivors is obtained, the technical problem that the infrared thermal imaging instrument can only give the azimuth information of the survivors and cannot obtain specific distance information is solved, and the search and rescue work can be smoothly carried out.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an implementation flow of a positioning method according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a specific implementation of the construction of the first mapping model according to the embodiment of the present invention;
FIG. 3 is a schematic structural view of a calibration plate according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of a specific implementation of step 202 of a positioning method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a second scanned image, a second image, and a third image obtained when the calibration plate is photographed by using an infrared laser sensor, a NoIR camera, and a thermal infrared imager according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a positioning device according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to illustrate the technical scheme of the invention, the following description is made by specific examples.
It should be understood that the terms "comprises" and "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In a specific implementation, the terminal described in the embodiment of the present invention may be a terminal device configured with a positioning device, such as a robot, a computer, a mobile phone, an intelligent wearable device, and the terminal device is further configured with an infrared laser sensor and a thermal infrared imager, and for convenience of description, the present invention is illustrated by taking the robot as an example.
Referring to fig. 1, fig. 1 is a schematic flow chart of an implementation of a positioning method according to an embodiment of the present invention. The main execution body of the positioning method in this embodiment is a positioning device, and the device can be implemented by software and/or hardware and integrated in a terminal, so that the positioning device is suitable for situations requiring positioning. The positioning method as shown in fig. 1 may include: steps 101 to 103.
Step 101, a first mapping model which is built in advance is obtained; and the first mapping model is used for mapping the three-dimensional coordinates obtained by scanning the infrared laser sensor into coordinates on an image shot by the thermal infrared imager.
In this embodiment of the present application, the infrared laser sensor may be a 2D infrared laser sensor or a 3D infrared laser sensor.
In this embodiment of the present application, the infrared laser sensor and the thermal infrared imager are disposed perpendicular to each other, and a Z axis of the thermal infrared imager is parallel to a scanning plane of the infrared laser sensor, so as to avoid an error caused by parallax.
In the embodiment of the invention, the first mapping model is used for mapping the three-dimensional coordinates obtained by scanning the infrared laser sensor to coordinates on an image shot by the thermal infrared imager. That is, the 3D coordinates are mapped to 2D coordinates.
Optionally, as shown in fig. 2, the step of constructing the first mapping model may include: steps 201 to 202.
Step 201, calibrating the infrared thermal imager and the NoIR camera to obtain an internal reference matrix and a second mapping model of the infrared thermal imager; the second mapping model is used for mapping pixel coordinates of an image shot by the NoIR camera to coordinates on the image shot by the thermal infrared imager; the NoIR camera is a camera without an infrared filter mounted.
In this embodiment, the NoIR camera is used as an auxiliary camera to calibrate the thermal infrared imager and the infrared laser sensor, so as to obtain the first mapping model, thereby mapping the three-dimensional coordinate obtained by scanning the infrared laser sensor to the coordinate on the image shot by the thermal infrared imager, obtaining the distance data of the corresponding coordinate on the image shot by the thermal infrared imager, and realizing positioning of the shot target object.
Specifically, in the process of calibrating the thermal infrared imager and the infrared laser sensor by using the NoIR camera as an auxiliary camera, the thermal infrared imager and the NoIR camera need to be calibrated first to obtain an internal reference matrix and a second mapping model of the thermal infrared imager.
The implementation mode of calibrating the infrared thermal imager to obtain the internal reference matrix of the infrared thermal imager can be realized by adopting a general calibration method, and specifically comprises the following steps:
according to the principle of small hole imaging, an imaging geometric model of the infrared thermal imager is established:
wherein: (u, v, 1) T The (X/Z, Y/Z, 1) is the homogeneous coordinate of any pixel point in the pixel coordinate system, and the (X/Z, Y/Z, 1) is the homogeneous coordinate corresponding to the point in the world coordinate system; f (f) x ,f y Referred to as the equivalent focal length of the camera in the u-axis and v-axis directions, u0, v0 are the actual center point coordinates of the image plane. In the pair f x ,f y ,u 0 ,v 0 When the calibration is carried out, shooting with different angles and different distances can be carried out on the calibration plate by using the infrared thermal imager, and f is obtained by calculation according to the corner points in the calibration plate x ,f y ,u 0 ,v 0 Is a value of (2).
In practical application, when the infrared thermal imager is calibrated, the internal reference matrix of the infrared thermal imager can be obtainedThe distortion coefficient of the infrared thermal imager can be obtained, and after the internal reference matrix and the distortion coefficient of the infrared thermal imager are obtained, the original image shot by the infrared thermal imager can be corrected and subjected to distortion elimination according to the internal reference matrix and the distortion coefficient.
Similarly, when the NoIR camera is calibrated, a corresponding internal reference matrix and a distortion coefficient can be obtained in the same manner, and the internal reference matrix and the distortion coefficient are used for correcting and eliminating distortion of an original image shot by the NoIR camera.
Optionally, calibrating the thermal infrared imager and the NoIR camera to obtain the second mapping model may include: shooting a calibration plate by using the thermal infrared imager and the NoIR camera respectively to obtain a fourth image and a fifth image of the calibration plate, and obtaining pixel point coordinates of a corresponding position between the fourth image and the fifth image; the second mapping model is then built as:
calculating a calibration parameter H by using pixel point coordinates of a corresponding position between the fourth image and the fifth image, and obtaining the second mapping model; wherein, (x) 1 ,y 1 ) For the coordinates on the fourth image, (x) 2 ,y 2 ) Is the coordinates on the fifth image.
It should be noted that, the fourth image and the fifth image are both images that have undergone correction and orthodontic treatment, that is, there is no distortion of the images, and each image described later is also an image that has undergone correction and orthodontic treatment using the reference matrix and distortion coefficient of each photographing camera.
In the embodiment of the present invention, the calibration plate may be a black-white checkered calibration plate, or other calibration plates including a plurality of corner points, in order to improve the calibration accuracy, as shown in fig. 3, the calibration plate 3a may be a calibration plate formed by two triangular metal plates 31 forming a set angle with each other, and each metal plate is provided with circular through holes 32 distributed in an array. The setting angle may be set according to the actual application, for example, the setting angle may be set to 130 degrees to 155 degrees. The circular through holes 32 distributed in the array in the calibration plate can be used for calibrating an internal reference matrix and distortion coefficients of the camera.
When the second mapping model is obtained by calibrating the infrared thermal imager and the NoIR camera, the calibration board may be photographed by respectively using the infrared thermal imager and the NoIR camera, and the calculation of the calibration parameter H may be performed by respectively selecting 4 pairs of vertices in the calibration board as shown in fig. 3, which may specifically include the following steps: and according to the coordinates of the 4 pairs of vertexes, obtaining a calibration parameter H in the second mapping model by using a pose estimation algorithm solvepnp.
And 202, calibrating the infrared thermal imager and the infrared laser sensor by using the internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model.
Optionally, as shown in fig. 4, in step 202, calibrating the thermal infrared imager and the infrared laser sensor by using the internal reference matrix of the thermal infrared imager and the second mapping model to obtain the first mapping model may include: steps 401 to 404.
Step 401, scanning a calibration board by using an infrared scanning line of an infrared laser sensor to obtain a second target three-dimensional coordinate of each pixel point in a second scanning image of the calibration board, shooting the calibration board by using the NoIR camera to obtain a second image of the calibration board, and shooting the calibration board by using the infrared thermal imager to obtain a third image of the calibration board.
Since the NoIR camera is a camera to which an infrared filter is not attached, it can capture an image formed by infrared rays.
For example, as shown in fig. 5, while the calibration plate is scanned by using the infrared scanning line of the infrared laser sensor to obtain the second target three-dimensional coordinates of each pixel point in the second scanned image 5a, an image formed by the infrared scanning line scanned onto the calibration plate may be captured when the calibration plate is photographed by using the NoIR camera, for example, an image 51 formed by the infrared scanning line in the second image 5b obtained by photographing by using the NoIR camera, and an image 5c in fig. 5 is a third image obtained by photographing the calibration plate by using the infrared thermal imager.
And step 402, extracting a second target two-dimensional coordinate corresponding to the second target three-dimensional coordinate in the second image.
After the second image and the second scanning image are acquired, a second target two-dimensional coordinate corresponding to the second target three-dimensional coordinate in the second image can be extracted.
For example, as shown in fig. 5, the coordinates of the intersection points A2, B2, and C2 of the infrared scanning line and the calibration plate are obtained by performing linear detection and segmentation on the second image 5B, and coordinates of a plurality of points distributed at equal intervals between A2 and B2, for example, coordinates of two points distributed at equal intervals are obtained by an interpolation method, and simultaneously coordinates of two points distributed at equal intervals between B2 and C2 are obtained, so as to obtain 7 second target two-dimensional coordinates corresponding to 7 second target three-dimensional coordinates in the second scanning image in the second image.
It should be noted that the number of the second target two-dimensional coordinates obtained here is merely illustrative, and in other embodiments of the present invention, more or fewer second target two-dimensional coordinates may be selected to perform calculation of the reference matrix. Meanwhile, the coordinates of the intersection points A2, B2 and C2 of the infrared scanning line and the calibration plate are obtained through linear detection and segmentation of the second image and serve as the second target two-dimensional coordinates, the acquisition mode is simpler, and the coordinates are more accurate.
And step 403, mapping the second target two-dimensional coordinate into a third target two-dimensional coordinate by using the second mapping model.
After 7 second target two-dimensional coordinates corresponding to 7 second target three-dimensional coordinates in the second scanned image in the second image are obtained, the second mapping model can be utilized to map the 7 second target two-dimensional coordinates into 7 third target two-dimensional coordinates corresponding to coordinate points on a third image.
And step 404, determining an external parameter matrix in the first mapping model by using the third target two-dimensional coordinate, the internal parameter matrix of the thermal infrared imager and the second target three-dimensional coordinate, and obtaining the first mapping model.
After the steps 401 to 403, 7 pairs of coordinate points corresponding to the second scan image and the third image one by one can be obtained. And determining an external reference matrix in the first mapping model by using the 7 third target two-dimensional coordinates, the 7 second target three-dimensional coordinates and the internal reference matrix of the infrared thermal imager, and obtaining the first mapping model.
For example, constructing the first mapping model is:
and obtaining a rotation matrix and a translation vector in the first mapping model by using a pose estimation algorithm solvepnp. Wherein M is an internal reference matrix of the thermal infrared imager, R is a rotation matrix in the first mapping model, t is a translation vector in the first mapping model, (X, Y, Z) is a three-dimensional coordinate of a scanned image obtained by scanning of an infrared laser sensor, and (X, Y) is a coordinate of an image shot by the thermal infrared imager.
And 102, shooting a target object to be positioned by using the thermal infrared imager to obtain a first image, and acquiring first target three-dimensional coordinates of each pixel point in the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor.
And step 103, mapping the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model, and obtaining distance data corresponding to the first target two-dimensional coordinate.
In the embodiment of the invention, after the first mapping model is obtained, when a certain target object needs to be positioned, the target object needing to be positioned can be directly shot by using the thermal infrared imager to obtain a first image, the first target three-dimensional coordinates of each pixel point in the first scanning image of the target object, which is obtained by scanning the infrared scanning line of the infrared laser sensor, are obtained, and then the first mapping model is used for mapping the first target three-dimensional coordinates into the first target two-dimensional coordinates on the first image, so that the distance data corresponding to the first target two-dimensional coordinates can be obtained.
Specifically, the infrared laser sensor emits an infrared scanning line (infrared laser) to a certain direction, receives the infrared laser reflected by the obstacle, obtains a time difference from the emission to the reception of the infrared laser, and can calculate distance data of the obstacle closest to the direction according to the time difference, so that after the distance data are mapped to a corresponding first target two-dimensional coordinate in the first image, the first target two-dimensional coordinate can carry corresponding distance data, the technical problem that the infrared thermal imager can only give azimuth information of survivors and cannot obtain specific distance information is solved, and the search and rescue work is facilitated to be carried out smoothly.
The embodiment of the invention also provides a positioning device which comprises a module for executing each step in the positioning method, and the device can be integrated with the terminal. Where not described in detail in the device, reference is made to the description of the method described above.
Referring to fig. 6, fig. 6 is a schematic block diagram of a positioning device according to an embodiment of the present invention, where the positioning device is configured on the terminal, and the terminal is configured with an infrared laser sensor and a thermal infrared imager, and the positioning device includes: an acquisition unit 61, a shooting unit 62, and a positioning unit 63.
An obtaining unit 61, configured to obtain a first mapping model that is built in advance; the first mapping model is used for mapping the three-dimensional coordinates obtained by scanning the infrared laser sensor into coordinates on an image shot by the thermal infrared imager;
the shooting unit 62 is configured to shoot a target object to be positioned by using the thermal infrared imager, obtain a first image, and obtain first target three-dimensional coordinates of each pixel point of the first scanned image of the target object, where the first scanned image is obtained by scanning an infrared scanning line of the infrared laser sensor;
and the positioning unit 63 is configured to map the first target three-dimensional coordinate to a first target two-dimensional coordinate on the first image by using the first mapping model, so as to obtain distance data corresponding to the first target two-dimensional coordinate.
Optionally, the positioning device further comprises a construction unit;
the construction unit is used for calibrating the infrared thermal imager and the NoIR camera to obtain an internal reference matrix and a second mapping model of the infrared thermal imager; the second mapping model is used for mapping pixel coordinates of an image shot by the NoIR camera to coordinates on the image shot by the thermal infrared imager; the NoIR camera is a camera without an infrared filter; and calibrating the infrared thermal imager and the infrared laser sensor by using the internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model.
The construction unit is also specifically configured to:
scanning a calibration plate by using an infrared scanning line of an infrared laser sensor to obtain a second target three-dimensional coordinate of each pixel point in a second scanning image of the calibration plate, shooting the calibration plate by using the NoIR camera to obtain a second image of the calibration plate, and shooting the calibration plate by using the infrared thermal imager to obtain a third image of the calibration plate;
extracting a second target two-dimensional coordinate corresponding to the second target three-dimensional coordinate in the second image;
mapping the second target two-dimensional coordinate into a third target two-dimensional coordinate by using the second mapping model;
and determining an external parameter matrix in the first mapping model by using the third target two-dimensional coordinate, the internal parameter matrix of the infrared thermal imager and the second target three-dimensional coordinate, and obtaining the first mapping model.
The construction unit is also specifically configured to:
and obtaining a rotation matrix and a translation vector in the first mapping model by using a pose estimation algorithm solvepnp according to the third target two-dimensional coordinate, the internal reference matrix of the infrared thermal imager and the second target three-dimensional coordinate.
The construction unit is also specifically configured to:
shooting a calibration plate by using the thermal infrared imager and the NoIR camera respectively to obtain a fourth image and a fifth image of the calibration plate, and obtaining pixel point coordinates of a corresponding position between the fourth image and the fifth image;
building the second mapping model asCalculating a calibration parameter H by using pixel point coordinates of a corresponding position between the fourth image and the fifth image, and obtaining the second mapping model; wherein, (x) 1 ,y 1 ) For the coordinates on the fourth image, (x) 2 ,y 2 ) Is the coordinates on the fifth image.
Optionally, the calibration plate is formed by mutually forming a set angle by two triangular metal plates, and each metal plate is provided with circular through holes distributed in an array.
Fig. 7 is a schematic diagram of a terminal according to an embodiment of the present invention. The terminal of this embodiment is configured with a thermal infrared imager and an infrared laser sensor; as shown in fig. 7, the terminal 7 may further include: a processor 70, a memory 71 and a computer program 72 stored in the memory 71 and executable on the processor 70, e.g. a located program. The steps of the positioning method embodiment described above, such as steps 101 through 103 shown in fig. 1, are implemented when the processor 70 executes the computer program 72. Alternatively, the processor 70, when executing the computer program 72, performs the functions of the modules/units of the apparatus embodiments described above, such as the functions of the units 71 to 73 shown in fig. 7.
By way of example, the computer program 72 may be partitioned into one or more modules/units that are stored in the memory 71 and executed by the processor 70 to complete the present invention. One or more of the modules/units may be a series of computer program instruction segments capable of performing a specific function for describing the execution of the computer program 72 in the terminal 7. For example, the computer program 72 may be divided into an acquisition unit, a photographing unit, and a positioning unit (unit in the virtual device), each of which functions specifically as follows:
the acquisition unit is used for acquiring a pre-constructed first mapping model; the first mapping model is used for mapping the three-dimensional coordinates obtained by scanning the infrared laser sensor into coordinates on an image shot by the thermal infrared imager;
the shooting unit is used for shooting a target object to be positioned by using the thermal infrared imager to obtain a first image, and acquiring first target three-dimensional coordinates of each pixel point of the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor;
and the positioning unit is used for mapping the first target three-dimensional coordinate into the first target two-dimensional coordinate on the first image by using the first mapping model, and obtaining distance data corresponding to the first target two-dimensional coordinate.
The terminal 7 may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is merely an example of the terminal 7 and is not limiting of the terminal 7, and may include more or fewer components than shown, or may combine some components, or different components, e.g., the terminal device may also include an input-output device, a network access device, a bus, etc.
The processor 70 may be a central processing unit (Central Processing Unit, CPU), or may be another general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an off-the-shelf programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may be an internal storage unit of the terminal 7, such as a hard disk or a memory of the terminal 7. The memory 71 may also be an external storage device of the terminal 7, such as a plug-in hard disk provided on the terminal 7, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like. Further, the memory 71 may also include both an internal storage unit and an external storage device of the terminal 7. The memory 71 is used for storing computer programs and other programs and data required by the terminal device. The memory 71 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present invention. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other manners. For example, the apparatus/terminal device embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, and the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, executable files or in some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the content of the computer readable medium can be appropriately increased or decreased according to the requirements of the jurisdiction's jurisdiction and the patent practice, for example, in some jurisdictions, the computer readable medium does not include electrical carrier signals and telecommunication signals according to the jurisdiction and the patent practice.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention.

Claims (7)

1. A positioning method applied to a terminal, wherein the terminal is configured with an infrared laser sensor and a thermal infrared imager, the positioning method comprising:
acquiring a pre-constructed first mapping model; the first mapping model is used for mapping the three-dimensional coordinates obtained by scanning the infrared laser sensor into coordinates on an image shot by the thermal infrared imager;
shooting a target object to be positioned by using the thermal infrared imager to obtain a first image, and acquiring first target three-dimensional coordinates of each pixel point in the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor;
mapping the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model to obtain distance data corresponding to the first target two-dimensional coordinate;
the construction step of the first mapping model comprises the following steps:
calibrating the infrared thermal imager and the NoIR camera to obtain an internal reference matrix and a second mapping model of the infrared thermal imager; the second mapping model is used for mapping pixel coordinates of an image shot by the NoIR camera to coordinates on the image shot by the thermal infrared imager; the NoIR camera is a camera without an infrared filter;
calibrating the infrared thermal imager and the infrared laser sensor by using an internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model;
the calibrating the infrared thermal imager and the infrared laser sensor by using the internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model comprises the following steps:
scanning a calibration plate by using an infrared scanning line of an infrared laser sensor to obtain a second target three-dimensional coordinate of each pixel point in a second scanning image of the calibration plate, shooting the calibration plate by using the NoIR camera to obtain a second image of the calibration plate, and shooting the calibration plate by using the infrared thermal imager to obtain a third image of the calibration plate;
extracting a second target two-dimensional coordinate corresponding to the second target three-dimensional coordinate in the second image;
mapping the second target two-dimensional coordinate into a third target two-dimensional coordinate by using the second mapping model;
determining an external reference matrix in the first mapping model by using the third target two-dimensional coordinate, the internal reference matrix of the infrared thermal imager and the second target three-dimensional coordinate, and obtaining the first mapping model;
the extracting the second target two-dimensional coordinate corresponding to the second target three-dimensional coordinate in the second image includes: and carrying out linear detection and segmentation on the second image to obtain coordinates of an intersection point of the infrared scanning line and the calibration plate as the second target two-dimensional coordinates.
2. The positioning method of claim 1, wherein the determining the reference matrix in the first mapping model using the third target two-dimensional coordinates, the reference matrix of the thermal infrared imager, and the second target three-dimensional coordinates comprises:
and obtaining a rotation matrix and a translation vector in the first mapping model by using a pose estimation algorithm solvepnp according to the third target two-dimensional coordinate, the internal reference matrix of the infrared thermal imager and the second target three-dimensional coordinate.
3. The positioning method of claim 1, wherein calibrating the thermal infrared imager and the NoIR camera to obtain the reference matrix and the second mapping model of the thermal infrared imager comprises:
shooting a calibration plate by using the thermal infrared imager and the NoIR camera respectively to obtain a fourth image and a fifth image of the calibration plate, and obtaining pixel point coordinates of a corresponding position between the fourth image and the fifth image;
constructing the second mapping modelIs constructed asCalculating to obtain calibration parameters by using pixel point coordinates of the corresponding positions between the fourth image and the fifth imageHObtaining the second mapping model; wherein (1)>For coordinates on said fourth image, < >>Is the coordinates on the fifth image.
4. A positioning method according to any one of claims 1-3, characterized in that the calibration plate is formed by two triangular metal plates at a set angle to each other, and each metal plate is provided with circular through holes distributed in an array.
5. A positioning device configured in a terminal, comprising: the terminal is provided with an infrared laser sensor and an infrared thermal imager, and the positioning device comprises:
the acquisition unit is used for acquiring a pre-constructed first mapping model; the first mapping model is used for mapping the three-dimensional coordinates obtained by scanning the infrared laser sensor into coordinates on an image shot by the thermal infrared imager;
the shooting unit is used for shooting a target object to be positioned by using the thermal infrared imager to obtain a first image, and acquiring first target three-dimensional coordinates of each pixel point of the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor;
the positioning unit is used for mapping the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model to obtain distance data corresponding to the first target two-dimensional coordinate;
the positioning device further comprises a construction unit;
the construction unit is used for calibrating the infrared thermal imager and the NoIR camera to obtain an internal reference matrix and a second mapping model of the infrared thermal imager; the second mapping model is used for mapping pixel coordinates of an image shot by the NoIR camera to coordinates on the image shot by the thermal infrared imager; the NoIR camera is a camera without an infrared filter;
calibrating the infrared thermal imager and the infrared laser sensor by using an internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model;
the calibrating the infrared thermal imager and the infrared laser sensor by using the internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model comprises the following steps:
scanning a calibration plate by using an infrared scanning line of an infrared laser sensor to obtain a second target three-dimensional coordinate of each pixel point in a second scanning image of the calibration plate, shooting the calibration plate by using the NoIR camera to obtain a second image of the calibration plate, and shooting the calibration plate by using the infrared thermal imager to obtain a third image of the calibration plate;
extracting a second target two-dimensional coordinate corresponding to the second target three-dimensional coordinate in the second image;
mapping the second target two-dimensional coordinate into a third target two-dimensional coordinate by using the second mapping model;
determining an external reference matrix in the first mapping model by using the third target two-dimensional coordinate, the internal reference matrix of the infrared thermal imager and the second target three-dimensional coordinate, and obtaining the first mapping model;
the extracting the second target two-dimensional coordinate corresponding to the second target three-dimensional coordinate in the second image includes: and carrying out linear detection and segmentation on the second image to obtain coordinates of an intersection point of the infrared scanning line and the calibration plate as the second target two-dimensional coordinates.
6. A terminal comprising an infrared laser sensor, a thermal infrared imager, a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any one of claims 1 to 4 when the computer program is executed by the processor.
7. A computer storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 4.
CN201811636299.4A 2018-12-29 2018-12-29 Positioning method, positioning device, terminal and computer storage medium Active CN111383264B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811636299.4A CN111383264B (en) 2018-12-29 2018-12-29 Positioning method, positioning device, terminal and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811636299.4A CN111383264B (en) 2018-12-29 2018-12-29 Positioning method, positioning device, terminal and computer storage medium

Publications (2)

Publication Number Publication Date
CN111383264A CN111383264A (en) 2020-07-07
CN111383264B true CN111383264B (en) 2023-12-29

Family

ID=71220525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811636299.4A Active CN111383264B (en) 2018-12-29 2018-12-29 Positioning method, positioning device, terminal and computer storage medium

Country Status (1)

Country Link
CN (1) CN111383264B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112085771B (en) * 2020-08-06 2023-12-05 深圳市优必选科技股份有限公司 Image registration method, device, terminal equipment and computer readable storage medium
CN114155349B (en) * 2021-12-14 2024-03-22 杭州联吉技术有限公司 Three-dimensional image construction method, three-dimensional image construction device and robot

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1354355A (en) * 2001-12-10 2002-06-19 西安交通大学 Laser linear scanning three-dimensional measurement double liquid knife virtual grid mapping calibrating method and equipment
CN1491403A (en) * 2001-10-29 2004-04-21 ���ṫ˾ Non-flat image processing apparatus and image processing method, and recording medium and computer program
CN101854846A (en) * 2007-06-25 2010-10-06 真实成像有限公司 Method, device and system for thermography
CN108921889A (en) * 2018-05-16 2018-11-30 天津大学 A kind of indoor 3-D positioning method based on Augmented Reality application
CN109000566A (en) * 2018-08-15 2018-12-14 深圳科瑞技术股份有限公司 Scanning three-dimensional imaging laser and CCD two-dimensional imaging combination measurement method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120121128A1 (en) * 2009-04-20 2012-05-17 Bent 360: Medialab Inc. Object tracking system
US20120078088A1 (en) * 2010-09-28 2012-03-29 Point of Contact, LLC. Medical image projection and tracking system
US20180100927A1 (en) * 2016-10-12 2018-04-12 Faro Technologies, Inc. Two-dimensional mapping system and method of operation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1491403A (en) * 2001-10-29 2004-04-21 ���ṫ˾ Non-flat image processing apparatus and image processing method, and recording medium and computer program
CN1354355A (en) * 2001-12-10 2002-06-19 西安交通大学 Laser linear scanning three-dimensional measurement double liquid knife virtual grid mapping calibrating method and equipment
CN101854846A (en) * 2007-06-25 2010-10-06 真实成像有限公司 Method, device and system for thermography
CN108921889A (en) * 2018-05-16 2018-11-30 天津大学 A kind of indoor 3-D positioning method based on Augmented Reality application
CN109000566A (en) * 2018-08-15 2018-12-14 深圳科瑞技术股份有限公司 Scanning three-dimensional imaging laser and CCD two-dimensional imaging combination measurement method and device

Also Published As

Publication number Publication date
CN111383264A (en) 2020-07-07

Similar Documents

Publication Publication Date Title
CN111145238B (en) Three-dimensional reconstruction method and device for monocular endoscopic image and terminal equipment
CN107633536B (en) Camera calibration method and system based on two-dimensional plane template
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
KR101666959B1 (en) Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor
CN105096329B (en) Method for accurately correcting image distortion of ultra-wide-angle camera
CN109727278B (en) Automatic registration method for airborne LiDAR point cloud data and aerial image
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
CN111862180B (en) Camera set pose acquisition method and device, storage medium and electronic equipment
WO2021139176A1 (en) Pedestrian trajectory tracking method and apparatus based on binocular camera calibration, computer device, and storage medium
CN111461963B (en) Fisheye image stitching method and device
CN112470192A (en) Dual-camera calibration method, electronic device and computer-readable storage medium
CN112927306B (en) Calibration method and device of shooting device and terminal equipment
CN110619660A (en) Object positioning method and device, computer readable storage medium and robot
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN112686950B (en) Pose estimation method, pose estimation device, terminal equipment and computer readable storage medium
CN115035235A (en) Three-dimensional reconstruction method and device
CN111340737A (en) Image rectification method, device and electronic system
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN109598763A (en) Camera calibration method, device, electronic equipment and computer readable storage medium
CN111882655A (en) Method, apparatus, system, computer device and storage medium for three-dimensional reconstruction
CN110136205B (en) Parallax calibration method, device and system of multi-view camera
CN109658451B (en) Depth sensing method and device and depth sensing equipment
CN115661258A (en) Calibration method and device, distortion correction method and device, storage medium and terminal
AU2020294259B2 (en) Object association method, apparatus and system, electronic device, storage medium and computer program
CN115018922A (en) Distortion parameter calibration method, electronic device and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant