CN111383264A - Positioning method, positioning device, terminal and computer storage medium - Google Patents

Positioning method, positioning device, terminal and computer storage medium Download PDF

Info

Publication number
CN111383264A
CN111383264A CN201811636299.4A CN201811636299A CN111383264A CN 111383264 A CN111383264 A CN 111383264A CN 201811636299 A CN201811636299 A CN 201811636299A CN 111383264 A CN111383264 A CN 111383264A
Authority
CN
China
Prior art keywords
image
target
thermal imager
infrared
dimensional coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811636299.4A
Other languages
Chinese (zh)
Other versions
CN111383264B (en
Inventor
熊友军
庞建新
张惊涛
张万里
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ubtech Robotics Corp
Original Assignee
Ubtech Robotics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ubtech Robotics Corp filed Critical Ubtech Robotics Corp
Priority to CN201811636299.4A priority Critical patent/CN111383264B/en
Publication of CN111383264A publication Critical patent/CN111383264A/en
Application granted granted Critical
Publication of CN111383264B publication Critical patent/CN111383264B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The invention is suitable for the technical field of computer application, and provides a positioning method, a device, a terminal and a computer storage medium, wherein the positioning method comprises the following steps: acquiring a pre-constructed first mapping model; shooting a target object to be positioned by using an infrared thermal imager to obtain a first image, and acquiring a first target three-dimensional coordinate of each pixel point in the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor; mapping the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model to obtain distance data corresponding to the first target two-dimensional coordinate; the technical problem that the infrared thermal imager can only give the azimuth information of survivors and cannot obtain specific distance information is solved.

Description

Positioning method, positioning device, terminal and computer storage medium
Technical Field
The invention belongs to the technical field of computers, and particularly relates to a positioning method, a positioning device, a positioning terminal and a computer storage medium.
Background
When survivor searching is carried out in a scene after a disaster, more risks often exist by using manpower searching, unnecessary casualties are easily caused, more risks can be avoided by using a robot carrying a certain sensor to carry out advanced detection, the approximate position of the survivor can also be obtained according to data of the sensor, and assistance is provided for later-stage manpower rescue.
In low light conditions, or when a rescue scene has a lot of smoke or dust, the conventional visible camera sensor cannot be used to accurately search for survivors, and only the thermal sensor can be used to identify the position of the wounded in the camera image based on the blob detection algorithm. However, the heat sensor can only provide the direction information of the survivors, and cannot obtain specific distance information, which is not beneficial to the development of search and rescue work.
Disclosure of Invention
In view of this, embodiments of the present invention provide a positioning method, an apparatus, a terminal, and a computer storage medium, which can solve the technical problem that distance information of survivors cannot be obtained.
A first aspect of an embodiment of the present invention provides a positioning method, which is applied to a terminal, where the terminal is configured with an infrared laser sensor and an infrared thermal imager, and the positioning method includes:
acquiring a pre-constructed first mapping model; the first mapping model is used for mapping a three-dimensional coordinate obtained by scanning of the infrared laser sensor to a coordinate on an image shot by the infrared thermal imager;
shooting a target object to be positioned by using the infrared thermal imager to obtain a first image, and acquiring a first target three-dimensional coordinate of each pixel point in the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor;
and mapping the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model to obtain distance data corresponding to the first target two-dimensional coordinate.
A second aspect of an embodiment of the present invention provides a positioning device, configured in a terminal, where the terminal is configured with an infrared laser sensor and an infrared thermal imager, and the positioning device includes:
the device comprises an acquisition unit, a mapping unit and a mapping unit, wherein the acquisition unit is used for acquiring a first mapping model which is constructed in advance; the first mapping model is used for mapping a three-dimensional coordinate obtained by scanning of the infrared laser sensor to a coordinate on an image shot by the infrared thermal imager;
the shooting unit is used for shooting a target object to be positioned by using the infrared thermal imager to obtain a first image and acquiring a first target three-dimensional coordinate of each pixel point of the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor;
and the positioning unit is used for mapping the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model to obtain distance data corresponding to the first target two-dimensional coordinate.
A third aspect of the embodiments of the present invention provides a terminal, including an infrared laser sensor, an infrared thermal imager, a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method when executing the computer program.
A fourth aspect of the embodiments of the present invention provides a computer storage medium storing a computer program, wherein the computer program is configured to implement the steps of the method when executed by a processor.
In the embodiment of the invention, when a terminal performs positioning operation by using a configured infrared laser sensor and an infrared thermal imager, a first image is obtained by obtaining a first mapping model which is constructed in advance, and a target object to be positioned is shot by using the infrared thermal imager, and a first target three-dimensional coordinate of the target object obtained by scanning an infrared scanning line of the infrared laser sensor is obtained; for example, if the target object is a survivor in a post-disaster scene, the first target three-dimensional coordinate may be mapped to the first target two-dimensional coordinate on the first image by using the first mapping model, so as to obtain distance data corresponding to the first target two-dimensional coordinate; the method and the device have the advantages that the distance information of the survivors is obtained, the technical problem that the infrared thermal imager can only give the azimuth information of the survivors and cannot obtain specific distance information is solved, and the method and the device are favorable for smoothly developing search and rescue work.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of a positioning method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a specific implementation of the construction of the first mapping model according to the embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a calibration board provided in an embodiment of the present invention;
fig. 4 is a flowchart illustrating a specific implementation of step 202 of a positioning method according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a second scanned image, a second image and a third image obtained when a calibration board is photographed by an infrared laser sensor, a NoIR camera and an infrared thermal imager according to an embodiment of the present invention;
FIG. 6 is a schematic view of a positioning device according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In specific implementation, the terminal described in the embodiment of the present invention may be a terminal device configured with a positioning device, such as a robot, a computer, a mobile phone, and an intelligent wearable device, and the terminal device is further configured with an infrared laser sensor and an infrared thermal imager.
Referring to fig. 1, fig. 1 is a schematic flow chart illustrating an implementation of a positioning method according to an embodiment of the present invention. The main execution body of the positioning method in this embodiment is a positioning device, which can be implemented by software and/or hardware, integrated in a terminal, and is suitable for a situation requiring positioning. The positioning method shown in fig. 1 may include: step 101 to step 103.
Step 101, acquiring a first mapping model which is constructed in advance; the first mapping model is used for mapping the three-dimensional coordinates obtained by scanning of the infrared laser sensor to coordinates on an image shot by the infrared thermal imager.
In the embodiment of the application, the infrared laser sensor can be a 2D infrared laser sensor or a 3D infrared laser sensor.
In the embodiment of the present application, the infrared laser sensor with infrared thermal imager mutually perpendicular places, and, the Z axle of infrared thermal imager with infrared laser sensor's scanning plane is parallel to avoid the error that the parallax error arouses.
In an embodiment of the present invention, the first mapping model is configured to map a three-dimensional coordinate obtained by scanning of the infrared laser sensor to a coordinate on an image captured by the infrared thermal imager. I.e. mapping 3D coordinates to 2D coordinates.
Optionally, as shown in fig. 2, the constructing step of the first mapping model may include: step 201 to step 202.
Step 201, calibrating the infrared thermal imager and the NoIR camera to obtain an internal reference matrix and a second mapping model of the infrared thermal imager; the second mapping model is used for mapping pixel coordinates of an image shot by the NoIR camera to coordinates on the image shot by the infrared thermal imager; the NoIR camera is a camera without an infrared filter mounted.
In this embodiment, the NoIR camera is used as an auxiliary camera to calibrate the infrared thermal imager and the infrared laser sensor to obtain the first mapping model, so that a three-dimensional coordinate obtained by scanning of the infrared laser sensor is mapped to a coordinate on an image shot by the infrared thermal imager, distance data of a corresponding coordinate on the image shot by the infrared thermal imager is obtained, and a shot target object is positioned.
Specifically, in the process of calibrating the infrared thermal imager and the infrared laser sensor by using the NoIR camera as the auxiliary camera, the infrared thermal imager and the NoIR camera need to be calibrated first to obtain an internal reference matrix and a second mapping model of the infrared thermal imager.
The implementation manner of calibrating the infrared thermal imager to obtain the internal reference matrix of the infrared thermal imager can be implemented by adopting a general calibration method, and specifically comprises the following steps:
according to the pinhole imaging principle, an imaging geometric model of the infrared thermal imager is established:
Figure BDA0001930126760000051
wherein: (u, v,1)TIs the homogeneous coordinate of any pixel point in the pixel coordinate system, and (X/Z, Y/Z,1) is the corresponding homogeneous coordinate of the point in the world coordinate system; f. ofx,fyCalled the equivalent focal lengths of the cameras in the u-axis and v-axis directions, u0, v0 are the actual center point coordinates of the image plane. In pair fx,fy,u0,v0When calibration is carried out, the infrared thermal imager can be utilized to shoot the calibration plate at different angles and at different distances, and f is obtained by calculation according to the angular point in the calibration platex,fy,u0,v0The value of (c).
It should be noted that, in practical application, when the infrared thermal imager is calibrated, not only the internal reference matrix of the infrared thermal imager can be obtained
Figure BDA0001930126760000061
And after an internal reference matrix and a distortion coefficient of the infrared thermal imager are obtained, correcting and eliminating distortion of an original image shot by the infrared thermal imager according to the internal reference matrix and the distortion coefficient.
Similarly, when the NoIR camera is calibrated, the corresponding internal reference matrix and distortion coefficient can be obtained in the same way, and the original image shot by the NoIR camera is corrected and distortion eliminated by using the internal reference matrix and distortion coefficient.
Optionally, the calibrating the infrared thermal imager and the NoIR camera to obtain the second mapping model may include: shooting a calibration plate by using the infrared thermal imager and the NoIR camera respectively to obtain a fourth image and a fifth image of the calibration plate, and obtaining pixel point coordinates of corresponding positions between the fourth image and the fifth image; the second mapping model is then constructed as:
Figure BDA0001930126760000062
calculating to obtain a calibration parameter H by using the pixel point coordinates of the corresponding position between the fourth image and the fifth image, and obtaining the first mapping image; wherein (x)1,y1) (x) is a coordinate on the fourth image2,y2) Are coordinates on the fifth image.
It should be noted that the fourth image and the fifth image are both images that have been subjected to correction and distortion removal processing, that is, there is no distortion of the images, and each of the images described later is also an image that has been subjected to correction and distortion removal processing using the internal reference matrix and distortion coefficient of each shooting camera.
In the embodiment of the present invention, the calibration board may be a black and white chess board calibration board, or other calibration boards including a plurality of angular points, in order to improve the calibration precision, as shown in fig. 3, the calibration board 3a may be a calibration board formed by two triangular metal plates 31 forming a set angle with each other, and each metal plate is provided with circular through holes 32 distributed in an array. The setting angle may be set according to practical applications, and for example, the setting angle may be set to 130 degrees to 155 degrees. The circular through holes 32 distributed in the array in the calibration plate can be applied to calibrate the internal reference matrix and the distortion coefficient of the camera.
When the infrared thermal imager and the NoIR camera are calibrated to obtain the second mapping model, the calibration plate may be photographed by respectively using the infrared thermal imager and the NoIR camera, and 4 pairs of vertexes in the calibration plate as shown in fig. 3 are respectively selected to calculate the calibration parameter H, where the specific calculation process may be: and obtaining a calibration parameter H in the second mapping model by utilizing a pose estimation algorithm solvepnp according to the coordinates of the 4 pairs of vertexes.
Step 202, calibrating the infrared thermal imager and the infrared laser sensor by using the internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model.
Optionally, as shown in fig. 4, in the step 202, calibrating the infrared thermal imager and the infrared laser sensor by using the internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model, which may include: step 401 to step 404.
Step 401, scanning a calibration board by using an infrared scanning line of an infrared laser sensor to obtain a second target three-dimensional coordinate of each pixel point in a second scanning image of the calibration board, simultaneously shooting the calibration board by using the NoIR camera to obtain a second image of the calibration board, and shooting the calibration board by using the infrared thermal imager to obtain a third image of the calibration board.
Since the NoIR camera is a camera without an infrared filter attached thereto, it can capture an image made of infrared rays.
For example, as shown in fig. 5, while a calibration board is scanned by an infrared scanning line of an infrared laser sensor to obtain a second target three-dimensional coordinate of each pixel point in a second scanned image 5a, an image of the infrared scanning line scanned onto the calibration board may be captured when the calibration board is photographed by a NoIR camera, for example, an image 51 of the infrared scanning line in a second image 5b photographed by the NoIR camera, and an image 5c in fig. 5 is a third image of the calibration board photographed by the infrared thermal imager.
Step 402, extracting a second target two-dimensional coordinate corresponding to the second target three-dimensional coordinate in the second image.
After the second image and the second scanned image are acquired, a second target two-dimensional coordinate corresponding to the second target three-dimensional coordinate in the second image can be extracted.
For example, as shown in fig. 5, the coordinates of the intersection points a2, B2, and C2 of the infrared scanning line and the calibration board are obtained by performing line detection and division on the second image 5B, and the coordinates of a plurality of points distributed at equal intervals between a2 and B2, for example, the coordinates of two points distributed at equal intervals are obtained by an interpolation method, and at the same time, the coordinates of two points distributed at equal intervals between B2 and C2 are obtained, so that 7 second target two-dimensional coordinates in the second image corresponding to the 7 second target three-dimensional coordinates in the second scanned image are obtained.
It should be noted that the number of the second target two-dimensional coordinates obtained here is merely an example, and in other embodiments of the present invention, more or fewer second target two-dimensional coordinates may be selected to perform the calculation of the external reference matrix. Meanwhile, the coordinates of the intersection points A2, B2 and C2 of the infrared scanning line and the calibration board are obtained as the second target two-dimensional coordinates by performing linear detection and segmentation on the second image, so that the acquisition mode is simpler and the coordinates are more accurate.
And 403, mapping the second target two-dimensional coordinate into a third target two-dimensional coordinate by using the second mapping model.
After the 7 second target two-dimensional coordinates corresponding to the 7 second target three-dimensional coordinates in the second scanned image in the second image are obtained, the 7 second target two-dimensional coordinates can be mapped to the 7 third target two-dimensional coordinates corresponding to the coordinate point on the third image by using the second mapping model.
And 404, determining an external reference matrix in the first mapping model by using the third target two-dimensional coordinate, the internal reference matrix of the infrared thermal imager and the second target three-dimensional coordinate, and obtaining the first mapping model.
After the above steps 401 to 403, 7 pairs of coordinate points corresponding to each other in the second scanned image and the third image are obtained. And determining an external reference matrix in the first mapping model by using the 7 third target two-dimensional coordinates, the 7 second target three-dimensional coordinates and the internal reference matrix of the infrared thermal imager, and obtaining the first mapping model.
For example, the first mapping model is constructed as follows:
Figure BDA0001930126760000091
and obtaining a rotation matrix and a translation vector in the first mapping model by using a pose estimation algorithm solvepnp. Wherein, M is an internal reference matrix of the infrared thermal imager, R is a rotation matrix in the first mapping model, t is a translation vector in the first mapping model, (X, Y, Z) is a three-dimensional coordinate of a scanned image obtained by scanning of the infrared laser sensor, and (X, Y) is a coordinate of an image shot by the infrared thermal imager.
Step 102, shooting a target object to be positioned by using the infrared thermal imager to obtain a first image, and obtaining first target three-dimensional coordinates of each pixel point in the first scanning image of the target object, which are obtained by scanning the infrared scanning line of the infrared laser sensor.
Step 103, mapping the first target three-dimensional coordinate to a first target two-dimensional coordinate on the first image by using the first mapping model, so as to obtain distance data corresponding to the first target two-dimensional coordinate.
In the embodiment of the present invention, after the first mapping model is obtained, when a target object needs to be located, the target object that needs to be located may be directly photographed by using the infrared thermal imager to obtain a first image, a first target three-dimensional coordinate of each pixel point in a first scanned image of the target object obtained by scanning an infrared scanning line of the infrared laser sensor is obtained, and then the first target three-dimensional coordinate is mapped to a first target two-dimensional coordinate on the first image by using the first mapping model, so that distance data corresponding to the first target two-dimensional coordinate of the first target two-dimensional coordinate may be obtained.
Specifically, the infrared laser sensor emits an infrared scanning line (infrared laser) in a certain direction, receives the infrared laser reflected by the barrier, obtains a time difference between the emission and the reception of the infrared laser, and calculates distance data of the barrier closest to the direction according to the time difference, so that the distance data is mapped to a corresponding first target two-dimensional coordinate in the first image, and the first target two-dimensional coordinate can carry corresponding distance data, thereby solving the technical problem that an infrared thermal imager can only give azimuth information of a survivor and cannot obtain specific distance information, and facilitating the smooth development of search and rescue work.
The embodiment of the present invention further provides a positioning apparatus, which includes a module for executing each step in the aforementioned positioning method, and the apparatus can be integrated in the aforementioned terminal. Where not described in detail in the apparatus, reference is made to the description of the aforementioned process.
Referring to fig. 6, fig. 6 is a schematic block diagram of a positioning apparatus provided in an embodiment of the present invention, where the positioning apparatus is configured in the terminal, and the terminal is configured with an infrared laser sensor and an infrared thermal imager, and the positioning apparatus includes: an acquisition unit 61, a photographing unit 62, and a positioning unit 63.
An obtaining unit 61, configured to obtain a first mapping model that is constructed in advance; the first mapping model is used for mapping a three-dimensional coordinate obtained by scanning of the infrared laser sensor to a coordinate on an image shot by the infrared thermal imager;
a shooting unit 62, configured to shoot a target object to be positioned by using the infrared thermal imager to obtain a first image, and obtain a first target three-dimensional coordinate of each pixel point of the first scanned image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor;
and a positioning unit 63, configured to map the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model, so as to obtain distance data corresponding to the first target two-dimensional coordinate.
Optionally, the positioning device further comprises a construction unit;
the construction unit is used for calibrating the infrared thermal imager and the NoIR camera to obtain an internal reference matrix and a second mapping model of the infrared thermal imager; the second mapping model is used for mapping pixel coordinates of an image shot by the NoIR camera to coordinates on the image shot by the infrared thermal imager; the NoIR camera is a camera without an infrared filter; and calibrating the infrared thermal imager and the infrared laser sensor by using the internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model.
The construction unit is further specifically configured to:
scanning a calibration plate by using an infrared scanning line of an infrared laser sensor to obtain a second target three-dimensional coordinate of each pixel point in a second scanning image of the calibration plate, meanwhile, shooting the calibration plate by using the NoIR camera to obtain a second image of the calibration plate, and shooting the calibration plate by using the infrared thermal imager to obtain a third image of the calibration plate;
extracting a second target two-dimensional coordinate corresponding to the second target three-dimensional coordinate in the second image;
mapping the second target two-dimensional coordinate into a third target two-dimensional coordinate by using the second mapping model;
and determining an external reference matrix in the first mapping model by using the third target two-dimensional coordinate, the internal reference matrix of the infrared thermal imager and the second target three-dimensional coordinate, and obtaining the first mapping model.
The construction unit is further specifically configured to:
and obtaining a rotation matrix and a translation vector in the first mapping model by utilizing a pose estimation algorithm solvepnp according to the three-dimensional coordinate of the third target, the internal reference matrix of the infrared thermal imager and the three-dimensional coordinate of the second target.
The construction unit is further specifically configured to:
shooting a calibration plate by using the infrared thermal imager and the NoIR camera respectively to obtain a fourth image and a fifth image of the calibration plate, and obtaining pixel point coordinates of corresponding positions between the fourth image and the fifth image;
constructing the second mapping model as
Figure BDA0001930126760000111
Calculating to obtain a calibration parameter H by using the pixel point coordinates of the corresponding position between the fourth image and the fifth image, and obtaining the first mapping image; wherein (x)1,y1) (x) is a coordinate on the fourth image2,y2) Are coordinates on the fifth image.
Optionally, the calibration plate is formed by two triangular metal plates forming a set angle with each other, and each metal plate is provided with circular through holes distributed in an array manner.
Fig. 7 is a schematic diagram of a terminal according to an embodiment of the present invention. The terminal of the embodiment is provided with an infrared thermal imager and an infrared laser sensor; as shown in fig. 7, the terminal 7 may further include: a processor 70, a memory 71, and a computer program 72, e.g., a location program, stored in the memory 71 and executable on the processor 70. The steps in the above-described embodiment of the positioning method, such as steps 101 to 103 shown in fig. 1, are implemented when the processor 70 executes the computer program 72. Alternatively, the processor 70, when executing the computer program 72, implements the functions of the various modules/units in the various device embodiments described above, such as the functions of the units 71 to 73 shown in fig. 7.
Illustratively, the computer program 72 may be divided into one or more modules/units, which are stored in the memory 71 and executed by the processor 70 to carry out the invention. One or more of the modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 72 in the terminal 7. For example, the computer program 72 may be divided into an acquisition unit, a shooting unit, and a positioning unit (unit in a virtual device), and the specific functions of each module are as follows:
the device comprises an acquisition unit, a mapping unit and a mapping unit, wherein the acquisition unit is used for acquiring a first mapping model which is constructed in advance; the first mapping model is used for mapping a three-dimensional coordinate obtained by scanning of the infrared laser sensor to a coordinate on an image shot by the infrared thermal imager;
the shooting unit is used for shooting a target object to be positioned by using the infrared thermal imager to obtain a first image and acquiring a first target three-dimensional coordinate of each pixel point of the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor;
and the positioning unit is used for mapping the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model to obtain distance data corresponding to the first target two-dimensional coordinate.
The terminal 7 may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is only an example of a terminal 7 and does not constitute a limitation of the terminal 7, and that it may comprise more or less components than those shown, or some components may be combined, or different components, e.g. the terminal device may further comprise an input output device, a network access device, a bus, etc.
The Processor 70 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 71 may be an internal storage unit of the terminal 7, such as a hard disk or a memory of the terminal 7. The memory 71 may also be an external storage device of the terminal 7, such as a plug-in hard disk provided on the terminal 7, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 71 may also include both an internal storage unit of the terminal 7 and an external storage device. The memory 71 is used for storing computer programs and other programs and data required by the terminal device. The memory 71 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, a module or a unit may be divided into only one logical function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method according to the embodiments of the present invention may also be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of the embodiments of the method. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, in accordance with legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunications signals.
The above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A positioning method is applied to a terminal, and is characterized in that the terminal is provided with an infrared laser sensor and an infrared thermal imager, and the positioning method comprises the following steps:
acquiring a pre-constructed first mapping model; the first mapping model is used for mapping a three-dimensional coordinate obtained by scanning of the infrared laser sensor to a coordinate on an image shot by the infrared thermal imager;
shooting a target object to be positioned by using the infrared thermal imager to obtain a first image, and acquiring a first target three-dimensional coordinate of each pixel point in the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor;
and mapping the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model to obtain distance data corresponding to the first target two-dimensional coordinate.
2. The positioning method of claim 1, wherein the step of constructing the first mapping model comprises:
calibrating the infrared thermal imager and the NoIR camera to obtain an internal reference matrix and a second mapping model of the infrared thermal imager; the second mapping model is used for mapping pixel coordinates of an image shot by the NoIR camera to coordinates on the image shot by the infrared thermal imager; the NoIR camera is a camera without an infrared filter;
and calibrating the infrared thermal imager and the infrared laser sensor by using the internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model.
3. The positioning method according to claim 2, wherein the calibrating the infrared thermal imager and the infrared laser sensor by using the internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model comprises:
scanning a calibration plate by using an infrared scanning line of an infrared laser sensor to obtain a second target three-dimensional coordinate of each pixel point in a second scanning image of the calibration plate, meanwhile, shooting the calibration plate by using the NoIR camera to obtain a second image of the calibration plate, and shooting the calibration plate by using the infrared thermal imager to obtain a third image of the calibration plate;
extracting a second target two-dimensional coordinate corresponding to the second target three-dimensional coordinate in the second image;
mapping the second target two-dimensional coordinate into a third target two-dimensional coordinate by using the second mapping model;
and determining an external reference matrix in the first mapping model by using the third target two-dimensional coordinate, the internal reference matrix of the infrared thermal imager and the second target three-dimensional coordinate, and obtaining the first mapping model.
4. The method of claim 3, wherein said determining an external reference matrix in the first mapping model using the third target two-dimensional coordinates, the internal reference matrix of the infrared thermal imager, and the second target three-dimensional coordinates comprises:
and obtaining a rotation matrix and a translation vector in the first mapping model by utilizing a pose estimation algorithm solvepnp according to the three-dimensional coordinate of the third target, the internal reference matrix of the infrared thermal imager and the three-dimensional coordinate of the second target.
5. The method according to claim 1, wherein the calibrating the thermal infrared imager and the NoIR camera to obtain the second mapping model comprises:
shooting a calibration plate by using the infrared thermal imager and the NoIR camera respectively to obtain a fourth image and a fifth image of the calibration plate, and obtaining pixel point coordinates of corresponding positions between the fourth image and the fifth image;
constructing the second mapping model as
Figure FDA0001930126750000021
Calculating to obtain a calibration parameter H by using the pixel point coordinates of the corresponding position between the fourth image and the fifth image, and obtaining the first mapping image; wherein (x)1,y1) (x) is a coordinate on the fourth image2,y2) Are coordinates on the fifth image.
6. The positioning method according to any one of claims 3-5, wherein the calibration plate is formed by two triangular metal plates at a set angle to each other, and each metal plate is provided with circular through holes distributed in an array.
7. A positioning apparatus configured in a terminal, comprising: the terminal is provided with infrared laser sensor and infrared thermal imager, positioner includes:
the device comprises an acquisition unit, a mapping unit and a mapping unit, wherein the acquisition unit is used for acquiring a first mapping model which is constructed in advance; the first mapping model is used for mapping a three-dimensional coordinate obtained by scanning of the infrared laser sensor to a coordinate on an image shot by the infrared thermal imager;
the shooting unit is used for shooting a target object to be positioned by using the infrared thermal imager to obtain a first image and acquiring a first target three-dimensional coordinate of each pixel point of the first scanning image of the target object, which is obtained by scanning an infrared scanning line of the infrared laser sensor;
and the positioning unit is used for mapping the first target three-dimensional coordinate into a first target two-dimensional coordinate on the first image by using the first mapping model to obtain distance data corresponding to the first target two-dimensional coordinate.
8. The positioning device of claim 7, further comprising a building unit;
the construction unit is used for calibrating the infrared thermal imager and the NoIR camera to obtain an internal reference matrix and a second mapping model of the infrared thermal imager; the second mapping model is used for mapping pixel coordinates of an image shot by the NoIR camera to coordinates on the image shot by the infrared thermal imager; the NoIR camera is a camera without an infrared filter;
and calibrating the infrared thermal imager and the infrared laser sensor by using the internal reference matrix of the infrared thermal imager and the second mapping model to obtain the first mapping model.
9. A terminal comprising an infrared laser sensor, an infrared thermal imager, a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any one of claims 1 to 6 when executing the computer program.
10. A computer storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN201811636299.4A 2018-12-29 2018-12-29 Positioning method, positioning device, terminal and computer storage medium Active CN111383264B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811636299.4A CN111383264B (en) 2018-12-29 2018-12-29 Positioning method, positioning device, terminal and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811636299.4A CN111383264B (en) 2018-12-29 2018-12-29 Positioning method, positioning device, terminal and computer storage medium

Publications (2)

Publication Number Publication Date
CN111383264A true CN111383264A (en) 2020-07-07
CN111383264B CN111383264B (en) 2023-12-29

Family

ID=71220525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811636299.4A Active CN111383264B (en) 2018-12-29 2018-12-29 Positioning method, positioning device, terminal and computer storage medium

Country Status (1)

Country Link
CN (1) CN111383264B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112085771A (en) * 2020-08-06 2020-12-15 深圳市优必选科技股份有限公司 Image registration method and device, terminal equipment and computer readable storage medium
CN114155349A (en) * 2021-12-14 2022-03-08 杭州联吉技术有限公司 Three-dimensional mapping method, three-dimensional mapping device and robot

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1354355A (en) * 2001-12-10 2002-06-19 西安交通大学 Laser linear scanning three-dimensional measurement double liquid knife virtual grid mapping calibrating method and equipment
CN1491403A (en) * 2001-10-29 2004-04-21 ���ṫ˾ Non-flat image processing apparatus and image processing method, and recording medium and computer program
CN101854846A (en) * 2007-06-25 2010-10-06 真实成像有限公司 Method, device and system for thermography
US20120078088A1 (en) * 2010-09-28 2012-03-29 Point of Contact, LLC. Medical image projection and tracking system
US20120121128A1 (en) * 2009-04-20 2012-05-17 Bent 360: Medialab Inc. Object tracking system
US20180100927A1 (en) * 2016-10-12 2018-04-12 Faro Technologies, Inc. Two-dimensional mapping system and method of operation
CN108921889A (en) * 2018-05-16 2018-11-30 天津大学 A kind of indoor 3-D positioning method based on Augmented Reality application
CN109000566A (en) * 2018-08-15 2018-12-14 深圳科瑞技术股份有限公司 Scanning three-dimensional imaging laser and CCD two-dimensional imaging combination measurement method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1491403A (en) * 2001-10-29 2004-04-21 ���ṫ˾ Non-flat image processing apparatus and image processing method, and recording medium and computer program
CN1354355A (en) * 2001-12-10 2002-06-19 西安交通大学 Laser linear scanning three-dimensional measurement double liquid knife virtual grid mapping calibrating method and equipment
CN101854846A (en) * 2007-06-25 2010-10-06 真实成像有限公司 Method, device and system for thermography
US20120121128A1 (en) * 2009-04-20 2012-05-17 Bent 360: Medialab Inc. Object tracking system
US20120078088A1 (en) * 2010-09-28 2012-03-29 Point of Contact, LLC. Medical image projection and tracking system
US20180100927A1 (en) * 2016-10-12 2018-04-12 Faro Technologies, Inc. Two-dimensional mapping system and method of operation
CN108921889A (en) * 2018-05-16 2018-11-30 天津大学 A kind of indoor 3-D positioning method based on Augmented Reality application
CN109000566A (en) * 2018-08-15 2018-12-14 深圳科瑞技术股份有限公司 Scanning three-dimensional imaging laser and CCD two-dimensional imaging combination measurement method and device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112085771A (en) * 2020-08-06 2020-12-15 深圳市优必选科技股份有限公司 Image registration method and device, terminal equipment and computer readable storage medium
CN112085771B (en) * 2020-08-06 2023-12-05 深圳市优必选科技股份有限公司 Image registration method, device, terminal equipment and computer readable storage medium
CN114155349A (en) * 2021-12-14 2022-03-08 杭州联吉技术有限公司 Three-dimensional mapping method, three-dimensional mapping device and robot
CN114155349B (en) * 2021-12-14 2024-03-22 杭州联吉技术有限公司 Three-dimensional image construction method, three-dimensional image construction device and robot

Also Published As

Publication number Publication date
CN111383264B (en) 2023-12-29

Similar Documents

Publication Publication Date Title
CN111145238B (en) Three-dimensional reconstruction method and device for monocular endoscopic image and terminal equipment
CN109035320B (en) Monocular vision-based depth extraction method
CN109146980B (en) Monocular vision based optimized depth extraction and passive distance measurement method
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
KR101666959B1 (en) Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor
CN105096329B (en) Method for accurately correcting image distortion of ultra-wide-angle camera
JPWO2018235163A1 (en) Calibration apparatus, calibration chart, chart pattern generation apparatus, and calibration method
CN108805938B (en) Detection method of optical anti-shake module, mobile terminal and storage medium
WO2021139176A1 (en) Pedestrian trajectory tracking method and apparatus based on binocular camera calibration, computer device, and storage medium
CN111461963B (en) Fisheye image stitching method and device
CN112686950B (en) Pose estimation method, pose estimation device, terminal equipment and computer readable storage medium
CN108182708B (en) Calibration method and calibration device of binocular camera and terminal equipment
WO2019232793A1 (en) Two-camera calibration method, electronic device and computer-readable storage medium
CN112927306B (en) Calibration method and device of shooting device and terminal equipment
CN110619660A (en) Object positioning method and device, computer readable storage medium and robot
EP3944194A1 (en) Fisheye camera calibration system, method and apparatus, and electronic device and storage medium
CN110136048B (en) Image registration method and system, storage medium and terminal
CN112288826A (en) Calibration method and device of binocular camera and terminal
Perdigoto et al. Calibration of mirror position and extrinsic parameters in axial non-central catadioptric systems
CN114549660B (en) Multi-camera calibration method, device and equipment based on cylindrical self-identification marker
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN210986289U (en) Four-eye fisheye camera and binocular fisheye camera
CN116051652A (en) Parameter calibration method, electronic equipment and storage medium
CN113592934B (en) Target depth and height measuring method and device based on monocular camera
CN115601449A (en) Calibration method, panoramic image generation method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant