WO2018233514A1 - Procédé et dispositif de mesure de pose, et support de stockage - Google Patents

Procédé et dispositif de mesure de pose, et support de stockage Download PDF

Info

Publication number
WO2018233514A1
WO2018233514A1 PCT/CN2018/090821 CN2018090821W WO2018233514A1 WO 2018233514 A1 WO2018233514 A1 WO 2018233514A1 CN 2018090821 W CN2018090821 W CN 2018090821W WO 2018233514 A1 WO2018233514 A1 WO 2018233514A1
Authority
WO
WIPO (PCT)
Prior art keywords
dimensional
real
virtual
rotation
matrix
Prior art date
Application number
PCT/CN2018/090821
Other languages
English (en)
Chinese (zh)
Inventor
徐坤
周轶
范国田
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2018233514A1 publication Critical patent/WO2018233514A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken

Definitions

  • the present invention relates to the field of measurement technology, and in particular, to a pose measurement method, device, and storage medium.
  • Object pose measurement has important application value in many fields such as modern communication, national defense and aerospace.
  • the measurement of the antenna pose is closely related to the coverage of the base station; for example, during the robot assembly process, the pose of the robot determines the accuracy of the assembly. How to achieve the pose measurement of objects has always been the focus of research.
  • the position measurement method usually needs to be configured with relevant measurement personnel, and the relevant measurement instrument is measured on the object.
  • this method can be implemented for objects that can be operated near.
  • objects such as antennas and aircraft that require long-distance measurement
  • such measurement methods are very dangerous, and it is impossible to ensure the personal safety of the measurement personnel, and at the same time, excessive manpower is required. Therefore, it is very necessary to provide a simple and feasible pose measurement method.
  • the embodiment of the invention provides a method, a device and a storage medium for measuring a pose, which solves the problem that the pose measurement method in the prior art is labor-intensive and cannot guarantee the personal safety of the measurement personnel.
  • a pose measurement method including:
  • a posture parameter of the object is calculated according to position information of the object in the real three-dimensional environment.
  • the posture parameter of the photographing device includes rotation angle information of the photographing device; and the matching the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the photographing device, including:
  • the posture parameter of the photographing device includes the positioning information of the photographing device; and the matching the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the photographing device, further includes:
  • a panning and scaling transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment is determined according to the positioning information and the position of the photographing device in the virtual three-dimensional scene.
  • the calculating a rotation transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment according to the first rotation matrix and the second rotation matrix comprises:
  • the average of the rotation angles of the respective rotation conversion matrices on the three coordinate axes is calculated, and the final rotation transformation matrix is obtained based on the average value reconstruction.
  • the calculating the attitude parameter of the object according to the location information of the object in the real three-dimensional environment comprises:
  • Two pairs of points with the same name are determined according to the straight line of the same name calibrated in the two target photos;
  • the rotation transformation matrix and the translation and scaling transformation matrix are used to transform the positions of the centers of the two phase points, and the positioning values and altitude values of the objects in the real three-dimensional environment are obtained according to the transformed positions.
  • the two pairs of the same name are determined according to the same-named line calibrated in the two target photos, including:
  • intersection of the polar line and the straight line of the same name in another target photo is the point of the same name of the end point.
  • the determining, according to the two pairs of phase names of the same name, the positions of two phase points in the virtual three-dimensional scene including:
  • a linear equation of the same name pair is formed into a linear equation group, and the linear equation group is solved by a matrix method, and the position of the phase point in the virtual three-dimensional scene is determined based on the solution.
  • the linear equations of the same-named pair of points form a linear equation group, and the linear equations are solved by a matrix method, and the position of the phase points in the virtual three-dimensional scene is determined based on the solution, including :
  • the new weighting factor of each phase point is obtained.
  • the linear equations of each phase point are respectively divided by the corresponding weighting factors to obtain a new linear equation group, and then the new solution is calculated. Repeat this step until After the new solution is equal to the previous solution, this step is repeated once again, and the calculated final solution is the position of the phase point.
  • the calculating the attitude parameter of the object according to the location information of the object in the real three-dimensional environment comprises:
  • the attitude angle of the object is calculated based on the position information of the straight line in the real space.
  • a pose measuring apparatus includes a camera, a processor, and a memory, wherein the memory stores a pose measurement program; the processor is configured to execute the stored in the memory The program is used to implement the steps in the pose measurement method described above.
  • a computer readable storage medium on which the pose measurement program is stored, and the pose measurement program is implemented by a processor to implement the bit described above. The steps in the attitude measurement method.
  • the object in different positions is photographed by the photographing device, and then the photograph of the photographed object is analyzed by combining the posture parameters of the photographing device, so that the actual pose parameter information of the object can be obtained. Therefore, the embodiment of the invention realizes the remote measurement of the pose of the object, and the operation method is very simple and convenient, and the measurement personnel are not required to climb, thereby effectively eliminating the risk of measurement.
  • FIG. 1 is a flowchart of a pose measurement method according to an embodiment of the present invention.
  • FIGS. 2a and 2b are schematic diagrams showing the result of calculating the phase point of the same name in an embodiment of the present invention.
  • FIG. 3 is a flow chart of a three-focus tensor measuring antenna according to an embodiment of the present invention.
  • FIG. 4 is a schematic block diagram of a pose measuring device according to an embodiment of the present invention.
  • the pose measurement method includes the following steps:
  • Step 101 Acquire a plurality of photos of the objects photographed at different positions and posture parameters of the photographing device when the photographs are taken.
  • the object here is not limited to the antenna, but also an object such as a robot, an aircraft, or the like that needs to monitor the attitude parameter.
  • the posture parameters of the robot arm are required to be tested to ensure the accuracy of the assembly; or, in the case of the aircraft wind test, the flight attitude is detected.
  • the shooting device here includes a camera, a camera, a mobile phone, and a tablet computer, and the like, and the type of the shooting device is not specifically limited.
  • the photographing device is a movable device
  • the movable device can acquire the posture parameter information by itself.
  • you can choose your own shooting location and get photos of objects taken at different locations.
  • the attitude parameters of the photographing apparatus include positioning information (for example, global positioning system GPS information) and information such as three rotation angles (rotation angles with respect to three coordinate axes).
  • positioning information for example, global positioning system GPS information
  • information such as three rotation angles (rotation angles with respect to three coordinate axes).
  • the photographing device is a movable device
  • the parameter can be directly obtained by the corresponding sensor.
  • you can pre-configure the rotation angle and GPS information, and directly obtain the configuration information when needed.
  • Step 102 Construct a virtual three-dimensional scene according to the object photo, and match the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the shooting device.
  • the matching between the virtual three-dimensional scene and the real three-dimensional environment is performed according to the posture parameter of the photographing device, mainly by calculating the rotation conversion matrix according to the rotation angle information of the photographing device, and calculating the translation and zoom matrix according to the positioning information of the photographing device.
  • the transformation of a virtual 3D scene into a real 3D scene can be achieved by rotating the transformation matrix, translation and scaling transformation matrices.
  • Step 103 Calculate the attitude parameter of the object according to the position information of the object in the real three-dimensional environment.
  • step 102 the conversion information of the virtual three-dimensional scene to the real three-dimensional environment has been acquired, and according to the conversion information, the position information of the antenna in the real space can be obtained by the user calibrating the position of the desired measurement in the object photo.
  • the attitude parameter of the object can be directly calculated according to the position information of the object in the real space.
  • the pose measurement method captureds an object at different positions by a photographing device, analyzes the photograph of the photographed object by combining the posture parameters of the photographing device, and finally obtains a photograph by combining a mathematical algorithm.
  • the actual pose parameter of the medium object The embodiment of the invention realizes the remote measurement of the posture of the object, and does not require the measuring personnel to perform the climbing operation, so the operation is very simple and convenient, the risk of measurement is effectively eliminated, and the personal safety of the measuring personnel is ensured.
  • the antenna is taken as an example to introduce the technical solution through the pose measurement process of the antenna.
  • Step 101 Acquire a plurality of photos of the objects photographed at different positions and posture parameters of the photographing device when the photographs are taken.
  • a mobile phone is used as a measuring device, and a photo of the antenna is taken by using a camera of the mobile phone, and the GPS position information and posture are acquired by using related sensors (for example, a GPS sensor, a gyroscope, an accelerometer, and an electronic compass) in the mobile phone.
  • related sensors for example, a GPS sensor, a gyroscope, an accelerometer, and an electronic compass
  • parameter. Take several aerial photos at different locations on your phone (the number can be anywhere from 5 to 10).
  • the attitude parameter of the mobile phone is acquired and recorded, and the posture parameter here includes the GPS position and the rotation angle information of the three coordinate axes.
  • Step 102 Construct a virtual three-dimensional scene according to the object photo, and match the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the shooting device.
  • the rotation transformation matrix is calculated by the three rotation angle information acquired when the shooting device is photographed, and the following steps are included:
  • Step 21 Acquire a first rotation matrix R SfM of the photographing device from the two-dimensional image plane to the virtual three-dimensional scene by using the SFM algorithm.
  • the rotation matrix (the rotation matrix of the two-dimensional image plane to the SFM space) in the external reference of each shooting device is obtained according to the SFM algorithm, and the rotation matrix is the first rotation matrix, and the matrix is passed through the matrix.
  • the attitude of the photographing device in the virtual three-dimensional space can be known from the two-dimensional image plane.
  • Step 22 Acquire a second rotation matrix of the photographing device from the two-dimensional image plane to the real three-dimensional environment according to the rotation angle information of the photographing device.
  • the rotation matrix of the photographing device in the real three-dimensional environment can be obtained according to the three rotation angles of the photographing device.
  • the following formula is used to calculate the attitude of the phone's own coordinate system with respect to the real three-dimensional environment (the geodetic coordinate system):
  • R camera R phone *R x (180)*R z (-90) (1)
  • R x (180) represents that the coordinate system first rotates 180 degrees around the x-axis of the phone's own coordinate system in a counterclockwise direction
  • R z (-90) represents the coordinate system and then rotates 90 degrees around the z-axis in a clockwise direction
  • R phone Three rotation angle information for shooting with the shooting device.
  • Step 23 Calculate a rotation transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment according to the first rotation matrix and the second rotation matrix.
  • step 21 and step 22 respectively, to obtain the photographing apparatus in the rotation matrix R SfM SFM space (virtual three-dimensional scene) is, the rotation matrix R camera imaging device to the real three dimensional environment according to the formula (2) can be obtained rotational transformation Matrix R trans :
  • R trans R camera *(R SfM ) -1 (2)
  • the photographing device photographs the object at different positions (multiple positions)
  • several rotation conversion matrices can be obtained according to the photographing devices of different positions, and the present invention is ensured in order to ensure the accuracy of the rotation conversion matrix.
  • the average rotation transformation matrix of the plurality of rotation conversion matrices obtained from different positions is required to obtain a final rotation transformation matrix. Specifically, the average values of the rotation angles of the respective rotation conversion matrices on the three coordinate axes are acquired, and the final rotation transformation matrix is obtained based on the average value reconstruction.
  • each rotation transformation matrix is decomposed into three rotation angles R x , R y , R z , which are averaged in three directions and then averaged according to three rotation angles. The value is re-reconstructed to obtain the final rotation transformation matrix.
  • the conversion process of the matrix and the rotation angle here is a technique well known to those skilled in the art and will not be described in detail herein.
  • the coordinate system of the real space is that the Y axis represents the true north direction, the X axis represents the true east direction, and the Z axis represents the vertical direction from the center of the earth to the sky. It can be seen that the side of the building is nearly perpendicular to the Z axis, so a more accurate conversion is made.
  • the position of the point cloud needs to be translated and scaled.
  • the conversion parameters since the scaling parameters are the same ratio of the respective coordinate axes, the value in the Z-axis direction can be ignored.
  • the numerical calculations of the X-axis and the Y-axis are explained.
  • the position of the shooting device in meters after conversion is (X i , Y i ) (that is, the position of the shooting device in the real three-dimensional environment can be obtained according to the GPS information of the shooting device), and the same can be obtained in the three-dimensional reconstruction space.
  • n represents the number of photographs of the object
  • the above formula is an overdetermined equation.
  • the solution can be solved by using the least square method, QR decomposition or singular value decomposition (SVD) and other common solutions in matrix theory, so that the target can be translated and scaled. Conversion parameters.
  • QR decomposition or singular value decomposition (SVD) and other common solutions in matrix theory, so that the target can be translated and scaled. Conversion parameters.
  • SSD singular value decomposition
  • an interface for the user to input the zoom parameters can be set, and the user provides the average moving distance between each of the two antenna photos at the time of shooting, and this parameter is divided by the neighboring photographing device in the virtual three-dimensional scene. The average distance between them to get the scaling parameter. Normally, the average moving distance of the user is between 0.3 and 1 meter.
  • the translation parameter can also be solved using the above formula. The reason for this method is that the GPS positioning accuracy is low. For example, the user has taken 5 photos, and the moving distance of the adjacent photos is 0.5 meters, which moves within a total range of 2.5 meters. However, the GPS accuracy is often accurate at 3 meters. Left and right, it is very likely that user movement occurs but the GPS reading is too dense.
  • Step 103 Calculate the attitude parameter of the object according to the position information of the object in the real three-dimensional environment.
  • the antenna attitude parameters include the attitude angle of the antenna, as well as GPS and altitude (height) information.
  • GPS and altitude (height) information we first introduce the acquisition of GPS and altitude (height) information.
  • the internal and external parameters of the shooting device can be obtained.
  • the phase name of the same name representing the antenna position to calculate the three-dimensional spatial position of the antenna target, which includes the following:
  • Two pairs of points with the same name are determined according to the straight line of the same name calibrated in the two target photos;
  • the rotation transformation matrix, translation and scaling transformation matrix are used to transform the position of the center of the two phase points, and the GPS and altitude values of the antenna in the real three-dimensional environment are obtained according to the transformed position.
  • the user after taking a picture of the antenna, the user selects two photos from the captured photos as the target picture, and calibrates the straight line of the same name indicating the position of the antenna in the target picture.
  • the final GPS and altitude values are determined based on the information of the line of the same name.
  • the method of inputting the same name straight line pair and the binocular visual limit constraint is used to obtain the same name phase point, including the following:
  • the polar line at the two end points of the same name line in one of the target photos is calculated according to the binocular vision limit constraint principle
  • intersection of the polar line and the line of the same name in the other target photo is the point of the same name of the endpoint.
  • the user selects two photos as the target photo, and selects an antenna target in each target photo, and selects a straight line pair of the same name representing the antenna in the antenna target.
  • the constraint limit binocular vision principle the same name is calculated 2a a line l at the end of the electrode line a in FIG. L; FIG. 2b and the entry of the same name polar straight line l 'intersects the intersection points a', thereby to obtain a point of the same name Can also get a straight line pair with the same name
  • the acquisition method of the above-mentioned same-named phase has increased the difficulty of the user's operation.
  • the current linear matching algorithm cannot accurately match the corresponding straight line, and the method of the embodiment of the present invention ensures the accuracy of the straight line matching.
  • the precision of the sex and the same name are the precision of the sex and the same name.
  • the embodiment of the present invention is implemented as follows:
  • the linear equations of the same-named pair are formed into a linear equation, and the linear equations are solved by the matrix method.
  • the position of the phase points in the virtual three-dimensional scene is determined based on the solution result.
  • Each phase point here corresponds to a shooting angle, that is, the shooting angle of the two target photos (different shooting positions) mentioned above.
  • X is the position coordinate of the three-dimensional space of the phase point to be calculated.
  • phase points of the same name phase point and projection matrix are needed.
  • A is a 4 x 4 matrix. Normally, due to the presence of noise, this equation cannot be completely equivalent. You can find the smallest X that makes
  • 1.
  • the optimal solution of this equation is the eigenvector corresponding to the minimum eigenvalue of the matrix A T A.
  • the general solution such as SVD, QR decomposition can be used to solve X.
  • the solved X (a, b, c, d)
  • the homogeneous coordinate X1 (a/d, b/d, c/d, d/d)
  • the final target coordinate T (a/d) , b/d, c/d).
  • one problem with the linear method is that the minimized
  • the spatial point X calculated at the beginning does not fully satisfy the linear equation and there is an error. What you want to minimize is the projection point of the real image point x and X. Distance, ie This means that the linear equation is divided by the weighting factor. Then the final error is to minimize the meaning of the photo.
  • the embodiment of the present invention is implemented according to an iterative method: a new weighting factor for each view is obtained according to the previous solution result, and the linear equations of each view are respectively divided by the corresponding weighting factors to obtain a new one. The linear equations are recalculated to obtain a new solution. Repeat this step until the new solution is equal to the previous solution. Then, the solution step of the linear equations is repeated according to the new weighting factor, and the final solution is calculated.
  • the optimal weighting factor obtained by the iterative method is divided into the optimal weighting factors of the respective angles of view, and the obtained solution is the position of the final target space point. Based on this method, the minimized equation can be made to conform to the coordinate error in the photographic sense.
  • the spatial iteration method calculated by such a linear iterative method has high accuracy, and generally can achieve convergence with a small number of iterations, and the implementation is simple and the program is simple.
  • the GPS position and altitude in the real three-dimensional scene are calculated according to the position coordinates of the center of the phase connecting the two ends, thereby representing the The GPS position and altitude of the antenna.
  • the (X, Y) of the real space coordinates obtained after the rotation transformation is obtained according to the translation and scaling matrix obtained in step 102, and is obtained as:
  • the (X, Y) is subjected to back projection transformation to obtain the GPS value.
  • the target's altitude is:
  • the spatial position coordinates of the two phase points may be utilized, and then the rotation transformation matrix, the translation and the scaling matrix are used to perform the rotation transformation.
  • the two phase points are directly at the coordinate points of the real three-dimensional environment, and the attitude angle of the antenna can be directly calculated.
  • the coordinate point representing the spatial straight line is re-acquired by the three-focus tensor method, and the attitude angle of the antenna is calculated according to the coordinate point.
  • the attitude angle of the antenna is calculated according to the coordinate point. Specifically, determining the position of the same straight line calibrated in the three target photos; calculating the position information of the straight line in the real three-dimensional environment by using the three-focus tensor algorithm according to the position and the position information of the photographing device; according to the position of the straight line in the real space The information calculates the attitude angle of the antenna.
  • the projection matrix of the three photographing devices centered at ⁇ C 0 , C 1 , C 2 ⁇ is P 1 , P 2 , P 3
  • the embodiment of the present invention further provides a pose measuring device for implementing the above-described pose measurement method.
  • the device includes a processor 42 and a memory 41 storing instructions executable by the processor 42. among them,
  • the processor 42 may be a general-purpose processor, such as a central processing unit (CPU), or may be a digital signal processor (DSP), an application specific integrated circuit (ASIC), or One or more integrated circuits configured to implement embodiments of the present invention.
  • CPU central processing unit
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • the memory 41 is configured to store the program code and transmit the program code to the CPU.
  • the memory 41 may include a volatile memory such as a random access memory (RAM); the memory 41 may also include a non-volatile memory such as a read-only memory (read- Only memory, ROM), flash memory, hard disk drive (HDD), or solid-state drive (SSD); the memory 41 may also include a combination of the above types of memories.
  • RAM random access memory
  • ROM read-only memory
  • HDD hard disk drive
  • SSD solid-state drive
  • a pose measuring device includes a camera, a processor and a memory, wherein a pose measurement program is stored in the memory; and the processor is configured to execute the pose measurement program stored in the memory, and the method is as follows:
  • the attitude parameter of the object is calculated according to the position information of the object in the real three-dimensional environment.
  • the posture parameter of the photographing device includes rotation angle information of the photographing device; and matching the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the photographing device, including:
  • a rotation transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment is calculated according to the first rotation matrix and the second rotation matrix.
  • the posture parameter of the shooting device includes positioning information of the shooting device; matching the virtual three-dimensional scene with the real three-dimensional environment according to the posture parameter of the shooting device, further includes:
  • the translation and scaling transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment is determined according to the positioning information and the position of the photographing device in the virtual three-dimensional scene.
  • calculating a rotation transformation matrix of the virtual three-dimensional scene to the real three-dimensional environment according to the first rotation matrix and the second rotation matrix including:
  • the average of the rotation angles of the respective rotation conversion matrices in the three coordinate axes is calculated, and the final rotation transformation matrix is obtained based on the average value reconstruction.
  • calculating the attitude parameter of the object according to the position information of the object in the real three-dimensional environment including:
  • Two pairs of points with the same name are determined according to the straight line of the same name calibrated in the two target photos;
  • the rotation transformation matrix and the translation and scaling transformation matrix are used to transform the position of the center of the two phase points, and the position value and the altitude value of the object in the real three-dimensional environment are obtained according to the transformed position.
  • two pairs of the same name are determined according to the straight line of the same name calibrated in the two target photos, including:
  • intersection of the polar line and the line of the same name in the other target photo is the point of the same name of the endpoint.
  • determining the positions of the two phase points in the virtual three-dimensional scene based on the two pairs of the same name including:
  • the linear equations of the same-named pair are formed into a linear system of equations, and the linear equations are solved by the matrix method.
  • the position of the phase points in the virtual three-dimensional scene is determined based on the solution.
  • a linear equation of the same name pair is formed into a linear equation group, and the matrix equation is used to solve the linear equation group, and the position of the phase point in the virtual three-dimensional scene is determined based on the solution, including:
  • the new weighting factor of each phase point is obtained.
  • the linear equations of each phase point are respectively divided by the corresponding weighting factors to obtain a new linear equation group, and then the new solution is calculated. Repeat this step until After the new solution is equal to the previous solution, this step is repeated, and the calculated final solution is the position of the phase point.
  • calculating the attitude parameter of the object according to the position information of the object in the real three-dimensional environment including:
  • the attitude angle of the object is calculated from the position information of the straight line in the real space.
  • the embodiment of the invention further provides a computer readable storage medium.
  • the computer readable storage medium herein stores one or more programs.
  • the computer readable storage medium may include a volatile memory such as a random access memory; the memory may also include a non-volatile memory such as a read only memory, a flash memory, a hard disk or a solid state hard disk; the memory may also include the above categories a combination of memory.
  • One or more programs in a computer readable storage medium may be executed by one or more processors to implement the pose measurement methods provided in the method embodiments.
  • embodiments of the present invention can be provided as a method, system, or computer program product. Accordingly, the present invention can take the form of a hardware embodiment, a software embodiment, or a combination of software and hardware. Moreover, the present invention can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage and optical storage, etc.) including computer usable program code.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.
  • the actual pose parameter information of the object can be obtained by shooting the object at different positions by the photographing device and analyzing the photograph of the photographed object in combination with the posture parameter of the photographing device.
  • the remote measurement of the pose of the object is realized, and the operation method is very simple and convenient, and no measurement personnel are required to climb, which effectively eliminates the risk of measurement.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé et un dispositif de mesure de pose et un support de stockage. Le procédé de mesure de pose consiste à : obtenir des photos d'un objet photographié à différentes positions et des paramètres de pose du dispositif de photographie lors de la prise des photos (étape 101) ; construire une scène tridimensionnelle virtuelle selon les photos de l'objet, et mettre en correspondance la scène tridimensionnelle virtuelle avec la scène tridimensionnelle réelle selon les paramètres de pose du dispositif de photographie (étape102) ; et calculer un paramètre de pose de l'objet en fonction d'informations de position de l'objet dans la scène tridimensionnelle réelle (étape 103).
PCT/CN2018/090821 2017-06-21 2018-06-12 Procédé et dispositif de mesure de pose, et support de stockage WO2018233514A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710475557.4A CN109099888A (zh) 2017-06-21 2017-06-21 一种位姿测量方法、设备及存储介质
CN201710475557.4 2017-06-21

Publications (1)

Publication Number Publication Date
WO2018233514A1 true WO2018233514A1 (fr) 2018-12-27

Family

ID=64735483

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/090821 WO2018233514A1 (fr) 2017-06-21 2018-06-12 Procédé et dispositif de mesure de pose, et support de stockage

Country Status (2)

Country Link
CN (1) CN109099888A (fr)
WO (1) WO2018233514A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110296686B (zh) * 2019-05-21 2021-11-09 北京百度网讯科技有限公司 基于视觉的定位方法、装置及设备
CN112815923B (zh) * 2019-11-15 2022-12-30 华为技术有限公司 视觉定位方法和装置
CN113781548B (zh) * 2020-06-10 2024-06-14 华为技术有限公司 多设备的位姿测量方法、电子设备及系统
CN114274147B (zh) * 2022-02-10 2023-09-22 北京航空航天大学杭州创新研究院 目标跟踪控制方法及装置、机械臂控制设备和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157011A (zh) * 2010-12-10 2011-08-17 北京大学 利用移动拍摄设备进行动态纹理采集及虚实融合的方法
CN103217147A (zh) * 2012-01-19 2013-07-24 株式会社东芝 测量设备和测量方法
CN103245337A (zh) * 2012-02-14 2013-08-14 联想(北京)有限公司 一种获取移动终端位置的方法、移动终端和位置检测系统
CN103759716A (zh) * 2014-01-14 2014-04-30 清华大学 基于机械臂末端单目视觉的动态目标位置和姿态测量方法
US20150029345A1 (en) * 2012-01-23 2015-01-29 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
CN106296801A (zh) * 2015-06-12 2017-01-04 联想(北京)有限公司 一种建立物体三维图像模型的方法及电子设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015133206A1 (fr) * 2014-03-05 2015-09-11 コニカミノルタ株式会社 Dispositif, procédé et programme de traitement d'images
CN106569591A (zh) * 2015-10-26 2017-04-19 苏州梦想人软件科技有限公司 基于计算机视觉跟踪和传感器跟踪的跟踪方法和跟踪系统
CN105528082B (zh) * 2016-01-08 2018-11-06 北京暴风魔镜科技有限公司 三维空间及手势识别追踪交互方法、装置和系统
CN106651942B (zh) * 2016-09-29 2019-09-17 苏州中科广视文化科技有限公司 基于特征点的三维旋转运动检测与旋转轴定位方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102157011A (zh) * 2010-12-10 2011-08-17 北京大学 利用移动拍摄设备进行动态纹理采集及虚实融合的方法
CN103217147A (zh) * 2012-01-19 2013-07-24 株式会社东芝 测量设备和测量方法
US20150029345A1 (en) * 2012-01-23 2015-01-29 Nec Corporation Camera calibration device, camera calibration method, and camera calibration program
CN103245337A (zh) * 2012-02-14 2013-08-14 联想(北京)有限公司 一种获取移动终端位置的方法、移动终端和位置检测系统
CN103759716A (zh) * 2014-01-14 2014-04-30 清华大学 基于机械臂末端单目视觉的动态目标位置和姿态测量方法
CN106296801A (zh) * 2015-06-12 2017-01-04 联想(北京)有限公司 一种建立物体三维图像模型的方法及电子设备

Also Published As

Publication number Publication date
CN109099888A (zh) 2018-12-28

Similar Documents

Publication Publication Date Title
WO2021212844A1 (fr) Procédé et appareil de couture de nuage de points, et dispositif et support de stockage
EP3309751B1 (fr) Dispositif, procédé et programme de traitement d'image
US9466143B1 (en) Geoaccurate three-dimensional reconstruction via image-based geometry
CN108592950B (zh) 一种单目相机和惯性测量单元相对安装角标定方法
WO2018233514A1 (fr) Procédé et dispositif de mesure de pose, et support de stockage
CN106530358A (zh) 仅用两幅场景图像标定ptz摄像机的方法
CN110969665B (zh) 一种外参标定方法、装置、系统及机器人
CN112288853B (zh) 三维重建方法、三维重建装置、存储介质
JP2013539147A5 (fr)
IL214151A (en) Method and device for restoring 3D character
CN113048980B (zh) 位姿优化方法、装置、电子设备及存储介质
WO2021004416A1 (fr) Procédé et appareil permettant d'établir une carte de balises sur la base de balises visuelles
CN113820735A (zh) 位置信息的确定方法、位置测量设备、终端及存储介质
JP7114686B2 (ja) 拡張現実装置及び位置決め方法
CN115423863B (zh) 相机位姿估计方法、装置及计算机可读存储介质
Duran et al. Accuracy comparison of interior orientation parameters from different photogrammetric software and direct linear transformation method
JP6928217B1 (ja) 測定処理装置、方法及びプログラム
Tjahjadi et al. Single image orientation of UAV's imagery using orthogonal projection model
El-Ashmawy A comparison study between collinearity condition, coplanarity condition, and direct linear transformation (DLT) method for camera exterior orientation parameters determination
KR20210009019A (ko) 벡터내적과 3차원 좌표변환을 이용한 카메라의 위치 및 자세를 결정하는 시스템
Bakuła et al. Capabilities of a smartphone for georeferenced 3dmodel creation: An evaluation
CN108764161B (zh) 基于极坐标系的破解稀疏阵引发的病态奇异性的遥感影像处理方法和装置
CN113223163A (zh) 点云地图构建方法及装置、设备、存储介质
Zhuo et al. A target localization method for UAV image sequences based on DEM matching
CN113034615A (zh) 一种用于多源数据融合的设备标定方法及相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18821112

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18821112

Country of ref document: EP

Kind code of ref document: A1