CN112051546B - Device for realizing relative positioning and corresponding relative positioning method - Google Patents

Device for realizing relative positioning and corresponding relative positioning method Download PDF

Info

Publication number
CN112051546B
CN112051546B CN201910485778.9A CN201910485778A CN112051546B CN 112051546 B CN112051546 B CN 112051546B CN 201910485778 A CN201910485778 A CN 201910485778A CN 112051546 B CN112051546 B CN 112051546B
Authority
CN
China
Prior art keywords
positioning
information
imaging
marks
mark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910485778.9A
Other languages
Chinese (zh)
Other versions
CN112051546A (en
Inventor
方俊
牛旭恒
李江亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Whyhow Information Technology Co Ltd
Original Assignee
Beijing Whyhow Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Whyhow Information Technology Co Ltd filed Critical Beijing Whyhow Information Technology Co Ltd
Priority to CN201910485778.9A priority Critical patent/CN112051546B/en
Priority to EP20818983.7A priority patent/EP3982084A1/en
Priority to JP2021571442A priority patent/JP2022536617A/en
Priority to PCT/CN2020/093689 priority patent/WO2020244480A1/en
Priority to TW109119021A priority patent/TWI812865B/en
Publication of CN112051546A publication Critical patent/CN112051546A/en
Priority to US17/536,752 priority patent/US20220083071A1/en
Application granted granted Critical
Publication of CN112051546B publication Critical patent/CN112051546B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

An apparatus for effecting relative positioning, the positioning apparatus comprising one or more first positioning markers capable of determining a plane and one or more second positioning markers at least a portion of which lie outside the plane in which the first positioning markers lie, and a corresponding relative positioning method, the method comprising: obtaining an image containing the positioning device captured by an imaging apparatus and physical position information of the first positioning mark and the second positioning mark; determining imaging position information of the first positioning mark and the second positioning mark on the image; and determining position information and/or posture information of the imaging device relative to the positioning device when the image is shot according to the physical position information and imaging position information of the first positioning mark and the second positioning mark and internal reference information of an imaging device of the imaging device.

Description

Device for realizing relative positioning and corresponding relative positioning method
Technical Field
The present invention relates to the field of positioning technologies, and in particular, to a device for implementing relative positioning and a corresponding relative positioning method.
Background
The statements in this section merely provide background information related to the present disclosure to facilitate an understanding of the present disclosure. The matters described in this section do not constitute prior art to the technical solutions of the present application unless explicitly stated.
In many application scenarios, it is necessary to determine the location of a device or machine. For example, in navigating a vehicle, it is necessary to determine the position of the vehicle. In addition, in some factories, robots or autonomous vehicles are used for delivery or transfer of goods, and during delivery or transfer of goods, the positions of the robots or autonomous vehicles need to be determined.
Currently, the main positioning methods are usually based on wireless signals, such as GPS positioning, wifi positioning, bluetooth positioning, etc., but these positioning methods are easily interfered by signals, and it is difficult to obtain accurate positioning results.
The positioning method based on the visual markers can overcome the defects to a certain extent, but is influenced by imaging precision, and some applications are only performed in short-distance positioning at present. For example, visual markers known as AR markers may be utilized in some augmented reality applications to closely determine the position and pose of a camera. Fig. 1 shows an exemplary AR marker, which is similar to a two-dimensional code. In addition, visual markers may also be used in some robotic applications to determine the position and pose of cameras mounted on nearby robots. However, the existing visual markers are generally planar printed objects, and when the distance from the visual marker is far, the number of imaging pixels of the visual marker is reduced, so that the positioning result based on the visual marker becomes unstable or has a large error, and the position and the posture of the camera cannot be accurately determined.
Therefore, the invention provides a device capable of realizing higher relative positioning precision and a corresponding relative positioning method.
Disclosure of Invention
One aspect of the invention relates to an apparatus for achieving relative positioning, comprising: one or more first positioning markers capable of determining a plane; and one or more second locating marks, wherein at least a portion of the second locating marks are located outside of a plane in which the first locating marks are located.
Optionally, the device comprises at least three first positioning marks which are positioned on the same plane and are not collinear, and the second positioning mark is positioned outside the plane of the first positioning mark.
Optionally, the distance between the plane of the first positioning mark and the second positioning mark is at least 0.1 cm.
Optionally, the distance between the plane where the first positioning mark is located and the second positioning mark is greater than 1/10 of the shortest distance between the first positioning marks.
Optionally, the device includes four first positioning marks, and any three of the four first positioning marks are not collinear.
Optionally, the four first positioning marks are arranged in a rectangular shape, and the second positioning mark is located at a middle position of the rectangle in the horizontal direction.
Optionally, one or more of the first and second locating marks are configured to serve as a data light source capable of communicating information.
Optionally, the apparatus further includes: one or more data light sources for conveying information.
One aspect of the present invention relates to a relative positioning method implemented using a positioning device comprising one or more first positioning markers capable of determining a plane and one or more second positioning markers, at least a portion of which lie outside the plane in which the first positioning markers lie, the method comprising: obtaining an image containing the positioning device captured by an imaging apparatus and physical position information of the first positioning mark and the second positioning mark; determining imaging position information of the first positioning mark and the second positioning mark on the image; and determining position information and/or posture information of the imaging device relative to the positioning device when the image is shot according to the physical position information and imaging position information of the first positioning mark and the second positioning mark and internal reference information of an imaging device of the imaging device.
Optionally, the physical location information of the first positioning mark and the second positioning mark includes relative physical location information between the positioning marks or absolute physical location information of the positioning marks.
Optionally, wherein the physical location information of the first and second positioning markers is obtained at least in part by communication between the imaging device and the positioning apparatus.
Optionally, the positioning device further includes one or more data light sources for transmitting information, or one or more positioning marks of the first positioning mark and the second positioning mark are configured to be used as data light sources capable of transmitting information, and the physical position information of the first positioning mark and the second positioning mark is obtained by the following method: identifying, by the imaging device, information conveyed by the data light source; and obtaining physical position information of the first positioning mark and the second positioning mark through the information.
Optionally, the information transmitted by the data light source includes identification information of the positioning device, which can be used to obtain absolute physical location information of the positioning device, the first positioning marker or the second positioning marker.
Optionally, the determining the position information and/or the pose information of the imaging device relative to the positioning device when capturing the image includes: determining position information and/or attitude information of the imaging device relative to the positioning means at the time of capturing the image using a PnP method; and/or determining perspective deformation related to the first positioning mark and the second positioning mark, and determining position information and/or posture information of the imaging device relative to the positioning device when the image is shot according to the perspective deformation.
Another aspect of the invention relates to a storage medium in which a computer program is stored which, when being executed by a processor, can be used to implement the above-mentioned method.
Yet another aspect of the invention relates to an electronic device comprising a processor and a memory, the memory having stored therein a computer program which, when executed by the processor, is operable to carry out the method described above.
Drawings
Embodiments of the invention are further described below with reference to the accompanying drawings, in which:
FIG. 1 illustrates an exemplary AR marker;
FIG. 2 illustrates a front view of an optical label according to one embodiment of the present invention;
FIG. 3 illustrates a side view of the optical label shown in FIG. 2;
FIG. 4 illustrates a perspective view of the optical label shown in FIG. 2;
FIG. 5 illustrates an exemplary positioning device on which positioning markers do not have depth differences;
fig. 6 shows an image obtained when the positioning device shown in fig. 5 is photographed on the left side using an imaging apparatus;
fig. 7 shows an image obtained when the positioning device shown in fig. 5 is photographed on the right side using an imaging apparatus;
FIG. 8 illustrates a perspective view of a positioning device with positioning markers having depth differences, according to one embodiment of the invention;
fig. 9 shows an image obtained when the positioning device shown in fig. 8 is photographed on the left side using an imaging apparatus;
fig. 10 shows an image obtained when the positioning device shown in fig. 8 is photographed on the right side using an imaging apparatus; and
fig. 11 illustrates a relative positioning method according to an embodiment of the invention.
Detailed Description
For the purpose of making the technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail by way of specific embodiments with reference to the accompanying drawings. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
For ease of explanation of the present invention, an optical communication device is described below as an example, but it will be understood by those skilled in the art from the description herein that the aspects of the present invention are not limited to optical communication devices, but may be applied to any device having the relative positioning features of the present invention.
Optical communication devices are also referred to as optical labels, and these two terms are used interchangeably herein. The optical tag can transmit information by emitting different lights, has the advantages of long recognition distance and loose requirements on visible light conditions, and the information transmitted by the optical tag can change with time, so that large information capacity and flexible configuration capability can be provided. Compared with the traditional two-dimensional code, the optical label has a longer recognition distance and stronger information interaction capability, so that great convenience can be provided for users.
The light label may typically include a controller and at least one light source, the controller being capable of driving the light source in different driving modes to convey different information outwards. The optical label also includes a controller for selecting a respective drive mode for each light source based on the information to be communicated. For example, in different driving modes, the controller may control the light emitting manner of the light source using different driving signals, so that when the light label is photographed using the apparatus having an imaging function, the imaging of the light source therein may take on different appearances (e.g., different colors, patterns, brightness, etc.). By analyzing the imaging of the light sources in the optical label, the driving mode of each light source at the moment can be analyzed, so that the information transmitted by the optical label at the moment can be analyzed.
In order to provide the corresponding service to the user based on the optical labels, each optical label may be assigned with an identification Information (ID) for uniquely identifying or identifying the optical label by a manufacturer, manager, user, or the like of the optical label. Typically, the light source may be driven by a controller in the light tag to pass the identification information outward, and the user may use the device to image capture the light tag to obtain the identification information passed by the light tag, so that a corresponding service may be accessed based on the identification information, e.g., access a web page associated with the identification information of the light tag, obtain other information associated with the identification information (e.g., location information of the light tag corresponding to the identification information), etc. The device may acquire multiple images containing the optical label by successive image acquisitions of the optical label by a camera thereon and identify the information conveyed by the optical label by analyzing the imaging of the optical label (or the individual light sources in the optical label) in each image.
Identification Information (ID) or other information of each optical label, such as service information related to the optical label, description information or attribute related to the optical label, such as position information, physical size information, physical shape information, gesture or orientation information, etc., of the optical label may be stored on the server. The optical tag may also have uniform or default physical size information, physical shape information, and the like. The device may use the identification information of the identified optical tag to query from the server for additional information related to the optical tag. The location information of the optical tag may refer to the actual location of the optical tag in the physical world, which may be indicated by geographical coordinate information.
Fig. 2 shows a front view of an optical label according to an embodiment of the invention, comprising three data light sources 205 for conveying information outwards, four first positioning markers 201 located on both sides of the three data light sources 205, one second positioning marker 202 located above the three data light sources 205, wherein the four first positioning markers 201 are located in the same plane and not collinear, and the second positioning marker 202 is located out of the plane of the four first positioning markers 201, i.e. there is a depth difference between the first positioning marker 201 and the second positioning marker 202.
In the embodiment shown in fig. 2, four first positioning marks 201 are arranged in a rectangular shape and the second positioning mark 202 is located at a middle position of the rectangle in the horizontal direction, but it will be understood by those skilled in the art that this is not a limitation, and the first positioning mark 201 and the second positioning mark 202 may have other arrangements. Also shown in the embodiment of fig. 2 are some features for making the light label more aesthetically pleasing (e.g., portions of the top of the light label that resemble two ears), which are merely exemplary and not limiting of the invention.
The data light source 205 may be any light source that can be used to convey information outwards, for example, the data light source 205 may be one LED lamp, an array of a plurality of LED lamps, a display screen or a portion thereof, or even an illuminated area of light (e.g., an illuminated area of light on a wall) may also be used as the data light source 205. The data light source 205 may have any surface shape, such as circular, square, rectangular, bar-shaped, etc. Various common optics may be included or attached to the data light source 205, such as a light guide plate, a soft light plate, a diffuser, and the like. In some embodiments, the optical tag may have one or more data light sources 205 therein, but is not limited to three.
The first positioning marker 201 and/or the second positioning marker 202 may be devices that do not actively emit light; the first positioning marker 201 and/or the second positioning marker 202 may also be actively light-emitting devices, such as lamps, to enable use in scenes where there is no or weak ambient light. The first positioning indicia 201 and/or the second positioning indicia 202 may have any suitable surface shape, such as, for example, circular, square, rectangular, triangular, hexagonal, oval, etc. The first positioning marker 201 and/or the second positioning marker 202 may also be stereotactic markers, e.g., spheres, cylinders, cubes, etc.
In some embodiments, the number of first positioning flags 201 may be three or more than four. Three non-collinear first positioning markers 201 are sufficient to define a plane, and the second positioning marker 202 is located outside the plane defined by the three first positioning markers 201.
In some embodiments, the second positioning mark 202 may be located elsewhere on the optical label, such as below or in the middle of the first positioning mark 201, and the optical label may have more than one second positioning mark 202 therein. In one embodiment, the optical tag includes two second positioning marks 202, the two second positioning marks 202 being located above and below the first positioning mark 201, respectively.
In some embodiments, the data light source 205 and the first positioning marker 201 may be located in the same plane or in different planes. In some embodiments, the data light source 205 and the second positioning marker 202 may be located in the same plane or in different planes.
Fig. 3 shows a side view of the optical label shown in fig. 2, and fig. 4 shows a perspective view of the optical label shown in fig. 2. As is clear from fig. 3 and 4, the second positioning mark 202 is located out of the plane of the four first positioning marks 201, i.e. there is a depth difference between the first positioning mark 201 and the second positioning mark 202.
The positioning device comprising the first positioning mark and the second positioning mark is described above by taking an optical tag as an example, but it will be appreciated by those skilled in the art that the solution of the present invention is not limited to an optical tag, but may be applied to any device satisfying the following conditions: the device comprises at least three first locating marks which are positioned on the same plane but are not collinear and one or more second locating marks, wherein the one or more second locating marks are positioned outside the plane of the first locating marks. Hereinafter, the above-described device may be referred to as a "positioning device". In some embodiments, the positioning device may include only a first positioning marker and a second positioning marker for achieving relative positioning, and not include a data light source for conveying information outward; alternatively, the first positioning mark and/or the second positioning mark in the positioning device can be used as the positioning mark or can be simultaneously configured to be used as a data light source capable of transmitting information outwards. In some embodiments, it is advantageous to enable the positioning means not only for relative positioning but also for transferring information outwards, which may enable the imaging device to learn identification information of the positioning means, which identification information may be used to obtain an absolute position of the positioning means, such that the absolute position of the imaging device may be determined from the absolute position of the positioning means and the relative position of the imaging device with respect to the positioning means.
According to different application scenarios, the distance or depth difference between the plane where the first positioning mark 201 is located and the second positioning mark 202 may have different values. Typically, if the required positioning range is relatively large (e.g., a relatively large range around the positioning device is required to achieve relative positioning), a relatively large depth difference is required; if the resolution of the imaging device is relatively low, a relatively large depth difference is required. Conversely, if the required positioning range is relatively small and/or the resolution of the imaging device is relatively high, a relatively small depth difference may suffice. Theoretically, in the case where there is a depth difference, in order to be able to observe a change in imaging position of the second positioning mark with respect to the first positioning mark (i.e., the imaging position change is distinguishable on pixels), imaging of a distance corresponding to the depth difference is generally required to be equal to or greater than two pixels. Assuming that the resolution of the imaging device is R and the furthest distance required for positioning is D, the smallest depth difference is: imaging at resolution R, the actual object length at distance D corresponding to 2 pixels. More specifically, assuming that the resolution of the imaging device is r=rx×ry, the size of the imager is l×w, the x-direction focal length of the imaging device is fx, the y-direction focal length is fy, the farthest distance required for positioning is D, the size of the object is x×y, and the imaging size of the object is u×v (the above sizes x, y, u, v are projection sizes of the object on the x axis and the y axis in the camera coordinate system). The following formula can be obtained:
fx/D=u*(L/Rx)/x
fy/D=v*(W/Ry)/y
Let u=v=2 pixels,
x_min=2*L*D/(fx*Rx)
y_min=2*W*D/(fy*Ry)
the minimum value of both x_min and y_min may be taken as the minimum value depth_min of the depth difference, i.e. depth_min=min (x_min, y_min).
For example, in the case of using an imaging device having a resolution of 4K (2160×3840), when the recognition distance is 10 meters, imaging of a line segment of 2/3 cm is about 2 pixels, so the minimum depth difference needs to be 2/3 cm. If the recognition distance becomes 50 meters (e.g., for some outdoor positioning scenarios), the minimum depth difference needs to be 10/3 cm. Whereas if the recognition distance becomes 1.5 meters (e.g., for some indoor positioning scenes), the minimum depth difference needs to be 0.1 cm.
In some embodiments, it is preferred that the plane in which the first locating feature lies is at least 0.1 cm, at least 0.2 cm, at least 0.5 cm, at least 0.8 cm, at least 1 cm, or at least 1.5 cm, or the like, from the second locating feature. In some embodiments, the distance or depth difference between the plane where the first positioning mark is located and the second positioning mark may also be determined according to the distance between the first positioning marks, preferably, the distance or depth difference between the plane where the first positioning mark is located and the second positioning mark is greater than 1/10, 1/8, 1/5, 1/4, or 1/3 of the shortest distance between the first positioning marks, and so on.
The following describes a relative positioning method implemented using the positioning device of the present invention for determining positional information and/or attitude information of an imaging apparatus relative to the positioning device. The position information and the posture information may be collectively referred to as "posture information". It should be noted that, depending on the different needs of the application scenario, it may not be necessary to obtain both the position information and the posture information of the imaging device in some cases, but only one of them may be obtained, for example, only the position information of the imaging device.
In one embodiment, to obtain the position and/or pose of the imaging device relative to the positioning device, a coordinate system may be established from the positioning device, which may be referred to as a world coordinate system or a positioning device coordinate system. The first positioning mark and the second positioning mark on the positioning device constitute some spatial points in the world coordinate system and have corresponding coordinates in the world coordinate system. After the image comprising the positioning means has been captured using the imaging device, it is possible to find the image points in the image, which correspond to these spatial points, respectively, on the basis of the physical or geometrical characteristics of the positioning means, for example, and to determine the imaging position of the individual image points in the image. According to the coordinates of each spatial point in the world coordinate system and the imaging position of each corresponding image point in the image, the pose information (R, t) of the imaging device in the world coordinate system when the image is shot can be calculated by combining the internal reference information of the camera, wherein R is a rotation matrix which represents the pose information of the camera in the world coordinate system, and t is a displacement vector which represents the position information of the camera in the world coordinate system. Methods of calculating R, t are known in the art, for example R, t may be calculated using the 3D-2D PnP (periodic-n-Point) method, and will not be described in detail herein in order not to obscure the present invention.
In calculating pose information (R, t) of an imaging device, the position of each image point on the image corresponding to each positioning mark on the positioning device needs to be accurately determined, which can be challenging if these positioning marks are all located in the same plane. Fig. 5 illustrates an exemplary positioning device on which the positioning marks do not have depth differences. Specifically, the positioning device includes five positioning marks P1, P2, P3, P4, P5 indicated by solid black dots, and the five positioning marks are located on the same plane. Of these five positioning marks, four positioning marks P1, P2, P3, P4 constitute one rectangular shape, and the positioning mark P5 is located at the center of the rectangle. Fig. 6 shows an image obtained when the positioning device shown in fig. 5 is photographed on the left side using the imaging apparatus, and fig. 7 shows an image obtained when the positioning device shown in fig. 5 is photographed on the right side using the imaging apparatus. It can be seen that there is a corresponding perspective distortion in the imaging of the positioning device, specifically, according to the principle of "near-far" in visual imaging, when the positioning device is photographed on the left side using the imaging apparatus, the distance between the positioning markers P1 and P2 is greater than the distance between the positioning markers P3 and P4; in contrast, when the positioning device is photographed on the right side using the imaging apparatus, the distance between the positioning marks P1 and P2 is smaller than the distance between the positioning marks P3 and P4. From the coordinates of the positioning marks P1, P2, P3, P4, P5 of the positioning device in the world coordinate system and the imaging positions of these positioning marks, pose information of the imaging apparatus in the world coordinate system can be calculated by using, for example, a PnP (transparent-n-Point) method. However, in many cases, there are some errors in determining the imaging position of the positioning marker, resulting in errors in the calculated pose information of the imaging device, which become more pronounced when the imaging device is relatively far from the positioning apparatus. For example, when the positioning device is far from the imaging apparatus, its imaging is small, at which time the distance between the positioning marks P1 and P2 or the distance between the positioning marks P3 and P4 occupies few pixels (for example, less than several tens of pixels), and the pixel difference between these two distances is smaller (for example, only 1-3 pixels differ), while it is difficult to accurately determine the difference between the above two distances due to an error of image processing (for example, the pixel error of image processing may be 1-2 pixels), and accordingly, it is difficult to accurately determine pose information of the imaging apparatus.
By adopting the positioning device, errors in the determined pose information of the imaging equipment can be greatly reduced or eliminated. Fig. 8 shows a perspective view of a positioning device according to an embodiment of the invention, which is similar to the positioning device shown in fig. 5, but with the positioning mark P5 in the central position being moved or projected out of the plane of the other four positioning marks P1, P2, P3, P4. In this way, when the positioning device shown in fig. 8 is placed in the manner shown in fig. 5 and photographed on the left or right using the imaging apparatus, different imaging effects can be obtained. Fig. 9 shows an image obtained when the positioning device is photographed at the left side using the imaging apparatus, fig. 10 shows an image obtained when the positioning device is photographed at the right side using the imaging apparatus, and a dotted circle in fig. 9 and 10 indicates an imaging position of the positioning mark P5 before the positioning mark P5 is moved, that is, an imaging position of the positioning mark P5 in the positioning device shown in fig. 5. As can be seen from fig. 9 and 10, since the positioning mark P5 protrudes out of the plane in which the other four positioning marks P1, P2, P3, P4 are located, when the imaging apparatus is used to photograph the positioning device at different positions, the imaging position of the positioning mark P5 of the positioning device will change significantly with respect to the imaging positions of the positioning marks P1, P2, P3, P4, and further analysis of the change helps to alleviate or eliminate errors in the calculated pose information of the imaging apparatus. Fig. 9 and 10 show images obtained when the positioning device is photographed at the left and right sides using the imaging apparatus, respectively, and those skilled in the art will appreciate that similar effects are also observed when photographing using the imaging apparatus in other directions. In addition, in some embodiments, any one of the positioning marks P1, P2, P3, P4 in the positioning device may be omitted.
Fig. 11 illustrates a relative positioning method according to an embodiment of the present invention that determines position information and/or pose information of an imaging apparatus with respect to a positioning device by analyzing an image containing the positioning device taken by the imaging apparatus. The positioning means may comprise at least three first positioning marks which lie in the same plane and are not collinear, and one or more second positioning marks which lie outside the plane in which the first positioning marks lie. The method may be performed by an imaging device, but may also be performed by other devices or apparatuses (e.g., a server). For example, the imaging device may send its captured image containing the positioning device to the server, after which the server may analyze the image to determine the position of the imaging device relative to the positioning device, so that software deployment or computing power deployment at the imaging device may be simplified. The method shown in fig. 11 includes the steps of:
step 1101: physical location information of a first location indicator and a second location indicator on the location device is obtained.
The physical location information of the first positioning marker and the second positioning marker may be obtained in various ways. For example, in some application scenarios (e.g., in an automated factory), the positioning device has a fixed specification or model, such that the imaging device or server, etc., may be pre-aware of the physical location information of the first positioning marker and the second positioning marker on the positioning device. In some application scenarios, the positioning device has a communication function, and the imaging apparatus may communicate with the positioning device (e.g., by wireless signals or light) to obtain physical location information of the first and second positioning markers of the positioning device. The imaging apparatus may directly obtain physical location information of the first and second locating marks of the locating device, may also obtain other information (e.g., identification information, specification information, or model information) of the locating device, and determine the physical location information of the first and second locating marks through query or analysis using the other information. For example, for the locating device in the form of a light label shown in fig. 2 of the present application, the imaging device may identify the identification information transmitted by the light label, and use the identification information to obtain the physical location information of each locating label on the light label through inquiry. The imaging device may also send the identified information conveyed by the optical tag to the server so that the server may use the information to obtain physical location information for each location tag on the optical tag by querying. The imaging device may send any information it obtains by communicating with the positioning means to other devices or means, such as a server.
The physical location information of the first positioning mark and the second positioning mark may be relative physical location information or absolute physical location information. In some embodiments, the physical location information of the first location indicator and the second location indicator may be a relative positional relationship (e.g., relative distance and relative direction) between the respective location indicators. In some embodiments, the physical location information of the first positioning marker and the second positioning marker may be coordinate information of the respective positioning markers in a coordinate system established according to the positioning device. For example, one positioning mark may be used as the origin of the coordinate system, and the position of each positioning mark may be represented by coordinate information in the coordinate system. In some embodiments, the physical location information of the first and second positioning markers may be absolute physical location information of the respective positioning markers in the real world. It will be appreciated by those skilled in the art that absolute physical position information of the positioning marker is not necessary for determining relative pose information between the imaging device and the positioning apparatus, but by using the absolute physical position information of the positioning marker, absolute pose information of the imaging device in the real world may be further determined based on the relative pose information between the imaging device and the positioning apparatus.
Step 1102: an image containing a positioning device captured by an imaging apparatus is obtained.
The imaging device referred to herein may be a user-carried device (e.g., a cell phone, tablet, smart glasses, smart helmet, smart watch, etc.), but it will be appreciated that the imaging device may also be a machine capable of autonomous movement, e.g., an unmanned aerial vehicle, an unmanned car, a robot, etc., on which an imaging device, e.g., a camera, is mounted.
Step 1103: and obtaining imaging position information of the first positioning mark and the second positioning mark on the image by analyzing the image.
After the image taken by the imaging device containing the positioning means is obtained, the imaging positions of the first positioning mark and the second positioning mark of the positioning means on the image, which can be represented by corresponding pixel coordinates, for example, can be determined by analyzing the image.
Step 1104: and determining the position information and/or the posture information of the imaging device relative to the positioning device when the image is shot according to the physical position information and the imaging position information of the first positioning mark and the second positioning mark and the internal reference information of the imaging device.
The imaging device of the imaging apparatus may have corresponding internal reference information, and the internal reference of the imaging device is a parameter related to the characteristics of the imaging device itself, such as a focal length, a pixel size, and the like of the imaging device. The imaging apparatus can obtain the reference information of its imaging device at the time of capturing an image. Other devices or apparatuses (e.g., servers) may also receive the intrinsic information from the imaging device. For example, the imaging apparatus may additionally upload the reference information of its imaging device when uploading an image to the server. In some embodiments, the imaging device may instead upload model information of its imaging device to the server, from which the server may obtain its reference information.
After the physical position information of the first and second positioning marks, the imaging position information of the first and second positioning marks, and the internal information of the imaging device of the imaging apparatus are obtained, the position information and/or the posture information of the imaging apparatus with respect to the positioning device may be determined using various methods known in the art, for example, a PnP (transparent-n-Point) method of 3D-2D, also referred to as a method of solvePnP. Representative methods of current comparison include the P3P method, the Iternive method, the EPnP method, the DLT method, and the like.
In some embodiments, after the physical position information of the first positioning mark and the second positioning mark, the imaging position information of the first positioning mark and the second positioning mark, and the internal reference information of the imaging device of the imaging apparatus are obtained, the position information and/or the posture information of the imaging apparatus with respect to the positioning device at the time of capturing an image may also be determined by analyzing perspective deformation or the like of these positioning marks or the pattern constituted by these positioning marks. For example, the imaging device may determine its distance information and direction information relative to the positioning device in various ways. In one embodiment, the imaging device may determine the relative distance of the imaging device from the positioning device by analyzing the actual size of the pattern of positioning markers and the imaging size of the pattern (the greater the imaging, the closer the distance; the lesser the imaging, the further the distance). The imaging device may also determine the orientation information of the imaging device relative to the positioning means by comparing the actual shape of the pattern of positioning marks with the imaged shape of the pattern. For example, for the imaging shown in fig. 9, it may be determined that the imaging apparatus is a photographing positioning device on the left side, and for the imaging shown in fig. 10, it may be determined that the imaging apparatus is a photographing positioning device on the right side. The imaging device may also determine its pose information relative to the positioning device from the imaging of the positioning device. For example, when the imaging position or imaging area of the positioning device is located at the center of the field of view of the imaging apparatus, the imaging apparatus can be considered to be currently facing the positioning device. The direction of imaging of the positioning means may be further taken into account when determining the pose of the imaging device. As the pose of the imaging device changes, the imaging position and/or the imaging direction of the positioning device on the imaging device changes accordingly, and thus pose information of the imaging device relative to the positioning device can be obtained according to imaging of the positioning device on the imaging device.
In the above embodiments, the positioning marks are mainly described in the form of dots, but it will be understood by those skilled in the art that in some embodiments, the first positioning mark and/or the second positioning mark may have other shapes. For example, in one embodiment, the two first positioning marks 201 shown in fig. 2 (e.g., the two first positioning marks 201 on the left or the two first positioning marks 201 on the right) may be replaced by one stripe-shaped positioning mark. In one embodiment, the first locator mark may include a bar locator mark and a dot locator mark that is non-collinear with the bar locator mark to collectively define a plane. In one embodiment, the first positioning mark may comprise two strip-shaped positioning marks lying in the same plane, for example, for the positioning device shown in fig. 8, the positioning marks P1 and P2 may be replaced with one strip-shaped positioning mark connecting P1 and P2, and the positioning marks P3 and P4 may be replaced with one strip-shaped positioning mark connecting P3 and P4. In one embodiment, a planar polygonal frame (e.g., a triangular frame or a rectangular frame) may be used as the first positioning mark, which itself defines a plane, for example, for the positioning device shown in fig. 8, a rectangular frame connecting the positioning marks P1, P2, P3, P4 may be used instead of the positioning marks P1, P2, P3, P4. In one embodiment, a planar positioning mark (e.g., a triangular positioning mark or a rectangular positioning mark) may be used as the first positioning mark, which itself defines a plane, for example, for the positioning device shown in fig. 8, the rectangular plates defined by the positioning marks P1, P2, P3, P4 may be used instead of the positioning marks P1, P2, P3, P4. Similarly, the second positioning mark may have other shapes, such as a bar shape, a polygonal frame shape, a planar shape, or the like. In one case, the second positioning mark as a whole may be located out of the plane in which the first positioning mark is located; in another case, the second positioning mark may intersect the plane defined by the first positioning mark, as long as a portion of the second positioning mark is located outside the plane in which the first positioning mark is located.
The position calculation result obtained by a plurality of experiments using the positioning device according to one embodiment of the present invention is described below.
1.
X direction (left-right direction) experiment: and testing the relative positioning results of the left direction and the right direction at the position 1 meter in front of the positioning device, wherein the shooting position of the imaging equipment is respectively moved by 0.5 meter, 1 meter and 1.5 meter in the left direction and the right direction. The positioning device serves as an origin of a spatial coordinate system, the X coordinate of the imaging device is negative when the imaging device shoots on the left side, and the X coordinate of the imaging device is positive when the imaging device shoots on the right side. The data units are millimeter (mm), X, Y, Z represents the calculated coordinates of the imaging device, and X0, Y0 and Z0 represent the actual coordinates of the imaging device when the positioning device is photographed.
2.
Y-direction (up-down direction) validation experiment: and testing the relative positioning results of the upper direction and the lower direction at the position 2 meters in front of the positioning device, wherein the shooting position of the imaging equipment moves by 0.5 meter, 1 meter and 1.5 meter in the upper direction and the lower direction respectively. The imaging device has a negative Y coordinate when photographed on the upper side and a positive Y coordinate when photographed on the lower side. The data units are millimeter (mm), X, Y, Z represents the calculated coordinates of the imaging device, and X0, Y0 and Z0 represent the actual coordinates of the imaging device when the positioning device is photographed.
As can be seen from the above test results, by adopting the positioning device of the present invention, when the imaging device photographs at a position several meters away from the positioning device, the error between the X, Y, Z coordinate value and the actual coordinate value of the imaging device calculated by the relative positioning is typically only several tens of millimeters, thereby providing relatively high relative positioning accuracy.
Reference herein to "various embodiments," "some embodiments," "one embodiment," or "an embodiment" or the like, means that a particular feature, structure, or property described in connection with the embodiments is included in at least one embodiment. Thus, appearances of the phrases "in various embodiments," "in some embodiments," "in one embodiment," or "in an embodiment" in various places throughout this document are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Thus, a particular feature, structure, or characteristic described in connection with or illustrated in one embodiment may be combined, in whole or in part, with features, structures, or characteristics of one or more other embodiments without limitation, provided that the combination is not logically or otherwise inoperable. The expressions appearing herein like "according to a", "based on a", "through a" or "using a" are meant to be non-exclusive, i.e. "according to a" may cover "according to a only" as well as "according to a and B", unless specifically stated or clearly understood from the context to mean "according to a only". In this application, some exemplary operation steps are described in a certain order for clarity of explanation, but it will be understood by those skilled in the art that each of these operation steps is not essential, and some of them may be omitted or replaced with other steps. The steps do not have to be performed sequentially in the manner shown, but rather, some of the steps may be performed in a different order, or concurrently, as desired, provided that the new manner of execution is not non-logical or non-operational.
Having thus described several aspects of at least one embodiment of this invention, it is to be appreciated various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be within the spirit and scope of the invention. While the invention has been described in terms of preferred embodiments, the invention is not limited to the embodiments described herein, but encompasses various changes and modifications that may be made without departing from the scope of the invention.

Claims (15)

1. An apparatus for achieving relative positioning, comprising:
one or more first positioning markers capable of determining a plane; and
one or more second locating marks, wherein at least a portion of the second locating marks are located outside the plane of the first locating marks,
wherein the apparatus is configurable for being photographed by an imaging device to obtain an image containing the apparatus, and wherein based on physical position information of the first and second positioning marks, imaging position information of the first and second positioning marks on the image, internal reference information of an imaging device of the imaging device, position information and/or attitude information of the imaging device relative to the apparatus at the time of photographing the image can be determined,
Wherein the determining of the position information and/or the pose information of the imaging apparatus relative to the device at the time of capturing the image includes:
determining position information and/or pose information of the imaging apparatus with respect to the device when the image is photographed using a PnP method; and/or
A perspective deformation associated with the first and second positioning markers is determined, and position information and/or pose information of the imaging apparatus relative to the device at the time of capturing the image is determined from the perspective deformation.
2. The apparatus of claim 1, wherein the apparatus includes at least three first locating marks that are in a same plane and are not collinear, and the second locating mark is located outside of a plane in which the first locating marks are located.
3. The apparatus of claim 2, wherein the first locating feature is located at a distance of at least 0.1 cm from the second locating feature.
4. The apparatus of claim 2, wherein a distance of a plane in which the first locating marks lie from the second locating mark is greater than 1/10 of a shortest distance between the respective first locating marks.
5. The apparatus of claim 1, wherein four first positioning markers are included in the apparatus, any three of the four first positioning markers being non-collinear.
6. The apparatus of claim 5, wherein the four first positioning marks are arranged in a rectangular shape, and the second positioning mark is positioned at a middle position of the rectangle in a horizontal direction.
7. The apparatus of claim 1, wherein one or more of the first and second locating marks are configured to serve as a data light source capable of communicating information.
8. The apparatus of claim 1, further comprising:
one or more data light sources for conveying information.
9. A relative positioning method implemented using a positioning device comprising one or more first positioning markers capable of determining a plane and one or more second positioning markers, at least a portion of the second positioning markers being located outside the plane in which the first positioning markers are located, the method comprising:
obtaining an image containing the positioning device captured by an imaging apparatus and physical position information of the first positioning mark and the second positioning mark;
determining imaging position information of the first positioning mark and the second positioning mark on the image; and
Determining position information and/or attitude information of the imaging device relative to the positioning means when the image is taken based on the physical position information and imaging position information of the first positioning mark and the second positioning mark and internal reference information of an imaging device of the imaging device,
wherein the determining of the position information and/or the attitude information of the imaging device relative to the positioning means when capturing the image includes:
determining position information and/or attitude information of the imaging device relative to the positioning means at the time of capturing the image using a PnP method; and/or
Perspective deformations associated with the first and second positioning markers are determined, and position information and/or pose information of the imaging device relative to the positioning means at the time of capturing the image is determined from the perspective deformations.
10. The method of relative positioning according to claim 9, wherein,
the physical location information of the first and second positioning marks comprises relative physical location information between the positioning marks or absolute physical location information of the positioning marks.
11. The method of relative positioning according to claim 9, wherein,
Physical location information of the first and second locating marks is obtained at least in part by communication between the imaging device and the locating means.
12. The relative positioning method of claim 9, wherein the positioning device further comprises one or more data light sources for transmitting information, or one or more of the first positioning marker and the second positioning marker are configured to be used as data light sources capable of transmitting information,
and wherein the physical location information of the first and second positioning markers is obtained by:
identifying, by the imaging device, information conveyed by the data light source; and
and obtaining the physical position information of the first positioning mark and the second positioning mark through the information.
13. The relative positioning method of claim 12, wherein the information conveyed by the data light source comprises identification information of the positioning device, which can be used to obtain absolute physical location information of the positioning device, the first positioning marker or the second positioning marker.
14. A storage medium having stored therein a computer program which, when executed by a processor, is operable to carry out the method of any of claims 9-13.
15. An electronic device comprising a processor and a memory, the memory having stored therein a computer program which, when executed by the processor, is operable to carry out the method of any of claims 9-13.
CN201910485778.9A 2019-06-05 2019-06-05 Device for realizing relative positioning and corresponding relative positioning method Active CN112051546B (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN201910485778.9A CN112051546B (en) 2019-06-05 2019-06-05 Device for realizing relative positioning and corresponding relative positioning method
EP20818983.7A EP3982084A1 (en) 2019-06-05 2020-06-01 Relative positioning device, and corresponding relative positioning method
JP2021571442A JP2022536617A (en) 2019-06-05 2020-06-01 Apparatus for realizing relative positioning and corresponding relative positioning method
PCT/CN2020/093689 WO2020244480A1 (en) 2019-06-05 2020-06-01 Relative positioning device, and corresponding relative positioning method
TW109119021A TWI812865B (en) 2019-06-05 2020-06-05 Device, method, storage medium and electronic apparatus for relative positioning
US17/536,752 US20220083071A1 (en) 2019-06-05 2021-11-29 Relative positioning device, and corresponding relative positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910485778.9A CN112051546B (en) 2019-06-05 2019-06-05 Device for realizing relative positioning and corresponding relative positioning method

Publications (2)

Publication Number Publication Date
CN112051546A CN112051546A (en) 2020-12-08
CN112051546B true CN112051546B (en) 2024-03-08

Family

ID=73609222

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910485778.9A Active CN112051546B (en) 2019-06-05 2019-06-05 Device for realizing relative positioning and corresponding relative positioning method

Country Status (1)

Country Link
CN (1) CN112051546B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008084523A1 (en) * 2007-01-10 2008-07-17 Tamura Corporation Position information detection device, position information detection method, and position information detection program
CN102012706A (en) * 2010-10-01 2011-04-13 苏州佳世达电通有限公司 Electronic device capable of automatically positioning and moving and method for automatically returning moving element thereof
KR101365291B1 (en) * 2012-11-30 2014-02-19 충남대학교산학협력단 Method and apparatus for estimating location in the object
KR20150008295A (en) * 2013-07-12 2015-01-22 한국교통연구원 User device locating method and apparatus for the same
CN105612401A (en) * 2013-09-30 2016-05-25 独立行政法人产业技术综合研究所 Marker image processing system
CN207424896U (en) * 2017-11-09 2018-05-29 北京外号信息技术有限公司 A kind of expansible optical label structure
CN108154533A (en) * 2017-12-08 2018-06-12 北京奇艺世纪科技有限公司 A kind of position and attitude determines method, apparatus and electronic equipment
CN108713179A (en) * 2017-09-18 2018-10-26 深圳市大疆创新科技有限公司 Mobile article body controlling means, equipment and system
CN109341691A (en) * 2018-09-30 2019-02-15 百色学院 Intelligent indoor positioning system and its localization method based on icon-based programming
CN210225419U (en) * 2019-06-05 2020-03-31 北京外号信息技术有限公司 Optical communication device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008084523A1 (en) * 2007-01-10 2008-07-17 Tamura Corporation Position information detection device, position information detection method, and position information detection program
CN102012706A (en) * 2010-10-01 2011-04-13 苏州佳世达电通有限公司 Electronic device capable of automatically positioning and moving and method for automatically returning moving element thereof
KR101365291B1 (en) * 2012-11-30 2014-02-19 충남대학교산학협력단 Method and apparatus for estimating location in the object
KR20150008295A (en) * 2013-07-12 2015-01-22 한국교통연구원 User device locating method and apparatus for the same
CN105612401A (en) * 2013-09-30 2016-05-25 独立行政法人产业技术综合研究所 Marker image processing system
CN108713179A (en) * 2017-09-18 2018-10-26 深圳市大疆创新科技有限公司 Mobile article body controlling means, equipment and system
CN207424896U (en) * 2017-11-09 2018-05-29 北京外号信息技术有限公司 A kind of expansible optical label structure
CN108154533A (en) * 2017-12-08 2018-06-12 北京奇艺世纪科技有限公司 A kind of position and attitude determines method, apparatus and electronic equipment
CN109341691A (en) * 2018-09-30 2019-02-15 百色学院 Intelligent indoor positioning system and its localization method based on icon-based programming
CN210225419U (en) * 2019-06-05 2020-03-31 北京外号信息技术有限公司 Optical communication device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"A Quick Algorithm to Detect LED Array from the Background in Image Sensor Based Visible Light Communication";Peng Liu et al.;《2018 IEEE International Conference on Imaging Systems and Techniques (IST)》;20181216;第1-5页 *
"基于人工路标的机器人视觉定位研究";韩笑等;《河南机电高等专科学校学报》;20150925;第23卷(第5期);第9-13页 *

Also Published As

Publication number Publication date
CN112051546A (en) 2020-12-08

Similar Documents

Publication Publication Date Title
CN210225419U (en) Optical communication device
US9448758B2 (en) Projecting airplane location specific maintenance history using optical reference points
US7374103B2 (en) Object localization
CN110782492B (en) Pose tracking method and device
CN110572630A (en) Three-dimensional image shooting system, method, device, equipment and storage medium
CN111026107A (en) Method and system for determining the position of a movable object
US20200134839A1 (en) System and method for reverse optical tracking of a moving object
TWI812865B (en) Device, method, storage medium and electronic apparatus for relative positioning
CN113196165A (en) Information projection system, control device, and information projection method
CN112528699B (en) Method and system for obtaining identification information of devices or users thereof in a scene
CN112051546B (en) Device for realizing relative positioning and corresponding relative positioning method
JP7412260B2 (en) Positioning system, positioning device, positioning method and positioning program
CN112074706A (en) Accurate positioning system
CN103968829A (en) Three-dimensional space orientation tracking method and three-dimensional space orientation tracking system based on virtual marker
CN111753565B (en) Method and electronic equipment for presenting information related to optical communication device
CN113008135B (en) Method, apparatus, electronic device and medium for determining a position of a target point in space
JPH10149435A (en) Environment recognition system and mark used therefor
CN109323691B (en) Positioning system and positioning method
CN111639735A (en) Device for positioning and positioning method based on device
CN114693749A (en) Method and system for associating different physical coordinate systems
CN212460586U (en) Device for positioning
CN116710975A (en) Method for providing navigation data for controlling a robot, method and device for manufacturing at least one predefined point-symmetrical area
CN112648936A (en) Stereoscopic vision detection method and detection device based on differential projection
CN220170495U (en) Calibration system of equipment
CN112417904B (en) Method and electronic device for presenting information related to an optical communication device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant