CN113712665A - Positioning method and device based on positioning marker and computer storage medium - Google Patents

Positioning method and device based on positioning marker and computer storage medium Download PDF

Info

Publication number
CN113712665A
CN113712665A CN202111280102.XA CN202111280102A CN113712665A CN 113712665 A CN113712665 A CN 113712665A CN 202111280102 A CN202111280102 A CN 202111280102A CN 113712665 A CN113712665 A CN 113712665A
Authority
CN
China
Prior art keywords
positioning
target
code
localization
marker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111280102.XA
Other languages
Chinese (zh)
Other versions
CN113712665B (en
Inventor
白汝乐
赵英含
徐颢
倪自强
赵海霞
田庆
谢永召
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baihui Weikang Technology Co Ltd
Original Assignee
Beijing Baihui Weikang Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baihui Weikang Technology Co Ltd filed Critical Beijing Baihui Weikang Technology Co Ltd
Priority to CN202111280102.XA priority Critical patent/CN113712665B/en
Publication of CN113712665A publication Critical patent/CN113712665A/en
Application granted granted Critical
Publication of CN113712665B publication Critical patent/CN113712665B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C19/00Dental auxiliary appliances
    • A61C19/06Implements for therapeutic treatment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C3/00Dental tools or instruments
    • A61C3/02Tooth drilling or cutting instruments; Instruments acting like a sandblast machine
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1408Methods for optical code recognition the method being specifically adapted for the type of code
    • G06K7/14172D bar codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/14Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
    • G06K7/1404Methods for optical code recognition
    • G06K7/1439Methods for optical code recognition including a method step for retrieval of the optical code
    • G06K7/1443Methods for optical code recognition including a method step for retrieval of the optical code locating of the code in an image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Electromagnetism (AREA)
  • Dentistry (AREA)
  • Artificial Intelligence (AREA)
  • Epidemiology (AREA)
  • Surgery (AREA)
  • Toxicology (AREA)
  • Robotics (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A positioning method, device and computer storage medium based on positioning marker mainly include collecting reference positioning mark on the positioning marker, obtaining at least two target images; identifying target positioning marks in each target image according to the reference positioning marks; and determining the target positioning result of the positioning marker according to the position of the target positioning marker in each target image and the position of the reference positioning marker on the positioning marker. Therefore, the tracking and positioning effects can be more accurate and stable.

Description

Positioning method and device based on positioning marker and computer storage medium
Technical Field
The embodiment of the invention relates to a visual positioning technology, in particular to a positioning method and a positioning device based on a positioning marker and a computer storage medium.
Background
In the oral cavity robot operation, the optical positioning tracker is needed to track the position relation between the robot end and the oral cavity of the patient in real time, so that the robot is guided to accurately complete the oral cavity operation tasks such as drilling, grinding and the like in the oral cavity of the patient.
In view of the above, how to provide a positioning technology for a robot end position for guiding an oral robot to accurately complete various oral surgery operations is a technical subject to be solved by the present application.
Disclosure of Invention
In view of the above, one of the technical problems to be solved by the embodiments of the present invention is to provide a positioning method based on a positioning marker, which can provide an accurate and stable tracking and positioning effect.
According to a first aspect of the present invention, there is provided a localization marker-based localization method, comprising: acquiring reference positioning marks on the positioning markers to obtain at least two target images; identifying target positioning marks in the target images according to the reference positioning marks; and determining a target positioning result of the positioning marker according to the positions of the target positioning markers in the target images and the positions of the reference positioning markers on the positioning marker.
According to a second aspect of the present invention, there is provided a storage medium having stored thereon computer instructions which, when executed by a processor, cause the processor to perform the method of the first aspect described above.
According to a third aspect of the present invention, there is provided a localization marker-based localization apparatus comprising: the acquisition module is used for acquiring reference positioning marks on the positioning markers and acquiring at least two target images; the identification module is used for identifying the target positioning identifier in each target image according to the reference positioning identifier; and the positioning module is used for determining a target positioning result of the positioning marker according to the positions of the target positioning markers in the target images and the positions of the reference positioning markers on the positioning marker.
As can be seen from the above technical solutions, according to the positioning method, apparatus and computer storage medium based on positioning markers provided in the embodiments of the present invention, the reference positioning identifier on the positioning marker is collected, the target positioning identifier in each target image is identified according to the reference positioning identifier, and then the target positioning result of the positioning marker is determined according to the position of the target positioning identifier in the target image and the position of the reference positioning identifier on the positioning marker. Therefore, the accurate positioning tracking effect can be provided only based on the positioning marker.
Furthermore, the location technique of this application can combine to use with the robot that is used for implementing oral surgery, and through the arm end with the location marker location at the robot, can implement and track the position relation between arm end and the patient oral cavity to be favorable to guiding the arm accurately to carry out various oral surgery tasks in the patient oral cavity, improve oral surgery's success rate.
In addition, this application is still through replacing traditional plane type place fixed marker for cylinder type place fixed marker for the terminal angle gesture of the arm that can supply to track the location is abundanter, thereby provides more stable tracking location effect.
Drawings
Some specific embodiments of the present application will be described in detail hereinafter by way of illustration and not limitation with reference to the accompanying drawings. The same reference numbers in the drawings identify the same or similar elements or components. Those skilled in the art will appreciate that the drawings are not necessarily drawn to scale. In the drawings:
fig. 1 is a schematic view illustrating an application scenario of the localization method based on the localization marker of the present invention.
Fig. 2 shows a schematic view of an embodiment of the localization marker of the present invention.
Fig. 3 shows a schematic plan-view development of a reference location marker on a localization marker of the present invention.
Fig. 4 shows a schematic view of an embodiment of each reference location code on the localization marker of the present invention.
Fig. 5 to 12 are schematic flow charts illustrating localization methods based on localization markers according to first to eighth embodiments of the present invention.
Fig. 13 shows a schematic structural diagram of a positioning marker-based positioning device according to a tenth embodiment of the invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the embodiments of the present invention, the technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments of the present invention shall fall within the scope of the protection of the embodiments of the present invention.
The following will further describe the specific implementation of the embodiments of the present invention with reference to the drawings of the embodiments of the present invention.
The application provides a positioning method and device based on a positioning marker and a computer storage medium, which can be used in combination with the technical industries in various fields and are used for providing accurate and stable navigation positioning effect. For example, the present application can be applied to navigation positioning of an oral surgery robot (refer to fig. 1), for example, the positioning marker 1 can be mounted at the end of the mechanical arm 2 of the oral surgery robot, and the optical positioning tracker 3 can be used to track the position relationship between the end of the mechanical arm 2 of the oral surgery robot and the oral cavity of the patient in real time by collecting the reference positioning mark on the positioning marker 1, so as to guide the robot to complete the oral surgery operations such as drilling, grinding and the like in the oral cavity of the patient.
It should be noted that the present application can also be applied in combination with other technical fields that need to provide a location support service, and the present application is not limited thereto.
The localization technology of the present application is realized based on localization markers, and the basic structural design of the localization markers will be described in detail below with reference to fig. 2 to 4.
Referring to fig. 2, in the present embodiment, the positioning marker 1 may be cylindrical, and the reference positioning mark may include at least one reference positioning code 10 disposed on the positioning marker 1.
Optionally, each reference locator code may include a two-dimensional code that is mainly composed of an edge region and an identification region, wherein the identification region is enclosed in the edge region and may be composed of at least 3 × 3 grid arrays.
Optionally, the reference positioning identifier may further include one or more reference identity codes circumferentially distributed on the positioning marker, and each reference identity code has the same two-dimensional code.
Alternatively, as shown in fig. 2 and 3, the reference positioning identifier may include a plurality of reference positioning codes 10 and a plurality of reference identification codes 12 circumferentially distributed on the positioning marker 1, and each reference positioning code 10 and each reference identification code 12 may be printed or engraved on the positioning marker 1 according to a certain arrangement. For example, in the example shown in fig. 3, the first row is the reference identity code 12, which can be used to distinguish different position markers in the same field of view, and can also be regarded as the identity code of the position marker, and in the first row, the two-dimensional code of each reference identity code 12 is the same, so as to ensure that the reference identity codes 12 on the position markers are all located in the tracking field of view of the imaging device (i.e. the optical position tracker 3 of fig. 1) in different postures. The second to fourth rows are reference location codes 10, wherein the two-dimensional code of each reference location code 10 is different, and the imaging device (i.e. the optical position tracker 3 of fig. 1) can calculate the current pose of the position marker by locating all the reference location codes 10 in its field of view.
In this embodiment, the size of the identification area in the reference location code may be adjusted arbitrarily according to the size of the circumference of the location marker, which is not limited in this application.
For example, in the example shown in fig. 4, each reference location code 10 may be formed by a 6 × 6 grid array, including an edge region 10a and an identification region 10b formed by a 4 × 4 grid array. Wherein, each grid forming the edge area 10a can be black to independently separate each reference positioning code, and each grid in the identification area can be coded by 0 and 1 (0 is black, 1 is white).
In this embodiment, the hamming distance between any two reference positioning codes should not be less than 1, and the hamming distance inside each reference positioning code should not be less than 1.
Specifically, the encoding principle of each reference location code should satisfy the following three conditions:
first, the codes of the reference location codes cannot be too simple, for example, all the reference location codes are black or all the reference location codes are white, and the reference location codes should be converted into black and white colors as much as possible so as to avoid confusion with other objects in the application environment.
Secondly, the hamming distance between any two reference positioning codes should not be less than 1, which is specifically defined as follows:
Figure 592886DEST_PATH_IMAGE001
Figure 962687DEST_PATH_IMAGE002
wherein the content of the first and second substances,
Figure 118862DEST_PATH_IMAGE003
positioning codes for two references
Figure 915917DEST_PATH_IMAGE004
And
Figure 587070DEST_PATH_IMAGE005
the distance between the two or more of the two or more,
Figure 494983DEST_PATH_IMAGE006
in order to obtain the Hamming distance,
Figure 505664DEST_PATH_IMAGE007
for clockwise rotation reference positioning code (i.e. two-dimensional code)
Figure 473620DEST_PATH_IMAGE008
An operator of 90 °.
Third, the internal hamming distance of each reference location code should also not be less than 1, which is defined as follows:
Figure 569752DEST_PATH_IMAGE009
wherein the content of the first and second substances,
Figure 343673DEST_PATH_IMAGE010
the internal hamming distance representing the reference positioning code, i.e. the distance between any two of the 4 values obtained by rotating the reference positioning code 4 times, should not be less than 1.
When the reference positioning codes meet the three conditions, it can be ensured that each reference positioning code on the positioning marker can be successfully tracked and identified by an imaging device (such as an optical positioning tracker), and four outer corner points of each reference positioning code can be distinguished.
Moreover, the encoding principle of each reference identity code in the reference positioning identifier may also refer to the above-mentioned principles of the positioning code, which is not described in detail in this application.
To sum up, compare in traditional plane type place marker, this application can make imaging device can track the reference location sign (including reference location sign and reference identity) on the place marker at arbitrary angle homoenergetic through adopting cylinder type place marker, consequently, the angle gesture of the arm that this application can supply to track is more, and the place tracking effect is also more stable.
First embodiment
Fig. 5 shows a flow chart of the localization method based on localization markers according to the first embodiment of the present invention.
As shown in the figure, the method of the present embodiment mainly includes the following steps:
step S502, collecting reference positioning marks on the positioning markers, and acquiring at least two target images.
Optionally, a reference localization marker on the localization marker may be acquired with the imaging device.
For example, in the example shown in fig. 1, the positioning marker 1 may be disposed at the end of a robot arm 2 (e.g., an oral surgery robot arm), and a reference positioning mark on the positioning marker 1 is acquired by an imaging device 3 (e.g., an optical positioning tracker) to track the position relationship between the end of the positioning robot arm 2 and the oral cavity of the patient.
Alternatively, the imaging device may include one of a binocular camera and a multi-view camera.
In one embodiment, a binocular camera may be used to synchronously acquire at least one reference location code on the location marker to obtain two target images.
In another embodiment, a multi-view camera may be used to synchronously acquire at least one reference positioning code on the positioning marker, obtain a plurality of target images, and screen out two target images that meet a predetermined screening rule from the plurality of target images according to the predetermined screening rule (e.g., imaging quality, imaging angle, etc.), so as to provide accuracy of subsequent localization tracking identification.
Step S504, according to the reference positioning mark, the target positioning mark in each target image is identified.
Optionally, the two target images may be respectively identified, and a target location code matched with the reference location code in each target image is obtained.
Specifically, each edge point in the target image can be found out by using a sobel operator, each connected region in the target image is determined according to the edge point, and then a candidate region is screened out from the connected regions and is identified, so that a target positioning identifier which is matched with the reference positioning identifier in the target image is determined.
Step S506, determining the target positioning result of the positioning marker according to the position of the target positioning marker in each target image and the position of the reference positioning marker on the positioning marker.
Optionally, the same target location code in the two target images may be identified through corner matching, three-dimensional coordinate information of each target location code is calculated according to two-dimensional coordinate information of the same target location code in the two target images, and the pose of the positioning marker under the imaging device (optical positioning tracker) is calculated according to the three-dimensional coordinate information of the target location code and the three-dimensional coordinate information of the reference location code on the positioning marker.
In summary, the target image is generated by collecting the reference positioning identifier on the positioning marker, the target positioning identifier in the target image is identified according to the reference positioning identifier, and the target positioning result of the positioning marker is determined based on the positions of the target positioning identifier and the reference positioning identifier in respective coordinate systems.
Second embodiment
Fig. 6 shows a localization method based on localization markers according to a second embodiment of the present application, which mainly shows a specific implementation of step S504 described above, specifically as follows:
for each of the two acquired target images, the following steps are performed
Step S602, determining each connected region in the target image according to each edge point identified from the target image.
In this embodiment, the sobel operator can be used to identify edge points in the target image, i.e., identify sharp edges in the target image, which can correspond to black edge regions of the reference location codes on the location markers.
Step S604, screening each connected region based on a preset screening rule, and determining a candidate region in the target image.
In this embodiment, if the connected region can fit a quadrilateral region (e.g., a rectangular region), the connected region can be determined as candidate regions, and each candidate region screened out can correspond to each reference location code (i.e., two-dimensional code) on the location marker.
In step S606, projective transformation is performed on the candidate region, and a transformation region of the candidate region is acquired.
In this embodiment, it is assumed that the four corner points of each reference location code in the cylindrical positioning marker are located on the same plane in space, and thus, the candidate region in the target image is converted into a standard square by performing the projection orthodontic treatment on the candidate region.
Step S608, identifying a conversion area, and if the identification result of the conversion area matches the reference positioning code, determining the conversion area as the target positioning code.
Alternatively, the candidate region may be decoded, and the decoding result of the candidate region may be matched with each reference location code on the location marker, and if the matching is successful, the candidate region is determined as the target location code.
In summary, the identification technology of the target positioning identifier adopted in the embodiment has the advantage of high identification accuracy, and can be beneficial to improving the accuracy of the subsequent positioning identification result.
Third embodiment
Fig. 7 shows a positioning method based on a positioning marker in a third embodiment of the present application, which mainly shows a specific implementation of step S602 described above, and as shown in the figure, this embodiment mainly includes the following steps:
step S702, using sobel operator to obtain each gradient and each gradient angle corresponding to each pixel point in the target image.
Optionally, the gradient values in each direction corresponding to each pixel (including the gradient values in the direction of each pixel along the X direction and the gradient values in the direction of the Y direction of the target image) may be obtained according to the sobel operator matrix and the gray values corresponding to each pixel, the gradients corresponding to each pixel may be obtained according to the gradient values in each direction corresponding to each pixel and the preset gradient conversion rule, and the gradient angles corresponding to each pixel may be obtained according to the gradient values in each direction corresponding to each pixel and the preset gradient angle conversion rule.
Specifically, the sobel operator matrix can be expressed as:
Figure 474440DEST_PATH_IMAGE011
Figure 613297DEST_PATH_IMAGE012
wherein the content of the first and second substances,
Figure 931146DEST_PATH_IMAGE013
representing the directional gradient value of each pixel point along the X direction of the target image,
Figure 446441DEST_PATH_IMAGE014
indicating the directional gradient value of each pixel point along the Y direction of the target image,
Figure 995496DEST_PATH_IMAGE015
in order to convolve the symbols with each other,
Figure 305255DEST_PATH_IMAGE016
indicating the respective gray values corresponding to the respective pixel points in the target image.
Alternatively, the preset gradient scaling rule may be expressed as:
Figure 110400DEST_PATH_IMAGE017
wherein the content of the first and second substances,
Figure 163807DEST_PATH_IMAGE018
representing the respective gradients corresponding to the respective pixel points.
Alternatively, the preset gradient angle scaling rule may be expressed as:
Figure 3587DEST_PATH_IMAGE019
wherein the content of the first and second substances,
Figure 484247DEST_PATH_IMAGE020
representing the respective gradient angles corresponding to the respective pixel points.
Step S704, determining each edge point in each pixel point according to the preset gradient high threshold, the preset gradient low threshold, each gradient corresponding to each pixel point, and each gradient angle.
Alternatively, the edge points may include a first edge point and a second edge point.
Optionally, each pixel point with a gradient greater than the gradient high threshold may be determined as the first edge point according to a preset gradient high threshold.
Optionally, according to a preset gradient high threshold and a preset gradient low threshold, if the gradient of the pixel point is between the preset gradient high threshold and the preset gradient low threshold, a first edge point adjacent to the pixel point exists at the same time, and when a difference between the gradient angle of the pixel point and the gradient angle of the first edge point is smaller than a preset difference, the pixel point is determined as a second edge point.
Optionally, the difference is preset (i.e. the
Figure 104584DEST_PATH_IMAGE020
) Is not greater than eight degrees.
In particular, the corresponding gradients of each pixel point in the target image may be traversed
Figure 696102DEST_PATH_IMAGE021
Will gradient
Figure 655968DEST_PATH_IMAGE018
Greater than a predetermined gradient high threshold value (maxThres) Each pixel point of (1) is determined as a first edge point, which is lower than a preset gradient low threshold value (minThres) Discarding each pixel point.
Furthermore, for gradients
Figure 307529DEST_PATH_IMAGE018
Between a predetermined gradient high threshold value (maxThres) And a preset gradient low threshold value (minThres) Then, whether 8 adjacent regions of the current pixel (i.e. 8 pixels adjacent to the current pixel) exist or not is searched forIf the first edge point exists, the gradient angle of the current pixel point is further calculated
Figure 149583DEST_PATH_IMAGE020
And the gradient angle of the first edge point adjacent thereto
Figure 544792DEST_PATH_IMAGE020
If the difference between the two absolute values is preset to be a difference (for example, 8 degrees), the current pixel point is determined as a second edge point, and otherwise, the current pixel point is discarded.
Finally, according to the determined first edge points and the second edge points, the edge point set of the target image can be obtained
Figure 359165DEST_PATH_IMAGE022
Step S706, based on each edge point, determines each connected region in each edge point of the target image.
In this embodiment, the edge point set can be determined by testing whether a certain pixel point in the target image is in the communication region
Figure 181627DEST_PATH_IMAGE022
Where 8 contiguous connected regions are found.
Specifically, a certain pixel point in the target image can be defined
Figure 448660DEST_PATH_IMAGE023
The coordinates of (a) are:
Figure 647561DEST_PATH_IMAGE024
according to the following preset formula, the pixel point can be determined
Figure 378756DEST_PATH_IMAGE023
Whether or not it is located in a connected region
Figure 372120DEST_PATH_IMAGE025
In (1).
Wherein, the preset formula can be expressed as:
Figure 126449DEST_PATH_IMAGE026
if it is
Figure 129040DEST_PATH_IMAGE027
Then, then
Figure 652426DEST_PATH_IMAGE028
In summary, the connected regions in the edge points of the target image can be accurately identified by the sobel operator, and the accuracy of the subsequent positioning result can be improved.
Fourth embodiment
Fig. 8 shows a flowchart of a positioning method based on a positioning marker according to a fourth embodiment of the present application, which is a specific implementation of step S606. As shown in the figure, the present embodiment mainly includes the following steps:
and S802, determining the relative vertex positions of the reference positioning code according to the preset side length of the reference positioning code.
In this embodiment, assuming that the reference location code is composed of a 6 × 6 grid array as shown in fig. 4, the length of the location edge of the reference location code can be determined to be 6, and the relative vertex positions of the four corner points of the reference location code are (0, 0), (0, 6), (6, 0), (6, 6).
Step S804, acquiring a homography matrix according to the target vertex positions of the candidate area relative to the target image and the relative vertex positions of the reference positioning code.
In this embodiment, the transformation of the candidate region in the target image to the standard square can be described by using a homography matrix of 3 × 3, which is expressed as follows:
Figure 383402DEST_PATH_IMAGE029
wherein the content of the first and second substances,
Figure 625028DEST_PATH_IMAGE030
a homography matrix is represented that is,
Figure 165730DEST_PATH_IMAGE031
the target vertex position of one corner point of the quadrilateral fitted to the candidate region,
Figure 543622DEST_PATH_IMAGE032
to refer to the relative vertex positions of a corner point of the location code, four vertex pairs are thereby obtained, from which four identical equations can be listed, each of which can present three equations, and a total of 12 equation sets can be listed, as follows:
when in use
Figure 144368DEST_PATH_IMAGE033
Where =0, the following three equations can be obtained:
Figure 670027DEST_PATH_IMAGE034
in the same way, when
Figure 14421DEST_PATH_IMAGE033
For 1,2, and 3, another 9 equations can be listed, which is not described herein.
For 9 unknowns in the above 12 equations, a homography matrix can be solved by using a singular value decomposition method
Figure 246819DEST_PATH_IMAGE035
It is expressed as:
Figure 18466DEST_PATH_IMAGE036
in step S806, projection conversion is performed on the candidate region using the homography matrix, and a conversion region of the candidate region is obtained.
In this embodiment, a square (which may also be referred to as a 6 × 6 grid array) transition region with a side length of 6 may be obtained according to the solved inverse matrix of the homography matrix and the coordinates of the four target vertices of the candidate region.
Fifth embodiment
Fig. 9 shows a flowchart of a positioning method based on a positioning marker according to a fifth embodiment of the present application, which is a specific implementation of step S608 described above. As shown in the figure, the present embodiment mainly includes the following steps:
step S902, identifying the conversion region, and obtaining the identification code of the conversion region.
Optionally, since the transition region is formed by a 6 × 6 grid array, the identification code of the transition region can be obtained by decoding the 4 × 4 identification region in the transition region according to the edge region of the reference location code and the specific distribution form of the identification region.
Step S904, comparing the identification code with the reference positioning code, and determining the conversion area as the target positioning code in response to the comparison result that the identification code matches with the reference positioning code.
Optionally, the identification code of the conversion region may be matched with each reference location code on the location marker, and if the matching is successful, the conversion region may be considered as a successfully tracked two-dimensional code, and the conversion region may be determined as the target location code.
In summary, the fourth and fifth embodiments of the present application perform the projection orthodontic treatment on the candidate region in the target image to convert the candidate region into the conversion region matching the reference location code, and decode the conversion region and match the conversion region with the reference location code, so as to accurately identify the target location code in the target image, thereby facilitating to improve the accuracy of the subsequent location identification.
Sixth embodiment
Fig. 10 shows a flowchart of a positioning method based on a positioning marker according to a sixth embodiment of the present application, which is a specific implementation of step S506 described above. As shown in the figure, the present embodiment mainly includes the following steps:
step S1002, obtaining intermediate positioning information of the target positioning identifier according to the position of the target positioning identifier in each target image, and continuing to execute step S1006.
Optionally, the coordinate positions of the same target positioning code in the two target images can be identified, two pieces of two-dimensional coordinate information of the target positioning code corresponding to the two target images are obtained, and then the three-dimensional coordinate information of the target positioning code can be obtained according to the two pieces of two-dimensional coordinate information of the target positioning code.
Alternatively, the positions corresponding to the target location codes in the two target images (for example, the positions of the corner points of the target location codes) may be identified, the first position information and the second position information corresponding to the two target images of each target location code are obtained, and then the first position information and the second position information are matched to determine the same target location code in the two target images.
Optionally, the three-dimensional coordinate information of the target location code may include coordinates of each target corner point corresponding to each corner point of the target location code.
Specifically, when the same target location code is extracted from two target images generated by the imaging device (optical location tracker), three-dimensional homogeneous coordinates of each corner point of the target location code under the imaging device (optical location tracker) can be obtained through two-dimensional homogeneous coordinates of each corner point of the target location code in the two target images, which is specifically as follows:
assume that two projection matrices of an imaging device (binocular camera) are respectively expressed as:
Figure 969104DEST_PATH_IMAGE037
and
Figure 851610DEST_PATH_IMAGE038
the three-dimensional homogeneous coordinate of a certain corner point of the target positioning code in the space is defined as
Figure 266411DEST_PATH_IMAGE039
(which contains 4 parameters), the two-dimensional homogeneous coordinate of the same angular point of the target positioning code in the two target images is
Figure 208959DEST_PATH_IMAGE040
And
Figure 646893DEST_PATH_IMAGE041
(each containing 3 parameters), then there are two mapping equations, which are expressed as follows:
Figure 333090DEST_PATH_IMAGE042
Figure 602397DEST_PATH_IMAGE043
from the two mapping equations above, the following two equations can be derived:
Figure 715846DEST_PATH_IMAGE044
Figure 641077DEST_PATH_IMAGE045
the two equations can be collated into
Figure 865385DEST_PATH_IMAGE046
In the form of (a), wherein,
Figure 192461DEST_PATH_IMAGE047
wherein the content of the first and second substances,
Figure 775015DEST_PATH_IMAGE048
to represent
Figure 187541DEST_PATH_IMAGE049
To (1) a
Figure 215540DEST_PATH_IMAGE050
Go and do benefitSolved by least square method
Figure 397123DEST_PATH_IMAGE039
I.e. three-dimensional homogeneous coordinates of a certain corner of the target location code in space (corresponding to the intermediate location information of the target location marker).
Step S1004, obtaining the reference positioning information of the reference positioning mark according to the position of the reference positioning mark on the positioning marker, and continuing to execute step S1006.
Optionally, the coordinate position of the reference location code in the local coordinate system of the location marker may be identified, and the three-dimensional coordinate information of the reference location code is obtained.
Optionally, the three-dimensional coordinate information of the reference location code may include coordinates of each reference corner point corresponding to each corner point of the reference location code.
Step S1006, determining the target positioning result of the positioning marker according to the intermediate positioning information and the reference positioning information.
Specifically, the posture parameter and the position parameter of the localization marker under the imaging device (such as the optical localization tracker) can be determined according to the three-dimensional coordinate information of the target localization code under the imaging device (such as the optical localization tracker) and the three-dimensional coordinate information of the reference localization code under the localization marker.
It should be further noted that, in this embodiment, the execution sequence between step S1002 and step S1004 may be set to be executed synchronously or sequentially according to an actual situation, which is not limited in this application.
Seventh embodiment
Fig. 11 shows a flowchart of a positioning method based on a positioning marker according to a seventh embodiment of the present application, which is a specific implementation of step S506 described above. As shown in the figure, the present embodiment mainly includes the following steps:
step S1102, determining a target gravity point of the target positioning code according to the coordinates of the at least three target corner points and the target gravity point conversion rule.
In this embodiment, a target gravity center point of the target location code in the local coordinate system corresponding to the imaging device may be determined according to at least three corner points of the target location code.
Optionally, the target barycentric point scaling rule is expressed as:
Figure 852375DEST_PATH_IMAGE051
wherein the content of the first and second substances,
Figure 548935DEST_PATH_IMAGE052
a target center of gravity point representing the target location code,
Figure 115046DEST_PATH_IMAGE053
representing the coordinates of the target corner points of the target location code,
Figure 416714DEST_PATH_IMAGE054
the number of corner points of the code is located for the target,
Figure 42868DEST_PATH_IMAGE054
is not less than 3.
Step S1104, obtaining a target vector of the target location code according to the coordinates of each target corner point and the target gravity center point, and continuing to execute step S1110.
Alternatively, the target vector of the target location code may be represented as:
Figure 164408DEST_PATH_IMAGE055
step S1106 determines a reference gravity center point of the reference positioning code according to each reference corner point coordinate corresponding to each target corner point coordinate and the reference gravity center point conversion rule.
In this embodiment, a reference gravity center point of the reference location code in the local coordinate system corresponding to the location marker may be determined according to at least three corner points of the reference location code.
Alternatively, the reference gravity point scaling rule may be expressed as:
Figure 534209DEST_PATH_IMAGE056
wherein the content of the first and second substances,
Figure 752701DEST_PATH_IMAGE057
a reference center of gravity point representing a reference location code,
Figure 549755DEST_PATH_IMAGE058
the coordinates of the reference corner points representing the reference location code,
Figure 158591DEST_PATH_IMAGE054
to refer to the number of corner points of the location code,
Figure 66504DEST_PATH_IMAGE054
is not less than 3.
Step S1108, obtaining a reference vector of the reference positioning code according to the coordinates of each reference corner point and the reference gravity point, and continuing to execute step S1110.
Optionally, the reference vector is represented as:
Figure 139503DEST_PATH_IMAGE059
step S1110, a covariance matrix is obtained according to the target vector and the reference vector.
Alternatively, the covariance matrix can be expressed as:
Figure 107459DEST_PATH_IMAGE060
wherein the content of the first and second substances,
Figure 203591DEST_PATH_IMAGE061
a covariance matrix is represented by a matrix of covariance,
Figure 915195DEST_PATH_IMAGE062
to be based on a reference vector
Figure 780383DEST_PATH_IMAGE063
A matrix of 3 × n columns;
Figure 480092DEST_PATH_IMAGE064
to be based on a reference vector
Figure 63520DEST_PATH_IMAGE065
A matrix of 3 × n columns;
Figure 578815DEST_PATH_IMAGE066
is a transposed matrix.
In step S1112, the attitude parameter and the position parameter of the positioning marker corresponding to the imaging device are determined based on the singular value decomposition result obtained by the covariance matrix.
Specifically, the attitude parameters of the localization markers corresponding to the imaging device may be determined based on the singular value decomposition result obtained by the covariance matrix.
In this embodiment, the singular value decomposition result of the covariance matrix can be expressed as:
Figure 298509DEST_PATH_IMAGE067
wherein the content of the first and second substances,
Figure 608268DEST_PATH_IMAGE068
and
Figure 678992DEST_PATH_IMAGE069
are all parameters to be solved.
According to the obtained parameters
Figure 794716DEST_PATH_IMAGE068
And
Figure 634496DEST_PATH_IMAGE069
the localization markers may be determined to correspond to pose parameters of the imaging device (e.g., the orientation of the robot arm tip in the world coordinate system) expressed as:
Figure 115156DEST_PATH_IMAGE070
wherein the content of the first and second substances,
Figure 673176DEST_PATH_IMAGE071
representing the pose parameters.
Then, according to the attitude parameters
Figure 264694DEST_PATH_IMAGE071
Target center of gravity point
Figure 21298DEST_PATH_IMAGE052
Reference center of gravity point
Figure 938438DEST_PATH_IMAGE057
And obtaining the position parameters of the positioning markers corresponding to the imaging device, wherein the position parameters are expressed as:
Figure 718175DEST_PATH_IMAGE072
wherein the content of the first and second substances,
Figure 113384DEST_PATH_IMAGE073
representing a location parameter.
Eighth embodiment
Fig. 12 shows a flowchart of a localization marker-based localization method according to an eighth embodiment of the present application, and as shown in the drawing, this embodiment mainly includes the following steps:
step S1202, collecting reference identity codes on the positioning markers, and acquiring at least two target images.
In this embodiment, the acquisition of the reference identity code and the acquisition of the reference location code may be performed synchronously, and the specific acquisition means may refer to the above detailed description of the reference location code (for example, refer to the description of step S502), which is not described herein again.
Step S1204, identify the target identity code in every target image according to referring to the identity code, and confirm the identity information of the positioning marker accordingly.
In this embodiment, the identification processing of the target identity code is substantially the same as the identification processing means of the target location code, and reference may be specifically made to the specific identification means related to the target location code (for example, refer to the description of step S504), which is not described herein again.
To sum up, this application can confirm the gesture and the position of positioning mark thing compare in imaging device according to the position of target location sign in the target image and the position of reference location sign on the positioning mark thing to have the advantage of location accuracy.
Ninth embodiment
A ninth embodiment of the present application provides a computer storage medium having stored thereon computer instructions, which, when executed by a processor, cause the processor to perform the method of any one of the first to eighth embodiments described above.
Tenth embodiment
Fig. 13 shows a basic architecture diagram of a localization marker based localization apparatus of a tenth embodiment of the present application. As shown in the figure, the positioning apparatus 1300 based on the positioning marker of the present embodiment mainly includes: an acquisition module 1302, an identification module 1304, and a location module 1306.
The acquisition module 1302 is configured to acquire the reference location identifier on the location marker 1310, and acquire at least two target images.
Optionally, the positioning marker 1310 is cylindrical, and the reference positioning identifier includes at least one reference positioning code provided on the positioning marker 1310; the reference positioning code comprises an edge area and an identification area, wherein the edge area surrounds the identification area, and the identification area is at least formed by a 3 x 3 grid array.
Optionally, the positioning marker 1310 is disposed at the end of the robot arm, and the identification region of the reference positioning code is formed by a 4 × 4 grid array.
Optionally, the reference positioning identifier comprises a plurality of the reference positioning codes circumferentially distributed on the positioning marker 1310; the reference positioning codes comprise two-dimensional codes, the Hamming distance between any two reference positioning codes is not less than 1, and the internal Hamming distance of each reference positioning code is not less than 1.
Optionally, the collecting module 1302 is further configured to synchronously collect the reference location code on the location marker 1310, and obtain two target images; the acquisition module comprises one of a binocular camera and a multi-view camera.
Optionally, the reference positioning identifier further includes one or more reference identification codes circumferentially distributed on the positioning marker 1310, and each reference identification code has the same two-dimensional code.
Optionally, the acquiring module 1302 is further configured to acquire the reference identity code on the positioning marker 1310 to acquire at least two target images.
The identifying module 1304 is configured to identify a target location identifier in each target image according to the reference location identifier.
Optionally, the identifying module 1304 is further configured to identify the two target images, and obtain the target positioning code in each target image, which is identical to the reference positioning code.
Optionally, the identifying module 1304 is further configured to, for each of the target images, perform the following steps: determining each connected region in each edge point of the target image according to each edge point identified from the target image; screening each connected region based on a preset screening rule, and determining a candidate region in the target image; performing projection conversion on the candidate region to obtain a conversion region of the candidate region; and identifying the conversion area, and if the identification result of the conversion area is matched with the reference positioning code, determining the conversion area as the target positioning code.
Optionally, the identifying module 1304 is further configured to obtain, by using a sobel operator, gradients and gradient angles corresponding to each pixel point in the target image; determining each edge point in each pixel point according to a preset gradient high threshold value, a preset gradient low threshold value, each gradient corresponding to each pixel point and each gradient angle; determining, based on the edge points, the connected regions in the edge points of the target image.
Optionally, the identifying module 1304 is further configured to obtain gradient values in each direction corresponding to each pixel point according to the sobel operator matrix and each gray value corresponding to each pixel point; obtaining each gradient corresponding to each pixel point according to each direction gradient value corresponding to each pixel point and a preset gradient conversion rule; obtaining each gradient angle corresponding to each pixel point according to each direction gradient value corresponding to each pixel point and a preset gradient angle conversion rule; the sobel operator matrix is expressed as:
Figure 927757DEST_PATH_IMAGE011
Figure 812536DEST_PATH_IMAGE074
wherein, the
Figure 79569DEST_PATH_IMAGE013
Representing a directional gradient value of each of the pixel points along an X direction of the target image, the
Figure 278470DEST_PATH_IMAGE014
A direction gradient value representing a Y direction of each of the pixel points along the target image, the
Figure 947348DEST_PATH_IMAGE015
Is a convolution symbol, said
Figure 940712DEST_PATH_IMAGE016
Representing each gray value corresponding to each pixel point in the target image;
the preset gradient conversion rule is expressed as:
Figure 258823DEST_PATH_IMAGE017
wherein, the
Figure 995835DEST_PATH_IMAGE018
Representing each of the gradients corresponding to each of the pixel points;
the preset gradient angle conversion rule is expressed as:
Figure 519220DEST_PATH_IMAGE019
wherein, the
Figure 949065DEST_PATH_IMAGE020
Each of the gradient angles corresponding to each of the pixel points is represented.
Optionally, the edge points comprise a first edge point and a second edge point; the identifying module 1304 is further configured to determine, according to the preset gradient high threshold, each of the pixel points whose gradient is greater than the gradient high threshold as the first edge point; according to the preset gradient high threshold and the preset gradient low threshold, if the gradient of the pixel point is between the preset gradient high threshold and the preset gradient low threshold, the first edge point adjacent to the pixel point exists at the same time, and the difference value between the gradient angle of the pixel point and the gradient angle of the first edge point is smaller than a preset difference value, the pixel point is determined as the second edge point.
Optionally, the preset difference is not greater than eight degrees.
Optionally, the identifying module 1304 is further configured to determine, for each of the connected regions, the connected region as the candidate region if the connected region can be fitted to a quadrilateral region.
Optionally, the identifying module 1304 is further configured to determine, according to a preset side length of the reference positioning code, each relative vertex position of the reference positioning code; acquiring a homography matrix according to the target vertex positions of the candidate region relative to the target image and the relative vertex positions of the reference positioning codes; and performing projection conversion on the candidate region by using the homography matrix to obtain a conversion region of the candidate region.
Optionally, the identifying module 1304 is further configured to identify the transition region, and obtain an identification code of the transition region; and comparing the identification code with the reference positioning code, responding to a comparison result that the identification code is matched with the reference positioning code, and determining the conversion area as the target positioning code.
Optionally, the identifying module 1304 is further configured to identify the target identity code in each target image according to the reference identity code, and accordingly determine the identity information of the positioning marker 1310.
The positioning module 1306 determines the target positioning result of the positioning marker 1310 according to the position of the target positioning identifier in each target image and the position of the reference positioning identifier on the positioning marker 1310.
Optionally, the positioning module 1306 is further configured to obtain intermediate positioning information of the target positioning identifier according to the position of the target positioning identifier in each target image, and obtain reference positioning information of the reference positioning identifier according to the position of the reference positioning identifier on the positioning marker 1310; determining a target positioning result of the positioning marker 1310 according to the intermediate positioning information and the reference positioning information.
Optionally, the positioning module 1306 is further configured to identify coordinate positions of the same target positioning code in the two target images, obtain two pieces of two-dimensional coordinate information of the target positioning code corresponding to the two target images, and obtain three-dimensional coordinate information of the target positioning code according to the two pieces of two-dimensional coordinate information;
optionally, the positioning module 1306 is further configured to identify a coordinate position of the reference positioning code in the local coordinate system of the positioning marker 1310, and obtain three-dimensional coordinate information of the reference positioning code.
Optionally, the positioning module 1306 is further configured to identify respective positions corresponding to the respective target positioning codes in the two target images, and obtain respective first position information and respective second position information of the respective target positioning codes corresponding to the two target images; and matching each piece of first position information with each piece of second position information, and determining the same target positioning code in the two target images.
Optionally, the three-dimensional coordinate information of the target positioning code includes target corner coordinates corresponding to each corner of the target positioning code, and the three-dimensional coordinate information of the reference positioning code includes reference corner coordinates corresponding to each corner of the reference positioning code; the positioning module 1306 is further configured to determine a target gravity point of the target positioning code according to at least three target corner point coordinates and a target gravity point conversion rule, and obtain a target vector of the target positioning code according to each target corner point coordinate and the target gravity point; determining a reference gravity center point of the reference positioning code according to each reference corner point coordinate corresponding to each target corner point coordinate and a reference gravity center point conversion rule, and acquiring a reference vector of the reference positioning code according to each reference corner point coordinate and the reference gravity center point; acquiring a covariance matrix according to the target vector and the reference vector; determining the position parameters and the attitude parameters of the positioning markers 1310 corresponding to the imaging device according to the singular value decomposition result obtained by the covariance matrix; the target gravity center point conversion rule is expressed as:
Figure 190690DEST_PATH_IMAGE051
wherein, the
Figure 528131DEST_PATH_IMAGE052
Representing said target center of gravity point, said
Figure 171601DEST_PATH_IMAGE053
Representing the coordinates of said target corner points, said
Figure 506768DEST_PATH_IMAGE054
The number of the corner points of the target location code, the
Figure 235689DEST_PATH_IMAGE054
Is not less than 3;
the reference gravity center point conversion rule is expressed as:
Figure 314504DEST_PATH_IMAGE056
wherein, the
Figure 874798DEST_PATH_IMAGE057
Representing said reference center of gravity point, said
Figure 646445DEST_PATH_IMAGE058
Representing the reference corner point coordinates, said
Figure 597084DEST_PATH_IMAGE054
For the number of the corner points of the reference positioning code, the
Figure 479589DEST_PATH_IMAGE054
Is not less than 3;
the target vector is represented as:
Figure 832073DEST_PATH_IMAGE055
the reference vector is represented as:
Figure 774621DEST_PATH_IMAGE059
the covariance matrix is expressed as:
Figure 274873DEST_PATH_IMAGE060
wherein, the
Figure 695490DEST_PATH_IMAGE061
Represents the covariance matrix, the
Figure 168059DEST_PATH_IMAGE062
To the reference vector
Figure 281509DEST_PATH_IMAGE063
A matrix of 3 × n columns; the above-mentioned
Figure 206739DEST_PATH_IMAGE064
To the reference vector
Figure 821978DEST_PATH_IMAGE075
A matrix of 3 × n columns; the above-mentioned
Figure 883475DEST_PATH_IMAGE076
Is a transposed matrix;
the singular value decomposition result of the covariance matrix is expressed as:
Figure 167825DEST_PATH_IMAGE067
wherein, the
Figure 580352DEST_PATH_IMAGE068
And said
Figure 608351DEST_PATH_IMAGE069
Is a parameter to be solved;
the localization marker 1310 corresponds to the pose parameters of the imaging device expressed as:
Figure 852251DEST_PATH_IMAGE070
wherein, the
Figure 41923DEST_PATH_IMAGE071
To representThe attitude parameter;
the positional parameters of the localization marker 1310 corresponding to the imaging device are represented as:
Figure 941746DEST_PATH_IMAGE072
wherein, the
Figure 507857DEST_PATH_IMAGE073
Representing the location parameter.
In addition, the positioning apparatus based on the positioning marker 1310 according to the embodiment of the present invention can also be used to implement other steps in the foregoing method for determining an optimal pose under each under-constraint, and has the beneficial effects of corresponding method step embodiments, which are not described herein again.
In summary, the positioning method, the positioning device and the computer storage medium based on the positioning marker provided in the embodiments of the present application acquire the reference positioning identifier on the positioning marker to obtain the target image, identify the target positioning identifier in the target image according to the reference positioning identifier, and determine the target positioning result of the positioning marker based on the coordinate positions of the reference positioning identifier and the target positioning identifier in their respective coordinate systems. Therefore, the method and the device can provide accurate positioning effect, and the design of the cylindrical positioning marker is utilized, so that the positioning marker can be tracked in more angle postures, and the stability of the tracking effect can be improved.
In addition, through setting up the locating mark in the arm end that is used for implementing oral surgery to adopt the location technique of this application, can accurately position the position relation between arm end and the patient oral cavity, thereby can guide the arm to accomplish drilling, each item oral surgery applications such as polishing accurately in the patient oral cavity.
It should be noted that, according to the implementation requirement, each component/step described in the embodiment of the present invention may be divided into more components/steps, and two or more components/steps or partial operations of the components/steps may also be combined into a new component/step to achieve the purpose of the embodiment of the present invention.
The above-described method according to an embodiment of the present invention may be implemented in hardware, firmware, or as software or computer code that may be stored in a recording medium such as a CD ROM, a RAM, a floppy disk, a hard disk, or a magneto-optical disk, or as computer code that is downloaded through a network with reference to a remote recording medium or a non-transitory machine-readable medium and is to be stored in a local recording medium, so that the method described herein may be stored in such software processing on a recording medium using a general-purpose computer, a dedicated processor, or programmable or dedicated hardware such as an ASIC or FPGA. It will be appreciated that the computer, processor, microprocessor controller or programmable hardware includes memory components (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by the computer, processor or hardware, implements the problem determination methods described herein. Further, when a general-purpose computer accesses code for implementing the optimal pose determination method under the under-constraint shown here, execution of the code transforms the general-purpose computer into a special-purpose computer for executing the optimal pose determination method under the under-constraint shown here.
Those of ordinary skill in the art will appreciate that the various illustrative elements and method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present embodiments.
The above embodiments are only for illustrating the embodiments of the present invention and not for limiting the embodiments of the present invention, and those skilled in the art can make various changes and modifications without departing from the spirit and scope of the embodiments of the present invention, so that all equivalent technical solutions also belong to the scope of the embodiments of the present invention, and the scope of patent protection of the embodiments of the present invention should be defined by the claims.

Claims (23)

1. A localization method based on localization markers, comprising:
acquiring reference positioning marks on the positioning markers to obtain at least two target images;
identifying target positioning marks in the target images according to the reference positioning marks;
and determining a target positioning result of the positioning marker according to the positions of the target positioning markers in the target images and the positions of the reference positioning markers on the positioning marker.
2. The localization marker-based localization method according to claim 1, wherein the localization marker is cylindrical, and the reference localization marker comprises at least one reference localization code provided on the localization marker; wherein the content of the first and second substances,
the reference positioning code comprises an edge area and an identification area, wherein the edge area surrounds the identification area, and the identification area is at least formed by a 3-by-3 grid array.
3. The localization marker-based localization method according to claim 2, wherein the localization marker is disposed at an end of a robot arm, and the identification region of the reference localization code is formed by a 4 x 4 grid array.
4. The localization marker-based localization method according to claim 2, wherein the reference localization signature comprises a plurality of the reference localization codes circumferentially distributed on the localization marker; wherein the content of the first and second substances,
the reference positioning codes comprise two-dimensional codes, the Hamming distance between any two reference positioning codes is not less than 1, and the internal Hamming distance of each reference positioning code is not less than 1.
5. The localization marker-based localization method according to claim 2, wherein the acquiring of the reference localization markers on the localization markers and the acquiring of the at least two target images comprises:
synchronously acquiring the reference positioning code on the positioning marker by using imaging equipment to obtain two target images;
wherein the imaging device includes one of a binocular camera and a multi-view camera.
6. The positioning method based on positioning markers according to claim 5, wherein the identifying the target positioning markers in each target image according to the reference positioning markers comprises:
and identifying the two target images, and acquiring a target positioning code matched with the reference positioning code in each target image.
7. The positioning method based on positioning markers according to claim 6, wherein the identifying the two target images and the obtaining the target positioning code matching the reference positioning code in each target image comprises:
for each of the target images, performing the steps of:
determining each connected region in each edge point of the target image according to each edge point identified from the target image;
screening each connected region based on a preset screening rule, and determining a candidate region in the target image;
performing projection conversion on the candidate region to obtain a conversion region of the candidate region;
and identifying the conversion area, and if the identification result of the conversion area is matched with the reference positioning code, determining the conversion area as the target positioning code.
8. The method according to claim 7, wherein the determining the connected regions of the edge points of the target image according to the edge points identified from the target image comprises:
obtaining gradients and gradient angles corresponding to the pixel points in the target image by using a sobel operator;
determining each edge point in each pixel point of the target image according to a preset gradient high threshold value, a preset gradient low threshold value, each gradient corresponding to each pixel point and each gradient angle;
based on the edge points, the connected regions are determined.
9. The method of claim 8, wherein obtaining the gradients and gradient angles corresponding to the pixel points in the target image by using a sobel operator comprises:
obtaining gradient values in each direction corresponding to each pixel point according to the sobel operator matrix and each gray value corresponding to each pixel point;
obtaining each gradient corresponding to each pixel point according to each direction gradient value corresponding to each pixel point and a preset gradient conversion rule;
obtaining each gradient angle corresponding to each pixel point according to each direction gradient value corresponding to each pixel point and a preset gradient angle conversion rule;
the sobel operator matrix is expressed as:
Figure 390399DEST_PATH_IMAGE001
Figure 386036DEST_PATH_IMAGE002
wherein, the
Figure 54915DEST_PATH_IMAGE003
Representing each of said imagesA directional gradient value of a pixel point in an X direction of the target image, the
Figure 313858DEST_PATH_IMAGE004
A direction gradient value representing a Y direction of each of the pixel points along the target image, the
Figure 802608DEST_PATH_IMAGE005
Is a convolution symbol, said
Figure 805199DEST_PATH_IMAGE006
Representing each gray value corresponding to each pixel point in the target image;
the preset gradient conversion rule is expressed as:
Figure 390902DEST_PATH_IMAGE007
wherein, the
Figure 555167DEST_PATH_IMAGE008
Representing each of the gradients corresponding to each of the pixel points;
the preset gradient angle conversion rule is expressed as:
Figure 796792DEST_PATH_IMAGE009
wherein, the
Figure 337495DEST_PATH_IMAGE010
Each of the gradient angles corresponding to each of the pixel points is represented.
10. The localization marker-based localization method according to claim 8, wherein the edge points comprise a first edge point and a second edge point; wherein the content of the first and second substances,
determining each edge point in each pixel point according to a preset gradient high threshold, a preset gradient low threshold, each gradient and each gradient angle corresponding to each pixel point comprises:
according to the preset gradient high threshold, determining each pixel point with the gradient larger than the gradient high threshold as the first edge point;
according to the preset gradient high threshold and the preset gradient low threshold, if the gradient of the pixel point is between the preset gradient high threshold and the preset gradient low threshold, the first edge point adjacent to the pixel point exists at the same time, and the difference value between the gradient angle of the pixel point and the gradient angle of the first edge point is smaller than a preset difference value, the pixel point is determined as the second edge point.
11. The localization marker-based localization method according to claim 10, wherein the preset difference is not more than eight degrees.
12. The localization marker-based localization method according to claim 7, wherein the screening each of the connected regions based on a preset screening rule to determine the candidate region in the target image comprises:
for each of the connected regions, determining the connected region as the candidate region if the connected region can be fitted to a quadrilateral region.
13. The localization marker-based localization method according to claim 7, wherein the performing projection transformation on the candidate region, and obtaining the transformation region of the candidate region comprises:
determining the relative vertex positions of the reference positioning codes according to the preset side lengths of the reference positioning codes;
acquiring a homography matrix according to the target vertex positions of the candidate area relative to the target image and the corresponding vertex positions of the reference positioning code;
and performing projection conversion on the candidate region by using the homography matrix to obtain a conversion region of the candidate region.
14. The positioning method according to claim 7, wherein the identifying the transition region, and if the identification result of the transition region matches the reference positioning code, determining the transition region as the target positioning code comprises:
identifying the conversion area, and obtaining an identification code of the conversion area;
and comparing the identification code with the reference positioning code, responding to a comparison result that the identification code is matched with the reference positioning code, and determining the conversion area as the target positioning code.
15. The localization marker-based localization method according to claim 7, wherein determining the target localization result of the localization marker according to the positions of the target localization markers in the respective target images and the positions of the reference localization markers on the localization marker comprises:
obtaining intermediate positioning information of the target positioning identifier according to the position of the target positioning identifier in each target image, and obtaining reference positioning information of the reference positioning identifier according to the position of the reference positioning identifier on the positioning marker;
and determining a target positioning result of the positioning marker according to the intermediate positioning information and the reference positioning information.
16. The localization marker-based localization method according to claim 15,
the obtaining of the intermediate positioning information of the target positioning identifier according to the position of the target positioning identifier in each target image includes:
identifying the coordinate positions of the same target positioning code in the two target images, acquiring two-dimensional coordinate information of the target positioning code corresponding to the two target images, and acquiring three-dimensional coordinate information of the target positioning code according to the two-dimensional coordinate information;
the obtaining the reference positioning information of the reference positioning identifier according to the position of the reference positioning identifier on the positioning marker comprises:
and identifying the coordinate position of the reference positioning code under the local coordinate system of the positioning marker to obtain the three-dimensional coordinate information of the reference positioning code.
17. The localization marker-based localization method of claim 16, further comprising:
identifying each position corresponding to each target positioning code in the two target images to obtain each first position information and each second position information of each target positioning code corresponding to the two target images;
and matching each piece of first position information with each piece of second position information, and determining the same target positioning code in the two target images.
18. The positioning method based on positioning markers according to claim 16, wherein the three-dimensional coordinate information of the target positioning code comprises target corner point coordinates corresponding to each corner point of the target positioning code, the three-dimensional coordinate information of the reference positioning code comprises reference corner point coordinates corresponding to each corner point of the reference positioning code, and each target corner point coordinate corresponds to each reference corner point coordinate; wherein the content of the first and second substances,
the determining a target positioning result of the positioning marker according to the intermediate positioning information and the reference positioning information includes:
determining a target gravity center point of the target positioning code according to at least three target corner point coordinates and a target gravity center point conversion rule, and acquiring a target vector of the target positioning code according to each target corner point coordinate and the target gravity center point;
determining a reference gravity center point of the reference positioning code according to each reference corner point coordinate corresponding to each target corner point coordinate and a reference gravity center point conversion rule, and acquiring a reference vector of the reference positioning code according to each reference corner point coordinate and the reference gravity center point;
acquiring a covariance matrix according to the target vector and the reference vector;
determining attitude parameters and position parameters of the positioning markers corresponding to the imaging equipment according to singular value decomposition results obtained by the covariance matrix;
the target gravity center point conversion rule is expressed as:
Figure 715387DEST_PATH_IMAGE011
wherein, the
Figure 378449DEST_PATH_IMAGE012
Representing said target center of gravity point, said
Figure 841791DEST_PATH_IMAGE013
Representing the coordinates of said target corner points, said
Figure 186185DEST_PATH_IMAGE014
The number of the corner points of the target location code, the
Figure 684163DEST_PATH_IMAGE014
Is not less than 3;
the reference gravity center point conversion rule is expressed as:
Figure 190230DEST_PATH_IMAGE015
wherein, the
Figure 973159DEST_PATH_IMAGE016
Representing said reference center of gravity point, said
Figure 855665DEST_PATH_IMAGE017
Representing the reference corner point coordinates, said
Figure 208149DEST_PATH_IMAGE014
For the number of the corner points of the reference positioning code, the
Figure 885118DEST_PATH_IMAGE014
Is not less than 3;
the target vector is represented as:
Figure 323052DEST_PATH_IMAGE018
the reference vector is represented as:
Figure 9248DEST_PATH_IMAGE019
the covariance matrix is expressed as:
Figure 278556DEST_PATH_IMAGE020
wherein, the
Figure 392005DEST_PATH_IMAGE021
Represents the covariance matrix, the
Figure 317236DEST_PATH_IMAGE022
To the reference vector
Figure 541544DEST_PATH_IMAGE023
A matrix of 3 × n columns; the above-mentioned
Figure 868620DEST_PATH_IMAGE024
To the reference vector
Figure 215288DEST_PATH_IMAGE025
A matrix of 3 × n columns; the above-mentioned
Figure 362235DEST_PATH_IMAGE026
Is a transposed matrix;
the singular value decomposition result of the covariance matrix is expressed as:
Figure 390234DEST_PATH_IMAGE027
wherein, the
Figure 571817DEST_PATH_IMAGE028
And said
Figure 27069DEST_PATH_IMAGE029
Is a parameter to be solved;
the localization markers correspond to pose parameters of the imaging device expressed as:
Figure 989209DEST_PATH_IMAGE030
wherein, the
Figure 555319DEST_PATH_IMAGE031
Representing the pose parameters;
the positional parameters of the localization markers corresponding to the imaging device are expressed as:
Figure 591408DEST_PATH_IMAGE032
wherein, the
Figure 217562DEST_PATH_IMAGE033
Representing the location parameter.
19. The localization marker-based localization method according to claim 2, wherein the reference localization signature further comprises one or more reference identity codes circumferentially distributed on the localization marker; wherein, each reference identity code has the same two-dimensional code.
20. The localization marker-based localization method of claim 19, further comprising:
acquiring the reference identity codes on the positioning markers to obtain at least two target images;
and identifying the target identity code in each target image according to the reference identity code so as to determine the identity information of the positioning marker.
21. A computer storage medium having stored thereon computer instructions which, when executed by a processor, cause the processor to perform the method of any one of claims 1 to 20.
22. A localization device based on localization markers, comprising:
the acquisition module is used for acquiring reference positioning marks on the positioning markers and acquiring at least two target images;
the identification module is used for identifying the target positioning identifier in each target image according to the reference positioning identifier;
and the positioning module is used for determining a target positioning result of the positioning marker according to the positions of the target positioning markers in the target images and the positions of the reference positioning markers on the positioning marker.
23. The positioning device of claim 22,
the positioning marker is arranged at the tail end of the mechanical arm;
the acquisition module comprises one of a binocular camera and a multi-view camera.
CN202111280102.XA 2021-11-01 2021-11-01 Positioning method and device based on positioning marker and computer storage medium Active CN113712665B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111280102.XA CN113712665B (en) 2021-11-01 2021-11-01 Positioning method and device based on positioning marker and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111280102.XA CN113712665B (en) 2021-11-01 2021-11-01 Positioning method and device based on positioning marker and computer storage medium

Publications (2)

Publication Number Publication Date
CN113712665A true CN113712665A (en) 2021-11-30
CN113712665B CN113712665B (en) 2022-04-22

Family

ID=78686281

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111280102.XA Active CN113712665B (en) 2021-11-01 2021-11-01 Positioning method and device based on positioning marker and computer storage medium

Country Status (1)

Country Link
CN (1) CN113712665B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114199252A (en) * 2021-12-06 2022-03-18 北京云迹科技股份有限公司 Indoor positioning method and device, electronic equipment and storage medium
CN114523471A (en) * 2022-01-07 2022-05-24 中国人民解放军海军军医大学第一附属医院 Error detection method based on associated identification and robot system
CN114536402A (en) * 2022-02-16 2022-05-27 中国医学科学院北京协和医院 Robot system fault detection processing method based on associated identification and robot system
CN116687569A (en) * 2023-07-28 2023-09-05 深圳卡尔文科技有限公司 Coded identification operation navigation method, system and storage medium
CN116843748A (en) * 2023-09-01 2023-10-03 上海仙工智能科技有限公司 Remote two-dimensional code and object space pose acquisition method and system thereof
WO2024001847A1 (en) * 2022-06-28 2024-01-04 中兴通讯股份有限公司 2d marker, and indoor positioning method and apparatus

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040002642A1 (en) * 2002-07-01 2004-01-01 Doron Dekel Video pose tracking system and method
US20100168763A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Configuration marker design and detection for instrument tracking
WO2015118157A1 (en) * 2014-02-10 2015-08-13 Navigate Surgical Technologies, Inc. System and method for determining the three-dimensional location and orientation of identification markers
CN106137131A (en) * 2016-07-18 2016-11-23 山东省肿瘤防治研究院 A kind of noinvasive tumor-localizing system
EP3733360A1 (en) * 2019-04-29 2020-11-04 Arrival Limited A system for preventing collisions in a robotic cell environment
CN112013850A (en) * 2020-10-16 2020-12-01 北京猎户星空科技有限公司 Positioning method, positioning device, self-moving equipment and storage medium
CN113499166A (en) * 2021-06-21 2021-10-15 西安交通大学 Autonomous stereoscopic vision navigation method and system for corneal transplantation surgical robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040002642A1 (en) * 2002-07-01 2004-01-01 Doron Dekel Video pose tracking system and method
US20100168763A1 (en) * 2008-12-31 2010-07-01 Intuitive Surgical, Inc. Configuration marker design and detection for instrument tracking
CN102341054A (en) * 2008-12-31 2012-02-01 直观外科手术操作公司 Configuration marker design and detection for instrument tracking
WO2015118157A1 (en) * 2014-02-10 2015-08-13 Navigate Surgical Technologies, Inc. System and method for determining the three-dimensional location and orientation of identification markers
CN106137131A (en) * 2016-07-18 2016-11-23 山东省肿瘤防治研究院 A kind of noinvasive tumor-localizing system
EP3733360A1 (en) * 2019-04-29 2020-11-04 Arrival Limited A system for preventing collisions in a robotic cell environment
CN112013850A (en) * 2020-10-16 2020-12-01 北京猎户星空科技有限公司 Positioning method, positioning device, self-moving equipment and storage medium
CN113499166A (en) * 2021-06-21 2021-10-15 西安交通大学 Autonomous stereoscopic vision navigation method and system for corneal transplantation surgical robot

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114199252A (en) * 2021-12-06 2022-03-18 北京云迹科技股份有限公司 Indoor positioning method and device, electronic equipment and storage medium
CN114199252B (en) * 2021-12-06 2024-02-09 北京云迹科技股份有限公司 Indoor positioning method and device, electronic equipment and storage medium
CN114523471A (en) * 2022-01-07 2022-05-24 中国人民解放军海军军医大学第一附属医院 Error detection method based on associated identification and robot system
EP4209312A1 (en) * 2022-01-07 2023-07-12 The First Affiliated Hospital of Naval Medical University Error detection method and robot system based on association identification
CN114536402A (en) * 2022-02-16 2022-05-27 中国医学科学院北京协和医院 Robot system fault detection processing method based on associated identification and robot system
CN114536402B (en) * 2022-02-16 2024-04-09 中国医学科学院北京协和医院 Robot system fault detection processing method based on association identification and robot system
WO2024001847A1 (en) * 2022-06-28 2024-01-04 中兴通讯股份有限公司 2d marker, and indoor positioning method and apparatus
CN116687569A (en) * 2023-07-28 2023-09-05 深圳卡尔文科技有限公司 Coded identification operation navigation method, system and storage medium
CN116687569B (en) * 2023-07-28 2023-10-03 深圳卡尔文科技有限公司 Coded identification operation navigation method, system and storage medium
CN116843748A (en) * 2023-09-01 2023-10-03 上海仙工智能科技有限公司 Remote two-dimensional code and object space pose acquisition method and system thereof
CN116843748B (en) * 2023-09-01 2023-11-24 上海仙工智能科技有限公司 Remote two-dimensional code and object space pose acquisition method and system thereof

Also Published As

Publication number Publication date
CN113712665B (en) 2022-04-22

Similar Documents

Publication Publication Date Title
CN113712665B (en) Positioning method and device based on positioning marker and computer storage medium
CN106372702B (en) Positioning identifier and positioning method thereof
Ha et al. Deltille grids for geometric camera calibration
US9292755B2 (en) Shape detection and ellipse fitting of a polygon
CN107609451A (en) A kind of high-precision vision localization method and system based on Quick Response Code
CN101162524B (en) Image-processing apparatus and method
WO2018133130A1 (en) 3d marker model construction and real-time tracking using monocular camera
JP4708752B2 (en) Information processing method and apparatus
US7812871B2 (en) Index identification method and apparatus
CN107481276B (en) Automatic identification method for marker point sequence in three-dimensional medical image
CN109215016B (en) Identification and positioning method for coding mark
CN104766309A (en) Plane feature point navigation and positioning method and device
JP2008046687A (en) Photographic environment calibration method and information processor
KR20180105875A (en) Camera calibration method using single image and apparatus therefor
JP4502361B2 (en) Index attitude detection method and apparatus
JP2017003525A (en) Three-dimensional measuring device
CN104166995B (en) Harris-SIFT binocular vision positioning method based on horse pace measurement
JP5988364B2 (en) Image processing apparatus and method
KR101673144B1 (en) Stereoscopic image registration method based on a partial linear method
CN115147588A (en) Data processing method and device, tracking mark, electronic device and storage medium
CN111179347B (en) Positioning method, positioning equipment and storage medium based on regional characteristics
CN111667429B (en) Target positioning correction method for inspection robot
CN114926542A (en) Mixed reality fixed reference system calibration method based on optical positioning system
CN108229625B (en) Coding method and device
JP6734213B2 (en) Information processing device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 100191 Room 501, floor 5, building 9, No. 35 Huayuan North Road, Haidian District, Beijing

Patentee after: Beijing Baihui Weikang Technology Co.,Ltd.

Address before: 100191 Room 608, 6 / F, building 9, 35 Huayuan North Road, Haidian District, Beijing

Patentee before: Beijing Baihui Wei Kang Technology Co.,Ltd.