CN216535498U - Positioning device based on object in space - Google Patents
Positioning device based on object in space Download PDFInfo
- Publication number
- CN216535498U CN216535498U CN202120678700.1U CN202120678700U CN216535498U CN 216535498 U CN216535498 U CN 216535498U CN 202120678700 U CN202120678700 U CN 202120678700U CN 216535498 U CN216535498 U CN 216535498U
- Authority
- CN
- China
- Prior art keywords
- positioning device
- space
- display board
- limiting
- positioning
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Length Measuring Devices By Optical Means (AREA)
Abstract
The utility model discloses a positioning device based on an object in space, which comprises: the supporting part is provided with a characteristic part and a limiting part; the characteristic part comprises a connecting mechanism and a display board, the connecting mechanism is connected with the supporting part, and the display board is provided with an optical characteristic part for shooting and identifying; the limiting part is arranged on one side of the characteristic part and used for limiting and spatially positioning an object to be positioned. Can be to the object under the specific scene, through the discernment characteristic that uses the object that has different error characteristics under same scene, can make two objects reach specific angle and position relation, and then through the space correlation of the two corresponding objects, carry out image acquisition and mutual correction of position to two different objects, realize that the optical positioning precision of folk prescription or both sides improves, the augmented reality of accurate location and position can help the user to carry out accurate and complete operation.
Description
Technical Field
The utility model relates to the technical field of image processing, in particular to a positioning device based on an object in space.
Background
Augmented reality technology generally captures an image of a real scene through a camera, and needs to analyze and process the captured image of the real scene and display additional information to a user in an overlapping manner on the basis of the real scene, namely, augmented reality. The process of analyzing and processing images of a real scene often includes locating objects in the scene. Under certain specific requirements, the requirement on the accuracy of positioning the object in the scene is extremely high, and the accuracy of positioning the object in the scene in the prior art cannot meet the requirement.
For example, when augmented reality technology is applied to surgical navigation scenes, the position relationship between medical instruments and patients and scenes needs to be determined very accurately so as to ensure that accurate navigation information is provided for users. If the puncture navigation based on the augmented reality technology is adopted, the quick and accurate operation navigation can be realized by the simplest and most convenient equipment which is easy to learn and use. In the whole process, one of the cores of the precise navigation is as follows: accurate surgical instruments space location based on visible light pattern reaches registration of virtual organ and real human body, but before the location, how to carry out quick location matching and correction very important, in specific operation, need carry out surgical instruments and shooting device's location calibration earlier, just can guarantee the accuracy of the navigation position of the following whole navigation in-process, and then just can guarantee the reinforcing in later stage and show the accuracy now. For the positioning calibration of the surgical instrument with the camera, it is necessary to rely on the accurate spatial positioning of the recognizable pattern on the object to be positioned. Due to the design limitation of the instrument, the unique spatial positioning accuracy of the recognizable patterns with different sizes and shapes is different due to the inherent rule of the spatial distribution of the characteristic points of the patterns or the characteristics of the production process, so that the positioning of the surgical instrument is very inconvenient.
SUMMERY OF THE UTILITY MODEL
In view of the above-mentioned drawbacks or shortcomings, it is an object of the present invention to provide a positioning device based on an object in space.
In order to achieve the above purpose, the technical scheme of the utility model is as follows:
an apparatus for object-based positioning in space, comprising: the support part, and the characteristic part and the limiting part which are arranged on the support part;
the characteristic part comprises a display board, the display board is connected with the supporting part, and optical characteristics used for shooting and identifying are arranged on the display board;
the limiting part is arranged to limit an object to be positioned.
The characteristic part further comprises a connecting mechanism, and the display board is connected with the supporting part through the connecting mechanism.
The connecting mechanism comprises an articulated mechanism, and the display board can be arranged on the supporting part in a turnover way through the articulated mechanism.
The optical features include one or any combination of specific patterns, structures, colors for being optically recognized.
The optical characteristic piece is a pattern attached to or printed on the display board, and the pattern is a two-dimensional code.
The limiting part is arranged on one side of the characteristic part, and when the object to be positioned is moved to a preset position, the limiting part limits the object to be positioned and forms a specific spatial position relation with the object to be positioned.
The limiting part is of a detachable structure and can be installed on one side of the characteristic part or replaced.
The limiting part is of a cylindrical structure, a positioning groove is formed in the cylindrical structure, and a horizontal alpha-angle opening is horizontally formed in the positioning groove.
And a through hole or a blind hole for limiting an object to be positioned is arranged along the central axis of the cylindrical structure.
The display board is provided with a shielding piece for shielding the optical characteristic piece.
The object to be positioned is a surgical instrument.
The object to be positioned is a puncture needle.
Compared with the prior art, the utility model has the beneficial effects that:
the utility model provides a positioning device based on objects in space, which can enable two objects to reach a specific angle and position relation by using identification characteristics of the objects with different error characteristics in the same scene for the objects in a specific scene, further can perform image acquisition and mutual position correction on the two different objects through the spatial correlation of the two corresponding objects, and realizes the improvement of optical positioning precision of one or both of the two objects.
Drawings
FIG. 1 is a first schematic view of an apparatus for locating an object in space according to the present invention;
FIG. 2 is a schematic diagram of a second positioning device of the present invention based on an object in space;
FIG. 3 is a schematic diagram III of a positioning device of the present invention based on an object in space;
FIG. 4 is a flow chart of a positioning method of the present invention;
FIG. 5 is an exemplary illustration of an embodiment in an embodiment of the utility model;
FIG. 6 is a schematic diagram of the inventive cross-calibration.
Detailed Description
The present invention will now be described in detail with reference to the drawings, wherein the described embodiments are only some, but not all embodiments of the utility model. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, belong to the scope of the present invention.
Example 1
In an accurate operation scene, the actual position of an object and the position in an image need to be accurately determined, and under certain specific requirements, the requirement on the accuracy of positioning the object in the scene is extremely high, for example, in a medical process, the position relationship between a medical instrument, a patient and the scene needs to be accurately determined, so that accurate navigation information can be provided for a user. Based on the requirement, the utility model provides an augmented reality method based on the position of a corrected object in space, which can be applied to operation implementation scenes, operation scenes in a simulation teaching process and positioning in a game process.
Taking a surgical implementation scenario as an example, the present invention provides a user with a localization of a surgical instrument within a subject. The user is, among other things, the observer of the entire in-vivo navigation process, which is also the operator who probes the instrument into the subject. The object may be a person or other animal that the user needs to operate on. The instrument may be any tool that can be introduced into the body of a subject. The instrument may be, for example, a medical instrument such as a puncture needle, biopsy needle, radio frequency or microwave ablation needle, ultrasound probe, rigid endoscope, endoscopic ovonic forceps, electric knife, or stapler. Preferably, the positioning device is a fixture in a surgical scene; the object to be positioned is a medical instrument in an operation scene.
As shown in fig. 1, the present invention provides a positioning device based on an object in a space, comprising: the device comprises a support part 1, a characteristic part 2 and a limiting part 3, wherein the characteristic part 2 and the limiting part 3 are arranged on the support part 1;
the feature part 2 comprises a display board 22, the display board 22 is connected with the support part 1, and an optical feature part 221 used for shooting and recognizing is arranged on the display board 22;
the limiting part 3 is arranged on one side of the characteristic part 2 and used for limiting an object to be positioned.
Preferably, in the present invention, the feature part 2 further comprises a connecting mechanism 21, and the display board 22 is connected with the supporting part 1 through the connecting mechanism 21. The display board 22 can be installed on the supporting portion 1 through the connecting mechanism 21 in a turnover mode, the connecting mechanism 21 can be arranged to be of a hinge structure and can be connected with the supporting portion 1 in a turnover mode or connected in a hinged mode through a hinge piece, when angle adjustment is needed to be conducted on the display board 22, the turnover adjustment is achieved through rotation of the connecting mechanism 21, and the best angle suitable for shooting is achieved.
Further, as shown in FIG. 2, the display board 22 is provided with a shielding member 222 for shielding the optical features. The shielding member 222 may be a vertically openable and closable and a horizontally openable and closable baffle, and may be opened when a planar identification object is identified and may shield the planar identification object when a needle-shaped identification object is identified.
The optical features 221 include one or any combination of specific patterns, structures, colors for being recognized. The patterns, the graphs, the structures or the two-dimensional codes can be arranged on the positioning device through a printing process, and the identifiable patterns have different space accuracy according to the rules and the production characteristics of the patterns. The combination of different characteristics recognizable patterns is fully utilized to realize rapid space calibration. Illustratively, in the present invention, the optical feature is a pattern attached or printed on the display board 22, and the pattern is a two-dimensional code. The shooting device identifies the two-dimensional code, and the position space information of the object is calculated according to the position information of a plurality of feature points on the two-dimensional code pattern. The spatial position information comprises one or more of spatial coordinates and placing forms of the positioning device, and the fixed positioning device can be specifically positioned in the spatial position.
In the utility model, the limiting part 3 is a detachable structure and can be installed on one side of the characteristic part or replaced. For example, the stop portion 3 may be separate from the overall structure, depending on the sterilization/disinfection requirements. The whole structure can be used repeatedly and only needs to be disinfected. While the stop 3, which will contact the sterile surgical instrument, may be a sterilized, single use component. In the use site, the structure 3 is combined with the integral structure for use, so that the safety in operation is improved. In addition, in the utility model, the limiting part is arranged on one side of the characteristic part, and when the object to be positioned moves to a preset position, the limiting part limits the object to be positioned and forms a specific spatial position relation with the object to be positioned. For the application scene in the operation process, the shooting device is a head-mounted optical camera. When the head-mounted optical camera is used by a user, the acquisition angle of the head-mounted optical camera can be well kept consistent with the observation direction of the head-mounted optical camera no matter what posture the user adopts.
The object to be positioned is a surgical instrument, and the mark fixed on the surface of the instrument can also be a three-dimensional graph, for example, in the design and production process of the instrument, the graph of the mark can be a handle of the instrument or a certain structure fixed on the side surface of the handle. The spatial positioning using the solid pattern requires a longer calculation time than that of the planar pattern, but has a higher accuracy in spatial positioning of a stationary or slowly moving target. The instrument may be, for example, a medical instrument such as a puncture needle, biopsy needle, radio frequency or microwave ablation needle, ultrasound probe, rigid endoscope, endoscopic ovonic forceps, electric knife, or stapler. As shown in fig. 2, taking the puncture needle as an example, the needle tip of the puncture needle moves, when the needle tip of the puncture needle moves to the limiting part, the needle tip and the positioning device form a specific spatial position relationship for positioning, and the position information of the needle tip is corrected according to the position information of the limiting part.
Illustratively, as shown in fig. 3, the position-limiting part 3 is a cylindrical structure, a positioning groove 31 for positioning is formed at the top end of the cylindrical structure, and a horizontal α -angle opening is horizontally formed in the positioning groove 31. When the needle body is held by hand, the structure can ensure that the needle body is held by hand from the vertical to the horizontal within the 90-degree range when the opening of the positioning groove in the needle point is opened, and the needle point cannot slide off from the plane when the needle body is moved within the horizontal alpha angle range. In another embodiment, a through hole or a blind hole is arranged along the central axis of the cylindrical structure, so that the through hole or the blind hole is adapted to accommodate the puncture needle to be inserted and placed, and the limit of the straight line where the needle body is located can be realized.
Example 2
As shown in fig. 4, a method for positioning an object in a space includes:
s1, capturing a positioning device image in the space, and identifying the identification characteristic of the positioning device in the positioning device image to obtain the spatial position information of the positioning device;
in order to perform positioning calibration on an object to be positioned, specific spatial position information of a fixed object is firstly acquired, the spatial position information at least comprises a spatial coordinate and/or a positioning orientation of a positioning device, and the fixed positioning device can be specifically positioned in a spatial position.
In this embodiment, the identification characteristics of the positioning device at least include shape characteristics of the positioning device body and/or mark identification characteristics of the positioning device. The form characteristics of the positioning apparatus body at least include the structure, form, and color of the positioning apparatus body, but in the specific implementation process, the form characteristics are not limited to this, and may be other recognizable characteristics of the object. For example, the utility model can fixedly arrange an object with a fixed shape, before calibration, the structure and the shape of the positioning device are firstly identified, and in the identification process, a user can be prompted whether the capturing process and the identification process are successful or not through different display modes. And positioning and identifying the positioning device to acquire accurate spatial position information of the surgical positioning device.
In addition, in the present invention, the mark identification characteristic of the positioning device at least includes a pattern, a figure or a two-dimensional code provided on the positioning device. The patterns, the graphs or the two-dimensional codes can be arranged on the positioning device through a printing process, and the identifiable patterns have different space accuracy according to the rules and the production characteristics of the patterns. The combination of recognizable patterns with different characteristics is fully utilized to realize the rapid space calibration of the navigation instrument.
Illustratively, in the present invention, a rectangular information board printed with a two-dimensional code may be used, and the device for capturing the image of the positioning device is a device capable of image acquisition, and the acquisition angle is consistent with the observation direction of the user. When the user is using, he may wear the camera on his body, for example on his head. Optionally, the camera is a head-mounted optical camera. When the head-mounted optical camera is used by a user, the acquisition angle of the head-mounted optical camera can be well kept consistent with the observation direction of the head-mounted optical camera no matter what posture the user adopts. The method comprises the steps of obtaining an image of the positioning device through a shooting device, identifying identification characteristics of marks of the positioning device, obtaining shape characteristics of a body of the positioning device according to the identification characteristics of the marks of the positioning device, obtaining the position of the orientation of the positioning device in an xyz space coordinate system, wherein Z coordinates represent coordinates along the depth direction of shooting by a camera, X and Y coordinates are coordinates vertical to the Z coordinate axis direction, and setting the current spatial coordinates of the positioning device to be X1, Y1 and Z1.
S2, when an object to be positioned is at a preset position in space, capturing an image of the object to be positioned in space of the object to be positioned, and identifying the identification characteristic of the object to be positioned in the image of the object to be positioned to obtain the space position information of the object to be positioned;
in a specific operation scene, an instrument is required to be used for operation, the object to be positioned is a moving instrument, and the spatial position information of the object to be positioned comprises the spatial coordinate of the object to be positioned and/or the orientation of the object to be positioned.
The identification characteristics of the object to be positioned at least comprise the morphological characteristics of the body of the object to be positioned and/or the identification characteristics of the mark of the object to be positioned; the morphological characteristics of the body of the object to be positioned at least comprise the structure, the shape or the color of the body of the object to be positioned; the mark identification characteristics of the object to be positioned at least comprise patterns, graphs or two-dimensional codes arranged on the object to be positioned.
The two-dimensional code is a black and white plane pattern distributed on a plane, the upper points of the two-dimensional code are very easy to identify, and the two-dimensional code can be positioned by identifying at least 3 points of the two-dimensional code. Since the two-dimensional code is fixed to the object or the instrument, positioning of the object or the instrument to which the two-dimensional code is fixed can be achieved.
Alternatively, the object marker identification characteristic to be located may also be other planar patterns such as a checkerboard. The two-dimensional code or the checkerboard is used as the identification, so that the object or the instrument can be positioned more accurately and quickly. Thus, the fast moving instrument can be navigated more accurately.
Optionally, the mark fixed on the surface of the instrument may be a three-dimensional figure, for example, in the design and production process of the instrument, the pattern of the mark may be the handle of the instrument, or some structure fixed on the side of the handle. The calculation time required for recognition is longer than that of a plane figure, but the spatial positioning accuracy of a fixed or slow moving target is higher.
Illustratively, the object to be positioned in the utility model is a puncture needle in an operation, and the end part of the puncture needle is provided with an identification structure and printed with a two-dimensional code.
When the object to be positioned is at the preset position in space, the method specifically comprises the following steps of capturing an image of the object to be positioned in space of the object to be positioned:
the positioning device is fixedly arranged in a space, the object to be positioned is a moving object, and when the object to be positioned moves to a preset position in the space, an image of the object to be positioned in the space is captured. In the process, the spatial preset position can be set to be the preset position of the object to be positioned, which is moved to coincide with the preset position of the positioning device. Or, according to the actual operation requirement, when a certain position of the object to be positioned reaches a fixed position or completes a specified action to position.
The method specifically comprises the following steps: the positioning device is fixedly arranged in space, the object to be positioned is a moving object, when the object to be positioned moves to a preset position in space, the object to be positioned is identified according to the identification characteristic of the mark of the object to be positioned, the orientation of the object to be positioned is obtained, and the current space coordinates of the object to be positioned are set and recorded as X2, Y2 and Z2. The spatial preset position is a position when the object to be positioned and a preset associated point, line or surface on the positioning device have a specific position relationship, and the specific position relationship comprises point, line or surface coincidence and partial coincidence.
Illustratively, the information board is used as a positioning device, the puncture needle is an object to be positioned, when a user holds the puncture needle to enable the needle point B to coincide with the point A of the information board, the positions of the two objects are positioned and mutually calibrated. In connection with embodiment 1 of the present invention, referring to fig. 1 and 3, when the needle tip or the needle body of the puncture needle is positioned below the limit of the limit part 3 of the positioning device, i.e. the needle tip is located at the positioning groove 31, or the needle body is placed in the through hole or the blind hole of the limit part, the positions of the puncture needle and the positioning device are positioned and calibrated with each other.
S3, correcting the spatial position information of the object to be positioned according to the spatial position information of the positioning device and the spatial preset position; and/or correcting the spatial position information of the positioning device according to the spatial position information of the object to be positioned;
the process can be two processes, and the relative correction is carried out on the two objects according to the actual situation, for example, the theoretical position information of the object to be positioned is calculated according to the space position information of the positioning device and the preset space position;
correcting the spatial position information of the object to be positioned according to the theoretical position information of the object to be positioned;
and/or calculating theoretical position information of the positioning device according to the spatial position information of the object to be positioned and a spatial preset position;
and correcting the spatial position information of the positioning device according to the theoretical position information of the positioning device.
For example, as shown in fig. 5, position information of the object in space is calculated according to the captured image of the positioning device, and the coordinates of the point a are calculated according to the captured features of the positioning device (mainly the pattern features on the panel);
when a doctor holds an object to be positioned (a puncture needle) by hand and places a point B of the needle tip on a point A of the positioning device, the coordinate of the point B of the puncture needle can be calculated according to the characteristic which is arranged at the tail end of the puncture needle and easy to identify;
it is known that A, B two points coincide at this time, but the coordinates of A, B two points obtained in step 1 and step 2 are not necessarily the same. According to the space geometric characteristics of the two objects, the accuracy of the X and Y coordinates of the point A on the positioning device is high, but the accuracy of the Z coordinate is relatively low, and the accuracy of the Z coordinate of the point B on the object to be positioned (puncture needle) is relatively high, so that the X2 and Y2 coordinates of the object to be positioned are corrected according to the X1 and Y1 coordinates of the positioning device, and the Z1 coordinate of the positioning device is corrected by the Z2 coordinate of the object to be positioned. The corresponding positions of the two structures within the database are adjusted as follows:
X2=X1;Y2=Y1;Z1=Z2;
the specific mutual calibration method is composed of the following 2 parts, and a mutual calibration schematic diagram is shown in fig. 5:
(1) the coordinates of the needle point in the coordinate system of the needle identification object are manually determined in advance.
(2) A hole is made in the recognition plate parallel to the z-axis and perpendicular to the Oxy-plane, and a Point at the bottom of the hole is a Calibration Point (Calibration Point). By designing the positioning device die body, the coordinate p of the calibration point in the positioning device coordinate system is determinedQ. During calibration, the identification needle is inserted into the hole, and the needle point is ensured to be positioned at the calibration point. According to the characteristic that the coordinate of the calibration point under the coordinate system of the camera is kept unchanged, the following relation can be known through coordinate conversion, and at the moment, the calibration point has the following 2 expressions under the coordinate system of the needle point: t isC←QpQ=TC←NpN
The calibration point has the following 2 expressions under the needle point coordinate system:
(a) the coordinate system of the object to be positioned, identified by the needle identifier and directly determined by manual point calibration:
(b) the coordinate system of the object to be positioned, which is recognized by the positioning device (recognition board) and obtained through coordinate conversion:
the 2 coordinates are each a representation of the index point in the needle identifier coordinate system. Assuming that the z-coordinate component is more accurate using expression (a) and the x-and y-coordinate components are more accurate using expression (b), the result after mutual calibration is
Wherein, C: camera coordinate system
Q: coordinate system of positioning device
N: puncture needle coordinate system
TB←A: coordinate transformation matrix representing coordinate system A to coordinate system B
pA: point p in coordinate system A
vA: vector v in coordinate system A
Positioning device point calibration method, camera recognizes positioning device and puncture needle, and T can be obtainedC←QAnd TC←N. The puncture needle tip is placed on a fixed point p on the identification plate. The coordinates of the fixed point in the coordinate system of the recognition plate, i.e. p, can be determined from the processing model of the recognition plateQ. According to the characteristic that the coordinates of the point are not changed in the camera coordinate system, the following coordinate relationship can be obtained:
TC←QpQ=TC←NpN
the coordinates of this point in the coordinate system of the puncture needle are thus obtained, i.e.
In addition, the utility model can also use the direction calibration to calibrate, including:
the mutual calibration method is composed of the following 2 parts, and a mutual calibration schematic diagram is shown in fig. 6:
(1) manually determining the direction vector v of the puncture needle in the coordinate system of the needle identifier in advanceN。
(2) Machining a hole in the positioning device parallel to the z-axis and perpendicular to the Oxy-planeThe bottom Point is the Calibration Point (Calibration Point), and the hole Direction is called the Calibration Direction (Calibration Direction). By designing the positioning device die body, a direction vector v of the hole direction under a positioning device coordinate system is determinedQ. During calibration, the identification needle is inserted into the hole, and the needle point is ensured to be positioned at the calibration point. According to the characteristic that the direction of the calibration direction is kept unchanged in the camera coordinate system, the following relation can be known through coordinate conversion:
TC←QvQ=TC←NvN
at this time, the calibration direction has 2 expressions under the needle point coordinate system:
(a) the direction vector of the object to be positioned, which is identified by the needle identifier and directly determined by manual direction calibration:
(b) the direction vector of the object to be positioned is identified by the identification plate and obtained through coordinate conversion:
the 2 vectors are each a representation of the nominal direction in the coordinate system of the needle identifier. Assuming that the w coordinate component is more accurate by the expression (a) and the u and v coordinate components are more accurate by the expression (b), the result after mutual calibration is
Provided is a direction calibration method of a positioning device. The camera recognizes the positioning plate and the puncture needle to obtain TC←QAnd TC←N. The needle point of the puncture needle is inserted into a fixed hole on the identification plate. From the processing model of the recognition plate, the direction vector of the hole in the coordinate system of the recognition plate, i.e. v, can be determinedQ. From the direction toThe amount is not changed in the direction of the camera coordinate system, and the following conversion relationship can be obtained
TC←QvQ=TC←NvN
Thus, a representation of the direction vector in the coordinate system of the puncture needle is obtained, i.e.
After the direction calibration, when the camera identifies the needle identifier in real time, the direction of the needle tip can be calculated in real time according to the following formula:
vC=TC←NvN
wherein, TC←NIdentification of the needle identifier by the camera, vNThe calibration result after the calculation of the mutual calibration or the direction calibration of the positioning device is adopted.
In one embodiment, the camera captures video of the subject and the instrument in real time. The user can view the video in which not only the surface portions of the object and the instrument captured by the camera are displayed, but also the internal organs of the object, lesions, and portions of the instrument within the object that are not actually visible, are displayed three-dimensionally in the corresponding positions. In other words, in the video, the parts of the internal body organs, lesions, and instruments that are not actually visible are aligned with the human body and the actual instruments, thereby guiding the user in a virtual scene image similar to the real environment and the positions where the instruments are operated.
The utility model can identify according to the positioning device and the object to be positioned, uses optical identification objects with different error characteristics in the same scene, and realizes the improvement of the optical positioning precision of one or both of the objects through the spatial correlation of the two corresponding objects. Aiming at the identification objects with different error characteristics, the relevance of coordinates of different identification patterns in the same space is determined by matching the geometric structures of the instruments with the spatial relevance. By using known confidence values, calibration of the spatial recognition positions of different recognition patterns is achieved.
It will be appreciated by those skilled in the art that the above embodiments are merely preferred embodiments of the utility model, and thus, modifications and variations may be made in the utility model by those skilled in the art, which will embody the principles of the utility model and achieve the objects and objectives of the utility model while remaining within the scope of the utility model.
Claims (11)
1. A positioning device based on an object in space, comprising: the support part, and the characteristic part and the limiting part which are arranged on the support part;
the characteristic part comprises a display board, the display board is connected with the supporting part, and optical characteristics used for shooting and identifying are arranged on the display board;
the limiting part is used for limiting an object to be positioned;
the object to be positioned is a surgical instrument.
2. The object-based positioning device in space of claim 1, wherein the feature further comprises a connection mechanism by which the display board is connected to the support portion.
3. The object-based positioning device in space of claim 2, wherein the connecting mechanism comprises a hinge mechanism, and the display board is flip-mounted on the supporting portion via the hinge mechanism.
4. The object-based positioning device in space of claim 1, wherein the optical features comprise one or any combination of specific patterns, structures, colors for being recognized.
5. The object-based positioning device in space according to claim 3, wherein the optical feature is a pattern attached to or printed on the display board, and the pattern is a two-dimensional code.
6. The positioning device for the object in the space according to claim 1, wherein the limiting part is arranged on one side of the feature part, and when the object to be positioned is moved to a predetermined position, the limiting part limits the object to be positioned and forms a specific spatial position relationship with the object to be positioned.
7. The positioning device for the object-based space according to claim 1 or 6, wherein the limiting part is a detachable structure and can be installed on one side of the feature part or replaced.
8. The object-based positioning device in space according to claim 1 or 6, wherein the limiting part is a cylindrical structure, a positioning groove is formed in the cylindrical structure, and a horizontal alpha-angle opening is horizontally formed in the positioning groove.
9. The positioning device for objects in space according to claim 8, wherein a through hole or a blind hole for limiting the object to be positioned is arranged along the central axis of the cylindrical structure.
10. The object-based positioning device in space of claim 1, wherein the display board is provided with a shielding member for shielding the optical feature.
11. The device according to any of claims 1-6 and 9-10, wherein the object to be positioned is a puncture needle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202120678700.1U CN216535498U (en) | 2021-04-01 | 2021-04-01 | Positioning device based on object in space |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202120678700.1U CN216535498U (en) | 2021-04-01 | 2021-04-01 | Positioning device based on object in space |
Publications (1)
Publication Number | Publication Date |
---|---|
CN216535498U true CN216535498U (en) | 2022-05-17 |
Family
ID=81537441
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202120678700.1U Active CN216535498U (en) | 2021-04-01 | 2021-04-01 | Positioning device based on object in space |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN216535498U (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022206406A1 (en) * | 2021-04-01 | 2022-10-06 | 上海复拓知达医疗科技有限公司 | Augmented reality system and method based on spatial position of corrected object, and computer-readable storage medium |
-
2021
- 2021-04-01 CN CN202120678700.1U patent/CN216535498U/en active Active
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022206406A1 (en) * | 2021-04-01 | 2022-10-06 | 上海复拓知达医疗科技有限公司 | Augmented reality system and method based on spatial position of corrected object, and computer-readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10165981B2 (en) | Surgical navigation method | |
US20230285086A1 (en) | Systems, methods and devices to scan 3d surfaces for intra-operative localization | |
JP6889703B2 (en) | Methods and devices for observing 3D surface images of patients during surgery | |
CN113940755B (en) | Surgical planning and navigation method integrating surgical operation and image | |
EP2953569B1 (en) | Tracking apparatus for tracking an object with respect to a body | |
US5765561A (en) | Video-based surgical targeting system | |
EP3254621A1 (en) | 3d image special calibrator, surgical localizing system and method | |
CN109998678A (en) | Augmented reality assisting navigation is used during medicine regulation | |
KR102105974B1 (en) | Medical imaging system | |
WO2022206417A1 (en) | Object space calibration positioning method | |
US20200360093A1 (en) | System and method to conduct bone surgery | |
CN112043382A (en) | Surgical navigation system and use method thereof | |
Agustinos et al. | Visual servoing of a robotic endoscope holder based on surgical instrument tracking | |
Liu et al. | On-demand calibration and evaluation for electromagnetically tracked laparoscope in augmented reality visualization | |
CN216535498U (en) | Positioning device based on object in space | |
Chan et al. | A needle tracking device for ultrasound guided percutaneous procedures | |
WO2022206406A1 (en) | Augmented reality system and method based on spatial position of corrected object, and computer-readable storage medium | |
Bucholz et al. | Automated rejection of contaminated surface measurements for improved surface registration in image guided neurosurgery | |
WO2022206436A1 (en) | Dynamic position identification and prompt system and method | |
KR20230059159A (en) | Apparatus and method for positioning a patient's body and tracking the patient's position during surgery | |
KR101592444B1 (en) | Medical image augmentation device using transparent display and method thereof | |
US11806093B1 (en) | Apparatus and method for tracking hand-held surgical tools | |
KR20090038717A (en) | Medical instrument used in medical navigation and medical navigation method | |
US20220338937A1 (en) | Device For Navigating A Medical Instrument Relative To A Patient Anatomy | |
CN115886695A (en) | Handheld robot-assisted endoscope |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |