CN111481293B - Multi-viewpoint optical positioning method and system based on optimal viewpoint selection - Google Patents

Multi-viewpoint optical positioning method and system based on optimal viewpoint selection Download PDF

Info

Publication number
CN111481293B
CN111481293B CN202010301658.1A CN202010301658A CN111481293B CN 111481293 B CN111481293 B CN 111481293B CN 202010301658 A CN202010301658 A CN 202010301658A CN 111481293 B CN111481293 B CN 111481293B
Authority
CN
China
Prior art keywords
viewpoint
markers
viewpoints
positioning
optical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010301658.1A
Other languages
Chinese (zh)
Other versions
CN111481293A (en
Inventor
张楠
武博
叶灿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Capital Medical University
Original Assignee
Capital Medical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Capital Medical University filed Critical Capital Medical University
Priority to CN202010301658.1A priority Critical patent/CN111481293B/en
Publication of CN111481293A publication Critical patent/CN111481293A/en
Application granted granted Critical
Publication of CN111481293B publication Critical patent/CN111481293B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems

Abstract

The invention discloses a multi-viewpoint optical positioning method and system based on optimal viewpoint selection. Meanwhile, due to different space positions of the viewpoints, different baseline distances between the viewpoints and different angles between the viewpoints and the optical positioning markers, the positioning accuracy of the optical positioning markers by the stereoscopic vision model composed of different viewpoints is different. When part of viewpoints can not capture the optical positioning markers due to light shielding, the invention completes the collection and positioning of the optical positioning markers by selecting the optimal viewpoints without the viewpoints shielded by light. When a plurality of viewpoints can capture images of the optical positioning markers simultaneously, the optimal viewpoint pair is selected according to the relative relation between the optical positioning markers and the viewpoints, and therefore positioning and tracking of the markers under multiple viewpoints are achieved.

Description

Multi-viewpoint optical positioning method and system based on optimal viewpoint selection
Technical Field
The invention relates to the technical field of surgical navigation, in particular to a multi-viewpoint optical positioning method and system based on optimal viewpoint selection.
Background
Currently, surgery is moving towards increasingly more elaborate and complex procedures. The operation navigation system has important significance for shortening operation time, reducing operation wounds and improving operation accuracy. The accuracy of a navigation system depends primarily on the accuracy of spatial location techniques. The optical positioning technology is a positioning method with the highest positioning precision in the current surgical navigation system. The high-precision optical positioning surgical navigation system is the main development direction of surgical navigation positioning systems.
An internationally advanced optical positioning surgical navigation system is the optical motion capture system (Polaris system) of north digital incorporated, canada (NDI). In patent publication EP1034440B1, NDI proposes a method of determining the position of an object in space by receiving signals from a source of the object by a pair of sensors and then computing images in the sensors.
However, when there is an object obstacle between a viewpoint and a target, the viewpoint cannot receive infrared light emitted by the target, and the system cannot locate the target.
Disclosure of Invention
The purpose of the invention is realized by the following technical scheme.
The invention discloses a multi-viewpoint optical positioning method and system based on optimal viewpoint selection, aiming at solving the problem that an optical positioning marker cannot be tracked due to the fact that light rays are shielded in the optical positioning technology. The images of the optical positioning markers are synchronously acquired by using a plurality of viewpoints, so that the situation that the images of the markers cannot be acquired due to the fact that the viewpoints are shielded is avoided. When part of viewpoints can not capture the optical positioning markers due to light shielding, the invention finishes the collection and positioning of the optical positioning markers by selecting the optimal viewpoints which can not be shielded by light, and then determines the pose of the positioning instrument according to the position relationship between the optical positioning markers and the tip points of the positioning instrument.
Specifically, according to an aspect of the present invention, there is provided a multi-viewpoint optical positioning method based on optimal viewpoint selection, including:
establishing a parallel multi-viewpoint array, and obtaining a multi-viewpoint unified coordinate system through a camera calibration method and coordinate system transformation;
a viewpoint selecting step, namely selecting the viewpoint needed to be used for positioning the optical marker according to the shielding condition of each viewpoint and the optical positioning marker and the spatial position relation between the optical marker and each viewpoint, and reconstructing the spatial coordinate of the optical positioning marker;
and a positioning step, namely determining the pose of the positioning instrument according to the position relation between the optical positioning marker and the tip point of the positioning instrument.
The method comprises the following steps of:
(1) placing a plurality of acquisition viewpoints in parallel in a space;
(2) obtaining internal and external parameters of the plurality of acquisition viewpoints by adopting a camera calibration method;
(3) synchronously acquiring images from multiple viewpoints, and calculating the central pixel coordinates of the optical positioning markers under each viewpoint;
(4) determining the corresponding relation between the optical positioning markers under each viewpoint, performing three-dimensional reconstruction on the space coordinates of the optical positioning markers by using a binocular vision principle, and calculating the space coordinates of each optical positioning marker under different viewpoint pairs;
(5) and calculating a conversion matrix between space coordinate systems of different viewpoint pairs by using coordinate system transformation.
The viewpoint selection and surgical instrument positioning steps are as follows:
(1) controlling multi-viewpoint synchronous image acquisition, and calculating the central pixel coordinates of the optical positioning markers and the number of the optical positioning markers under each viewpoint;
(2) determining the corresponding relation between the optical positioning markers in each viewpoint and judging whether each viewpoint is shielded or not;
(3) selecting a viewpoint pair to be used for positioning the optical positioning marker according to the shielding condition of each viewpoint and the optical positioning marker and the spatial position relation between the optical marker and each viewpoint;
(4) carrying out three-dimensional reconstruction on the space coordinates of each optical positioning marker by using a binocular vision principle;
(5) the spatial coordinates of the reconstructed optical positioning markers are converted into spatial coordinates in a reference coordinate system.
Preferably, the parallel multi-viewpoint array has parallel imaging planes of the viewpoints, optical axes of the viewpoints are parallel, and optical centers of the viewpoints are on a horizontal line.
Preferably, the optical positioning marker is actively and/or passively luminescent.
Preferably, the plurality of viewpoints are three or more viewpoints.
The shielding condition of each viewpoint and the optical positioning markers is judged according to the quantity of the markers in each viewpoint, if N (N > -3) optical positioning markers are used for positioning one positioning instrument, the viewpoint with the quantity of the markers smaller than N is captured as the viewpoint with light shielding; the viewpoint where the number of captured markers is equal to N is an unobstructed viewpoint.
The specific rule for selecting the viewpoint pair to be used for positioning the optical positioning marker is:
if a positioning instrument is positioned by N (N > -3) optical positioning markers and occlusion exists in partial viewpoints, selecting the viewpoints with the number of the captured markers equal to N;
when there are a plurality of viewpoints from which the N optical positioning markers can be captured, a viewpoint pair having the largest baseline distance is selected.
According to another aspect of the present invention, there is provided a multi-viewpoint optical positioning system based on optimal viewpoint selection, comprising:
the multi-view spatial arrangement and unified coordinate system establishing module establishes a parallel multi-view array, and obtains a multi-view unified coordinate system through a camera calibration method and coordinate system transformation;
the viewpoint selecting module is used for selecting viewpoints to be used for positioning the optical positioning marker according to the shielding condition of each viewpoint and the optical positioning marker and the spatial position relation between the optical marker and each viewpoint, and reconstructing the spatial coordinate of the optical positioning marker;
and the positioning module is used for determining the pose of the positioning instrument according to the position relation between the optical positioning marker and the tip point of the positioning instrument.
According to another aspect of the present invention, there is provided an electronic apparatus including: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the above method steps.
The invention has the beneficial effects that: the invention utilizes multiple viewpoints to obtain the image of the optical positioning marker, and solves the problem that the positioning cannot be realized due to the shielding between part of viewpoints and the optical positioning marker; the method comprises the steps of establishing a multi-view unified coordinate system by using a coordinate system transformation technology; a multi-viewpoint optical positioning method based on optimal viewpoint selection is provided.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to refer to like parts throughout the drawings. In the drawings:
fig. 1 is a diagram of a hardware device architecture according to an embodiment of the present invention.
Fig. 2 is a flowchart of a multi-view optical positioning method according to an embodiment of the present invention.
FIG. 3 is a flowchart illustrating steps and processes for establishing a multi-view unified coordinate system according to an embodiment of the present invention.
FIG. 4 is a flowchart illustrating the steps and processes for selecting a viewpoint according to the present invention.
Fig. 5 is a diagram illustrating a relationship between depth and parallax.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
FIG. 1 illustrates a hardware infrastructure architecture diagram according to an embodiment of the invention; the invention provides a multi-viewpoint optical positioning system, which comprises two stereo cameras (each stereo camera comprises two video cameras 1, so that the total number of the four video cameras 1 is four), a workstation 2, a display 3, a direct current power supply 4 and a surgical instrument 5. The surgical device 5 has three optical positioning markers 6 thereon. Four cameras 1 are horizontally arranged in space, and the distance between every two cameras 1 is d1,d2,d3,d4,d5,d6. The four cameras 1 synchronously capture images of the optical positioning markers 6 on the surgical instrument 5, the obtained images are received and processed by the workstation 2, the spatial coordinates of the tip point of the surgical instrument 5 are calculated through the positional relationship between each optical positioning marker 6 on the surgical instrument 5 and the instrument tip, and then displayed on the display 3, thereby displaying the spatial coordinates of the surgical instrument tip in real time.
As shown in fig. 2, the multi-viewpoint optical positioning method based on optimal viewpoint selection of the present invention is divided into three steps, namely, a step of spatial arrangement of multiple viewpoints and a step of establishing a unified coordinate system. Establishing a parallel multi-viewpoint array, and obtaining a multi-viewpoint unified coordinate system by a camera calibration method and a coordinate system transformation technology; secondly, a viewpoint selecting step. Selecting a viewpoint pair to be used for positioning the optical positioning marker to determine the space coordinate of the optical positioning marker according to the shielding condition of each viewpoint and the optical positioning marker on the surgical instrument and the space position relation between the optical marker and each viewpoint; thirdly, a surgical instrument positioning step, namely determining the pose of the surgical instrument according to the position relation between the optical positioning marker and the tip point of the surgical instrument.
Referring to fig. 3, the steps and processes for spatial arrangement of multiple viewpoints and establishment of a unified coordinate system according to the present invention will be described in detail.
Step S1: establishing parallel multi-view array
Two stereo cameras are horizontally placed in the space.
Step S2: obtaining internal and external parameters of each viewpoint through calibration
And obtaining internal parameters of four viewpoints and external parameters among different viewpoints by adopting a Zhangyingyou plane calibration method.
Step S3: multi-view synchronous image acquisition
And controlling a stereo camera on the bus of the 1394 cards of the same computer to synchronously acquire images.
Step S4: calculating the coordinates and number of the central pixels of the image of the optical positioning marker
And (4) counting the central pixel coordinates and the number of image points of the optical positioning markers under each viewpoint by using a region growing method.
Step S5: matching of marker images under multiple viewpoints
And synchronously acquiring images of all viewpoints and calculating the central pixel coordinates and the number of image points of the optical positioning markers under each viewpoint. Because the optical centers of the multiple viewpoints are approximately on a straight line and the imaging planes are approximately parallel, the optical positioning marker is considered to be close to the same horizontal line when the image point on the imaging plane of one viewpoint is close to the image point of the other viewpoint matched with the optical positioning marker, and the vertical coordinate difference of the image points is within 2 pixels. When a plurality of image points are arranged near the same horizontal line, the matching relation of the image points of the optical positioning marker is determined according to the relative relation of the horizontal coordinates. And determining the corresponding relation between the marker image points on each image.
Step S6: establishing a binocular vision system with two viewpoints, and calculating the space coordinates of the markers
After the corresponding relation between the image points of the marker on each image is determined, the space coordinates of the marker on the instrument are subjected to three-dimensional reconstruction by using the binocular vision principle, and the space coordinates of each optical positioning marker under different binocular vision systems are calculated.
Step S7: calculating the transformation relationship between the coordinates of the optical positioning markers in different binocular vision systems
The spatial coordinates of the optical positioning markers reconstructed by the binocular stereo viewpoint system are related to the positions of two viewpoints forming the binocular vision system in the space, and the coordinates of the same markers reconstructed by different binocular stereo viewpoint systems are different. The coordinate transformation of the optical positioning marker in different binocular vision systems meets the rigid body transformation. And (3) acquiring a conversion matrix between space coordinate systems of different binocular vision systems by using a coordinate system transformation technology, as shown in formula (1).
P′=R′tPt+T′t (1)
Where P' is the coordinates of a point in space in a reference coordinate system, PiTo reconstruct spatial coordinates, R ', using different binocular visions'iAnd T'iThe coordinate system formed for the different binocular vision is transformed to the parameters of the reference coordinate system, the coordinate system reconstructed by the present invention generally with 1,2 viewpoints (i.e., the viewpoints represented by the first and second cameras on the left in fig. 1) is the reference coordinate system.
Referring to fig. 4, the steps and flow of the viewpoint selecting step according to the present invention will be described in detail.
Step S11: multi-view synchronous image acquisition
And controlling a stereo camera on the bus of the 1394 cards of the same computer to synchronously acquire images.
Step S12: calculating the coordinates and number of the central pixels of the image points of the optical positioning markers
And (4) counting the central pixel coordinates and the number of image points of the optical positioning markers under each viewpoint by using a region growing method.
Step S13: matching optical positioning marker images under multiple viewpoints
After images of the optical positioning markers on the surgical instrument under various viewpoints are obtained, the corresponding relation between the image points of the optical positioning markers under each viewpoint is determined through a stereo matching principle. And after image points formed by reflection caused by instruments or other factors are eliminated, counting the number of image points of the optical positioning marker in each viewpoint.
Step S14: judging whether each viewpoint is shielded or not
After the images of the markers under the multiple viewpoints are matched, the number of the optical positioning markers on each surgical instrument is fixed (3), so that the shielding condition between each viewpoint and the optical positioning markers is judged according to the number of the points of the images of the markers under each viewpoint. If so, excluding the viewpoint pair with occlusion, and then proceeding to step S15, otherwise, proceeding directly to step S15.
Step S15: determining whether there are more than two viewpoints to obtain a complete image of the marker on the surgical instrument
If there are no more than two viewpoints to obtain an image of the optical positioning markers on the complete surgical instrument, then return is made to step S11. If so, the process proceeds to step S16.
Step S16: determining whether there are only two viewpoints to obtain a complete image of an optical positioning marker on a surgical instrument
If there are only two viewpoints to obtain the image of the optical positioning markers on the complete surgical instrument, the process proceeds directly to step S18 to perform three-dimensional reconstruction and calculate the three-dimensional spatial coordinates of each optical positioning marker. If not, the flow proceeds to step S17.
Step S17: selecting the viewpoint pair with the maximum distance from the base line for reconstruction
When there are multiple viewpoints that can capture the optical positioning markers on the complete surgical instrument, the viewpoint pair with the largest baseline distance is selected.
Step S18: calculating spatial coordinates of optical positioning markers using three-dimensional reconstruction
And (3) carrying out three-dimensional reconstruction on the space coordinates of the optical positioning markers on the instrument by using the binocular vision principle, and calculating the space coordinates of each marker under the selected binocular vision system.
Step S19: transforming the coordinates of the optical positioning markers into a reference coordinate system
And converting the space coordinates of the marker on the surgical instrument into coordinates under a reference coordinate system by using a conversion matrix between the coordinates of a binocular vision system consisting of two viewpoints for reconstruction and the coordinates of a reference binocular vision system.
The invention also provides a method for selecting the optimal viewpoint. The selection of the optimal viewpoint pair needs to be considered if there are multiple viewpoints from which image points of the optical positioning marker can be acquired simultaneously. In the depth direction, the optimal depth measurement range is fixed for the binocular viewpoint with a fixed baseline and focal length. The relation between the measured depth and the parallax is shown in formula (2):
Figure BDA0002454219190000081
wherein Z is depth, b is the distance between two cameras, f is the focal length of the cameras, xlAnd xrThe horizontal coordinates of the image points of the markers on the left and right viewpoint pixel coordinate systems, and d is parallax.
As can be seen from fig. 5, at a position with a large depth, the resolution accuracy of the parallax to the depth is deteriorated, and according to the formula (2), the parallax depends on the baseline distance between the two cameras, so the camera with the large baseline distance is selected to position the surgical instrument.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (8)

1. A multi-viewpoint optical positioning method based on optimal viewpoint selection is characterized by comprising the following steps:
establishing a parallel multi-viewpoint array, and obtaining a multi-viewpoint unified coordinate system through a camera calibration method and coordinate system transformation;
a viewpoint selecting step, namely selecting the optimal viewpoint needed to be used for positioning the optical marker according to the shielding condition of each viewpoint and the optical positioning marker and the spatial position relationship between the optical marker and each viewpoint, and reconstructing the spatial coordinate of the optical positioning marker; the shielding condition of each viewpoint and the optical positioning markers is judged according to the quantity of the markers in each viewpoint, and if one positioning instrument is positioned by using N optical positioning markers, the viewpoint with the quantity of the markers less than N is captured as the viewpoint with light shielding; capturing the viewpoints with the number of the markers equal to N as the viewpoints without occlusion; the specific rules for selecting the optimal viewpoint to be used for positioning the optical markers are as follows: if a positioning instrument is positioned by N optical positioning markers and partial viewpoints are blocked, selecting viewpoints with the number of the captured markers equal to N; selecting a viewpoint pair having a largest baseline distance when there are a plurality of viewpoints from which the N optical positioning markers can be captured;
and a positioning step, namely determining the pose of the positioning instrument according to the position relation between the optical positioning marker and the tip point of the positioning instrument.
2. The method of claim 1,
the parallel multi-viewpoint array is characterized in that the imaging planes of all viewpoints are parallel, the optical axes of all viewpoints are parallel, and the optical centers of all viewpoints are on a horizontal line.
3. The method according to claim 1, wherein the spatial arrangement of the multiple viewpoints and the establishment of the unified coordinate system comprises the following steps:
(1) placing a plurality of acquisition viewpoints in parallel in a space;
(2) obtaining internal and external parameters of the plurality of acquisition viewpoints by adopting a camera calibration method;
(3) synchronously acquiring images from multiple viewpoints, and calculating the central pixel coordinates of the optical positioning markers under each viewpoint;
(4) determining the corresponding relation between the optical positioning markers under each viewpoint, performing three-dimensional reconstruction on the space coordinates of the optical positioning markers by using a binocular vision principle, and calculating the space coordinates of each optical positioning marker under different viewpoint pairs;
(5) and calculating a conversion matrix between space coordinate systems of different viewpoint pairs by using coordinate system transformation.
4. The method according to claim 1, wherein the viewpoint selecting comprises the following specific steps:
(1) controlling multi-viewpoint synchronous image acquisition, and calculating the central pixel coordinates of the optical positioning markers and the number of the optical positioning markers under each viewpoint;
(2) determining the corresponding relation between the optical positioning markers in each viewpoint and judging whether each viewpoint is shielded or not;
(3) selecting a viewpoint pair to be used for positioning the optical positioning marker according to the shielding condition of each viewpoint and the optical positioning marker and the spatial position relation between the optical marker and each viewpoint;
(4) carrying out three-dimensional reconstruction on the space coordinates of each optical positioning marker by using a binocular vision principle;
(5) the spatial coordinates of the reconstructed optical positioning markers are converted into spatial coordinates in a reference coordinate system.
5. The method of claim 1,
the optical positioning markers are actively and/or passively luminescent.
6. The method of claim 1,
the multi-view refers to three or more views.
7. A multi-viewpoint optical positioning system based on optimal viewpoint selection, comprising:
the multi-view spatial arrangement and unified coordinate system establishing module is used for establishing a parallel multi-view array and obtaining a multi-view unified coordinate system through a camera calibration method and coordinate system transformation;
the viewpoint selecting module is used for selecting the optimal viewpoint to be used for positioning the optical positioning marker according to the shielding condition of each viewpoint and the optical positioning marker and the spatial position relation between the optical marker and each viewpoint, and reconstructing the spatial coordinate of the optical positioning marker; the shielding condition of each viewpoint and the optical positioning markers is judged according to the quantity of the markers in each viewpoint, and if one positioning instrument is positioned by using N optical positioning markers, the viewpoint with the quantity of the markers less than N is captured as the viewpoint with light shielding; capturing the viewpoints with the number of the markers equal to N as the viewpoints without occlusion; the specific rules for selecting the optimal viewpoint to use to locate the optical markers are: if a positioning instrument is positioned by N optical positioning markers and partial viewpoints are blocked, selecting viewpoints with the number of the captured markers equal to N; selecting a viewpoint pair having a largest baseline distance when there are a plurality of viewpoints from which the N optical positioning markers can be captured;
and the positioning module is used for determining the pose of the positioning instrument according to the position relation between the optical positioning marker and the tip point of the positioning instrument.
8. An electronic device, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps of any of claims 1 to 6.
CN202010301658.1A 2020-04-16 2020-04-16 Multi-viewpoint optical positioning method and system based on optimal viewpoint selection Active CN111481293B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010301658.1A CN111481293B (en) 2020-04-16 2020-04-16 Multi-viewpoint optical positioning method and system based on optimal viewpoint selection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010301658.1A CN111481293B (en) 2020-04-16 2020-04-16 Multi-viewpoint optical positioning method and system based on optimal viewpoint selection

Publications (2)

Publication Number Publication Date
CN111481293A CN111481293A (en) 2020-08-04
CN111481293B true CN111481293B (en) 2021-09-17

Family

ID=71789989

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010301658.1A Active CN111481293B (en) 2020-04-16 2020-04-16 Multi-viewpoint optical positioning method and system based on optimal viewpoint selection

Country Status (1)

Country Link
CN (1) CN111481293B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101299270B (en) * 2008-05-27 2010-06-02 东南大学 Multiple video cameras synchronous quick calibration method in three-dimensional scanning system
CN101750607B (en) * 2008-07-25 2012-11-14 清华大学 Instrument identifying method for passive optical position fixing navigation system
CN104647390B (en) * 2015-02-11 2016-02-10 清华大学 For the multiple-camera associating active tracing order calibration method of mechanical arm remote operating
CN209220493U (en) * 2018-02-28 2019-08-09 安徽福晴医疗科技有限公司 A kind of distribution optical alignment tracking system
CN110619658B (en) * 2019-09-16 2022-04-19 北京地平线机器人技术研发有限公司 Object tracking method, object tracking device and electronic equipment

Also Published As

Publication number Publication date
CN111481293A (en) 2020-08-04

Similar Documents

Publication Publication Date Title
US11025889B2 (en) Systems and methods for determining three dimensional measurements in telemedicine application
EP3125806B1 (en) Quantitative three-dimensional imaging of surgical scenes
CN109949899B (en) Image three-dimensional measurement method, electronic device, storage medium, and program product
CN108734744B (en) Long-distance large-view-field binocular calibration method based on total station
US20170180704A1 (en) Quantitative three-dimensional imaging of surgical scenes from multiport perspectives
US20160295194A1 (en) Stereoscopic vision system generatng stereoscopic images with a monoscopic endoscope and an external adapter lens and method using the same to generate stereoscopic images
US11354810B2 (en) Methods and systems for imaging a scene, such as a medical scene, and tracking objects within the scene
TWI520576B (en) Method and system for converting 2d images to 3d images and computer-readable medium
CN114998499A (en) Binocular three-dimensional reconstruction method and system based on line laser galvanometer scanning
CN105783880B (en) A kind of monocular laser assisted bay section docking calculation
Mahdy et al. Projector calibration using passive stereo and triangulation
CN106500625A (en) A kind of telecentricity stereo vision measuring apparatus and its method for being applied to the measurement of object dimensional pattern micron accuracies
CN105496556A (en) High-precision optical positioning system for surgical navigation
CN102881040A (en) Three-dimensional reconstruction method for mobile photographing of digital camera
CN112215871B (en) Moving target tracking method and device based on robot vision
CN113040909A (en) Optical tracking system and method based on near-infrared three-eye stereo vision
CN111481293B (en) Multi-viewpoint optical positioning method and system based on optimal viewpoint selection
CN109938841B (en) Surgical instrument navigation system based on multi-view camera coordinate fusion
CN111829435A (en) Multi-binocular camera and line laser cooperative detection method
WO2018222181A1 (en) Systems and methods for determining three dimensional measurements in telemedicine application
CN114155280A (en) Binocular line tracking method, device and equipment
CN110068308B (en) Distance measurement method and distance measurement system based on multi-view camera
US20230355319A1 (en) Methods and systems for calibrating instruments within an imaging system, such as a surgical imaging system
KR20080107345A (en) Stereo camera and method for recognizing stereoscopic image of stereo camera
US20230147711A1 (en) Methods for generating stereoscopic views in multicamera systems, and associated devices and systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Zhang Nan

Inventor after: Wu Bo

Inventor after: Ye Can

Inventor before: Zhang Nan

CB03 Change of inventor or designer information
GR01 Patent grant
GR01 Patent grant