US20240159527A1 - Opto-mechanical camera position tracking device - Google Patents
Opto-mechanical camera position tracking device Download PDFInfo
- Publication number
- US20240159527A1 US20240159527A1 US18/111,533 US202318111533A US2024159527A1 US 20240159527 A1 US20240159527 A1 US 20240159527A1 US 202318111533 A US202318111533 A US 202318111533A US 2024159527 A1 US2024159527 A1 US 2024159527A1
- Authority
- US
- United States
- Prior art keywords
- encoder
- camera
- pedestal
- tilt
- jib
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000009877 rendering Methods 0.000 claims abstract description 4
- 238000012800 visualization Methods 0.000 claims abstract description 3
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 claims description 30
- 230000003416 augmentation Effects 0.000 claims description 2
- 238000000034 method Methods 0.000 abstract description 7
- 230000003287 optical effect Effects 0.000 abstract description 5
- 230000003190 augmentative effect Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B21/00—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
- G01B21/22—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring angles or tapers; for testing the alignment of axes
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/56—Accessories
- G03B17/561—Support related camera accessories
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
Definitions
- the present invention generally relates to the field of positional tracking
- the present invention relates to the opto-mechanical device based positional tracking of camera for virtual reality and/or augmented reality applications with improved features and characteristics.
- Virtual Reality comprises a computer simulated environment that can simulate a physical presence in places in the real world or imagined worlds.
- virtual reality environments are primarily visual experiences, displayed on a screen (e.g., and viewed by a user using 3D glasses) or through special stereoscopic display head gear.
- the simulated environments can be configured to be similar to the real world in order to create lifelike experiences, or the simulated environments can differ significantly from the real word, such as in VR games.
- Augmented Reality generally refers to a computer simulated environment combined with the real world.
- the elements of the real world are augmented with computer generated graphics.
- translucent stereoscopic headsets are worn by the user in AR simulations to enable a wearer to view the real world through the headset while also being able to view computer generated graphics.
- Movement of participants and/or objects in interactive VR and AR simulations may be tracked using various methods and devices such as magnetic tracking, acoustic tracking, inertial tracking, optical tracking with or without markers varying in parameters such as tracking precision, tracking volume, tracking markers, manufacturing cost, and complexity of user setup.
- Certain positional tracking systems currently known in the art fully or partially rely on tracking markers attached to objects, and then track the marked objects.
- a tracked object typically must be covered with large tracking markers that can encode several bits of data, such that typically only large objects can be tracked.
- PTAM/SLAM systems (acronyms for Positional Tracking and Mapping for Small AR Workspaces, and Simultaneous Localization and Mapping, respectively), locate the camera on the head-mounted displays (HMD) and place the tracking markers on the walls of the environment.
- HMD head-mounted displays
- This approach has several disadvantages, for example: it typically requires the VR user to greatly modify the appearance of his or her environment by covering all viewing directions with large tracking markers; it typically requires the user to perform a complex calibration step in order to the map the environment; and the tracking cameras attached to the HMD typically require a good lens for precise tracking, and this increases the weight of the HMD, typically significantly. Since the tracking markers are typically complex in design in such implementations, their decoding is usually performed on the PC instead of an onboard processor in or near the HMD, and this typically increases the amount of data sent from the camera to the computer and the tracking latency.
- the fixed sensors and markers should have a clear line of sight with the camera else there will be drift, lag or black out in the video output. This restrains the creative and technical team to seamlessly work in the given shooting environment.
- the current camera trackers restrict the movement of the camera confined to the area where line of sight is available between the camera and the markers or sensors. This constrains the creative and technical team to think beyond and deliver.
- the current camera tracking requires substantial time to calibrate for individual cameras and a specific lens. This results in high set up time and also a large amount of time to change from one camera lens to another, thereby resulting in huge loss of the shooting time on the production days.
- Most of the camera tracking systems are fixed and those camera tracking systems which are dismantlable and mountable are not easy to move, not easy to set up and not easy to calibrate. Most of the camera tracking systems cannot work outdoors due to various above listed limitations.
- the present invention describes an opto-mechanical camera position tracking device.
- the primary object of the present invention is to provide an opto-mechanical camera position tracking device.
- Further object of the present invention is to provide a camera position tracking device which gathers real-time positional data of camera in three-dimensional (3D) space and zoom value of the camera lens.
- Embodiments of the present disclosure present technological improvements as solution to one or more of the above-mentioned technical problems recognized by the inventor in conventional practices and existing state of the art.
- the present disclosure seeks to provide an opto-mechanical camera position tracking device which gathers real-time positional data of camera in three dimensional (3D) space and zoom value of the camera lens.
- the said camera tracking device attaches to an existing camera crane, detects the necessary positional data (jib, camera—pan, jib, camera—tilt) and optical data (lens—zoom), processes it, and sends it to a rendering engine using FreeD protocol.
- FIG. 1 illustrates the dolly wheel encoder unit ( 100 ) in accordance to the embodiment of the present invention
- FIG. 2 illustrates the lens encoder unit ( 200 ) in accordance to the embodiment of the present invention
- FIG. 3 illustrates the pedestal pan encoder unit ( 300 ) in accordance to the embodiments of the present invention
- FIG. 4 illustrates the head pan and tilt encoder unit ( 400 ) in accordance to the embodiments of the present invention
- FIG. 5 illustrates the pedestal tilt encoder unit ( 500 ) in accordance to the embodiments of the present invention.
- the present invention describes an opto-mechanical camera position tracking device.
- Typical prior arts lack in the ability to track the position of camera and camera data to use it for virtual production in films and augmented reality graphics in broadcast industry seamlessly in real time.
- the embodiments of the present invention relate to an opto-mechanical camera position tracking device which gathers real-time positional data of camera in three-dimensional (3D) space and zoom value of the camera lens.
- the said camera tracking device attaches to an existing camera crane or jib, detects the necessary positional and optical data, processes it, and sends it to a rendering engine for seamless, real-time visualization.
- the camera position tracking device comprises of a Dolly Wheel Encoder Unit ( 100 ); a Lens Encoder Unit ( 200 ); a Pedestal Pan Encoder Unit ( 300 ); a Head Pan and Tilt Encoder Unit ( 400 ) and a Pedestal Tilt Encoder Unit ( 500 ).
- the dolly wheel encoder unit ( 100 ) consists of an encoder ( 101 ), an encoder shaft screw ( 102 ), an encoder wheel ( 103 ), a grub screw ( 104 ), a spring ( 105 ), a trolley wheel extension rod ( 106 ) and a wheel encoder mounting plate ( 107 ).
- the jib pedestal is mounted on the wheels and placed on the track.
- the Dolly Wheel Encoder ( 100 ) along with its parts is fixed to the base of the jib pedestal to collect the rotational data of the wheels.
- the encoder wheel ( 103 ) rotates and the rotational data of the wheel is converted to digital data.
- the digital data is sent to the printed circuit board PCB through the cables.
- the PCB along with the processor converts the data to FreeD protocol and sends the FreeD protocol to the computer system.
- the lens encoder unit ( 200 ) consists of a teeth ( 201 ), a drive mounting bracket ( 202 ), an encoder ( 203 ), a hub ( 204 ), and a grub screw ( 206 ).
- the focus of a camera consists a of ring with gear teeth.
- the Lens encoder Unit ( 200 ) along with its components are fixed to the gear teeth of the focus ring of a camera to collect the rotational data of the gear teeth.
- the gear teeth in the ring moves when the focus of the camera is adjusted.
- the rotational data of the gear teeth is converted to digital data.
- the digital data is sent to the PCB through the cables.
- the PCB along with the processor converts the data to FreeD protocol and send the FreeD protocol to the computer system.
- the Lens Encoder Unit ( 200 ) can also process the zoom data.
- the pedestal pan encoder unit ( 300 ) consists of a dolly wheel encoder ( 301 ), a gear ( 304 ), a pinion gear ( 305 ) and a shaft ( 306 ).
- the Jib Arm is mounted above the jib pedestal.
- the Pedestal Pan Encoder Unit ( 300 ) along with its parts mentioned above is fixed to the connecting point of the Jib Arm and Jib Pedestal.
- the Jib Arm When the Jib Arm is moved left and right (Pan Left, Pan Right), the mounted gear beneath the Jib Arm rotates and the rotational data of the gear is converted to digital data.
- the digital data is sent to the PCB through the cables.
- the PCB along with the processor converts the data to FreeD protocol and send the FreeD protocol to the computer system.
- the head pan and tilt encoder unit ( 400 ) consists of a teeth gear ( 401 ), an arm ( 402 ), an attachment shaft ( 306 ), drive mounting bracket ( 202 ), encoder ( 405 ); a head pan encoder clamp ( 406 ), a head pan encoder clamp connector ( 407 ), a tilt encoder mounting block ( 408 ), a tilt encoder mounting shaft ( 306 ), and a tilt encoder mounting shaft tap ( 410 ).
- the Camera is mounted on the Jib head.
- the Head Pan and Tilt Encoder Unit ( 400 ) along with its parts is fixed to the Jib Head.
- the Jib Head When the Jib Head is moved left and right (Pan Left, Pan Right) or moved up and down (Tilt Up, Tilt Down), the mounted gear near the Jib Head rotates and the rotational data of the gear is converted to digital data.
- the digital data is sent to the PCB through the cables.
- the PCB along with the processor converts the data to FreeD protocol and send the FreeD protocol to the computer system.
- the pedestal tilt encoder unit ( 500 ) consists of an encoder ( 501 ), a pedestal tilt block ( 502 ), a pedestal tilt block arm ( 503 ), a pedestal tilt block connect ( 504 ), a pedestal tilt block motor mount ( 505 ).
- the Jib Arm is mounted above the Jib pedestal.
- the Pedestal Tilt Encoder Unit ( 500 ) with its parts is fixed to the connecting point of the Jib Arm and Jib Pedestal.
- the Jib Arm When the Jib Arm is moved up and down (Tilt Up, Tilt Down), the mounted gear beneath the Jib Arm rotates and the rotational data of the gear is converted to digital data.
- the digital data is sent to the PCB through the cables.
- the PCB along with the processor converts the data to FreeD protocol and send the FreeD protocol to the computer system.
- the camera is mounted on the Jib Head, hence the entire camera movement is based on the movement of JIB.
- the translation data of the JIB is taken through the connected wires from are generated with the control unit.
- the data includes jib forward and backward data, data from the dolly wheel, pan and tilt data from the pedestal, pan and tilt data from the head, rotation data of the head and the optical data of the lens.
- PCB printed circuit board
- the custom designed printed circuit board (PCB) with the processor is used to compile and synchronize the above-mentioned data and send it in open-source FreeD protocol which can be used by any leading virtual production engines used in broadcast and film making industry.
- a visual tool simulates the working of the camera tracker in real-time for easy diagnosis and observation of the camera tracker.
- the camera tracker As the entire camera tracker's sensors and encoders are mounted on the JIB, there are no additional markers and sensors necessary to be mounted in the ceiling or walls, hence the camera tracker provides easy working environment for the technical crew and is not defined or restricted by a specific perimeter. Also, this feature allows the present invention to be used in outdoors seamlessly.
- the present invention is an external marker less and encoder based camera tracking system which reduces the set up time substantially to couple of hours and eliminates the lag, drift and blackout because no line of sight is requred. Entire camera tracker's sensors and encoders are mounted on the jib. It is easily portable and hence can be used both indoor and outdoor.
- the present invention provides easy set up and calibration and also enables to easily check if all the sensors are working intact. It's.
- the present invention is compatible with a variety of jib, professional cameras and lenses and has auto calibration feature allows calibration of different professional cameras and lenses in less than few minutes.
- the present invention allows tracking data from a camera can be streamed to multiple systems simultaneously for additional augmentation.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
The present invention relates to an opto-mechanical camera position tracking device which gathers real-time positional data of camera in three dimensional (3D) space and zoom value of the camera lens without using any line of sight or markers and has zero drift. The said camera tracking device attaches to an existing camera crane, detects the necessary positional and optical data, processes it, and sends it to a rendering engine for seamless, real-time visualization.
Description
- The present invention generally relates to the field of positional tracking,
- Particularly, the present invention relates to the opto-mechanical device based positional tracking of camera for virtual reality and/or augmented reality applications with improved features and characteristics.
- Virtual Reality (VR) comprises a computer simulated environment that can simulate a physical presence in places in the real world or imagined worlds. Conventionally, virtual reality environments are primarily visual experiences, displayed on a screen (e.g., and viewed by a user using 3D glasses) or through special stereoscopic display head gear. The simulated environments can be configured to be similar to the real world in order to create lifelike experiences, or the simulated environments can differ significantly from the real word, such as in VR games.
- Augmented Reality (AR) generally refers to a computer simulated environment combined with the real world. Conventionally, the elements of the real world are augmented with computer generated graphics. Often, translucent stereoscopic headsets are worn by the user in AR simulations to enable a wearer to view the real world through the headset while also being able to view computer generated graphics.
- Movement of participants and/or objects in interactive VR and AR simulations may be tracked using various methods and devices such as magnetic tracking, acoustic tracking, inertial tracking, optical tracking with or without markers varying in parameters such as tracking precision, tracking volume, tracking markers, manufacturing cost, and complexity of user setup.
- Certain positional tracking systems currently known in the art fully or partially rely on tracking markers attached to objects, and then track the marked objects. In such systems, a tracked object typically must be covered with large tracking markers that can encode several bits of data, such that typically only large objects can be tracked.
- For this reason most such systems, known as “PTAM/SLAM systems” (acronyms for Positional Tracking and Mapping for Small AR Workspaces, and Simultaneous Localization and Mapping, respectively), locate the camera on the head-mounted displays (HMD) and place the tracking markers on the walls of the environment. This approach has several disadvantages, for example: it typically requires the VR user to greatly modify the appearance of his or her environment by covering all viewing directions with large tracking markers; it typically requires the user to perform a complex calibration step in order to the map the environment; and the tracking cameras attached to the HMD typically require a good lens for precise tracking, and this increases the weight of the HMD, typically significantly. Since the tracking markers are typically complex in design in such implementations, their decoding is usually performed on the PC instead of an onboard processor in or near the HMD, and this typically increases the amount of data sent from the camera to the computer and the tracking latency.
- The fixed sensors and markers should have a clear line of sight with the camera else there will be drift, lag or black out in the video output. This restrains the creative and technical team to seamlessly work in the given shooting environment. The current camera trackers restrict the movement of the camera confined to the area where line of sight is available between the camera and the markers or sensors. This constrains the creative and technical team to think beyond and deliver. The current camera tracking requires substantial time to calibrate for individual cameras and a specific lens. This results in high set up time and also a large amount of time to change from one camera lens to another, thereby resulting in huge loss of the shooting time on the production days. Most of the camera tracking systems are fixed and those camera tracking systems which are dismantlable and mountable are not easy to move, not easy to set up and not easy to calibrate. Most of the camera tracking systems cannot work outdoors due to various above listed limitations.
- Therefore, in light of foregoing discussion, there exists a need to overcome the drawbacks associated with the existing state of the art.
- The present invention describes an opto-mechanical camera position tracking device.
- The primary object of the present invention is to provide an opto-mechanical camera position tracking device.
- Further object of the present invention is to provide a camera position tracking device which gathers real-time positional data of camera in three-dimensional (3D) space and zoom value of the camera lens.
- Embodiments of the present disclosure present technological improvements as solution to one or more of the above-mentioned technical problems recognized by the inventor in conventional practices and existing state of the art.
- The present disclosure seeks to provide an opto-mechanical camera position tracking device which gathers real-time positional data of camera in three dimensional (3D) space and zoom value of the camera lens.
- According to an aspect of the present invention, the said camera tracking device attaches to an existing camera crane, detects the necessary positional data (jib, camera—pan, jib, camera—tilt) and optical data (lens—zoom), processes it, and sends it to a rendering engine using FreeD protocol.
- The objects and the advantages of the invention are achieved by the process elaborated in the present disclosure.
- The accompanying drawings constitute a part of this specification and illustrate one or more embodiments of the invention. Preferred embodiments of the invention are described in the following with reference to the drawings, which are for the purpose of illustrating the present preferred embodiments of the invention and not for the purpose of limiting the same.
- For simplicity and clarity of illustration, the drawing figures illustrate the general manner of construction, and descriptions and details of well-known features and techniques may be omitted to avoid unnecessarily obscuring the invention. Additionally, elements in the drawing figures are not necessarily drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve understanding of embodiments of the present invention. The same reference numerals in different figures denotes the same elements.
- In the drawings:
-
FIG. 1 illustrates the dolly wheel encoder unit (100) in accordance to the embodiment of the present invention; -
FIG. 2 illustrates the lens encoder unit (200) in accordance to the embodiment of the present invention; -
FIG. 3 illustrates the pedestal pan encoder unit (300) in accordance to the embodiments of the present invention; -
FIG. 4 illustrates the head pan and tilt encoder unit (400) in accordance to the embodiments of the present invention; -
FIG. 5 illustrates the pedestal tilt encoder unit (500) in accordance to the embodiments of the present invention. - The following detailed description illustrates embodiments of the present disclosure and ways in which the disclosed embodiments can be implemented. Although some modes of carrying out the present disclosure have been disclosed, those skilled in the art would recognize that other embodiments for carrying out or practicing the present disclosure are also possible.
- The present invention describes an opto-mechanical camera position tracking device.
- Typical prior arts lack in the ability to track the position of camera and camera data to use it for virtual production in films and augmented reality graphics in broadcast industry seamlessly in real time.
- The embodiments of the present invention relate to an opto-mechanical camera position tracking device which gathers real-time positional data of camera in three-dimensional (3D) space and zoom value of the camera lens. The said camera tracking device attaches to an existing camera crane or jib, detects the necessary positional and optical data, processes it, and sends it to a rendering engine for seamless, real-time visualization.
- According to the embodiments of the present invention, the camera position tracking device comprises of a Dolly Wheel Encoder Unit (100); a Lens Encoder Unit (200); a Pedestal Pan Encoder Unit (300); a Head Pan and Tilt Encoder Unit (400) and a Pedestal Tilt Encoder Unit (500).
- The dolly wheel encoder unit (100) consists of an encoder (101), an encoder shaft screw (102), an encoder wheel (103), a grub screw (104), a spring (105), a trolley wheel extension rod (106) and a wheel encoder mounting plate (107).
- The jib pedestal is mounted on the wheels and placed on the track. The Dolly Wheel Encoder (100) along with its parts is fixed to the base of the jib pedestal to collect the rotational data of the wheels.
- When the jib is moved on the track, the encoder wheel (103) rotates and the rotational data of the wheel is converted to digital data. The digital data is sent to the printed circuit board PCB through the cables. The PCB along with the processor converts the data to FreeD protocol and sends the FreeD protocol to the computer system.
- The lens encoder unit (200) consists of a teeth (201), a drive mounting bracket (202), an encoder (203), a hub (204), and a grub screw (206).
- The focus of a camera consists a of ring with gear teeth. The Lens encoder Unit (200) along with its components are fixed to the gear teeth of the focus ring of a camera to collect the rotational data of the gear teeth.
- The gear teeth in the ring moves when the focus of the camera is adjusted. The rotational data of the gear teeth is converted to digital data. The digital data is sent to the PCB through the cables. The PCB along with the processor converts the data to FreeD protocol and send the FreeD protocol to the computer system. The Lens Encoder Unit (200) can also process the zoom data.
- The pedestal pan encoder unit (300) consists of a dolly wheel encoder (301), a gear (304), a pinion gear (305) and a shaft (306).
- The Jib Arm is mounted above the jib pedestal. The Pedestal Pan Encoder Unit (300) along with its parts mentioned above is fixed to the connecting point of the Jib Arm and Jib Pedestal.
- When the Jib Arm is moved left and right (Pan Left, Pan Right), the mounted gear beneath the Jib Arm rotates and the rotational data of the gear is converted to digital data. The digital data is sent to the PCB through the cables. The PCB along with the processor converts the data to FreeD protocol and send the FreeD protocol to the computer system.
- The head pan and tilt encoder unit (400) consists of a teeth gear (401), an arm (402), an attachment shaft (306), drive mounting bracket (202), encoder (405); a head pan encoder clamp (406), a head pan encoder clamp connector (407), a tilt encoder mounting block (408), a tilt encoder mounting shaft (306), and a tilt encoder mounting shaft tap (410).
- The Camera is mounted on the Jib head. The Head Pan and Tilt Encoder Unit (400) along with its parts is fixed to the Jib Head.
- When the Jib Head is moved left and right (Pan Left, Pan Right) or moved up and down (Tilt Up, Tilt Down), the mounted gear near the Jib Head rotates and the rotational data of the gear is converted to digital data. The digital data is sent to the PCB through the cables. The PCB along with the processor converts the data to FreeD protocol and send the FreeD protocol to the computer system.
- The pedestal tilt encoder unit (500) consists of an encoder (501), a pedestal tilt block (502), a pedestal tilt block arm (503), a pedestal tilt block connect (504), a pedestal tilt block motor mount (505).
- The Jib Arm is mounted above the Jib pedestal. The Pedestal Tilt Encoder Unit (500) with its parts is fixed to the connecting point of the Jib Arm and Jib Pedestal.
- When the Jib Arm is moved up and down (Tilt Up, Tilt Down), the mounted gear beneath the Jib Arm rotates and the rotational data of the gear is converted to digital data. The digital data is sent to the PCB through the cables. The PCB along with the processor converts the data to FreeD protocol and send the FreeD protocol to the computer system.
- The camera is mounted on the Jib Head, hence the entire camera movement is based on the movement of JIB.
- The translation data of the JIB is taken through the connected wires from are generated with the control unit. The data includes jib forward and backward data, data from the dolly wheel, pan and tilt data from the pedestal, pan and tilt data from the head, rotation data of the head and the optical data of the lens.
- The custom designed printed circuit board (PCB) with the processor is used to compile and synchronize the above-mentioned data and send it in open-source FreeD protocol which can be used by any leading virtual production engines used in broadcast and film making industry.
- A visual tool simulates the working of the camera tracker in real-time for easy diagnosis and observation of the camera tracker.
- As the entire camera tracker's sensors and encoders are mounted on the JIB, there are no additional markers and sensors necessary to be mounted in the ceiling or walls, hence the camera tracker provides easy working environment for the technical crew and is not defined or restricted by a specific perimeter. Also, this feature allows the present invention to be used in outdoors seamlessly.
- The present invention is an external marker less and encoder based camera tracking system which reduces the set up time substantially to couple of hours and eliminates the lag, drift and blackout because no line of sight is requred. Entire camera tracker's sensors and encoders are mounted on the jib. It is easily portable and hence can be used both indoor and outdoor.
- According to the embodiments of the present invention, the present invention provides easy set up and calibration and also enables to easily check if all the sensors are working intact. It's.
- The present invention is compatible with a variety of jib, professional cameras and lenses and has auto calibration feature allows calibration of different professional cameras and lenses in less than few minutes.
- The present invention allows tracking data from a camera can be streamed to multiple systems simultaneously for additional augmentation.
Claims (8)
1. An opto-mechanical camera position tracking device, the said device comprising:
a dolly wheel encoder unit (100);
a lens encoder unit (200);
a pedestal pan encoder unit (300);
a head pan and tilt encoder unit (400);
a pedestal tilt encoder unit (500);
a printed circuit board (PCB)
characterized by mounted sensors and encoders on jib and camera and gathering of real-time positional data of camera in three-dimensional (3D) space and zoom value of the camera lens without using any line of sight or markers and has zero drift.
2. The device as claimed in claim 1 , wherein the dolly wheel encoder unit (100) consists of an encoder (101), an encoder shaft screw (102), an encoder wheel (103), a grub screw (104), a spring (105), a trolley wheel extension rod (106), and a wheel encoder mounting plate (107) and is fixed to the base of the jib pedestal to collect the rotational data of the wheels.
3. The device as claimed in claim 1 , wherein the lens encoder unit (200) consists of a teeth (201), a drive mounting bracket (202), an encoder (203), a hub (204), and a grub screw (206) and is fixed to the gear teeth of the zoom ring of a camera to collect the rotational data of the gear teeth.
4. The device as claimed in claim 1 , wherein the pedestal pan encoder unit (300) consists of a dolly wheel encoder (301), a gear (304), a pinion gear (305), and a shaft (306) and is connected to the intersecting point of the jib arm and jib pedestal.
5. The device as claimed in claim 1 , wherein the head pan and tilt encoder unit (400) consists of a teeth gear (401), an arm (402), an attachment shaft (306), drive mounting bracket (202), encoder (405); a head pan encoder clamp (406), a head pan encoder clamp connector (407), a tilt encoder mounting block (408), a tilt encoder mounting shaft (306), and a tilt encoder mounting shaft tap (410) and is connected to the jib head having mounted camera.
6. The device as claimed in claim 1 , wherein the pedestal tilt encoder unit (500) consists of an encoder (501), a pedestal tilt block (502), a pedestal tilt block arm (503), a pedestal tilt block connect (504), and a pedestal tilt block motor mount (505) and is connected to the intersecting point of the jib arm and jib pedestal.
7. The device as claimed in claim 1 , wherein the printed circuit board (PCB) with the processor is used to compile and synchronize the data and the processed data is sent to the real time rendering machine via the connectivity port for real time visualization.
8. The device as claimed in claim 1 , wherein the tracking data from a camera is streamed to multiple systems simultaneously for additional augmentation.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
IN202241065501 | 2022-11-15 | ||
IN202241065501 | 2022-11-15 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20240159527A1 true US20240159527A1 (en) | 2024-05-16 |
US12111149B2 US12111149B2 (en) | 2024-10-08 |
Family
ID=91029062
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/111,533 Active US12111149B2 (en) | 2022-11-15 | 2023-02-18 | Opto-mechanical camera position tracking device |
Country Status (1)
Country | Link |
---|---|
US (1) | US12111149B2 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110280565A1 (en) * | 2010-05-14 | 2011-11-17 | Chapman Leonard T | Dual loop camera stabilization systems and methods |
US20210321022A1 (en) * | 2020-04-10 | 2021-10-14 | Ricky Galante | Means and apparatus for enhancing visibility of proximate subsurface features for vessel |
-
2023
- 2023-02-18 US US18/111,533 patent/US12111149B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110280565A1 (en) * | 2010-05-14 | 2011-11-17 | Chapman Leonard T | Dual loop camera stabilization systems and methods |
US20210321022A1 (en) * | 2020-04-10 | 2021-10-14 | Ricky Galante | Means and apparatus for enhancing visibility of proximate subsurface features for vessel |
Also Published As
Publication number | Publication date |
---|---|
US12111149B2 (en) | 2024-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7170736B2 (en) | interactive augmentation or virtual reality device | |
EP3350653B1 (en) | General spherical capture methods | |
CN104536579B (en) | Interactive three-dimensional outdoor scene and digital picture high speed fusion processing system and processing method | |
JP7047394B2 (en) | Head-mounted display device, display system, and control method for head-mounted display device | |
CN108377381A (en) | Immersion VR Video Rendering method and devices | |
US20070247457A1 (en) | Device and Method for Presenting an Image of the Surrounding World | |
US11353708B1 (en) | Custom mixed reality smart glasses and software for vision impaired use | |
JP2016062486A (en) | Image generation device and image generation method | |
CN108022302A (en) | A kind of sterically defined AR 3 d display devices of Inside-Out | |
WO2017065348A1 (en) | Collaboration method using head mounted display | |
CN203746012U (en) | Three-dimensional virtual scene human-computer interaction stereo display system | |
CN112655202B (en) | Reduced bandwidth stereoscopic distortion correction for fisheye lenses of head-mounted displays | |
WO2018158765A1 (en) | Display system with video see-through | |
US10366542B2 (en) | Audio processing for virtual objects in three-dimensional virtual visual space | |
US11151804B2 (en) | Information processing device, information processing method, and program | |
CN111947650A (en) | Fusion positioning system and method based on optical tracking and inertial tracking | |
US12111149B2 (en) | Opto-mechanical camera position tracking device | |
JP7501044B2 (en) | DISPLAY SYSTEM, INFORMATION PROCESSING DEVICE, AND DISPLAY CONTROL METHOD FOR DISPLAY SYSTEM | |
US9989762B2 (en) | Optically composited augmented reality pedestal viewer | |
US10922888B2 (en) | Sensor fusion augmented reality eyewear device | |
KR200398885Y1 (en) | Augmented reality experiencing telescope by using visual tracking | |
CN114202639A (en) | High-presence visual perception method based on VR and related device | |
US11619814B1 (en) | Apparatus, system, and method for improving digital head-mounted displays | |
CN112053444A (en) | Method for superimposing virtual objects based on optical communication means and corresponding electronic device | |
EP4325842A1 (en) | Video display system, information processing device, information processing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |