CN106780617B - Virtual reality system and positioning method thereof - Google Patents
Virtual reality system and positioning method thereof Download PDFInfo
- Publication number
- CN106780617B CN106780617B CN201611045598.1A CN201611045598A CN106780617B CN 106780617 B CN106780617 B CN 106780617B CN 201611045598 A CN201611045598 A CN 201611045598A CN 106780617 B CN106780617 B CN 106780617B
- Authority
- CN
- China
- Prior art keywords
- camera
- straight line
- light source
- determining
- coordinates
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 230000003287 optical effect Effects 0.000 claims abstract description 41
- 238000004364 calculation method Methods 0.000 claims description 8
- 238000004590 computer program Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
Abstract
The invention discloses a virtual reality system and a positioning method thereof, wherein the system comprises a spherical light source arranged on a target to be positioned, a first camera and a second camera which are arranged around the target to be positioned, and a processing unit respectively connected with the first camera and the second camera; the first camera and the second camera acquire images of a spherical light source according to the same frequency, and the spherical light source presents circular light spots in the images; the processing unit receives two frames of images acquired at the same moment by the first camera and the second camera, determines a first straight line passing through the circle center of a circular light spot in the image acquired by the first camera and the optical center of the first camera, and a second straight line passing through the circle center of the circular light spot in the image acquired by the second camera and the optical center of the second camera, determines the position of the spherical light source according to the first straight line and the second straight line, does not need to perform curve fitting on the relation between the distance between the spherical light source and the camera and the diameter of the circular light spot in advance, and avoids affecting the positioning accuracy.
Description
Technical Field
The invention relates to the technical field of virtual reality, in particular to a virtual reality system and a positioning method thereof.
Background
The existing virtual reality system positioning scheme generally uses a single camera to track and position a spherical light source arranged on a target to be positioned, and the circle center of a circular light spot which is shown by the spherical light source on an image shot by the camera is equivalent to the direction of the spherical light source in a camera coordinate; when the diameter of the spherical light source is fixed, the distance between the spherical light source and the camera and the diameter of the circular light spot are in monotonically decreasing function relation, and the spatial position of the spherical light source can be determined by combining the distance and the direction.
However, the distance between the spherical light source and the camera and the diameter of the circular light spot are not in a linear decreasing function, and a strict curve fitting is required to achieve a good positioning effect, so that the curve fitting process becomes a bottleneck of positioning and influences the positioning accuracy.
Disclosure of Invention
The invention provides a virtual reality system and a positioning method thereof, which are used for solving the problem that the existing virtual reality positioning scheme needs to carry out strict curve fitting on the functional relation between the distance between a spherical light source and a camera and the diameter of a circular light spot, and influences the positioning accuracy.
According to one aspect of the present invention, there is provided a virtual reality system comprising: a spherical light source disposed on the target to be positioned, a first camera and a second camera disposed around the target to be positioned, and a processing unit connected to the first camera and the second camera, respectively;
the first camera and the second camera acquire images of the spherical light source according to the same frequency, and the spherical light source presents circular light spots in the images;
the processing unit is used for receiving two frames of images acquired by the first camera and the second camera at the same moment, determining a first straight line passing through the circle center of a circular light spot in the image acquired by the first camera and the optical center of the first camera, determining a second straight line passing through the circle center of the circular light spot in the image acquired by the second camera and the optical center of the second camera, and determining the position of the spherical light source according to the first straight line and the second straight line.
According to another aspect of the present invention, there is provided a virtual reality positioning method, comprising:
setting a spherical light source on a target to be positioned, and arranging a first camera and a second camera around the target to be positioned;
acquiring images by using a first camera and a second camera according to the same frequency, wherein a spherical light source presents a circular light spot in the images;
for two frames of images acquired at the same moment by the first camera and the second camera, determining a first straight line passing through the circle center of a circular light spot in the image acquired by the first camera and the optical center of the first camera, and a second straight line passing through the circle center of the circular light spot in the image acquired by the second camera and the optical center of the second camera;
and determining the position of the spherical light source according to the first straight line and the second straight line.
The beneficial effects of the invention are as follows: according to the embodiment of the invention, two cameras are arranged around a spherical light source to collect images, a first straight line passing through the center of a circular light spot in the image collected by a first camera and the optical center of the first camera is determined, a second straight line passing through the center of the circular light spot in the image collected by a second camera and the optical center of the second camera is determined, and finally the position of the spherical light source is determined according to the first straight line and the second straight line. Because the diameter of the spherical light source in the image is not involved in the whole positioning process, curve fitting is not needed to be carried out on the relation between the distance between the spherical light source and the camera and the diameter of the circular light spot in advance, and the influence on the positioning accuracy in the curve fitting process is avoided.
Drawings
FIG. 1 is a functional block diagram of a virtual reality system provided by one embodiment of the invention;
FIG. 2 is a positioning schematic diagram of a virtual reality system according to an embodiment of the invention;
fig. 3 is a flowchart of a positioning method of a virtual reality system according to an embodiment of the present invention.
Detailed Description
The design concept of the invention is as follows: the existing virtual reality system positioning scheme uses a camera, and by utilizing the principle of near-large and far-small, a better positioning effect can be achieved only by strictly performing curve fitting on the relationship between the distance between the spherical light source and the camera and the diameter of the circular light spot, and the curve fitting process becomes a bottleneck of positioning and influences the positioning accuracy. Aiming at the situation, the invention simultaneously uses two cameras to track the spherical light source, obtains the space direction of the spherical light source observed at different positions, and directly measures the position of the spherical light source by utilizing the principle that two straight lines intersect at one point. Because the diameter of the spherical light source in the image is not involved in the positioning process, curve fitting is not needed to be carried out on the relation between the distance between the spherical light source and the camera and the diameter of the circular light spot in advance, and the influence on the positioning accuracy in the curve fitting process is avoided.
Example 1
Fig. 1 is a functional block diagram of a virtual reality system according to an embodiment of the present invention, and as shown in fig. 1, the virtual reality system includes a spherical light source 110 disposed on an object to be positioned, a first camera 120 and a second camera 130 disposed around the object to be positioned, and a processing unit 140 connected to the first camera 120 and the second camera 130, respectively. The spherical light source 110 may be an infrared light source or a visible light source. Compared with a visible light source, the infrared light source has stronger anti-interference capability, such as background color, interference of ambient light and the like, so that the effect of using the infrared light source is better. The spherical light source 110 may be disposed on a head-mounted display device of a virtual reality system for tracking movement of a user's head; or may be provided on the handle controller for tracking the movement of the user's hand.
The first camera 120 and the second camera 130 collect images of the spherical light source 110 at the same frequency, and the spherical light source 110 respectively presents a circular spot in the images collected by the first camera 120 and the second camera 130.
The processing unit 140 receives two frames of images acquired by the first camera 120 and the second camera 130 at the same time, determines a first straight line passing through the center of a circular light spot in the image acquired by the first camera 120 and the optical center of the first camera 120, and a second straight line passing through the center of a circular light spot in the image acquired by the second camera 130 and the optical center of the second camera 130, and determines the position of the spherical light source 110 according to the first straight line and the second straight line.
The processing unit 140 includes an image processing module 141 and a coordinate calculation module 142. The coordinate calculation module 142 establishes a spatial coordinate system with the optical center of the first camera 120 as an origin, the x-axis is parallel to the bottom edge of the image plane of the first camera 120, the y-axis is perpendicular to the bottom edge of the image plane of the first camera 120, and the z-axis is the optical axis of the first camera 120. The coordinate calculation module 142 determines the coordinates of the optical center of the second camera 130 based on the relative position between the second camera 130 and the first camera 120.
In the use process of the virtual reality system of this embodiment, the relative positions of the first camera 120 and the second camera 130 are fixed, and the manufacturer can adjust and fix the relative positions of the two cameras in the production process, so that the two cameras are not adjustable; or the relative positions of the two cameras may be adjusted to specific values by the user prior to use. In order to simplify the operation process and increase the processing speed, the first camera 120 and the second camera 130 in the present embodiment are preferably the same type of camera, so that two camera images are acquiredThe images of (a) are the same size and the distance from the image plane to the optical center is equal. The image plane of the first camera 120 is perpendicular to the ground, and the bottom edge of the image plane is parallel to the ground; the image plane of the second camera 130 is on the same plane as the image plane of the first camera 120, the bottom edge of the image plane of the second camera 130 is on the same straight line with the bottom edge of the image plane of the first camera 120, and the distance between the first camera and the second camera is a preset distance x 0 . Thus, the coordinate calculation module 142 determines the coordinates of the optical center of the first camera 120 as (0, 0), and the coordinates of the optical center of the second camera 130 as (x) 0 ,0,0)。
The image processing module 141 receives two frames of images acquired by the first camera 120 and the second camera 130 at the same time, recognizes circular light spots in the two frames of images by scanning and the like, and determines the center coordinates of the circular light spots respectively. Since the first camera 120 and the second camera 130 are the same type of camera, the distance from the image plane to the optical center is z 0 The image processing module determines that the center coordinates of the circular spot in the image acquired by the first camera 120 is (x 1 ,y 1 ,z 0 ) The center coordinates of the circular spot in the image acquired by the second camera 130 are (x) 2 ,y 2 ,z 0 ) As shown in fig. 2.
The coordinate calculating module 142 determines a first straight line passing through the two points according to the coordinates (0, 0) of the optical center of the first camera 120 and the center coordinates (x 1, y1, z 0) of the circular light spot in the image acquired by the first camera 120, wherein the equation of the first straight line is:
the coordinate calculation module 142 calculates the coordinates (x 0 0, 0) and the center coordinates (x) of the circular spot in the image acquired by the second camera 130 2 ,y 2 ,z 0 ) Determining a second straight line passing through the two points, wherein the equation of the second straight line is as follows:
the processing unit 140 further includes a linear relationship determination module 143 and a position determination module 144. The linear relationship determination module 143 is configured to determine a positional relationship between the first line and the second line. In theory, the first straight line and the second straight line both pass through the center of the spherical light source 110, so that the two straight lines necessarily have an intersection point, but are affected by an error, and the first straight line and the second straight line actually calculated are likely to have no intersection point. Therefore, the straight line relation judging module 143 judges the position relation between the first straight line and the second straight line first, when judging that the first straight line and the second straight line intersect, the position determining module 144 calculates an intersection point of the two straight lines according to the equation of the first straight line and the equation of the second straight line, and takes the coordinate of the intersection point as the space coordinate of the spherical light source 110; when determining that the first line and the second line are different, the position determining module 144 uses the coordinates of the midpoint of the common vertical line segment of the first line and the second line as the spatial coordinates of the spherical light source 110.
In the process of determining the position of the spherical light source 110, the diameter of the circular light spot represented by the spherical light source 110 in the image is not used, so that curve fitting is not required to be performed on the relationship between the distance between the spherical light source 110 and the camera and the diameter of the circular light spot in advance, and the accuracy of positioning is prevented from being affected by the curve fitting process.
Example two
Fig. 3 is a flowchart of a positioning method of a virtual reality system according to an embodiment of the present invention, where, as shown in fig. 3, the positioning method of a virtual reality system provided in this embodiment includes:
step S310: a spherical light source is arranged on the target to be positioned, and a first camera and a second camera are arranged around the target to be positioned. The spherical light source can be an infrared light source or a visible light source, and the infrared light source has better effect. The spherical light source may be disposed on a head-mounted display device of the virtual reality system for tracking movement of the user's head; may also be provided on the handle controller for tracking the movement of the user's hand.
Step S320: the first camera and the second camera are used for collecting images according to the same frequency, and the spherical light source presents circular light spots in the images.
Step S330: for two frames of images acquired at the same moment by the first camera and the second camera, determining a first straight line passing through the circle center of a circular light spot in the image acquired by the first camera and the optical center of the first camera, and a second straight line passing through the circle center of the circular light spot in the image acquired by the second camera and the optical center of the second camera.
Preferably, step S330 specifically includes: establishing a space coordinate system by taking an optical center of the first camera as an origin, wherein an x-axis is parallel to the bottom edge of the image plane of the first camera, a y-axis is perpendicular to the bottom edge of the image plane of the first camera, and a z-axis is an optical axis of the first camera; determining the coordinates of the optical center of the second camera according to the relative position between the second camera and the first camera; respectively determining coordinates of circular light spots in two frames of images acquired at the same time by the first camera and the second camera; determining a first straight line passing through the two points according to the coordinates of the circle centers of the circular light spots in the image acquired by the first camera and the coordinates of the optical center of the first camera; and determining a second straight line passing through the two points according to the coordinates of the circle center of the circular light spot in the image acquired by the second camera and the coordinates of the optical center of the second camera.
In order to simplify the operation process and increase the processing speed, the embodiment further comprises adjusting the position of the first camera before starting to acquire the image, so that the image plane of the first camera is perpendicular to the ground and the bottom edge of the image plane is parallel to the ground; adjusting the position of the second camera to enable the image plane of the second camera to be on the same plane as the image plane of the first camera, and enabling the bottom edge of the image plane of the second camera to be on the same straight line with the bottom edge of the image plane of the first camera; and adjusting the distance between the first camera and the second camera to be equal to the preset distance.
Step S340: and determining the position of the spherical light source according to the first straight line and the second straight line. Firstly judging the position relation between a first straight line and a second straight line, and taking the coordinate of the intersection point of the first straight line and the second straight line as the space coordinate of the spherical light source when judging that the first straight line and the second straight line intersect; when the first straight line and the second straight line are judged to be different, the coordinate of the midpoint of the common vertical line segment of the first straight line and the second straight line is taken as the space coordinate of the spherical light source.
The foregoing is merely a specific embodiment of the invention and other modifications and variations can be made by those skilled in the art in light of the above teachings. It is to be understood by persons skilled in the art that the foregoing detailed description is provided for the purpose of illustrating the invention more fully, and that the scope of the invention is defined by the appended claims.
It should be noted that:
various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some or all of the components in accordance with embodiments of the present invention may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present invention can also be implemented as an apparatus or device program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present invention may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
The virtual reality system of this invention conventionally includes a processor and a computer program product or computer readable medium in the form of a memory. The memory may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. The memory has memory space for program code to perform any of the method steps described above. For example, the memory space for the program code may include individual program code for implementing the various steps in the above method, respectively. The program code can be read from or written to one or more computer program products. These computer program products comprise a program code carrier such as a hard disk, a Compact Disc (CD), a memory card or a floppy disk. Such computer program products are typically portable or fixed storage units. The memory cells may be similarly arranged memory segments, memory spaces, etc. The program code may be compressed, for example, in a suitable form. Typically, the storage unit comprises computer readable code, i.e. code that can be read by e.g. a processor, for performing the steps of the method according to the invention, which code, when executed, causes the virtual reality system to perform the steps of the method described above.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description. The language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter.
Claims (6)
1. A virtual reality system, comprising: a spherical light source disposed on the target to be positioned, a first camera and a second camera disposed around the target to be positioned, and a processing unit connected to the first camera and the second camera, respectively;
the first camera and the second camera acquire images of the spherical light source according to the same frequency, and the spherical light source presents circular light spots in the images;
the processing unit is used for receiving two frames of images acquired by the first camera and the second camera at the same moment, determining a first straight line passing through the circle center of a circular light spot in the image acquired by the first camera and the optical center of the first camera, and a second straight line passing through the circle center of the circular light spot in the image acquired by the second camera and the optical center of the second camera, and determining the position of the spherical light source according to the first straight line and the second straight line;
wherein the spherical light source is an infrared light source; alternatively, the spherical light source is a visible light source; the first camera and the second camera are cameras with the same model;
the image plane of the first camera is perpendicular to the ground, and the bottom edge of the image plane is parallel to the ground;
the image plane of the second camera is on the same plane as the image plane of the first camera, and the bottom edge of the image plane of the second camera is on the same straight line with the bottom edge of the image plane of the first camera;
the distance between the first camera and the second camera is a preset distance.
2. The system of claim 1, wherein the processing unit comprises an image processing module and a coordinate calculation module;
the coordinate calculation module is used for establishing a space coordinate system by taking the optical center of the first camera as an origin, wherein the x-axis is parallel to the bottom edge of the image plane of the first camera, the y-axis is perpendicular to the bottom edge of the image plane of the first camera, and the z-axis is the optical axis of the first camera; determining the coordinates of the optical center of the second camera according to the relative position between the second camera and the first camera;
the image processing module is used for receiving two frames of images acquired by the first camera and the second camera at the same moment, identifying circular light spots in the two frames of images, and respectively determining the center coordinates of the circular light spots;
the coordinate calculation module is further used for determining a first straight line passing through the two points according to the coordinates of the optical center of the first camera and the circle center coordinates of the circular light spots in the image acquired by the first camera; and determining a second straight line passing through the two points according to the coordinates of the optical center of the second camera and the center coordinates of the circular light spot in the image acquired by the second camera.
3. The system of claim 2, wherein the processing unit further comprises a linear relationship determination module and a position determination module;
the linear relation judging module is used for judging the position relation between the first line and the second line;
when the straight line relation judging module judges that the first straight line and the second straight line intersect, the position determining module takes the coordinate of the intersection point of the first straight line and the second straight line as the space coordinate of the spherical light source;
when the straight line relation judging module judges that the first straight line and the second straight line are different, the position determining module takes the coordinate of the midpoint of the common vertical line segment of the first straight line and the second straight line as the space coordinate of the spherical light source.
4. A method for locating a virtual reality system, comprising:
setting a spherical light source on a target to be positioned, and arranging a first camera and a second camera around the target to be positioned;
acquiring images by using a first camera and a second camera according to the same frequency, wherein a spherical light source presents a circular light spot in the images;
for two frames of images acquired at the same moment by the first camera and the second camera, determining a first straight line passing through the circle center of a circular light spot in the image acquired by the first camera and the optical center of the first camera, and a second straight line passing through the circle center of the circular light spot in the image acquired by the second camera and the optical center of the second camera;
determining the position of the spherical light source according to the first straight line and the second straight line;
the spherical light source is an infrared light source; alternatively, the spherical light source is a visible light source;
the first camera and the second camera are cameras with the same model; before starting to acquire the image, the method further comprises:
adjusting the position of the first camera to enable the image plane of the first camera to be perpendicular to the ground and the bottom edge of the image plane to be parallel to the ground;
adjusting the position of the second camera to enable the image plane of the second camera to be on the same plane as the image plane of the first camera, and enabling the bottom edge of the image plane of the second camera to be on the same straight line with the bottom edge of the image plane of the first camera;
and adjusting the distance between the first camera and the second camera to enable the distance to be equal to a preset distance.
5. The method of claim 4, wherein determining a first line passing through the center of the circular spot in the image acquired by the first camera and the optical center of the first camera, and a second line passing through the center of the circular spot in the image acquired by the second camera and the optical center of the second camera, specifically comprises:
establishing a space coordinate system by taking an optical center of the first camera as an origin, wherein an x-axis is parallel to the bottom edge of the image plane of the first camera, a y-axis is perpendicular to the bottom edge of the image plane of the first camera, and a z-axis is an optical axis of the first camera; determining the coordinates of the optical center of the second camera according to the relative position between the second camera and the first camera;
respectively determining coordinates of circular light spots in two frames of images acquired at the same time by the first camera and the second camera;
determining a first straight line passing through the two points according to the coordinates of the circle centers of the circular light spots in the image acquired by the first camera and the coordinates of the optical center of the first camera;
and determining a second straight line passing through the two points according to the coordinates of the circle center of the circular light spot in the image acquired by the second camera and the coordinates of the optical center of the second camera.
6. The method of claim 5, wherein determining the location of the spherical light source based on the first line and the second line comprises:
when the first straight line and the second straight line intersect, calculating the coordinate of the intersection point of the first straight line and the second straight line as the space coordinate of the spherical light source;
when the first straight line and the second straight line are different, the coordinates of the middle point of the public vertical line segment of the first straight line and the second straight line are calculated and used as the space coordinates of the spherical light source.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611045598.1A CN106780617B (en) | 2016-11-24 | 2016-11-24 | Virtual reality system and positioning method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611045598.1A CN106780617B (en) | 2016-11-24 | 2016-11-24 | Virtual reality system and positioning method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106780617A CN106780617A (en) | 2017-05-31 |
CN106780617B true CN106780617B (en) | 2023-11-10 |
Family
ID=58974278
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611045598.1A Active CN106780617B (en) | 2016-11-24 | 2016-11-24 | Virtual reality system and positioning method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106780617B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107644443B (en) * | 2017-09-01 | 2020-07-28 | 北京七鑫易维信息技术有限公司 | Parameter setting method and device in sight tracking equipment |
WO2019232800A1 (en) * | 2018-06-08 | 2019-12-12 | Hangzhou Taruo Information Technology Co., Ltd. | Automatic follow focus for camera device |
CN112052847B (en) * | 2020-08-17 | 2024-03-26 | 腾讯科技(深圳)有限公司 | Image processing method, apparatus, electronic device, and computer-readable storage medium |
CN112837375A (en) * | 2021-03-17 | 2021-05-25 | 北京七维视觉传媒科技有限公司 | Method and system for camera positioning inside real space |
CN115307559B (en) * | 2022-07-08 | 2023-10-24 | 国网湖北省电力有限公司荆州供电公司 | Target positioning method, remote laser cleaning method and system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004092826A1 (en) * | 2003-04-18 | 2004-10-28 | Appro Technology Inc. | Method and system for obtaining optical parameters of camera |
CN102855626A (en) * | 2012-08-09 | 2013-01-02 | 深圳先进技术研究院 | Methods and devices for light source direction calibration and human information three-dimensional collection |
CN103175851A (en) * | 2011-12-21 | 2013-06-26 | 北京兆维电子(集团)有限责任公司 | Calibrating tool and calibrating method for multi-camera scanning system |
CN206282232U (en) * | 2016-11-24 | 2017-06-27 | 北京小鸟看看科技有限公司 | A kind of virtual reality system |
-
2016
- 2016-11-24 CN CN201611045598.1A patent/CN106780617B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004092826A1 (en) * | 2003-04-18 | 2004-10-28 | Appro Technology Inc. | Method and system for obtaining optical parameters of camera |
CN103175851A (en) * | 2011-12-21 | 2013-06-26 | 北京兆维电子(集团)有限责任公司 | Calibrating tool and calibrating method for multi-camera scanning system |
CN102855626A (en) * | 2012-08-09 | 2013-01-02 | 深圳先进技术研究院 | Methods and devices for light source direction calibration and human information three-dimensional collection |
CN206282232U (en) * | 2016-11-24 | 2017-06-27 | 北京小鸟看看科技有限公司 | A kind of virtual reality system |
Non-Patent Citations (3)
Title |
---|
CHOI K S,EDMUND Y.Automatic source camera iden-tification using the intrinsic lens radial distortion.Op-tics Express.2006,全文. * |
圆形目标精密定位方法的研究与应用;邢德奎;达飞鹏;张虎;;仪器仪表学报(第12期);全文 * |
基于射影变换几何不变性的圆心像坐标的求取;李鸿燕;;计算机与现代化(第09期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN106780617A (en) | 2017-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106780617B (en) | Virtual reality system and positioning method thereof | |
JP7057454B2 (en) | Improved camera calibration system, target, and process | |
EP3008708B1 (en) | Vision augmented navigation | |
US10127732B1 (en) | Stereo-based calibration apparatus | |
US10802606B2 (en) | Method and device for aligning coordinate of controller or headset with coordinate of binocular system | |
CN111462213A (en) | Equipment and method for acquiring 3D coordinates and dimensions of object in motion process | |
US10948994B2 (en) | Gesture control method for wearable system and wearable system | |
CN110378966B (en) | Method, device and equipment for calibrating external parameters of vehicle-road coordination phase machine and storage medium | |
CN110176038B (en) | Method, system and storage medium for calibrating camera of vehicle | |
WO2017126172A1 (en) | Information processing device, information processing method, and recording medium | |
CN113034612B (en) | Calibration device, method and depth camera | |
EP3343242B1 (en) | Tracking system, tracking device and tracking method | |
CN103344190B (en) | A kind of elastic arm attitude measurement method based on line sweep and system | |
US10634918B2 (en) | Internal edge verification | |
US11747651B2 (en) | Computer-implemented method for determining centring parameters for mobile terminals, mobile terminal and computer program | |
CN108416787A (en) | Workpiece linear edge localization method applied to Machine Vision Detection | |
CN111488775A (en) | Device and method for judging degree of fixation | |
CN112017212A (en) | Training and tracking method and system of face key point tracking model | |
US20180040138A1 (en) | Camera-based method for measuring distance to object (options) | |
CN110966981A (en) | Distance measuring method and device | |
US10358091B2 (en) | Automated vehicle mirror adjustment | |
JP2009260475A (en) | Information processor, information processing method, and program | |
CN110533775B (en) | Glasses matching method and device based on 3D face and terminal | |
CN110415196A (en) | Method for correcting image, device, electronic equipment and readable storage medium storing program for executing | |
CN206282232U (en) | A kind of virtual reality system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |