KR20130110798A - Building internal navication apparatus and method for controlling distance and speed of camera - Google Patents
Building internal navication apparatus and method for controlling distance and speed of camera Download PDFInfo
- Publication number
- KR20130110798A KR20130110798A KR1020120033044A KR20120033044A KR20130110798A KR 20130110798 A KR20130110798 A KR 20130110798A KR 1020120033044 A KR1020120033044 A KR 1020120033044A KR 20120033044 A KR20120033044 A KR 20120033044A KR 20130110798 A KR20130110798 A KR 20130110798A
- Authority
- KR
- South Korea
- Prior art keywords
- building
- camera
- moving speed
- eye camera
- right eye
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Image Analysis (AREA)
Abstract
The building navigation device capable of adjusting the camera interval and the moving speed of the present invention includes a left eye and a right eye camera unit for navigating a building, a movement speed adjusting unit for adjusting a moving speed of the left eye camera unit and the right eye camera unit, and the left eye camera unit and the Camera spacing adjusting unit for adjusting the distance between the right eye camera unit, and if the building photographed by the left eye and right eye camera unit is not located within the comfort zone of the fusion area of the preset phantom when the building navigation is performed, Extracts two pre-measured camera intervals and movement speed values that may give a minimum noticeable difference (JND), and adjusts the interval and movement speed of the left eye camera unit and the right eye camera unit with the extracted values. Under the control of the moving speed control unit and the camera interval control unit The control unit may include a control unit, wherein the two camera distances and moving speed values measured in advance may be recognized by a person in response to a change in stimulation while changing two camera distances and moving speed values for the scene where the building is captured. The difference between the stimuli is measured, and the minimum distance between two cameras and the moving speed when the minimum difference of the stimulation (JND) is generated is measured and stored in advance.
Through this, the present invention measures the difference between the stimuli that can be perceived by a person in response to the change of the stimulus while changing values of two camera intervals and moving speeds for the scene where the building is captured when the building is navigated inside / outside the building. Among the measured stimulus differences, two camera intervals and moving speed values when the minimum stimulus difference (JND) occurs are measured in advance, and the building navigation is performed by adjusting the two camera intervals and moving speeds with the previously measured values. Thus, there is an effect that can solve the visual fatigue elements that can occur when the navigation to the inside / outside of the building to generate a three-dimensional image.
Description
The present invention relates to a building navigation apparatus and method that can adjust the camera interval and moving speed, in particular, the building is imaged to solve the visual fatigue factors that may occur when the stereo rendering to the three-dimensional image by navigating inside / outside the building By measuring the distance between two cameras and moving speed of the scene, we measure the difference in human perception in response to the stimulus change, and when the minimum stimulus difference (JND) occurs among the measured stimulus differences Preliminary measurement of two camera intervals and moving speed values of a camera, and a navigation apparatus and method for controlling a camera interval and moving speed for performing a building navigation while adjusting two camera intervals and moving speeds with a pre-measured value will be.
Recently, as the demand for stereo 3D contents increases, researches on stereo 3D contents targeting various applications are actively conducted. In addition, modeling tools such as Maya, 3D MAX, and Rhino can be immersed in imaginary structures. There is an increasing number of visualizations that use stereo rendering.
Stereo 3D content is a principle that allows us to project images corresponding to the left and right visual fields using a binocular cue of human to feel three-dimensional feeling and depth feeling.
Therefore, in stereo rendering, two cameras are arranged in the virtual space similar to human eyes to acquire images, and the images are guided to be formed on the retina of each eye, thereby giving the user a sense of depth. These stereoscopic images are different from human's real-world perception in the acquisition method and process of the image, and therefore, symptoms such as visual fatigue and dizziness may appear.
Recently, various approaches have been proposed to reduce the visual fatigue caused by stereo 3D contents.
Among them, the method of controlling image parallax using an image processing technique is to generate an intermediate view image when the parallax of the image is larger than the binocular spacing of the user in consideration of the binocular disparity of the user. By filling the holes generated in the captured image using the left and right image information of the interpolated image, a more natural result can be obtained than the conventional method of interpolating by referring to the pixel information around the holes.
In addition, there are methods of applying a saliency map and a depth map as an example of applying stereo rendering by finding an area to be visually concentrated in an image.
The Saleian City Map is a method of approximating the visual stimuli by the color, directionality, curvature, size, and movement of elements in the image.
M. Lang et al. Changed the binocular disparity so that the objects in the corresponding area were located in the comfort zone by using the salience map and prevented the hall from being generated in the image by using the image warping technique. However, this method has a limitation in that distortion due to warping occurs.
Thus, in many cases, hole filling or image-waving techniques using an image processing technique are mainly used to reduce visual fatigue occurring in stereo 3D contents.
However, these methods can be applied to stereo synthesis based on 2D image, but they are not applicable to the case of 3D building.
SUMMARY OF THE INVENTION The present invention has been made to solve the above-mentioned problems, and when a user navigates inside / outside a building, a person may recognize a response to a stimulus change while changing two camera intervals and moving speed values for a scene where the building is captured. Measure the difference between the two stimuli, and measure the two camera intervals and movement speed values when the minimum stimulus difference (JND) occurs among the measured stimulus differences. The purpose of the present invention is to solve visual fatigue factors that may occur when generating a 3D image by navigating inside / outside the building by adjusting the moving speed.
Building navigation apparatus capable of adjusting the camera interval and movement speed of the present invention for achieving the above object is a movement speed control unit for adjusting the movement speed of the left and right eye camera unit for navigating the building, the left eye camera unit and the right eye camera unit A camera spacing adjusting unit for adjusting a distance between the left eye camera unit and the right eye camera unit, and when the building photographed by the left eye and right eye camera unit is not located within a comfort zone of a preset phantom fusion region during the building navigation, Extracts two pre-measured camera intervals and moving speed values that can give a minimum noticeable difference (JND) for the scene where the building is captured, and extracts the left eye camera unit and the right eye camera unit. The moving speed control unit and the phase to adjust the interval and the moving speed And a control unit for controlling a camera gap adjusting unit, wherein the two camera distance and moving speed values measured in advance are changed in response to a change in stimulation while changing two camera distance and moving speed values for the scene where the building is captured. The difference between the perceived stimuli is measured, and two camera intervals and moving speed values when the minimum stimulus difference (JND) occurs among the measured stimulus differences are measured and stored in advance.
In addition, a storage unit storing two measured camera intervals and moving speed values that may give a minimum stimulation difference (JND) for a scene in which a plurality of buildings are photographed, and a left eye image and a right eye photographed through the left and right eye camera units. The stereo rendering unit further comprises a stereo rendering unit for generating an image of a 3D format including a left eye and a right eye image, and a display unit for outputting an image of a 3D format generated by the stereo rendering unit.
The controller may further include the left eye according to the pre-measured camera spacing and moving speed values when the building image captured by the left and right eye camera units is out of the fusion region of the phantom in a negative parallax or positive parallax direction. The distance between the camera and the right eye camera is adjusted to control the object to be positioned in the comfort zone.
The controller may determine that the building is located in the retinal contention area out of the comfort zone when the building is visible only to one of the left eye camera and the right eye camera, and the two camera intervals and moving speeds measured in advance. The distance between the left eye camera and the right eye camera is adjusted according to a value so that the building is located within the comfort zone or out of the comfort zone so as to be invisible to the left eye camera and the right eye camera.
The building navigation method of the present invention is carried out in the building navigation device, step (a) of setting the fusion area and the comfort zone of the phantom through the left eye camera and the right eye camera during the building navigation, the building during navigation of the building When located outside the fusion region of the camera, the two camera distance and the moving speed values that can give the minimum stimulation difference (JND) for the scene of the image is extracted, and extracted from the left eye camera and the (B) adjusting the distance and the moving speed of the right eye camera, while the building is within the fusion area of the phantom while navigating the building, if the building is located in the retinal contention area beyond the comfort zone, the building is captured. Between two pre-measured cameras with minimal stimulus difference (JND) And extract the moving speed value, and the extracted value includes the (c) step of adjusting the interval and the moving speed of the left-eye camera and the right-eye camera.
In addition, the pre-measured two camera intervals and moving speed values measure the difference in stimuli that a person can perceive in response to a change in stimulation while changing two camera intervals and moving speed values for the scene where the building is captured. Among the measured stimulus differences, two camera intervals and moving speed values when a minimum stimulus difference (JND) is generated are measured and stored in advance.
In addition, the step (b) is a pre-measured that can give a minimum stimulus difference (JND) for the scene in which the building is imaged when the building deviates in the direction of negative parallax or positive parallax of the phantom fusion region The building is located in the comfort zone by adjusting the distance and the moving speed of the left eye camera and the right eye camera with two camera spacing and moving speed values.
Also, in the step (c), when the building is visible only to either one of the left eye camera and the right eye camera, when the building is located in the retinal contention area outside the comfort zone, minimal stimulation is performed for the scene where the building is captured. The building is located in the comfort zone or out of the comfort zone by adjusting the distance and the movement speed of the left eye camera and the right eye camera with two pre-measured camera spacing and moving speed values that can give a difference JND. To be invisible to the left eye camera and the right eye camera.
The present invention measures the difference between the stimuli that humans can recognize in response to the stimulus change while changing the distance between the two cameras and the moving speed for the scene in which the building is imaged when navigating inside / outside the building. By measuring the distance between the two cameras and the moving speed when the minimum stimulus difference (JND) occurs among the difference between the stimuli, and performing the building navigation while adjusting the distance and moving speed of the two cameras with the measured values, There is an effect that can solve the visual fatigue factors that can occur when the navigation to the inside / outside of the building to generate a three-dimensional image.
1 is a view for explaining a fusion region of the phantom formed by two cameras according to an embodiment of the present invention,
2 is a view for explaining a comfort zone formed by two cameras according to an embodiment of the present invention;
3 is a view for explaining an off-axis-based camera interval adjustment method according to an embodiment of the present invention,
4 is an internal configuration diagram of a building navigation apparatus capable of adjusting two camera intervals and moving speeds according to an embodiment of the present invention;
5 is a diagram illustrating a reference image during JND measurement according to an embodiment of the present invention;
6 is a view for explaining a range difference of JND according to an embodiment of the present invention;
7 is a flowchart illustrating a process of adjusting a distance between two cameras and a moving speed during building navigation according to an embodiment of the present invention;
8 is a graph showing a comparison of the change in the distance between the two cameras during the building navigation according to an embodiment of the present invention,
9 is a view comparing before and after adjusting two camera intervals and moving speed according to an embodiment of the present invention.
DETAILED DESCRIPTION Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art may easily implement the present invention.
1 is a view for explaining a fusion region of the phantom formed by two cameras according to an embodiment of the present invention, Figure 2 is a view for explaining a comfort zone formed by two cameras according to an embodiment of the present invention to be.
1 and 2, when a person looks at the screen, an image projected on the screen is formed on both eyes, and the distance between two points of the image on the eye is called parallax. The parallax is divided into positive parallax, short parallax corresponding to zero parallax, and negative parallx.
Positive parallax refers to the parallax when the image is inside the screen and when the parallax is less than or equal to the distance between eyes. The larger the parallax value, the more stereoscopic the image appears to be than the screen. In areas with positive parallax, crossed disparity occurs.
Negative parallax refers to the parallax when the image is in front of the screen and occurs when the eyes intersect to give a three-dimensional appearance as if the object is protruding. In the region with negative parallax, uncrossed disparity occurs.
The temporal trajectory is also called zero parallax, and refers to the case where the image is formed in two dimensions on the screen plane, that is, when the parallax is zero. In this case, since the image is formed on the screen plane, the user does not feel a three-dimensional effect.
In other words, the fusion region of the phantom represents a region where objects are naturally merged so that a stereoscopic feeling is properly displayed, that is, a region where positive parallax and negative parallax occur around a short time trajectory.
In general, when the convergence state of the eye leaves the fusion region of the pannum, the stereoscopic feeling cannot be felt, and visual fatigue such as displopia, superimposition, retinal rivalry, and suppression occurs. Therefore, in the present invention, the above-described fatigue generation region and the region where no fatigue occurs may be defined based on the fusion region of the phantom, and the region where the fatigue does not occur may be defined as a comfort zone.
3 is a diagram illustrating an off-axis based camera spacing method according to an embodiment of the present invention.
The off-axis method deforms and intersects the view space so that the convergence of the eye can be expressed and the virtual screen similar to the short-time trajectory can be defined.
In the off-axis method, there are two ways to change the distance of two cameras corresponding to the binocular spacing, that is, the left and right eyes.
The first is to reduce the distance between the left and right cameras. Reducing the distance between two cameras reduces the binocular spacing because the angle of intersection of the gaze vectors in each view space is reduced.
Thus, the comfort zone shown in FIG. 2 can be defined through the off-axis method.
The second method is to adjust the convergence angle of the view space directly. However, if you directly change the convergence of the view space, the depth of the virtual screen will continue to change. In addition, the size of the retinal contention area may be more uncomfortable visually because it can be larger than the case of only adjusting the camera interval. Therefore, in the present invention, the binocular spacing is reduced by reducing the distance between two cameras corresponding to the first method.
4 is an internal configuration diagram of a building navigation apparatus capable of adjusting two camera intervals and moving speeds according to an exemplary embodiment of the present invention. FIG. 5 is a diagram illustrating a reference image during JND measurement according to an exemplary embodiment of the present invention. 6 is a view for explaining the range difference between the JND according to an embodiment of the present invention.
4 to 6, the building navigation device capable of adjusting two camera intervals and moving speed includes a
The left
The right
The
The camera
The
Three-dimensional formats include the Top and Down format, the Side by Side format, and the interlaced format.
The
The
Referring to FIG. 5 and FIG. 6, the minimum noticeable difference (JND) is a minimum noticeable difference (JND) that can be recognized by a person in response to a change in the stimulus. Find the minimum stimulus difference on the degree of change in binocular spacing of the image, and extract the two camera spacing and moving speed values accordingly, and reflect this to adjust the distance and moving speed of the left and right eye cameras during building navigation. .
In the present invention, in order to measure JND, which is the minimum stimulus difference on the degree of change of binocular spacing of the three-dimensional image, the image of the specific building is captured by two cameras and the image of the binocular spacing is continuously changed. It can be measured by the method.
The user compares the two images and stores the binocular spacing at the moment when the two images look different. The experimental image that the user sees is an image having 5 binocular spacings for one scene, and measures the JND which is the minimum stimulus difference for about 20 main buildings generated while navigating the building, and thus two cameras Extract interval and speed values.
The five binocular spacings of the experimental images are divided into five stages by setting the maximum binocular spacing 6.5cm, which is the average binocular spacing of an adult, from 0.
5 shows four of the about 20 buildings used for JND measurements.
(A) and (b) of FIG. 5 are general three-dimensional scenes with many buildings, (c) is a scene where retinal contention occurs, and (d) is a scene where the building protrudes toward the field of view.
FIG. 6 shows the JND measurement ranges for the buildings shown in FIG. 5. The horizontal axis represents binocular spacing in each scene, and the vertical axis represents the gamut of the image change perceived by the user. As shown in the graph of FIG. 6, as the binocular spacing increases for the four buildings shown in FIG. 5, the range of JND perceived by the user also increases.
Therefore, according to Weber's law, the larger the three-dimensional stimulus is, the less sensitive it is to changes in the stimulus.
Therefore, through the measured JND, two camera intervals and moving speed values when JND is generated for about 20 buildings are measured in advance, and the building is based on two camera intervals and moving speeds previously measured during building navigation. Perform navigation.
Therefore, the distance between two cameras and the moving speed can be changed so that the building exists within the comfort zone range without the user being aware of the change in the image.
The
When the visual fatigue factor occurs while navigating the building, the
Meanwhile, in the present invention, visual fatigue factors may occur when the distance between the left
7 is a flowchart illustrating a process of adjusting a camera interval and a moving speed based on JND during building navigation according to an embodiment of the present invention.
Referring to FIG. 7, when building navigation starts, the
In addition, the
If the building is located outside the fusion area of the phantom, the control unit 401 extracts two pre-measured camera spacing and moving speed values to give a minimum stimulus difference (JND) for the scene where the building is photographed in step S703. The distance and movement speed of the left
In this case, if the building is located outside the fusion area of the phantom,
first. The visual fatigue factor occurs when the building deviates in the negative parallax direction of the phantom fusion region of FIG. 2.
This happens when the two cameras and the building are very close together, and in order to change the location of the building that suddenly pops up during navigation, it is located in the comfort zone in the fusion area of the phantom. Extract the two pre-measured camera spacing and moving speed values that can give the minimum stimulation difference (JND) for the camera, and use the extracted values to determine the spacing and moving speed of the
The
In addition, if the building is out of the fusion area of the pannum in the direction of positive parallax, the building is too far from the two camera's field of view, making it difficult to recognize three-dimensional effect, causing visual fatigue.
In this case, the
To do so, the
The
On the other hand, if it is determined in step S702 that the building is not out of the fusion area of the phantom as described above, the
Although the building is located within the fusion area of the phantom, visual fatigue factors occur when the building is located in the retinal contention area due to the difference in the view space between the left
This occurs when a camera passes through a door or narrow passageway during navigation, and a column or corner is visible in one camera but not in the other camera.
At this time, the
At this time, when the distance between the left
The
On the other hand, when adjusting the distance and the movement speed of the left
8 is a diagram illustrating a graph comparing changes of two camera intervals during building navigation according to an embodiment of the present invention, Figure 9 is before and after adjusting the two camera intervals and moving speed according to an embodiment of the present invention Is a diagram comparing.
8 and 9 illustrate two results measured in advance in consideration of the result of building navigation and minimum stimulus difference (JND) while fixing the distance and the moving speed of the left eye camera and the right eye camera to verify the invention proposed in the present invention. Based on the camera spacing and moving speed, the results of building navigation were compared while changing the distance and moving speed of the left and right cameras.
In this experiment, we used a 120Hz LCD monitor and shutter glass in a graphic environment using Geforce Quadro FX3800.
FIG. 8 is a graph illustrating a variation of binocular spacing, that is, two camera spacings, generated during a building navigation process based on a time axis. Based on the minimum stimulation difference (JND) obtained through the experiment than when the navigation was performed with the two cameras fixed at 6.5cm, the binocular spacing of an adult, with the two cameras moving at a fixed speed. If you change the distance and speed of two cameras dynamically based on the value of one or two camera speeds and movement speeds, you can see that the two cameras change dynamically according to the characteristics of each scene. have.
FIG. 9 compares the experimental results of FIG. 8 for each scene and shows the red-blue anaglyph.
9 (a) to 9 (e) is an image of fixing two camera intervals and moving speed, and the bottom of FIGS. 9 (a) to 9 (e) shows two cameras according to an embodiment of the present invention. This video is applied with the method of adjusting the interval and moving speed.
In FIG. 9 (a), since the building is far from the field of view, the two cameras do not change the distance and the moving speed in both methods. However, in (b), (c), (d), and (e) of FIG. 9, it can be seen that the difference is severe between the two navigation methods. In particular, it can be seen that the difference becomes larger in a scene where retinal contention occurs, such as a staircase or a corridor.
As described above, the present invention may provide a comfortable image to the user by performing the building navigation while adjusting the distance between the two cameras and the moving speed in consideration of the minimum stimulation difference (JND) of the scene in which the building is photographed.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of illustration, It will be readily apparent that various substitutions, modifications, and alterations can be made herein.
410: control unit 421: left eye camera unit
422: right eye camera unit 430: moving speed control unit
440: camera spacing controller 450: stereo rendering unit
460: display unit 470: storage unit
Claims (8)
A movement speed controller for controlling movement speeds of the left eye camera unit and the right eye camera unit;
A camera gap adjusting unit for adjusting a distance between the left eye camera unit and the right eye camera unit;
When the building image captured by the left and right eye camera units is not located within the comfort zone of a predetermined fusion area of the phantom in navigation of the building, a minimum noticeable difference (JND: Just Noticeable Difference) is observed with respect to the image of the building image. Extracts two pre-measured camera spacing and moving speed values, and controls the moving speed adjusting unit and the camera gap adjusting unit to adjust the distance and the moving speed of the left eye camera unit and the right eye camera unit using the extracted values. Including a control unit,
The pre-measured two camera intervals and movement speed values measure the difference between the stimuli that a human can perceive in response to the change of stimuli while changing the two camera intervals and movement speed values for the scene where the building is photographed. And a camera distance and movement speed adjustable building navigation device, characterized in that the two camera intervals and movement speed values when a minimum stimulus difference (JND) is generated are stored in advance.
A storage unit for storing two camera distances and moving speed values measured in advance to give a minimum stimulus difference (JND) for a scene where a plurality of buildings are photographed;
A stereo rendering unit configured to stereo-render the left eye image and the right eye image photographed through the left eye and right eye cameras to generate a 3D format image including a left eye and a right eye image;
And a display unit configured to output an image of a 3D format generated by the stereo rendering unit.
When the image captured by the left eye and right eye cameras is out of the fusion region of the phantom in a negative parallax or positive parallax direction, the left eye camera and the right eye according to the previously measured two camera distances and moving speed values. And controlling the distance between the camera and the moving speed so that the object is positioned in the comfort zone.
When the building is visible to only one of the left eye camera and the right eye camera, it is determined that the building is in the retinal contention area out of the comfort zone, and according to the previously measured two camera distances and moving speed values. Adjusting the distance and the movement speed of the left eye camera and the right eye camera to control the building is located in the comfort zone or out of the comfort zone to control the visible to the left eye camera and the right eye camera. Building navigation device available.
(A) setting a fusion area and a comfort zone of the phantom through the left eye camera and the right eye camera during building navigation;
If the building is located outside the fusion area of the phantom while navigating the building, two pre-measured camera spacing and moving speed values are extracted to give a minimum stimulus difference (JND) for the scene where the building is imaged. (B) adjusting an interval and a moving speed of the left eye camera and the right eye camera with the extracted value;
While the building is in navigation, the building is within the fusion area of the phantom, but is located in the retinal contention area outside of the comfort zone, so that a minimum stimulus difference (JND) can be given to the scene where the building is imaged. And (c) extracting two camera spacing and moving speed values and adjusting the distance and the moving speed of the left eye camera and the right eye camera using the extracted values.
While changing the distance between the two cameras and the moving speed of the scene in which the building is photographed, the difference between the stimuli that can be perceived by the human being is measured in response to the change of the stimulus, and the minimum stimulus difference (JND) Building navigation method, characterized in that the value is measured in advance and stored the two cameras when the distance and the moving speed value.
When the building deviates from the fusion region of the phantom in the direction of negative parallax or positive parallax, two pre-measured camera intervals and moving speeds may give a minimum stimulus difference (JND) for the scene where the building is imaged. Building navigation method characterized in that the building is located in the comfort zone by adjusting the distance and the moving speed of the left eye camera and the right eye camera by a value.
When the building is visible to only one of the left eye camera and the right eye camera, and the building is located in the retinal contention area beyond the comfort zone, a minimum stimulus difference (JND) may be given to the scene where the building is captured. The distance between the left eye camera and the right eye camera and the moving speed of the left eye camera and the right eye camera by using two pre-measured camera spacing and moving speed values, so that the building is located in the comfort zone or out of the comfort zone. Building navigation method characterized in that it is not visible to the camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120033044A KR101320477B1 (en) | 2012-03-30 | 2012-03-30 | Building internal navication apparatus and method for controlling distance and speed of camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020120033044A KR101320477B1 (en) | 2012-03-30 | 2012-03-30 | Building internal navication apparatus and method for controlling distance and speed of camera |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20130110798A true KR20130110798A (en) | 2013-10-10 |
KR101320477B1 KR101320477B1 (en) | 2013-10-23 |
Family
ID=49632563
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020120033044A KR101320477B1 (en) | 2012-03-30 | 2012-03-30 | Building internal navication apparatus and method for controlling distance and speed of camera |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101320477B1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101428384B1 (en) * | 2013-05-01 | 2014-08-13 | 세종대학교산학협력단 | Method and apparatus for navigation in free form architecture |
US10067663B2 (en) | 2015-12-03 | 2018-09-04 | Hyundai Motor Company | System and method for setting a three-dimensional effect |
WO2018177031A1 (en) * | 2017-03-29 | 2018-10-04 | 迎刃而解有限公司 | Reflective surround display system |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103118255A (en) * | 2013-01-25 | 2013-05-22 | 深圳广晟信源技术有限公司 | Self-adaptation quantifying method based on concave model and self-adaptation quantifying device based on concave model |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100249824B1 (en) * | 1997-12-11 | 2000-03-15 | 정선종 | 3d navigation system and method for virtual worlds using 3d device |
KR100500898B1 (en) * | 2003-12-18 | 2005-07-18 | 한국전자통신연구원 | 3d space modeling apparatus using space information and method therefor |
KR20050063302A (en) * | 2003-12-22 | 2005-06-28 | 한국전자통신연구원 | The navigation method and system in 3d virtual world |
WO2008073135A2 (en) | 2006-03-30 | 2008-06-19 | Rajasingham Arjuna I | Virtual and real navigation systems |
-
2012
- 2012-03-30 KR KR1020120033044A patent/KR101320477B1/en not_active IP Right Cessation
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101428384B1 (en) * | 2013-05-01 | 2014-08-13 | 세종대학교산학협력단 | Method and apparatus for navigation in free form architecture |
US10067663B2 (en) | 2015-12-03 | 2018-09-04 | Hyundai Motor Company | System and method for setting a three-dimensional effect |
WO2018177031A1 (en) * | 2017-03-29 | 2018-10-04 | 迎刃而解有限公司 | Reflective surround display system |
Also Published As
Publication number | Publication date |
---|---|
KR101320477B1 (en) | 2013-10-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6380881B2 (en) | Stereoscopic image display apparatus, image processing apparatus, and stereoscopic image processing method | |
TWI523488B (en) | A method of processing parallax information comprised in a signal | |
US9754379B2 (en) | Method and system for determining parameters of an off-axis virtual camera | |
US20010033327A1 (en) | Stereoscopic computer graphics image generating apparatus and stereoscopic TV apparatus | |
US9049435B2 (en) | Image providing apparatus and image providing method based on user's location | |
JP2014500674A (en) | Method and system for 3D display with adaptive binocular differences | |
JP2002223458A (en) | Stereoscopic video image generator | |
US9294751B2 (en) | Method and system for disparity adjustment during stereoscopic zoom | |
KR101320477B1 (en) | Building internal navication apparatus and method for controlling distance and speed of camera | |
US20120188226A1 (en) | Method and system for displaying stereoscopic images | |
US9628770B2 (en) | System and method for stereoscopic 3-D rendering | |
Baker | Generating images for a time-multiplexed stereoscopic computer graphics system | |
TW201733351A (en) | Three-dimensional auto-focusing method and the system thereof | |
CN103609104A (en) | Interactive user interface for stereoscopic effect adjustment | |
JP2012244453A (en) | Image display device, image display system, and three-dimensional spectacles | |
KR20120070363A (en) | Stereoscopic 3d display device and method of driving the same | |
US20140347451A1 (en) | Depth Adaptation for Multi-View System | |
US20120120051A1 (en) | Method and system for displaying stereoscopic images | |
JP2014053782A (en) | Stereoscopic image data processor and stereoscopic image data processing method | |
JP5037713B1 (en) | Stereoscopic image display apparatus and stereoscopic image display method | |
JP2011181991A (en) | 3d video display device | |
KR20120017653A (en) | Display apparatus and method for providing osd applying thereto | |
Yoon et al. | Saliency-guided stereo camera control for comfortable vr explorations | |
KR101428384B1 (en) | Method and apparatus for navigation in free form architecture | |
US20190149811A1 (en) | Information processing apparatus, information processing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant | ||
FPAY | Annual fee payment |
Payment date: 20170920 Year of fee payment: 5 |
|
LAPS | Lapse due to unpaid annual fee |