CN1647114A - Method and system for displaying stereoscopic image - Google Patents

Method and system for displaying stereoscopic image Download PDF

Info

Publication number
CN1647114A
CN1647114A CNA038090252A CN03809025A CN1647114A CN 1647114 A CN1647114 A CN 1647114A CN A038090252 A CNA038090252 A CN A038090252A CN 03809025 A CN03809025 A CN 03809025A CN 1647114 A CN1647114 A CN 1647114A
Authority
CN
China
Prior art keywords
stereo
picture
display device
data
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA038090252A
Other languages
Chinese (zh)
Inventor
尹炳二
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Geo Rae Co Ltd
Original Assignee
Geo Rae Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/280,239 external-priority patent/US20030117395A1/en
Application filed by Geo Rae Co Ltd filed Critical Geo Rae Co Ltd
Publication of CN1647114A publication Critical patent/CN1647114A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/133Equalising the characteristics of different image components, e.g. their average brightness or colour balance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/296Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/371Image reproducers using viewer tracking for tracking viewers with different interocular distances; for tracking rotational head movements around the vertical axis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

The invention relates to a system for displaying stereoscopic images. The system comprises a display device (3900), an indicator controller (3920), a computing device (3950) and a display driver (3940). The display device (3900) displays at least one stereoscopic image, the stereoscopic image comprising a pair of two-dimensional plane images. The indicator controller (3920) controls movement of at least one input device indicator being displayed on the two-dimensional plane images, the at least one input device indicator being configured to move to a target location on the two-dimensional plane images. The computing device (3950) determines each location value for the target location of the at least one indicator, and determines center points of the two-dimensional plane images based on the determined location value of the at least one the indicator. The display driver (3940) moves displayed images based on the determined target location value.

Description

Be used to show the method and system of stereo-picture
Technical field
The present invention relates to a method and system that is used to generate and/or show a more real stereo-picture.Especially, the present invention relates to be used for showing the method and system of a more real stereo-picture at one group of display device.
Background technology
Usually, the mankind can go induced environment to discern an object by eyes.And because two eyes have a predetermined space interval between mutually, two these objects that eye response arrives are perceived as two images at the beginning, and each image is formed by in left eye or the right eye one of them.When these two image section ground crossovers, human brain recognizes this object.Here, the described image of sensing by human brain by the part of crossover because two different images that sent by left eye and right eye are combined in the brain, the sensation of a three-dimensional is just arranged.
By using above-mentioned principle, just developed and used different traditional three-dimensional (3D) image of camera and demonstration to generate and dubbing system.
As one of them example of these systems, United States Patent (USP) 4,729,017 discloses " stereo display method and device ".Have a simple relatively structure, described device allows an observer by stereo-picture of visual inspection.
Another example of these systems, United States Patent (USP) 5,978,143 disclose " stereo recording and display system ".This patent disclosure the picture material of stereo display can the observed person at an easy rate in the scene inner control, this is by described stereoscopic camera record.
Another example of these systems, United States Patent (USP) 6,005,607 disclose " stereo computer graph image generating apparatus and three dimensional television set ".This device shows the two dimensional image that generates three-dimensionally from three-dimensional structure information.
Summary of the invention
One aspect of the present invention provides a kind of method that shows stereo-picture.This method comprises provides a pair of stereocamera, and generates at least one stereo-picture, and this stereo-picture comprises a pair of two-dimensional plane image that is generated respectively by described a pair of stereocamera.This method also comprises the motion of surveying each described stereocamera, transmits the stereo-picture and the detection data of described generation.Described method also comprises described stereo-picture of reception and detection data, and shows the described two dimensional image that receives on one group of display device respectively.This method comprises provides a pilot signal, according to described detection data, represents the direction of motion of described stereocamera.
Another aspect of the present invention provides one to be used for system for displaying stereoscopic images.This system comprises one group of stereocamera, one group of motion detection equipment, a transmitter, a receiver and one group of display device.Described one group of stereocamera generates at least one stereo-picture, and this stereo-picture comprises a pair of two-dimensional plane image.In described one group of motion detection equipment each is surveyed the motion of described one group of stereocamera respectively.Described transmitter transmits the image of described generation and the exercise data of detection.Described receiver receives the exercise data of described image and detection.Described one group of display device shows the plane picture of described bidimensional respectively, and a pilot signal is provided, and represents the direction of motion of each described stereocamera.
Another aspect of the present invention provides a method that shows stereo-picture.Described method comprises the motion detection data that receive at least one stereo-picture and one group of stereocamera, and described stereo-picture comprises a pair of two-dimensional plane image.Described method also is included in and shows on one group of display device that this is to the two-dimensional plane image.This method comprises provides a pilot signal, according to described detection data, represents the direction of motion of described stereocamera.
Another aspect of the present invention provides a method that is used to show stereo-picture.This method comprises at least one stereo-picture of generation from three-dimensional structure data, and this stereo-picture comprises a pair of two-dimensional image, respectively by a pair of projection section projection.This method also comprises the motion of surveying described a pair of projection section, and shows described stereo-picture.This method comprises according to described detection data, and a pilot signal is provided, about each motion of described projection section.
Another aspect of the present invention provides a system that is used to show a 3-D view.This system comprises a receiver and one group of display device.This receiver receives the exercise data of at least one stereo-picture and one group of stereocamera, and described stereo-picture comprises a pair of two-dimensional plane image that is generated respectively by described one group of stereocamera.Described one group of display device shows this to two-dimensional image, and a pilot signal is provided, and represents the direction of motion of described stereocamera.
Another aspect of the present invention provides a method that is used to show stereo-picture.This method is included on one group of display device, shows at least one stereo-picture, and this stereo-picture comprises a pair of two-dimensional image, and this to two-dimensional image on, at least one input equipment indicator is provided.This method also is included in this on the two-dimensional image, and described at least one input equipment indicator is moved to a second place from a primary importance, and determines a positional value, represents the second place of described at least one input equipment indicator.This method comprises and is described two-dimensional image according to described definite positional value, computing center's point respectively, and the central point that moves described two-dimensional image is with consistent with the described central point that calculates.
Another aspect of the present invention provides a method that shows stereo-picture.This method is included on one group of display device and shows at least one stereo-picture, and described stereo-picture comprises a pair of two-dimensional plane image, and this to the two-dimensional plane image on, at least one input equipment indicator is provided.This method also comprises storing represents the data that concern between the central point of at least one indicator post and each described two-dimensional image, and is the amount of at least one indicator calculating motion on described two-dimensional image.This method also comprises according to the amount that calculates and the data of storage, determines the center position of each described two-dimensional image respectively, and moves the central point of described two-dimensional image respectively according to the center position of determining.
Another aspect of the present invention provides a method that shows stereo-picture.This method is included on one group of display device and shows at least one stereo-picture, and described stereo-picture comprises a pair of two-dimensional plane image, and this to the two-dimensional plane image on, at least one input equipment indicator is provided.This method comprises, moves the target location of described at least one input equipment indicator to the described two-dimensional image, and for determining a positional value in this target location on the described two-dimensional image.This method also comprises, according to the positional value of determining, and the central point of the described two-dimensional image that calculating will be moved, and according to the center point value that is calculated, the central point of mobile two-dimensional image respectively.
Another aspect of the present invention provides a system for displaying stereoscopic images.This system comprises one group of display device, an input equipment, a computing equipment, and a display driver.Described one group of display device shows at least one stereo-picture, and described stereo-picture comprises a pair of two-dimensional image.Described input equipment, control is presented at the motion of at least one the input equipment indicator on the described two-dimensional image, and described at least one input equipment indicator is set a target location that moves on the described two-dimensional image.Described computing equipment is the target location of described at least one indicator, determines each positional value, and according to the positional value of determining of described at least one indicator, determines the central point of described two-dimensional image.Described display driver moves the image that is shown according to the target location value of determining.
Another aspect of the present invention provides a method, according to the position of one group of stereocamera about a scene, adjusts the display direction of stereo-picture.This method comprises, surveys separately the position of described one group of stereocamera about a scene being taken.Described method also comprises to one group of display device transmission position of camera data, receives described position of camera data, and according to described position of camera data, determine the display device adjusted value.Described method comprises according to described adjusted value, adjusts the position of at least one display device.
Another aspect of the present invention provides a system, according to the position of one group of stereocamera, adjusts the display direction of stereo-picture.This system comprises one group of position sensing equipment, a transmitter, a receiver, one group of display device, and a display device controller.Each position sensing equipment is surveyed the position of each described stereocamera.Described transmitter transmits at least one stereo-picture and described position sensing data, and described stereo-picture comprises a pair of two-dimensional image.Described receiver receives at least one stereo-picture and described position sensing data.Described one group of display device shows the two-dimensional image that receives respectively.Described display device controller according to the position sensing data that receive, is determined the display device adjusted value, and according to described definite adjusted value, adjusts the position of at least one display device.
Another aspect of the present invention provides a system, according to the position of one group of stereocamera, adjusts the display direction of stereo-picture.This system comprises a receiver, one group of display device, and a computing equipment.Described receiver receives the position sensing data of at least one stereo-picture and described one group of stereocamera, and described stereo-picture comprises a pair of two-dimensional image.Described one group of display device shows described a pair of two-dimensional image respectively.Described computing equipment according to the position sensing data that receive, is determined the display device adjusted value, and according to described definite adjusted value, adjusts the position of at least one display device.
Another aspect of the present invention provides a method, according to the position of one group of stereocamera, adjusts the display direction of stereo-picture.This method comprises, generates at least one stereo-picture, and described stereo-picture comprises a pair of two-dimensional image that is generated respectively by one group of stereocamera.Described method also comprises, surveys the position of each stereocamera about the scene that will be taken.Described method also comprises, transmits described at least one stereo-picture and described position sensing data to one group of display device, and receives described position sensing data and described stereo-picture.This method comprises, according to described position sensing data, determines the display image adjusted value, and according to described definite adjusted value, adjusts the display direction of at least one two-dimensional image that receives.
Another aspect of the present invention provides a system, according to the position of one group of stereocamera, adjusts the display direction of stereo-picture.This system comprises, one group of stereocamera, one group of position sensing equipment, a transmitter, a receiver, a computing equipment, and a display driver.Described one group of stereocamera generates at least one stereo-picture, and described stereo-picture comprises a pair of two-dimensional image.Described one group of position sensing equipment is surveyed the position of described one group of stereocamera respectively.Described transmitter transmits the stereo-picture and the described position sensing data that are generated.Described receiver receives described position sensing data and described stereo-picture.Described computing equipment according to described position sensing data, is determined the display image adjusted value.Described display driver according to described definite adjusted value, is adjusted the display direction of at least one two-dimensional image that receives.
And another aspect of the present invention provides a method, adjusts the display direction of stereo-picture.This method comprises, a pair of projection section is provided, and is configured to generate at least one stereo-picture from the three-dimensional structure data of a scene, and described stereo-picture comprises a pair of two-dimensional image.Described method also is included in a pair of display device, shows described a pair of two-dimensional image respectively, and surveys the relative position of each projection section about described scene.This method comprises, according to described positional value, determines the display device adjusted value, and according to described adjusted value, adjusts the angles of display of at least one described two-dimensional image with respect to the eyes of each observer's the described display device of sensing.
Another aspect of the present invention also provides an information communication system.Described system comprises one first portable equipment and one second portable equipment.Described first portable equipment comprises a pair of stereocamera and a pair of display screen.Described a pair of stereocamera generates and transmits one first stereo-picture, and described a pair of display screen is configured to receive and to show one second stereo-picture.Described second portable equipment is communicated by letter with described first portable equipment, and comprises a pair of stereocamera and a pair of display screen.The a pair of stereocamera of described second equipment generates one second stereo-picture, and transmits second stereo-picture to described first portable equipment.The described a pair of display screen of described second equipment receives described first stereo-picture from described first portable equipment, and shows described first stereo-picture.Each is to stereocamera, and predetermined interval of distance is placed mutually, and each described first and second stereo-picture comprises, respectively by each a pair of two-dimensional image that described stereocamera is generated.
Another aspect of the present invention also provides a portable communicator.Described device comprises, a pair of stereocamera, a transmitter, a receiver and a pair of display screen.Described a pair of stereocamera generates a stereo-picture of first scene.Described transmitter transmits described first stereo-picture.Described receiver receives a stereo-picture of second scene different with described first scene.Described a pair of display screen shows the image of described second scene.Each stereo-picture comprises a pair of two-dimensional image.
Description of drawings
Figure 1A has shown that a typical 3-D view generates and reproducing unit.
Figure 1B has shown that another typical 3-D view generates and reproducing unit.
Fig. 2 A and Fig. 2 B have shown the shooting ratio of a camera.
Fig. 3 A and Fig. 3 B have shown the screen ratio of a display device, and this equipment shows an image that is taken.
Fig. 4 A shown, according to the variation of the focal length of a camera, and the variation of distance between object lens and the film.
Fig. 4 B shown, according to the variation of the described focal length of this camera, the variation of described shooting ratio.
Fig. 4 C has shown, the relation between the focal length of shooting ratio and camera.
Fig. 4 D has shown an exemplary form, has presented the minimum and maximum shooting ratio of a camera.
Fig. 5 A has shown a shooting ratio calculation element according to an aspect of the present invention.
Fig. 5 B has shown a shooting ratio calculation element according to another aspect of the present invention.
Fig. 6 A has shown an exemplary process flow diagram, is used for the operation of the shooting ratio calculation element among the key drawing 5A.
Fig. 6 B has shown an exemplary process flow diagram, is used for the operation of the shooting ratio calculation element among the key drawing 5B.
Fig. 7 has shown a camera, comprises the shooting ratio calculation element shown in Fig. 5 A and 5B.
Fig. 8 has shown a system that is used to show stereo-picture, so that take ratio (A: B: C) substantially the same in screen ratio (D: E: F).
Fig. 9 has shown an exemplary process flow diagram, is used for the operation of the described picture size adjustment member of key drawing 8.
Figure 10 is a concept map, is used for explaining the described picture size adjustment of each display device.
Figure 11 has shown an exemplary process flow diagram, is used for the overall operation of the system of key drawing 8 demonstrations.
Figure 12 has shown the example of described display system according to an aspect of the present invention.
Figure 13 has shown a three-dimensional (three-dimensional) display system according to an aspect of the present invention, comprises an eye position fixed equipment.
Figure 14 has shown the relation between the image that is shown and an observer's the eyes.
Figure 15 has shown a three-dimensional image display systems according to an aspect of the present invention.
Figure 16 has shown an exemplary process flow diagram, is used to explain the operation of system shown in Figure 15.
Figure 17 is a concept map, is used to explain the operation of display device described in Figure 15.
Figure 18 has shown a three-dimensional image display systems according to another aspect of the present invention.
Figure 19 has shown an exemplary process flow diagram, is used for explaining the operation of the system of Figure 18.
Figure 20 has shown a concept map, is used for explaining the operation of the system of Figure 18.
Figure 21 A has shown an eye lens motion detection equipment.
Figure 21 B is a concept map, is used to explain the motion of described eye lens.
Figure 22 is a concept map, is used to explain the motion of the central point of the described image that is shown.
Figure 23 has shown a camera arrangement that is used for a three-dimensional display system according to an aspect of the present invention.
Figure 24 has shown a display system corresponding to described camera arrangement shown in Figure 23.
Figure 25 has shown an exemplary process flow diagram, is used to explain the described camera shown in Figure 23 and 24 and the operation of display system.
Figure 26 A is a concept map, has shown the parameter of one group of stereocamera.
Figure 26 B is a concept map, has shown the parameter of observer's eyes.
Figure 27 is a concept map, has shown the motion of one group of stereocamera.
Figure 28 is a concept map, is used for the distance between the person and object according to the observation, explains the motion of described eye lens.
Figure 29 has shown a three-dimensional display system according to another aspect of the present invention, is used to control one group of stereocamera.
Figure 30 shows an exemplary block diagram of described controller of camera as shown in figure 29.
Figure 31 has shown an exemplary process flow diagram, is used to explain the operation of described controller of camera according to an aspect of the present invention.
Figure 32 A has shown an exemplary chart, and this chart is used for controlling level and vertical engine.
Figure 32 B has shown a concept map, has explained the motion of described camera.
Figure 33 has shown an exemplary process flow diagram, is used to explain the operation of system shown in Figure 29.
Figure 34 has shown according to another aspect of the present invention, is used for stereocamera controller system of a three-dimensional display system.
Figure 35 has shown an exemplary form, has presented the relation between camera adjusted value and the selecteed camera.
Figure 36 A is a top plan of many group stereocameras.
Figure 36 B is a front elevation view of described many group stereocameras.
Figure 37 has shown an exemplary process flow diagram, is used to explain the operation of system shown in Figure 34.
Figure 38 has shown a three-dimensional display system according to another aspect of the present invention.
Figure 39 has shown the example of a 3-D display image.
Figure 40 A-40H has shown concept map, has explained the relation between 3D mouse cursor and eye lens position.
Figure 41 has shown an exemplary block diagram of described display device shown in Figure 38.
Figure 42 has shown an exemplary process flow diagram, is used for explaining the operation of the display device of Figure 41.
Figure 43 A-43C has shown concept map, has explained the position of the described central point that calculates described eye lens and the method for the distance between two positions.
Figure 44 is a concept map, is used to explain the method for determining position of the intermediate point of the image that is shown.
Figure 45 has shown a three-dimensional display system according to another aspect of the present invention.
Figure 46 has shown the exemplary block diagram of described display device shown in Figure 45.
Figure 47 is a concept map, is used for the motion according to described eye lens, explains described camera control.
Figure 48 has shown an exemplary process flow diagram, is used to explain the operation of system shown in Figure 45.
Figure 49 has shown a three-dimensional display system according to another aspect of the present invention.
Figure 50 has shown an exemplary block diagram of described controller of camera shown in Figure 49.
Figure 51 has shown an exemplary process flow diagram, is used for explaining the described controller of camera of Figure 50.
Figure 52 has shown an exemplary form, is used to explain the relation between spatial magnification and camera distances.
Figure 53 has shown an exemplary process flow diagram, is used to explain the operation of total system shown in Figure 49.
Figure 54 has shown a three-dimensional display system according to another aspect of the present invention.
Figure 55 has shown an exemplary form, is used to explain the relation between described camera motion and the angles of display.
Figure 56 has shown an exemplary process flow diagram, is used to explain the overall operation of the system shown in Figure 54.
Figure 57 has shown a three-dimensional display system according to another aspect of the present invention.
Figure 58 has shown an exemplary block diagram of the described display device among Figure 57.
Figure 59 A and 59B are concept maps, are used to explain the described adjustment that is shown image.
Figure 60 has shown an exemplary process flow diagram, is used for explaining the operation of Figure 57 system.
Figure 61 has shown for described image transmitting stereoscopic image and has taken an exemplary block diagram of the described system of ratio.
Figure 62 has shown and has been described image storing stereoscopic image and take exemplary block diagram of the described system of ratio on a lasting storer.
Figure 63 has shown the form that described data are exemplary, in the recording medium of this data storage in Figure 62.
Figure 64 has shown the exemplary block diagram of a pair of portable communication device, comprises a pair of digital camera and a pair of display screen.
Figure 65 has shown the exemplary block diagram of a portable communication device, is used for taking ratio and a screen ratio demonstration stereo-picture according to one.
Figure 66 A and 66B have shown the exemplary block diagram of a portable communication device, are used to control the position of described stereo-picture.
Figure 67 has shown the exemplary block diagram of a portable communication device, is used to control the spatial magnification of stereo-picture.
Figure 68 has shown a concept map, is used to explain a portable communication device with independent display screen.
Figure 69 A and 69B have shown an exemplary block diagram, are used for explaining generating described stereo-picture from three-dimensional structure data.
Figure 70 has shown a three-dimensional display system, is used to make the resolution unanimity between described stereocamera and the display device.
Embodiment
Figure 1A has shown that a typical 3-D view generates and reproducing unit.Described system among Figure 1A uses two display devices to show stereo-picture.Described device comprises one group of stereocamera 110 and 120, opens with a predetermined separating distance between mutually.Described camera 110 and 120 can be separated the distance of opening between two eyes that are same as an observer, is used for taking an object 100 two different positions.Each camera 110 and 120 respectively to described display device 140 and 150 simultaneously or the image that one after the other provides each to be taken.Described display device 140 and 150 is positioned, and makes the observer to observe each image that is presented on described equipment 140 and 150 by his left eye and right eye respectively.Described observer can the described left side discern a 3-D view with image the right by side by side or one after the other responding to and synthesizing.That is to say that when described observer saw a stereoscopic image with each eye, a single image (object) was awared, have three-dimensional character.
Figure 1B has shown that another typical 3-D view generates and reproducing unit.Described system among Figure 1B uses a display device to show stereo-picture.Described device comprises one group of stereocamera 110 and 120, opens with a predetermined separating distance between mutually, is used for taking same object 100 two different positions.Each camera 110 and 120 provides the image of each shooting to a synthesis device 130.Described synthesis device 130 receives two images from the camera 110 described left side and the right and 120, and one after the other received image is shone on the display device 160.Described synthesis device 130 can be located in a camera place, or a screen place.Described observer wears special glasses 170, and these glasses allow each image that only is shown to each soon.Described glasses 170 can comprise a filtrator or a grating, allow described observer alternately to see each image.Described display device 160 can comprise a LCD (LCD) or a 3D glasses, as a head mounted display (HMD).Like this, described observer can discern a 3-D view, by each eye one after the other the described left side of perception with image the right.
Here, according to two cameras and will be by the distance between the described object of described camera, and the size of described subject, the described size of images that is shown has just been determined.And, owing to be presented at the distance between the image described left side and the right on the described display device, and an observer's left eye and the distance between the right eye have identical ratio, and described observer feels the object of observing described reality with three dimensional form.
In above-mentioned technology, work as object of which movement, camera motion, perhaps about described object, carry out an amplification (furthering) or dwindle (putting far away) image function, when not being in a fixing object by the state of a fixing camera, described object can be by described camera.In these cases, the distance between described camera and the described object that is taken, the size of the perhaps described object that is taken changes.Like this, an observer can feel to have a kind of distance sensation by described image, and this distance perspective is different with the actual range between described object with described camera.
And, even when the distance between described object and the described stereocamera in shooting process be fix the time, each observer has its specific interocular distance, bio-measurement, this is by the distance between the central point of the eyes of measuring described observer.For example, an adult's interocular distance and a children's interocular distance is just very different.And described interocular distance also changes between the observer of same age.Simultaneously, in existing three-dimensional display system, the distance between the central point of each stereo-picture be fixed on general adult distance value (as, 70 millimeters), illustrate as Figure 1A and 1B.Yet as mentioned above, each observer has he own individual's interocular distance.This may cause headache, when described observer observes stereo-picture and three-dimensional sensation when being twisted.In some cases, even imperceptible three-dimensional.
In order to show a real 3-D view, one aspect of the present invention will be adjusted display image or display device, makes screen ratio in the described display device (D: E: F) be equal to shooting ratio (A: B: C) in the described camera substantially.Hereinafter, the term 3-D view will be expressed the identical meaning with stereo-picture.And a stereo-picture comprises a pair of two-dimensional image that is generated by a pair of stereocamera.Stereo-picture comprises a plurality of stereo-pictures.
Take ratio (A: B: C) and screen ratio (D: E: F)
Fig. 2 A and Fig. 2 B have shown the shooting ratio of a camera.This ratio relates to a scope, or the size in described space, is directly proportional with the scope of seeing by the view finder of a camera, and this camera can be taken in a scene.Described shooting ratio comprise three parameters (A, B, C).Parameter A and B are respectively defined as the level and the vertical length in described space, comprise the described object of being taken by described camera 20 22.Parameters C is defined in the vertical range between described camera 20 and the described object 22.Usually, camera have it own itself, level and the vertical range that can take an object, and the ratio of described level and vertical length is normally constant, as, 4: 3 or 16: 9.Like this, in case in described level or the vertical length has been determined, another one length has also just been determined automatically.In one embodiment of the present of invention, described camera 20 comprises a video camera, a static type camera, an analog camera, or a digital camera.
In order to do explanation, suppose described object 22 be placed in described camera 20 at a distance of 10 meters, and be taken and make described object 22 be included in a single film or the picture frame, shown in Fig. 2 A and Fig. 2 B.If described horizontal range (A) is 20 meters, described vertical range (B) will be " 15 meters ", the camera ratio with respect to 4: 3.Because the distance between described camera 20 and the described object is 10 meters, described shooting ratio is 20: 15: 10=2: 1.5: 1.In one embodiment of the invention, the described shooting ratio when taking an object can be determined according to the optical characteristics of camera object lens, as, maximum shooting ratio and minimum shooting ratio.
Fig. 3 A and Fig. 3 B have represented the screen ratio of a display device, and this equipment shows a photographic images.Described screen ratio relates to the scope that an observer can see by a display device.Described screen ratio comprises three parameter (D: E: F).Parameter D and E are respectively defined as level and the vertical length of the described image that is presented in the described display device 24.Parameter F is defined in the vertical range between a described display device and an observer's the eyes 26.For the purpose of convenient, in Fig. 3 A and Fig. 3 B, have only eyes 26 and a display device 24 to be shown, and do not demonstrate two eyes and one group of display device.F can use a range sensing sensor automatically to be measured, or manual the measurement, maybe can be fixed.In one embodiment of the invention, parameter D and E are adjusted, and make described shooting ratio (A: B: C) and described screen ratio (D: E: F) equate.Like this, the described controlled size of images in described display device 24 corresponds to the size of images of being taken by described camera 20.This size of images that is shown that shows that an observer sees is proportional with the size of images that is generated by described camera 20.Like this, by always keeping the relation of " A: B: C=D: E: F ", a more real 3-D view just is provided for the observer.Like this, by one embodiment of the present of invention, if described camera is taken an object with bigger shooting ratio, described image also just adopts one to be shown than the giant-screen ratio.
Fig. 4 A shown, according to the variation of the focal length of described camera 20, and the variation of distance between object lens and the film.(notice that though used term " film " in this instructions, this term is not limited to the analog image recording medium.Such as, charge coupled cell CCD equipment or complementary metal oxide semiconductor (CMOS) cmos image sensor can be used to obtain image in digital environment).Described camera 20 can have more focal range, but an example has gone out four focal ranges among Fig. 4 A.
Shown in Fig. 4 A, according to the focal length of described camera 20, the distance range between a film and object lens is from d1 to d4.Described focal length can be adjusted by a focus adjustment member (will do explanation hereinafter) of described camera 20.When described focal length is " infinity " (∞) time, described distance (d1) is the shortest.When described camera 20 was configured to have an infinitely-great focal length, described camera 20 received the light of maximum by described object lens.When described focal length is 0.5 meter, to state camera in this place and receive the minimum light time by described object lens, described distance (d4) is the longest.That is to say, enter the amount of the light of described camera 20, change according to the focal length of described camera 20.
Because the position of described object lens is normally fixed, in order to change described distance from d1 to d4, the position range of described film according to described focal length, is from P sTo P e, equal " d ".The described focal length adjustment member of described camera 20 is adjusted the position of described film, from P sTo P 1The described focal length adjustment of described camera 20 can be carried out by hand, or can realize automatically.
Fig. 4 B shown, according to the variation of the described focal length of described camera 20, the variation of taking ratio.Described shooting ratio (A: B: C) can be expressed as (A/C: B/C).When described camera is configured to have an infinitely-great focal length, described value A/C or B/C are maximum values, shown in Fig. 4 B are " 2.0/1 ".On the contrary, when described camera 20 is configured to have, for example, during the focal length of " 0.5 meter ", described value A/C or B/C are minimum values, shown in Fig. 4 B are " 1.0/1 ".That is to say that the light that described camera receives is many more, described shooting ratio is big more.Similar, described focal length is big more, and described shooting ratio is big more.
Fig. 4 C has shown, the relation between the focal length of shooting ratio and camera.The focal length of described camera can be determined, for example, and the current scale position of the described focus adjustment member by surveying described camera.Shown in Fig. 4 C, when described camera has the focal range of " 0.3 meter to infinitely great ", when object of described camera, described focus adjustment member at scale on a position between 0.3 meter and the infinity.In this case, described shooting ratio is linear change, shown in Fig. 4 C.If described camera has a focus adjustment member, when taking an object, this part is adjusted automatically, and described shooting ratio can be determined, by surveying controlled current focal length automatically.
Fig. 4 D has shown an exemplary form, has presented the minimum and maximum shooting ratio of a camera.As mentioned above, when described focal length is the longest, for example, and the infinitely-great distance shown in Fig. 4 D, a camera has maximum shooting ratio (A: B: C=3: 2: 1).And, when described focal length the most in short-term, for example, " 0.3 meter " shown in Fig. 4 D, described camera have minimum shooting ratio (A: B: C=1.5: 1: 1).The minimum and maximum shooting ratio of described camera is by its optical characteristics decision.In one embodiment, a camera manufacturing company may provide minimum and maximum shooting ratio in the technical specification of camera.When the form among Fig. 4 D is used for being located in described focus adjustment member on the scale between " 0.3 meter and infinitely great ", determine to take ratio.
Be used for calculating the method and system of the shooting ratio of a camera
Fig. 5 A has shown a shooting ratio calculation element according to an aspect of the present invention.Described device comprises 54, one storeies 56 of 52, one FAP position sensings of a focus adjustment member (FAP) part, and one is taken ratio calculating section 58.In one embodiment, described shooting ratio calculation element can be embedded in the described camera 20.
Described focus adjustment member 52 is adjusted the focus of the object lens of described camera 20.Described focus adjustment member 52 can manually or automatically be carried out its function.In one embodiment of the invention, described focus adjustment member 52 can be included in 10 scales between " 0.3 meter and infinitely great ", and when camera 20 was taken an object, described focus adjustment member 52 was positioned on this one of them scale.In one embodiment of the invention, described focus adjustment member 52 can use one known, be used in a focus adjustment member in the typical camera.
Described FAP position sensing part 54 is surveyed the current scale position of described focus adjustment member 52 among described scale.In one embodiment of the invention, described FAP position sensing part 54 can comprise a known position detection sensor, and this sensor is surveyed the scale value that described focus adjustment member 52 is positioned at.In another embodiment of the present invention, because the variation of described scale position and the distance between described object lens and the film are proportional, shown in Fig. 4 A, described FAP position sensing part 54 can comprise a known distance acquisition sensor, the distance between described object lens of this sensor measurement and the film.
The data of the minimum and maximum shooting ratio of the described camera 20 of described storer 56 storage representatives.In one embodiment of the invention, described storer 56 comprises a ROM (read-only memory) (ROM), a fast wiping type storer or a programmable read only memory.This also is applicable to other all storeies of describing in instructions.
Described shooting ratio calculating section 58 according to described scale position of detecting and minimum and maximum shooting ratio, calculates one and takes ratio (A: B: C).In one embodiment of the invention, described shooting ratio calculating section 58 comprises a digital signal processor (DSP), uses following formula I and II to calculate described ratio (A: B: C).
Formula I:
A = ( A max - A min c ) × ( S cur S tot ) + A min c
Formula II:
B = ( B max - B min c ) × ( S cur S tot ) + B min c
In formula I and formula II, parameter A MaxAnd B MaxRepresent described maximum level and vertical length value (A and B) of taking ratio respectively, in Fig. 4 D, use " 3 " and " 2 " as an example.Parameter A MinAnd B MinRepresent described minimum level and vertical length value (A and B) of taking ratio respectively, in Fig. 4 D, use " 1.5 " and " 1 " as an example.Parameter S CurAnd S TotRepresent the current scale value that detects and total scale value respectively.Parameter " c " described maximum of representative or the minimum distance value of taking ratio.Because described shooting ratio (A: B: C) described three parameter A of representative, the relative scale between B and the C, described parameter can be simplified, shown in Fig. 4 D.For example, described shooting ratio, A: B: C=300: 200: 100, just be same as A: B: C=3: 2: 1.In one embodiment of the invention, described parameter " c " has value " 1 ", shown in Fig. 4 D.
In another embodiment of the present invention, and a shooting of described shooting ratio calculating section 58 calculating ratio (A: B: C), make described ratio drop between the described minimum and maximum shooting ratio, simultaneously, proportional with the value of the described scale position of detecting.Like this, as long as described ratio drops between the described minimum and maximum shooting ratio, and proportional with the value of the described scale position of detecting, any other formula can be used for calculating described shooting ratio.
With reference to figure 6A, the operation of the shooting ratio calculation element among Fig. 5 A obtains explaining.Described camera 20 is taken an object (602).In one embodiment of the invention, described camera 20 comprises a single camera.In another embodiment of the present invention, described camera 20 comprises a pair of stereocamera, shown in Figure 1A.Under above any situation, for convenience, will carry out according to single camera for the description of described device operation.
Described minimum and maximum shooting ratio is provided to described shooting ratio calculating section 58 (604) from described storer 56.In one embodiment of the invention, described shooting ratio calculating section 58 can be stored described minimum and maximum shooting ratio therein.In this case, described storer 56 can dispense from described device.
Described FAP position sensing part 54 when described camera 20 is taken described object, is surveyed the current position (606) of described focus adjustment member 52.When described camera object, focal length can be changed.The current location that detects of described focus adjustment member 52 is provided for described shooting ratio calculating section 58.
Described shooting ratio calculating section 58 calculates the level value (A) (608) of a current shooting ratio from formula I.Suppose that the described current location value that is detected is " 5 ", in total scale value " 10 ".Use the form among formula I and Fig. 4 D, following the obtaining of described level value A.
A = ( A max - A min c ) × ( S cur S tot ) + A min c = ( 3 - 1.5 1 ) × ( 5 10 ) + 1.5 1 = 2.25
Described shooting ratio calculating section 58 calculates the vertical value (B) (610) of a current shooting ratio from formula II.In above-mentioned example, use the form among formula II and Fig. 4 D, described vertical value B is following to be obtained.
B = ( B max - B min c ) × ( S cur S tot ) + B min c = ( 2 - 1 1 ) × ( 5 10 ) + 1 1 = 1.5
Described shooting ratio calculating section 58 obtains parameters C (612) from the described minimum and maximum ratio that is used to calculating parameter A and B.With reference to the form among the figure 4D, described distance value (C) is " 1 ".Described shooting ratio calculating section 58 provides a current shooting ratio (A: B: C) (614).In above-mentioned example, described current shooting ratio=2.25: 1.5: 1.
Fig. 5 B has shown a block diagram of taking the ratio calculation element according to another aspect of the present invention.Described device comprises 64, one storeies 66 of 62, one iris openings of iris probe portion, and one is taken ratio calculating section 68.In one embodiment of the invention, described shooting ratio calculation element is embedded in the described camera 20.
Described iris 62 is according to its extent of opening, to adjust an equipment of the quantity of the light that enters into described camera 20.When described iris 62 extent of opening were maximum, the rayed of maximum was on the film of described camera 20.The opening that this is maximum, the longest corresponding focal length and maximum shooting ratio.On the contrary, when described iris 62 extent of opening hour, minimum light enters into described camera 20.The opening that this is minimum, the shortest corresponding focal length and minimum shooting ratio.In one embodiment of the invention, described iris 62 can be a known iris that is used in the typical camera.
Described iris opening probe portion 64 is surveyed the extent of opening of described iris 62.The extent of opening of described iris 62 can be quantized a scope, as, 1-10.Degree " 10 " can be represented the opening of the maximum of iris 62, and degree " 1 " can be represented the opening of the minimum of iris 62.Described storer 66 is being stored the data of the minimum and maximum shooting ratio of representing described camera 20.
A shooting of described shooting ratio calculating section 68 calculating ratio (A: B: C), according to the degree and the described minimum and maximum shooting ratio of the opening that detects.In one embodiment of the invention, described shooting ratio calculating section 68 comprises a digital signal processor (DSP), uses following formula III and IV to calculate described ratio (A: B: C).
Formula III:
A = ( A max - A min c ) × ( I cur I l arg est ) + A min c
Formula IV:
B = ( B max - B min c ) × ( I cur I l arg est ) + B min c
In formula III and IV, parameter A MaxAnd B Max, A MinAnd B Min, and " c ", identical with the parameter of in formula I and II, using.Parameter I CurAnd I LargestRepresent the current extent of opening and the maximum extent of opening that are detected respectively.
With reference to figure 6B, the operation of taking the ratio calculation element will be described.About the operation of preceding two processes 702 and 704, identical with among Fig. 6 A.
When described camera 20 was taken described object, described iris opening probe portion 64 was surveyed the current extent of opening (706) of described iris 62.The extent of opening of the described iris 62 that detects is provided for described shooting ratio calculating section 68.
Described shooting ratio calculating section 68 calculates the level value (A) (708) of a current shooting ratio from formula III.Suppose that the described current extent of opening that is detected is 2, in total extent of opening 10.Use formula III and Fig. 4 D, following the obtaining of described level value A.
A = ( A max - A min c ) × ( I cur I l arg est ) + A min c = ( 3 - 1.5 1 ) × ( 2 10 ) + 1.5 1 = 1.8
Described shooting ratio calculating section 68 calculates the vertical value (B) (710) of a current shooting ratio from formula IV.In above-mentioned example, use formula IV and Fig. 4 D, described vertical value B is following to be obtained.
B = ( B max - B min c ) × ( I cur I l arg est ) + B min c = ( 2 - 1 1 ) × ( 2 10 ) + 1 1 = 1.2
Described shooting ratio calculating section 68 obtains parameters C (712) from the described minimum and maximum ratio that is used to calculating parameter A and B.With reference to figure 4D, described distance value is " 1 ".Described shooting ratio calculating section 68 provides a current shooting ratio (A: B: C) (714).In above-mentioned example, current shooting ratio=1.8: 1.2: 1.
Fig. 7 has shown a camera, comprises the shooting ratio calculation element shown in Fig. 5 A and Fig. 5 B.Described camera 20 comprises 72, one shootings of 70, one microcomputers of image data processing system ratio calculation element 74, and a data compositor 76.
In one embodiment of the invention, described camera 20 comprises an analog camera and a digital camera.When described camera 20 was taken an object, described image data processing system 70 according to the control of described microcomputer 72, was carried out a typical Flame Image Process to the described image that is taken.In one embodiment of the invention, described image data processing system 70 can comprise a digital quantizer, this converter arrives digital value with the described analog image digitizing that is taken, a storer is stored the described data that are digitized, and a digital signal processor (DSP), this processor is carried out a view data of the described view data that is digitized and is handled (all not illustrating).Described image data processing system 70 provides processed data to a data compositor 76.
In one embodiment, described shooting ratio calculation element 74 comprises the device shown in Fig. 5 A or the 5B.Described shooting ratio calculation element 74 calculates one and takes ratio (A: B: C).(A: B: C) data are provided to described data combiner 76 from installing 74 to the shooting ratio that calculates.
The described image data processing system 70 of described microcomputer 72 controls, described shooting ratio calculation element 74, and described data combiner 76 make described camera 20 export the described data that are synthesized 78.In one embodiment of the invention, the described image data processing system 70 of described microcomputer 72 controls makes described device suitably handle described Digital Image Data.In this embodiment of the present invention, the described shooting ratio calculation element 74 of described microcomputer 72 controls takes ratio for one of captured image calculation.In this embodiment of the present invention, the described data combiner 76 of described microcomputer 72 controls synthesizes described treated data, and corresponding to the described shooting ratio data of treated data.In one embodiment of the invention, described microcomputer 72 can provide a synchronizing signal to described data combiner 76, synthesizes described view data and described ratio data.As mentioned above, as long as the current scale position of described focus adjustment member or the extent of opening of described iris are not changed, described shooting ratio is just constant.Described microcomputer 72 can be surveyed the variation of scale position or extent of opening, and controls described data combiner 76, makes that described view data and corresponding ratio data are suitably synthesized.
In one embodiment of the invention, described microcomputer 72 is programmed carries out above-mentioned function, and use can be from Intel, the typical microcomputer product that companies such as IBM and Motorola obtain.This product also goes for other microcomputers that this instructions is described.
Described data combiner 76, according to the control of described microcomputer 72, the synthetic described view data of described image data processing system 70 and the shooting ratio that the calculates (A: B: C) data of coming from.The described data that are synthesized 78 of described compositor 76 outputs, wherein said view data and described ratio data can be synchronous mutually.In one embodiment of the invention, described compositor 76 can comprise a known multiplexer.
Be used for taking the method and system that ratio is controlled screen ratio according to one
Fig. 8 has shown a system that is used to show stereo-picture, so that take ratio (A: B: C) substantially the same in screen ratio (D: E: F).This system comprises a camera place 80 and a demonstration place 82.Described camera place 80 is transmitted one and is taken ratio (A: B: C) arrive described demonstration place 82 with the image that is taken.Described demonstration place 82 shows the image that is transmitted, and makes screen ratio (D: E: F) substantially the same in described shooting ratio (A: B: C).In one embodiment of the invention, described camera place 80 can comprise a single camera, and described demonstration place 82 can comprise a single display device.In another embodiment of the present invention, described camera place 80 can comprise one group of stereocamera, and described demonstration place 82 can comprise one group of display device, as shown in Figure 8.
The embodiment in camera place 80 shown in Figure 8 comprises one group of stereocamera 110 and 120, and transmitter 806 and 808.The stereocamera 110 described left side and the right and 120 with respect to the object that will be taken, can be placed shown in Figure 1A.Camera 110 and 120 comprises the described element of describing in conjunction with Fig. 7.In the camera 110 and 120 each provides its oneself the data that are synthesized 802 and 804 to transmitter 806 and 808 respectively.The data 802 and 804 that each transmission is synthesized in the transmitter 806 and 808 by a network 84, arrive described demonstration place 82.Described network 84 can comprise a wire transmission or a wireless transmission.In one embodiment of the invention, each and described camera 110 and 120 are separated in the transmitter 806 and 808.In another embodiment of the present invention, each can be embedded in described camera 110 and 120 in the transmitter 806 and 808.For simplicity, suppose that two described shooting ratios are respectively " A1: B1: C1 " and " A2: B2: C2 ".
In one embodiment of the invention, described shooting ratio " A1: B1: C1 " and " A2: B2: C2 " are substantially the same.In one embodiment of the invention, described data 802 and 804 can be synthesized, and are sent to described demonstration place 82.In one embodiment of the invention, described shooting ratio is at each camera and show the data layout that the place can have a standard, makes described demonstration place can easily discern described shooting ratio.
Described demonstration place 82 comprises 820,832, one group of display device 86,88 of a bank of receivers.Each receiver 820,832 receives the data that are synthesized that send from described camera place 80, and provides each data set to described display device 86,88 respectively.In one embodiment of the invention, each and described display device 86,88 are separated in the described receiver 820,832.In another embodiment of the present invention, described receiver 820,832 can be embedded in the described display device 86,88.Described display device 86,88 comprises data extractor 822,834, picture size adjustment member 828 and 840, and display screen 830 and 842.Described data extractor 822 and 834 separates described shooting ratio data (824,838) and described view data (826,836) from received data.In one embodiment of the invention, each in the described data extractor 822 and 834 comprises a typical multiplexer.
Described picture size adjustment member 828, (A1: B1: C1), and screen-(G1 H1), adjusts the size of images that will be displayed on the display screen 830 for observer's distance (F1) and display screen size value according to described shooting ratio.Described screen-observer distance (F1) representative is at described display screen 830 and observer's eyes, as, aim at the left eye of screen 830, between distance.In one embodiment of the invention, describedly can fix apart from F1.In this case, observer's eyes can be placed in the eyes fixed sturcture, and this will describe below in more detail.And described picture size adjustment member 828 can be stored described fixed value F1 therein.Described screen size value G1 and H1 represent the level and the vertical dimension of described screen 830 respectively.In one embodiment of the invention, described size value G1 and H1 can be stored in the described picture size adjustment member 828.
Described picture size adjustment member 840, (A2: B2: C2), and screen-(G2 H2), adjusts the size of images that will be displayed on the screen 842 for observer's distance (F2) and display screen size value according to described shooting ratio.Described screen-observer distance (F2) representative is at described display screen 842 and observer's eyes, as, aim at the right eye of screen 842, between distance.In one embodiment of the invention, describedly can fix apart from F2.In one embodiment of the invention, described screen-observer's distance (F2) and described screen-observer's distance (F1) is substantially the same.Described screen size value G2 and H2 represent the level and the vertical dimension of described screen 842 respectively.In one embodiment of the invention, described display screen size value G2 and H2 and described display screen size value G1 and H1 are basic identical.
Described picture size adjustment member 828 and 840 operation meeting are done in conjunction with Fig. 9 and Figure 10 and are illustrated in greater detail.Because two picture size adjustment member 828 and 840 operation are basic identical, for simplicity, only explain the operation about picture size adjustment member 828.
Described view data 826, and described shooting ratio data (A1: B1: C1) and described screen-observer's distance (F1) is provided for described picture size adjustment member 828 (902).Screen ratio (D1: E1: F1), according to described shooting ratio (A1: B1: C1), utilize following formula V to be calculated (904) with described screen-observer's distance (F1).Because value F1 provides, the parameter D1 of screen ratio and E1 obtain from formula V.
Formula V:
A1∶B1∶C1=D1∶E1∶F1
D 1 = A 1 × F 1 C 1
E 1 = B 1 × F 1 C 1
(G1 H1) is provided for described picture size adjustment member 828 (906) for the level of described display screen 830 and vertical screen dimension value.In one embodiment of the invention, described screen size value G1 and H1, and described distance value F1 fixes, and be stored in the described picture size adjustment member 828.In another embodiment of the present invention, described screen size value G1 and H1, and described distance value F1 manually is provided in the described picture size adjustment member 828.
Image amplifies (dwindling) ratio d and e calculates (908) from following formula VI.Described ratio d and e represent described display screen 830 and 842 level and vertical amplification (dwindling) ratio respectively.
Formula VI:
d = D 1 G 1
e = E 1 H 1
This just comes, and (G1 H1) carries out the amplification of the image 826 that is provided or dwindle about described screen size.If the value " D1 " that is calculated is greater than the screen size value (G1) of described level, the image that is provided needs as many to be amplified as " d ".If the value " D1 " that is calculated is less than the screen size value (G1) of described level, the image that is provided needs to be dwindled by ground as " d ".Also be suitable in the same manner for the value that calculates " E1 ".This amplification or dwindle makes an observer to discern described image to be same as the ratio that camera 110 takes described object.Described display device 86 and 88 combination provide a more real 3-D view to the observer.
(whether d is e) greater than " 1 " (910) to determine described amplification (dwindling) ratio.If (d, e) all greater than 1, described view data 826 just is exaggerated by " d " and " e " ratio respectively two ratios, shown in Figure 10 A (912).In one embodiment of the invention, (G1, part H1) is cut off, shown in Figure 10 A (914) greater than described screen size in the described image.
If be not two ratios " d " and " e " all greater than 1, (whether d e) less than " 1 " (916) just to determine described amplification (dwindling) ratio.If two ratio d and e are less than 1, described view data 826 is just by reduced by " d " and " e " ratio respectively, shown in Figure 10 B (918).In one embodiment of the invention, the blank parts in the described screen is filled background colour, as black, shown in Figure 10 B (920).
If two ratio d and e equal 1, described size of images is not done any adjustment (922).In this case, because described amplification (dwindling) ratio is 1, described image does not amplify or dwindles, shown in Figure 10 C.
Now, with reference to Figure 11, the overall operation of the system that shows among Fig. 8 has been described.Use one group of stereocamera 110 and 120 to take an object (1120), go out as Figure 1A example.In the camera 110 and 120 each calculate respectively described shooting ratio (A1: B1: C1) and (A2: B2: C2) (1140), for example, use method shown in Figure 6.
Be the described view data and the described shooting ratio of described image calculation, in described stereocamera 110 and 120 each is synthesized (1160).The described data that are synthesized are represented with label 802 and 804 in Fig. 8.In one embodiment of the invention, described synthesizing in each frame of described view data carries out.In one embodiment of the invention, as long as described shooting ratio remains unchanged, described synthesizing is not performed, and has only view data, and do not take ratio, is sent to described demonstration place 82.In this case, when described shooting ratio is changed, described syntheticly just restart.Perhaps, described shooting ratio is not synthesized, but transmits discretely with described view data.In the described transmitter 806 and 808 each by described communication network 84, transmits generated data to described demonstration place 82 (1180).
In the described receiver 820 and 832 each data (1200) that 80 receptions are transmitted from described camera place.Described shooting ratio and view data are separated (1220) from described generated data.Except 1200 and 1220, alternative is that described view data and shooting ratio are received respectively, because they are not synthesized in transmission.In one embodiment of the invention, the described data that are synthesized may not comprise the shooting ratio.In this case, the recent described shooting ratio that receives is used to calculate described screen ratio.In one embodiment of the invention, described screen ratio can remain unchanged, up to receiving new shooting ratio.
(the D1: E1: of the described screen ratio of each in the described display device 86 and 88 F1) with (D2: E2: F2), calculated (1240) by using the method for describing in conjunction with Fig. 9.Described stereo-picture is shown, and makes described shooting ratio (A1: B1: C1) and (A2: B2: each C2) is substantially the same in described screen ratio (D1: E1: F1) and (D2: E2: each F2) (1260).In this case, described image can be exaggerated or dwindles according to the described screen size of each in described display device 86 and 88, and is the same as discussing among Fig. 9 and Figure 10.
Be used for controlling the method and system of the display position of a stereo-picture
Figure 12 has shown the example of described display system according to an embodiment of the invention.Figure 12 A has shown a head mounted display (HMD) system.Described HMD system comprises a pair of display screen 1200 and 1220.For simplicity, electronics display mechanism as shown in Figure 8 has been omitted in this HMD system.An observer wears HMD on his or her head, and observes stereo-picture by each display screen 1200 and 1220.Like this, in one embodiment of the invention, the distance (F) between described screen-observer's eyes can be fixed.In another embodiment of the present invention, described distance (F) can be measured by a known distance acquisition sensor, and is provided to described HMD system.An alternative embodiment of the invention comprises a three-dimensional display system shown in Figure 1B.Another embodiment of described display device comprises a pair of projector equipment, and one group of stereo-picture of their projections is to screen.
Figure 12 B has shown a three-dimensional display system according to another embodiment of the invention.Described display system comprises a V-arrangement mirror 1240, and one group of display device 1260 and 1280.In one embodiment of the invention, except further comprising an inversion section (not shown) respectively, described display device 1260 and 1280 substantially the same with the display device 86 and 88 among Fig. 8.The left surface and the right flank of the image that described inversion section counter-rotating will be shown.Described V-arrangement mirror 1240 will come from described display device 1260 and 1280 image reflection to an observer the eye in.Like this, described observer sees the image of a reflection that comes from described V-arrangement mirror 1240.The described three dimension system that comprises described V-arrangement mirror is open by U.S. Patent application 10/067,628, and this application is submitted on February 4th, 2002 by the inventor of this patent.For simplicity, hereinafter, the content of invention is main to be described according to the display system shown in Figure 12 B, and still, the present invention is equally applicable to other display system, for example, and the display system shown in Figure 12 A.
Figure 13 has shown a three-dimensional display system according to an aspect of the present invention, comprises an eye position fixed equipment 1300.With reference to figure 13A and 13B, described eye position fixed equipment 1300 with from 1240 predetermined distances of described mirror, is positioned at the place ahead of described V-arrangement mirror 1240.Described eye position fixed equipment 1300 is used for fixing the distance between mirror 1240 and observer's eyes.Described eye position fixed equipment 1300 also is used for placing observer's eyes, makes each eye of observer all be substantially perpendicular to each mirror image (virtual image).The pair of holes 1320 and 1340 that is defined in the described equipment 1300 is configured, and makes described observer can observe each central point of described reflected image.In one embodiment of the invention, the size of each in the described hole 1320 and 1340 is enough big, make described observer to see to be positioned at half complete part (left side or right half) of described V-arrangement mirror 1240 of a predetermined distance and position, shown in Figure 13 A and 13B.In one embodiment of the invention, described eye position fixed equipment 1300 can be used for fixing on demand the position of an observer's eyes, in conjunction with other aspects of the present invention, and following discussing.
Figure 14 A has shown the relation between a described image that is shown and an observer's the eyes.Distance (W d) distance between each the central point (1430,1440) in the image (1410,1420) that is shown of representative.Distance (W a) represent the distance between the central point (1450,1460) of each eye of observer.Described distance W aVary with each individual.Usually, described distance increases along with growing up of people, and when he or she arrived specific age, described distance just no longer changed.An adult mean distance may be 70 millimeters.Some may be 80 millimeters, and some other people may be 60 millimeters.Distance (V a) represent the distance between the central point (1470,1480) of each eye lens of observer.Here, eye lens are illustrated in piece of meat circle, transparent after the pupil of eyes.Described eye lens move along with the motion of eyes.Described apart from V a(F) changes according to the distance between object and described observer's the eyes.Described distance (F) is big more, described value V aBig more.With reference to figure 14B, when seeing an object, the observer is distal to, for example, and 10,000 meters, V aHas maximal value (V Amax), with described distance W aSubstantially the same.
Traditional three-dimensional display system display image is not considered described value W aThis represents described distance value (W d) all the same for all observers, and do not consider that they have different W aThe fact of value.These traditional systems cause some undesirable problems, as headache or dim eyesight, and the damage of three-dimensional sensation.In order to produce more real 3-D view, alleviate observer's headache or dim eyesight simultaneously, described distance W dNeed by considering described distance W aBe determined again.Consider W aValue can provide better to the observer, more real 3-D view.In one embodiment of the invention, described distance W dBe adjusted, make described distance W dWith W aBasic identical.
Figure 15 has shown a three-dimensional image display systems according to an aspect of the present invention.Equally, described system can together use with other system, for example respectively with a HMD system or a display system with V-arrangement mirror, shown in Figure 13 A and 13B, or a projection display system.
System shown in Figure 15 comprises a pair of display device 1260 and 1280, and a pair of input equipment 1400 and 1500.Each input equipment 1400 and 1500 provides described distance value W to each display device 1260 and 1280 aIn one embodiment of the invention, each input equipment 1400 and 1500 comprises a keyboard, a mouse, an indicating equipment, or a telepilot.In one embodiment of the invention, one of them in the described input equipment 1400 and 1500 can be omitted, and another input equipment is used for providing described distance value W to two display devices 1260 and 1280 a
Described display device 1260 and 1280 comprises interface 1510 and 1550 respectively, microcomputer 1520 and 1560, display driver 1530 and 1570, and display screen 1540 and 1580.In one embodiment of the invention, each in the display screen 1540 and 1580 comprises a LCD (LCD) screen, a cathode ray tube (CRT) screen, or a plasma display (PDP) screen.Described interface 1510 and 1550 be provided at respectively described input equipment 1400 and 1500 and described microcomputer 1520 and 1560 between interface.In one embodiment of the invention, each described interface 1510 and 1550 comprises a typical input device controls device and/or a typical interface module (not shown).
There is certain methods to measure and provide described distance (W a).For example, an optist can use eye examination apparatus to measure an observer's W aValue.In this case, the observer can import described value (W by described input equipment 1400 and 1500 a).In addition, for example, an eye lens motion detector can be used to measure described W aValue.In this case, described W aValue can be provided for described interface 1510,1550 among any one input equipment 1400,1500 or Figure 15 from described detector.
The another one example, shown in Figure 14 C, described W aValue can be used and a pair ofly grow up about 1 meter, and the about 1 millimeter parallel pipe 200,220 of diameter is measured, and they be about 1 centimetre away from eyes of observer.Each end of described pipe 200,220 is all being opened.Described pipe distance (P d) can between about 40 millimeters and 120 millimeters, adjust by the distance of widening or dwindle between the pipe 200,220.Described pipe 200,220, between it is widened or dwindles apart from the time, keeping parallelism.A scale 240 can shown in Figure 14 C, make described scale 240 can measure the distance between pipe 200,220 by card on pipe 200 and 220.When the observer respectively by described from its nearer hole, intactly saw the hole at 260,280 o'clock, described scale 240 indicates described observer's W aValue.In another embodiment, red and blue material (paper, plastics, or glass) can cover described hole 260,280 respectively.In this case, described pipe distance (P d) be exactly, when the observer from described hole 260,280, by synthetic described redness and blue observe purple the time W aValue.
Each microcomputer 1520 and 1560 is according to the W that provides aValue is that the image be shown determines the amount of motion, makes described W dBe worth substantially the same in W aValue.In one embodiment of the invention, the described distance value W of each microcomputer (1520,1560) initialization d, and according to described value W aWith initial value W d, come amount for the definite motion of image that is shown.Each display driver 1530 and 1570 amounts according to the motion that is determined move the image that is shown, and show the image of passive movement on each display screen 1540 and 1580.In one embodiment of the invention, each microcomputer (1520,1560) can comprise the function of each display driver 1530 and 1570.In this case, described display driver 1530 and 1570 can omit.
With reference to Figure 16, the operation of system shown in Figure 15 is described.One group of stereo-picture is displayed on a pair of display screen 1540 and 1580 (1610).Described stereo-picture can be provided by described stereocamera 110 and 120 respectively, shown in Figure 1A.Distance (W between the central point of the described image that is shown d) (1620) are initialised.In one embodiment of the invention, described initial value can comprise described general adult interocular distance value, as, " 70 millimeters ".Distance (W between observer's the eye lens central point a) be provided (1630).
Just determine W then aWhether equal W d(1640).If W aEqual W d, the described image that is shown does not carry out any motion (1680).In this case, because the distance (W between the central point of observer's eyes a) and the central point of the described image that is shown between distance (W d) identical, to the described image that is shown, do not carry out any adjustment.
If, W aBe not equal to W d, just determine W aWhether greater than W d(1650).If W aGreater than W d, described distance (W d) need be increased, up to W dEqual W aIn this case, the image 1750 that is displayed on the left side on the left side screen 1540 is moved to the left, and the image 1760 that is displayed on the right on the screen 1580 of the right is moved to the right, and is substantially the same up to these two values, shown in Figure 17 A.With reference to figure 17B, described image that is shown 1750 and 1760 motion by generalities demonstrate, at described display system and a V-arrangement mirror.Because described V-arrangement mirror reflects the described image that is shown to an observer, these images receive from described display device 1260 and 1280, in order to make the observer see the image that was adjusted by described mirror, shown in Figure 17 A, the described image that is shown 1750 and 1760 need be moved about described V-arrangement mirror, shown in Figure 17 B.That is to say that when the described image that is shown 1750 and 1760 is moved, shown in Figure 17 B, the described observer who is seeing the V-arrangement mirror can feel the motion of described image, shown in Figure 17 A.
About HMD system as described in shown in Figure 12 A, what demonstrate among the described image motion direction that is shown and Figure 17 A is identical.About the described projection display system of describing in conjunction with Figure 15, because on described projection display system projects images to a screen, this screen crosses described optical projection system, described to be shown the direction of motion of image opposite with those directions shown in Figure 17 A.
If determine W aBe not more than W d, described distance W dNeed be reduced, up to W dEqual W aLike this, the image 1770 that is displayed on the left side on the display device 1260 is moved to the right, and the image 1780 that is displayed on the right on the display device 1280 is moved to the left, and is substantially the same up to these two values, shown in Figure 17 C and 17D.Be equally applicable to system among Figure 17 C and the 17D about the explanation of the motion that is shown image among Figure 17 A and the 17B.
Figure 18 has shown a three-dimensional image display systems according to another embodiment of the invention.Described system comprises 1810, one microcomputers 1820 of an input equipment, a pair of servomechanism 1830 and 1835, and a pair of display device 1840 and 1845.Described input equipment 1810 provides an observer's input to each display device 1840 and 1845, as, described distance value W aIn one embodiment of the invention, described input equipment 1810 can be, for example, and a keyboard, a mouse, an indicating equipment, or a telepilot.For simplicity, interface has been omitted.
Described microcomputer 1820 is according to the described value W that is provided a, be that described display device 1840 and 1845 determines the amounts of motion, make described W dValue and described W aBe worth substantially the same.In one embodiment of the invention, the described distance value (W of described microcomputer 1820 initialization d), and according to described value W aWith described initial value W d, be the amount of described display device 1840 and 1845 definite motions.In the described servomechanism 1830 and 1835 each is according to the amount of the motion that is determined, move respectively described display device 1840 and 1845.
With reference to Figure 19, the operation of the system among Figure 18 obtains explaining.Each stereo-picture is displayed in described display device 1840 and 1845 (1850).Distance (W between the central point of the described image that is shown d) (1855) are initialised.In one embodiment of the invention, described initial value can be " 70 millimeters ".Described distance (W between the central point of two eyes of an observer a) be provided for described microcomputer 1820 (1860).Determine W aWhether equal W d(1870).If W aEqual W d, display device 1840 and 1845 is not done motion (1910).If determine W aGreater than W d(1880), described servomechanism 1830 and 1835 is respectively to certain orientation move described display device 1840 and 1845 (1842,1844), makes W dEnlarge and reach W a, shown in Figure 20 A and 20B.If determine W aBe not more than W d, described servomechanism 1830 and 1835 is respectively to certain orientation move described display device 1840 and 1845 (1846,1848), makes W dDwindle and reach W a, shown in Figure 20 C and 20D.
In another embodiment of the present invention, use a known eye lens motion detector can automatically detect described distance (V a).This embodiment of the present invention will be described in conjunction with Figure 21 A.Described detector 2100, survey between the central point of observer's eye lens apart from V aAnd described detector 2100 is surveyed the position of described each eye lens.In Figure 21 A and 21B, A 2LAnd A 2RRepresent the central point of observer's eye lens, A 3LAnd A 3RRepresent the central point of observer's eyes.From Figure 21 A and 21B, can see described A 3LThe position is fixed, and described A 2LPosition-movable.Described detector 2100 is surveyed the current location of described each eye lens.In one embodiment of the invention, described detector 2100 comprises a known eye lens acquisition sensor, as at United States Patent (USP) 5,526, is disclosed in 089.
Described distance that is detected and positional value are provided to a microcomputer 2120.Described microcomputer 2120 receives described distance value V a, and determine the amount of motion for the described image that is shown, or be the amount that described display device is determined motion, and combine the similar of Figure 15-20 description.Determined amount is used for controlling the motion of the described image that is shown or the motion of described display device.And described microcomputer 2120 is determined the reposition of the central point of described image according to the positional value of described eye lens.Like this, described microcomputer 2120 control described display driver (1530,1570) or described servomechanisms (1830,1835) the described stereo-picture that moves from the current central point 2210 and 2230 of described image, arrives new central point, for example, 2220 and 2240, as shown in figure 22.
Be used to provide the method and system of the movable information of stereocamera
Figure 23 has shown, a camera arrangement that is used for a three-dimensional display system according to an aspect of the present invention.Described camera arrangement is used for showing that to one the place provides view data and the camera motion detection data that is taken.Described camera arrangement comprises one group of stereocamera 2200,2210, motion detection equipment 2220,2230, compositor 2240,2250, and transmitter 2280,2290.Each stereocamera 2200,2210 is taken an image, and the view data that photographs is offered each compositor 2240,2250.
Described motion detection equipment 2220,2230 is surveyed the motion of described camera 2200 and 2210 respectively.Described camera 2200 and 2210 motion may comprise direction up and down, and left with to the motion of right, as shown in figure 23.Each detecting devices (2220,2230) provides detection data to each compositor 2240 and 2250.In one embodiment of the invention, if each in detecting devices 2220 and 2230 does not detect any motion of described camera 2200 and 2210, described equipment 2220 and 2230 can not provide detection data to compositor 2240 and 2250, or provides representative not have the detection information data of motion.In one embodiment of the invention, each in the described motion detection equipment 2220 and 2230 comprises a typical motion detection sensor.Described motion detection sensor can provide the detection data of text or figure to described compositor 2240 and 2250.
Described compositor 2240 and 2250 synthetic described view data and described motion detection data, and provide synthetic data 2260 and 2270 to arrive described transmitter 2280 and 2290 respectively.If described compositor 2240 and 2250 receives the detection information data that representative does not have motion from described motion detection equipment 2220 and 2230, if perhaps described compositor 2240 and 2250 does not receive any exercise data, each compositor (2240,2250) only provide described view data, and do not have the motion detection data to described transmitter 2280 and 2290.In one embodiment of the invention, each compositor 2240 and 2250 comprises a typical multiplexer.In the described transmitter 2280 and 2290 each is transmitted described synthetic data 2260 and 2270 to described demonstration place by a communication network (not shown).
Figure 24 has shown a display system corresponding to described camera arrangement shown in Figure 23.Described display system is used for providing camera motion to an observer.Described camera arrangement comprises a pair of receiver 2300 and 2310, data extractor 2320 and 2330, image processor 2340 and 2360, microcomputer 2350 and 2370, at screen video data (OSD) circuit 2390 and 2410, compositor 2380 and 2400, display driver 2420 and 2430, and display screen 2440 and 2450.
In the described receiver 2300 and 2310 each receives the synthetic data that send from described camera arrangement, and the data of described reception are provided to described data extractor 2320 and 2330 respectively.In the described data extractor 2320 and 2330 each is separated described view data and described motion detection data from the data of described reception.Described view data is provided for described image processor 2340 and 2360.Described motion detection data are provided for described microcomputer 2350 and 2370.Described image processor 2340 and 2360 is carried out typical view data for described view data and is handled, and provides processed data to described compositor 2380 and 2400 respectively.
In the described microcomputer 2350 and 2370 each is determined camera motion information from described motion detection data.In one embodiment of the invention, each microcomputer (2350,2370) is determined camera motion information for four direction at least, as, above, below, the left side, the right side.Described microcomputer 2350 and 2370 provides the camera motion that is determined information to described osd circuit 2390 and 2410 respectively.Each described osd circuit 2390 and 2410 generates the osd data of representing camera motion according to determined movable information.In one embodiment of the invention, described osd data comprises arrow indication 2442-2448, represents the motion of described camera 2200 and 2210.Described arrow 2442 and 2448 represents that respectively each camera moves up and down.Described arrow 2444 and 2446 represents that respectively each camera moves to a direction, makes distance between the camera enlarge or dwindles.
Described compositor 2380 and 2400 synthetic described processed view data and described osd datas, and provide the described image that is synthesized respectively to described display driver 2420 and 2430.Each display driver 2420 and 2430 each in described display screen 2440 and 2450 shows the described image that is synthesized.
With reference to Figure 25, the operation of described camera shown in Figure 23 and 24 and display system obtains describing.In the described stereocamera 2200 and 2210 each is taken an object (2460).Described a pair of motion detection equipment 2220 and 2230 is surveyed the motion (2470) of described camera 2200 and 2210 respectively.The view data that is taken and described motion detection data each in described compositor 2240 and 2250 are synthesized (2480).The described data that are synthesized 2260 and 2270 are sent to described demonstration place (2490) by a communication network.Other embodiment is synthesizing and separating shown in chart not.
Be transmitted next data from described camera arrangement, be provided for described data extractor 2320 and 2330 (2500) by described receiver 2300 and 2310.Described view data and motion detection data are separated (2510) in described data extractor 2320 and 2330.Described view data is provided for described image processor 2340 and 2360, and each image processor 2340 and 2360 is handled described view data (2520).Described motion detection data are provided for described microcomputer 2350 and 2370, and each microcomputer 2350 and 2370 is determined movable information (2520) from described motion detection data.
In described osd circuit 2390 and 2410,, be generated (2530) according to described definite movable information corresponding to the osd data of described movable information.Described processed view data and described osd data are synthesized (2540) in described compositor 2380 and 2400.The described data presentation that is synthesized is on described display screen 2440 and 2450 (2550).When described osd data is displayed on described display screen 2440 and 2450, this means that in described camera 2200 and 2210 at least one moved.Like this, described image also moves along described camera 2200 and 2210 travel directions.This is used for guiding an observer's eye lens to follow the trail of the motion of described camera 2200 and 2210.In one embodiment of the invention, described arrow 2442-2448 just was shown before the motion campaign of described image by described camera, made an observer can expect the motion of described image in advance.
In another embodiment of the present invention, described display system allows described observer to understand the motion of described camera 2200 and 2210 by a voice message of the described camera motion of representative is provided.For example, described voice message can be " described stereocamera moves upward " or " described camera moves right ".In this embodiment of the present invention, described osd circuit 2390 and 2410 can omit.In another embodiment of the present invention, represent the described osd data of described camera motion and voice message can offer described observer.
In one embodiment of the invention, described camera and display system shown in Figure 23 and 24 have function: can show described image, make described shooting ratio (A: B: C) equal described screen ratio (A: B: C), as discussing in conjunction with Fig. 7-11.In another embodiment of the present invention, described system may comprise function: can show stereo-picture, make that the distance between the central point of distance and observer's eyes between the central point of described stereo-picture is substantially the same, as discussing in conjunction with Figure 15-22.
Being used for the motion of person's eye lens according to the observation controls the side of the motion of stereocamera
Method and system
Another aspect of the present invention provides a three-dimensional display system, and this system motion of person's eye lens according to the observation controls the motion of described camera.Before describing aspect this of invention, earlier to describe a next observer eyes and the relation between one group of stereocamera with reference to figure 26-28.
Figure 26 A is a concept map, has shown the parameter of stereocamera.In the described camera 30 and 32 each comprises object lens 34 and 36 respectively.Described camera parameter comprises C 2L, C 2R, C 3L, C 3R, S CL, S CR, V cAnd W cC 2LAnd C 2RRepresent the central point of described object lens 34 and 36 respectively.C 3LAnd C 3RRepresent the turning axle of described camera 30 and 32 respectively.S CLRepresentative connects C 2LAnd C 3LLine.S CRRepresentative connects C 2RAnd C 3RLine.V cRepresent C 2LAnd C 2RBetween distance.W cRepresent C 3LAnd C 3RBetween distance.
Described turning axle C 3LAnd C 3RDo not move, be described camera 30 and 32 swivel eyes around axis.Described turning axle C 3LAnd C 3RBy turning round, allow described camera 30 and 32 rotations respectively, shown in Figure 27 B-27E as an automobile windscreen wiper.Figure 27 A has shown the default location of camera 30 and 32.Figure 27 B-27D has shown the tangential movement of described camera 30 and 32.Figure 27 E has shown the vertical movement of described camera 30 and 32.In one embodiment of the invention, when they in when motion, and after the motion, shown in Figure 27 B-27E, each in the camera 30 and 32 is parallel to each other basically.Figure 27 F is a front elevation of one of them stereocamera, the motion of described camera in 8 directions of having demonstrated out.Described cornerwise motion 46a-46d can carry out with vertical motion by synthetic described level.For example, described motion " 46a " is by realizing with the top motion camera left.
Figure 26 B is a concept map, has shown the parameter of an observer's eyes.In the eyes 38 and 40 each only comprises eye lens 42 and 44 respectively.Each eye lens is located substantially on the outside surface of eyes.This means the central point of each eye and the radius that the distance between each eye lens is substantially equal to eyes.Described eye lens move along with the rotation of eyes.Described eyes parameter comprises A 2L, A 2R, A 3L, A 3R, S AL, S AR, V aAnd W aA 2LAnd A 2RRepresent the central point of eye lens 42 and 44 respectively.In the eye lens 42 and 44 each aspect the reception image, is carried out substantially the same function with the object lens 34 and 36 of described stereocamera 30 and 32.Like this, described eyes parameter A 2LAnd A 2RCan corresponding described camera parameter C 2LAnd C 2R
A 3LAnd A 3RRepresent the turning axle of eyes 38 and 40 respectively.Described turning axle A 3LAnd A 3RBe eyes 38 and 40 swivel eyes around axle.Described turning axle A 3LAnd A 3RAllow eyes 38 and 40 rotations, shown in Figure 28 B-28D.Because, when described camera 30 and 32 whens rotation, described stereocamera 30 and 32 turning axle C 3LAnd C 3RDo not move, so, when eyes 38 and 40 rotations, observer's eyes 38 and 40 turning axle A 3LAnd A 3RDo not move.Like this, described eyes parameter A 3LAnd A 3RCan be corresponding to described camera parameter C 3LAnd C 3R
S ALRepresentative connects A 2LAnd A 3LLine.S ARRepresentative connects A 2RAnd A 3RLine.Shown in Figure 26 A and 26B, described eyes parameter S ALAnd S ARCan correspond respectively to described camera parameter S CLAnd S CRV aRepresent A 2LAnd A 2RBetween distance.W aRepresent A 3LAnd A 3RBetween distance.Similarly, described eyes V parameter aAnd W aCan correspond respectively to described camera parameter V cAnd W c
With reference to figure 28A-28C, can see when the direction of eyes 38 and 40 changes having only S ALAnd S ARDirection change and turning axle A 3LAnd A 3RFix.This means W aConstant, work as S ALAnd S ARDuring variation.Like this, for motion, control the motion of described camera 30 and 32, when described distance W according to eyes 38 and 40 cWhen constant, the line S of described camera CLAnd S CRDirection need be according to the line S of described eyes ALAnd S ARDirection control.
Figure 28 A has shown the example of an eyes structure, and one of them observer watches from his or she " 10,000 meters " object far away at least.This example is constructed corresponding to described camera, the focal length infinity of wherein said camera.As mentioned before, when an observer sees an object, such as, be distal to " 10,000 meters ", eye lens 42 and 44 central point A 2LAnd A 2RBetween described distance (V a) with eyes 38 and 40 central point A 3LAnd A 3RBetween distance (W a) substantially the same.
When observer observes an object that is positioned at his or she the place ahead, and this object is when being bordering on " 10 meters ", and observer's left eye rotates in a clockwise direction, and right eye rotates in a counter-clockwise direction, shown in Figure 28 B.As a result, apart from V aBecome and compare distance W aShort.Be positioned at a take over object of side of his or she the place ahead if observer observes, every eyes are all to turn clockwise, shown in Figure 28 C.In this case, apart from V aMay be less than distance W aFigure 28 D motion of eyes of having demonstrated out in 8 directions.
Figure 29 has shown, according to another aspect of the present invention, is used to control a three-dimensional display system of one group of stereocamera.Described system comprises a camera place and a demonstration place.Described demonstration place is used to transmit the eye lens exercise data to described camera place.Described camera place is used for according to described eye lens exercise data, controls described one group of stereocamera 30 and 32.
Described demonstration place comprises 3000, one transmitters 3010 of an eye lens motion detection equipment, a pair of display device 2980 and 2990, a pair of receiver 2960 and 2970, and a V-arrangement mirror 2985.When described camera place was transmitted stereo-picture to described demonstration place by a pair of transmitter 2900 and 2930, described demonstration place received described image, and shows by described display device 2980 and 2990.An observer sees stereo-picture by described V-arrangement mirror, and the image that this mirror reflection is shown is to the observer.When the observer observed described image, its eye lens may move with certain orientation, as, (clockwise or counterclockwise) of latitudinal (up or down) and longitudinal.Again, another display device can be used, for example, and HMD, an or projection display apparatus as previously described.
Described eye lens motion detection equipment 3000 as an observer during by described V-arrangement sem observation 3-D view, is surveyed the motion of each eye lens of observer.Described motion comprises the current location of eye lens.Described detecting devices 3000 is substantially the same with the equipment 2100 shown in Figure 21 A.Described detecting devices 3000 can be that a microcomputer 2940 in described camera place can recognition data with the conversion of motion of described eye lens, and the data that conversion is provided are to described transmitter 3010.In one embodiment of the invention, described detection data can comprise a pair of (x, y) value of each eye lens.
Described transmitter 3010 transmits described eye lens exercise data to described camera place by a communication network 3015.Described detection data can comprise recognition data, and this data identification is in the left side in described camera place and each of right eye lens.In one embodiment of the invention, described demonstration place can comprise a pair of transmitter, and each transmits the eye lens exercise data left side and the right to described camera place.In one embodiment of the invention, before transmitting described exercise data, may carry out data modification, as encoding for the adaptation transmission and/or modulating.
Described camera place comprises one group of stereocamera 30 and 2950, one microcomputers 2940 of 32, one receivers, a pair of controller of camera 2910 and 2920, described a pair of transmitter 2900 and 2930.Described receiver 2950 receives described eye lens exercise data from described demonstration place, and provides these data to microcomputer 2940.Described microcomputer 2940 is determined described each eye lens exercise data from received data, and provides the eye lens exercise data left side and the right to described controller of camera 2910 and 2920 respectively.In one embodiment of the invention, described camera place can comprise a pair of receiver, and wherein each receives the eye lens exercise data left side and the right respectively from described demonstration place.In this case, each receiver provides each eye lens detection data to the controller of camera 2910 and 2920 of correspondence respectively, and described microcomputer 2940 is omitted.
Described controller of camera 2910 and 2920 is controlled in described camera 30 and 32 each according to the eye lens exercise data that is received.That is, each in described controller of camera 2910 and the 2920 control described cameras 30 and 32 is with the direction motion substantially the same with the direction of motion of described eye lens 42 and 44.With reference to Figure 30, described controller of camera 2910 and 2920 comprises servo controller 3140 and 3190 respectively, the engine 3120 and 3160 of level, and vertical engine 3130 and 3180.Each described servo controller 3140 and 3190 according to the eye lens exercise data that receives, is controlled described level and vertical engine (3120,3160,3130,3180).The engine 3120 and 3160 of each level move in the horizontal direction respectively described camera 30 and 32.Each vertical engine 3130 and 3180 move in vertical direction respectively described camera 30 and 32.
Figure 31 has shown a process flow diagram, be used to present according to an aspect of the present invention described controller of camera 2910 and 2920 operation.Figure 32 A has shown that is used for chart controlling level and vertical engine.Figure 32 B has shown a concept map, has explained the motion of described camera.With reference to Figure 31 and 32, described controller of camera 2910 and 2920 operation will be described.Because the operation of described controller of camera 2910 and 2920 is substantially the same, shown in the operation of controller of camera 2910 is only described.Described servo controller 3140 initialization camera adjusted values (3200).In one embodiment of the invention, the initialization of described camera adjusted value comprises and sets a default value, for example, " (x, y)=(0,0) ", representative is motion not.These values are corresponding to described eye lens exercise data, and this is just can see under the situation of direction forward and detecting at need not move his eye lens of observer.In one embodiment of the invention, described initialization can comprise the relation between the actual motion amount that is set in described adjusted value and camera 30, shown in Figure 32 A.
Described eye lens exercise data is provided for described servo controller 3140 (3210).In one embodiment of the invention, described eye lens exercise data comprises that (wherein x and y represent the level and the vertical movement of each eye lens respectively for x, y) coordinate figure.
Described servo controller 3140, according to the eye lens exercise data that provides, determine the camera adjusted value (X, Y).Determine whether X equals " 0 " (3230).If X is " 0 ", described servo controller 3140 does not move described horizontal engine 3120 (3290).If X is not " 0 ", just determine that whether X is greater than " 0 " (3240).If X is greater than " 0 ", described servo controller 3140 is operated described horizontal engine 3120 and is come motion camera 30 (3270) to the right.Shown in Figure 32 A, for example, if value X is " 1 ", described amount of exercise is " 2 ° ", and described direction is clockwise (θ 3Direction).For example, if value X is " 2 ", described motion is " 4 ° ", in a clockwise direction.
If X is not more than " 0 ", mean X less than " 0 ", the described horizontal engine 3120 of described servo controller 3140 operations is come with counterclockwise (θ 1) the direction described camera 30 (3260) that moves.With reference to figure 32A and 32B, if value X is " 1 ", described amount of exercise is " 2 ° ", and direction is counterclockwise.If value X is " 3 ", described motion is " 6 ° ", with counterclockwise (θ 1) direction.
Similarly, whether decision Y equals " 0 " (3300).If Y is " 0 ", described servo controller 3140 does not move described vertical engine 3130 (3290).If y is not " 0 ", determine that whether Y is greater than " 0 " (3310).If Y is greater than " 0 ", the described vertical engine 3130 of described servo controller 3140 operations is come with positive latitude (upwards: θ 2) the direction described camera 30 (3320) that moves.If value Y is " 2 ", motion is " 4 ° ", upward.
If Y is not more than " 0 ", the described vertical engine 3130 of described servo controller 3140 operations makes described camera 30 move downward (3330).If Y is " 3 ", described motion is " 6 ° ", and direction is that negative latitude is (downward: θ 4) direction.
Now, the operation of the integral body of system shown in Figure 29 will be described in conjunction with Figure 33.Described eye lens motion detection equipment 3000 is provided to the demonstration place (3020) of system.When the observer watched stereo-picture, the motion of this observer's eye lens was surveyed (3030) by described eye lens motion detection equipment 3000.Described eye lens exercise data is sent to described camera place (3040) by described transmitter 3010 and communication network 3015.As mentioned before, can use a transmitter or a pair of transmitter.
The described receiver 2950 in described camera place receives described eye lens exercise data (3050) from described demonstration place.Described camera adjusted value is determined (3060) according to described eye lens exercise data.Described stereocamera 30 and 32 is by determined camera adjusted value control (3070).Like this, described stereocamera 30 and 32 Be Controlled make described camera can follow the trail of the motion of described eye lens.For described observer, he or she notices that his or her eye lens one are to a specific direction motion, and stereo-picture is also to described eye lens travel direction motion.
Figure 34 has shown according to another aspect of the present invention, is used for stereocamera controller system of a three-dimensional display system.For simplicity, described demonstration place is not shown.This aspect of the present invention is selected corresponding to the momental a pair of stereocamera of described eye lens, rather than is controlled this motion to stereocamera in many group stereocameras.
Described system comprises 3430, one storeies 3440 of a microcomputer, camera selector switch 3420 and 3425, and organize stereocamera 30a and 32a, 30b and 32b, and 30c and 32c more.Described storer 3440 a storages form as shown in figure 35.This form has shown the relation between camera adjusted value and the selecteed camera.Described camera adjusted value " (0,0) " corresponding to, for example, one group of camera C 33 is shown in Figure 35 and 36B.Described camera adjusted value " (1,0) " is corresponding to one group of camera C 34, shown in Figure 35 and 36B.Described camera adjusted value " (2,2) " is corresponding to photograph unit C15, as shown in the figure.In one embodiment of the invention, another group stereocamera chooses from described many group cameras, for example, and one of them in one of them in the C34 photograph unit and the C32 photograph unit.
Figure 36 A is a top plan of many group stereocameras.In one embodiment of the invention, the outline line that obtains by all object lens that connect described many group stereocameras, similar to the outline line of the observer's who exposes to the outside eyes.
Described microcomputer 3430 is determined the camera adjusted value according to received eye lens exercise data.Described microcomputer 3430 is determined first and second cameras selection signal also according to the form that is stored in the described storer 3440.The described first selection signal motion of person's left eye lens is according to the observation determined, and is used for controlling described camera selector switch 3420.The described second selection signal motion of person's right eye lens is according to the observation determined, and is used for controlling described camera selector switch 3425.Described microcomputer 3430 provides each to select signal to described camera selector switch 3420 and 3425 respectively.
Described camera selector switch 3420 and 3425 is selected camera separately according to described selection signal.In one embodiment of the invention, the base set camera shown in Figure 36 B (for example, C33) is taken an object, and by transmitter 2900 and 2930, is transmitted described image respectively to described demonstration place.In this embodiment of the present invention, if described camera selector switch 3420 and 3425 is selected other one group of camera, the object of this group camera that is selected, and, transmit described image to described demonstration place by transmitter 2900 and 2930.In one embodiment of the invention, all cameras are all opened, and first group of camera is connected respectively to described transmitter 2900 and 2930.In this embodiment of the present invention, when second group of camera was selected, described first group of camera was disconnected and described transmitter 2900 and 2930 be connected, and described second group of camera is connected respectively to described transmitter 2900 and 2930.In another embodiment of the present invention, have only one group of selected camera to open, and do not have selected camera to keep closed condition.In one embodiment of the invention, each described camera selector switch 3420 and 3425 comprises a switch, this switch is group stereocamera 30a and 32a how respectively, 30b and 32b, and carry out conversion between 30c and 32c and described transmitter 2900 and 2930.
With reference to Figure 37, the operation of system shown in Figure 34 will obtain explaining.Base set camera among Figure 36 (for example, C33), is taken an object (3710).The eye lens exercise data is received (3720) from described demonstration place.The camera adjusted value is determined (3730) according to the eye lens exercise data that receives.Described camera adjusted value is demonstrated out in the form of Figure 35.Camera selects signal to determine (3740) according to determined camera adjusted value, for example, utilizes the relation in the form of Figure 35.Determine one group of new camera whether selected (3750).If there is not one group of new camera selected, the image of exporting from described basic camera is sent to described demonstration place (3780).(for example, C35), described basic camera (C33) is disconnected to the connection of described transmitter 2900, and described new camera (C35) is connected to described transmitter 2900 and 2930 (3760) if one group of new camera is selected.Described selecteed camera (C35) is taken an object (3770), and is sent to described demonstration place (3790) from the image of described selecteed camera output.
About the embodiment that describes in conjunction with Figure 29-37, described camera control can be used for telecontrol engineering, as a remote operation, and vehicle, aircraft, or aircraft, the remote control of fighter plane, or build the remote control of investigation or automatic assembly equipment.
The side that uses a 3D mouse to guide the stereo-picture of observer's eye lens motion to show
Method and system
Figure 38 has shown a three-dimensional display system according to another aspect of the present invention.Described three-dimensional display system uses a three-dimensional input equipment to guide observer's eye lens motion.Described system also uses described three-dimensional input equipment to adjust the image of demonstration, make on the longitudinal of central point of observer's eye lens with the latitude direction on the longitudinal of position and the central point of the image of demonstration on the position with on the latitude direction substantially the same.In one embodiment of the invention, described three-dimensional input equipment comprises a 3D mouse (being described further below).
Described system comprises one group of stereocamera 30 and 32, a pair of transmitter 2900 and 2930, one groups of display devices 3900 and 3910, one 3D mouses 3920, and an input equipment 3990.Described stereocamera 30 and 32, a pair of transmitter 2900 and 2930, and a pair of receiver 2960 with 2970 with shown in Figure 29 identical.Described display device 3900 and 3910 shows the stereo-picture that sends from described camera place.And this of described equipment 3900 and 3910 demonstration guiding observer eye lens motions is to the 3D mouse cursor.
In one embodiment of the invention, the input of described 3D mouse is provided for described display device 3900 and 3910, as shown in figure 38.In this embodiment of the present invention, this is shown the 3D mouse cursor, and moves by described 3D mouse 3920.
In one embodiment of the invention, the shape of described 3D mouse cursor comprises square, arrow, and cruciform has the square of right-angled intersection, shown in Figure 40 A-40H, cross curve, or cross-hair.In one embodiment of the invention, for simplicity, use the square cursor of mouse 400 and 420 of a pair of right-angled intersection as shown in figure 40.In one embodiment of the invention, when an observer is the described image that is shown, adjust a distance value (will in conjunction with Figure 39 and 40 more detailed descriptions), the distance (M between described 3D mouse cursor 400 and 420 d) be adjusted.And in this embodiment of the present invention, the size of described 3D mouse cursor can be adjusted.In this embodiment of the present invention, described observer adjusts described distance value, for example, and by rotating the scroll button of described 3D mouse.For example, by rotating scroll button (towards user's direction) backward, described observer can be set to a less distance value (10,000 meters-100 meters-5 meters-1 meter-0.5 meter-5 centimetres) from a bigger value.And by the moving button (reverse directions of backward directions) of tumbler forward, described observer can be set to a bigger distance value (5 centimetres-0.5 meter-1 meter-5 meters-100 meters-10,000 meters) from a less value.Hereinafter, described distance value often is used as an infinitely-great value or infinity for 10,000 meters.
Figure 39 has shown the example of a 3-D display image.Described image comprises 3820, one house image sections 3830 of 3810, one trees image sections of a mountain range image section and a people's image section 3840.Suppose described mountain range image section 3810, described trees image section 3820, described house image section 3830 and described people's image section 3840 is respectively from described stereocamera 30 and 32 " about 10; 000 meter ", " about 100 meters ", " about 5 meters ", the distance of " about 1 meter " is taken.
When an observer wishes to see described mountain range shown in Figure 39 image section 3810, he or she can the setpoint distance value greater than " 10,000 meters ".In this case, described cursor of mouse is apart from M dHas M D0Value, this value equals W a(V Amax) value, shown in Figure 40 A.As mentioned before, when described observer watches the object of an infinite distance, V aHas maximal value (V Amax).And, observer's sight line L S1And L S2, parallel to each other basically, L S1And L A2Each is S ALAnd S AR(each connects A 2And A 3) extended line, shown in Figure 40 A and 40B.This means, if described observer is with their eye lens, with interval W aDistance, watch shown image, shown in Figure 40 A and 40B, described observer feels a kind of distance perspective, as they see " a d 0(10,000 meters) " object far away.This be because human eyes mutually between apart from the 60-80 millimeter, and in brain, feel a kind of sensation of three-dimensional by the composograph of every eyes.Like this, see at interval M as the observer d=W aDescribed two cursor of mouse the time, they feel and at infinity are positioned at single (three-dimensional) cursor of mouse between described two cursor of mouse (400,420).
As described observer's setpoint distance value (d 1) be, for example, " 100 meters ", and watch described trees image 3820, M dHave less than M D0Value M D1, shown in Figure 40 C and 40D.And, observer's sight line L S1And L S2No longer parallel.Like this, when described two sight line overtimes, their polymerizations are in a virtual point " M ", shown in Figure 40 D.Mid point between the central point of every eyes of point " 0 " representative.Similarly, if the observer with the M of being separated by D1Eye lens watch the image of described demonstration, shown in Figure 40 C and 40D, described observer feels a distance perspective, feels as observing " a d 1(100 meters) " object far away.Distance between M and the O is not the length of physics but imaginary length.Yet because the observer feels distance perspective, with regard to the distance or direction of observer's eye lens, the distance between the M and 0 can be considered to observer's eyes and the actual distance between real object.That is, when described observer's read fortune every M D1Described two cursor of mouse 400 and 420 o'clock, they feel and at M point place, are positioned at 100 meters distances far away by single (three-dimensional) cursor of mouse.
When described observer sets a littler distance value (d 2) be, for example, " 5 meters ", and watch described house image 3830, M dHave less than M D1Value M D2, shown in Figure 40 E and 40F.And when described two sight lines overtime in screen, their polymerizations are in a virtual point " M ", shown in Figure 40 F.Similarly, in this case, when the observer watched described house image 3830, described observer felt a distance perspective, felt as observing " a d 2(5 meters) " object far away.Like this, when described observer's read fortune every M D2Described two cursor of mouse 400 and 420 o'clock, they feel and at M point place, are positioned at 5 meters distances far away by single (three-dimensional) cursor of mouse.
As observer's setpoint distance value (d 3) between observer and screen the time, for example " 50 centimetres ", described cursor of mouse 400 and 420 overlaps each other, shown in Figure 40 G.That is, when the actual range between the central point of described distance value and point " O " and screen was identical, shown in Figure 40 G, described cursor of mouse overlapped each other.
From Figure 40 A-40G as can be seen, even a pair of 3D mouse cursor 400 and 420 is displayed in each display device 3900 and 3910, described observer sees the cursor of mouse of a three-dimensional, and feels a kind of distance perspective.
When the observer sets one less than " d 3" distance value (d 4) time, described observer's sight line is at the screen pre-polymerization, and intersection, shown in Figure 40 H.In this case, described observer can see two cursor of mouse 400 and 420, because described observer's sight line is at the screen pre-polymerization.
Shown in Figure 40 A-40H, M dValue is determined according to the distance value of being set by the observer.
Figure 41 has shown an exemplary block diagram of described display system shown in Figure 38.Because each in described display device 3900 and 3910 is carried out substantially the same function, in Figure 41, only show a display device 3900.
Described display device 3900 comprises 3950, one storeies 3960 of 3940, one microcomputers of 3930, one display drivers of a display screen and interface 3970 and 3980.Described display device 3900 according to the distance value group shown in Figure 40 A-40H, is adjusted at the distance (M between a pair of 3D mouse cursor 400 and 420 d).Described display device 3900 is according to the central point of the described image that is shown of motion campaign of described 3D mouse cursor.In one embodiment of the invention, the described image that is shown of described display device 3900 motions, make on observer's the longitudinal of central point of eye lens with the latitude direction on the longitudinal of position and the central point that is shown image on the position with on the latitude direction substantially the same.
Described 3D mouse 3920 is surveyed its amount of exercise.The described amount of exercise that is detected is provided for described microcomputer 3950 by interface 3970.The described distance value that the observer sets is provided for described microcomputer 3950 by 3D mouse 3920 and interface 3970.In one embodiment of the invention, described interface 3970 comprises a mouse controller.In another embodiment of the present invention, described distance value can pass through described input equipment 3990 and interface 3980, offers described microcomputer 3950.
Described input equipment 3990 is by interface 3980, provides the character of described 3D mouse to described microcomputer 3950, as minimum detection amount (A m), autokinesis (B m), described cursor of mouse size (C m), described observer-screen distance (d), and observer's ocular data are as W a, S ALAnd S ARDescribed minimum detection amount is represented the observable minimum movement amount of described 3D mouse.That is, have only when the motion of described 3D mouse surpasses described minimum detection amount, just may detect the motion of described 3D mouse.In one embodiment of the invention, when making described 3D mouse, just set described minimum detection amount.Described autokinesis representative is according to the motion of described 3D mouse, the sensitivity level of the motion of described cursor of mouse.The scroll button that this means described 3D mouse has different sensitivitys, that is, according to described distance value, perhaps sensitiveer or more insensitive.For example, if described distance value greater than 1,000 meter, the rotation of " 1 millimeter " of described scroll button may increase or reduce 2,000 distances far away.If described distance value is between 100 meters and 1,000 meter, the rotation of " 1 millimeter " of described scroll button may increase or reduce 100 distances far away.Similarly, if described distance value less than 1 meter, the rotation of " 1 millimeter " of described scroll button may increase or reduce 10 centimetres of distances far away.
In one embodiment of the invention, can also adjust described cursor of mouse size.Described distance (d) is represented mid point between observer's the eyes and the distance between the described screen, shown in Figure 43 A.In one embodiment of the invention, described screen comprises a V-arrangement mirror, a HMD screen, and a projection screen, and a display device screen are shown in Figure 1B.
And described input equipment 3990 provides the character of display device to described microcomputer 3950 by interface 3980.In one embodiment of the invention, the character of described display device comprises the screen size of described display device resolution and described display device 3900.Described resolution is represented the level of described equipment 3900 and vertical number of pixels.For example, if the resolution of described display device 3900 is 640 * 480, the number of pixels of level is 640, and vertical number of pixels is 480.Described size comprises the level of described display device 3900 and vertical length.Utilize the resolution and the screen size of described display device 3900, can obtain the length of a pixel, for example, per 10 pixels " 1 millimeter ".
In one embodiment of the invention, described input equipment 3990 comprises a keyboard, a telepilot, and an indication input equipment, etc.In one embodiment of the invention, described interface 3980 comprises described input device controls device.In one embodiment of the invention, the character of described 3D mouse is stored in the storer 3960.In one embodiment of the invention, described observer's ocular data use a detecting devices that is directed to the eye lens motion to be detected, or the observed person offers described display device 3900.
Described microcomputer 3950 according to the described distance value of being set by the observer, is determined described cursor of mouse distance (M d).Person's ocular data according to the observation, demonstrate described distance value and shown in Figure 40 A-40H as described in value M dBetween the relation a form (not shown) be stored in the described storer 3960.Described microcomputer 3950 by with reference to this form, is determined described cursor distance (M d), and provide determined distance value to described display driver 3940.Described display driver 3940, according to determine apart from M d, show that in described display screen 3930 this is to cursor of mouse 400 and 420.Described microcomputer 3950 is also determined the reposition of described cursor of mouse 400 and 420, and according to the position of described cursor of mouse 400 and 420, for the central point of the image that is shown calculates amount of exercise.The central point that described storer 3960 also can be stored as the image of demonstration calculates the needed data of amount of exercise.
With reference to Figure 42, display device 3900 and 3910 operation will obtain describing.The character of 3D mouse is set in each described display device 3900 and 3910 (4200).As mentioned before, the character of described 3D mouse comprises minimum detection amount (A m), autokinesis (B m), described cursor of mouse size (C m).And the character of described 3D mouse can provide or be stored in the described storer 3960 by described observer.
The character of display device is provided in described display device 3900 and 3910 (4205).In one embodiment of the invention, the character of described display device can be stored in the described storer 3960.
Described observer's ocular data is provided for described display device 3900 and 3910 (4210).As mentioned before, described observer's ocular data can automatically be detected by a detecting devices, also can offer described display device 3900 and 3910 by described observer.In one embodiment of the invention, described observer's ocular data comprises the distance (W between the central point of eyes a), and at described eye lens central point (A 2) and described eye center point (A 3) between apart from S AValue (S ALAnd S AR).
An observer-screen distance (d), by, for example, described input equipment 3990 offers each described display device 3900 and 3910 (4220).
Described mouse cursor position and distance value be initialised (4230).In one embodiment of the invention, described initialization is performed at an infinitely-great distance value.In this case, the cursor of mouse on the left side and the right lays respectively at (W a/ 2,0,0) and (W a/ 2,0,0), the initial point of wherein said coordinate system is 0 point (0,0,0), shown in Figure 43 A.And the position of the central point of the image that each is shown is respectively (W a/ 2,0,0) and (W a/ 2,0,0).
3-D view and 3D mouse cursor are displayed in each described display device 3900 and 3910 (4240).In one embodiment of the invention, 3D mouse cursor 400 and 420 is displayed on each 3-D view.Because described mouse cursor position is initialised, described controlled cursor of mouse 400 and 420 is displayed on the described image.
Determine whether initial distance value has become another value (4250).When described observer wanted to set the distance value different with described initial distance value, he or she can provide described distance value to described display device 3900 and 3910.
If described initial distance value is changed, 3D mouse cursor distance (M d) be adjusted, and described 3D mouse cursor position is according to the distance value that changes be initialised again (4260).For example, when described initial position is (0,0,10,000 meters), if another distance value shown in Figure 40 C (for example, 100 meters) is provided, described cursor of mouse distance (M d) from M D0Change to M D1Yet x and the y value of some M do not change, even the z value of some M changes to 100 meters from 10,000 meters.
If described initial distance value does not change, will determine whether the 3D mouse motion is detected (4270).
If 3D mouse motion is detected, 3D mouse cursor 400 and a reposition of 420 just are determined (4280).In one embodiment of the invention, the reposition of described cursor of mouse is following is determined.At first, determine the number of the pixel that described cursor of mouse moves on the x direction.For example, motion left has " x " value, and motion to the right has "+x " value.Be equally applicable to " y " direction, that is, " y " value arranged, "+y " value is arranged for the motion that makes progress for downward motion.The motion of described " z " direction is determined by described distance value.
Want the position of the central point of controlled display image, the reposition according to described 3D mouse cursor 400 and 420 calculates (4290).In one embodiment of the invention, the position of the central point of display image, the positional value from each eye lens obtains respectively.In this embodiment of the present invention, the described positional value of described eye lens is as described below, obtains by formula VII and VIII.With reference to Figure 43, description is obtained the method for the position of described eye lens.
At first, Z LValue from formula VII, obtain.
Formula VII:
Z L = [ I N - ( - W n 2 ) ] 2 + [ J N - 0 ] 2 + [ K N - 0 ] 2 = [ I N + ( W a 2 ) ] 2 + [ J N ] 2 + [ K N ] 2
In Figure 43 A, M N(I N, J N, K N) described two the cursor of mouse M of representative L(I L, J L, K L) and M R(I R, J R, K R) the position of central point.Because each mouse cursor position M LAnd M RIn 4280, obtain described center position M NObtain.That is I, NAnd J NBy average (I L, I R) and (J L, J R) obtain.K NDetermine by current distance value.Z LBe left eye central point (A 3L) and M NBetween distance.
The second, the center position of each eye lens [(x1, y1, z1); (x2, y2, z2)] obtain by formula VIII.A 2L(x1, y1 z1) are the center position of described left eye lens, and A 2R(x2, y2 z2) are the center position of described right eye lens, shown in Figure 43 A.Figure 43 B has shown the three-dimensional plot of observer's eyes.With reference to figure 43B, eye lens central point (A as can be seen 2L) how along the apparent motion of glasses.
Formula VIII:
x 1 = ( - W a 2 ) + [ ( I N + W a 2 ) × S ] Z L
y 1 = 0 + [ ( J N × S ) ] Z L
z 1 = 0 + [ ( K N ) × S ] Z L
x 2 = ( W a 2 ) - [ ( I N + W a 2 ) × S ] Z L
y 2 = 0 + [ ( J N ) × S ] Z L
z 2 = 0 + [ ( K N ) × S ] Z L
In one embodiment of the invention, position that digital signal processor can be used for calculating described eye lens.
Each central point of the image that is shown, be moved to respectively the position (x1, y1) and (x2, y2), (4300) as shown in figure 44.In one embodiment of the invention, the white space of the screen after the motion can be filled with background colour, as, black, as shown in figure 44.
Determine whether the motion of described 3D mouse has finished (4310).If the motion of described 3D mouse is not finished, with regard to implementation 4280-4300, up to finishing described motion.This has just guaranteed, the described cursor of mouse as long as the observer moves, and the described image that is shown is with regard to passive movement.
By using above-mentioned computing method, the distance between two positions can be measured.With reference to figure 43C, M N1Be the peak on mountain range 42, M N2Be a bit on the house 44.Suppose M N1And M N2Positional value, be determined by above-mentioned computing method, be respectively (0.02 meter, 0.04 meter, 100 meters) and (0.01 meter, 0 meter, 10 meters).These positional values that are determined can be stored in the described storer 3960, and two position M N1And M N2Between following calculating of distance.
Z L = [ - 0.02 - 0.01 ] 2 + [ 0.04 - 0 ] 2 + [ 100 - 10 ] 2 = 90
In this embodiment, described microcomputer 3950 is programmed and calculates two distances between the position, perhaps can comprise a range observation pattern.In this case, when an observer specifies a primary importance (A: two cursor of mouse 400 and 420 mid point), described position is determined, and is stored in the described storer 3960.In one embodiment, described positional value can be displayed on the described display screen 3930 or by voice signal and be provided for the observer.This is applicable to a second place (B).Like this, (A, value B) is determined, and (A, B) distance between is calculated in the position in described first and second positions.
Use a 3D mouse to control the method and system of the motion of stereocamera
Figure 45 has shown a three-dimensional display system according to another aspect of the present invention.Described system is used for the motion of person's eye lens according to the observation and controls the motion of stereocamera.
Described system comprises a camera place and a demonstration place.Described demonstration place comprises a pair of emittor/ receiver 4530 and 4540, one groups of display devices 4510 and 4520, and an input equipment 3990 and a 3D mouse 3920.
Substantially the same in described input equipment 3990 and 3D mouse 3920 and the system shown in Figure 38.With reference to Figure 46, described display device 4510 comprises interface 3970 and 4820, one storeies 4830 of 3980, one microcomputers, and an interface 4810.Described interface 3970 and 3980 substantially the same with in the display device shown in Figure 41 those.Described microcomputer 4820 is determined the current location value of described cursor of mouse, and the positional value of the central point of calculating observer's eye lens.Described storer 4830 also can be stored the required data of amount of exercise to the central point that calculates described display image.
Described interface 4810 can be revised described positional value, adapts to transmission, and provides the data that are modified to described transmitter 4530.Described transmitter 4530 transmits the position data that is modified and arrives described camera place.
With reference to Figure 45, described camera place comprises one group of stereocamera 30 and 32, a pair of transmitter 4570 and 4600, a pair of servomechanism 4580 and 4590, and a pair of receiver 4550 and 4560.Each receiver 4550 and 4560 receives the positional value that sends from described demonstration place, and provides described data to described a pair of servomechanism 4580 and 4590 respectively.
Described servomechanism 4580 and 4590 according to the position data that receives, is controlled camera 30 and 32 respectively.In one embodiment of the invention, described servomechanism 4580 and 4590 control cameras 30 and 32 make the object lens (C of described camera 30 and 32 2L, C 2ROn the longitudinal of central point Figure 26 and 27) with the latitude direction on value, substantially the same with the value on the longitudinal of the central point of observer's eye lens with on the latitude direction, shown in Figure 47 A and 47C.
With reference to Figure 48, the operation of system shown in Figure 45 will obtain explaining.3D mouse character and display device character are set (4610) in each display device 4510 and 4520.Described 3D mouse character and display device character with explain in conjunction with Figure 42 those are substantially the same.Described observer's ocular data and observer-screen distance (d) are provided for each display device 4510 and 4520 (4620).Equally, described observer's ocular data and observer-screen distance (d) with explain in conjunction with Figure 42 those are substantially the same.3D mouse cursor position and distance value be initialised (4630).In one embodiment of the invention, described 3D mouse cursor position is initialized to the central point of each display device screen, and described distance value is initialized to the distance value of infinite distance.The described 3-D view that receives from the camera place, and 3D mouse cursor (400,420) is displayed on described display device 4510 and 4520 (4640).In one embodiment of the invention, described 3D mouse cursor can be displayed on the described 3-D view.In this case, the image section under described 3D mouse cursor (400,420) may not the observed person be seen.
Determine whether the 3D mouse motion is detected (4650).If motion is detected, the reposition of described 3D mouse cursor is determined (4660).The positional value of the central point of described observer's eye lens is calculated respectively (4670) according to the reposition of described cursor of mouse.The reposition of described cursor of mouse (400,420) and motion are shown in Figure 47 B.The concrete grammar of execution in step 4650-4670 is described in conjunction with Figure 42-44.
Described positional value data are sent to described camera place (4680) by each emittor/receiver 4530 and 4540.As mentioned before, as long as described positional value is just calculated in the motion of described cursor of mouse.Like this, described positional value may comprise a series of data.In one embodiment of the invention, described positional value sends described camera place continuously to, makes camera 30 and 32 according to the reception of positional value order, Be Controlled.In another embodiment of the present invention, can obtain the sequence of the positional value that generated, and be sent to described camera place, make described camera 30 and 32 according to described sequence Be Controlled.In one embodiment of the invention, described positional value data are numerical datas, and suitably modulate for transmitting.
Described positional value data are received (4690) in each receiver 4550 and 4560.In one embodiment of the invention, can be only with a transmitter, and need not use two transmitters 4530 and 4540.In this case, can be only with a receiver, and need not use two receivers 4550 and 4560.
The camera adjusted value is determined according to described positional value, and stereocamera 30 and 32 is according to described camera adjusted value Be Controlled (4700).Each servo controller 4580 and separately cameras 30 and 32 of 4590 controls make each central points of camera object lens, the motion (4710) that can follow the trail of each eye lens central point.Shown in Figure 47 C, corresponding to the reposition value A of the reposition of described 3D mouse cursor 2L1And A 2R1Use formula VIII to calculate, as mentioned before.Each servo controller 4580 and 4590 control cameras 30 and 32 make the central point of each camera object lens be positioned at C 2L1And C 2R1, shown in Figure 47 A.For this reason, described servo controller 4580 and 4590 is set the positional value of the central point of camera object lens, with consistent with the positional value of the central point of described eye lens.In one embodiment of the invention, described servo controller 4580 and 4590 comprises the engine and a vertical engine of a level, and they move each camera respectively on the direction of level (x-direction) and vertical (y-direction).In one embodiment of the invention, have only a servo controller to be used to control the motion of camera 30 and 32, rather than described a pair of servo controller 4580 and 4590.
When each described servo controller 4580 and the 4590 described stereocameras 30 of control and 32, camera 30 and 32 is taken an object.The image that is taken is sent to described demonstration place, and shows (4720,4730) on each described display device 4510 and 4520.
About the embodiment that 45-48 in conjunction with the accompanying drawings describes, described camera control can be used in the telecontrol engineering, as a remote operation, and vehicle, aircraft, or aircraft, the remote control of fighter plane, or build the remote control of investigation or automatic assembly equipment.
The method and system of the spatial magnification of control stereo-picture
Figure 49 has shown a three-dimensional display system according to another aspect of the present invention.Described three-dimensional display system is used for, and adjusts data according to the described spatial magnification that is provided by the observer, adjusts the spatial magnification of a stereo-picture.
Described system comprises a camera place and a demonstration place.Described demonstration place comprises 4910, one groups of display devices 4920 of an input equipment and 4930, one transmitters 4950, and a pair of receiver 4940 and 4960.
Described input equipment 4910 provides the eye distance value (W of an observer shown in Figure 43 A at least one described display device 4920 and 4930 a) and spatial magnification adjustment data.Described spatial magnification means the bulk that an observer feels from shown image.For example, if described spatial magnification is " 1 ", the observer is in the bulk that shows that the place is felt, and is measure-alike with the real space of taking in the camera place.Simultaneously, if described spatial magnification is " 10 ", the observer is in the bulk that shows that the place is felt, is 10 times of the real space size of taking in the camera place.And if described spatial magnification is " 0.1 ", the observer is in the bulk that shows that the place is felt, by 1/10th of the captured real space size of camera.Described spatial magnification is adjusted the data about spatial magnification that a data represented observer will adjust.In one embodiment of the invention, described spatial magnification adjustment data can comprise " 0.1 " times spatial magnification, " 1 " times spatial magnification, " 10 " times spatial magnification, or " 100 " times spatial magnification.The adjustment of described spatial magnification is carried out by the distance of adjusting between described camera 30 and 32, can describe in detail later.
The described spatial magnification adjustment data that provided by described input equipment 4910 are provided at least one display device 4920 and 4930.In the described display device 4920 and 4930 at least one to described transmitter 4950, provides described spatial magnification to adjust data and eye distance value (W a).Described transmitter 4950 to described camera place, transmits described magnification and adjusts data and described value W aIn one embodiment of the invention, described spatial magnification is adjusted data and described value W a, can directly be provided to described transmitter 4950, and need not pass through described display device 4920 and 4930 from described input equipment 4910.
Described receiver 4970 receives described spatial magnification and adjusts data and W from described transmitter 4950 a, and provide described data to described controller of camera 4990.Described controller of camera 4990 is adjusted data and described value W according to described spatial magnification a, control described camera distances.Described controller of camera 4990 comprises the engine 4975 of a servo controller 4985 and a level, as shown in figure 50.With reference to figure 50-52, the operation of described controller of camera 4990 will be explained.
Described servo controller 4985 initialization camera distances (C I), for example, make C IAnd W aIdentical (5100).Described spatial magnification relates to described camera distances (C I) and described eye distance value (W a).Work as C IAnd W aWhen identical, described spatial magnification is " 1 ", means what the observer saw, with identical with 32 dimension of object of taking by camera 30.Work as C IGreater than W aThe time, described spatial magnification means what the observer saw less than " 1 ", less than the dimension of object of being taken by camera 30 and 32.Work as C ILess than W aThe time, described spatial magnification means what the observer saw greater than " 1 ", greater than the dimension of object of being taken by camera 30 and 32.
Described spatial magnification is adjusted data (SM) and is provided for described servo controller 4985 (5110).Determine whether described adjustment data are " 1 " (5120).If described adjustment data are " 1 ", described camera distances is not adjusted (5160).If described adjustment data are not " 1 ", determine described adjustment data whether greater than " 1 ".If described adjustment data are greater than " 1 ", the described engine 4975 of described servo controller 4985 operations dwindles C IUp to obtaining desired spatial magnification (5150).With reference to Figure 52, be presented at spatial magnification and camera distances (C I) between relation, W wherein aIt is 80 millimeters.Like this, work as C IWhen being 80 millimeters, described spatial magnification is " 1 ".In this case, if desired spatial magnification is " 10 ", described camera distances is adjusted to " 8 millimeters ", shown in Figure 52.
If described adjustment data are less than " 1 ", the described engine 4975 of described servo controller 4985 operations enlarges C IUp to obtaining desired spatial magnification (5140).Shown in Figure 52, if desired spatial magnification is " 0.1 ", described camera distances is adjusted to " 800 millimeters ".
With reference to Figure 53, the operation of total system shown in Figure 49 will obtain describing.Stereo-picture obtains showing (5010) by described display device 4920 and 4930.Eye distance (W a) and spatial magnification adjust data (SM) and be provided at least one described display device 4920 and 4930, or directly from described input equipment 4910 to described transmitter 4950 (5020).Described eye distance (W a) and spatial magnification adjust data (SM) and be sent to described camera place (5030).Described camera place receives described W aWith the SM value, and according to described W aAdjust described camera distances (C with the SM value I).Described stereocamera 30 and 32 spatial magnifications that have an adjustment are taken an object (5050).Described image by described transmitter 4980 and 5000, is sent to described demonstration place (5060).Each described display device 4920 and 4930 receives and shows described image (5070).
About the embodiment that describes in conjunction with Figure 49-53, described camera control can be used for telecontrol engineering, as a remote operation, and vehicle, aircraft, or aircraft, the remote control of fighter plane, or build the remote control of investigation or automatic assembly equipment.
Adjust the method and system of the angles of display of stereo-picture according to a position of camera
Figure 54 has shown a three-dimensional display system according to another aspect of the present invention.Described system according to the relative position of described stereocamera about an object 5400, adjusts the position of display device.
Described system comprises a camera place and a demonstration place.Described camera place comprises one group of stereocamera 30 and 32, a pair of orientation detection equipment 5410 and 5420, transmitter 5430 and 5440.In this embodiment of the present invention, described camera 30 and 32 can be not parallel to each other, shown in Figure 54.Described orientation detection equipment 5410 and 5420 is surveyed described stereocamera 30 and 32 directions about the object 5400 that will be taken respectively.In one embodiment of the invention, described equipment 5410 and 5420 pitch angle of surveying about described two cameras initial position parallel to each other.In some cases, described camera 30 and 32 can tilt, and for example, counterclockwise 10 spend, shown in Figure 54, or from described initial position, clockwise direction.The pitch angle of described detecting devices 5410 and 5420 difference detection cameras 30 and 32.In one embodiment of the invention, each orientation detection equipment 5410 and 5420 comprises a typical direction sensor.
In the described transmitter 5430 and 5440 each transmits the detection direction data of described camera 30 and 32 to described demonstration place.If detect, have only camera 32 to tilt, shown in Figure 57, described detecting devices 5410 is surveyed less than inclination, like this, has only described transmitter 5440 to transmit the described data that detect to described demonstration place.Be equally applicable to when the situation of having only camera 30 to tilt.
Described demonstration place comprises a pair of receiver 5450 and 5460, a pair of display device controller 5470 and 5500, and one group of display device 5480 and 5490.Described camera 30 that each receiver 5450 and 5460 receptions detect and 32 tilt data, and provide these data to each display device controller 5470 and 5500.Described display device controller 5470 and 5500, according to received camera tilt data, decision shows adjusted value.Described demonstration adjusted value representative is to described display device 5480 and 5490 amounts of exercise that will adjust.In one embodiment of the invention, described display device controller 5470 and 5500 determines to show adjusted value according to the form shown in Figure 55.In this embodiment of the present invention, if described camera 32 is with 10 degree that counterclockwise tilt, shown in Figure 54, described display device controller 5500 in a clockwise direction, corresponding display device 5490,10 degree tilt, shown in Figure 54, like this, described camera is about the position of object 5400 and observer's the eye lens position about described screen, and is substantially the same.As mentioned before, described screen can comprise a V-arrangement mirror, a HMD screen, and a projection screen, or a display screen 160 are shown in Figure 1B.
With reference to Figure 56, the overall operation of the system shown in Figure 54 obtains explaining.Described one group of stereocamera 30 and 32 is taken an object (5510).Each described orientation detection equipment 5410 and 5420 is surveyed the direction (5520) of a camera about described object.That is, for example, each camera 30 and 32 pitch angle from a parastate are detected.Described captured image data (PID) and orientation detection data (DDD) are sent to described demonstration place (5530).Described PID and DDD receive in described demonstration place, and described DDD retrieves from the data that received and obtains (5540,5550).In one embodiment of the invention, described retrieval can use a typical demultiplexer to carry out.
In the described display device controller 5470 and 5500 at least one determined described display device adjusted value (5560) according to the DDD that is retrieved.In the described display device controller 5470 and 5500 at least one by the corresponding display equipment of moving, adjusted the angles of display (5570) about described observer's eye lens.Described display device 5480 and 5490 shows the stereo-picture (5580) that receives.
Figure 57 has shown a three-dimensional display system according to another aspect of the present invention.Described system according to described stereocamera 30 and 32 relative positions about described object 5400, adjusts the image of demonstration.
Except described display device 5710 and 5720, substantially the same among the system shown in Figure 57 and Figure 54.Described display device 5710 and 5720 according to the camera orientations detection data that receives, is adjusted the position of display image.With reference to Figure 58, presented an exemplary block diagram of described display device 5720.Though not shown, described display device 5710 and described display device 5720 are substantially the same.Described display device 5720 comprises 5920, one display drivers 5930 of 5910, one storeies of a microcomputer, and a display screen 5940.Form (not shown) of described storer 5920 storages has shown the relation between the adjustment amount of described camera tilt angle and display image.Described microcomputer 5910 is determined the display image adjusted value according to camera orientations data that receive and the form in the storer 5920.Described display driver 5930 according to the adjusted value of determining, is adjusted the angles of display of display image, and show described image on described display screen 5940.
With reference to figure 59A and 59B, presented the adjustment of the described image that is shown.In one embodiment of the invention, this can partly be carried out by the left side or the image right that enlarge or dwindle display image.For example, according to the pitch angle of described camera, determine described expansion or reduction volume.In this embodiment of the present invention, enlarge or dwindle and known dwindle or amplify software and carry out by one.Image among Figure 59 A is corresponding to, described display device inclination in the clockwise direction.Similarly, the image among Figure 59 B is corresponding to, described display device inclination in the counterclockwise direction.
With reference to Figure 60, explain the operation of system among Figure 54.By Figure 60 as can be seen, step 5810-5850 is identical with the step shown in Figure 55.The display image adjusted value is determined (5860) according to the camera orientations detection data (DDD) of retrieval.The image that shows is adjusted according to the adjusted value of determining, shown in Figure 59 (5870).Controlled image is shown (5880).
On a lasting storer, transmit or storing stereoscopic image and the method for taking ratio
And system
Figure 61 has shown a three-dimensional display system according to a further aspect of the invention.In this aspect of the invention, stereo-picture and shooting ratio by a network, as the Internet (Internet), are transmitted, or are stored on the lasting storer, as CD or disk.
With reference to Figure 61, the described data that are synthesized 620 of stereo-picture 624 and at least one shooting ratio (A: B: C) 622 of image 624 have been shown.Described stereo-picture 624 comprises the stereo broadcasting image, tridimensional advertisement image, or three-dimensional film image, the stereo product image of shopping online, or the stereo-picture of any other kind.In one embodiment of the invention, described shooting ratio 622 can be fixed for whole group stereo-picture 624.In preamble, the method for synthetic described stereo-picture 624 and shooting ratio 622 has been described in conjunction with Fig. 7.
In one embodiment, stereo-picture 624 generates from a pair of stereocamera (not shown), and synthesizes with described shooting ratio 622.In one embodiment of the invention, described solid (broadcasting, advertisement, or film, etc.) image 624 and described shooting ratio 622 can be from Internet servers, or the computing equipment of a broadcaster transmits.Described Internet server is by an Internet Broadcast company, a Internet movies company, an Internet advertisement company, or a Internet shopping company operation.In another embodiment, described shooting ratio is not synthetic, but transmits individually with described stereo-picture.Yet for simplicity, following description is primarily aimed at described synthetic method.
Described generated data 620 by a network 625, is sent to a computing equipment 627 that is positioned at a demonstration place.In one embodiment of the invention, described network 627 can comprise the Internet, and is wired, public telephone switching network (PSTN), or a wireless network.With reference to Figure 63, shown the exemplary data layout of described generated data 620.The left side of described stereo-picture 624 and image right are embedded in the described generated data 620, make described image 624 sequentially retrieve in one group of display device 626 and 628.For example, left-side images 1 and image right 1, left-side images 2 and image right 2 sequentially are arranged in described data layout, make described image to be retrieved in proper order by that.In one embodiment, described computing equipment 627 receives described synthetic data 620, and, described stereo-picture 624 of retrieval and shooting ratio 622 from the data that receive.In another embodiment, described image 624 and shooting ratio 622 are received individually, because they are not synthesized in transmission.
Described computing equipment 627 also respectively to described display device 626 and 628, provides described left side and image right.In one embodiment of the invention, described data layout can be formed, and makes described computing equipment 627, when it retrieves described image 624, as, predetermined order or data label can be discerned the left side and the image right of described stereo-picture 624.In one embodiment of the invention, described computing equipment 627 can comprise any computing equipment, and this equipment can pass through described network 625, downloads described image 624 and ratio 622, or with synthetic form or individually.In one embodiment, can provide a pair of computing equipment in described demonstration place, each is retrieved and provides left side and image right to described display device 626 and 628 respectively.
Described display device 626 and 628 shows the stereo-picture receive, makes described screen ratio (D1: E1: F1, the D2: E2: F2) with described shooting ratio (A: B: C) substantially the same of each display device 626 and 628.In one embodiment of the invention, described screen ratio (D1: E1: F1, D2: E2: F2) be identical (D1: E1: F1=D2: E2: F2=D: E: F).Described display device 626 and 628 comprises, disclosed display device 86 of Fig. 8 and 88 element.In one embodiment of the invention, each display device 626 and 628 comprises CRT, LCD, HMD, PDP equipment, or projection-type display apparatus.
In another embodiment of the present invention, shown in Figure 62, be stored in a recording medium 630,, can be provided for display device 634 and 636 by a medium retrieval facility 632 in described demonstration place as the generated data in CD or the disk.In one embodiment, described CD can comprise a CD CD or a digital video disk (DVD).And described disk can comprise a hard disk.
Described recording medium 630 is inserted in the described medium retrieval facility 632, and this equipment is retrieved described stereo-picture 624 and taken ratio 622.In one embodiment of the invention, described medium retrieval facility 632 can comprise a CD ROM driver, a DVDROM driver, or a hard drive (HDD), and a main frame of described driver.Described medium retrieval facility 632 can be embedded in (not shown) in the computing equipment.
Described medium retrieval facility 632 retrievals, and described stereo-picture 624 is provided and takes ratio 622 to described display device 634 and 636 respectively.Exemplary data layout shown in Figure 63 is applicable to the data that are stored in the described recording medium 630.In one embodiment of the invention, described shooting ratio 622 is identical for all stereo-pictures.In this embodiment, described shooting ratio 622 is provided for each described display device 634 and 636 once, and is all using identical shooting ratio in the stereo-picture.
In one embodiment of the invention, the described data layout that is recorded on the described medium 630 is formed, and makes described medium retrieval facility 632 can discern the left side and the image right of described stereo-picture 624.The operation of described display device 634 and 636 operation and the equipment of discussing in conjunction with Figure 61 626 and 628 identical.
Comprise the portable of a pair of digital camera that generates stereo-picture and a pair of display screen
Communication facilities
Figure 64 has shown an information communication system according to a further aspect of the invention.Described system comprises a pair of portable communication device 65 and 67.Described equipment 65 comprises a pair of digital camera 640,642, and 644,646, one of a pair of display screen is apart from 648, one eyes in importation importation 650 at interval, and a spatial magnification importation 652.Described equipment 65 comprises a receiver and a transmitter, or a transceiver (all not shown).
This logarithmic code camera 640 and 642 generates the stereo-picture of a scene or an object, and their shooting ratio.In one embodiment of the invention, each camera 640 and 642 comprises the element substantially the same with camera shown in Figure 7 20.Described equipment 65 transmits the stereo-picture that is generated and takes ratio to described equipment 67.This shows the stereo-picture that receives from described equipment 67 to display screen 644 and 646.
Describedly providing described distance value apart from importation 648, is eyes and the distance value between each described screen 644 and 646 (similar with F2 to the observer-screen distance F1 among Fig. 8) an observer.Described distance value between the central point of importation, described eyes interval 650 reception observer eyes (in Figure 14 A, is illustrated as W a).Described spatial magnification importation 652 is provided for the adjustment data of spatial magnification, and provides described adjustment data to described equipment 65.In one embodiment of the invention, described apart from importation 648, described eyes at interval each in importation 650 and the described spatial magnification importation 652 comprise can input digit 0-9 keyseat.In another embodiment, all described importations are comprised in the input equipment.
Described equipment 67 comprises a pair of digital camera 664,666, and 654,656, one of a pair of display screen is apart from 658, one eyes in importation importation 660 at interval, and a spatial magnification importation 662.Described equipment 67 also comprises a receiver and a transmitter, or a transceiver (all not shown).
This logarithmic code camera 664 and 666 generates the stereo-picture of a scene or an object, and their shooting ratio.In one embodiment of the invention, each camera 664 and 666 comprises the element substantially the same with camera shown in Figure 7 20.Described equipment 67 transmits the stereo-picture that is generated and takes ratio to described equipment 65.This shows the stereo-picture that receives from described equipment 65 to display screen 654 and 656.
Described apart from importation 658, described eyes are importation 660 at interval, and described spatial magnification importation 662 and described equipment 65 is substantially the same.
System shown in Figure 64 can comprise at least one base station (not shown), communicates with described equipment 65 and 67.In one embodiment of the invention, each described equipment 65 and 67 comprises a cellular phone, an International Flow telecommunications 2000 (IMT-2000) equipment, and personal data assistant (PDA) one by one, the portable telecommunication apparatus of a Handheld Personal Computer (PC) or other kinds.
In one embodiment of the invention, described spatial magnification is adjusted data and is taken the data layout that ratio has a standard, makes described equipment 65 and 67 can easily discern described data.
The equipment that shows stereo-picture is implemented, and makes described shooting ratio and described screen ratio base
Identical on the basis
Figure 65 has shown a pair of information communication device 65 and 67 according to an aspect of the present invention.Each equipment 65 and 67 shows the stereo-picture that receives from another one, makes that the shooting ratio of an equipment is substantially the same with the screen ratio of another equipment.Described equipment 65 comprises 700, one display parts 720 of a camera part, and a data processor 740, as, a microcomputer.
Described camera part 700 generates stereo-picture and it takes ratio, and transmits to described equipment 67.As mentioned before, the communication between the described equipment 65 and 67 can be carried out (not shown) by at least one base station.Described camera part 700 comprises a pair of digital camera 640,642 and a transmitter 710.Each described digital camera 640 and 642 generates stereo-picture and it takes ratio, and synthetic described image and ratio (generated data 702 and 704).In one embodiment of the invention, it is identical with shooting ratio in 704 to be provided at described generated data 702.Each described digital camera 640 and 642 can comprise the element of the camera 20 among Fig. 7.
The calculating of the generation of described stereo-picture and described shooting ratio, and described image and ratio is synthetic, is made an explanation in detail in conjunction with Fig. 5-11.Described transmitter 710 transmits described generated data 702,704 to described equipment 67.In another embodiment, described shooting ratio is not synthesized, but, transmit individually with described stereo-picture.
In one embodiment of the invention, described transmitter 710 comprises two translator units, transmits generated data 702 and 704 respectively.Described equipment 67 receives and shows the stereo-picture that sends from described equipment 65, makes that the shooting ratio that receives is substantially the same with the screen ratio of described equipment 67.
Described display part 720 receives the generated data 714 and 716 of stereo-picture and its shooting ratio from described equipment 67, and shows described stereo-picture, makes that the shooting ratio that receives is substantially the same with the screen ratio of described equipment 65.
Described display part 720 comprises a pair of display device 706,708, and a receiver 712.The described generated data 714 and 716 that described receiver 712 receiving equipments 67 transmit, and provide described generated data 714 and 716 to described display device 706,708 respectively.In one embodiment of the invention, described receiver 712 can comprise two receiving units, receives described generated data 714 and 716 respectively.In another embodiment, described image and shooting ratio are received individually, because they are not synthesized in transmission.
The image and the ratio that are provided by described receiver 712 are provided each display device 706 and 708.Described equipment 706 and 708 also shows described stereo-picture, makes described shooting ratio substantially the same with the screen ratio of described display device 706 and 708 respectively.Each display device 706 and 708 can comprise and display device shown in Figure 8 86 or 88 substantially the same elements.In one embodiment, described equipment 706 and 708 is connected to described apart from importation 648, shown in Figure 64, makes screen-observer's distance of described equipment 706 and 708 can be provided for described equipment 65.In one embodiment of the invention, described equipment 706 and 708 screen ratio are substantially the same.Described display device 706 and 708 detailed operation are explained in conjunction with Fig. 8-11.
The operation of described microcomputer 740 described camera parts 700 of control and display part 720, and with the data communication of described equipment 67.In one embodiment of the invention, described microcomputer 740 is programmed controls described camera part 700, make that described digital camera 640 and 642 generates stereo-picture and it takes ratio, and when communicating to connect between described equipment 65 and 67 set up, control described transmitter 710 to described equipment 67 transmitted images and ratio.In another embodiment of the present invention, described microcomputer 740 is programmed the power supply of controlling described camera part 700 and described display part 720 independently.In this embodiment, even when described camera 640 and 642 cuts out, described display device 706 and 708 can show the stereo-picture that receives from described equipment 67.And when described display device 706 and 708 is closed, described camera 640 and 642 can generate stereo-picture and it takes ratio, and transmits described image and ratios to described equipment 67.In this embodiment, described equipment 65 can comprise an element, and its execution is communicated by letter with the voice signal of described equipment 67.
Described equipment 65 can comprise a volatile storage, as a random access memory (RAM), and/or a nonvolatile memory, as a fast wiping type storer or a programmable read only memory, store the data that are used to communicate by letter.Described equipment 65 can comprise a power suppling part branch, as a battery.
In another embodiment of the present invention, described equipment 65 can comprise a transceiver, has merged described transmitter 710 and receiver 712.In this case, described transmitter 710 and receiver 712 can omit.
Though do not illustrate especially, described equipment 67 can be configured to and comprise the element substantially the same with described equipment 65, and the substantially the same function of execution and described equipment 65, shown in Figure 65.Like this, the detailed explanation of embodiment has just been omitted.
Control the equipment of the display position of described stereo-picture
Figure 66 A has shown an information communication device 65 according to another aspect of the present invention.In this aspect of the invention, described information communication device 65, the distance (W between the central point of person's eyes according to the observation a), control the display position of described stereo-picture.
In one embodiment of the invention, the stereo-picture of described equipment 65 movement display on described display screen 644 and 646, the described distance (W between the central point of the feasible stereo-picture that is shown d) and described distance W aSubstantially the same.Described equipment 65 comprises eyes 650, one data processors 722 in importation at interval, as, a microcomputer, a pair of display driver 724,726, and a pair of display screen 644,646.Described eyes at interval importation 650 and this to display screen 644 and 646 with Figure 64 in substantially the same.Described microcomputer 722 is according to the distance W that receives a, control described display driver 724 and 726, make described distance W dAnd distance W aSubstantially the same.Especially, described display driver 724 and 726 moves the stereo-picture that is presented on described display screen 644 and 646, up to W dWith W aSubstantially the same.Detailed explanation about the motion of described stereo-picture provides in conjunction with Figure 15-17.
In another embodiment of the present invention, shown in Figure 66 B, described equipment 65 moves described display screen 644 and 646, makes described distance W between the central point of stereo-picture dWith described distance W aSubstantially the same.In this embodiment, described equipment 65 comprises described eyes 650, one microcomputers 732 in importation at interval, a pair of servomechanism 734,736, and described a pair of display screen 644,646.
Described microcomputer 732 is according to the W that receives aDistance is controlled described servomechanism 734 and 736, makes described distance W dWith described distance W aSubstantially the same.Especially, described servomechanism 734 and 736 moves described display screen 644 and 646, up to W dWith W aSubstantially the same.Detailed explanation about the motion of described display screen provides in conjunction with Figure 18-20.
Though do not illustrate especially, described equipment 67 can comprise the element substantially the same with described equipment 65, and the substantially the same function of execution and described equipment 65, shown in Figure 66 A and 66B.Like this, the detailed explanation of embodiment has just been omitted.
Adjust the equipment of the spatial magnification of stereo-picture
Figure 67 has shown an information communication device 65 according to a further aspect in the invention.In this aspect of the invention, described information communication device 65 is adjusted spatial magnification according to the adjustment data of spatial magnification.Described equipment 65 comprises 760, one display parts 780 of a camera part, and a microcomputer 750.
Described camera part 760 comprises 640,642, controller of camera 742 of a pair of digital camera and a transceiver 744.Described transceiver 744 receives the adjustment data of spatial magnification from described equipment 67, and provides described adjustment data (C) to described controller of camera 742.Spatial magnification embodiment has been described in detail in conjunction with Figure 49-53.Adjustment data example in Figure 52 of described spatial magnification goes out.
Described controller of camera 742 according to the adjustment data (C) that are provided, is controlled the distance (at interval) between described digital camera 640 and 642.In one embodiment of the invention, described controller of camera 742 comprises an engine of adjusting described camera distances, the servo controller (all not shown) of a described engine of control.The operation of described controller of camera 742, with the operation of the described controller 4990 that combines Figure 50-52 description, substantially the same.The interval of described digital camera 640 and 642 to adjust generates stereo-picture, and by described transceiver 744, transmits described stereo-picture to described equipment 67.Described equipment 67 receptions also show described adjusted stereo-picture.Like this, described equipment 67 can be adjusted described spatial magnification for the scene of being taken by the described camera 640,642 of equipment 65.In one embodiment of the invention, each in the described equipment 65 and 67 can show the present space magnification at its at least one display screen, as " 1 ", " 0.5 ", " 10 ", or the like, make the observer can know the present space magnification.In another embodiment of the present invention, described equipment 65 and 67 can provide a sound signal of representing the present space magnification to the observer.
In another embodiment, spatial magnification is adjusted data (A) can be provided to described controller of camera 742, for example, and by spatial magnification importation 652 as described in shown in Figure 64.This embodiment can be used for following this situation, when a user of described equipment 65 wants a user to described equipment 67, when providing a stereo-picture with the spatial magnification of having adjusted.In one embodiment, the operation of described controller of camera 742 and receives from described equipment 67 under the situation of described adjustment data (C), and is substantially the same.
Described display part 780 comprises a pair of display screen 644,646, and a transceiver 746.Spatial magnification (SM) is adjusted data (B), is provided for described transceiver 746 from a user of described equipment 65.Described SM adjusts the interval (Figure 64) between the described camera 664 and 666 that data (B) are used for adjusting described equipment 67.Described SM adjusts data (B) and also is provided in described display screen 644 and 646 at least one, makes described SM adjust data (B) and is presented in described display screen 644 and 646 at least one.This is used for notifying user's present space magnification of described equipment 65.Described transceiver 746 transmits described SM and adjusts data (B) to described equipment 67.
Described equipment 67 receives described SM and adjusts data (B), and according to the interval between the described camera 664 and 666 of adjusting the described equipment 67 of data (B) adjustment.And described equipment 67 transmits to described equipment 65, the stereo-picture that generates with the spatial magnification of adjusting.Described transceiver 746 receives left side and image right from described equipment 67, and provides described image to described display screen 644 and 646 respectively.Described display screen 644 and 646 shows described stereo-picture.In one embodiment, the described equipment 65 of Figure 67 and each in the described equipment 67 can also comprise in conjunction with Figure 65 and 66 described equipment of describing 65 and 67 function.
The operation of described microcomputer 750 described camera parts 760 of control and display part 780, and with the data communication of described equipment 67.In one embodiment of the invention, described microcomputer 750 is programmed controls described camera part 760 and display part 780, make that when communicating to connect after the foundation between the described equipment 65 and 67 described SM adjusts data, and (B C) transmits or receives between mutually.In another embodiment of the present invention, described microcomputer 750 is programmed controls described camera part 760, make described controller of camera 742, adjust data (A) according to described SM, adjust the interval between described digital camera 640 and 642, even when not establishing a communications link between described equipment 65 and 67.
Described equipment 65 can comprise a volatile storage, as a random access memory (RAM), and/or a nonvolatile memory, as a fast wiping type storer or a programmable read only memory, store the data that are used to communicate by letter.Described equipment 65 can comprise that is carried out the element that voice signal transmits.
Though do not illustrate especially, the embodiment of described equipment 67 comprises the element substantially the same with described equipment 65, and the substantially the same function of execution and described equipment 65, shown in Figure 67.Like this, the detailed explanation of these embodiment has just been omitted.
The equipment that comprises the display screen of separation
In another embodiment of the present invention, described communication facilities 65 comprises a safety goggles shape display device 649, shown in Figure 68.Described safety goggles shape display device comprises one group of display screen 645 and 647.In one embodiment of the invention, described display device 649 can pass through a communication interface 643, is connected with described equipment 65.In another embodiment of the present invention, described display device 649 can have wireless connections to described equipment 65.
Described equipment 67 is applicable to the described embodiment in conjunction with Figure 65-67.In one embodiment of the invention, each in described equipment 65 and the equipment 67 can comprise a head mounted display (HMD) equipment, and this equipment comprises one group of display screen.
Other aspects of the present invention
Figure 69 has shown a three-dimensional display system according to a further aspect in the invention.In this aspect of the invention, stereo-picture generates from three-dimensional structure data.Described three-dimensional structure data can comprise 3d gaming data or three-dimensional animation data.
As an example, described three-dimensional structure data comprises pixel value (as, three primary colors RGB pixel value), and for example, its scope is (0000,0000,0000) to (9999,9999,9999), at three-dimensional coordinate system (x, a y, z) position in from (000,000,000) to (999,999,999).In this case, table 1 illustrates the data #1-data #N of described three-dimensional structure data.
Table 1
Data # 1 in a position (001,004,002) Data # 2 in a position (001,004,004) Data # N in a position (025,400,087)
(0001, 0003,1348) (0010, 0033,1234) (0001, 3003,1274)
In one embodiment of the invention, shown in Figure 69 A, stereo-picture generates from three-dimensional structure data 752 in a remote server.Described three-dimensional structure data 752 is projected on a pair of two dimensional surface, uses known projection section 754 and 756, in the stereo-picture display technique, also often is known as virtual camera or observation point.Described projection section can comprise a known software, carries out described projecting function.These images that are projected are stereo-pictures, and each comprises a pair of two-dimensional image that is sent to a demonstration place.In described demonstration place, described stereo-picture shows on a pair of display device.
In another embodiment of the present invention, shown in Figure 69 A, stereo-picture shows the place at one, generates from three-dimensional structure data.In this embodiment, described three-dimensional structure data can be transmitted or download to described demonstration place from a remote server.Described projection section 772 and 774 is arranged in a computing equipment 770.In one embodiment of the invention, described projection section 772 and 774 can comprise a software module, and together with described structured data, is downloaded to the described computing equipment 770 in described demonstration place from remote server.The image that is projected, that is, the stereo-picture of generation shows by a pair of display device 776 and 778.In another embodiment of the present invention, described three-dimensional structure data is stored on the recording medium, as CD or disk, and is inserted into described computing equipment 770, is carried out retrieval, as described in conjunction with Figure 62.In this case, described projection section 772 and a software module of 774 can be comprised in the described medium.
The method that generates stereo-picture from described three-dimensional structure data is open in the United States Patent (USP) 6,005,607 of issue on Dec 21st, 1999.
This aspect of the present invention is applicable to previously described all aspects of the present invention.Yet, in certain embodiments, may make some modifications.As an example, described virtual camera (projection section, observation point) described shooting ratio, can be by utilizing the position of camera and object in the system of projection coordinate, calculate the level and the vertical length of an object that is taken or scene, and the distance between camera and object (scene), calculate.
Another example, the control of the motion of described virtual camera can be by a software performing, and this software is discerned the position of described virtual camera, and controls the motion of described camera.
Another example, the control of described spatial magnification can be by using the position of identifying of the described virtual camera in system of described projection coordinate, and carry out at the interval of adjusting between the described virtual camera.
Figure 70 has shown a three-dimensional display system according to a further aspect of the invention.This aspect of the present invention shows stereo-picture, makes that the resolution of the resolution of each display device and each stereocamera is substantially the same.In this aspect of the invention, in each camera, the described pixel that is taken with respect to a camera frames (as, 640 * 480) the position be presented in each display device pixel with respect to a display screen (as, 1280 * 960) the position substantially the same.With reference to Figure 70, the resolution of described display device is the twice of described camera resolution.Like this, be converted into 4 pixels on the display screen same position by a pixel in the upper left corner of described camera, shown in Figure 70.Similarly, be converted into 4 pixels on the display screen same position by a pixel in the lower right corner of described camera, shown in Figure 70.All three-dimensional display systems that this aspect of the present invention goes for describing among the application.
Above-mentioned system has shown a communication position, connects the described remote camera place that is shown to.Yet these inventions do not have receiver/transmitter and network can realize yet, make function carry out in a single place.Some quilts in the said system are described based on the motion or the position of an observer's eye lens.Yet described system can realize based on an observer's eye pupil or cornea.
Top description has pointed out that the present invention is applicable to the new feature of each embodiment, and persons skilled in the art are appreciated that without departing from the invention, in the form and details of equipment of describing and process, can make different omissions, substitutes and change.Therefore, scope of the present invention is by the definition of following claim book, but not aforesaid description.All are being included in wherein with the meaning of claims equivalence and the variation in the scope.

Claims (86)

1. method that shows stereo-picture comprises:
On one group of display device, show at least one stereo-picture, this stereo-picture comprises a pair of two-dimensional image;
This to two-dimensional image on, at least one input equipment indicator is provided;
This to two-dimensional image on, described at least one input equipment indicator is moved to a second place from a primary importance;
Determine a positional value, this positional value is represented the described second place of described at least one input equipment indicator;
According to determined positional value, be described two-dimensional image, respectively computing center's point; And
The described central point that moves described two-dimensional image is to align with the described central point that calculates.
2. method according to claim 1, wherein said at least one input equipment indicator comprises a pair of cursor of mouse by a mouse control, this mouse and described one group of display device carry out data communication.
3. method according to claim 2, wherein said central points comprises:
Respectively to the described central point of a homogeneous observer's eye lens and the described central point of described two-dimensional image;
The positional value of the central point of each described eye lens of calculating observer; And
According to the described positional value that calculates for described eye lens, determine the described center position value of described two-dimensional image respectively.
4. method according to claim 3, wherein said alignment comprises:
An observation point structure is provided, stipulates two openings, allow observer's eyes to be placed in one, each opening aligns with the described central point of described two-dimensional image; And
Place described observer's eyes, be close to described observation point structure.
5. method according to claim 3, the described calculating of wherein said positional value comprises:
Intermediate point M between the described central point of setting observer's eyes is as an initial coordinate O (0,0,0);
Set the distance value (d) that an observer observes the described image that is shown;
For a cursor of mouse that is presented on one of them of described two-dimensional image is determined a position M L(I L, J L, K L), and determine a position M for another cursor of mouse that is presented on another of described two-dimensional image R(I R, J R, K R);
Determine at described position M LAnd M RBetween a center position M N(I N, J N, K N), I wherein NBe confirmed as (I L+ I R)/2, J NBe confirmed as (J L+ J R)/2, K NBe confirmed as described distance value d;
The position of determining each described central point of described eyes is A 3L(W a/ 2,0,0) and A 3R(W a/ 2,0,0), W wherein aRepresent the distance between observer's the described central point of eyes;
Use formula I, calculate described some A 3LWith described some M NBetween described apart from Z L, wherein said formula I is as follows:
Z L = [ I N - ( - W a 2 ) ] 2 + [ J N - O ] 2 + [ K N - O ] 2 = [ I N + ( W a 2 ) ] 2 + [ J N ] 2 + [ K N ] 2
Use formula II, determine each described central point of described eye lens, A 2L(x1, y1, z1) and A 2R(z2), wherein said formula II is as follows for x2, y2:
x 1 = ( - W a 2 ) + [ ( I N + W a 2 ) × S ] Z L
y 1 = O + [ ( J N ) × S ] Z L
z 1 = O + [ ( K N ) × S ] Z L
x 2 = ( W a 2 ) - [ ( I N + W a 2 ) × S ] Z L
y 2 = O + [ ( J N ) × S ] Z L
z 2 = O + [ ( K N ) × S ] Z L
And
The described central point of wherein said two-dimensional image be confirmed as described position (x1, y1) and (x2, y2).
6. method according to claim 5, wherein said mouse comprise a scroll button, and the setting of wherein said distance value is carried out by the described scroll button of rotating described mouse.
7. method according to claim 1, wherein said one group of display device comprises a single display device, is applicable to sequentially to show described two-dimensional image.
8. method according to claim 1, wherein said one group of display device comprises a pair of display device, is configured, and side by side shows described two-dimensional image respectively.
9. method according to claim 1 further comprises, generates described at least one stereo-picture from three-dimensional structure data.
10. method that shows stereo-picture comprises:
Show at least one stereo-picture on one group of display device, described stereo-picture comprises a pair of two-dimensional plane image;
This to the two-dimensional plane image on, at least one input equipment indicator is provided;
The data that the storage representative concerns between the central point of described at least one indicator post and each described two-dimensional image;
For described at least one indicator on described two-dimensional image calculates the amount of motion;
According to the data of amount that is calculated and storage, determine the center position of each described two-dimensional image respectively; And
According to determined center position, move the described central point of described two-dimensional image respectively.
11. method according to claim 10 further comprises generating described at least one stereo-picture from three-dimensional structure data.
12. a method that shows stereo-picture comprises:
Show at least one stereo-picture on one group of display device, described stereo-picture comprises a pair of two-dimensional plane image;
This to the two-dimensional plane image on, described at least one input equipment indicator is provided;
On described two-dimensional image, move described at least one input equipment indicator to a target location;
For determining a positional value in this target location on the described two-dimensional image;
According to determined positional value, the central point of the described two-dimensional image that calculating will be moved; And
According to the center point value that is calculated, move the described central point of described two-dimensional image respectively.
13. method according to claim 12, wherein said at least one input equipment indicator comprises a pair of cursor of mouse by a mouse control, and this mouse and described each display device carry out data communication.
14. method according to claim 13, wherein said central points comprises:
Respectively to the described central point of a homogeneous observer's eye lens and the described central point of described two-dimensional image; And
According to described target location value, the positional value of the central point of each described eye lens of calculating observer.
15. method according to claim 14, the described calculating of wherein said positional value comprises:
Set the intermediate point M between described observer's the described central point of eyes, as an initial coordinate O (0,0,0);
Set the distance value (d) that an observer observes the described image that is shown;
For a cursor of mouse that is presented on one of them of described two-dimensional image is determined a position M L(I L, J L, K L), and determine a position M for another cursor of mouse that is presented on another of described two-dimensional image R(I R, J R, K R);
Determine at described position M LAnd M RBetween a center position M N(I N, J N, K N), I wherein NBe confirmed as (I L+ I R)/2, J NBe confirmed as (J L+ J R)/2, K NBe confirmed as described distance value d;
The position of determining each described central point of described eyes is A 3L(W a/ 2,0,0) and A 3R(W a/ 2,0,0), W wherein aRepresent the distance between observer's the described central point of eyes;
Use formula I, calculate described some A 3LWith described some M NBetween described apart from Z L, wherein said formula I is as follows:
Z L = [ I N - ( - W a 2 ) ] 2 + [ J N - O ] 2 + [ K N - O ] 2 = [ I N + ( W a 2 ) ] 2 + [ J N ] 2 + [ K N ] 2
Use formula II, determine each described central point of described eye lens, A 2L(x1, y1, z1) and A 2R(z2), wherein said formula II is as follows for x2, y2:
x 1 = ( - W a 2 ) + [ ( I N + W a 2 ) × S ] Z L
y 1 = O + [ ( J N ) × S ] Z L
z 1 = O + [ ( K N ) × S ] Z L
x 2 = ( W a 2 ) - [ ( I N + W a 2 ) × S ] Z L
y 2 = O + [ ( J N ) × S ] Z L
z 2 = O + [ ( K N ) × S ] Z L
And
The described central point of wherein said two-dimensional image be confirmed as described position (x1, y1) and (x2, y2).
16. method according to claim 12 further comprises, generates described at least one stereo-picture from three-dimensional structure data.
17. a system for displaying stereoscopic images comprises:
One group of display device is configured to show at least one stereo-picture, and described stereo-picture comprises a pair of two-dimensional image;
An input equipment is configured to control the motion of at least one the input equipment indicator that is presented on the described two-dimensional image, and described at least one input equipment indicator is set a target location that moves on the described two-dimensional image;
A computing equipment is the target location of described at least one indicator, determines each positional value, and according to the positional value of determining of described at least one indicator, determines the central point of described two-dimensional image; And
A display driver, the target location value according to determining moves the image that is shown.
18. system according to claim 17, wherein said at least one input equipment indicator comprises a pair of cursor of mouse by a mouse control, and this mouse and described one group of display device carry out data communication.
19. system according to claim 18, further comprise an observation point, two openings of eyes regulation for the observer, each central point of the described observer's eye lens of each register and each central point of described two-dimensional image, and wherein said computing equipment is configured, according to described target location value, calculate the positional value of described central point of each eye lens of an observer, and according to the positional value that is calculated, the described central point of definite described two-dimensional image that will be moved.
20. system according to claim 17, wherein said one group of display device comprises a single display device, is applicable to sequentially to show described two-dimensional image.
21. the described system of claim 17, wherein said one group of display device comprises a pair of display device, is configured to side by side show respectively described two-dimensional image.
22. system according to claim 17, wherein said one group of display device is from one of them following selection: a head mounted display and a projection display apparatus, a LCD (LCD) equipment, a cathode ray tube (CRT) equipment, and a plasma display device equipment.
23. system according to claim 17 further comprises:
A receiver is configured to receive three-dimensional structure data; And
One group of projection section is configured to generate described at least one stereo-picture from described three-dimensional structure data, and provides described image to described one group of display device.
24. a method that shows stereo-picture comprises:
A pair of stereocamera is provided;
Generate at least one stereo-picture, this stereo-picture comprises a pair of two-dimensional plane image that is generated respectively by described a pair of stereocamera;
Survey the motion of each described stereocamera;
Transmit the stereo-picture and the detection data of described generation;
Receive described stereo-picture and detection data;
On one group of display device, show received two dimensional image respectively; And
A pilot signal is provided,, represents the direction of motion of described stereocamera according to described detection data.
25. method according to claim 24 wherein provides described pilot signal, comprise showing one at the screen data-signal, the described observer's of this signal guidance eyes in each described display device, the motion of the described camera of tracking.
26. method according to claim 24 wherein provides described pilot signal, comprising provides a voice signal, represents the motion of described camera.
27. method according to claim 24, the detection of wherein said motion is carried out by using one group of camera motion detecting devices.
28. method according to claim 24, wherein said transmission is carried out by a communication network.
29. a system for displaying stereoscopic images comprises:
One group of stereocamera is configured to generate at least one stereo-picture, and this stereo-picture comprises a pair of two-dimensional plane image;
One group of motion detection equipment, each surveys the motion of described one group of stereocamera respectively;
A transmitter is configured to transmit the image of described generation and the exercise data that detects;
A receiver is configured to the exercise data that receives described image and detect; And
One group of display device is configured to show respectively the plane picture of described bidimensional, and a pilot signal is provided, and represents the direction of motion of each described stereocamera.
30. system according to claim 29, wherein said display device is configured to show one in screen data-signal, the motion that the described observer's of this signal guidance eyes remove to follow the trail of described camera.
31. system according to claim 29 further comprises a loudspeaker, and a voice signal is provided, and represents the direction of motion of described camera.
32. system according to claim 29, wherein said transmitter is configured, and comes to transmit described image and exercise data by a communication network.
33. a method that shows stereo-picture comprises:
Receive the motion detection data of at least one stereo-picture and one group of stereocamera, described stereo-picture comprises a pair of two-dimensional plane image;
Show that on one group of display device this is to the two-dimensional plane image; And
A pilot signal is provided,, represents the direction of motion of described stereocamera according to described detection data.
34. method according to claim 33 wherein provides described pilot signal, comprising provides a voice signal, represents the direction of motion of described camera.
35. method according to claim 33 wherein provides described pilot signal, comprises showing one in screen data (OSD) signal, the described observer's of this signal guidance eyes, the motion of the described camera of following the trail of in each described display device.
36. method according to claim 35, wherein said osd signal comprise an arrow indicator signal, represent the direction of motion of described camera.
37. a method that shows stereo-picture comprises:
Generate at least one stereo-picture from three-dimensional structure data, this stereo-picture comprises a pair of two-dimensional image, respectively by a pair of projection section projection;
Survey the motion of described a pair of projection section;
Show described stereo-picture; And
According to described detection data, provide a pilot signal about the motion of described each projection section.
38. a system is used to show a 3-D view, comprising:
A receiver is configured to receive the exercise data of at least one stereo-picture and one group of stereocamera, and described stereo-picture comprises a pair of two-dimensional plane image that is generated respectively by described one group of stereocamera; And
One group of display device is configured to show this to two-dimensional image, and a pilot signal is provided, and represents the direction of motion of described stereocamera.
39. according to the described system of claim 38, wherein said one group of display device comprises:
An image processor is configured to handle received stereo-picture;
One is configured to generate an osd signal at screen data (OSD) circuit, represents the direction of motion of described stereocamera; And
A compositor is configured to synthetic described processed view data and the osd signal that is generated.
40. according to the described system of claim 39, wherein said osd circuit is configured, and generates an arrow indicator signal, represents the direction of motion of at least one described camera.
41. according to the described system of claim 38, further comprise a V-arrangement mirror, between described display device, wherein said display device is reverse mutually, and comprise one group of inversion section, the left surface and the right flank of the described two-dimensional plane image that reverses, the image that is converted with output, and wherein said V-arrangement mirror is configured, and receives and reflects the described image that is converted to described observer.
42. according to the described system of claim 38, wherein said one group of display device comprises a single display device, is applicable to sequentially to show described two-dimensional image.
43. according to the described system of claim 38, wherein said one group of display device comprises a pair of display device, is configured, and side by side shows described two-dimensional image respectively.
44. according to the described system of claim 38, wherein said one group of display device is from one of them following selection: a head mounted display and a projection display apparatus, a LCD (LCD) equipment, a cathode ray tube (CRT) equipment, and a plasma display device equipment.
45. a method according to the position of one group of stereocamera about a scene, is adjusted the display direction of stereo-picture, comprising:
Survey separately the position of described one group of stereocamera about a scene being taken;
Transmit the position of camera data to one group of display device;
Receive described position of camera data;
According to described position of camera data, determine the display device adjusted value; And
According to described adjusted value, adjust the position of at least one display device.
46., further comprise the described stereocamera in location according to the described method of claim 45, make their parallel alignment, and wherein, survey described position of camera and comprise, survey angle of inclination from a first direction of described parallel alignment.
47. according to the described method of claim 46, wherein the adjustment to described angles of display comprises, in a second direction opposite with described first direction, one of them described display device at least tilts.
48. according to the described method of claim 45, the adjustment of wherein said position comprises, move at least one described display device with respect to observation point, described observation point comprises the point at each eyes place of observer, and these points become vertical angle basically with the central point of each display screen of described display device respectively.
49. a system according to the position of one group of stereocamera, adjusts the display direction of stereo-picture, comprising:
One group of position sensing equipment, each position sensing equipment is surveyed the position of each described stereocamera;
A transmitter transmits at least one stereo-picture and described position sensing data, and described stereo-picture comprises a pair of two-dimensional image;
A receiver receives at least one stereo-picture and described position sensing data;
One group of display device shows the two-dimensional image that receives respectively; And
A display device controller according to the position sensing data that receive, is determined the display device adjusted value, and according to described definite adjusted value, adjusts the position of at least one display device.
50. a system according to the position of one group of stereocamera, adjusts the display direction of stereo-picture, comprising:
A receiver receives the position sensing data of at least one stereo-picture and described one group of stereocamera, and described stereo-picture comprises a pair of two-dimensional image;
One group of display device shows described a pair of two-dimensional image respectively; And
A computing equipment according to the position sensing data that receive, is determined the display device adjusted value, and according to described definite adjusted value, adjusts the position of at least one display device.
51. according to the described system of claim 50, wherein said position sensing data comprise, at least one described stereocamera is at the angle of inclination of a first direction, and wherein said computing equipment is in a second place in contrast to described primary importance, and at least one described two-dimensional image tilts.
52. according to the described system of claim 51, wherein said computing equipment is configured, and enlarges or dwindle the part of at least one described two-dimensional image.
53. according to the described system of claim 50, wherein said computing equipment comprises:
A microcomputer according to described position sensing data, is determined the display image adjusted value; And
A display driver according to determined adjusted value, is adjusted the display direction of at least one received two-dimensional image.
54. a method according to the position of one group of stereocamera, is adjusted the display direction of stereo-picture, comprising:
Generate at least one stereo-picture, described stereo-picture comprises a pair of two-dimensional image that is generated respectively by one group of stereocamera;
Survey the position of each stereocamera about the scene that will be taken;
Transmit described at least one stereo-picture and described position sensing data to one group of display device;
Receive described position sensing data and described stereo-picture;
According to described position sensing data, determine the display image adjusted value; And
According to described definite adjusted value, adjust the display direction of at least one two-dimensional image that receives.
55. according to the described method of claim 54, the adjustment of wherein said position comprises, move described one group of display device with respect to observation point, described observation point comprises the point at each eyes place of observer, and these points become vertical angle basically with each central point of the display screen of described display device respectively.
56. according to the described method of claim 54, wherein said one group of display device comprises a single display device, is applicable to sequentially to show described two-dimensional image.
57. according to the described method of claim 54, wherein said one group of display device comprises a pair of display device, is configured, and side by side shows described two-dimensional image respectively.
58. a system according to the position of one group of stereocamera, adjusts the display direction of stereo-picture, comprising:
One group of stereocamera generates at least one stereo-picture, and described stereo-picture comprises a pair of two-dimensional image;
One group of position sensing equipment is surveyed the position of described one group of stereocamera respectively;
A transmitter transmits the stereo-picture and the described position sensing data that are generated;
A receiver receives described position sensing data and described stereo-picture;
A computing equipment according to described position sensing data, is determined the display image adjusted value; And
A display driver according to described definite adjusted value, is adjusted the display direction of at least one two-dimensional image that receives.
59. a method, the display direction of a stereo-picture of adjustment comprises:
A pair of projection section is provided, from the three-dimensional structure data of a scene, generates at least one stereo-picture, described stereo-picture comprises a pair of two-dimensional image;
In a pair of display device, show described a pair of two-dimensional image respectively;
Survey the relative position of each described projection section about described scene;
According to described positional value, determine the display device adjusted value; And
According to described adjusted value, adjust at least one described two-dimensional image and only point to the angles of display of the eyes of described display device with respect to each of observer.
60. an information communication system comprises:
One first portable equipment comprises a pair of stereocamera and a pair of display screen, and described a pair of stereocamera generates also and transmits one first stereo-picture, and described a pair of display screen receives and shows one second stereo-picture;
One second portable equipment is communicated by letter with described first portable equipment, and comprise a pair of stereocamera and a pair of display screen, this of described second equipment generates one second stereo-picture to stereocamera, and transmit second stereo-picture to described first portable equipment, this of described second equipment receives described first stereo-picture to display screen from described first portable equipment, and shows described first stereo-picture;
Wherein, each is to described stereocamera, mutually apart from a predetermined interval; And
Each described first and second stereo-picture comprises, respectively by each a pair of two-dimensional image that described stereocamera is generated.
61. according to the described system of claim 60, wherein at least one described first and second portable equipment further comprises:
An importation provides the described distance value (W between an observer's the central point of eyes a);
A computing equipment is according to the W that is provided aValue is that each two-dimensional image that is shown determines the amount of motion, makes described W aThe substantially the same distance between the central point of the image that is shown of value; And
A display driver according to the amount of determined motion, moves the described image that is shown.
62. according to the described system of claim 60, wherein at least one described first and second portable equipment further comprises:
An importation provides the described distance value (W between an observer's the central point of eyes a);
A computing equipment is according to the W that is provided aValue is that each two-dimensional image that is shown determines the amount of motion, makes described W aThe substantially the same distance between the central point of the image that is shown of value; And
A servomechanism according to determined amount of exercise, moves display screen separately.
63. according to the described system of claim 62, wherein said servomechanism comprises:
A servo controller according to the amount of exercise of determining, is determined an adjustment amount; And
A horizontal engine according to the adjustment amount of determining, is adjusted the interval between the described display screen.
64. according to the described system of claim 60, further comprise a base station, communicate by letter with described first and second portable equipments.
65. according to the described system of claim 60, wherein each described portable equipment comprises a cellular phone, personal data assistant one by one, a Handheld Personal Computer PC, and an International Flow telecommunications 2000 (IMT-2000) terminal.
66. according to the described system of claim 60, wherein at least one described first and second portable equipment comprises:
An importation is provided for the adjustment data of spatial magnification, and described spatial magnification relates to the size by a scene of one of them shooting of described a pair of stereocamera; And
A camera control section according to the adjustment data that provide, is adjusted the distance between the described a pair of stereocamera.
67. according to the described system of claim 60, wherein each described first and second portable equipment comprises:
A receiver receives the adjustment data that are used for spatial magnification, and described spatial magnification relates to the size by a scene of one of them shooting of described a pair of stereocamera; And
A camera control section according to the adjustment data that receive, is adjusted the distance between the described a pair of stereocamera.
68. according to the described system of claim 67, wherein at least one described first and second portable equipment shows the spatial magnification adjustment data that receive.
69. according to the described system of claim 67, wherein at least one described first and second portable equipment comprises that a sound provides part, the spatial magnification that provides representative to receive is adjusted a voice signal of data.
70. according to the described system of claim 60, wherein each described first and second portable equipment comprises:
An importation is provided for the adjustment data of spatial magnification, and described spatial magnification relates to the size of a scene of taking in one of them of described first and second portable equipments; And
A transmitter sends described spatial magnification to another portable equipment and adjusts data.
71. according to the described system of claim 60, wherein said first portable equipment, to described second equipment, transmit described first stereo-picture respectively, and the described first and second shooting ratio (A1: B1: C1 of the image that in first portable equipment, generates, A2: B2: C2), wherein A1 and A2, and B1 and B2, be defined by the level and the vertical length of the scene of being taken respectively, and C1 and C2 are defined by the object lens of described stereocamera and the distance between the described scene respectively by the stereocamera of described first equipment; And
Wherein, described second portable equipment, the image that reception and demonstration receive, make each screen ratio (D1: E1: F1 of display screen of described second equipment, D2: E2: F2) take ratio (A1: B1: C1 with each, A2: B2: C2) substantially the same, wherein D1 and D2, and E1 and E2 be defined as being presented at the level and the vertical length of the described two-dimensional image in each display screen of described second equipment respectively, and F1 and F2 are defined by in the display screen of described second equipment and the distance between the observation point.
72. according to the described system of claim 71, wherein said observation point comprises the point at each eyes place of observer, these points become vertical angle basically with the central point of the described image that is shown respectively.
73. a Portable Communications Unit comprises:
A pair of stereocamera generates a stereo-picture of first scene;
A transmitter transmits described first stereo-picture;
A receiver, reception is different from a stereo-picture of one second scene of described first scene;
A pair of display screen shows the image of described second scene;
Wherein each stereo-picture comprises a pair of two-dimensional plane image.
74. according to the described device of claim 73, wherein said transmitter transmits described stereo-picture by a communication network.
75., further comprise according to the described device of claim 73:
An importation provides the described distance value (W between an observer's the central point of eyes a);
A computing equipment is according to the W that is provided aValue is that the two-dimensional image be shown determines the amount of motion, makes described W aThe substantially the same distance between the central point of the image that is shown of value; And
A display driver according to the amount of determined motion, moves the described a pair of image that is shown.
76., further comprise according to the described device of claim 73:
An importation provides the described distance value (W between an observer's the central point of eyes a);
A computing equipment is according to described W aValue is that the two-dimensional image be shown determines the amount of motion, makes described W aThe substantially the same distance between the central point of the image that is shown of value; And
A servomechanism according to determined amount of exercise, moves described a pair of display screen.
77. according to the described device of claim 76, wherein said servomechanism comprises:
A servo controller according to the amount of exercise of determining, is determined an adjustment amount; And
A horizontal engine according to the adjustment amount of determining, is adjusted the interval between the described display screen.
78. according to the described device of claim 73, wherein said a pair of display screen separates with described device, and has one and be wirelessly connected to this device.
79. according to the described device of claim 73, wherein said a pair of display screen is arranged in a head mounted display.
80. according to the described device of claim 73, wherein said a pair of display screen is arranged in a safety goggles.
81., further comprise according to the described device of claim 73:
An importation is provided for the adjustment data of spatial magnification, and described spatial magnification relates to the size of a scene of being taken by described stereocamera; And
A camera control section according to the adjustment data that provide, is adjusted the distance between the described stereocamera.
82., further comprise according to the described device of claim 73:
A receiver receives the adjustment data that are used for spatial magnification, and described spatial magnification relates to the size of a scene of being taken by described a pair of stereocamera; And
A camera control section according to the adjustment data that receive, is adjusted the distance between the described a pair of stereocamera.
83., further comprise according to the described device of claim 73:
An importation is provided for the adjustment data of spatial magnification, and described spatial magnification relates to by another size to a scene of three-dimensional camera; And
A transmitter transmits described spatial magnification to described another to stereocamera and adjusts data.
84. according to the described device of claim 73, wherein said receiver receives the stereo-picture of described second scene respectively, and be created on another first and second of described stereo-picture of described second scene in the stereocamera is taken ratio (A1: B1: C1, A2: B2: C2), wherein A1 and A2, and B1 and B2, be defined by respectively by described another level and vertical length the scene of three-dimensional camera, and C1 and C2 be defined by respectively described another to the object lens of stereocamera and the distance between the described scene; And
Wherein, described display screen, show the stereo-picture that receives, make each screen ratio (D1: E1: F1 of described display screen, D2: E2: F2) with each take ratio (A1: B1: C1, A2: B2: C2) substantially the same, wherein D1 and D2, and E1 and E2 be defined as being presented at the level and the vertical length of the described two-dimensional image in described each display screen respectively, and F1 and F2 are defined by the distance between described display screen and observation point.
85. 4 described devices according to Claim 8, wherein said observation point comprises the point at each eyes place of observer, and these points become vertical angle basically with the central point of the described image that is shown respectively.
86. 4 described devices according to Claim 8, wherein said device comprises a cellular phone, personal data assistant one by one, a Handheld Personal Computer PC, or an International Flow telecommunications 2000 (IMT-2000) terminal.
CNA038090252A 2002-02-27 2003-02-26 Method and system for displaying stereoscopic image Pending CN1647114A (en)

Applications Claiming Priority (22)

Application Number Priority Date Filing Date Title
KR2002/10422 2002-02-27
KR20020010424 2002-02-27
KR2002/10423 2002-02-27
KR20020010422 2002-02-27
KR2002/10424 2002-02-27
KR20020010423 2002-02-27
US10/280,239 US20030117395A1 (en) 2001-08-17 2002-10-24 Method and system for calculating a photographing ratio of a camera
US10/280,465 2002-10-24
US10/280,248 2002-10-24
US10/280,251 2002-10-24
US10/280,419 2002-10-24
US10/280,419 US7084838B2 (en) 2001-08-17 2002-10-24 Method and system for controlling the motion of stereoscopic cameras using a three-dimensional mouse
US10/280,465 US7091931B2 (en) 2001-08-17 2002-10-24 Method and system of stereoscopic image display for guiding a viewer's eye motion using a three-dimensional mouse
US10/280,239 2002-10-24
US10/280,344 2002-10-24
US10/280,248 US20030107646A1 (en) 2001-08-17 2002-10-24 Method and system for adjusting display angles of a stereoscopic image based on a camera location
US10/280,436 2002-10-24
US10/280,179 2002-10-24
US10/280,251 US20030107643A1 (en) 2001-08-17 2002-10-24 Method and system for controlling the motion of stereoscopic cameras based on a viewer's eye motion
US10/280,179 US7190825B2 (en) 2001-08-17 2002-10-24 Portable communication device for stereoscopic image display and transmission
US10/280,436 US20030122925A1 (en) 2001-08-17 2002-10-24 Method and system for providing the motion information of stereoscopic cameras
US10/280,344 US20030112508A1 (en) 2001-08-17 2002-10-24 Method and system for controlling space magnification for stereoscopic images

Publications (1)

Publication Number Publication Date
CN1647114A true CN1647114A (en) 2005-07-27

Family

ID=27767994

Family Applications (2)

Application Number Title Priority Date Filing Date
CNA038090252A Pending CN1647114A (en) 2002-02-27 2003-02-26 Method and system for displaying stereoscopic image
CNA038090260A Pending CN1647115A (en) 2002-02-27 2003-02-26 Method and system for controling a stereo camera

Family Applications After (1)

Application Number Title Priority Date Filing Date
CNA038090260A Pending CN1647115A (en) 2002-02-27 2003-02-26 Method and system for controling a stereo camera

Country Status (5)

Country Link
EP (2) EP1485875A4 (en)
CN (2) CN1647114A (en)
AU (2) AU2003215852A1 (en)
CA (2) CA2476610A1 (en)
WO (2) WO2003073738A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101859580A (en) * 2009-04-03 2010-10-13 索尼公司 Messaging device, information processing method and program
CN101060642B (en) * 2006-04-21 2010-12-15 三星电子株式会社 Method and apparatus for generating 3d on screen display
CN103248910A (en) * 2012-02-09 2013-08-14 三星电子株式会社 Three-dimensional imaging system and image reproducing method thereof
CN108069307A (en) * 2016-11-14 2018-05-25 杭州海康威视数字技术股份有限公司 The method and device that a kind of number in elevator is counted
WO2018113657A1 (en) * 2016-12-19 2018-06-28 纳恩博(北京)科技有限公司 Image processing method and device

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101606177B (en) * 2007-01-04 2013-07-17 鸣川肇 Information processing method
EP1975752B1 (en) * 2007-03-30 2010-07-07 Abb Research Ltd. A method for operating remotely controlled cameras in an industrial process
GB2480193B (en) * 2009-01-19 2015-01-21 Minoru Inaba Stereoscopic video imaging display system
CN103459991A (en) 2011-01-31 2013-12-18 维泰克实验室技术股份有限公司 Bottle dispenser having a digital volume display
US20140002642A1 (en) 2012-06-15 2014-01-02 Elmar SWIEGOT Absolute position detection
CA3086592A1 (en) 2017-08-30 2019-03-07 Innovations Mindtrick Inc. Viewer-adjusted stereoscopic image display
KR102377499B1 (en) * 2018-02-08 2022-03-24 이노베이션스 마인드트릭 인크. Viewer-adjustable stereoscopic image display
US10701342B2 (en) * 2018-02-17 2020-06-30 Varjo Technologies Oy Imaging system and method for producing images using cameras and processor

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63244992A (en) * 1987-03-31 1988-10-12 Toshiba Corp Stereoscopic display device
JP2701317B2 (en) * 1988-05-16 1998-01-21 ミノルタ株式会社 Camera photometry and focus adjustment device
US4962422A (en) * 1988-06-23 1990-10-09 Kabushiki Kaisha Topcon Stereoscopic image display apparatus
US4987487A (en) * 1988-08-12 1991-01-22 Nippon Telegraph And Telephone Corporation Method of stereoscopic images display which compensates electronically for viewer head movement
US5119123A (en) * 1989-03-16 1992-06-02 Minolta Camera Kabushiki Kaisha Camera having a remote control function
US5175616A (en) * 1989-08-04 1992-12-29 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence Of Canada Stereoscopic video-graphic coordinate specification system
JPH05292543A (en) * 1992-04-15 1993-11-05 Komatsu Ltd Visual device
JPH08511631A (en) * 1993-05-04 1996-12-03 ゼノテック リサーチ プロプライエタリー リミテッド 3D image display unit
JP3387624B2 (en) * 1994-05-20 2003-03-17 キヤノン株式会社 3D display device
JP3509282B2 (en) * 1994-05-24 2004-03-22 セイコープレシジョン株式会社 Camera ranging device
AUPN003894A0 (en) * 1994-12-13 1995-01-12 Xenotech Research Pty Ltd Head tracking system for stereoscopic display apparatus
JP3783977B2 (en) * 1997-02-17 2006-06-07 キヤノン株式会社 3D image device and 3D image display method
DE19836681B4 (en) * 1997-09-19 2008-03-27 Carl Zeiss Ag Stereoscopic recording and playback system
AUPO981997A0 (en) * 1997-10-15 1997-11-06 Lions Eye Institute Of Western Australia Incorporated, The Stereo optic disc analyser
US6271808B1 (en) * 1998-06-05 2001-08-07 Silicon Light Machines Stereo head mounted display using a single display device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101060642B (en) * 2006-04-21 2010-12-15 三星电子株式会社 Method and apparatus for generating 3d on screen display
CN101859580A (en) * 2009-04-03 2010-10-13 索尼公司 Messaging device, information processing method and program
CN103248910A (en) * 2012-02-09 2013-08-14 三星电子株式会社 Three-dimensional imaging system and image reproducing method thereof
CN108069307A (en) * 2016-11-14 2018-05-25 杭州海康威视数字技术股份有限公司 The method and device that a kind of number in elevator is counted
WO2018113657A1 (en) * 2016-12-19 2018-06-28 纳恩博(北京)科技有限公司 Image processing method and device

Also Published As

Publication number Publication date
AU2003224349A1 (en) 2003-09-09
EP1479046A2 (en) 2004-11-24
EP1485875A4 (en) 2006-12-20
WO2003073738A2 (en) 2003-09-04
EP1485875A2 (en) 2004-12-15
CA2476612A1 (en) 2003-09-04
WO2003073739A3 (en) 2003-12-24
CA2476610A1 (en) 2003-09-04
WO2003073739A2 (en) 2003-09-04
AU2003215852A8 (en) 2003-09-09
AU2003215852A1 (en) 2003-09-09
AU2003224349A8 (en) 2003-09-09
WO2003073738A3 (en) 2003-12-04
CN1647115A (en) 2005-07-27
EP1479046A4 (en) 2007-01-31

Similar Documents

Publication Publication Date Title
CN1110943C (en) Video camera system having remote commander
CN1191558C (en) Apparatus and method for map display, computer program of said display apparatus
CN1815409A (en) System, image processing apparatus, and information processing method
CN1647114A (en) Method and system for displaying stereoscopic image
CN1289939C (en) Stereoscopic image display device
CN1614505A (en) Three-dimensional image display device, portable terminal device, display panel and fly eye lens
CN1153995C (en) Mini-Zoom projection lenses for use with pixelized panels
CN1153362A (en) Methods for creating image for three-dimensional display, for calculating depth information, and for image processing using depth information
CN1268122C (en) Display system, network answering display device, terminal apparatus and controlling program
CN1725828A (en) Information display system
CN100346352C (en) Image display apparatus and short film generation apparatus
CN1652575A (en) Head-mounted camera
CN1725829A (en) Information display system
CN101048811A (en) Display controller and display device and method
CN101079953A (en) Information processing system, information processing device, information processing method, and program
CN1794037A (en) Optical scanning device and image display apparatus using the same
CN1717065A (en) Information processing apparatus and method, recording medium, program, and display device
CN1703099A (en) Display apparatus displaying three-dimensional image and display method for displaying three-dimensional image
CN101064777A (en) Command system, imaging system, command device, imaging method, command processing method, and program
CN1910577A (en) Image file list display device
CN1476574A (en) Eyeglasses order/sale system over network and its method
CN1743840A (en) Ultrasound probe diagnosing apparatus, ultrasound diagnostic apparatus, and ultrasound probe diagnosing method
CN1867940A (en) Imaging apparatus and image processing method therefor
CN1914919A (en) Detection range adjustment device
CN1707308A (en) Head-mounted display apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication