US9667951B2 - Three-dimensional television calibration - Google Patents
Three-dimensional television calibration Download PDFInfo
- Publication number
- US9667951B2 US9667951B2 US14/182,671 US201414182671A US9667951B2 US 9667951 B2 US9667951 B2 US 9667951B2 US 201414182671 A US201414182671 A US 201414182671A US 9667951 B2 US9667951 B2 US 9667951B2
- Authority
- US
- United States
- Prior art keywords
- user
- offset
- images
- screen
- perceived
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- H04N13/0425—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/327—Calibration thereof
-
- H04N13/0022—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
Definitions
- the present invention relates to methods and apparatus for calibrating three-dimensional television systems.
- Display technologies are integral to most electronic devices, being used both for watching media such as television (TV) programs and for graphical user interfaces (UIs) for computers, mobile phones and other electronic devices.
- TV television
- UIs graphical user interfaces
- 3D three-dimensional
- 3D imaging works by tricking the eye into perceiving depth information through two or more images.
- Stereoscopic imaging techniques utilized for moving images such as TV involve displaying each image to be viewed by the viewer as two images: one image is arranged to be viewed by the right eye and another one is arranged to be viewed by the left eye.
- the left and right images differ slightly such that when they reach each eye the viewer can extract depth information from the images.
- Each portion of the left and right images contains views of objects that are captured from subtly different perspectives. As a result, the offset of the views differs by a certain number of pixels, in accordance with the depth to be perceived by the viewer.
- FIG. 1 is a simplified block diagram illustration of a three-dimensional television system constructed and operative in accordance with an embodiment of the present invention
- FIG. 2A is a descriptive illustration showing different parallax in accordance with an embodiment of the present invention.
- FIG. 2B is a descriptive illustration showing vergence movements in accordance with an embodiment of the present invention.
- FIG. 2C is a descriptive illustration showing the accommodation process in accordance with an embodiment of the present invention.
- FIG. 3 is a descriptive illustration showing the stereoscopic parameters in accordance with an embodiment of the present invention.
- FIGS. 4A and 4B are pictorial illustrations showing a calibration procedure in accordance with an embodiment of the present invention.
- FIG. 5 is a pictorial illustration of a calibration procedure in accordance with another embodiment of the present invention.
- a method includes displaying a test pattern on a display device associated with a client device, the test pattern comprising a stereoscopic image having first and second images and depth characteristics associated with an offset between the first and second images; adjusting the offset between the first and second images in response to inputs received from a user of the client device viewing said stereoscopic image from at least one viewing position; and storing the adjusted offset in a storage device.
- 3D-TV three-dimensional television
- the 3D effect is usually determined according to a fixed image size captured and/or processed before transmission.
- Current 3D-TV systems propose different ways to adjust the 3D settings on a TV set in order to calibrate the 3D effect.
- all of them require the user to adjust the 3D settings by manually entering a plurality of parameters (e.g. the viewing distance, the screen size, etc.) via a control menu displayed on the TV screen or monitor.
- the user also needs to take measurement(s) by hand of these different parameters prior to adjusting the 3D settings.
- the calibration is usually done “on the fly” by using the TV program (e.g. TV broadcast program, recorded program, Video-On-Demand program, etc.) currently being rendered on the TV display screen without any consideration of whether or not this TV program is suitable for calibrating the stereoscopic effect. Even with these adjustments, the rendered 3D effect may not be adapted to the particular viewing conditions and audience characteristics thereby leading to a deceptive 3D experience.
- the TV program e.g. TV broadcast program, recorded program, Video-On-Demand program, etc.
- FIG. 1 is a simplified block diagram illustration of a three-dimensional television system constructed and operative in accordance with an embodiment of the present invention.
- a headend 110 typically communicates with a plurality of client devices via a communications network 130 . Additionally or alternatively, a plurality of headends communicates with a single client device 120 or with a plurality of client devices via the communications network 130 . For simplicity of depiction and description, and without limiting the generality of the invention, only one client device 120 is illustrated in FIG. 1 .
- the communication network 130 is a one-way or two-way communication network that includes at least one of the following: a satellite based communication network; a cable based communication network; a conventional terrestrial broadcast television network; a telephony based communication network; a telephony based television broadcast network; a mobile-telephony based television broadcast network; an Internet Protocol (IP) television broadcast network; and a computer based communication network.
- IP Internet Protocol
- the communication network 130 may, for example, be implemented by a one-way or two-way hybrid communication network, such as a combination cable-telephone network, a combination satellite-telephone network, a combination satellite-computer based communication network, or by any other appropriate network.
- Other ways of implementing the communication network 130 will be apparent to someone skilled in the art.
- the 3D-TV system 100 of FIG. 1 comprises a client device 120 disposed between a headend 110 and a display device 140 .
- Client device 120 comprises a storage device, such as a hard disk or high capacity memory.
- Client device 120 is coupled to a display device 140 .
- Client device 120 typically further comprises a tuner, a demultiplexer, a decoder, a descrambler, a receiver and a processor. It is appreciated that the client device 120 comprises standard hardware components and software components, as is well known in the art.
- Client device 120 is typically connected in operation to display device 140 via a digital AV interface (e.g. HDMI, DVI, etc.) or via an analogue AV interface (e.g. component (RGB, YPbPr), composite (NTSC, PAL, SECAM), S-video, SCART, RF coaxial, D-Terminal (D-tanshi) etc.). While shown as separate entities in FIG. 1 , the client device 120 may be integral with the display device 140 in other embodiments of the present invention.
- a digital AV interface e.g. HDMI, DVI, etc.
- an analogue AV interface e.g. component (RGB, YPbPr), composite (NTSC, PAL, SECAM), S-video, SCART, RF coaxial, D-Terminal (D-tanshi) etc.
- the client device 120 may be integral with the display device 140 in other embodiments of the present invention.
- Display device 140 is typically operated by a user, for example via a remote control unit (RCU) 150 .
- RCU 150 Using a RCU 150 , a user can interact with a User Interface (UI) or an Electronic Program Guide (EPG) displayed on the display device 140 , select AV content to view, such as a live event broadcast, a Video-On-Demand (VOD) asset, a recorded event, etc.
- UI User Interface
- EPG Electronic Program Guide
- the operating system software within client device 120 monitors user interaction with display device 140 and/or client device 120 and maintains a database of service and event information.
- the operating system software enables the user to choose an event/service to view.
- the RCU 150 is operable to communicate with the receiver of the client device 120 using any suitable connectivity link (e.g. wireless connection over Internet Protocol).
- the RCU 150 may be, for example, but without limiting the generality of the invention, a traditional remote control, a laptop computer, a desktop or personal computer (PC), a tablet computer such as an iPadTM, a mobile computing device, such as a Personal Digital Assistant (PDA), a mobile phone, or any suitable handheld device.
- a traditional remote control a laptop computer, a desktop or personal computer (PC), a tablet computer such as an iPadTM, a mobile computing device, such as a Personal Digital Assistant (PDA), a mobile phone, or any suitable handheld device.
- PDA Personal Digital Assistant
- the headend 110 of the 3D-TV system 100 is further able to provide a stereoscopic 3D image stream in the form of a set of consecutive stereoscopic images to be transmitted to the client device 120 .
- the perception of depth that is associated with 3D images is achieved by a pair of similar two-dimensional (2D) images captured from slightly different perspectives and thus slightly offset from each other.
- the offset of the two images which in turn determines the perceived depth of the image, is determined in accordance with a fixed image size before transmission. For example, the depth may be determined to be displayed on a 3D-TV set of a particular size.
- Another 3D broadcasting technique consists in supplying a disparity map along with a 2D image stream.
- This disparity map or depth map (these expressions being used interchangeably) comprises information relating to the distance of the surfaces of scene objects from a viewpoint and typically comprises a set of values (e.g. one for each image pixel location) representing a pixel translation to apply to the 2D image stream in order to generate a further 2D image stream.
- the two 2D image streams would then combine to form a stereoscopic 3D image stream.
- the disparity map is useful in situations where scaling is needed since it is usually observed that the perceived depths are not scaled consistently. Therefore, having this disparity map enables the client device 120 to refine the 3D effect according to the display size of the 3D-TV set.
- the client device 120 then performs any processing required to display the received 3D image stream on the display device 140 . This processing may include:
- a user of such a 3D-TV system 100 is able to customize a pixel offset to be used to display the 3D image stream according to the viewing conditions and audience characteristics of a particular household during a calibration procedure.
- the customized pixel offset obtained during the calibration procedure is applied to the right and left images of a received 3D image stream (either the right and left images provided by the headend 110 ; or the right (respectively left) image provided by the headend 110 and the left (respectively right) image generated using the disparity map) at the time of display.
- the calibration procedure may be done either during the installation phase i.e. at a time when the user first installs his 3D-TV system 100 or at a later time automatically or on a per user request.
- the user is able to run a calibration application that may be locally executed on the client device 120 .
- This calibration application is an interactive application that may be written in any suitable programming language (e.g. native, C/C++, Java, HTML, Flash, etc.) and may be enabled by various ecosystems (e.g. middleware, download, web browser, runtime engine, widget, application store, etc.) as long as it is operable to run on the client device 120 .
- the calibration application does not depend on the content itself and therefore may run for different types of 3D video content such as live broadcasts, VOD programs, BluRay, etc.
- the application may be launched by the user using an RCU 150 via a dedicated menu in the UI rendered on the display device 140 .
- the calibration application gives the user—wearing 3D glasses if necessary—the opportunity to adjust the pixel offset between first (e.g. Left) and second (e.g. right) images of a stereoscopic image through one or more interactive test patterns.
- the adjusted offset may then be stored on a storage device of the client device 120 and be used later to display the stereoscopic images of the 3D image stream. It will be apparent to someone skilled in the art, that such a calibration procedure may be repeated or updated as often as needed in response to a user's input or to a change in the viewing conditions and/or in the audience characteristics.
- the application typically enables a user to adjust the pixel offset between the first and second images of a stereoscopic image by reacting to/interacting with one or more interactive test patterns.
- This pixel offset is also known as the horizontal parallax. Calibrating the horizontal parallax has an influence on many stereoscopic parameters such as, for example, the intra-ocular distance, the parallax and the vergence.
- the intra-ocular distance defines the distance between the viewer's eyes. It is apparent that this distance varies from one person to another.
- the solution adopted by the 3D industry to address this issue is to use average values for every representative group of people i.e. Adults, teenagers and children.
- the typical average value selected for an adult is 65 mm (the maximum accepted value being 73 mm) and 50 mm for a child.
- the intra-ocular distance applied in theaters is usually 63.5 mm in order to be representative of a maximum number of people.
- the parallax expresses the offset between the apparent angles and positions of an object due to the distance between the eyes of the observer. Thus, the perceived 3D effect depends on both the amount and the type of parallax. There are three different types of parallax as shown in FIG. 2A :
- the vergence is the simultaneous movement of both eyes in opposite directions to obtain (or maintain) binocular vision.
- the two eyes converge to point at the same object as illustrated on FIG. 2B .
- the eyes When a person with binocular vision looks at an object, the eyes typically rotate around a vertical axis so that the image projection is at the center of the retinas. To look at an object located closer by, the eyes rotate towards each other (convergence), while for an object located farther away, the eyes rotate away from each other (divergence). Exaggerated convergence is also known as cross-eyed viewing as shown in FIG. 2 .
- FIG. 3 is a pictorial illustration showing the stereoscopic parameters in accordance with an embodiment of the present invention.
- a user 101 is shown located at a viewing distance Z d in front of the display device 140 .
- a 3D object 102 with a horizontal parallax (pixel offset) P is perceived at depth Z v .
- This depth Z v is given by the following equation:
- e is the user's intra-ocular distance.
- Z v near denotes the perceived depth of an object near to the user 102
- Z v far denotes the perceived depth of an object far from the user 102 .
- ⁇ ⁇ ⁇ ⁇ max tan - 1 ⁇ ( e Z v near ) - tan - 1 ( e Z v far ) ⁇ e ⁇ ( 1 Z v near - 1 Z v far )
- ⁇ max denotes the maximal parallax angle between near and far objects which enables a distortion-free fusing of stereoscopic images.
- the maximal parallax angle allowed by a stereoscopic 3D display device may be obtained by: ⁇ P max ⁇ Z d ⁇ max
- ⁇ P rel mainly depends on a psycho-optical component where ⁇ max describes the maximal parallax at which the stereo images can be fused without visible distortions.
- Z d W D may vary considerably in a range from 1.00 to 4.00.
- the resulting range of ⁇ P rel is between 1/50 and 1/12.
- the depth reproduction is not necessarily linear.
- the depth reproduction is linear in a situation where the display setup is such that infinity parallax equals the intra-ocular distance e. Otherwise, stereoscopic distortions appear such that foreground scene elements are more elongated than background scene elements, or vice versa.
- FIGS. 4A and 4B are pictorial illustrations showing a calibration operation in accordance with an embodiment of the present invention.
- a user standing at a particular viewing position may launch a calibration application as explained hereinabove.
- the application typically shows a test pattern enabling the user 101 to calibrate the stereoscopic 3D effect.
- the test patterns comprises first (e.g. Left) and second (e.g. Right) images (not shown) of a stereoscopic image having depth characteristics associated with the pixel offset between the first (e.g. Left) and second (e.g. Right) images.
- the test patterns show an object 102 such as, for example, but without limiting the generality of the invention, a ball (as illustrated in FIGS.
- the perceived distances at which the user 101 is asked to position the object 102 may be any suitable distance enabling a stereoscopic calibration without any stereoscopic distortions.
- the calibration application may also be configured so that the user 101 is able to position the object 102 at a plurality of perceived distances such as, but not limited to, the object 102 appearing on the plane of the display screen 140 , the object appearing at infinite (i.e. at a maximum distance for which the object can be seen behind the screen without stereoscopic distortions), or as close as possible to the user 101 (i.e. at a maximum distance for which the object can be seen in front of the screen without stereoscopic distortions).
- the user 101 typically positions the object 102 using relevant keys on a RCU 150 thereby adjusting the pixel offset between the first (e.g. Left) and second (e.g. Right) images of the stereoscopic image.
- Positioning the object 102 at different perceived distances through the calibration application enables the calculation of an average adjusted pixel offset customized according to the user's current viewing position (e.g. viewing distance Z d —distance between the user 101 and the display screen 140 —and angular position); and physical characteristics (e.g. intra-ocular distance e).
- the average adjusted pixel offset is typically obtained by calculating the average value of the adjusted pixel offset values corresponding to the several pre-defined positions at which the object 102 is to be rendered during the calibration.
- the average adjusted pixel offset is stored on a storage device of the client device 120 to be applied later to the stereoscopic images of the 3D image stream.
- the left and right images of the 3D image stream are shifted by the value of the average adjusted pixel offset for display.
- a simple pixel offset of the stereoscopic images enables recalibrating the stereoscopic 3D effect. Indeed, all the depth characteristics are impacted by a simple pixel offset of the stereoscopic images.
- the pixel size depends on the display screen size (e.g. usually given as a diagonal measurement by TV manufacturers, i.e. the distance between two diagonally opposite screen corners), the aspect ratio and the screen resolution, and all of these are provided to the calibration application as known configuration parameters.
- the calculations are performed based on the following configuration parameters of the display screen 140 :
- the 3D-TV calibration system 100 provides an improved 3D experience to the user 101 while keeping the calibration procedure as simple as possible.
- This 3D-TV calibration also provides a customized 3D experience by taking the viewing conditions and the audience characteristics of a particular household into consideration.
- FIG. 5 is a pictorial illustration of a calibration procedure in accordance with another embodiment of the present invention.
- the calibration application enables the user 101 to calibrate his 3D-TV system 100 to take into consideration further viewing positions.
- the stereoscopic 3D effect can be calibrated for different angular viewing positions.
- the calibration application typically shows a further test pattern customized for calibrating the angular viewing positions.
- the further test pattern comprises left and right images of a stereoscopic image showing a plurality of objects 103 , 104 , 105 and 106 at different perceived depths. As shown in FIG.
- the user 101 is first asked to sit at a first viewing position (e.g. viewpoint A) and align a first object 103 perceived at a first depth (e.g. a star) with a second object 106 perceived at a second depth (e.g. dark grey box). Then, the user 101 is asked to sit at a second viewing position (e.g. viewpoint B) and align a first object 103 perceived at a first depth (e.g. a star) with a second object 104 perceived at a second depth (e.g. light grey box).
- the first and second viewing positions may be the extreme viewing positions (i.e.
- the user 101 typically uses a RCU 150 to do this and in turn, an adjusted pixel offset value is obtained for each angular viewing position. Then, these further values may be used in relation to the ones obtained from the first test pattern to calculate the average adjusted offset value prior to storing it on the storage device of the client device 120 . Additionally and/or alternatively, the user 101 may be asked to center vertically the first object 103 perceived at a first depth with the second objects 104 - 106 perceived at a second depth thereby adjusting the vertical parallax.
- the calibration application is able to store the adjusted pixel offset values and/or the average adjusted pixel offset values calculated during the different calibration procedures described in relation to FIGS. 4A, 4B and 5 with identification data relevant to the user ( 101 ) who used the calibration application.
- This identification data typically identifies a particular user and/or type/category of user.
- This identification data may be requested by the calibration application during the calibration procedure and may be in any suitable form.
- the user 101 may enter or be requested to enter a name, a surname, a user name, etc. as identification data for a particular user and/or a category such as, for example but without limiting the generality of the invention, adult, teenager, child, etc. as identification data for a type of user.
- the calibration application is then able to generate different profiles using the stored adjusted pixel offset values and/or the average adjusted pixel offset values corresponding to different users and viewing positions using the identification data. For example, a user 101 may calibrate his 3D-TV system 100 for different viewing positions and indicates ‘user 1 ’ as identification data. Therefore, the calibration application may generate several profiles and/or sub-profiles for the same user 101 :
- a user 101 may calibrate his 3D-TV system 100 for different viewing positions and indicates ‘adult’ as identification data. Therefore, the calibration application may generate several profiles and/or sub-profiles for this type of user:
- the user 101 does not have to calibrate the stereoscopic effect rendered by his 3D-TV system 100 each time he is watching his display device 140 . Rather, when a user 101 starts watching TV, he is able to select a profile relevant to his current viewing position. Then, when the client device 120 receives and subsequently processes a 3D image stream, the calculated adjusted pixel offset values or the average adjusted pixel offset values stored in the storage of client device 120 associated with the selected profile may be retrieved and applied to the 3D image stream. Typically, the left and right images of the 3D image stream are shifted by the value of the retrieved calculated adjusted pixel offset or the retrieved average adjusted pixel offset for display.
- a household typically comprises different types of users.
- a household may, for example, comprise a man, a woman and a child, each of them having different TV viewing habits, viewing positions and intra-ocular distances.
- each viewer is able to calibrate the 3D-TV system 100 using the calibration application. Therefore, the stereoscopic 3D effect may be calibrated for different types of users and viewing positions.
- the calibration application may store the different users' (sub-)profiles and use them when requested.
- the calibration application is further able to generate, automatically or on a per user request, and store additional profiles corresponding to different types of audiences using identification data relevant to the type of users. This is particularly useful in situations where different types of users are watching TV together from different viewing positions.
- the calibration application is typically able to use the adjusted pixel offset values and/or average adjusted pixel offset values to generate audience profiles. Many different types of audience profiles may be generated.
- the calibration application may generate an audience profile using at least two of the stored users' profiles; and/or using all of the stored users' profiles; and/or may generate an audience profile for adults using the users' profiles corresponding to adults users; and/or an audience profile for children using the users' profiles corresponding to children users; and/or a mixed audience profile combining adults and children users' profiles.
- a household comprises n adults and m children.
- Different users' profiles corresponding to the household's audience are available and the calibration application determined that:
- the calibration application may generate a plurality of additional audience profiles corresponding to:
- mixed_audience1 adults_offset n + children_offset m 2
- the coefficients 1 ⁇ 2, 1 ⁇ 4 and 3 ⁇ 4 are not limiting and that the calibration application may use any suitable coefficient or method to generate mixed audience profiles and/or various additional audience profiles giving more prominence to one or more category of users.
- the generation of audience profiles is not limited to two categories of users (as shown in the example hereinabove).
- the calibration application is able to generate a plurality of additional audience profiles for more than two categories of users.
- the client device 120 when a user 101 starts watching TV, he may be able to select a profile relevant to the current audience. Then, when the client device 120 receives and subsequently processes a 3D image stream, the calculated average pixel offset values stored in the storage of client device 120 associated with the selected audience profile may be retrieved and applied to the 3D image stream. Typically, the left and right images of the 3D image stream are shifted by the value of the retrieved calculated adjusted pixel offset for display.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
Abstract
Description
-
- merely scaling the set of consecutive stereoscopic images (each stereoscopic image having left and right images provided by the headend 110) for display on the
display device 140; or - generating a 2D image stream using the disparity map and the initial 2D image stream provided by the
headend 110 to produce a 3D image stream comprising a set of consecutive stereoscopic images and scaling the produced 3D image stream for display on thedisplay device 140.
- merely scaling the set of consecutive stereoscopic images (each stereoscopic image having left and right images provided by the headend 110) for display on the
-
- Positive parallax for which corresponding image points are said to have positive (or uncrossed) parallax P when the point in the right eye view lies further to the right than the corresponding point in the left eye view. In such a situation, the related viewing rays converge at a 3D point so that the reproduced 3D scene is perceived behind the screen. In addition, it is to be noted that in a situation where the parallax P is equal to the viewer's intra-ocular distance, the 3D scene is reproduced at infinity. This also means that the allowed maximum value for the positive parallax is limited to the viewer's intra-ocular distance;
- Zero parallax for which corresponding image points lie at the same position in the left and right eye views. The resulting 3D point is therefore observed on the screen plane. This situation is typically referred to as the Zero Parallax Setting (ZAPS); and
- Negative parallax for which corresponding image points are said to have negative (or crossed) parallax P when the point in the right eye view lies further to the left than the corresponding point in the left eye view. In such a situation, the related viewing rays converge at a 3D point so that the reproduced 3D scene is perceived in front the screen.
ΔP max ≈Z d·Δαmax
ΔPrel mainly depends on a psycho-optical component where Δ∝max describes the maximal parallax at which the stereo images can be fused without visible distortions.
of 1.67 Hence, it is possible to deduce a related thumb rule for the relative parallax range ΔPrel:
may vary considerably in a range from 1.00 to 4.00. The resulting range of ΔPrel is between 1/50 and 1/12.
Z v
Z v
Z v
Z v
P 1=pixel_offset·pixel_size
-
- diagonal: 42 inches or 106.68 cm;
- aspect ratio: 16:9; and
- screen resolution: 1920×1080.
P 1=−1.335 cm and P 2=0.9345 cm
-
- with Zv
1 =Z d−25 and
- with Zv
-
- with Zv
2 =Z d+25
- with Zv
Z d=141.67 cm
Z v
Z v
e=6.23 cm
TABLE 1 | ||||
Viewing distance Zd (cm) | 150 | 200 | 250 | 300 |
Object distance Zv (cm) | 175 | 225 | 275 | 325 |
Diagonal | Diagonal | Width | Pixel size | ||||
(inch) | (cm) | (cm) | (cm) | — | — | — | — |
42 | 106.68 | 85.34 | 0.0445 | 21 | 16 | 13 | 11 |
46 | 116.84 | 93.47 | 0.0487 | 19 | 15 | 12 | 10 |
50 | 127.00 | 101.60 | 0.0529 | 18 | 14 | 11 | 10 |
55 | 139.70 | 111.76 | 0.0582 | 16 | 13 | 10 | 9 |
-
- (offset_value1; user1_profile1) for viewing position 1 (at viewing distance Zd
1 in front of the screen for example); - (offset_value2; user1_profile2) for viewing position 2 (at viewing distance Zd
2 in front of the screen); - (offset_value3; user1_profile3) for viewing position 3 (at viewing distance Zd
3 not in front of the screen i.e. at angular position ∝3); - (offset_value4; user1_profile4) for viewing position 4 (at viewing distance Zd
4 not in front of the screen i.e. at angular position ∝4); - etc.
- (offset_value1; user1_profile1) for viewing position 1 (at viewing distance Zd
-
- (offset_value1; adult_profile1) for viewing position 1 (at viewing distance Zd
1 in front of the screen for example); - (offset_value2; adult_profile2) for viewing position 2 (at viewing distance Zd
2 in front of the screen); - (offset_value3; adult_profile3) for viewing position 3 (at viewing distance Zd
3 not in front of the screen i.e. at angular position ∝3); - (offset_value4; adult_profile4) for viewing position 4 (at viewing distance Zd
4 not in front of the screen i.e. at angular position ∝4); - etc.
- (offset_value1; adult_profile1) for viewing position 1 (at viewing distance Zd
-
- the average of the average adjusted pixel offset values (adults_offset) for the n adults is given by:
-
- Similarly, the average of the average adjusted pixel offset values (child_offset) for the m children is given by:
-
- a mixed audience profile typically well-suited in contexts where n=m or where n and m have a similar/close value. Such a mixed audience profile is useful in these examples since it gives a same importance to different categories of users. However, those skilled in the art will appreciate that these are non-limiting examples and that other ways of generating a mixed audience profiles are possible (e.g. generating mixed audience profiles for more than two categories of users for instance). Therefore, the average pixel offset value (mixed_audience1) used is given by:
-
- a children-oriented audience profile typically well-suited in contexts where n<m. Therefore, the average pixel offset value (children_audience2) used is given by:
children_audience2=¼·adults_offsetn+¾·children_offsetm - an adult-oriented audience profile typically well-suited in contexts where n>m. Therefore, the average pixel offset value (adults_audience3) used is given by:
adults_audience3=¾·adults_offsetn+¼·children_offsetm
- a children-oriented audience profile typically well-suited in contexts where n<m. Therefore, the average pixel offset value (children_audience2) used is given by:
Claims (7)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/182,671 US9667951B2 (en) | 2014-02-18 | 2014-02-18 | Three-dimensional television calibration |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/182,671 US9667951B2 (en) | 2014-02-18 | 2014-02-18 | Three-dimensional television calibration |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150237335A1 US20150237335A1 (en) | 2015-08-20 |
US9667951B2 true US9667951B2 (en) | 2017-05-30 |
Family
ID=53799291
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/182,671 Active 2035-07-11 US9667951B2 (en) | 2014-02-18 | 2014-02-18 | Three-dimensional television calibration |
Country Status (1)
Country | Link |
---|---|
US (1) | US9667951B2 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105376559B (en) * | 2015-11-05 | 2019-03-12 | 广东未来科技有限公司 | 3 d display device and its view-point correction method |
CN105391997B (en) * | 2015-11-05 | 2017-12-29 | 广东未来科技有限公司 | The 3d viewpoint bearing calibration of 3 d display device |
US9992487B1 (en) * | 2016-08-10 | 2018-06-05 | Integrity Applications Incorporated | Stereoscopic viewer |
DE112021007958T5 (en) * | 2021-07-13 | 2024-04-25 | Haag-Streit Ag | Ophthalmic or surgical microscope with display device and camera |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6011581A (en) | 1992-11-16 | 2000-01-04 | Reveo, Inc. | Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments |
EP1489857A1 (en) | 2002-03-27 | 2004-12-22 | Sanyo Electric Co., Ltd. | 3-dimensional image processing method and device |
GB2404106A (en) | 2003-07-16 | 2005-01-19 | Sharp Kk | Generating a test image for use in assessing display crosstalk. |
US20050253924A1 (en) * | 2004-05-13 | 2005-11-17 | Ken Mashitani | Method and apparatus for processing three-dimensional images |
US20050275914A1 (en) | 2004-06-01 | 2005-12-15 | Vesely Michael A | Binaural horizontal perspective hands-on simulator |
US7538876B2 (en) | 2006-06-12 | 2009-05-26 | The Boeing Company | Efficient and accurate alignment of stereoscopic displays |
US20100220325A1 (en) | 2007-06-07 | 2010-09-02 | Stephan Otte | Method for orienting an optical element on a screen |
US20100220175A1 (en) * | 2009-02-27 | 2010-09-02 | Laurence James Claydon | Systems, apparatus and methods for subtitling for stereoscopic content |
GB2479784A (en) | 2010-04-23 | 2011-10-26 | Nds Ltd | Stereoscopic Image Scaling |
US8287127B2 (en) * | 2009-04-03 | 2012-10-16 | Seiko Epson Corporation | Aerial three-dimensional image display systems |
US9049425B2 (en) * | 2011-09-28 | 2015-06-02 | Superd Co., Ltd. | Stereoscopic image processing method and system |
US9081181B2 (en) * | 2011-05-19 | 2015-07-14 | Samsung Electronics Co., Ltd. | Head mounted display device and image display control method therefor |
US9172939B2 (en) * | 2011-12-30 | 2015-10-27 | Stmicroelectronics (Canada), Inc. | System and method for adjusting perceived depth of stereoscopic images |
US9383587B2 (en) * | 2009-07-31 | 2016-07-05 | Tibor Balogh | Method and apparatus for displaying 3D images |
US9445084B2 (en) * | 2011-02-15 | 2016-09-13 | Nintendo Co., Ltd. | Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method |
-
2014
- 2014-02-18 US US14/182,671 patent/US9667951B2/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6011581A (en) | 1992-11-16 | 2000-01-04 | Reveo, Inc. | Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments |
EP1489857A1 (en) | 2002-03-27 | 2004-12-22 | Sanyo Electric Co., Ltd. | 3-dimensional image processing method and device |
GB2404106A (en) | 2003-07-16 | 2005-01-19 | Sharp Kk | Generating a test image for use in assessing display crosstalk. |
US20050253924A1 (en) * | 2004-05-13 | 2005-11-17 | Ken Mashitani | Method and apparatus for processing three-dimensional images |
US20050275914A1 (en) | 2004-06-01 | 2005-12-15 | Vesely Michael A | Binaural horizontal perspective hands-on simulator |
US7538876B2 (en) | 2006-06-12 | 2009-05-26 | The Boeing Company | Efficient and accurate alignment of stereoscopic displays |
US20100220325A1 (en) | 2007-06-07 | 2010-09-02 | Stephan Otte | Method for orienting an optical element on a screen |
US20100220175A1 (en) * | 2009-02-27 | 2010-09-02 | Laurence James Claydon | Systems, apparatus and methods for subtitling for stereoscopic content |
US8287127B2 (en) * | 2009-04-03 | 2012-10-16 | Seiko Epson Corporation | Aerial three-dimensional image display systems |
US9383587B2 (en) * | 2009-07-31 | 2016-07-05 | Tibor Balogh | Method and apparatus for displaying 3D images |
GB2479784A (en) | 2010-04-23 | 2011-10-26 | Nds Ltd | Stereoscopic Image Scaling |
US9445084B2 (en) * | 2011-02-15 | 2016-09-13 | Nintendo Co., Ltd. | Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method |
US9081181B2 (en) * | 2011-05-19 | 2015-07-14 | Samsung Electronics Co., Ltd. | Head mounted display device and image display control method therefor |
US9049425B2 (en) * | 2011-09-28 | 2015-06-02 | Superd Co., Ltd. | Stereoscopic image processing method and system |
US9172939B2 (en) * | 2011-12-30 | 2015-10-27 | Stmicroelectronics (Canada), Inc. | System and method for adjusting perceived depth of stereoscopic images |
Also Published As
Publication number | Publication date |
---|---|
US20150237335A1 (en) | 2015-08-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10943502B2 (en) | Manipulation of media content to overcome user impairments | |
US9167289B2 (en) | Perspective display systems and methods | |
US9641824B2 (en) | Method and apparatus for making intelligent use of active space in frame packing format | |
US8514275B2 (en) | Three-dimensional (3D) display method and system | |
US8605136B2 (en) | 2D to 3D user interface content data conversion | |
US10154243B2 (en) | Method and apparatus for customizing 3-dimensional effects of stereo content | |
CN102223555B (en) | Image display apparatus and method for controlling the same | |
US20110248989A1 (en) | 3d display apparatus, method for setting display mode, and 3d display system | |
US11350080B2 (en) | Methods and apparatus for displaying images | |
US20120075291A1 (en) | Display apparatus and method for processing image applied to the same | |
US20130194395A1 (en) | Method, A System, A Viewing Device and a Computer Program for Picture Rendering | |
US9667951B2 (en) | Three-dimensional television calibration | |
US20170171534A1 (en) | Method and apparatus to display stereoscopic image in 3d display system | |
KR20130094905A (en) | Display apparatus and method for adjusting three-dimensional effect | |
JP5500645B2 (en) | Video adjustment device, television receiver, and program | |
US9325963B2 (en) | Device and method for rendering and delivering 3-D content | |
KR101674688B1 (en) | A method for displaying a stereoscopic image and stereoscopic image playing device | |
KR101466581B1 (en) | Stereoscopic 3d content auto-format-adapter middleware for streaming consumption from internet | |
US20130047186A1 (en) | Method to Enable Proper Representation of Scaled 3D Video | |
US9237334B2 (en) | Method and device for controlling subtitle applied to display apparatus | |
US20120050469A1 (en) | Image processing device, image processing method and image processing system | |
TWI502960B (en) | Device and method for 2d to 3d conversion | |
Zone | John Hart: Shooting Time-Lapse 3D Movies with Digital Still Cameras FastLife (2009) | |
KR20120020306A (en) | Apparatus and method for displaying of stereo scope images | |
KR20120056647A (en) | Method and apparatus for transmitting 3-dimensional caption, method and apparatus for displaying 3-dimensional caption |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CISCO TECHNOLOGY INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GRUSZKA, THIERRY;REEL/FRAME:032262/0359 Effective date: 20140220 Owner name: CISCO TECHNOLOGY INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ALLIEZ, DAMIEN;REEL/FRAME:032262/0407 Effective date: 20140219 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
AS | Assignment |
Owner name: NDS LIMITED, UNITED KINGDOM Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BEAUMARIS NETWORKS LLC;CISCO SYSTEMS INTERNATIONAL S.A.R.L.;CISCO TECHNOLOGY, INC.;AND OTHERS;REEL/FRAME:047420/0600 Effective date: 20181028 |
|
AS | Assignment |
Owner name: SYNAMEDIA LIMITED, UNITED KINGDOM Free format text: CHANGE OF NAME;ASSIGNOR:NDS LIMITED;REEL/FRAME:048513/0297 Effective date: 20181108 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: SYNAMEDIA LIMITED, UNITED KINGDOM Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE 26 APPLICATION NUMBERS ERRONEOUSLY RECORDED AGAINST ON THE ATTACHED LIST PREVIOUSLY RECORDED AT REEL: 048513 FRAME: 0297. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:NDS LIMITED;REEL/FRAME:056623/0708 Effective date: 20181108 |