GB2547701A - Method and apparatus for autostereoscopic display platform - Google Patents

Method and apparatus for autostereoscopic display platform Download PDF

Info

Publication number
GB2547701A
GB2547701A GB1603430.8A GB201603430A GB2547701A GB 2547701 A GB2547701 A GB 2547701A GB 201603430 A GB201603430 A GB 201603430A GB 2547701 A GB2547701 A GB 2547701A
Authority
GB
United Kingdom
Prior art keywords
display
autostereoscopic
display platform
design
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1603430.8A
Other versions
GB201603430D0 (en
Inventor
Tan Baolin
Lu Min
Ma Xiaoqi
Kang Jianghui
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Euro Electronics (uk) Ltd
Original Assignee
Euro Electronics (uk) Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Euro Electronics (uk) Ltd filed Critical Euro Electronics (uk) Ltd
Priority to GB1603430.8A priority Critical patent/GB2547701A/en
Publication of GB201603430D0 publication Critical patent/GB201603430D0/en
Publication of GB2547701A publication Critical patent/GB2547701A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/261Image signal generators with monoscopic-to-stereoscopic image conversion

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Autostereoscopic display p latform for displaying 3D CAD drawings An autostereoscopic display platform comprising software to convert CAD drawings into autostereoscopic images, and an autostereoscopic display to view the images. The software preferably generates a right eye image from the front CAD drawing and a left view picture by rotating the design model by a parallax angle (α, Fig.5); these two images are then interlaced. The display may comprise a LCD display screen 1 and an optical element 2, such as parallax barrier (4, Fig.2) or lenticular lens, through which the right and left images are projected into the respective eyes. Camera 3 may track the users eye movements to calculate the parallax angle; the users eye movements may be continually monitored and the parallax barrier adjusted to ensure the user remains in the 3D working area. A keyboard, mouse, or gestures may be used to control to display platform e.g. for rotating or zooming in on the displayed object (5, Fig.2).

Description

Method and Apparatus for Autostereoscopic Display Platform
Background
This invention relates to an autostereoscopic (glass-free) 3D display platform for individual customers and designers to project a CAD design in a 3D floating virtual object. More particularly this present invention relates to a display screen, an optical element overlying the display screen, a real-time eye tracking system, a piece of 2D/3D image processing software, and a signal input and output system. Here, the optical instrument overlying the display screen can be a parallax barrier or a lenticular lens sheet.
In this modem information society, more and more customers chase more personalized products to satisfy different individual needs, which lead to personalized aesthetic factors increasingly affecting product design. Lots of customers even want to get involved into the design process, and hope to display their designs more intuitively and detailed. However, the traditional 2D displays of the design drawings not only cannot present the design intuitively, but also are hard for some customers to understand the design, and sometimes will even mislead them. A 3D display can satisfy the visual demands of individual customers and designers, and avoid the defects of the 2D displays by presenting a 2D design drawing to be a 3D floating virtual object, which means what you see is what you get. This intuitive and accurate representation of design drawings can enhance individual customers’ understanding of the design, and help them participate in the personalized product design more effectively. CAD software is the most popular computer software used in product and industrial design, which can create 2D/3D graphical representations of objects. It also can convert 2D drawings to 3D models. However, the 3D graphical representations are still presented on two-dimensional plane. As discussed above, the 3D CAD model is easy for professionals to understand, but not intuitive for common customers. This patent can convert a 3D CAD model to a 3D floating virtual object.
This patent describes a 3D display platform for individual customers and designers to display CAD designs as 3D floating virtual objects. For the convenience of the viewers, the displayed 3D virtual object can be rotated and enlarged under computer instructions. Therefore, this patent has to solve the following problems. The first problem is to track the eyes movement of individual users in real-time. The second problem is to adjust the optical element. The third problem is to prepare the display content, and project the design as a floating virtual object. The fourth problem is to display the corresponding rotated virtual object or part of the virtual object by instructions.
Statements of invention
The present invention provides a 3D display platform comprising an eye-tracking system to track an individual user’s position, a display screen to display 3D images, an optical element (parallax barrier or lenticular lens) overlying the display screen to convert prepared 3D images to a 3D virtual object and propagate the 3D virtual object to the individual user, an image conversion system to convert 2D images to prepared 3D images to be displayed, and a piece of software to read in the instructions to display the product design from various angles or display a small part of the product design.
The workflow of this 3D display platform is shown as the followings: 1) to detect the individual user’s eyes by the eye-tracking system; 2) to adjust the optical element to project the 3D virtual object into the user’s eyes after obtaining the eyes’ positions; 3) to set the front of the 3D CAD design model as the default initial display content of the right view, and prepare the corresponding 3D CAD model as the left view at the same moment while steps 1 and 2 are proceeding; 4) to display the prepared 3D model as a real 3D floating virtual object; 5) to adjust the display content to follow the viewer’s instructions.
During this display process, the eye-tracking system should always monitor the user’s position. If the eye-tracking system finds the user has moved out of the current 3D viewing area, the optical element will do the corresponding adjustments to make sure that the user is always in the 3D viewing area and can always view the 3D virtual object.
In order to project the 3D floating virtual object into the user’s eyes, the display platform presented in this patent must monitor the eyes’ positions of the individual user’s by a realtime eye-tracking system. This real-time system comprises a camera to take the user’s images and the corresponding eye-tracking software to analyse the images and give the eyes’ locations. When the individual user uses this 3D display platform on a laptop or a tablet, the camera used in the eye-tracking system will be its built-in webcam. Otherwise, the user needs an additional webcam for the eye-tracking system, and put it on the top centre of the monitor. Compared with other eye detection and tracking systems which need to detect users’ pupil/iris or the movement of the eyeball in the orbit, this system only needs to detect the eyes’ positions to ensure the viewer being in the 3D viewing area. Hence, the grey projection algorithm which is efficient to get the rough positions of eyes’ (but hard to give the accurate information of pupils) is enough to meet the needs of this platform and can be employed to detect the eyes’ positions.
The default working area of the optical element is directly opposite the monitor, because users normally sit right opposite the monitor. When the eye-tracking system detects that the viewer’s eyes are out of this default viewing area, the optical element will be adjusted to guarantee the viewer being in the 3D viewing area. When this 3D display platform is working, the eye-tracking system always tracks the user’s eyes, and the optical element (parallax barrier or lenticular lens) will be adjusted with the feedback result of the eyetracking system’s.
While the eye-tracking system and the optical element are working, the initial 3D images are prepared as well. In this patent, the presentation of a 3D image is achieved by projecting two different images to right and left eyes. As is well known, when a human being watches a real 3D object, his two eyes will receive two different images with a parallax angle a.
In this patent, the initial projection of the 3D virtual object is set as the front of the object. Here, the image for right eye (right view) is the front design drawing from CAD. The image for left eye (left view) is the 3D design model rotated a horizontal angle a, which can also be generated by CAD. The rotating horizontal angle a (parallax angle) is the angle between the sight lines of right and left eyes’. Hence, a = 2arctan(dinter/2D) . Here, dinter is the average interocular distance of human beings (normally it is 65mm), and D is the working distance of the 3D display platform. When these two images for respective right and left eyes are ready, the 3D image interlacing will be done to project the right and left views through the optical element to the right and left eyes, respectively.
While the 3D image is prepared and displayed, the viewer can watch a 3D floating virtual object, which results from that two eyes view two different images.
After the initial 3D virtual object (front part) is presented to the viewer, the viewer can choose different viewing angles and parts by three methods, 1) keyboard input, 2) mouse click and drag, or 3) gesture recognition. The viewing angle can be changed by keying in the viewing angle in the interactive interface, by using mouse to drag the 3D floating virtual object, or by moving index finger up and down, or right and left. The 3D floating virtual object can rotate 360° horizontally and rotate 180° vertically. At the same time, the detailed part of the design can be presented by the zoom in/out function which is achieved by using the button in the interactive interface, or dragging mouse, or folding/opening thumb and index finger.
Detailed Description of the Invention
An embodiment of the 3D display platform using an adjustable parallax barrier will now be described by the means of an example and referring to the accompanying figures in which:
Figure 1 is a schematic diagram of a 3D display platform to convert a CAD design to a 3D floating virtual object;
Figure 2 is a schematic diagram of a parallax barrier overlaying a display screen to produce a 3D image;
Figure 3 is a schematic diagram of an adjustable parallax barrier with an eye-tracking system to project a 3D image to different viewing areas;
Figure 4 is the workflow of the 3D display platform shown in Figure lwith a parallax barrier shown in Figure 2;
Figure 5 is a schematic diagram to show the parallax angle of right and left eyes;
Figure 6 is the user interface of the 3D display platform.
The present invention will be described in detail in relation to a 3D display platform shown in Figure 1. An optical element 2 is overlying a LCD display screen 1, and a camera 3 is on the top of the LCD screen 1 to detect the user’s eyes. This display platform is driven by a 2D-to-3D conversion system.
Figure 2 shows the working principle of the parallax barrier 4 which is employed as the required optical element 2 in Figure 1. The parallax barrier 4 can produce 3D image 5 (a heart) by split the light from LCD 1 and redirect to different viewing regions. In 3D working area, when each eye of a viewer receives its corresponding different image, the brain is convinced that the viewer is watching a 3D virtual object 5.
The adjustable parallax barrier is shown in Figure 3. With an eye-tracking system, the positions of the user’s two eyes’ are identified by employing a webcam on the top centre of the monitor to take photos and feeding back the process result to the platform. According to the feedback result, the adjustable parallax barrier is slightly shifting horizontally to make sure that the viewer is in the 3D working area.
The workflow of the 3D display platform is shown in Figure 4. At first, the following two tasks, i.e. eyes detection and initial 3D virtual object 4 preparation, are being done at the same moment. To detect the user’s eyes, the camera 3 on the top of the LCD screen 1 takes photos of the viewer. Here, the default position of the user is directly opposite the LCD 1. Then, the photos are sent back to the image processing system to check the positions of two eyes. The grey projection algorithm is employed in the image processing and eye-tracking program. After the position of the user is detected, the data of the position will be fed back to the system and be used to monitor the adjustable parallax barrier 4.
At the same time, the initial projected 3D virtual object 4 is prepared. As shown in Figure 5, the views of right and left eyes have a parallax angle a. This means when the image of the product’s front is set as the right view, the left view can be obtained by rotating the product by an angle a. When a viewer sits in front of a laptop, a computer or a tablet, the viewing distance is normally from 40cm to 60cm. Hence, the viewing distance D in this embodiment is set to 50cm, and
In this embodiment of the platform, a 3D floating heart 5 is projected. When the front side of the heart 5 is set as the right view, the left view is obtained from CAD by rotating this heart 5 an angle 7.4 ° . Now, both right and left views are ready. Interlacing these two views gets the display content. When the viewer receives these two views, he will be convinced that he is viewing a floating 3D heart.
When the viewer needs to view other sides of this floating 3D heart 5 or view it more detailed, he can either key in the rotating angle in the user interface as shown in Figure 6 or drag the floating virtual object 5 by mouse.
The data keyed in will be fed back to CAD, then CAD will return the corresponding right and left views to the platform. For example, a viewer needs to rotate the virtual object to the left by 30°, he has to key in 30° in the second space after “horizontal”. The data 30° will be fed back to the CAD program, then both of two views are rotated 30° to the left. The new views will be returned to the platform, and displayed to the viewer. If the viewer wants to view part of the object in detail, he can use the “Zoom in” function. The instruction will be sent to the system. Then, the corresponding display result will be returned to the viewer.
Using mouse can achieve the same result as using the interface. The viewer can place the cursor on the virtual object, and press the left button of mouse to drag it to the desired angle, then release the left button and view the different part of the design. The “Zoom in” and “Zoom out” function are achieved by pressing both left and right buttons of mouse. The action of the mouse will be returned to the platform, and the data will be sent to the CAD program to yield the corresponding project content.
In this patent, the gesture instruments are set as the followings: 1) moving fingers horizontally to rotate the virtual object on horizontal direction, 2) moving fingers vertically to rotate the virtual object on vertical direction, 3) folding thumb and index finger to zoom out the virtual object, 4) opening thumb and index finger to zoom in the virtual object.
The gesture instruments are recognized by a webcam. When the viewer is using the gesture instrument input function, he must put his fingers in front of the webcam and make sure that the webcam can detect his fingers. The desired rotation angle of the floating virtual object is achieved by the following steps: 1) recognizing the starting point and the end of the movement of the fingers by the webcam, 2) calculating the moving distance of the figures, 3) setting this moving distance as the rotational distance of the virtual object, 4) calculating the rotation angle by using the working distance D as the rotational radius.
After the rotation angle is sent to the system, the system will produce the desired views and prepare the projected image. When the webcam recognizes the movement of thumb and index finger, the “zoom in”/”zoom out” instrument will be sent to the system. The display platform will do the corresponding response.

Claims (1)

  1. Claims 1 An autostereoscopic 3D display platform for design works comprising a piece of software converting the original CAD design drawings to the required 3D autostereoscopic view, and an autostereoscopic 3D display to display the 3D design result. 2 an autostereoscopic 3D display platform for design as claimed in claim 1, wherein the workflow of this display platform is from the data of software to autostereoscopic 3D display system. 3 an autostereoscopic 3D display platform for design as claimed in claim 1, wherein the parallax angle between two views is affected by the working distance of this display platform. 4 an autostereoscopic 3D display platform for design as claimed in claim 1, wherein the two views interlaced to be displayed are provided by CAD software, and have a parallax angle as claimed in claim 3. 5 an autostereoscopic 3D display platform for design as claimed in claim 1, wherein the eye-tracking system monitors the eyes’ positions during the whole working process of this display platform. 6 an autostereoscopic 3D display platform for design as claimed in claim 1, wherein the instruction can be inputted by keyboard, mouse or gesture.
GB1603430.8A 2016-02-28 2016-02-28 Method and apparatus for autostereoscopic display platform Withdrawn GB2547701A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1603430.8A GB2547701A (en) 2016-02-28 2016-02-28 Method and apparatus for autostereoscopic display platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1603430.8A GB2547701A (en) 2016-02-28 2016-02-28 Method and apparatus for autostereoscopic display platform

Publications (2)

Publication Number Publication Date
GB201603430D0 GB201603430D0 (en) 2016-04-13
GB2547701A true GB2547701A (en) 2017-08-30

Family

ID=55807024

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1603430.8A Withdrawn GB2547701A (en) 2016-02-28 2016-02-28 Method and apparatus for autostereoscopic display platform

Country Status (1)

Country Link
GB (1) GB2547701A (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150229914A1 (en) * 2012-10-22 2015-08-13 Uab 3D Tau Autostereoscopic system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150229914A1 (en) * 2012-10-22 2015-08-13 Uab 3D Tau Autostereoscopic system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
(NVIDIA) "NVIDIA 3D VISION PRO FOR SOLIDWORKS AND EDRAWINGS" [online], May 2013. Available from: http://www.nvidia.com/object/solidworks-edrawings.html [Accessed date 22 July 2016] *
(SEEREAL) "NextGen display technology" [online], May 2007. Available from: http://www.seereal.com/en/autostereoscopy/NextGen.php [Accessed date 22 July 2016] *
(SEEREAL) "World premiere: SeeReal hooks up to 3D CAD Workstation at Hanover Fair 2005" [online], April 2005. Available from: http://www.seereal.com/en/news/news/200504-01.php [Accessed date 22 July 2016] *

Also Published As

Publication number Publication date
GB201603430D0 (en) 2016-04-13

Similar Documents

Publication Publication Date Title
US11714592B2 (en) Gaze-based user interactions
US11557102B2 (en) Methods for manipulating objects in an environment
US10776954B2 (en) Real-world anchor in a virtual-reality environment
US20130154913A1 (en) Systems and methods for a gaze and gesture interface
EP3311249B1 (en) Three-dimensional user input
US9141189B2 (en) Apparatus and method for controlling interface
KR101815020B1 (en) Apparatus and Method for Controlling Interface
US20120056989A1 (en) Image recognition apparatus, operation determining method and program
US11720171B2 (en) Methods for navigating user interfaces
KR20220137770A (en) Devices, methods, and graphical user interfaces for gaze-based navigation
Deng et al. Understanding the impact of multimodal interaction using gaze informed mid-air gesture control in 3D virtual objects manipulation
KR20160096392A (en) Apparatus and Method for Intuitive Interaction
US20230336865A1 (en) Device, methods, and graphical user interfaces for capturing and displaying media
KR20160055407A (en) Holography touch method and Projector touch method
CN111857461B (en) Image display method and device, electronic equipment and readable storage medium
GB2547701A (en) Method and apparatus for autostereoscopic display platform
Iwata et al. PupilMouse supported by head pose detection
Qin et al. Selecting Real-World Objects via User-Perspective Phone Occlusion
Hopf et al. Novel autostereoscopic single-user displays with user interaction
KR20140055987A (en) Method of display control for image
Orlosky Adaptive display of virtual content for improving usability and safety in mixed and augmented reality
KR20240092971A (en) Display device for glasses-free stereoscopic image with function of air touch
KR20240091221A (en) Devices, methods, and graphical user interfaces for capturing and displaying media
KR20150137908A (en) Holography touch method and Projector touch method
WO2023049418A2 (en) Devices, methods, and graphical user interfaces for interacting with media and three-dimensional environments

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)