US20180160093A1 - Portable device and operation method thereof - Google Patents

Portable device and operation method thereof Download PDF

Info

Publication number
US20180160093A1
US20180160093A1 US15/368,693 US201615368693A US2018160093A1 US 20180160093 A1 US20180160093 A1 US 20180160093A1 US 201615368693 A US201615368693 A US 201615368693A US 2018160093 A1 US2018160093 A1 US 2018160093A1
Authority
US
United States
Prior art keywords
portable device
module
user
image capture
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/368,693
Inventor
Sung-Yang Wu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wu Sung Yang
Original Assignee
Sung-Yang Wu
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sung-Yang Wu filed Critical Sung-Yang Wu
Priority to US15/368,693 priority Critical patent/US20180160093A1/en
Priority to CN201710080854.9A priority patent/CN108616754A/en
Publication of US20180160093A1 publication Critical patent/US20180160093A1/en
Priority to US17/184,479 priority patent/US11212501B2/en
Priority to US17/184,617 priority patent/US11240487B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • H04N13/117Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation the virtual viewpoint locations being selected by the viewers or determined by viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/0014
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • H04N13/0203
    • H04N13/04
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control

Definitions

  • the present invention relates to a portable device and an operation method thereof, and more particularly, to a portable device including an image capture module and a display module and an operation method thereof.
  • AR augmented reality
  • Augmented reality is a live direct view of a physical and real-world environment whose elements are augmented or supplemented by computer-generated sensory input such as sound, video, graphics or other information. Augmentation is generally in real-time with environmental elements.
  • AR technology the information about the surrounding real world of the user becomes interactive and digitized.
  • the AR experience of the user may be compromised because the image shown on the display is generally constrained by the specification of the camera and cannot always fit the background scene when the viewing distance between the user and the display becomes different.
  • a portable device and an operation method thereof are provided in the present invention.
  • An eye-tracking module in the portable device is used to track a user's viewpoint position relative to the portable device for adjusting viewport within images captured by an image capture module in accordance with the user's viewpoint position for generating modified images shown on a display module. Accordingly, the modified images shown on the display module may fit the background scene seen by the user, and the augmented reality experience on the portable device may be improved.
  • an operation method of a portable device includes the following steps. Images are captured steadily by an image capture module of a portable device. A user's viewpoint position relative to the portable device is tracked. Viewport within the images captured by the image capture module is adjusted in accordance with the user's viewpoint position for generating modified images. The modified images are displayed on a display module of the portable device instantaneously.
  • a portable device includes a display module, an image capture module, and an eye-tracking module.
  • the image capture module is used to capture images steadily.
  • the eye-tracking module is used to track a user's viewpoint position relative to the portable device. Viewport within the images captured by the image capture module is adjusted in accordance with the user's viewpoint position for generating modified images displayed on the display module instantaneously.
  • FIG. 1 is a schematic drawing illustrating an operation method of a portable device according to an embodiment of the present invention.
  • FIG. 2 is a stereoscopic schematic drawing illustrating a portable device according to an embodiment of the present invention.
  • FIG. 3 is a cross-sectional diagram of the portable device according to the embodiment of the present invention.
  • FIG. 4 is a schematic drawing illustrating a step of adjusting viewport within an image captured by an image capture module according to an embodiment of the present invention.
  • FIG. 5 is a schematic drawing illustrating the difference between an image shown on a display module of a conventional portable device and an image shown on a display module of the portable device according to the embodiment of the present invention.
  • FIG. 6 is a schematic drawing illustrating components in the portable device and interaction between the components according to the embodiment of the present invention.
  • FIG. 1 is a schematic drawing illustrating an operation method of a portable device according to an embodiment of the present invention.
  • FIG. 2 is a stereoscopic schematic drawing illustrating the portable device in this embodiment.
  • FIG. 3 is a cross-sectional diagram of the portable device in this embodiment.
  • an operation method of a portable device 100 includes the following steps.
  • the portable device 100 including a display module 10 , an image capture module 20 , and an eye-tracking module 30 is provided.
  • the portable device 100 may be a smart phone, a tablet, a handheld game console, or a car navigation system, but not limited thereto.
  • step S 11 images are captured steadily by the image capture module 20 of the portable device 100 .
  • step S 12 a user's viewpoint position relative to the portable device 100 is tracked by the eye-tracking module 30 .
  • the step S 11 and the step S 12 may be performed simultaneously.
  • step S 20 viewport within the images captured by the image capture module 20 is adjusted in accordance with the user's viewpoint position for generating modified images.
  • step S 30 the modified images are displayed on the display module 10 of the portable device 100 instantaneously. In other words, the image capture module 20 is used to capture the images steadily.
  • the eye-tracking module 30 is used to track the user's viewpoint position relative to the portable device 100 .
  • the step of capturing the images by the image capture module 20 , the step of tracking the user's viewpoint position, the step of adjusting the viewport within the images captured by the image capture module 20 in accordance with the user's viewpoint position for generating the modified images, and the step of displaying the modified images on the display module 10 may be performed simultaneously for presenting a real-time and live direct view of the environment around the user.
  • the viewpoint of the user is different from the viewpoint of the image capture module 20 , and the images captured by the image capture module 20 and displayed on the display module 10 will not fit the background scene seen by the user without modifying the captured images especially when the distance between the user and the portable device 100 changes.
  • the modified images displayed on the display module 10 can fit the background scene behind the portable device 100 and seen by the user.
  • the step of tracking the user's viewpoint position may include tracking a distance and an orientation between at least one eye E of the user and the portable device 100 , and calculating a distance and an orientation between at least one eye E of the user and the image capture module 20 of the portable device 100 .
  • the distance and the orientation between at least one eye E of the user and the image capture module 20 of the portable device may be calculated in accordance with a distance and an orientation between the eye E of the user and the eye-tracking module 30 and a relative position difference between the eye-tracking module 30 and the image capture module 20 .
  • ⁇ EyeCam stands for the distance and the orientation difference between the eye E and the image capture module 20
  • ⁇ EyeSen stands for the distance and the orientation difference between the eye E and the eye-tracking module 30
  • ⁇ SenCam stands for the distance and the orientation difference between the eye-tracking module 30 and the image capture module 20
  • ⁇ EyeCam may be calculated by an equation listed below.
  • X ⁇ EyeSen stands for a position difference between the eye E and the eye-tracking module 30 in X axis
  • X ⁇ SenCam stands for a position difference between the eye-tracking module 30 and the image capture module 20 in X axis
  • Y ⁇ EyeSen stands for a position difference between the eye E and the eye-tracking module 30 in Y axis
  • Y ⁇ SenCam stands for a position difference between the eye-tracking module 30 and the image capture module 20 in Y axis
  • Z ⁇ EyeSen stands for a position difference between the eye E and the eye-tracking module 30 in Z axis
  • Z ⁇ SenCam stands for a position difference between the eye-tracking module 30 and the image capture module 20 in Z axis
  • the step of tracking the user's viewpoint position may include tracking a distance and an orientation between each eye of the user and the portable device 100 .
  • the image capture module 20 may include 3D image capture module, and the modified images displayed on the display module 10 may include naked eye 3D images.
  • the 3D image capture module mentioned above may include a 3D depth camera configured to record position information of objects in the images captured by the image capture module 20 .
  • the display module 10 may be a naked eye 3D display module, but not limited thereto.
  • the images captured by the image capture module 20 may include different images for the left eye and the right eye of the user, and the modified images may include different images displayed on the display module 10 for the left eye and the right eye of the user.
  • FIG. 4 is a schematic drawing illustrating the step of adjusting viewport within an image captured by the image capture module 20 according to an embodiment of the present invention.
  • an image 90 is captured by the image capture module 20 , and a first viewport V 1 within the image 90 is formed in accordance with the user's viewpoint position for generating and/or rendering a modified image.
  • the viewport within the image is adjusted to be a second viewport V 2 shown in FIG. 4 .
  • the viewport within the captured image becomes smaller when the distance between the user and the portable device 100 is reduced, and the magnification is increased when the distance between the user and the portable device 100 is reduced.
  • the captured images are trimmed for generating and/or rendering the modified images displayed on the display module 10 in accordance with the information about the user's viewpoint position relative to the portable device 100 which is tracked by the eye-tracking module 30 .
  • the modified images displayed on the display module 10 present a simulated viewpoint similar to that of the user instead of the viewpoint of the image capture module 20 .
  • FIG. 5 is a schematic drawing illustrating the difference between an image shown on a display module 10 ′ of a conventional portable device 100 ′ and an image shown on the display module 10 of the portable device 100 in this embodiment.
  • the image displayed on the display module 10 is the modified image generated by the operation method of the present invention and the image displayed on the display module 10 fit the background scene.
  • the image displayed on the display module 10 ′ of the conventional portable device 100 ′ does not fit the background scene. Therefore, the user's augmented reality experience on the portable device 100 may be improved.
  • FIG. 6 is a schematic drawing illustrating components in the portable device 100 and interaction between the components according to an embodiment of the present invention.
  • the display module 10 and the eye-tracking module 30 may be disposed at a front side of the portable device 100
  • the image capture module 20 may be disposed at a back side of the portable device 100 .
  • the eye-tracking module may include an infrared sensor, or other suitable sensors.
  • the portable device 100 may further include a viewpoint calculation module 31 , a viewport adjust module 32 , a 3D modeling module 21 , and a 3D rendering module 11 .
  • the viewport calculation module 31 is connected to the eye-tracking module 30 and used to calculate the distance and the orientation between at least one eye E of the user and the image capture module 20 of the portable device 100 .
  • the 3D modeling module 21 is connected to the image capture module 20 for processing the image data from the image capture module 20 .
  • the viewport adjust module 32 is connected to the viewport calculation module 31 and the 3D rendering module 11 .
  • the viewport adjust module 32 is used to adjust the viewport within the images captured by the image capture module and input the relation information to the 3D rendering module 11 for generating the modified images.
  • the user's viewpoint position relative to the portable device is tracked and calculation for adjusting the viewport within the images captured by the image capture module and generating the modified images displayed on the display module.
  • the modified images displayed on the display module may fit the background scene seen by the user, and the augmented reality experience on the portable device may be improved accordingly.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A portable device includes a display module, an image capture module, and an eye-tracking module. The image capture module is used to capture images steadily. The eye-tracking module is used to track a user's viewpoint position relative to the portable device. Viewport within the images captured by the image capture module is adjusted in accordance with the user's viewpoint position for generating modified images displayed on the display module instantaneously. The modified images shown on the display module can fit the background scene seen by the user, and the augmented reality experience on the portable device may be improved accordingly.

Description

    BACKGROUND OF THE INVENTION 1. Field of the Invention
  • The present invention relates to a portable device and an operation method thereof, and more particularly, to a portable device including an image capture module and a display module and an operation method thereof.
  • 2. Description of the Prior Art
  • In recent years, portable devices, such as smart phones, tablets, handheld game consoles, and car navigation systems, become popular because of the lightweight and compact displays and other additional features, such as wireless internet connection and image capture. The portable device equipped with a display panel and a camera may be used to perform an augmented reality (AR) function. Augmented reality is a live direct view of a physical and real-world environment whose elements are augmented or supplemented by computer-generated sensory input such as sound, video, graphics or other information. Augmentation is generally in real-time with environmental elements. With the AR technology, the information about the surrounding real world of the user becomes interactive and digitized. However, the AR experience of the user may be compromised because the image shown on the display is generally constrained by the specification of the camera and cannot always fit the background scene when the viewing distance between the user and the display becomes different.
  • SUMMARY OF THE INVENTION
  • A portable device and an operation method thereof are provided in the present invention. An eye-tracking module in the portable device is used to track a user's viewpoint position relative to the portable device for adjusting viewport within images captured by an image capture module in accordance with the user's viewpoint position for generating modified images shown on a display module. Accordingly, the modified images shown on the display module may fit the background scene seen by the user, and the augmented reality experience on the portable device may be improved.
  • According to an embodiment of the present invention, an operation method of a portable device is provided. The operation method includes the following steps. Images are captured steadily by an image capture module of a portable device. A user's viewpoint position relative to the portable device is tracked. Viewport within the images captured by the image capture module is adjusted in accordance with the user's viewpoint position for generating modified images. The modified images are displayed on a display module of the portable device instantaneously.
  • According to an embodiment of the present invention, a portable device is provided. The portable device includes a display module, an image capture module, and an eye-tracking module. The image capture module is used to capture images steadily. The eye-tracking module is used to track a user's viewpoint position relative to the portable device. Viewport within the images captured by the image capture module is adjusted in accordance with the user's viewpoint position for generating modified images displayed on the display module instantaneously.
  • These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic drawing illustrating an operation method of a portable device according to an embodiment of the present invention.
  • FIG. 2 is a stereoscopic schematic drawing illustrating a portable device according to an embodiment of the present invention.
  • FIG. 3 is a cross-sectional diagram of the portable device according to the embodiment of the present invention.
  • FIG. 4 is a schematic drawing illustrating a step of adjusting viewport within an image captured by an image capture module according to an embodiment of the present invention.
  • FIG. 5 is a schematic drawing illustrating the difference between an image shown on a display module of a conventional portable device and an image shown on a display module of the portable device according to the embodiment of the present invention.
  • FIG. 6 is a schematic drawing illustrating components in the portable device and interaction between the components according to the embodiment of the present invention.
  • DETAILED DESCRIPTION
  • In the following detailed description of the invention, reference is made to the accompanying drawings, which form a part hereof, and in which is shown, by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention.
  • The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims, along with the full scope of equivalents to which such claims are entitled. One or more implementations of the present invention will now be described with reference to the attached drawings, wherein like reference numerals are used to refer to like elements throughout, and wherein the illustrated structures are not necessarily drawn to scale.
  • Please refer to FIG. 1, FIG. 2, and FIG. 3. FIG. 1 is a schematic drawing illustrating an operation method of a portable device according to an embodiment of the present invention. FIG. 2 is a stereoscopic schematic drawing illustrating the portable device in this embodiment. FIG. 3 is a cross-sectional diagram of the portable device in this embodiment. As shown in FIGS. 1-3, an operation method of a portable device 100 includes the following steps. The portable device 100 including a display module 10, an image capture module 20, and an eye-tracking module 30 is provided. The portable device 100 may be a smart phone, a tablet, a handheld game console, or a car navigation system, but not limited thereto. In step S11, images are captured steadily by the image capture module 20 of the portable device 100. In step S12, a user's viewpoint position relative to the portable device 100 is tracked by the eye-tracking module 30. The step S11 and the step S12 may be performed simultaneously. In step S20, viewport within the images captured by the image capture module 20 is adjusted in accordance with the user's viewpoint position for generating modified images. In step S30, the modified images are displayed on the display module 10 of the portable device 100 instantaneously. In other words, the image capture module 20 is used to capture the images steadily. The eye-tracking module 30 is used to track the user's viewpoint position relative to the portable device 100. Preferably, the step of capturing the images by the image capture module 20, the step of tracking the user's viewpoint position, the step of adjusting the viewport within the images captured by the image capture module 20 in accordance with the user's viewpoint position for generating the modified images, and the step of displaying the modified images on the display module 10 may be performed simultaneously for presenting a real-time and live direct view of the environment around the user.
  • As shown in FIG. 2, the viewpoint of the user is different from the viewpoint of the image capture module 20, and the images captured by the image capture module 20 and displayed on the display module 10 will not fit the background scene seen by the user without modifying the captured images especially when the distance between the user and the portable device 100 changes. However, by the operation method in this embodiment, the modified images displayed on the display module 10 can fit the background scene behind the portable device 100 and seen by the user.
  • As shown in FIG. 1 and FIG. 3, in some embodiments, the step of tracking the user's viewpoint position may include tracking a distance and an orientation between at least one eye E of the user and the portable device 100, and calculating a distance and an orientation between at least one eye E of the user and the image capture module 20 of the portable device 100. The distance and the orientation between at least one eye E of the user and the image capture module 20 of the portable device may be calculated in accordance with a distance and an orientation between the eye E of the user and the eye-tracking module 30 and a relative position difference between the eye-tracking module 30 and the image capture module 20. For example, ΔEyeCam stands for the distance and the orientation difference between the eye E and the image capture module 20, ΔEyeSen stands for the distance and the orientation difference between the eye E and the eye-tracking module 30, ΔSenCam stands for the distance and the orientation difference between the eye-tracking module 30 and the image capture module 20, and ΔEyeCam may be calculated by an equation listed below.

  • ΔEyeCam=ΔEyeSen+ΔSenCam=(X ΔEyeSen +X ΔSenCam ,Y ΔEyeSen +Y ΔSenCam ,Z ΔEyeSen +Z ΔSenCam)
  • In the above equation, XΔEyeSen stands for a position difference between the eye E and the eye-tracking module 30 in X axis, and XΔSenCam stands for a position difference between the eye-tracking module 30 and the image capture module 20 in X axis; YΔEyeSen stands for a position difference between the eye E and the eye-tracking module 30 in Y axis, and YΔSenCam stands for a position difference between the eye-tracking module 30 and the image capture module 20 in Y axis; ZΔEyeSen stands for a position difference between the eye E and the eye-tracking module 30 in Z axis, and ZΔSenCam stands for a position difference between the eye-tracking module 30 and the image capture module 20 in Z axis, wherein ΔSenCam is fixed in the portable device 100, and ΔEyeSen may be measured by the eye-tracking module 30.
  • In some embodiments, the step of tracking the user's viewpoint position may include tracking a distance and an orientation between each eye of the user and the portable device 100. Additionally, the image capture module 20 may include 3D image capture module, and the modified images displayed on the display module 10 may include naked eye 3D images. The 3D image capture module mentioned above may include a 3D depth camera configured to record position information of objects in the images captured by the image capture module 20. Additionally, the display module 10 may be a naked eye 3D display module, but not limited thereto. Specifically, the images captured by the image capture module 20 may include different images for the left eye and the right eye of the user, and the modified images may include different images displayed on the display module 10 for the left eye and the right eye of the user.
  • Please refer to FIG. 1, FIG. 3, and FIG. 4. FIG. 4 is a schematic drawing illustrating the step of adjusting viewport within an image captured by the image capture module 20 according to an embodiment of the present invention. As shown in FIG. 1, FIG. 3, and FIG. 4, an image 90 is captured by the image capture module 20, and a first viewport V1 within the image 90 is formed in accordance with the user's viewpoint position for generating and/or rendering a modified image. When the portable device 100 becomes closer to the user's eyes, the viewport within the image is adjusted to be a second viewport V2 shown in FIG. 4. Accordingly, the viewport within the captured image becomes smaller when the distance between the user and the portable device 100 is reduced, and the magnification is increased when the distance between the user and the portable device 100 is reduced. In other words, the captured images are trimmed for generating and/or rendering the modified images displayed on the display module 10 in accordance with the information about the user's viewpoint position relative to the portable device 100 which is tracked by the eye-tracking module 30. The modified images displayed on the display module 10 present a simulated viewpoint similar to that of the user instead of the viewpoint of the image capture module 20.
  • Please refer to FIG. 3 and FIG. 5. FIG. 5 is a schematic drawing illustrating the difference between an image shown on a display module 10′ of a conventional portable device 100′ and an image shown on the display module 10 of the portable device 100 in this embodiment. As shown in FIG. 3 and FIG. 5, the image displayed on the display module 10 is the modified image generated by the operation method of the present invention and the image displayed on the display module 10 fit the background scene. However, the image displayed on the display module 10′ of the conventional portable device 100′ does not fit the background scene. Therefore, the user's augmented reality experience on the portable device 100 may be improved.
  • Please refer to FIG. 3 and FIG. 6. FIG. 6 is a schematic drawing illustrating components in the portable device 100 and interaction between the components according to an embodiment of the present invention. As shown in FIG. 3 and FIG. 6, the display module 10 and the eye-tracking module 30 may be disposed at a front side of the portable device 100, and the image capture module 20 may be disposed at a back side of the portable device 100. The eye-tracking module may include an infrared sensor, or other suitable sensors. As shown in FIG. 3 and FIG. 6, the portable device 100 may further include a viewpoint calculation module 31, a viewport adjust module 32, a 3D modeling module 21, and a 3D rendering module 11. The viewport calculation module 31 is connected to the eye-tracking module 30 and used to calculate the distance and the orientation between at least one eye E of the user and the image capture module 20 of the portable device 100. The 3D modeling module 21 is connected to the image capture module 20 for processing the image data from the image capture module 20. The viewport adjust module 32 is connected to the viewport calculation module 31 and the 3D rendering module 11. The viewport adjust module 32 is used to adjust the viewport within the images captured by the image capture module and input the relation information to the 3D rendering module 11 for generating the modified images.
  • To summarize the above descriptions, in the portable device and the operation method thereof in the present invention, the user's viewpoint position relative to the portable device is tracked and calculation for adjusting the viewport within the images captured by the image capture module and generating the modified images displayed on the display module. The modified images displayed on the display module may fit the background scene seen by the user, and the augmented reality experience on the portable device may be improved accordingly.
  • Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.

Claims (17)

What is claimed is:
1. An operation method of a portable device, comprising:
capturing images steadily by an image capture module of a portable device;
tracking a user's viewpoint position relative to the portable device;
adjusting viewport within the images captured by the image capture module in accordance with the user's viewpoint position for generating modified images; and
displaying the modified images on a display module of the portable device instantaneously.
2. The operation method of the portable device according to claim 1, wherein the modified images displayed on the display module fit a background scene behind the portable device and seen by the user.
3. The operation method of the portable device according to claim 1, wherein the step of tracking the user's viewpoint position comprises:
tracking a distance and an orientation between at least one eye of the user and the portable device.
4. The operation method of the portable device according to claim 1, wherein the step of tracking the user's viewpoint position comprises:
calculating a distance and an orientation between at least one eye of the user and the image capture module of the portable device.
5. The operation method of the portable device according to claim 4, wherein the user's viewpoint position relative to the portable device is tracked by an eye-tracking module in the portable device.
6. The operation method of the portable device according to claim 5, wherein the distance and the orientation between at least one eye of the user and the image capture module of the portable device is calculated in accordance with a distance and an orientation between the eye of the user and the eye-tracking module and a relative position difference between the eye-tracking module and the image capture module.
7. The operation method of the portable device according to claim 1, wherein the step of tracking the user's viewpoint position comprises:
tracking a distance and an orientation between each eye of the user and the portable device, wherein the image capture module comprises a 3D image capture module, and the modified images displayed on the display module comprise naked eye 3D images.
8. The operation method of the portable device according to claim 7, wherein the 3D image capture module comprises a 3D depth camera configured to record position information of objects in the images captured by the image capture module.
9. The operation method of the portable device according to claim 8, wherein the images captured by the image capture module comprises different images for the left eye and the right eye of the user, and the modified images comprises different images displayed on the display module for the left eye and the right eye of the user.
10. The operation method of the portable device according to claim 1, wherein the step of capturing the images by the image capture module, the step of tracking the user's viewpoint position, the step of adjusting the viewport within the images captured by the image capture module in accordance with the user's viewpoint position for generating the modified images, and the step of displaying the modified images on the display module are performed simultaneously.
11. A portable device, comprising:
a display module;
an image capture module configured to capture images steadily; and
an eye-tracking module configured to track a user's viewpoint position relative to the portable device, wherein viewport within the images captured by the image capture module is adjusted in accordance with the user's viewpoint position for generating modified images displayed on the display module instantaneously.
12. The portable device according to claim 11, wherein the display module and the eye-tracking module are disposed at a front side of the portable device, and the image capture module is disposed at a back side of the portable device.
13. The portable device according to claim 11, wherein the image capture module comprises a 3D depth camera configured to record position information of objects in the images captured by the image capture module.
14. The portable device according to claim 13, wherein the display module comprises a naked eye 3D display module.
15. The portable device according to claim 11, wherein the eye-tracking module comprises an infrared sensor.
16. The portable device according to claim 11, further comprising:
a viewport calculation module connected to the eye-tracking module and configured to calculate a distance and an orientation between at least one eye of the user and the image capture module of the portable device.
17. The portable device according to claim 16, further comprising:
a viewport adjust module connected to the viewport calculation module and configured to adjust the viewport within the images captured by the image capture module for generating the modified images.
US15/368,693 2016-12-05 2016-12-05 Portable device and operation method thereof Abandoned US20180160093A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US15/368,693 US20180160093A1 (en) 2016-12-05 2016-12-05 Portable device and operation method thereof
CN201710080854.9A CN108616754A (en) 2016-12-05 2017-02-15 Portable apparatus and its operating method
US17/184,479 US11212501B2 (en) 2016-12-05 2021-02-24 Portable device and operation method for tracking user's viewpoint and adjusting viewport
US17/184,617 US11240487B2 (en) 2016-12-05 2021-02-25 Method of stereo image display and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/368,693 US20180160093A1 (en) 2016-12-05 2016-12-05 Portable device and operation method thereof

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US17/184,479 Division US11212501B2 (en) 2016-12-05 2021-02-24 Portable device and operation method for tracking user's viewpoint and adjusting viewport
US17/184,617 Continuation-In-Part US11240487B2 (en) 2016-12-05 2021-02-25 Method of stereo image display and related device

Publications (1)

Publication Number Publication Date
US20180160093A1 true US20180160093A1 (en) 2018-06-07

Family

ID=62243658

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/368,693 Abandoned US20180160093A1 (en) 2016-12-05 2016-12-05 Portable device and operation method thereof
US17/184,479 Active US11212501B2 (en) 2016-12-05 2021-02-24 Portable device and operation method for tracking user's viewpoint and adjusting viewport

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/184,479 Active US11212501B2 (en) 2016-12-05 2021-02-24 Portable device and operation method for tracking user's viewpoint and adjusting viewport

Country Status (2)

Country Link
US (2) US20180160093A1 (en)
CN (1) CN108616754A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11483483B2 (en) * 2018-11-30 2022-10-25 Maxell, Ltd. Display apparatus

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110796116A (en) * 2018-11-08 2020-02-14 英属开曼群岛商麦迪创科技股份有限公司 Multi-panel display system, vehicle with multi-panel display system and display method
CN114979613A (en) * 2021-02-25 2022-08-30 吴松阳 Stereoscopic image display method and portable device for displaying stereoscopic image

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080136916A1 (en) * 2005-01-26 2008-06-12 Robin Quincey Wolff Eye tracker/head tracker/camera tracker controlled camera/weapon positioner control system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI369636B (en) 2008-10-28 2012-08-01 Univ Nat Central Image system for adjusting displaying angle by detecting human face and visual simulation control apparatus thereof
CN102314856B (en) * 2010-07-06 2014-12-03 南通新业电子有限公司 Image processing system, display device and image display method
US8704879B1 (en) 2010-08-31 2014-04-22 Nintendo Co., Ltd. Eye tracking enabling 3D viewing on conventional 2D display
TWI544447B (en) * 2011-11-29 2016-08-01 財團法人資訊工業策進會 System and method for augmented reality
NL2010302C2 (en) 2013-02-14 2014-08-18 Optelec Dev B V A system for determining a recommended magnification factor for a magnifier such as a loupe or an electronic magnifier to be used by a person.
CN103531094A (en) 2013-09-22 2014-01-22 明基材料有限公司 Display system and method capable of adjusting image focal length automatically
TWI540880B (en) 2014-02-19 2016-07-01 大昱光電股份有限公司 Method for displaying stereoscopic image and stereoscopic image device
US9934573B2 (en) * 2014-09-17 2018-04-03 Intel Corporation Technologies for adjusting a perspective of a captured image for display
CN104539924A (en) * 2014-12-03 2015-04-22 深圳市亿思达科技集团有限公司 Holographic display method and holographic display device based on eye tracking

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080136916A1 (en) * 2005-01-26 2008-06-12 Robin Quincey Wolff Eye tracker/head tracker/camera tracker controlled camera/weapon positioner control system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11483483B2 (en) * 2018-11-30 2022-10-25 Maxell, Ltd. Display apparatus
US11831976B2 (en) 2018-11-30 2023-11-28 Maxell, Ltd. Display apparatus

Also Published As

Publication number Publication date
US11212501B2 (en) 2021-12-28
CN108616754A (en) 2018-10-02
US20210185292A1 (en) 2021-06-17

Similar Documents

Publication Publication Date Title
CN110413105B (en) Tangible visualization of virtual objects within a virtual environment
US11212501B2 (en) Portable device and operation method for tracking user's viewpoint and adjusting viewport
US10953330B2 (en) Reality vs virtual reality racing
US8878846B1 (en) Superimposing virtual views of 3D objects with live images
US8388146B2 (en) Anamorphic projection device
TWI544447B (en) System and method for augmented reality
US10884576B2 (en) Mediated reality
US20120120203A1 (en) Three-dimensional display method, tracking three-dimensional display unit and image processing device
JP6619871B2 (en) Shared reality content sharing
US20180246331A1 (en) Helmet-mounted display, visual field calibration method thereof, and mixed reality display system
US20160148429A1 (en) Depth and Chroma Information Based Coalescence of Real World and Virtual World Images
WO2016163183A1 (en) Head-mounted display system and computer program for presenting real space surrounding environment of user in immersive virtual space
US20180143694A1 (en) Mediated Reality
US11682138B2 (en) Localization and mapping using images from multiple devices
CN110915211A (en) Physical input device in virtual reality
US11302023B2 (en) Planar surface detection
CN112655202A (en) Reduced bandwidth stereo distortion correction for fisheye lens of head-mounted display
CN110969706B (en) Augmented reality device, image processing method, system and storage medium thereof
US11195320B2 (en) Feed-forward collision avoidance for artificial reality environments
CN108319365B (en) Movement tracking method and movement tracking system
CN113870213A (en) Image display method, image display device, storage medium, and electronic apparatus
US20230122636A1 (en) Apparatus and method for localisation and mapping
JP2018063567A (en) Image processing device, image processing method and program
CN106384365B (en) Augmented reality system comprising depth information acquisition and method thereof
KR101741149B1 (en) Method and device for controlling a virtual camera's orientation

Legal Events

Date Code Title Description
STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION