WO2018088742A1 - Display apparatus and control method thereof - Google Patents

Display apparatus and control method thereof Download PDF

Info

Publication number
WO2018088742A1
WO2018088742A1 PCT/KR2017/012083 KR2017012083W WO2018088742A1 WO 2018088742 A1 WO2018088742 A1 WO 2018088742A1 KR 2017012083 W KR2017012083 W KR 2017012083W WO 2018088742 A1 WO2018088742 A1 WO 2018088742A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
planar
display
spherical
processor
Prior art date
Application number
PCT/KR2017/012083
Other languages
French (fr)
Inventor
Yong-Deok Kim
Bo-Eun Kim
Sung-Hyun Kim
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Priority to EP17870508.3A priority Critical patent/EP3520086B1/en
Publication of WO2018088742A1 publication Critical patent/WO2018088742A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/12Panospheric to cylindrical image transformations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/06Topological mapping of higher dimensional structures onto lower dimensional surfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/20Linear translation of whole images or parts thereof, e.g. panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting

Definitions

  • Apparatuses and methods consistent with the present disclosure relate to a display apparatus and a control method thereof, and more particularly, to a display apparatus for editing a Virtual Reality (VR) image generated by combining a plurality of images captured from a plurality of different viewpoints and converting the combined image to a planar image, and a control method thereof.
  • VR Virtual Reality
  • a current editing software tool of a VR image has a main purpose of stitching and thus does not support editing such as drawing of a picture on a 360° image with a pen, inserting of a text into the 360° image, or the like.
  • An existing editing tool of a photo editing app, a photoshop, or the like of a smartphone may be used for performing this editing but does not provide an additional function of a 360° image.
  • the existing editing tool performs editing on a VR image generated by projecting a spherical VR image onto a plane.
  • editing may not be performed like a user intends, due to a distortion occurring in a process of projecting the spherical VR image onto the plane.
  • Exemplary embodiments of the present disclosure overcome the above disadvantages and other disadvantages not described above. Also, the present disclosure is not required to overcome the disadvantages described above, and an exemplary embodiment of the present disclosure may not overcome any of the problems described above.
  • the present disclosure provides a display apparatus for performing intuitive editing when a Virtual Reality (VR) image is displayed, and a control method thereof.
  • VR Virtual Reality
  • a display apparatus comprising: a storage configured to store a Virtual Reality (VR) image; a user interface; a display; a processor configured to: convert the VR image into a spherical VR image, obtain a planar VR image corresponding to an area of the spherical VR image according to a projection method, control the display to display the planar VR image, receive a user input, through the user interface, to select an editing tool for performing an editing operation on the planar VR image, in response to the editing operation, overlay a first object corresponding to the editing operation on the planar VR image and control the display to display the first object overlaid on the planar VR image, obtain a second object by inversely performing the projection method to project the first object as the second object in a spherical coordinate system and edit the spherical VR image based on the second object.
  • VR Virtual Reality
  • the processor may be further configured to: change a size of the second object based on the user input, change a shape of the first object based on the second object of which the size is changed according to the projection method, and control the display to display the first object having the changed shape on the planar VR image.
  • the processor may be further configured to: move the second object to a third position in the spherical coordinate system corresponding to the second position in the planar VR image, change the shape of the first object based on the inversely performed projection method so that the first object to correspond to the second object having the changed position, and control the display to display the first object having the changed shape in the second position on the planar VR image.
  • the processor may be further configured to: identify the area of the spherical VR image based on the projection point and the projection angle, obtain and control the display to display a planar VR image corresponding to the area based on the projection method.
  • the processor may be further configured to obtain and control the display to display a planar VR image corresponding to the fourth position.
  • the processor may be further configured to overlay a lattice type guide graphical user interface (GUI) on the planar VR image and control the display to display the lattice type guide GUI overlaid on the planar VR image, and wherein the lattice type guide GUI guides a position corresponding to the planar VR image on the spherical VR image.
  • GUI lattice type guide graphical user interface
  • the processor may display a plane VR image corresponding to the area of the edited spherical VR image.
  • the planar VR image may be obtained by converting a combined image, which is obtained by combining a plurality of images captured from a plurality of different viewpoints, to a plane image.
  • the first object provided from the editing tool may comprise at least one selected from a tool GUI used in an editing function, an editing content generated by the tool GUI, and a content added according to the editing function.
  • a method of controlling a display apparatus comprising: converting a VR image into a spherical VR image; obtaining a planar VR image corresponding to an area of the spherical VR image according to a projection method; displaying the planar VR image; receiving a user input to select an editing tool for performing an editing operation on the planar VR image; in response to the editing operation, overlaying a first object corresponding to the editing operation on the planar VR image; displaying the first object overlaid on the planar VR image; obtaining a second object by inversely performing the projection method to project the first object as the second object in a spherical coordinate system; and editing the spherical VR image based on the second object.
  • the method may further comprise: in response to the user input comprising an operation for changing a size of the first object being received, changing a size of the second object based on the user input; and changing a shape of the first object based on the second object of which the size is changed according to the projection method, and displaying the first object having the changed shape on the planar VR image.
  • the method may further comprise: in response to the user input comprising an operation for changing a position of the first object from a first position to a second position in the planar VR image, moving the second object to a third position in the spherical coordinate system corresponding to the second position in the planar on the spherical VR image; and changing a shape of the first object based on the projection method so that the first object to correspond to the second object having the changed position and displaying the first object having the changed shape in the second position on the planar VR image.
  • the displaying of the planar VR image may comprise: in response to the user input comprising an operation for a projection point, a projection angle, and the projection method being received, identifying the area of the spherical VR image based on the projection point and the projection angle; and obtaining and displaying a planar VR image corresponding to the area based on the projection method.
  • the method may further comprise: in response to the user input comprising an operation for changing a position of the first object from a first position to a fourth position in a preset area of the planar VR image being received, obtaining and displaying a planar VR image corresponding to the fourth position.
  • the method may further comprise: overlaying a lattice type guide graphical user interface (GUI) on the planar VR image and displaying the lattice type guide GUI overlaid on the planar VR image, wherein the lattice type guide GUI guides a position corresponding to the planar VR image on the spherical VR image, on the planar VR image.
  • GUI lattice type guide graphical user interface
  • the method may further comprise: displaying a plane VR image corresponding to the area of the edited spherical VR image.
  • the planar VR image may be obtained by converting a combined image, which is obtained by combining a plurality of images captured from a plurality of different viewpoints, to a plane image.
  • the first object provided from the editing tool may comprise at least one selected from a tool GUI used in an editing function, an editing content generated by the tool GUI, and a content added according to the editing function.
  • a non-transitory recording medium storing a program for performing an operation method of a display apparatus, the operation method comprising: converting a VR image into a spherical VR image; obtaining a planar VR image corresponding to an area of the spherical VR image according to a projection method; displaying the planar VR image; receiving a user input to select an editing tool for performing an editing operation on the planar VR image; in response to the editing operation, overlaying a first object corresponding to the editing operation on the planar VR image; displaying the first object overlaid on the planar VR image; obtaining a second object by inversely performing to project the first object as the second object in a spherical coordinate system; and editing the spherical VR image based on the second object.
  • a display apparatus comprising: a processor configured to: receive a first Virtual Reality (VR) image; obtain a second VR image corresponding to an area of the first VR image by applying a projection method on the first VR image; overlay a first object corresponding to an editing operation on the second VR image; obtain a second object by inversely performing the projection method used for obtaining the second VR image on the first object, in order to project the first object as the second object in a spherical coordinate system; and edit the first VR image based on the second object.
  • VR Virtual Reality
  • the processor may be further configured to: change a first attribute of the second object based on the editing operation; and change a second attribute of the first object based on the changed first attribute of the second objection, wherein the second attribute is different from the first attribute.
  • the first attribute may correspond to a size of an object; and the second attribute corresponds to a shape of an object.
  • the processor may be further configured to: move the second object to a third position in the spherical coordinate system corresponding to the second position in the second VR image, change the shape of the first object according to the projection method so that the first object corresponds to the second object having the changed position.
  • a method of controlling a display apparatus comprising: receiving a first Virtual Reality (VR) image; obtaining a second VR image corresponding to an area of the first VR image by applying a projection method on the first VR image; overlaying a first object corresponding to an editing operation on the second VR image; obtaining a second object by inversely performing the projection method used for obtaining the second VR image on the first object, in order to project the first object as the second object in a spherical coordinate system; and editing the first VR image based on the second object.
  • VR Virtual Reality
  • the method may further comprise: in response to the editing comprising an operation for changing a first attribute of the first object, changing a first attribute of the second object based on the editing operation; and changing a second attribute of the first object based on the changed first attribute of the second objection, wherein the second attribute is different from the first attribute.
  • a display apparatus may provide a user with an intuitive and convenient editing function by changing a shape of an object provided from an editing tool when a VR image is displayed.
  • FIG. 1A is a block diagram of a configuration of a display apparatus according to an exemplary embodiment
  • FIG. 1B is a block diagram of a detailed configuration of a display apparatus according to an exemplary embodiment
  • FIGS. 2A through 2D illustrate an example of a projection method according to an exemplary embodiment
  • FIGS. 3A through 3C illustrate a change in a size of an object according to an exemplary embodiment
  • FIGS. 4A through 4C illustrate a change in a position of an object according to an exemplary embodiment
  • FIG. 5 illustrates a type of an object according to an exemplary embodiment of
  • FIGS. 6A and 6B illustrate a method of changing a projection point according to an exemplary embodiment
  • FIGS. 7A through 7F illustrate a process of editing a Virtual Reality (VR) image according to an exemplary embodiment
  • FIG. 8 illustrates a screen that is being edited, according to an exemplary embodiment
  • FIG. 9 is a flowchart of a method of controlling a display apparatus according to an exemplary embodiment.
  • FIG. 1A is a block diagram of a configuration of a display apparatus 100 according to an exemplary embodiment
  • the display apparatus 100 includes a storage 110, a user interface 120, a display 130, and a processor 140.
  • the display apparatus 100 may be an apparatus that displays and edits an image or a video.
  • the display apparatus 100 may be realized as a notebook computer, a desktop personal computer (PC), a smartphone, or the like, and any apparatus which displays and edits an image or a video is not limited and may be applied to the display apparatus 100.
  • the display apparatus 100 may be an apparatus that displays and edits a Virtual Reality (VR) image or video.
  • the VR image may be an image generated by combining a plurality of images captured from a plurality of different viewpoints and converting the combined image to a plane.
  • the VR image may be an image generated by capturing a plurality of images so as to include all directions based on a capturing person, stitching the plurality of captured images, and converting the stitched image to a plane.
  • the VR image is not limited thereto and thus may be generated by capturing a plurality of images so as to include merely some directions not all directions.
  • a spherical VR image is generated, and an example of the spherical VR image is illustrated in FIG. 2A. Also, if the spherical VR image illustrated in FIG. 2A is converted through an equirectangular projection method, a VR image is generated, and an example of the VR image is illustrated in FIG. 2C.
  • a conversion of a spherical VR image into a planar VR image is referred to as a projection
  • a method of converting the spherical VR image into the planar VR image is referred to as a projection method.
  • Detailed descriptions of the projection and the projection method will be described later with reference to FIGS. 2A, 2B and 2C.
  • the display apparatus 100 may provide a function of displaying and editing a whole or a part of a VR image.
  • the storage 110 may store a VR image generated by combining a plurality of images captured from a plurality of different viewpoints and converting the combined image to a plane.
  • the VR image may be an image generated by an external apparatus not the display apparatus 100.
  • the display apparatus 100 may receive a VR image from an external apparatus and may store the VR image in the storage 110.
  • the display apparatus 100 may include a plurality of cameras, directly perform capturing by using the plurality of cameras, and generate a VR image by processing a plurality of captured images.
  • the user interface 120 may receive a user input.
  • the user interface 120 may receive a user input for displaying a VR image, a spherical VR image, or the like.
  • the user interface 120 may receive a user input for displaying a planar VR image corresponding to an area of a spherical VR image.
  • the user input may be an input that designates a projection point and a projection angle for designating an area.
  • the user input may also be an input that designates a projection method.
  • the user interface 120 may receive a user input for changing an area of a VR image that is currently being displayed.
  • the user interface 120 may receive a user input for editing a VR image that is being displayed.
  • the user interface 120 may receive a user input for executing an editing tool for editing a VR image.
  • the user interface 120 may also receive a user input for changing a size or a position of an object that is provided from an editing tool as the editing tool is executed.
  • the display 130 may display various types of contents under control of the processor 140.
  • the display 130 may display the VR image and the object provided from the editing tool.
  • the display 130 may also in real time display a VR image that is edited according to an execution of the editing tool.
  • the display 130 may be realized as a Liquid Crystal Display (LCD) panel, an Organic Light Emitting Diode (OLED), or the like but is not limited thereto.
  • the display 130 may also be realized as a flexible display, a transparent display, or the like.
  • the processor 140 controls an overall operation of the display apparatus 100.
  • the processor 140 may convert a VR image into a spherical VR image.
  • the VR image may be an image generated by converting a spherical VR image to a plane through a preset projection method and may be stored in the storage 110.
  • the processor 140 may generate a spherical VR image by inversely projecting a VR image according to a projection method used for generating a VR image. For example, if an equirectangular projection method is used when generating a VR image, the processor 140 may generate a spherical VR image by respectively mapping a width and a length of a VR image on ⁇ and ⁇ of a spherical coordinate system.
  • the processor 140 may generate a spherical VR image by inversely projecting a VR image according to each of the equirectangular projection method and the another type of projection method.
  • the VR image may store information about a projection method.
  • the processor 140 may generate a spherical VR image based on the projection method stored in the VR image.
  • the processor 140 may determine a projection method used when generating a VR image by analyzing the VR image.
  • the processor 140 may generate a planar VR image corresponding to an area of the spherical VR image and control the display 130 to display the planar VR image.
  • the processor 140 may generate a planar VR image by projecting merely an area of a spherical VR image.
  • the processor 140 may generate a planar VR image by projecting a whole of a spherical VR image and cropping merely an area of the VR image.
  • the processor 140 may determine an area of a spherical VR image based on the projection point and the projection angle, and generate and display a planar VR image corresponding to the area based on the projection method.
  • the projection point may be a point of an area that a user wants to display on the spherical VR image.
  • the projection angle may be an angle of an area that the user wants to display in a center of the spherical VR image.
  • the area that the user wants to display may be a rectangular area.
  • the projection angle may include an angle formed by upper and lower edges of the rectangular area and the center of the spherical VR image and an angle formed by left and right edges of the rectangular area and the center of the spherical VR image.
  • the processor 140 may determine an area that the user wants to display by receiving merely one of the two angles described above. For example, if an angle formed by left and right edges of an area that the user wants to display and the center of the spherical VR image is received, the processor 140 may determine the area that the user wants to display based on an aspect ratio of the display 130.
  • the processor 140 may determine an area of the spherical VR image by using a projection point, a projection angle, and a projection method set by default. The processor 140 may also receive a user input for some of the projection point, the projection angle, and the projection method.
  • the processor 140 may overlay and display a first object provided from the editing tool on the planar VR image. For example, if an editing tool for adding a line onto a planar VR image is executed, the processor 140 may overlay and display a pen tool on the planar VR image.
  • the first object provided from the editing tool may include at least one selected from a tool Graphical User Interface (GUI) used in an editing function, an editing content generated by the tool GUI, and a content added according to the editing function.
  • GUI Graphical User Interface
  • the processor 140 may generate a second object by inversely performing the projection method used for generating the planar VR image on the first object, in order to project the first object as the second object in a spherical coordinate system.
  • the processor 140 may generate an edited sphere VR image based on a second object generated by reversely projecting the first object according to a projection method used for generating the plane VR image.
  • the processor 140 may generate the second object by inversely projecting the first object according to the equirectangular projection method.
  • the processor 140 may change a size of the second object in response to the user input, change a shape of the first object based on a projection method so as to enable the first object to correspond to the second object having the changed size, and display the first object having the changed shape on the planar VR image.
  • the projection method may be a projection method used for generating the planar VR image.
  • the processor 140 may change the size of the second object by 10 units.
  • the size of the first object may not be changed by 10 units and displayed.
  • the processor 140 may generate the first object of which the shape is changed and corresponds to the second object of which the size is changed by 10 units based on the projection method used for generating the planar VR image.
  • the size of the first object may not be simply changed by 10 units, but the shape of the first object may be changed according to at least one selected from a projection point, a projection angle, and a projection method.
  • the processor 140 may display the first object, of which the shape is changed, on the planar VR image.
  • a user may perform editing with checking editing of a spherical VR image not editing of a planar VR image.
  • the processor 140 may change a position of the second object to a third position corresponding to the second position on the spherical VR image, change a shape of the first object based on a projection method so as to enable the first object to correspond to the second object having the changed position, and display the first object having the changed shape in the second position on the planar VR image.
  • the projection method may be a projection method used for generating the planar VR image.
  • the processor 140 may change the position of the second object to the third position corresponding to the second position on the spherical VR image.
  • the third position corresponding to the second position may be determined based on the projection method used for generating the planar VR image.
  • the processor 140 may project the second object, of which the position is changed to the third position based on the projection method used for generating the planar VR image, onto a plane.
  • the processor 140 may generate the first object of which position is changed by projecting the second object of which position is changed.
  • the position of the first object may not be simply changed, but the shape of the first object may be changed according to at least one selected from a projection point, a projection angle, and a projection method.
  • the processor 140 may display the first object, of which the shape is changed, on the planar VR image.
  • the user may perform editing with checking editing of the spherical VR image not editing of the planar VR image in real time.
  • the processor 140 may generate and display a planar VR image corresponding to the fourth position.
  • the processor 140 may generate and display a planar VR image where the point of the left boundary is a projection point.
  • the processor 140 may overlay and display a lattice type guide GUI, which guides a position corresponding to a planar VR image on the spherical VR image, on the planar VR image.
  • the processor 140 may overlay and display a lattice type GUI corresponding to vertical and horizontal lines of a spherical VR image on a planar VR image.
  • the vertical and horizontal lines of the spherical VR image may respectively correspond to latitude and longitude.
  • the processor 140 may display a planar VR image corresponding to an area of an edited spherical VR image. For example, if an editing tool for adding a line onto a planar VR image is executed, the processor 140 may overlay and display a pen tool on the planar VR image. Also, if the pen tool is executed to add the line, the processor 140 may add the line onto the spherical VR image, convert the spherical VR image, onto which the line is added, into a planar VR image based on a projection method, and display the planar VR image.
  • FIG. 1B is a block diagram of a detailed configuration of the display apparatus 100, according to an exemplary embodiment.
  • the display apparatus 100 includes the storage 110, the user interface 120, the display 130, the processor 140, a communicator 150, an audio processor 160, a video processor 170, a speaker 180, a button 181, a camera 182, and a microphone 183.
  • a communicator 150 an audio processor 160, a video processor 170, a speaker 180, a button 181, a camera 182, and a microphone 183.
  • the processor 140 controls an overall operation of the display apparatus 100 by using various types of programs stored in the storage 110.
  • the processor 140 includes a Random Access Memory (RAM) 141, a Read Only Memory (ROM) 142, a main Central Processing Unit (CPU) 143, a graphic processor 144, first through nth interfaces 145-1 through 145-n, and a bus 146.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • CPU Central Processing Unit
  • graphic processor 144 first through nth interfaces 145-1 through 145-n
  • bus 146 a bus 146.
  • the RAM 141, the ROM 142, the main CPU 143, the graphic processor 144, the first through nth interfaces 145-1 through 145-n, and the like may be connected to one another through the bus 146.
  • the first through nth interfaces 145-1 through 145-n are connected to various types of elements described above.
  • One of interfaces may be a network interface that is connected to an external apparatus through a network.
  • the main CPU 143 performs booting by using an Operating System (O/S) stored in the storage 110 by accessing the storage 110.
  • the main CPU 143 also performs various types of operations by using various types of programs and the like stored in the storage 110.
  • O/S Operating System
  • a command set and the like for booting a system are stored in the ROM 142. If power is supplied by inputting a turn-on command, the main CPU 143 boots the system by copying the O/S stored in the storage 110 into the RAM 141 according to a command stored in the ROM 142 and executing the O/S. If the system is completely booted, the main CPU 143 performs various types of operations by copying various types of application programs stored in the storage 110 into the RAM 141 and executing the application programs copied into the RAM 141.
  • the graphic processor 144 generates a screen including various types of objects including an icon, an image, a text, and the like by using an operator (not shown) and a renderer (not shown).
  • the operator calculates attribute values such as coordinate values, shapes, sizes, colors, and the like at which objects will be displayed according to a layout of the screen based on a received control command.
  • the renderer generates a screen having various types of layouts including an object based on the attribute values calculated by the operator.
  • the screen generated by the renderer is displayed in a display area of the display 130.
  • the above-described operation of the processor 140 may be performed by a program stored in the storage 110.
  • the storage 110 stores various types of data such as an O/S software module for driving the display apparatus 100, a projection method module, an image editing module, and the like.
  • the processor 140 may display a VR image and provide an editing tool based on information stored in the storage 110.
  • the user interface 120 receives various types of user interactions.
  • the user interface 120 may be realized as various types according to various exemplary embodiments of the display apparatus 100.
  • the display apparatus 100 may be a notebook computer, a desktop PC, or the like, and the user interface 120 may be a receiver or the like for receiving an input signal from a keyboard or a mouse for interfacing with the notebook computer, the desktop PC, or the like.
  • the display apparatus 100 may be a touch-based electronic device, and the user interface 120 may be a touch screen type that forms an interactive layer structure with a touch pad for interfacing with the touch-based electronic device.
  • the user interface 120 may be used as the display 130 described above.
  • the communicator 150 is an element that performs communications with various types of external apparatuses according to various types of communication methods.
  • the communicator 150 includes a Wireless Fidelity (WiFi) chip 151, a Bluetooth chip 152, a wireless communication chip 153, a Near Field Communication (NFC) chip 154, and the like.
  • the processor 140 performs communications with various types of external apparatuses by using the communicator 150.
  • the WiFi chip 151 and the Bluetooth chip 152 respectively perform communications according to a WiFi method and a Bluetooth method. If the WiFi chip 151 and the Bluetooth chip 152 are used, various types of information may be transmitted and received by transmitting and receiving various types of connection information such as a Subsystem Identification (SSID), a session key, and the like and connecting communications by using the various types of connection information.
  • the wireless communication chip 153 refers to a chip that performs communications according to various types of communication standards such as Institute of Electrical and Electronics Engineers (IEEE), Zigbee, 3rd Generation (3G), 3rd Generation Partnership Project (3GPP), Long Term Evolution (LTE), and the like.
  • the NFC chip 154 refers to a chip that operates according to an NFC method using a band of 13.56 MHz among various types of Radio Frequency Identification (RFID) frequency bands such as 135 kHz, 13.56 MHz, 433 MHz, 860 ⁇ 960 MHz, 2.45 GHz, and the like.
  • RFID Radio Frequency Identification
  • the communicator 150 may perform a unidirectional or bidirectional communication with an external apparatus. If the communicator 150 performs the unidirectional communication with the external apparatus, the communicator 150 may receive a signal from the external apparatus. If the communicator 150 performs the bidirectional communication with the external apparatus, the communicator 150 may receive a signal from the external apparatus and may transmit a signal to the external apparatus.
  • the audio processor 160 is an element that performs processing with respect to audio data.
  • the audio processor 160 may perform various types of processing, such as decoding, amplifying, noise filtering, and the like, with to the audio data.
  • the video processor 170 is an element that performs processing with respect to video data.
  • the video processor 170 may perform various types of image processing, such as decoding, scaling, noise filtering, frame rate converting, resolution converting, and the like, with respect to the video data.
  • the speaker 180 is an element that outputs various types of audio data, various types of notification sounds, voice messages, and the like processed by the audio processor 160.
  • the button 181 may be various types of buttons such a mechanical button, a touch pad, a wheel, and the like that are formed in an arbitrary area of a front part, a side part, a back part, or the like of an external appearance of a main body of the display apparatus 100.
  • the camera 182 is an element that captures a still image or a moving picture image under control of the user.
  • the camera 182 may be realized as a plurality of cameras including a front camera, a back camera, and the like.
  • the microphone 183 is an element that receives a user voice or other sounds and converts the user voice or the other sounds into audio data.
  • the display apparatus 100 may further include various types of external input ports for connecting the display apparatus 100 to various types of external terminals such as a Universal Serial Bus (USB) port through which a USB connector may be connected to the display apparatus 100, a headset, a mouse, a Local Area Network (LAN), and the like, a Digital Multimedia Broadcasting (DMB) chip that receives and processes a DMB signal, various types of sensors, and the like.
  • USB Universal Serial Bus
  • LAN Local Area Network
  • DMB Digital Multimedia Broadcasting
  • FIGS. 2A through 2D illustrate an example of a projection method according to an exemplary embodiment.
  • FIG. 2A illustrates an example of a spherical VR image.
  • FIG. 2C illustrates a VR image generated by converting the spherical VR image of FIG. 2A to a plane based on an equirectangular projection method.
  • FIG. 2B illustrates an exemplary representation of the spherical VR image in FIG. 2A.
  • FIG. 2B illustrates example of a central point O and a projection point P0 of the spherical VR image.
  • ⁇ of a spherical coordinate system denotes an angle formed between a straight line going from the central point O to the projection point P0 and a straight line going from the central point O to a first point P1 on a horizontal plane. If the projection point P0 and the first point P1 are not on the horizontal plane, an angle may be determined based on two points on the horizontal plane onto which the projection point P0 and the first point P1 are respectively projected.
  • the horizontal plane may be a basis unrolling a spherical VR image on a plane and may be set in another direction.
  • the horizontal plane may be set so as to be orthogonal to a horizontal plane of FIG. 2B.
  • the processor 140 may determine the horizontal plane based on the projection point P0.
  • ⁇ of the spherical coordinate system may be an angle formed between a straight line going from the central point O to a second point P2 and the horizontal plane.
  • the processor 140 may generate a VR image by converting a spherical VR image to a plane based on a correspondence relation between ⁇ and ⁇ of the spherical coordinate system and x and y of an orthogonal coordinate system.
  • the correspondence relation may depend on a projection method.
  • shapes of circular dots displayed on the spherical VR image of FIG. 2A may be changed as the spherical VR image is projected onto a plane.
  • the shapes of the circular dots illustrated in FIG. 2A may be changed into elliptical shapes as the locations of the circular dots are closer to the upper and lower regions of the VR image of FIG. 2C.
  • This is a problem occurring as the spherical VR image is illustrated on a rectangular plane, and a distortion may become serious as the locations of the circular dots are closer to the upper and lower regions of FIG. 2C.
  • an area where a distortion occurs may be changed.
  • FIGS. 2A and 2B The equirectangular projection method is illustrated in FIGS. 2A and 2B, but the present disclosure is not limited thereto.
  • a spherical VR image may be converted into a VR image by using various types of projection methods such as rectilinear, cylindrical, Mercator, stereographic, pannini, and ours projection methods, and the like.
  • FIG. 2C An example of a VR image converted to a plane through various types of projection methods is illustrated in FIG. 2C.
  • FIGS. 3A through 3C illustrate a change in a size of an object according to an exemplary embodiment of the present disclosure.
  • the processor 140 may overlay and display a first object provided from the editing tool on the planar VR image.
  • the processor 140 may overlay and display a sticker having a preset shape on the planar VR image.
  • a sticker may be an arrow, an emoticon, or the like, selected from a GUI editing tool.
  • the processor 140 may also receive a user input for changing a size of the first object.
  • the processor 140 may receive a user input for changing the size of the first object from a first size 310-1 to a second size 310-2.
  • the processor 140 may change a size of a second object in response to the user input as shown in FIG. 3B.
  • the second object may be an object that is positioned on a spherical coordinate system and corresponds to the first object.
  • the processor 140 may generate a second object by inversely converting a first object based on a projection method.
  • the processor 140 may change a size of the second object by a difference d between the first size 310-1 and the second size 310-2.
  • the processor 140 may change the size of the second object from a third size 320-1 to a fourth size 320-2 on a second layer.
  • the processor 140 may change the size of the second object from the third size 320-1 to the fourth size 320-2 on the second layer according a difference d.
  • a shape of the second object may not be changed, but merely the size of the second object may be changed.
  • the present disclosure is not limited thereto, and thus if a user input for changing the size of the first object from the first size 310-1 to the second size 310-2 is received, the processor 140 may calculate a plurality of coordinates corresponding to a plurality of vertexes of the second size on a spherical coordinate system and change the size of the second object in response to the plurality of calculated coordinates. In this case, the shape of the second object may be changed.
  • the second object and the spherical VR image may be respectively included on different layers.
  • the spherical VR image may be included on a first layer, and the second object may be included on a second layer.
  • the spherical VR image may not be changed.
  • the processor 140 may generate a first object, of which shape is changed, by converting a second object, of which a size is changed, to a plane based on a preset projection method.
  • the processor 140 may project a layer including the second object onto a plane.
  • the processor 140 may project a layer including a second object onto a plane according to a projection point, a projection angle, and a projection method used when projecting a spherical VR image onto a planar VR image.
  • an area may be distorted in a projection process. As the distortion occurs, a size of a first object may not be simply changed, but a shape of the first object may be distorted. As shown in FIG. 3C, the processor 140 may display a first object 330, of which shape is changed, on a planar VR image.
  • the processor 140 may generate an edited spherical VR image by merging a first layer including the spherical VR image with a second layer including the first object 330.
  • the processor 140 may display a planar VR image corresponding to an area of the edited spherical VR image.
  • the processor 140 may overlay and display a lattice type guide GUI, which guides a position corresponding to the planar VR image on the spherical VR image, on the planar VR image.
  • the lattice type guide GUI may correspond to vertical and horizontal lines of the spherical VR image.
  • a distance between the vertical and horizontal lines may be preset. Alternatively, the distance may be changed under control of the user.
  • FIGS. 4A through 4C illustrate a change in a position of an object according to an exemplary embodiment of the present disclosure.
  • the processor 140 may overlay and display a first object provided from the editing tool on the planar VR image.
  • the processor 140 may also receive a user input for changing a position of the first object.
  • the processor 140 may receive a user input for changing the position of the first object from a first position 410-1 to a second position 410-2.
  • the processor 140 may change a position of a second object in response to the user input.
  • the second object may be an object that is positioned on a spherical coordinate system and corresponds to the first object.
  • the processor 140 may generate the second object by inversely converting the first object based on a projection method.
  • the processor 140 may change the position of the second object by a difference d between the first position 410-1 and the second position 410-2.
  • the processor 140 may change the position of the second object from a third position 420-1 to a fourth position 420-2 on a second layer.
  • the processor 140 may change the position of the second object from the third position 420-1 to the fourth position 420-2 on the second layer according to a distance d.
  • a shape of the second object may not be changed, but merely the position of the second object may be changed.
  • the present disclosure is not limited thereto, and thus if a user input for changing the position of the first object from the first position 410-1 to the second position 410-2 is received, the processor 140 may calculate a plurality of coordinates corresponding to a plurality of vertexes of the second position 410-2 on a spherical coordinate system and change the position of the second object in response to the plurality of calculated coordinates. In this case, the shape of the second object may be changed.
  • the second object and the spherical VR image may be respectively included on different layers.
  • the spherical VR image may be included on a first layer, and the second object may be included on a second layer.
  • the spherical VR image may not be changed.
  • the processor 140 may generate a first object, of which shape is changed, by converting a second object, of which position is changed, to a plane based on a preset projection method.
  • the processor 140 may project a layer including the second object onto a plane.
  • the processor 140 may project the layer including the second object according to a projection point, a projection angle, and a projection method used when projecting the spherical VR image onto the planar VR image.
  • an area may be distorted in a projection process. As the distortion occurs, the position of the first object may not be simply changed, but merely the shape of the first object may be distorted. As shown in FIG. 4C, the processor 140 may display a first object 430 having a changed shape on a planar VR image.
  • the processor 140 may generate an edited spherical VR image by merging a first layer including the spherical VR image with a second layer including the first object 430.
  • the processor 140 may display a planar VR image corresponding to an area of the edited spherical VR image.
  • the processor 140 may overlay and display a lattice type guide GUI, which guides a position corresponding to the planar VR image on the spherical VR image, on the planar VR image.
  • the lattice type guide GUI may correspond to vertical and horizontal lines of the spherical VR image.
  • a distance between the vertical and horizontal lines may be preset. Alternatively, the distance may be changed under control of the user.
  • FIG. 5 illustrates a type of an object according to an exemplary embodiment.
  • the processor 140 may insert an image.
  • the processor 140 may change a shape of an image by using a method as described with reference to FIGS. 3A, 3B, 3C, 4A, 4B and 4C and display an image 510, of which shape is changed, on a planar VR image.
  • the image has a rectangular shape, but the processor 140 may generate the image 510 of which shape is changed and then display the image 510 having the changed shape on the planar VR image.
  • the processor 140 may also apply a filter to a boundary area between the image 510 having the changed shape and the planar VR image.
  • an object provided from an editing tool may include at least one selected from a tool GUI used in an editing function, an editing content generated by the tool GUI, and a content added according to the editing function.
  • the processor 140 may display an image, a pen, a paint, an eraser, a sticker, a text box, a moving picture image, a filter, and the like on the planar VR image according to the same method.
  • FIGS. 6A and 6B illustrate a method of changing a projection point according to an exemplary embodiment.
  • the processor 140 may overlay and display a first object on a planar VR image. If a user input for changing a position of the first object from a first position 610-1 to a second position 610-2 in a preset area of the planar VR image is received, the processor 140 may generate and display a planar VR image corresponding to the second position 610-2.
  • the processor 140 may change a projection point to a left side. If a user input for moving the first object to a left boundary is received, the processor 140 may check that there is a user intention of moving an object to another area not to a currently displayed planar VR image and change a displayed area.
  • the processor 140 may generate and display a planar VR image corresponding to the second position 610-2.
  • the processor 140 may change a projection point so as to display a second object in a center.
  • a building positioned in the center may be displayed to a right side due to the change in the projection point.
  • the present disclosure is not limited thereto, and thus if a user input for moving a first object to a left boundary, the processor 140 may change a projection point to a preset projection point. Alternatively, the processor 140 may determine a new projection point based on at least one selected from an existing projection point, a projection angle, and a changed position of the first object.
  • the processor 140 may change the projection angle. For example, if the projection point is changed, the processor 140 may change the projection angle so as to enable the projection angle to be larger in order to easily search for an area of a VR image.
  • the processor 140 may enlarge and display a projection angle.
  • FIGS. 7A through 7F illustrate a process of editing a VR image according to an exemplary embodiment.
  • the processor 140 may display a VR image generated by converting a spherical VR image to a plane based on a preset projection method.
  • the processor 140 may display an area of the VR image converted to the plane.
  • the processor 140 receive projection parameters such as a projection point, a projection angle, a projection method, and the like from the user and display an area of the VR image converted to the plane.
  • the processor 140 may also change an image, which is being displayed, by receiving projection parameters from the user in real time.
  • the processor 140 may display a whole or an area of the VR image converted to the plane according to a user input.
  • the processor 140 may change and display a projection point in real time according to a user input.
  • a projection point of FIG. 7C is more moved to a left side than a projection point of FIG. 7B.
  • the processor 140 may change and display a projection angle in real time according to a user input.
  • a projection angle of FIG. 7D is more reduced than a projection angle of FIG. 7C.
  • the user may enlarge an image, which is being displayed, by reducing a projection angle or may reduce an image, which is being displayed, by enlarging the projection angle.
  • the processor 140 may display an image by enlarging or reducing the image without changing a projection angle.
  • the processor 140 may also display the image by changing a projection point, a projection angle, and a projection method in real time.
  • the processor 140 may display an edited VR image in real time.
  • the processor 140 may display an image, which is being displayed, an editing state thereof or may display a whole of a completely edited VR image as shown in FIG. 7F.
  • the processor 140 may move the projection point or enlarge and reduce the image with maintaining existing editing contents. For example, although a projection point is moved as shown in FIG. 7C or an image is enlarged as shown in FIG. 7D, the processor 140 may maintain existing editing contents.
  • the processor 140 may maintain the existing editing contents.
  • FIG. 8 illustrates a screen that is being edited according to an exemplary embodiment.
  • the processor 140 may display an area of a VR image on a whole screen or may reduce and display a whole VR image 810 in an area of the whole screen.
  • the processor 140 may display an editing result 820-1 of the area in real time.
  • the processor 140 may also display an editing result 820-2 of the whole VR image 810 that is reduced and displayed. Through this operation, the user may edit an area of an image and check how a whole area of the image is edited.
  • FIG. 9 is a flowchart of a method of controlling a display apparatus according to an exemplary embodiment.
  • the display apparatus converts a VR image, into a spherical VR image.
  • the VR image is a planar VR image generated by combining a plurality of images captured from a plurality of different viewpoints
  • the spherical VR image may be received from a storage, and as such a conversion operation S910 by the display apparatus may be omitted.
  • the display apparatus generates and displays a planar VR image corresponding to an area of the spherical VR image.
  • the display apparatus overlays and displays a first object provided from the editing tool on the planar VR image.
  • the display apparatus generates a second object by inversely performing the projection method used for generating the planar VR image on the first object, in order to project the first object as the second object in a spherical coordinate system.
  • the display apparatus edits the spherical VR image based on the second object.
  • the method may further include, if a user input for changing a size of the first object is received, changing a size of the second object in response to the user input, and changing a shape of the first object based on the projection method so as to enable the first object to correspond to the second object having the changed size and displaying the first object having the changed shape on the planar VR image.
  • the method may further include, if a user input for changing a position of the first object from a first position to a second position is received, changing a position of the second object to a third position corresponding to the second position on the spherical VR image, and changing a shape of the first object based on a projection method so as to enable the first object to correspond to the second object having the changed position and displaying the first object having the changed shape on the planar VR image.
  • Operation 920 may further include, if a user input for a projection point, a projection angle, and a projection method is received, determining an area of the spherical VR image based on the projection point and the projection angle, and generating and displaying the planar VR image corresponding to the area based on the projection method.
  • the method may further include, if a user input for changing the position of the first object from the first position to a fourth position in a preset area of the planar VR image is received, generating and displaying a planar VR image corresponding to the fourth position.
  • the method may further include overlaying and display a lattice type guide GUI, which guides a position corresponding to the planar VR image on the spherical VR image, on the planar VR image.
  • a lattice type guide GUI which guides a position corresponding to the planar VR image on the spherical VR image, on the planar VR image.
  • the method may further include displaying a planar VR image corresponding to an area of an edited spherical VR image.
  • the first object provided from the editing tool may include at least one selected from a tool GUI used in an editing function, an editing content generated by the tool GUI, and a content added according to the editing function.
  • a display apparatus may provide a user with an intuitive and convenient editing function by changing a shape of an object providing from an editing tool when a VR image is displayed.
  • an image has been mainly described above, but the same method may be applied with respect to each frame of a moving picture image.
  • the user may edit each frame and may perform the same editing with respect to frames displayed for a preset time.
  • Methods according to various exemplary embodiments of the present disclosure described above may be embodied as an application type that may be installed in an existing electronic device.
  • the elements, components, methods or operations described herein may be implemented using hardware components, software components, or a combination thereof.
  • the hardware components may include a processing device.
  • the display apparatus may include a processing device, such as the image processor or the controller, that may be implemented using one or more general-purpose or special purpose computers, such as, for example, a hardware processor, a CPU, a hardware controller, an ALU, a DSP, a microcomputer, an FPGA, a PLU, a microprocessor or any other device capable of responding to and executing instructions in a defined manner.
  • the processing device may run an operating system (OS) and one or more software applications that run on the OS.
  • OS operating system
  • the processing device also may access, store, manipulate, process, and create data in response to execution of the software.
  • a processing device may include multiple processing elements and multiple types of processing elements.
  • a processing device may include multiple processors or a processor and a controller.
  • different processing configurations are possible, such a parallel processors.
  • the various exemplary embodiments described above may be embodied as software including instructions stored in machine-readable storage media (e.g., computer-readable storage media).
  • a device may an apparatus that calls an instruction from a storage medium, may operate according to the called instruction, and may include an electronic device (e.g., an electronic device A) according to disclosed exemplary embodiments. If the instruction is executed by a processor, the processor may directly perform a function corresponding to the instruction or the function may be performed by using other types of elements under control of the processor.
  • the instruction may include codes generated or executed by a compiler or an interpreter.
  • a machine-readable storage medium may be provided as a non-transitory storage medium type.
  • “non-transitory” means that a storage medium does not include a signal and is tangible but does not distinguish semi-permanent and temporary storages of data in the storage medium.
  • a method according to various exemplary embodiments described above may be included and provided in a computer program product.
  • the computer program product may be transacted as a product between a seller and a buyer.
  • the computer program product may be distributed as a type of a machine-readable storage medium (e.g., a type of a compact disc read only memory (CD-ROM)) or may be distributed online through an application store (e.g., play store TM). If the computer program product is distributed online, at least a part of the computer program product may be at least temporally or temporarily generated in a storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • a storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
  • exemplary embodiments described above may be embodied in a recording medium readable by a computer or a similar apparatus to the computer by using software, hardware, or a combination thereof.
  • exemplary embodiments described herein may be embodied as a processor.
  • exemplary embodiments such as processes and functions described herein may be embodied as additional software modules.
  • the software modules may perform at least one or more functions and operations described herein.
  • Computer instructions for performing a processing operation of a device according to the above-described various exemplary embodiments may be stored in a non-transitory computer-readable medium.
  • the computer instructions stored in the non-transitory computer-readable medium enable a particular device to perform a processing operation in a device according to the above-described exemplary embodiments when being executed by a processor of the particular device.
  • the non-transitory computer readable medium is a medium which does not store data temporarily such as a register, cash, and memory but stores data semi-permanently and is readable by devices.
  • the aforementioned applications or programs may be stored in the non-transitory computer readable media such as compact disks (CDs), digital video disks (DVDs), hard disks, Blu-ray disks, universal serial buses (USBs), memory cards, and read-only memory (ROM).
  • CDs compact disks
  • DVDs digital video disks
  • hard disks hard disks
  • Blu-ray disks Blu-ray disks
  • USBs universal serial buses
  • memory cards and read-only memory (ROM).
  • Each of elements according to the above-described various exemplary embodiments may include a single entity or a plurality of entities, and some of corresponding sub elements described above may be omitted or other types of sub elements may be further included in the various exemplary embodiments.
  • some elements e.g., modules or programs
  • Operations performed by modules, programs, or other types of elements according to the various exemplary embodiments may be sequentially, in parallel, or heuristically executed or at least some operations may be executed in different sequences or may be omitted, or other types of operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Geometry (AREA)

Abstract

A display apparatus is provided. The display apparatus includes a storage that stores store a Virtual Reality (VR) image and a processor that converts the VR image into a spherical VR image, generates a planar VR image corresponding to an area of the spherical VR image, and controls a display to display the planar VR image.

Description

DISPLAY APPARATUS AND CONTROL METHOD THEREOF
Apparatuses and methods consistent with the present disclosure relate to a display apparatus and a control method thereof, and more particularly, to a display apparatus for editing a Virtual Reality (VR) image generated by combining a plurality of images captured from a plurality of different viewpoints and converting the combined image to a planar image, and a control method thereof.
Various types of personal capturing devices have been launched with an increase in interests in Virtual Reality (VR). As a result, the number of personal contents has exponentially increased, and thus consumer demand for editing these contents have increased.
However, a current editing software tool of a VR image has a main purpose of stitching and thus does not support editing such as drawing of a picture on a 360° image with a pen, inserting of a text into the 360° image, or the like.
An existing editing tool of a photo editing app, a photoshop, or the like of a smartphone may be used for performing this editing but does not provide an additional function of a 360° image. In other words, the existing editing tool performs editing on a VR image generated by projecting a spherical VR image onto a plane. In this case, editing may not be performed like a user intends, due to a distortion occurring in a process of projecting the spherical VR image onto the plane.
Therefore, there is a need for a method of performing editing without a distortion by reviewing the distortion in real time.
Exemplary embodiments of the present disclosure overcome the above disadvantages and other disadvantages not described above. Also, the present disclosure is not required to overcome the disadvantages described above, and an exemplary embodiment of the present disclosure may not overcome any of the problems described above.
The present disclosure provides a display apparatus for performing intuitive editing when a Virtual Reality (VR) image is displayed, and a control method thereof.
According to an aspect of an exemplary embodiment, there is provided a display apparatus comprising: a storage configured to store a Virtual Reality (VR) image; a user interface; a display; a processor configured to: convert the VR image into a spherical VR image, obtain a planar VR image corresponding to an area of the spherical VR image according to a projection method, control the display to display the planar VR image, receive a user input, through the user interface, to select an editing tool for performing an editing operation on the planar VR image, in response to the editing operation, overlay a first object corresponding to the editing operation on the planar VR image and control the display to display the first object overlaid on the planar VR image, obtain a second object by inversely performing the projection method to project the first object as the second object in a spherical coordinate system and edit the spherical VR image based on the second object.
In response to the user input comprising an operation for changing a size of the first object, the processor may be further configured to: change a size of the second object based on the user input, change a shape of the first object based on the second object of which the size is changed according to the projection method, and control the display to display the first object having the changed shape on the planar VR image.
In response to the user input comprising an operation for changing a position of the first object from a first position to a second position in the planar VR image, the processor may be further configured to: move the second object to a third position in the spherical coordinate system corresponding to the second position in the planar VR image, change the shape of the first object based on the inversely performed projection method so that the first object to correspond to the second object having the changed position, and control the display to display the first object having the changed shape in the second position on the planar VR image.
In response to a user input comprising an operation for a projection point, a projection angle, and the projection method, the processor may be further configured to: identify the area of the spherical VR image based on the projection point and the projection angle, obtain and control the display to display a planar VR image corresponding to the area based on the projection method.
In response to a user input comprising an operation for changing a position of the first object from a first position to a fourth position in a preset area of the planar VR image, the processor may be further configured to obtain and control the display to display a planar VR image corresponding to the fourth position.
The processor may be further configured to overlay a lattice type guide graphical user interface (GUI) on the planar VR image and control the display to display the lattice type guide GUI overlaid on the planar VR image, and wherein the lattice type guide GUI guides a position corresponding to the planar VR image on the spherical VR image.
The processor may display a plane VR image corresponding to the area of the edited spherical VR image.
The planar VR image may be obtained by converting a combined image, which is obtained by combining a plurality of images captured from a plurality of different viewpoints, to a plane image.
The first object provided from the editing tool may comprise at least one selected from a tool GUI used in an editing function, an editing content generated by the tool GUI, and a content added according to the editing function.
According to an aspect of an exemplary embodiment, there is provide a method of controlling a display apparatus, the method comprising: converting a VR image into a spherical VR image; obtaining a planar VR image corresponding to an area of the spherical VR image according to a projection method; displaying the planar VR image; receiving a user input to select an editing tool for performing an editing operation on the planar VR image; in response to the editing operation, overlaying a first object corresponding to the editing operation on the planar VR image; displaying the first object overlaid on the planar VR image; obtaining a second object by inversely performing the projection method to project the first object as the second object in a spherical coordinate system; and editing the spherical VR image based on the second object.
The method may further comprise: in response to the user input comprising an operation for changing a size of the first object being received, changing a size of the second object based on the user input; and changing a shape of the first object based on the second object of which the size is changed according to the projection method, and displaying the first object having the changed shape on the planar VR image.
The method may further comprise: in response to the user input comprising an operation for changing a position of the first object from a first position to a second position in the planar VR image, moving the second object to a third position in the spherical coordinate system corresponding to the second position in the planar on the spherical VR image; and changing a shape of the first object based on the projection method so that the first object to correspond to the second object having the changed position and displaying the first object having the changed shape in the second position on the planar VR image.
The displaying of the planar VR image may comprise: in response to the user input comprising an operation for a projection point, a projection angle, and the projection method being received, identifying the area of the spherical VR image based on the projection point and the projection angle; and obtaining and displaying a planar VR image corresponding to the area based on the projection method.
The method may further comprise: in response to the user input comprising an operation for changing a position of the first object from a first position to a fourth position in a preset area of the planar VR image being received, obtaining and displaying a planar VR image corresponding to the fourth position.
The method may further comprise: overlaying a lattice type guide graphical user interface (GUI) on the planar VR image and displaying the lattice type guide GUI overlaid on the planar VR image, wherein the lattice type guide GUI guides a position corresponding to the planar VR image on the spherical VR image, on the planar VR image.
The method may further comprise: displaying a plane VR image corresponding to the area of the edited spherical VR image.
The planar VR image may be obtained by converting a combined image, which is obtained by combining a plurality of images captured from a plurality of different viewpoints, to a plane image.
The first object provided from the editing tool may comprise at least one selected from a tool GUI used in an editing function, an editing content generated by the tool GUI, and a content added according to the editing function.
According to an aspect of an exemplary embodiment, there is provided a non-transitory recording medium storing a program for performing an operation method of a display apparatus, the operation method comprising: converting a VR image into a spherical VR image; obtaining a planar VR image corresponding to an area of the spherical VR image according to a projection method; displaying the planar VR image; receiving a user input to select an editing tool for performing an editing operation on the planar VR image; in response to the editing operation, overlaying a first object corresponding to the editing operation on the planar VR image; displaying the first object overlaid on the planar VR image; obtaining a second object by inversely performing to project the first object as the second object in a spherical coordinate system; and editing the spherical VR image based on the second object.
According to an aspect of an exemplary embodiment, there is provided a display apparatus comprising: a processor configured to: receive a first Virtual Reality (VR) image; obtain a second VR image corresponding to an area of the first VR image by applying a projection method on the first VR image; overlay a first object corresponding to an editing operation on the second VR image; obtain a second object by inversely performing the projection method used for obtaining the second VR image on the first object, in order to project the first object as the second object in a spherical coordinate system; and edit the first VR image based on the second object.
In response to the editing comprising an operation for changing a first attribute of the first object, the processor may be further configured to: change a first attribute of the second object based on the editing operation; and change a second attribute of the first object based on the changed first attribute of the second objection, wherein the second attribute is different from the first attribute.
The first attribute may correspond to a size of an object; and the second attribute corresponds to a shape of an object.
In response to the editing comprising an operation for changing a position of the first object from a first position to a second position, the processor may be further configured to: move the second object to a third position in the spherical coordinate system corresponding to the second position in the second VR image, change the shape of the first object according to the projection method so that the first object corresponds to the second object having the changed position.
According to an aspect of an exemplary embodiment, there is provided a method of controlling a display apparatus, the method comprising: receiving a first Virtual Reality (VR) image; obtaining a second VR image corresponding to an area of the first VR image by applying a projection method on the first VR image; overlaying a first object corresponding to an editing operation on the second VR image; obtaining a second object by inversely performing the projection method used for obtaining the second VR image on the first object, in order to project the first object as the second object in a spherical coordinate system; and editing the first VR image based on the second object.
The method may further comprise: in response to the editing comprising an operation for changing a first attribute of the first object, changing a first attribute of the second object based on the editing operation; and changing a second attribute of the first object based on the changed first attribute of the second objection, wherein the second attribute is different from the first attribute.
According to various exemplary embodiments of the present disclosure, a display apparatus may provide a user with an intuitive and convenient editing function by changing a shape of an object provided from an editing tool when a VR image is displayed.
Additional and/or other aspects and advantages of the disclosure will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the disclosure.
The above and/or other aspects of the present disclosure will be more apparent by describing certain exemplary embodiments of the present disclosure with reference to the accompanying drawings, in which:
FIG. 1A is a block diagram of a configuration of a display apparatus according to an exemplary embodiment;
FIG. 1B is a block diagram of a detailed configuration of a display apparatus according to an exemplary embodiment;
FIGS. 2A through 2D illustrate an example of a projection method according to an exemplary embodiment;
FIGS. 3A through 3C illustrate a change in a size of an object according to an exemplary embodiment;
FIGS. 4A through 4C illustrate a change in a position of an object according to an exemplary embodiment;
FIG. 5 illustrates a type of an object according to an exemplary embodiment of;
FIGS. 6A and 6B illustrate a method of changing a projection point according to an exemplary embodiment;
FIGS. 7A through 7F illustrate a process of editing a Virtual Reality (VR) image according to an exemplary embodiment;
FIG. 8 illustrates a screen that is being edited, according to an exemplary embodiment; and
FIG. 9 is a flowchart of a method of controlling a display apparatus according to an exemplary embodiment.
-
Certain exemplary embodiments of the present disclosure will now be described in greater detail with reference to the accompanying drawings.
In the following description, same drawing reference numerals are used for the same elements even in different drawings. The matters defined in the description, such as detailed construction and elements, are provided to assist in a comprehensive understanding of the disclosure. Thus, it is apparent that the exemplary embodiments of the present disclosure can be carried out without those specifically defined matters. Also, well-known functions or constructions are not described in detail since they would obscure the disclosure with unnecessary detail.
Hereinafter, various exemplary embodiments of the present disclosure will be described in detail with reference to the attached drawings.
FIG. 1A is a block diagram of a configuration of a display apparatus 100 according to an exemplary embodiment
As shown in FIG. 1A, the display apparatus 100 includes a storage 110, a user interface 120, a display 130, and a processor 140.
The display apparatus 100 may be an apparatus that displays and edits an image or a video. For example, the display apparatus 100 may be realized as a notebook computer, a desktop personal computer (PC), a smartphone, or the like, and any apparatus which displays and edits an image or a video is not limited and may be applied to the display apparatus 100.
In particular, the display apparatus 100 may be an apparatus that displays and edits a Virtual Reality (VR) image or video. Here, the VR image may be an image generated by combining a plurality of images captured from a plurality of different viewpoints and converting the combined image to a plane.
In other words, the VR image may be an image generated by capturing a plurality of images so as to include all directions based on a capturing person, stitching the plurality of captured images, and converting the stitched image to a plane. However, the VR image is not limited thereto and thus may be generated by capturing a plurality of images so as to include merely some directions not all directions.
If a plurality of images captured from a plurality of different viewpoints are stitched, a spherical VR image is generated, and an example of the spherical VR image is illustrated in FIG. 2A. Also, if the spherical VR image illustrated in FIG. 2A is converted through an equirectangular projection method, a VR image is generated, and an example of the VR image is illustrated in FIG. 2C.
Here, a conversion of a spherical VR image into a planar VR image is referred to as a projection, and a method of converting the spherical VR image into the planar VR image is referred to as a projection method. Detailed descriptions of the projection and the projection method will be described later with reference to FIGS. 2A, 2B and 2C.
The display apparatus 100 may provide a function of displaying and editing a whole or a part of a VR image.
The storage 110 may store a VR image generated by combining a plurality of images captured from a plurality of different viewpoints and converting the combined image to a plane. The VR image may be an image generated by an external apparatus not the display apparatus 100. In this case, the display apparatus 100 may receive a VR image from an external apparatus and may store the VR image in the storage 110. Alternatively, the display apparatus 100 may include a plurality of cameras, directly perform capturing by using the plurality of cameras, and generate a VR image by processing a plurality of captured images.
The user interface 120 may receive a user input. For example, the user interface 120 may receive a user input for displaying a VR image, a spherical VR image, or the like.
In particular, the user interface 120 may receive a user input for displaying a planar VR image corresponding to an area of a spherical VR image. In this case, the user input may be an input that designates a projection point and a projection angle for designating an area. The user input may also be an input that designates a projection method.
Alternatively, the user interface 120 may receive a user input for changing an area of a VR image that is currently being displayed.
The user interface 120 may receive a user input for editing a VR image that is being displayed. For example, the user interface 120 may receive a user input for executing an editing tool for editing a VR image. The user interface 120 may also receive a user input for changing a size or a position of an object that is provided from an editing tool as the editing tool is executed.
The display 130 may display various types of contents under control of the processor 140. For example, the display 130 may display the VR image and the object provided from the editing tool. The display 130 may also in real time display a VR image that is edited according to an execution of the editing tool.
Also, the display 130 may be realized as a Liquid Crystal Display (LCD) panel, an Organic Light Emitting Diode (OLED), or the like but is not limited thereto. The display 130 may also be realized as a flexible display, a transparent display, or the like.
The processor 140 controls an overall operation of the display apparatus 100.
The processor 140 may convert a VR image into a spherical VR image. Here, the VR image may be an image generated by converting a spherical VR image to a plane through a preset projection method and may be stored in the storage 110.
The processor 140 may generate a spherical VR image by inversely projecting a VR image according to a projection method used for generating a VR image. For example, if an equirectangular projection method is used when generating a VR image, the processor 140 may generate a spherical VR image by respectively mapping a width and a length of a VR image on Φ and θ of a spherical coordinate system.
According to an exemplary embodiment, even if the equirectangular projection method and another type of projection method are used, the processor 140 may generate a spherical VR image by inversely projecting a VR image according to each of the equirectangular projection method and the another type of projection method.
Here, the VR image may store information about a projection method. In this case, the processor 140 may generate a spherical VR image based on the projection method stored in the VR image. Alternatively, the processor 140 may determine a projection method used when generating a VR image by analyzing the VR image.
The processor 140 may generate a planar VR image corresponding to an area of the spherical VR image and control the display 130 to display the planar VR image. For example, the processor 140 may generate a planar VR image by projecting merely an area of a spherical VR image. Alternatively, the processor 140 may generate a planar VR image by projecting a whole of a spherical VR image and cropping merely an area of the VR image.
Here, if a user input for a projection point, a projection angle, and a projection method is received, the processor 140 may determine an area of a spherical VR image based on the projection point and the projection angle, and generate and display a planar VR image corresponding to the area based on the projection method.
Here, the projection point may be a point of an area that a user wants to display on the spherical VR image. The projection angle may be an angle of an area that the user wants to display in a center of the spherical VR image. However, the area that the user wants to display may be a rectangular area. In this case, the projection angle may include an angle formed by upper and lower edges of the rectangular area and the center of the spherical VR image and an angle formed by left and right edges of the rectangular area and the center of the spherical VR image.
However, the present disclosure is not limited thereto, and thus the processor 140 may determine an area that the user wants to display by receiving merely one of the two angles described above. For example, if an angle formed by left and right edges of an area that the user wants to display and the center of the spherical VR image is received, the processor 140 may determine the area that the user wants to display based on an aspect ratio of the display 130.
If the user input for the projection point, the projection angle, and the projection method is not received, the processor 140 may determine an area of the spherical VR image by using a projection point, a projection angle, and a projection method set by default. The processor 140 may also receive a user input for some of the projection point, the projection angle, and the projection method.
If an editing tool for editing a planar VR image is executed according to a user input, the processor 140 may overlay and display a first object provided from the editing tool on the planar VR image. For example, if an editing tool for adding a line onto a planar VR image is executed, the processor 140 may overlay and display a pen tool on the planar VR image.
Here, the first object provided from the editing tool may include at least one selected from a tool Graphical User Interface (GUI) used in an editing function, an editing content generated by the tool GUI, and a content added according to the editing function.
The processor 140 may generate a second object by inversely performing the projection method used for generating the planar VR image on the first object, in order to project the first object as the second object in a spherical coordinate system. In other words, the processor 140 may generate an edited sphere VR image based on a second object generated by reversely projecting the first object according to a projection method used for generating the plane VR image. For example, if an equirectangular projection method is used when generating a VR image, the processor 140 may generate the second object by inversely projecting the first object according to the equirectangular projection method. An operation of editing a spherical VR image based on the second object will be described later.
If a user input for changing a size of the first object is received, the processor 140 may change a size of the second object in response to the user input, change a shape of the first object based on a projection method so as to enable the first object to correspond to the second object having the changed size, and display the first object having the changed shape on the planar VR image. Here, the projection method may be a projection method used for generating the planar VR image.
For example, if a user input for changing the size of the first object by 10 units is received, the processor 140 may change the size of the second object by 10 units. In other words, although the user input for changing the size of the first object by 10 units is received, the size of the first object may not be changed by 10 units and displayed.
The processor 140 may generate the first object of which the shape is changed and corresponds to the second object of which the size is changed by 10 units based on the projection method used for generating the planar VR image. Here, the size of the first object may not be simply changed by 10 units, but the shape of the first object may be changed according to at least one selected from a projection point, a projection angle, and a projection method.
The processor 140 may display the first object, of which the shape is changed, on the planar VR image. In other words, a user may perform editing with checking editing of a spherical VR image not editing of a planar VR image.
If a user input for changing a position of the first object from a first position to a second position is received, the processor 140 may change a position of the second object to a third position corresponding to the second position on the spherical VR image, change a shape of the first object based on a projection method so as to enable the first object to correspond to the second object having the changed position, and display the first object having the changed shape in the second position on the planar VR image. Here, the projection method may be a projection method used for generating the planar VR image.
If a user input for changing the position of the first object from the first position to the second position is received, the processor 140 may change the position of the second object to the third position corresponding to the second position on the spherical VR image. Here, the third position corresponding to the second position may be determined based on the projection method used for generating the planar VR image.
The processor 140 may project the second object, of which the position is changed to the third position based on the projection method used for generating the planar VR image, onto a plane. The processor 140 may generate the first object of which position is changed by projecting the second object of which position is changed. Here, the position of the first object may not be simply changed, but the shape of the first object may be changed according to at least one selected from a projection point, a projection angle, and a projection method.
The processor 140 may display the first object, of which the shape is changed, on the planar VR image. In other words, the user may perform editing with checking editing of the spherical VR image not editing of the planar VR image in real time.
If a user input for changing the position of the first object from the first position to a fourth position in a preset area of the planar VR image is received, the processor 140 may generate and display a planar VR image corresponding to the fourth position.
For example, if a user input for changing the position of the first object to a point of a left boundary of the planar VR image is received, the processor 140 may generate and display a planar VR image where the point of the left boundary is a projection point.
The processor 140 may overlay and display a lattice type guide GUI, which guides a position corresponding to a planar VR image on the spherical VR image, on the planar VR image.
For example, the processor 140 may overlay and display a lattice type GUI corresponding to vertical and horizontal lines of a spherical VR image on a planar VR image. Here, the vertical and horizontal lines of the spherical VR image may respectively correspond to latitude and longitude.
The processor 140 may display a planar VR image corresponding to an area of an edited spherical VR image. For example, if an editing tool for adding a line onto a planar VR image is executed, the processor 140 may overlay and display a pen tool on the planar VR image. Also, if the pen tool is executed to add the line, the processor 140 may add the line onto the spherical VR image, convert the spherical VR image, onto which the line is added, into a planar VR image based on a projection method, and display the planar VR image.
FIG. 1B is a block diagram of a detailed configuration of the display apparatus 100, according to an exemplary embodiment. Referring to FIG. 1B, the display apparatus 100 includes the storage 110, the user interface 120, the display 130, the processor 140, a communicator 150, an audio processor 160, a video processor 170, a speaker 180, a button 181, a camera 182, and a microphone 183. Detailed descriptions of some of elements of FIG. 1B overlapping with elements of FIG. 1A will be omitted.
The processor 140 controls an overall operation of the display apparatus 100 by using various types of programs stored in the storage 110.
According to an exemplary embodiment, the processor 140 includes a Random Access Memory (RAM) 141, a Read Only Memory (ROM) 142, a main Central Processing Unit (CPU) 143, a graphic processor 144, first through nth interfaces 145-1 through 145-n, and a bus 146.
The RAM 141, the ROM 142, the main CPU 143, the graphic processor 144, the first through nth interfaces 145-1 through 145-n, and the like may be connected to one another through the bus 146.
The first through nth interfaces 145-1 through 145-n are connected to various types of elements described above. One of interfaces may be a network interface that is connected to an external apparatus through a network.
The main CPU 143 performs booting by using an Operating System (O/S) stored in the storage 110 by accessing the storage 110. The main CPU 143 also performs various types of operations by using various types of programs and the like stored in the storage 110.
A command set and the like for booting a system are stored in the ROM 142. If power is supplied by inputting a turn-on command, the main CPU 143 boots the system by copying the O/S stored in the storage 110 into the RAM 141 according to a command stored in the ROM 142 and executing the O/S. If the system is completely booted, the main CPU 143 performs various types of operations by copying various types of application programs stored in the storage 110 into the RAM 141 and executing the application programs copied into the RAM 141.
The graphic processor 144 generates a screen including various types of objects including an icon, an image, a text, and the like by using an operator (not shown) and a renderer (not shown). The operator calculates attribute values such as coordinate values, shapes, sizes, colors, and the like at which objects will be displayed according to a layout of the screen based on a received control command. The renderer generates a screen having various types of layouts including an object based on the attribute values calculated by the operator. The screen generated by the renderer is displayed in a display area of the display 130.
The above-described operation of the processor 140 may be performed by a program stored in the storage 110.
The storage 110 stores various types of data such as an O/S software module for driving the display apparatus 100, a projection method module, an image editing module, and the like.
In this case, the processor 140 may display a VR image and provide an editing tool based on information stored in the storage 110.
The user interface 120 receives various types of user interactions. Here, the user interface 120 may be realized as various types according to various exemplary embodiments of the display apparatus 100. For example, the display apparatus 100 may be a notebook computer, a desktop PC, or the like, and the user interface 120 may be a receiver or the like for receiving an input signal from a keyboard or a mouse for interfacing with the notebook computer, the desktop PC, or the like. Also, the display apparatus 100 may be a touch-based electronic device, and the user interface 120 may be a touch screen type that forms an interactive layer structure with a touch pad for interfacing with the touch-based electronic device. In this case, the user interface 120 may be used as the display 130 described above.
The communicator 150 is an element that performs communications with various types of external apparatuses according to various types of communication methods. The communicator 150 includes a Wireless Fidelity (WiFi) chip 151, a Bluetooth chip 152, a wireless communication chip 153, a Near Field Communication (NFC) chip 154, and the like. The processor 140 performs communications with various types of external apparatuses by using the communicator 150.
The WiFi chip 151 and the Bluetooth chip 152 respectively perform communications according to a WiFi method and a Bluetooth method. If the WiFi chip 151 and the Bluetooth chip 152 are used, various types of information may be transmitted and received by transmitting and receiving various types of connection information such as a Subsystem Identification (SSID), a session key, and the like and connecting communications by using the various types of connection information. The wireless communication chip 153 refers to a chip that performs communications according to various types of communication standards such as Institute of Electrical and Electronics Engineers (IEEE), Zigbee, 3rd Generation (3G), 3rd Generation Partnership Project (3GPP), Long Term Evolution (LTE), and the like. The NFC chip 154 refers to a chip that operates according to an NFC method using a band of 13.56 MHz among various types of Radio Frequency Identification (RFID) frequency bands such as 135 kHz, 13.56 MHz, 433 MHz, 860~960 MHz, 2.45 GHz, and the like.
The communicator 150 may perform a unidirectional or bidirectional communication with an external apparatus. If the communicator 150 performs the unidirectional communication with the external apparatus, the communicator 150 may receive a signal from the external apparatus. If the communicator 150 performs the bidirectional communication with the external apparatus, the communicator 150 may receive a signal from the external apparatus and may transmit a signal to the external apparatus.
The audio processor 160 is an element that performs processing with respect to audio data. The audio processor 160 may perform various types of processing, such as decoding, amplifying, noise filtering, and the like, with to the audio data.
The video processor 170 is an element that performs processing with respect to video data. The video processor 170 may perform various types of image processing, such as decoding, scaling, noise filtering, frame rate converting, resolution converting, and the like, with respect to the video data.
The speaker 180 is an element that outputs various types of audio data, various types of notification sounds, voice messages, and the like processed by the audio processor 160.
The button 181 may be various types of buttons such a mechanical button, a touch pad, a wheel, and the like that are formed in an arbitrary area of a front part, a side part, a back part, or the like of an external appearance of a main body of the display apparatus 100.
The camera 182 is an element that captures a still image or a moving picture image under control of the user. The camera 182 may be realized as a plurality of cameras including a front camera, a back camera, and the like.
The microphone 183 is an element that receives a user voice or other sounds and converts the user voice or the other sounds into audio data.
Also, although not shown in FIG. 1B, according to an exemplary embodiment, the display apparatus 100 may further include various types of external input ports for connecting the display apparatus 100 to various types of external terminals such as a Universal Serial Bus (USB) port through which a USB connector may be connected to the display apparatus 100, a headset, a mouse, a Local Area Network (LAN), and the like, a Digital Multimedia Broadcasting (DMB) chip that receives and processes a DMB signal, various types of sensors, and the like.
Hereinafter, basic elements and various exemplary embodiments to facilitate understanding of the present disclosure will be described.
FIGS. 2A through 2D illustrate an example of a projection method according to an exemplary embodiment.
FIG. 2A illustrates an example of a spherical VR image. FIG. 2C illustrates a VR image generated by converting the spherical VR image of FIG. 2A to a plane based on an equirectangular projection method.
FIG. 2B illustrates an exemplary representation of the spherical VR image in FIG. 2A. According to an exemplary embodiment, FIG. 2B, illustrates example of a central point O and a projection point P0 of the spherical VR image. Φ of a spherical coordinate system denotes an angle formed between a straight line going from the central point O to the projection point P0 and a straight line going from the central point O to a first point P1 on a horizontal plane. If the projection point P0 and the first point P1 are not on the horizontal plane, an angle may be determined based on two points on the horizontal plane onto which the projection point P0 and the first point P1 are respectively projected.
Here, the horizontal plane may be a basis unrolling a spherical VR image on a plane and may be set in another direction. For example, the horizontal plane may be set so as to be orthogonal to a horizontal plane of FIG. 2B. Also, the processor 140 may determine the horizontal plane based on the projection point P0.
According to an exemplary embodiment, θ of the spherical coordinate system may be an angle formed between a straight line going from the central point O to a second point P2 and the horizontal plane.
The processor 140 may generate a VR image by converting a spherical VR image to a plane based on a correspondence relation between Φ and θ of the spherical coordinate system and x and y of an orthogonal coordinate system. The correspondence relation may depend on a projection method.
If an equirectangular projection method is used as shown in FIG. 2C, shapes of circular dots displayed on the spherical VR image of FIG. 2A may be changed as the spherical VR image is projected onto a plane. In other words, the shapes of the circular dots illustrated in FIG. 2A may be changed into elliptical shapes as the locations of the circular dots are closer to the upper and lower regions of the VR image of FIG. 2C. This is a problem occurring as the spherical VR image is illustrated on a rectangular plane, and a distortion may become serious as the locations of the circular dots are closer to the upper and lower regions of FIG. 2C. However, if another type of projection method is used, an area where a distortion occurs may be changed.
The equirectangular projection method is illustrated in FIGS. 2A and 2B, but the present disclosure is not limited thereto. For example, a spherical VR image may be converted into a VR image by using various types of projection methods such as rectilinear, cylindrical, Mercator, stereographic, pannini, and ours projection methods, and the like. An example of a VR image converted to a plane through various types of projection methods is illustrated in FIG. 2C.
Hereinafter, for convenience of description, an equirectangular projection method will be described as being used. However, technology of the present application may be applied even if other types of projection methods are used.
FIGS. 3A through 3C illustrate a change in a size of an object according to an exemplary embodiment of the present disclosure.
As shown in FIG. 3A, if an editing tool for editing a planar VR image is executed according to a user input when the planar VR image is displayed, the processor 140 may overlay and display a first object provided from the editing tool on the planar VR image.
For example, if a user input for displaying a plurality of stickers and selecting one of the plurality of stickers is executed, the processor 140 may overlay and display a sticker having a preset shape on the planar VR image. Here, a sticker may be an arrow, an emoticon, or the like, selected from a GUI editing tool.
The processor 140 may also receive a user input for changing a size of the first object. For example, the processor 140 may receive a user input for changing the size of the first object from a first size 310-1 to a second size 310-2.
If a user input for changing the size of the first object is received, the processor 140 may change a size of a second object in response to the user input as shown in FIG. 3B. Here, the second object may be an object that is positioned on a spherical coordinate system and corresponds to the first object. In other words, the processor 140 may generate a second object by inversely converting a first object based on a projection method.
For example, if a user input for changing the size of the first object from the first size 310-1 to the second size 310-2 is received, the processor 140 may change a size of the second object by a difference d between the first size 310-1 and the second size 310-2. In other words, the processor 140 may change the size of the second object from a third size 320-1 to a fourth size 320-2 on a second layer. According to an exemplary embodiment, the processor 140 may change the size of the second object from the third size 320-1 to the fourth size 320-2 on the second layer according a difference d. Here, a shape of the second object may not be changed, but merely the size of the second object may be changed.
However, the present disclosure is not limited thereto, and thus if a user input for changing the size of the first object from the first size 310-1 to the second size 310-2 is received, the processor 140 may calculate a plurality of coordinates corresponding to a plurality of vertexes of the second size on a spherical coordinate system and change the size of the second object in response to the plurality of calculated coordinates. In this case, the shape of the second object may be changed.
The second object and the spherical VR image may be respectively included on different layers. For example, the spherical VR image may be included on a first layer, and the second object may be included on a second layer. In other words, although the size of the second object is changed, the spherical VR image may not be changed.
The processor 140 may generate a first object, of which shape is changed, by converting a second object, of which a size is changed, to a plane based on a preset projection method. Here, the processor 140 may project a layer including the second object onto a plane.
The processor 140 may project a layer including a second object onto a plane according to a projection point, a projection angle, and a projection method used when projecting a spherical VR image onto a planar VR image.
As described above, an area may be distorted in a projection process. As the distortion occurs, a size of a first object may not be simply changed, but a shape of the first object may be distorted. As shown in FIG. 3C, the processor 140 may display a first object 330, of which shape is changed, on a planar VR image.
If a user input for merging the first object 330 having the changed shape with the planar VR image is received, the processor 140 may generate an edited spherical VR image by merging a first layer including the spherical VR image with a second layer including the first object 330. The processor 140 may display a planar VR image corresponding to an area of the edited spherical VR image.
The processor 140 may overlay and display a lattice type guide GUI, which guides a position corresponding to the planar VR image on the spherical VR image, on the planar VR image.
For example, the lattice type guide GUI may correspond to vertical and horizontal lines of the spherical VR image. A distance between the vertical and horizontal lines may be preset. Alternatively, the distance may be changed under control of the user.
FIGS. 4A through 4C illustrate a change in a position of an object according to an exemplary embodiment of the present disclosure.
As shown in FIG. 4A, if an editing tool for editing a planar VR image is executed according to a user input when the planar VR image is displayed, the processor 140 may overlay and display a first object provided from the editing tool on the planar VR image.
The processor 140 may also receive a user input for changing a position of the first object. For example, the processor 140 may receive a user input for changing the position of the first object from a first position 410-1 to a second position 410-2.
If the user input for changing the position of the first object is received, the processor 140 may change a position of a second object in response to the user input. Here, the second object may be an object that is positioned on a spherical coordinate system and corresponds to the first object. In other words, the processor 140 may generate the second object by inversely converting the first object based on a projection method.
For example, if a user input for changing the position of the first object from the first position 410-1 to the second position 410-2 is received, the processor 140 may change the position of the second object by a difference d between the first position 410-1 and the second position 410-2. In other words, the processor 140 may change the position of the second object from a third position 420-1 to a fourth position 420-2 on a second layer. According to an exemplary embodiment, the processor 140 may change the position of the second object from the third position 420-1 to the fourth position 420-2 on the second layer according to a distance d. Here, a shape of the second object may not be changed, but merely the position of the second object may be changed.
However, the present disclosure is not limited thereto, and thus if a user input for changing the position of the first object from the first position 410-1 to the second position 410-2 is received, the processor 140 may calculate a plurality of coordinates corresponding to a plurality of vertexes of the second position 410-2 on a spherical coordinate system and change the position of the second object in response to the plurality of calculated coordinates. In this case, the shape of the second object may be changed.
The second object and the spherical VR image may be respectively included on different layers. For example, the spherical VR image may be included on a first layer, and the second object may be included on a second layer. In other words, although the position of the second object is changed, the spherical VR image may not be changed.
The processor 140 may generate a first object, of which shape is changed, by converting a second object, of which position is changed, to a plane based on a preset projection method. Here, the processor 140 may project a layer including the second object onto a plane.
The processor 140 may project the layer including the second object according to a projection point, a projection angle, and a projection method used when projecting the spherical VR image onto the planar VR image.
As described above, an area may be distorted in a projection process. As the distortion occurs, the position of the first object may not be simply changed, but merely the shape of the first object may be distorted. As shown in FIG. 4C, the processor 140 may display a first object 430 having a changed shape on a planar VR image.
If a user input for merging the first object 430 having the changed shape with the planar VR image is received, the processor 140 may generate an edited spherical VR image by merging a first layer including the spherical VR image with a second layer including the first object 430. The processor 140 may display a planar VR image corresponding to an area of the edited spherical VR image.
The processor 140 may overlay and display a lattice type guide GUI, which guides a position corresponding to the planar VR image on the spherical VR image, on the planar VR image.
For example, the lattice type guide GUI may correspond to vertical and horizontal lines of the spherical VR image. A distance between the vertical and horizontal lines may be preset. Alternatively, the distance may be changed under control of the user.
FIG. 5 illustrates a type of an object according to an exemplary embodiment.
The processor 140 may insert an image. The processor 140 may change a shape of an image by using a method as described with reference to FIGS. 3A, 3B, 3C, 4A, 4B and 4C and display an image 510, of which shape is changed, on a planar VR image.
For example, the image has a rectangular shape, but the processor 140 may generate the image 510 of which shape is changed and then display the image 510 having the changed shape on the planar VR image. The processor 140 may also apply a filter to a boundary area between the image 510 having the changed shape and the planar VR image.
However, this is merely an exemplary embodiment, and thus an object provided from an editing tool may include at least one selected from a tool GUI used in an editing function, an editing content generated by the tool GUI, and a content added according to the editing function.
For example, the processor 140 may display an image, a pen, a paint, an eraser, a sticker, a text box, a moving picture image, a filter, and the like on the planar VR image according to the same method.
FIGS. 6A and 6B illustrate a method of changing a projection point according to an exemplary embodiment.
As shown in FIG. 6A, the processor 140 may overlay and display a first object on a planar VR image. If a user input for changing a position of the first object from a first position 610-1 to a second position 610-2 in a preset area of the planar VR image is received, the processor 140 may generate and display a planar VR image corresponding to the second position 610-2.
For example, if a user input for moving the first object to a left boundary is received, the processor 140 may change a projection point to a left side. If a user input for moving the first object to a left boundary is received, the processor 140 may check that there is a user intention of moving an object to another area not to a currently displayed planar VR image and change a displayed area.
As shown in FIG. 6B, the processor 140 may generate and display a planar VR image corresponding to the second position 610-2. In other words, the processor 140 may change a projection point so as to display a second object in a center. Also, a building positioned in the center may be displayed to a right side due to the change in the projection point.
However, the present disclosure is not limited thereto, and thus if a user input for moving a first object to a left boundary, the processor 140 may change a projection point to a preset projection point. Alternatively, the processor 140 may determine a new projection point based on at least one selected from an existing projection point, a projection angle, and a changed position of the first object.
Alternatively, if the projection point is changed, the processor 140 may change the projection angle. For example, if the projection point is changed, the processor 140 may change the projection angle so as to enable the projection angle to be larger in order to easily search for an area of a VR image.
Merely a change in a position of a first object has been described above with reference to FIGS. 6A and 6B, but the present disclosure is not limited thereto. For example, if a user input for enlarging the first object in a preset size or more is received, the processor 140 may enlarge and display a projection angle.
FIGS. 7A through 7F illustrate a process of editing a VR image according to an exemplary embodiment.
As shown in FIG. 7A, the processor 140 may display a VR image generated by converting a spherical VR image to a plane based on a preset projection method.
Alternatively, as shown in FIG. 7B, the processor 140 may display an area of the VR image converted to the plane. In other words, the processor 140 receive projection parameters such as a projection point, a projection angle, a projection method, and the like from the user and display an area of the VR image converted to the plane. The processor 140 may also change an image, which is being displayed, by receiving projection parameters from the user in real time.
The processor 140 may display a whole or an area of the VR image converted to the plane according to a user input.
As shown in FIG. 7C, if merely an area of the VR image converted to the plane is displayed, the processor 140 may change and display a projection point in real time according to a user input. A projection point of FIG. 7C is more moved to a left side than a projection point of FIG. 7B.
Also, as shown in FIG. 7D, the processor 140 may change and display a projection angle in real time according to a user input. A projection angle of FIG. 7D is more reduced than a projection angle of FIG. 7C. In other words, the user may enlarge an image, which is being displayed, by reducing a projection angle or may reduce an image, which is being displayed, by enlarging the projection angle.
However, the present disclosure is not limited thereto, and thus the processor 140 may display an image by enlarging or reducing the image without changing a projection angle. The processor 140 may also display the image by changing a projection point, a projection angle, and a projection method in real time.
If editing of the user is performed, the processor 140 may display an edited VR image in real time. Here, as shown in FIG. 7E, the processor 140 may display an image, which is being displayed, an editing state thereof or may display a whole of a completely edited VR image as shown in FIG. 7F.
Although there is an input of a movement of a projection point or an enlargement and a reduction of an image, the processor 140 may move the projection point or enlarge and reduce the image with maintaining existing editing contents. For example, although a projection point is moved as shown in FIG. 7C or an image is enlarged as shown in FIG. 7D, the processor 140 may maintain existing editing contents.
In particular, although existing editing contents are not displayed due to the movement of the projection point or the enlargement and reduction of the image, the processor 140 may maintain the existing editing contents.
FIG. 8 illustrates a screen that is being edited according to an exemplary embodiment.
As shown in FIG. 8, the processor 140 may display an area of a VR image on a whole screen or may reduce and display a whole VR image 810 in an area of the whole screen.
If an area of a VR image is edited according to a user input, the processor 140 may display an editing result 820-1 of the area in real time. The processor 140 may also display an editing result 820-2 of the whole VR image 810 that is reduced and displayed. Through this operation, the user may edit an area of an image and check how a whole area of the image is edited.
FIG. 9 is a flowchart of a method of controlling a display apparatus according to an exemplary embodiment.
In operation S910, the display apparatus converts a VR image, into a spherical VR image. According to an exemplary embodiment, the VR image is a planar VR image generated by combining a plurality of images captured from a plurality of different viewpoints, According to an exemplary non-limiting embodiment, the spherical VR image may be received from a storage, and as such a conversion operation S910 by the display apparatus may be omitted. In operation S920, the display apparatus generates and displays a planar VR image corresponding to an area of the spherical VR image. In operation S930, if an editing tool for editing the planar VR image is executed according to a user input, the display apparatus overlays and displays a first object provided from the editing tool on the planar VR image. In operation S940, the display apparatus generates a second object by inversely performing the projection method used for generating the planar VR image on the first object, in order to project the first object as the second object in a spherical coordinate system. In operation S950, the display apparatus edits the spherical VR image based on the second object.
Also, the method may further include, if a user input for changing a size of the first object is received, changing a size of the second object in response to the user input, and changing a shape of the first object based on the projection method so as to enable the first object to correspond to the second object having the changed size and displaying the first object having the changed shape on the planar VR image.
The method may further include, if a user input for changing a position of the first object from a first position to a second position is received, changing a position of the second object to a third position corresponding to the second position on the spherical VR image, and changing a shape of the first object based on a projection method so as to enable the first object to correspond to the second object having the changed position and displaying the first object having the changed shape on the planar VR image.
Operation 920 may further include, if a user input for a projection point, a projection angle, and a projection method is received, determining an area of the spherical VR image based on the projection point and the projection angle, and generating and displaying the planar VR image corresponding to the area based on the projection method.
Also, the method may further include, if a user input for changing the position of the first object from the first position to a fourth position in a preset area of the planar VR image is received, generating and displaying a planar VR image corresponding to the fourth position.
The method may further include overlaying and display a lattice type guide GUI, which guides a position corresponding to the planar VR image on the spherical VR image, on the planar VR image.
Also, the method may further include displaying a planar VR image corresponding to an area of an edited spherical VR image.
The first object provided from the editing tool may include at least one selected from a tool GUI used in an editing function, an editing content generated by the tool GUI, and a content added according to the editing function.
According to various exemplary embodiments of the present disclosure as described above, a display apparatus may provide a user with an intuitive and convenient editing function by changing a shape of an object providing from an editing tool when a VR image is displayed.
An equirectangular projection method has been described above as being used, but this is merely for convenience of description. Therefore, technology of the present application may be applied even if other types of projection methods are used.
Also, an image has been mainly described above, but the same method may be applied with respect to each frame of a moving picture image. The user may edit each frame and may perform the same editing with respect to frames displayed for a preset time.
Methods according to various exemplary embodiments of the present disclosure described above may be embodied as an application type that may be installed in an existing electronic device.
The methods according to the various exemplary embodiments of the present disclosure described above may also be embodied by merely upgrading software or hardware of an existing electronic device.
In addition, the various exemplary embodiments of the present disclosure described above may be performed through an embedded server included in an electronic device or an external server of the electronic device.
According to an exemplary embodiment, the elements, components, methods or operations described herein may be implemented using hardware components, software components, or a combination thereof. For example, the hardware components may include a processing device. According to an exemplary embodiment, the display apparatus may include a processing device, such as the image processor or the controller, that may be implemented using one or more general-purpose or special purpose computers, such as, for example, a hardware processor, a CPU, a hardware controller, an ALU, a DSP, a microcomputer, an FPGA, a PLU, a microprocessor or any other device capable of responding to and executing instructions in a defined manner. The processing device may run an operating system (OS) and one or more software applications that run on the OS. The processing device also may access, store, manipulate, process, and create data in response to execution of the software. For purpose of simplicity, the description of a processing device is used as singular; however, one skilled in the art will appreciated that a processing device may include multiple processing elements and multiple types of processing elements. For example, a processing device may include multiple processors or a processor and a controller. In addition, different processing configurations are possible, such a parallel processors.
According to an exemplary embodiment of the present disclosure, the various exemplary embodiments described above may be embodied as software including instructions stored in machine-readable storage media (e.g., computer-readable storage media). A device may an apparatus that calls an instruction from a storage medium, may operate according to the called instruction, and may include an electronic device (e.g., an electronic device A) according to disclosed exemplary embodiments. If the instruction is executed by a processor, the processor may directly perform a function corresponding to the instruction or the function may be performed by using other types of elements under control of the processor. The instruction may include codes generated or executed by a compiler or an interpreter. A machine-readable storage medium may be provided as a non-transitory storage medium type. Here, “non-transitory” means that a storage medium does not include a signal and is tangible but does not distinguish semi-permanent and temporary storages of data in the storage medium.
Also, according to an exemplary embodiment of the present disclosure, a method according to various exemplary embodiments described above may be included and provided in a computer program product. The computer program product may be transacted as a product between a seller and a buyer. The computer program product may be distributed as a type of a machine-readable storage medium (e.g., a type of a compact disc read only memory (CD-ROM)) or may be distributed online through an application store (e.g., play store TM). If the computer program product is distributed online, at least a part of the computer program product may be at least temporally or temporarily generated in a storage medium such as a memory of a server of a manufacturer, a server of an application store, or a relay server.
In addition, according to an exemplary embodiment of the present disclosure, various exemplary embodiments described above may be embodied in a recording medium readable by a computer or a similar apparatus to the computer by using software, hardware, or a combination thereof. In some cases, exemplary embodiments described herein may be embodied as a processor. According to a software embodiment, exemplary embodiments such as processes and functions described herein may be embodied as additional software modules. The software modules may perform at least one or more functions and operations described herein.
Computer instructions for performing a processing operation of a device according to the above-described various exemplary embodiments may be stored in a non-transitory computer-readable medium. The computer instructions stored in the non-transitory computer-readable medium enable a particular device to perform a processing operation in a device according to the above-described exemplary embodiments when being executed by a processor of the particular device. The non-transitory computer readable medium is a medium which does not store data temporarily such as a register, cash, and memory but stores data semi-permanently and is readable by devices. More specifically, the aforementioned applications or programs may be stored in the non-transitory computer readable media such as compact disks (CDs), digital video disks (DVDs), hard disks, Blu-ray disks, universal serial buses (USBs), memory cards, and read-only memory (ROM).
Each of elements according to the above-described various exemplary embodiments (e.g., modules or programs) may include a single entity or a plurality of entities, and some of corresponding sub elements described above may be omitted or other types of sub elements may be further included in the various exemplary embodiments. Alternatively or additionally, some elements (e.g., modules or programs) may be integrated into one entity and then may equally or similarly perform a function performed by each of corresponding elements that are not integrated. Operations performed by modules, programs, or other types of elements according to the various exemplary embodiments may be sequentially, in parallel, or heuristically executed or at least some operations may be executed in different sequences or may be omitted, or other types of operations may be added.
The foregoing exemplary embodiments and advantages are merely exemplary and are not to be construed as limiting the present disclosure. The present teaching can be readily applied to other types of apparatuses. Also, the description of the exemplary embodiments of the present disclosure is intended to be illustrative, and not to limit the scope of the claims, and many alternatives, modifications, and variations will be apparent to those skilled in the art.

Claims (15)

  1. A display apparatus comprising:
    a storage configured to store a Virtual Reality (VR) image;
    a user interface;
    a display; a processor configured to:
    convert the VR image into a spherical VR image,
    obtain a planar VR image corresponding to an area of the spherical VR image according to a projection method,
    control the display to display the planar VR image,
    receive a user input, through the user interface, to select an editing tool for performing an editing operation on the planar VR image,
    in response to the editing operation, overlay a first object corresponding to the editing operation on the planar VR image and control the display to display the first object overlaid on the planar VR image,
    obtain a second object by inversely performing the projection method to project the first object as the second object in a spherical coordinate system, and
    edit the spherical VR image based on the second object.
  2. The display apparatus of claim 1, wherein, in response to the user input comprising an operation for changing a size of the first object, the processor is further configured to:
    change a size of the second object based on the user input,
    change a shape of the first object based on the second object of which the size is changed according to the projection method, and
    control the display to display the first object having the changed shape on the planar VR image.
  3. The display apparatus of claim 1, wherein, in response to the user input comprising an operation for changing a position of the first object from a first position to a second position in the planar VR image, the processor is further configured to:
    move the second object to a third position in the spherical coordinate system corresponding to the second position in the planar VR image,
    change the shape of the first object based on the inversely performed projection method so that the first object to correspond to the second object having the changed position, and
    control the display to display the first object having the changed shape in the second position on the planar VR image.
  4. The display apparatus of claim 1, wherein, in response to a user input comprising an operation for a projection point, a projection angle, and the projection method, the processor is further configured to:
    identify the area of the spherical VR image based on the projection point and the projection angle,
    obtain and control the display to display a planar VR image corresponding to the area based on the projection method.
  5. The display apparatus of claim 1, wherein, in response to a user input comprising an operation for changing a position of the first object from a first position to a fourth position in a preset area of the planar VR image, the processor is further configured to obtain and control the display to display a planar VR image corresponding to the fourth position.
  6. The display apparatus of claim 1, wherein the processor is further configured to overlay a lattice type guide graphical user interface (GUI) on the planar VR image and control the display to display the lattice type guide GUI overlaid on the planar VR image, and
    wherein the lattice type guide GUI guides a position corresponding to the planar VR image on the spherical VR image.
  7. The display apparatus of claim 1, wherein the processor displays a plane VR image corresponding to the area of the edited spherical VR image.
  8. The display apparatus of claim 1, wherein the planar VR image is obtained by converting a combined image, which is obtained by combining a plurality of images captured from a plurality of different viewpoints, to a plane image.
  9. The display apparatus of claim 1, wherein the first object provided from the editing tool comprises at least one selected from a tool GUI used in an editing function, an editing content generated by the tool GUI, and a content added according to the editing function.
  10. A method of controlling a display apparatus, the method comprising:
    converting a VR image into a spherical VR image;
    obtaining a planar VR image corresponding to an area of the spherical VR image according to a projection method;
    displaying the planar VR image;
    receiving a user input to select an editing tool for performing an editing operation on the planar VR image;
    in response to the editing operation, overlaying a first object corresponding to the editing operation on the planar VR image;
    displaying the first object overlaid on the planar VR image;
    obtaining a second object by inversely performing the projection method to project the first object as the second object in a spherical coordinate system; and
    editing the spherical VR image based on the second object.
  11. The method of claim 10, further comprising:
    in response to the user input comprising an operation for changing a size of the first object being received, changing a size of the second object based on the user input; and
    changing a shape of the first object based on the second object of which the size is changed according to the projection method, and displaying the first object having the changed shape on the planar VR image.
  12. The method of claim 10, further comprising:
    in response to the user input comprising an operation for changing a position of the first object from a first position to a second position in the planar VR image, moving the second object to a third position in the spherical coordinate system corresponding to the second position in the planar on the spherical VR image; and
    changing a shape of the first object based on the projection method so that the first object to correspond to the second object having the changed position and displaying the first object having the changed shape in the second position on the planar VR image.
  13. The method of claim 10, wherein the displaying of the planar VR image comprises:
    in response to the user input comprising an operation for a projection point, a projection angle, and the projection method being received, identifying the area of the spherical VR image based on the projection point and the projection angle; and
    obtaining and displaying a planar VR image corresponding to the area based on the projection method.
  14. The method of claim 10, further comprising:
    in response to the user input comprising an operation for changing a position of the first object from a first position to a fourth position in a preset area of the planar VR image being received, obtaining and displaying a planar VR image corresponding to the fourth position.
  15. The method of claim 10, further comprising:
    overlaying a lattice type guide graphical user interface (GUI) on the planar VR image and
    displaying the lattice type guide GUI overlaid on the planar VR image,
    wherein the lattice type guide GUI guides a position corresponding to the planar VR image on the spherical VR image, on the planar VR image.
PCT/KR2017/012083 2016-11-08 2017-10-30 Display apparatus and control method thereof WO2018088742A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP17870508.3A EP3520086B1 (en) 2016-11-08 2017-10-30 Display apparatus and control method thereof

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160148403A KR20180051288A (en) 2016-11-08 2016-11-08 Display apparatus and control method thereof
KR10-2016-0148403 2016-11-08

Publications (1)

Publication Number Publication Date
WO2018088742A1 true WO2018088742A1 (en) 2018-05-17

Family

ID=62063953

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/012083 WO2018088742A1 (en) 2016-11-08 2017-10-30 Display apparatus and control method thereof

Country Status (5)

Country Link
US (1) US20180130243A1 (en)
EP (1) EP3520086B1 (en)
KR (1) KR20180051288A (en)
CN (1) CN108062795A (en)
WO (1) WO2018088742A1 (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201710646A (en) * 2015-09-02 2017-03-16 湯姆生特許公司 Method, apparatus and system for facilitating navigation in an extended scene
KR102598082B1 (en) * 2016-10-28 2023-11-03 삼성전자주식회사 Image display apparatus, mobile device and operating method for the same
WO2018093391A1 (en) * 2016-11-21 2018-05-24 Hewlett-Packard Development Company, L.P. 3d immersive visualization of a radial array
US10999602B2 (en) 2016-12-23 2021-05-04 Apple Inc. Sphere projected motion estimation/compensation and mode decision
US11259046B2 (en) 2017-02-15 2022-02-22 Apple Inc. Processing of equirectangular object data to compensate for distortion by spherical projections
US10924747B2 (en) 2017-02-27 2021-02-16 Apple Inc. Video coding techniques for multi-view video
US10356319B2 (en) * 2017-04-28 2019-07-16 Fuji Xerox Co., Ltd. Panoramic portals for connecting remote spaces
US11093752B2 (en) 2017-06-02 2021-08-17 Apple Inc. Object tracking in multi-view video
US20190005709A1 (en) * 2017-06-30 2019-01-03 Apple Inc. Techniques for Correction of Visual Artifacts in Multi-View Images
US10754242B2 (en) 2017-06-30 2020-08-25 Apple Inc. Adaptive resolution and projection format in multi-direction video
CN107977124B (en) * 2017-11-28 2020-11-03 友达光电(苏州)有限公司 Three-dimensional touch panel
JP7043255B2 (en) 2017-12-28 2022-03-29 キヤノン株式会社 Electronic devices and their control methods
JP2019121857A (en) * 2017-12-28 2019-07-22 キヤノン株式会社 Electronic apparatus and control method of the same
KR102286987B1 (en) * 2019-01-18 2021-08-05 경희대학교 산학협력단 Method and apparatus for enhancing user experience space utilization which experiences virtual reality contents
CN110248087A (en) * 2019-04-29 2019-09-17 努比亚技术有限公司 Image pickup method, filming apparatus and computer readable storage medium
CN111158763B (en) * 2019-12-06 2023-08-18 思创数码科技股份有限公司 Equipment instruction processing system for intelligent management and control of building
KR102318698B1 (en) * 2019-12-27 2021-10-28 주식회사 믹서 Method and program for creating virtual space where virtual objects are arranged based on spherical coordinate system
KR20230062589A (en) * 2020-09-02 2023-05-09 구글 엘엘씨 Creating state awareness around panoramic imagery

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7346408B2 (en) * 2005-09-06 2008-03-18 Esko Ip Nv Two-dimensional graphics for incorporating on three-dimensional objects
US20130120354A1 (en) * 2008-08-28 2013-05-16 Peter F. Falco, Jr. Using Two Dimensional Image Adjustment Operations on Three Dimensional Objects
US20150015574A1 (en) * 2013-07-09 2015-01-15 Nvidia Corporation System, method, and computer program product for optimizing a three-dimensional texture workflow
US20150249815A1 (en) * 2013-05-01 2015-09-03 Legend3D, Inc. Method for creating 3d virtual reality from 2d images
US20150269781A1 (en) * 2014-03-19 2015-09-24 Machine Elf Software, Inc. Rapid Virtual Reality Enablement of Structured Data Assets

Family Cites Families (151)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5990941A (en) * 1991-05-13 1999-11-23 Interactive Pictures Corporation Method and apparatus for the interactive display of any portion of a spherical image
WO1998033147A1 (en) * 1997-01-24 1998-07-30 Sony Corporation Pattern data generator, pattern data generating method and its medium
DE60238926D1 (en) * 2001-08-31 2011-02-24 Dassault Systemes Solidworks SIMULTANEOUS USE OF 2D AND 3D MODELING DATA
US6963348B2 (en) * 2002-05-31 2005-11-08 Nvidia Corporation Method and apparatus for display image adjustment
JP4288939B2 (en) * 2002-12-05 2009-07-01 ソニー株式会社 Imaging device
JP4269155B2 (en) * 2003-06-25 2009-05-27 日本電気株式会社 Pointing device control system and electronic device
US7739623B2 (en) * 2004-04-15 2010-06-15 Edda Technology, Inc. Interactive 3D data editing via 2D graphical drawing tools
US7561160B2 (en) * 2004-07-15 2009-07-14 Olympus Corporation Data editing program, data editing method, data editing apparatus and storage medium
US9900669B2 (en) * 2004-11-02 2018-02-20 Pierre Touma Wireless motion sensor system and method
US8418075B2 (en) * 2004-11-16 2013-04-09 Open Text Inc. Spatially driven content presentation in a cellular environment
JP2007043225A (en) * 2005-07-29 2007-02-15 Univ Of Electro-Communications Picked-up processing apparatus and picked-up processing method
EP1804183B1 (en) * 2005-12-30 2017-06-21 Dassault Systèmes Process for selecting objects in a PLM database and apparatus implementing this process
EP1804187B1 (en) * 2005-12-30 2020-09-09 Dassault Systèmes Process for displaying objects in a PLM database and apparatus implementing this process
US8477154B2 (en) * 2006-03-20 2013-07-02 Siemens Energy, Inc. Method and system for interactive virtual inspection of modeled objects
US20070257903A1 (en) * 2006-05-04 2007-11-08 Harris Corporation Geographic information system (gis) for displaying 3d geospatial images with reference markers and related methods
US7969418B2 (en) * 2006-11-30 2011-06-28 Cherif Atia Algreatly 3-D computer input device and method
KR100790887B1 (en) * 2006-09-22 2008-01-02 삼성전자주식회사 Apparatus and method for processing image
KR100790890B1 (en) * 2006-09-27 2008-01-02 삼성전자주식회사 Apparatus and method for generating panorama image
JP4931055B2 (en) * 2006-11-22 2012-05-16 ソニー株式会社 Image processing apparatus and image processing method
JP4870546B2 (en) * 2006-12-27 2012-02-08 株式会社岩根研究所 CV tag video display device with layer generation / selection function
US9037599B1 (en) * 2007-05-29 2015-05-19 Google Inc. Registering photos in a geographic information system, and applications thereof
US7812850B1 (en) * 2007-06-29 2010-10-12 Adobe Systems Incorporated Editing control for spatial deformations
US8068693B2 (en) * 2007-07-18 2011-11-29 Samsung Electronics Co., Ltd. Method for constructing a composite image
US8217956B1 (en) * 2008-02-29 2012-07-10 Adobe Systems Incorporated Method and apparatus for rendering spherical panoramas
US20100107187A1 (en) * 2008-10-24 2010-04-29 At&T Intellectual Property I, L.P. System and Method of Displaying Advertising Content
CN101968890B (en) * 2009-07-27 2013-07-10 西安费斯达自动化工程有限公司 360-degree full-view simulation system based on spherical display
EP2333683A1 (en) * 2009-11-06 2011-06-15 Dassault Systèmes Method and system for designing an assembly of objects in a system of computer-aided design
EP2333682B1 (en) * 2009-11-06 2020-05-20 Dassault Systèmes Method and system for designing an assembly of objects in a system of computer-aided design
US10152198B2 (en) * 2009-12-15 2018-12-11 Dassault Systèmes Method and system for editing a product assembly
US8443300B2 (en) * 2010-08-24 2013-05-14 Ebay Inc. Three dimensional navigation of listing information
JP5595188B2 (en) * 2010-08-31 2014-09-24 キヤノン株式会社 Image processing apparatus and method
JP5250598B2 (en) * 2010-10-05 2013-07-31 株式会社ソニー・コンピュータエンタテインメント Image display device and image display method
JP5791256B2 (en) * 2010-10-21 2015-10-07 キヤノン株式会社 Display control apparatus and display control method
JP4917664B1 (en) * 2010-10-27 2012-04-18 株式会社コナミデジタルエンタテインメント Image display device, game program, and game control method
EP2474930B1 (en) * 2010-12-30 2018-10-24 Dassault Systèmes Updating a modeled object
US20120174038A1 (en) * 2011-01-05 2012-07-05 Disney Enterprises, Inc. System and method enabling content navigation and selection using an interactive virtual sphere
US20120206471A1 (en) * 2011-02-11 2012-08-16 Apple Inc. Systems, methods, and computer-readable media for managing layers of graphical object data
US9288476B2 (en) * 2011-02-17 2016-03-15 Legend3D, Inc. System and method for real-time depth modification of stereo images of a virtual reality environment
US9282321B2 (en) * 2011-02-17 2016-03-08 Legend3D, Inc. 3D model multi-reviewer system
US8860717B1 (en) * 2011-03-29 2014-10-14 Google Inc. Web browser for viewing a three-dimensional object responsive to a search query
JP5891388B2 (en) * 2011-03-31 2016-03-23 パナソニックIpマネジメント株式会社 Image drawing apparatus, image drawing method, and image drawing program for drawing a stereoscopic image
EP2521059B1 (en) * 2011-05-06 2019-10-16 Dassault Systèmes Design operations on shapes divided in portions
US20130007575A1 (en) * 2011-06-29 2013-01-03 Google Inc. Managing Map Data in a Composite Document
JP5790345B2 (en) * 2011-09-07 2015-10-07 株式会社リコー Image processing apparatus, image processing method, program, and image processing system
JP2013101525A (en) * 2011-11-09 2013-05-23 Sony Corp Image processing device, method, and program
JP6143678B2 (en) * 2011-12-27 2017-06-07 ソニー株式会社 Information processing apparatus, information processing method, and program
JP2013214947A (en) * 2012-03-09 2013-10-17 Ricoh Co Ltd Image capturing apparatus, image capturing system, image processing method, information processing apparatus, and program
KR20130133319A (en) * 2012-05-23 2013-12-09 삼성전자주식회사 Apparatus and method for authoring graphic user interface using 3d animations
EP2672456B1 (en) * 2012-06-07 2019-07-24 Dassault Systèmes Method and system for dynamically manipulating an assembly of objects in a three-dimensional scene of a system of computer-aided design
US9256961B2 (en) * 2012-06-28 2016-02-09 Here Global B.V. Alternate viewpoint image enhancement
CN107256532B (en) * 2012-10-24 2020-09-18 株式会社摩如富 Image processing apparatus, image processing method, and recording medium
KR101473257B1 (en) * 2012-11-01 2014-12-24 주식회사 케이티 Apparatus for reproducing contents stream including user interface data and method thereof
KR102123061B1 (en) * 2012-11-27 2020-06-16 삼성전자주식회사 Boundary segmentation apparatus and method based on user interaction
JP6044328B2 (en) * 2012-12-26 2016-12-14 株式会社リコー Image processing system, image processing method, and program
KR20140112909A (en) * 2013-03-14 2014-09-24 삼성전자주식회사 Electronic device and method for generating panorama image
TWI649675B (en) * 2013-03-28 2019-02-01 新力股份有限公司 Display device
JP6228392B2 (en) * 2013-05-31 2017-11-08 任天堂株式会社 Panorama image display program, panorama image display device, panorama image display system, and panorama image display method
US10262462B2 (en) * 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US9086837B1 (en) * 2013-07-30 2015-07-21 Microstrategy Incorporated Collaboration sessions
KR101657039B1 (en) * 2013-08-28 2016-09-12 가부시키가이샤 리코 Image processing apparatus, image processing method, and imaging system
KR102089614B1 (en) * 2013-08-28 2020-04-14 삼성전자주식회사 Method for taking spherical panoramic image and an electronic device thereof
US10025479B2 (en) * 2013-09-25 2018-07-17 Terarecon, Inc. Advanced medical image processing wizard
US9483878B2 (en) * 2013-11-27 2016-11-01 Disney Enterprises, Inc. Contextual editing using variable offset surfaces
CN104833360B (en) * 2014-02-08 2018-09-18 无锡维森智能传感技术有限公司 A kind of conversion method of two-dimensional coordinate to three-dimensional coordinate
JP5835384B2 (en) * 2014-03-18 2015-12-24 株式会社リコー Information processing method, information processing apparatus, and program
EP2930692A1 (en) * 2014-04-10 2015-10-14 Dassault Systèmes Fitting sample points of 3D curves sketched by a user with an isovalue surface
US10068373B2 (en) * 2014-07-01 2018-09-04 Samsung Electronics Co., Ltd. Electronic device for providing map information
US9570113B2 (en) * 2014-07-03 2017-02-14 Gopro, Inc. Automatic generation of video and directional audio from spherical content
JP6205071B2 (en) * 2014-09-08 2017-09-27 富士フイルム株式会社 Imaging control apparatus, imaging control method, camera system, and program
JP2016062486A (en) * 2014-09-19 2016-04-25 株式会社ソニー・コンピュータエンタテインメント Image generation device and image generation method
US10091418B2 (en) * 2014-10-24 2018-10-02 Bounce Imaging, Inc. Imaging systems and methods
EP3040946B1 (en) * 2014-12-30 2019-11-13 Dassault Systèmes Viewpoint selection in the rendering of a set of objects
EP3040945B1 (en) * 2014-12-30 2019-06-19 Dassault Systèmes Creation of bounding boxes on a 3d modeled assembly
US10127714B1 (en) * 2015-01-27 2018-11-13 Google Llc Spherical three-dimensional video rendering for virtual reality
JP5920507B1 (en) * 2015-03-10 2016-05-18 株式会社リコー Image processing system, image processing method, and program
KR102132406B1 (en) * 2015-04-29 2020-07-09 삼성전자주식회사 Display apparatus and control method thereof
KR102365730B1 (en) * 2015-06-15 2022-02-22 한국전자통신연구원 Apparatus for controlling interactive contents and method thereof
US20180118224A1 (en) * 2015-07-21 2018-05-03 Mitsubishi Electric Corporation Display control device, display device, and display control method
CN107923761B (en) * 2015-08-20 2021-04-16 三菱电机株式会社 Display control device, display device, and display control method
WO2017033777A1 (en) * 2015-08-27 2017-03-02 株式会社コロプラ Program for controlling head-mounted display system
US10217189B2 (en) * 2015-09-16 2019-02-26 Google Llc General spherical capture methods
US10096130B2 (en) * 2015-09-22 2018-10-09 Facebook, Inc. Systems and methods for content streaming
EP3159850A1 (en) * 2015-10-25 2017-04-26 Dassault Systèmes Comparing 3d modeled objects
US10086845B2 (en) * 2015-10-30 2018-10-02 Mitsubishi Electric Corporation Vehicle information display control device, and method for displaying automatic driving information
EP3185171B1 (en) * 2015-12-24 2020-03-04 Dassault Systèmes 3d object localization with descriptor
US10977764B2 (en) * 2015-12-29 2021-04-13 Dolby Laboratories Licensing Corporation Viewport independent image coding and rendering
EP3188033B1 (en) * 2015-12-31 2024-02-14 Dassault Systèmes Reconstructing a 3d modeled object
CN108495765B (en) * 2016-02-01 2021-01-12 三菱电机株式会社 Vehicle information display control device and display method of automatic driving information
EP3203394B1 (en) * 2016-02-02 2021-10-20 Dassault Systèmes B-rep design with face trajectories
KR102523997B1 (en) * 2016-02-12 2023-04-21 삼성전자주식회사 Method and apparatus for processing 360 image
US10484621B2 (en) * 2016-02-29 2019-11-19 Gopro, Inc. Systems and methods for compressing video content
WO2017163527A1 (en) * 2016-03-22 2017-09-28 株式会社リコー Image processing system, image processing method, and program
JP7056554B2 (en) * 2016-03-29 2022-04-19 ソニーグループ株式会社 Information processing equipment, image pickup equipment, image reproduction equipment, and methods and programs
CN105912123A (en) * 2016-04-15 2016-08-31 北京小鸟看看科技有限公司 Interface layout method and device under three-dimension immersion environment
US20170302714A1 (en) * 2016-04-15 2017-10-19 Diplloid Inc. Methods and systems for conversion, playback and tagging and streaming of spherical images and video
US9940697B2 (en) * 2016-04-15 2018-04-10 Gopro, Inc. Systems and methods for combined pipeline processing of panoramic images
CN109362234B (en) * 2016-04-28 2021-11-30 深圳市大疆创新科技有限公司 System and method for obtaining spherical panoramic images
KR20170124814A (en) * 2016-05-03 2017-11-13 삼성전자주식회사 Image display apparatus and operating method for the same
KR20170124811A (en) * 2016-05-03 2017-11-13 삼성전자주식회사 Image display apparatus and operating method for the same
US11064979B2 (en) * 2016-05-16 2021-07-20 Analogic Corporation Real-time anatomically based deformation mapping and correction
JP2017208619A (en) * 2016-05-16 2017-11-24 株式会社リコー Image processing apparatus, image processing method, program and imaging system
KR102506480B1 (en) * 2016-06-14 2023-03-07 삼성전자주식회사 Image processing apparatus and method for image processing thereof
EP3264251B1 (en) * 2016-06-29 2019-09-04 Dassault Systèmes Generation of a color of an object displayed on a gui
US10277886B2 (en) * 2016-07-19 2019-04-30 Gopro, Inc. Mapping of spherical image data into rectangular faces for transport and decoding across networks
US10136055B2 (en) * 2016-07-29 2018-11-20 Multimedia Image Solution Limited Method for stitching together images taken through fisheye lens in order to produce 360-degree spherical panorama
WO2018030168A1 (en) * 2016-08-10 2018-02-15 ソニー株式会社 Image processing device and image processing method
WO2018030169A1 (en) * 2016-08-10 2018-02-15 ソニー株式会社 Image processing device and image processing method
KR102568897B1 (en) * 2016-08-16 2023-08-21 삼성전자주식회사 Electronic apparatus and operating method for the same
KR102567002B1 (en) * 2016-08-16 2023-08-14 삼성전자주식회사 Image display apparatus and operating method for the same
KR20180023723A (en) * 2016-08-26 2018-03-07 삼성전자주식회사 Display apparatus and the control method thereof
KR102536945B1 (en) * 2016-08-30 2023-05-25 삼성전자주식회사 Image display apparatus and operating method for the same
US10127632B1 (en) * 2016-09-05 2018-11-13 Google Llc Display and update of panoramic image montages
US10085006B2 (en) * 2016-09-08 2018-09-25 Samsung Electronics Co., Ltd. Three hundred sixty degree video stitching
KR20180028782A (en) * 2016-09-09 2018-03-19 삼성전자주식회사 Electronic apparatus and operating method for the same
KR102560029B1 (en) * 2016-09-12 2023-07-26 삼성전자주식회사 A method and apparatus for transmitting and receiving virtual reality content
KR102604261B1 (en) * 2016-09-30 2023-11-21 삼성전자주식회사 Image processing apparatus and controlling method thereof
US20180103197A1 (en) * 2016-10-06 2018-04-12 Gopro, Inc. Automatic Generation of Video Using Location-Based Metadata Generated from Wireless Beacons
KR20180040451A (en) * 2016-10-12 2018-04-20 엘지전자 주식회사 Mobile terminal and operating method thereof
KR20180042777A (en) * 2016-10-18 2018-04-26 엘지전자 주식회사 Mobile terminal and operating method thereof
US10600150B2 (en) * 2016-10-31 2020-03-24 Adobe Inc. Utilizing an inertial measurement device to adjust orientation of panorama digital images
KR20180048170A (en) * 2016-11-02 2018-05-10 엘지전자 주식회사 Display apparatus
KR20180051001A (en) * 2016-11-07 2018-05-16 삼성전자주식회사 Electronic apparatus for being connected to camera and method for controlling thereof
KR102551713B1 (en) * 2016-11-18 2023-07-06 삼성전자주식회사 Electronic apparatus and image processing method thereof
KR102633595B1 (en) * 2016-11-21 2024-02-05 삼성전자주식회사 Display apparatus and the control method thereof
KR20180060236A (en) * 2016-11-28 2018-06-07 엘지전자 주식회사 Mobile terminal and operating method thereof
US10614606B2 (en) * 2016-11-30 2020-04-07 Ricoh Company, Ltd. Information processing apparatus for creating an animation from a spherical image
JP6784168B2 (en) * 2016-12-19 2020-11-11 株式会社リコー Information processing equipment, programs, browsing systems
US10733697B2 (en) * 2016-12-27 2020-08-04 Intel IP Corporation Convolutional neural network for wide-angle camera images
US10489979B2 (en) * 2016-12-30 2019-11-26 Facebook, Inc. Systems and methods for providing nested content items associated with virtual content items
US10560682B2 (en) * 2017-01-13 2020-02-11 Gopro, Inc. Methods and apparatus for providing a frame packing arrangement for panoramic content
US10198862B2 (en) * 2017-01-23 2019-02-05 Gopro, Inc. Methods and apparatus for providing rotated spherical viewpoints
CN108470379B (en) * 2017-02-23 2021-12-07 株式会社理光 Three-dimensional image fusion method and device
CN108537721B (en) * 2017-03-02 2021-09-07 株式会社理光 Panoramic image processing method and device and electronic equipment
US20180253820A1 (en) * 2017-03-03 2018-09-06 Immersive Enterprises, LLC Systems, methods, and devices for generating virtual reality content from two-dimensional images
US10681271B2 (en) * 2017-03-15 2020-06-09 Ricoh Company, Ltd. Image processing apparatus, image capturing system, image processing method, and recording medium
US10915986B2 (en) * 2017-03-20 2021-02-09 Qualcomm Incorporated Adaptive perturbed cube map projection
US10877649B2 (en) * 2017-03-21 2020-12-29 Ricoh Company, Ltd. Browsing system, browsing method, and information processing apparatus
WO2018194273A1 (en) * 2017-04-21 2018-10-25 Samsung Electronics Co., Ltd. Image display apparatus and method
US10957009B2 (en) * 2017-05-04 2021-03-23 Electronics And Telecommunications Research Institute Image processing apparatus and method
US10417276B2 (en) * 2017-05-15 2019-09-17 Adobe, Inc. Thumbnail generation from panoramic images
JP6919334B2 (en) * 2017-05-26 2021-08-18 株式会社リコー Image processing device, image processing method, program
BR102017012517A2 (en) * 2017-06-12 2018-12-26 Samsung Eletrônica da Amazônia Ltda. method for 360 ° media display or bubble interface
US9998664B1 (en) * 2017-06-20 2018-06-12 Sliver VR Technologies, Inc. Methods and systems for non-concentric spherical projection for multi-resolution view
US20190005709A1 (en) * 2017-06-30 2019-01-03 Apple Inc. Techniques for Correction of Visual Artifacts in Multi-View Images
US10701263B2 (en) * 2017-07-18 2020-06-30 Ricoh Company, Ltd. Browsing system, image distribution apparatus, and image distribution method
US10831333B2 (en) * 2017-07-26 2020-11-10 Adobe Inc. Manipulating a camera perspective within a three-dimensional space
EP4002284B1 (en) * 2017-10-13 2023-04-12 Dassault Systèmes Method for creating an animation summarizing a design process of a three-dimensional object
JP7031228B2 (en) * 2017-10-26 2022-03-08 株式会社リコー Program, image display method, image display system, information processing device
US10217488B1 (en) * 2017-12-15 2019-02-26 Snap Inc. Spherical video editing
US10740981B2 (en) * 2018-02-06 2020-08-11 Adobe Inc. Digital stages for presenting digital three-dimensional models
JP7118659B2 (en) * 2018-02-15 2022-08-16 キヤノン株式会社 IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD AND PROGRAM

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7346408B2 (en) * 2005-09-06 2008-03-18 Esko Ip Nv Two-dimensional graphics for incorporating on three-dimensional objects
US20130120354A1 (en) * 2008-08-28 2013-05-16 Peter F. Falco, Jr. Using Two Dimensional Image Adjustment Operations on Three Dimensional Objects
US20150249815A1 (en) * 2013-05-01 2015-09-03 Legend3D, Inc. Method for creating 3d virtual reality from 2d images
US20150015574A1 (en) * 2013-07-09 2015-01-15 Nvidia Corporation System, method, and computer program product for optimizing a three-dimensional texture workflow
US20150269781A1 (en) * 2014-03-19 2015-09-24 Machine Elf Software, Inc. Rapid Virtual Reality Enablement of Structured Data Assets

Also Published As

Publication number Publication date
EP3520086A1 (en) 2019-08-07
KR20180051288A (en) 2018-05-16
US20180130243A1 (en) 2018-05-10
EP3520086B1 (en) 2020-08-19
EP3520086A4 (en) 2019-11-06
CN108062795A (en) 2018-05-22

Similar Documents

Publication Publication Date Title
WO2018088742A1 (en) Display apparatus and control method thereof
WO2015119480A1 (en) User terminal device and displaying method thereof
WO2018038439A1 (en) Image display apparatus and operating method thereof
WO2015041405A1 (en) Display apparatus and method for motion recognition thereof
WO2016167503A1 (en) Display apparatus and method for displaying
EP3105657A1 (en) User terminal device and displaying method thereof
WO2015119463A1 (en) User terminal device and displaying method thereof
WO2017048076A1 (en) Display apparatus and method for controlling display of display apparatus
WO2017119664A1 (en) Display apparatus and control methods thereof
WO2014038918A1 (en) Method for connecting mobile terminal and external display and apparatus implementing the same
WO2018048163A1 (en) Electronic apparatus and method of controlling the same
WO2017052143A1 (en) Image display device and method of operating the same
WO2017105015A1 (en) Electronic device and method of operating the same
WO2014182109A1 (en) Display apparatus with a plurality of screens and method of controlling the same
WO2018030829A1 (en) Image display apparatus and operating method thereof
WO2016080700A1 (en) Display apparatus and display method
WO2016108547A1 (en) Display apparatus and display method
WO2016159654A1 (en) System and method for providing widget
WO2018080165A1 (en) Image display apparatus, mobile device, and methods of operating the same
WO2016122240A1 (en) Electronic apparatus and method of setting network of audio device
WO2019039739A1 (en) Display apparatus and control method thereof
WO2021118225A1 (en) Display device and operating method thereof
WO2018124823A1 (en) Display apparatus and controlling method thereof
WO2021133053A1 (en) Electronic device and method for controlling same
WO2018080176A1 (en) Image display apparatus and method of displaying image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17870508

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017870508

Country of ref document: EP

Effective date: 20190430