WO2017175910A1 - 이미지 처리 방법 및 장치 - Google Patents
이미지 처리 방법 및 장치 Download PDFInfo
- Publication number
- WO2017175910A1 WO2017175910A1 PCT/KR2016/005289 KR2016005289W WO2017175910A1 WO 2017175910 A1 WO2017175910 A1 WO 2017175910A1 KR 2016005289 W KR2016005289 W KR 2016005289W WO 2017175910 A1 WO2017175910 A1 WO 2017175910A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- pixels
- projection image
- pixel
- polyhedron
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 149
- 238000003672 processing method Methods 0.000 claims description 25
- 230000015654 memory Effects 0.000 claims description 15
- 238000010586 diagram Methods 0.000 description 43
- 238000004891 communication Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/10—Selection of transformation methods according to the characteristics of the input images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/12—Panospheric to cylindrical image transformations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/40—Filling a planar surface by adding surface attributes, e.g. colour or texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/18—Image warping, e.g. rearranging pixels individually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2211/00—Image generation
- G06T2211/40—Computed tomography
- G06T2211/416—Exact reconstruction
Definitions
- the present invention relates to a virtual image processing method and apparatus.
- VR virtual reality
- the virtual reality image displayed on the virtual reality device moves according to the line of sight of the user wearing the virtual reality display device, the virtual reality image should include all the images around the user. That is, the virtual reality image provided by the virtual reality device is an image corresponding to all directions around the user, that is, a 360 degree image. Therefore, interest in processing such 360 degree images is increasing along with interest in virtual reality devices.
- the 360-degree image processing method according to the prior art has a problem in that the image size is large, the processing efficiency is low, the power consumption is high, and part of the image is lost, so that the quality may be degraded. Therefore, there is a need for an image processing method and apparatus that can provide more efficient and higher quality images.
- One embodiment provides an image processing method and apparatus. More specifically, the present invention provides an image processing method and apparatus capable of providing more efficient and high quality images.
- a high quality 360 degree image can be efficiently provided.
- 1 is a diagram illustrating a 360 degree image.
- FIG. 2 is a flowchart illustrating an image processing method, according to an exemplary embodiment.
- FIG. 3 is a diagram illustrating a process of projecting an acquired image onto a polyhedron according to an exemplary embodiment.
- FIG. 4 is a diagram illustrating a projection image, according to an exemplary embodiment.
- FIG. 5 is a flowchart illustrating a rectangular image reconstruction method according to an embodiment.
- FIG. 6 is a diagram illustrating a rectangular image reconstruction method, according to an exemplary embodiment.
- FIG. 7 is a diagram illustrating horizontal movement of a pixel, according to an exemplary embodiment.
- FIG. 8 is a diagram illustrating a horizontal shift result of a pixel, according to an exemplary embodiment.
- FIG. 9 is a diagram illustrating vertical movement of pixels according to an exemplary embodiment.
- FIG. 10 is a diagram illustrating a result of vertical movement of pixels according to an exemplary embodiment.
- FIG. 11 is a diagram illustrating an actual rectangular image reconstructed according to an embodiment.
- FIG. 12 is a flowchart illustrating a rectangular image reconstruction method according to another embodiment.
- FIG. 13 is a diagram illustrating a rectangular image reconstruction method according to another embodiment.
- FIG. 14 is a flowchart illustrating movement of pixels according to another exemplary embodiment.
- 15 is a diagram illustrating vertical movement of pixels according to another exemplary embodiment.
- 16 is a view illustrating a rectangular image reconstruction method according to another embodiment.
- 17 is a diagram illustrating a reference line setting process according to another embodiment.
- FIG. 18 is a flowchart of an image processing apparatus, according to an exemplary embodiment.
- 19 is a flowchart illustrating an image processing method, according to another exemplary embodiment.
- 20 is a flowchart illustrating a process of selecting a projection image reconstruction method, according to an exemplary embodiment.
- 21 is a flowchart illustrating a method of recovering a projected image, according to an exemplary embodiment.
- 22 is a flowchart illustrating a method of recovering a projected image, according to another exemplary embodiment.
- FIG. 23 is a flowchart of an image processing apparatus, according to another exemplary embodiment.
- 24 is a diagram illustrating projection images according to an exemplary embodiment.
- 25 and 26 illustrate generating a projection image according to one embodiment.
- FIG. 27 is a diagram illustrating a rectangular image reconstruction method, according to an embodiment.
- FIG. 28 is a flowchart illustrating a rectangular image reconstruction method, according to another embodiment.
- 29 is a diagram illustrating a rectangular image reconstruction method, according to another embodiment.
- 30 and 31 are diagrams illustrating movement of pixels according to another exemplary embodiment.
- FIG. 32 is a diagram illustrating a rectangular image reconstruction method according to another embodiment.
- 33 and 34 are diagrams illustrating movement of pixels according to another exemplary embodiment.
- 35 is a view illustrating a rectangular image reconstruction method according to another embodiment.
- 36 is a diagram illustrating movement of pixels according to another embodiment.
- FIG. 37 is a diagram illustrating a rectangular image reconstruction method, according to another embodiment.
- 38 is a diagram illustrating movement of pixels according to another embodiment.
- an image processing method includes: obtaining images for at least two directions, generating the projection image by projecting the images onto a polyhedron, and positioning a position of at least one pixel among the pixels of the projection image. Moving to reshaping into a rectangular image and processing the rectangular image.
- an image processing method includes: obtaining a rectangular image, reconstructing a projection image by shifting a position of at least one pixel among pixels of the rectangular image, and assembling the projection image into a polyhedron, Back-projecting the polyhedron to generate a backprojected image.
- an image processing apparatus obtains images for at least two directions, generates projection images by projecting the images onto a polyhedron, and moves positions of at least one pixel among pixels of the projection image.
- a memory configured to reconstruct a rectangular image and to control processing of the rectangular image, and to store data for the operation of the controller.
- part when a part is “connected” to another part, this includes not only “directly connected” but also “electrically connected” with another element in between. .
- any part of the specification is to "include” any component, which means that it may further include other components, except to exclude other components unless otherwise stated.
- the term “part” as used herein refers to a hardware component, such as software, FPGA or ASIC, and “part” plays certain roles. But wealth is not limited to software or hardware.
- the 'unit' may be configured to be in an addressable storage medium or may be configured to play one or more processors.
- a “part” refers to components such as software components, object-oriented software components, class components, and task components, processes, functions, properties, procedures, Subroutines, segments of program code, drivers, firmware, microcode, circuits, data, databases, data structures, tables, arrays and variables.
- the functionality provided within the components and “parts” may be combined into a smaller number of components and “parts” or further separated into additional components and “parts”.
- 1 is a diagram illustrating a 360 degree image.
- the 360 degree image is an image 120 representing an environment surrounding a specific position 360 degrees around the specific position 110.
- an image representing the surrounding environment surrounding the user 360 degrees in the virtual reality may be a 360 degree image.
- the virtual reality device may provide a 360 degree image to the user, so that even when a user wearing the virtual reality device moves or looks around in the virtual reality, an appropriate image may be provided accordingly.
- FIG. 2 is a flowchart illustrating an image processing method, according to an exemplary embodiment.
- the image processing apparatus acquires images for at least two directions.
- the image processing apparatus may acquire images in at least two directions by photographing a surrounding environment surrounding the image processing apparatus through a camera.
- the image processing apparatus may photograph the surrounding environment surrounding the image processing apparatus through at least one camera.
- the camera may be a component included in the image processing apparatus or may be configured as a separate device from the image processing apparatus.
- the image processing apparatus may photograph the surrounding environment in a panoramic manner by using a camera, or may photograph each direction such as before, after, left, right, up, and down.
- the image processing apparatus may obtain images for at least two directions by receiving an image from an external device. That is, the external device may photograph the surrounding environment or generate a virtual image to transmit the image to the image processing device, and the image processing device may acquire images for at least two directions by receiving the image.
- the image processing apparatus generates the projection image by projecting the images onto the polyhedron.
- the image processing apparatus may project the images to the area of the developed view corresponding to the area of the polyhedron.
- the image processing apparatus may project the images projected on the polyhedron to at least one plane outside the polyhedron when generating the projection image. 3, 4, 25, and 26 illustrate a process of generating a projection image.
- FIG. 3 is a diagram illustrating a process of projecting an acquired image onto a polyhedron according to an exemplary embodiment.
- 3A is an image 310 of at least two directions obtained in step 210.
- the present invention is not limited thereto, and various types of images, such as images of front, rear, left, right, up, and down, may be obtained.
- 3B is a diagram in which the acquired image 310 is projected onto the polyhedron 320.
- Figure 3 (b) is a diagram showing a tetrahedron on which the acquired image 310 is projected using a tetrahedron as the polyhedron 320.
- the polyhedron 320 may be a polyhedron 320 composed of at least one or more triangles having the same shape and area.
- one surface of the polyhedron 320 may be formed of another polygon made of two or more triangles.
- a square formed by combining two or more of the same triangles may form one surface of the polyhedron 320.
- FIG. 4 is a diagram illustrating a projection image, according to an exemplary embodiment.
- a projection image 410 generated by projecting an image 310 acquired on a developed view of the icosahedron shown in FIG. 3B.
- the projection image 410 may be the same as the developed view in which the icosahedron on which the acquired image 310 is projected.
- the projection image may be an icosahedral projection image generated using an icosahydral projection technique.
- 3 and 4 illustrate an image processing process using an example of projecting the acquired image 310 onto a tetrahedron.
- the polyhedron is not limited to the tetrahedron, and it is also possible to perform image processing by projecting the obtained image onto various polyhedrons such as octahedron and tetrahedron.
- it is not limited to using the developed view of a polyhedron, and it is possible to generate a projection image using various methods, such as using the projection of a polyhedron.
- 25 and 26 illustrate generating a projection image according to an embodiment.
- FIG. 25 (a) shows an octahedron 2510 on which an acquired image is projected using an octahedron instead of a tetrahedron
- FIG. 25 (c) is a tetrahedron.
- FIG. 25 (b) shows a projected image 2520 generated by projecting an image obtained on an expanded view of the octahedron shown in FIG. 25 (a), and FIG. 25 (d) is shown in FIG. 25 (c).
- the projection image 2540 generated by projecting the acquired image onto the developed view of the illustrated tetrahedron is shown.
- the projection image 2540 of FIG. 25D may be the same as the developed view in which the octahedron on which the acquired image is projected is unfolded.
- the projection image may be generated by projecting an image projected on the polyhedron onto at least one plane outside the polyhedron.
- the acquired image is projected on the octahedron 2610, and the image on the upper four sides of the image projected on the octahedron 2610 is mirrored.
- the projection 2630 of the octahedron 2610 may be obtained by projecting it again on at least one plane 2620 on the upper surface.
- the image on the lower four sides of the image projected on the octahedron 2610 is projected again to at least one plane below the octahedron so that the octahedron ( Other projections of 2610 may be obtained.
- a projection image 2640 including two projections of the octahedron 2610 may be generated.
- the generated projection image may be processed as it is without reconstruction into a rectangular image.
- the image processing apparatus repositions at least one of the pixels of the projection image 410 to a rectangular image. Since the projection image 410 is a two-dimensional image in which the polyhedron 320 is unfolded, as shown in FIG. 4, there is a margin that is not related to the 360-degree image 310. Such a margin is only a margin in the drawing, but corresponds to data to be processed in an image processing process. Therefore, the more margins, the more data the image processing apparatus has to process, resulting in lower processing efficiency.
- the process of reconstructing the projection image 410 into a rectangular image is a process of reducing unnecessary space to be processed by the image processing apparatus by reducing such a margin.
- moving the position of the pixel means moving the pixel data.
- the pixel data of a specific pixel is stored as pixel data of another pixel, not a physical pixel movement.
- the rectangular image when reconstructing the projection image 410 into a rectangular image, the rectangular image may be generated by moving only the position of the pixel without deleting the pixel or adding a new pixel. That is, the rectangular image may be generated by simply moving the position of the pixel without changing the total number of pixels and the pixel data of the projection image 410. Thus, the area of the reconstructed rectangular image and the projection image 410 is the same. According to one embodiment, in reconstructing the projected image 410 into a rectangular image, only the position of the pixel is moved, and the total number of pixels and the pixel data are the same, so that if the positional movement history of the pixel or only the original position of the pixel is known, The image can be easily restored to the projection image 410.
- a rectangular image when reconstructing a projection image into a rectangular image, a rectangular image may be generated by moving a pixel of the projection image and adding a margin. For example, as shown in Figs. 27 (a) to 27 (c), the pixels of the projection image generated using the exploded view of the icosahedron or octahedron are moved around the projected image after the pixels are moved. At least one margin 2710, 2720, and 2730 may be added to the rectangle to generate a rectangular image.
- a rectangular image when reconstructing a projection image into a rectangular image, a rectangular image may be generated by adding a margin without shifting pixels of the projected image. For example, as shown in Fig.
- a rectangular image is added by adding at least one margin 2740 around the projection image without shifting the pixels of the projection image generated using the exploded view of the tetrahedron. Can be generated. If the reconstructed rectangular image includes a margin, the area of the reconstructed rectangular image and the projected image may be different.
- a direction of parallel movement of pixels included in each row is determined for each row, and the left end of each row is determined according to the determined parallel movement direction.
- the pixels included in each row may be horizontally shifted left or right so that the pixels are sequentially filled from the left edge or the right edge.
- the vertical movement direction of the pixels included in each column is determined for each column, and the upper end of each column is determined according to the determined vertical movement direction.
- the pixels included in each column may be vertically moved up or down in each column so that the pixels are sequentially filled from the top edge or the bottom edge.
- 5-11 illustrate an embodiment showing a method of reconstructing the projected image 410 into a rectangular image through the vertical and horizontal movement of pixels and the result thereof.
- FIG. 5 is a flowchart illustrating a rectangular image reconstruction method according to an embodiment
- FIG. 6 is a diagram illustrating a rectangular image reconstruction method according to an embodiment.
- the image processing apparatus selects one row on a plane on which the projection image 410 is shown.
- the row where the pixel located at the top of the projection image 410 may be the x-axis.
- the image processing apparatus determines whether the number of the selected row is less than or equal to the set value.
- a projection image is shown in FIG. 6A.
- the projection image shown in FIG. 6 (a) has a shape in which triangles are stacked in three layers.
- the set value may be set to 2/3 of the total row length.
- this is only an example, and various standards may be set according to the example.
- the image processing apparatus may move pixels included in the selected row to be sequentially filled with pixels from the left end of the selected row in step 530. If it is determined that the number of the selected row is greater than or equal to the set value in step 520, the image processing apparatus may move pixels included in the selected row to fill the pixels sequentially from the right end of the selected row in step 540. .
- step 550 the image processing apparatus determines whether there is a row that is not selected, and when there is a row that is not selected, the process returns to step 510 and repeats the above process.
- Steps 510 to 550 are processes for reconstructing the projection image shown in FIG. 6A into the image shown in FIG. 6B, and will be described in more detail with reference to FIGS. 7 and 8.
- FIG. 7 is a diagram illustrating a horizontal shift of a pixel, according to an exemplary embodiment.
- FIG. 8 is a diagram illustrating a horizontal shift of a pixel, according to an exemplary embodiment.
- each triangle constituting the projection image includes 9 pixels, and each pixel represented by the rectangle has respective pixel data.
- the row including the pixel located at the top is called the x-axis, and the row number increases as the row goes downward.
- the setting is set to the full row length, i.e., 2/3 of the height of the projected image, pixels from the left end 710 of each row sequentially for rows with row numbers less than or equal to 2/3 of the total row length You can shift the pixels in each row to the left so that it fills.
- pixels included in each row may be shifted to the left so that the pixels having row numbers greater than 2/3 of the total row length are sequentially filled from the right end (not shown) of each row.
- the rows above the row including the pixels having pixel data of a, f, g, h, i, j, s, x, y, z, 1, 2 are leftward, 3, 4, 5 Rows above a row containing pixels with pixel data of, 6, 7, #, $,%, ⁇ , & can be moved to the right.
- FIG. 8 shows the result of the shifting of the pixels.
- the shift result shown in FIG. 8 is for explaining a horizontal shift of a pixel according to an exemplary embodiment. It is assumed that only pixels displayed to have pixel data in FIG. 7 are shifted, and the shift result is shown. Since the projection image shown in FIG. 6A is composed of a larger number of triangles than shown in FIG. 7, the result of the horizontal shift of the actual pixel may be different from that shown in FIG. 8. For example, the top row may have an additional pixel on the right side besides two pixels with pixel data of A and J. Also, pixels included in the lower three rows in which the pixels move to the right may be further moved to the right end not shown in FIG. 8.
- the projection image shown in FIG. 6 (a) may be reconstructed into the image shown in FIG. 6 (b).
- the image processing apparatus determines whether there are no rows selected, and when there are no rows selected, the image processing apparatus proceeds to step 560 and selects one column on the plane on which the projection image is shown.
- the y-axis may be a row where the leftmost pixel of the projection image is located.
- the image processing apparatus may move the pixels included in the selected column upward so that the pixels are sequentially filled from the upper end of the selected column.
- the image processing apparatus determines whether there is a column that is not selected, and when there is a column that is not selected, the process returns to step 560 and repeats the above process. If it is determined in step 580 that there are no columns selected, the reconstruction of the rectangular image is completed.
- Steps 560 to 580 are processes for reconstructing the reconstructed projection image shown in FIG. 6 (b) into the rectangular image shown in FIG. 6 (c), and will be described in more detail with reference to FIGS. 9 and 10.
- FIG. 9 is a diagram illustrating a vertical movement of pixels according to an embodiment
- FIG. 10 is a diagram illustrating a vertical movement result of pixels according to an embodiment.
- the image processing apparatus may move pixels included in each column upward so that pixels are sequentially filled from the upper end 910 of each column. Therefore, the columns on the right side of the column including the pixels having pixel data of D, G, U, Z, g, and 3 may be moved upward.
- a column containing pixels with pixel data of A, B, E, S, X, a and a column containing pixels with pixel data of J, C, F, T, Y, f are already sequentially from the top Pixels are filled and do not move.
- a column containing pixels that do not move will be described again below.
- the result of the shift of the pixels is shown in FIG.
- the shift result shown in FIG. 10 is for explaining a vertical shift of a pixel according to an exemplary embodiment, and illustrates a result of vertical shift based on the horizontal shift result shown in FIG. 8. Therefore, the result of the horizontal shift of the actual pixel may be different from that shown in FIG.
- the vertical shifting process of pixels illustrated in steps 560 to 580 and FIGS. 9 and 10 describes a process of shifting the pixels included in the selected column so that the pixels are sequentially filled from the upper end of the selected column. have.
- this is only an example, and it is also possible to move the pixels included in the selected column downward so that the pixels are sequentially filled from the lower end of the selected column.
- adjacent pixels in the reconstructed rectangular image may have continuous pixel data.
- the pixel with pixel data of A and the pixel with pixel data of J in FIG. 7 are pixels that were located adjacent in the polyhedron before generating the projection image 410. That is, the projection image 410 is a polyhedron 320, so that in the projection image 410, a pixel having pixel data of A and a pixel having pixel data of J are physically separated, but the original two pixels are polyhedral. At 320, the pixels were located adjacent to one vertex.
- pixel data A and J are likely to have continuous data.
- continuous data refers to data that gradually change without a sudden change in data value.
- image processing continuous data can be processed faster and more efficiently than discrete data. Adjacent pixels of the reconstructed rectangular image are likely to have continuous pixel data, allowing for fast and efficient processing.
- the projection image shown in FIG. 6 (b) may be reconstructed into the image shown in FIG. 6 (c).
- FIG. 11 is a diagram illustrating an actual rectangular image reconstructed according to an embodiment.
- FIG. 11 is an actual rectangular image reconstructed through the method described with reference to FIGS. 5 to 10. Referring to FIG. 11, it can be seen that the rectangular image may be reconfigured to have a shape in which pixels are concentrated at the upper left and the lower right similarly to FIG. 6 (c).
- a plurality of pixels including pixels constituting a first boundary of the projection image may include pixels constituting the first boundary. It may be moved to be connected to the second boundary line which is the opposite boundary line connected to the boundary line.
- 12 to 15 and 28 to 38 illustrate a plurality of pixels including pixels constituting the first boundary of the projection image, and a second boundary line that is an opposite boundary where pixels constituting the first boundary are connected to the first boundary.
- a specific embodiment is shown for reconstructing a rectangular image by moving to connect.
- the projection image may be generated by using a development view of an octahedron or an octahedron, and the first boundary line and the second boundary line may both be composed of a plurality of lines, and the reference point may be set by Moving the plurality of pixels away from the reference point may be repeated.
- FIG. 12 is a flowchart illustrating a rectangular image reconstruction method according to another embodiment
- FIG. 13 is a diagram illustrating a rectangular image reconstruction method according to another embodiment.
- the image processing apparatus sets a reference point.
- a reference point is a standard for distinguishing pixels to be moved and pixels not to be moved.
- one-tenth of the horizontal length may be set as the reference point based on the entire row length, ie, the left end of the projection image.
- the image processing apparatus selects one column on the plane in which the projection image is shown.
- the image processing apparatus determines that the column number of the selected column is smaller than the column number of the reference point by comparing the selected column with the reference point, the image processing apparatus moves the pixels included in the selected column to the opposite side of the reference point.
- the plurality of pixels including pixels constituting the first boundary of the projection image may be moved to be connected to a second boundary, which is an opposite boundary connecting the first boundary with the pixels constituting the first boundary. have.
- step 1230 If it is determined in step 1230 that the column number of the selected column is greater than or equal to the column number of the reference point, the image processing apparatus proceeds directly to step 1250.
- the image processing apparatus determines whether there is an unselected column, and when there is an unselected column, the image processing apparatus returns to step 1220 and repeats the above process.
- Steps 1210 to 1250 are processes for reconstructing the projection image shown in FIG. 13A into the image shown in FIG. 13B, and will be described in more detail with reference to FIG. 14.
- FIG. 14 is a flowchart illustrating movement of pixels according to another exemplary embodiment.
- a 1/10 point of the horizontal length may be set as the reference point 1410 based on the left end of the projection image.
- the pixels 1420 having column numbers smaller than the column numbers of the reference point are moved to the opposite sides of the reference point.
- the pixels constituting the first boundary line 1430 may be moved to be connected to the second boundary line 1440 which is the opposite boundary line connected to the first boundary line.
- the first boundary line 1430 and the second boundary line 1440 are lines forming one corner in the polyhedron 320 before generating the projection image 410. That is, since the projection image 410 is a polyhedron 320, the first boundary line 1430 and the second boundary line 1440 are physically separated from the projection image 410, but in the original polyhedron 320, one The line that forms the edge of the. Therefore, even if the plurality of pixels including the pixels constituting the first boundary line are moved to be connected to the second boundary line, which is the opposite boundary line connecting the first boundary line, the pixels in the projection image 410 are changed. There is only a difference in which point the projection image 410 is represented based on the point.
- the image processing apparatus determines whether there are no rows selected, and when there are no rows selected, the image processing apparatus proceeds to step 1260 and selects one column on the plane on which the projection image is shown.
- Steps 1260 to 1280 perform the same process as those of step 560 to 580 of FIG. 5. Therefore, overlapping content will be briefly described.
- the image processing apparatus may move the pixels included in the selected column upward so that the pixels are sequentially filled from the upper end of the selected column.
- the image processing apparatus determines whether there is a column that is not selected, and when there is a column that is not selected, the process returns to step 1280 and repeats the above process.
- Steps 1260 to 1280 are processes for reconstructing the reconstructed projection image shown in FIG. 13B into the rectangular image shown in FIG. 13C, and will be described in more detail with reference to FIG. 15.
- 15 is a diagram illustrating vertical movement of pixels according to another exemplary embodiment.
- the image processing apparatus may move pixels included in each column upward so that pixels are sequentially filled from the upper end 1510 of each column. There may be a column containing pixels which are already filled with pixels from the top and are not moved. A column containing pixels that do not move will be described again below.
- the projection image shown in FIG. 13B may be reconstructed into the image shown in FIG. 13C.
- FIG. 28 is a flowchart illustrating a rectangular image reconstruction method, according to another embodiment.
- the image processing apparatus sets a reference point.
- a predetermined point may be set as a reference point.
- the image processing apparatus selects one row on the plane on which the projection image is shown.
- the image processing apparatus compares the selected row with the reference point and determines that the selected row is in the moving direction with respect to the reference point, the image processing apparatus moves to step 2840 to move the pixels included in the selected row to the opposite side of the reference point.
- the plurality of pixels including pixels constituting the first boundary of the projection image may be moved to be connected to a second boundary, which is an opposite boundary connecting the first boundary with the pixels constituting the first boundary. have.
- step 2830 determines in step 2830 that the selected row is not in the moving direction with respect to the reference point. If the image processing apparatus determines in step 2830 that the selected row is not in the moving direction with respect to the reference point, the image processing apparatus proceeds directly to step 2850. In operation 2850, the image processing apparatus determines whether there is an unselected row, and when there is an unselected row, the process returns to step 2820 and repeats the above process.
- Steps 2810 to 2850 reconstruct the projection image shown in Fig. 29 (a) into the image shown in Fig. 29 (b) or 29 (d) or the projection image shown in Fig. 32 (a) as shown in Fig. 32 (a). b) or as a process for reconstructing the image shown in FIG. 32 (d), will be described in more detail with reference to FIGS. 30 to 31 and 33 to 34.
- the projection image may be generated using an exploded view of an octahedron or an octahedron.
- a third point of the vertical length may be set as the reference point 3010 based on the lower end of the projection image.
- the pixels in the moving direction with respect to the reference point 3010 that is, the pixels below the reference point 3010, move to the opposite side of the reference point.
- the pixels constituting the first boundary line 3020 may be moved to be connected to the second boundary line 3030 which is an opposite boundary line connected to the first boundary line.
- both the first boundary line 3020 and the second boundary line 3030 may be composed of a plurality of lines.
- a third point of the vertical length may be set as the reference point 3110 based on the upper end of the projection image.
- the pixels in the moving direction with respect to the reference point 3110 that is, the pixels above the reference point 3110, are moved to the opposite sides of the reference point.
- the pixels constituting the first boundary line 3120 may be moved to be connected to the second boundary line 3130, which is the opposite boundary line connected to the first boundary line.
- both the first boundary line 3120 and the second boundary line 3130 may be formed of a plurality of lines.
- one-half point of the vertical length of the projection image may be set as reference points 3310 and 3410.
- the pixels in the moving direction with respect to the reference point are moved to the opposite side of the reference point.
- the pixels constituting the first boundary lines 3320 and 3420 may be moved to be connected to the second boundary lines 3330 and 3430, which are opposite boundary lines connected to the first boundary lines.
- both the first boundary lines 3320 and 3420 and the second boundary lines 3330 and 3430 may be formed of a plurality of lines.
- the image processing apparatus After the pixels of the projection image are moved in steps 2810 to 2850, the image processing apparatus generates a rectangular image by adding a margin in step 2860.
- the image shown in FIGS. 29 (b) and 29 (d) may be reconstructed into the image shown in FIGS. 29 (c) and 29 (e), respectively, or FIGS. 32 (b) and 32 (d).
- This is a process for reconstructing the image shown in FIGS. 32 (c) and 32 (e), respectively.
- a rectangular image may be generated by adding a margin after pixels of the projection image are moved.
- 35 and 37 are diagrams illustrating a rectangular image reconstruction method, according to another embodiment.
- it may be repeated to set a reference point and move the plurality of pixels opposite the reference point.
- the process of reconstructing the image shown in FIG. 35 (a) into the image shown in FIG. 35 (b) is the same as the process of reconstructing the image shown in FIG. 13 (a) into the image shown in FIG.
- the process of reconstructing the image shown in FIG. 37 (a) into the image shown in FIG. 37 (b) is the same as the process of reconstructing the image shown in FIG. 32 (a) into the image shown in FIG. 32 (b).
- the description of the process is omitted.
- a process of generating the rectangular image shown in FIGS. 35 (c) and 37 (c) will be described in more detail with reference to FIGS. 36 and 38.
- 36 and 38 are diagrams illustrating movement of pixels according to another exemplary embodiment.
- a third point of the vertical length may be set as the reference point 3610 based on the lower end of the projection image.
- pixels having a row number smaller than the row number of the reference point are moved to the opposite side of the reference point.
- the pixels constituting the first boundary line 3620 may be moved to be connected to the second boundary line 3630, which is an opposite boundary line connected to the first boundary line.
- both the first boundary line 3620 and the second boundary line 3630 may include a plurality of lines.
- a point 1/10 of a horizontal length may be set as the reference point 3810 based on the left end of the projection image.
- pixels having a column number smaller than the column number of the reference point are moved to the opposite side of the reference point.
- the pixels constituting the first boundary line 3820 may be moved to be connected to the second boundary line 3830, which is an opposite boundary line connected to the first boundary line.
- a plurality of reference lines may be set in the projection image, and pixels may be moved based on the plurality of reference lines.
- 5 to 15 illustrate the contents of moving pixels so that pixels are sequentially filled with any one of the left end, the right end, the upper end, and the lower end when the pixel moves. have.
- a plurality of reference lines may be set to move the pixels so that the pixels are sequentially filled based on each reference point. It demonstrates with reference to FIG. 16 and FIG.
- FIG. 16 is a diagram illustrating a rectangular image reconstruction method according to another embodiment
- FIG. 17 is a diagram illustrating a process of setting a baseline according to another embodiment.
- FIG. 16 (b) is a diagram in which one reference line is moved
- FIG. 16 (b) is a diagram in which each reference line is moved.
- a plurality of reference lines 1710, 1720, 1730, 1740, 1750, and 1760 are set.
- the upper two layers in FIG. 17 move pixels leftward relative to the left baseline of each triangle, and the lower one layer is moved by one tenth of the horizontal length as a whole, and then the right reference line. You can move the pixel to the right based on the.
- the projection image shown in FIG. 16 (a) may be reconstructed into the image shown in FIG. 16 (b).
- the process of reconstructing the image shown in FIG. 16 (b) into the rectangular image shown in FIG. 16 (c) is a vertical movement process of the pixels, and thus description thereof will not be repeated.
- the image processing apparatus may determine whether to move at least one pixel when reconstructing a rectangular image, and determine a movement direction of at least one pixel when determining that the at least one pixel is moved. It is not always necessary to move all the pixels, so if you decide to move a pixel first, and decide to move the pixel, you decide the direction of movement of the pixel. For example, in FIG.
- a column including pixels having pixel data of A, B, E, S, X and a and a column including pixels having pixel data of J, C, F, T, Y and f are already included. Pixels are filled sequentially from the top and do not need to be moved.
- the pixel movement method according to an embodiment may be applied in various ways. For example, it is possible to apply vertical movement after horizontal movement, and to move pixels opposite the reference point after vertical movement.
- the rectangular image may be reconstructed using various methods.
- the rectangular image may be reconstructed using a combination of the above-described method of positioning the pixels.
- the image processing apparatus processes the rectangular image.
- the data to be processed may be reduced by reconstructing the projection image into a rectangular image in operation 230, and the adjacent pixels of the reconstructed rectangular image may have continuous pixel data, thereby enabling more efficient processing.
- the process of processing the rectangular image may include compression, rendering, and the like.
- the image processing method may further include generating reconstruction information required to reconstruct the processed rectangular image into the projection image.
- the reconstruction information may include the position movement history of the pixel and / or information related to the original position of the pixel. As described above, the rectangular image can be easily restored to the projection image 410 as long as only the positional movement history of the pixel and / or the original position of the pixel can be obtained.
- the image processing method may further include transmitting the processed rectangular image and the reconstruction information. By transmitting the processed rectangular image and the reconstruction information together, the processed rectangular image on the receiving side can be easily reconstructed into the projection image 410.
- the data to be processed is reduced by reconstructing the projection image into a rectangular image, and adjacent pixels of the reconstructed rectangular image are more likely to have continuous pixel data, thereby enabling more efficient processing.
- FIG. 18 is a flowchart of an image processing apparatus, according to an exemplary embodiment.
- the image processing apparatus 1800 may include a controller 1810 and a memory 1820. According to an embodiment, the image processing apparatus 1800 may be a virtual reality device.
- the controller 1810 may control an operation of the entire image processing apparatus 1800 and may process an image by controlling the memory 1820.
- the controller 1810 stores a signal or data input from the outside of the image processing apparatus 1800, a RAM used as a storage area corresponding to various operations performed by an electronic device, and a ROM in which a control program for controlling a peripheral device is stored. (ROM) and a processor.
- the processor may be implemented as a system on chip (SoC) that integrates a core and a GPU.
- SoC system on chip
- the processor may include a plurality of processors.
- the controller 1810 acquires images in at least two directions, generates the projection image by projecting the images onto a polyhedron, and moves the rectangle by moving the position of at least one pixel among the pixels of the projection image. You can reconstruct the image and process the rectangular image.
- the controller 1810 may perform the image processing method described with reference to FIGS. 2 to 17. Duplicate content will be briefly explained here.
- the controller 1810 may project the images to the area of the developed view corresponding to the area of the polyhedron when the images are projected onto the polyhedron. According to another exemplary embodiment, the controller 1810 may project the images projected on the polyhedron to at least one plane outside the polyhedron when the projection image is generated.
- the controller 1810 may generate the rectangular image by moving only the position of the pixel without deleting the pixel or adding a new pixel.
- the controller 1810 may generate a rectangular image by moving at least one pixel of the projection image and adding at least one margin.
- the controller 1810 determines the parallel movement direction of the pixels included in each row for each row, and the left side of each row according to the determined parallel movement direction. Pixels included in each row can be horizontally shifted left or right so that pixels are sequentially filled from the end or the right end.
- the controller 1810 may determine a vertical movement direction of pixels included in each column for each column when the projection image 410 is reconstructed into a rectangular image, and the upper portion of each column according to the determined vertical movement direction. The pixels included in each column may be vertically moved upward or downward for each column so that the pixels are sequentially filled from the end or the bottom end.
- the controller 1810 may include a plurality of pixels including pixels constituting the first boundary of the projection image when the rectangular image is reconstructed, and an opposite boundary line where the pixels constituting the first boundary are connected to the first boundary. It may be moved to be connected to the second boundary line.
- the controller 1810 may set a plurality of reference lines in the projection image and move pixels based on the plurality of reference lines when reconstructing a rectangular image.
- the controller 1810 may determine whether to move at least one pixel when reconstructing a rectangular image, and determine a movement direction of at least one pixel when determining that at least one pixel is moved.
- the controller 1810 may generate reconstruction information required to reconstruct a rectangular image into the projection image.
- the controller 1810 may control to transmit the processed rectangular image and the reconstruction information.
- the controller 1810 may reduce unnecessary data to be processed by the image processing apparatus by reconstructing the projection image into a rectangular image.
- the memory 1820 stores a program and data necessary for the operation of the image processing apparatus 1800.
- the memory 1820 may be configured as a volatile storage medium or a nonvolatile storage medium, or may be a combination of both storage media.
- the volatile storage medium may include a semiconductor memory such as RAM, DRAM, SRAM, and the like, and the nonvolatile storage medium may include a hard disk and a flash NAND memory.
- the memory 1820 may store data for the operation of the controller 1810.
- the image processing apparatus 1800 may further include a camera unit, a receiver, a transmitter, and the like.
- the camera unit may photograph an environment surrounding the image processing apparatus 360 degrees through the camera.
- the receiver may receive a 360 degree image from an external device.
- the transmitter may transmit the processed rectangular image and the reconstruction information.
- the receiver and the transmitter may include a communication unit.
- the image processing apparatus 1800 may reduce data to be processed, process an image such that adjacent pixels have continuous pixel data, reduce power required for data processing, and increase processing efficiency.
- 19 is a flowchart illustrating an image processing method, according to another exemplary embodiment.
- An image processing method is a method of restoring a processed rectangular image to a 360 degree image 310.
- the image processing apparatus acquires a rectangular image.
- the rectangular image obtains images 310 for at least two directions, projects the images 310 onto a development view of the polyhedron 320, and generates a projection image 410, and the projection image 410 It may be an image generated by reconstructing a rectangular image by moving the position of at least one pixel among the pixels of the. That is, the rectangular image acquired in step 1910 may be an image generated by the image processing method illustrated in FIGS. 2 to 17.
- the image processing apparatus reconstructs the projection image 410 by shifting positions of at least one pixel among the pixels of the rectangular image.
- the process of restoring the rectangular image to the projection image 410 may be performed in the reverse order of the process of reconstructing the projection image 410 illustrated in FIGS. 2 to 17 into a rectangular image.
- an image processing method may include receiving reconstruction information required to reconstruct a rectangular image into the projection image 410.
- the projection image 410 may be restored based on the restoration information. That is, the restoration method may be determined based on the received restoration information.
- a process of determining a restoration method based on the restoration information received in FIG. 20 is illustrated.
- 20 is a flowchart illustrating a process of selecting a projection image reconstruction method, according to an exemplary embodiment.
- the image processing apparatus determines a restoration method based on restoration information.
- the image processing apparatus may determine one reconstruction method from at least one reconstruction method. If the image processing apparatus determines the restoration method as the method A in step 2010, the method proceeds to step 2020 to restore the projection image using the method A, and when the restoration method is determined as the method B, proceeds to step 2030 and the method B To restore the projection image using the method, and if the restoration method is determined by another method, the process proceeds to step 2040 to restore the projection image using the method.
- the restoration methods A, B, and other restoration methods are classified for convenience in order to distinguish each restoration method, and do not refer to a specific restoration method.
- the restoration method will be described through specific embodiments.
- 21 is a flowchart illustrating a method of recovering a projected image, according to an exemplary embodiment.
- FIG. 21 illustrates a method of reconstructing a rectangular image reconstructed by the method shown in FIG. 5 into a projection image 410.
- the process of restoring the rectangular image into the projection image 410 is performed in the reverse order of the process of reconstructing the projection image 410 into the rectangular image.
- the image processing apparatus selects one column on a plane on which a rectangular image is shown.
- the image processing apparatus may move the pixels of the selected column downward. At this time, the pixels included in the selected column are not moved to the bottom end but to the original position. This is a restoration process of the projection image 410, which is a natural process for restoring the original image.
- step 2130 the image processing apparatus determines whether there is a column that is not selected, and when there is a column that is not selected, the process returns to step 2110 and repeats the above process. If it is determined in step 2130 that no heat is selected, the flow proceeds to step 2140.
- the image processing apparatus selects one row on the plane on which the projection image 410 is shown.
- the image processing apparatus determines whether the number of the selected row is less than or equal to the set value. If it is determined in step 2150 that the number of the selected row is less than or equal to the set value, the image processing apparatus may proceed to step 2160 to move the pixels included in the selected row to the right. If it is determined that the number of the row selected in step 2150 is greater than or equal to the set value, the image processing apparatus may move the pixels included in the selected row to the left in step 2170.
- the image processing apparatus determines whether there is an unselected row, and when there is an unselected row, the process returns to step 2140 and repeats the above process. If it is determined in step 2180 that there are no rows selected, the restoration of the projection image 410 is completed.
- 22 is a flowchart illustrating a method of recovering a projected image, according to another exemplary embodiment.
- FIG. 22 illustrates a method for reconstructing a rectangular image reconstructed by the method shown in FIG. 12 into a projection image.
- steps 2210 to 2230 are the same as steps 2110 to 2230 of FIG. 21.
- the image processing apparatus selects one column on a plane on which a rectangular image is shown.
- the image processing apparatus may move the pixels of the selected column downward.
- the image processing apparatus selects a pixel.
- the image processing apparatus determines whether the selected pixel is a pixel moved beyond the reference point. If it is determined that the pixel selected in step 2250 is the pixel moved beyond the reference point, the image processing apparatus moves to the opposite side of the reference point in step 2260. That is, the pixel is restored to its original position.
- step 2250 If the image processing apparatus determines that the selected pixel is not the pixel moved beyond the reference point in step 2250, the image processing apparatus proceeds directly to step 2270.
- the image processing apparatus determines whether there is a non-selected pixel, and when there is a non-selected pixel, the process returns to step 2270 and repeats the above process.
- the restoration of the projection image is completed.
- the method of restoring the reconstructed rectangular image into the projection image according to the exemplary embodiment described with reference to FIGS. 21 and 22 is just one embodiment, and various reconstruction methods may be used according to the method of reconstructing the rectangular image.
- the image processing apparatus restores the projection image to the polyhedron and back-projections the polyhedron to generate the back-projection image.
- the reverse projection image may be an image representing a surrounding environment surrounding a specific location.
- the projected image when the projected image is generated by projecting the images onto an area of a development view corresponding to an area of the polyhedron when the images are projected onto the polyhedron, the projected image may be restored to the polyhedron by assembling the projected image.
- the projection image when the projection image is generated by projecting images projected on the polyhedron to at least one plane outside the polyhedron, the projection image may be restored to a polyhedron by three-dimensionalizing the projection image.
- the image processing apparatus may generate an image representing all or part of the surrounding environment surrounding a specific location.
- a reverse projection image of the environment around the user may be generated according to the user's gaze.
- the generated reverse projection image may not be an image representing the entire environment around the user, but may be an image representing a part of the environment around the user viewed according to the eyes of the user.
- the reverse projection image may be an image representing the entire environment around the user. That is, the reverse projection image may be a 360 degree image.
- the image processing apparatus may efficiently process the image by reconstructing the projection image by using a rectangular image having less data in operation 1920.
- the image processing method when generating the reverse projection image based on the processed rectangular image, it is possible to efficiently process with less power by restoring less data.
- FIG. 23 is a flowchart of an image processing apparatus, according to another exemplary embodiment.
- the image processing apparatus 2300 may include a receiver 2310, a memory 2320, and a controller 2330.
- the image processing apparatus 2300 may be a virtual reality device. In addition, it may be the same device as the image processing apparatus 1800 illustrated in FIG. 18.
- the receiver 2310 serves to receive a rectangular image.
- the received rectangular image may be an image processed rectangular image.
- the receiver 2310 may include various components such as a USB interface unit and a DVD interface unit.
- the image processing apparatus 2300 may receive an image file from the USB.
- the communication unit may be connected to a network by wire or wirelessly to perform communication with an external device, and may include a short range communication module, a mobile communication module, a wireless internet module, a wired internet module, and the like.
- the communication unit may include one or more components.
- the memory 2320 stores a program and data necessary for the operation of the image processing apparatus 2300.
- the memory 2320 may be configured as a volatile storage medium or a nonvolatile storage medium, or may be a combination of both storage media.
- Volatile storage media may include semiconductor memories such as RAM, DRAM, and SRAM, and non-volatile storage media may include hard disks and flash NAND memories.
- the memory 2320 may store data for the operation of the controller 2330.
- the controller 2330 may control an operation of the entire image processing apparatus 2300 and may process an image by controlling the memory 2320.
- the controller 2330 stores a signal or data input from the outside of the image processing apparatus 2300, a RAM used as a storage area corresponding to various operations performed by the electronic device, and a ROM in which a control program for controlling a peripheral device is stored.
- a processor can be implemented as an SoC that integrates the core and the GPU.
- the processor may include a plurality of processors.
- the controller 2330 acquires a rectangular image, restores the projection image by shifting the position of at least one pixel among the pixels of the rectangular image, restores the projection image to a polyhedron, and reverse-projects the polyhedron to reverse the polyhedron.
- the rectangular image obtains images 310 for at least two directions, and projects the images 310 onto a development view of the polyhedron 320 to produce a projection image 410 and pixels of the projection image.
- the image may be an image generated by moving a position of at least one of the pixels to reconstruct the rectangular image. That is, the rectangular image acquired in step 1910 may be an image generated by the image processing method illustrated in FIGS. 2 to 17.
- the projected image when the projected image is generated by projecting the images onto an area of a development view corresponding to an area of the polyhedron when the images are projected onto the polyhedron, the projected image may be restored to the polyhedron by assembling the projected image.
- the projection image when the projection image is generated by projecting images projected on the polyhedron to at least one plane outside the polyhedron, the projection image may be restored to a polyhedron by three-dimensionalizing the projection image.
- the controller 2330 may perform the image processing method described with reference to FIGS. 19 to 22. Duplicate content will be briefly explained here.
- the controller 2330 may receive the restoration information necessary to restore the rectangular image to the projection image by controlling the receiver 2310 and restore the projection image based on the received restoration information when the projection image is restored. can do.
- the image processing apparatus 2300 may efficiently process with little power when restoring less data when generating the reverse projection image based on the rectangular image.
- the polyhedron 320 is not limited to the tetrahedron, and it is also possible to perform image processing by projecting the obtained image 310 onto the polyhedron 320 having various shapes.
- 24 illustrates a projection image generated by using various types of polyhedrons 320.
- 24 is a diagram illustrating projection images according to an exemplary embodiment.
- FIG. 24 (a) shows an octahedron
- FIGS. 24 (b) and (c) show a cube
- (d) shows a projection image that can be generated by projecting the acquired image 310 onto an octahedron.
- such polyhedrons may be polyhedrons composed of at least one triangle having the same shape and area. At this time, one side of the polyhedron may be composed of another polygon made of two or more triangles.
- FIG. 24 (c) is a cube in which each surface is formed of a quadrangle formed by combining two triangles.
- the projection image may be generated using various types of polyhedrons without being limited to the polyhedrons shown in FIG. 24.
- the above-described embodiments can be written as a program that can be executed in a computer, and can be implemented in a general-purpose digital computer which operates a program using a computer-readable recording medium.
- the computer-readable recording medium may be a magnetic storage medium (for example, a ROM, a floppy disk, a hard disk, etc.), an optical reading medium (for example, a CD-ROM, a DVD, etc.) and a carrier wave (for example, the Internet). Storage medium).
- a magnetic storage medium for example, a ROM, a floppy disk, a hard disk, etc.
- an optical reading medium for example, a CD-ROM, a DVD, etc.
- carrier wave for example, the Internet.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
Abstract
Description
Claims (15)
- 적어도 둘 이상의 방향에 대한 이미지들을 획득하는 단계;다면체에 상기 이미지들을 투영하여 투영 이미지를 생성하는 단계;상기 투영 이미지의 픽셀들 중 적어도 하나 이상의 픽셀의 위치를 이동시켜 직사각형 이미지(rectangular image)로 재구성(reshaping)하는 단계; 및상기 직사각형 이미지를 처리하는 단계를 포함하는 것을 특징으로 하는 이미지 처리 방법.
- 제1항에 있어서,상기 다면체에 상기 이미지들을 투영하여 투영 이미지를 생성하는 단계는,상기 이미지들을 상기 다면체에 투영하는 경우 상기 다면체의 영역에 대응되는 전개도의 영역에 상기 이미지들을 투영하는 단계를 포함하는 것을 특징으로 하는 이미지 처리 방법.
- 제1항에 있어서,상기 다면체에 상기 이미지들을 투영하여 투영 이미지를 생성하는 단계는,상기 다면체 외부의 적어도 하나의 평면에, 상기 다면체에 투영된 상기 이미지들을 투영하는 단계를 포함하는 것을 특징으로 하는 이미지 처리 방법.
- 제1항에 있어서,상기 투영 이미지의 픽셀들 중 적어도 하나 이상의 픽셀의 위치를 이동시켜 직사각형 이미지로 재구성하는 단계는,상기 투영 이미지의 픽셀을 이동시킨 후 적어도 하나의 여백을 추가하여 직사각형 이미지를 생성하는 단계를 포함하는 것을 특징으로 하는 이미지 처리 방법.
- 제1항에 있어서,상기 투영 이미지의 픽셀들 중 적어도 하나 이상의 픽셀의 위치를 이동시켜 직사각형 이미지로 재구성하는 단계는,각 행(row) 별로 상기 각 행에 포함된 픽셀들의 평행 이동 방향을 결정하는 단계; 및상기 결정된 평행 이동 방향에 따라 상기 각 행의 왼쪽 끝(left edge) 또는 오른쪽 끝(right edge)으로부터 순차적으로 픽셀이 채워지도록 상기 각 행 별로 상기 각 행에 포함된 픽셀들을 왼쪽(left) 또는 오른쪽(right)으로 수평 이동시키는 단계를 포함하는 것을 특징으로 하는 이미지 처리 방법.
- 제1항에 있어서,상기 투영 이미지의 픽셀들 중 적어도 하나 이상의 픽셀의 위치를 이동시켜 직사각형 이미지로 재구성하는 단계는,각 열(column) 별로 상기 각 열에 포함된 픽셀들의 수직 이동 방향을 결정하는 단계; 및상기 결정된 수직 이동 방향에 따라 상기 각 열의 위쪽 끝(top edge) 또는 아래쪽 끝(bottom edge)으로부터 순차적으로 픽셀이 채워지도록 상기 각 열 별로 상기 각 열에 포함된 픽셀들을 위쪽(up) 또는 아래쪽(down)으로 수직 이동시키는 단계를 포함하는 것을 특징으로 하는 이미지 처리 방법.
- 제1항에 있어서,상기 투영 이미지의 픽셀들 중 적어도 하나 이상의 픽셀의 위치를 이동시켜 직사각형 이미지로 재구성하는 단계는,적어도 하나 이상의 픽셀의 이동 여부를 결정하는 단계; 및상기 적어도 하나 이상의 픽셀을 이동 시킨다고 결정하는 경우, 상기 적어도 하나 이상의 픽셀의 이동 방향을 결정하는 단계를 포함하는 것을 특징으로 하는 이미지 처리 방법.
- 제1항에 있어서,상기 투영 이미지의 픽셀들 중 적어도 하나 이상의 픽셀의 위치를 이동시켜 직사각형 이미지로 재구성하는 단계는,상기 투영 이미지의 제1 경계선(boundary)을 구성하는 픽셀들을 포함하는 복수 개의 픽셀들을 상기 제1 경계선을 구성하는 픽셀들이 상기 제1 경계선과 연결되는 반대편 경계선인 제2 경계선에 연결되도록 이동시키는 단계를 포함하는 것을 특징으로 하는 이미지 처리 방법.
- 제1항에 있어서,상기 투영 이미지의 픽셀들 중 적어도 하나 이상의 픽셀의 위치를 이동시켜 직사각형 이미지로 재구성하는 단계는,상기 투영 이미지 내에 복수 개의 기준선을 설정하는 단계; 및상기 복수 개의 기준선을 기준으로, 상기 픽셀들을 이동시키는 단계를 포함하는 것을 특징으로 하는 이미지 처리 방법.
- 제1항에 있어서,상기 직사각형 이미지를 상기 투영 이미지로 복원하는데 필요한 복원 정보를 생성하는 단계를 더 포함하는 것을 특징으로 하는 이미지 처리 방법.
- 제1항에 있어서,상기 다면체는,동일한 모양과 면적을 갖는 적어도 하나 이상의 삼각형으로 구성되는 다면체인 것을 특징으로 하는 이미지 처리 방법.
- 직사각형 이미지를 획득하는 단계;상기 직사각형 이미지의 픽셀들 중 적어도 하나 이상의 픽셀의 위치를 이동시켜 투영 이미지를 복원하는 단계; 및상기 투영 이미지를 다면체로 복원하고, 상기 다면체를 역투영(back-projection)하여 역투영 이미지를 생성하는 단계를 포함하는 것을 특징으로 하는 이미지 처리 방법.
- 제12항에 있어서,상기 직사각형 이미지를 상기 투영 이미지로 복원하는데 필요한 복원 정보를 수신하는 단계를 더 포함하고,상기 투영 이미지를 복원하는 단계는,상기 수신한 복원 정보를 바탕으로 상기 투영 이미지를 복원하는 것을 특징으로 하는 이미지 처리 방법.
- 제12항에 있어서,상기 직사각형 이미지는,적어도 둘 이상의 방향에 대한 이미지들을 획득하고, 다면체에 상기 이미지들을 투영하여 투영 이미지를 생성하며, 상기 투영 이미지의 픽셀들 중 적어도 하나 이상의 픽셀의 위치를 이동시켜 직사각형 이미지로 재구성하여 생성된 것을 특징으로 하는 이미지 처리 방법.
- 적어도 둘 이상의 방향에 대한 이미지들을 획득하고, 다면체에 상기 이미지들을 투영하여 투영 이미지를 생성하며, 상기 투영 이미지의 픽셀들 중 적어도 하나 이상의 픽셀의 위치를 이동시켜 직사각형 이미지로 재구성하고, 상기 직사각형 이미지를 처리하는 제어부 및상기 제어부의 동작을 위한 데이터를 저장하는 메모리를 포함하는 것을 특징으로 하는 이미지 처리 장치.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/089,224 US10825133B2 (en) | 2016-04-05 | 2016-05-18 | Method and apparatus for processing image |
CN201680084366.9A CN109074677B (zh) | 2016-04-05 | 2016-05-18 | 用于处理图像的方法和设备 |
KR1020187025485A KR102493124B1 (ko) | 2016-04-05 | 2016-05-18 | 이미지 처리 방법 및 장치 |
EP16898014.2A EP3416138B1 (en) | 2016-04-05 | 2016-05-18 | Method and apparatus for processing image |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2016/003537 WO2017175888A1 (ko) | 2016-04-05 | 2016-04-05 | 이미지 처리 방법 및 장치 |
KRPCT/KR2016/003537 | 2016-04-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017175910A1 true WO2017175910A1 (ko) | 2017-10-12 |
Family
ID=60000414
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2016/003537 WO2017175888A1 (ko) | 2016-04-05 | 2016-04-05 | 이미지 처리 방법 및 장치 |
PCT/KR2016/005289 WO2017175910A1 (ko) | 2016-04-05 | 2016-05-18 | 이미지 처리 방법 및 장치 |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2016/003537 WO2017175888A1 (ko) | 2016-04-05 | 2016-04-05 | 이미지 처리 방법 및 장치 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10825133B2 (ko) |
EP (1) | EP3416138B1 (ko) |
KR (1) | KR102493124B1 (ko) |
CN (1) | CN109074677B (ko) |
WO (2) | WO2017175888A1 (ko) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108307166A (zh) * | 2018-03-09 | 2018-07-20 | 嘀拍信息科技南通有限公司 | 一种新的全景视频传输投影模型 |
US10891711B2 (en) | 2017-04-13 | 2021-01-12 | Samsung Electronics Co., Ltd. | Image processing method and apparatus |
US10931971B2 (en) | 2016-12-27 | 2021-02-23 | Samsung Electronics Co., Ltd. | Method and apparatus for encoding and decoding 360-degree image |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109155858B (zh) | 2016-05-16 | 2022-09-13 | 三星电子株式会社 | 视频编码方法和设备、视频解码方法和设备 |
KR102176322B1 (ko) * | 2018-12-19 | 2020-11-09 | 주식회사 팀제파 | 멀티 프로젝터 제어 시스템 및 방법 |
CN111489411B (zh) * | 2019-01-29 | 2023-06-20 | 北京百度网讯科技有限公司 | 线条绘制方法、装置、图像处理器、显卡及车辆 |
GB2585645B (en) * | 2019-07-08 | 2024-04-17 | Toshiba Kk | Computer vision method and system |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100732958B1 (ko) * | 2004-08-13 | 2007-06-27 | 경희대학교 산학협력단 | 20면체 파노라마 영상의 부호화 및 복호화를 위한 방법 및장치 |
KR101120131B1 (ko) * | 2009-05-29 | 2012-03-22 | 주식회사 영국전자 | 지능형 광역 감시 카메라, 그 제어회로 및 제어방법, 이를 이용한 영상 감시 시스템 |
KR101201107B1 (ko) * | 2004-12-30 | 2012-11-13 | 마이크로소프트 코포레이션 | 파노라마식 이미지에서의 데드 존의 최소화 |
US20140132598A1 (en) * | 2007-01-04 | 2014-05-15 | Hajime Narukawa | Method of mapping image information from one face onto another continous face of different geometry |
KR20150091517A (ko) * | 2012-12-06 | 2015-08-11 | 퀄컴 인코포레이티드 | 파노라마 이미지를 위한 환형 뷰 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2424601A (en) | 1944-01-25 | 1947-07-29 | Joel E Crouch | Icosahedral map |
US6459451B2 (en) * | 1996-06-24 | 2002-10-01 | Be Here Corporation | Method and apparatus for a panoramic camera to capture a 360 degree image |
JP2003141562A (ja) * | 2001-10-29 | 2003-05-16 | Sony Corp | 非平面画像の画像処理装置及び画像処理方法、記憶媒体、並びにコンピュータ・プログラム |
US7259784B2 (en) | 2002-06-21 | 2007-08-21 | Microsoft Corporation | System and method for camera color calibration and image stitching |
EP1559060A4 (en) * | 2002-11-06 | 2007-06-13 | Geometric Informatics Inc | ANALYSIS OF GEOMETRIC SURFACES BY CONFORMITY STRUCTURE |
JP4355535B2 (ja) * | 2003-08-07 | 2009-11-04 | 株式会社岩根研究所 | 360度画像変換処理装置 |
KR100755450B1 (ko) * | 2006-07-04 | 2007-09-04 | 중앙대학교 산학협력단 | 평면 호모그래피를 이용한 3차원 재구성 장치 및 방법 |
US20110310219A1 (en) | 2009-05-29 | 2011-12-22 | Youngkook Electronics, Co., Ltd. | Intelligent monitoring camera apparatus and image monitoring system implementing same |
US20130044258A1 (en) | 2011-08-15 | 2013-02-21 | Danfung Dennis | Method for presenting video content on a hand-held electronic device |
KR20130043300A (ko) * | 2011-10-20 | 2013-04-30 | 삼성전자주식회사 | 프로젝터를 통해 투사되는 영상을 보정하기 위한 장치 및 방법 |
JP5870636B2 (ja) * | 2011-11-09 | 2016-03-01 | ソニー株式会社 | 画像処理装置および方法、並びにプログラム |
JP6421445B2 (ja) * | 2014-01-24 | 2018-11-14 | 株式会社リコー | 投影システム、画像処理装置、校正方法、システムおよびプログラム |
JP6112616B2 (ja) * | 2014-04-18 | 2017-04-12 | Necフィールディング株式会社 | 情報処理装置、情報処理システム、情報処理方法、及びプログラム |
GB2527503A (en) | 2014-06-17 | 2015-12-30 | Next Logic Pty Ltd | Generating a sequence of stereoscopic images for a head-mounted display |
US10204658B2 (en) | 2014-07-14 | 2019-02-12 | Sony Interactive Entertainment Inc. | System and method for use in playing back panorama video content |
JP6464599B2 (ja) * | 2014-07-31 | 2019-02-06 | 株式会社リコー | 画像処理装置、画像処理システム、画像処理装置の制御方法、及びプログラム |
-
2016
- 2016-04-05 WO PCT/KR2016/003537 patent/WO2017175888A1/ko active Application Filing
- 2016-05-18 WO PCT/KR2016/005289 patent/WO2017175910A1/ko active Application Filing
- 2016-05-18 KR KR1020187025485A patent/KR102493124B1/ko active IP Right Grant
- 2016-05-18 CN CN201680084366.9A patent/CN109074677B/zh active Active
- 2016-05-18 EP EP16898014.2A patent/EP3416138B1/en active Active
- 2016-05-18 US US16/089,224 patent/US10825133B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100732958B1 (ko) * | 2004-08-13 | 2007-06-27 | 경희대학교 산학협력단 | 20면체 파노라마 영상의 부호화 및 복호화를 위한 방법 및장치 |
KR101201107B1 (ko) * | 2004-12-30 | 2012-11-13 | 마이크로소프트 코포레이션 | 파노라마식 이미지에서의 데드 존의 최소화 |
US20140132598A1 (en) * | 2007-01-04 | 2014-05-15 | Hajime Narukawa | Method of mapping image information from one face onto another continous face of different geometry |
KR101120131B1 (ko) * | 2009-05-29 | 2012-03-22 | 주식회사 영국전자 | 지능형 광역 감시 카메라, 그 제어회로 및 제어방법, 이를 이용한 영상 감시 시스템 |
KR20150091517A (ko) * | 2012-12-06 | 2015-08-11 | 퀄컴 인코포레이티드 | 파노라마 이미지를 위한 환형 뷰 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3416138A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10931971B2 (en) | 2016-12-27 | 2021-02-23 | Samsung Electronics Co., Ltd. | Method and apparatus for encoding and decoding 360-degree image |
US10891711B2 (en) | 2017-04-13 | 2021-01-12 | Samsung Electronics Co., Ltd. | Image processing method and apparatus |
CN108307166A (zh) * | 2018-03-09 | 2018-07-20 | 嘀拍信息科技南通有限公司 | 一种新的全景视频传输投影模型 |
Also Published As
Publication number | Publication date |
---|---|
KR102493124B1 (ko) | 2023-01-30 |
WO2017175888A1 (ko) | 2017-10-12 |
CN109074677A (zh) | 2018-12-21 |
EP3416138B1 (en) | 2020-09-23 |
EP3416138A4 (en) | 2019-03-20 |
CN109074677B (zh) | 2023-07-07 |
US20190108612A1 (en) | 2019-04-11 |
EP3416138A1 (en) | 2018-12-19 |
KR20180132049A (ko) | 2018-12-11 |
US10825133B2 (en) | 2020-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017175910A1 (ko) | 이미지 처리 방법 및 장치 | |
WO2020145689A1 (en) | Method and apparatus for improving image padding in video-based point-cloud compression codec | |
WO2020013592A1 (en) | Improved point cloud compression via color smoothing of point cloud prior to texture video generation | |
WO2019093834A1 (en) | Point cloud compression using non-orthogonal projection | |
EP3782122A1 (en) | Point cloud compression using interpolation | |
WO2018182192A1 (en) | Method and apparatus for displaying image based on user motion information | |
WO2019078696A1 (en) | COMPRESSION OF POINT CLOUD USING HYBRID TRANSFORMS | |
WO2018070810A1 (ko) | 가상 현실 영상을 처리하는 방법 및 장치 | |
WO2016048108A1 (en) | Image processing apparatus and image processing method | |
WO2020231231A1 (en) | Single-pass boundary detection in video-based point cloud compression | |
EP3632119A1 (en) | Display apparatus and server, and control methods thereof | |
WO2018093100A1 (en) | Electronic apparatus and method for processing image thereof | |
WO2017026705A1 (ko) | 360도 3d 입체 영상을 생성하는 전자 장치 및 이의 방법 | |
WO2021096233A1 (en) | Electronic apparatus and control method thereof | |
WO2021133053A1 (ko) | 전자 장치 및 그의 제어 방법 | |
WO2019022509A1 (en) | DEVICE AND METHOD FOR PROVIDING CONTENT | |
WO2018030567A1 (ko) | Hmd 및 그 hmd의 제어 방법 | |
WO2021141400A1 (en) | Attribute transfer in v-pcc | |
WO2016126083A1 (ko) | 주변 상황 정보를 통지하기 위한 방법, 전자 장치 및 저장 매체 | |
WO2019035581A1 (ko) | 서버, 디스플레이장치 및 그 제어방법 | |
WO2018190446A1 (ko) | 영상 처리 방법 및 장치 | |
WO2016080653A1 (en) | Method and apparatus for image processing | |
WO2018124624A1 (en) | Method, device, and system for processing multimedia signal | |
WO2016072538A1 (ko) | 유저 인터페이스를 통한 카메라 장치의 동작 방법 | |
WO2023055033A1 (en) | Method and apparatus for enhancing texture details of images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 20187025485 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016898014 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016898014 Country of ref document: EP Effective date: 20180914 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16898014 Country of ref document: EP Kind code of ref document: A1 |