WO2022230186A1 - 配信制御システム、配信制御装置、配信制御方法、及びプログラム - Google Patents
配信制御システム、配信制御装置、配信制御方法、及びプログラム Download PDFInfo
- Publication number
- WO2022230186A1 WO2022230186A1 PCT/JP2021/017244 JP2021017244W WO2022230186A1 WO 2022230186 A1 WO2022230186 A1 WO 2022230186A1 JP 2021017244 W JP2021017244 W JP 2021017244W WO 2022230186 A1 WO2022230186 A1 WO 2022230186A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- virtual
- virtual viewpoint
- distribution control
- user
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 31
- 238000004891 communication Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 13
- 238000013500 data storage Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000009877 rendering Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 241000945470 Arcturus Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013144 data compression Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
- G06T17/205—Re-meshing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/21805—Source of audio or video content, e.g. local disk arrays enabling multiple viewpoints, e.g. using a plurality of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234345—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements the reformatting operation being performed only on part of the stream, e.g. a region of the image or a time segment
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6587—Control parameters, e.g. trick play commands, viewpoint selection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/08—Bandwidth reduction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/36—Level of detail
Definitions
- the present invention relates to a distribution control system, a distribution control device, a distribution control method, and a program.
- Stereoscopic video content with six degrees of freedom (6DoF), typified by volumetric videos and holograms, is known.
- 6DoF six degrees of freedom
- Volumetric video is animation data composed of polygon meshes (hereinafter also simply referred to as "mesh”) and textures, and is displayed on the display by rendering together with the virtual environment on the client side. can.
- Non-Patent Document 1 proposes a method of rendering a volumetric video on the server side based on the movement of the user's head detected by an AR/VR device, which is a client, and transmitting it to the client as 2D data.
- Non-Patent Document 2 proposes a method of distributing a volumetric video generated in real time to a client and rendering it on the client side.
- Non-Patent Document 3 proposes a method of reducing the amount of data required for playback by dynamically changing the Level Of Detail of volumetric video according to the bandwidth of the communication network.
- volumetric video has a large amount of data, and the communication network bandwidth required for its distribution is large, so an efficient distribution method is required.
- Non-Patent Document 1 rendering for each user needs to be performed on the server side, which imposes a heavy load on the server. Also, when the number of users increases, the division of server resources may cause deterioration in the quality of video viewed by each user. Furthermore, it is necessary to send location information from the client to the server with high frequency and low delay. is large.
- Non-Patent Document 2 requires a communication band of 4 Gbps, but it is difficult for users to always secure a stable 4 Gbps communication band.
- the load on the communication network is heavy, the usable bandwidth of other users using the same communication network is narrowed, and the quality of experience of the other users is lowered.
- Non-Patent Document 3 when the usable bandwidth of the communication network is narrow, the 3D data within the range visible to the user in the volumetric video being viewed (that is, the 3D data corresponding to the front for the user) Since the image quality and Level of Detail, including the 3D data that is displayed, are degraded, the quality of experience is significantly degraded.
- An embodiment of the present invention has been made in view of the above points, and aims to reduce the amount of data required for distributing stereoscopic video content while maintaining the user's experience quality.
- a distribution system includes an arrangement unit that arranges a plurality of virtual viewpoints centering on an object represented by stereoscopic data that constitutes stereoscopic video content; a first creation unit for creating, for each of the virtual viewpoints, one-sided stereoscopic data in which a data amount of a portion of the object that cannot be visually recognized from the virtual viewpoint is reduced; a distribution unit that distributes single-panel stereoscopic data of one virtual viewpoint among the single-panel stereoscopic data for each virtual viewpoint to the terminal of the user according to a position and a field of view.
- FIG. 6 is a flowchart showing an example of one-surface 3D data creation processing according to the present embodiment
- FIG. 4 is a diagram showing an example of arrangement of virtual viewpoints
- 6 is a flowchart showing an example of correspondence table creation processing according to the present embodiment
- FIG. 10 is a diagram showing an example of a viewing angle range
- FIG. 10 is a diagram showing an example of a viewing angle range correspondence table
- 6 is a flowchart showing an example of distribution processing according to the embodiment
- FIG. 4 is a diagram showing an example of specifying a virtual viewpoint
- the volumetric video is animation data composed of 3D data (also called three-dimensional data or stereoscopic data) represented by meshes and textures. That is, for example, if the 3D data of the frame at time t is d t , the volumetric video is expressed as ⁇ d t
- t s is the start time of the volumetric video and t e is the end time.
- volumetric video and can be similarly applied to, for example, stereoscopic video content having six degrees of freedom such as a hologram.
- FIG. 1 is a diagram showing an example of the overall configuration of a distribution control system 1 according to this embodiment.
- the distribution control system 1 includes a distribution control server 10, a content server 20, a client 30, and a service site server 40. Also, the distribution control server 10 and the client 30 are communicably connected via a communication network N such as the Internet. Similarly, client 30 and service site server 40 are communicably connected via communication network N.
- FIG. 1 the distribution control server 10 and the client 30 are communicably connected via a communication network N such as the Internet.
- client 30 and service site server 40 are communicably connected via communication network N.
- distribution control server 10 and the content server 20 exist within the same local network and are communicably connected within the local network. It may be communicatively connected.
- the distribution control server 10 creates a plurality of one-plane 3D data from the 3D data that constitutes the given volumetric video, and stores the plurality of one-plane 3D data in the content server 20 .
- the one-plane 3D data is three-dimensional data when an object represented by the 3D data that constitutes the volumetric video is viewed from a certain point of view. It has a smaller amount of data than
- the object is the subject of the volumetric video, and means any object that can be represented by meshes and textures, such as people, animals and plants, structures, buildings, machines, celestial bodies, natural phenomena, etc. do.
- the distribution control server 10 determines appropriate one-plane 3D data from the user's viewpoint, spatial position, field of view (line-of-sight direction and viewing range), etc., and sends this one-plane 3D data to the client. Deliver to 30.
- the user's viewpoint and spatial position refer to the user's position in the virtual space in which the target object is arranged.
- the content server 20 stores a plurality of one-sided 3D data. Also, in response to a data request from the distribution control server 10, the content server 20 sends back to the distribution control server 10 single-sided 3D data corresponding to this data request.
- the clients 30 are various terminals (eg, XR (VR/AR/MR/SR, etc.) devices, etc.) used by users who watch the volumetric video, and render the one-sided 3D data distributed from the distribution control server 10. , to play a volumetric video.
- XR devices include smart phones, tablet terminals, wearable devices, etc., which are equipped with application programs that function as XR devices.
- a user views a volumetric video
- the following procedure is performed.
- the user accesses the service site server 40 with the client 30 and acquires a list of contents (volumetric videos) that the user can view.
- the user selects the volumetric video they wish to view from this list and obtains a link to the selected volumetric video.
- the client 30 accesses the link, a viewing request is transmitted to the distribution control server 10, and full 3D data is returned in response to this request, thereby starting playback of the volumetric video.
- the client 30 appropriately transmits information such as the user's viewpoint, spatial position, field of view (hereinafter, information representing the user's viewpoint or spatial position and field of view is also referred to as "user viewpoint information") to the distribution control server 10.
- information such as the user's viewpoint, spatial position, field of view (hereinafter, information representing the user's viewpoint or spatial position and field of view is also referred to as "user viewpoint information")
- user viewpoint information information representing the user's viewpoint or spatial position and field of view
- one-sided 3D data corresponding to the user's viewpoint, spatial position, field of view, etc. is returned from the distribution control server 10 and reproduced by the client 30 .
- the service site server 40 presents a list of content (volumetric video) that the user can view, and provides the client 30 with a link to content selected from this list.
- the distribution control server 10 has a full 3D data creation unit 101, a correspondence table creation unit 102, a delivery control unit 103, and a delivery unit 104. These units are implemented by, for example, one or more programs installed in the distribution control server 10 causing a processor such as a CPU (Central Processing Unit) to execute processing.
- a processor such as a CPU (Central Processing Unit) to execute processing.
- the distribution control server 10 has a correspondence table storage unit 105 .
- the correspondence table storage unit 105 is implemented by, for example, an auxiliary storage device such as a HDD (Hard Disk Drive) or an SSD (Solid State Drive).
- the one-plane 3D data creation unit 101 creates a plurality of one-plane 3D data from the 3D data that constitutes the volumetric video given to the distribution control server 10 . More specifically, the one-view 3D data creation unit 101 arranges a plurality of virtual viewpoints at equal intervals (or irregular intervals) around the object represented by the 3D data that constitutes the volumetric video, and then creates the virtual viewpoints. Each time, when the object is viewed from the virtual viewpoint, geometry simplification (Level Of Detail lowering processing) is performed on the mesh corresponding to the back surface of the object to create full 3D data. In this way, for each virtual viewpoint, by simplifying the geometry of the mesh behind the object represented by the 3D data (that is, the part that cannot be seen from the virtual viewpoint), one-plane 3D data for each virtual viewpoint is created. do.
- geometry simplification Level Of Detail lowering processing
- the one-plane 3D data creation unit 101 saves the created plurality of one-plane 3D data in the content server 20 .
- the correspondence table creation unit 102 creates a correspondence table (hereinafter also referred to as a "viewing angle range correspondence table") that associates each virtual viewpoint arranged when creating a plurality of one-plane 3D data with the viewing angle range of the virtual viewpoint. ) is created.
- the viewing angle range is a range in which the same quality of experience can be obtained when viewing an object as when viewing the object from the virtual viewpoint. In other words, it is the range in which the back surface of the geometrically simplified object cannot be seen, similar to the virtual viewpoint.
- the distribution control unit 103 Upon receiving the user viewpoint information from the client 30, the distribution control unit 103 refers to the viewing angle range correspondence table and determines appropriate one-plane 3D data from this user viewpoint information. More specifically, when at least a part of the target object is captured within the field of view at the viewpoint or the spatial position included in the user viewpoint information, the distribution control unit 103 sets the viewing angle including the viewpoint or the spatial position. A virtual viewpoint corresponding to the range is specified, and one-plane 3D data corresponding to the specified virtual viewpoint is determined as appropriate.
- the distribution unit 104 transmits the single-sided 3D data determined by the distribution control unit 103 to the client 30 .
- the correspondence table storage unit 105 stores the viewing angle range correspondence table created by the correspondence table creating unit 102 .
- the content server 20 has a one-sided 3D data storage unit 201 .
- the one-sided 3D data storage unit 201 is implemented by, for example, an auxiliary storage device such as an HDD or SSD.
- the one-side 3D data storage unit 201 stores each one-side 3D data created by the one-side 3D data creating unit 101 .
- the configuration of the distribution control system 1 shown in FIG. 1 is an example, and other configurations may be used.
- the distribution control server 10 and content server 20 may be configured as an integrated server.
- FIG. 2 is a flow chart showing an example of one-plane 3D data creation processing according to the present embodiment. In the following, it is assumed that the volumetric video is provided to the distribution control server 10.
- FIG. 2 is a flow chart showing an example of one-plane 3D data creation processing according to the present embodiment. In the following, it is assumed that the volumetric video is provided to the distribution control server 10.
- the full-screen 3D data creation unit 101 acquires 3D data for one frame from the 3D data forming the volumetric video (step S101).
- the single-sided 3D data creation unit 101 acquires the 3D data dt of the frame at time t .
- the 3D data is composed of meshes and textures, since the textures are not particularly processed below, only the meshes of the 3D data may be obtained in step S101 above.
- the one-plane 3D data creation unit 101 arranges N virtual viewpoints around the object represented by the 3D data dt acquired in step S102 (step S102).
- the line-of-sight direction of each virtual viewpoint is the object.
- the single-surface 3D data creation unit 101 creates N images with the object in the line-of-sight direction at regular intervals (or non-uniform intervals) on the circumference of a circle with a predetermined radius centered on the object.
- N is a predetermined integer of 2 or more.
- virtual viewpoints V 1 to V 8 with the object O as the line-of-sight direction are arranged at regular intervals on the circumference of a circle with a radius R centered on the object O.
- Arranging the virtual viewpoints on the circumference of the circle is an example, and the present invention is not limited to this.
- the virtual viewpoints may be arranged on the sides or vertices of a polygon centered on the object.
- the virtual viewpoints V 1 to V 8 are arranged on the xy plane of the xyz space with the center of the object O as the origin.
- a virtual viewpoint may be placed on a sphere (or a polyhedron) centered at .
- the one-plane 3D data creation unit 101 selects one virtual viewpoint from the N virtual viewpoints arranged in step S102 (step S103).
- the one-sided 3D data creation unit 101 performs geometry simplification (Level Of Detail reduction) on the mesh corresponding to the back surface of the object when the object is viewed from the virtual viewpoint selected in step S103. processing) is performed (step S104).
- the single-sided 3D data creation unit 101 performs geometry simplification processing on meshes that are invisible (invisible) from the virtual viewpoint among the meshes that make up the object.
- the quality of experience equivalent to that of the original 3D data dt is obtained, and one-sided 3D data with a data amount reduced from that of the original 3D data dt is obtained. created.
- the one-plane 3D data creation unit 101 determines whether or not all N virtual viewpoints have been selected (step S105).
- step S105 If it is determined in step S105 that there is a virtual viewpoint that has not yet been selected, the one-sided 3D data creation unit 101 returns to step S103 and selects one virtual viewpoint from the virtual viewpoints that have not yet been selected. is selected, and the processing from step S104 onwards is executed.
- step S105 determines whether or not there is a next frame in the given volumetric video.
- step S106 If it is determined in step S106 that there is a next frame, the single-sided 3D data creation unit 101 returns to step S101, acquires the 3D data of the next frame, and executes the processes from step S102 onward. That is, in this case, the one-surface 3D data creation unit 101 sets t ⁇ t+1, returns to the above step S101, and acquires the 3D data dt of the frame at the next time t .
- the data is saved in the one-sided 3D data storage unit 201 of the content server 20 (step S107).
- d ti is the one-plane 3D data corresponding to the viewpoint V i (where i ⁇ [1, N]) at time t
- t ⁇ [t s , t e ], i ⁇ [1, N] ⁇ are stored in the one-sided 3D data storage unit 201 .
- the number N of virtual viewpoints arranged is common to all frames, but may be different for each frame. Further, in the present embodiment, the above steps S102 to S105 are repeatedly executed for each frame. When the data dt are the same, the above steps S102 to S105 may be executed only for the 3D data dt of one frame included in the time width.
- FIG. 4 is a flowchart showing an example of correspondence table creation processing according to the present embodiment.
- the correspondence table creation unit 102 arranges N virtual viewpoints around the object represented by the 3D data d (step S201), as in step S102 of FIG. Note that the number of virtual viewpoints to be arranged and the method of arrangement (equally spaced or non-equally spaced, arranged on a circle, arranged on a polygon, etc.) are the same as in step S102 in FIG.
- the correspondence table creation unit 102 selects one virtual viewpoint from the N virtual viewpoints arranged in step S201 (step S202).
- the correspondence table creation unit 102 calculates the boundary angle between the virtual viewpoint selected in step S202 and the virtual viewpoint adjacent thereto (hereinafter referred to as "adjacent virtual viewpoint") (step S203).
- the boundary angle is the angle between the line of sight of the virtual viewpoint and the boundary between the virtual viewpoint and the adjacent virtual viewpoint.
- the adjacent virtual viewpoints of the virtual viewpoint V2 are V1 and V3
- the angle between the line of sight of the virtual viewpoint V2 and the line of sight of the adjacent virtual viewpoint V1 is ⁇
- ⁇ 22 be the angle between the line of sight of the adjacent virtual viewpoint V3 .
- the line that bisects the angle ⁇ 21 is the boundary with the adjacent virtual viewpoint V1
- the line that bisects the angle ⁇ 22 is the boundary with the adjacent virtual viewpoint V3, and the angle to these boundaries is defined as the boundary angle. .
- the correspondence table creation unit 102 determines whether or not all N virtual viewpoints have been selected (step S204).
- step S204 If it is determined in step S204 above that there is a virtual viewpoint that has not yet been selected, the correspondence table creation unit 102 returns to step S202 above, and selects one virtual viewpoint from the virtual viewpoints that have not yet been selected. Then, the processing after step S203 is executed.
- the correspondence table creation unit 102 calculates the viewing angle range of each virtual viewpoint from the boundary angle calculated in step S203 above.
- a viewing angle range correspondence table is created, and the created viewing angle range correspondence table is stored in the correspondence table storage unit 105 (step S205).
- a virtual viewpoint V i (where i ⁇ [1, N]) is positioned at an angle ⁇ i with respect to the object . is calculated as follows: ⁇ i - ⁇ i1 ⁇ ⁇ i ⁇ i + ⁇ i2 .
- a viewing angle range correspondence table is created by associating the virtual viewpoint V i with the viewing angle range ⁇ i ⁇ i1 ⁇ i ⁇ i + ⁇ i2 .
- An example of the viewing angle range correspondence table created in this way is shown in FIG.
- a common viewing angle range correspondence table was created for the 3D data d t at time t ⁇ [t s , t e ].
- a viewing angle range correspondence table may be created for each number of arranged virtual viewpoints.
- FIG. 7 is a flowchart showing an example of distribution processing according to this embodiment. Steps S301 to S303 below are executed each time user viewpoint information is received, and steps S304 to S305 are executed at each frame interval. However, hereinafter, it is assumed that at least a part of the target object is included in the field of view of the user.
- the distribution control unit 103 uses the viewpoint or spatial position included in the user viewpoint information received from the client 30 to calculate the user position with respect to the object (step S301).
- the user position is the angle of the viewpoint or spatial position with respect to the object. Note that the reference of the angle is the same as when the position of the virtual viewpoint with respect to the object is determined in step S203 of FIG.
- the distribution control unit 103 refers to the viewing angle range correspondence table stored in the correspondence table storage unit 105, and specifies a virtual viewpoint from the user position calculated in step S301 (step S302). That is, the distribution control unit 103 identifies, among the virtual viewpoints, a virtual viewpoint corresponding to the viewing angle range including the user position.
- the virtual viewpoint V2 is identified in step S302 above.
- FIG. 8 assume that ⁇ 3 ⁇ 31 ⁇ B ⁇ 3 + ⁇ 32 where ⁇ B is the user position of a certain user B.
- FIG. 8 the virtual viewpoint V3 is identified in step S302 above.
- the object O exists in the line-of-sight direction of the user A, and the object O does not exist in the line-of-sight direction of the user B, but at least a part of the object O is within the field of view. is a case containing
- the distribution control unit 103 determines the one-view 3D data corresponding to the virtual viewpoint identified in step S302 as a distribution target (step S303). That is, for example, when the virtual viewpoint specified in step S302 is V i , the distribution control unit 103 determines the one-sided 3D data ⁇ d ti ⁇ to be distributed.
- the distribution unit 104 acquires, from the content server 20, the one-surface 3D data of the frame at the relevant time, among the one-surface 3D data to be distributed determined in step S303 above (step S304).
- the distribution unit 104 distributes the single-sided 3D data acquired in step S304 to the client 30 (step S305).
- the client 30 side renders the one-sided 3D data and displays the object on the display.
- the distribution control server 10 and the content server 20 according to this embodiment are implemented by, for example, the hardware configuration of a computer 500 shown in FIG. Note that the client 30 and the service site server 40 may also be realized with the same hardware configuration.
- a computer 500 shown in FIG. 9 has an input device 501, a display device 502, an external I/F 503, a communication I/F 504, a processor 505, and a memory device 506. Each of these pieces of hardware is communicably connected via a bus 507 .
- the input device 501 is, for example, a keyboard, mouse, touch panel, or the like.
- the display device 502 is, for example, a display. Note that the computer 500 may not have at least one of the input device 501 and the display device 502 .
- the external I/F 503 is an interface with an external device such as a recording medium 503a.
- Examples of the recording medium 503a include CD (Compact Disc), DVD (Digital Versatile Disk), SD memory card (Secure Digital memory card), USB (Universal Serial Bus) memory card, and the like.
- a communication I/F 504 is an interface for performing data communication with other devices, devices, systems, and the like.
- the processor 505 is, for example, various arithmetic units such as a CPU.
- the memory device 506 is, for example, various storage devices such as HDD, SSD, RAM (Random Access Memory), ROM (Read Only Memory), and flash memory.
- the distribution control server 10 and the content server 20 have the hardware configuration of the computer 500 shown in FIG. 9, so that the various processes described above can be realized.
- the hardware configuration of the computer 500 shown in FIG. 9 is an example, and the computer 500 may have other hardware configurations.
- computer 500 may have multiple processors 505 and may have multiple memory devices 506 .
- the distribution control system 1 arranges a plurality of virtual viewpoints with respect to the target object represented by the 3D data that constitutes the stereoscopic video content. Simplify the geometry of polygon meshes that are not visible from the viewpoint. As a result, one-plane 3D data with a reduced data amount compared to the original 3D data is created for each virtual viewpoint.
- Distribution Control System 10 Distribution Control Server 20 Content Server 30 Client 40 Service Site Server 50 Communication Network 101 One-sided 3D Data Creation Section 102 Correspondence Table Creation Section 103 Distribution Control Section 104 Distribution Section 105 Correspondence Table Storage Section 201 One-sided 3D Data Storage Section 500 Computer 501 Input device 502 Display device 503 External I/F 503a recording medium 504 communication I/F 505 processor 506 memory device 507 bus
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
まず、本実施形態に係る配信制御システム1の全体構成について、図1を参照しながら説明する。図1は、本実施形態に係る配信制御システム1の全体構成の一例を示す図である。
以下、複数の一面3Dデータを作成するための一面3Dデータ作成処理と、視聴角度範囲対応表を作成するための対応表作成処理と、適切な一面3Dデータをクライアント30に配信するための配信処理との各処理について説明する。なお、一面3Dデータ作成処理と対応表作成処理は、配信処理よりも前に実行される事前処理である。
まず、一面3Dデータ作成処理について、図2を参照しながら説明する。図2は、本実施形態に係る一面3Dデータ作成処理の一例を示すフローチャートである。なお、以下では、ボリュメトリックビデオが配信制御サーバ10に与えられているものとする。
次に、対応表作成処理について、図4を参照しながら説明する。図4は、本実施形態に係る対応表作成処理の一例を示すフローチャートである。以下では、3Dデータdt(ただし、t∈[ts,te])に関する視聴角度範囲対応表を作成する場合について説明する。なお、以下では、d=dtと表す。
次に、配信処理について、図7を参照しながら説明する。図7は、本実施形態に係る配信処理の一例を示すフローチャートである。以下のステップS301~ステップS303はユーザ視点情報を受信する毎に実行され、ステップS304~ステップS305はフレーム間隔毎に実行される。ただし、以下では、ユーザの視界内に対象物の少なくとも一部の部分が含まれているものとする。
最後に、本実施形態に係る配信制御サーバ10及びコンテンツサーバ20のハードウェア構成について説明する。本実施形態に係る配信制御サーバ10及びコンテンツサーバ20は、例えば、図9に示すコンピュータ500のハードウェア構成により実現される。なお、クライアント30及びサービスサイトサーバ40も同様のハードウェア構成で実現されていてもよい。
以上のように、本実施形態に係る配信制御システム1は、立体映像コンテンツを構成する3Dデータが表す対象物に対して複数の仮想視点を配置した上で、これらの仮想視点毎に、当該仮想視点から視認できない部分のポリゴンメッシュのジオメトリーを簡素化する。これにより、仮想視点毎に、元の3Dデータよりもデータ量を削減した一面3Dデータが作成される。
10 配信制御サーバ
20 コンテンツサーバ
30 クライアント
40 サービスサイトサーバ
50 通信ネットワーク
101 一面3Dデータ作成部
102 対応表作成部
103 配信制御部
104 配信部
105 対応表記憶部
201 一面3Dデータ記憶部
500 コンピュータ
501 入力装置
502 表示装置
503 外部I/F
503a 記録媒体
504 通信I/F
505 プロセッサ
506 メモリ装置
507 バス
Claims (7)
- 立体映像コンテンツを構成する立体データが表す対象物に対して、前記対象物を中心とする複数の仮想視点を配置する配置部と、
前記仮想視点毎に、前記対象物に関して前記仮想視点からは視認できない部分のデータ量を削減した一面立体データを作成する第1の作成部と、
前記対象物が配置される仮想空間上におけるユーザの位置と視界に応じて、前記仮想視点毎の一面立体データのうちの1つの仮想視点の一面立体データを前記ユーザの端末に配信する配信部と、
を有する配信制御システム。 - 前記第1の作成部は、
前記仮想視点からは視認できない部分を表すポリゴンメッシュに対してジオメトリー簡素化処理を行うことで、前記部分のデータ量を削減した一面立体データを作成する、請求項1に記載の配信制御システム。 - 前記仮想視点からは視認できない部分は、前記仮想視点から前記対象物を見たときに、前記対象物の背面に相当する部分である、請求項1又は2に記載の配信制御システム。
- 前記仮想視点と、前記仮想視点から前記対象物を見たときに同一の体感品質を得られる範囲とを対応付けた対応表を作成する第2の作成部を有し、
前記配信部は、
前記視界の中に前記対象物の少なくとも一部の部分が含まれる場合に、前記対応表を参照して、前記位置が含まれる範囲に対応する仮想視点の一面立体データを特定し、
特定した一面立体データを前記端末に配信する、請求項1乃至3の何れか一項に記載の配信制御システム。 - 立体映像コンテンツを構成する立体データが表す対象物に対して、前記対象物を中心とする複数の仮想視点を配置する配置部と、
前記仮想視点毎に、前記対象物に関して前記仮想視点からは視認できない部分のデータ量を削減した一面立体データを作成する第1の作成部と、
前記対象物が配置される仮想空間上におけるユーザの位置と視界に応じて、前記仮想視点毎の一面立体データのうちの1つの仮想視点の一面立体データを前記ユーザの端末に配信する配信部と、
を有する配信制御装置。 - 立体映像コンテンツを構成する立体データが表す対象物に対して、前記対象物を中心とする複数の仮想視点を配置する配置手順と、
前記仮想視点毎に、前記対象物に関して前記仮想視点からは視認できない部分のデータ量を削減した一面立体データを作成する第1の作成手順と、
前記対象物が配置される仮想空間上におけるユーザの位置と視界に応じて、前記仮想視点毎の一面立体データのうちの1つの仮想視点の一面立体データを前記ユーザの端末に配信する配信手順と、
をコンピュータが実行する配信制御方法。 - コンピュータを、請求項1乃至4の何れか一項に記載の配信制御システムとして機能させるプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/017244 WO2022230186A1 (ja) | 2021-04-30 | 2021-04-30 | 配信制御システム、配信制御装置、配信制御方法、及びプログラム |
US18/557,268 US20240144602A1 (en) | 2021-04-30 | 2021-04-30 | Distribution control system, distribution control apparatus, distribution control method, and program |
JP2023517012A JPWO2022230186A1 (ja) | 2021-04-30 | 2021-04-30 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/017244 WO2022230186A1 (ja) | 2021-04-30 | 2021-04-30 | 配信制御システム、配信制御装置、配信制御方法、及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022230186A1 true WO2022230186A1 (ja) | 2022-11-03 |
Family
ID=83848244
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/017244 WO2022230186A1 (ja) | 2021-04-30 | 2021-04-30 | 配信制御システム、配信制御装置、配信制御方法、及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20240144602A1 (ja) |
JP (1) | JPWO2022230186A1 (ja) |
WO (1) | WO2022230186A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020116154A1 (ja) * | 2018-12-03 | 2020-06-11 | ソニー株式会社 | 情報処理装置および方法 |
WO2020137876A1 (ja) * | 2018-12-26 | 2020-07-02 | シャープ株式会社 | 生成装置、3次元データ送信装置、及び3次元データ再生装置 |
JP2020136882A (ja) * | 2019-02-19 | 2020-08-31 | 株式会社メディア工房 | 点群データ通信システム、点群データ送信装置および点群データ送信方法 |
WO2020195767A1 (ja) * | 2019-03-25 | 2020-10-01 | シャープ株式会社 | 3dモデル送信装置、及び、3dモデル受信装置 |
-
2021
- 2021-04-30 US US18/557,268 patent/US20240144602A1/en active Pending
- 2021-04-30 JP JP2023517012A patent/JPWO2022230186A1/ja active Pending
- 2021-04-30 WO PCT/JP2021/017244 patent/WO2022230186A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020116154A1 (ja) * | 2018-12-03 | 2020-06-11 | ソニー株式会社 | 情報処理装置および方法 |
WO2020137876A1 (ja) * | 2018-12-26 | 2020-07-02 | シャープ株式会社 | 生成装置、3次元データ送信装置、及び3次元データ再生装置 |
JP2020136882A (ja) * | 2019-02-19 | 2020-08-31 | 株式会社メディア工房 | 点群データ通信システム、点群データ送信装置および点群データ送信方法 |
WO2020195767A1 (ja) * | 2019-03-25 | 2020-10-01 | シャープ株式会社 | 3dモデル送信装置、及び、3dモデル受信装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022230186A1 (ja) | 2022-11-03 |
US20240144602A1 (en) | 2024-05-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9420229B2 (en) | System and method for managing multimedia data | |
US11082490B2 (en) | Method and apparatus for execution of applications in a cloud system | |
US7884823B2 (en) | Three dimensional rendering of display information using viewer eye coordinates | |
CA2814809C (en) | Composite video streaming using stateless compression | |
US20180189980A1 (en) | Method and System for Providing Virtual Reality (VR) Video Transcoding and Broadcasting | |
US7656403B2 (en) | Image processing and display | |
US20050033817A1 (en) | Sharing OpenGL applications using application based screen sampling | |
US20170186243A1 (en) | Video Image Processing Method and Electronic Device Based on the Virtual Reality | |
CN108762934B (zh) | 远程图形传输系统、方法及云服务器 | |
JP7411791B2 (ja) | リモート端末の没入型遠隔会議及びテレプレセンスのためのオーバーレイ処理のパラメータ | |
Pazzi et al. | Propane: A progressive panorama streaming protocol to support interactive 3d virtual environment exploration on graphics-constrained devices | |
Zhu et al. | Towards peer-assisted rendering in networked virtual environments | |
CN113608613A (zh) | 虚拟现实互动方法、装置、电子设备及计算机可读介质 | |
WO2022230186A1 (ja) | 配信制御システム、配信制御装置、配信制御方法、及びプログラム | |
WO2024089875A1 (ja) | 配信制御システム、配信制御装置、配信制御方法、及びプログラム | |
US11677979B2 (en) | Freeview video coding | |
WO2023187936A1 (ja) | 配信制御装置、端末、配信制御システム、配信制御方法、及びプログラム | |
CN104618733A (zh) | 图像远程投射方法和相关装置 | |
EP3948790B1 (en) | Depth-compressed representation for 3d virtual scene | |
CN114092362A (zh) | 一种全景图片加载方法和装置 | |
JP7419529B2 (ja) | 遠隔端末のための没入型テレカンファレンス及びテレプレゼンスのインタラクティブオーバーレイ処理 | |
WO2023084789A1 (ja) | 配信装置、配信方法及びプログラム | |
AU2015203292B2 (en) | Composite video streaming using stateless compression | |
CN117676110A (zh) | 视频处理方法、装置及服务器 | |
CN116868107A (zh) | 使用边缘-云架构和对等网络流式传输的光场/沉浸式媒体的拆分渲染 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21939343 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023517012 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18557268 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21939343 Country of ref document: EP Kind code of ref document: A1 |