WO2017104115A1 - Three-dimensional data coding method, three-dimensional data decoding method, three-dimensional data coding device, and three-dimensional data decoding device - Google Patents
Three-dimensional data coding method, three-dimensional data decoding method, three-dimensional data coding device, and three-dimensional data decoding device Download PDFInfo
- Publication number
- WO2017104115A1 WO2017104115A1 PCT/JP2016/005041 JP2016005041W WO2017104115A1 WO 2017104115 A1 WO2017104115 A1 WO 2017104115A1 JP 2016005041 W JP2016005041 W JP 2016005041W WO 2017104115 A1 WO2017104115 A1 WO 2017104115A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dimensional data
- processing unit
- encoding
- encoded
- decoding
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 69
- 230000003068 static effect Effects 0.000 claims description 36
- 238000013500 data storage Methods 0.000 claims description 3
- 238000003070 Statistical process control Methods 0.000 description 21
- 238000010586 diagram Methods 0.000 description 15
- 230000006870 function Effects 0.000 description 11
- 238000004590 computer program Methods 0.000 description 4
- 238000013139 quantization Methods 0.000 description 4
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 239000000470 constituent Substances 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 210000005261 ventrolateral medulla Anatomy 0.000 description 3
- 241001465754 Metazoa Species 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/597—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T9/00—Image coding
- G06T9/001—Model-based coding, e.g. wire frame
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T9/00—Image coding
- G06T9/004—Predictors, e.g. intraframe, interframe coding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/42—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
- H04N19/436—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation using parallelised computational arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/593—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques
Definitions
- the present disclosure relates to a three-dimensional data encoding method, a three-dimensional data decoding method, a three-dimensional data encoding device, and a three-dimensional data decoding device.
- the three-dimensional data is acquired by various methods such as a distance sensor such as a range finder, a stereo camera, or a combination of a plurality of monocular cameras.
- a point cloud that represents the shape of a three-dimensional structure by a point group in a three-dimensional space
- the position and color of the point cloud are stored.
- Point clouds are expected to become the mainstream representation method of 3D data, but point clouds have a very large amount of data. Therefore, in the storage or transmission of three-dimensional data, it is essential to compress the amount of data by encoding as in the case of two-dimensional moving images (for example, MPEG-4 AVC or HEVC standardized by MPEG). Become.
- point cloud compression is partially supported by a public library (Point Cloud Library) that performs point cloud related processing.
- a public library Point Cloud Library
- This disclosure aims to provide a three-dimensional data encoding method, a three-dimensional data decoding method, a three-dimensional data encoding device, or a three-dimensional data decoding device that can provide a random access function in encoded three-dimensional data.
- the present disclosure enables the quantization and prediction of the space by the configuration in which the space is divided and encoded, and is effective even when random access is not necessarily performed.
- a three-dimensional data encoding method is a three-dimensional data encoding method for encoding three-dimensional data, wherein the three-dimensional data is a random access unit, each of which includes three-dimensional coordinates.
- a three-dimensional data decoding method is a three-dimensional data decoding method for decoding three-dimensional data, which is a random access unit, each of which is associated with a three-dimensional coordinate.
- the present disclosure can provide a three-dimensional data encoding method, a three-dimensional data decoding method, a three-dimensional data encoding device, or a three-dimensional data decoding device that can provide a random access function in encoded three-dimensional data.
- FIG. 1 is a diagram illustrating a configuration of encoded three-dimensional data according to the embodiment.
- FIG. 2 is a diagram illustrating an example of a prediction structure between SPCs belonging to the lowest layer of the GOS according to the embodiment.
- FIG. 3 is a diagram illustrating an example of a prediction structure between layers according to the embodiment.
- FIG. 4 is a diagram illustrating an example of the coding order of GOS according to the embodiment.
- FIG. 5 is a diagram illustrating an example of the coding order of GOS according to the embodiment.
- FIG. 6 is a block diagram of the three-dimensional data encoding apparatus according to the embodiment.
- FIG. 7 is a flowchart of the encoding process according to the embodiment.
- FIG. 8 is a block diagram of the three-dimensional data decoding apparatus according to the embodiment.
- FIG. 9 is a flowchart of the decoding process according to the embodiment.
- FIG. 10 is a diagram illustrating an example of meta information according to the embodiment.
- a three-dimensional data encoding method is a three-dimensional data encoding method for encoding three-dimensional data, wherein the three-dimensional data is a random access unit, each of which includes three-dimensional coordinates.
- the three-dimensional data encoding method can provide a random access function in the encoded three-dimensional data.
- the three-dimensional data encoding method includes a generation step of generating first information indicating the plurality of first processing units and three-dimensional coordinates associated with each of the plurality of first processing units.
- the encoded data may include the first information.
- the first information may further indicate at least one of an object, a time, and a data storage destination associated with each of the plurality of first processing units.
- the first processing unit may be further divided into a plurality of second processing units, and in the encoding step, each of the plurality of second processing units may be encoded.
- the second processing unit to be processed included in the first processing unit to be processed is encoded with reference to another second processing unit included in the first processing unit to be processed. May be.
- a type of the second processing unit to be processed a first type that does not refer to another second processing unit, a second type that refers to another second processing unit, and others
- One of the third types referring to the two second processing units may be selected, and the second processing unit to be processed may be encoded according to the selected type.
- the frequency of selecting the first type may be changed according to the number or density of objects included in the three-dimensional data.
- the size of the first processing unit may be determined according to the number or density of objects or dynamic objects included in the three-dimensional data.
- the first processing unit is spatially divided in a predetermined direction, and includes a plurality of layers each including one or more second processing units.
- the second processing unit is The encoding may be performed with reference to the second processing unit included in the same layer as the second processing unit or a layer lower than the second processing unit.
- the second processing unit including only static objects and the second processing unit including only dynamic objects may be assigned to different first processing units.
- a plurality of dynamic objects may be individually encoded, and encoded data of the plurality of dynamic objects may be associated with a second processing unit including only static objects.
- the second processing unit may be further divided into a plurality of third processing units, and in the encoding step, each of the plurality of third processing units may be encoded.
- the third processing unit may include one or more voxels that are minimum units with which position information is associated.
- the second processing unit may include a feature point group derived from information obtained by a sensor.
- the encoded data may include information indicating the encoding order of the plurality of first processing units.
- the encoded data may include information indicating the sizes of the plurality of first processing units.
- the plurality of first processing units may be encoded in parallel.
- a 3D data decoding method is a 3D data decoding method for decoding 3D data, which is a random access unit, each of which is associated with a 3D coordinate.
- a decoding step of generating three-dimensional data of the first processing unit by decoding each encoded data of one processing unit is included.
- the three-dimensional data decoding method can provide a random access function in the encoded three-dimensional data.
- the three-dimensional data encoding device is a three-dimensional data encoding device that encodes three-dimensional data, and the three-dimensional data is a random access unit, each of which is tertiary A dividing unit that divides the first processing unit associated with the original coordinates and an encoding unit that generates encoded data by encoding each of the plurality of first processing units may be included.
- the three-dimensional data encoding apparatus can provide a random access function in the encoded three-dimensional data.
- a 3D data decoding apparatus is a 3D data decoding apparatus that decodes 3D data, is a random access unit, and is associated with 3D coordinates.
- a decoding unit that generates the three-dimensional data of the first processing unit by decoding each of the encoded data of one processing unit may be included.
- the three-dimensional data decoding apparatus can provide a random access function in the encoded three-dimensional data.
- FIG. 1 is a diagram showing a configuration of encoded three-dimensional data according to the present embodiment.
- the three-dimensional space is divided into spaces (SPC) corresponding to pictures in coding of moving images, and three-dimensional data is coded in units of spaces.
- the space is further divided into volumes (VLM) corresponding to macroblocks or the like in video encoding, and prediction and conversion are performed in units of VLM.
- the volume includes a plurality of voxels (VXL) that are minimum units with which position coordinates are associated.
- prediction is similar to the prediction performed in a two-dimensional image, refers to other processing units, generates predicted three-dimensional data similar to the processing unit to be processed, and generates the predicted three-dimensional data and the processing target.
- the difference from the processing unit is encoded.
- This prediction includes not only spatial prediction that refers to other prediction units at the same time but also temporal prediction that refers to prediction units at different times.
- a three-dimensional data encoding device (hereinafter also referred to as an encoding device) encodes a three-dimensional space represented by point cloud data such as a point cloud
- a point cloud is selected according to the voxel size. Or a plurality of points included in a voxel are collectively encoded. If the voxel is subdivided, the three-dimensional shape of the point cloud can be expressed with high accuracy, and if the voxel size is increased, the three-dimensional shape of the point cloud can be roughly expressed.
- the three-dimensional data is a point cloud
- the three-dimensional data is not limited to the point cloud, and may be three-dimensional data in an arbitrary format.
- hierarchical voxels may be used.
- n-th layer it may be indicated in order whether or not the sample points are present in the n-1 and lower layers (lower layers of the n-th layer). For example, when decoding only the n-th layer, if there is a sample point in the (n ⁇ 1) th or lower layer, it can be decoded assuming that the sample point exists at the center of the voxel in the n-th layer.
- the encoding apparatus acquires point cloud data using a distance sensor, a stereo camera, a monocular camera, a gyroscope, an inertial sensor, or the like.
- Spaces can be decoded independently, such as intra-space (I-SPC), predictive space (P-SPC) that can only be unidirectionally referenced, and bi-directional references. Classified into any one of at least three prediction structures including a large bi-directional space (B-SPC).
- the space has two types of time information, a decoding time and a display time.
- GOS Group Of Space
- WLD world
- the space area occupied by the world is associated with the absolute position on the earth by GPS or latitude and longitude information. This position information is stored as meta information.
- the meta information may be included in the encoded data or may be transmitted separately from the encoded data.
- all SPCs may be adjacent three-dimensionally, or there may be SPCs that are not three-dimensionally adjacent to other SPCs.
- processing such as encoding, decoding, or referring to three-dimensional data included in a processing unit such as GOS, SPC, or VLM is also simply referred to as encoding, decoding, or referencing the processing unit.
- the three-dimensional data included in the processing unit includes at least one set of a spatial position such as three-dimensional coordinates and a characteristic value such as color information.
- a plurality of SPCs in the same GOS or a plurality of VLMs in the same SPC occupy different spaces, but have the same time information (decoding time and display time).
- the first SPC in the decoding order in GOS is I-SPC.
- GOS There are two types of GOS: closed GOS and open GOS.
- the closed GOS is a GOS that can decode all SPCs in the GOS when decoding is started from the head I-SPC.
- the open GOS some SPCs whose display time is earlier than the first I-SPC in the GOS refer to different GOS, and decoding cannot be performed only by the GOS.
- WLD may be decoded from the reverse direction to the encoding order, and reverse reproduction is difficult if there is a dependency between GOS. Therefore, in such a case, a closed GOS is basically used.
- GOS has a layer structure in the height direction, and encoding or decoding is performed in order from the SPC of the lower layer.
- FIG. 2 is a diagram showing an example of a prediction structure between SPCs belonging to the lowest layer of GOS.
- FIG. 3 is a diagram illustrating an example of a prediction structure between layers.
- I-SPCs there are one or more I-SPCs in GOS.
- objects such as humans, animals, cars, bicycles, traffic lights, and buildings that are landmarks in the three-dimensional space, but it is particularly effective to encode an object having a small size as I-SPC.
- a three-dimensional data decoding device (hereinafter also referred to as a decoding device) decodes only the I-SPC in the GOS when decoding the GOS with a low throughput or high speed.
- the encoding device may switch the encoding interval or appearance frequency of the I-SPC according to the density of the objects in the WLD.
- the encoding device or the decoding device encodes or decodes a plurality of layers in order from the lower layer (layer 1). Thereby, for example, the priority of data near the ground with a larger amount of information can be increased for an autonomous vehicle or the like.
- encoded data used in a drone or the like may be encoded or decoded in order from the SPC of the upper layer in the height direction in the GOS.
- the encoding device or decoding device may encode or decode a plurality of layers so that the decoding device can roughly grasp GOS and gradually increase the resolution.
- the encoding device or the decoding device may encode or decode in the order of layers 3, 8, 1, 9.
- static objects or scenes such as buildings or roads
- dynamic objects such as cars or humans
- Object detection is performed separately by extracting feature points from point cloud data or camera images such as a stereo camera.
- the first method is a method of encoding without distinguishing between static objects and dynamic objects.
- the second method is a method of discriminating between static objects and dynamic objects based on identification information.
- GOS is used as the identification unit.
- the GOS including only the SPC configuring the static object and the GOS including the SPC configuring the dynamic object are distinguished by the identification information stored in the encoded data or separately from the encoded data. .
- SPC may be used as an identification unit.
- the SPC including only the VLM configuring the static object and the SPC including the VLM configuring the dynamic object are distinguished by the identification information.
- VLM or VXL may be used as the identification unit.
- a VLM or VXL including only a static object and a VLM or VXL including a dynamic object are distinguished by the identification information.
- the encoding device may encode the dynamic object as one or more VLMs or SPCs, and may encode the VLM or SPC including only the static objects and the SPC including the dynamic objects as different GOSs. . Also, when the GOS size is variable according to the size of the dynamic object, the encoding apparatus separately stores the GOS size as meta information.
- the encoding device may encode the static object and the dynamic object independently of each other and superimpose the dynamic object on the world composed of the static objects.
- the dynamic object is composed of one or more SPCs, and each SPC is associated with one or more SPCs constituting a static object on which the SPC is superimposed.
- a dynamic object may be represented by one or more VLMs or VXLs instead of SPCs.
- the encoding device may encode the static object and the dynamic object as different streams.
- the encoding device may generate a GOS including one or more SPCs that constitute a dynamic object. Further, the encoding apparatus may set the GOS (GOS_M) including the dynamic object and the GOS of the static object corresponding to the GOS_M space area to the same size (occupy the same space area). Thereby, a superimposition process can be performed per GOS.
- GOS including one or more SPCs that constitute a dynamic object.
- the encoding apparatus may set the GOS (GOS_M) including the dynamic object and the GOS of the static object corresponding to the GOS_M space area to the same size (occupy the same space area). Thereby, a superimposition process can be performed per GOS.
- the P-SPC or B-SPC constituting the dynamic object may refer to the SPC included in a different encoded GOS.
- the reference across the GOS is effective from the viewpoint of the compression rate.
- the first method and the second method may be switched according to the use of the encoded data. For example, when using encoded 3D data as a map, it is desirable to be able to separate dynamic objects, and therefore the encoding apparatus uses the second method. On the other hand, the encoding apparatus uses the first method when it is not necessary to separate dynamic objects when encoding three-dimensional data of events such as concerts or sports.
- the GOS or SPC decoding time and display time can be stored in the encoded data or as meta information. Moreover, all the time information of static objects may be the same. At this time, the actual decoding time and display time may be determined by the decoding device. Alternatively, a different value may be given as the decoding time for each GOS or SPC, and the same value may be given as the display time. Furthermore, like a decoder model in moving picture coding such as HEVC's HRD (Hypothetical Reference Decoder), the decoder has a buffer of a predetermined size and can be decoded without failure if a bit stream is read at a predetermined bit rate according to the decoding time. You may introduce a model that guarantees
- the coordinates of the three-dimensional space in the world are expressed by three coordinate axes (x axis, y axis, z axis) orthogonal to each other.
- encoding can be performed so that spatially adjacent GOSs are continuous in the encoded data.
- GOS in the xz plane is continuously encoded.
- the value of the y-axis is updated after encoding of all GOSs in a certain xz plane is completed. That is, as encoding progresses, the world extends in the y-axis direction.
- the GOS index number is set in the encoding order.
- the three-dimensional space of the world is associated with GPS or geographical absolute coordinates such as latitude and longitude in one-to-one correspondence.
- the three-dimensional space may be expressed by a relative position from a preset reference position.
- the x-axis, y-axis, and z-axis directions in the three-dimensional space are expressed as direction vectors determined based on latitude and longitude, and the direction vectors are stored as meta information together with the encoded data.
- the GOS size is fixed, and the encoding device stores the size as meta information. Further, the size of the GOS may be switched depending on, for example, whether it is an urban area or whether it is indoor or outdoor. That is, the size of the GOS may be switched according to the amount or nature of the object that is valuable as information. Alternatively, the encoding device may adaptively switch the GOS size or the I-SPC interval in the GOS in accordance with the object density or the like in the same world. For example, the higher the object density, the smaller the size of the GOS and the shorter the I-SPC interval in the GOS.
- the density of objects is high, so that the GOS is subdivided in order to realize random access with fine granularity.
- the seventh to tenth GOS exist on the back side of the third to sixth GOS, respectively.
- FIG. 6 is a block diagram of 3D data encoding apparatus 100 according to the present embodiment.
- FIG. 7 is a flowchart showing an operation example of the three-dimensional data encoding apparatus 100.
- the three-dimensional data encoding device 100 shown in FIG. 6 generates encoded three-dimensional data 112 by encoding the three-dimensional data 111.
- the three-dimensional data encoding apparatus 100 includes an acquisition unit 101, an encoding region determination unit 102, a division unit 103, and an encoding unit 104.
- the acquisition unit 101 first acquires three-dimensional data 111 that is point cloud data (S101).
- the coding area determination unit 102 determines a coding target area among the spatial areas corresponding to the acquired point cloud data (S102). For example, the coding area determination unit 102 determines a space area around the position as a coding target area according to the position of the user or the vehicle.
- the dividing unit 103 divides the point cloud data included in the encoding target area into each processing unit.
- the processing unit is the above-described GOS, SPC, or the like.
- this encoding target area corresponds to the above-described world, for example.
- the dividing unit 103 divides the point cloud data into processing units based on a preset GOS size or the presence or size of a dynamic object (S103). Further, the dividing unit 103 determines the start position of the SPC that is the head in the encoding order in each GOS.
- the encoding unit 104 generates encoded three-dimensional data 112 by sequentially encoding a plurality of SPCs in each GOS (S104).
- the three-dimensional data encoding device 100 generates the encoded three-dimensional data 112 by encoding the three-dimensional data 111.
- the 3D data encoding apparatus 100 divides 3D data into first processing units (GOS) each of which is a random access unit and is associated with 3D coordinates.
- the processing unit (GOS) is divided into a plurality of second processing units (SPC), and the second processing unit (SPC) is divided into a plurality of third processing units (VLM).
- the third processing unit (VLM) includes one or more voxels (VXL) that are minimum units with which position information is associated.
- the three-dimensional data encoding device 100 generates encoded three-dimensional data 112 by encoding each of the plurality of first processing units (GOS). Specifically, the three-dimensional data encoding device 100 encodes each of the plurality of second processing units (SPC) in each first processing unit (GOS). In addition, the 3D data encoding apparatus 100 encodes each of the plurality of third processing units (VLM) in each second processing unit (SPC).
- GOS first processing unit
- VLM third processing units
- the 3D data encoding apparatus 100 has the second processing unit to be processed included in the first processing unit (GOS) to be processed.
- SPC is encoded with reference to another second processing unit (SPC) included in the first processing unit (GOS) to be processed. That is, the 3D data encoding apparatus 100 does not refer to the second processing unit (SPC) included in the first processing unit (GOS) different from the first processing unit (GOS) to be processed.
- the processing target second processing unit (SPC) included in the processing target first processing unit (GOS) is set as the processing target first processing unit (GOS).
- the 3D data encoding apparatus 100 uses the first type (I-SPC) that does not refer to another second processing unit (SPC) as the type of the second processing unit (SPC) to be processed, Select either the second type (P-SPC) that refers to the second processing unit (SPC) or the third type that refers to the other two second processing units (SPC), and process according to the selected type
- the target second processing unit (SPC) is encoded.
- FIG. 8 is a block diagram of a block of 3D data decoding apparatus 200 according to the present embodiment.
- FIG. 9 is a flowchart showing an operation example of the three-dimensional data decoding apparatus 200.
- the 3D data decoding apparatus 200 shown in FIG. 8 generates decoded 3D data 212 by decoding the encoded 3D data 211.
- the encoded three-dimensional data 211 is, for example, the encoded three-dimensional data 112 generated by the three-dimensional data encoding device 100.
- the three-dimensional data decoding apparatus 200 includes an acquisition unit 201, a decoding start GOS determination unit 202, a decoding SPC determination unit 203, and a decoding unit 204.
- the acquisition unit 201 acquires the encoded three-dimensional data 211 (S201).
- the decoding start GOS determination unit 202 determines a GOS to be decoded (S202). Specifically, the decoding start GOS determination unit 202 refers to the meta information stored in the encoded 3D data 211 or separately from the encoded 3D data, and the spatial position, object, or The GOS including the SPC corresponding to the time is determined as the GOS to be decrypted.
- the decoding SPC determination unit 203 determines the type (I, P, B) of SPC to be decoded in the GOS (S203). For example, the decoding SPC determination unit 203 determines whether (1) only I-SPC is decoded, (2) I-SPC and P-SPC are decoded, or (3) all types are decoded. Note that this step may not be performed if the type of SPC to be decoded has been determined in advance, such as decoding all SPCs.
- the decoding unit 204 acquires an address position where the head SPC in the decoding order (same as the coding order) in the GOS starts in the encoded three-dimensional data 211, and the code of the first SPC from the address position.
- the data is acquired, and each SPC is sequentially decoded from the head SPC (S204).
- the address position is stored in meta information or the like.
- the three-dimensional data decoding apparatus 200 decodes the decoded three-dimensional data 212. Specifically, the three-dimensional data decoding apparatus 200 decodes each of the encoded three-dimensional data 211 of the first processing unit (GOS) that is a random access unit and each is associated with a three-dimensional coordinate. Thus, the decoded three-dimensional data 212 of the first processing unit (GOS) is generated. More specifically, the three-dimensional data decoding apparatus 200 decodes each of the plurality of second processing units (SPC) in each first processing unit (GOS). The three-dimensional data decoding apparatus 200 decodes each of the plurality of third processing units (VLM) in each second processing unit (SPC).
- SPC second processing units
- VLM third processing units
- This meta information is generated by the three-dimensional data encoding apparatus 100 and is included in the encoded three-dimensional data 112 (211).
- FIG. 10 is a diagram illustrating an example of a table included in the meta information. Note that not all the tables shown in FIG. 10 need be used, and at least one table may be used.
- the address may be an address in a logical format or a physical address of an HDD or a memory.
- Information specifying a file segment may be used instead of an address.
- a file segment is a unit obtained by segmenting one or more GOSs.
- a plurality of GOSs to which the object belongs may be indicated in the object-GOS table. If the plurality of GOSs are closed GOS, the encoding device and the decoding device can perform encoding or decoding in parallel. On the other hand, if the plurality of GOSs are open GOS, the compression efficiency can be further improved by referring to the plurality of GOSs.
- the 3D data encoding apparatus 100 extracts a feature point specific to an object from a 3D point cloud or the like at the time of world encoding, detects an object based on the feature point, and detects the detected object as a random access point. Can be set as
- the three-dimensional data encoding apparatus 100 includes the first information indicating the plurality of first processing units (GOS) and the three-dimensional coordinates associated with each of the plurality of first processing units (GOS). Is generated. Also, the encoded three-dimensional data 112 (211) includes this first information. The first information further indicates at least one of the object, the time, and the data storage destination associated with each of the plurality of first processing units (GOS).
- the three-dimensional data decoding apparatus 200 acquires the first information from the encoded three-dimensional data 211, and uses the first information to encode the first processing unit corresponding to the specified three-dimensional coordinate, object, or time.
- the original data 211 is specified, and the encoded three-dimensional data 211 is decoded.
- the three-dimensional data encoding apparatus 100 may generate and store the following meta information. Further, the three-dimensional data decoding apparatus 200 may use this meta information at the time of decoding.
- a profile may be defined according to the application, and information indicating the profile may be included in the meta information. For example, profiles for urban areas, suburbs, or flying objects are defined, and the maximum or minimum size of the world, SPC, or VLM is defined in each profile. For example, in a city area, detailed information is required as compared to a suburb area, so the minimum size of the VLM can be set smaller.
- the meta information may include a tag value indicating the type of object.
- This tag value is associated with the VLM, SPC, or GOS constituting the object. For example, the tag value “0” indicates “person”, the tag value “1” indicates “car”, the tag value “2” indicates “traffic light”, and the like. Also good.
- a tag value indicating a size or a property such as a dynamic object or a static object may be used.
- the meta information may include information indicating the range of the space area occupied by the world.
- the meta information may store the SPC or VXL size as header information common to a plurality of SPCs such as the entire encoded data stream or SPC in GOS.
- the meta information may include identification information such as a distance sensor or a camera used for generating the point cloud, or information indicating the position accuracy of the point cloud in the point cloud.
- the meta information may include information indicating whether the world is composed of only static objects or includes dynamic objects.
- the encoding device or decoding device may encode or decode two or more different SPCs or GOSs in parallel.
- the GOS to be encoded or decoded in parallel can be determined based on meta information indicating the spatial position of the GOS.
- the encoding device or the decoding device uses GPS, route information, zoom magnification, etc. You may encode or decode GOS or SPC included in the space specified based on it.
- the decoding device may perform decoding in order from a space close to the self-position or the travel route.
- the encoding device or the decoding device may encode or decode a space far from its own position or travel route with a lower priority than a close space.
- lowering the priority means lowering the processing order, lowering the resolution (processing by thinning out), or lowering the image quality (increasing encoding efficiency, for example, increasing the quantization step).
- the decoding device may decode only the lower layer when decoding the encoded data that is hierarchically encoded in the space.
- the decoding device may decode preferentially from the lower hierarchy according to the zoom magnification or use of the map.
- the encoding device or the decoding device encodes with a resolution reduced except for an area within a specific height from the road surface (an area for recognition). Or decoding may be performed.
- the encoding device may individually encode the point clouds representing the indoor and outdoor space shapes. For example, by separating GOS (indoor GOS) representing the room and GOS (outdoor GOS) representing the room, the decoding device selects the GOS to be decoded according to the viewpoint position when using the encoded data. it can.
- the encoding device may encode the indoor GOS and the outdoor GOS having close coordinates so as to be adjacent in the encoded stream.
- the encoding apparatus associates both identifiers, and stores information indicating the identifiers associated with each other in the encoded stream or meta information stored separately.
- the decoding apparatus can identify the indoor GOS and the outdoor GOS whose coordinates are close by referring to the information in the meta information.
- the encoding device may switch the size of GOS or SPC between the indoor GOS and the outdoor GOS. For example, the encoding apparatus sets the size of GOS smaller in the room than in the outdoor. Further, the encoding device may change the accuracy when extracting feature points from the point cloud, the accuracy of object detection, or the like between the indoor GOS and the outdoor GOS.
- the encoding device may add information for the decoding device to display the dynamic object separately from the static object to the encoded data.
- the decoding apparatus can display a dynamic object and a red frame or explanatory characters together.
- the decoding device may display only a red frame or explanatory characters instead of the dynamic object.
- the decoding device may display a finer object type. For example, a red frame may be used for a car and a yellow frame may be used for a human.
- the encoding device or the decoding device encodes the dynamic object and the static object as different SPC or GOS according to the appearance frequency of the dynamic object or the ratio of the static object to the dynamic object. Or you may decide whether to decode. For example, when the appearance frequency or ratio of a dynamic object exceeds a threshold, SPC or GOS in which a dynamic object and a static object are mixed is allowed, and the appearance frequency or ratio of a dynamic object does not exceed the threshold Does not allow SPC or GOS in which dynamic objects and static objects are mixed.
- the encoding device When detecting a dynamic object from the two-dimensional image information of the camera instead of the point cloud, the encoding device separately acquires information for identifying the detection result (such as a frame or a character) and the object position, Such information may be encoded as part of the three-dimensional encoded data. In this case, the decoding device superimposes and displays auxiliary information (frame or character) indicating the dynamic object on the decoding result of the static object.
- auxiliary information frame or character
- the encoding device may change the density of VXL or VLM in the SPC according to the complexity of the shape of the static object. For example, the encoding device densely sets VXL or VLM when the shape of the static object is complicated. Further, the encoding apparatus may determine a quantization step or the like when quantizing the spatial position or color information according to the density of VXL or VLM. For example, the encoding apparatus sets the quantization step to be smaller as VXL or VLM is denser.
- the encoding device or decoding device performs space encoding or decoding in units of space having coordinate information.
- the encoding device and the decoding device perform encoding or decoding in units of volume in the space.
- the volume includes a voxel that is a minimum unit with which position information is associated.
- the encoding device and the decoding device associate arbitrary elements using a table in which each element of spatial information including coordinates, objects, time, and the like is associated with a GOP, or a table in which elements are associated with each other. Encoding or decoding.
- the decoding apparatus determines coordinates using the value of the selected element, specifies a volume, voxel, or space from the coordinates, and decodes the space including the volume or voxel or the specified space.
- the encoding device determines a volume, voxel, or space that can be selected by the element by feature point extraction or object recognition, and encodes it as a randomly accessible volume, voxel, or space.
- a space refers to an I-SPC that can be encoded or decoded by the space alone, a P-SPC that is encoded or decoded with reference to any one processed space, and any two processed spaces. Are classified into three types of B-SPC encoded or decoded.
- One or more volumes correspond to static objects or dynamic objects.
- a space including only static objects and a space including only dynamic objects are encoded or decoded as different GOS. That is, SPCs that include only static objects and SPCs that include only dynamic objects are assigned to different GOSs.
- a dynamic object is encoded or decoded for each object, and is associated with one or more spaces including only static objects. That is, the plurality of dynamic objects are individually encoded, and the obtained encoded data of the plurality of dynamic objects is associated with the SPC including only the static object.
- the encoding device and the decoding device increase the priority of I-SPC in GOS and perform encoding or decoding.
- the encoding apparatus performs encoding so that degradation of I-SPC is reduced (so that original three-dimensional data is reproduced more faithfully after decoding).
- the decoding device decodes only I-SPC, for example.
- the encoding device may perform encoding by changing the frequency of using I-SPC according to the density or number (quantity) of objects in the world. That is, the encoding device changes the frequency of selecting the I-SPC according to the number or density of objects included in the three-dimensional data. For example, the encoding device increases the frequency of using the I space as the objects in the world are denser.
- the encoding apparatus sets random access points in units of GOS, and stores information indicating the space area corresponding to GOS in the header information.
- the encoding device uses, for example, a default value as the GOS space size.
- the encoding device may change the size of the GOS according to the number (amount) or density of objects or dynamic objects. For example, the encoding device reduces the GOS space size as the number of objects or dynamic objects increases.
- the space or volume includes a feature point group derived using information obtained by a sensor such as a depth sensor, gyroscope, or camera.
- the coordinates of the feature point are set at the center position of the voxel. Further, high accuracy of position information can be realized by subdividing voxels.
- the feature point group is derived using a plurality of pictures.
- the plurality of pictures have at least two types of time information: actual time information and the same time information (for example, encoding time used for rate control) in the plurality of pictures associated with the space.
- encoding or decoding is performed in units of GOS including one or more spaces.
- the encoding device and the decoding device refer to the space in the processed GOS and predict the P space or B space in the processing target GOS.
- the encoding device and the decoding device do not refer to different GOS, and use the processed space in the processing target GOS to predict the P space or B space in the processing target GOS.
- the encoding device and the decoding device transmit or receive the encoded stream in units of world including one or more GOSs.
- GOS has a layer structure in at least one direction in the world, and the encoding device and the decoding device perform encoding or decoding from a lower layer.
- a randomly accessible GOS belongs to the lowest layer.
- a GOS belonging to an upper layer refers only to a GOS belonging to the same layer or lower. That is, GOS includes a plurality of layers that are spatially divided in a predetermined direction and each include one or more SPCs.
- the encoding device and the decoding device encode or decode each SPC with reference to the SPC included in the same layer as the SPC or a layer lower than the SPC.
- the encoding device and the decoding device continuously encode or decode GOS within a world unit including a plurality of GOSs.
- the encoding device and the decoding device write or read information indicating the order (direction) of encoding or decoding as metadata. That is, the encoded data includes information indicating the encoding order of a plurality of GOSs.
- the encoding device and the decoding device encode or decode two or more different spaces or GOS in parallel.
- the encoding device and the decoding device encode or decode space or GOS space information (coordinates, size, etc.).
- the encoding device and the decoding device encode or decode a space or GOS included in a specific space that is specified based on external information related to its position or / and region size, such as GPS, route information, or magnification. .
- the encoding device or decoding device encodes or decodes a space far from its own position with a lower priority than a close space.
- the encoding apparatus sets a certain direction of the world according to the magnification or application, and encodes GOS having a layer structure in the direction. Further, the decoding apparatus preferentially decodes GOS having a layer structure in one direction of the world set according to the magnification or use from the lower layer.
- the encoding device changes the feature point included in the space, the accuracy of object recognition, or the size of the space area between the room and the room.
- the encoding device and the decoding device encode or decode the indoor GOS and the outdoor GOS whose coordinates are close to each other in the world, and encode or decode these identifiers in association with each other.
- the 3D data encoding device and the 3D data decoding device according to the embodiment of the present disclosure have been described, but the present disclosure is not limited to this embodiment.
- each processing unit included in the 3D data encoding apparatus or 3D data decoding apparatus is typically realized as an LSI which is an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include a part or all of them.
- circuits are not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
- An FPGA Field Programmable Gate Array
- reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
- Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
- the present disclosure may be realized as a three-dimensional data encoding method or a three-dimensional data decoding method executed by a three-dimensional data encoding device or a three-dimensional data decoding device.
- division of functional blocks in the block diagram is an example, and a plurality of functional blocks can be realized as one functional block, a single functional block can be divided into a plurality of functions, or some functions can be transferred to other functional blocks. May be.
- functions of a plurality of functional blocks having similar functions may be processed in parallel or time-division by a single hardware or software.
- the present disclosure is not limited to this embodiment. Unless it deviates from the gist of the present disclosure, various modifications conceived by those skilled in the art have been made in this embodiment, and forms constructed by combining components in different embodiments are also within the scope of one or more aspects. May be included.
- the present disclosure can be applied to a three-dimensional data encoding device and a three-dimensional data decoding device.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Compression, Expansion, Code Conversion, And Decoders (AREA)
- Processing Or Creating Images (AREA)
- Compression Of Band Width Or Redundancy In Fax (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
Abstract
Description
101,201 取得部
102 符号化領域決定部
103 分割部
104 符号化部
111 三次元データ
112,211 符号化三次元データ
200 三次元データ復号装置
202 復号開始GOS決定部
203 復号SPC決定部
204 復号部
212 復号三次元データ DESCRIPTION OF
Claims (20)
- 三次元データを符号化する三次元データ符号化方法であって、
前記三次元データを、ランダムアクセス単位であって、各々が三次元座標に対応付けられている第1処理単位に分割する分割ステップと、
複数の前記第1処理単位の各々を符号化することで符号化データを生成する符号化ステップとを含む
三次元データ符号化方法。 A three-dimensional data encoding method for encoding three-dimensional data,
A division step of dividing the three-dimensional data into a first processing unit, which is a random access unit, each associated with a three-dimensional coordinate;
A three-dimensional data encoding method comprising: an encoding step of generating encoded data by encoding each of the plurality of first processing units. - 前記三次元データ符号化方法は、
前記複数の第1処理単位と、前記複数の第1処理単位の各々に対応付けられている三次元座標とを示す第1情報を生成する生成ステップを含み、
前記符号化データは、前記第1情報を含む
請求項1記載の三次元データ符号化方法。 The three-dimensional data encoding method includes:
Generating a first information indicating the plurality of first processing units and the three-dimensional coordinates associated with each of the plurality of first processing units;
The three-dimensional data encoding method according to claim 1, wherein the encoded data includes the first information. - 前記第1情報は、さらに、
前記複数の第1処理単位の各々に対応付けられている、オブジェクト、時刻及びデータ格納先のうち少なくとも一つを示す
請求項2記載の三次元データ符号化方法。 The first information further includes:
The three-dimensional data encoding method according to claim 2, wherein at least one of an object, a time, and a data storage destination associated with each of the plurality of first processing units is indicated. - 前記分割ステップでは、さらに、前記第1処理単位を複数の第2処理単位に分割し、
前記符号化ステップでは、前記複数の第2処理単位の各々を符号化する
請求項1~3のいずれか1項に記載の三次元データ符号化方法。 In the dividing step, the first processing unit is further divided into a plurality of second processing units,
The three-dimensional data encoding method according to any one of claims 1 to 3, wherein in the encoding step, each of the plurality of second processing units is encoded. - 前記符号化ステップでは、処理対象の第1処理単位に含まれる処理対象の第2処理単位を、前記処理対象の第1処理単位に含まれる他の第2処理単位を参照して符号化する
請求項4記載の三次元データ符号化方法。 In the encoding step, a second processing unit to be processed included in the first processing unit to be processed is encoded with reference to another second processing unit included in the first processing unit to be processed. Item 5. The three-dimensional data encoding method according to item 4. - 前記符号化ステップでは、前記処理対象の第2処理単位のタイプとして、他の第2処理単位を参照しない第1タイプ、他の一つの第2処理単位を参照する第2タイプ、及び他の二つの第2処理単位を参照する第3タイプのうちいずれかを選択し、選択したタイプに従い前記処理対象の第2処理単位を符号化する
請求項5記載の三次元データ符号化方法。 In the encoding step, as a type of the second processing unit to be processed, a first type that does not refer to another second processing unit, a second type that refers to another second processing unit, and other two types. The three-dimensional data encoding method according to claim 5, wherein any one of the third types referring to the second processing unit is selected, and the second processing unit to be processed is encoded according to the selected type. - 前記符号化ステップでは、前記三次元データに含まれるオブジェクトの数又は粗密さに応じて、前記第1タイプを選択する頻度を変更する
請求項6記載の三次元データ符号化方法。 The three-dimensional data encoding method according to claim 6, wherein in the encoding step, the frequency of selecting the first type is changed according to the number or density of objects included in the three-dimensional data. - 前記符号化ステップでは、前記三次元データに含まれるオブジェクト又は動的オブジェクトの数又は粗密さに応じて、前記第1処理単位のサイズを決定する
請求項6記載の三次元データ符号化方法。 The three-dimensional data encoding method according to claim 6, wherein, in the encoding step, the size of the first processing unit is determined according to the number or density of objects or dynamic objects included in the three-dimensional data. - 前記第1処理単位は、予め定められた方向に空間分割され、各々が1以上の前記第2処理単位を含む複数のレイヤを含み、
前記符号化ステップでは、前記第2処理単位を、当該第2処理単位と同一レイヤ又は当該第2処理単位より下層のレイヤに含まれる前記第2処理単位を参照して符号化する
請求項5~8のいずれか1項に記載の三次元データ符号化方法。 The first processing unit includes a plurality of layers that are spatially divided in a predetermined direction, each including one or more second processing units,
6. In the encoding step, the second processing unit is encoded with reference to the second processing unit included in the same layer as the second processing unit or a layer below the second processing unit. 9. The three-dimensional data encoding method according to claim 8. - 前記分割ステップでは、静的オブジェクトのみを含む第2処理単位と、動的オブジェクトのみを含む第2処理単位とを異なる第1処理単位に割り当てる
請求項4~9のいずれか1項に記載の三次元データ符号化方法。 The tertiary processing according to any one of claims 4 to 9, wherein, in the dividing step, a second processing unit including only static objects and a second processing unit including only dynamic objects are allocated to different first processing units. Original data encoding method. - 前記符号化ステップでは、複数の動的オブジェクトを個別に符号化し、前記複数の動的オブジェクトの符号化データは、静的オブジェクトのみを含む第2処理単位に対応付けられる
請求項4~9のいずれか1項に記載の三次元データ符号化方法。 In the encoding step, a plurality of dynamic objects are individually encoded, and encoded data of the plurality of dynamic objects is associated with a second processing unit including only a static object. The three-dimensional data encoding method according to claim 1. - 前記分割ステップでは、さらに、前記第2処理単位を複数の第3処理単位に分割し、
前記符号化ステップでは、前記複数の第3処理単位の各々を符号化する
請求項4~11のいずれか1項に記載の三次元データ符号化方法。 In the dividing step, the second processing unit is further divided into a plurality of third processing units,
The three-dimensional data encoding method according to any one of claims 4 to 11, wherein in the encoding step, each of the plurality of third processing units is encoded. - 前記第3処理単位は、位置情報が対応付けられる最小単位である1以上のボクセルを含む
請求項12記載の三次元データ符号化方法。 The three-dimensional data encoding method according to claim 12, wherein the third processing unit includes one or more voxels that are minimum units with which position information is associated. - 前記第2処理単位は、センサで得られた情報から導出された特徴点群を含む
請求項4~13のいずれか1項に記載の三次元データ符号化方法。 The three-dimensional data encoding method according to any one of claims 4 to 13, wherein the second processing unit includes a feature point group derived from information obtained by a sensor. - 前記符号化データは、前記複数の第1処理単位の符号化順を示す情報を含む
請求項1~14のいずれか1項に記載の三次元データ符号化方法。 The three-dimensional data encoding method according to any one of claims 1 to 14, wherein the encoded data includes information indicating an encoding order of the plurality of first processing units. - 前記符号化データは、前記複数の第1処理単位のサイズを示す情報を含む
請求項1~15のいずれか1項に記載の三次元データ符号化方法。 The three-dimensional data encoding method according to any one of claims 1 to 15, wherein the encoded data includes information indicating a size of the plurality of first processing units. - 前記符号化ステップでは、前記複数の第1処理単位を並列に符号化する
請求項1~16のいずれか1項に記載の三次元データ符号化方法。 The three-dimensional data encoding method according to any one of claims 1 to 16, wherein in the encoding step, the plurality of first processing units are encoded in parallel. - 三次元データを復号する三次元データ復号方法であって、
ランダムアクセス単位であって、各々が三次元座標に対応付けられている第1処理単位の符号化データの各々を復号することで前記第1処理単位の三次元データを生成する復号ステップを含む
三次元データ復号方法。 A three-dimensional data decoding method for decoding three-dimensional data,
A decoding step for generating three-dimensional data of the first processing unit by decoding each of the encoded data of the first processing unit, each of which is a random access unit, each associated with a three-dimensional coordinate Original data decoding method. - 三次元データを符号化する三次元データ符号化装置であって、
前記三次元データを、ランダムアクセス単位であって、各々が三次元座標に対応付けられている第1処理単位に分割する分割部と、
複数の前記第1処理単位の各々を符号化することで符号化データを生成する符号化部とを含む
三次元データ符号化装置。 A three-dimensional data encoding device for encoding three-dimensional data,
A dividing unit that divides the three-dimensional data into a first processing unit that is a random access unit, each associated with a three-dimensional coordinate;
A three-dimensional data encoding device comprising: an encoding unit that generates encoded data by encoding each of the plurality of first processing units. - 三次元データを復号する三次元データ復号装置であって、
ランダムアクセス単位であって、各々が三次元座標に対応付けられている第1処理単位の符号化データの各々を復号することで前記第1処理単位の三次元データを生成する復号部を含む
三次元データ復号装置。 A three-dimensional data decoding device for decoding three-dimensional data,
A random access unit, including a decoding unit that generates three-dimensional data of the first processing unit by decoding each of the encoded data of the first processing unit, each of which is associated with a three-dimensional coordinate Original data decoding device.
Priority Applications (12)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017556323A JP6817961B2 (en) | 2015-12-14 | 2016-12-01 | 3D data coding method, 3D data decoding method, 3D data coding device and 3D data decoding device |
KR1020187015523A KR102545015B1 (en) | 2015-12-14 | 2016-12-01 | Point cloud 3D data encoding method, point cloud 3D data decoding method, point cloud 3D data encoding apparatus and point cloud 3D data decoding apparatus |
CN202311058205.0A CN116883521A (en) | 2015-12-14 | 2016-12-01 | Three-dimensional data encoding method, decoding method, encoding device, and decoding device |
CA3005713A CA3005713A1 (en) | 2015-12-14 | 2016-12-01 | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
KR1020237019887A KR20230091200A (en) | 2015-12-14 | 2016-12-01 | Three-dimensional data coding method, three-dimensional data decoding method, three-dimensional data coding device, and three-dimensional data decoding device |
CN201680070904.9A CN108369751B (en) | 2015-12-14 | 2016-12-01 | Three-dimensional data encoding method, decoding method, encoding device, and decoding device |
MX2018006642A MX2018006642A (en) | 2015-12-14 | 2016-12-01 | Three-dimensional data coding method, three-dimensional data decoding method, three-dimensional data coding device, and three-dimensional data decoding device. |
CN202311061409.XA CN116883522A (en) | 2015-12-14 | 2016-12-01 | Three-dimensional data encoding method, decoding method, encoding device, and decoding device |
MYPI2018000862A MY190934A (en) | 2015-12-14 | 2016-12-01 | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
EP16875104.8A EP3392840A4 (en) | 2015-12-14 | 2016-12-01 | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
US15/996,710 US11290745B2 (en) | 2015-12-14 | 2018-06-04 | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
US17/674,380 US20220174318A1 (en) | 2015-12-14 | 2022-02-17 | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562266914P | 2015-12-14 | 2015-12-14 | |
US62/266,914 | 2015-12-14 | ||
JP2016-225504 | 2016-11-18 | ||
JP2016225504 | 2016-11-18 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/996,710 Continuation US11290745B2 (en) | 2015-12-14 | 2018-06-04 | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017104115A1 true WO2017104115A1 (en) | 2017-06-22 |
Family
ID=59056189
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/005041 WO2017104115A1 (en) | 2015-12-14 | 2016-12-01 | Three-dimensional data coding method, three-dimensional data decoding method, three-dimensional data coding device, and three-dimensional data decoding device |
Country Status (9)
Country | Link |
---|---|
US (2) | US11290745B2 (en) |
EP (1) | EP3392840A4 (en) |
JP (3) | JP6817961B2 (en) |
KR (2) | KR102545015B1 (en) |
CN (3) | CN108369751B (en) |
CA (1) | CA3005713A1 (en) |
MX (2) | MX2018006642A (en) |
MY (1) | MY190934A (en) |
WO (1) | WO2017104115A1 (en) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019012975A1 (en) * | 2017-07-10 | 2019-01-17 | ソニー株式会社 | Information processing device and method |
WO2019065297A1 (en) * | 2017-09-29 | 2019-04-04 | ソニー株式会社 | Information processing device and method |
WO2019078000A1 (en) * | 2017-10-16 | 2019-04-25 | ソニー株式会社 | Information processing device and method |
WO2019131880A1 (en) * | 2017-12-28 | 2019-07-04 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Coding method, decoding method, information processing method, coding device, decoding device, and information processing system |
WO2019235366A1 (en) * | 2018-06-06 | 2019-12-12 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2019240284A1 (en) * | 2018-06-14 | 2019-12-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2019240215A1 (en) * | 2018-06-13 | 2019-12-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2019240167A1 (en) * | 2018-06-12 | 2019-12-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2019240286A1 (en) * | 2018-06-15 | 2019-12-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
KR20200002213A (en) * | 2018-06-29 | 2020-01-08 | 현대엠엔소프트 주식회사 | Apparatus and method for constructing a 3d space map for route search for unmanned aerial vehicle |
WO2020008758A1 (en) * | 2018-07-06 | 2020-01-09 | ソニー株式会社 | Information processing device, information processing method, and program |
JP2020521359A (en) * | 2017-05-24 | 2020-07-16 | インターデジタル ヴイシー ホールディングス, インコーポレイテッド | Method and apparatus for encoding and reconstructing point clouds |
WO2020162495A1 (en) * | 2019-02-05 | 2020-08-13 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2020218593A1 (en) * | 2019-04-25 | 2020-10-29 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2020230710A1 (en) * | 2019-05-10 | 2020-11-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2021010134A1 (en) * | 2019-07-12 | 2021-01-21 | ソニー株式会社 | Information processing device and method |
CN112424833A (en) * | 2018-07-13 | 2021-02-26 | 松下电器(美国)知识产权公司 | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
CN113196784A (en) * | 2018-12-19 | 2021-07-30 | 索尼集团公司 | Point cloud coding structure |
CN113545099A (en) * | 2019-03-11 | 2021-10-22 | 索尼集团公司 | Information processing apparatus, reproduction processing apparatus, information processing method, and reproduction processing method |
JPWO2020138464A1 (en) * | 2018-12-28 | 2021-11-04 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 3D data coding method, 3D data decoding method, 3D data coding device, and 3D data decoding device |
US20210375004A1 (en) * | 2019-02-28 | 2021-12-02 | Panasonic Intellectual Property Corporation Of America | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2022145214A1 (en) * | 2020-12-28 | 2022-07-07 | ソニーグループ株式会社 | Information processing device and method |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3918793A4 (en) * | 2019-01-30 | 2022-11-30 | Nokia Technologies Oy | An apparatus, a method and a computer program for volumetric video |
WO2020220249A1 (en) * | 2019-04-30 | 2020-11-05 | 深圳市大疆创新科技有限公司 | Data encoding and data decoding methods and devices, and storage medium |
WO2022252337A1 (en) * | 2021-06-04 | 2022-12-08 | 华为技术有限公司 | Encoding method and apparatus for 3d map, and decoding method and apparatus for 3d map |
WO2024014826A1 (en) * | 2022-07-11 | 2024-01-18 | 엘지전자 주식회사 | Transmission device for point cloud data, method performed by transmission device, reception device for point cloud data, and method performed by reception device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11265460A (en) * | 1998-03-18 | 1999-09-28 | Nec Corp | Memory saving device and method for voxel data |
JP2000338900A (en) * | 1999-05-27 | 2000-12-08 | Sony Corp | Display device for three-dimensional stereoscopic image and method for displaying three-dimensional stereoscopic image |
JP2005259139A (en) * | 2004-03-08 | 2005-09-22 | Samsung Electronics Co Ltd | Generating method of adaptive notation system of base-2n tree, and device and method for encoding/decoding three-dimensional volume data utilizing same |
US20090184957A1 (en) * | 2008-01-21 | 2009-07-23 | Samsung Electronics Co., Ltd. | Method and system for compressing and decoding mesh data with random accessibility in three-dimensional mesh model |
Family Cites Families (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4999705A (en) * | 1990-05-03 | 1991-03-12 | At&T Bell Laboratories | Three dimensional motion compensated video coding |
GB2259824B (en) * | 1991-09-19 | 1995-01-18 | Sony Broadcast & Communication | Data compression |
KR950009680B1 (en) * | 1992-05-19 | 1995-08-25 | 주식회사금성사 | Image decoder of image compression and decompression system |
JPH06141301A (en) * | 1992-10-27 | 1994-05-20 | Victor Co Of Japan Ltd | Picture information compressor, expander and compander |
US6674911B1 (en) * | 1995-09-14 | 2004-01-06 | William A. Pearlman | N-dimensional data compression using set partitioning in hierarchical trees |
US6604166B1 (en) * | 1998-12-30 | 2003-08-05 | Silicon Automation Systems Limited | Memory architecture for parallel data access along any given dimension of an n-dimensional rectangular data array |
CN1237817C (en) * | 2000-05-03 | 2006-01-18 | 皇家菲利浦电子有限公司 | Encoding method for the compression of a video sequence |
JP2003533952A (en) * | 2000-05-18 | 2003-11-11 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Encoding method for video sequence compression |
JP4672175B2 (en) * | 2000-05-26 | 2011-04-20 | 本田技研工業株式会社 | Position detection apparatus, position detection method, and position detection program |
EP1297709A1 (en) * | 2000-06-14 | 2003-04-02 | Koninklijke Philips Electronics N.V. | Color video encoding and decoding method |
US7376279B2 (en) * | 2000-12-14 | 2008-05-20 | Idx Investment Corporation | Three-dimensional image streaming system and method for medical images |
CA2373707A1 (en) * | 2001-02-28 | 2002-08-28 | Paul Besl | Method and system for processing, compressing, streaming and interactive rendering of 3d color image data |
GB2378108B (en) * | 2001-07-24 | 2005-08-17 | Imagination Tech Ltd | Three dimensional graphics system |
US6965816B2 (en) * | 2001-10-01 | 2005-11-15 | Kline & Walker, Llc | PFN/TRAC system FAA upgrades for accountable remote and robotics control to stop the unauthorized use of aircraft and to improve equipment management and public safety in transportation |
KR100397511B1 (en) * | 2001-11-21 | 2003-09-13 | 한국전자통신연구원 | The processing system and it's method for the stereoscopic/multiview Video |
CN1240225C (en) * | 2002-02-13 | 2006-02-01 | 松下电器产业株式会社 | Picture coding device and picture coding method |
US20040008779A1 (en) * | 2002-06-18 | 2004-01-15 | Lai King Chung | Techniques for video encoding and decoding |
JP2004141514A (en) * | 2002-10-28 | 2004-05-20 | Toshiba Corp | Image processing apparatus and ultrasonic diagnostic apparatus |
US8163896B1 (en) * | 2002-11-14 | 2012-04-24 | Rosetta Genomics Ltd. | Bioinformatically detectable group of novel regulatory genes and uses thereof |
JP2004186978A (en) * | 2002-12-03 | 2004-07-02 | Sanyo Electric Co Ltd | Method and device for data write and digital camera |
KR100513732B1 (en) * | 2002-12-05 | 2005-09-08 | 삼성전자주식회사 | Method and apparatus for encoding and decoding 3 dimensional data |
US7321625B2 (en) * | 2002-12-13 | 2008-01-22 | Ntt Docomo, Inc. | Wavelet based multiresolution video representation with spatially scalable motion vectors |
JP4002878B2 (en) * | 2003-01-17 | 2007-11-07 | 松下電器産業株式会社 | Image coding method |
US7444011B2 (en) * | 2004-02-10 | 2008-10-28 | University Of Chicago | Imaging system performing substantially exact reconstruction and using non-traditional trajectories |
KR100519780B1 (en) * | 2004-02-17 | 2005-10-07 | 삼성전자주식회사 | Method and apparatus for encoding and decoding 3D volume data |
US7630565B2 (en) * | 2004-11-30 | 2009-12-08 | Lsi Corporation | Parallel video encoder with whole picture deblocking and/or whole picture compressed as a single slice |
US20060193386A1 (en) * | 2005-02-25 | 2006-08-31 | Chia-Wen Lin | Method for fast mode decision of variable block size coding |
US8588304B2 (en) * | 2005-03-31 | 2013-11-19 | Panasonic Corporation | Video decoding device, video decoding method, video decoding program, and video decoding integrated circuit |
JP2006338630A (en) * | 2005-05-31 | 2006-12-14 | Terarikon Inc | Three-dimensional image display device for creating three-dimensional image while sequentially and partially decompressing compressed image data |
US7702141B2 (en) * | 2005-06-29 | 2010-04-20 | General Electric Company | Method for quantifying an object in a larger structure using a reconstructed image |
US7502501B2 (en) * | 2005-12-22 | 2009-03-10 | Carestream Health, Inc. | System and method for rendering an oblique slice through volumetric data accessed via a client-server architecture |
KR100846870B1 (en) * | 2006-07-06 | 2008-07-16 | 한국전자통신연구원 | Apparatus and method of multi-stage, multi-dimensional transform based on multiple unit blocks |
JP4793366B2 (en) * | 2006-10-13 | 2011-10-12 | 日本ビクター株式会社 | Multi-view image encoding device, multi-view image encoding method, multi-view image encoding program, multi-view image decoding device, multi-view image decoding method, and multi-view image decoding program |
US8103111B2 (en) | 2006-12-26 | 2012-01-24 | Olympus Imaging Corp. | Coding method, electronic camera, recording medium storing coded program, and decoding method |
RU2010120518A (en) * | 2007-10-15 | 2011-11-27 | Нокиа Корпорейшн (Fi) | PASSING MOTION INFORMATION AND ONE-CYCLE CODING FOR MULTI-FULL VIDEO CONTENT |
US9641822B2 (en) * | 2008-02-25 | 2017-05-02 | Samsung Electronics Co., Ltd. | Method and apparatus for processing three-dimensional (3D) images |
KR101086772B1 (en) * | 2008-04-18 | 2011-11-25 | 한양대학교 산학협력단 | Method and apparatus for 3d mesh compression based quantization |
WO2009130561A1 (en) * | 2008-04-21 | 2009-10-29 | Nokia Corporation | Method and device for video coding and decoding |
US20090268821A1 (en) * | 2008-04-29 | 2009-10-29 | The Hong Kong University Of Science And Technology | Block parallel and fast motion estimation in video coding |
JP5309700B2 (en) * | 2008-06-03 | 2013-10-09 | 富士通株式会社 | Moving picture decoding apparatus and encoding apparatus |
US20100020877A1 (en) * | 2008-07-23 | 2010-01-28 | The Hong Kong University Of Science And Technology | Multiple reference frame motion estimation in video coding |
US8948496B2 (en) * | 2008-08-29 | 2015-02-03 | Koninklijke Philips N.V. | Dynamic transfer of three-dimensional image data |
JP5394212B2 (en) * | 2008-12-19 | 2014-01-22 | トムソン ライセンシング | How to insert data, how to read the inserted data |
US9066107B2 (en) * | 2009-01-28 | 2015-06-23 | France Telecom | Methods for encoding and decoding sequence implementing a motion compensation, corresponding encoding and decoding devices, signal and computer programs |
US8615044B2 (en) * | 2009-06-05 | 2013-12-24 | Cisco Technology, Inc. | Adaptive thresholding of 3D transform coefficients for video denoising |
WO2010150486A1 (en) * | 2009-06-22 | 2010-12-29 | パナソニック株式会社 | Video coding method and video coding device |
KR101631944B1 (en) * | 2009-10-30 | 2016-06-20 | 삼성전자주식회사 | Method and apparatus for entropy encoding and entropy decoding for accelarting video decoding |
US20120215788A1 (en) * | 2009-11-18 | 2012-08-23 | Nokia Corporation | Data Processing |
KR20110069740A (en) * | 2009-12-17 | 2011-06-23 | 에스케이 텔레콤주식회사 | Video coding method and apparatus |
US20110194613A1 (en) * | 2010-02-11 | 2011-08-11 | Qualcomm Incorporated | Video coding with large macroblocks |
US8468431B2 (en) | 2010-07-01 | 2013-06-18 | Densbits Technologies Ltd. | System and method for multi-dimensional encoding and decoding |
KR20120018269A (en) * | 2010-08-20 | 2012-03-02 | 한국전자통신연구원 | Multi-dimensional layered modulation transmission apparatus and method for stereoscopic 3d video |
JP5703781B2 (en) * | 2010-09-03 | 2015-04-22 | ソニー株式会社 | Image processing apparatus and method |
EP2618565A4 (en) * | 2010-09-13 | 2014-04-16 | Sony Computer Entertainment Inc | Image processing device, image processing method, data structure for video files, data compression device, data decoding device, data compression method, data decoding method, and data structure for compressed video files |
SG188255A1 (en) * | 2010-09-30 | 2013-04-30 | Panasonic Corp | Image decoding method, image coding method, image decoding apparatus, image coding apparatus, program, and integrated circuit |
US8755620B2 (en) * | 2011-01-12 | 2014-06-17 | Panasonic Corporation | Image coding method, image decoding method, image coding apparatus, image decoding apparatus, and image coding and decoding apparatus for performing arithmetic coding and/or arithmetic decoding |
JP5357199B2 (en) * | 2011-03-14 | 2013-12-04 | 日本電信電話株式会社 | Image encoding method, image decoding method, image encoding device, image decoding device, image encoding program, and image decoding program |
MY164655A (en) * | 2011-07-29 | 2018-01-30 | Sun Patent Trust | Moving Picture Coding Method, Moving Picture Decoding Method, Moving Picture Coding Apparatus, Moving Picture Decoding Apparatus, And Decoding Apparatus |
WO2013030634A1 (en) * | 2011-08-31 | 2013-03-07 | Rocks International Group Pte Ltd | Virtual advertising platform |
EP2768396A2 (en) * | 2011-10-17 | 2014-08-27 | Butterfly Network Inc. | Transmissive imaging and related apparatus and methods |
KR20140107107A (en) * | 2011-12-28 | 2014-09-04 | 파나소닉 주식회사 | Image playbackdevice, image playback method, image playback program, image transmissiondevice, image transmission method and image transmission program |
US20150172715A1 (en) * | 2012-07-09 | 2015-06-18 | Nippon Telegraph And Telephone Corporation | Picture encoding method, picture decoding method, picture encoding apparatus, picture decoding apparatus, picture encoding program, picture decoding program, and recording media |
US9325990B2 (en) * | 2012-07-09 | 2016-04-26 | Qualcomm Incorporated | Temporal motion vector prediction in video coding extensions |
US9667942B2 (en) * | 2012-11-20 | 2017-05-30 | Qualcomm Incorporated | Adaptive luminance compensation in three dimensional video coding |
US10699361B2 (en) * | 2012-11-21 | 2020-06-30 | Ati Technologies Ulc | Method and apparatus for enhanced processing of three dimensional (3D) graphics data |
WO2014120369A1 (en) * | 2013-01-30 | 2014-08-07 | Intel Corporation | Content adaptive partitioning for prediction and coding for next generation video |
KR101834236B1 (en) * | 2013-04-02 | 2018-03-06 | 주식회사 칩스앤미디어 | Method and apparatus for processing video |
US9497485B2 (en) * | 2013-04-12 | 2016-11-15 | Intel Corporation | Coding unit size dependent simplified depth coding for 3D video coding |
WO2015031637A1 (en) * | 2013-08-29 | 2015-03-05 | Abbott Laboratories | Nutritional composition having lipophilic compounds with improved solubility and bioavailability |
TWI646828B (en) * | 2013-09-03 | 2019-01-01 | 日商新力股份有限公司 | Decoding device and decoding method, encoding device and encoding method |
EP3013050A1 (en) * | 2014-10-22 | 2016-04-27 | Axis AB | Video compression with adaptive GOP length and adaptive quantization |
US10264273B2 (en) * | 2014-10-31 | 2019-04-16 | Disney Enterprises, Inc. | Computed information for metadata extraction applied to transcoding |
US9600929B1 (en) * | 2014-12-01 | 2017-03-21 | Ngrain (Canada) Corporation | System, computer-readable medium and method for 3D-differencing of 3D voxel models |
-
2016
- 2016-12-01 CA CA3005713A patent/CA3005713A1/en active Pending
- 2016-12-01 CN CN201680070904.9A patent/CN108369751B/en active Active
- 2016-12-01 JP JP2017556323A patent/JP6817961B2/en active Active
- 2016-12-01 MX MX2018006642A patent/MX2018006642A/en unknown
- 2016-12-01 CN CN202311058205.0A patent/CN116883521A/en active Pending
- 2016-12-01 KR KR1020187015523A patent/KR102545015B1/en active IP Right Grant
- 2016-12-01 CN CN202311061409.XA patent/CN116883522A/en active Pending
- 2016-12-01 EP EP16875104.8A patent/EP3392840A4/en active Pending
- 2016-12-01 WO PCT/JP2016/005041 patent/WO2017104115A1/en active Application Filing
- 2016-12-01 KR KR1020237019887A patent/KR20230091200A/en active IP Right Grant
- 2016-12-01 MY MYPI2018000862A patent/MY190934A/en unknown
-
2018
- 2018-05-30 MX MX2024007272A patent/MX2024007272A/en unknown
- 2018-06-04 US US15/996,710 patent/US11290745B2/en active Active
-
2022
- 2022-02-17 US US17/674,380 patent/US20220174318A1/en active Pending
- 2022-09-06 JP JP2022141511A patent/JP7411747B2/en active Active
-
2023
- 2023-12-25 JP JP2023217517A patent/JP2024023826A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11265460A (en) * | 1998-03-18 | 1999-09-28 | Nec Corp | Memory saving device and method for voxel data |
JP2000338900A (en) * | 1999-05-27 | 2000-12-08 | Sony Corp | Display device for three-dimensional stereoscopic image and method for displaying three-dimensional stereoscopic image |
JP2005259139A (en) * | 2004-03-08 | 2005-09-22 | Samsung Electronics Co Ltd | Generating method of adaptive notation system of base-2n tree, and device and method for encoding/decoding three-dimensional volume data utilizing same |
US20090184957A1 (en) * | 2008-01-21 | 2009-07-23 | Samsung Electronics Co., Ltd. | Method and system for compressing and decoding mesh data with random accessibility in three-dimensional mesh model |
Non-Patent Citations (1)
Title |
---|
"Octree-Based Progressive Geometry Coding of Point Clouds", EUROGRAPHICS SYMPOSIUM ON POINT-BASED GRAPHICS, 2006 |
Cited By (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020521359A (en) * | 2017-05-24 | 2020-07-16 | インターデジタル ヴイシー ホールディングス, インコーポレイテッド | Method and apparatus for encoding and reconstructing point clouds |
JP7309616B2 (en) | 2017-05-24 | 2023-07-18 | インターデジタル ヴイシー ホールディングス, インコーポレイテッド | Method and apparatus for encoding and reconstructing point clouds |
JPWO2019012975A1 (en) * | 2017-07-10 | 2020-05-07 | ソニー株式会社 | Information processing apparatus and method |
CN110832550A (en) * | 2017-07-10 | 2020-02-21 | 索尼公司 | Information processing apparatus and method |
US11508096B2 (en) | 2017-07-10 | 2022-11-22 | Sony Corporation | Information processing apparatus and method |
WO2019012975A1 (en) * | 2017-07-10 | 2019-01-17 | ソニー株式会社 | Information processing device and method |
EP3654293A4 (en) * | 2017-07-10 | 2020-07-29 | Sony Corporation | Information processing device and method |
JP7234925B2 (en) | 2017-07-10 | 2023-03-08 | ソニーグループ株式会社 | Information processing device and method |
CN110832550B (en) * | 2017-07-10 | 2023-10-13 | 索尼公司 | Information processing apparatus and method |
WO2019065297A1 (en) * | 2017-09-29 | 2019-04-04 | ソニー株式会社 | Information processing device and method |
US11087501B2 (en) | 2017-09-29 | 2021-08-10 | Sony Corporation | Voxel correlation information processing apparatus and method |
WO2019078000A1 (en) * | 2017-10-16 | 2019-04-25 | ソニー株式会社 | Information processing device and method |
JPWO2019078000A1 (en) * | 2017-10-16 | 2020-12-03 | ソニー株式会社 | Information processing equipment and methods |
EP3699870A4 (en) * | 2017-10-16 | 2020-12-23 | Sony Corporation | Information processing device and method |
JP7276136B2 (en) | 2017-10-16 | 2023-05-18 | ソニーグループ株式会社 | Information processing device and method |
US11657539B2 (en) | 2017-10-16 | 2023-05-23 | Sony Corporation | Information processing apparatus and information processing method |
WO2019131880A1 (en) * | 2017-12-28 | 2019-07-04 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Coding method, decoding method, information processing method, coding device, decoding device, and information processing system |
US11533514B2 (en) | 2017-12-28 | 2022-12-20 | Panasonic Intellectual Property Corporation Of America | Encoding method, decoding method, information processing method, encoding device, decoding device, and information processing system |
JP7167144B2 (en) | 2018-06-06 | 2022-11-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
JPWO2019235366A1 (en) * | 2018-06-06 | 2021-06-24 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 3D data coding method, 3D data decoding method, 3D data coding device, and 3D data decoding device |
WO2019235366A1 (en) * | 2018-06-06 | 2019-12-12 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
JPWO2019240167A1 (en) * | 2018-06-12 | 2021-07-26 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 3D data coding method, 3D data decoding method, 3D data coding device, and 3D data decoding device |
JP7389028B2 (en) | 2018-06-12 | 2023-11-29 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2019240167A1 (en) * | 2018-06-12 | 2019-12-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
JPWO2019240215A1 (en) * | 2018-06-13 | 2021-06-24 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 3D data coding method, 3D data decoding method, 3D data coding device, and 3D data decoding device |
CN112262412A (en) * | 2018-06-13 | 2021-01-22 | 松下电器(美国)知识产权公司 | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
JP7167147B2 (en) | 2018-06-13 | 2022-11-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2019240215A1 (en) * | 2018-06-13 | 2019-12-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
JP7245244B2 (en) | 2018-06-14 | 2023-03-23 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
JPWO2019240284A1 (en) * | 2018-06-14 | 2021-06-24 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 3D data coding method, 3D data decoding method, 3D data coding device, and 3D data decoding device |
WO2019240284A1 (en) * | 2018-06-14 | 2019-12-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2019240286A1 (en) * | 2018-06-15 | 2019-12-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
JPWO2019240286A1 (en) * | 2018-06-15 | 2021-06-24 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 3D data coding method, 3D data decoding method, 3D data coding device, and 3D data decoding device |
JP7330962B2 (en) | 2018-06-15 | 2023-08-22 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
KR102485554B1 (en) * | 2018-06-29 | 2023-01-05 | 현대오토에버 주식회사 | Apparatus and method for constructing a 3d space map for route search for unmanned aerial vehicle |
KR20200002213A (en) * | 2018-06-29 | 2020-01-08 | 현대엠엔소프트 주식회사 | Apparatus and method for constructing a 3d space map for route search for unmanned aerial vehicle |
JPWO2020008758A1 (en) * | 2018-07-06 | 2021-07-08 | ソニーグループ株式会社 | Information processing equipment and information processing methods, and programs |
JP7310816B2 (en) | 2018-07-06 | 2023-07-19 | ソニーグループ株式会社 | Information processing device, information processing method, and program |
US11516453B2 (en) | 2018-07-06 | 2022-11-29 | Sony Corporation | Information processing apparatus, information processing method, and program for point cloud sample processing |
WO2020008758A1 (en) * | 2018-07-06 | 2020-01-09 | ソニー株式会社 | Information processing device, information processing method, and program |
JP7528336B2 (en) | 2018-07-13 | 2024-08-05 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
CN112424833A (en) * | 2018-07-13 | 2021-02-26 | 松下电器(美国)知识产权公司 | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
CN113196784A (en) * | 2018-12-19 | 2021-07-30 | 索尼集团公司 | Point cloud coding structure |
JP7216351B2 (en) | 2018-12-19 | 2023-02-01 | ソニーグループ株式会社 | Point cloud coding structure |
JP2022517060A (en) * | 2018-12-19 | 2022-03-04 | ソニーグループ株式会社 | Point cloud coding structure |
JPWO2020138464A1 (en) * | 2018-12-28 | 2021-11-04 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 3D data coding method, 3D data decoding method, 3D data coding device, and 3D data decoding device |
JP7410879B2 (en) | 2018-12-28 | 2024-01-10 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2020162495A1 (en) * | 2019-02-05 | 2020-08-13 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
US12073591B2 (en) * | 2019-02-28 | 2024-08-27 | Panasonic Intellectual Property Corporation Of America | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
US20210375004A1 (en) * | 2019-02-28 | 2021-12-02 | Panasonic Intellectual Property Corporation Of America | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
CN113545099A (en) * | 2019-03-11 | 2021-10-22 | 索尼集团公司 | Information processing apparatus, reproduction processing apparatus, information processing method, and reproduction processing method |
WO2020218593A1 (en) * | 2019-04-25 | 2020-10-29 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2020230710A1 (en) * | 2019-05-10 | 2020-11-19 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
WO2021010134A1 (en) * | 2019-07-12 | 2021-01-21 | ソニー株式会社 | Information processing device and method |
JP7544050B2 (en) | 2019-07-12 | 2024-09-03 | ソニーグループ株式会社 | Information processing device and method |
WO2022145214A1 (en) * | 2020-12-28 | 2022-07-07 | ソニーグループ株式会社 | Information processing device and method |
Also Published As
Publication number | Publication date |
---|---|
CN116883521A (en) | 2023-10-13 |
MX2024007272A (en) | 2024-06-26 |
JP6817961B2 (en) | 2021-01-20 |
MX2018006642A (en) | 2018-08-01 |
CN116883522A (en) | 2023-10-13 |
CN108369751B (en) | 2023-09-08 |
EP3392840A4 (en) | 2019-02-06 |
CA3005713A1 (en) | 2017-06-22 |
US20180278956A1 (en) | 2018-09-27 |
EP3392840A1 (en) | 2018-10-24 |
US11290745B2 (en) | 2022-03-29 |
KR20180094870A (en) | 2018-08-24 |
MY190934A (en) | 2022-05-23 |
JP2024023826A (en) | 2024-02-21 |
JP2022177079A (en) | 2022-11-30 |
KR102545015B1 (en) | 2023-06-19 |
CN108369751A (en) | 2018-08-03 |
US20220174318A1 (en) | 2022-06-02 |
JPWO2017104115A1 (en) | 2018-09-27 |
KR20230091200A (en) | 2023-06-22 |
JP7411747B2 (en) | 2024-01-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7411747B2 (en) | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device | |
JP6711913B2 (en) | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device | |
US11710271B2 (en) | Three-dimensional data creation method, three-dimensional data transmission method, three-dimensional data creation device, and three-dimensional data transmission device | |
US11989921B2 (en) | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device | |
JP7248582B2 (en) | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device | |
US11206426B2 (en) | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device using occupancy patterns | |
JP7490844B2 (en) | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device | |
JP2024103490A (en) | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device | |
JP7138695B2 (en) | Three-dimensional data encoding method, three-dimensional data decoding method, three-dimensional data encoding device, and three-dimensional data decoding device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16875104 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017556323 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 3005713 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2018/006642 Country of ref document: MX |
|
ENP | Entry into the national phase |
Ref document number: 20187015523 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 201680070904.9 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |