WO2017168949A1 - 情報処理装置、撮像装置、画像再生装置、および方法とプログラム - Google Patents
情報処理装置、撮像装置、画像再生装置、および方法とプログラム Download PDFInfo
- Publication number
- WO2017168949A1 WO2017168949A1 PCT/JP2017/001197 JP2017001197W WO2017168949A1 WO 2017168949 A1 WO2017168949 A1 WO 2017168949A1 JP 2017001197 W JP2017001197 W JP 2017001197W WO 2017168949 A1 WO2017168949 A1 WO 2017168949A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- image
- imaging
- specific area
- specific
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 387
- 238000000034 method Methods 0.000 title claims abstract description 48
- 230000010365 information processing Effects 0.000 title claims description 33
- 230000015572 biosynthetic process Effects 0.000 claims abstract description 29
- 238000003786 synthesis reaction Methods 0.000 claims abstract description 29
- 239000002131 composite material Substances 0.000 claims description 153
- 238000012545 processing Methods 0.000 claims description 115
- 239000000203 mixture Substances 0.000 claims description 40
- 238000003672 processing method Methods 0.000 claims description 4
- 238000005516 engineering process Methods 0.000 description 19
- 238000012986 modification Methods 0.000 description 7
- 230000004048 modification Effects 0.000 description 7
- 230000002730 additional effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000001752 chlorophylls and chlorophyllins Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
- H04N1/3876—Recombination of partial images to recreate the original image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Definitions
- This technology is in an information processing device, an imaging device, an image reproducing device, a method, and a program, and can easily process an image of a desired imaging region in a composite image obtained by connecting captured images acquired by a plurality of imaging units.
- a wide-angle image is generated using a plurality of images having different imaging directions.
- a plurality of image pickup units are picked up in different directions, and the obtained picked-up images are connected using the technique disclosed in Patent Document 1, so that the picked-up images are continuous without causing image shift.
- a composite image is being generated.
- the omnidirectional image which shows an omnidirectional sphere is produced
- JP 2000-215317 A Japanese Patent Laid-Open No. 2001-298652
- an image in a part of the imaging region may be deleted or replaced.
- an unintended subject such as a photographer or equipment used at the time of imaging
- a specific area that is a target area for image deletion or replacement is determined based on an image
- the specific area is determined based on the image from the composite image, the specific area must be determined based on the image for each frame if the composite image is a moving image. become longer.
- an information processing apparatus an imaging apparatus, an image reproduction apparatus, and a method that can easily process an image of a desired imaging area in a composite image obtained by connecting captured images acquired by a plurality of imaging units, and The purpose is to provide a program.
- a first aspect of the technology is information including a specific area information generation unit that generates specific area information indicating a specific area corresponding to a specific imaging area in a composite image obtained by connecting captured images acquired by a plurality of imaging units.
- a specific area information generation unit that generates specific area information indicating a specific area corresponding to a specific imaging area in a composite image obtained by connecting captured images acquired by a plurality of imaging units.
- specific area information indicating a specific area corresponding to a specific imaging area in a composite image obtained by connecting captured images acquired by a plurality of imaging units is generated by a specific area information generation unit.
- the imaging information generation unit generates imaging information indicating imaging position information, imaging direction, imaging range, and time information of the imaging unit.
- the specific area information includes not only the specific area in the composite image but also the application time of image processing for the specific area or the imaging position to which the image processing is applied, including the application range information.
- the specific imaging region is, for example, a region captured by a specific imaging unit in a plurality of imaging units, a specified imaging region indicated by an azimuth and an elevation angle, or an imaging region specified by an apex angle and a sitting angle. .
- a composite setting information generation unit that generates composite setting information for continuously connecting the captured images acquired by the plurality of imaging units without causing image shift is provided.
- the captured image composition unit that generates the composite image generates a composite image based on a predetermined imaging direction based on the composite setting information generated by the composite setting information generation unit.
- an output unit is provided that outputs the synthesized image in association with the specific area information.
- the specific area information generation unit generates specific area information for each restriction level when reproducing the composite image, specific area information with the specific image area including the predetermined captured image as a specific area, and voice data. It is also possible to generate specific area information based on text data, specific area information based on acquired command information, or specific area information based on imaging information and subject position information.
- the information processing method includes generating specific area information indicating a specific area of the composite image corresponding to the specific imaging area by a specific area information generation unit.
- a third aspect of the technology is a program that generates information related to a composite image obtained by connecting captured images acquired by a plurality of imaging units, using a computer, The program for realizing the function of generating the specific area information indicating the specific area of the composite image corresponding to the specific imaging area by the computer.
- the fourth aspect of this technology is A plurality of imaging units; A captured image combining unit that combines the captured images acquired by the plurality of imaging units to generate a combined image;
- the imaging apparatus includes a specific area information generation unit that generates specific area information indicating a specific area of the composite image corresponding to the specific imaging area.
- a composite image is generated by stitching together captured images with different imaging directions acquired by a plurality of imaging units. Further, specific area information indicating a specific area corresponding to the specific imaging area in the composite image is generated. Furthermore, the captured image composition unit performs image processing of a specific area in the composite image based on the generated specific area information. Alternatively, the composite image and the specific area information are output in association with each other.
- the fifth aspect of this technology is An image reading unit for reading a composite image generated by connecting a plurality of captured images; An information reading unit that reads specific area information indicating a specific area corresponding to a specific imaging area in the composite image;
- the image reproduction apparatus includes a reproduction control unit that performs reproduction control on a specific area indicated by the specific area information read by the information reading unit in the composite image read by the image reading unit.
- a composite image generated by connecting a plurality of captured images and specific area information indicating a specific area corresponding to the specific image capture area in the composite image are acquired.
- invalidation processing or enhancement processing is performed on the specific area indicated by the specific area information in the acquired composite image.
- the invalidation process for example, a process of synthesizing the texture indicated by the specific area information with the image of the specific area, or a process of stopping or replacing output of the image of the specific area with invalid data is performed.
- FIG. 1 exemplifies a configuration of an imaging apparatus using the information processing apparatus of the present technology.
- the imaging device 10 includes a captured image acquisition unit 11, an imaging information generation unit 12, a combination setting information generation unit 13, and a specific area information generation unit 14.
- the imaging apparatus 10 includes a captured image composition unit 21, an encoding processing unit 22, and an output unit 23.
- the imaging device 10 may be provided with a storage unit (for example, a recording medium) 31 that is fixed or detachable. The case where the storage unit 31 is used will be described later.
- the captured image acquisition unit 11 includes a plurality of imaging units 11-1 to 11-n.
- the image capturing units 11-1 to 11-n combine the captured images acquired by the image capturing units 11-1 to 11-n so that the captured images are continuous without causing image shift.
- the imaging range and the physical layout of the imaging units 11-1 to 11-n are set so that an image can be generated.
- the captured image acquisition unit 11 includes the time code supplied from the time code generation unit 121 of the imaging information generation unit 12 in the image data of the captured images acquired by the imaging units 11-1 to 11-n. To 21. Note that the composite image may be an image having a wider angle of view than the captured image obtained by one imaging unit.
- an omnidirectional image an image in which the imaging range is narrower than the omnidirectional image (eg, a hemispherical image), an image in which the imaging range in one or both of the horizontal direction and the vertical direction is expanded, and the like.
- the imaging information generation unit 12 generates imaging information including an imaging position, an imaging direction, an imaging range, and time information.
- the imaging information generation unit 12 includes a time code generation unit 121, a clock unit 122, a sensor unit 123, and an information generation processing unit 124.
- the time code generation unit 121 generates a time code and outputs the time code to the captured image acquisition unit 11 and the information generation processing unit 124.
- the clock unit 122 generates time information and outputs the time information to the information generation processing unit 124.
- the sensor unit 123 is configured using a position sensor, an altitude sensor, an azimuth sensor, an elevation angle sensor, and the like.
- the sensor unit 123 includes, for example, imaging position information indicating the latitude, longitude, and altitude of the imaging device 10 (or the captured image acquisition unit 11 when the captured image acquisition unit 11 is separable from the main body of the imaging device 10), and the imaging device 10. Attitude information indicating the azimuth and the elevation angle to which (or, for example, any one reference imaging unit of the imaging image acquisition unit 11) is generated is generated.
- the sensor unit 123 outputs imaging position information and posture information to the information generation processing unit 124.
- the information generation processing unit 124 uses a time code, time information, imaging position information, and orientation information, and will be described later, which is generated by the captured image acquired by the captured image acquisition unit 11 or the captured image combining unit 21.
- Imaging information that is meta-information related to the composite image is generated.
- the imaging information has common data and time series data.
- the common data is information that does not depend on time, and is generated, for example, at the start of imaging or before the start of imaging.
- the time-series data is information that depends on time, and is generated using imaging position information and posture information that may change with the passage of time together with a time code that is time information. Further, the time series data is periodically generated at predetermined time intervals.
- the imaging information generation unit 12 outputs the generated imaging information to the output unit 23.
- the combination setting information generation unit 13 generates combination setting information for connecting the captured images so as to be continuous without causing image shift in the stitching process of images performed by the captured image combining unit 21.
- the combination setting information generation unit 13 includes physical layout information of the imaging units 11-1 to 11-n used in the captured image acquisition unit 11, and the focal point of the lens used in the imaging units 11-1 to 11-n. Based on lens information indicating distance, information on a projection method for projecting a wide-angle image such as an omnidirectional image on a two-dimensional plane, image size information of an image projected on a two-dimensional plane, etc., as described later, composition setting information Is generated.
- Information used for generating the synthesis setting information may be input by the user, or may be automatically acquired by the synthesis setting information generation unit 13. For example, communication with the imaging units 11-1 to 11-n may be performed, and lens information may be automatically acquired and used to generate synthesis setting information.
- the composition setting information generation unit 13 outputs the composition setting information to the captured image composition unit 21 and the specific area information generation unit 14.
- the specific area information generation unit 14 generates specific area information indicating a specific area corresponding to the specific imaging area in the composite image obtained by connecting the captured images acquired by the imaging units 11-1 to 11-n.
- the specific imaging area is, for example, an imaging area of a subject that is not intended.
- the specific imaging region is identified by any specific imaging unit in the imaging units 11-1 to 11-n by specifying identification information individually assigned to the imaging units 11-1 to 11-n in advance.
- Area is an imaging area specified by an apex angle and a sitting angle of a coordinate system based on an imaging position (hereinafter referred to as “camera coordinate system”), or an imaging area specified by a real world azimuth and elevation angle. It is good.
- the specific area information generation unit 14 sets a set of coordinates indicating the specific area corresponding to the specific imaging area in the composite image as the area setting information.
- the specific area information generation unit 14 synthesizes the imaging areas when the specific imaging area is an area captured by the imaging section specified by the identification information or an imaging area specified by the apex angle and the sitting angle of the camera coordinate system. Using setting information, it is converted into two-dimensional coordinates of a specific area in the composite image.
- the specific area information generation unit 14 acquires imaging information from the imaging information generation unit 12 (not shown) and acquires the acquired imaging Using the information, the imaging area specified by the azimuth and the elevation angle is converted into the imaging area of the camera coordinate system. Further, the specific area information generation unit 14 converts the imaged area of the camera coordinate system after the conversion into two-dimensional coordinates of the specific area in the composite image using the composite setting information.
- the specific area information generation unit 14 generates application range information indicating the application range of the image processing for the specific area indicated by the area setting information.
- the specific area information generation unit 14 outputs specific area information including area setting information and application range information to the captured image synthesis unit 21 or the output unit 23.
- the specific area information is meta information related to a captured image acquired by the captured image acquisition unit 11 or a composite image (described later) generated by the captured image composition unit 21.
- the captured image combining unit 21 is acquired by the imaging units 11-1 to 11-n based on the image data of the captured image output from the captured image acquisition unit 11 and the composite setting information output from the composite setting information generation unit 13.
- the synthesized images are connected to generate a two-dimensional plane composite image.
- the captured image combining unit 21 outputs the generated image data of the combined image to the encoding processing unit 22.
- the captured image composition unit 21 outputs the generated composite image from the specific area information generation unit 14.
- the image processing of the specific area is performed based on the specific area information.
- the captured image combining unit 21 outputs the image data of the combined image that has been subjected to the image processing of the specific area to the encoding processing unit 22.
- the encoding processing unit 22 performs image encoding processing and generates encoded data in which the data amount is compressed. For example, when the composite image is a moving image, the encoding processing unit 22 performs the H.264 encoding process. H.265 (ISO / IEC 23008-2 HEVC) and H.264. Processing for generating encoded data in accordance with a standard such as H.264 / AVC is performed. The encoding processing unit 22 outputs the generated encoded data to the output unit 23.
- the output unit 23 outputs the encoded data of the composite image and the meta information in association with each other.
- the output unit 23 for example, the imaging information generated by the imaging information generation unit 12 or the specification generated by the specific region information generation unit 14
- the region information is multiplexed with the encoded data to generate multiplexed data.
- the output unit 23 multiplexes the imaging information generated by the imaging information generation unit 12 with the encoded data, for example, when the encoded data of the composite image is subjected to image processing on a specific area. Data may be generated.
- the output unit 23 outputs the generated multiplexed data and the encoded data of the composite image in which the image processing is performed on the specific region to an external device or the like via the transmission path.
- the storage unit 31 may store the image data of the captured image acquired by the captured image acquisition unit 11, and the composite image image data generated by the captured image combining unit 21.
- the encoded data generated by the encoding processing unit 22 may be stored.
- the storage unit 31 includes meta information such as imaging information generated by the imaging information generation unit 12, synthesis setting information generated by the synthesis setting information generation unit 13, and specific area information generated by the specific area information generation unit 14.
- Information may be stored in association with image data of a captured image or a composite image. Note that the association includes the case of adding meta information to image data, recording imaging information and the like in the same file as the image data, and recording image data and imaging information and the like on the same recording medium.
- the association includes a case where image data and imaging information are multiplexed as described above. Further, by associating image data or encoded data with meta information or the like and storing them in the storage unit 31, the image data or encoded data and meta information or the like can be output in association with each other via the storage unit 31.
- the storage unit 31 includes, for example, image data of a captured image acquired by the captured image acquisition unit 11, imaging information generated by the imaging information generation unit 12, synthesis setting information generated by the synthesis setting information generation unit 13, and a specific area.
- the specific area information generated by the information generation unit 14 is stored in association with each other.
- the external device reads information associated with the image data from the storage unit 31 to generate a synthesized image or a synthesized image.
- Image processing or the like can be performed on a specific area. For this reason, for example, the image processing of the specific area in the composite image can be efficiently performed by distributing the processing to the imaging device and the external device according to the processing capability of the imaging device and the external device.
- the image data of the captured image acquired by the captured image acquisition unit 11, the composite setting information, and the like are stored in the storage unit 31, and then the image data of the captured image and the composite setting information are stored from the storage unit 31. It is also possible to read out and generate a composite image in the captured image composition unit 21. In this case, the imaging operation can be prioritized even when it takes time to synthesize the captured image.
- the amount of data stored compared to storing the captured image acquired by the captured image acquisition unit 11 by storing the composite image generated by the captured image combining unit 21 is stored. Can be reduced. Furthermore, by storing the encoded data generated by the encoding processing unit 22, the amount of data can be further reduced. In addition, if a decoding process part and a display part are provided in the imaging device 10, the content of encoded data can be confirmed.
- the imaging apparatus 10 performs imaging processing and composition / specific area processing.
- FIG. 2 is a flowchart showing the imaging process.
- the imaging units 11-1 to 11-n of the captured image acquisition unit 11 and the information generation processing unit 124 of the imaging information generation unit 12 use the time code generated by the time code generation unit 121 together. Acquisition of captured images and generation of imaging information are performed.
- step ST1 the imaging device 10 generates composition setting information.
- the composite setting information generation unit 13 of the imaging apparatus 10 generates composite setting information for connecting the captured images acquired by the imaging units 11-1 to 11-n so as to be continuous without causing image shift. .
- the composition setting information generation unit 13 uses physical layout information of the imaging units 11-1 to 11-n in the generation of composition setting information.
- the physical layout information is information that makes it possible to determine the imaging direction of the imaging units 11-1 to 11-n, and is input by the user, for example, according to the physical layout. Further, when the physical layout of the imaging units 11-1 to 11-n has a predetermined positional relationship, for example, when the imaging unit is attached to a mounting jig (camera rig, etc.) and has a predetermined positional relationship, Physical layout information may be automatically generated based on the position. In the captured image acquisition unit 11, physical layout information corresponding to the physical layout of the imaging units 11-1 to 11-n may be stored in advance.
- the synthesis setting information generation unit 13 uses the lens information of the imaging units 11-1 to 11-n in the generation of the synthesis setting information.
- the lens information is information that enables the imaging regions of the imaging units 11-1 to 11-n to be distinguished, for example, information such as a focal length.
- the lens information may be input by the user according to the lens state used in the imaging units 11-1 to 11-n, or may be acquired from the imaging units 11-1 to 11-n. Good.
- the physical layout information is information that can determine the imaging direction of the imaging units 11-1 to 11-n
- the lens information is information that can determine the imaging area of the imaging units 11-1 to 11-n. Therefore, by using the physical layout information and the lens information, the composite setting information generation unit 13 can determine which region of the omnidirectional sphere is captured by which imaging unit, for example.
- the composition setting information generation unit 13 uses information related to a projection method for projecting a wide image such as an omnidirectional image on a two-dimensional plane in generating the composition setting information.
- a projection formula used when converting an omnidirectional image into a two-dimensional planar image using an equirectangular projection, an equal product cylindrical projection, or the like is used as information relating to the projection method.
- the composite setting information generation unit 13 uses the image size information in generating the composite setting information.
- the composition setting information generation unit 13 sets the image size of the two-dimensional plane using the image size information.
- the information regarding the projection method indicates the correspondence between the position on the omnidirectional image and the position on the two-dimensional plane image
- the image size information indicates the image size of the two-dimensional plane image. Therefore, the correspondence between the position on the omnidirectional image and the position on the two-dimensional plane image of the image size indicated by the image size information becomes clear based on the information on the projection method and the image size information.
- the synthesis setting information generation unit 13 captures each captured image (part of the omnidirectional image) acquired by the imaging units 11-1 to 11-n. (Corresponding to the image of the region) each pixel in the composite image that is a two-dimensional plane of the image size indicated by the image size information is determined, and each pixel of the captured image and the composite image Composite setting information indicating the correspondence is generated. If a composite image is generated using the composite setting information, pixels indicating the same position of the omnidirectional sphere in a plurality of captured images are mapped to positions on the composite image corresponding to the position of the omnidirectional sphere. . Therefore, if the imaging range is set so that the desired imaging range is captured by any of the imaging units, a composite image of the desired imaging range in which the captured images are continuous can be generated without causing image shift.
- FIG. 3 exemplifies a correspondence relationship between an imaging region and a region on a two-dimensional plane when six imaging units are used.
- the four imaging units are arranged so that the imaging direction is the horizontal direction and has an angle difference of 90 ° from each other.
- One of the two imaging units is arranged so that the imaging direction is the celestial direction (the zenith direction) and the other imaging direction is the ground direction (the nadir direction).
- the horizontal regions Pa, Pb, Pc, Pd, the celestial region Pe, and the ground region Pf (not shown) shown in FIG. Note that the horizontal areas Pa, Pb, Pc, and Pd shown in FIG. 3A are apparent from the physical layout information and the lens information.
- the correspondence between the regions Pa, Pb, Pc, Pd, Pe, and Pf and the region on the two-dimensional plane of the image size indicated by the image size information becomes clear.
- the region Ga in the two-dimensional plane (composite image) GP having the image size indicated by the image size information corresponds to the region Pa.
- Gb, Gc, Gd, Ge, and Gf correspond to the regions Pb, Pc, Pd, Pe, and Pf.
- the imaging direction that is the reference is the center position of the two-dimensional plane.
- the composition setting information generation unit 13 generates composition setting information for connecting the captured images acquired by the plurality of imaging units so as to be continuous without causing image shift, and proceeds to step ST2.
- step ST2 the imaging device 10 performs imaging.
- image data of the captured image is generated by the imaging units 11-1 to 11-n.
- the captured image acquisition unit 11 adds the time code supplied from the time code generation unit 121 of the imaging information generation unit 12 to the generated image data.
- the captured image acquisition unit 11 generates image data to which a time code is added, and proceeds to step ST3.
- the imaging apparatus 10 generates imaging information.
- the imaging information generation unit 12 of the imaging apparatus 10 generates imaging information that is meta information related to the captured image.
- the imaging information includes common data that does not depend on time and time-series data that depends on time.
- the imaging information generation unit 12 generates data indicating the imaging start time using time information output from the clock unit 122, for example, as common data independent of time.
- the time-series data depending on time is data generated at predetermined time intervals.
- the time-series data includes time code generated during imaging by the time code generation unit 121, imaging position information indicating the latitude, longitude, and altitude of the imaging device 10 (captured image acquisition unit 11) generated by the sensor unit 123, and Posture information indicating the azimuth and elevation angle that the imaging units 11-1 to 11-n are facing is used. Note that the posture information does not need to use information on each of the imaging units 11-1 to 11-n. If the posture information of the imaging unit as any one of the reference information is obtained, other information is obtained based on the physical layout information. The attitude information of the imaging unit can be calculated.
- FIG. 4 illustrates time series data of imaging information.
- the time code generated by the time code generation unit 121 the information indicating the latitude, longitude, and altitude of the imaging device 10 generated by the sensor unit 123, and the posture information of one imaging unit specified in advance (For example, azimuth and elevation).
- the abbreviation which shows a direction is also described.
- the imaging apparatus 10 When the imaging apparatus 10 generates imaging information and determines that there is no user operation indicating the end of imaging or an external instruction (hereinafter referred to as “user operation etc.”), the imaging apparatus 10 returns to step ST2 and issues an instruction indicating the end of imaging. If it is determined that there is, the imaging process is terminated.
- FIG. 5 is a flowchart showing captured image composition / specific area processing.
- the image capturing process shown in FIG. 2 is performed, and the image capturing units 11-1 to 11-n of the captured image acquiring unit 11 capture the image data of the captured image and the capturing information generating unit 12.
- the imaging information and synthesis setting information generation unit 13 synthesis setting information is generated.
- the imaging apparatus 10 sets a specific imaging area and an applicable range.
- the specific area information generation unit 14 of the imaging device 10 sets, for example, an imaging area that includes an unintended subject as the specific imaging area.
- the specific area information generation unit 14 sets the imaging area of the imaging unit specified by the user operation or the like as the specific imaging area.
- the specific area information generation unit 14 sets the imaging region of the specified imaging unit as the specific imaging region when the imaging unit is specified by a user operation or the like using the identification information assigned to the imaging unit. For example, when the identification information of the imaging unit specified by the user operation or the like indicates the imaging unit that images the sky in FIG. 3, the region Pe is set as the specific imaging region.
- the specific area information generation unit 14 sets the area surrounded by the specified apex angle and the sitting angle as the specific imaging area. For example, in a coordinate system based on the imaging position, coordinates (vertical angle: 45 °, sitting angle: 45 °) (vertical angle: 90 °, sitting angle: 45 °) (vertical angle: 45 °, sitting angle: 90 °) (vertical angle: 90 °, seating angle: 90 °) is specified, the specified coordinate area is set as the specific imaging area.
- FIG. 6 is a diagram for explaining the setting of the specific imaging region.
- the range of the direction specified by the user or the like in accordance with the direction setting instruction is set as the specific imaging region.
- an elevation angle range specified by a user or the like in accordance with an elevation angle setting instruction is set as a specific imaging region.
- FIG. 7 exemplifies a specific imaging region when the range of the azimuth and the elevation angle is specified.
- a region indicated by oblique lines with the imaging position as the center is the specific imaging region.
- FIG. 8 illustrates the specific imaging region when the coordinates are specified. For example, coordinates (azimuth: 0 °, elevation angle: 45 °) (azimuth: 0 °, elevation angle: 0 °) (azimuth: 90 °, sitting angle: 0 °) (azimuth: 90 °, elevation angle: 45 °) are specified. In this case, an area indicated by oblique lines with the imaging position as the center is the specific imaging area.
- the specific area information generation unit 14 sets an application range.
- the application range in which image processing is performed on an image in a specific area corresponding to the specific imaging area in the composite image is specified using, for example, time information or an imaging position.
- the specific area information generation unit 14 specifies a time for performing image processing on an image in the specific area based on a user operation or the like.
- the time may be specified using an absolute time such as UTC, or may be specified using a relative time such as an elapsed time from the start of imaging.
- the specification of the time may specify either the start time or the end time of the application range, or may specify the start time and the end time.
- the specific area information generation unit 14 specifies the imaging position of the application range for performing image processing on the image of the specific area based on a user operation or the like.
- the imaging position one or more of latitude, longitude, and altitude is specified. For example, by specifying three coordinates (latitude: N35, longitude: E139) (latitude: N35, longitude: E140) (latitude: N40, longitude: E139), an area surrounded by the three coordinates is set as an application range. .
- the height for example, 30 m or less
- the range of the specified height as an application range.
- the imaging apparatus 10 sets the specific imaging area and the application range as described above, proceeds to step ST12 in FIG. 5, and generates specific area information.
- the specific area information generation unit 14 of the imaging device 10 generates specific area information based on the setting of the specific imaging area and the application range.
- Specified area information has area setting information and application range information.
- the area setting information is information indicating a specific area in the composite image
- the application range information is information indicating an application range of image processing for the specific area indicated by the area setting information.
- the specific area information generation unit 14 generates, as area setting information, a set of coordinates indicating a specific area corresponding to the specific imaging area in the combined image obtained by connecting the captured images.
- the specific area information generation unit 14 is acquired by the imaging unit corresponding to the identification information based on the synthesis setting information generated by the synthesis setting information generation unit 13.
- the coordinate group of the composite image of the captured image is calculated.
- the specific area information generation unit 14 sets information indicating the calculated coordinate group as area setting information.
- the specific area information generating unit 14 indicates the apex angle and the sitting angle based on the composite setting information generated by the composite setting information generating unit 13.
- the coordinate group of the composite image corresponding to the specified specific imaging region is calculated.
- the specific area information generation unit 14 sets information indicating the calculated coordinate group as area setting information.
- the specific area information generation unit 14 when the specific imaging area is set by the azimuth and the elevation angle, the specific area information generation unit 14 combines the captured image acquired by each imaging unit based on the synthesis setting information generated by the synthesis setting information generation unit 13. Correlate with an image. In addition, the specific area information generation unit 14 determines, for each time code, which azimuth and elevation angle each imaging unit is facing from the imaging information generated by the imaging information generation unit. Further, the specific area information generation unit 14 determines which area of the captured image the specific imaging area corresponds to based on the determined orientation of the imaging unit.
- the specific area information generation unit 14 indicates the azimuth and the elevation angle based on the correspondence between the captured image acquired by the imaging unit and the composite image and the correspondence between the specific imaging region and the captured image acquired by the imaging unit.
- the coordinate group of the composite image corresponding to the specified specific imaging region is calculated.
- the specific area information generation unit 14 sets information indicating the calculated coordinate group as area setting information.
- the specific area information generation unit 14 generates specific area information including area setting information indicating a specific area corresponding to the specific imaging area in the composite image and application range information indicating setting of the application range.
- FIG. 9 illustrates specific area information.
- the specific area information for example, latitude, longitude, and altitude when image processing for a specific area is performed as application range information for each time code.
- the area setting information the coordinate group of the specific area corresponding to the specific imaging area in the composite image is indicated for each time code.
- FIG. 9 illustrates a case where two specific areas AR1 and AR2 are provided, and coordinate groups are calculated for the two specific areas.
- coordinate group information coordinate values (horizontal pixel position, vertical pixel position) of four corners indicating the area of the coordinate group are shown.
- the application range information may include the start time and the end time as described above.
- the imaging device 10 generates specific area information as described above, proceeds to step ST13 in FIG. 5, and performs a captured image composition process.
- the captured image composition unit 21 of the imaging apparatus 10 connects the captured images acquired by the imaging units 11-1 to 11-n of the captured image acquisition unit 11 based on the composite setting information generated by the composite setting information generation unit 13.
- a matching process is performed.
- stitching process captured images having the same time code are stitched together to generate a single composite image.
- the imaging device 10 generates a composite image and proceeds to step ST14.
- step ST14 the imaging apparatus 10 determines whether a specific area is set.
- the captured image composition unit 21 determines whether a specific region is set for the composite image based on the specific region information.
- the captured image composition unit 21 determines that the specific region is set, and proceeds to step ST15.
- the captured image composition unit 21 determines that the specific area is not set, and proceeds to step ST16.
- step ST15 the imaging apparatus 10 performs image processing on a specific area.
- the captured image composition unit 21 invalidates the image in the specific area, for example, by deleting or replacing the image data in the specific area with invalid image data, and proceeds to step ST16.
- step ST16 the imaging apparatus 10 performs an encoding process.
- the encoding processing unit 22 of the imaging apparatus 10 reduces the amount of data by performing encoding processing using the composite image generated by the captured image combining unit 21 or the composite image that has been subjected to the image processing of the specific region. Return to step ST13.
- the imaging apparatus 10 repeats the processing from step ST13 to step ST16 until the end of imaging.
- the processing from step ST13 to step ST16 is performed for each frame.
- the processing from step ST13 to step ST16 is not limited to the order processing in which the processing is performed in the order of steps, but may be performed by pipeline processing or parallel processing.
- the encoded data after the encoding process may be output to an external device.
- the information processing apparatus of the present technology it is possible to generate specific area information indicating a specific area corresponding to a specific imaging area in a composite image obtained by connecting captured images acquired by a plurality of imaging units. Therefore, by using the generated specific area information, it is possible to easily process the image of the specific area in the combined image obtained by connecting the captured images acquired by the plurality of imaging units. For example, when an information processing apparatus of this technique is applied to an imaging apparatus, a composite image in which an image of an unintended subject is invalidated can be output from the imaging apparatus.
- the specific imaging area can be set based on the identification information of the imaging apparatus, the apex angle and the sitting angle, the azimuth, the elevation angle, and the like, for example, an imaging area of an unintended subject can be easily set as the specific imaging area.
- the application area information is included in the specific area information, it is possible to set a time range in which image processing for the specific area is performed, and processing with a high degree of freedom can be performed.
- the imaging information associated with the composite image includes imaging position information including latitude, longitude, and altitude
- the application range information can include imaging position information. It is possible to control image processing for the area. For example, if the imaging position is low, the object to be invalidated does not appear in the captured image behind a building or the like, and if the imaging position is high, the subject to be invalidated may appear in the captured image. .
- the image processing of the specific region is not limited to being performed by the imaging apparatus, but may be performed by an external device.
- specific area information or the like is associated with image data of a composite image that has not been subjected to image processing of the specific area, or encoded data generated by encoding the image data, and is output to an external device.
- image data or encoded data may be output by multiplexing specific area information or the like.
- image data or encoded data and specific area information or the like are stored in a removable storage unit 31. And may be output via the storage unit 31.
- the image reproduction apparatus exemplifies a case where a reproduction operation is performed using, for example, encoded data of a composite image and specific area information.
- the specific area information includes texture generation information for generating a texture. The texture generation information is generated in advance, for example, at the composite image provider.
- FIG. 10 illustrates the first configuration of the image reproducing device.
- the image reproduction device 50 includes an image reading unit 51, a decoding processing unit 52, a meta information reading unit 53, a texture generation unit 54, a texture storage unit 55, and a synthesis processing unit 56.
- the image reading unit 51 takes in encoded data, for example.
- the image reading unit 51 reads encoded data from multiplexed data, a storage unit, and the like, and outputs the read encoded data to the decoding processing unit 52.
- the decoding processing unit 52 performs decoding processing of the encoded data read by the image reading unit 51, and generates image data of a composite image.
- the decode processing unit 52 outputs the generated image data to the composition processing unit 56.
- the meta information reading unit 53 reads meta information.
- the meta information reading unit 53 reads meta information associated with the encoded data from the multiplexed data, the storage unit, and the like, and outputs the meta information to the texture generation unit 54.
- the texture generation unit 54 generates a texture.
- the texture generation unit 54 generates a texture based on the texture generation information included in the specific area information read by the meta information reading unit 53, for example.
- the texture generation unit 54 stores the generated texture in the texture storage unit 55.
- the composition processing unit 56 is a reproduction control unit that performs reproduction control on a specific area of the composite image.
- the composition processing unit 56 reads the texture corresponding to the composite image from the texture storage unit 55 and combines the read texture with the specific region in the composite image of the image data generated by the decode processing unit 52. Control is performed so that the image is not reproduced as it is.
- the composition processing unit 56 outputs image data of a composite image that has been subjected to reproduction control for the specific area.
- FIG. 11 is a flowchart showing the operation of the first configuration of the image reproduction apparatus. Note that the following description will be given on the assumption that the image reproduction device 50 can acquire the encoded data of the composite image and the specific area information.
- step ST21 the image reproducing device 50 reads the encoded data.
- the image reading unit 51 of the image reproducing device 50 reads the encoded data from the multiplexed data, the storage unit, etc., and proceeds to step ST22.
- step ST22 the image reproduction device 50 performs a decoding process.
- the decoding processing unit 52 of the image reproducing device 50 performs decoding processing of the encoded data read in step ST21, generates image data of a composite image, and proceeds to step ST25.
- step ST23 the image playback device reads the meta information.
- the meta information reading unit 53 of the image reproducing device 50 reads meta information related to the encoded data from the multiplexed data, the storage unit, etc., and proceeds to step ST24.
- step ST24 the image reproduction device 50 generates a texture.
- the texture generation unit 54 of the image reproduction device 50 generates a texture based on the texture generation information included in the meta information read in step ST23.
- the texture generation unit 54 stores the generated texture in the texture storage unit 55 and proceeds to step ST25.
- step ST25 the image reproduction device 50 determines whether there is a corresponding texture.
- the composition processing unit 56 of the image reproduction device 50 determines whether the texture corresponding to the composite image generated in step ST22 is stored in the texture storage unit 55. For example, when a texture having a time code equal to the generated synthesized image is stored in the texture storage unit 55, the synthesis processing unit 56 determines that there is a corresponding texture, and proceeds to step ST26. If the texture with the same time code is not stored in the texture storage unit 55, the synthesis processing unit 56 determines that there is no corresponding texture, and proceeds to step ST27.
- step ST26 the image reproduction device 50 performs image composition.
- the composition processing unit 56 of the image playback device 50 combines the texture of the time code equal to the composite image with the image of the specific area, and proceeds to step ST27.
- step ST27 the image reproduction device 50 performs image output processing.
- the composition processing unit 56 of the image reproduction device 50 outputs the image data of the composite image in which the texture is synthesized in the specific area, or the image data of the composite image to the display device or the like if there is no corresponding texture.
- texture generation information for generating a texture is included in the specific area information, but texture generation information for generating various textures may be included in the imaging information.
- the imaging information and the specific area information are output to the image reproduction device 50 together, and the specific area information includes information for specifying the texture to be synthesized, thereby specifying the texture selected from various textures in the synthesized image. Can be combined with the area image.
- FIG. 12 illustrates the second configuration of the image playback device.
- the image reproduction device 50 a includes an image reading unit 51, a decoding processing unit 52, a meta information reading unit 53, a meta information storage unit 57, and an output control unit 58.
- the image reading unit 51 reads encoded data.
- the image reading unit 51 reads encoded data from multiplexed data, a storage unit, and the like, and outputs the acquired encoded data to the decoding processing unit 52.
- the decoding processing unit 52 performs decoding processing of the encoded data read by the image reading unit 51, and generates image data of a composite image.
- the decode processing unit 52 outputs the generated image data to the output control unit 58.
- the meta information reading unit 53 reads meta information.
- the meta information reading unit 53 reads the meta information corresponding to the encoded data from the multiplexed data, the storage unit, etc., and outputs it to the meta information storage unit 57.
- the meta information storage unit 57 stores the meta information read by the meta information reading unit 53.
- the output control unit 58 is a reproduction control unit that performs reproduction control on a specific area of the composite image.
- the output control unit 58 outputs the image data generated by the decode processing unit 52, whether or not the pixel of the image data to be output is a pixel in a specific region is stored in the meta information storage unit 57. The determination is made based on the area information.
- the output control unit 58 outputs the image data generated by the decoding processing unit 52 when the pixel of the image data to be output is not a pixel in the specific area.
- the output control unit 58 stops outputting pixel data generated by the decoding processing unit 52 or replaces it with preset pixel data. Control is performed so that the image of the specific area is not reproduced.
- the output control unit 58 outputs image data of a composite image that has been subjected to reproduction control for the specific area.
- FIG. 13 is a flowchart showing the operation of the second configuration of the image reproduction apparatus. Note that the following description will be given on the assumption that the image reproduction device 50a can acquire the encoded data of the composite image and the specific area information.
- step ST31 the image playback device reads meta information.
- the meta information reading unit 53 of the image reproducing device 50a reads meta information related to the encoded data from the multiplexed data, the storage unit, and the like, the read meta information is stored in the meta information storage unit 57, and the process proceeds to step ST32.
- step ST32 the image reproduction device 50a reads the encoded data.
- the image reading unit 51 of the image reproducing device 50a reads the encoded data from the multiplexed data, the storage unit, etc., and proceeds to step ST33.
- step ST33 the image reproduction device 50a performs a decoding process.
- the decoding processing unit 52 of the image reproducing device 50a performs decoding processing of the encoded data read in step ST32, generates image data of a composite image, and proceeds to step ST34.
- step ST34 the image reproduction device 50a determines whether the output pixel is a specific area.
- the output control unit 58 of the image reproducing device 50a is specific area information stored in the meta information storage unit 57, and the pixel of the output image data is based on the specific area information having the same time code as the output image data. It is determined whether or not the pixel is in a specific area, and the process proceeds to step ST35.
- step ST35 the image reproduction device 50a performs output control.
- the output control unit 58 of the image reproduction device 50a outputs the image data generated by the decoding process.
- the output control unit 58 stops outputting the image data or replaces the image data generated by the decoding processing unit 52 with preset image data and outputs it. To do.
- the output control unit 58 can output the image data of the composite image in which the output control is performed on the image of the specific region from the image reproduction device.
- the image reproduction device can perform reproduction control of the image of the specific area in the composite image based on the specific area information.
- the above-described specific area information generation unit 14 sets the area of the composite image corresponding to the specific image pickup area set based on the identification information of the image pickup unit or the like as the specific area, but the specific area information is used when reproducing the composite image. It may be generated for each restriction level.
- the provider of the composite image uses the specific area information in the specific area information generation unit 14 according to, for example, a billing contract with the user of the composite image. Is generated.
- the specific area information for the free user is information that invalidates “AR1 percent” of the composite image as the specific area.
- the specific area information for the free user may include information for displaying texture information such as guidance information for recommending a pay contract, guidance information for other image contents, and advertisement information in the invalid specific area.
- the specific area information for the paying general user is information that invalidates “AR2 ( ⁇ AR1) percent” of the composite image as the specific area.
- the specific area information for the paying general user may include information to be displayed in the specific area where the texture is invalidated, such as guidance information recommending a premier contract, guidance information of other image contents, and advertisement information.
- the specific area information for the paid premium user is information that does not set the specific area.
- a captured image acquired by a plurality of imaging units a composite image is generated using a plurality of content images and display information related to the content, and the content is invalidated according to a billing contract with the user of the composite image
- specific area information for specifying display information may be generated.
- the reproduction of the composite image can be restricted according to the selected specific area information by selectively using the specific area information.
- imaging information and specific area information are generated based on an imaged object and its position, audio text data, command information, and the like.
- the imaging information generation unit 12 includes the captured object and position information on the captured image of each object in the time-series data of the imaging information. Further, the photographed object may be classified by type (for example, person, building, animal, plant, symbol, etc.), and the classification result may be included in the time series data. Note that the process of classifying objects by type may be performed automatically using image recognition technology, or the user may specify the type for each object.
- type for example, person, building, animal, plant, symbol, etc.
- FIG. 14 illustrates time-series data of imaging information.
- the time series data is generated for each time code, for example.
- it is assumed that captured images are acquired using the imaging units 11-1 to 11-5.
- the coordinates (x, y) and (x ′, y ′) indicate the coordinates of the smallest rectangular area including the object.
- the imaging unit 11-1 has captured the person A, and the image area of the person A has coordinates (x1, y1), (x1 ′, y1 ′).
- the imaging unit 11-2 captures the person B, and the image area of the person B indicates the coordinates (x2, y2) and (x2 ', y2') on the image.
- the imaging unit 11-3 captures the building A, and shows that the image area of the building A is the coordinates (x5, y5) and (x5 ', y5') on the image.
- the imaging unit 11-4 images the dog A, and shows that the image area of the dog A is the coordinates (x8, y8) and (x8 ', y8') on the image.
- the specific area information generation unit 14 sets the area of the composite image including the predetermined object as the specific area. For example, when the person A imaged by the imaging unit 11-1 in FIG. 14 is detected by subject recognition processing or the like, a specific area is set according to the person A as shown in FIG. For example, the rectangular area indicated by the coordinates (xg1, yg1) and (xg1 ', yg1') of the person A in the composite image GP is set as the specific area.
- the imaging information and the specific area information may be generated based on audio data.
- the imaging information generation unit 12 converts voice data indicating the voice picked up by the microphone into text data, and uses the text data as time series data of the imaging information.
- FIG. 16 exemplifies time series data of imaging information generated based on voice data, and shows time code and text data of voice picked up by microphones A to C at the timing indicated by the time code. .
- the specific area information generation unit 15 defines the relationship between the text data and the specific area in advance, and generates specific area information based on the recognized text data.
- the text data “A ON” is an instruction to set the imaging range of the imaging unit A as the specific area.
- the text data “A OFF” is an instruction to cancel the setting of the specific area that is the imaging range of the imaging unit A.
- the imaging range of all imaging units is set as an instruction for the specific area.
- the text data of “all off” is an instruction to cancel all the settings of the specific area.
- the specific area information generation unit 15 sets or cancels the specific area in accordance with an instruction corresponding to text data based on the voice picked up by the microphone.
- the imaging information and the specific area information may be generated based on command information from an external device (for example, a remote control device).
- the imaging information generation unit 12 uses command information from an external device or a remote control device as imaging information.
- the specific area information generation unit 15 defines an area corresponding to the command information in advance, and generates specific area information with the area corresponding to the command information supplied from the external device as the specific area.
- the specific area information may be generated based on the subject position information and the imaging information.
- the position range information of the subject indicates the latitude range, longitude range, and altitude of the subject as the specific area of the composite image.
- the imaging information indicates the latitude, longitude, and altitude of the imaging position.
- the imaging direction of the subject with respect to the imaging position can be determined from the relationship between the position of the subject as the specific region and the imaging position.
- the angle of view range corresponding to the latitude range and longitude range of the subject can be calculated.
- the image of the specific area is invalidated as the image processing for the specific area.
- the process of emphasizing the specific area may be performed so that the specific area is noted.
- the specific area is emphasized by performing image processing for displaying a symbol such as an arrow or a message around the specific area, or performing image processing such as surrounding the specific area with a frame.
- the series of processes described in the specification can be executed by hardware, software, or a combined configuration of both.
- a program in which a processing sequence is recorded is installed and executed in a memory in a computer incorporated in dedicated hardware.
- the program can be installed and executed on a general-purpose computer capable of executing various processes.
- the program can be recorded in advance on a hard disk, SSD (Solid State Drive), or ROM (Read Only Memory) as a recording medium.
- the program is a flexible disk, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto optical disc), a DVD (Digital Versatile Disc), a BD (Blu-Ray Disc (registered trademark)), a magnetic disk, or a semiconductor memory card. It can be stored (recorded) in a removable recording medium such as temporarily or permanently. Such a removable recording medium can be provided as so-called package software.
- the program may be transferred from the download site to the computer wirelessly or by wire via a network such as a LAN (Local Area Network) or the Internet.
- the computer can receive the program transferred in this way and install it on a recording medium such as a built-in hard disk.
- the information processing apparatus may have the following configuration.
- An information processing apparatus including a specific area information generation unit that generates specific area information indicating a specific area corresponding to a specific imaging area in a composite image obtained by connecting captured images acquired by a plurality of imaging units.
- the information processing apparatus further including an imaging information generation unit that generates imaging information indicating an imaging position, an imaging direction, an imaging range, and time information of the imaging unit.
- the specific area information generation unit includes, in the specific area information, application range information indicating an application time of image processing for the specific area or an imaging position to which image processing is applied. .
- the specific area information generation unit is specified by an area picked up by a specific image pickup unit in the plurality of image pickup parts, an image pickup area specified by an azimuth and an elevation angle, or an apex angle and a sitting angle.
- a combination setting information generation unit that generates combination setting information for continuously connecting the captured images acquired by the plurality of imaging units without causing image shift;
- the information processing apparatus according to any one of (1) to (5), further including a captured image combining unit that generates the combined image based on the combination setting information generated by the combination setting information generation unit.
- the information processing apparatus according to any one of (1) to (6), wherein the specific area information generation unit generates each restriction level when reproducing the composite image.
- the specific area information generation unit generates the specific area information using the area of the composite image including the captured predetermined object as the specific area. Processing equipment.
- specific area information indicating a specific area corresponding to a specific imaging area in a composite image obtained by connecting captured images acquired by a plurality of imaging units Is generated. Further, reproduction control of the composite image is performed based on the specific area information. Therefore, based on the specific area information, it is possible to easily perform processing on the image of the specific area corresponding to the desired imaging area in the composite image. Therefore, it is suitable for generating and reproducing panoramic images and omnidirectional images.
- DESCRIPTION OF SYMBOLS 10 ... Imaging device 11 ... Captured image acquisition part 11-1 to 11-n ... Imaging part 12 ... Imaging information generation part 13 ... Composition setting information generation part 14 ... Specific area information Generating unit 21 ... Captured image combining unit 22 ... Encoding processing unit 23 ... Output unit 31 ... Storage unit 50, 50a ... Image reproduction device 51 ... Image reading unit 52 ... Decoding Processing unit 53 ... Meta information reading unit 54 ... Texture generation unit 55 ... Texture storage unit 56 ... Synthesis processing unit 57 ... Meta information storage unit 58 ... Output control unit 121 ... Time code generator 122 ... Clock unit 123 ... Sensor unit 124 ... Information generation processing unit
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
特定撮像領域に対応する前記合成画像の特定領域を示す特定領域情報を特定領域情報生成部で生成することを含む情報処理方法にある。
特定撮像領域に対応する前記合成画像の特定領域を示す特定領域情報を生成する機能を
前記コンピュータで実現させるプログラムにある。
複数の撮像部と、
前記複数の撮像部で取得された撮像画をつなぎ合わせて合成画像を生成する撮像画合成部と、
特定撮像領域に対応する前記合成画像の特定領域を示す特定領域情報を生成する特定領域情報生成部と
を備える撮像装置にある。
複数の撮像画をつなぎ合わせて生成された合成画像を読み込む画像読込部と、
前記合成画像における特定撮像領域に対応する特定領域を示す特定領域情報を読み込む情報読込部と、
前記画像読込部で読み込まれた合成画像における前記情報読込部で読み込まれた特定領域情報によって示された特定領域に対して、再生制御を行う再生制御部と
を備える画像再生装置にある。
1.撮像装置の構成と動作
1-1.撮像装置の構成
1-2.撮像装置の動作
2.画像再生装置の構成と動作
2-1.画像再生装置の第1の構成と動作
2-2.画像再生装置の第2の構成と動作
3.撮像情報や特定領域情報の変形例
<1-1.撮像装置の構成>
図1は本技術の情報処理装置を用いた撮像装置の構成を例示している。撮像装置10は、撮像画取得部11、撮像情報生成部12、合成設定情報生成部13、特定領域情報生成部14を有している。また、撮像装置10は、撮像画合成部21、エンコード処理部22、出力部23を有している。さらに、撮像装置10には、記憶部(例えば記録媒体)31が固定または着脱可能に設けられていてもよい。なお、記憶部31を用いた場合については後述する。
次に撮像装置の動作について説明する。撮像装置10では撮像処理と合成・特定領域処理を行う。
ところで、特定領域の画像処理は、撮像装置で行う場合に限らず外部機器で行うようにしてもよい。この場合、特定領域の画像処理が行われていない合成画像の画像データや、この画像データをエンコード処理して生成した符号化データに特定領域情報等を関連付けて外部機器へ出力する。外部機器に対する出力は、画像データまたは符号化データに特定領域情報等を多重化して出力してもよく、例えば着脱可能な記憶部31に、画像データまたは符号化データと、特定領域情報等を記憶させて、記憶部31を介して出力してもよい。
次に、外部機器として画像再生装置を用いて、特定領域の画像処理を画像再生時に行う場合について説明する。画像再生装置の第1の構成と動作では、合成画像における特定領域に対してテクスチャを合成する場合について説明する。なお、画像再生装置では、例えば合成画像の符号化データと特定領域情報を用いて再生動作を行う場合を例示している。また、例えば特定領域情報にはテクスチャを生成するテクスチャ生成情報が含められている。テクスチャ生成情報は、例えば合成画像の提供先等で予め生成されている。
次に、画像再生装置の第2の構成と動作について説明する。画像再生装置の第2の構成と動作では、合成画像の特定領域において、画像データの出力制御を行う場合について説明する。なお、画像再生装置では、例えば合成画像の符号化データと特定領域情報等のメタ情報を用いて再生動作を行う場合を例示している。
次に、撮像情報や特定領域情報の第1の変形例について説明する。上述の特定領域情報生成部14は、撮像部の識別情報等に基づいて設定した特定撮像領域に対応する合成画像の領域を特定領域としたが、特定領域情報は、合成画像を再生する際の制限レベル毎に生成してもよい。
(1) 複数の撮像部で取得された撮像画をつなぎ合わせた合成画像における特定撮像領域に対応した特定領域を示す特定領域情報を生成する特定領域情報生成部を備える情報処理装置。
(2) 前記撮像部の撮像位置と撮像方向と撮像範囲および時間情報を示す撮像情報を生成する撮像情報生成部を更に備える(1)に記載の情報処理装置。
(3) 前記特定領域情報生成部は、前記特定領域に対する画像処理の適用時間または画像処理を適用する撮像位置を示す適用範囲情報を、前記特定領域情報に含める(2)に記載の情報処理装置。
(4) 前記特定領域情報生成部は、前記複数の撮像部における特定の撮像部で撮像される領域、または方位および仰角で示された特定された撮像領域、あるいは頂角と座角で特定された撮像領域を前記特定撮像領域とする(2)または(3)に記載の情報処理装置。
(5) 前記合成画像に前記特定領域情報を関連付けて出力する出力部を更に備える(2)乃至(4)のいずれかに記載の情報処理装置。
(6) 前記複数の撮像部で取得された撮像画を、画像のずれを生じることなく連続するようにつなぎ合わせるための合成設定情報を生成する合成設定情報生成部と、
前記合成設定情報生成部で生成された合成設定情報に基づき、前記合成画像を生成する撮像画合成部を更に備える(1)乃至(5)のいずれかに記載の情報処理装置。
(7) 前記特定領域情報生成部は、前記合成画像を再生する際の制限レベル毎に生成する(1)乃至(6)のいずれかに記載の情報処理装置。
(8) 前記特定領域情報生成部は、撮像された所定の物体を含む前記合成画像の領域を前記特定領域として前記特定領域情報を生成する(1)乃至(6)のいずれかに記載の情報処理装置。
(9) 前記特定領域情報生成部は、音声データから生成したテキストデータに基づき前記特定領域情報を生成する(1)乃至(6)のいずれかに記載の情報処理装置。
(10) 前記特定領域情報生成部は、取得したコマンド情報に基づき前記特定領域情報を生成する(1)乃至(6)のいずれかに記載の情報処理装置。
(11) 前記特定領域情報生成部は、被写体の位置情報と前記撮像情報に基づき前記特定領域情報を生成する(2)乃至(6)のいずれかに記載の情報処理装置。
11・・・撮像画取得部
11-1~11-n・・・撮像部
12・・・撮像情報生成部
13・・・合成設定情報生成部
14・・・特定領域情報生成部
21・・・撮像画合成部
22・・・エンコード処理部
23・・・出力部
31・・・記憶部
50,50a・・・画像再生装置
51・・・画像読込部
52・・・デコード処理部
53・・・メタ情報読込部
54・・・テクスチャ生成部
55・・・テクスチャ記憶部
56・・・合成処理部
57・・・メタ情報記憶部
58・・・出力制御部
121・・・タイムコード生成部
122・・・時計部
123・・・センサ部
124・・・情報生成処理部
Claims (20)
- 複数の撮像部で取得された撮像画をつなぎ合わせた合成画像における特定撮像領域に対応した特定領域を示す特定領域情報を生成する特定領域情報生成部
を備える情報処理装置。 - 前記撮像部の撮像位置情報と撮像方向と撮像範囲および時間情報を示す撮像情報を生成する撮像情報生成部を更に備える
請求項1に記載の情報処理装置。 - 前記特定領域情報生成部は、前記特定領域に対する画像処理の適用時間または画像処理を適用する撮像位置を示す適用範囲情報を、前記特定領域情報に含める
請求項2に記載の情報処理装置。 - 前記特定領域情報生成部は、前記複数の撮像部における特定の撮像部で撮像される領域、または方位および仰角で示された特定された撮像領域、あるいは頂角と座角で特定された撮像領域を前記特定撮像領域とする
請求項2に記載の情報処理装置。 - 前記合成画像に前記特定領域情報を関連付けて出力する出力部を更に備える
請求項2に記載の情報処理装置。 - 前記複数の撮像部で取得された撮像画を、画像のずれを生じることなく連続するようにつなぎ合わせるための合成設定情報を生成する合成設定情報生成部と、
前記合成設定情報生成部で生成された合成設定情報に基づき、前記合成画像を生成する撮像画合成部を更に備える
請求項1に記載の情報処理装置。 - 前記特定領域情報生成部は、前記合成画像を再生する際の制限レベル毎に生成する
請求項1に記載の情報処理装置。 - 前記特定領域情報生成部は、撮像された所定の物体を含む前記合成画像の領域を前記特定領域として前記特定領域情報を生成する
請求項1に記載の情報処理装置。 - 前記特定領域情報生成部は、音声データから生成したテキストデータに基づき前記特定領域情報を生成する
請求項1に記載の情報処理装置。 - 前記特定領域情報生成部は、取得したコマンド情報に基づき前記特定領域情報を生成する
請求項1に記載の情報処理装置。 - 前記特定領域情報生成部は、被写体の位置情報と前記撮像情報に基づき前記特定領域情報を生成する
請求項2に記載の情報処理装置。 - 複数の撮像部で取得された撮像画をつなぎ合わせた合成画像に関連した情報を生成する情報処理方法において、
前記合成画像における特定撮像領域に対応する特定領域を示す特定領域情報を特定領域情報生成部で生成すること
を含む情報処理方法。 - 複数の撮像部で取得された撮像画をつなぎ合わせた合成画像に関連した情報をコンピュータで生成するプログラムであって、
前記合成画像における特定撮像領域に対応する特定領域を示す特定領域情報を生成する機能を
前記コンピュータで実現するプログラム。 - 複数の撮像部と、
前記複数の撮像部で取得された撮像画をつなぎ合わせて合成画像を生成する撮像画合成部と、
前記合成画像における特定撮像領域に対応する特定領域を示す特定領域情報を生成する特定領域情報生成部と
を備える撮像装置。 - 前記撮像画合成部で生成された合成画像に関連付けて特定領域情報生成部で生成された特定領域情報を出力する
請求項14に記載の撮像装置。 - 前記撮像画合成部は、前記特定領域情報生成部で生成された特定領域情報に基づき、前記合成画像における前記特定領域の画像処理を行う
請求項14に記載の撮像装置。 - 複数の撮像画をつなぎ合わせて生成された合成画像を読み込む画像読込部と、
前記合成画像における特定撮像領域に対応する特定領域を示す特定領域情報を読み込む情報読込部と、
前記画像読込部で読み込まれた合成画像における前記情報読込部で読み込まれた特定領域情報によって示された特定領域に対して、再生制御を行う再生制御部と
を備える画像再生装置。 - 前記再生制御部は、前記特定領域の無効化処理または強調処理を行う
請求項17に記載の画像再生装置。 - 前記再生制御部は、前記無効化処理として前記特定領域の画像に前記特定領域情報で示されたテクスチャを合成する
請求項18に記載の画像再生装置。 - 前記再生制御部は、前記無効化処理として前記特定領域の画像の出力を停止または無効データに置き換える
請求項18に記載の画像再生装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17773499.3A EP3439281A4 (en) | 2016-03-29 | 2017-01-16 | INFORMATION PROCESSING DEVICE, IMAGING DEVICE, IMAGE REPRODUCING DEVICE, METHOD, AND PROGRAM |
CN201780019393.2A CN108886580A (zh) | 2016-03-29 | 2017-01-16 | 信息处理装置、成像装置、图像再现装置、方法和程序 |
JP2018508413A JP7056554B2 (ja) | 2016-03-29 | 2017-01-16 | 情報処理装置、撮像装置、画像再生装置、および方法とプログラム |
US16/078,745 US10878545B2 (en) | 2016-03-29 | 2017-01-16 | Information processing device, imaging apparatus, image reproduction apparatus, and method and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016066188 | 2016-03-29 | ||
JP2016-066188 | 2016-03-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017168949A1 true WO2017168949A1 (ja) | 2017-10-05 |
Family
ID=59962990
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/001197 WO2017168949A1 (ja) | 2016-03-29 | 2017-01-16 | 情報処理装置、撮像装置、画像再生装置、および方法とプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US10878545B2 (ja) |
EP (1) | EP3439281A4 (ja) |
JP (1) | JP7056554B2 (ja) |
CN (1) | CN108886580A (ja) |
WO (1) | WO2017168949A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019155930A1 (ja) * | 2018-02-07 | 2019-08-15 | ソニー株式会社 | 送信装置、送信方法、処理装置および処理方法 |
US12020824B2 (en) | 2020-02-06 | 2024-06-25 | Sumitomo Pharma Co., Ltd. | Virtual reality video reproduction apparatus, and method of using the same |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20180051288A (ko) * | 2016-11-08 | 2018-05-16 | 삼성전자주식회사 | 디스플레이 장치 및 그 제어 방법 |
US10999602B2 (en) | 2016-12-23 | 2021-05-04 | Apple Inc. | Sphere projected motion estimation/compensation and mode decision |
US11259046B2 (en) | 2017-02-15 | 2022-02-22 | Apple Inc. | Processing of equirectangular object data to compensate for distortion by spherical projections |
US10924747B2 (en) | 2017-02-27 | 2021-02-16 | Apple Inc. | Video coding techniques for multi-view video |
US11093752B2 (en) | 2017-06-02 | 2021-08-17 | Apple Inc. | Object tracking in multi-view video |
US10754242B2 (en) | 2017-06-30 | 2020-08-25 | Apple Inc. | Adaptive resolution and projection format in multi-direction video |
US20190005709A1 (en) * | 2017-06-30 | 2019-01-03 | Apple Inc. | Techniques for Correction of Visual Artifacts in Multi-View Images |
JP2021052321A (ja) * | 2019-09-25 | 2021-04-01 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム、および画像処理システム |
GB2611154A (en) * | 2021-07-29 | 2023-03-29 | Canon Kk | Image pickup apparatus used as action camera, control method therefor, and storage medium storing control program therefor |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004105383A1 (ja) * | 2003-05-20 | 2004-12-02 | Matsushita Electric Industrial Co., Ltd. | 撮像システム |
JP2005333552A (ja) * | 2004-05-21 | 2005-12-02 | Viewplus Inc | パノラマ映像配信システム |
JP2010008620A (ja) * | 2008-06-26 | 2010-01-14 | Hitachi Ltd | 撮像装置 |
JP2014039119A (ja) * | 2012-08-13 | 2014-02-27 | Canon Inc | 画像処理装置およびその制御方法、画像処理プログラム、並びに撮像装置 |
JP2015121850A (ja) * | 2013-12-20 | 2015-07-02 | 株式会社リコー | 画像生成装置、画像生成方法、およびプログラム |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000215317A (ja) | 1998-11-16 | 2000-08-04 | Sony Corp | 画像処理方法及び画像処理装置 |
JP2001298652A (ja) | 2000-04-17 | 2001-10-26 | Sony Corp | 画像圧縮方法及び画像圧縮装置、並びにソフトウェア記憶媒体 |
EP1679885A1 (en) | 2003-08-11 | 2006-07-12 | Matsushita Electric Industrial Co., Ltd. | Photographing system and photographing method |
JP4484511B2 (ja) * | 2003-12-26 | 2010-06-16 | 三洋電機株式会社 | 画像合成装置、画像合成用集積回路、及び画像合成方法 |
US7619658B2 (en) * | 2004-11-15 | 2009-11-17 | Hewlett-Packard Development Company, L.P. | Methods and systems for producing seamless composite images without requiring overlap of source images |
JP4765613B2 (ja) * | 2005-12-22 | 2011-09-07 | 富士ゼロックス株式会社 | 画像読み取り装置及びプログラム |
JP4853320B2 (ja) * | 2007-02-15 | 2012-01-11 | ソニー株式会社 | 画像処理装置、画像処理方法 |
JP5162928B2 (ja) * | 2007-03-12 | 2013-03-13 | ソニー株式会社 | 画像処理装置、画像処理方法、画像処理システム |
JP2010049313A (ja) | 2008-08-19 | 2010-03-04 | Sony Corp | 画像処理装置、画像処理方法、プログラム |
US8862987B2 (en) * | 2009-03-31 | 2014-10-14 | Intel Corporation | Capture and display of digital images based on related metadata |
JP5299054B2 (ja) | 2009-04-21 | 2013-09-25 | ソニー株式会社 | 電子機器、表示制御方法およびプログラム |
JP5210994B2 (ja) * | 2009-08-18 | 2013-06-12 | 東芝アルパイン・オートモティブテクノロジー株式会社 | 車両用画像表示装置 |
US8447136B2 (en) * | 2010-01-12 | 2013-05-21 | Microsoft Corporation | Viewing media in the context of street-level images |
JP5361777B2 (ja) * | 2010-03-29 | 2013-12-04 | 三菱スペース・ソフトウエア株式会社 | 画素パッケージファイル再生装置および画素パッケージファイル再生プログラム |
US8810691B2 (en) * | 2010-09-03 | 2014-08-19 | Olympus Imaging Corp. | Imaging apparatus, imaging method and computer-readable recording medium |
JP2012248070A (ja) * | 2011-05-30 | 2012-12-13 | Sony Corp | 情報処理装置、メタデータ設定方法、及びプログラム |
EP2779621B1 (en) * | 2011-11-07 | 2021-12-22 | Sony Interactive Entertainment Inc. | Image generation device, image generation method and program |
KR101952684B1 (ko) * | 2012-08-16 | 2019-02-27 | 엘지전자 주식회사 | 이동 단말기 및 이의 제어 방법, 이를 위한 기록 매체 |
US8805091B1 (en) | 2012-08-17 | 2014-08-12 | Google Inc. | Incremental image processing pipeline for matching multiple photos based on image overlap |
IN2014CH02078A (ja) * | 2013-04-24 | 2015-07-03 | Morpho Inc | |
KR102089432B1 (ko) * | 2013-06-20 | 2020-04-14 | 엘지전자 주식회사 | 이동 단말기 및 이의 제어방법 |
JP6664163B2 (ja) * | 2015-08-05 | 2020-03-13 | キヤノン株式会社 | 画像識別方法、画像識別装置及びプログラム |
CN105306837A (zh) * | 2015-10-27 | 2016-02-03 | 浙江宇视科技有限公司 | 多图像拼接方法及装置 |
-
2017
- 2017-01-16 JP JP2018508413A patent/JP7056554B2/ja active Active
- 2017-01-16 US US16/078,745 patent/US10878545B2/en active Active
- 2017-01-16 EP EP17773499.3A patent/EP3439281A4/en active Pending
- 2017-01-16 CN CN201780019393.2A patent/CN108886580A/zh active Pending
- 2017-01-16 WO PCT/JP2017/001197 patent/WO2017168949A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004105383A1 (ja) * | 2003-05-20 | 2004-12-02 | Matsushita Electric Industrial Co., Ltd. | 撮像システム |
JP2005333552A (ja) * | 2004-05-21 | 2005-12-02 | Viewplus Inc | パノラマ映像配信システム |
JP2010008620A (ja) * | 2008-06-26 | 2010-01-14 | Hitachi Ltd | 撮像装置 |
JP2014039119A (ja) * | 2012-08-13 | 2014-02-27 | Canon Inc | 画像処理装置およびその制御方法、画像処理プログラム、並びに撮像装置 |
JP2015121850A (ja) * | 2013-12-20 | 2015-07-02 | 株式会社リコー | 画像生成装置、画像生成方法、およびプログラム |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019155930A1 (ja) * | 2018-02-07 | 2019-08-15 | ソニー株式会社 | 送信装置、送信方法、処理装置および処理方法 |
US11341976B2 (en) | 2018-02-07 | 2022-05-24 | Sony Corporation | Transmission apparatus, transmission method, processing apparatus, and processing method |
US12020824B2 (en) | 2020-02-06 | 2024-06-25 | Sumitomo Pharma Co., Ltd. | Virtual reality video reproduction apparatus, and method of using the same |
Also Published As
Publication number | Publication date |
---|---|
CN108886580A (zh) | 2018-11-23 |
US20190057496A1 (en) | 2019-02-21 |
EP3439281A4 (en) | 2019-04-17 |
JPWO2017168949A1 (ja) | 2019-02-07 |
EP3439281A1 (en) | 2019-02-06 |
JP7056554B2 (ja) | 2022-04-19 |
US10878545B2 (en) | 2020-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017168949A1 (ja) | 情報処理装置、撮像装置、画像再生装置、および方法とプログラム | |
CN106464804B (zh) | 成像系统、成像装置、计算机程序和系统 | |
JP7444162B2 (ja) | 画像処理装置、画像処理方法、プログラム | |
JP2018011302A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP2016167739A (ja) | 画像処理システム、画像処理方法およびプログラム | |
JP2003178298A (ja) | 画像処理装置及び画像処理方法、記憶媒体、並びにコンピュータ・プログラム | |
WO2020170606A1 (ja) | 画像処理装置、画像処理方法、プログラム | |
JP2019118026A (ja) | 情報処理装置、情報処理方法、及びプログラム | |
JP2013025649A (ja) | 画像処理装置及び画像処理方法、プログラム | |
JP2019057891A (ja) | 情報処理装置、撮像装置、情報処理方法及びプログラム | |
JP2008510357A (ja) | 画像のエンコーディング方法、エンコーディング装置、画像のデコーディング方法及びデコーディング装置 | |
JP6724659B2 (ja) | 撮影装置、方法およびプログラム | |
JP2010166218A (ja) | カメラシステム及びその制御方法 | |
JP6256513B2 (ja) | 撮像システム、撮像装置、方法およびプログラム | |
JP5045213B2 (ja) | 撮像装置、再生装置及び記録ファイル作成方法 | |
CN114175616B (zh) | 图像处理设备、图像处理方法和程序 | |
US11825191B2 (en) | Method for assisting the acquisition of media content at a scene | |
WO2021171848A1 (ja) | 画像処理装置、画像処理方法、プログラム | |
JP2020174363A (ja) | 撮影システム、方法およびプログラム | |
WO2021181965A1 (ja) | 画像処理装置、画像処理方法、プログラム | |
WO2021181966A1 (ja) | 画像処理装置、画像処理方法、プログラム | |
JP2013186640A (ja) | 画像処理装置と画像処理方法およびプログラム | |
WO2024171796A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP2015226224A (ja) | 撮像装置 | |
JP2023042257A (ja) | 画像処理装置、画像処理方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018508413 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017773499 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2017773499 Country of ref document: EP Effective date: 20181029 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17773499 Country of ref document: EP Kind code of ref document: A1 |