WO2022113583A1 - 情報処理装置、情報処理方法、及び記録媒体 - Google Patents
情報処理装置、情報処理方法、及び記録媒体 Download PDFInfo
- Publication number
- WO2022113583A1 WO2022113583A1 PCT/JP2021/038810 JP2021038810W WO2022113583A1 WO 2022113583 A1 WO2022113583 A1 WO 2022113583A1 JP 2021038810 W JP2021038810 W JP 2021038810W WO 2022113583 A1 WO2022113583 A1 WO 2022113583A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- images
- information processing
- dimensional
- processing apparatus
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 169
- 238000003672 processing method Methods 0.000 title claims description 10
- 238000012545 processing Methods 0.000 claims description 60
- 238000004590 computer program Methods 0.000 claims description 17
- 238000004364 calculation method Methods 0.000 claims description 13
- 238000001514 detection method Methods 0.000 claims description 12
- 238000011156 evaluation Methods 0.000 claims description 2
- 238000000034 method Methods 0.000 description 44
- 239000000872 buffer Substances 0.000 description 27
- 238000010586 diagram Methods 0.000 description 22
- 230000000694 effects Effects 0.000 description 18
- 230000006870 function Effects 0.000 description 11
- 238000003384 imaging method Methods 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 6
- 230000001815 facial effect Effects 0.000 description 4
- 238000000691 measurement method Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000010363 phase shift Effects 0.000 description 3
- 230000004397 blinking Effects 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 230000001174 ascending effect Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
Definitions
- This disclosure relates to the technical fields of information processing devices for generating three-dimensional images, information processing methods, and recording media.
- Patent Document 1 discloses a technique for creating a three-dimensional face model corresponding to a face image by mapping the value of each pixel of the face image onto the aligned three-dimensional face shape data.
- Patent Document 2 discloses that a three-dimensional shape of a subject's face is estimated and the estimation result is displayed on a display device.
- Patent Document 3 discloses that distributed processing is performed for processing related to a three-dimensional object.
- Patent Document 4 discloses that the image to be processed is thinned out to reduce the transfer load and the processing load.
- This disclosure is intended to improve the related techniques mentioned above.
- One aspect of the information processing apparatus of the present disclosure is an acquisition means for acquiring an image set including a plurality of images of a subject, a first storage means for storing a plurality of the image sets, and a plurality of the image sets. Therefore, it is provided with a first selection means for selecting a first predetermined number of images across a set, and a generation means for generating a three-dimensional image of the subject based on the first predetermined number of images.
- One aspect of the information processing method of the present disclosure is a first aspect of acquiring an image set including a plurality of images of a subject, storing a plurality of the image sets, and straddling the sets from the plurality of the image sets. A predetermined number of images are selected, and a three-dimensional image of the subject is generated based on the first predetermined number of images.
- One aspect of the recording medium of the present disclosure is to acquire an image set including a plurality of images of a subject, store a plurality of the image sets, and first determine from the plurality of the image sets across the sets.
- a computer program is recorded that selects a number of images and operates the computer to generate a three-dimensional image of the subject based on the first predetermined number of images.
- FIG. 1 is a block diagram showing a hardware configuration of the information processing apparatus according to the first embodiment.
- the information processing device 10 includes a processor 11, a RAM (Random Access Memory) 12, a ROM (Read Only Memory) 13, and a storage device 14.
- the information processing device 10 may further include an input device 15 and an output device 16.
- the processor 11, the RAM 12, the ROM 13, the storage device 14, the input device 15, and the output device 16 are connected via the data bus 17.
- Processor 11 reads a computer program.
- the processor 11 is configured to read a computer program stored in at least one of the RAM 12, the ROM 13, and the storage device 14.
- the processor 11 may read a computer program stored in a computer-readable recording medium by using a recording medium reading device (not shown).
- the processor 11 may acquire (that is, read) a computer program from a device (not shown) arranged outside the information processing device 10 via a network interface.
- the processor 11 controls the RAM 12, the storage device 14, the input device 15, and the output device 16 by executing the read computer program.
- a functional block for generating a three-dimensional image is realized in the processor 11.
- processor 11 a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), an FPGA (field-programmable gate array), a DSP (Demand-Side Platform), and an ASIC (Application) are listed.
- processor 11 one of the above-mentioned examples may be used, or a plurality of processors 11 may be used in parallel.
- the RAM 12 temporarily stores the computer program executed by the processor 11.
- the RAM 12 temporarily stores data temporarily used by the processor 11 while the processor 11 is executing a computer program.
- the RAM 12 may be, for example, a D-RAM (Dynamic RAM).
- the ROM 13 stores a computer program executed by the processor 11.
- the ROM 13 may also store fixed data.
- the ROM 13 may be, for example, a P-ROM (Programmable ROM).
- the storage device 14 stores data stored in the information processing device 10 for a long period of time.
- the storage device 14 may operate as a temporary storage device of the processor 11.
- the storage device 14 may include, for example, at least one of a hard disk device, a magneto-optical disk device, an SSD (Solid State Drive), and a disk array device.
- the input device 15 is a device that receives an input instruction from the user of the information processing device 10.
- the input device 15 may include, for example, at least one of a keyboard, a mouse and a touch panel.
- the input device 15 may be a dedicated controller (operation terminal). Further, the input device 15 may include a terminal owned by the user (for example, a smartphone, a tablet terminal, or the like).
- the input device 15 may be a device capable of voice input including, for example, a microphone.
- the output device 16 is a device that outputs information about the information processing device 10 to the outside.
- the output device 16 may be a display device (for example, a display) capable of displaying information about the information processing device 10.
- the display device here may be a television monitor, a personal computer monitor, a smartphone monitor, a tablet terminal monitor, or another mobile terminal monitor.
- the display device may be a large monitor, a digital signage, or the like installed in various facilities such as a store.
- the output device 16 may be a device that outputs information in a format other than an image.
- the output device 16 may be a speaker that outputs information about the information processing device 10 by voice.
- FIG. 2 is a block diagram showing a functional configuration of the information processing apparatus according to the first embodiment.
- the information processing apparatus 10 includes an image acquisition unit 110, a first storage unit 120, and a first selection unit 130 as processing blocks for realizing the function. It is provided with a three-dimensional image generation unit 140.
- Each of the image acquisition unit 110, the first selection unit 130, and the three-dimensional image generation unit 140 may be realized by the processor 11 (see FIG. 1) described above.
- the first storage unit 120 may be realized by the RAM 12 (see FIG. 1) described above.
- the image acquisition unit 110 is configured to be able to acquire an image set including a plurality of images of the subject.
- the image acquisition unit 110 may directly acquire an image captured by, for example, a camera or the like, or may appropriately acquire an image stored in a storage means.
- When acquiring an image from a camera there may be a plurality of cameras, and a plurality of images may be acquired from each of the plurality of cameras.
- the number of images included in the image set is not particularly limited. Specific examples of the image set will be described in detail in other embodiments described later.
- the image stored in the image acquisition unit 110 is output to the first storage unit 120.
- the first storage unit 120 is configured to be able to store a plurality of image sets (that is, a plurality of images) acquired by the image acquisition unit 110.
- the first storage unit 120 can store, for example, two image sets.
- the first storage unit 120 may be capable of storing three or more image sets.
- the first storage unit 120 may have a function of appropriately deleting an unnecessary image set.
- the first selection unit 130 is configured to be able to select a first predetermined number of images from the image set stored in the first storage unit 120.
- the "first predetermined number of images” here is set as the number of images to be output to the three-dimensional image generation unit 140, and even if it is the minimum number of images required to generate a three-dimensional image, for example. good.
- the first predetermined number may be the same number as the number of images included in the image set, or may be a different number.
- the first selection unit 130 selects a first predetermined number of images across the set from a plurality of image sets stored in the first storage unit 120.
- the first selection unit 130 selects a first predetermined number of images from each of the first image set and the second image set. The method of selecting an image by the first selection unit 130 will be described in detail later with a specific example.
- the first predetermined number of images selected by the first selection unit 130 is output to the three-dimensional image generation unit 140.
- the three-dimensional image generation unit 140 is configured to be able to generate a three-dimensional image of a subject from a first predetermined number of images selected by the first selection unit 130. That is, the three-dimensional image generation unit 140 has a function of generating a three-dimensional image from a plurality of two-dimensional images.
- the method of generating a three-dimensional image from a plurality of images is not particularly limited, and existing techniques can be adopted as appropriate, and therefore detailed description thereof will be omitted here.
- FIG. 3 is a flowchart showing an operation flow of the information processing apparatus according to the first embodiment.
- the image acquisition unit 110 first acquires an image set including a plurality of images (step S11). Then, the first storage unit 120 stores the image set acquired by the image acquisition unit 110 (step S12). In the process of step S11 and step S12 described above, a plurality of image sets may be collectively acquired and stored, or a plurality of image sets may be acquired and stored repeatedly by repeatedly executing the process of acquiring and storing one image set. The image set may be stored.
- the first selection unit 130 selects a first predetermined number of images across the set from the plurality of image sets stored in the first storage unit 120 (step S13).
- the three-dimensional image generation unit 140 generates a three-dimensional image of the subject by using the first predetermined number of images selected by the first selection unit 130 (step S14).
- the series of processes described above may be repeatedly executed as appropriate. That is, a plurality of three-dimensional images may be generated by repeatedly executing the process of generating the three-dimensional image.
- FIG. 4 is a conceptual diagram showing an example of image selection in the information processing apparatus according to the first embodiment.
- the first image set and the second image set are stored in the first storage unit 120. It should be noted that one image set contains eight images. Specifically, the first image set contains eight images from No. 1 to No. 8. The second image set contains eight images from No. 9 to No. 16.
- the first selection unit 130 has, for example, four images in the latter half of the first image set (that is, images Nos. 5 to 8) and four images in the first half of the second image set (that is, from No. 9). No. 12 image) and may be selected.
- a three-dimensional image is generated from a total of eight images, the selected images Nos. 5 to 8 and the images Nos. 9 to 12.
- the unselected images that is, the 1st to 4th images and the 13th to 16th images
- the unused image may be discarded or may be used in the same process executed at different timings (for example, it may be a selection candidate of the next first selection unit 130).
- images having consecutive numbers across sets are selected, but images with non-consecutive numbers may be selected.
- the images Nos. 1, 3, 5, and 7 in the first image set and the images Nos. 9, 11, 13, and 15 in the second image set may be selected.
- the same number of images (8 images) as the images included in the image set are selected, but the number of images different from the images included in the image set (for example, 7 or less images).
- 9 or more images may be selected.
- the selected image may be appropriately set according to various conditions.
- the image to be selected may be determined based on, for example, the quality of the image (whether the image pickup target is facing the front, whether the image is blurred or blurred, etc.). Such an example will be described in detail in another embodiment described later.
- a first predetermined number of images are selected from a plurality of image sets across the sets. By doing so, it is possible to generate a three-dimensional image using a first predetermined number of images included in different image sets. In this case, it is possible to more appropriately generate a three-dimensional image as compared with the case where a three-dimensional image is generated using only the images included in one image set. For example, an image suitable for generating a 3D image can be selected (in other words, excluding an image unsuitable for generating a 3D image) to generate a 3D image.
- the information processing apparatus 10 since the information processing apparatus 10 according to the first embodiment includes a first storage unit 120 that stores a plurality of image sets, it is easy to select a first predetermined number of images across the sets. Further, since the first storage unit 120 is configured to be able to store a plurality of image sets, the process of acquiring an image and the process of selecting an image to generate a three-dimensional image are efficiently executed. It becomes possible to do. Specifically, the process of acquiring an image and the process of generating a three-dimensional image can be executed in parallel at the same time. Further, by including the first storage unit 120, it is possible to execute the process of acquiring a new image without waiting for the process of generating the three-dimensional image to be completed.
- the image acquisition unit 110 acquires a new image until the three-dimensional image generation unit 140 completes generation of a three-dimensional image using one image set. Cannot (ie, it is required to wait for the process to generate a 3D image to complete). However, by including the first storage unit 120, the image acquisition unit 110 can acquire a new image without waiting for the completion of the process of generating the three-dimensional image by the three-dimensional image generation unit 140. Specifically, the image acquisition unit 110 sequentially stores the acquired images in the first storage unit 120, and the three-dimensional image generation unit 140 does not acquire the image directly from the image acquisition unit 110, but first. It is possible to acquire an image from the storage unit 120 and generate a three-dimensional image.
- the information processing apparatus 10 according to the second embodiment will be described with reference to FIG.
- the second embodiment differs from the first embodiment described above in only a part of the operation.
- the hardware configuration (see FIG. 1) and the functional configuration (see FIG. 2) of the device are described.
- FIG. 5 is a flowchart showing an operation flow of the information processing apparatus according to the second embodiment.
- the same reference numerals are given to the same processes as those shown in FIG.
- the image acquisition unit 110 first acquires an image set including a plurality of images (step S11). Then, the first storage unit 120 stores the image set acquired by the image acquisition unit 110 (step S12).
- the first selection unit 130 selects a first predetermined number of images across the set from the plurality of image sets stored in the first storage unit 120.
- the first selection unit 130 selects a first predetermined number of images according to the processing capacity of the three-dimensional image generation unit 140 (step S21). That is, the first selection unit 130 selects an image according to the subsequent processing capacity.
- the three-dimensional image generation unit 140 generates a three-dimensional image of the subject by using the first predetermined number of images selected by the first selection unit 130 (step S14).
- the above-mentioned "processing capacity of the three-dimensional image generation unit 140" may be the processing capacity according to the specifications of the three-dimensional image generation unit 140, or at that time in consideration of the calculation load of the processing being executed. It may be the processing capacity of.
- the first selection unit 130 changes the number of images to be selected (that is, the value of the first predetermined number of images) according to, for example, the processing capacity of the three-dimensional image generation unit 140. In this case, the number of images used by the three-dimensional image generation unit 140 to generate the three-dimensional image varies depending on the processing capacity.
- the first selection unit 130 changes the frequency of selecting an image according to the processing capacity of the three-dimensional image generation unit 140. In this case, since the frequency with which the image is output to the three-dimensional image generation unit 140 varies, the number of images processed by the three-dimensional image generation unit 140 per unit time varies.
- a first predetermined number of images are selected according to the processing capacity of the three-dimensional image generation unit 140.
- the first selection unit 130 reduces the number of images to be selected.
- the three-dimensional image generation unit 140 which has a reduced processing capacity, can also execute the processing for generating a three-dimensional image without any problem. ..
- the first selection unit 130 reduces the frequency of selecting an image.
- the 3D image generation unit 140 which has a reduced processing capacity, can also execute the processing to generate the 3D image without any problem. Can be done.
- the information processing apparatus 10 according to the third embodiment will be described with reference to FIGS. 6 to 9.
- the third embodiment shows a specific example of the image used in the first and second embodiments described above (that is, an image of a subject for generating a three-dimensional image), and shows a specific example of the device configuration and operation.
- the flow and the like may be the same as in the first and second embodiments. Therefore, in the following, the parts different from each of the above-described embodiments will be described in detail, and the description of other overlapping parts will be omitted as appropriate.
- FIG. 6 is a schematic configuration diagram showing a configuration of an imaging system to which the information processing apparatus according to the third embodiment is applied.
- the information processing apparatus 10 is connected to the imaging unit 210 and the projection unit 220.
- the image pickup unit 210 and the projection unit 220 are arranged so as to face the subject 50, respectively.
- the image pickup unit 210 is configured to be able to capture an image of the subject 50.
- the image pickup unit 210 may include a solid-state image pickup element such as a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor. Further, the image pickup unit 210 may include an optical system for forming an image of a subject on the image pickup surface of the solid-state image pickup device, a signal processing circuit for signal processing the output of the solid-state image pickup device to obtain a brightness value for each pixel, and the like. Although one imaging unit 210 is shown here for convenience of explanation, two or more imaging units 210 may be provided. In this case, the two image pickup units 210 may be arranged so as to image the subject 50 at different angles.
- the projection unit 220 is configured to be able to project a predetermined light pattern on the subject 50.
- the projection unit 220 is not particularly limited, but may be, for example, a DLP (Digital Light Processing) projector, a liquid crystal projector, or the like.
- a DLP projector or a liquid crystal projector can project an arbitrary light pattern at high speed, which is preferable in order to shorten the time required to measure the shape of the subject 50. Shortening the measurement time is particularly suitable for measuring the three-dimensional shape of a moving object (moving object), such as when performing face recognition of a person.
- the details of the light pattern projected by the projection unit 220 on the subject 50 will be described later.
- FIG. 7 is a plan view showing an example of a sine wave pattern projected on a subject.
- the projection unit 220 projects a sinusoidal pattern (that is, a sinusoidal grid-like optical pattern) onto the subject 50 as a predetermined optical pattern.
- a sinusoidal pattern that is, a sinusoidal grid-like optical pattern
- the sinusoidal grid phase shift method is a method of projecting a sinusoidal pattern onto a subject while shifting the phase little by little to specify the three-dimensional shape of the subject.
- the brightness value I (x, y, t) at time t at the (x, y) coordinates of the obtained image has the amplitude of the sine wave as A and the phase. It can be expressed by the following equation (1), where ⁇ is the value and B is the bias (center value of the sine wave).
- the coordinates (x). , Y) can be defined as a three-dimensional position.
- the phase value ⁇ can be calculated if there are at least three sinusoidal pattern projection images.
- the phase value ⁇ can be calculated with higher accuracy by the least squares method or the like.
- Equation (6) to (8) can be obtained by obtaining the amplitude A, the phase value ⁇ , and the bias B from the equations (2) to (5) by the least squares method.
- FIG. 8 is a plan view showing an example of a luminance inclination pattern projected on the subject.
- the projection unit 220 projects a luminance gradient pattern (that is, a light pattern in which the luminance value linearly changes at a constant rate) onto the subject 50 in addition to the above-mentioned sinusoidal wave pattern. ..
- the luminance gradient pattern is projected as a periodic optical pattern in which the number of repetition periods is different from that of the sinusoidal pattern.
- FIG. 8 shows a luminance gradient pattern in which the brightness increases at a constant rate from top to bottom.
- a luminance gradient pattern in which the luminance increases at a constant rate from bottom to top is used. May be good.
- the luminance value K (x, y, t) at time t at the (x, y) coordinates of the obtained image is the luminance value K (x, y, t) of the following equation (9).
- a ′′ is an amplitude
- B ′′ is a bias
- ⁇ is a variable whose value changes linearly in the range of -1 ⁇ ⁇ ⁇ 1.
- one projection unit 220 switches between a sinusoidal pattern and a brightness gradient pattern at high speed for projection. It is possible to do.
- the sine wave pattern and the luminance gradient pattern generated based on the light emitted from the same light source are projected by one projection unit 220, the basic physical characteristics of the projection device 20 at the time of projecting these light patterns are It can be assumed that they are the same. That is, when the sine wave pattern and the luminance inclination pattern are projected onto the subject 50 using the same projection unit 220, it is considered that the following equation (10) is satisfied.
- FIG. 9 is a diagram showing an example of an image set used in the information processing apparatus according to the third embodiment.
- the information processing apparatus 10 projects an image captured by projecting the above-mentioned sine wave pattern (see FIG. 7) and a brightness gradient pattern (see FIG. 8).
- a plurality of images including an image to be captured and a texture image showing the state of the surface of the subject 50 are acquired as one image set.
- four images are captured by projecting a sine wave pattern
- two images are captured by projecting a brightness gradient pattern
- two texture images are captured in one image set, for a total of four images. Eight images are included.
- the configuration of the image set described above is only an example, and the number of each of the image captured by projecting the sine wave pattern, the image captured by projecting the luminance gradient pattern, and the texture image may be appropriately changed. Further, the total number of images constituting one image set may be different from the number of eight images. However, it is preferable that a plurality of images captured by projecting a sine wave pattern, images captured by projecting a luminance gradient pattern, and texture images are included in one image set.
- the information processing apparatus 10 according to the fourth embodiment will be described with reference to FIGS. 10 to 15. It should be noted that the fourth embodiment is different from the above-mentioned first to third embodiments only in a part of the configuration and operation, and the other parts are the same as those of the first to third embodiments. It may be there. Therefore, in the following, the parts different from each of the above-described embodiments will be described in detail, and the description of other overlapping parts will be omitted as appropriate.
- FIG. 10 is a block diagram showing a functional configuration of the information processing apparatus according to the fourth embodiment.
- the same reference numerals are given to the same components as those shown in FIG. 2.
- the information processing apparatus 10 includes an image acquisition unit 110, a first storage unit 120, and a first selection unit 130 as processing blocks for realizing the function. It includes a three-dimensional image generation unit 140, a second storage unit 150, a second selection unit 160, and a display unit 170. That is, in the information processing apparatus 10 according to the fourth embodiment, in addition to the configuration according to the first embodiment (see FIG. 2), the second storage unit 150, the second selection unit 160, and the display unit 170 are further added. It is configured in preparation.
- the second storage unit 150 may be realized by the above-mentioned RAM 12 (see FIG. 1).
- the second selection unit 160 and the display unit 170 may be realized by the processor 11 (see FIG. 1) described above.
- the second storage unit 150 is configured to be capable of storing a plurality of sets (hereinafter, appropriately referred to as "three-dimensional image sets") including a plurality of three-dimensional images generated by the three-dimensional image generation unit 140.
- the second storage unit 150 can store, for example, two three-dimensional image sets.
- the second storage unit 150 may be capable of storing three or more three-dimensional image sets.
- the second storage unit 150 may have a function of appropriately deleting an unnecessary three-dimensional image set.
- the second selection unit 160 is configured to be able to select a second predetermined number of three-dimensional images from the three-dimensional image set stored in the second storage unit 150.
- the "second predetermined number of sheets" is set as the number of three-dimensional images output by the display unit 170, and may be, for example, the number of three-dimensional images simultaneously displayed on a display or the like.
- the second predetermined number may be the same number as the number of three-dimensional images included in the three-dimensional image set, or may be a different number.
- the second selection unit 160 selects a second predetermined number of three-dimensional images from a plurality of three-dimensional image sets stored in the second storage unit 150 across the sets. For example, the second selection unit 160 selects a second predetermined number of images from each of the first three-dimensional image set and the second three-dimensional image set.
- the second predetermined number of three-dimensional images selected by the second selection unit 160 is output to the display unit 170.
- the display unit 170 outputs a second predetermined number of images selected by the second selection unit 160 as a three-dimensional image to be displayed to the user or the like. That is, the display unit 170 has a function of outputting a plurality of three-dimensional images.
- the three-dimensional image output from the display unit 170 is output to, for example, a display device having a display. This display device may be realized, for example, by the output device 16 (see FIG. 1) described above.
- the display mode of the three-dimensional image output from the display unit 170 will be described with reference to specific examples in other embodiments described later.
- FIG. 11 is a flowchart showing a flow of operation of the information processing apparatus according to the fourth embodiment.
- the same reference numerals are given to the same processes as those shown in FIG.
- the image acquisition unit 110 first acquires an image set including a plurality of images (step S11). Then, the first storage unit 120 stores the image set acquired by the image acquisition unit 110 (step S12).
- the first selection unit 130 selects a first predetermined number of images across the set from the plurality of image sets stored in the first storage unit 120 (step S13). Then, the three-dimensional image generation unit 140 generates a three-dimensional image of the subject by using the first predetermined number of images selected by the first selection unit 130 (step S14).
- the second storage unit 150 stores the three-dimensional image set generated by the three-dimensional image generation unit 140 (step S41).
- the second selection unit 160 selects a second predetermined number of three-dimensional images from the plurality of three-dimensional image sets stored in the second storage unit 150 across the sets (step S42).
- the display unit 170 outputs a second predetermined number of three-dimensional images selected by the second selection unit 160 as a three-dimensional image for display (step S43).
- the second predetermined number of three-dimensional images selected by the second selection unit 160 is displayed (presented) to the user and the like.
- FIG. 12 is a conceptual diagram showing the configuration of the image ring memory and the three-dimensional image ring memory.
- FIG. 13 is a flowchart showing the flow of operation of the image pickup thread in the information processing apparatus according to the fourth embodiment.
- FIG. 14 is a flowchart showing the flow of operation of the three-dimensional image generation thread in the information processing apparatus according to the fourth embodiment.
- FIG. 15 is a flowchart showing the flow of operation of the three-dimensional image display thread in the information processing apparatus according to the fourth embodiment.
- the first storage unit 120 includes an image ring memory capable of storing a plurality of images.
- the image ring memory here has eight buffers, and each buffer is configured to be able to store an image.
- Each buffer has a state indicating the state. The status is either "writable” or "writable”.
- Pointer # 1 points to a buffer to be worked on in the image ring memory.
- the second storage unit 150 is provided with a three-dimensional image ring memory capable of storing a plurality of three-dimensional images.
- the three-dimensional image ring memory here has five buffers, and each buffer is configured to be able to store a three-dimensional image.
- Each buffer has a state indicating the state. The status is either "writable” or "writable”.
- Pointer # 2 points to a buffer to be worked on in the 3D image ring memory.
- the imaging thread will be described with reference to the example shown in FIG.
- the image pickup thread is a thread corresponding to a process in which an image acquisition unit 110 acquires an image captured by an image pickup unit 210 (see FIG. 6) and stores the image in the first storage unit 120.
- step S101 the state of the buffer pointed to by the pointer # 1 is checked. If the state of the buffer pointed to by the pointer # 1 is "writable”, the subsequent processing is not started. In this case, for example, after a predetermined period, the process of step S101 may be executed again. On the other hand, when the state of the buffer pointed to by the pointer # 1 is "writable", the captured image is acquired and the acquired image is written to the image ring memory (step S102).
- step S103 the state of the buffer pointed to by the pointer # 1 is set to "writable"
- step S104 the pointer # 1 is advanced to the next buffer
- the three-dimensional image generation thread will be described with reference to the example shown in FIG.
- the first selection unit 130 selects a first predetermined number of images from a plurality of image sets stored in the first storage unit 120
- the three-dimensional image generation unit 140 selects three-dimensional images. It is a thread corresponding to the process of generating and storing.
- step S201 first check the status of the buffer pointed to by the pointer # 1 (step S201). If the state of the buffer pointed to by the pointer # 1 is "writable", the subsequent processing is not started. In this case, for example, after a predetermined period, the process of step S201 may be executed again.
- step S202 the status of the buffer pointed to by the pointer # 1 is "writable” is checked. If the state of the buffer pointed to by the pointer # 2 is "writable”, the subsequent processing is not started. In this case, for example, after a predetermined period, the process of step S202 may be executed again.
- step S203 a three-dimensional image is generated and written to the generated three-dimensional image ring memory.
- step S204 the status of the input buffer of the image ring memory pointed to by the pointer # 1 is set to "writable” (step S204). Further, the status of the output buffer of the three-dimensional image ring memory pointed to by the pointer # 2 is set to "writable” (step S205). Then, the pointer # 1 and the pointer # 2 are advanced to the next buffers (step S206).
- the three-dimensional image display thread will be described with reference to the example shown in FIG.
- the second selection unit 160 selects a second predetermined number of three-dimensional images from a plurality of three-dimensional image sets stored in the second storage unit 150, and the display unit 170 displays them. It is a thread corresponding to the process of outputting as a three-dimensional image of.
- step S301 first check the status of the buffer pointed to by the pointer # 2 (step S301). If the state of the buffer pointed to by the pointer # 2 is "writable”, the subsequent processing is not started. In this case, for example, after a predetermined period, the process of step S301 may be executed again. On the other hand, when the state of the buffer pointed to by the pointer # 1 is "writable", the selected 3D image is output as a 3D image for display (step S302).
- step S303 the state of the buffer pointed to by the pointer # 2 is set to "writable"
- step S304 the pointer # 2 is advanced to the next buffer
- a second predetermined number of three-dimensional images are selected from a plurality of three-dimensional image sets across the sets. By doing so, it is possible to display a second predetermined number of images included in different three-dimensional image sets.
- the three-dimensional image can be displayed more appropriately as compared with the case where only the images included in one three-dimensional image set are displayed.
- a 3D image suitable for display can be selected (in other words, excluding a 3D image unsuitable for display) to display the 3D image.
- FIG. 16 is a block diagram showing a functional configuration of a modified example of the information processing apparatus according to the fourth embodiment.
- the same reference numerals are given to the same components as those shown in FIG. 10.
- the image acquisition unit 110, the three-dimensional image generation unit 140, and the second storage are as processing blocks for realizing the function.
- a unit 150, a second selection unit 160, and a display unit 170 are provided. That is, the modified example of the information processing apparatus 10 according to the fourth embodiment does not include the first storage unit 120 and the first selection unit 130 as compared with the configuration according to the second embodiment (see FIG. 10).
- FIG. 17 is a flowchart showing an operation flow of a modified example of the information processing method according to the fourth embodiment.
- the same reference numerals are given to the same processes as those shown in FIG.
- the image acquisition unit 110 first acquires an image set including a plurality of images (step S11). and, Then, the three-dimensional image generation unit 140 generates a three-dimensional image of the subject by using the image set acquired by the image acquisition unit 110 (step S40). That is, as in the fourth embodiment described above, the three-dimensional image is directly generated as it is without the storage by the first storage unit 120 and the selection by the first selection unit 130.
- the second storage unit 150 stores the three-dimensional image set generated by the three-dimensional image generation unit 140 (step S41). Then, the second selection unit 160 selects a second predetermined number of three-dimensional images from the plurality of three-dimensional image sets stored in the second storage unit 150 across the sets (step S42). After that, the display unit 170 outputs a second predetermined number of three-dimensional images selected by the second selection unit 160 as a three-dimensional image for display (step S43).
- the set is straddled from a plurality of three-dimensional image sets.
- a second predetermined number of 3D images is selected. By doing so, it is possible to display a second predetermined number of images included in different three-dimensional image sets.
- the information processing apparatus 10 according to the fifth embodiment will be described with reference to FIG. It should be noted that the fifth embodiment is different from the fourth embodiment described above only in a part of the operation, and other parts may be the same as the fourth embodiment. Therefore, in the following, the parts different from each of the above-described embodiments will be described in detail, and the description of other overlapping parts will be omitted as appropriate.
- FIG. 18 is a flowchart showing a flow of operation of the information processing apparatus according to the fifth embodiment.
- the same reference numerals are given to the same processes as those shown in FIG.
- the image acquisition unit 110 first acquires an image set including a plurality of images (step S11). Then, the first storage unit 120 stores the image set acquired by the image acquisition unit 110 (step S12).
- the first selection unit 130 selects a first predetermined number of images across the set from the plurality of image sets stored in the first storage unit 120 (step S13). Then, the three-dimensional image generation unit 140 generates a three-dimensional image of the subject by using the first predetermined number of images selected by the first selection unit 130 (step S14).
- the second storage unit 150 stores the three-dimensional image set generated by the three-dimensional image generation unit 140 (step S41). Then, the second selection unit 160 selects a second predetermined number of three-dimensional images from the plurality of three-dimensional image sets stored in the second storage unit 150 across the sets. At this time, in particular, in the fifth embodiment, the second selection unit 160 selects a second predetermined number of images according to the processing capacity of the display unit 170 (step S51). That is, the second selection unit 160 selects an image according to the subsequent processing capacity. After that, the display unit 170 outputs a second predetermined number of three-dimensional images selected by the second selection unit 160 as a three-dimensional image for display (step S43).
- the above-mentioned "processing capacity of the display unit 170" may be the processing capacity of the display unit 170 according to the specifications, or is the processing capacity at that time in consideration of the computational load of the processing being executed. You may.
- the second selection unit 160 changes, for example, the number of three-dimensional images to be selected (that is, the value of the second predetermined number of images) according to the processing capacity of the display unit 170. In this case, the number of three-dimensional images output by the display unit 170 varies depending on the processing capacity.
- the second selection unit 160 changes the frequency of selecting the three-dimensional image according to the processing capacity of the display unit 170. In this case, since the frequency with which the three-dimensional image is output to the display unit 170 varies, the number of three-dimensional images processed by the display unit 170 per unit time varies.
- a second predetermined number of images are selected according to the processing capacity of the display unit 170.
- the second selection unit 160 reduces the number of images to be selected.
- the display unit 170 having a reduced processing capacity can execute the process of outputting the three-dimensional image without any problem.
- the second selection unit 160 reduces the frequency of selecting the three-dimensional image.
- the display unit 170 having a reduced processing capacity can execute the process of outputting the three-dimensional image without any problem.
- the configuration considering the processing capacity of the display unit 170 of the fifth embodiment may be combined with the configuration considering the processing capacity of the three-dimensional image generation unit 140 described in the second embodiment (see FIG. 5). In this case, since the processing powers of the display unit 170 and the three-dimensional image generation unit 140 are taken into consideration, more efficient processing can be realized as compared with the case where the processing power of only one of them is taken into consideration. Become.
- the information processing apparatus 10 according to the sixth embodiment will be described with reference to FIGS. 19 and 20.
- the sixth embodiment describes a specific example of the display mode by the display unit 170 included in the fourth and fifth embodiments described above, and the fourth and fifth embodiments describe other operations and device configurations. It may be similar to the form. Therefore, in the following, the parts different from each of the above-described embodiments will be described in detail, and the description of other overlapping parts will be omitted as appropriate.
- FIG. 19 is a diagram showing an example of displaying a list of three-dimensional images by the information processing apparatus according to the sixth embodiment. In the following, the description will proceed assuming that the subject 50 is the face of a person.
- the information processing apparatus 10 can output in a display mode in which a plurality of three-dimensional images are displayed in a list. Specifically, each of the second predetermined number of three-dimensional images selected by the second selection unit 160 is output in a format such that the display unit 170 is displayed in a list. As a result, three-dimensional images are displayed in a list on a display device having a display or the like.
- each of the four three-dimensional images is displayed in four areas of upper right, lower right, upper left, and lower left, but they are listed in a different display mode. May be good.
- the list may be displayed in such a way that a plurality of three-dimensional images are arranged in a horizontal row.
- the list may be displayed in such a way that a plurality of three-dimensional images are arranged in a vertical row. It is preferable that the plurality of three-dimensional images displayed in the list are all displayed so as to fit on the screen, but some of them may be off the screen.
- a three-dimensional image that is off the screen may be displayed on the screen by, for example, a scrolling operation or a swiping operation.
- a plurality of three-dimensional images are displayed in the same size.
- a plurality of three-dimensional images may be displayed in different sizes from each other.
- one image may be displayed larger and the other image may be displayed smaller.
- one image may be displayed small and the other image may be displayed large.
- a plurality of three-dimensional images may be displayed in a unique size different from that of other three-dimensional images. For example, there may be a three-dimensional image displayed in the largest size, an image displayed in the second largest size, a three-dimensional image displayed in the third largest size, and a three-dimensional image displayed in the fourth largest size. ..
- FIG. 20 is a diagram showing an example of a display mode in the information processing apparatus according to the sixth embodiment.
- the information processing apparatus 10 may display a plurality of three-dimensional images so as to rotate.
- the upper left three-dimensional image is displayed so as to rotate, but other three-dimensional images may also be displayed so as to rotate.
- all of the displayed three-dimensional images may be displayed to rotate, or some plurality of three-dimensional images may be displayed to rotate. It should be noted that which three-dimensional image is rotated may be set in advance by the user or the like, or may be automatically determined according to various parameters of the three-dimensional image.
- the rotation directions thereof may be the same direction or different directions. Further, the rotation speed may be the same or different.
- the rotation direction and rotation speed of each of the three-dimensional images may change automatically on the way. That is, the three-dimensional image that has been rotated clockwise may be rotated counterclockwise from the middle. Further, the three-dimensional image that has rotated relatively slowly may be rotated relatively quickly from the middle.
- the rotation direction and the rotation speed may be set in advance by the user or the like, or may be automatically determined according to various parameters of the three-dimensional image.
- the information processing apparatus 10 may perform a display that emphasizes the difference between a plurality of three-dimensional images in addition to or instead of the rotation display described above. Specifically, a plurality of displayed three-dimensional images may be compared and the portion where the difference becomes large may be highlighted. Examples of highlighting include displaying in a different color, shading, and enclosing with a frame. More specifically, for example, one 3D image is an image of a blinking face (ie, with eyes closed) and another 3D image is not blinking. When it is an image of a face (that is, the eyes are open), at least one of the three-dimensional images may be highlighted around the eyes.
- a plurality of three-dimensional images are displayed in a list. By doing so, it is possible to present more three-dimensional images to the user as compared with the case where only one three-dimensional image is displayed. Further, by comparing a plurality of three-dimensional images side by side, it is possible to easily discriminate between them.
- the information processing apparatus 10 according to the seventh embodiment will be described with reference to FIGS. 21 to 23. It should be noted that the seventh embodiment is different from the above-mentioned fourth to sixth embodiments only in a part of the configuration and operation, and the other parts are the same as those of the fourth to sixth embodiments. It may be there. Therefore, in the following, the parts different from each of the above-described embodiments will be described in detail, and the description of other overlapping parts will be omitted as appropriate.
- FIG. 21 is a block diagram showing a functional configuration of the information processing apparatus according to the seventh embodiment.
- the same reference numerals are given to the same components as those shown in FIG. 10.
- the information processing apparatus 10 includes an image acquisition unit 110, a first storage unit 120, and a first selection unit 130 as processing blocks for realizing the function. It includes a three-dimensional image generation unit 140, a second storage unit 150, a second selection unit 160, a display unit 170, and a score calculation unit 180. That is, the information processing apparatus 10 according to the seventh embodiment is configured to further include a score calculation unit 180 in addition to the configuration according to the fourth embodiment (see FIG. 10). The score calculation unit 180 may be realized by the processor 11 (see FIG. 1) described above.
- the score calculation unit 180 is configured to be able to calculate the score of each of the three-dimensional images generated by the three-dimensional image generation unit 140.
- the "score" here is a score indicating the quality of the three-dimensional image, and is calculated as a larger value as the quality of the three-dimensional image is higher, for example. More specifically, the score is calculated according to various conditions such as whether the 3D image is not blurred and whether the target of the 3D image is in good condition (for example, whether the person has his eyes closed). ..
- the above example is just an example, and the score may be calculated using other criteria.
- the score when the score is calculated using a plurality of conditions, for example, the score may be calculated one by one from each condition, and then the average value of those scores may be calculated and used as the final score.
- Information about the score calculated by the score calculation unit 180 is output to the display unit 170.
- the display unit 170 switches the display mode of the three-dimensional image according to the score as described later.
- FIG. 22 is a flowchart showing a flow of operation of the information processing apparatus according to the seventh embodiment.
- the same reference numerals are given to the same processes as those shown in FIG.
- the image acquisition unit 110 first acquires an image set including a plurality of images (step S11). Then, the first storage unit 120 stores the image set acquired by the image acquisition unit 110 (step S12).
- the first selection unit 130 selects a first predetermined number of images across the set from the plurality of image sets stored in the first storage unit 120 (step S13).
- the three-dimensional image generation unit 140 generates a three-dimensional image of the subject by using the first predetermined number of images selected by the first selection unit 130 (step S14).
- the second storage unit 150 stores the three-dimensional image set generated by the three-dimensional image generation unit 140 (step S41).
- the score calculation unit 180 calculates the score of the three-dimensional image (step S71).
- the score calculation unit 180 outputs the calculated score to the display unit 170.
- the score may be calculated at the timing when the three-dimensional image is generated. Alternatively, the score calculation may be executed at the timing immediately before the three-dimensional image is output from the display unit 170.
- the second selection unit 160 selects a second predetermined number of three-dimensional images from the plurality of three-dimensional image sets stored in the second storage unit 150 (step S42).
- the display unit 170 outputs a second predetermined number of three-dimensional images selected by the second selection unit 160 as a three-dimensional image for display.
- a three-dimensional image is output so as to be displayed in a display mode corresponding to the score calculated by the score calculation unit 180 (step S72).
- the display mode according to the score will be described below with specific examples.
- FIG. 23 is a diagram showing an example of a display mode in the information processing apparatus according to the seventh embodiment.
- the information processing apparatus 10 may enlarge and display the three-dimensional image having the highest score. That is, the three-dimensional image having the highest score may be displayed in a large size, while the other three-dimensional images may be displayed in a small size.
- the three-dimensional image having the highest score is enlarged and displayed, but a plurality of three-dimensional images having the highest score may be enlarged and displayed. For example, three three-dimensional images may be selected in descending order of score, and the three selected three-dimensional images may be enlarged and displayed. Further, the expansion ratio may be changed according to the score.
- the 3D image with the highest score is displayed the largest
- the 3D image with the second highest score is displayed the second largest
- the 3D image with the third highest score is displayed the third largest
- the score is displayed the 3D image with the 4th highest.
- a three-dimensional image having a low score may be enlarged and displayed.
- the score value of each 3D image may be superimposed and displayed on the 3D image.
- the score value may be displayed for all three-dimensional images, or may be displayed only for the three-dimensional image having the highest score or a plurality of three-dimensional images having the highest score.
- the score may be displayed as a numerical value or may be displayed as a graph or the like.
- the information processing apparatus 10 may, for example, highlight only a three-dimensional image having a high score. For example, the color of the three-dimensional image having the highest score or the three-dimensional image having the highest score may be changed, shaded, or surrounded by a frame. When highlighting, only the three-dimensional image having the highest score may be highlighted, or a plurality of three-dimensional images having the highest scores may be highlighted.
- the information processing apparatus 10 may display only those having a high score. For example, only the three-dimensional image whose score is equal to or higher than the predetermined threshold may be displayed, and the three-dimensional image whose score is lower than the predetermined threshold may not be displayed. In this case, the predetermined threshold value may be appropriately set by the user. Further, only a predetermined number of three-dimensional images having a high score may be displayed, and other three-dimensional images may not be displayed. In this case, the predetermined number may be appropriately set by the user.
- the information processing apparatus 10 may rearrange and display three-dimensional images in order of score.
- the three-dimensional images may be displayed by sorting them in descending order of score.
- the three-dimensional image having a lower score may not be displayed. That is, only three-dimensional images having a high score to some extent may be displayed side by side in order of score.
- the three-dimensional images may be displayed by sorting them in ascending order of score. In this case, the three-dimensional image having the higher score may not be displayed. That is, only three-dimensional images having a low score to some extent may be displayed side by side in order of score.
- the information processing apparatus 10 may rotate and display only a three-dimensional image having a high score (for the rotation display, refer to the sixth embodiment).
- the three-dimensional image having the highest score may be rotated, or a plurality of three-dimensional images having the highest scores may be rotated.
- the rotation direction and the rotation speed of the three-dimensional image may be changed according to the score. For example, an image with a high score may rotate faster, while a three-dimensional image with a low score may rotate slowly.
- the display mode of the three-dimensional image is changed according to the calculated score. In this way, it is possible to present the three-dimensional image to the user in a more appropriate manner according to the score (in other words, quality) of the three-dimensional image.
- the information processing apparatus 10 according to the eighth embodiment will be described with reference to FIGS. 24 and 25. It should be noted that the eighth embodiment is different from the above-mentioned fourth to seventh embodiments only in a part of the configuration and operation, and the other parts are the same as those of the fourth to seventh embodiments. It may be there. Therefore, in the following, the parts different from each of the above-described embodiments will be described in detail, and the description of other overlapping parts will be omitted as appropriate.
- FIG. 24 is a block diagram showing a functional configuration of the information processing apparatus according to the eighth embodiment.
- the same reference numerals are given to the same components as those shown in FIG. 10.
- the information processing apparatus 10 includes an image acquisition unit 110, a first storage unit 120, and a first selection unit 130 as processing blocks for realizing the function. It includes a three-dimensional image generation unit 140, a second storage unit 150, a second selection unit 160, a display unit 170, and a selection operation detection unit 190. That is, the information processing apparatus 10 according to the eighth embodiment is further provided with the selection operation detection unit 190 in addition to the configuration according to the fourth embodiment (see FIG. 10).
- the selection operation detection unit 190 may be realized by the processor 11 (see FIG. 1) described above.
- the selection operation detection unit 190 is configured to be able to detect an operation of selecting a three-dimensional image by the user.
- the selection operation detection unit 190 can detect the user's operation by, for example, the input device 15 (see FIG. 1).
- the selection operation detection unit 190 may detect an operation of selecting one three-dimensional image from a plurality of displayed three-dimensional images.
- the selection operation detection unit 190 may detect an operation of selecting two or more three-dimensional images from a plurality of displayed three-dimensional images.
- the selection operation detection unit 190 outputs information for identifying the selected three-dimensional image to the display unit 170.
- the display unit 170 switches the display mode of the three-dimensional image according to the selection operation as described later.
- FIG. 25 is a flowchart showing an operation flow of the information processing apparatus according to the eighth embodiment.
- the same reference numerals are given to the same processes as those shown in FIG.
- the image acquisition unit 110 first acquires an image set including a plurality of images (step S11). Then, the first storage unit 120 stores the image set acquired by the image acquisition unit 110 (step S12).
- the first selection unit 130 selects a first predetermined number of images across the set from the plurality of image sets stored in the first storage unit 120 (step S13). Then, the three-dimensional image generation unit 140 generates a three-dimensional image of the subject by using the first predetermined number of images selected by the first selection unit 130 (step S14).
- the second storage unit 150 stores the three-dimensional image set generated by the three-dimensional image generation unit 140 (step S41). Then, the second selection unit 160 selects a second predetermined number of three-dimensional images from the plurality of three-dimensional image sets stored in the second storage unit 150 across the sets (step S42). After that, the display unit 170 outputs a second predetermined number of three-dimensional images selected by the second selection unit 160 as a three-dimensional image for display (step S43).
- the selection operation detection unit 190 detects the selection operation by the user (step S81). Then, the display mode of the three-dimensional image is switched according to the detected selection operation (step S82).
- the display mode may be appropriately switched according to the three-dimensional image selected by the user's operation.
- the display mode in this case may be the same as the display mode described in the seventh embodiment.
- the information processing apparatus 10 may, for example, enlarge and display a three-dimensional image selected by the user.
- the user-selected 3D image may be highlighted.
- only the three-dimensional image selected by the user may be displayed.
- the user-selected three-dimensional image may be displayed to rotate.
- the information processing apparatus 10 may display a slide show with a three-dimensional image selected by the user.
- the three-dimensional images may be displayed in the order selected by the user.
- a plurality of two-dimensional images may be displayed for the three-dimensional image selected by the user. For example, if the 3D image is an image of a person's face, then based on the selected 3D image, the face is viewed from the right side, the face is viewed from the left side, and the face is viewed from above. A two-dimensional image or the like may be displayed.
- a three-dimensional image may be displayed three-dimensionally by using AR (Augmented Reality), hologram technology, or the like.
- the display mode of the three-dimensional image is changed according to the user's selection operation. In this way, it is possible to present the three-dimensional image in a more appropriate manner according to the user's operation (in other words, reflecting the user's intention).
- the three-dimensional facial shape measuring device can measure the three-dimensional shape of the face of a person who is a subject by taking an image of the face of a person with two cameras on the left and right and synthesizing the images. More specifically, the camera on the right side captures the image on the right side of the face, and the camera on the left side captures the image on the left side of the face.
- the three-dimensional facial shape measuring device may, for example, take an image in a state where the subject is irradiated with a sinusoidal pattern and perform measurement using a sinusoidal grid shift method.
- Each implementation also implements a processing method in which a program for operating the configuration of the embodiment is recorded on a recording medium so as to realize the functions of the above-described embodiments, the program recorded on the recording medium is read out as a code, and the program is executed by a computer. Included in the category of morphology. That is, a computer-readable recording medium is also included in the scope of each embodiment. Further, not only the recording medium on which the above-mentioned program is recorded but also the program itself is included in each embodiment.
- the recording medium for example, a floppy (registered trademark) disk, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a magnetic tape, a non-volatile memory card, or a ROM can be used.
- a floppy (registered trademark) disk for example, a hard disk, an optical disk, a magneto-optical disk, a CD-ROM, a magnetic tape, a non-volatile memory card, or a ROM
- the program recorded on the recording medium that executes the process alone, but also the program that operates on the OS and executes the process in cooperation with other software and the function of the expansion board is also an embodiment. Is included in the category of.
- This disclosure is not limited to the above embodiment.
- This disclosure can be appropriately modified to the extent that it does not contradict the gist or idea of the invention that can be read from the claims and the entire specification, and the information processing apparatus, information processing method, computer program, and the information processing device, and the computer program, which are accompanied by such changes.
- the recording medium is also included in the technical idea of this disclosure.
- the information processing apparatus is composed of an acquisition means for acquiring an image set including a plurality of images of a subject, a first storage means for storing a plurality of the image sets, and a plurality of the image sets.
- Information characterized by comprising a first selection means for selecting a first predetermined number of images across a set, and a generation means for generating a three-dimensional image of the subject based on the first predetermined number of images. It is a processing device.
- Appendix 2 The information processing apparatus according to Appendix 2 is the information processing apparatus according to Appendix 1, wherein the first selection means selects the first predetermined number of images according to the processing capacity of the generation means. be.
- the information processing apparatus has the plurality of images, the image captured by projecting a sinusoidal wave pattern in a sinusoidal pattern onto the subject, and the brightness gradient pattern in which the brightness value linearly changes.
- the information processing apparatus according to Appendix 1 or 2, characterized in that it includes an image projected onto a subject and captured, and a texture image showing the state of the surface of the subject.
- the information processing apparatus is a second storage means for storing a plurality of sets of three-dimensional images including a plurality of the three-dimensional images, and a second predetermined set across the sets from the plurality of the three-dimensional image sets.
- the information processing apparatus is characterized in that the second selection means selects the second predetermined number of the three-dimensional images according to the processing capacity of the display means. It is a processing device.
- the information processing apparatus according to the appendix 6 is the information processing apparatus according to any one of the appendices 4 or 5, wherein the display means displays a list of the second predetermined number of the three-dimensional images. be.
- the information processing apparatus further includes a calculation means for calculating a score according to a predetermined evaluation standard for each of the second predetermined number of the three-dimensional images, and the display means corresponds to the score.
- the information processing apparatus according to any one of Supplementary note 4 to 6, wherein the display mode of each of the second predetermined number of the three-dimensional images is changed.
- the information processing apparatus further includes a detection means for detecting a selection operation for selecting a part of the three-dimensional images from the second predetermined number of the three-dimensional images, and the display means is the selection.
- the information processing apparatus according to any one of Supplementary note 4 to 7, wherein the display mode of the three-dimensional image selected by the operation is changed.
- Appendix 9 In the information processing method described in Appendix 9, an image set including a plurality of images of a subject is acquired, a plurality of the image sets are stored, and a first predetermined number of images is straddled from the plurality of the image sets. It is an information processing method characterized by selecting the image of the above and generating a three-dimensional image of the subject based on the first predetermined number of images.
- the computer program according to the appendix 10 acquires an image set including a plurality of images of a subject, stores a plurality of the image sets, and from the plurality of the image sets, a first predetermined number of images straddling the set. It is a computer program characterized in that an image is selected and a computer is operated so as to generate a three-dimensional image of the subject based on the first predetermined number of images.
- Appendix 11 The recording medium described in Appendix 11 is a recording medium characterized in that the computer program described in Appendix 10 is recorded.
- Information processing device 11 Processor 50 Subject 110 Image acquisition unit 120 First storage unit 130 First selection unit 140 Three-dimensional image generation unit 150 Second storage unit 160 Second selection unit 170 Display unit 180 Score calculation unit 190 Selection operation detection unit 210 Imaging unit 220 Imaging unit
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
Abstract
Description
第1実施形態に係る情報処理装置について、図1から図4を参照して説明する。
まず、図1を参照しながら、第1実施形態に係る情報処理装置のハードウェア構成について説明する。図1は、第1実施形態に係る情報処理装置のハードウェア構成を示すブロック図である。
次に、図2を参照しながら、第1実施形態に係る情報処理装置10の機能的構成について説明する。図2は、第1実施形態に係る情報処理装置の機能的構成を示すブロック図である。
次に、図3を参照しながら、第1実施形態に係る情報処理装置10の動作の流れについて説明する。図3は、第1実施形態に係る情報処理装置の動作の流れを示すフローチャートである。
次に、図4を参照しながら、第1選択部130による画像の選択例について具体例を挙げて説明する。図4は、第1実施形態に係る情報処理装置における画像の選択例を示す概念図である。
次に、第1実施形態に係る情報処理装置10によって得られる技術的効果について説明する。
第2実施形態に係る情報処理装置10について、図5を参照して説明する。なお、第2実施形態は、上述した第1実施形態と比較して動作の一部が異なるのみであり、例えば装置のハードウェア構成(図1参照)や機能的構成(図2参照)については、第1実施形態と同様であってよい。このため、以下では、第1実施形態と異なる部分について詳しく説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図5を参照しながら、第2実施形態に係る情報処理装置の動作の流れについて説明する。図5は、第2実施形態に係る情報処理装置の動作の流れを示すフローチャートである。なお、図5では、図3で示した処理と同様の処理に同一の符号を付している。
次に、第2実施形態に係る情報処理装置10によって得られる技術的効果について説明する。
第3実施形態に係る情報処理装置10について、図6から図9を参照して説明する。なお、第3実施形態は、上述した第1及び第2実施形態において用いられる画像(即ち、三次元画像を生成するための被写体の画像)の具体例を示すものであり、装置構成や動作の流れ等については、第1及び第2実施形態と同様であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳しく説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図6を参照しながら、第3実施形態に係る情報処理装置を含む撮像システム全体の構成について説明する。図6は、第3実施形態に係る情報処理装置が適用される撮像システムの構成を示す概略構成図である。
図7を参照しながら、被写体に投影される正弦波パターン、及び正弦波パターンを利用した計測手法について詳しく説明する。図7は、被写体に投影される正弦波パターンの一例を示す平面図である。
次に、図8を参照しながら、被写体に投影される輝度傾斜パターン、及び輝度傾斜パターンを用いた計測手法について詳しく説明する。図8は、被写体に投影される輝度傾斜パターンの一例を示す平面図である。
A=A″,B=B″ …(10)
次に、図9を参照しながら、第3実施形態に係る情報処理装置10で取得される画像セットについて説明する。図9は、第3実施形態に係る情報処理装置で用いられる画像セットの一例を示す図である。
次に、第3実施形態に係る情報処理装置10によって得られる技術的効果について説明する。
第4実施形態に係る情報処理装置10について、図10から図15を参照して説明する。なお、第4実施形態は、上述した第1から第3実施形態と比較して、一部の構成及び動作が異なるのみであり、その他の部分については、第1から第3実施形態と同様であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳しく説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図10を参照しながら、第4実施形態に係る情報処理装置10の機能的構成について説明する。図10は、第4実施形態に係る情報処理装置の機能的構成を示すブロック図である。なお、図10では、図2で示した構成要素と同様の要素に同一の符号を付している。
次に、図11を参照しながら、第4実施形態に係る情報処理装置10の動作の流れについて説明する。図11は、第4実施形態に係る情報処理装置の動作の流れを示すフローチャートである。なお、図11では、図3で示した処理と同様の処理に同一の符号を付している。
次に、図12から図15を参照しながら、第4実施形態に係る情報処理装置10のメモリ(即ち、第1記憶部120及び第2記憶部150)の動作について説明する。図12は、画像リングメモリ及び三次元画像リングメモリの構成を示す概念図である。図13は、第4実施形態に係る情報処理装置における撮像スレッドの動作の流れを示すフローチャートである。図14は、第4実施形態に係る情報処理装置における三次元画像生成スレッドの動作の流れを示すフローチャートである。図15は、第4実施形態に係る情報処理装置における三次元画像表示スレッドの動作の流れを示すフローチャートである。
次に、第4実施形態に係る情報処理装置10によって得られる技術的効果について説明する。
次に、第4実施形態に係る情報処理装置10の変形例について、図16及び図17を参照して説明する。なお、第4実施形態に係る変形例は、一部が第4実施形態と異なるのみであり、その他の部分は概ね第4実施形態と同様である。よって、以下では、第4実施形態と異なる部分について詳しく説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図16を参照しながら、第4実施形態に係る情報処理装置10の変形例の機能的構成について説明する。図16は、第4実施形態に係る情報処理装置の変形例の機能的構成を示すブロック図である。なお、図16では、図10で示した構成要素と同様の要素に同一の符号を付している。
次に、図17を参照しながら、第4実施形態に係る情報処理装置10の変形例の動作の流れについて説明する。図17は、第4実施形態に係る情報処理方法の変形例の動作の流れを示すフローチャートである。なお、図17では、図11で示した処理と同様の処理に同一の符号を付している。
そして、三次元画像生成部140が、画像取得部110で取得された画像セットを用いて、被写体の三次元画像を生成する(ステップS40)。即ち、上述した第4実施形態のように、第1記憶部120による記憶、第1選択部130による選択を行わずに、そのまま直接三次元画像を生成する。
次に、第4実施形態に係る情報処理装置10の変形例によって得られる技術的効果について説明する。
第5実施形態に係る情報処理装置10について、図18を参照して説明する。なお、第5実施形態は、上述した第4実施形態と比較して動作の一部が異なるのみであり、その他の部分については、第4実施形態と同様であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳しく説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図18を参照しながら、第5実施形態に係る情報処理装置10の動作の流れについて説明する。図18は、第5実施形態に係る情報処理装置の動作の流れを示すフローチャートである。なお、図18では、図11で示した処理と同様の処理に同一の符号を付している。
次に、第5実施形態に係る情報処理装置10によって得られる技術的効果について説明する。
第6実施形態に係る情報処理装置10について、図19及び図20を参照して説明する。なお、第6実施形態は、上述した第4及び第5実施形態が備える表示部170による表示態様の具体例を説明するものであり、その他の動作や装置構成については、第4及び第5実施形態と同様であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳しく説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図19を参照しながら、第6実施形態に係る情報処理装置10で実現される表示態様について説明する。図19は、第6実施形態に係る情報処理装置による三次元画像の一覧表示の例を示す図である。以下では、被写体50が人物の顔であるものとして説明を進める。
次に、図20を参照しながら、第6実施形態に係る情報処理装置10で実現される他の表示態様について説明する。図20は、第6実施形態に係る情報処理装置における表示態様の一例を示す図である。
次に、第6実施形態に係る情報処理装置10によって得られる技術的効果について説明する。
第7実施形態に係る情報処理装置10について、図21から図23を参照して説明する。なお、第7実施形態は、上述した第4から第6実施形態と比較して、一部の構成及び動作が異なるのみであり、その他の部分については、第4から第6実施形態と同様であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳しく説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図21を参照しながら、第7実施形態に係る情報処理装置10の機能的構成について説明する。図21は、第7実施形態に係る情報処理装置の機能的構成を示すブロック図である。なお、図21では、図10で示した構成要素と同様の要素に同一の符号を付している。
次に、図22を参照しながら、第7実施形態に係る情報処理装置10の動作の流れについて説明する。図22は、第7実施形態に係る情報処理装置の動作の流れを示すフローチャートである。なお、図22では、図11で示した処理と同様の処理に同一の符号を付している。
次に、図23を参照しながら、第7実施形態に係る情報処理装置10によって実現される表示態様について具体的に説明する。図23は、第7実施形態に係る情報処理装置における表示態様の一例を示す図である。
次に、第7実施形態に係る情報処理装置10によって得られる技術的効果について説明する。
第8実施形態に係る情報処理装置10について、図24及び図25を参照して説明する。なお、第8実施形態は、上述した第4から第7実施形態と比較して、一部の構成及び動作が異なるのみであり、その他の部分については、第4から第7実施形態と同様であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳しく説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図24を参照しながら、第8実施形態に係る情報処理装置10の機能的構成について説明する。図24は、第8実施形態に係る情報処理装置の機能的構成を示すブロック図である。なお、図24では、図10で示した構成要素と同様の要素に同一の符号を付している。
次に、図25を参照しながら、第8実施形態に係る情報処理装置10の動作の流れについて説明する。図25は、第8実施形態に係る情報処理装置の動作の流れを示すフローチャートである。なお、図25では、図11で示した処理と同様の処理に同一の符号を付している。
第8実施形態に係る情報処理装置10では、ユーザの操作によって選択された三次元画像に応じて、適宜表示態様が切り替えられてよい。この場合の表示態様は、第7実施形態で上げた表示態様と同様のものであってよい。
次に、第8実施形態に係る情報処理装置10によって得られる技術的効果について説明する。
上述した第1から第第8実施形態の情報処理装置の具体的な適用例について説明する。
上述した各実施形態は、顔の三次元形状を計測する三次元顔貌形状計測装置に適用可能である。三次元顔貌形状計測装置は、左右2つのカメラによって人物の顔を撮像して、それらの画像を合成することで、被写体である人物の顔の三次元形状を計測することが可能である。より具体的には、右側のカメラが顔の右側の画像を撮像し、左側のカメラが顔の左側の画像を撮像する。そして、顔の右側の画像から作成された顔の右側の形状と、顔の左側の画像から作成された顔の左側の形状とを合成することで、人物の顔全体(例えば、耳まで)の三次元形状を作成する。三次元顔貌形状計測装置は、例えば正弦波パターンを被写体に照射した状態で画像を撮像し、正弦波格子シフト法を用いた計測を行うものであってもよい。
以上説明した実施形態に関して、更に以下の付記のようにも記載されうるが、以下には限られない。
付記1に記載の情報処理装置は、被写体を撮像した複数枚の画像を含む画像セットを取得する取得手段と、前記画像セットを複数セット記憶する第1記憶手段と、複数の前記画像セットから、セットを跨いで第1所定枚数の画像を選択する第1選択手段と、前記第1所定枚数の画像に基づいて、前記被写体の三次元画像を生成する生成手段とを備えることを特徴とする情報処理装置である。
付記2に記載の情報処理装置は、前記第1選択手段は、前記生成手段の処理能力に応じて前記第1所定枚数の画像を選択することを特徴とする付記1に記載の情報処理装置である。
付記3に記載の情報処理装置は、前記複数枚の画像は、正弦波格子状の正弦波パターンを前記被写体に投影して撮像した画像と、輝度値が線形的に変化する輝度傾斜パターンを前記被写体に投影して撮像した画像と、前記被写体の表面の状態を示すテクスチャ画像とを含んでいることを特徴とする付記1又は2に記載の情報処理装置である。
付記4に記載の情報処理装置は、複数枚の前記三次元画像を含む三次元画像セットを複数セット記憶する第2記憶手段と、複数の前記三次元画像セットから、セットを跨いで第2所定枚数の前記三次元画像を選択する第2選択手段と、前記第2所定枚数の前記三次元画像を表示する表示手段とを更に備えることを特徴とする請求項1から3のいずれか一項に記載の情報処理装置である。
付記5に記載の情報処理装置は、前記第2選択手段は、前記表示手段の処理能力に応じて前記第2所定枚数の前記三次元画像を選択することを特徴とする付記4に記載の情報処理装置である。
付記6に記載の情報処理装置は、前記表示手段は、前記第2所定枚数の前記三次元画像を一覧表示することを特徴とする付記4又は5のいずれか一項に記載の情報処理装置である。
付記7に記載の情報処理装置は、前記第2所定枚数の前記三次元画像の各々について、所定の評価基準に応じたスコアを算出する算出手段を更に備え、前記表示手段は、前記スコアに応じて前記第2所定枚数の前記三次元画像の各々の表示態様を変更することを特徴とする付記4から6のいずれか一項に記載の情報処理装置である。
付記8に記載の情報処理装置は、前記第2所定枚数の前記三次元画像のうち一部の前記三次元画像を選択する選択操作を検出する検出手段を更に備え、前記表示手段は、前記選択操作で選択された前記三次元画像の表示態様を変更することを特徴とする付記4から7のいずれか一項に記載の情報処理装置である。
付記9に記載の情報処理方法は、被写体を撮像した複数枚の画像を含む画像セットを取得し、前記画像セットを複数セット記憶し、複数の前記画像セットから、セットを跨いで第1所定枚数の画像を選択し、前記第1所定枚数の画像に基づいて、前記被写体の三次元画像を生成することを特徴とする情報処理方法である。
付記10に記載のコンピュータプログラムは、被写体を撮像した複数枚の画像を含む画像セットを取得し、前記画像セットを複数セット記憶し、複数の前記画像セットから、セットを跨いで第1所定枚数の画像を選択し、前記第1所定枚数の画像に基づいて、前記被写体の三次元画像を生成するようにコンピュータを動作させることを特徴とするコンピュータプログラムである。
付記11に記載の記録媒体は、付記10に記載のコンピュータプログラムが記録されていることを特徴とする記録媒体である。
11 プロセッサ
50 被写体
110 画像取得部
120 第1記憶部
130 第1選択部
140 三次元画像生成部
150 第2記憶部
160 第2選択部
170 表示部
180 スコア算出部
190 選択操作検出部
210 撮像部
220 投影部
Claims (10)
- 被写体を撮像した複数枚の画像を含む画像セットを取得する取得手段と、
前記画像セットを複数セット記憶する第1記憶手段と、
複数の前記画像セットから、セットを跨いで第1所定枚数の画像を選択する第1選択手段と、
前記第1所定枚数の画像に基づいて、前記被写体の三次元画像を生成する生成手段と
を備えることを特徴とする情報処理装置。 - 前記第1選択手段は、前記生成手段の処理能力に応じて前記第1所定枚数の画像を選択することを特徴とする請求項1に記載の情報処理装置。
- 前記複数枚の画像は、正弦波格子状の正弦波パターンを前記被写体に投影して撮像した画像と、輝度値が線形的に変化する輝度傾斜パターンを前記被写体に投影して撮像した画像と、前記被写体の表面の状態を示すテクスチャ画像とを含んでいることを特徴とする請求項1又は2に記載の情報処理装置。
- 複数枚の前記三次元画像を含む三次元画像セットを複数セット記憶する第2記憶手段と、
複数の前記三次元画像セットから、セットを跨いで第2所定枚数の前記三次元画像を選択する第2選択手段と、
前記第2所定枚数の前記三次元画像を表示する表示手段と
を更に備えることを特徴とする請求項1から3のいずれか一項に記載の情報処理装置。 - 前記第2選択手段は、前記表示手段の処理能力に応じて前記第2所定枚数の前記三次元画像を選択することを特徴とする請求項4に記載の情報処理装置。
- 前記表示手段は、前記第2所定枚数の前記三次元画像を一覧表示することを特徴とする請求項4又は5のいずれか一項に記載の情報処理装置。
- 前記第2所定枚数の前記三次元画像の各々について、所定の評価基準に応じたスコアを算出する算出手段を更に備え、
前記表示手段は、前記スコアに応じて前記第2所定枚数の前記三次元画像の各々の表示態様を変更する
ことを特徴とする請求項4から6のいずれか一項に記載の情報処理装置。 - 前記第2所定枚数の前記三次元画像のうち一部の前記三次元画像を選択する選択操作を検出する検出手段を更に備え、
前記表示手段は、前記選択操作で選択された前記三次元画像の表示態様を変更する
ことを特徴とする請求項4から7のいずれか一項に記載の情報処理装置。 - 被写体を撮像した複数枚の画像を含む画像セットを取得し、
前記画像セットを複数セット記憶し、
複数の前記画像セットから、セットを跨いで第1所定枚数の画像を選択し、
前記第1所定枚数の画像に基づいて、前記被写体の三次元画像を生成する
ことを特徴とする情報処理方法。 - 被写体を撮像した複数枚の画像を含む画像セットを取得し、
前記画像セットを複数セット記憶し、
複数の前記画像セットから、セットを跨いで第1所定枚数の画像を選択し、
前記第1所定枚数の画像に基づいて、前記被写体の三次元画像を生成する
ようにコンピュータを動作させることを特徴とするコンピュータプログラムが記録されている記録媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/038,286 US20230419604A1 (en) | 2020-11-30 | 2021-10-20 | Information processing apparatus, information processing method, and recording medium |
JP2022565117A JPWO2022113583A5 (ja) | 2021-10-20 | 情報処理装置、情報処理方法、及びコンピュータプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020198259 | 2020-11-30 | ||
JP2020-198259 | 2020-11-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022113583A1 true WO2022113583A1 (ja) | 2022-06-02 |
Family
ID=81754223
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/038810 WO2022113583A1 (ja) | 2020-11-30 | 2021-10-20 | 情報処理装置、情報処理方法、及び記録媒体 |
Country Status (2)
Country | Link |
---|---|
US (1) | US20230419604A1 (ja) |
WO (1) | WO2022113583A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017037089A (ja) * | 2016-10-14 | 2017-02-16 | 株式会社キーエンス | 形状測定装置 |
WO2017154606A1 (ja) * | 2016-03-10 | 2017-09-14 | ソニー株式会社 | 情報処理装置および情報処理方法 |
WO2018135510A1 (ja) * | 2017-01-19 | 2018-07-26 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 三次元再構成方法及び三次元再構成装置 |
WO2019177066A1 (ja) * | 2018-03-16 | 2019-09-19 | 日本電気株式会社 | 3次元形状計測装置、3次元形状計測方法、プログラム及び記録媒体 |
-
2021
- 2021-10-20 WO PCT/JP2021/038810 patent/WO2022113583A1/ja active Application Filing
- 2021-10-20 US US18/038,286 patent/US20230419604A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017154606A1 (ja) * | 2016-03-10 | 2017-09-14 | ソニー株式会社 | 情報処理装置および情報処理方法 |
JP2017037089A (ja) * | 2016-10-14 | 2017-02-16 | 株式会社キーエンス | 形状測定装置 |
WO2018135510A1 (ja) * | 2017-01-19 | 2018-07-26 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 三次元再構成方法及び三次元再構成装置 |
WO2019177066A1 (ja) * | 2018-03-16 | 2019-09-19 | 日本電気株式会社 | 3次元形状計測装置、3次元形状計測方法、プログラム及び記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022113583A1 (ja) | 2022-06-02 |
US20230419604A1 (en) | 2023-12-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
TWI523517B (zh) | 影像擷取裝置、影像對齊方法及用於執行該方法之儲存媒體 | |
CN111062981B (zh) | 图像处理方法、装置及存储介质 | |
JP2015510112A (ja) | 仮想定規 | |
JP2007129709A (ja) | イメージングデバイスをキャリブレートするための方法、イメージングデバイスの配列を含むイメージングシステムをキャリブレートするための方法およびイメージングシステム | |
EP2866088B1 (en) | Information processing apparatus and method | |
JP6417797B2 (ja) | 情報端末装置、情報処理方法、システムおよびプログラム | |
US10970807B2 (en) | Information processing apparatus and storage medium | |
JP2016136683A (ja) | 撮像装置及びその制御方法 | |
WO2013179560A1 (ja) | 画像処理装置および画像処理方法 | |
CN110610457A (zh) | 全景图像拼接方法、装置以及系统 | |
JP2016212784A (ja) | 画像処理装置、画像処理方法 | |
JP2019527355A (ja) | デジタル画像における改善された光沢表現のためのコンピュータシステムおよび方法 | |
JP6926486B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
JP5805013B2 (ja) | 撮像画像表示装置、撮像画像表示方法、プログラム | |
WO2022113583A1 (ja) | 情報処理装置、情報処理方法、及び記録媒体 | |
JP6371547B2 (ja) | 画像処理装置、方法、および、プログラム | |
JP2020166653A (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP2020166652A (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP2011227915A (ja) | 画像分類装置および方法並びにプログラム | |
US11632601B1 (en) | User interface for camera focus | |
JP6344903B2 (ja) | 画像処理装置およびその制御方法、撮像装置、プログラム | |
US20230334820A1 (en) | Image processing apparatus, image processing method, and non-transitory computer-readable storage medium | |
US20240029379A1 (en) | Image processing apparatus, image processing method, and computer-readable recording medium | |
WO2022113582A1 (ja) | キャリブレーション方法、キャリブレーション装置、キャリブレーションシステム、及び記録媒体 | |
JP2018041201A (ja) | 表示制御プログラム、表示制御方法および情報処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21897561 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022565117 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18038286 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21897561 Country of ref document: EP Kind code of ref document: A1 |