WO2021106961A1 - Dispositif de génération d'image - Google Patents

Dispositif de génération d'image Download PDF

Info

Publication number
WO2021106961A1
WO2021106961A1 PCT/JP2020/043904 JP2020043904W WO2021106961A1 WO 2021106961 A1 WO2021106961 A1 WO 2021106961A1 JP 2020043904 W JP2020043904 W JP 2020043904W WO 2021106961 A1 WO2021106961 A1 WO 2021106961A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
unit
trained model
learning
image
Prior art date
Application number
PCT/JP2020/043904
Other languages
English (en)
Japanese (ja)
Inventor
謙祐 横田
杉浦 直樹
Original Assignee
株式会社小糸製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社小糸製作所 filed Critical 株式会社小糸製作所
Priority to JP2021561466A priority Critical patent/JPWO2021106961A1/ja
Publication of WO2021106961A1 publication Critical patent/WO2021106961A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Definitions

  • the present invention relates to an image generator.
  • Patent Document 1 discloses an image generator that generates intermediate image data by morphing processing.
  • the difficulty of predicting the new image data generated from the first image data and the second image data is not set in the morphing process. Therefore, there is a concern that the new image data can be easily predicted from the first image data and the second image data. Further, since the intermediate image data is an intermediate image data between the first image data and the second image data, the new image data is apparently the data recalled from the first image data or the second image data. There is a concern. Therefore, after setting the difficulty of prediction, it is required to generate image data having a new design property that is not easy to predict from the first image data and the second image data.
  • an object of the present invention is to provide an image generation device capable of generating image data having a new design property that is not easy to predict from a plurality of image data after setting the difficulty of prediction.
  • the image generator of the present invention includes a recording unit that records a plurality of first image data and a plurality of second image data, a first specific gravity of the first domain of the first image data, and the above.
  • a setting unit for setting the second specific weight of the second domain of the second image data, and learned corresponding to the first specific weight and the second specific weight from the plurality of first image data and the plurality of second image data.
  • a learning unit that generates a model for each value of the first specific gravity and the second specific gravity, a trained model storage unit that stores a plurality of the trained models, and a plurality of the trained model storage units that are stored in the trained model storage unit.
  • a trained model selection unit that selects one trained model from the trained models, a test image data input unit that inputs test image data, and the trained model selected by the trained model selection unit. It is characterized by including an image data generation unit that generates new image data from the test image data input from the test image data input unit.
  • the values of the first specific density and the second specific gravity indicating the difficulty of prediction can be set, and a trained model corresponding to each of the values of the first specific density and the second specific gravity is generated, and a plurality of trained models are generated.
  • New image data is generated from the test image data using one of the trained models of.
  • the degree of conversion of the test image data changes according to the trained model corresponding to the values of the first specific density and the second specific gravity.
  • the new image data may become image data that is difficult to predict from the first image data and the second image data, and may have a new design property. Therefore, the image generation device of the present embodiment can generate new image data having a new design property that is not easy to predict from the plurality of first and second image data after setting the difficulty of prediction.
  • the learning unit may generate a trained model according to the Cycle GAN method.
  • the learning unit may perform the calculation used in the Cycle GAN method by the number of learning times set in each of the learned models to generate each of the learned models.
  • the image generator of the present invention may further include an output unit that outputs new image data.
  • an image generation device capable of generating image data having a new design property that is not easy to predict from a plurality of image data after setting the difficulty of prediction. can do.
  • FIG. 1 is a block diagram of an image generator according to an embodiment of the present invention.
  • FIG. 2 is a diagram for explaining the third term in the formula of the loss function of the learning unit.
  • FIG. 3 is a flowchart showing the trained model generation steps.
  • FIG. 4 is a flowchart showing the learning process in the trained model generation step.
  • FIG. 5 is a flowchart showing an image generation step.
  • FIG. 1 is a block diagram of the image generation device 10 according to the present embodiment.
  • the image generation device 10 generates a trained model from a plurality of image data according to the Cycle GAN method in the hostile generation network (GAN (Generative Adversarial Network)) method, and generates new image data using the generated trained model. To do.
  • GAN Geneative Adversarial Network
  • FIG. 1 an example of generating a trained model from each domain of two image data is shown.
  • a domain indicates a feature in image data.
  • the image generation device 10 includes a recording unit 21, a first image data input unit 23, a second image data input unit 25, a specific gravity input unit 27, a learning frequency input unit 29, a learning unit 41, and an image data generation unit. It includes a control unit 40 including 47, a trained model storage unit 51, a test image data input unit 53, a trained model selection unit 55, and an image output unit 57.
  • each block of the image generation device 10 may be configured by hardware, may be configured by software, or may be configured by a combination of hardware and software.
  • the recording unit 21 records a plurality of first image data and a plurality of second image data.
  • Each first image data is data in which the appearance of each first image data is similar when each first image data is output as an image (for example, a still image), and the appearance of each first image data.
  • the first image data has been described, but the same applies to each of the second image data.
  • each first image data is image data indicating cat's eyes
  • each first image data is data classified into the same category such as cat's eyes.
  • each second image data is image data indicating a vehicle headlight
  • each second image data is data classified into the same category such as a vehicle headlight, and is the first data.
  • the data is classified into a category different from the image data.
  • the plurality of first image data has a first domain
  • the plurality of second image data has a second domain.
  • the first image data is image data indicating the eyes of a cat
  • the first domain indicates, for example, the size and shape of the eyes.
  • the second image data is image data indicating a vehicle headlight
  • the second domain indicates, for example, the size and shape of the vehicle headlight.
  • the recording unit 21 records training image data and test image data, which will be described later.
  • the recording unit 21 is, for example, a memory.
  • the first image data input unit 23 inputs to the learning unit 41 an instruction to cause the learning unit 41 of the control unit 40 to read a plurality of first image data recorded in the recording unit 21.
  • the second image data input unit 25 inputs to the learning unit 41 an instruction to cause the learning unit 41 to read a plurality of second image data recorded in the recording unit 21.
  • the specific gravity input unit 27 learns the specific density ⁇ A of the first domain of the first image data used at the time of learning of the learning unit 41 and the specific gravity ⁇ B of the second domain of the second image data used at the time of learning of the learning unit 41. It is input to the setting unit 43 described later of 41.
  • the specific densities ⁇ A and ⁇ B indicate the difficulty of prediction described later.
  • the values of the specific densities ⁇ A and ⁇ B can be appropriately set by the user.
  • the learning number input unit 29 inputs the learning number of the learning unit 41, which will be described later, to the setting unit 43.
  • the first image data input unit 23, the second image data input unit 25, the specific gravity input unit 27, and the learning frequency input unit 29 are devices for input such as a keyboard and a mouse.
  • the control unit 40 includes a CPU (Central Processing Unit) and a memory.
  • the control unit 40 comprehensively controls the operation of the image generation device 10 by reading and executing the control program recorded in the memory by the CPU.
  • the learning unit 41 has a setting unit 43.
  • the setting unit 43 sets the specific densities ⁇ A and ⁇ B input from the specific densities input unit 27, and inputs the set specific densities ⁇ A and ⁇ B to the generation unit 45 and the identification unit 46 described later of the learning unit 41. Further, the setting unit 43 sets the learning number input from the learning number input unit 29, and inputs the set learning number to the generation unit 45 and the identification unit 46.
  • the learning unit 41 has a generation unit 45 and an identification unit 46, and the generation unit 45 and the identification unit 46 form a neural network in machine learning.
  • the fake image data and the training image data used in the generation unit 45 and the identification unit 46 will be described.
  • the learning unit 41 is mainly both the generation unit 45 and the identification unit 46.
  • the fake image data is fake data obtained by converting a certain image data so as to approximate it to the training image data.
  • the training image data is real data that is a basis for improving the accuracy of the fake image data in order to approximate the fake image data to the training image data.
  • the approximation here indicates the appearance when the image data is output as an image (for example, a still image).
  • the training image data becomes the second image data
  • the fake image data approximates the first image data to the second image data which is the training image data. It becomes the converted fake second image data.
  • the training image data becomes the first image data
  • the fake image data approximates the second image data to the first image data which is the training image data. It becomes the fake first image data converted into.
  • the generation unit 45 reads the above-mentioned image data from the recording unit 21, converts the image data, and generates fake image data from the image data.
  • the fake image data is input to the identification unit 46.
  • the identification unit 46 discriminates between the fake image data input from the generation unit 45 and the training image data read from the recording unit 21.
  • the identification unit 46 calculates information regarding the deviation between the fake image data and the training image data, and outputs the information to the generation unit 45.
  • the generation unit 45 reads image data different from the image data read from the recording unit 21 from the recording unit 21, and converts the other image data read from the other image data based on the information from the identification unit 46. To generate fake image data different from the above. Another fake image data is input to the identification unit 46, and the identification unit 46 discriminates between the other fake image data and the training image data.
  • the generation unit 45 and the identification unit 46 compete with each other alternately, and as a result, the generation unit 45 and the identification unit 46 deepen the learning.
  • the generation unit 45 can generate fake image data that is close to the training image data.
  • the identification unit 46 does not output the information to the generation unit 45, and the generation unit 45 does not generate the fake image data.
  • the generation unit 45 and the identification unit 46 have image data. Both the first image data and the second image data are generated and identified, and this point will be described below.
  • the Cycle GAN method is represented by the following loss function equation (1).
  • a special loss function called "Cycle loss”, which is the third term, is added to the loss function which is the first term and the second term.
  • the first term of the equation (1) is a loss function that converts the first image data into fake image data approximated to the second image data.
  • X is shows the first image data
  • Y represents a second image data
  • G represents the generator 45 for generating a false image data from the first image data
  • D Y is the training image data
  • the identification unit 46 for distinguishing from the fake image data is shown.
  • the second term of the equation (1) is a loss function that converts the second image data into fake image data approximated to the first image data.
  • Y represents a second image data
  • X is shows the first image data
  • F is shows a generator 45 which generates false image data from the second image data
  • D X is the training image data
  • the identification unit 46 for distinguishing from the fake image data is shown.
  • the specific densities ⁇ A and ⁇ B set by the setting unit 43 are integrated as coefficients.
  • the third term of the formula (1) is represented by the following formula (2).
  • the equation (2) will be described below with reference to FIG.
  • the generation unit 45 reduces the difference between the first image data real_X and the restored first image data rec_X by the specific gravity ⁇ A, thereby reducing the difference between the first image data real_X and the fake image data fake_Y. Suppress excessive conversion to.
  • the specific density ⁇ A is integrated, and the smaller the specific density ⁇ A, the weaker the above-mentioned suppression, the greater the degree of conversion of the first image data real_X, and the more dynamically the first image data real_X becomes. Will be converted. Therefore, the fake image data fake_Y generated by the conversion is not more similar to the second image data which is the training image data, and is a new image having a new design that is difficult to predict from the first image data real_X. It becomes data.
  • the fake image data fake_Y generated by the conversion is not similar to the second image data which is the training image data, and is a new image having a new design that is not easy to predict from the first image data real_X. It becomes data.
  • the smaller the specific density ⁇ A the more difficult it is to predict, and the fake image data fake_Y becomes image data that is more difficult to predict from the first image data real_X.
  • the identification unit 46 denoted D Y in Figure 2 identifies the second image data is a training image data false image data fake_Y generated by the generator 45.
  • the generation unit 45 reduces the difference between the second image data real_Y and the restored second image data rec_Y by the specific gravity ⁇ B, so that the fake image data fake_X from the second image data real_Y Suppress excessive conversion to.
  • the specific density ⁇ B is integrated, and the smaller the specific density ⁇ B, the weaker the above-mentioned suppression, the greater the degree of conversion of the second image data real_Y, and the more dynamically the second image data real_Y. Will be converted. Therefore, the fake image data fake_X generated by the conversion is not more similar to the first image data which is the training image data, and is a new image having a new design that is difficult to predict from the second image data real_Y. It becomes data.
  • the fake image data fake_X generated by the conversion is not similar to the first image data which is the training image data, and is a new image having a new design that is not easy to predict from the second image data real_Y. It becomes data.
  • the smaller the specific density ⁇ B the more difficult it is to predict, and the fake image data fake_X becomes image data that is more difficult to predict from the second image data real_Y.
  • the identification unit 46 denoted as D x in Figure 2 identifies the first image data is a training image data false image data fake_X generated by the generator 45.
  • the learning process which is a calculation using the loss function represented by the above equation (1), is performed by the learning unit 41 for the number of learning times set by the setting unit 43, so that the specific gravity ⁇ A
  • One trained model in ⁇ B is constructed.
  • the trained model is constructed for each value of the specific densities ⁇ A and ⁇ B.
  • the first, second, and third trained models are constructed as trained models.
  • the first trained model is constructed with a specific density ⁇ A1 and a specific density ⁇ B1 smaller than the specific density ⁇ A1
  • the second trained model is constructed with a specific density ⁇ A2 and a specific density ⁇ B2 having the same specific density ⁇ A2
  • the third trained model is constructed with a specific density ⁇ A3. It is constructed with a specific density ⁇ B3 larger than the specific density ⁇ A3.
  • the trained model storage unit 51 stores each trained model constructed as described above as independent data.
  • the trained model is input to the trained model storage unit 51 each time the trained model is constructed as one model by the learning unit 41.
  • the trained model storage unit 51 is, for example, a memory.
  • the test image data input unit 53 inputs to the image data generation unit 47 an instruction to cause the image data generation unit 47 to read the test image data recorded in the recording unit 21.
  • the test image data is an image used when the image data generation unit 47 generates the image data.
  • the test image data is, for example, image data showing a cat's eyes such as the first image data, or image data showing a vehicle headlight such as the second image data.
  • the trained model selection unit 55 selects a trained model from the trained model storage unit 51, and inputs an instruction to the image data generation unit 47 to read the selected trained model into the image data generation unit 47.
  • test image data input unit 53 and the trained model selection unit 55 are devices for input such as a keyboard and a mouse.
  • the image data generation unit 47 accesses the trained model storage unit 51 according to the instruction from the trained model selection unit 55, and reads the trained model selected by the trained model selection unit 55 from the trained model storage unit 51. Next, the image data generation unit 47 generates new image data from the test image data using the read learned model. The generated new image data is input to the image output unit 57.
  • the image output unit 57 is, for example, a monitor.
  • the image output unit 57 outputs new image data generated by the image data generation unit 47 as an image.
  • the operation of the image generation device 10 includes a trained model generation step and an image generation step as main steps.
  • FIG. 3 is a flowchart showing the trained model generation steps.
  • Step S1 the first image data input unit 23 inputs an instruction to cause the learning unit 41 to read a plurality of first image data, and the learning unit 41 inputs a plurality of first image data from the recording unit 21. read out. Further, the second image data input unit 25 inputs an instruction to cause the learning unit 41 to read the second image data, and the learning unit 41 reads the second image data from the recording unit 21.
  • the process proceeds to step S2.
  • Step S2 the specific gravity input unit 27 inputs the specific densities ⁇ A1 and ⁇ B1 to the setting unit 43, and the setting unit 43 sets the specific densities ⁇ A1 and ⁇ B1 as the specific densities ⁇ A and ⁇ B.
  • the set specific densities ⁇ A1 and ⁇ B1 are input to the generation unit 45 and the identification unit 46, and the process proceeds to step S3.
  • Step S3 the learning number input unit 29 inputs the learning number of the learning unit 41 to the setting unit 43, and the setting unit 43 sets the input learning number.
  • the set number of learnings is input to the generation unit 45 and the identification unit 46, and the process proceeds to step S4.
  • the number of learnings is set to, for example, 100 times.
  • Step S4 the learning unit 41 checks the current number of learning times. If the number of learnings is less than 100, the process proceeds to step S5, and if the number of learnings is not less than 100, the process proceeds to step S7.
  • the trained model generation step is started and the process shifts to the first step S4, the number of trainings is set to 0.
  • Step S5 In this step, the learning unit 41 shifts to the learning process described later. When the learning process is completed, the process proceeds to step S6.
  • Step S6 In this step, the learning unit 41 adds one to the current number of learnings, and the process returns to step S4.
  • Step S7 the first trained model corresponding to the specific densities ⁇ A1 and ⁇ B1 set in step S2 is completed by the learning process 100 times, and the completed first trained model is stored in the trained model storage unit 51. It is stored.
  • FIG. 4 is a flowchart showing the learning process of the learning unit 41.
  • Step S11 the learning unit 41 allocates the order i to each of the first image data and each second image data read from the recording unit 21 in step S1. As described above, since the number of the first image data and the number of the second image data are 14,000 each, the order i is 1 to 14000. When the order is assigned, the process proceeds to step S12.
  • Step S12 the learning unit 41 checks the order i of the first image data and the second image data to be learned. If the order i is less than 14,000 described above, the process proceeds to step S13. If the order i is not less than 14,000, it is assumed that the learning process is performed on all the first image data and all the second image data with the specific densities ⁇ A1 and ⁇ B1 set in step S2, and the process proceeds to step S6.
  • Step S13 the learning unit 41 acquires the i-th first image data and the second image data, and the process proceeds to step S14.
  • i is set to 1.
  • Step S14 the learning unit 41 performs a calculation using the loss function represented by the equation (1) for the i-th first image data and the i-th second image data, and the process proceeds to step S15.
  • Step S15 learning is performed in the learning unit 41, and the process proceeds to step S16.
  • Step S16 the learning unit 41 adds one of the current order i, and the process returns to step S12.
  • the first image data and the second image data in the 1st to 14000th positions are expressed by the equation (1).
  • the learning process which is a calculation using the shown loss function, is performed 100 times.
  • the first trained model corresponding to the specific densities ⁇ A1 and ⁇ B1 is completed.
  • the second trained corresponding to the specific densities ⁇ A2 and ⁇ B2 is similar to the generation of the first trained model.
  • the model is generated by 100 learning processes.
  • the third trained model corresponding to the specific gravities ⁇ A3 and ⁇ B3 is generated by 100 times of learning processing in the same manner as the generation of the first trained model. ..
  • the trained model is generated for each value of the specific densities ⁇ A and ⁇ B set in step S2, and each generated trained model is stored in the trained model storage unit 51.
  • each trained model is stored in the trained model storage unit 51, the process in the trained model generation step ends.
  • FIG. 5 is a flowchart showing an image generation step.
  • the image generation step is performed after a plurality of trained models have been constructed by the trained model generation step.
  • Step S21 the test image data is input from the recording unit 21 to the image data generation unit 47 by the test image data input unit 53. Further, the trained model selected by the trained model selection unit 55 is input from the trained model storage unit 51 to the image data generation unit 47.
  • the test image data is image data indicating the cat's eyes such as the first image data, and is data classified into the same category as the first image data.
  • the training image data is the second image data.
  • Step S22 the image data generation unit 47 checks the trained model input to the image data generation unit 47. If the input trained model is the first trained model, the process proceeds to step S23. If the input trained model is the second trained model, the process proceeds to step S24. If the input trained model is the third trained model, the process proceeds to step S25.
  • Step S23 the image data generation unit 47 generates new image data from the test image data using the first trained model.
  • the specific gravity ⁇ A1 is larger than the specific density ⁇ B1. Therefore, the degree of conversion of the test image data becomes large, and the new image data is closer to the first domain than the second domain, and the image data has a new design that is not easy to predict from the test image data.
  • the new image data generated in this step is the image data indicating the vehicle headlight that most closely resembles the cat's eyes. Then, the new image data is input to the image output unit 57, and the process proceeds to step S26.
  • Step S24 the image data generation unit 47 generates new image data from the test image data using the second trained model.
  • the second trained model since the specific gravity ⁇ A2 is the same as the specific density ⁇ B2, the new image data is an intermediate image data between the cat's eyes and the vehicle headlight. Then, the new image data is input to the image output unit 57, and the process proceeds to step S26.
  • Step S25 the image data generation unit 47 generates new image data from the test image data using the third trained model.
  • the specific density ⁇ A3 is smaller than the specific density ⁇ B3. Therefore, the degree of conversion of the test image data becomes larger, and the new image data is closer to the second domain than the first domain, and the image data has a new design that is difficult to predict from the test image data.
  • the new image data generated in this step is image data showing a vehicle headlight that approximates the eyes of a cat. Then, the new image data is input to the image output unit 57, and the process proceeds to step S26.
  • Step S26 the image output unit 57 outputs new image data as an image, and the process in the image generation step ends.
  • the image generation device 10 of the present embodiment includes a recording unit 21 that records a plurality of first image data and a plurality of second image data, and a first specific gravity and a first domain of the first image data. 2
  • the setting unit 43 for setting the second specific weight of the second domain of the image data is provided.
  • the image generation device 10 generates a trained model corresponding to the first specific gravity and the second specific gravity from the plurality of first image data and the plurality of second image data for each value of the first specific gravity and the second specific gravity.
  • a trained model storage unit 51 that stores a plurality of trained models, and a trained model selection unit 55 that selects one trained model from a plurality of trained models stored in the trained model storage unit 51. Further prepare.
  • the image generation device 10 is input from the test image data input unit 53 using the test image data input unit 53 for inputting the test image data and the trained model selected by the trained model selection unit 55. It further includes an image data generation unit 47 that generates new image data from the test image data.
  • the values of the specific gravity ⁇ A and the specific gravity ⁇ B indicating the difficulty of prediction can be set, and a trained model corresponding to each value of the specific gravity ⁇ A and the specific gravity ⁇ B is generated, and a plurality of trainings are performed.
  • New image data is generated from the test image data using one trained model from the completed models.
  • the degree of conversion of the test image data changes according to the trained model corresponding to the values of the specific gravity ⁇ A and the specific gravity ⁇ B.
  • the new image data can become image data that is difficult to predict from the first image data and the second image data, which are training image data, and can be provided with new designability. Therefore, the image generation device 10 of the present embodiment can generate new image data having a new design property that is not easy to predict from the plurality of first and second image data after setting the difficulty of prediction. ..
  • the setting unit 43 sets a plurality of specific densities ⁇ A and a plurality of specific densities ⁇ B
  • the learning unit 41 sets a plurality of learned models corresponding to the plurality of specific radii ⁇ A and the plurality of specific densities ⁇ B.
  • the image data generation unit 47 generates new image data using one trained model from among the plurality of trained models. By generating a plurality of trained models, various kinds of new image data can be generated as compared with the case where only one trained model is generated.
  • the image data generation unit 47 may be the generation unit 45 learned in the learning unit 41.
  • the learning unit 41 may provide the generation unit 45 learned in the learning process to the image data generation unit 47.
  • the number of learnings set in step S3 may be set for each learned model to be constructed. Therefore, for example, the number of trainings in the construction of the first trained model may be the same as the number of trainings in the construction of other trained models, and may be more or less than the number of trainings in the construction of other trained models. Good. As the number of learnings increases, when a trained model with a large number of learnings is used, new image data having a new design that is not easy to predict can be easily generated from the plurality of first and second image data. Also, the smaller the number of trainings, the faster the trained model can be generated. Further, although the learning unit 41 generates three trained models, it is not necessary to limit the learning unit 41 to this, and at least one trained model may be generated.
  • the learning unit 41 generates a trained model according to the Cycle GAN method, but the learning unit 41 does not have to be limited to this.
  • the setting unit 43 sets the specific densities input from the specific densities input unit 27 as the specific densities ⁇ A and ⁇ B, but it is not necessary to be limited to these.
  • the setting unit 43 may set the specific densities preset in the memory of the control unit 40 as the specific densities ⁇ A and ⁇ B.
  • the setting unit 43 sets the value input from the learning number input unit 29 as the learning number, but the setting unit 43 does not have to be limited to this.
  • the setting unit 43 may set a value preset in the memory of the control unit 40 as the number of learnings.
  • Each first image data is described as image data indicating the eyes of a cat, but it is not necessary to be limited to this, and image data indicating the eyes of other animals may be used.
  • an image generation device capable of generating image data having a new design that is not easy to predict from a plurality of image data after setting the difficulty of prediction is provided, and the image generation is performed.
  • the device can be used in the field of image generation and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)

Abstract

Unité de génération d'image (10) comprenant : une unité de stockage (21) ; une unité de définition (43) qui définit, respectivement, une première gravité spécifique d'un premier domaine de premières données d'image et une seconde gravité spécifique d'un second domaine de secondes données d'image ; une unité d'apprentissage (41) qui, pour chaque première gravité spécifique et chaque seconde valeur de gravité spécifique, génère, à partir d'une pluralité de premiers éléments de données d'image et d'une pluralité de seconds éléments de données d'image, un modèle appris correspondant à la première gravité spécifique et à la seconde gravité spécifique ; une unité de stockage de modèle appris (51) ; et une unité de sélection de modèle appris (55). De plus, le dispositif de génération d'image (10) comprend une unité d'entrée de données d'image de test (53), et une unité de génération de données d'image (47) qui utilise un modèle appris sélectionné pour générer de nouvelles données d'image à partir de données d'image de test.
PCT/JP2020/043904 2019-11-27 2020-11-25 Dispositif de génération d'image WO2021106961A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2021561466A JPWO2021106961A1 (fr) 2019-11-27 2020-11-25

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-214201 2019-11-27
JP2019214201 2019-11-27

Publications (1)

Publication Number Publication Date
WO2021106961A1 true WO2021106961A1 (fr) 2021-06-03

Family

ID=76129498

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/043904 WO2021106961A1 (fr) 2019-11-27 2020-11-25 Dispositif de génération d'image

Country Status (2)

Country Link
JP (1) JPWO2021106961A1 (fr)
WO (1) WO2021106961A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019093126A (ja) * 2017-11-24 2019-06-20 キヤノンメディカルシステムズ株式会社 医用データ処理装置、磁気共鳴イメージング装置及び学習済みモデル生成方法

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019093126A (ja) * 2017-11-24 2019-06-20 キヤノンメディカルシステムズ株式会社 医用データ処理装置、磁気共鳴イメージング装置及び学習済みモデル生成方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SUGIURA, AKIHIKO,: "A simple test of pediatric depression using facial expressions cognition", vol. 20, no. 6, June 2009 (2009-06-01), pages 8 - 31 *

Also Published As

Publication number Publication date
JPWO2021106961A1 (fr) 2021-06-03

Similar Documents

Publication Publication Date Title
KR102115534B1 (ko) GAN(Generative Adversarial Networks)을 이용하는 이상 검출 방법, 장치 및 그 시스템
JP7095599B2 (ja) 辞書学習装置、辞書学習方法、データ認識方法およびコンピュータプログラム
KR102042168B1 (ko) 시계열 적대적인 신경망 기반의 텍스트-비디오 생성 방법 및 장치
US20220215267A1 (en) Processes and methods for enabling artificial general intelligence capable of flexible calculation, prediction, planning and problem solving with arbitrary and unstructured data inputs and outputs
JP7058202B2 (ja) 情報処理方法及び情報処理システム
JP2007265345A (ja) 情報処理装置および方法、学習装置および方法、並びにプログラム
CN116912629B (zh) 基于多任务学习的通用图像文字描述生成方法及相关装置
JPWO2020165935A1 (ja) モデル構築装置、モデル構築方法、コンピュータプログラム及び記録媒体
WO2021106961A1 (fr) Dispositif de génération d'image
US11568303B2 (en) Electronic apparatus and control method thereof
JP6622369B1 (ja) 訓練データを生成する方法、コンピュータおよびプログラム
KR20200058297A (ko) 설명 가능한 소수샷 영상 분류 방법 및 장치
WO2020202244A1 (fr) Dispositif de génération de modèle, dispositif d'ajustement de modèle, procédé de génération de modèle, procédé d'ajustement de modèle et support d'enregistrement
JPWO2016203757A1 (ja) 制御装置、それを使用する情報処理装置、制御方法、並びにコンピュータ・プログラム
JP4773680B2 (ja) 情報処理装置および方法、プログラム記録媒体、並びにプログラム
CN113077383B (zh) 一种模型训练方法及模型训练装置
US7324980B2 (en) Information processing apparatus and method
JP7438544B2 (ja) ニューラルネットワーク処理装置、コンピュータプログラム、ニューラルネットワーク製造方法、ニューラルネットワークデータの製造方法、ニューラルネットワーク利用装置、及びニューラルネットワーク小規模化方法
WO2021220343A1 (fr) Dispositif de génération de données, procédé de génération de données, dispositif d'apprentissage, et support d'enregistrement
JP7487556B2 (ja) モデル生成プログラム、モデル生成装置、及びモデル生成方法
JP7148078B2 (ja) 属性推定装置、属性推定方法、属性推定器学習装置、及びプログラム
JPWO2018066083A1 (ja) 学習プログラム、情報処理装置および学習方法
JP2022075329A (ja) 認識システム、認識方法、プログラム、学習方法、学習済みモデル、蒸留モデル、及び、学習用データセット生成方法
JP2021135683A (ja) 学習装置、推論装置、学習方法及び推論方法
JP2009223437A (ja) 新規信号生成装置、新規信号生成方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20892675

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021561466

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20892675

Country of ref document: EP

Kind code of ref document: A1