WO2020095392A1 - Radio wave image identification device, radio wave image learning device, and radio wave image identification method - Google Patents

Radio wave image identification device, radio wave image learning device, and radio wave image identification method Download PDF

Info

Publication number
WO2020095392A1
WO2020095392A1 PCT/JP2018/041385 JP2018041385W WO2020095392A1 WO 2020095392 A1 WO2020095392 A1 WO 2020095392A1 JP 2018041385 W JP2018041385 W JP 2018041385W WO 2020095392 A1 WO2020095392 A1 WO 2020095392A1
Authority
WO
WIPO (PCT)
Prior art keywords
radio wave
wave image
unit
model
area
Prior art date
Application number
PCT/JP2018/041385
Other languages
French (fr)
Japanese (ja)
Inventor
竜馬 谷▲高▼
將 白石
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2020556413A priority Critical patent/JP6833135B2/en
Priority to PCT/JP2018/041385 priority patent/WO2020095392A1/en
Publication of WO2020095392A1 publication Critical patent/WO2020095392A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a radio wave image identification device, a radio wave image learning device, and a radio wave image identification method.
  • CNN convolutional neural networks
  • Non-Patent Document 1 indicates that CNN called AlexNet can be used to perform highly accurate identification in object identification of an optical image.
  • the feature amount is extracted from the input image, and the subjects captured in the image are classified based on the feature amount.
  • the input image is a radio wave image
  • the radio wave image has less feature amount that can be extracted from the image as compared with the optical image, so that the identification accuracy is deteriorated in the conventional object identification using CNN. There was a problem.
  • the present invention is intended to solve the above-mentioned problems, and an object thereof is to provide a radio wave image identification device capable of performing highly accurate object identification even if an input image is a radio wave image.
  • a radio wave image identifying apparatus includes a radio wave image acquiring unit that acquires radio wave image information indicating a radio wave image, a model acquiring unit that acquires a model used for identification processing based on the radio wave image information, and the radio wave image information or the radio wave image information. Based on the model based on the feature map information generated based on the image information, an area of interest surrounded by a closed curve and an area of the area surrounded by a closed curve including the area of interest excluding the area of interest.
  • a peripheral area is set, a separation degree calculation layer for calculating the degree of separation between the attention area and the peripheral area, and based on the model, convolution processing is performed on the attention area in the radio wave image information or the feature map information,
  • One or more convolutions having a region of interest convolutional layer that generates feature map information by weighting the processing result of the convolutional processing based on the degree of separation A processing unit, based on the model and the characteristic map information, comprising a total binding unit that performs classification and a result output unit for total binding unit outputs a classification result of the classification, the.
  • the input image is a radio wave image
  • highly accurate object identification can be performed.
  • FIG. 1 is a block diagram showing an example of the configuration of the radio wave image identifying apparatus according to the first embodiment.
  • 2A and 2B are diagrams showing an example of a hardware configuration of a main part of the radio wave image identifying device according to the first embodiment.
  • FIG. 3 is a diagram illustrating an example of the attention area and the peripheral area at a certain position on the radio wave image or the feature map.
  • FIG. 4A is a diagram showing an example of a method of calculating the radius Ra indicating the size of the attention area corresponding to the position of each pixel on the radio wave image.
  • FIG. 4B is a diagram showing an example of a method of calculating the radius Ra indicating the size of the attention area corresponding to the position of each element on the feature map.
  • FIG. 1 is a block diagram showing an example of the configuration of the radio wave image identifying apparatus according to the first embodiment.
  • 2A and 2B are diagrams showing an example of a hardware configuration of a main part of the radio wave image identifying device according to
  • FIG. 5 is a diagram showing an example of processing a rectangular filter used in the conventional convolution processing into a circular filter corresponding to the attention area and the peripheral area according to the first embodiment.
  • FIG. 6 is a flowchart illustrating an example of processing of the radio wave image identifying apparatus according to the first embodiment.
  • 7A and 7B are diagrams showing an example of the radio wave image indicated by the radio wave image information acquired by the radio wave image identifying apparatus.
  • FIG. 8 is a block diagram showing an example of a main part of the radio wave image learning apparatus according to the second embodiment.
  • FIG. 1 is a block diagram showing an example of the configuration of the radio wave image identifying apparatus 100 according to the first embodiment.
  • the radio wave image identification device 100 is applied to a radio wave image identification system 10.
  • the radio wave image identification system 10 includes a radio wave image identification device 100, a radio wave image output device 11, and a storage device 12.
  • the radio wave image identification device 100 identifies an object shown in the radio wave image indicated by the acquired radio wave image information, based on the radio wave image information acquired from the radio wave image output device 11 and the model acquired from the storage device 12.
  • the identification processing is executed. More specifically, the radio wave image identifying apparatus 100 executes an identification process for identifying an object included in a radio wave image indicated by the acquired radio wave image information, by a process using a neural network based on the acquired model. .. Details of the radio wave image identifying apparatus 100 will be described later.
  • the radio wave image output device 11 demodulates a radio wave received by an antenna (not shown), and generates radio wave image information indicating a radio wave image based on the demodulated radio wave.
  • the radio wave image output device 11 outputs the generated radio wave image information to the radio wave image identification device 100.
  • the storage device 12 is a model that is information indicating parameters and programs necessary for the radio wave image identification device 100 to execute identification processing, and an ideal class for the radio wave image identification device 100 to perform classification in the identification processing.
  • the values and are stored in the storage unit 13.
  • the storage device 12 outputs the ideal value of the model or class to the radio wave image identification device 100 in response to a request from the radio wave image identification device 100.
  • the storage device 12 acquires a model from the radio wave image identification device 100 in response to a request from the radio wave image identification device 100, updates the model stored in the storage unit 13 with the acquired model, and stores the model in the storage unit 13. You may memorize it.
  • the ideal value of a class is represented by a two-dimensional vector when the number of classifications when classifying is 2, for example, the value of one element of the vector is 1 and the value of the other element is It is quantified to 0.
  • the radio wave image identification device 100 includes a radio wave image acquisition unit 101, a model acquisition unit 102, a convolution processing unit 103, an activation unit 104, a pooling unit 105, a total combination unit 106, a result output unit 107, and an evaluation unit. And a model updating unit 109.
  • the activation unit 104, the pooling unit 105, the evaluation unit 108, and the model updating unit 109 are not essential components.
  • the radio wave image acquisition unit 101 acquires radio wave image information. More specifically, for example, the radio wave image acquisition unit 101 acquires radio wave image information from the radio wave image output device 11.
  • the model acquisition unit 102 acquires a model used for identification processing based on radio wave image information. More specifically, for example, the model acquisition unit 102 acquires a model from the storage device 12.
  • Each of the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, and the model updating unit 109 performs processing by a neural network based on the model acquired by the model acquiring unit 102. That is, each of the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total connection unit 106, and the model update unit 109 corresponds to a layer in a layered connection neural network. A plurality of each of the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, and the model updating unit 109 may exist.
  • the plurality of convolution processing units 103 may exist in one layer or may be combined in layers. The same applies to each of the activation unit 104, the pooling unit 105, the total combination unit 106, and the model updating unit 109.
  • the convolution processing unit 103 includes a separation degree calculation layer 131 and a region-of-interest convolution layer 132.
  • the degree-of-separation calculation layer 131 uses the closed curve based on the model based on the radio wave image information acquired by the radio wave image acquisition unit 101 or the feature map information generated based on the radio wave image information acquired by the radio wave image acquisition unit 101.
  • An attention area that is a surrounded area and a peripheral area that is an area that is the area surrounded by a closed curve that includes the attention area and that excludes the attention area are set, and the degree of separation between the attention area and the peripheral area is calculated.
  • the attention area convolutional layer 132 is a degree of separation in radio wave image information acquired by the radio wave image acquisition unit 101 based on the model, or in feature map information generated based on the radio wave image information acquired by the radio wave image acquisition unit 101.
  • the convolution processing is executed on the attention area set by the calculation layer 131. Further, the attention area convolutional layer 132 generates feature map information by weighting the processing result of the convolution processing based on the degree of separation calculated by the degree-of-separation calculation layer 131.
  • the attention area convolutional layer 132 may perform a plurality of convolution processes on the attention area in the radio wave image information or the feature map information using a plurality of different parameters based on the model.
  • the attention area convolutional layer 132 may generate a plurality of feature map information by weighting each processing result of the plurality of convolution processings based on the degree of separation.
  • the attention area convolutional layer 132 executes convolution processing on the attention area in the radio wave image information or the feature map information, and a plurality of parameters calculated based on a plurality of different parameters based on the model are used for the processing result of the convolution processing.
  • a plurality of feature map information may be generated by giving weighting based on the degree of separation. A method of weighting the processing result based on the degree of separation will be described later.
  • the closed curve in the attention area is, for example, a circle Ca having a radius Ra centered on a certain position on the radio wave image or the feature map.
  • the circle Ca referred to here is not limited to a perfect circle, and may be, for example, a partially flattened substantially circle.
  • the shape of the closed curve in the attention area is not limited to the circular shape, and may be an elliptical shape, an oval shape, or a rounded rectangular shape.
  • the elliptical shape, the oval shape, and the rounded rectangle include the substantially elliptical shape, the substantially oval shape, and the substantially rounded rectangle, respectively.
  • a rounded rectangle is a figure having a shape in which some or all of the corners of the rectangle are rounded.
  • the closed curve including the attention area in the peripheral area is, for example, a circle Cb having a radius Rb larger than the radius Ra of the circle Ca centered on the center of the circle Ca that is the closed curve in the attention area.
  • the circle Cb referred to here is not limited to a perfect circle, and may be, for example, a partially flattened substantially circle.
  • the shape of the closed curve including the attention area in the peripheral area is not limited to the circular shape, and may be an elliptical shape, an oval shape, or a rounded rectangular shape.
  • the elliptical shape, the oval shape, and the rounded rectangle include the substantially elliptical shape, the substantially oval shape, and the substantially rounded rectangle, respectively.
  • a method of calculating the size of the peripheral area such as the radius Ra and the radius Rb corresponding to the position of each pixel on the radio wave image or each element on the feature map will be described later.
  • the degree of separation between the region of interest and the peripheral region in the degree-of-separation calculation layer 131 and the method of calculating the degree of separation will be described later.
  • the activation unit 104 performs a process of reducing an unnecessary feature amount by nonlinear conversion on the feature map information output by the convolution processing unit 103.
  • an activation function such as a sigmoid function, a ramp function or a softmax function is used.
  • the pooling unit 105 performs, for example, a process of reducing the feature amount of the feature map information nonlinearly converted by the activation unit 104 so that the feature amount remains.
  • the pooling unit 105 gives invariance to a minute position change of the target position on the feature map by the processing.
  • the total combination unit 106 performs class classification based on the model and the feature map information. More specifically, for example, the all-combining unit 106 classifies the feature map information output from the convolution processing unit 103 based on the feature map information processed by the activation unit 104 and the pooling unit 105. To do.
  • the total combination unit 106 extracts, for example, a feature amount from the feature map information and converts the extracted feature amount into a vector.
  • the vector output from the layer of the last all-joint unit 106 represents each class component in a plurality of identifiable classes. It becomes a vector containing.
  • the total combination unit 106 may include the activation unit 104, the pooling unit 105, or the activation unit 104 and the pooling unit 105. That is, the activation unit 104, the pooling unit 105, and the total combination unit 106 may be collectively defined as the total combination unit 106.
  • the result output unit 107 outputs the classification result of the class combination performed by the total combination unit 106. More specifically, for example, the result output unit 107 outputs the classification result obtained by performing the class classification by the all-combining unit 106 to the evaluation unit 108 described below. Further, the result output unit 107 may output the classification result to a display control device (not shown). The display control device outputs the classification result by visually displaying the classification result on a display screen (not shown), for example.
  • the evaluation unit 108 evaluates the parameters included in the model based on the classification result output by the result output unit 107 and the ideal value in the class classification. More specifically, for example, the evaluation unit 108 calculates the error between the probability of each class and the ideal value by an error function such as a cross entropy function based on the classification result and the ideal value of the class classification, and the error Find the parameter that minimizes.
  • the evaluation unit 108 acquires, for example, an ideal value in class classification from the storage device 12.
  • the model updating unit 109 updates the model by changing the parameters included in the model based on the evaluation result evaluated by the evaluation unit 108.
  • the model updating unit 109 outputs the updated model to the storage device 12, for example.
  • the storage device 12 updates the model stored in the storage unit 13 to the model output by the model updating unit 109 and stores the model in the storage unit 13.
  • the model updating unit 109 may output the updated model to the model acquiring unit 102, for example.
  • FIG. 1 illustrates both the case where the model updating unit 109 outputs the updated model to the storage device 12 and the case where the model updating unit 109 outputs the updated model to the model acquisition unit 102.
  • the evaluation unit 108 and the model updating unit 109 are not essential configurations. Since the radio wave image identifying apparatus 100 includes the evaluation unit 108 and the model updating unit 109, for example, the parameter for executing the identification process changes for each radio wave image information acquired by the radio wave image acquisition unit 101. With this configuration, since the model is updated by learning for each radio wave image information acquired by the radio wave image acquisition unit 101, the radio wave image identification device 100 executes the object identification process using a more suitable model. It becomes possible to do.
  • FIG. 2A and FIG. 2B are diagrams showing an example of a hardware configuration of a main part of the radio wave image identifying apparatus 100 according to the first embodiment.
  • the radio wave image identifying apparatus 100 is composed of a computer, and the computer has a processor 201 and a memory 202.
  • the memory 202 includes the radio wave image acquisition unit 101, model acquisition unit 102, convolution processing unit 103, activation unit 104, pooling unit 105, total combination unit 106, result output unit 107, evaluation unit 108, and model update in the memory 202.
  • a program for causing the unit 109 to function is stored.
  • the processor 201 reads out and executes the program stored in the memory 202, the radio wave image acquisition unit 101, the model acquisition unit 102, the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, and the result.
  • the output unit 107, the evaluation unit 108, and the model updating unit 109 are realized.
  • the radio wave image identifying device 100 may be configured by the processing circuit 203.
  • the functions of the radio wave image acquisition unit 101, model acquisition unit 102, convolution processing unit 103, activation unit 104, pooling unit 105, total combination unit 106, result output unit 107, evaluation unit 108, and model updating unit 109 are It may be realized by the processing circuit 203.
  • the radio wave image identifying apparatus 100 may be composed of a processor 201, a memory 202 and a processing circuit 203 (not shown).
  • the functions of the radio wave image acquisition unit 101, the model acquisition unit 102, the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, the result output unit 107, the evaluation unit 108, and the model updating unit 109 are performed.
  • Some of the functions may be realized by the processor 201 and the memory 202, and the remaining functions may be realized by the processing circuit 203.
  • the processor 201 uses, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, a micro controller, or a DSP (Digital Signal Processor).
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • microprocessor a microcontroller
  • DSP Digital Signal Processor
  • the memory 202 uses, for example, a semiconductor memory or a magnetic disk. More specifically, the memory 202 includes a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory-Memory), and an EEPROM (Electrically Integrated Memory). State Drive) or HDD (Hard Disk Drive) is used.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • flash memory an EPROM (Erasable Programmable Read Only Memory-Memory)
  • EEPROM Electrically Integrated Memory
  • State Drive or HDD (Hard Disk Drive) is used.
  • the processing circuit 203 includes, for example, an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field-Programmable Gate Array), and a So-C-Syc-SyC-SyC-Syc-SyC-Syc-System-Cyclic System. Is used.
  • ASIC Application Specific Integrated Circuit
  • PLD Programmable Logic Device
  • FPGA Field-Programmable Gate Array
  • So-C-Syc-SyC-SyC-Syc-SyC-Syc-System-Cyclic System So-C-Syc-SyC-SyC-Syc-System-Cyclic System. Is used.
  • FIG. 3 is a diagram illustrating an example of the attention area and the peripheral area at a certain position on the radio wave image or the feature map.
  • the attention area at a certain position on the radio wave image or the feature map is an area surrounded by a circle Ca having a radius Ra centered on the position.
  • the peripheral area at a certain position on the radio wave image or the feature map is an area excluding the attention area from the area surrounded by the circle Cb having the radius Rb centered on the position.
  • the convolution processing unit 103 includes, for example, a region size determination layer 133 (not shown) that executes the process before executing the process in the separation degree calculation layer 131.
  • the area size determination layer 133 executes convolution processing on the radio wave image information or the feature map information input to the convolution processing unit 103, for example, using a predetermined convolution filter (hereinafter referred to as “area determination filter”). By doing so, a value indicating the size of the attention area or the size of the peripheral area corresponding to the position of each pixel on the radio wave image or each element on the feature map is calculated.
  • FIG. 4A is a diagram showing an example of a method of calculating the radius Ra indicating the size of the attention area corresponding to the position of each pixel on the radio wave image.
  • radio wave image information in which the number of pixels in the vertical direction is H, the number of pixels in the horizontal direction is W, and the number of channels is C (hereinafter referred to as “H ⁇ W ⁇ C”).
  • the area size determination layer 133 corresponds to each pixel of the radio wave image information by executing the convolution processing on the radio wave image information using the area determination filter.
  • the area-of-interest size map information of “H ⁇ W ⁇ 1” in which the value of each element is the value of the radius Ra at the position of each pixel is output.
  • the channel in the radio wave image information corresponds to each color component such as red, blue and green in the radio wave image. That is, for example, when each pixel in the radio wave image information has three color components of a red component, a blue component, and a green component, the number of channels in the radio wave image is 3.
  • the radio wave image acquisition unit 101 acquires radio wave image information and generates a plurality of color component image information corresponding to each color component from the acquired radio wave image information.
  • the area size determination layer 133 performs the convolution processing on the plurality of color component image information using the area determination filter, so that the value of each element corresponding to each pixel of the radio wave image information becomes The attention area size map information that is the value of the radius Ra at the position is output.
  • FIG. 4B is a diagram showing an example of a method of calculating the radius Ra indicating the size of the attention area corresponding to the position of each element on the feature map.
  • the feature map information in which the number of elements in the vertical direction is H, the number of elements in the horizontal direction is W, and the number of channels is C (hereinafter referred to as “H ⁇ W ⁇ C”).
  • the area size determination layer 133 uses the area determination filter to perform the convolution processing on the feature map, so that each element corresponding to each element of the feature map
  • the area-of-interest size map information of “H ⁇ W ⁇ 1” whose value is the value of the radius Ra at the position of each element of the feature map is output.
  • the channel in the feature map information is, for example, the number of feature map information input to the convolution processing unit 103.
  • One or more feature map information is output from 103, and the convolution processing unit 103 of the layer on which the process is performed later performs the convolution process on the one or more feature map information using the area determination filter.
  • the channel is the number of one or more feature map information input to the convolution processing unit 103 of the layer to be processed later.
  • the radius Rb indicating the size of the outer circumference of the peripheral region corresponding to the position of each pixel on the radio wave image or each element on the feature map is calculated, for example, by the same method as the method of calculating the radius Ra described above. That is, the area size determination layer 133 determines the size of the outer circumference of the peripheral area corresponding to the position of each pixel on the radio wave image or each element on the feature map by the same method as outputting the attention area size map information.
  • the peripheral area size map information shown is output.
  • the size value of the outer circumference of the peripheral area corresponding to the position of each pixel on the radio wave image or each element on the feature map is preset to a value indicating the size of the attention area corresponding to the position. It may be calculated by multiplying by a value.
  • the area size determination layer 133 includes an area determination filter for calculating the major axis of the ellipse and an area determination filter for calculating the minor axis.
  • the area of interest size map information or area size map information for the major axis that indicates the size of the area of interest or the peripheral area corresponding to the position of each pixel on the radio wave image or each element on the feature map, and the minor axis Output the attention area size map information or the peripheral area size map information for. That is, the area size determination layer 133 uses a plurality of area determination filters necessary for defining the sizes of the attention area and the peripheral area having a predetermined shape, thereby obtaining the necessary attention area size map information and the peripheral area size map. Output information.
  • the parameters of the area size determination layer 133 are learned and updated for each radio wave image information acquired by the radio wave image acquisition unit 101 by the processing in the evaluation unit 108 and the model update unit 109 described above. Therefore, the radio wave image identifying apparatus 100 can output the attention area size map information and the peripheral area size map information using a more suitable model.
  • the separation degree between the attention area and the peripheral area in the separation degree calculation layer 131 and a method of calculating the separation degree will be described.
  • the separation degree calculation layer 131 generates, for example, a plurality of filters used for calculating the degree of separation between the attention area and the peripheral area.
  • the filters generated by the separability calculation layer 131 are, for example, an inter-region filter, a summation region filter, and a separability calculation filter.
  • the inter-region filter is a convolution filter for calculating the amount of change in the Radar Cross Section value (hereinafter referred to as “RCS value”) between the region of interest and the peripheral region.
  • the inter-region filter is the sum of the value indicating the variance of the RCS values in the attention region with respect to the summation region and the value indicating the variance of the RCS values in the peripheral regions (hereinafter referred to as “inter-region variance value”).
  • the RCS value is a value corresponding to the radio wave reflection intensity at a certain pixel on the radio wave image or a certain element on the feature map.
  • the summation area filter is a convolution filter for calculating the change amount of the RCS value in an area (hereinafter, referred to as “summation area”) in which the attention area and the peripheral area are combined.
  • the summation area filter is a convolution filter for calculating a value indicating the variance of RCS values in the summation area (hereinafter referred to as “summation area dispersion value”).
  • the summation area filter is a convolution filter for calculating the summation area variance value using Expression (2).
  • the summation area filter may be a convolution filter for calculating the summation area variance value using Expression (3) obtained by discretizing Expression (2).
  • the separation degree calculation filter is a filter for calculating a value indicating a ratio of the amount of change in RCS value between the attention area and the peripheral area to the amount of change in RCS value in the summation area.
  • the separation degree calculation filter is a filter for calculating the degree of separation between the attention area and the peripheral area by dividing the inter-area dispersion value by the total area dispersion value.
  • the separation degree calculation filter is a filter for calculating the degree of separation between the region of interest and the peripheral region using Expression (4).
  • the degree-of-separation calculation layer 131 calculates the degree of separation between the attention area and the peripheral area using an inter-area filter, a summation area filter, and a separation degree calculation filter.
  • the degree of separation between the attention area and the peripheral area approaches 0 when the change amount of the RCS value between the attention area and the peripheral area is small, and approaches 1 when the change amount is large.
  • the attention area convolutional layer 132 generates a plurality of filters used for generating the feature map information.
  • the filters generated by the attention area convolutional layer 132 are, for example, an attention area filter and a weighting filter.
  • the attention area filter is a convolution filter for calculating the characteristic RCS value at the position from the RCS value in the attention area corresponding to the position of an image on the radio wave image or the position of an element on the feature map. is there.
  • the attention area convolutional layer 132 calculates a characteristic RCS value at the position of each pixel on the radio wave image or the position of each element on the feature map using the attention area filter.
  • the weighting filter is calculated by the separation degree calculation layer 131, for example, to a characteristic RCS value at a position of an image on the radio wave image calculated using the attention area filter or a position of an element on the characteristic map. It is a filter for multiplying the degree of separation at the position.
  • the attention area convolutional layer 132 uses a weighting filter to multiply the position of each pixel on the radio wave image or the characteristic RCS value at the position of each element on the characteristic map by the degree of separation at each position. , Generate feature map information. As described above, the degree of separation between the region of interest and the peripheral region approaches 1 when the amount of change in the RCS value between the region of interest and the peripheral region is large, and approaches 0 when the amount of change is small.
  • the characteristic map information generated by multiplying the characteristic RCS value at the position of each pixel on the radio wave image or the position of each element on the characteristic map by the degree of separation at each position is the degree of separation. Is high, the weight given to the attention area is large, and if the separation degree is low, the weight given to the attention area is small.
  • the inter-region filter, the summation region filter, and the attention region filter for performing the convolution processing on the RCS value in the attention region or the surrounding region correspond to the shape and size of the attention region or the surrounding region.
  • the shape of the attention area and the peripheral area is a shape such as a circle, an ellipse, an oval, or a rounded rectangle.
  • the inter-region filter, the summation region filter, and the attention region filter are generated based on the attention region size map information and the peripheral region size map information output by the region size determination layer 133. More specifically, for example, the inter-region filter, the summation region filter, and the region-of-interest filter are convolution filters having a predetermined shape such as a circular filter having a variable size included in the model acquired by the model acquisition unit 102. It is generated by applying a value indicating the size of the attention area or the surrounding area included in the attention area size map information and the surrounding area size map information.
  • the inter-region filter, the summation region filter, and the attention region filter by processing each position in the rectangular filter used in the conventional convolution processing, so as to be arranged substantially evenly in the attention region or the peripheral region, It may be generated to have a predetermined shape such as a circle.
  • FIG. 5 is a diagram showing an example of processing a rectangular filter used in the conventional convolution processing into a circular filter corresponding to the attention area and the peripheral area according to the first embodiment.
  • the convolution processing unit 103 decomposes a rectangular filter, and arranges the decomposed rectangular filters so as to correspond to the shapes of the attention region and the peripheral region, so that the regions between the convolution filters are separated.
  • a filter, a summation area filter, and an attention area filter may be generated.
  • FIG. 6 is a flowchart illustrating an example of processing of the radio wave image identifying apparatus 100 according to the first embodiment.
  • the radio wave image identifying apparatus 100 for example, repeatedly executes the process of the flowchart for each radio wave image information.
  • step ST601 the radio wave image acquisition unit 101 acquires radio wave image information.
  • step ST602 the model acquisition unit 102 acquires a model.
  • step ST603 the convolution processing unit 103 determines the area of interest size map information indicating the size of the area of interest and the surrounding area at the position of each pixel on the radio wave image or the position of each element on the feature map and the surrounding area. Region size map information is generated.
  • step ST604 the convolution processing unit 103 calculates the degree of separation between the attention area and the peripheral area.
  • step ST605 the convolution processing unit 103 executes the convolution processing on the region of interest, and weights the processing result of the convolution processing based on the degree of separation to generate the feature map information.
  • step ST606 the convolution processing section 103 that executes the processing from step ST603 to step ST605 is the last layer. It is determined whether or not it is the convolution processing unit 103.
  • step ST606 when the convolution processing unit 103 is not the convolution processing unit 103 of the last layer, the radio wave image identifying apparatus 100 returns to the processing of step ST603, and the convolution processing unit of the layer next to the layer that has executed the processing up to this point.
  • the CPU 103 executes the processes from step ST603 to step ST606.
  • the activation unit 104 performs processing of reducing unnecessary feature amounts in step ST607.
  • step ST608 the pooling unit 105 performs processing to reduce the feature amount so that it remains.
  • step ST609 the all-combining unit 106 performs class classification based on the model and the feature map information.
  • step ST610 the result output unit 107 outputs the classification result obtained by the total combining unit 106 performing the classification.
  • step ST611 the evaluation unit 108 evaluates the parameters included in the model based on the classification result output by the result output unit 107 and the ideal value in the class classification.
  • step ST612 the model updating unit 109 updates the model by changing the parameters included in the model based on the evaluation result evaluated by the evaluation unit 108.
  • the radio wave image identifying apparatus 100 After executing the process of step ST611, the radio wave image identifying apparatus 100 ends the process of the flowchart, returns to the process of step ST601, and repeatedly executes the process of the flowchart for each radio wave image information.
  • the processing order of the processing of step ST601 and the processing of step ST602 may be reversed.
  • the radio wave image identifying apparatus 100 includes a radio wave image acquiring unit 101 that acquires radio wave image information indicating a radio wave image, a model acquiring unit 102 that acquires a model used for identification processing based on the radio wave image information, and a radio wave image.
  • the feature map information generated based on the image information or the radio wave image information is used to exclude the attention area from the area enclosed by the closed curve including the attention area and the attention area, which is the area enclosed by the closed curve, based on the model.
  • a surrounding area which is a closed area and calculates a degree of separation between the attention area and the surrounding area, and a convolution process for the attention area in the radio wave image information or the feature map information based on the model.
  • the attention area convolutional layer 13 for generating the feature map information by executing the above processing and weighting the processing result of the convolution processing based on the degree of separation.
  • one or more convolution processing units 103 each having :, a total combining unit 106 that performs class classification based on the model and feature map information, and a result output unit 107 that outputs a classification result obtained by the total combining unit 106. Equipped with.
  • the radio wave image identifying apparatus 100 assigns weights based on the calculated degree of separation between the attention area and the peripheral area, so that even if the input image is a radio wave image, a highly accurate object can be obtained. Identification can be done.
  • the radio wave image identifying apparatus 100 is configured such that the convolution processing unit 103 increases the weighting given to the attention area when the degree of separation is high, and reduces the weighting given to the attention area when the degree of separation is low. With this configuration, the radio wave image identifying apparatus 100 can enhance the RCS value in the attention area when the RCS value in the attention area is characteristic.
  • the radio wave image identifying apparatus 100 is configured such that the size of the attention area is changed by the radio wave image information acquired by the radio wave image acquisition unit 101 by being updated by learning. More specifically, the radio wave image identifying apparatus 100 updates the radius Ra of the circle Ca in the attention area and the radius Rb of the circle Cb in the peripheral area by learning so that the radio wave image acquiring unit 101 acquires the radio wave. It is configured to change for each image information. With such a configuration, the radio wave image identifying apparatus 100 can calculate the feature amount that is more suitable, and can improve the identification accuracy.
  • the convolution filter used when the convolution processing unit 103 performs the convolution process on the attention area in the radio wave image information or the feature map information is configured by processing a rectangular filter. I chose With this configuration, the radio wave image identifying apparatus 100 does not need to prepare a convolution filter having a predetermined shape in advance, and can use a more general-purpose convolution filter.
  • the radio wave image identifying apparatus 100 evaluates the parameters included in the model based on the classification result output by the result output unit 107 and the ideal value in the class classification, and the evaluation performed by the evaluation unit 108. And a model updating unit 109 that updates the model by updating the parameters included in the model based on the result.
  • the radio wave image identifying apparatus 100 can perform the identification process based on the model that is more suitable for each piece of radio wave image information to be acquired, and can improve the identification accuracy.
  • the radio wave image identifying apparatus 100 is configured such that the shape of the attention area is a circle, an ellipse, an oval, or a rounded rectangle.
  • the feature of the object appearing in the radio wave image is a shape close to a circle. Therefore, the radio wave image identifying apparatus 100 uses a convolution filter having a shape such as a circle, an ellipse, an oval, or a rounded rectangle to extract a more suitable feature amount than when using a conventional rectangular filter. Therefore, even if the input image is a radio wave image, highly accurate object identification can be performed.
  • the inner product of the k ⁇ k convolution filter and the region of vertical k (k is a natural number) pixel and horizontal k pixel centering on a certain pixel is calculated and the processing result is calculated.
  • Conventional convolution processing means calculating a degree of similarity indicating how similar the convolution area and the convolution filter are.
  • the attention area filter in addition to the degree of similarity between the attention area centered around the attention point, such as a circle, an ellipse, an oval, or a rounded rectangle, the attention area filter , A degree of separation between the attention area and a peripheral area outside the attention area, that is, a value indicating how similar the attention area and the peripheral area are to each other is calculated, and the similarity and the degree of separation are combined. is there.
  • the radio wave image identifying apparatus 100 not only simply measures the similarity between the RCS value in the attention area on the radio wave image and the RCS value in the entire radio wave image, but Since it is possible to measure the degree of peculiarity in the image information, it is possible to extract many features even in a radio wave image having a smaller feature amount than an optical image.
  • FIG. 7A and 7B are diagrams showing an example of the radio wave image indicated by the radio wave image information acquired by the radio wave image identifying apparatus 100.
  • the radio wave image information acquired by the radio wave image identifying apparatus 100 is, for example, a range Doppler map represented by a two-dimensional image including a distance direction and a velocity direction.
  • FIG. 7A shows a radio wave image corresponding to an object that is two types of different identification targets.
  • the light and shade in the radio wave image indicates the RCS intensity of the reflected wave of the radio wave received by the antenna and applied to the object as the identification target.
  • the contrast in the radio wave image includes the influence of noise received by the antenna in addition to the reflected wave.
  • the radio wave image information acquired by the radio wave image identifying apparatus 100 is not limited to the range Doppler map, for example. As shown in FIG. 7B, the radio wave image information acquired by the radio wave image identifying apparatus 100 may be, for example, a micro-Doppler spectrumgram shown by a two-dimensional image in the frequency direction and the time direction.
  • the radio wave image on the left side in FIG. 7A shows the RCS intensity of the reflected wave from the object having a conical shape
  • the radio wave image on the right side shows the RCS intensity of the reflected wave from the object having a cylindrical shape. Is shown.
  • the RCS intensity of the identification target in each radio wave image in FIG. 7A is indicated by the RCS intensity in the area inside the closed curve surrounded by the broken line in the center of the image.
  • what is indicated by the RCS intensity in the area other than the area is due to the influence of noise.
  • the value of the RCS intensity in the radio wave image does not fluctuate abruptly with changes in the brightness value and the like, as compared with the optical image. Therefore, there is no clear contour in the RCS intensity due to the reflected wave from the object that is the identification target. Therefore, the conventional convolution processing that extracts the contour information or the shape information cannot extract a sufficient amount of features.
  • the convolution processing unit 103 determines the RCS value in the attention area and the RCS value in the surrounding area even when there is no clear contour in the RCS intensity of the reflected wave from the object that is the identification target.
  • the attention area has an important feature amount.
  • the weighting can be increased as a matter of course.
  • FIG. 8 is a block diagram showing an example of a main part of the radio wave image learning device 150 according to the second embodiment. As shown in FIG. 8, the radio wave image learning device 150 is applied to, for example, the radio wave image learning system 15.
  • the model updating unit 109 is changed to a model updating unit 109a and the model acquiring unit 102 is deleted, as compared with the radio wave image identifying apparatus 100 according to the first embodiment. It is a thing.
  • the same components as those of the radio wave image identifying device 100 according to the first embodiment are denoted by the same reference numerals and duplicate description will be omitted. That is, the description of the configuration in FIG. 8 given the same reference numerals as those in FIG. 1 will be omitted.
  • the radio wave image learning system 15 includes, for example, a radio wave image learning device 150, a radio wave image output device 11, and a storage device 12. Since the radio wave image output device 11 and the storage device 12 are the same as the radio wave image output device 11 and the storage device 12 described in the first embodiment, detailed description thereof will be omitted.
  • the radio wave image learning device 150 includes a radio wave image acquisition unit 101, a convolution processing unit 103, an activation unit 104, a pooling unit 105, a total combination unit 106, a result output unit 107, an evaluation unit 108, and a model update unit 109a.
  • the radio wave image acquisition unit 101, the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, the result output unit 107, and the evaluation unit 108 are the radio wave image acquisition unit 101, the convolution unit described in the first embodiment. Since it is equivalent to the processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, the result output unit 107, and the evaluation unit 108, detailed description thereof will be omitted.
  • the evaluation unit 108 holds in advance an ideal value of a class for the radio wave image learning apparatus 150 to perform class classification in the identification processing.
  • the model updating unit 109a holds a pre-learned model or a partially-learned model prepared in advance.
  • the radio wave image learning device 150 includes a model acquisition unit 102a (not shown in FIG. 8), and the model update unit 109a, for example, through the model acquisition unit 102a, before learning or partly prepared in the storage device 12.
  • the learned model may be acquired.
  • the model updating unit 109a updates the model for each radio wave image information by updating the parameter included in the model based on the evaluation result evaluated by the evaluation unit 108, and holds the updated model.
  • the model updating unit 109a outputs the updated model to the storage device 12.
  • the convolution processing unit 103 generates feature map information based on the radio wave image information acquired by the radio wave image acquisition unit 101, based on the model held by the model updating unit 109a.
  • Each function of the unit 108 and the model updating unit 109a may be realized by the processor 201 and the memory 202 in the hardware configuration illustrated in FIGS. 2A and 2B in the first embodiment, or processing It may be realized by the circuit 203.
  • the operation of the main part of the radio wave image learning device 150 is the same as the operation of the radio wave image identification device 100 according to the first embodiment described with reference to FIG.
  • the radio wave image learning device 150 is based on the model based on the radio wave image acquisition unit 101 that acquires the radio wave image information indicating the radio wave image and the radio wave image information or the feature map information generated based on the radio wave image information. Then, a region of interest, which is a region surrounded by the closed curve, and a peripheral region, which is a region surrounded by the closed curve that includes the region of interest and excluding the region of interest, are set, and the degree of separation between the region of interest and the peripheral region is set.
  • convolution processing is performed on the attention area in the radio wave image information or the feature map information, and weighting based on the separation degree is given to the processing result of the convolution processing.
  • one or more convolution processing units 103 each having a region-of-interest convolutional layer 132 for generating the feature map information according to Based on the total combination unit 106 that performs the class classification, the result output unit 107 that outputs the classification result of the class combination performed by the total combination unit 106, the classification result that the result output unit 107 outputs, and the ideal value in the class classification.
  • An evaluation unit 108 that evaluates parameters included in the model
  • a model updating unit 109a that updates the model by updating the parameters included in the model based on the evaluation result evaluated by the evaluation unit 108. ..
  • the radio wave image learning apparatus 150 enables model learning for highly accurate object identification even if the input image is a radio wave image.
  • the present invention allows free combinations of the respective embodiments, modification of arbitrary constituent elements of each embodiment, or omission of arbitrary constituent elements in each embodiment. ..
  • the radio wave image identification device and the radio wave image learning device according to the present invention can be applied to a radio wave image identification system.
  • radio wave image identification system 11 radio wave image output device, 12 storage device, 13 storage unit, 100 radio wave image identification device, 101 radio wave image acquisition unit, 102 model acquisition unit, 103 convolution processing unit, 104 activation unit, 105 pooling unit , 106 total combination unit, 107 result output unit, 108 evaluation unit, 109, 109a model update unit, 131 separation degree calculation layer, 132 attention area convolution layer, 201 processor, 202 memory, 203 processing circuit, 15 radio wave image learning system, 150 radio wave image learning device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

This radio wave image identification device is provided with: a radio wave image acquisition unit (101) that acquires radio wave image information; a model acquisition unit (102) that acquires a model; a convolution processing unit (103) having a degree-of-separation calculation layer (131) that, on the basis of the model, sets, to the radio wave image information or feature map information generated on the basis of the radio wave image information, a region of interest being a region enclosed by a closed curve and a peripheral region being a region obtained by excluding the region of interest from a region including the region of interest and enclosed by a closed curve, and calculates a degree of separation between the region of interest and the peripheral region, and a region-of-interest convolution layer (132) that executes a convolution process on the region of interest in the radio wave image information or the feature map information on the basis of the model, assigns a weight based on the degree of separation to the processing result of the convolution process, and thus generates feature map information; an entire combination unit (106) that performs class sorting on the basis of the model and the feature map information; and a result output unit (107) that outputs the result of the class sorting performed by the entire combination unit (106).

Description

電波画像識別装置、電波画像学習装置、及び、電波画像識別方法Radio wave image identification device, radio wave image learning device, and radio wave image identification method
 この発明は、電波画像識別装置、電波画像学習装置、及び、電波画像識別方法に関するものである。 The present invention relates to a radio wave image identification device, a radio wave image learning device, and a radio wave image identification method.
 機械学習技術を用いた光学画像における物体識別技術では、ディープラーニング技術の一つである畳み込みニューラルネットワーク(Convolutional Neural Networks)(以下「CNN」という。)を用いることが一般的である。 In the object identification technology for optical images using machine learning technology, it is common to use convolutional neural networks (hereinafter referred to as “CNN”), which is one of the deep learning technologies.
 例えば、非特許文献1には、AlexNetと呼ばれるCNNを用いて、光学画像の物体識別において高精度な識別が可能であることが示されている。 For example, Non-Patent Document 1 indicates that CNN called AlexNet can be used to perform highly accurate identification in object identification of an optical image.
 CNNでは、入力画像から特徴量を抽出して、当該特徴量に基づき画像に写った被写体を分類する。
 しかしながら、入力画像が電波画像である場合、電波画像は、光学画像と比較して画像から抽出可能な特徴量が少ないため、従来のCNNを用いた物体識別では、識別精度が低下してしまうという問題点があった。
In CNN, the feature amount is extracted from the input image, and the subjects captured in the image are classified based on the feature amount.
However, when the input image is a radio wave image, the radio wave image has less feature amount that can be extracted from the image as compared with the optical image, so that the identification accuracy is deteriorated in the conventional object identification using CNN. There was a problem.
 この発明は、上述の問題点を解決するためのもので、入力画像が電波画像であっても、高精度の物体識別を行うことができる電波画像識別装置を提供することを課題としている。 The present invention is intended to solve the above-mentioned problems, and an object thereof is to provide a radio wave image identification device capable of performing highly accurate object identification even if an input image is a radio wave image.
 この発明に係る電波画像識別装置は、電波画像を示す電波画像情報を取得する電波画像取得部と、電波画像情報に基づく識別処理に用いられるモデルを取得するモデル取得部と、電波画像情報又は電波画像情報に基づいて生成された特徴マップ情報に、モデルに基づいて、閉曲線により囲まれた領域である注目領域と、注目領域を包含する閉曲線により囲まれた領域から注目領域を除いた領域である周辺領域とを設定し、注目領域と周辺領域との分離度を算出する分離度算出層と、モデルに基づいて、電波画像情報又は特徴マップ情報における注目領域に対して畳み込み処理を実行し、当該畳み込み処理の処理結果に分離度に基づく重み付けを付与することにより特徴マップ情報を生成する注目領域畳み込み層と、を有する1以上の畳み込み処理部と、モデル及び特徴マップ情報に基づいて、クラス分類を行う全結合部と、全結合部がクラス分類した分類結果を出力する結果出力部と、を備えた。 A radio wave image identifying apparatus according to the present invention includes a radio wave image acquiring unit that acquires radio wave image information indicating a radio wave image, a model acquiring unit that acquires a model used for identification processing based on the radio wave image information, and the radio wave image information or the radio wave image information. Based on the model based on the feature map information generated based on the image information, an area of interest surrounded by a closed curve and an area of the area surrounded by a closed curve including the area of interest excluding the area of interest. A peripheral area is set, a separation degree calculation layer for calculating the degree of separation between the attention area and the peripheral area, and based on the model, convolution processing is performed on the attention area in the radio wave image information or the feature map information, One or more convolutions having a region of interest convolutional layer that generates feature map information by weighting the processing result of the convolutional processing based on the degree of separation A processing unit, based on the model and the characteristic map information, comprising a total binding unit that performs classification and a result output unit for total binding unit outputs a classification result of the classification, the.
 この発明によれば、入力画像が電波画像であっても、高精度の物体識別を行うことができる。 According to the present invention, even if the input image is a radio wave image, highly accurate object identification can be performed.
図1は、実施の形態1に係る電波画像識別装置の構成の一例を示すブロック図である。FIG. 1 is a block diagram showing an example of the configuration of the radio wave image identifying apparatus according to the first embodiment. 図2A及び図2Bは、実施の形態1に係る電波画像識別装置の要部のハードウェア構成の一例を示す図である。2A and 2B are diagrams showing an example of a hardware configuration of a main part of the radio wave image identifying device according to the first embodiment. 図3は、電波画像上又は特徴マップ上のある位置における注目領域及び周辺領域の一例を示す図である。FIG. 3 is a diagram illustrating an example of the attention area and the peripheral area at a certain position on the radio wave image or the feature map. 図4Aは、電波画像上の各画素における位置に対応する注目領域の大きさを示す半径Raの算出方法の一例を示す図である。図4Bは、特徴マップ上の各要素における位置に対応する注目領域の大きさを示す半径Raの算出方法の一例を示す図である。FIG. 4A is a diagram showing an example of a method of calculating the radius Ra indicating the size of the attention area corresponding to the position of each pixel on the radio wave image. FIG. 4B is a diagram showing an example of a method of calculating the radius Ra indicating the size of the attention area corresponding to the position of each element on the feature map. 図5は、従来の畳み込み処理において用いられる矩形フィルタを、実施の形態1に係る注目領域及び周辺領域に対応させて円形フィルタに加工する一例を示す図である。FIG. 5 is a diagram showing an example of processing a rectangular filter used in the conventional convolution processing into a circular filter corresponding to the attention area and the peripheral area according to the first embodiment. 図6は、実施の形態1に係る電波画像識別装置の処理の一例を説明するフローチャートである。FIG. 6 is a flowchart illustrating an example of processing of the radio wave image identifying apparatus according to the first embodiment. 図7A及び図7Bは、電波画像識別装置が取得する電波画像情報が示す電波画像の一例を示す図ある。7A and 7B are diagrams showing an example of the radio wave image indicated by the radio wave image information acquired by the radio wave image identifying apparatus. 図8は、実施の形態2に係る電波画像学習装置の要部の一例を示すブロック図である。FIG. 8 is a block diagram showing an example of a main part of the radio wave image learning apparatus according to the second embodiment.
 以下、この発明の実施の形態について、図面を参照しながら詳細に説明する。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
実施の形態1.
 図1を参照して実施の形態1に係る電波画像識別装置100の要部の構成について説明する。
 図1は、実施の形態1に係る電波画像識別装置100の構成の一例を示すブロック図である。
 図1に示すとおり、電波画像識別装置100は、電波画像識別システム10に適用される。
 電波画像識別システム10は、電波画像識別装置100、電波画像出力装置11、及び記憶装置12を備える。
Embodiment 1.
A configuration of a main part of the radio wave image identifying apparatus 100 according to the first embodiment will be described with reference to FIG.
FIG. 1 is a block diagram showing an example of the configuration of the radio wave image identifying apparatus 100 according to the first embodiment.
As shown in FIG. 1, the radio wave image identification device 100 is applied to a radio wave image identification system 10.
The radio wave image identification system 10 includes a radio wave image identification device 100, a radio wave image output device 11, and a storage device 12.
 電波画像識別装置100は、電波画像出力装置11から取得した電波画像情報と、記憶装置12から取得したモデルとに基づいて、取得した電波画像情報が示す電波画像に写った物体を識別するための識別処理を実行するものである。
 より具体的には、電波画像識別装置100は、取得したモデルに基づくニューラルネットワークによる処理により、取得した電波画像情報が示す電波画像に写った物体を識別するための識別処理を実行するものである。
 電波画像識別装置100の詳細については後述する。
The radio wave image identification device 100 identifies an object shown in the radio wave image indicated by the acquired radio wave image information, based on the radio wave image information acquired from the radio wave image output device 11 and the model acquired from the storage device 12. The identification processing is executed.
More specifically, the radio wave image identifying apparatus 100 executes an identification process for identifying an object included in a radio wave image indicated by the acquired radio wave image information, by a process using a neural network based on the acquired model. ..
Details of the radio wave image identifying apparatus 100 will be described later.
 電波画像出力装置11は、アンテナ(不図示)により受信した電波を復調し、復調した電波に基づいて、電波画像を示す電波画像情報を生成するものである。
 電波画像出力装置11は、生成した電波画像情報を電波画像識別装置100に出力する。
The radio wave image output device 11 demodulates a radio wave received by an antenna (not shown), and generates radio wave image information indicating a radio wave image based on the demodulated radio wave.
The radio wave image output device 11 outputs the generated radio wave image information to the radio wave image identification device 100.
 記憶装置12は、電波画像識別装置100が識別処理を実行するために必要なパラメータ及びプログラム等を示す情報であるモデルと、電波画像識別装置100が識別処理におけるクラス分類を行うためのクラスの理想値とを記憶部13に記憶している。
 記憶装置12は、電波画像識別装置100の要求に応じて、電波画像識別装置100にモデル又はクラスの理想値等を出力する。また、記憶装置12は、電波画像識別装置100の要求に応じて、電波画像識別装置100からモデルを取得し、記憶部13に記憶されているモデルを取得したモデルに更新して記憶部13に記憶させても良い。
 クラスの理想値は、例えば、クラス分類を行う際の分類数が2である場合、クラスの理想値は2次元ベクトルで表現され、ベクトルの一方の要素の値が1、他方の要素の値が0に数値化されたものである。
The storage device 12 is a model that is information indicating parameters and programs necessary for the radio wave image identification device 100 to execute identification processing, and an ideal class for the radio wave image identification device 100 to perform classification in the identification processing. The values and are stored in the storage unit 13.
The storage device 12 outputs the ideal value of the model or class to the radio wave image identification device 100 in response to a request from the radio wave image identification device 100. In addition, the storage device 12 acquires a model from the radio wave image identification device 100 in response to a request from the radio wave image identification device 100, updates the model stored in the storage unit 13 with the acquired model, and stores the model in the storage unit 13. You may memorize it.
The ideal value of a class is represented by a two-dimensional vector when the number of classifications when classifying is 2, for example, the value of one element of the vector is 1 and the value of the other element is It is quantified to 0.
 電波画像識別装置100の要部について説明する。
 実施の形態1に係る電波画像識別装置100は、電波画像取得部101、モデル取得部102、畳み込み処理部103、活性化部104、プーリング部105、全結合部106、結果出力部107、評価部108、及びモデル更新部109を備える。
 なお、電波画像識別装置100において、活性化部104、プーリング部105、評価部108、及びモデル更新部109は、必須の構成ではない。
A main part of the radio wave image identifying apparatus 100 will be described.
The radio wave image identification device 100 according to the first embodiment includes a radio wave image acquisition unit 101, a model acquisition unit 102, a convolution processing unit 103, an activation unit 104, a pooling unit 105, a total combination unit 106, a result output unit 107, and an evaluation unit. And a model updating unit 109.
In the radio wave image identifying device 100, the activation unit 104, the pooling unit 105, the evaluation unit 108, and the model updating unit 109 are not essential components.
 電波画像取得部101は、電波画像情報を取得する。
 より具体的には、例えば、電波画像取得部101は、電波画像出力装置11から電波画像情報を取得する。
The radio wave image acquisition unit 101 acquires radio wave image information.
More specifically, for example, the radio wave image acquisition unit 101 acquires radio wave image information from the radio wave image output device 11.
 モデル取得部102は、電波画像情報に基づく識別処理に用いられるモデルを取得する。
 より具体的には、例えば、モデル取得部102は、記憶装置12からモデルを取得する。
The model acquisition unit 102 acquires a model used for identification processing based on radio wave image information.
More specifically, for example, the model acquisition unit 102 acquires a model from the storage device 12.
 畳み込み処理部103、活性化部104、プーリング部105、全結合部106、及びモデル更新部109の各部は、モデル取得部102が取得したモデルに基づくニューラルネットワークにより処理が実行されるものである。すなわち、畳み込み処理部103、活性化部104、プーリング部105、全結合部106、及びモデル更新部109の各部は、層状結合によるニューラルネットワークにおける層に相当する。
 畳み込み処理部103、活性化部104、プーリング部105、全結合部106、及びモデル更新部109の各部は、それぞれが複数存在しても良い。例えば、畳み込み処理部103が複数存在する場合、複数の畳み込み処理部103は、1層に存在しても良く、また、層状結合しても良い。活性化部104、プーリング部105、全結合部106、及びモデル更新部109の各部についても同様である。
Each of the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, and the model updating unit 109 performs processing by a neural network based on the model acquired by the model acquiring unit 102. That is, each of the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total connection unit 106, and the model update unit 109 corresponds to a layer in a layered connection neural network.
A plurality of each of the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, and the model updating unit 109 may exist. For example, when there are a plurality of convolution processing units 103, the plurality of convolution processing units 103 may exist in one layer or may be combined in layers. The same applies to each of the activation unit 104, the pooling unit 105, the total combination unit 106, and the model updating unit 109.
 畳み込み処理部103は、分離度算出層131及び注目領域畳み込み層132を有する。
 分離度算出層131は、電波画像取得部101が取得した電波画像情報、又は、電波画像取得部101が取得した電波画像情報に基づいて生成された特徴マップ情報に、モデルに基づいて、閉曲線により囲まれた領域である注目領域と、注目領域を包含する閉曲線により囲まれた領域から注目領域を除いた領域である周辺領域とを設定し、注目領域と周辺領域との分離度を算出する。
 注目領域畳み込み層132は、モデルに基づいて、電波画像取得部101が取得した電波画像情報、又は、電波画像取得部101が取得した電波画像情報に基づいて生成された特徴マップ情報における、分離度算出層131が設定した注目領域に対して、畳み込み処理を実行する。更に、注目領域畳み込み層132は、当該畳み込み処理の処理結果に、分離度算出層131が算出した分離度に基づく重み付けを付与することにより特徴マップ情報を生成する。
 注目領域畳み込み層132は、モデルに基づく複数の異なるパラメータを用いて、電波画像情報又は特徴マップ情報における注目領域に対して複数の畳み込み処理を実行しても良い。注目領域畳み込み層132は、当該複数の畳み込み処理の処理結果毎に分離度に基づく重み付けを付与して複数の特徴マップ情報を生成しても良い。
 また、注目領域畳み込み層132は、電波画像情報又は特徴マップ情報における注目領域に対して畳み込み処理を実行し、当該畳み込み処理の処理結果に、モデルに基づく複数の異なるパラメータを用いて算出した複数の分離度に基づく重み付けを付与することにより、複数の特徴マップ情報を生成しても良い。
 分離度に基づく処理結果への重み付けの方法については後述する。
The convolution processing unit 103 includes a separation degree calculation layer 131 and a region-of-interest convolution layer 132.
The degree-of-separation calculation layer 131 uses the closed curve based on the model based on the radio wave image information acquired by the radio wave image acquisition unit 101 or the feature map information generated based on the radio wave image information acquired by the radio wave image acquisition unit 101. An attention area that is a surrounded area and a peripheral area that is an area that is the area surrounded by a closed curve that includes the attention area and that excludes the attention area are set, and the degree of separation between the attention area and the peripheral area is calculated.
The attention area convolutional layer 132 is a degree of separation in radio wave image information acquired by the radio wave image acquisition unit 101 based on the model, or in feature map information generated based on the radio wave image information acquired by the radio wave image acquisition unit 101. The convolution processing is executed on the attention area set by the calculation layer 131. Further, the attention area convolutional layer 132 generates feature map information by weighting the processing result of the convolution processing based on the degree of separation calculated by the degree-of-separation calculation layer 131.
The attention area convolutional layer 132 may perform a plurality of convolution processes on the attention area in the radio wave image information or the feature map information using a plurality of different parameters based on the model. The attention area convolutional layer 132 may generate a plurality of feature map information by weighting each processing result of the plurality of convolution processings based on the degree of separation.
In addition, the attention area convolutional layer 132 executes convolution processing on the attention area in the radio wave image information or the feature map information, and a plurality of parameters calculated based on a plurality of different parameters based on the model are used for the processing result of the convolution processing. A plurality of feature map information may be generated by giving weighting based on the degree of separation.
A method of weighting the processing result based on the degree of separation will be described later.
 注目領域における閉曲線は、例えば、電波画像上又は特徴マップ上のある位置を中心とする半径Raの円Caである。ここで言う円Caは、真円とは限らず、例えば、一部が扁平した略円であっても良い。また、注目領域における閉曲線の形状は、円形とは限らず、楕円形、卵形、又は角丸長方形であっても良い。ここで言う楕円形、卵形、及び角丸長方形は、それぞれ略楕円形、略卵形、及び略角丸長方形を含むものである。角丸長方形とは、長方形の角を一部又は全部を丸くした形状をした図形である。
 電波画像上の各画素又は特徴マップ上の各要素における位置に対応する半径Ra等の注目領域の大きさを算出する方法については後述する。
The closed curve in the attention area is, for example, a circle Ca having a radius Ra centered on a certain position on the radio wave image or the feature map. The circle Ca referred to here is not limited to a perfect circle, and may be, for example, a partially flattened substantially circle. Further, the shape of the closed curve in the attention area is not limited to the circular shape, and may be an elliptical shape, an oval shape, or a rounded rectangular shape. The elliptical shape, the oval shape, and the rounded rectangle include the substantially elliptical shape, the substantially oval shape, and the substantially rounded rectangle, respectively. A rounded rectangle is a figure having a shape in which some or all of the corners of the rectangle are rounded.
A method of calculating the size of the region of interest such as the radius Ra corresponding to the position of each pixel on the radio wave image or each element on the feature map will be described later.
 周辺領域における注目領域を包含する閉曲線は、例えば、注目領域における閉曲線である円Caの中心を中心とした、円Caの半径Raより大きい半径Rbの円Cbである。ここで言う円Cbは、真円とは限らず、例えば、一部が扁平した略円であっても良い。また、周辺領域における注目領域を包含する閉曲線の形状は、円形とは限らず、楕円形、卵形、又は角丸長方形であっても良い。ここで言う楕円形、卵形、及び角丸長方形は、それぞれ略楕円形、略卵形、及び略角丸長方形を含むものである。
 電波画像上の各画素又は特徴マップ上の各要素における位置に対応する半径Ra及び半径Rb等の周辺領域の大きさを算出する方法については後述する。
 また、分離度算出層131における注目領域と周辺領域との分離度及び分離度の算出方法については後述する。
The closed curve including the attention area in the peripheral area is, for example, a circle Cb having a radius Rb larger than the radius Ra of the circle Ca centered on the center of the circle Ca that is the closed curve in the attention area. The circle Cb referred to here is not limited to a perfect circle, and may be, for example, a partially flattened substantially circle. Further, the shape of the closed curve including the attention area in the peripheral area is not limited to the circular shape, and may be an elliptical shape, an oval shape, or a rounded rectangular shape. The elliptical shape, the oval shape, and the rounded rectangle include the substantially elliptical shape, the substantially oval shape, and the substantially rounded rectangle, respectively.
A method of calculating the size of the peripheral area such as the radius Ra and the radius Rb corresponding to the position of each pixel on the radio wave image or each element on the feature map will be described later.
The degree of separation between the region of interest and the peripheral region in the degree-of-separation calculation layer 131 and the method of calculating the degree of separation will be described later.
 活性化部104は、例えば、畳み込み処理部103が出力した特徴マップ情報に対して非線形変換により不要な特徴量を減少させる処理を行う。非線形変換は、シグモイド関数、ランプ関数又はソフトマックス関数等の活性化関数が用いられる。
 プーリング部105は、例えば、活性化部104が非線形変換した特徴マップ情報に対して特徴量を残すように縮小する処理を行う。プーリング部105は、当該処理により、特徴マップ上における目標位置の微小な位置変化に対して不変性を与える。
The activation unit 104, for example, performs a process of reducing an unnecessary feature amount by nonlinear conversion on the feature map information output by the convolution processing unit 103. For the non-linear conversion, an activation function such as a sigmoid function, a ramp function or a softmax function is used.
The pooling unit 105 performs, for example, a process of reducing the feature amount of the feature map information nonlinearly converted by the activation unit 104 so that the feature amount remains. The pooling unit 105 gives invariance to a minute position change of the target position on the feature map by the processing.
 全結合部106は、モデル及び特徴マップ情報に基づいて、クラス分類を行う。より具体的には、例えば、全結合部106は、畳み込み処理部103が出力した特徴マップ情報に対して、活性化部104及びプーリング部105が処理した後の特徴マップ情報に基づいてクラス分類を行う。
 全結合部106は、例えば、特徴マップ情報から特徴量を抽出し、抽出した特徴量をベクトルに変換する。全結合部106が複数存在し、複数の全結合部106が層を成している場合、最後の全結合部106の層から出力されるベクトルは、識別可能な複数のクラスにおける各クラス成分を含むベクトルとなる。複数の層を成す全結合部106の最終層に、ソフトマックス関数により処理する全結合部106を設け、出力されたベクトルを各クラスの確率を示す情報に変換しても良い。
 全結合部106は、活性化部104、プーリング部105、又は、活性化部104及びプーリング部105を含んでいても良い。すなわち、活性化部104、プーリング部105、及び全結合部106を合わせて、全結合部106と定義しても良い。
The total combination unit 106 performs class classification based on the model and the feature map information. More specifically, for example, the all-combining unit 106 classifies the feature map information output from the convolution processing unit 103 based on the feature map information processed by the activation unit 104 and the pooling unit 105. To do.
The total combination unit 106 extracts, for example, a feature amount from the feature map information and converts the extracted feature amount into a vector. When there are a plurality of all-joint units 106 and the plurality of all-joint units 106 form a layer, the vector output from the layer of the last all-joint unit 106 represents each class component in a plurality of identifiable classes. It becomes a vector containing. It is also possible to provide the total combining unit 106 that processes by the softmax function in the final layer of the total combining unit 106 that forms a plurality of layers, and convert the output vector into information indicating the probability of each class.
The total combination unit 106 may include the activation unit 104, the pooling unit 105, or the activation unit 104 and the pooling unit 105. That is, the activation unit 104, the pooling unit 105, and the total combination unit 106 may be collectively defined as the total combination unit 106.
 結果出力部107は、全結合部106がクラス分類した分類結果を出力する。
 より具体的には、例えば、結果出力部107は、全結合部106がクラス分類した分類結果を後述の評価部108に出力する。
 また、結果出力部107は、分類結果を表示制御装置(不図示)に出力しても良い。表示制御装置は、例えば、表示画面(不図示)に分類結果を視覚的に表示することにより出力する。
The result output unit 107 outputs the classification result of the class combination performed by the total combination unit 106.
More specifically, for example, the result output unit 107 outputs the classification result obtained by performing the class classification by the all-combining unit 106 to the evaluation unit 108 described below.
Further, the result output unit 107 may output the classification result to a display control device (not shown). The display control device outputs the classification result by visually displaying the classification result on a display screen (not shown), for example.
 評価部108は、結果出力部107が出力した分類結果と、クラス分類における理想値とに基づいて、モデルに含まれるパラメータを評価する。
 より具体的には、例えば、評価部108は、分類結果とクラス分類の理想値とに基づいて、交差エントロピー関数等の誤差関数により、各クラスの確率と理想値との誤差を計算し、誤差を最小にするパラメータを求める。
 なお、評価部108は、例えば、クラス分類における理想値を記憶装置12から取得する。
The evaluation unit 108 evaluates the parameters included in the model based on the classification result output by the result output unit 107 and the ideal value in the class classification.
More specifically, for example, the evaluation unit 108 calculates the error between the probability of each class and the ideal value by an error function such as a cross entropy function based on the classification result and the ideal value of the class classification, and the error Find the parameter that minimizes.
The evaluation unit 108 acquires, for example, an ideal value in class classification from the storage device 12.
 モデル更新部109は、評価部108が評価した評価結果に基づいて、モデルに含まれるパラメータを変更することによりモデルを更新する。モデル更新部109は、例えば、更新したモデルを記憶装置12に出力する。記憶装置12は、記憶部13に記憶されているモデルをモデル更新部109が出力したモデルに更新して記憶部13に記憶させる。また、モデル更新部109は、例えば、更新したモデルをモデル取得部102に出力しても良い。
 図1は、モデル更新部109が、更新したモデルを記憶装置12に出力する場合、お及び、更新したモデルをモデル取得部102に出力する場合の両方を示している。
The model updating unit 109 updates the model by changing the parameters included in the model based on the evaluation result evaluated by the evaluation unit 108. The model updating unit 109 outputs the updated model to the storage device 12, for example. The storage device 12 updates the model stored in the storage unit 13 to the model output by the model updating unit 109 and stores the model in the storage unit 13. The model updating unit 109 may output the updated model to the model acquiring unit 102, for example.
FIG. 1 illustrates both the case where the model updating unit 109 outputs the updated model to the storage device 12 and the case where the model updating unit 109 outputs the updated model to the model acquisition unit 102.
 電波画像識別装置100において、評価部108及びモデル更新部109は、必須な構成でない。電波画像識別装置100は、評価部108及びモデル更新部109を有することにより、例えば、電波画像取得部101が取得する電波画像情報毎に識別処理を実行するためのパラメータが変化する。
 このように構成することで、電波画像取得部101が取得する電波画像情報毎に学習によりモデルが更新されるため、電波画像識別装置100は、より適したモデルを用いて物体の識別処理を実行することが可能になる。
In the radio wave image identifying apparatus 100, the evaluation unit 108 and the model updating unit 109 are not essential configurations. Since the radio wave image identifying apparatus 100 includes the evaluation unit 108 and the model updating unit 109, for example, the parameter for executing the identification process changes for each radio wave image information acquired by the radio wave image acquisition unit 101.
With this configuration, since the model is updated by learning for each radio wave image information acquired by the radio wave image acquisition unit 101, the radio wave image identification device 100 executes the object identification process using a more suitable model. It becomes possible to do.
 図2A及び図2Bを参照して、実施の形態1に係る電波画像識別装置100の要部のハードウェア構成について説明する。
 図2A及び図2Bは、実施の形態1に係る電波画像識別装置100の要部のハードウェア構成の一例を示す図である。
With reference to FIG. 2A and FIG. 2B, a hardware configuration of a main part of the radio wave image identifying apparatus 100 according to the first embodiment will be described.
2A and 2B are diagrams showing an example of a hardware configuration of a main part of the radio wave image identifying apparatus 100 according to the first embodiment.
 図2Aに示す如く、電波画像識別装置100はコンピュータにより構成されており、当該コンピュータはプロセッサ201及びメモリ202を有している。メモリ202には、当該コンピュータを電波画像取得部101、モデル取得部102、畳み込み処理部103、活性化部104、プーリング部105、全結合部106、結果出力部107、評価部108、及びモデル更新部109として機能させるためのプログラムが記憶されている。メモリ202に記憶されているプログラムをプロセッサ201が読み出して実行することにより、電波画像取得部101、モデル取得部102、畳み込み処理部103、活性化部104、プーリング部105、全結合部106、結果出力部107、評価部108、及びモデル更新部109が実現される。 As shown in FIG. 2A, the radio wave image identifying apparatus 100 is composed of a computer, and the computer has a processor 201 and a memory 202. The memory 202 includes the radio wave image acquisition unit 101, model acquisition unit 102, convolution processing unit 103, activation unit 104, pooling unit 105, total combination unit 106, result output unit 107, evaluation unit 108, and model update in the memory 202. A program for causing the unit 109 to function is stored. When the processor 201 reads out and executes the program stored in the memory 202, the radio wave image acquisition unit 101, the model acquisition unit 102, the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, and the result. The output unit 107, the evaluation unit 108, and the model updating unit 109 are realized.
 また、図2Bに示す如く、電波画像識別装置100は処理回路203により構成されても良い。この場合、電波画像取得部101、モデル取得部102、畳み込み処理部103、活性化部104、プーリング部105、全結合部106、結果出力部107、評価部108、及びモデル更新部109の機能が処理回路203により実現されても良い。 Further, as shown in FIG. 2B, the radio wave image identifying device 100 may be configured by the processing circuit 203. In this case, the functions of the radio wave image acquisition unit 101, model acquisition unit 102, convolution processing unit 103, activation unit 104, pooling unit 105, total combination unit 106, result output unit 107, evaluation unit 108, and model updating unit 109 are It may be realized by the processing circuit 203.
 また、電波画像識別装置100はプロセッサ201、メモリ202及び処理回路203により構成されても良い(不図示)。この場合、電波画像取得部101、モデル取得部102、畳み込み処理部103、活性化部104、プーリング部105、全結合部106、結果出力部107、評価部108、及びモデル更新部109の機能のうちの一部の機能がプロセッサ201及びメモリ202により実現されて、残余の機能が処理回路203により実現されるものであっても良い。 Further, the radio wave image identifying apparatus 100 may be composed of a processor 201, a memory 202 and a processing circuit 203 (not shown). In this case, the functions of the radio wave image acquisition unit 101, the model acquisition unit 102, the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, the result output unit 107, the evaluation unit 108, and the model updating unit 109 are performed. Some of the functions may be realized by the processor 201 and the memory 202, and the remaining functions may be realized by the processing circuit 203.
 プロセッサ201は、例えば、CPU(Central Processing Unit)、GPU(Graphics Processing Unit)、マイクロプロセッサ、マイクロコントローラ又はDSP(Digital Signal Processor)を用いたものである。 The processor 201 uses, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a microprocessor, a micro controller, or a DSP (Digital Signal Processor).
 メモリ202は、例えば、半導体メモリ又は磁気ディスクを用いたものである。より具体的には、メモリ202は、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read-Only Memory)、SSD(Solid State Drive)又はHDD(Hard Disk Drive)などを用いたものである。 The memory 202 uses, for example, a semiconductor memory or a magnetic disk. More specifically, the memory 202 includes a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory-Memory), and an EEPROM (Electrically Integrated Memory). State Drive) or HDD (Hard Disk Drive) is used.
 処理回路203は、例えば、ASIC(Application Specific Integrated Circuit)、PLD(Programmable Logic Device)、FPGA(Field-Programmable Gate Array)、SoC(System-on-a-Chip)又はシステムLSI(Large-Scale Integration)を用いたものである。 The processing circuit 203 includes, for example, an ASIC (Application Specific Integrated Circuit), a PLD (Programmable Logic Device), an FPGA (Field-Programmable Gate Array), and a So-C-Syc-SyC-SyC-Syc-SyC-Syc-System-Cyclic System. Is used.
 図3を参照して、電波画像上又は特徴マップ上のある位置における注目領域及び周辺領域について説明する。
 図3は、電波画像上又は特徴マップ上のある位置における注目領域及び周辺領域の一例を示す図である。
The attention area and the peripheral area at a certain position on the radio wave image or the feature map will be described with reference to FIG.
FIG. 3 is a diagram illustrating an example of the attention area and the peripheral area at a certain position on the radio wave image or the feature map.
 図3に示すように、例えば、電波画像上又は特徴マップ上のある位置における注目領域は、当該位置を中心とする半径Raの円Caにより囲まれた領域である。また、電波画像上又は特徴マップ上のある位置における周辺領域は、当該位置を中心とする半径Rbの円Cbにより囲まれた領域から、当該注目領域を除いた領域である。 As shown in FIG. 3, for example, the attention area at a certain position on the radio wave image or the feature map is an area surrounded by a circle Ca having a radius Ra centered on the position. Further, the peripheral area at a certain position on the radio wave image or the feature map is an area excluding the attention area from the area surrounded by the circle Cb having the radius Rb centered on the position.
 図4を参照して、電波画像上の各画素又は特徴マップ上の各要素における位置に対応する注目領域の大きさ及び周辺領域の大きさを算出する方法について説明する。
 畳み込み処理部103は、例えば、分離度算出層131における処理を実行する前に処理を実行する不図示の領域サイズ決定層133を有する。
 領域サイズ決定層133は、例えば、畳み込み処理部103に入力された電波画像情報又は特徴マップ情報に対して、所定の畳み込みフィルタ(以下「領域決定フィルタ」という。)を用いて畳み込み処理を実行することにより、当該電波画像上の各画素又は当該特徴マップ上の各要素における位置に対応する注目領域の大きさ又は周辺領域の大きさを示す値を算出する。
A method of calculating the size of the attention area and the size of the peripheral area corresponding to the position of each pixel on the radio wave image or each element on the feature map will be described with reference to FIG.
The convolution processing unit 103 includes, for example, a region size determination layer 133 (not shown) that executes the process before executing the process in the separation degree calculation layer 131.
The area size determination layer 133 executes convolution processing on the radio wave image information or the feature map information input to the convolution processing unit 103, for example, using a predetermined convolution filter (hereinafter referred to as “area determination filter”). By doing so, a value indicating the size of the attention area or the size of the peripheral area corresponding to the position of each pixel on the radio wave image or each element on the feature map is calculated.
 図4Aは、電波画像上の各画素における位置に対応する注目領域の大きさを示す半径Raの算出方法の一例を示す図である。
 具体的には、例えば、縦方向の画素数がH個、横方向の画素数がW個、及びチャネル数がC個(以下「H×W×C」と表記する。)の電波画像情報が、畳み込み処理部103に入力された場合、領域サイズ決定層133は、領域決定フィルタを用いて、当該電波画像情報に対して畳み込み処理を実行することにより、当該電波画像情報の各画素に対応する各要素の値が、当該各画素の位置における半径Raの値となる「H×W×1」の注目領域サイズマップ情報を出力する。
 電波画像情報におけるチャネルは、例えば、電波画像における赤青緑等の各色成分に対応するものである。すなわち、例えば、電波画像情報における各画素が、赤成分、青成分、緑成分の3つの色成分を有する場合、当該電波画像におけるチャネル数は3となる。
FIG. 4A is a diagram showing an example of a method of calculating the radius Ra indicating the size of the attention area corresponding to the position of each pixel on the radio wave image.
Specifically, for example, radio wave image information in which the number of pixels in the vertical direction is H, the number of pixels in the horizontal direction is W, and the number of channels is C (hereinafter referred to as “H × W × C”). When input to the convolution processing unit 103, the area size determination layer 133 corresponds to each pixel of the radio wave image information by executing the convolution processing on the radio wave image information using the area determination filter. The area-of-interest size map information of “H × W × 1” in which the value of each element is the value of the radius Ra at the position of each pixel is output.
The channel in the radio wave image information corresponds to each color component such as red, blue and green in the radio wave image. That is, for example, when each pixel in the radio wave image information has three color components of a red component, a blue component, and a green component, the number of channels in the radio wave image is 3.
 より具体的には、例えば、電波画像取得部101は、電波画像情報を取得し、取得した電波画像情報から各色成分に対応する複数の色成分画像情報を生成する。領域サイズ決定層133は、当該複数の色成分画像情報に対して領域決定フィルタを用いて畳み込み処理を行うことにより、当該電波画像情報の各画素に対応する各要素の値が、当該各画素の位置における半径Raの値となる注目領域サイズマップ情報を出力する。 More specifically, for example, the radio wave image acquisition unit 101 acquires radio wave image information and generates a plurality of color component image information corresponding to each color component from the acquired radio wave image information. The area size determination layer 133 performs the convolution processing on the plurality of color component image information using the area determination filter, so that the value of each element corresponding to each pixel of the radio wave image information becomes The attention area size map information that is the value of the radius Ra at the position is output.
 図4Bは、特徴マップ上の各要素における位置に対応する注目領域の大きさを示す半径Raの算出方法の一例を示す図である。
 具体的には、例えば、縦方向の要素数がH個、横方向の要素数がW個、及びチャネル数がC個(以下「H×W×C」と表記する。)の特徴マップ情報が畳み込み処理部103に入力された場合、領域サイズ決定層133は、領域決定フィルタを用いて、当該特徴マップに対して畳み込み処理を実行することにより、当該特徴マップの各要素に対応する各要素の値が、当該特徴マップの各要素の位置における半径Raの値となる「H×W×1」の注目領域サイズマップ情報を出力する。
FIG. 4B is a diagram showing an example of a method of calculating the radius Ra indicating the size of the attention area corresponding to the position of each element on the feature map.
Specifically, for example, the feature map information in which the number of elements in the vertical direction is H, the number of elements in the horizontal direction is W, and the number of channels is C (hereinafter referred to as “H × W × C”). When input to the convolution processing unit 103, the area size determination layer 133 uses the area determination filter to perform the convolution processing on the feature map, so that each element corresponding to each element of the feature map The area-of-interest size map information of “H × W × 1” whose value is the value of the radius Ra at the position of each element of the feature map is output.
 特徴マップ情報におけるチャネルは、例えば、畳み込み処理部103に入力された特徴マップ情報の個数である。具体的には、例えば、電波画像識別装置100が複数の畳み込み処理部103を備え、当該複数の畳み込み処理部103が層を成している場合、先に処理が実行される層の畳み込み処理部103から1以上の特徴マップ情報が出力され、後に処理が実行される層の畳み込み処理部103は、当該1以上の特徴マップ情報に対して、領域決定フィルタを用いて、畳み込み処理を実行する。チャネルは、後に処理が実行される層の畳み込み処理部103に入力される1以上の特徴マップ情報の個数である。 The channel in the feature map information is, for example, the number of feature map information input to the convolution processing unit 103. Specifically, for example, when the radio wave image identifying apparatus 100 includes a plurality of convolution processing units 103, and the plurality of convolution processing units 103 form a layer, the convolution processing units of the layer on which the process is executed first. One or more feature map information is output from 103, and the convolution processing unit 103 of the layer on which the process is performed later performs the convolution process on the one or more feature map information using the area determination filter. The channel is the number of one or more feature map information input to the convolution processing unit 103 of the layer to be processed later.
 電波画像上の各画素又は特徴マップ上の各要素における位置に対応する周辺領域の外周の大きさを示す半径Rbは、例えば、上述の半径Raを算出する方法と同様の方法により算出される。すなわち、領域サイズ決定層133は、注目領域サイズマップ情報を出力するのと同様の方法により、電波画像上の各画素又は特徴マップ上の各要素における位置に対応する周辺領域の外周の大きさを示す周辺領域サイズマップ情報を出力する。
 また、例えば、電波画像上の各画素又は特徴マップ上の各要素における位置に対応する周辺領域の外周の大きさ値は、当該位置に対応する注目領域の大きさを示す値に予め設定された値を乗じることにより算出されるものであっても良い。
The radius Rb indicating the size of the outer circumference of the peripheral region corresponding to the position of each pixel on the radio wave image or each element on the feature map is calculated, for example, by the same method as the method of calculating the radius Ra described above. That is, the area size determination layer 133 determines the size of the outer circumference of the peripheral area corresponding to the position of each pixel on the radio wave image or each element on the feature map by the same method as outputting the attention area size map information. The peripheral area size map information shown is output.
In addition, for example, the size value of the outer circumference of the peripheral area corresponding to the position of each pixel on the radio wave image or each element on the feature map is preset to a value indicating the size of the attention area corresponding to the position. It may be calculated by multiplying by a value.
 注目領域又は周辺領域が円形以外の形状、例えば、楕円形である場合、領域サイズ決定層133は、楕円の長径を算出するための領域決定フィルタと、短径を算出するための領域決定フィルタとを用いて、電波画像上の各画素又は特徴マップ上の各要素における位置に対応する注目領域又は周辺領域の大きさを示す長径用の注目領域サイズマップ情報又は周辺領域サイズマップ情報と、短径用の注目領域サイズマップ情報又は周辺領域サイズマップ情報とを出力する。
 すなわち、領域サイズ決定層133は、所定の形状の注目領域及び周辺領域の大きさを規定するために必要な複数の領域決定フィルタを用いることにより、必要な注目領域サイズマップ情報及び周辺領域サイズマップ情報を出力する。
When the attention area or the peripheral area has a shape other than a circle, for example, an ellipse, the area size determination layer 133 includes an area determination filter for calculating the major axis of the ellipse and an area determination filter for calculating the minor axis. By using the area of interest size map information or area size map information for the major axis that indicates the size of the area of interest or the peripheral area corresponding to the position of each pixel on the radio wave image or each element on the feature map, and the minor axis Output the attention area size map information or the peripheral area size map information for.
That is, the area size determination layer 133 uses a plurality of area determination filters necessary for defining the sizes of the attention area and the peripheral area having a predetermined shape, thereby obtaining the necessary attention area size map information and the peripheral area size map. Output information.
 領域サイズ決定層133のパラメータは、上述の評価部108及びモデル更新部109における処理により、電波画像取得部101が取得する電波画像情報毎に学習され、更新される。そのため、電波画像識別装置100は、より適したモデルを用いて、注目領域サイズマップ情報及び周辺領域サイズマップ情報を出力することが可能になる。 The parameters of the area size determination layer 133 are learned and updated for each radio wave image information acquired by the radio wave image acquisition unit 101 by the processing in the evaluation unit 108 and the model update unit 109 described above. Therefore, the radio wave image identifying apparatus 100 can output the attention area size map information and the peripheral area size map information using a more suitable model.
 分離度算出層131における注目領域と周辺領域との分離度及び分離度の算出方法について説明する。
 分離度算出層131は、例えば、注目領域と周辺領域との分離度を算出するために用いる複数のフィルタを生成する。分離度算出層131が生成するフィルタは、例えば、領域間フィルタ、合算領域フィルタ、及び分離度算出フィルタである。
The separation degree between the attention area and the peripheral area in the separation degree calculation layer 131 and a method of calculating the separation degree will be described.
The separation degree calculation layer 131 generates, for example, a plurality of filters used for calculating the degree of separation between the attention area and the peripheral area. The filters generated by the separability calculation layer 131 are, for example, an inter-region filter, a summation region filter, and a separability calculation filter.
 領域間フィルタは、注目領域と周辺領域と間における、Radar Cross Section値(以下「RCS値」という。)の変化量を算出するための畳み込みフィルタである。
 具体的には、例えば、領域間フィルタは、合算領域に対する注目領域におけるRCS値の分散を示す値と周辺領域におけるRCS値の分散を示す値との和(以下「領域間分散値」という。)を算出するための畳み込みフィルタである。
 より具体的には、例えば、領域間フィルタは、式(1)を用いて領域間分散値を算出するための畳み込みフィルタである。
Figure JPOXMLDOC01-appb-I000001

Figure JPOXMLDOC01-appb-I000002
 なお、RCS値は、電波画像上のある画素又は特徴マップ上のある要素における電波反射強度に対応する値である。
The inter-region filter is a convolution filter for calculating the amount of change in the Radar Cross Section value (hereinafter referred to as “RCS value”) between the region of interest and the peripheral region.
Specifically, for example, the inter-region filter is the sum of the value indicating the variance of the RCS values in the attention region with respect to the summation region and the value indicating the variance of the RCS values in the peripheral regions (hereinafter referred to as “inter-region variance value”). Is a convolution filter for calculating
More specifically, for example, the inter-region filter is a convolution filter for calculating the inter-region variance value using Expression (1).
Figure JPOXMLDOC01-appb-I000001

Figure JPOXMLDOC01-appb-I000002
The RCS value is a value corresponding to the radio wave reflection intensity at a certain pixel on the radio wave image or a certain element on the feature map.
 合算領域フィルタは、注目領域と周辺領域とを合わせた領域(以下「合算領域」という。)におけるRCS値の変化量を算出するための畳み込みフィルタである。
 具体的には、例えば、合算領域フィルタは、合算領域におけるRCS値の分散を示す値(以下「合算領域分散値」という。)を算出するための畳み込みフィルタである。
 より具体的には、例えば、合算領域フィルタは、式(2)を用いて合算領域分散値を算出するための畳み込みフィルタである。
Figure JPOXMLDOC01-appb-I000003

Figure JPOXMLDOC01-appb-I000004
 合算領域フィルタは、式(2)を離散化した式(3)を用いて合算領域分散値を算出するための畳み込みフィルタであっても良い。
Figure JPOXMLDOC01-appb-I000005

Figure JPOXMLDOC01-appb-I000006
The summation area filter is a convolution filter for calculating the change amount of the RCS value in an area (hereinafter, referred to as “summation area”) in which the attention area and the peripheral area are combined.
Specifically, for example, the summation area filter is a convolution filter for calculating a value indicating the variance of RCS values in the summation area (hereinafter referred to as “summation area dispersion value”).
More specifically, for example, the summation area filter is a convolution filter for calculating the summation area variance value using Expression (2).
Figure JPOXMLDOC01-appb-I000003

Figure JPOXMLDOC01-appb-I000004
The summation area filter may be a convolution filter for calculating the summation area variance value using Expression (3) obtained by discretizing Expression (2).
Figure JPOXMLDOC01-appb-I000005

Figure JPOXMLDOC01-appb-I000006
 分離度算出フィルタは、注目領域と周辺領域と間におけるRCS値の変化量が、合算領域におけるRCS値の変化量に占める割合を示す値を算出するためのフィルタである。
 具体的には、例えば、分離度算出フィルタは、領域間分散値を、合算領域分散値で除することにより、注目領域と周辺領域との分離度を算出するためのフィルタである。
 より具体的には、例えば、分離度算出フィルタは、式(4)を用いて注目領域と周辺領域との分離度を算出するためのフィルタである。
Figure JPOXMLDOC01-appb-I000007

Figure JPOXMLDOC01-appb-I000008
The separation degree calculation filter is a filter for calculating a value indicating a ratio of the amount of change in RCS value between the attention area and the peripheral area to the amount of change in RCS value in the summation area.
Specifically, for example, the separation degree calculation filter is a filter for calculating the degree of separation between the attention area and the peripheral area by dividing the inter-area dispersion value by the total area dispersion value.
More specifically, for example, the separation degree calculation filter is a filter for calculating the degree of separation between the region of interest and the peripheral region using Expression (4).
Figure JPOXMLDOC01-appb-I000007

Figure JPOXMLDOC01-appb-I000008
 分離度算出層131は、領域間フィルタ、合算領域フィルタ、及び分離度算出フィルタを用いて、注目領域と周辺領域との分離度を算出する。
 注目領域と周辺領域との分離度は、注目領域と周辺領域と間におけるRCS値の変化量が小さい場合、0に近付き、当該変化量が大きい場合、1に近付く。
The degree-of-separation calculation layer 131 calculates the degree of separation between the attention area and the peripheral area using an inter-area filter, a summation area filter, and a separation degree calculation filter.
The degree of separation between the attention area and the peripheral area approaches 0 when the change amount of the RCS value between the attention area and the peripheral area is small, and approaches 1 when the change amount is large.
 分離度に基づく処理結果への重み付けの方法について説明する。
 注目領域畳み込み層132は、特徴マップ情報を生成するために用いる複数のフィルタを生成する。注目領域畳み込み層132が生成するフィルタは、例えば、注目領域フィルタ、及び重み付けフィルタである。
A method of weighting the processing result based on the degree of separation will be described.
The attention area convolutional layer 132 generates a plurality of filters used for generating the feature map information. The filters generated by the attention area convolutional layer 132 are, for example, an attention area filter and a weighting filter.
 注目領域フィルタは、電波画像上のある画像の位置、又は、特徴マップ上のある要素の位置に対応する注目領域におけるRCS値から、当該位置における特徴的なRCS値を算出するための畳み込みフィルタである。
 注目領域畳み込み層132は、注目領域フィルタを用いて、電波画像上の各画素の位置、又は、特徴マップ上の各要素の位置における特徴的なRCS値を算出する。
The attention area filter is a convolution filter for calculating the characteristic RCS value at the position from the RCS value in the attention area corresponding to the position of an image on the radio wave image or the position of an element on the feature map. is there.
The attention area convolutional layer 132 calculates a characteristic RCS value at the position of each pixel on the radio wave image or the position of each element on the feature map using the attention area filter.
 重み付けフィルタは、例えば、注目領域フィルタを用いて算出された電波画像上のある画像の位置、又は、特徴マップ上のある要素の位置における特徴的なRCS値に、分離度算出層131が算出した当該位置における分離度を乗じるためのフィルタである。
 注目領域畳み込み層132は、重み付けフィルタを用いて、電波画像上の各画素の位置、又は、特徴マップ上の各要素の位置における特徴的なRCS値に、当該各位置における分離度を乗じることにより、特徴マップ情報を生成する。
 上述のとおり、注目領域と周辺領域との分離度は、注目領域と周辺領域と間におけるRCS値の変化量が大きい場合、1に近付き、当該変化量が小さい場合、0に近付く。このため、電波画像上の各画素の位置、又は、特徴マップ上の各要素の位置における特徴的なRCS値に、当該各位置における分離度を乗じることにより生成される特徴マップ情報は、分離度が高ければ注目領域に付与する重み付けが大きくなり、分離度が低ければ注目領域に付与する重み付けが小さくなる。
The weighting filter is calculated by the separation degree calculation layer 131, for example, to a characteristic RCS value at a position of an image on the radio wave image calculated using the attention area filter or a position of an element on the characteristic map. It is a filter for multiplying the degree of separation at the position.
The attention area convolutional layer 132 uses a weighting filter to multiply the position of each pixel on the radio wave image or the characteristic RCS value at the position of each element on the characteristic map by the degree of separation at each position. , Generate feature map information.
As described above, the degree of separation between the region of interest and the peripheral region approaches 1 when the amount of change in the RCS value between the region of interest and the peripheral region is large, and approaches 0 when the amount of change is small. Therefore, the characteristic map information generated by multiplying the characteristic RCS value at the position of each pixel on the radio wave image or the position of each element on the characteristic map by the degree of separation at each position is the degree of separation. Is high, the weight given to the attention area is large, and if the separation degree is low, the weight given to the attention area is small.
 注目領域又は周辺領域におけるRCS値に対して、畳み込み処理を行うための領域間フィルタ、合算領域フィルタ、及び注目領域フィルタは、注目領域又は周辺領域の形状及び大きさに対応するものである。注目領域及び周辺領域の形状は、円形、楕円形、卵形、又は角丸長方形等の形状である。 The inter-region filter, the summation region filter, and the attention region filter for performing the convolution processing on the RCS value in the attention region or the surrounding region correspond to the shape and size of the attention region or the surrounding region. The shape of the attention area and the peripheral area is a shape such as a circle, an ellipse, an oval, or a rounded rectangle.
 領域間フィルタ、合算領域フィルタ、及び注目領域フィルタは、領域サイズ決定層133が出力した注目領域サイズマップ情報及び周辺領域サイズマップ情報に基づいて生成される。
 より具体的には、例えば、領域間フィルタ、合算領域フィルタ、及び注目領域フィルタは、モデル取得部102が取得したモデルに含まれる大きさが可変な円形フィルタ等の所定の形状を有する畳み込みフィルタに、注目領域サイズマップ情報及び周辺領域サイズマップ情報に含まれる注目領域又は周辺領域の大きさを示す値を適用することにより生成される。
 また、例えば、領域間フィルタ、合算領域フィルタ、及び注目領域フィルタは、従来の畳み込み処理において用いられる矩形フィルタにおける各位置を、注目領域又は周辺領域において略均等に配置するように加工することにより、円形等の所定の形状になるように生成されても良い。
The inter-region filter, the summation region filter, and the attention region filter are generated based on the attention region size map information and the peripheral region size map information output by the region size determination layer 133.
More specifically, for example, the inter-region filter, the summation region filter, and the region-of-interest filter are convolution filters having a predetermined shape such as a circular filter having a variable size included in the model acquired by the model acquisition unit 102. It is generated by applying a value indicating the size of the attention area or the surrounding area included in the attention area size map information and the surrounding area size map information.
Further, for example, the inter-region filter, the summation region filter, and the attention region filter, by processing each position in the rectangular filter used in the conventional convolution processing, so as to be arranged substantially evenly in the attention region or the peripheral region, It may be generated to have a predetermined shape such as a circle.
 図5は、従来の畳み込み処理において用いられる矩形フィルタを、実施の形態1に係る注目領域及び周辺領域に対応させて円形フィルタに加工する一例を示す図である。
 図5に示すように、例えば、畳み込み処理部103は、矩形フィルタを分解し、分解した矩形フィルタを、注目領域及び周辺領域の形状に対応するように配置することにより、畳み込みフィルタである領域間フィルタ、合算領域フィルタ、及び注目領域フィルタを生成しても良い。
FIG. 5 is a diagram showing an example of processing a rectangular filter used in the conventional convolution processing into a circular filter corresponding to the attention area and the peripheral area according to the first embodiment.
As shown in FIG. 5, for example, the convolution processing unit 103 decomposes a rectangular filter, and arranges the decomposed rectangular filters so as to correspond to the shapes of the attention region and the peripheral region, so that the regions between the convolution filters are separated. A filter, a summation area filter, and an attention area filter may be generated.
 図6を参照して、実施の形態1に係る電波画像識別装置100の動作について説明する。
 図6は、実施の形態1に係る電波画像識別装置100の処理の一例を説明するフローチャートである。
 電波画像識別装置100は、例えば、当該フローチャートの処理を電波画像情報毎に繰り返して実行する。
The operation of the radio wave image identifying apparatus 100 according to the first embodiment will be described with reference to FIG.
FIG. 6 is a flowchart illustrating an example of processing of the radio wave image identifying apparatus 100 according to the first embodiment.
The radio wave image identifying apparatus 100, for example, repeatedly executes the process of the flowchart for each radio wave image information.
 まず、ステップST601にて、電波画像取得部101は、電波画像情報を取得する。
 次に、ステップST602にて、モデル取得部102は、モデルを取得する。
 次に、ステップST603にて、畳み込み処理部103は、電波画像上の各画素の位置、又は特徴マップ上の各要素の位置における注目領域及び周辺領域の大きさを示す注目領域サイズマップ情報及び周辺領域サイズマップ情報を生成する。
 次に、ステップST604にて、畳み込み処理部103は、注目領域と周辺領域との分離度を算出する。
 次に、ステップST605にて、畳み込み処理部103は、注目領域に対して畳み込み処理を実行し、当該畳み込み処理の処理結果に分離度に基づく重み付けを付与することにより特徴マップ情報を生成する。
First, in step ST601, the radio wave image acquisition unit 101 acquires radio wave image information.
Next, in step ST602, the model acquisition unit 102 acquires a model.
Next, in step ST603, the convolution processing unit 103 determines the area of interest size map information indicating the size of the area of interest and the surrounding area at the position of each pixel on the radio wave image or the position of each element on the feature map and the surrounding area. Region size map information is generated.
Next, in step ST604, the convolution processing unit 103 calculates the degree of separation between the attention area and the peripheral area.
Next, in step ST605, the convolution processing unit 103 executes the convolution processing on the region of interest, and weights the processing result of the convolution processing based on the degree of separation to generate the feature map information.
 畳み込み処理部103が複数存在し、当該複数の畳み込み処理部103が層を成している場合、ステップST606にて、ステップST603からステップST605までの処理を実行した畳み込み処理部103が、最後の層の畳み込み処理部103であるか否かを判定する。
 ステップST606にて、畳み込み処理部103が最後の層の畳み込み処理部103でない場合、電波画像識別装置100は、ステップST603の処理に戻り、これまで処理を実行した層の次の層の畳み込み処理部103にステップST603からステップST606までの処理を実行させる。
 ステップST606にて、畳み込み処理部103が最後の層の畳み込み処理部103である場合、ステップST607にて、活性化部104は、不要な特徴量を減少させる処理を行う。
When there are a plurality of convolution processing sections 103 and the plurality of convolution processing sections 103 form a layer, in step ST606, the convolution processing section 103 that executes the processing from step ST603 to step ST605 is the last layer. It is determined whether or not it is the convolution processing unit 103.
In step ST606, when the convolution processing unit 103 is not the convolution processing unit 103 of the last layer, the radio wave image identifying apparatus 100 returns to the processing of step ST603, and the convolution processing unit of the layer next to the layer that has executed the processing up to this point. The CPU 103 executes the processes from step ST603 to step ST606.
When the convolution processing unit 103 is the convolution processing unit 103 of the last layer in step ST606, the activation unit 104 performs processing of reducing unnecessary feature amounts in step ST607.
 次に、ステップST608にて、プーリング部105は、特徴量を残すように縮小する処理を行う。
 次に、ステップST609にて、全結合部106は、モデル及び特徴マップ情報に基づいて、クラス分類を行う。
 次に、ステップST610にて、結果出力部107は、全結合部106がクラス分類した分類結果を出力する。
 次に、ステップST611にて、評価部108は、結果出力部107が出力した分類結果と、クラス分類における理想値とに基づいて、モデルに含まれるパラメータを評価する。
 次に、ステップST612にて、モデル更新部109は、評価部108が評価した評価結果に基づいて、モデルに含まれるパラメータを変更することによりモデルを更新する。
Next, in step ST608, the pooling unit 105 performs processing to reduce the feature amount so that it remains.
Next, in step ST609, the all-combining unit 106 performs class classification based on the model and the feature map information.
Next, in step ST610, the result output unit 107 outputs the classification result obtained by the total combining unit 106 performing the classification.
Next, in step ST611, the evaluation unit 108 evaluates the parameters included in the model based on the classification result output by the result output unit 107 and the ideal value in the class classification.
Next, in step ST612, the model updating unit 109 updates the model by changing the parameters included in the model based on the evaluation result evaluated by the evaluation unit 108.
 電波画像識別装置100は、ステップST611の処理を実行した後、当該フローチャートの処理を終了し、ステップST601の処理に戻って、当該フローチャートの処理を電波画像情報毎に繰り返し実行する。
 なお、当該フローチャートの処理において、ステップST601の処理とステップST602の処理とは、処理する順序が逆であっても良い。
After executing the process of step ST611, the radio wave image identifying apparatus 100 ends the process of the flowchart, returns to the process of step ST601, and repeatedly executes the process of the flowchart for each radio wave image information.
In the processing of the flowchart, the processing order of the processing of step ST601 and the processing of step ST602 may be reversed.
 以上のように、電波画像識別装置100は、電波画像を示す電波画像情報を取得する電波画像取得部101と、電波画像情報に基づく識別処理に用いられるモデルを取得するモデル取得部102と、電波画像情報又は電波画像情報に基づいて生成された特徴マップ情報に、モデルに基づいて、閉曲線により囲まれた領域である注目領域と、注目領域を包含する閉曲線により囲まれた領域から注目領域を除いた領域である周辺領域とを設定し、注目領域と周辺領域との分離度を算出する分離度算出層131と、モデルに基づいて、電波画像情報又は特徴マップ情報における注目領域に対して畳み込み処理を実行し、当該畳み込み処理の処理結果に分離度に基づく重み付けを付与することにより特徴マップ情報を生成する注目領域畳み込み層132と、を有する1以上の畳み込み処理部103と、モデル及び特徴マップ情報に基づいて、クラス分類を行う全結合部106と、全結合部106がクラス分類した分類結果を出力する結果出力部107と、を備えた。
 このように構成することで、電波画像識別装置100は、算出した注目領域と周辺領域との分離度に基づく重み付けを付与を行うことにより、入力画像が電波画像であっても、高精度の物体識別を行うことができる。
As described above, the radio wave image identifying apparatus 100 includes a radio wave image acquiring unit 101 that acquires radio wave image information indicating a radio wave image, a model acquiring unit 102 that acquires a model used for identification processing based on the radio wave image information, and a radio wave image. The feature map information generated based on the image information or the radio wave image information is used to exclude the attention area from the area enclosed by the closed curve including the attention area and the attention area, which is the area enclosed by the closed curve, based on the model. A surrounding area which is a closed area and calculates a degree of separation between the attention area and the surrounding area, and a convolution process for the attention area in the radio wave image information or the feature map information based on the model. The attention area convolutional layer 13 for generating the feature map information by executing the above processing and weighting the processing result of the convolution processing based on the degree of separation. And one or more convolution processing units 103 each having :, a total combining unit 106 that performs class classification based on the model and feature map information, and a result output unit 107 that outputs a classification result obtained by the total combining unit 106. Equipped with.
With this configuration, the radio wave image identifying apparatus 100 assigns weights based on the calculated degree of separation between the attention area and the peripheral area, so that even if the input image is a radio wave image, a highly accurate object can be obtained. Identification can be done.
 また、電波画像識別装置100は、畳み込み処理部103は、分離度が高ければ注目領域に付与する重み付けを大きくし、分離度が低ければ注目領域に付与する重み付けを小さくするように構成した。
 このように構成することで、電波画像識別装置100は、注目領域におけるRCS値が特徴的なものである場合、注目領域におけるRCS値を際立たせることができる。
Further, the radio wave image identifying apparatus 100 is configured such that the convolution processing unit 103 increases the weighting given to the attention area when the degree of separation is high, and reduces the weighting given to the attention area when the degree of separation is low.
With this configuration, the radio wave image identifying apparatus 100 can enhance the RCS value in the attention area when the RCS value in the attention area is characteristic.
 また、電波画像識別装置100は、注目領域の大きさが、学習により更新されることにより、電波画像取得部101が取得する電波画像情報毎に変化するように構成した。
 より具体的には、電波画像識別装置100は、注目領域における円Caの半径Ra、及び周辺領域における円Cbの半径Rbが、学習により更新されることにより、電波画像取得部101が取得する電波画像情報毎に変化するように構成した。
 このように構成することで、電波画像識別装置100は、より適したと特徴量の算出が可能となり、識別精度を向上できる。
In addition, the radio wave image identifying apparatus 100 is configured such that the size of the attention area is changed by the radio wave image information acquired by the radio wave image acquisition unit 101 by being updated by learning.
More specifically, the radio wave image identifying apparatus 100 updates the radius Ra of the circle Ca in the attention area and the radius Rb of the circle Cb in the peripheral area by learning so that the radio wave image acquiring unit 101 acquires the radio wave. It is configured to change for each image information.
With such a configuration, the radio wave image identifying apparatus 100 can calculate the feature amount that is more suitable, and can improve the identification accuracy.
 また、電波画像識別装置100は、畳み込み処理部103が電波画像情報又は特徴マップ情報における注目領域に対して畳み込み処理を行う際に使用する畳み込みフィルタが、矩形フィルタを加工することにより構成されるようにした。
 このように構成することで、電波画像識別装置100は、所定の形状の畳み込みフィルタを予め用意する必要がなくなり、より汎用的な畳み込みフィルタを用いることができる。
In the radio wave image identifying apparatus 100, the convolution filter used when the convolution processing unit 103 performs the convolution process on the attention area in the radio wave image information or the feature map information is configured by processing a rectangular filter. I chose
With this configuration, the radio wave image identifying apparatus 100 does not need to prepare a convolution filter having a predetermined shape in advance, and can use a more general-purpose convolution filter.
 また、電波画像識別装置100は、結果出力部107が出力した分類結果と、クラス分類における理想値とに基づいて、モデルに含まれるパラメータを評価する評価部108と、評価部108が評価した評価結果に基づいて、モデルに含まれるパラメータを更新することによりモデルを更新するモデル更新部109と、を備えた。
 このように構成することで、電波画像識別装置100は、取得する電波画像情報毎に、より適したとモデルに基づいた識別処理が可能となり、識別精度を向上できる。
Further, the radio wave image identifying apparatus 100 evaluates the parameters included in the model based on the classification result output by the result output unit 107 and the ideal value in the class classification, and the evaluation performed by the evaluation unit 108. And a model updating unit 109 that updates the model by updating the parameters included in the model based on the result.
With this configuration, the radio wave image identifying apparatus 100 can perform the identification process based on the model that is more suitable for each piece of radio wave image information to be acquired, and can improve the identification accuracy.
 また、電波画像識別装置100は、注目領域の形状が、円形、楕円形、卵形、又は角丸長方形となるように構成した。
 電波画像に出現する物体の特徴は、円形に近い形状である。したがって、電波画像識別装置100は、円形、楕円形、卵形、又は角丸長方形等の形状の畳み込みフィルタを用いることにより、従来の矩形フィルタを用いた場合よりも、より適した特徴量の抽出でき、入力画像が電波画像であっても、高精度の物体識別を行うことができる。
Further, the radio wave image identifying apparatus 100 is configured such that the shape of the attention area is a circle, an ellipse, an oval, or a rounded rectangle.
The feature of the object appearing in the radio wave image is a shape close to a circle. Therefore, the radio wave image identifying apparatus 100 uses a convolution filter having a shape such as a circle, an ellipse, an oval, or a rounded rectangle to extract a more suitable feature amount than when using a conventional rectangular filter. Therefore, even if the input image is a radio wave image, highly accurate object identification can be performed.
 従来の畳み込み処理は、画像の各画素に対して、ある画素を中心とした縦k(kは自然数)画素及び横k画素の領域とk×k畳み込みフィルタとの内積を取った処理結果を算出する。従来の畳み込み処理は、畳み込む領域と畳み込みフィルタがどの程度類似しているかを示す類似度を算出することを意味している。
 これに対して、畳み込み処理部103における畳み込み処理は、注目点を中心とした円形、楕円形、卵形、又は角丸長方形等の形状の注目領域と、注目領域フィルタとの類似度に加えて、注目領域と当該注目領域の外側の周辺領域との分離度、即ち、注目領域と周辺領域とがどの程度類似していないかを示す値を算出し、類似度と分離度とを組み合わせるものである。このように構成することで、電波画像識別装置100は、電波画像上の注目領域におけるRCS値と、電波画像全体におけるRCS値との類似性を単純に量るだけではなく、当該注目領域が電波画像情報においてどの程度特異であるかを量ることができるため、光学画像に比べて特徴量の少ない電波画像においても多くの特徴を抽出することが可能となる。
In the conventional convolution process, for each pixel of an image, the inner product of the k × k convolution filter and the region of vertical k (k is a natural number) pixel and horizontal k pixel centering on a certain pixel is calculated and the processing result is calculated. To do. Conventional convolution processing means calculating a degree of similarity indicating how similar the convolution area and the convolution filter are.
On the other hand, in the convolution processing in the convolution processing unit 103, in addition to the degree of similarity between the attention area centered around the attention point, such as a circle, an ellipse, an oval, or a rounded rectangle, the attention area filter , A degree of separation between the attention area and a peripheral area outside the attention area, that is, a value indicating how similar the attention area and the peripheral area are to each other is calculated, and the similarity and the degree of separation are combined. is there. With this configuration, the radio wave image identifying apparatus 100 not only simply measures the similarity between the RCS value in the attention area on the radio wave image and the RCS value in the entire radio wave image, but Since it is possible to measure the degree of peculiarity in the image information, it is possible to extract many features even in a radio wave image having a smaller feature amount than an optical image.
 図7を参照して、実施の形態1に係る電波画像識別装置100が取得する電波画像情報について説明する。
 図7A及び図7Bは、電波画像識別装置100が取得する電波画像情報が示す電波画像の一例を示す図ある。
The radio wave image information acquired by the radio wave image identifying apparatus 100 according to the first embodiment will be described with reference to FIG. 7.
7A and 7B are diagrams showing an example of the radio wave image indicated by the radio wave image information acquired by the radio wave image identifying apparatus 100.
 図7Aに示すように、電波画像識別装置100が取得する電波画像情報は、例えば、距離方向及び速度方向からなる2次元画像によって示されるレンジドップラマップである。
 図7Aは、2種類の異なる識別目標である物体に対応する電波画像を示している。
 図7Aにおいて、電波画像における明暗は、アンテナが受信した、識別目標である物体に対して照射した電波の反射波のRCS強度を示している。また、電波画像における明暗は、当該反射波の他にアンテナが受信したノイズによる影響が含まれている。ノイズは、Signal-Noise Ratio(以下「SN比」という。)によって表され、SN比が小さいほどノイズの電波画像へ影響は大きくなる。
 電波画像識別装置100が取得する電波画像情報は、例えば、レンジドップラマップに限るものではない。
 図7Bに示すように、電波画像識別装置100が取得する電波画像情報は、例えば、周波数方向及び時間方向からなる2次元画像によって示されるマイクロドップラスペクトルグラムであっても良い。
As illustrated in FIG. 7A, the radio wave image information acquired by the radio wave image identifying apparatus 100 is, for example, a range Doppler map represented by a two-dimensional image including a distance direction and a velocity direction.
FIG. 7A shows a radio wave image corresponding to an object that is two types of different identification targets.
In FIG. 7A, the light and shade in the radio wave image indicates the RCS intensity of the reflected wave of the radio wave received by the antenna and applied to the object as the identification target. In addition, the contrast in the radio wave image includes the influence of noise received by the antenna in addition to the reflected wave. Noise is represented by Signal-Noise Ratio (hereinafter referred to as “SN ratio”), and the smaller the SN ratio, the greater the influence of noise on the radio wave image.
The radio wave image information acquired by the radio wave image identifying apparatus 100 is not limited to the range Doppler map, for example.
As shown in FIG. 7B, the radio wave image information acquired by the radio wave image identifying apparatus 100 may be, for example, a micro-Doppler spectrumgram shown by a two-dimensional image in the frequency direction and the time direction.
 図7Aにおける左側の電波画像は、円錐状の形状をした物体からの反射波のRCS強度を示したものであり、右側の電波画像は、円柱状の形状をした物体からの反射波のRCS強度を示したものである。図7Aにおける各電波画像中の識別目標のRCS強度は、画像中央の破線により囲んだ閉曲線の内側の領域におけるRCS強度により示される。一方、当該領域以外の領域におけるRCS強度により示されるものは、ノイズの影響によるものである。このように、電波画像は、光学画像と比較して、ノイズの影響によるRCS強度と、識別目標である物体から反射波によるRCS強度との判別が難しい。 The radio wave image on the left side in FIG. 7A shows the RCS intensity of the reflected wave from the object having a conical shape, and the radio wave image on the right side shows the RCS intensity of the reflected wave from the object having a cylindrical shape. Is shown. The RCS intensity of the identification target in each radio wave image in FIG. 7A is indicated by the RCS intensity in the area inside the closed curve surrounded by the broken line in the center of the image. On the other hand, what is indicated by the RCS intensity in the area other than the area is due to the influence of noise. As described above, in the radio wave image, it is difficult to distinguish the RCS intensity due to the influence of noise from the RCS intensity due to the reflected wave from the object which is the identification target, as compared with the optical image.
 また、図7Aに示すように、電波画像におけるRCS強度の値は、光学画像と比較して、輝度値等の変化に伴って急激に変動することがない。そのため、識別目標である物体から反射波によるRCS強度には、明確な輪郭が存在しない。したがって、輪郭情報又は形状情報等の抽出を行う従来の畳み込み処理では、十分な特徴量を抽出することができない。
 しかしながら、電波画像識別装置100は、識別目標である物体から反射波によるRCS強度に明確な輪郭が存在しない場合においても、畳み込み処理部103が、注目領域におけるRCS値と、周辺領域におけるRCS値との差を統計的に比較することにより、RCS値が滑らかに変化している場合においても、注目領域と周辺領域と間でRCS値に差がある場合、当該注目領域に重要な特徴量があるものとして重み付けを大きくすることができる。
Further, as shown in FIG. 7A, the value of the RCS intensity in the radio wave image does not fluctuate abruptly with changes in the brightness value and the like, as compared with the optical image. Therefore, there is no clear contour in the RCS intensity due to the reflected wave from the object that is the identification target. Therefore, the conventional convolution processing that extracts the contour information or the shape information cannot extract a sufficient amount of features.
However, in the radio wave image identifying apparatus 100, the convolution processing unit 103 determines the RCS value in the attention area and the RCS value in the surrounding area even when there is no clear contour in the RCS intensity of the reflected wave from the object that is the identification target. By statistically comparing the difference between the RCS values, even if the RCS value changes smoothly, if there is a difference in the RCS value between the attention area and the peripheral area, the attention area has an important feature amount. The weighting can be increased as a matter of course.
実施の形態2.
 図8を参照して実施の形態2に係る電波画像学習装置150について説明する。
 図8は、実施の形態2に係る電波画像学習装置150の要部の一例を示すブロック図である。
 図8に示すとおり、電波画像学習装置150は、例えば、電波画像学習システム15に適用される。
Embodiment 2.
A radio wave image learning device 150 according to the second embodiment will be described with reference to FIG.
FIG. 8 is a block diagram showing an example of a main part of the radio wave image learning device 150 according to the second embodiment.
As shown in FIG. 8, the radio wave image learning device 150 is applied to, for example, the radio wave image learning system 15.
 実施の形態2に係る電波画像学習装置150は、実施の形態1に係る電波画像識別装置100と比較して、モデル更新部109がモデル更新部109aに変更され、モデル取得部102が削除されたものである。
 実施の形態2に係る電波画像学習装置150の構成において、実施の形態1に係る電波画像識別装置100と同様の構成については、同じ符号を付して重複した説明を省略する。すなわち、図1に記載した符号と同じ符号を付した図8の構成については、説明を省略する。
In the radio wave image learning apparatus 150 according to the second embodiment, the model updating unit 109 is changed to a model updating unit 109a and the model acquiring unit 102 is deleted, as compared with the radio wave image identifying apparatus 100 according to the first embodiment. It is a thing.
In the configuration of the radio wave image learning device 150 according to the second embodiment, the same components as those of the radio wave image identifying device 100 according to the first embodiment are denoted by the same reference numerals and duplicate description will be omitted. That is, the description of the configuration in FIG. 8 given the same reference numerals as those in FIG. 1 will be omitted.
 電波画像学習システム15は、例えば、電波画像学習装置150、電波画像出力装置11、及び記憶装置12を備える。
 電波画像出力装置11、及び記憶装置12は、実施の形態1に記載した電波画像出力装置11、及び記憶装置12と同等であるため、詳細な説明を省略する。
The radio wave image learning system 15 includes, for example, a radio wave image learning device 150, a radio wave image output device 11, and a storage device 12.
Since the radio wave image output device 11 and the storage device 12 are the same as the radio wave image output device 11 and the storage device 12 described in the first embodiment, detailed description thereof will be omitted.
 電波画像学習装置150は、電波画像取得部101、畳み込み処理部103、活性化部104、プーリング部105、全結合部106、結果出力部107、評価部108、及びモデル更新部109aを備える。
 電波画像取得部101、畳み込み処理部103、活性化部104、プーリング部105、全結合部106、結果出力部107、及び評価部108は、実施の形態1に記載した電波画像取得部101、畳み込み処理部103、活性化部104、プーリング部105、全結合部106、結果出力部107、及び評価部108と同等であるため、詳細な説明を省略する。
The radio wave image learning device 150 includes a radio wave image acquisition unit 101, a convolution processing unit 103, an activation unit 104, a pooling unit 105, a total combination unit 106, a result output unit 107, an evaluation unit 108, and a model update unit 109a.
The radio wave image acquisition unit 101, the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, the result output unit 107, and the evaluation unit 108 are the radio wave image acquisition unit 101, the convolution unit described in the first embodiment. Since it is equivalent to the processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, the result output unit 107, and the evaluation unit 108, detailed description thereof will be omitted.
 評価部108は、電波画像学習装置150が識別処理におけるクラス分類を行うためのクラスの理想値を予め保持している。
 モデル更新部109aは、予め用意された学習前又は一部学習済みのモデルを保持している。
 電波画像学習装置150は、図8には不図示のモデル取得部102aを備え、モデル更新部109aは、例えば、モデル取得部102aを介して、記憶装置12に予め用意された学習前又は一部学習済みのモデルを取得するようにしても良い。
 モデル更新部109aは、評価部108が評価した評価結果に基づいて、モデルに含まれるパラメータを更新することにより、電波画像情報毎にモデルを更新し、更新したモデルを保持する。
 また、モデル更新部109aは、更新したモデルを記憶装置12に出力する。
 畳み込み処理部103は、モデル更新部109aが保持するモデルに基づいて、電波画像取得部101が取得した電波画像情報に基づく特徴マップ情報を生成する。
The evaluation unit 108 holds in advance an ideal value of a class for the radio wave image learning apparatus 150 to perform class classification in the identification processing.
The model updating unit 109a holds a pre-learned model or a partially-learned model prepared in advance.
The radio wave image learning device 150 includes a model acquisition unit 102a (not shown in FIG. 8), and the model update unit 109a, for example, through the model acquisition unit 102a, before learning or partly prepared in the storage device 12. The learned model may be acquired.
The model updating unit 109a updates the model for each radio wave image information by updating the parameter included in the model based on the evaluation result evaluated by the evaluation unit 108, and holds the updated model.
In addition, the model updating unit 109a outputs the updated model to the storage device 12.
The convolution processing unit 103 generates feature map information based on the radio wave image information acquired by the radio wave image acquisition unit 101, based on the model held by the model updating unit 109a.
 なお、実施の形態2に係る電波画像学習装置150における電波画像取得部101、モデル取得部102a、畳み込み処理部103、活性化部104、プーリング部105、全結合部106、結果出力部107、評価部108、及びモデル更新部109aの各機能は、実施の形態1において図2A及び図2Bに一例を示したハードウェア構成におけるプロセッサ201及びメモリ202により実現されるものであっても良く、又は処理回路203により実現されるものであっても良い。 In the radio wave image learning apparatus 150 according to the second embodiment, the radio wave image acquisition unit 101, the model acquisition unit 102a, the convolution processing unit 103, the activation unit 104, the pooling unit 105, the total combination unit 106, the result output unit 107, and the evaluation. Each function of the unit 108 and the model updating unit 109a may be realized by the processor 201 and the memory 202 in the hardware configuration illustrated in FIGS. 2A and 2B in the first embodiment, or processing It may be realized by the circuit 203.
 電波画像学習装置150における要部の動作は、図6を参照して説明した実施の形態1に係る電波画像識別装置100の動作と同様であるため、説明を省略する。 The operation of the main part of the radio wave image learning device 150 is the same as the operation of the radio wave image identification device 100 according to the first embodiment described with reference to FIG.
 以上のように、電波画像学習装置150は、電波画像を示す電波画像情報を取得する電波画像取得部101と、電波画像情報又は電波画像情報に基づいて生成された特徴マップ情報に、モデルに基づいて、閉曲線により囲まれた領域である注目領域と、注目領域を包含する閉曲線により囲まれた領域から注目領域を除いた領域である周辺領域とを設定し、注目領域と周辺領域との分離度を算出する分離度算出層131と、モデルに基づいて、電波画像情報又は特徴マップ情報における注目領域に対して畳み込み処理を実行し、当該畳み込み処理の処理結果に分離度に基づく重み付けを付与することにより特徴マップ情報を生成する注目領域畳み込み層132と、を有する1以上の畳み込み処理部103と、モデル及び特徴マップ情報に基づいて、クラス分類を行う全結合部106と、全結合部106がクラス分類した分類結果を出力する結果出力部107と、結果出力部107が出力した分類結果と、クラス分類における理想値とに基づいて、モデルに含まれるパラメータを評価する評価部108と、評価部108が評価した評価結果に基づいて、モデルに含まれるパラメータを更新することによりモデルを更新するモデル更新部109aと、を備えた。 As described above, the radio wave image learning device 150 is based on the model based on the radio wave image acquisition unit 101 that acquires the radio wave image information indicating the radio wave image and the radio wave image information or the feature map information generated based on the radio wave image information. Then, a region of interest, which is a region surrounded by the closed curve, and a peripheral region, which is a region surrounded by the closed curve that includes the region of interest and excluding the region of interest, are set, and the degree of separation between the region of interest and the peripheral region is set. Based on the separation degree calculation layer 131 for calculating the, and the model, convolution processing is performed on the attention area in the radio wave image information or the feature map information, and weighting based on the separation degree is given to the processing result of the convolution processing. Based on the model and the feature map information, one or more convolution processing units 103 each having a region-of-interest convolutional layer 132 for generating the feature map information according to Based on the total combination unit 106 that performs the class classification, the result output unit 107 that outputs the classification result of the class combination performed by the total combination unit 106, the classification result that the result output unit 107 outputs, and the ideal value in the class classification. An evaluation unit 108 that evaluates parameters included in the model, and a model updating unit 109a that updates the model by updating the parameters included in the model based on the evaluation result evaluated by the evaluation unit 108. ..
 このように構成することで、電波画像学習装置150は、入力画像が電波画像であっても、高精度の物体識別を行うことためのモデル学習を可能にする。 With this configuration, the radio wave image learning apparatus 150 enables model learning for highly accurate object identification even if the input image is a radio wave image.
 なお、この発明はその発明の範囲内において、各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。 It should be noted that, within the scope of the invention, the present invention allows free combinations of the respective embodiments, modification of arbitrary constituent elements of each embodiment, or omission of arbitrary constituent elements in each embodiment. ..
 この発明に係る電波画像識別装置及び電波画像学習装置は、電波画像識別システムに適用することができる。 The radio wave image identification device and the radio wave image learning device according to the present invention can be applied to a radio wave image identification system.
10 電波画像識別システム、11 電波画像出力装置、12 記憶装置、13 記憶部、100 電波画像識別装置、101 電波画像取得部、102 モデル取得部、103 畳み込み処理部、104 活性化部、105 プーリング部、106 全結合部、107 結果出力部、108 評価部、109,109a モデル更新部、131 分離度算出層、132 注目領域畳み込み層、201 プロセッサ、202 メモリ、203 処理回路、15 電波画像学習システム、150 電波画像学習装置。 10 radio wave image identification system, 11 radio wave image output device, 12 storage device, 13 storage unit, 100 radio wave image identification device, 101 radio wave image acquisition unit, 102 model acquisition unit, 103 convolution processing unit, 104 activation unit, 105 pooling unit , 106 total combination unit, 107 result output unit, 108 evaluation unit, 109, 109a model update unit, 131 separation degree calculation layer, 132 attention area convolution layer, 201 processor, 202 memory, 203 processing circuit, 15 radio wave image learning system, 150 radio wave image learning device.

Claims (16)

  1.  電波画像を示す電波画像情報を取得する電波画像取得部と、
     前記電波画像情報に基づく識別処理に用いられるモデルを取得するモデル取得部と、
     前記電波画像情報又は前記電波画像情報に基づいて生成された特徴マップ情報に、前記モデルに基づいて、閉曲線により囲まれた領域である注目領域と、前記注目領域を包含する閉曲線により囲まれた領域から前記注目領域を除いた領域である周辺領域とを設定し、前記注目領域と前記周辺領域との分離度を算出する分離度算出層と、前記モデルに基づいて、前記電波画像情報又は前記特徴マップ情報における前記注目領域に対して畳み込み処理を実行し、当該畳み込み処理の処理結果に前記分離度に基づく重み付けを付与することにより前記特徴マップ情報を生成する注目領域畳み込み層と、を有する1以上の畳み込み処理部と、
     前記モデル及び前記特徴マップ情報に基づいて、クラス分類を行う全結合部と、
     前記全結合部が前記クラス分類した分類結果を出力する結果出力部と、
     を備えたこと
     を特徴とする電波画像識別装置。
    A radio wave image acquisition unit that acquires radio wave image information indicating a radio wave image,
    A model acquisition unit that acquires a model used for identification processing based on the radio wave image information,
    An area of interest surrounded by a closed curve based on the model in the radio wave image information or the characteristic map information generated based on the radio wave image information, and an area surrounded by a closed curve including the area of interest. From the radio wave image information or the feature, based on the model, a separation degree calculation layer that sets a peripheral area that is an area excluding the attention area from, and calculates the degree of separation between the attention area and the peripheral area. And a region of interest convolution layer that generates the characteristic map information by performing a convolution process on the region of interest in the map information and weighting the processing result of the convolution process based on the degree of separation. And the convolution processing part of
    Based on the model and the feature map information, a total combining unit for classifying,
    A result output unit that outputs the classification result obtained by the all combining unit performing the class classification;
    A radio wave image identification device comprising:
  2.  前記分離度は、前記注目領域と前記周辺領域と間におけるRCS値の変化量が、前記注目領域と前記周辺領域とを合わせた領域におけるRCS値の変化量に占める割合であること
     を特徴とする請求項1に記載の電波画像識別装置。
    The degree of separation is characterized in that an amount of change in RCS value between the attention area and the peripheral area is a proportion of an amount of change in RCS value in the combined area of the attention area and the peripheral area. The radio wave image identifying device according to claim 1.
  3.  前記畳み込み処理部は、前記分離度が高ければ前記注目領域に付与する重み付けを大きくし、前記分離度が低ければ前記注目領域に付与する重み付けを小さくすること
     を特徴とする請求項1に記載の電波画像識別装置。
    The convolution processing unit increases the weighting given to the attention area when the separation degree is high, and decreases the weighting given to the attention area when the separation degree is low. Radio wave image identification device.
  4.  前記注目領域の大きさは、学習により更新されることにより、前記電波画像取得部が取得する前記電波画像情報毎に変化すること
     を特徴とする、請求項1に記載の電波画像識別装置。
    The radio wave image identification device according to claim 1, wherein the size of the attention area changes for each of the radio wave image information acquired by the radio wave image acquisition unit by being updated by learning.
  5.  前記注目領域の形状は、円形であり、当該円形の半径は、学習により更新されることにより、前記電波画像取得部が取得する前記電波画像情報毎に変化すること
     を特徴とする請求項1に記載の電波画像識別装置。
    The shape of the region of interest is a circle, and the radius of the circle is changed by learning, so that it changes for each of the radio wave image information acquired by the radio wave image acquisition unit. The radio wave image identification device described.
  6.  前記畳み込み処理部が前記電波画像情報又は前記特徴マップ情報における前記注目領域に対して前記畳み込み処理を行う際に使用する畳み込みフィルタは、矩形フィルタを加工することにより構成されること
     を特徴とする請求項1に記載の電波画像識別装置。
    The convolution filter used when the convolution processing unit performs the convolution processing on the attention area in the radio wave image information or the characteristic map information is configured by processing a rectangular filter. The radio wave image identifying device according to Item 1.
  7.  前記注目領域の形状は、円形、楕円形、卵形、又は角丸長方形であること
     を特徴とする請求項1に記載の電波画像識別装置。
    The radio wave image identifying device according to claim 1, wherein the shape of the attention area is a circle, an ellipse, an oval, or a rounded rectangle.
  8.  前記結果出力部が出力した前記分類結果と、前記クラス分類における理想値とに基づいて、前記モデルに含まれるパラメータを評価する評価部と、
     前記評価部が評価した評価結果に基づいて、前記モデルに含まれる前記パラメータを更新することにより前記モデルを更新するモデル更新部と、
     を備えたこと
     を特徴とする請求項1から請求項7のいずれか1項に記載の電波画像識別装置。
    Based on the classification result output by the result output unit and an ideal value in the class classification, an evaluation unit that evaluates a parameter included in the model,
    Based on the evaluation result evaluated by the evaluation unit, a model update unit for updating the model by updating the parameters included in the model,
    The radio wave image identification device according to any one of claims 1 to 7, further comprising:
  9.  電波画像を示す電波画像情報を取得する電波画像取得部と、
     前記電波画像情報又は前記電波画像情報に基づいて生成された特徴マップ情報に、モデルに基づいて、閉曲線により囲まれた領域である注目領域と、前記注目領域を包含する閉曲線により囲まれた領域から前記注目領域を除いた領域である周辺領域とを設定し、前記注目領域と前記周辺領域との分離度を算出する分離度算出層と、前記モデルに基づいて、前記電波画像情報又は前記特徴マップ情報における前記注目領域に対して畳み込み処理を実行し、当該畳み込み処理の処理結果に前記分離度に基づく重み付けを付与することにより前記特徴マップ情報を生成する注目領域畳み込み層と、を有する1以上の畳み込み処理部と、
     前記モデル及び前記特徴マップ情報に基づいて、クラス分類を行う全結合部と、
     前記全結合部が前記クラス分類した分類結果を出力する結果出力部と、
     前記結果出力部が出力した前記分類結果と、前記クラス分類における理想値とに基づいて、前記モデルに含まれるパラメータを評価する評価部と、
     前記評価部が評価した評価結果に基づいて、前記モデルに含まれる前記パラメータを更新することにより前記モデルを更新するモデル更新部と、
     を備えたこと
     を特徴とする電波画像学習装置。
    A radio wave image acquisition unit that acquires radio wave image information indicating a radio wave image,
    Based on the model in the radio wave image information or the characteristic map information generated based on the radio wave image information, an attention area that is an area surrounded by a closed curve and an area surrounded by a closed curve that includes the attention area. A separation degree calculation layer that sets a peripheral area that is an area excluding the attention area and calculates the degree of separation between the attention area and the peripheral area, and the radio wave image information or the feature map based on the model. And a region of interest convolution layer that generates the feature map information by performing a convolution process on the region of interest in the information and weighting the processing result of the convolution process based on the degree of separation. A convolution processing unit,
    Based on the model and the feature map information, a total combining unit for classifying,
    A result output unit that outputs the classification result obtained by the all combining unit performing the class classification;
    Based on the classification result output by the result output unit and an ideal value in the class classification, an evaluation unit that evaluates a parameter included in the model,
    Based on the evaluation result evaluated by the evaluation unit, a model update unit for updating the model by updating the parameters included in the model,
    A radio wave image learning device characterized by being equipped with.
  10.  前記分離度は、前記注目領域と前記周辺領域と間におけるRCS値の変化量が、前記注目領域と前記周辺領域とを合わせた領域におけるRCS値の変化量に占める割合であること
      を特徴とする請求項9に記載の電波画像学習装置。
    The degree of separation is characterized in that an amount of change in RCS value between the attention area and the peripheral area is a proportion of an amount of change in RCS value in the combined area of the attention area and the peripheral area. The radio wave image learning device according to claim 9.
  11.  前記畳み込み処理部は、前記分離度が高ければ前記注目領域に付与する重み付けを大きくし、前記分離度が低ければ前記注目領域に付与する重み付けを小さくすること
      を特徴とする請求項9に記載の電波画像学習装置。
    The convolution processing unit increases the weighting given to the region of interest if the degree of separation is high, and reduces the weighting given to the region of interest if the degree of separation is low. Radio wave image learning device.
  12.  前記注目領域の大きさは、学習により更新されることにより、前記電波画像取得部が取得する前記電波画像情報毎に変化すること
      を特徴とする請求項9に記載の電波画像学習装置。
    The radio wave image learning apparatus according to claim 9, wherein the size of the attention area changes for each radio wave image information acquired by the radio wave image acquisition unit by being updated by learning.
  13.  前記注目領域の形状は、円形であり、当該円形の半径は、学習により更新されることにより、前記電波画像取得部が取得する前記電波画像情報毎に変化すること
      を特徴とする請求項9に記載の電波画像学習装置。
    The shape of the region of interest is a circle, and the radius of the circle is changed by learning, thereby changing for each of the radio wave image information pieces acquired by the radio wave image acquisition unit. The described radio wave image learning device.
  14.  前記畳み込み処理部が前記電波画像情報又は前記特徴マップ情報における前記注目領域に対して前記畳み込み処理を行う際に使用する畳み込みフィルタは、矩形フィルタを加工することにより構成されること
      を特徴とする請求項9に記載の電波画像学習装置。
    The convolution filter used when the convolution processing unit performs the convolution processing on the attention area in the radio wave image information or the characteristic map information is configured by processing a rectangular filter. The radio wave image learning device according to Item 9.
  15.  前記注目領域の形状は、円形、楕円形、卵形、又は角丸長方形であること
      を特徴とする請求項9に記載の電波画像学習装置。
    The radio wave image learning apparatus according to claim 9, wherein the shape of the region of interest is a circle, an ellipse, an oval, or a rounded rectangle.
  16.  電波画像取得部が、電波画像を示す電波画像情報を取得し、
     モデル取得部が、前記電波画像情報に基づく識別処理に用いられるモデルを取得し、
     畳み込み処理部が有する分離度算出層が、前記電波画像情報又は前記電波画像情報に基づいて生成された特徴マップ情報に、前記モデルに基づいて、閉曲線により囲まれた領域である注目領域と、前記注目領域を包含する閉曲線により囲まれた領域から前記注目領域を除いた領域である周辺領域とを設定し、前記注目領域と前記周辺領域との分離度を算出し、
     前記畳み込み処理部が有する注目領域畳み込み層が、前記モデルに基づいて、前記電波画像情報又は前記特徴マップ情報における前記注目領域に対して畳み込み処理を実行し、当該畳み込み処理の処理結果に前記分離度に基づく重み付けを付与することにより前記特徴マップ情報を生成し、
     全結合部が、前記モデル及び前記特徴マップ情報に基づいて、クラス分類を行い、
     結果出力部が、前記全結合部が前記クラス分類した分類結果を出力すること
     を特徴とする電波画像識別方法。
    The radio wave image acquisition unit acquires radio wave image information indicating the radio wave image,
    The model acquisition unit acquires a model used for identification processing based on the radio wave image information,
    The separation degree calculation layer included in the convolution processing unit, in the radio wave image information or the feature map information generated based on the radio wave image information, based on the model, the attention area that is an area surrounded by a closed curve, Setting a peripheral region which is a region excluding the attention region from a region surrounded by a closed curve including the attention region, and calculating the degree of separation between the attention region and the peripheral region,
    The attention area convolution layer included in the convolution processing unit performs a convolution processing on the attention area in the radio wave image information or the feature map information based on the model, and the processing result of the convolution processing includes the degree of separation. Generating the feature map information by giving a weighting based on
    The total combining unit performs class classification based on the model and the feature map information,
    The radio wave image identification method, wherein the result output unit outputs the classification result obtained by the total combination unit performing the class classification.
PCT/JP2018/041385 2018-11-07 2018-11-07 Radio wave image identification device, radio wave image learning device, and radio wave image identification method WO2020095392A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020556413A JP6833135B2 (en) 2018-11-07 2018-11-07 Radio image identification device, radio image learning device, and radio image identification method
PCT/JP2018/041385 WO2020095392A1 (en) 2018-11-07 2018-11-07 Radio wave image identification device, radio wave image learning device, and radio wave image identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/041385 WO2020095392A1 (en) 2018-11-07 2018-11-07 Radio wave image identification device, radio wave image learning device, and radio wave image identification method

Publications (1)

Publication Number Publication Date
WO2020095392A1 true WO2020095392A1 (en) 2020-05-14

Family

ID=70611737

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/041385 WO2020095392A1 (en) 2018-11-07 2018-11-07 Radio wave image identification device, radio wave image learning device, and radio wave image identification method

Country Status (2)

Country Link
JP (1) JP6833135B2 (en)
WO (1) WO2020095392A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3710852A1 (en) * 2017-11-13 2020-09-23 Robin Radar Facilities BV Radar based system and method for detection of an object and generation of plots holding radial velocity data, and system for detection and classification of unmanned aerial vehicles, uavs

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018115974A (en) * 2017-01-19 2018-07-26 沖電気工業株式会社 Information processing device, information processing method, and program

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018115974A (en) * 2017-01-19 2018-07-26 沖電気工業株式会社 Information processing device, information processing method, and program

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
CHEN, S. Z. ET AL.: "Target classificaion using the deep convolutional networks for SAR image s", IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, vol. 54, no. 8, August 2016 (2016-08-01), pages 4806 - 4817, XP011612535 *
DING, J. ET AL.: "Convolution neural network with data augmentation for SAR target recognition", IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, vol. 13, no. 3, March 2016 (2016-03-01), pages 364 - 368, XP011608921 *
FUKUI, KAZUHIRO ET AL.: "Facial feature point extraction method based on combination of shape extraction and pattern matching", IEICE TRANSACTIONS, vol. J80-D-II, August 1997 (1997-08-01), pages 49 - 58, XP000782034, ISSN: 0915-1923 *
FUKUI, KAZUHIRO ET AL.: "Facial feature point extraction method based on combination of shape extraction and pattern matching", SYSTEMS AND COMPUTERS IN JAPAN, vol. 29, no. 6, 1998, pages 49 - 58, XP000782034, DOI: 10.1002/(SICI)1520-684X(19980615)29:6<49::AID-SCJ5>3.0.CO;2-L *
KAWAGUCHI, TSUYOSHI ET AL.: "Detection of eyes from human faces by hough transform and separability filter", PROCEEDINGS 2000 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (CAT. NO. 00 CH 37101, September 2000 (2000-09-01), pages 49 - 52, XP010530547, DOI: 10.1109/ICIP.2000.900889 *

Also Published As

Publication number Publication date
JPWO2020095392A1 (en) 2021-03-11
JP6833135B2 (en) 2021-02-24

Similar Documents

Publication Publication Date Title
JP7096365B2 (en) Goal detection and goal detection network training
US20200242424A1 (en) Target detection method and apparatus
CN110795976B (en) Method, device and equipment for training object detection model
CN104966085B (en) A kind of remote sensing images region of interest area detecting method based on the fusion of more notable features
JP7138869B2 (en) Feature quantity extraction device, feature quantity extraction method, identification device, identification method, and program
WO2020168699A1 (en) Neural network for enhancing original image, and computer-implemented method for enhancing original image using neural network
WO2020024585A1 (en) Method and apparatus for training object detection model, and device
KR20180065889A (en) Method and apparatus for detecting target
US20210006760A1 (en) Meta-learning for camera adaptive color constancy
WO2015198757A1 (en) Image processing device, endoscope system, and image processing method
CN109859113B (en) Model generation method, image enhancement method, device and computer-readable storage medium
CN109299668A (en) A kind of hyperspectral image classification method based on Active Learning and clustering
Ghosh et al. Potato Leaf Disease Recognition and Prediction using Convolutional Neural Networks
JP2014534699A (en) System and method for digital image signal compression using unique images
JP6636828B2 (en) Monitoring system, monitoring method, and monitoring program
Yin et al. Adaptive low light visual enhancement and high-significant target detection for infrared and visible image fusion
Lecca et al. Point‐based spatial colour sampling in Milano‐Retinex: a survey
WO2020095392A1 (en) Radio wave image identification device, radio wave image learning device, and radio wave image identification method
CN111738272B (en) Target feature extraction method and device and electronic equipment
US9547914B2 (en) Techniques for feature extraction
WO2022141476A1 (en) Image processing method, data obtaining method, and device
JP4626418B2 (en) Object detection device and object detection method
JP2014095667A (en) Target type discrimination device and target type discrimination method
CN111179226B (en) Visual field diagram identification method and device and computer storage medium
CN114140381A (en) Vitreous opacity grading screening method and device based on MDP-net

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18939247

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020556413

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18939247

Country of ref document: EP

Kind code of ref document: A1