WO2021079785A1 - Meat quality distinction program, and system - Google Patents

Meat quality distinction program, and system Download PDF

Info

Publication number
WO2021079785A1
WO2021079785A1 PCT/JP2020/038582 JP2020038582W WO2021079785A1 WO 2021079785 A1 WO2021079785 A1 WO 2021079785A1 JP 2020038582 W JP2020038582 W JP 2020038582W WO 2021079785 A1 WO2021079785 A1 WO 2021079785A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
meat quality
meat
association
degree
Prior art date
Application number
PCT/JP2020/038582
Other languages
French (fr)
Japanese (ja)
Inventor
綾子 澤田
Original Assignee
Assest株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Assest株式会社 filed Critical Assest株式会社
Publication of WO2021079785A1 publication Critical patent/WO2021079785A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/12Meat; Fish

Definitions

  • the present invention relates to a meat quality discrimination program and system suitable for discriminating meat quality with high accuracy.
  • the meat quality of meat is evaluated based on items such as "fat crossing", “meat color”, “meat tightness”, and “fat color and quality”. Then, the meat quality is finally represented by the grade from the comprehensive discrimination result of each of these items.
  • the meat quality of this meat has been evaluated by a panelist (consumer) as a sensory evaluation by a person, such as the quality and characteristics of the meat (hard or soft, strong or weak flavor, etc.).
  • evaluation by panelists may cause blurring, and it is often difficult to make a unified judgment.
  • meat quality is analyzed through instrumental analysis, but if instrumental analysis is performed each time meat is shipped, labor and cost burden will increase.
  • the present invention has been devised in view of the above-mentioned problems, and an object of the present invention is to determine the meat quality of meat with high accuracy and automatically without relying on human sensory evaluation or instrumental analysis. It is to provide a meat quality discrimination program and a system capable of this.
  • the meat quality discrimination program is a meat quality discrimination program for discriminating the meat quality of meat, which includes an information acquisition step of acquiring image information obtained by imaging the meat to be discriminated, reference image information of the meat imaged in the past, and meat quality. Based on the reference image information corresponding to the image information acquired in the above information acquisition step, the one with the higher degree of association is prioritized and the meat quality is determined. Is characterized by having a computer execute the above.
  • FIG. 1 It is a block diagram which shows the whole structure of the system to which this invention is applied. It is a figure which shows the specific configuration example of a search device. It is a figure for demonstrating the operation of this invention. It is a figure for demonstrating the operation of this invention. It is a figure for demonstrating the operation of this invention. It is a figure for demonstrating the operation of this invention. It is a figure for demonstrating the operation of this invention. It is a figure for demonstrating the operation of this invention. It is a figure for demonstrating the operation of this invention. It is a figure for demonstrating the operation of this invention. It is a figure for demonstrating the operation of this invention. It is a figure for demonstrating the operation of this invention. It is a figure for demonstrating the operation of this invention.
  • FIG. 1 is a block diagram showing an overall configuration of a meat quality discrimination system 1 in which a meat quality discrimination program to which the present invention is applied is implemented.
  • the meat quality discrimination system 1 includes an information acquisition unit 9, a discrimination device 2 connected to the information acquisition unit 9, and a database 3 connected to the discrimination device 2.
  • the information acquisition unit 9 is a device for a person who uses this system to input various commands and information, and is specifically composed of a keyboard, buttons, a touch panel, a mouse, a switch, and the like.
  • the information acquisition unit 9 is not limited to a device for inputting text information, and may be configured by a device such as a microphone that can detect voice and convert it into text information. Further, the information acquisition unit 9 may be configured as an image pickup device capable of capturing an image of a camera or the like.
  • the information acquisition unit 9 may be configured by a scanner having a function of recognizing a character string from a paper-based document. Further, the information acquisition unit 9 may be integrated with the discrimination device 2 described later. The information acquisition unit 9 outputs the detected information to the determination device 2.
  • the information acquisition unit 9 may be configured by means for specifying the position information by scanning the map information. Further, the information acquisition unit 9 may be composed of an illuminance sensor for measuring a temperature sensor, a humidity sensor, and a wind direction sensor. Further, the information acquisition unit 9 may be composed of a communication interface for acquiring data on the weather from the Japan Meteorological Agency or a private weather forecast company. Further, the information acquisition unit 9 may be composed of a body sensor that is attached to the body to detect body data, and this body sensor detects, for example, body temperature, heart rate, blood pressure, number of steps, walking speed, and acceleration. It may be composed of a sensor for the purpose. Further, the body sensor may acquire biological data of animals as well as humans. Further, the information acquisition unit 9 may be configured as a device that acquires information such as drawings by scanning or reading it from a database. In addition to these, the information acquisition unit 9 may be configured by an odor sensor that detects an odor or scent.
  • Database 3 stores various information necessary for determining meat quality.
  • Information necessary for determining meat quality includes reference image information of meat captured in the past, reference ultrasonic image information previously captured from the living body of livestock that provides meat, free amino acid analysis from meat, fatty acid composition, and so on.
  • Reference analysis information that analyzes any one or more of oleic acid, inosic acid, guanylate, and vitamin E, reference production area information regarding the meat production area that was imaged in the past, and reference obtained from the living body of livestock that provides meat in the past.
  • Biometric information reference breeding environment information regarding the breeding environment of livestock that provide meat imaged in the past, reference feed information regarding the feed that was given to livestock that provide meat that was imaged in the past, and actual judgments based on these
  • the data set with the meat quality made is stored.
  • the database 3 contains reference ultrasonic image information, reference analysis information, reference production area information, reference biological information, reference breeding environment information, and reference food information. Any one or more and the meat quality are memorized in association with each other.
  • the discrimination device 2 is composed of, for example, an electronic device such as a personal computer (PC), but is embodied in any other electronic device such as a mobile phone, a smartphone, a tablet terminal, a wearable terminal, etc., in addition to the PC. It may be converted. The user can obtain a search solution by the discrimination device 2.
  • PC personal computer
  • FIG. 2 shows a specific configuration example of the discrimination device 2.
  • the discrimination device 2 performs wired communication or wireless communication with a control unit 24 for controlling the entire discrimination device 2 and an operation unit 25 for inputting various control commands via an operation button, a keyboard, or the like.
  • a communication unit 26 for the purpose, a determination unit 27 for making various determinations, and a storage unit 28 for storing a program for performing a search to be executed represented by a hard disk or the like are connected to the internal bus 21, respectively. .. Further, a display unit 23 as a monitor that actually displays information is connected to the internal bus 21.
  • the control unit 24 is a so-called central control unit for controlling each component mounted in the discrimination device 2 by transmitting a control signal via the internal bus 21. Further, the control unit 24 transmits various control commands via the internal bus 21 in response to the operation via the operation unit 25.
  • the operation unit 25 is embodied by a keyboard or a touch panel, and an execution command for executing a program is input from the user.
  • the operation unit 25 notifies the control unit 24 of the execution command.
  • the control unit 24, including the determination unit 27, executes a desired processing operation in cooperation with each component.
  • the operation unit 25 may be embodied as the information acquisition unit 9 described above.
  • the discrimination unit 27 discriminates the search solution.
  • the discriminating unit 27 reads out various information stored in the storage unit 28 and various information stored in the database 3 as necessary information when executing the discriminating operation.
  • the discriminating unit 27 may be controlled by artificial intelligence. This artificial intelligence may be based on any well-known artificial intelligence technique.
  • the display unit 23 is composed of a graphic controller that creates a display image based on the control by the control unit 24.
  • the display unit 23 is realized by, for example, a liquid crystal display (LCD) or the like.
  • the storage unit 28 When the storage unit 28 is composed of a hard disk, predetermined information is written to each address based on the control by the control unit 24, and this is read out as needed. Further, the storage unit 28 stores a program for executing the present invention. This program is read and executed by the control unit 24.
  • the reference image information is obtained from the image information obtained by capturing an image of the appearance of the meat, and can be obtained by analyzing the image information.
  • This image may be a moving image as well as a still image.
  • This reference image information may be used to identify the meat quality by analyzing an image captured of the meat.
  • the reference image information is premised on being composed of image data obtained by capturing the meat at the time of slaughter, which is disassembled for meat, but is composed of image data from the living body of the livestock that provides the meat. May be good.
  • the meat quality referred to here may be expressed by, for example, "fat crossing", that is, the degree of frost, and may be evaluated based on the judgment criteria of BMS (Beef Marbling Standard).
  • the meat quality may be expressed by "meat color”. This "meat color” is the color and luster of the meat, and like the "fat crossing", the color of the meat was evaluated based on the BCS (Beef Color Standard) criterion. May be good.
  • the meat quality also includes luster.
  • Meat quality also includes "meat tightness and crush", which may be evaluated visually. This meat quality may be evaluated by the texture of the meat, and if these are fine, a soft texture can be obtained.
  • Meat quality is also included in “fat color and quality", and the color is judged based on white or cream color, and is evaluated in consideration of luster and quality.
  • This meat quality may be expressed through the grade of meat, and the meat quality may be expressed by a ranking evaluated on a 5-point or 10-point set by the system side or the user side. good. Alternatively, it may simply be a very tasty, tasty, ok, ordinary expression.
  • meat qualities may be discriminated based on the features learned in the past.
  • artificial intelligence is used to learn the image data of meat and the meat quality, and when actually acquiring the reference image information, the meat quality is discriminated by comparing with the learned image data. You may try to do it.
  • any one or more of crossbreeding, meat color, meat tightness, and fat color and quality may be output.
  • the image data of the meat and any one or more of the crossover, the color of the meat, the tightness of the meat, the color of the fat and the quality are learned, and the image information for reference is actually acquired.
  • the discriminant may be made by comparing with these trained image data.
  • the meat quality may be judged to be good or bad based on the previous experience of the evaluator, or the taste may be judged by actually tasting it. In such a case, multiple inspectors who sample the meat quality evaluate the taste of each item such as texture, sourness, aroma, chewyness, and bitterness in multiple stages, and statistically analyze them to improve the quality. It may be an evaluation value. Further, the meat quality may be determined through a taste sensor capable of detecting the taste, or may be determined through various instrumental analyzes.
  • the input data is, for example, reference image information P01 to P03.
  • the reference image information P01 to P03 as such input data is linked to the meat quality as output.
  • the meat quality as the output solution is displayed.
  • the reference image information is related to each other through the degree of association of 3 or more levels with respect to the meat quality A to D as the output solution.
  • the reference image information is arranged on the left side through this degree of association, and each meat quality is arranged on the right side through this degree of association.
  • the degree of association indicates which meat quality is highly relevant to the reference image information arranged on the left side. In other words, this degree of association is an index indicating what kind of meat quality each reference image information is likely to be associated with, and the accuracy in selecting the most probable meat quality from the reference image information. Is shown. In the example of FIG. 3, w13 to w19 are shown as the degree of association.
  • w13 to w19 are shown in 10 stages as shown in Table 1 below, and the closer to 10 points, the higher the degree of relevance of each combination as an intermediate node to the meat quality as an output. On the contrary, the closer to one point, the lower the degree of relevance of each combination as an intermediate node to the price as an output.
  • the discrimination device 2 acquires in advance the degree of association w13 to w19 of three or more stages shown in FIG. That is, the discriminating device 2 accumulates a past data set as to which of the reference image information and the meat quality in that case is adopted and evaluated in discriminating the actual search solution, and analyzes these. By analyzing, the degree of association shown in FIG. 3 is created.
  • meat quality A is often evaluated as the meat quality for reference image information captured for meat in the past.
  • the degree of association with the reference image information is strengthened.
  • This analysis may be performed by artificial intelligence.
  • analysis is performed from various data as a result of past evaluation of meat quality.
  • the degree of association that leads to the evaluation of this meat quality is set higher, and if there are many cases of meat quality B, it leads to the evaluation of this meat quality.
  • Set a higher degree of association For example, in the example of the reference image information P01, the meat quality A and the meat quality C are linked, but from the previous case, the degree of association of w13 connected to the meat quality A is 7 points, and the degree of association of w14 connected to the meat quality C is 2. It is set to a point.
  • the degree of association shown in FIG. 3 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association.
  • the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence.
  • reference image information is input as input data
  • meat quality is output as output data
  • at least one hidden layer is provided between the input node and the output node, and the machine is provided. You may let them learn.
  • the above-mentioned degree of association is set in either one or both of the input node and the hidden layer node, and this is the weighting of each node, and the output is selected based on this. Then, when the degree of association exceeds a certain threshold value, the output may be selected.
  • Such degree of association is what is called learned data in artificial intelligence.
  • the above-mentioned is used to actually discriminate the meat quality from now on.
  • the meat quality will be searched using the learned data.
  • the image information obtained by actually capturing the image of the meat in the area to be discriminated is newly acquired.
  • the newly acquired image information is input by the information acquisition unit 9 described above.
  • the image information is acquired by capturing an image for which meat is to be discriminated. This determination method may be performed by the same method as the reference image information described above.
  • the meat quality is determined based on the newly acquired image information in this way.
  • the degree of association shown in FIG. 3 (Table 1) acquired in advance is referred to.
  • the meat quality B is associated with w15 and the meat quality C is associated with the association degree w16 through the degree of association.
  • the meat quality B having the highest degree of association is selected as the optimum solution.
  • an output solution to which the arrows are not connected may be selected, and any other output solution may be selected in any other priority as long as it is based on the degree of association.
  • the most suitable meat quality can be searched for from the newly acquired image information and displayed to the user.
  • the user that is, the meat producer, the distributor, and the distributor can select the meat based on the searched meat quality, predict the taste of the meat, and further, the meat. You can decide the price of.
  • any one or more of a spectrum image and an ultrasonic image may be acquired.
  • this ultrasonic image is used, the biological data of the meat in the living body of the livestock that provides the meat may be imaged.
  • the reference image information is image data of meat taken at the time of past slaughter.
  • the reference image information is an image captured by a normal camera, but may be composed of a spectrum image color-coded for each frequency band.
  • the reference ultrasonic image information is the ultrasonic image data of the meat portion imaged in advance in the living body of the livestock that provides the meat.
  • the meat quality can be determined with higher accuracy. Therefore, in addition to the reference image information, the reference ultrasonic image information is combined to form the above-mentioned degree of association.
  • the input data is, for example, reference image information P01 to P03 and reference ultrasonic image information P14 to 17.
  • the intermediate node shown in FIG. 5 is a combination of reference ultrasonic image information and reference image information as such input data. Each intermediate node is further linked to the output. In this output, the meat quality as the output solution is displayed.
  • Each combination (intermediate node) of the reference image information and the reference ultrasonic image information is related to each other through three or more levels of association with the meat quality as this output solution.
  • the reference image information and the reference ultrasonic image information are arranged on the left side through this degree of association, and the meat quality is arranged on the right side through this degree of association.
  • the degree of association indicates the degree of high relevance to the meat quality with respect to the reference image information and the reference ultrasonic image information arranged on the left side.
  • this degree of association is an index indicating what kind of meat quality each reference image information and reference ultrasonic image information is likely to be associated with, and is a reference image information and reference ultrasonic image. It shows the accuracy in selecting the most probable meat quality from the information. Therefore, the optimum meat quality is searched for by combining the reference image information and the reference ultrasonic image information.
  • w13 to w22 are shown as the degree of association. As shown in Table 1, these w13 to w22 are shown in 10 stages, and the closer to 10 points, the higher the degree of relevance of each combination as an intermediate node to the output, and conversely, 1 point. The closer they are, the less relevant each combination as an intermediate node is to the output.
  • the discrimination device 2 acquires in advance the degree of association w13 to w22 of three or more stages shown in FIG. That is, the discriminating device 2 accumulates past data as to which of the reference image information, the reference ultrasonic image information, and the meat quality in that case is suitable for discriminating the actual search solution. By analyzing and analyzing these, the degree of association shown in FIG. 5 is created.
  • the reference image information in the actual case in the past is the image data ⁇ .
  • the reference ultrasonic image information is the image data ⁇ .
  • the meat quality indicating how much the meat quality was actually was learned as a data set and defined in the form of the above-mentioned degree of association.
  • such reference image information and reference ultrasonic image information may be extracted from a management database managed by a producer, a distributor, a distributor, or the like.
  • This analysis may be performed by artificial intelligence.
  • the meat quality is analyzed from the past data. If there are many cases where the meat quality is A (sweetness degree ⁇ , acidity degree ⁇ , bitterness degree ⁇ , chewyness ⁇ , etc.), the degree of association that leads to this meat quality A is set higher, and the case of meat quality B When there are many cases and there are few cases of meat quality A, the degree of association leading to meat quality B is set high, and the degree of association leading to meat quality A is set low.
  • the output of meat quality A and quality B is linked, but from the previous case, the degree of association of w13 connected to meat quality A is 7 points, and the degree of association of w14 connected to meat quality B is 2 points. Is set to.
  • the degree of association shown in FIG. 5 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association.
  • the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence. Other than that, the configuration related to artificial intelligence is the same as the description in FIG.
  • the node 61b is a node in which the reference ultrasonic image information P14 is combined with the reference image information P01, the association degree of the meat quality C is w15, and the association degree of the meat quality E is. Is w16.
  • the node 61c is a node in which the reference ultrasonic image information P15 and P17 are combined with respect to the reference image information P02, and the degree of association of the meat quality B is w17 and the degree of association of the meat quality D is w18.
  • Such degree of association is what is called learned data in artificial intelligence. After creating such learned data, when actually determining the meat quality from now on, the above-mentioned learned data will be used. In such a case, the image information and the ultrasonic image information of the meat for which the meat quality is to be actually determined are input or selected.
  • the degree of association shown in FIG. 5 (Table 1) acquired in advance is referred to.
  • the node 61d is associated via the degree of association.
  • the node 61d is associated with the meat quality C by w19 and the meat quality D by the degree of association w20.
  • the meat quality C having the highest degree of association is selected as the optimum solution.
  • Table 2 below shows examples of the degree of association w1 to w12 extending from the input.
  • the intermediate node 61 may be selected based on the degree of association w1 to w12 extending from this input. That is, the larger the degree of association w1 to w12, the heavier the weighting in the selection of the intermediate node 61 may be. However, the degrees of association w1 to w12 may all have the same value, and the weights in the selection of the intermediate node 61 may all be the same.
  • a combination with the reference spectrum information and a degree of association with the meat quality for the combination may be set at three or more levels. Good.
  • the above-mentioned ultrasonic image information and ultrasonic image information are not limited to the case of capturing ultrasonic image data at a certain point in time for livestock applied as meat, and are time-series from the living body of livestock.
  • the ultrasonic image may be taken a plurality of times to acquire the change tendency over time.
  • the degree of association is formed as reference ultrasonic image information including the change tendency of such time-series ultrasonic image data, and images are taken multiple times in time series from the living body of the livestock that provides the meat to be discriminated.
  • the time-series change tendency of the ultrasonic image is acquired, it is discriminated by inputting this as input data.
  • a combination with the reference analysis information instead of the above-mentioned reference ultrasonic image information and a degree of association with the meat quality for the combination are set at three or more levels. An example is shown.
  • This reference analysis information which is added as an explanatory variable instead of the reference ultrasound image information, is all information regarding the results of chemical and physical analysis performed on meat.
  • This reference analysis information may include analysis information obtained by analyzing any one or more of free amino acid analysis, fatty acid composition, oleic acid, inosinic acid, guanylic acid, and vitamin E.
  • free amino acid analysis the proportion of various amino acids such as glutamic acid, which is an umami component, is analyzed.
  • analysis results of quantitative analysis of various fatty acids such as oleic acid contained in adipose tissue are shown as evaluation criteria for texture such as mellowness and melting in the mouth.
  • inosinic acid In the analysis of oleic acid, the more the unit price of unsaturated fatty acid is contained, the softer and tastier it is evaluated, so this is analyzed.
  • inosinic acid is performed because inosinic acid, which is a kind of organic compound, is an umami component of dried bonito and is said to increase by aging after dismantling treatment.
  • guanylic acid analyzes guanylic acid because it elicits the umami component of shiitake mushrooms. Vitamin E also affects umami, so this is analyzed.
  • each index included in such reference analysis information also affects the taste of meat, the discrimination accuracy can be improved by combining the meat quality with the reference image information and discriminating the meat quality through the degree of association.
  • Both the reference analysis information and the analysis information may be an analysis performed on the meat at the time of slaughter, or may be an analysis performed on the living body of the livestock that provides the meat.
  • the input data is, for example, reference image information P01 to P03 and reference analysis information P18 to 21.
  • the intermediate node shown in FIG. 6 is a combination of reference analysis information and reference image information as such input data. Each intermediate node is further linked to the output. In this output, the meat quality as the output solution is displayed.
  • Each combination (intermediate node) of the reference image information and the reference analysis information is related to each other through three or more levels of association with the meat quality as this output solution.
  • the reference image information and the reference analysis information are arranged on the left side through this degree of association, and the meat quality is arranged on the right side through this degree of association.
  • the degree of association indicates the degree of high relevance to the meat quality with respect to the reference image information and the reference analysis information arranged on the left side.
  • this degree of association is an index showing what kind of meat quality each reference image information and reference analysis information is likely to be associated with, and is the most reliable from the reference image information and the reference analysis information. It shows the accuracy in selecting a unique meat quality.
  • the discrimination device 2 acquires in advance the degree of association w13 to w22 of three or more stages shown in FIG. That is, which of the reference image information, the reference analysis information obtained when acquiring the reference image information, and the meat quality in that case was suitable for the discrimination device 2 to discriminate the actual search solution. , Past data is accumulated, and by analyzing and analyzing these, the degree of association shown in FIG. 6 is created.
  • the reference analysis information has an oleic acid content of ⁇ and an inosinic acid content of ⁇ for a certain reference image information. To do. In such a case, if there are many cases where the meat quality is determined to be A, these are trained as a data set and defined in the form of the above-mentioned degree of association.
  • This analysis may be performed by artificial intelligence.
  • the meat quality is analyzed from the past data. If there are many cases of meat quality A, the degree of association that this meat quality leads to A is set higher, and if there are many cases of meat quality B and there are few cases of meat quality A, the degree of association that meat quality leads to B is set. And set the degree of association that the meat quality leads to A low.
  • the output of meat quality A and meat quality B is linked, but from the previous case, the degree of association of w13 connected to meat quality A is 7 points, and the degree of association of w14 connected to meat quality B is 2 points. Is set to.
  • the degree of association shown in FIG. 6 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association.
  • the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence. Other than that, the configuration related to artificial intelligence is the same as the description in FIG.
  • the node 61b is a node in which the reference analysis information P18 is combined with the reference image information P01, and the degree of association of the meat quality C is w15 and the degree of association of the meat quality E is w16. It has become.
  • the node 61c is a node in which the reference analysis information P19 and P21 are combined with respect to the reference image information P02, and the degree of association of the meat quality B is w17 and the degree of association of the meat quality D is w18.
  • Such degree of association is what is called learned data in artificial intelligence.
  • the above-mentioned trained data After creating such trained data, when actually searching for meat quality from now on, the above-mentioned trained data will be used.
  • the image information of the meat quality to be discriminated and the analysis information are actually acquired.
  • the analysis information is newly acquired when actually estimating the meat quality, and the acquisition method is the same as the above-mentioned reference analysis information.
  • the reference analysis information is obtained by pre-analyzing any one or more of free amino acid analysis, fatty acid composition, oleic acid, inosinic acid, guanylic acid, and vitamin E according to this analysis information, and this is used for reference analysis. It will be used as information, and the degree of association with the combination with the reference image information will be formed.
  • the degree of association shown in FIG. 6 (Table 1) acquired in advance is referred to.
  • the node 61d is associated via the degree of association.
  • the node 61d is associated with the meat quality C by w19 and the meat quality D by the degree of association w20.
  • the meat quality C having the highest degree of association is selected as the optimum solution.
  • a combination with the reference production area information instead of the above-mentioned reference ultrasonic image information and a degree of association with the meat quality for the combination are set at three or more levels. An example is shown.
  • This reference production area information which is added as an explanatory variable instead of the reference ultrasonic image information, is information on the production area of the meat, for example, national level such as the United States and Japan, regional level such as Tohoku region and Kyushu region, Hokkaido and It may be shown at the prefectural level such as Kagoshima prefecture, and also at the group or town of Hokkaido, or even at the ranch level. Since the meat production area included in the reference production area information also affects the taste of the meat, the discrimination accuracy can be improved by combining the meat quality with the reference image information and discriminating the meat quality through the degree of association.
  • the input data is, for example, reference image information P01 to P03 and reference production area information P18 to 21.
  • the intermediate node shown in FIG. 7 is a combination of reference image information and reference production area information as such input data. Each intermediate node is further linked to the output. In this output, the meat quality as the output solution is displayed.
  • Each combination (intermediate node) of the reference image information and the reference production area information is related to each other through three or more levels of association with the meat quality as this output solution.
  • the reference image information and the reference production area information are arranged on the left side through this degree of association, and the meat quality is arranged on the right side through this degree of association.
  • the degree of association indicates the degree of high relevance to the meat quality with respect to the reference image information and the reference production area information arranged on the left side.
  • this degree of association is an index showing what kind of meat quality each reference image information and reference production area information is likely to be associated with, and is the most reliable from the reference image information and the reference production area information. It shows the accuracy in selecting a unique meat quality.
  • the discrimination device 2 acquires in advance the degree of association w13 to w22 of three or more stages shown in FIG. 7. That is, which of the reference image information, the reference production area information obtained when the reference image information was acquired, and the meat quality in that case was suitable for the discrimination device 2 to discriminate the actual search solution. , Past data is accumulated, and by analyzing and analyzing these, the degree of association shown in FIG. 7 is created.
  • This analysis may be performed by artificial intelligence.
  • the meat quality is analyzed from the past data. If there are many cases of meat quality A, the degree of association that this meat quality leads to A is set higher, and if there are many cases of meat quality B and there are few cases of meat quality A, the degree of association that meat quality leads to B is set. And set the degree of association that the meat quality leads to A low.
  • the output of meat quality A and meat quality B is linked, but from the previous case, the degree of association of w13 connected to meat quality A is 7 points, and the degree of association of w14 connected to meat quality B is 2 points. Is set to.
  • the degree of association shown in FIG. 7 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association.
  • the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence. Other than that, the configuration related to artificial intelligence is the same as the description in FIG.
  • the node 61b is a node of the combination of the reference image information P01 and the reference production area information P18, and the degree of association of the meat quality C is w15 and the degree of association of the meat quality E is w16. It has become.
  • the node 61c is a node in which the reference production area information P19 and P21 are combined with respect to the reference image information P02, and the degree of association of the meat quality B is w17 and the degree of association of the meat quality D is w18.
  • Such degree of association is what is called learned data in artificial intelligence.
  • the above-mentioned trained data will be used.
  • the image information of the meat quality to be discriminated and the production area information are actually acquired.
  • the production area information is newly acquired when actually estimating the meat quality, and the acquisition method is the same as the reference production area information described above.
  • the method of acquiring the production area information and reference production area information is to input the keyboard to a device such as a PC or smartphone, or to capture and analyze the character information and the two-dimensional code described on the label on which the production area is written for meat. You may get it by doing.
  • the degree of association shown in FIG. 7 (Table 1) acquired in advance is referred to.
  • the node 61d is associated via the degree of association.
  • the node 61d is associated with the meat quality C by w19 and the meat quality D by the degree of association w20.
  • the meat quality C having the highest degree of association is selected as the optimum solution.
  • a combination with the reference biometric information instead of the above-mentioned reference ultrasonic image information and a degree of association with the meat quality for the combination are set at three or more levels. An example is shown.
  • This reference biometric information which is added as an explanatory variable instead of the reference ultrasound image information, includes all biometric data measured for the livestock living body that provides the meat.
  • the types of biological data of livestock include all biological data such as heart rate, body temperature, electrocardiogram, blood pressure, blood test result, and weight of livestock. Since the data on the living body included in the reference biological information also affects the taste of the meat, the discrimination accuracy can be improved by discriminating the meat quality through the degree of association in combination with the reference image information.
  • the input data is, for example, reference image information P01 to P03 and reference biological information P18 to 21.
  • the intermediate node shown in FIG. 8 is a combination of reference image information and reference biometric information as such input data. Each intermediate node is further linked to the output. In this output, the meat quality as the output solution is displayed.
  • Each combination (intermediate node) of the reference image information and the reference biometric information is related to each other through three or more levels of association with the meat quality as this output solution.
  • the reference image information and the reference biometric information are arranged on the left side through this degree of association, and the meat quality is arranged on the right side through this degree of association.
  • the degree of association indicates the degree of high relevance to the meat quality with respect to the reference image information and the reference biometric information arranged on the left side. In other words, this degree of association is an index indicating what kind of meat quality each reference image information and reference biometric information are likely to be associated with, and is the most reliable from the reference image information and the reference biometric information. It shows the accuracy in selecting a unique meat quality.
  • the discrimination device 2 acquires in advance the degree of association w13 to w22 of three or more stages shown in FIG. That is, the discriminating device 2 accumulates past data as to which of the reference image information, the reference biometric information, and the meat quality in that case was suitable for discriminating the actual search solution, and these By analyzing and analyzing the above, the degree of association shown in FIG. 8 is created.
  • This analysis may be performed by artificial intelligence.
  • the meat quality is analyzed from the past data. If there are many cases of meat quality A, the degree of association that this meat quality leads to A is set higher, and if there are many cases of meat quality B and there are few cases of meat quality A, the degree of association that meat quality leads to B is set. And set the degree of association that the meat quality leads to A low.
  • the output of meat quality A and meat quality B is linked, but from the previous case, the degree of association of w13 connected to meat quality A is 7 points, and the degree of association of w14 connected to meat quality B is 2 points. Is set to.
  • the degree of association shown in FIG. 8 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association.
  • the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence. Other than that, the configuration related to artificial intelligence is the same as the description in FIG.
  • the node 61b is a node of the combination of the reference image information P01 and the reference biological information P18, and the degree of association of the meat quality C is w15 and the degree of association of the meat quality E is w16. It has become.
  • the node 61c is a node in which the reference biometric information P19 and P21 are combined with respect to the reference image information P02, and the degree of association of the meat quality B is w17 and the degree of association of the meat quality D is w18.
  • Such degree of association is what is called learned data in artificial intelligence.
  • the above-mentioned trained data After creating such trained data, when actually searching for meat quality from now on, the above-mentioned trained data will be used.
  • the image information of the meat quality to be discriminated and the biological information are actually acquired.
  • the biological information is newly acquired when actually estimating the meat quality, and the acquisition method is the same as the above-mentioned reference biological information.
  • the degree of association shown in FIG. 8 (Table 1) acquired in advance is referred to.
  • the node 61d is associated via the degree of association.
  • the node 61d is associated with the meat quality C by w19 and the meat quality D by the degree of association w20.
  • the meat quality C having the highest degree of association is selected as the optimum solution.
  • the above-mentioned biological information and reference biological information may be obtained by acquiring biological data from the biological body a plurality of times in a time series at time intervals and including the time-series change tendency of the biological data. This makes it possible to judge the meat quality including the time-series change tendency of the biological data of livestock.
  • a combination with the reference breeding environment information instead of the above-mentioned reference ultrasonic image information and a degree of association with the meat quality for the combination are set at three or more levels. An example is shown.
  • This reference breeding environment information which is added as an explanatory variable instead of the reference ultrasound image information, includes all data regarding the environment in which the livestock that provide the meat are bred.
  • the types of data for this reference breeding environment information include temperature, humidity, wind direction, sunlight intensity, indoor lighting degree, audio data, pest extermination status, cleaning status, manure processing status, etc. It contains all the information about the breeding environment. Since the data included in the reference breeding environment information also affects the taste of the meat, the discrimination accuracy can be improved by discriminating the meat quality through the degree of association in combination with the reference image information.
  • the input data is, for example, reference image information P01 to P03 and reference breeding environment information P18 to 21.
  • the intermediate node shown in FIG. 9 is a combination of reference image information and reference breeding environment information as such input data. Each intermediate node is further linked to the output. In this output, the meat quality as the output solution is displayed.
  • Each combination (intermediate node) of the reference image information and the reference breeding environment information is related to each other through three or more levels of association with the meat quality as this output solution.
  • the reference image information and the reference breeding environment information are arranged on the left side through this degree of association, and the meat quality is arranged on the right side through this degree of association.
  • the degree of association indicates the degree of relevance to the meat quality with respect to the reference image information and the reference breeding environment information arranged on the left side.
  • this degree of association is an index indicating what kind of meat quality each reference image information and reference breeding environment information is likely to be associated with, and is based on the reference image information and reference breeding environment information. It shows the accuracy in selecting the most probable meat quality.
  • the discrimination device 2 acquires in advance the degree of association w13 to w22 of three or more stages shown in FIG. That is, the discriminating device 2 accumulates past data as to which of the reference image information, the reference breeding environment information, and the meat quality in that case was suitable for discriminating the actual search solution. By analyzing and analyzing these, the degree of association shown in FIG. 9 is created.
  • This analysis may be performed by artificial intelligence.
  • the meat quality is analyzed from the past data. If there are many cases of meat quality A, the degree of association that this meat quality leads to A is set higher, and if there are many cases of meat quality B and there are few cases of meat quality A, the degree of association that meat quality leads to B is set. And set the degree of association that the meat quality leads to A low.
  • the output of meat quality A and meat quality B is linked, but from the previous case, the degree of association of w13 connected to meat quality A is 7 points, and the degree of association of w14 connected to meat quality B is 2 points. Is set to.
  • the degree of association shown in FIG. 9 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association.
  • the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence. Other than that, the configuration related to artificial intelligence is the same as the description in FIG.
  • the node 61b is a node of the combination of the reference image information P01 and the reference breeding environment information P18, and the degree of association of the meat quality C is w15 and the degree of association of the meat quality E is w16. It has become.
  • the node 61c is a node that is a combination of the reference breeding environment information P19 and P21 with respect to the reference image information P02, and the degree of association of the meat quality B is w17 and the degree of association of the meat quality D is w18.
  • Such degree of association is what is called learned data in artificial intelligence.
  • the above-mentioned trained data After creating such trained data, when actually searching for meat quality from now on, the above-mentioned trained data will be used.
  • the image information of the meat quality to be discriminated and the breeding environment information are actually acquired.
  • the breeding environment information is newly acquired when actually estimating the meat quality, and the acquisition method is the same as the above-mentioned reference breeding environment information.
  • the degree of association shown in FIG. 9 (Table 1) acquired in advance is referred to.
  • the node 61d is associated via the degree of association.
  • the node 61d is associated with the meat quality C by w19 and the meat quality D by the degree of association w20.
  • the meat quality C having the highest degree of association is selected as the optimum solution.
  • the breeding environment information may be replaced with the feed information regarding the feed given to the livestock.
  • the degree of association of the previously acquired reference feed information regarding the feed given to the livestock in combination with the reference image information is formed in advance. Then, when the feed information regarding the feed of the livestock that newly provides the meat to be discriminated is acquired, the meat quality is discriminated based on the feed information.
  • the degree of association is expressed by a 10-step evaluation, but it is not limited to this, and it may be expressed by a degree of association of 3 or more levels, and conversely, it may be expressed by 3 or more levels. For example, 100 steps or 1000 steps may be used.
  • this degree of association does not include those expressed in two stages, that is, whether or not they are related to each other, either 1 or 0.
  • the above-mentioned input data and output data may not be exactly the same in the process of learning, so that the input data and the output data may be classified by type. That is, the information P01, P02, ... P15, 16, ... That constitute the input data are classified according to the criteria classified in advance on the system side or the user side according to the content of the information, and the classified inputs. A dataset may be created between the data and the output data and trained.
  • the degree of association in addition to the reference image information, any of the reference ultrasonic image information, the reference analysis information, the reference production area information, the reference biological information, the reference breeding environment information, and the reference food information.
  • the explanation has been given by taking the case of being composed of a combination of the above as an example, but the description is not limited to this.
  • the degree of association is any two or more of the reference ultrasonic image information, the reference analysis information, the reference production area information, the reference biological information, the reference breeding environment information, and the reference food information. It may be composed of a combination of.
  • the degree of association is one or more of the reference ultrasonic image information, the reference analysis information, the reference production area information, the reference biological information, the reference breeding environment information, and the reference food information. In addition, other factors may be added to this combination to form a degree of association.
  • the present invention determines the meat quality based on the degree of association between two or more types of information, the reference information U and the reference information V.
  • the reference information Y is the reference image information
  • the reference information V is the reference ultrasonic image information, the reference analysis information, the reference production area information, the reference biological information, the reference breeding environment information, and the reference food information. It shall be one of.
  • the output obtained for the reference information U may be used as the input data as it is, and may be associated with the output (flesh quality) via the intermediate node 61 in combination with the reference information V. ..
  • reference information U reference image information
  • this is used as an input as it is, and the degree of association with other reference information V is used.
  • the output (flesh quality) may be searched.
  • the livestock breeding conditions for improving the meat quality may be used as the output solution.
  • livestock breeding conditions should be included in place of meat quality for the dataset that trains the degree of association.
  • Livestock breeding conditions include feed to be fed to livestock, temperature, humidity, wind direction, sunlight intensity, indoor lighting level, audio data, pest extermination status, cleaning status, manure processing status, etc. May be used.
  • the optimum solution search is performed through the degree of association set in three or more stages.
  • the degree of association can be described by, for example, a numerical value from 0 to 100% in addition to the above-mentioned 10 steps, but the degree of association is not limited to this, and any step can be described as long as it can be described by a numerical value of 3 or more steps. It may be configured.
  • the search is performed in descending order of the degree of association under the situation where there are multiple possible candidates for the search solution. It is also possible to display it. If the users can be displayed in descending order of the degree of association in this way, it is possible to preferentially display more probable search solutions.
  • the present invention it is possible to judge without overlooking the discrimination result of the extremely low output such as the degree of association of 1%. It warns the user that even a judgment result with an extremely low degree of association is connected as a slight sign, and may be useful as the judgment result once every tens or hundreds of times. be able to.
  • the search policy can be determined by the method of setting the threshold value by performing the search based on the degree of association of three or more stages. If the threshold value is lowered, even if the above-mentioned degree of association is 1%, it can be picked up without omission, but it is unlikely that a more appropriate discrimination result can be detected favorably, and a lot of noise may be picked up. is there. On the other hand, if the threshold value is raised, there is a high possibility that the optimum search solution can be detected with high probability, but the degree of association is usually low and it is passed through, but it is suitable that it appears once every tens or hundreds of times. Sometimes the solution is overlooked. It is possible to decide which one to prioritize based on the ideas of the user side and the system side, but it is possible to increase the degree of freedom in selecting the points to be emphasized.
  • the above-mentioned degree of association may be updated.
  • This update may reflect information provided via a public communication network such as the Internet.
  • the degree of association is increased or decreased according to these.
  • this update is equivalent to learning in terms of artificial intelligence. It can be said that it is a learning act because it acquires new data and reflects it in the learned data.
  • this update of the degree of association is done by the system side or the user side based on the contents of research data, treatises, conference presentations, newspaper articles, books, etc. by experts, except when it is based on information that can be obtained from the public communication network. It may be updated artificially or automatically. Artificial intelligence may be utilized in these update processes.
  • the process of first creating a trained model and the above-mentioned update may use not only supervised learning but also unsupervised learning, deep learning, reinforcement learning, and the like.
  • unsupervised learning instead of reading and learning the data set of input data and output data, information corresponding to the input data is read and trained, and the degree of association related to the output data is self-formed from there. You may let it.

Landscapes

  • Engineering & Computer Science (AREA)
  • Food Science & Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Medicinal Chemistry (AREA)
  • Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Analysis (AREA)

Abstract

[Problem] To automatically distinguish meat quality with high accuracy and with less reliance on labor. [Solution] A meat quality distinction program for distinguishing meat quality, wherein the program is characterized by causing a computer to execute an information acquisition step for acquiring image information related to the external appearance of meat quality to be distinguished, and a distinction step for using three-stage or higher relatedness between meat quality and reference image information related to the external appearance of meat quality acquired in the past to distinguish meat quality on the basis of reference image information that corresponds to the image information acquired in the information acquisition step.

Description

肉質判別プログラム及びシステムMeat quality discrimination program and system
 本発明は、肉質を高精度に判別する上で好適な肉質判別プログラム及びシステムに関する。 The present invention relates to a meat quality discrimination program and system suitable for discriminating meat quality with high accuracy.
 食肉の肉質は、「脂肪交雑」、「肉の色沢」、「肉のしまりときめ」、「脂肪の色沢と質」等の項目に基づいて評価が行われる。そして、これらの各項目の総合的な判別結果から最終的に肉質が等級により表される。従来においてこの食肉の肉質は、人による官能評価として、食肉の品質さや特性(硬いか軟らかいか、風味が強いか弱いか等)をパネラー(消費者)により評価させていた。しかし、パネラーによる評価は、ブレが生じる場合もあり、統一的な判断が困難になる場合が多い。また肉質を機器分析を通じて行う場合もあるが、食肉を出荷する都度、機器分析を行うことになれば、労力と費用負担が増大してしまうことにもなる。 The meat quality of meat is evaluated based on items such as "fat crossing", "meat color", "meat tightness", and "fat color and quality". Then, the meat quality is finally represented by the grade from the comprehensive discrimination result of each of these items. Conventionally, the meat quality of this meat has been evaluated by a panelist (consumer) as a sensory evaluation by a person, such as the quality and characteristics of the meat (hard or soft, strong or weak flavor, etc.). However, evaluation by panelists may cause blurring, and it is often difficult to make a unified judgment. In some cases, meat quality is analyzed through instrumental analysis, but if instrumental analysis is performed each time meat is shipped, labor and cost burden will increase.
 このため、肉質評価を、人による官能評価や機器分析に頼ることなく高精度に評価することができるシステムが従来より望まれていた。 For this reason, a system that can evaluate meat quality with high accuracy without relying on human sensory evaluation or instrumental analysis has been conventionally desired.
 そこで本発明は、上述した問題点に鑑みて案出されたものであり、その目的とするところは、食肉の肉質を人による官能評価や機器分析に頼ることなく高精度かつ自動的に判別することが可能な肉質判別プログラム及びシステムを提供することにある。 Therefore, the present invention has been devised in view of the above-mentioned problems, and an object of the present invention is to determine the meat quality of meat with high accuracy and automatically without relying on human sensory evaluation or instrumental analysis. It is to provide a meat quality discrimination program and a system capable of this.
 本発明に係る肉質判別プログラムは、食肉の肉質を判別する肉質判別プログラムにおいて、判別対象の食肉を撮像した画像情報を取得する情報取得ステップと、過去において撮像した食肉の参照用画像情報と、肉質との3段階以上の連関度を利用し、上記情報取得ステップにおいて取得した画像情報に応じた参照用画像情報に基づき、上記連関度のより高いものを優先させて、肉質を判別する判別ステップとをコンピュータに実行させることを特徴とする。 The meat quality discrimination program according to the present invention is a meat quality discrimination program for discriminating the meat quality of meat, which includes an information acquisition step of acquiring image information obtained by imaging the meat to be discriminated, reference image information of the meat imaged in the past, and meat quality. Based on the reference image information corresponding to the image information acquired in the above information acquisition step, the one with the higher degree of association is prioritized and the meat quality is determined. Is characterized by having a computer execute the above.
 特段のスキルや経験が無くても、人による官能評価や機器分析に頼ることなく、誰でも手軽に肉質の判別を高精度に行うことができる。 Even if you do not have any special skills or experience, anyone can easily determine the meat quality with high accuracy without relying on human sensory evaluation or instrumental analysis.
本発明を適用したシステムの全体構成を示すブロック図である。It is a block diagram which shows the whole structure of the system to which this invention is applied. 探索装置の具体的な構成例を示す図である。It is a figure which shows the specific configuration example of a search device. 本発明の動作について説明するための図である。It is a figure for demonstrating the operation of this invention. 本発明の動作について説明するための図である。It is a figure for demonstrating the operation of this invention. 本発明の動作について説明するための図である。It is a figure for demonstrating the operation of this invention. 本発明の動作について説明するための図である。It is a figure for demonstrating the operation of this invention. 本発明の動作について説明するための図である。It is a figure for demonstrating the operation of this invention. 本発明の動作について説明するための図である。It is a figure for demonstrating the operation of this invention. 本発明の動作について説明するための図である。It is a figure for demonstrating the operation of this invention. 本発明の動作について説明するための図である。It is a figure for demonstrating the operation of this invention.
 以下、本発明を適用した肉質判別プログラムについて、図面を参照しながら詳細に説明をする。 Hereinafter, the meat quality discrimination program to which the present invention is applied will be described in detail with reference to the drawings.
 図1は、本発明を適用した肉質判別プログラムが実装される肉質判別システム1の全体構成を示すブロック図である。肉質判別システム1は、情報取得部9と、情報取得部9に接続された判別装置2と、判別装置2に接続されたデータベース3とを備えている。 FIG. 1 is a block diagram showing an overall configuration of a meat quality discrimination system 1 in which a meat quality discrimination program to which the present invention is applied is implemented. The meat quality discrimination system 1 includes an information acquisition unit 9, a discrimination device 2 connected to the information acquisition unit 9, and a database 3 connected to the discrimination device 2.
 情報取得部9は、本システムを活用する者が各種コマンドや情報を入力するためのデバイスであり、具体的にはキーボードやボタン、タッチパネル、マウス、スイッチ等により構成される。情報取得部9は、テキスト情報を入力するためのデバイスに限定されるものではなく、マイクロフォン等のような音声を検知してこれをテキスト情報に変換可能なデバイスで構成されていてもよい。また情報取得部9は、カメラ等の画像を撮影可能な撮像装置として構成されていてもよい。情報取得部9は、紙媒体の書類から文字列を認識できる機能を備えたスキャナで構成されていてもよい。また情報取得部9は、後述する判別装置2と一体化されていてもよい。情報取得部9は、検知した情報を判別装置2へと出力する。また情報取得部9は地図情報をスキャニングすることで位置情報を特定する手段により構成されていてもよい。また情報取得部9は、温度センサ、湿度センサ、風向センサ、を測るための照度センサで構成されていてもよい。また情報取得部9は、天候についてのータを気象庁や民間の天気予報会社から取得する通信インターフェースで構成されていてもよい。また情報取得部9は身体に装着して身体のデータを検出するための身体センサで構成されていてもよく、この身体センサは、例えば体温、心拍数、血圧、歩数、歩く速度、加速度を検出するためのセンサで構成されていてもよい。また身体センサは人間のみならず動物の生体データを取得するものであってもよい。また情報取得部9は図面等の情報をスキャニングしたり、或いはデータベースから読み出すことで取得するデバイスとして構成されていてもよい。情報取得部9は、これら以外に臭気や香りを検知する臭気センサにより構成されていてもよい。 The information acquisition unit 9 is a device for a person who uses this system to input various commands and information, and is specifically composed of a keyboard, buttons, a touch panel, a mouse, a switch, and the like. The information acquisition unit 9 is not limited to a device for inputting text information, and may be configured by a device such as a microphone that can detect voice and convert it into text information. Further, the information acquisition unit 9 may be configured as an image pickup device capable of capturing an image of a camera or the like. The information acquisition unit 9 may be configured by a scanner having a function of recognizing a character string from a paper-based document. Further, the information acquisition unit 9 may be integrated with the discrimination device 2 described later. The information acquisition unit 9 outputs the detected information to the determination device 2. Further, the information acquisition unit 9 may be configured by means for specifying the position information by scanning the map information. Further, the information acquisition unit 9 may be composed of an illuminance sensor for measuring a temperature sensor, a humidity sensor, and a wind direction sensor. Further, the information acquisition unit 9 may be composed of a communication interface for acquiring data on the weather from the Japan Meteorological Agency or a private weather forecast company. Further, the information acquisition unit 9 may be composed of a body sensor that is attached to the body to detect body data, and this body sensor detects, for example, body temperature, heart rate, blood pressure, number of steps, walking speed, and acceleration. It may be composed of a sensor for the purpose. Further, the body sensor may acquire biological data of animals as well as humans. Further, the information acquisition unit 9 may be configured as a device that acquires information such as drawings by scanning or reading it from a database. In addition to these, the information acquisition unit 9 may be configured by an odor sensor that detects an odor or scent.
 データベース3は、肉質判別を行う上で必要な様々な情報が蓄積される。肉質判別を行う上で必要な情報としては、過去において撮像した食肉の参照用画像情報、食肉を提供する家畜の生体から予め撮像した参照用超音波画像情報、食肉から遊離アミノ酸分析、脂肪酸組成、オレイン酸、イノシン酸、グアニル酸、ビタミンEの何れか1以上を分析した参照用分析情報、過去において撮像した食肉の産地に関する参照用産地情報、過去において食肉を提供する家畜の生体から取得した参照用生体情報、過去において撮像した食肉を提供する家畜の飼育環境に関する参照用飼育環境情報、過去において撮像した食肉を提供する家畜に施した餌に関する参照用餌情報と、これらに対して実際に判断がなされた肉質とのデータセットが記憶されている。 Database 3 stores various information necessary for determining meat quality. Information necessary for determining meat quality includes reference image information of meat captured in the past, reference ultrasonic image information previously captured from the living body of livestock that provides meat, free amino acid analysis from meat, fatty acid composition, and so on. Reference analysis information that analyzes any one or more of oleic acid, inosic acid, guanylate, and vitamin E, reference production area information regarding the meat production area that was imaged in the past, and reference obtained from the living body of livestock that provides meat in the past. Biometric information, reference breeding environment information regarding the breeding environment of livestock that provide meat imaged in the past, reference feed information regarding the feed that was given to livestock that provide meat that was imaged in the past, and actual judgments based on these The data set with the meat quality made is stored.
 つまり、データベース3には、このような参照用画像情報に加え、参照用超音波画像情報、参照用分析情報、参照用産地情報、参照用生体情報、参照用飼育環境情報、参照用餌情報の何れか1以上と、肉質が互いに紐づけられて記憶されている。 That is, in addition to such reference image information, the database 3 contains reference ultrasonic image information, reference analysis information, reference production area information, reference biological information, reference breeding environment information, and reference food information. Any one or more and the meat quality are memorized in association with each other.
 判別装置2は、例えば、パーソナルコンピュータ(PC)等を始めとした電子機器で構成されているが、PC以外に、携帯電話、スマートフォン、タブレット型端末、ウェアラブル端末等、他のあらゆる電子機器で具現化されるものであってもよい。ユーザは、この判別装置2による探索解を得ることができる。 The discrimination device 2 is composed of, for example, an electronic device such as a personal computer (PC), but is embodied in any other electronic device such as a mobile phone, a smartphone, a tablet terminal, a wearable terminal, etc., in addition to the PC. It may be converted. The user can obtain a search solution by the discrimination device 2.
 図2は、判別装置2の具体的な構成例を示している。この判別装置2は、判別装置2全体を制御するための制御部24と、操作ボタンやキーボード等を介して各種制御用の指令を入力するための操作部25と、有線通信又は無線通信を行うための通信部26と、各種判断を行う判別部27と、ハードディスク等に代表され、実行すべき検索を行うためのプログラムを格納するための記憶部28とが内部バス21にそれぞれ接続されている。さらに、この内部バス21には、実際に情報を表示するモニタとしての表示部23が接続されている。 FIG. 2 shows a specific configuration example of the discrimination device 2. The discrimination device 2 performs wired communication or wireless communication with a control unit 24 for controlling the entire discrimination device 2 and an operation unit 25 for inputting various control commands via an operation button, a keyboard, or the like. A communication unit 26 for the purpose, a determination unit 27 for making various determinations, and a storage unit 28 for storing a program for performing a search to be executed represented by a hard disk or the like are connected to the internal bus 21, respectively. .. Further, a display unit 23 as a monitor that actually displays information is connected to the internal bus 21.
  制御部24は、内部バス21を介して制御信号を送信することにより、判別装置2内に実装された各構成要素を制御するためのいわゆる中央制御ユニットである。また、この制御部24は、操作部25を介した操作に応じて各種制御用の指令を内部バス21を介して伝達する。 The control unit 24 is a so-called central control unit for controlling each component mounted in the discrimination device 2 by transmitting a control signal via the internal bus 21. Further, the control unit 24 transmits various control commands via the internal bus 21 in response to the operation via the operation unit 25.
 操作部25は、キーボードやタッチパネルにより具現化され、プログラムを実行するための実行命令がユーザから入力される。この操作部25は、上記実行命令がユーザから入力された場合には、これを制御部24に通知する。この通知を受けた制御部24は、判別部27を始め、各構成要素と協調させて所望の処理動作を実行していくこととなる。この操作部25は、前述した情報取得部9として具現化されるものであってもよい。 The operation unit 25 is embodied by a keyboard or a touch panel, and an execution command for executing a program is input from the user. When the execution command is input by the user, the operation unit 25 notifies the control unit 24 of the execution command. Upon receiving this notification, the control unit 24, including the determination unit 27, executes a desired processing operation in cooperation with each component. The operation unit 25 may be embodied as the information acquisition unit 9 described above.
 判別部27は、探索解を判別する。この判別部27は、判別動作を実行するに当たり、必要な情報として記憶部28に記憶されている各種情報や、データベース3に記憶されている各種情報を読み出す。この判別部27は、人工知能により制御されるものであってもよい。この人工知能はいかなる周知の人工知能技術に基づくものであってもよい。 The discrimination unit 27 discriminates the search solution. The discriminating unit 27 reads out various information stored in the storage unit 28 and various information stored in the database 3 as necessary information when executing the discriminating operation. The discriminating unit 27 may be controlled by artificial intelligence. This artificial intelligence may be based on any well-known artificial intelligence technique.
  表示部23は、制御部24による制御に基づいて表示画像を作り出すグラフィックコントローラにより構成されている。この表示部23は、例えば、液晶ディスプレイ(LCD)等によって実現される。 The display unit 23 is composed of a graphic controller that creates a display image based on the control by the control unit 24. The display unit 23 is realized by, for example, a liquid crystal display (LCD) or the like.
  記憶部28は、ハードディスクで構成される場合において、制御部24による制御に基づき、各アドレスに対して所定の情報が書き込まれるとともに、必要に応じてこれが読み出される。また、この記憶部28には、本発明を実行するためのプログラムが格納されている。このプログラムは制御部24により読み出されて実行されることになる。 When the storage unit 28 is composed of a hard disk, predetermined information is written to each address based on the control by the control unit 24, and this is read out as needed. Further, the storage unit 28 stores a program for executing the present invention. This program is read and executed by the control unit 24.
 上述した構成からなる肉質判別システム1における動作について説明をする。 The operation in the meat quality discrimination system 1 having the above-described configuration will be described.
 肉質判別システム1では、例えば図3に示すように、参照用画像情報と、肉質との3段階以上の連関度が予め設定されていることが前提となる。参照用画像情報とは、食肉の外観について、画像を撮像することにより得られた画像情報から得られるものであり、画像情報を解析することで得ることができる。この画像は静止画のみならず動画であってもよい。この参照用画像情報は、食肉について撮像した画像を解析することで、肉質を特定するようにしてもよい。参照用画像情報は、食肉用に解体した、いわば屠畜時の食肉を撮像した画像データで構成されていることを前提としているが、食肉を提供する家畜の生体からの画像データで構成してもよい。 In the meat quality discrimination system 1, for example, as shown in FIG. 3, it is premised that the degree of association between the reference image information and the meat quality in three or more stages is set in advance. The reference image information is obtained from the image information obtained by capturing an image of the appearance of the meat, and can be obtained by analyzing the image information. This image may be a moving image as well as a still image. This reference image information may be used to identify the meat quality by analyzing an image captured of the meat. The reference image information is premised on being composed of image data obtained by capturing the meat at the time of slaughter, which is disassembled for meat, but is composed of image data from the living body of the livestock that provides the meat. May be good.
 ここでいう肉質は、例えば「脂肪交雑」、つまり霜降の度合いで表現されていてもよく、 BMS(ビーフ・マーブリング・スタンダード)の判定基準に基づいて評価されたものであってもよい。また肉質は、「肉の色沢」で表現されてもよい。この「肉の色沢」は肉の色と光沢であり、「脂肪交雑」と同様に、肉の色にはBCS(ビーフ・カラー・スタンダード)という判定基準に基づいて評されたものであってもよい。また肉質は、光沢も含まれる。肉質は、 「肉のしまりときめ」も含まれ、これらは見た目で評価してもよい。この肉質は、 肉のきめで評価されてもよく、これらが細かいと柔らかい食感を得ることができる。肉質は、「脂肪の色沢と質」も含まれ、色が白またはクリーム色を基準に判定され、光沢と質を考慮して評価される。 The meat quality referred to here may be expressed by, for example, "fat crossing", that is, the degree of frost, and may be evaluated based on the judgment criteria of BMS (Beef Marbling Standard). The meat quality may be expressed by "meat color". This "meat color" is the color and luster of the meat, and like the "fat crossing", the color of the meat was evaluated based on the BCS (Beef Color Standard) criterion. May be good. The meat quality also includes luster. Meat quality also includes "meat tightness and crush", which may be evaluated visually. This meat quality may be evaluated by the texture of the meat, and if these are fine, a soft texture can be obtained. Meat quality is also included in "fat color and quality", and the color is judged based on white or cream color, and is evaluated in consideration of luster and quality.
 この肉質は、肉の等級を介して表現されるものであっても良く、また肉質は、システム側、又はユーザ側が設定した5段階や10段階で評価したランキングで表現されるものであっても良い。或いは、単に物凄く美味しい、美味しい、まあまあ、普通で表現されたものであってもよい。 This meat quality may be expressed through the grade of meat, and the meat quality may be expressed by a ranking evaluated on a 5-point or 10-point set by the system side or the user side. good. Alternatively, it may simply be a very tasty, tasty, ok, ordinary expression.
 これらの肉質は、以前において学習させた特徴量に基づいて判別するようにしてもよい。このとき人工知能を活用し、食肉の画像データと、肉質を学習させておき、実際に参照用画像情報を取得する際には、これらの学習させた画像データと照らし合わせて、その肉質を判別するようにしてもよい。 These meat qualities may be discriminated based on the features learned in the past. At this time, artificial intelligence is used to learn the image data of meat and the meat quality, and when actually acquiring the reference image information, the meat quality is discriminated by comparing with the learned image data. You may try to do it.
 また、肉質の代替として、脂肪交雑、肉の色沢、肉のしまりときめ、脂肪の色沢と質の何れか1以上を出力としてもよい。かかる場合には、食肉の画像データと、脂肪交雑、肉の色沢、肉のしまりときめ、脂肪の色沢と質の何れか1以上を学習させておき、実際に参照用画像情報を取得する際には、これらの学習させた画像データと照らし合わせて、判別するようにしてもよい。 Also, as an alternative to meat quality, any one or more of crossbreeding, meat color, meat tightness, and fat color and quality may be output. In such a case, the image data of the meat and any one or more of the crossover, the color of the meat, the tightness of the meat, the color of the fat and the quality are learned, and the image information for reference is actually acquired. At that time, the discriminant may be made by comparing with these trained image data.
 肉質は、評価者による以前の経験に基づいてその良しあしを判断してもよいし、実際に試食をしてその味を判断するようにしてもよい。かかる場合には肉質を試食する複数人の検査者がその味について、食感、酸っぱさ、香ばしさ、歯ごたえ、苦み等の各項目について複数段階で評価し、それらを統計的に分析して品質評価値としてもよい。また、肉質は、味覚を検知可能な味覚センサを通じて判別するようにしてもよいし、各種機器分析を通じて判断してもよい。  The meat quality may be judged to be good or bad based on the previous experience of the evaluator, or the taste may be judged by actually tasting it. In such a case, multiple inspectors who sample the meat quality evaluate the taste of each item such as texture, sourness, aroma, chewyness, and bitterness in multiple stages, and statistically analyze them to improve the quality. It may be an evaluation value. Further, the meat quality may be determined through a taste sensor capable of detecting the taste, or may be determined through various instrumental analyzes.
 図3の例では、入力データとして例えば参照用画像情報P01~P03であるものとする。このような入力データとしての参照用画像情報P01~P03は、出力としての肉質に連結している。この出力においては、出力解としての、肉質が表示されている。 In the example of FIG. 3, it is assumed that the input data is, for example, reference image information P01 to P03. The reference image information P01 to P03 as such input data is linked to the meat quality as output. In this output, the meat quality as the output solution is displayed.
 参照用画像情報は、この出力解としての肉質A~Dに対して3段階以上の連関度を通じて互いに連関しあっている。参照用画像情報がこの連関度を介して左側に配列し、各肉質が連関度を介して右側に配列している。連関度は、左側に配列された参照用画像情報に対して、何れの肉質と関連性が高いかの度合いを示すものである。換言すれば、この連関度は、各参照用画像情報が、いかなる肉質に紐付けられる可能性が高いかを示す指標であり、参照用画像情報から最も確からしい肉質を選択する上での的確性を示すものである。図3の例では、連関度としてw13~w19が示されている。このw13~w19は以下の表1に示すように10段階で示されており、10点に近いほど、中間ノードとしての各組み合わせが出力としての肉質と互いに関連度合いが高いことを示しており、逆に1点に近いほど中間ノードとしての各組み合わせが出力としての値段と互いに関連度合いが低いことを示している。 The reference image information is related to each other through the degree of association of 3 or more levels with respect to the meat quality A to D as the output solution. The reference image information is arranged on the left side through this degree of association, and each meat quality is arranged on the right side through this degree of association. The degree of association indicates which meat quality is highly relevant to the reference image information arranged on the left side. In other words, this degree of association is an index indicating what kind of meat quality each reference image information is likely to be associated with, and the accuracy in selecting the most probable meat quality from the reference image information. Is shown. In the example of FIG. 3, w13 to w19 are shown as the degree of association. These w13 to w19 are shown in 10 stages as shown in Table 1 below, and the closer to 10 points, the higher the degree of relevance of each combination as an intermediate node to the meat quality as an output. On the contrary, the closer to one point, the lower the degree of relevance of each combination as an intermediate node to the price as an output.
Figure JPOXMLDOC01-appb-T000001
Figure JPOXMLDOC01-appb-T000001
 判別装置2は、このような図3に示す3段階以上の連関度w13~w19を予め取得しておく。つまり判別装置2は、実際の探索解の判別を行う上で、参照用画像情報と、その場合の肉質の何れが採用、評価されたか、過去のデータセットを蓄積しておき、これらを分析、解析することで図3に示す連関度を作り上げておく。 The discrimination device 2 acquires in advance the degree of association w13 to w19 of three or more stages shown in FIG. That is, the discriminating device 2 accumulates a past data set as to which of the reference image information and the meat quality in that case is adopted and evaluated in discriminating the actual search solution, and analyzes these. By analyzing, the degree of association shown in FIG. 3 is created.
 例えば、過去において食肉に対して撮像した参照用画像情報に対する肉質としては肉質Aが多く評価されたものとする。このようなデータセットを集めて分析することにより、参照用画像情報との連関度が強くなる。 For example, it is assumed that meat quality A is often evaluated as the meat quality for reference image information captured for meat in the past. By collecting and analyzing such a data set, the degree of association with the reference image information is strengthened.
 この分析、解析は人工知能により行うようにしてもよい。かかる場合には、例えば参照用画像情報P01である場合に、過去の肉質の評価を行った結果の各種データから分析する。参照用画像情報P01である場合に、肉質Aの事例が多い場合には、この肉質の評価につながる連関度をより高く設定し、肉質Bの事例が多い場合には、この肉質の評価につながる連関度をより高く設定する。例えば参照用画像情報P01の例では、肉質Aと、肉質Cにリンクしているが、以前の事例から肉質Aにつながるw13の連関度を7点に、肉質Cにつながるw14の連関度を2点に設定している。 This analysis may be performed by artificial intelligence. In such a case, for example, in the case of reference image information P01, analysis is performed from various data as a result of past evaluation of meat quality. In the case of reference image information P01, if there are many cases of meat quality A, the degree of association that leads to the evaluation of this meat quality is set higher, and if there are many cases of meat quality B, it leads to the evaluation of this meat quality. Set a higher degree of association. For example, in the example of the reference image information P01, the meat quality A and the meat quality C are linked, but from the previous case, the degree of association of w13 connected to the meat quality A is 7 points, and the degree of association of w14 connected to the meat quality C is 2. It is set to a point.
 また、この図3に示す連関度は、人工知能におけるニューラルネットワークのノードで構成されるものであってもよい。即ち、このニューラルネットワークのノードが出力に対する重み付け係数が、上述した連関度に対応することとなる。またニューラルネットワークに限らず、人工知能を構成するあらゆる意思決定因子で構成されるものであってもよい。 Further, the degree of association shown in FIG. 3 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association. Further, the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence.
 かかる場合には、図4に示すように、入力データとして参照用画像情報が入力され、出力データとして肉質が出力され、入力ノードと出力ノードの間に少なくとも1以上の隠れ層が設けられ、機械学習させるようにしてもよい。入力ノード又は隠れ層ノードの何れか一方又は両方において上述した連関度が設定され、これが各ノードの重み付けとなり、これに基づいて出力の選択が行われる。そして、この連関度がある閾値を超えた場合に、その出力を選択するようにしてもよい。 In such a case, as shown in FIG. 4, reference image information is input as input data, meat quality is output as output data, at least one hidden layer is provided between the input node and the output node, and the machine is provided. You may let them learn. The above-mentioned degree of association is set in either one or both of the input node and the hidden layer node, and this is the weighting of each node, and the output is selected based on this. Then, when the degree of association exceeds a certain threshold value, the output may be selected.
 このような連関度が、人工知能でいうところの学習済みデータとなる。このような学習済みデータを、以前の評価対象の食肉の外観の画像等と実際に判別・評価した肉質とのデータセットを通じて作った後に、実際にこれから新たに肉質の判別を行う上で、上述した学習済みデータを利用して肉質を探索することとなる。かかる場合には、実際に判別対象の領域において食肉を撮像した画像情報を新たに取得する。新たに取得する画像情報は、上述した情報取得部9により入力される。画像情報は、食肉を判別しようとする画像を撮像することで取得する。この判別方法は、上述した参照用画像情報と同様の手法で行うようにしてもよい。 Such degree of association is what is called learned data in artificial intelligence. After creating such learned data through a data set of the appearance image of the meat to be evaluated before and the meat quality actually discriminated and evaluated, the above-mentioned is used to actually discriminate the meat quality from now on. The meat quality will be searched using the learned data. In such a case, the image information obtained by actually capturing the image of the meat in the area to be discriminated is newly acquired. The newly acquired image information is input by the information acquisition unit 9 described above. The image information is acquired by capturing an image for which meat is to be discriminated. This determination method may be performed by the same method as the reference image information described above.
 このようにして新たに取得した画像情報に基づいて、肉質を判別する。かかる場合には、予め取得した図3(表1)に示す連関度を参照する。例えば、新たに取得した画像情報がP02と同一かこれに類似するものである場合には、連関度を介して肉質Bがw15、肉質Cが連関度w16で関連付けられている。かかる場合には、連関度の最も高い肉質Bを最適解として選択する。但し、最も連関度の高いものを最適解として選択することは必須ではなく、連関度は低いものの連関性そのものは認められる肉質Cを最適解として選択するようにしてもよい。また、これ以外に矢印が繋がっていない出力解を選択してもよいことは勿論であり、連関度に基づくものであれば、その他いかなる優先順位で選択されるものであってもよい。 The meat quality is determined based on the newly acquired image information in this way. In such a case, the degree of association shown in FIG. 3 (Table 1) acquired in advance is referred to. For example, when the newly acquired image information is the same as or similar to P02, the meat quality B is associated with w15 and the meat quality C is associated with the association degree w16 through the degree of association. In such a case, the meat quality B having the highest degree of association is selected as the optimum solution. However, it is not essential to select the one with the highest degree of association as the optimum solution, and the meat quality C, which has the lowest degree of association but is recognized as having the association itself, may be selected as the optimum solution. In addition to this, it goes without saying that an output solution to which the arrows are not connected may be selected, and any other output solution may be selected in any other priority as long as it is based on the degree of association.
 このようにして、新たに取得する画像情報から、最も好適な肉質を探索し、ユーザに表示することができる。この探索結果を見ることにより、ユーザ、即ち肉質生産業者、販売業者、流通業者は、探索された肉質に基づいて食肉の選別を行うことができ、食肉の味を予測することができ、さらに食肉の値段を決めることができる。 In this way, the most suitable meat quality can be searched for from the newly acquired image information and displayed to the user. By seeing this search result, the user, that is, the meat producer, the distributor, and the distributor can select the meat based on the searched meat quality, predict the taste of the meat, and further, the meat. You can decide the price of.
 なお、上述した画像は、通常のカメラで撮像した画像以外に、スペクトル画像や超音波画像の何れか1以上を取得してもよい。かかる場合には、参照用画像情報として、取得する画像情報に応じたスペクトル画像、可視画像、超音波画像の何れか1以上を撮像しておくことが必要になる。特に、この超音波画像を利用する場合には、食肉を提供する家畜の生体における肉の生体データを撮像するようにしてもよい。 As the above-mentioned image, in addition to the image captured by a normal camera, any one or more of a spectrum image and an ultrasonic image may be acquired. In such a case, it is necessary to capture one or more of a spectrum image, a visible image, and an ultrasonic image according to the image information to be acquired as reference image information. In particular, when this ultrasonic image is used, the biological data of the meat in the living body of the livestock that provides the meat may be imaged.
 図5の例では、参照用画像情報と、参照用超音波画像情報との組み合わせが形成されていることが前提となる。ここで参照用画像情報は、過去の屠畜時に撮像した食肉の画像データである。参照用画像情報は、通常のカメラにより撮像した画像であるが、周波数の帯域毎に色分けされたスペクトル画像で構成されていてもよい。これに対して、参照用超音波画像情報は、食肉を提供する家畜の生体において予め撮像した肉の部分における超音波画像データである。 In the example of FIG. 5, it is premised that a combination of the reference image information and the reference ultrasonic image information is formed. Here, the reference image information is image data of meat taken at the time of past slaughter. The reference image information is an image captured by a normal camera, but may be composed of a spectrum image color-coded for each frequency band. On the other hand, the reference ultrasonic image information is the ultrasonic image data of the meat portion imaged in advance in the living body of the livestock that provides the meat.
 このような参照用画像情報に加えて、参照用超音波画像情報を組み合わせて判断することで、肉質をより高精度に判別することができる。このため、参照用画像情報に加えて、参照用超音波画像情報を組み合わせて上述した連関度を形成しておく。 By combining and determining the reference ultrasonic image information in addition to the reference image information, the meat quality can be determined with higher accuracy. Therefore, in addition to the reference image information, the reference ultrasonic image information is combined to form the above-mentioned degree of association.
 図5の例では、入力データとして例えば参照用画像情報P01~P03、参照用超音波画像情報P14~17であるものとする。このような入力データとしての、参照用画像情報に対して、参照用超音波画像情報が組み合わさったものが、図5に示す中間ノードである。各中間ノードは、更に出力に連結している。この出力においては、出力解としての、肉質が表示されている。 In the example of FIG. 5, it is assumed that the input data is, for example, reference image information P01 to P03 and reference ultrasonic image information P14 to 17. The intermediate node shown in FIG. 5 is a combination of reference ultrasonic image information and reference image information as such input data. Each intermediate node is further linked to the output. In this output, the meat quality as the output solution is displayed.
 参照用画像情報と参照用超音波画像情報との各組み合わせ(中間ノード)は、この出力解としての、肉質に対して3段階以上の連関度を通じて互いに連関しあっている。参照用画像情報と参照用超音波画像情報がこの連関度を介して左側に配列し、肉質が連関度を介して右側に配列している。連関度は、左側に配列された参照用画像情報と参照用超音波画像情報に対して、肉質と関連性が高いかの度合いを示すものである。換言すれば、この連関度は、各参照用画像情報と参照用超音波画像情報が、いかなる肉質に紐付けられる可能性が高いかを示す指標であり、参照用画像情報と参照用超音波画像情報から最も確からしい肉質を選択する上での的確性を示すものである。このため、これらの参照用画像情報と参照用超音波画像情報の組み合わせで、最適な肉質を探索していくこととなる。 Each combination (intermediate node) of the reference image information and the reference ultrasonic image information is related to each other through three or more levels of association with the meat quality as this output solution. The reference image information and the reference ultrasonic image information are arranged on the left side through this degree of association, and the meat quality is arranged on the right side through this degree of association. The degree of association indicates the degree of high relevance to the meat quality with respect to the reference image information and the reference ultrasonic image information arranged on the left side. In other words, this degree of association is an index indicating what kind of meat quality each reference image information and reference ultrasonic image information is likely to be associated with, and is a reference image information and reference ultrasonic image. It shows the accuracy in selecting the most probable meat quality from the information. Therefore, the optimum meat quality is searched for by combining the reference image information and the reference ultrasonic image information.
 図5の例では、連関度としてw13~w22が示されている。このw13~w22は表1に示すように10段階で示されており、10点に近いほど、中間ノードとしての各組み合わせが出力と互いに関連度合いが高いことを示しており、逆に1点に近いほど中間ノードとしての各組み合わせが出力と互いに関連度合いが低いことを示している。 In the example of FIG. 5, w13 to w22 are shown as the degree of association. As shown in Table 1, these w13 to w22 are shown in 10 stages, and the closer to 10 points, the higher the degree of relevance of each combination as an intermediate node to the output, and conversely, 1 point. The closer they are, the less relevant each combination as an intermediate node is to the output.
 判別装置2は、このような図5に示す3段階以上の連関度w13~w22を予め取得しておく。つまり判別装置2は、実際の探索解の判別を行う上で、参照用画像情報と参照用超音波画像情報、並びにその場合の肉質が何れが見合うものであったか、過去のデータを蓄積しておき、これらを分析、解析することで図5に示す連関度を作り上げておく。 The discrimination device 2 acquires in advance the degree of association w13 to w22 of three or more stages shown in FIG. That is, the discriminating device 2 accumulates past data as to which of the reference image information, the reference ultrasonic image information, and the meat quality in that case is suitable for discriminating the actual search solution. By analyzing and analyzing these, the degree of association shown in FIG. 5 is created.
 例えば、過去にあった実際の事例における参照用画像情報が、画像データαであるものとする。また参照用超音波画像情報が、画像データβであるものとする。かかる場合に、実際にその肉質がいくらであったかを示す肉質をデータセットとして学習させ、上述した連関度という形で定義しておく。なお、このような参照用画像情報や、参照用超音波画像情報は、生産業者、販売業者、流通業者等が管理する管理データベースから抽出するようにしてもよい。 For example, it is assumed that the reference image information in the actual case in the past is the image data α. Further, it is assumed that the reference ultrasonic image information is the image data β. In such a case, the meat quality indicating how much the meat quality was actually was learned as a data set and defined in the form of the above-mentioned degree of association. In addition, such reference image information and reference ultrasonic image information may be extracted from a management database managed by a producer, a distributor, a distributor, or the like.
 この分析、解析は人工知能により行うようにしてもよい。かかる場合には、例えば参照用画像情報P01で、参照用超音波画像情報P16である場合に、その肉質を過去のデータから分析する。肉質がA(甘さ度合〇〇、酸味度合〇〇、苦み度合〇〇、歯ごたえ〇〇等)の事例が多い場合には、この肉質Aにつながる連関度をより高く設定し、肉質Bの事例が多く、肉質Aの事例が少ない場合には、肉質Bにつながる連関度を高くし、肉質Aにつながる連関度を低く設定する。例えば中間ノード61aの例では、肉質Aと品質Bの出力にリンクしているが、以前の事例から肉質Aにつながるw13の連関度を7点に、肉質Bにつながるw14の連関度を2点に設定している。 This analysis may be performed by artificial intelligence. In such a case, for example, in the case of the reference image information P01 and the reference ultrasonic image information P16, the meat quality is analyzed from the past data. If there are many cases where the meat quality is A (sweetness degree 〇〇, acidity degree 〇〇, bitterness degree 〇〇, chewyness 〇〇, etc.), the degree of association that leads to this meat quality A is set higher, and the case of meat quality B When there are many cases and there are few cases of meat quality A, the degree of association leading to meat quality B is set high, and the degree of association leading to meat quality A is set low. For example, in the example of the intermediate node 61a, the output of meat quality A and quality B is linked, but from the previous case, the degree of association of w13 connected to meat quality A is 7 points, and the degree of association of w14 connected to meat quality B is 2 points. Is set to.
 また、この図5に示す連関度は、人工知能におけるニューラルネットワークのノードで構成されるものであってもよい。即ち、このニューラルネットワークのノードが出力に対する重み付け係数が、上述した連関度に対応することとなる。またニューラルネットワークに限らず、人工知能を構成するあらゆる意思決定因子で構成されるものであってもよい。その他、人工知能に関する構成は、図4における説明と同様である。 Further, the degree of association shown in FIG. 5 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association. Further, the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence. Other than that, the configuration related to artificial intelligence is the same as the description in FIG.
 図5に示す連関度の例で、ノード61bは、参照用画像情報P01に対して、参照用超音波画像情報P14の組み合わせのノードであり、肉質Cの連関度がw15、肉質Eの連関度がw16となっている。ノード61cは、参照用画像情報P02に対して、参照用超音波画像情報P15、P17の組み合わせのノードであり、肉質Bの連関度がw17、肉質Dの連関度がw18となっている。 In the example of the degree of association shown in FIG. 5, the node 61b is a node in which the reference ultrasonic image information P14 is combined with the reference image information P01, the association degree of the meat quality C is w15, and the association degree of the meat quality E is. Is w16. The node 61c is a node in which the reference ultrasonic image information P15 and P17 are combined with respect to the reference image information P02, and the degree of association of the meat quality B is w17 and the degree of association of the meat quality D is w18.
 このような連関度が、人工知能でいうところの学習済みデータとなる。このような学習済みデータを作った後に、実際にこれから肉質を判別する際において、上述した学習済みデータを利用して行うこととなる。かかる場合には、実際に肉質を判別しようとする食肉の画像情報、超音波画像情報を入力又は選択する。 Such degree of association is what is called learned data in artificial intelligence. After creating such learned data, when actually determining the meat quality from now on, the above-mentioned learned data will be used. In such a case, the image information and the ultrasonic image information of the meat for which the meat quality is to be actually determined are input or selected.
 このようにして新たに取得した画像情報、超音波画像情報に基づいて、最適な肉質を探索する。かかる場合には、予め取得した図5(表1)に示す連関度を参照する。例えば、新たに取得した画像情報がP02と同一かこれに類似するものである場合であって、超音波画像情報がP17である場合には、連関度を介してノード61dが関連付けられており、このノード61dは、肉質Cがw19、肉質Dが連関度w20で関連付けられている。かかる場合には、連関度の最も高い肉質Cを最適解として選択する。但し、最も連関度の高いものを最適解として選択することは必須ではなく、連関度は低いものの連関性そのものは認められる肉質Dを最適解として選択するようにしてもよい。また、これ以外に矢印が繋がっていない出力解を選択してもよいことは勿論であり、連関度に基づくものであれば、その他いかなる優先順位で選択されるものであってもよい。 Search for the optimum meat quality based on the newly acquired image information and ultrasonic image information in this way. In such a case, the degree of association shown in FIG. 5 (Table 1) acquired in advance is referred to. For example, when the newly acquired image information is the same as or similar to P02 and the ultrasonic image information is P17, the node 61d is associated via the degree of association. The node 61d is associated with the meat quality C by w19 and the meat quality D by the degree of association w20. In such a case, the meat quality C having the highest degree of association is selected as the optimum solution. However, it is not essential to select the one with the highest degree of association as the optimum solution, and the meat quality D, which has the lowest degree of association but is recognized as having the association itself, may be selected as the optimum solution. In addition to this, it goes without saying that an output solution to which the arrows are not connected may be selected, and any other output solution may be selected in any other priority as long as it is based on the degree of association.
 また、入力から伸びている連関度w1~w12の例を以下の表2に示す。 Table 2 below shows examples of the degree of association w1 to w12 extending from the input.
Figure JPOXMLDOC01-appb-T000002
Figure JPOXMLDOC01-appb-T000002
 この入力から伸びている連関度w1~w12に基づいて中間ノード61が選択されていてもよい。つまり連関度w1~w12が大きいほど、中間ノード61の選択における重みづけを重くしてもよい。しかし、この連関度w1~w12は何れも同じ値としてもよく、中間ノード61の選択における重みづけは何れも全て同一とされていてもよい。 The intermediate node 61 may be selected based on the degree of association w1 to w12 extending from this input. That is, the larger the degree of association w1 to w12, the heavier the weighting in the selection of the intermediate node 61 may be. However, the degrees of association w1 to w12 may all have the same value, and the weights in the selection of the intermediate node 61 may all be the same.
 なお、上述した参照用画像情報に加え、上述した参照用超音波画像情報の代わりに参照用スペクトル情報との組み合わせと、当該組み合わせに対する肉質との3段階以上の連関度を設定するようにしてもよい。 In addition to the above-mentioned reference image information, instead of the above-mentioned reference ultrasonic image information, a combination with the reference spectrum information and a degree of association with the meat quality for the combination may be set at three or more levels. Good.
 ちなみに、上述した超音波画像情報及び超音波画像情報は、食肉として適用される家畜についてある一時点における超音波画像データを撮像する場合に限定されるものではなく、家畜の生体から時系列的に複数回に亘り超音波画像を撮像し、その時系列的な変化傾向を取得するようにしてもよい。このような時系列的な超音波画像データの変化傾向を含めて参照用超音波画像情報として連関度を形成しておき、判別対象の食肉を提供する家畜の生体から時系列的に複数回撮像した超音波画像の時系列的な変化傾向を取得した場合には、これを入力データとして入力することで判別を行う。 By the way, the above-mentioned ultrasonic image information and ultrasonic image information are not limited to the case of capturing ultrasonic image data at a certain point in time for livestock applied as meat, and are time-series from the living body of livestock. The ultrasonic image may be taken a plurality of times to acquire the change tendency over time. The degree of association is formed as reference ultrasonic image information including the change tendency of such time-series ultrasonic image data, and images are taken multiple times in time series from the living body of the livestock that provides the meat to be discriminated. When the time-series change tendency of the ultrasonic image is acquired, it is discriminated by inputting this as input data.
 図6は、上述した参照用画像情報に加え、上述した参照用超音波画像情報の代わりに参照用分析情報との組み合わせと、当該組み合わせに対する肉質との3段階以上の連関度が設定されている例を示している。 In FIG. 6, in addition to the above-mentioned reference image information, a combination with the reference analysis information instead of the above-mentioned reference ultrasonic image information and a degree of association with the meat quality for the combination are set at three or more levels. An example is shown.
 参照用超音波画像情報の代わりに説明変数として加えられるこの参照用分析情報は、食肉に対して行った化学的、物理的分析結果に関するあらゆる情報である。この参照用分析情報は、遊離アミノ酸分析、脂肪酸組成、オレイン酸、イノシン酸、グアニル酸、ビタミンEの何れか1以上を分析した分析情報を含むものであっても良い。遊離アミノ酸分析では、うま味成分であるグルタミン酸など様々なアミノ酸の割合を分析する。脂肪酸組成の分析では、まろやかさや口どけといった食感の評価基準として、脂肪組織に含まれるオレイン酸など、様々な脂肪酸の定量分析の分析結果を示す。オレイン酸の分析は、単価の不飽和脂肪酸が多く含まれるほど柔らかくおいしいと評価されることから、これを分析する。イノシン酸の分析は、有機化合物の一種、イノシン酸は鰹節のうま味成分で、解体処理後の熟成などで増加するといわれているため、これを分析する。グアニル酸の分析は、グアニン酸はシイタケのうま味成分を引き出すため、これを分析する。またビタミンEもうま味に影響を及ぼすことからこれを分析する。 This reference analysis information, which is added as an explanatory variable instead of the reference ultrasound image information, is all information regarding the results of chemical and physical analysis performed on meat. This reference analysis information may include analysis information obtained by analyzing any one or more of free amino acid analysis, fatty acid composition, oleic acid, inosinic acid, guanylic acid, and vitamin E. In free amino acid analysis, the proportion of various amino acids such as glutamic acid, which is an umami component, is analyzed. In the analysis of fatty acid composition, the analysis results of quantitative analysis of various fatty acids such as oleic acid contained in adipose tissue are shown as evaluation criteria for texture such as mellowness and melting in the mouth. In the analysis of oleic acid, the more the unit price of unsaturated fatty acid is contained, the softer and tastier it is evaluated, so this is analyzed. The analysis of inosinic acid is performed because inosinic acid, which is a kind of organic compound, is an umami component of dried bonito and is said to increase by aging after dismantling treatment. The analysis of guanylic acid analyzes guanylic acid because it elicits the umami component of shiitake mushrooms. Vitamin E also affects umami, so this is analyzed.
 このような参照用分析情報に含まれる各指標も肉の味に影響を及ぼすことから、参照用画像情報と組み合わせ、連関度を通じて肉質を判別することで、判別精度を向上させることができる。 Since each index included in such reference analysis information also affects the taste of meat, the discrimination accuracy can be improved by combining the meat quality with the reference image information and discriminating the meat quality through the degree of association.
 この参照用分析情報、及び分析情報は、共に、屠畜時の食肉に対して行う分析であってもよいし、当該食肉を提供する家畜の生体に対して行う分析であってもよい。 Both the reference analysis information and the analysis information may be an analysis performed on the meat at the time of slaughter, or may be an analysis performed on the living body of the livestock that provides the meat.
 図6の例では、入力データとして例えば参照用画像情報P01~P03、参照用分析情報P18~21であるものとする。このような入力データとしての、参照用画像情報に対して、参照用分析情報が組み合わさったものが、図6に示す中間ノードである。各中間ノードは、更に出力に連結している。この出力においては、出力解としての、肉質が表示されている。 In the example of FIG. 6, it is assumed that the input data is, for example, reference image information P01 to P03 and reference analysis information P18 to 21. The intermediate node shown in FIG. 6 is a combination of reference analysis information and reference image information as such input data. Each intermediate node is further linked to the output. In this output, the meat quality as the output solution is displayed.
 参照用画像情報と参照用分析情報との各組み合わせ(中間ノード)は、この出力解としての、肉質に対して3段階以上の連関度を通じて互いに連関しあっている。参照用画像情報と参照用分析情報がこの連関度を介して左側に配列し、肉質が連関度を介して右側に配列している。連関度は、左側に配列された参照用画像情報と参照用分析情報に対して、肉質と関連性が高いかの度合いを示すものである。換言すれば、この連関度は、各参照用画像情報と参照用分析情報が、いかなる肉質に紐付けられる可能性が高いかを示す指標であり、参照用画像情報と参照用分析情報から最も確からしい肉質を選択する上での的確性を示すものである。 Each combination (intermediate node) of the reference image information and the reference analysis information is related to each other through three or more levels of association with the meat quality as this output solution. The reference image information and the reference analysis information are arranged on the left side through this degree of association, and the meat quality is arranged on the right side through this degree of association. The degree of association indicates the degree of high relevance to the meat quality with respect to the reference image information and the reference analysis information arranged on the left side. In other words, this degree of association is an index showing what kind of meat quality each reference image information and reference analysis information is likely to be associated with, and is the most reliable from the reference image information and the reference analysis information. It shows the accuracy in selecting a unique meat quality.
 判別装置2は、このような図6に示す3段階以上の連関度w13~w22を予め取得しておく。つまり判別装置2は、実際の探索解の判別を行う上で、参照用画像情報と、参照用画像情報を取得する際に得た参照用分析情報、並びにその場合の肉質が何れが好適であったか、過去のデータを蓄積しておき、これらを分析、解析することで図6に示す連関度を作り上げておく。 The discrimination device 2 acquires in advance the degree of association w13 to w22 of three or more stages shown in FIG. That is, which of the reference image information, the reference analysis information obtained when acquiring the reference image information, and the meat quality in that case was suitable for the discrimination device 2 to discriminate the actual search solution. , Past data is accumulated, and by analyzing and analyzing these, the degree of association shown in FIG. 6 is created.
 例えば、過去にあった実際の肉質の評価時において、ある参照用画像情報に対して、参照用分析情報が、オレイン酸の含有量が〇〇、イノシン酸の含有量が□□であるものとする。かかる場合に、肉質がAと判別されている事例が多い場合には、これらをデータセットとして学習させ、上述した連関度という形で定義しておく。 For example, at the time of actual evaluation of meat quality in the past, it is assumed that the reference analysis information has an oleic acid content of 〇〇 and an inosinic acid content of □□ for a certain reference image information. To do. In such a case, if there are many cases where the meat quality is determined to be A, these are trained as a data set and defined in the form of the above-mentioned degree of association.
 この分析、解析は人工知能により行うようにしてもよい。かかる場合には、例えば参照用画像情報P01で、参照用分析情報P20である場合に、その肉質を過去のデータから分析する。肉質Aの事例が多い場合には、この肉質がAにつながる連関度をより高く設定し、肉質がBの事例が多く、肉質がAの事例が少ない場合には、肉質がBにつながる連関度を高くし、肉質がAにつながる連関度を低く設定する。例えば中間ノード61aの例では、肉質Aと肉質Bの出力にリンクしているが、以前の事例から肉質Aにつながるw13の連関度を7点に、肉質Bにつながるw14の連関度を2点に設定している。 This analysis may be performed by artificial intelligence. In such a case, for example, in the case of the reference image information P01 and the reference analysis information P20, the meat quality is analyzed from the past data. If there are many cases of meat quality A, the degree of association that this meat quality leads to A is set higher, and if there are many cases of meat quality B and there are few cases of meat quality A, the degree of association that meat quality leads to B is set. And set the degree of association that the meat quality leads to A low. For example, in the example of the intermediate node 61a, the output of meat quality A and meat quality B is linked, but from the previous case, the degree of association of w13 connected to meat quality A is 7 points, and the degree of association of w14 connected to meat quality B is 2 points. Is set to.
 また、この図6に示す連関度は、人工知能におけるニューラルネットワークのノードで構成されるものであってもよい。即ち、このニューラルネットワークのノードが出力に対する重み付け係数が、上述した連関度に対応することとなる。またニューラルネットワークに限らず、人工知能を構成するあらゆる意思決定因子で構成されるものであってもよい。その他、人工知能に関する構成は、図4における説明と同様である。 Further, the degree of association shown in FIG. 6 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association. Further, the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence. Other than that, the configuration related to artificial intelligence is the same as the description in FIG.
 図6に示す連関度の例で、ノード61bは、参照用画像情報P01に対して参照用分析情報P18の組み合わせのノードであり、肉質Cの連関度がw15、肉質Eの連関度がw16となっている。ノード61cは、参照用画像情報P02に対して、参照用分析情報P19、P21の組み合わせのノードであり、肉質Bの連関度がw17、肉質Dの連関度がw18となっている。 In the example of the degree of association shown in FIG. 6, the node 61b is a node in which the reference analysis information P18 is combined with the reference image information P01, and the degree of association of the meat quality C is w15 and the degree of association of the meat quality E is w16. It has become. The node 61c is a node in which the reference analysis information P19 and P21 are combined with respect to the reference image information P02, and the degree of association of the meat quality B is w17 and the degree of association of the meat quality D is w18.
 このような連関度が、人工知能でいうところの学習済みデータとなる。このような学習済みデータを作った後に、実際にこれから肉質の探索を行う際において、上述した学習済みデータを利用して行うこととなる。かかる場合には、実際にその肉質の判別対象の画像情報と、分析情報とを取得する。ここで分析情報は、肉質を実際に見積もる際に、新たに取得するが、その取得方法は、上述した参照用分析情報と同様である。 Such degree of association is what is called learned data in artificial intelligence. After creating such trained data, when actually searching for meat quality from now on, the above-mentioned trained data will be used. In such a case, the image information of the meat quality to be discriminated and the analysis information are actually acquired. Here, the analysis information is newly acquired when actually estimating the meat quality, and the acquisition method is the same as the above-mentioned reference analysis information.
 ちなみに、参照用分析情報は、この分析情報に応じた遊離アミノ酸分析、脂肪酸組成、オレイン酸、イノシン酸、グアニル酸、ビタミンEの何れか1以上を予め分析して取得し、これを参照用分析情報とし、参照用画像情報との組み合わせの連関度を形成しておくことになる。 By the way, the reference analysis information is obtained by pre-analyzing any one or more of free amino acid analysis, fatty acid composition, oleic acid, inosinic acid, guanylic acid, and vitamin E according to this analysis information, and this is used for reference analysis. It will be used as information, and the degree of association with the combination with the reference image information will be formed.
 このようにして新たに取得した画像情報と、分析情報に基づいて、最適な肉質を探索する。かかる場合には、予め取得した図6(表1)に示す連関度を参照する。例えば、新たに取得した画像情報がP02と同一かこれに類似するものである場合であって、分析情報がP21と同一か又は類似する場合には、連関度を介してノード61dが関連付けられており、このノード61dは、肉質Cがw19、肉質Dが連関度w20で関連付けられている。かかる場合には、連関度の最も高い肉質Cを最適解として選択する。但し、最も連関度の高いものを最適解として選択することは必須ではなく、連関度は低いものの連関性そのものは認められる肉質Dを最適解として選択するようにしてもよい。また、これ以外に矢印が繋がっていない出力解を選択してもよいことは勿論であり、連関度に基づくものであれば、その他いかなる優先順位で選択されるものであってもよい。 Search for the optimum meat quality based on the newly acquired image information and analysis information in this way. In such a case, the degree of association shown in FIG. 6 (Table 1) acquired in advance is referred to. For example, when the newly acquired image information is the same as or similar to P02 and the analysis information is the same as or similar to P21, the node 61d is associated via the degree of association. The node 61d is associated with the meat quality C by w19 and the meat quality D by the degree of association w20. In such a case, the meat quality C having the highest degree of association is selected as the optimum solution. However, it is not essential to select the one with the highest degree of association as the optimum solution, and the meat quality D, which has the lowest degree of association but is recognized as having the association itself, may be selected as the optimum solution. In addition to this, it goes without saying that an output solution to which the arrows are not connected may be selected, and any other output solution may be selected in any other priority as long as it is based on the degree of association.
 図7は、上述した参照用画像情報に加え、上述した参照用超音波画像情報の代わりに参照用産地情報との組み合わせと、当該組み合わせに対する肉質との3段階以上の連関度が設定されている例を示している。 In FIG. 7, in addition to the above-mentioned reference image information, a combination with the reference production area information instead of the above-mentioned reference ultrasonic image information and a degree of association with the meat quality for the combination are set at three or more levels. An example is shown.
 参照用超音波画像情報の代わりに説明変数として加えられるこの参照用産地情報は、その食肉の産地に関する情報であり、例えば、米国、日本といった国レベル、東北地方や九州地方といった地方レベル、北海道や鹿児島県といった都道府県レベル、更には北海道の群や町、更には牧場レベルで示されていてもよい。このような参照用産地情報に含まれる肉の産地も肉の味に影響を及ぼすことから、参照用画像情報と組み合わせ、連関度を通じて肉質を判別することで、判別精度を向上させることができる。 This reference production area information, which is added as an explanatory variable instead of the reference ultrasonic image information, is information on the production area of the meat, for example, national level such as the United States and Japan, regional level such as Tohoku region and Kyushu region, Hokkaido and It may be shown at the prefectural level such as Kagoshima prefecture, and also at the group or town of Hokkaido, or even at the ranch level. Since the meat production area included in the reference production area information also affects the taste of the meat, the discrimination accuracy can be improved by combining the meat quality with the reference image information and discriminating the meat quality through the degree of association.
 図7の例では、入力データとして例えば参照用画像情報P01~P03、参照用産地情報P18~21であるものとする。このような入力データとしての、参照用画像情報に対して、参照用産地情報が組み合わさったものが、図7に示す中間ノードである。各中間ノードは、更に出力に連結している。この出力においては、出力解としての、肉質が表示されている。 In the example of FIG. 7, it is assumed that the input data is, for example, reference image information P01 to P03 and reference production area information P18 to 21. The intermediate node shown in FIG. 7 is a combination of reference image information and reference production area information as such input data. Each intermediate node is further linked to the output. In this output, the meat quality as the output solution is displayed.
 参照用画像情報と参照用産地情報との各組み合わせ(中間ノード)は、この出力解としての、肉質に対して3段階以上の連関度を通じて互いに連関しあっている。参照用画像情報と参照用産地情報がこの連関度を介して左側に配列し、肉質が連関度を介して右側に配列している。連関度は、左側に配列された参照用画像情報と参照用産地情報に対して、肉質と関連性が高いかの度合いを示すものである。換言すれば、この連関度は、各参照用画像情報と参照用産地情報が、いかなる肉質に紐付けられる可能性が高いかを示す指標であり、参照用画像情報と参照用産地情報から最も確からしい肉質を選択する上での的確性を示すものである。 Each combination (intermediate node) of the reference image information and the reference production area information is related to each other through three or more levels of association with the meat quality as this output solution. The reference image information and the reference production area information are arranged on the left side through this degree of association, and the meat quality is arranged on the right side through this degree of association. The degree of association indicates the degree of high relevance to the meat quality with respect to the reference image information and the reference production area information arranged on the left side. In other words, this degree of association is an index showing what kind of meat quality each reference image information and reference production area information is likely to be associated with, and is the most reliable from the reference image information and the reference production area information. It shows the accuracy in selecting a unique meat quality.
 判別装置2は、このような図7に示す3段階以上の連関度w13~w22を予め取得しておく。つまり判別装置2は、実際の探索解の判別を行う上で、参照用画像情報と、参照用画像情報を取得する際に得た参照用産地情報、並びにその場合の肉質が何れが好適であったか、過去のデータを蓄積しておき、これらを分析、解析することで図7に示す連関度を作り上げておく。 The discrimination device 2 acquires in advance the degree of association w13 to w22 of three or more stages shown in FIG. 7. That is, which of the reference image information, the reference production area information obtained when the reference image information was acquired, and the meat quality in that case was suitable for the discrimination device 2 to discriminate the actual search solution. , Past data is accumulated, and by analyzing and analyzing these, the degree of association shown in FIG. 7 is created.
 この分析、解析は人工知能により行うようにしてもよい。かかる場合には、例えば参照用画像情報P01で、参照用産地情報P20である場合に、その肉質を過去のデータから分析する。肉質Aの事例が多い場合には、この肉質がAにつながる連関度をより高く設定し、肉質がBの事例が多く、肉質がAの事例が少ない場合には、肉質がBにつながる連関度を高くし、肉質がAにつながる連関度を低く設定する。例えば中間ノード61aの例では、肉質Aと肉質Bの出力にリンクしているが、以前の事例から肉質Aにつながるw13の連関度を7点に、肉質Bにつながるw14の連関度を2点に設定している。 This analysis may be performed by artificial intelligence. In such a case, for example, in the case of the reference image information P01 and the reference production area information P20, the meat quality is analyzed from the past data. If there are many cases of meat quality A, the degree of association that this meat quality leads to A is set higher, and if there are many cases of meat quality B and there are few cases of meat quality A, the degree of association that meat quality leads to B is set. And set the degree of association that the meat quality leads to A low. For example, in the example of the intermediate node 61a, the output of meat quality A and meat quality B is linked, but from the previous case, the degree of association of w13 connected to meat quality A is 7 points, and the degree of association of w14 connected to meat quality B is 2 points. Is set to.
 また、この図7に示す連関度は、人工知能におけるニューラルネットワークのノードで構成されるものであってもよい。即ち、このニューラルネットワークのノードが出力に対する重み付け係数が、上述した連関度に対応することとなる。またニューラルネットワークに限らず、人工知能を構成するあらゆる意思決定因子で構成されるものであってもよい。その他、人工知能に関する構成は、図4における説明と同様である。 Further, the degree of association shown in FIG. 7 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association. Further, the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence. Other than that, the configuration related to artificial intelligence is the same as the description in FIG.
 図7に示す連関度の例で、ノード61bは、参照用画像情報P01に対して参照用産地情報P18の組み合わせのノードであり、肉質Cの連関度がw15、肉質Eの連関度がw16となっている。ノード61cは、参照用画像情報P02に対して、参照用産地情報P19、P21の組み合わせのノードであり、肉質Bの連関度がw17、肉質Dの連関度がw18となっている。 In the example of the degree of association shown in FIG. 7, the node 61b is a node of the combination of the reference image information P01 and the reference production area information P18, and the degree of association of the meat quality C is w15 and the degree of association of the meat quality E is w16. It has become. The node 61c is a node in which the reference production area information P19 and P21 are combined with respect to the reference image information P02, and the degree of association of the meat quality B is w17 and the degree of association of the meat quality D is w18.
 このような連関度が、人工知能でいうところの学習済みデータとなる。このような学習済みデータを作った後に、実際にこれから肉質の探索を行う際において、上述した学習済みデータを利用して行うこととなる。かかる場合には、実際にその肉質の判別対象の画像情報と、産地情報とを取得する。ここで産地情報は、肉質を実際に見積もる際に、新たに取得するが、その取得方法は、上述した参照用産地情報と同様である。産地情報、参照用産地情報の取得方法は、PCやスマートフォン等へのデバイスへのキーボード入力や、食肉に対して産地が記入されているラベルに記載の文字情報や二次元コードを撮像し、解析することで取得してもよい。 Such degree of association is what is called learned data in artificial intelligence. After creating such trained data, when actually searching for meat quality from now on, the above-mentioned trained data will be used. In such a case, the image information of the meat quality to be discriminated and the production area information are actually acquired. Here, the production area information is newly acquired when actually estimating the meat quality, and the acquisition method is the same as the reference production area information described above. The method of acquiring the production area information and reference production area information is to input the keyboard to a device such as a PC or smartphone, or to capture and analyze the character information and the two-dimensional code described on the label on which the production area is written for meat. You may get it by doing.
 このようにして新たに取得した画像情報と、産地情報に基づいて、最適な肉質を探索する。かかる場合には、予め取得した図7(表1)に示す連関度を参照する。例えば、新たに取得した画像情報がP02と同一かこれに類似するものである場合であって、産地情報がP21と同一か又は類似する場合には、連関度を介してノード61dが関連付けられており、このノード61dは、肉質Cがw19、肉質Dが連関度w20で関連付けられている。かかる場合には、連関度の最も高い肉質Cを最適解として選択する。但し、最も連関度の高いものを最適解として選択することは必須ではなく、連関度は低いものの連関性そのものは認められる肉質Dを最適解として選択するようにしてもよい。また、これ以外に矢印が繋がっていない出力解を選択してもよいことは勿論であり、連関度に基づくものであれば、その他いかなる優先順位で選択されるものであってもよい。 Search for the optimum meat quality based on the newly acquired image information and production area information in this way. In such a case, the degree of association shown in FIG. 7 (Table 1) acquired in advance is referred to. For example, when the newly acquired image information is the same as or similar to P02 and the production area information is the same as or similar to P21, the node 61d is associated via the degree of association. The node 61d is associated with the meat quality C by w19 and the meat quality D by the degree of association w20. In such a case, the meat quality C having the highest degree of association is selected as the optimum solution. However, it is not essential to select the one with the highest degree of association as the optimum solution, and the meat quality D, which has the lowest degree of association but is recognized as having the association itself, may be selected as the optimum solution. In addition to this, it goes without saying that an output solution to which the arrows are not connected may be selected, and any other output solution may be selected in any other priority as long as it is based on the degree of association.
 図8は、上述した参照用画像情報に加え、上述した参照用超音波画像情報の代わりに参照用生体情報との組み合わせと、当該組み合わせに対する肉質との3段階以上の連関度が設定されている例を示している。 In FIG. 8, in addition to the above-mentioned reference image information, a combination with the reference biometric information instead of the above-mentioned reference ultrasonic image information and a degree of association with the meat quality for the combination are set at three or more levels. An example is shown.
 参照用超音波画像情報の代わりに説明変数として加えられるこの参照用生体情報は、その食肉を提供する家畜の生体に関して計測したあらゆる生体データを含むものである。この家畜の生体データの種類としては、家畜の心拍数、体温、心電図、血圧、血液検査結果、体重等、あらゆる生体データを含む。参照用生体情報に含まれる生体に関するデータも肉の味に影響を及ぼすことから、参照用画像情報と組み合わせ、連関度を通じて肉質を判別することで、判別精度を向上させることができる。 This reference biometric information, which is added as an explanatory variable instead of the reference ultrasound image information, includes all biometric data measured for the livestock living body that provides the meat. The types of biological data of livestock include all biological data such as heart rate, body temperature, electrocardiogram, blood pressure, blood test result, and weight of livestock. Since the data on the living body included in the reference biological information also affects the taste of the meat, the discrimination accuracy can be improved by discriminating the meat quality through the degree of association in combination with the reference image information.
 図8の例では、入力データとして例えば参照用画像情報P01~P03、参照用生体情報P18~21であるものとする。このような入力データとしての、参照用画像情報に対して、参照用生体情報が組み合わさったものが、図8に示す中間ノードである。各中間ノードは、更に出力に連結している。この出力においては、出力解としての、肉質が表示されている。 In the example of FIG. 8, it is assumed that the input data is, for example, reference image information P01 to P03 and reference biological information P18 to 21. The intermediate node shown in FIG. 8 is a combination of reference image information and reference biometric information as such input data. Each intermediate node is further linked to the output. In this output, the meat quality as the output solution is displayed.
 参照用画像情報と参照用生体情報との各組み合わせ(中間ノード)は、この出力解としての、肉質に対して3段階以上の連関度を通じて互いに連関しあっている。参照用画像情報と参照用生体情報がこの連関度を介して左側に配列し、肉質が連関度を介して右側に配列している。連関度は、左側に配列された参照用画像情報と参照用生体情報に対して、肉質と関連性が高いかの度合いを示すものである。換言すれば、この連関度は、各参照用画像情報と参照用生体情報が、いかなる肉質に紐付けられる可能性が高いかを示す指標であり、参照用画像情報と参照用生体情報から最も確からしい肉質を選択する上での的確性を示すものである。 Each combination (intermediate node) of the reference image information and the reference biometric information is related to each other through three or more levels of association with the meat quality as this output solution. The reference image information and the reference biometric information are arranged on the left side through this degree of association, and the meat quality is arranged on the right side through this degree of association. The degree of association indicates the degree of high relevance to the meat quality with respect to the reference image information and the reference biometric information arranged on the left side. In other words, this degree of association is an index indicating what kind of meat quality each reference image information and reference biometric information are likely to be associated with, and is the most reliable from the reference image information and the reference biometric information. It shows the accuracy in selecting a unique meat quality.
 判別装置2は、このような図8に示す3段階以上の連関度w13~w22を予め取得しておく。つまり判別装置2は、実際の探索解の判別を行う上で、参照用画像情報と、参照用生体情報、並びにその場合の肉質が何れが好適であったか、過去のデータを蓄積しておき、これらを分析、解析することで図8に示す連関度を作り上げておく。 The discrimination device 2 acquires in advance the degree of association w13 to w22 of three or more stages shown in FIG. That is, the discriminating device 2 accumulates past data as to which of the reference image information, the reference biometric information, and the meat quality in that case was suitable for discriminating the actual search solution, and these By analyzing and analyzing the above, the degree of association shown in FIG. 8 is created.
 この分析、解析は人工知能により行うようにしてもよい。かかる場合には、例えば参照用画像情報P01で、参照用生体情報P20である場合に、その肉質を過去のデータから分析する。肉質Aの事例が多い場合には、この肉質がAにつながる連関度をより高く設定し、肉質がBの事例が多く、肉質がAの事例が少ない場合には、肉質がBにつながる連関度を高くし、肉質がAにつながる連関度を低く設定する。例えば中間ノード61aの例では、肉質Aと肉質Bの出力にリンクしているが、以前の事例から肉質Aにつながるw13の連関度を7点に、肉質Bにつながるw14の連関度を2点に設定している。 This analysis may be performed by artificial intelligence. In such a case, for example, in the case of the reference image information P01 and the reference biometric information P20, the meat quality is analyzed from the past data. If there are many cases of meat quality A, the degree of association that this meat quality leads to A is set higher, and if there are many cases of meat quality B and there are few cases of meat quality A, the degree of association that meat quality leads to B is set. And set the degree of association that the meat quality leads to A low. For example, in the example of the intermediate node 61a, the output of meat quality A and meat quality B is linked, but from the previous case, the degree of association of w13 connected to meat quality A is 7 points, and the degree of association of w14 connected to meat quality B is 2 points. Is set to.
 また、この図8に示す連関度は、人工知能におけるニューラルネットワークのノードで構成されるものであってもよい。即ち、このニューラルネットワークのノードが出力に対する重み付け係数が、上述した連関度に対応することとなる。またニューラルネットワークに限らず、人工知能を構成するあらゆる意思決定因子で構成されるものであってもよい。その他、人工知能に関する構成は、図4における説明と同様である。 Further, the degree of association shown in FIG. 8 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association. Further, the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence. Other than that, the configuration related to artificial intelligence is the same as the description in FIG.
 図8に示す連関度の例で、ノード61bは、参照用画像情報P01に対して参照用生体情報P18の組み合わせのノードであり、肉質Cの連関度がw15、肉質Eの連関度がw16となっている。ノード61cは、参照用画像情報P02に対して、参照用生体情報P19、P21の組み合わせのノードであり、肉質Bの連関度がw17、肉質Dの連関度がw18となっている。 In the example of the degree of association shown in FIG. 8, the node 61b is a node of the combination of the reference image information P01 and the reference biological information P18, and the degree of association of the meat quality C is w15 and the degree of association of the meat quality E is w16. It has become. The node 61c is a node in which the reference biometric information P19 and P21 are combined with respect to the reference image information P02, and the degree of association of the meat quality B is w17 and the degree of association of the meat quality D is w18.
 このような連関度が、人工知能でいうところの学習済みデータとなる。このような学習済みデータを作った後に、実際にこれから肉質の探索を行う際において、上述した学習済みデータを利用して行うこととなる。かかる場合には、実際にその肉質の判別対象の画像情報と、生体情報とを取得する。ここで生体情報は、肉質を実際に見積もる際に、新たに取得するが、その取得方法は、上述した参照用生体情報と同様である。 Such degree of association is what is called learned data in artificial intelligence. After creating such trained data, when actually searching for meat quality from now on, the above-mentioned trained data will be used. In such a case, the image information of the meat quality to be discriminated and the biological information are actually acquired. Here, the biological information is newly acquired when actually estimating the meat quality, and the acquisition method is the same as the above-mentioned reference biological information.
 このようにして新たに取得した画像情報と、生体情報に基づいて、最適な肉質を探索する。かかる場合には、予め取得した図8(表1)に示す連関度を参照する。例えば、新たに取得した画像情報がP02と同一かこれに類似するものである場合であって、生体情報がP21と同一か又は類似する場合には、連関度を介してノード61dが関連付けられており、このノード61dは、肉質Cがw19、肉質Dが連関度w20で関連付けられている。かかる場合には、連関度の最も高い肉質Cを最適解として選択する。但し、最も連関度の高いものを最適解として選択することは必須ではなく、連関度は低いものの連関性そのものは認められる肉質Dを最適解として選択するようにしてもよい。また、これ以外に矢印が繋がっていない出力解を選択してもよいことは勿論であり、連関度に基づくものであれば、その他いかなる優先順位で選択されるものであってもよい。 Search for the optimum meat quality based on the newly acquired image information and biological information in this way. In such a case, the degree of association shown in FIG. 8 (Table 1) acquired in advance is referred to. For example, when the newly acquired image information is the same as or similar to P02 and the biometric information is the same as or similar to P21, the node 61d is associated via the degree of association. The node 61d is associated with the meat quality C by w19 and the meat quality D by the degree of association w20. In such a case, the meat quality C having the highest degree of association is selected as the optimum solution. However, it is not essential to select the one with the highest degree of association as the optimum solution, and the meat quality D, which has the lowest degree of association but is recognized as having the association itself, may be selected as the optimum solution. In addition to this, it goes without saying that an output solution to which the arrows are not connected may be selected, and any other output solution may be selected in any other priority as long as it is based on the degree of association.
 なお、上述した生体情報、参照用生体情報は、生体から時系列的に複数回、時間間隔をおいて生体データを取得し、その生体データの時系列的な変化傾向を含めてもよい。これにより、家畜の生体データの時系列的な変化傾向も含めて肉質の判断を行うことが可能となる。 Note that the above-mentioned biological information and reference biological information may be obtained by acquiring biological data from the biological body a plurality of times in a time series at time intervals and including the time-series change tendency of the biological data. This makes it possible to judge the meat quality including the time-series change tendency of the biological data of livestock.
 図9は、上述した参照用画像情報に加え、上述した参照用超音波画像情報の代わりに参照用飼育環境情報との組み合わせと、当該組み合わせに対する肉質との3段階以上の連関度が設定されている例を示している。 In FIG. 9, in addition to the above-mentioned reference image information, a combination with the reference breeding environment information instead of the above-mentioned reference ultrasonic image information and a degree of association with the meat quality for the combination are set at three or more levels. An example is shown.
 参照用超音波画像情報の代わりに説明変数として加えられるこの参照用飼育環境情報は、その食肉を提供する家畜を飼育する環境に関するあらゆるデータを含むものである。この参照用飼育環境情報のデータの種類としては、家畜を飼育する厩舎の温度、湿度、風向き、日照度合、室内照明の度合、音声データ、害虫の駆除状況、清掃状況、糞尿の処理状況等、飼育環境に関するあらゆる情報を含むものである。参照用飼育環境情報に含まれるデータも肉の味に影響を及ぼすことから、参照用画像情報と組み合わせ、連関度を通じて肉質を判別することで、判別精度を向上させることができる。 This reference breeding environment information, which is added as an explanatory variable instead of the reference ultrasound image information, includes all data regarding the environment in which the livestock that provide the meat are bred. The types of data for this reference breeding environment information include temperature, humidity, wind direction, sunlight intensity, indoor lighting degree, audio data, pest extermination status, cleaning status, manure processing status, etc. It contains all the information about the breeding environment. Since the data included in the reference breeding environment information also affects the taste of the meat, the discrimination accuracy can be improved by discriminating the meat quality through the degree of association in combination with the reference image information.
 図9の例では、入力データとして例えば参照用画像情報P01~P03、参照用飼育環境情報P18~21であるものとする。このような入力データとしての、参照用画像情報に対して、参照用飼育環境情報が組み合わさったものが、図9に示す中間ノードである。各中間ノードは、更に出力に連結している。この出力においては、出力解としての、肉質が表示されている。 In the example of FIG. 9, it is assumed that the input data is, for example, reference image information P01 to P03 and reference breeding environment information P18 to 21. The intermediate node shown in FIG. 9 is a combination of reference image information and reference breeding environment information as such input data. Each intermediate node is further linked to the output. In this output, the meat quality as the output solution is displayed.
 参照用画像情報と参照用飼育環境情報との各組み合わせ(中間ノード)は、この出力解としての、肉質に対して3段階以上の連関度を通じて互いに連関しあっている。参照用画像情報と参照用飼育環境情報がこの連関度を介して左側に配列し、肉質が連関度を介して右側に配列している。連関度は、左側に配列された参照用画像情報と参照用飼育環境情報に対して、肉質と関連性が高いかの度合いを示すものである。換言すれば、この連関度は、各参照用画像情報と参照用飼育環境情報が、いかなる肉質に紐付けられる可能性が高いかを示す指標であり、参照用画像情報と参照用飼育環境情報から最も確からしい肉質を選択する上での的確性を示すものである。 Each combination (intermediate node) of the reference image information and the reference breeding environment information is related to each other through three or more levels of association with the meat quality as this output solution. The reference image information and the reference breeding environment information are arranged on the left side through this degree of association, and the meat quality is arranged on the right side through this degree of association. The degree of association indicates the degree of relevance to the meat quality with respect to the reference image information and the reference breeding environment information arranged on the left side. In other words, this degree of association is an index indicating what kind of meat quality each reference image information and reference breeding environment information is likely to be associated with, and is based on the reference image information and reference breeding environment information. It shows the accuracy in selecting the most probable meat quality.
 判別装置2は、このような図9に示す3段階以上の連関度w13~w22を予め取得しておく。つまり判別装置2は、実際の探索解の判別を行う上で、参照用画像情報と、参照用飼育環境情報、並びにその場合の肉質が何れが好適であったか、過去のデータを蓄積しておき、これらを分析、解析することで図9に示す連関度を作り上げておく。 The discrimination device 2 acquires in advance the degree of association w13 to w22 of three or more stages shown in FIG. That is, the discriminating device 2 accumulates past data as to which of the reference image information, the reference breeding environment information, and the meat quality in that case was suitable for discriminating the actual search solution. By analyzing and analyzing these, the degree of association shown in FIG. 9 is created.
 この分析、解析は人工知能により行うようにしてもよい。かかる場合には、例えば参照用画像情報P01で、参照用飼育環境情報P20である場合に、その肉質を過去のデータから分析する。肉質Aの事例が多い場合には、この肉質がAにつながる連関度をより高く設定し、肉質がBの事例が多く、肉質がAの事例が少ない場合には、肉質がBにつながる連関度を高くし、肉質がAにつながる連関度を低く設定する。例えば中間ノード61aの例では、肉質Aと肉質Bの出力にリンクしているが、以前の事例から肉質Aにつながるw13の連関度を7点に、肉質Bにつながるw14の連関度を2点に設定している。 This analysis may be performed by artificial intelligence. In such a case, for example, in the case of the reference image information P01 and the reference breeding environment information P20, the meat quality is analyzed from the past data. If there are many cases of meat quality A, the degree of association that this meat quality leads to A is set higher, and if there are many cases of meat quality B and there are few cases of meat quality A, the degree of association that meat quality leads to B is set. And set the degree of association that the meat quality leads to A low. For example, in the example of the intermediate node 61a, the output of meat quality A and meat quality B is linked, but from the previous case, the degree of association of w13 connected to meat quality A is 7 points, and the degree of association of w14 connected to meat quality B is 2 points. Is set to.
 また、この図9に示す連関度は、人工知能におけるニューラルネットワークのノードで構成されるものであってもよい。即ち、このニューラルネットワークのノードが出力に対する重み付け係数が、上述した連関度に対応することとなる。またニューラルネットワークに限らず、人工知能を構成するあらゆる意思決定因子で構成されるものであってもよい。その他、人工知能に関する構成は、図4における説明と同様である。 Further, the degree of association shown in FIG. 9 may be composed of the nodes of the neural network in artificial intelligence. That is, the weighting coefficient for the output of the node of this neural network corresponds to the above-mentioned degree of association. Further, the network is not limited to a neural network, and may be composed of all decision-making factors constituting artificial intelligence. Other than that, the configuration related to artificial intelligence is the same as the description in FIG.
 図9に示す連関度の例で、ノード61bは、参照用画像情報P01に対して参照用飼育環境情報P18の組み合わせのノードであり、肉質Cの連関度がw15、肉質Eの連関度がw16となっている。ノード61cは、参照用画像情報P02に対して、参照用飼育環境情報P19、P21の組み合わせのノードであり、肉質Bの連関度がw17、肉質Dの連関度がw18となっている。 In the example of the degree of association shown in FIG. 9, the node 61b is a node of the combination of the reference image information P01 and the reference breeding environment information P18, and the degree of association of the meat quality C is w15 and the degree of association of the meat quality E is w16. It has become. The node 61c is a node that is a combination of the reference breeding environment information P19 and P21 with respect to the reference image information P02, and the degree of association of the meat quality B is w17 and the degree of association of the meat quality D is w18.
 このような連関度が、人工知能でいうところの学習済みデータとなる。このような学習済みデータを作った後に、実際にこれから肉質の探索を行う際において、上述した学習済みデータを利用して行うこととなる。かかる場合には、実際にその肉質の判別対象の画像情報と、飼育環境情報とを取得する。ここで飼育環境情報は、肉質を実際に見積もる際に、新たに取得するが、その取得方法は、上述した参照用飼育環境情報と同様である。 Such degree of association is what is called learned data in artificial intelligence. After creating such trained data, when actually searching for meat quality from now on, the above-mentioned trained data will be used. In such a case, the image information of the meat quality to be discriminated and the breeding environment information are actually acquired. Here, the breeding environment information is newly acquired when actually estimating the meat quality, and the acquisition method is the same as the above-mentioned reference breeding environment information.
 このようにして新たに取得した画像情報と、飼育環境情報に基づいて、最適な肉質を探索する。かかる場合には、予め取得した図9(表1)に示す連関度を参照する。例えば、新たに取得した画像情報がP02と同一かこれに類似するものである場合であって、飼育環境情報がP21と同一か又は類似する場合には、連関度を介してノード61dが関連付けられており、このノード61dは、肉質Cがw19、肉質Dが連関度w20で関連付けられている。かかる場合には、連関度の最も高い肉質Cを最適解として選択する。但し、最も連関度の高いものを最適解として選択することは必須ではなく、連関度は低いものの連関性そのものは認められる肉質Dを最適解として選択するようにしてもよい。また、これ以外に矢印が繋がっていない出力解を選択してもよいことは勿論であり、連関度に基づくものであれば、その他いかなる優先順位で選択されるものであってもよい。 Search for the optimum meat quality based on the newly acquired image information and breeding environment information in this way. In such a case, the degree of association shown in FIG. 9 (Table 1) acquired in advance is referred to. For example, when the newly acquired image information is the same as or similar to P02 and the breeding environment information is the same as or similar to P21, the node 61d is associated via the degree of association. The node 61d is associated with the meat quality C by w19 and the meat quality D by the degree of association w20. In such a case, the meat quality C having the highest degree of association is selected as the optimum solution. However, it is not essential to select the one with the highest degree of association as the optimum solution, and the meat quality D, which has the lowest degree of association but is recognized as having the association itself, may be selected as the optimum solution. In addition to this, it goes without saying that an output solution to which the arrows are not connected may be selected, and any other output solution may be selected in any other priority as long as it is based on the degree of association.
 また、飼育環境情報を、その家畜に施していた餌に関する餌情報に代替させてもよい。かかる場合には、参照用飼育環境情報の代替として、以前に取得した家畜に施した餌に関する参照用餌情報を、参照用画像情報との組み合わせの連関度を予め形成しておく。そして、新たに判別対象の食肉を提供する家畜の餌に関する餌情報を取得した場合には、その餌情報に基づいて肉質を判別することになる。 Further, the breeding environment information may be replaced with the feed information regarding the feed given to the livestock. In such a case, as a substitute for the reference breeding environment information, the degree of association of the previously acquired reference feed information regarding the feed given to the livestock in combination with the reference image information is formed in advance. Then, when the feed information regarding the feed of the livestock that newly provides the meat to be discriminated is acquired, the meat quality is discriminated based on the feed information.
 上述した連関度においては、10段階評価で連関度を表現しているが、これに限定されるものではなく、3段階以上の連関度で表現されていればよく、逆に3段階以上であれば100段階でも1000段階でも構わない。一方、この連関度は、2段階、つまり互いに連関しているか否か、1又は0の何れかで表現されるものは含まれない。 In the above-mentioned degree of association, the degree of association is expressed by a 10-step evaluation, but it is not limited to this, and it may be expressed by a degree of association of 3 or more levels, and conversely, it may be expressed by 3 or more levels. For example, 100 steps or 1000 steps may be used. On the other hand, this degree of association does not include those expressed in two stages, that is, whether or not they are related to each other, either 1 or 0.
 上述した構成からなる本発明によれば、特段のスキルや経験が無くても、誰でも手軽に肉質の判別・探索を行うことができる。また本発明によれば、この探索解の判断を、人間が行うよりも高精度に行うことが可能となる。更に、上述した連関度を人工知能(ニューラルネットワーク等)で構成することにより、これを学習させることでその判別精度を更に向上させることが可能となる。 According to the present invention having the above-mentioned configuration, anyone can easily discriminate and search for meat quality without any special skill or experience. Further, according to the present invention, it is possible to make a judgment of this search solution with higher accuracy than a human being. Further, by configuring the above-mentioned degree of association with artificial intelligence (neural network or the like), it is possible to further improve the discrimination accuracy by learning this.
 なお、上述した入力データ、及び出力データは、学習させる過程で完全に同一のものが存在しない場合も多々あることから、これらの入力データと出力データを類型別に分類した情報であってもよい。つまり、入力データを構成する情報P01、P02、・・・・P15、16、・・・は、その情報の内容に応じて予めシステム側又はユーザ側において分類した基準で分類し、その分類した入力データと出力データとの間でデータセットを作り、学習させるようにしてもよい。 Note that the above-mentioned input data and output data may not be exactly the same in the process of learning, so that the input data and the output data may be classified by type. That is, the information P01, P02, ... P15, 16, ... That constitute the input data are classified according to the criteria classified in advance on the system side or the user side according to the content of the information, and the classified inputs. A dataset may be created between the data and the output data and trained.
 なお、上述した連関度では、参照用画像情報に加え、参照用超音波画像情報、参照用分析情報、参照用産地情報、参照用生体情報、参照用飼育環境情報、参照用餌情報の何れかとの組み合わせで構成されている場合を例にとり説明をしたが、これに限定されるものではない。つまり連関度は、参照用画像情報に加え、参照用超音波画像情報、参照用分析情報、参照用産地情報、参照用生体情報、参照用飼育環境情報、参照用餌情報の何れか2以上との組み合わせで構成されていてもよい。また連関度は、参照用画像情報に加え、参照用超音波画像情報、参照用分析情報、参照用産地情報、参照用生体情報、参照用飼育環境情報、参照用餌情報の何れか1以上に加え、他のファクターがこの組み合わせに加わって連関度が形成されていてもよい。 In the above-mentioned degree of association, in addition to the reference image information, any of the reference ultrasonic image information, the reference analysis information, the reference production area information, the reference biological information, the reference breeding environment information, and the reference food information. The explanation has been given by taking the case of being composed of a combination of the above as an example, but the description is not limited to this. In other words, in addition to the reference image information, the degree of association is any two or more of the reference ultrasonic image information, the reference analysis information, the reference production area information, the reference biological information, the reference breeding environment information, and the reference food information. It may be composed of a combination of. In addition to the reference image information, the degree of association is one or more of the reference ultrasonic image information, the reference analysis information, the reference production area information, the reference biological information, the reference breeding environment information, and the reference food information. In addition, other factors may be added to this combination to form a degree of association.
 いずれの場合も、その連関度の参照情報に合わせたデータの入力がなされ、その連関度を利用して肉質を求める。 In either case, data is input according to the reference information of the degree of association, and the meat quality is obtained using the degree of association.
 また本発明は、図10に示すように参照用情報Uと参照用情報Vという2種類以上の情報の組み合わせの連関度に基づいて肉質を判別するものである。この参照用情報Yが参照用画像情報であり、参照用情報Vが参照用超音波画像情報、参照用分析情報、参照用産地情報、参照用生体情報、参照用飼育環境情報、参照用餌情報の何れかであるものとする。 Further, as shown in FIG. 10, the present invention determines the meat quality based on the degree of association between two or more types of information, the reference information U and the reference information V. The reference information Y is the reference image information, and the reference information V is the reference ultrasonic image information, the reference analysis information, the reference production area information, the reference biological information, the reference breeding environment information, and the reference food information. It shall be one of.
 このとき、図10に示すように、参照用情報Uについて得られた出力をそのまま入力データとして、参照用情報Vとの組み合わせの中間ノード61を介して出力(肉質)と関連付けられていてもよい。例えば、参照用情報U(参照用画像情報)について、図3に示すように出力解を出した後、これをそのまま入力として、他の参照用情報Vとの間での連関度を利用し、出力(肉質)を探索するようにしてもよい。 At this time, as shown in FIG. 10, the output obtained for the reference information U may be used as the input data as it is, and may be associated with the output (flesh quality) via the intermediate node 61 in combination with the reference information V. .. For example, for reference information U (reference image information), after an output solution is output as shown in FIG. 3, this is used as an input as it is, and the degree of association with other reference information V is used. The output (flesh quality) may be searched.
 また本発明によれば、出力として肉質を判別する代わりに、肉質をより優れたものにするための家畜の飼育条件を出力解としてもよい。かかる場合には、連関度を学習させるデータセットについて、肉質の変わりに、家畜の飼育条件を含める。家畜の飼育条件としては、家畜に施す餌、家畜を飼育する厩舎の温度、湿度、風向き、日照度合、室内照明の度合、音声データ、害虫の駆除状況、清掃状況、糞尿の処理状況等のデータを利用するようにしてもよい。 Further, according to the present invention, instead of determining the meat quality as the output, the livestock breeding conditions for improving the meat quality may be used as the output solution. In such cases, livestock breeding conditions should be included in place of meat quality for the dataset that trains the degree of association. Livestock breeding conditions include feed to be fed to livestock, temperature, humidity, wind direction, sunlight intensity, indoor lighting level, audio data, pest extermination status, cleaning status, manure processing status, etc. May be used.
 また、本発明によれば、3段階以上に設定されている連関度を介して最適な解探索を行う点に特徴がある。連関度は、上述した10段階以外に、例えば0~100%までの数値で記述することができるが、これに限定されるものではなく3段階以上の数値で記述できるものであればいかなる段階で構成されていてもよい。 Further, according to the present invention, there is a feature that the optimum solution search is performed through the degree of association set in three or more stages. The degree of association can be described by, for example, a numerical value from 0 to 100% in addition to the above-mentioned 10 steps, but the degree of association is not limited to this, and any step can be described as long as it can be described by a numerical value of 3 or more steps. It may be configured.
 このような3段階以上の数値で表される連関度に基づいて最も確からしい肉質、を判別することで、探索解の可能性の候補として複数考えられる状況下において、当該連関度の高い順に探索して表示することも可能となる。このように連関度の高い順にユーザに表示できれば、より確からしい探索解を優先的に表示することも可能となる。 By discriminating the most probable meat quality based on the degree of association represented by the numerical values of three or more stages, the search is performed in descending order of the degree of association under the situation where there are multiple possible candidates for the search solution. It is also possible to display it. If the users can be displayed in descending order of the degree of association in this way, it is possible to preferentially display more probable search solutions.
 これに加えて、本発明によれば、連関度が1%のような極めて低い出力の判別結果も見逃すことなく判断することができる。連関度が極めて低い判別結果であっても僅かな兆候として繋がっているものであり、何十回、何百回に一度は、その判別結果として役に立つ場合もあることをユーザに対して注意喚起することができる。 In addition to this, according to the present invention, it is possible to judge without overlooking the discrimination result of the extremely low output such as the degree of association of 1%. It warns the user that even a judgment result with an extremely low degree of association is connected as a slight sign, and may be useful as the judgment result once every tens or hundreds of times. be able to.
 更に本発明によれば、このような3段階以上の連関度に基づいて探索を行うことにより、閾値の設定の仕方で、探索方針を決めることができるメリットがある。閾値を低くすれば、上述した連関度が1%のものであっても漏れなく拾うことができる反面、より適切な判別結果を好適に検出できる可能性が低く、ノイズを沢山拾ってしまう場合もある。一方、閾値を高くすれば、最適な探索解を高確率で検出できる可能性が高い反面、通常は連関度は低くてスルーされるものの何十回、何百回に一度は出てくる好適な解を見落としてしまう場合もある。いずれに重きを置くかは、ユーザ側、システム側の考え方に基づいて決めることが可能となるが、このような重点を置くポイントを選ぶ自由度を高くすることが可能となる。 Further, according to the present invention, there is an advantage that the search policy can be determined by the method of setting the threshold value by performing the search based on the degree of association of three or more stages. If the threshold value is lowered, even if the above-mentioned degree of association is 1%, it can be picked up without omission, but it is unlikely that a more appropriate discrimination result can be detected favorably, and a lot of noise may be picked up. is there. On the other hand, if the threshold value is raised, there is a high possibility that the optimum search solution can be detected with high probability, but the degree of association is usually low and it is passed through, but it is suitable that it appears once every tens or hundreds of times. Sometimes the solution is overlooked. It is possible to decide which one to prioritize based on the ideas of the user side and the system side, but it is possible to increase the degree of freedom in selecting the points to be emphasized.
 更に本発明では、上述した連関度を更新させるようにしてもよい。この更新は、例えばインターネットを始めとした公衆通信網を介して提供された情報を反映させるようにしてもよい。また参照用画像情報を初めとする各参照用情報を取得し、これらに対する肉質、改善施策に関する知見、情報、データを取得した場合、これらに応じて連関度を上昇させ、或いは下降させる。 Further, in the present invention, the above-mentioned degree of association may be updated. This update may reflect information provided via a public communication network such as the Internet. In addition, when each reference information such as reference image information is acquired and knowledge, information, and data regarding the meat quality and improvement measures for these are acquired, the degree of association is increased or decreased according to these.
 つまり、この更新は、人工知能でいうところの学習に相当する。新たなデータを取得し、これを学習済みデータに反映させることを行っているため、学習行為といえるものである。 In other words, this update is equivalent to learning in terms of artificial intelligence. It can be said that it is a learning act because it acquires new data and reflects it in the learned data.
 また、この連関度の更新は、公衆通信網から取得可能な情報に基づく場合以外に、専門家による研究データや論文、学会発表や、新聞記事、書籍等の内容に基づいてシステム側又はユーザ側が人為的に、又は自動的に更新するようにしてもよい。これらの更新処理においては人工知能を活用するようにしてもよい。 In addition, this update of the degree of association is done by the system side or the user side based on the contents of research data, treatises, conference presentations, newspaper articles, books, etc. by experts, except when it is based on information that can be obtained from the public communication network. It may be updated artificially or automatically. Artificial intelligence may be utilized in these update processes.
 また学習済モデルを最初に作り上げる過程、及び上述した更新は、教師あり学習のみならず、教師なし学習、ディープラーニング、強化学習等を用いるようにしてもよい。教師なし学習の場合には、入力データと出力データのデータセットを読み込ませて学習させる代わりに、入力データに相当する情報を読み込ませて学習させ、そこから出力データに関連する連関度を自己形成させるようにしてもよい。 In addition, the process of first creating a trained model and the above-mentioned update may use not only supervised learning but also unsupervised learning, deep learning, reinforcement learning, and the like. In the case of unsupervised learning, instead of reading and learning the data set of input data and output data, information corresponding to the input data is read and trained, and the degree of association related to the output data is self-formed from there. You may let it.
1 肉質判別システム
2 判別装置
21 内部バス
23 表示部
24 制御部
25 操作部
26 通信部
27 判別部
28 記憶部
61 ノード
 
1 Meat quality discrimination system 2 Discrimination device 21 Internal bus 23 Display unit 24 Control unit 25 Operation unit 26 Communication unit 27 Discrimination unit 28 Storage unit 61 Node

Claims (12)

  1.  食肉の肉質を判別する肉質判別プログラムにおいて、
     判別対象の食肉を撮像した画像情報を取得する情報取得ステップと、
     過去において撮像した食肉の参照用画像情報と、肉質との3段階以上の連関度を利用し、上記情報取得ステップにおいて取得した画像情報に応じた参照用画像情報に基づき、上記連関度のより高いものを優先させて、肉質を判別する判別ステップとをコンピュータに実行させること
     を特徴とする肉質判別プログラム。
    In the meat quality discrimination program that discriminates the meat quality of meat
    The information acquisition step to acquire the image information of the meat to be discriminated, and
    Using the reference image information of meat captured in the past and the degree of association with meat quality in three or more stages, the degree of association is higher based on the reference image information corresponding to the image information acquired in the above information acquisition step. A meat quality discrimination program characterized in that a computer executes a discrimination step for discriminating meat quality by giving priority to things.
  2.  上記情報取得ステップでは、上記画像情報として、スペクトル画像を取得し、
     上記判別ステップでは、上記参照用画像情報として、スペクトル画像として撮像されたものを使用すること
     を特徴とする請求項1記載の肉質判別プログラム。
    In the above information acquisition step, a spectrum image is acquired as the above image information, and the spectrum image is acquired.
    The meat quality discrimination program according to claim 1, wherein in the discrimination step, an image captured as a spectrum image is used as the reference image information.
  3.  上記情報取得ステップは、屠畜時の食肉を撮像した上記画像情報と、当該食肉を提供する家畜の生体に対して予め撮像した超音波画像情報とを取得し、
     上記判別ステップでは、過去の屠畜時に撮像した食肉の上記参照用画像情報と、当該食肉を提供する家畜の生体に対して予め撮像した参照用超音波画像情報とを有する組み合わせと、上記肉質との3段階以上の連関度を利用し、更に上記情報取得ステップにおいて取得した超音波画像情報に応じた参照用超音波画像情報に基づき、肉質を判別すること
     を特徴とする請求項1記載の肉質判別プログラム。
    The information acquisition step acquires the above-mentioned image information obtained by capturing the meat at the time of slaughter and the ultrasonic image information previously captured by the living body of the livestock providing the meat.
    In the determination step, a combination having the above-mentioned reference image information of the meat imaged at the time of past slaughter and the reference ultrasonic image information previously imaged with respect to the living body of the livestock providing the meat, and the above-mentioned meat quality The meat quality according to claim 1, wherein the meat quality is determined based on the reference ultrasonic image information according to the ultrasonic image information acquired in the above information acquisition step by utilizing the three or more levels of association of the above. Discrimination program.
  4.  上記情報取得ステップでは、上記超音波画像情報として、生体に対して時系列的に複数回撮像した超音波画像の時系列的な変化傾向を取得し、
     上記判別ステップでは、上記参照用超音波画像情報として、生体に対して時系列的に複数回撮像した超音波画像の時系列的な変化傾向を取得すること
     を特徴とする請求項3記載の肉質判別プログラム。
    In the information acquisition step, as the ultrasonic image information, the time-series change tendency of the ultrasonic image captured a plurality of times in a time series with respect to the living body is acquired.
    The meat quality according to claim 3, wherein in the discrimination step, as the reference ultrasonic image information, a time-series change tendency of an ultrasonic image captured a plurality of times in a time series with respect to a living body is acquired. Discrimination program.
  5.  上記情報取得ステップは、上記食肉から遊離アミノ酸、脂肪酸組成、オレイン酸、イノシン酸、グアニル酸、ビタミンEの何れか1以上を分析した分析情報を取得し、
     上記判別ステップでは、上記参照用画像情報と、上記分析情報に応じた遊離アミノ酸、脂肪酸組成、オレイン酸、イノシン酸、グアニル酸、ビタミンEの何れか1以上を予め分析して取得した参照用分析情報とを有する組み合わせと、上記肉質との3段階以上の連関度を利用し、更に上記情報取得ステップにおいて取得した分析情報に応じた参照用分析情報に基づき、肉質を判別すること
     を特徴とする請求項1又は2記載の肉質判別プログラム。
    The information acquisition step acquires analytical information obtained by analyzing any one or more of free amino acids, fatty acid composition, oleic acid, inosinic acid, guanylic acid, and vitamin E from the meat.
    In the discrimination step, the reference image information and any one or more of free amino acid, fatty acid composition, oleic acid, inosinic acid, guanylic acid, and vitamin E according to the analysis information are analyzed in advance for reference analysis. It is characterized in that the meat quality is discriminated based on the combination having information and the degree of association with the meat quality in three or more stages, and further based on the reference analysis information corresponding to the analysis information acquired in the information acquisition step. The meat quality determination program according to claim 1 or 2.
  6.  上記情報取得ステップでは、上記食肉の産地に関する産地情報を取得し、
     上記判別ステップでは、上記参照用画像情報と、過去において撮像した食肉の産地に関する参照用産地情報とを有する組み合わせと、上記肉質との3段階以上の連関度を利用し、更に上記情報取得ステップにおいて取得した産地情報に応じた参照用産地情報に基づき、肉質を判別すること
     を特徴とする請求項1又は2記載の肉質判別プログラム。
    In the above information acquisition step, the production area information regarding the above meat production area is acquired, and the information is acquired.
    In the determination step, the combination having the reference image information and the reference production area information regarding the meat production area captured in the past and the degree of association with the meat quality in three or more stages are used, and further in the information acquisition step. The meat quality determination program according to claim 1 or 2, wherein the meat quality is determined based on the reference production area information according to the acquired production area information.
  7.  上記情報取得ステップでは、当該食肉を提供する家畜の生体から取得した生体情報を取得し、
     上記判別ステップでは、上記参照用画像情報と、過去において食肉を提供する家畜の生体から取得した参照用生体情報とを有する組み合わせと、上記肉質との3段階以上の連関度を利用し、更に上記情報取得ステップにおいて取得した生体情報に応じた参照用生体情報に基づき、肉質を判別すること
     を特徴とする請求項1又は2記載の肉質判別プログラム。
    In the above information acquisition step, biological information acquired from the living body of the livestock that provides the meat is acquired, and the biological information is acquired.
    In the determination step, the combination having the reference image information and the reference biological information acquired from the living body of the livestock that provides the meat in the past and the degree of association with the meat quality in three or more stages are used, and further, the above The meat quality discrimination program according to claim 1 or 2, wherein the meat quality is discriminated based on the reference biometric information corresponding to the biological information acquired in the information acquisition step.
  8.  上記情報取得ステップでは、上記生体情報として、生体から時系列的に複数回取得した生体データの時系列的な変化傾向を取得し、
     上記判別ステップでは、上記参照用生体情報として、生体から時系列的に複数回取得した生体データの時系列的な変化傾向を取得すること
     を特徴とする請求項7記載の肉質判別プログラム。
    In the information acquisition step, as the biological information, the time-series change tendency of the biological data acquired from the living body a plurality of times in the time series is acquired.
    The meat quality discrimination program according to claim 7, wherein in the discrimination step, as the reference biological information, the time-series change tendency of the biological data acquired from the living body a plurality of times in time series is acquired.
  9.  上記情報取得ステップでは、上記食肉を提供する家畜の飼育環境に関する飼育環境情報を取得し、
     上記判別ステップでは、上記参照用画像情報と、過去において撮像した食肉を提供する家畜の飼育環境に関する参照用飼育環境情報とを有する組み合わせと、上記肉質との3段階以上の連関度を利用し、更に上記情報取得ステップにおいて取得した飼育環境情報に応じた参照用飼育環境情報に基づき、肉質を判別すること
     を特徴とする請求項1又は2記載の肉質判別プログラム。
    In the above information acquisition step, the breeding environment information regarding the breeding environment of the livestock that provides the meat is acquired.
    In the discrimination step, the combination having the reference image information and the reference breeding environment information regarding the breeding environment of the livestock providing the meat imaged in the past and the degree of association with the meat quality in three or more stages are used. The meat quality discrimination program according to claim 1 or 2, further comprising discriminating the meat quality based on the reference breeding environment information according to the breeding environment information acquired in the above information acquisition step.
  10.  上記情報取得ステップでは、上記食肉を提供する家畜に施した餌に関する餌情報を取得し、
     上記判別ステップでは、上記参照用画像情報と、過去において撮像した食肉を提供する家畜に施した餌に関する参照用餌情報とを有する組み合わせと、上記肉質との3段階以上の連関度を利用し、更に上記情報取得ステップにおいて取得した餌情報に応じた参照用餌情報に基づき、肉質を判別すること
     を特徴とする請求項1又は2記載の肉質判別プログラム。
    In the above information acquisition step, feed information regarding the feed given to the livestock that provides the meat is acquired.
    In the discrimination step, the combination having the reference image information and the reference feed information regarding the feed given to the livestock providing the meat imaged in the past and the degree of association with the meat quality in three or more stages are used. Further, the meat quality discrimination program according to claim 1 or 2, wherein the meat quality is discriminated based on the reference bait information according to the bait information acquired in the above information acquisition step.
  11.  上記判別ステップでは、人工知能におけるニューラルネットワークのノードの各出力の重み付け係数に対応する上記連関度を利用すること
     を特徴とする請求項1~10のうち何れか1項記載の肉質判別プログラム。
    The meat quality discrimination program according to any one of claims 1 to 10, wherein in the discrimination step, the degree of association corresponding to the weighting coefficient of each output of the node of the neural network in artificial intelligence is used.
  12.  食肉の肉質を判別する肉質判別システムにおいて、
     判別対象の食肉を撮像した画像情報を取得する情報取得手段と、
     過去において撮像した食肉の参照用画像情報と、肉質との3段階以上の連関度を利用し、上記情報取得手段により取得された画像情報に応じた参照用画像情報に基づき、上記連関度のより高いものを優先させて、肉質を判別する判別手段とを備えること
     を特徴とする肉質判別システム。
    In the meat quality discrimination system that discriminates the meat quality of meat
    Information acquisition means for acquiring image information of the meat to be discriminated,
    Based on the reference image information of the meat captured in the past and the reference image information according to the image information acquired by the information acquisition means by using the three or more levels of association with the meat quality, the degree of association is determined. A meat quality discrimination system characterized in that it is provided with a discrimination means for discriminating meat quality by giving priority to higher ones.
PCT/JP2020/038582 2019-10-26 2020-10-13 Meat quality distinction program, and system WO2021079785A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-194832 2019-10-26
JP2019194832A JP6732271B1 (en) 2019-10-26 2019-10-26 Meat quality discrimination program and system

Publications (1)

Publication Number Publication Date
WO2021079785A1 true WO2021079785A1 (en) 2021-04-29

Family

ID=71738610

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/038582 WO2021079785A1 (en) 2019-10-26 2020-10-13 Meat quality distinction program, and system

Country Status (2)

Country Link
JP (1) JP6732271B1 (en)
WO (1) WO2021079785A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6932364B1 (en) * 2020-08-17 2021-09-08 Assest株式会社 Purchase price estimation program
KR102586231B1 (en) * 2020-12-18 2023-10-10 대한민국 Automatic quality grade determination system and method using image information of cattle carcass, and Computer readable recording medium storing program performing the method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000097929A (en) * 1998-09-25 2000-04-07 Yoshiyuki Sasaki Method for discriminating quality of edible meat
US20110110563A1 (en) * 2009-07-29 2011-05-12 Heon Hwang Method and system for automatically grading beef quality
JP2014002136A (en) * 2012-05-21 2014-01-09 Obihiro Univ Of Agriculture & Veterinary Medicine Method for classifying meat color
JP2014071018A (en) * 2012-09-28 2014-04-21 Obihiro Univ Of Agriculture & Veterinary Medicine Evaluation method for marbling of meat
US20150317803A1 (en) * 2014-05-02 2015-11-05 Empire Technology Development Llc Meat assessment device
JP6587268B1 (en) * 2019-02-07 2019-10-09 Assest株式会社 Platform risk determination program and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000097929A (en) * 1998-09-25 2000-04-07 Yoshiyuki Sasaki Method for discriminating quality of edible meat
US20110110563A1 (en) * 2009-07-29 2011-05-12 Heon Hwang Method and system for automatically grading beef quality
JP2014002136A (en) * 2012-05-21 2014-01-09 Obihiro Univ Of Agriculture & Veterinary Medicine Method for classifying meat color
JP2014071018A (en) * 2012-09-28 2014-04-21 Obihiro Univ Of Agriculture & Veterinary Medicine Evaluation method for marbling of meat
US20150317803A1 (en) * 2014-05-02 2015-11-05 Empire Technology Development Llc Meat assessment device
JP6587268B1 (en) * 2019-02-07 2019-10-09 Assest株式会社 Platform risk determination program and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A. PRZYBYLAK ET AL.: "Marbling Classification of Lamb Carcasses with the Artificial Neural Image Analysis", PROCEEDINGS OF SPIE, vol. 9631, 1 July 2015 (2015-07-01), pages 963113-1 - 963113-5, XP060055997 *
FUKUDA OSAMU ET AL.: "Estimation of Marbling Score in Live Beef Cattle Using Bayesian Network", SICE JOURNAL OF CONTROL, MEASUREMENT, AND SYSTEM INTEGRATION, vol. 10, no. 4, 1 July 2017 (2017-07-01), pages 297 - 302, XP055819239 *
FUKUDA, OSAMU ET AL: "Estimation of beef marbling standard number using a neural network", TRANSACTIONS OF THE SOCIETY OF INSTRUMENT AND CONTROL ENGINEERS, vol. 46, no. 7, 2010, pages 408 - 414 *
KUROSAWA, MASAAKI ET AL: "A beef grading system by fuzzy inference and neural networks", THE TRANSACTIONS OF THE INSTITUTE OF ELECTRICAL ENGINEERS OF JAPAN. C, vol. 115, no. 12, 20 November 1995 (1995-11-20), pages 1490 - 1498, XP055819244 *

Also Published As

Publication number Publication date
JP2021067618A (en) 2021-04-30
JP6732271B1 (en) 2020-07-29

Similar Documents

Publication Publication Date Title
WO2021079785A1 (en) Meat quality distinction program, and system
JP6830685B1 (en) Apple quality estimation program and system
JP6858377B1 (en) Fish quality determination program and system
JP2021192215A (en) Meat quality discrimination program and system
JP2021192025A (en) Meat portion discrimination program
JP2021192195A (en) Livestock breeding method proposal program and system
JP2022021261A (en) Program and system for fish quality discrimination
WO2022009893A1 (en) Fruit quality estimation program and system
WO2022138839A1 (en) Animal intention determination program
JP2021192192A (en) Meat quality determination program and system
JP2021192194A (en) Meat quality discrimination program and system
JP2021192193A (en) Meat quality discrimination program and system
WO2022009892A1 (en) Program for proposing unit selling price of meat
JP2021114993A (en) Fry feeding amount proposing program
JP6801902B1 (en) Child Abuse Sign Identification Program and System
JP2021192196A (en) Meat quality discrimination program
JP2022101291A (en) Domestic animal estrus determination program
JP2021192197A (en) Meat quality determination program
JP2021192198A (en) Meat specification system
JP2021192019A (en) Meat quality inspection device
JP2022021264A (en) Fish quality inspector
JP2022101296A (en) Animal intention determination program
JP2022021267A (en) Quality determination program of cultured fish
JP2021173647A (en) Fish quality discrimination program and system
JP2021074030A (en) Decayed tooth risk determination program and system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20878751

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20878751

Country of ref document: EP

Kind code of ref document: A1