CN116912665A - Image processing apparatus, system and method, data management apparatus and storage medium - Google Patents

Image processing apparatus, system and method, data management apparatus and storage medium Download PDF

Info

Publication number
CN116912665A
CN116912665A CN202310366586.2A CN202310366586A CN116912665A CN 116912665 A CN116912665 A CN 116912665A CN 202310366586 A CN202310366586 A CN 202310366586A CN 116912665 A CN116912665 A CN 116912665A
Authority
CN
China
Prior art keywords
image
classification
image processing
images
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310366586.2A
Other languages
Chinese (zh)
Inventor
藤本浩司
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Arkray Inc
Original Assignee
Arkray Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2023060223A external-priority patent/JP2023156996A/en
Application filed by Arkray Inc filed Critical Arkray Inc
Publication of CN116912665A publication Critical patent/CN116912665A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/94Hardware or software architectures specially adapted for image or video understanding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/483Physical analysis of biological material
    • G01N33/487Physical analysis of biological material of liquid biological material
    • G01N33/493Physical analysis of biological material of liquid biological material urine
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects

Abstract

The invention relates to an image processing apparatus, a system and a method, a data management apparatus and a storage medium. The image processing device is provided with: an acquisition unit that acquires a plurality of tangible component images; a first classification unit that classifies a plurality of formed component images into detection components for each predetermined classification, and calculates a degree of matching of classification results; and a transmitting unit that transmits the formed partial image specified from the plurality of formed partial images to the data management device via the network line, based on the calculated matching degree.

Description

Image processing apparatus, system and method, data management apparatus and storage medium
Technical Field
The present disclosure relates to an image processing apparatus, a data management apparatus, an image processing system, an image processing method, and a non-transitory storage medium.
Background
For example, japanese patent application laid-open No. 2020-085535 describes an information processing apparatus as follows: in the case of reclassifying formed partial images classified by each prescribed classification by stream-based image capturing, occurrence of erroneous operation in classification work is suppressed. The information processing device is provided with: a classification unit that cuts out a formed component image identified as a formed component from a plurality of images obtained by capturing a sample fluid that contains a plurality of formed components and flows through the flow cell, and classifies the cut formed component image into detection components for each predetermined classification; and a control unit configured to accept selection of a category of a destination in a state in which the display unit is caused to display a first image list of the formed component images included in the category of the detection component, and further, to control movement of the formed component images selected from the first image list to the second image list in a state in which the display unit is caused to simultaneously display the first image list and the second image list of the formed component images included in the category of the destination, when the formed component images classified into the detection component by the classification unit are classified into different categories.
Furthermore, JP-A-2000-258335 discloses a urine sediment analyzer capable of shortening the time for determining sediment of various sediment components in urine and improving the analysis efficiency. The urine sediment analysis device comprises: an optical system for capturing an image of cells in a sample; an image processor extracting cell characteristics in the photographed image; an analysis unit including a recognition determination unit that morphologically and automatically classifies cells based on the extracted features; and a data operation unit for performing automatic classification results and displaying and reclassifying the captured image. The urinary sediment analyzer includes a plurality of main data operation units having a function of reading and re-editing analysis results including a re-classification result determination right, and a sub-data operation unit having no re-classification result determination right, wherein the classification result is determined by the main data operation unit based on the result re-classified by the main data operation unit or the sub-data operation unit.
Further, japanese patent No. 6277702 describes an image processing method executed by an image processing apparatus capable of communicating with other apparatuses in a clinical examination system. The image processing method comprises the following steps: shooting more than two images of a body to be detected; a step of selecting an image transmitted from the captured image to the other device; and a step of transmitting the selected image to another device, wherein the image captured in the step of capturing is associated with information specifying a patient in which a component is associated, and wherein the step of selecting is configured to select at least an image including a component other than the component associated with the patient associated with the image captured in the step of capturing as the image transmitted to the other device.
However, in the case where the image processing apparatus classifies the formation component image into detection components of each predetermined classification, a learned model in which machine learning is performed in advance may be mounted. However, since it is not easy to always set the learned model mounted on the image processing apparatus to the latest state, it is difficult to appropriately classify various formed component images only by the learned model mounted on the image processing apparatus. The techniques described in the above-mentioned JP 2020-085535A and JP 2000-258335A and JP 6277702A do not consider a difficult-to-classify image of a tangible component.
Disclosure of Invention
The present disclosure has been made in view of the above-described points, and an object thereof is to provide an image processing apparatus, a data management apparatus, an image processing system, an image processing method, and a non-transitory storage medium capable of transmitting a formed partial image that is difficult to be appropriately classified by an image processing apparatus according to a predetermined classification to a data management apparatus that performs processing related to classification processing for classifying the formed partial image.
An image processing device according to an embodiment of the present disclosure includes: an acquisition unit that acquires a plurality of tangible component images; a first classification unit that classifies the plurality of formed component images into detection components for each predetermined classification, and calculates a matching degree of classification results; and a transmitting unit configured to transmit the formed partial image specified from the plurality of formed partial images to a data management device via a network line, based on the matching degree calculated by the first classifying unit.
A data management device according to an aspect of the present disclosure is a data management device connected to an image processing device via a network line, including: a second classification unit configured to classify the determined formed component image received from the image processing apparatus as a detection component; and a return unit configured to return a classification result based on the second classification unit to the image processing apparatus.
An image processing system according to an aspect of the present disclosure is an image processing system including an image processing apparatus and a data management apparatus, the data management apparatus being connected to the image processing apparatus via a network line, the image processing apparatus including: an acquisition unit that acquires a plurality of tangible component images; a first classification unit that classifies the plurality of formed component images into detection components for each predetermined classification, and calculates a matching degree of classification results; and a transmitting unit configured to transmit the formed component image specified from the plurality of formed component images to the data management device based on the matching degree calculated by the first classifying unit, the data management device including: a second classification unit configured to classify the determined formed component image received from the image processing apparatus as a detection component; and a return unit configured to return the classification result of the second classification unit to the image processing apparatus.
A non-transitory storage medium according to one embodiment of the present disclosure stores an image processing program for causing a computer to execute: a plurality of formed component images are acquired, the formed component images are classified into detection components for each predetermined class, the matching degree of the classification result is calculated, and the formed component images specified in the formed component images are transmitted to a data management device via a network line based on the calculated matching degree.
As described above, according to the present disclosure, it is possible to transmit to a data management apparatus that performs classification processing for classifying formed component images that are difficult to classify in an image processing apparatus.
Drawings
Fig. 1 is a perspective view showing a part of the structure of a measurement system according to a first embodiment.
Fig. 2 is a schematic diagram showing an example of the structure of the measurement system according to the first embodiment.
Fig. 3 is a block diagram showing one example of an electrical structure of the image processing system according to the first embodiment.
Fig. 4 is a block diagram showing one example of the functional configuration of the image processing apparatus according to the first embodiment.
Fig. 5 is a front view showing an example of a measurement result screen according to the embodiment.
Fig. 6 is a front view showing an example of a formed partial image list screen according to the embodiment.
Fig. 7 is a diagram for explaining a determination process of a formed partial image transmitted to the data management apparatus according to the embodiment.
Fig. 8 is a block diagram showing an example of a functional configuration of the data management apparatus according to the first embodiment.
Fig. 9 is a diagram for explaining the classification processing of the tangible component image by the data management device according to the embodiment.
Fig. 10 is a flowchart showing an example of a flow of processing performed by the image processing program according to the first embodiment.
Fig. 11 is a flowchart showing an example of a flow of processing performed by the data management program according to the first embodiment.
Fig. 12 is a block diagram showing one example of a functional structure of an image processing apparatus according to the second embodiment.
Fig. 13 is a flowchart showing an example of a flow of processing performed by the image processing program according to the second embodiment.
An example of a manner for implementing the techniques of the present disclosure is described in detail below with reference to the accompanying drawings. In addition, in the components and processes that perform the same actions, and functions, the same reference numerals are given to the components and processes throughout the drawings, and overlapping descriptions may be omitted as appropriate. The drawings are merely schematically illustrated to the extent that the techniques of the present disclosure can be fully understood. Accordingly, the techniques of this disclosure are not limited to the examples of figures. In the present embodiment, a description of a structure not directly related to the present disclosure or a known structure may be omitted.
First embodiment
Fig. 1 is a perspective view showing a part of the structure of a measurement system 70 according to the first embodiment.
As shown in fig. 1, the measurement system 70 according to the present embodiment includes a flow cell 40, a housing 72, a camera 74, and a light source 76. Further, an arrow UP shown in fig. 1 indicates an upper side in the UP-down direction of the measurement system 70.
The flow cell 40 according to the present embodiment is applied to, for example: by introducing a urine sample as an example of the sample fluid together with the sheath fluid, the camera 74 is used to capture the visible components in the urine sample, and the visible components in the urine are analyzed for the presence of formation (urinary sediment test) based on the shape of the visible components in the captured image. The camera 74 is one example of a photographing section. The urine specimen contains various physical components. As examples of the type of the physical component, erythrocytes, leukocytes, epithelial cells, tubular bacteria, and the like are given. In the present embodiment, the case of performing the examination of the physical components in urine using the urine sample is described as an example of the sample fluid, but the present invention can also be used for the examination of the physical components in urine using blood, cells, body fluid, or the like.
The measurement system 70 includes a housing 72 in which the flow cell 40 is disposed. The housing 72 is formed with a recess 72A into which the flow cell 40 is inserted, and a position of the housing 72 including the recess 72A is formed of a transparent member (glass or the like as an example). A camera 74 is provided in the housing 72 at a position facing the flow cell 40. A light source 76 is provided above the housing 72 at a position facing the camera 74 with the flow chamber 40 interposed therebetween. The camera 74 is disposed at a position where the sample fluid flowing through the flow cell 40 can be photographed.
The measurement system 70 includes a first supply device 78 for supplying a sample fluid to the sample intake port 42 of a sample flow path (not shown) in the detection flow cell 40. The first supply device 78 includes: a supply pipe 80 having one end connected to the sample inlet 42; a pump 82 provided in the middle of the supply pipe 80; and a sample reservoir 84 that is connected to the other end of the supply pipe 80 and stores a sample fluid.
The measurement system 70 includes a second supply device 86 for supplying a sheath fluid to the sheath inlet 44 of a sheath flow path (not shown) in the flow cell 40. The second supply device 86 includes: a supply tube 88 having one end connected to the sheath inlet 44; a pump 90 provided in the middle of the supply pipe 88; and a tank 92 connected to the other end portion of the supply pipe 88 and storing the sheath fluid.
In the flow cell 40, a discharge port 46 is provided between the sample inlet 42 and the sheath inlet 44. One end of a discharge pipe (not shown) is connected to the discharge port 46, and the other end of the discharge pipe is connected to a discard tank (not shown). The flow cell 40 includes a junction (not shown) where the sample introduced from the sample inlet 42 and the sheath fluid introduced from the sheath inlet 44 join together, and the joined fluid flows in the flow path. The camera 74 is used to capture the tangible components in the stream of the test object.
Fig. 2 is a schematic diagram showing an example of the structure of the measurement system 70 according to the first embodiment.
As shown in fig. 2, the measurement system 70 according to the present embodiment includes an image processing device 10. The arrow UP shown in fig. 2 indicates the upper side in the UP-down direction of the measurement system 70, as in fig. 1.
The image processing apparatus 10 has a function as a control device for controlling the operations of the camera 74, the light source operation unit 77 electrically connected to the light source 76, the pump 82, and the pump 90, respectively. The image processing apparatus 10 supplies the pulse signal to the light source operation unit 77, thereby causing the light source 76 to emit light at predetermined intervals. Further, the image processing apparatus 10 drives the pump 82, controls the flow rate of the fluid to be measured, and drives the pump 90, controlling the flow rate of the sheath fluid. Although not shown, a plurality of cameras 74 and an optical system for guiding light to each camera 74 may be provided. The optical systems are adjusted so that each camera 74 is focused at a different location (depth) in the flow cell 40. In other words, a plurality of images focused at different depth positions are simultaneously captured in the same position in the horizontal plane by the plurality of cameras 74. The photographed images photographed at the same time are stored in a storage section 15 shown in fig. 3 described later, correspondingly. The depth direction here means a direction perpendicular to the flow direction of the fluid to be measured, and means the up-down direction in fig. 2. The distance from each focal point to the wall surface of the flow cell 40 on the side closer to the camera 74 is different.
Fig. 3 is a block diagram showing one example of the electrical configuration of the image processing system 100 according to the first embodiment.
As shown in fig. 3, the image processing system 100 according to the present embodiment includes an image processing apparatus 10 and a data management apparatus 20. The image processing apparatus 10 is connected to the data management apparatus 20 via a network line N. The image processing apparatus 10 includes a CPU (Central Processing Unit ) 11, a ROM (Read Only Memory) 12, a RAM (Randam Access Memory, random access Memory) 13, an input/output interface (I/O) 14, a storage unit 15, a display unit 16, an operation unit 17, a communication unit 18, and a connection unit 19. The CPU11 may be a processor such as a GPU (Graphics Processing Unit ), for example.
The image processing apparatus 10 according to the present embodiment is applied to a general-purpose computer apparatus such as a personal computer (PC: personal Computer, personal computer), for example. The image processing apparatus 10 may be a portable computer apparatus such as a smart phone or a tablet terminal. The image processing apparatus 10 may be divided into a plurality of units. For example, the present invention may be configured to include: a unit that controls the measurement system such as the camera 74, the light source 76, the pump 82, the pump 90, etc.; and a unit that performs processing and analysis of the image captured by the camera 74. Further, the image processing apparatus 10 may be externally connected to the measurement system 70.
The control device is constituted by a CPU11, a ROM12, a RAM13, and an I/O14. The control device has, for example, a function of controlling a measurement system such as the camera 74, the light source 76, the pump 82, and the pump 90, and a function of processing and analyzing an image captured by the camera 74. The CPU11, ROM12, RAM13, and I/O14 are connected via buses.
The I/O14 is connected to each functional unit including a storage unit 15, a display unit 16, an operation unit 17, a communication unit 18, and a connection unit 19. These functional units can communicate with the CPU11 via the I/O14.
The control device may be configured as a sub-control unit that controls the operation of a part of the image processing device 10, or may be configured as a part of a main control unit that controls the operation of the entire image processing device 10. Part or all of the blocks of the control device are integrated circuits such as LSI (Large Scale Integration, large scale integrated circuit) or IC (Integrated Circuit ) chip sets. Each block may be a single circuit, or a circuit in which a part or all of the blocks are integrated may be used. The blocks may be integrally provided with each other, and a part of the blocks may be separately provided. In addition, in each of the above blocks, a part thereof may be provided separately. The integration of the control device is not limited to LSI, and a dedicated circuit or a general-purpose processor may be used.
As the storage unit 15, for example, an HDD (Hard Disk Drive), an SSD (Solid State Drive ), a flash memory, or the like is used. The storage unit 15 stores an image processing program 15A for performing the image classification processing according to the present embodiment. The image processing program 15A may be stored in the ROM 12. The storage unit 15 may be externally provided or may be additionally provided with a memory.
The image processing program 15A may be installed in the image processing apparatus 10 in advance, for example. The image processing program 15A may be stored in a nonvolatile non-transitory storage medium or distributed via the network line N, and may be appropriately installed in the image processing apparatus 10 or updated. Further, as examples of the nonvolatile non-transitory storage medium, a CD-ROM (Compact Disc Read Only Memory ), an optical disk, an HDD, a DVD-ROM (Digital Versatile Disk Read Only Memory, digital versatile disc read only memory), a flash memory, a memory card, and the like are assumed.
The display portion 16 is, for example, a liquid crystal display (LCD, liquid Crystal Display), an organic EL (Electro Luminescence ) display, or the like. The display unit 16 may have a touch panel integrally therewith. The operation unit 17 is provided with a device for operating input such as a keyboard and a mouse. The display unit 16 and the operation unit 17 receive various instructions from the user of the image processing apparatus 10. The display unit 16 displays various information such as a result of processing executed in response to an instruction received by a user, and notification of the processing.
The communication unit 18 is connected to a network line N such as the internet, a LAN (Local Area Network ), or a WAN (Wide Area Network, wide area network), and can communicate with the data management device 20 via the network line N.
The camera 74, the light source 76, the pump 82, the pump 90, and the like are connected to the connection portion 19, for example. The measurement systems such as the camera 74, the light source 76, the pump 82, the pump 90, etc. are controlled by the control device described above. The connection unit 19 also functions as an input port that inputs the image output from the camera 74.
On the other hand, the data management device 20 according to the present embodiment includes a CPU21, a ROM22, a RAM23, an input/output interface (I/O) 24, a storage unit 25, a display unit 26, an operation unit 27, and a communication unit 28. The CPU21 may be a processor such as a GPU, for example.
The data management device 20 according to the present embodiment is a general-purpose computer device such as an application server computer. Since the data management device 20 performs more data processing than the image processing device 10, it is preferable that the memory provided in the data management device 20 has a faster access speed than the memory provided in the image processing device 10, and it is preferable that the CPU provided in the data management device 20 has a faster processing speed than the CPU provided in the image processing device 10.
The CPU21, ROM22, RAM23, and I/O24 constitute a control unit. The CPU21, ROM22, RAM23, and I/O24 are connected via buses.
The I/O24 is connected to each functional unit including a storage unit 25, a display unit 26, an operation unit 27, and a communication unit 28. These functional units can communicate with the CPU21 via the I/O24.
As the storage unit 25, for example, HDD, SSD, flash memory, or the like is used. The storage unit 25 stores a data management program 25A for performing the image classification processing according to the present embodiment. The data management program 25A may be stored in the ROM 22. The storage unit 25 may be externally provided or may be additionally provided with a memory.
For example, the data management program 25A may be installed in the data management device 20 in advance. The data management program 25A may be stored in a nonvolatile non-transitory storage medium or distributed via the network line N, and may be appropriately installed in the data management device 20 or updated. Further, as examples of the nonvolatile non-transitory storage medium, a CD-ROM, a magneto-optical disk, an HDD, a DVD-ROM, a flash memory, a memory card, or the like is assumed.
The display unit 26 uses, for example, a Liquid Crystal Display (LCD), an organic EL display, or the like. The display unit 26 may integrally have a touch panel. The operation unit 27 is provided with a device for operating input such as a keyboard and a mouse. The display unit 26 and the operation unit 27 receive various instructions from the user of the data management device 20. The display unit 26 displays various information such as a result of processing executed in response to an instruction received by the user, and notification of the processing.
The communication unit 28 is connected to a network line N such as the internet, LAN, WAN, etc., and can communicate with the image processing apparatus 10 via the network line N.
Next, a functional configuration of the image processing apparatus 10 according to the first embodiment will be specifically described with reference to fig. 4.
The CPU11 of the image processing apparatus 10 according to the present embodiment functions as each section shown in fig. 4 by writing the image processing program 15A stored in the storage section 15 into the RAM13 and executing it.
Fig. 4 is a block diagram showing one example of the functional structure of the image processing apparatus 10 according to the first embodiment.
As shown in fig. 4, the CPU11 of the image processing apparatus 10 according to the present embodiment functions as an acquisition unit 11A, a first classification unit 11B, a display control unit 11C, a transmission unit 11D, and a reception unit 11E.
The storage unit 15 stores a first learned model 15C for image classification processing. The first learned model 15C is a model used in the image classification processing of the first classification section 11B.
The acquisition unit 11A cuts out a plurality of types of formed components contained in the fluid to be measured as formed component images from among a plurality of images (for example, 300 sheets and 1000 sheets) obtained by capturing the fluid to be measured flowing through the flow cell 40 by the camera 74, and acquires the cut-out plurality of formed component images.
The first classification unit 11B classifies the plurality of formed component images acquired by the acquisition unit 11A into detection components for each predetermined classification (for example, the type, size, shape, presence or absence of nuclei, etc. of the formed components). The first classification unit 11B temporarily stores the group of the physical component images classified by the predetermined classification for each object in the storage unit 15. As a method of classifying formed partial images, various known techniques such as a method using machine learning and a method using pattern matching are applied as an example. The group of formed partial images according to the present embodiment is classified using the first learned model 15C, for example. The first learned model 15C is a model generated by machine learning data obtained by detecting components of each predetermined class for each of the tangible component images obtained in the past. That is, the learning data is teacher data. The first learned model 15C has a component image as an input, and outputs a detection component for each predetermined class. As a learning model for performing machine learning, for example, CNN (Convolutional Neural Network ) or the like is used. The machine learning method uses, for example, deep learning. In the case of the description as each formed partial image, the formed partial image group is also called a formed partial image.
Further, the first classification section 11B calculates the degree of matching based on the image classification method (machine learning, pattern matching, etc., as one example) used when classifying the formed partial images. The matching degree referred to herein means the classification accuracy of the image with respect to the classification result, and the higher the proportion of the image of each predetermined classification to match the correct image or the predetermined feature point, the higher the value is given. When the matching degree is completely consistent with the correct image or the feature point, the matching degree is 100%. That is, it is considered that there is a high possibility that the formed partial images having a relatively low matching degree are not appropriately classified. In addition, the degree of matching may also be expressed as a matching rate.
The value of the matching degree varies according to the photographing method of the image of the tangible component in the tangible component image. Specifically, in an image focused on a tangible component, since it is easy to determine by classification using machine learning or the like, the degree of matching with respect to an accurate classification is high, and the degree of matching with respect to an inaccurate classification is low. However, in an image which is not focused on a tangible component, that is, an image which has a component blur formed, the degree of matching with respect to accurate classification is reduced, and the difference between the degree of matching with respect to accurate classification and the degree of matching with respect to inaccurate classification is also reduced. Further, in a plurality of images having a partial overlap, the matching degree may become a low value. In addition, in some cases, an item in a rare object to be measured, which is not learned in a learned model that should be accurately determined as not classified, is determined as a learned item, but the matching degree in this case is a low value.
The display control unit 11C performs control to display the formed component images classified as the detection components by the first classification unit 11B as formed component individual images on the display unit 16. Thus, a user such as an inspection technician can visually check that the individual images of the components are formed.
Here, a specific example of a screen in the image classification processing according to the present embodiment will be described with reference to fig. 5 and 6.
Fig. 5 is a front view showing an example of the measurement result screen 50 according to the present embodiment. The measurement result screen 50 is displayed on the display unit 16 by the display control unit 11C.
As shown in fig. 5, the measurement result screen 50 according to the present embodiment displays a list of measurement results for each item. As the measurement results, there are displayed qualitative display of names of the components, contents or number of the components, and the like as an index of the contents. Further, the measurement result screen 50 is provided with a formed component name 50A and a formed component button 50B.
In fig. 5, the items shown as the formed components are mainly represented by RBC, WBC, white blood cell, NSE, non-squamous epithelial cell, SQEC, squamous epithelial cell, NHC, other tubular type, and BACT, respectively. In addition, CRYS represents crystallization, YST represents yeast, HYST represents transparent tube, MUCS represents mucous wire, SPRM represents sperm, WBCC represents leukocyte pellet. The UNCL is other than erythrocytes, leukocytes, non-squamous epithelial cells, other tubular cells, bacteria, crystals, yeasts, glass tubular cells, mucous filaments, sperm, and leukocyte aggregates, or other components not originating from urine samples, such as hair and dust (hereinafter, also referred to as "non-classification"). That is, the detection components of each predetermined classification classified by the first classification section 11B correspond to the items represented as these respective tangible components and the classification defined as the items representing unclassified items.
Here, the display control unit 11C performs control to give a mark indicating that the formation of the partial image is confirmed for the image list (for example, see fig. 6 described later) in which the classification selected from the classification items is displayed on the measurement result screen 50. In the measurement result screen 50 shown in fig. 5, a mark 50C indicating that confirmation is performed is given to an item for which confirmation of the image list is performed among the classified items. In addition, when the reclassification (classification change) is performed, a measurement result reflecting the result is displayed.
In the measurement result screen 50 shown in fig. 5, when the formed component name 50A or the formed component button 50B is selected by a click operation or the like, a formed component individual image of the selected classification item is displayed on the display unit 16 as a formed component image list screen 51 shown in fig. 6 as an example.
Fig. 6 is a front view showing an example of the formed partial image list screen 51 according to the present embodiment.
As shown in fig. 6, the formed component image list screen 51 according to the present embodiment displays a formed component image list 51A selected as a desired visible component of the image. The formed partial image list screen 51 is displayed on the display unit 16.
The formed component image list screen 51 shown in fig. 6 includes a first item button group 52 for detecting components and a second item button group 53 for moving destination. The first item button group 52 of the detection component is provided for each kind of detection component. Similarly, the second item button group 53 of the movement destination is provided for each sort item of the movement destination. As one example, as described above, "RBC" means red blood cells, "WBC" means white blood cells, "SQEC" means flat epithelial cells, and "uncll" means unclassified. In the example shown in fig. 6, since the "SQEC" of the first item button group 52 of the detection component is selected, a list of images of flat epithelial cells is displayed on the component image forming list screen 51.
Further, a plurality of operation buttons B1 to B6 are displayed in an operable manner on the formation image list screen 51 shown in fig. 6. The operation button B1 is a button for enlarging and displaying a partial image and displaying a scale (length) in association with enlargement of the image. In the example shown in fig. 6, the scale is shown as 10 μm/pixel. The operation button B2 is a button for switching and displaying the tangible component images having different focal positions. In the example shown in fig. 6, the focal position in the depth direction in the flow cell 40 can be switched corresponding to the 3 layers of the upper layer, the middle layer, and the lower layer. The operation button B3 is a button for moving the formed component image of the detected component to the movement destination. The operation button B4 is a button for returning the editing job of the image classification to the previous one. The operation button B5 is a button for displaying a tangible component image to be "sample" described later. The operation button B6 is a button for displaying a window in which the magnification of the formation partial image or the brightness, contrast, and the like of the formation partial image are set.
By operating the plurality of operation buttons B1 to B6 shown in fig. 6, the display mode of the visible component image can be changed, and therefore, the discrimination of the visible component is easy. The operation buttons may be displayed on the display unit 16 as a touch panel or may be provided as the operation unit 17.
In the formed partial image list screen 51 shown in fig. 6, when any one item button in the second item button group 53 of the movement destination is selected by a click operation or the like, a reclassification job screen (not shown) is displayed. In the reclassification job screen, a first image list of the detection component and a second image list of the movement destination are displayed on one screen.
In addition, in the formed partial image list 51A shown in fig. 6, the matching degree calculated by the first classification unit 11B may be displayed together with the formed partial image.
Returning to fig. 4, the transmitting unit 11D controls the communication unit 18 to identify a specific formed component image from among the plurality of formed component images based on the matching degree calculated by the first classifying unit 11B, and transmits the identified specific formed component image to the data management device 20 via the network line N. That is, it can be said that the determined tangible component image is a tangible component image determined based on the matching degree calculated by the first classification section 11B. The specified formed component image includes, for example, an unclassified formed component image that cannot be classified as a detection component for each predetermined classification by the first classification unit 11B (in other words, classified as not belonging to any predetermined classification by the first classification unit 11B).
Further, a plurality of formed partial images whose matching degree is within a predetermined range may be transmitted to the data management device 20. For example, a plurality of formed partial images having a matching degree smaller than the upper limit or a plurality of formed partial images having a matching degree smaller than the upper limit and not smaller than the lower limit may be transmitted to the data management device 20. As an example, a plurality of formed partial images having a matching degree of less than 70% may be transmitted to the data management apparatus 20, or a plurality of formed partial images having a matching degree of less than 70% and 30% or more may be transmitted to the data management apparatus 20. By setting the upper limit on the matching degree, only images which cannot be accurately determined can be sent to the first classification section 11B, and by setting the lower limit on the matching degree, blurred images can be eliminated. In addition, among the items shown in fig. 5, the "unclassified" items may be ones in which only a plurality of formed partial images having a high degree of matching (for example, 80% or more) are transmitted to the data management device 20, and in this case, rare items shown in fig. 5 may be transmitted to the data management device 20. Further, an upper limit or a lower limit may be set for the number of images transmitted to the data management device 20, and for example, if the matching degree is less than 70%, 100 images from the image having the high matching degree may be transmitted to the data management device 20. An upper limit or a lower limit of the number of images to be transmitted may be set for each item shown in fig. 5. In particular, by setting the upper limit, the amount of data to be transmitted can be suppressed.
Fig. 7 is a diagram for explaining the determination processing of the formed partial image transmitted to the data management device 20 according to the present embodiment.
As shown in fig. 7, the plurality of formed component images are classified into detection components for each predetermined classification using the first learned model 15C. At this time, the matching degree is calculated for each formed partial image. In the first learned model 15C, the formed component images that can be classified into the detection components for each predetermined classification are subjected to machine learning, and the unclassified formed component images that cannot be classified into the detection components for each predetermined classification (in other words, the formed component images that do not belong to the predetermined arbitrary classification are captured) are subjected to machine learning in the classification of "unclassified". In the example of fig. 5 described above, the classified formed component images are classified into items such as "RBC", "WBC", and "SQEC", and the classified formed component images are classified into items such as "unclassified". In addition, in the formed component image classified as "UNCL", the higher the matching degree is, the higher the probability that the imaged formed component in the formed component image does not belong to any of the predetermined classifications is. The determined formed partial image transmitted to the data management device 20 may be determined manually by a user or automatically by the image processing device 10.
In the case where the user manually determines, for example, the user observes the degree of matching of each of the tangible component images, and determines a tangible component image having a relatively low degree of matching. In addition, all the "unclassified" formed partial images may be determined. In addition, the user may also determine a tangible component image that is particularly interesting for the classification result.
On the other hand, in the case of automatic determination by the image processing apparatus 10, for example, a tangible component image having a matching degree below a threshold value is determined. In addition, all the "unclassified" formed partial images may be determined. The threshold value here may be set to an appropriate value by the user, or may be set to a different threshold value for each predetermined classification.
The above-identified tangible component images are collected in 1 folder, and transmitted to the data management device 20 in accordance with the user operation. The data management device 20 performs an image classification process described later, and returns the classification result to the image processing device 10.
The receiving unit 11E controls the communication unit 18 to receive the classification result obtained by classifying the specified formed component image into the detection component from the data management device 20.
The image of the tangible component having the matching degree exceeding the threshold may be transmitted to the data management device 20, or may not be transmitted. By not transmitting the tangible component image whose matching degree exceeds the threshold value to the data management apparatus 20, the amount of data transmitted to the data management apparatus 20 can be reduced.
Next, the functional configuration of the data management device 20 according to the first embodiment will be specifically described with reference to fig. 8.
The CPU21 of the data management device 20 according to the present embodiment functions as each unit shown in fig. 8 by writing the data management program 25A stored in the storage unit 25 into the RAM23 and executing it.
Fig. 8 is a block diagram showing an example of the functional configuration of the data management apparatus 20 according to the first embodiment.
As shown in fig. 8, the CPU21 of the data management device 20 according to the present embodiment functions as an acquisition unit 21A, a second classification unit 21B, a display control unit 21C, a return unit 21D, and a reception unit 21E.
The second learned model 25C for the image classification processing is stored in the storage section 25. The second learned model 25C is a model for the image classification processing by the second classification section 21B.
The receiving unit 21E controls the communication unit 28 to receive the specified tangible component image from the image processing apparatus 10. The specified formed component images received from the image processing apparatus 10 are temporarily stored in the storage unit 25 as the classification target image group.
The acquisition unit 21A acquires a formed partial image as a classification target from the classification target image group stored in the storage unit 25.
The second classification unit 21B classifies the formed component images acquired by the acquisition unit 21A into detection components for each predetermined classification (for example, the type, size, shape, presence or absence of nuclei, etc. of the formed components). The physical component images classified for each predetermined classification by the second classification unit 21B are sent to the recovery unit 21D. As a method of classifying a formed partial image, as an example, a method using machine learning is applied. In this case, the formed partial images are classified using the second learned model 25C, for example. The second learned model 25C is a model generated by performing machine learning on other learning data corresponding to more detection components than the learning data of the first learned model 15C, for example, using the same algorithm as that of the machine learning of the first learned model 15C. The amount of learning data learned in the second learned model 25C is larger than the amount of learning data learned in the first learned model 15C. That is, the second learned model 25C is learned in such a manner that the classification performance is higher than that of the first learned model 15C.
The second learned model 25C may be a model generated by performing machine learning on learning data of the first learned model 15C using another algorithm having a higher classification performance than that of the machine learning of the first learned model 15C. In addition to the CNN described above, various methods such as linear regression, regularization (Regularization), decision trees, random forests, k-nearest neighbor (k-NN), logistic regression, and Support Vector Machines (SVM) are used as the algorithm for machine learning. In the case where the classification performance of the learned model, for example, the SVM is higher than the CNN, the CNN is employed in the first learned model 15C, and the SVM is employed in the second learned model 25C. In contrast, in the case where the classification performance CNN of the learned model is higher than that of the SVM, the SVM is employed in the first learned model 15C, and the CNN is employed in the second learned model 25C. In the comparison of the classification performance of the learned model, an index value (for example, accuracy, matching rate, etc.) indicating the model performance may be calculated using test data prepared in advance, and compared.
The second learned model 25C may be a model generated by performing machine learning on other learning data corresponding to more detection components than the learning data of the first learned model 15C by using other algorithms having higher classification performance than the machine learning algorithm of the first learned model 15C.
In the case of managing the version of the second learned model 25C, it is preferable to manage the version of the second learned model 25C so that the version is always up to date.
Here, the second classifying unit 21B may classify the formed partial images according to a classification operation by the user. That is, the second classification section 21B performs a process related to the classification process. The user is preferably an inspection technician who is familiar with classification of the physical component images.
Specifically, the display control unit 21C performs control to display the specified formed component image as the classification target on the display unit 26. In this case, the second classifying unit 21B classifies the specified formed component images according to a classification operation of the specified formed component images displayed on the display unit 26 by the user.
Fig. 9 is a diagram for explaining the classification processing of the formed partial images by the data management device 20 according to the present embodiment.
As shown in fig. 9, the data management device 20 uses the second learned model 25C having higher classification performance than the first learned model 15C or classifies the formed component image as the classification target into, for example, each of the classification 3 of the detection component P and the classification 4 of the detection component Q according to a classification operation by a user such as an inspection technician. The formed partial images classified into each of the classification 3 and the classification 4 are, for example, images that cannot be classified in the image processing apparatus 10 or images that have a high possibility of being not accurately classified in the image processing apparatus 10. Then, the classification result is sent to the replying unit 21D.
The detection components other than the detection components that can be classified by the image classification processing based on the first learned model 15C (the first classification unit 11B) may be classified based on the classification of the user such as the second learned model 25C or the inspection technician. That is, the second learned model 25C or the user such as the examination technician may classify the model into an item (for example, a marroli body or a foreign cell) other than the items shown in fig. 5. Note that, based on the classification result, comments may be returned and summarized based on the classification result of the detection component or in place of the classification result.
Returning to fig. 8, the return unit 21D controls the communication unit 28 to return the classification result of the second classification unit 21B to the image processing apparatus 10 via the network line N.
Next, the operation of the image processing apparatus 10 according to the first embodiment will be described with reference to fig. 10.
Fig. 10 is a flowchart showing an example of a flow of processing based on the image processing program 15A according to the first embodiment.
The CPU11 of the image processing apparatus 10 writes the image processing program 15A stored in the ROM12 or the storage section 15 into the RAM13, thereby executing the image classification processing based on the image processing program 15A.
In step S101 in fig. 10, the CPU11 cuts out a plurality of types of formed components included in the fluid to be measured as formed component images from a plurality of images (300 sheets, 1000 sheets, as one example) obtained by capturing the fluid to be measured flowing through the flow cell 40 by the camera 74, and acquires the cut-out plurality of formed component images.
In step S102, the CPU11 classifies the plurality of formed component images acquired in step S101 into detection components for each predetermined classification (for example, the type, size, shape, presence or absence of a kernel, etc. of a formed component) using, for example, the first learned model 15C. The formed partial images classified for each predetermined classification are temporarily stored in the storage unit 15 for each object division. As a method of classifying formed partial images, various known techniques such as a method using machine learning and a method using pattern matching are applied as an example.
In step S103, the CPU11 calculates a matching degree for each of the tangible component images classified as the detection components for each of the predetermined classifications in step S102.
In step S104, the CPU11 controls the communication unit 18 to transmit the specified formed partial image among the plurality of formed partial images to the data management device 20 based on the matching degree calculated in step S103. The determined formed partial image transmitted to the data management device 20 may be determined manually by the user or automatically by the image processing device 10 as described above.
In step S105, the CPU11 determines whether or not there is a reply of the classification result from the data management apparatus 20 for the determined tangible component image transmitted to the data management apparatus 20 in step S104. If it is determined that there is a reply of the classification result from the data management apparatus 20 (in the case of a positive determination), the flow proceeds to step S106, and if it is determined that there is no reply of the classification result from the data management apparatus 20 (in the case of a negative determination), the flow waits in step S105.
In step S106, the CPU11 stores the classification result returned from the data management device 20 in the storage unit 15, for example, and ends a series of processes based on the present image processing program 15A. The first classification unit 11B may classify the image as a specific detection component, and rewrite, that is, change, the classification result of the specific detection component of the tangible component image stored in the storage unit 15 based on the returned classification result.
As in steps S105 and S106, the image processing apparatus 10 that has transmitted the specified tangible component image to the data management apparatus 20 is not limited to receiving the classification result of the specified tangible component image, and other apparatuses including a CPU may receive the classification result of the specified tangible component image from the data management apparatus 20.
Next, the operation of the data management device 20 according to the first embodiment will be described with reference to fig. 11.
Fig. 11 is a flowchart showing an example of a flow of processing based on the data management program 25A according to the first embodiment.
The image classification processing based on the data management program 25A is executed by the CP21 of the data management device 20 writing the data management program 25A stored in the ROM22 or the storage unit 25 into the RAM 23.
In step S111 of fig. 11, the CPU21 determines whether the determined tangible component image transmitted from the image processing apparatus 10 is received. If it is determined that the specified tangible component image is received (in the case of affirmative determination), the flow goes to step S112, and if it is determined that the specified tangible component image is not received (in the case of negative determination), the flow goes to wait in step S111.
In step S112, the CPU11 temporarily stores the determined formed component images received in step S111 in the storage section 25 as the classification target image group.
In step S113, the CPU11 classifies the determined formed component images stored as the classification target image group in step S112 as detection components using, for example, the second learned model 25C. Here, as described above, the second learned model 25C is a model having higher classification performance than the first learned model 15C. The determined formed partial images may be classified based on a classification operation by a user such as an inspection technician.
In step S114, the CPU11 returns the classification result classified in step S113 to the image processing apparatus 10, and ends a series of processes based on the present data management program 25A.
As described above, according to the present embodiment, using the matching degree calculated for each of the formed component images, the formed component images that are not classified, the formed component images for which the classification result is intentional, and the like, which are difficult to classify based on the image processing apparatus, are transmitted to the data management apparatus, and the classification result is obtained from the data management apparatus. Therefore, even if there are formed partial images that are difficult to classify in the image processing apparatus, it is possible to classify appropriately.
Second embodiment
In the first embodiment, the description has been made of the mode of transmitting the specified tangible component image to the data management apparatus. In the second embodiment, a description will be given of a manner in which a representative image is transmitted to a data management apparatus in a case where there are a plurality of similar formed partial images among the determined formed partial images.
Fig. 12 is a block diagram showing one example of the functional configuration of the image processing apparatus 10A according to the second embodiment.
As shown in fig. 12, the CPU11 of the image processing apparatus 10A according to the present embodiment functions as an acquisition unit 11A, a first classification unit 11B, a display control unit 11C, a transmission unit 11D, a reception unit 11E, and a selection unit 11F. The same reference numerals are given to the same components as those of the image processing apparatus 10 described in the first embodiment, and overlapping description thereof is omitted.
In the present embodiment, a case is assumed where a plurality of specified tangible element images are provided. In this case, the selecting unit 11F groups similar formed partial images among the plurality of specified formed partial images, and selects a representative image that is representative from the grouped similar formed partial images. The transmitting unit 11D controls the communication unit 18 to transmit the representative image selected by the selecting unit 11F to the data management device 20. Further, only the representative image may be transmitted, or the representative image and the specified image other than the representative image may be transmitted. The data management device 20 performs the above-described image classification processing, and returns the classification result to the image processing device 10A.
The receiving unit 11E controls the communication unit 18 to receive the classification result obtained by classifying the representative image into the detection component from the data management device 20. In this case, the first classification unit 11B uses the classification result of the representative image obtained from the data management device 20 as the classification result of the other component images in the group including the representative image. Specifically, a correspondence table (not shown) that can identify the group including the similar component images and the information capable of identifying the representative image is stored in the storage unit 15, so long as the corresponding group can be identified from the information capable of identifying the representative image included in the classification result of the representative image.
According to the above, the amount of data transmitted is reduced as compared with the case where the determined tangible component image is transmitted entirely. In general, it is known that the amount of formation image data is large, and even if a dedicated line is used, a time is particularly required for uploading. Therefore, by reducing the amount of data transmitted, the time for uploading can be shortened, and the communication load with the data management device 20 can be reduced.
Specifically, the selection unit 11F analyzes the formed partial images, and determines the similarity between the images. For example, the following (a 1) to (a 3) can be considered as the index value for determining the similarity between images.
(a1) The tangible component image has at least one value of the number of pixels in the aspect, the frequency (representing complexity as an image), the contrast, the color (RGB), the brightness, and the area based on binarization.
(a2) A value obtained by performing frequency analysis such as fourier transform on the formed partial image.
(a3) A value obtained by performing spatial analysis (analysis of consistency of positions, coordinates, and the like of feature points included in an image) such as matching analysis processing on the formed partial image.
As a method for determining the similarity between images, for example, if the difference between at least one value obtained from the above (a 1) to (a 3) is within a predetermined range between two images, it can be determined that the two images are similar.
The representative image may be selected by, for example, an image that is first received from among the images grouped by the selecting unit 11F, or any of the images grouped.
The selection unit 11F may select at least one of the number of representative images to be transmitted and the data amount according to the total data amount of the representative images to be transmitted or the congestion condition of the line to be used (for example, it may be determined according to the communication speed). In this case, by appropriately changing the index value for determining the similarity between the images, at least one of the number of transmitted representative images and the data amount can be adjusted.
Next, with reference to fig. 13, an operation of the image processing apparatus 10A according to the second embodiment will be described.
Fig. 13 is a flowchart showing an example of a flow of processing based on the image processing program 15A according to the second embodiment.
The CPU11 of the image processing apparatus 10A writes the image processing program 15A stored in the ROM12 or the storage section 15 into the RAM13, thereby executing the image classification processing based on the image processing program 15A.
In step S121 in fig. 13, the CPU11 cuts out a plurality of types of formed components included in the fluid to be measured as formed component images from a plurality of images (300 sheets, 1000 sheets, as one example) obtained by capturing the fluid to be measured flowing through the flow cell 40 by the camera 74, and acquires the cut-out plurality of formed component images.
In step S122, the CPU11 classifies the plurality of formed component images acquired in step S121 into detection components (for example, the type, size, shape, presence or absence of a kernel, etc. of a formed component) for each predetermined classification using, for example, the first learned model 15C. The formed partial images classified for each predetermined classification are temporarily stored in the storage unit 15 for each object to be measured. As a method of classifying formed partial images, various known techniques such as a method using machine learning and a method using pattern matching are applied as an example.
In step S123, the CPU11 calculates a matching degree for each of the tangible component images classified as the detection components for each of the predetermined classifications in step S122.
In step S124, the CPU11 determines a plurality of formed component images that become candidates for transmission to the data management apparatus 20, from among the plurality of formed component images, based on the matching degree calculated in step S123. Then, the CPU11 determines the similarity of the plurality of determined tangible element images. Specifically, as described above, if the difference between at least one value obtained from the above (a 1) to (a 3) is within a predetermined range, it can be determined that the two images are similar.
In step S125, the CPU11 groups similar formed partial images determined to have similarity in step S124.
In step S126, the CPU11 selects a representative image from the similar formed partial images grouped in step S125. At this time, the CPU11 previously stores in the storage section 15a correspondence table in which information capable of identifying a group including similar formed partial images and information capable of identifying representative images are associated as described above.
In step S127, the CPU11 controls the communication unit 18 to transmit the representative image selected in step S126 to the data management device 20.
In step S128, the CPU11 determines whether there is a reply of the classification result from the data management apparatus 20 for the representative image transmitted to the data management apparatus 20 in step S127. If it is determined that there is a reply to the classification result from the data management apparatus 20 (in the case of a positive determination), the flow proceeds to step S129, and if it is determined that there is no reply to the classification result from the data management apparatus 20 (in the case of a negative determination), the flow waits in step S128.
In step S129, the CPU11 stores the classification result returned from the data management device 20 in the storage unit 15, for example, and ends a series of processes based on the present image processing program 15A. At this time, the CPU11 identifies the corresponding group by referring to the correspondence table stored in step S126 based on the information capable of identifying the representative image included in the classification result of the representative image from the data management device 20, and uses the classification result of the representative image as the classification result of the other formed partial images of the identified group.
As described above, according to the present embodiment, when there are a plurality of similar formed partial images among the specified formed partial images, only the representative image is transmitted to the data management apparatus. Therefore, the amount of data to be transmitted is reduced as compared with the case where all the specified tangible component images are transmitted, and the communication load with the data management apparatus can be reduced.
In the above embodiments, the processors are broad processors, and include general-purpose processors (e.g., CPU: central Processing Unit, central processing unit), special-purpose processors (e.g., GPU: graphics Processing Unit, graphics processor, ASIC: application Specific Integrated Circuit, application-specific integrated circuit, FPGA: field Programmable Gate Array, field-programmable gate array, etc.).
The operations of the processors in the above embodiments may be formed not only by one processor but also by cooperation of a plurality of processors existing at physically separate locations. The order of the respective operations of the processor is not limited to the order described in the above embodiments, and may be changed as appropriate.
The image processing apparatus according to the embodiment is described above by way of example. The embodiment may be a program for causing a computer to execute functions of each unit included in the image processing apparatus. The embodiment may be implemented as a computer-readable non-transitory storage medium storing the program.
The configuration of the image processing apparatus described in the above embodiment is an example, and may be changed according to the situation without departing from the gist of the present invention. The display of the tangible component image is not limited to the above embodiment, and may be, for example, displayed side by side in the lateral direction. The display position of each button may be changed as appropriate.
The flow of the processing of the program described in the above embodiment is also an example, and unnecessary steps may be deleted, new steps may be added, or the processing order may be exchanged within a range not departing from the gist.
In the above embodiment, the processing according to the program embodiment was described as being implemented by a software configuration using a computer, but the present invention is not limited to this. Embodiments may also be implemented by, for example, hardware structures, combinations of hardware structures and software structures.

Claims (15)

1. An image processing device is provided with:
an acquisition unit that acquires a plurality of tangible component images;
a first classification unit that classifies the plurality of formed component images into detection components for each predetermined classification, and calculates a matching degree of classification results; and
and a transmitting unit configured to transmit the formed partial image specified from the plurality of formed partial images to a data management device via a network line, based on the matching degree calculated by the first classifying unit.
2. The image processing apparatus according to claim 1, wherein,
the transmitting unit transmits, as the specified formed partial image, a formed partial image having a matching degree equal to or less than a threshold value among the plurality of formed partial images, to the data management device.
3. The image processing apparatus according to claim 1 or 2, wherein,
the first classification unit classifies the plurality of formed component images using a first learned model that is generated by machine learning data obtained by detecting components for each predetermined class for each of formed component images obtained in the past, and outputs the detected components for each predetermined class with the formed component images as inputs.
4. The image processing apparatus according to claim 1 or 2, wherein,
the image processing apparatus further includes a receiving unit that receives, from the data management apparatus, a classification result obtained by classifying the specified formed component image into a detection component.
5. The image processing apparatus according to claim 1 or 2, wherein,
the determined tangible element image is a plurality,
The image processing apparatus further includes a selecting unit that groups similar formed partial images among the plurality of specified formed partial images, selects a representative image that is representative from the grouped similar formed partial images,
the transmitting section transmits the representative image selected by the selecting section to the data management apparatus.
6. A data management apparatus to which an image processing apparatus is connected via a network line, the data management apparatus comprising:
a second classification unit configured to classify the determined formed component image received from the image processing apparatus as a detection component; and
and a return unit configured to return the classification result obtained by the second classification unit to the image processing apparatus.
7. An image processing system, comprising:
an image processing device; and
a data management device connected to the image processing device via a network line,
the image processing device is provided with:
an acquisition unit that acquires a plurality of tangible component images;
a first classification unit that classifies the plurality of formed component images into detection components for each predetermined classification, and calculates a matching degree of classification results; and
A transmitting unit configured to transmit a formed component image specified from among the plurality of formed component images to the data management device based on the matching degree calculated by the first classifying unit,
the data management device is provided with:
a second classification unit configured to classify the determined formed component image received from the image processing apparatus as a detection component; and
and a return unit configured to return the classification result obtained by the second classification unit to the image processing apparatus.
8. The image processing system of claim 7, wherein,
the first classification section classifies the formed component images using a first learned model,
the first learned model is generated by machine learning data obtained by detecting components for each predetermined class in a formed component image obtained in the past, and takes a formed component image as an input and outputs the detected components for each predetermined class.
9. The image processing system of claim 7, wherein,
the second classification unit classifies the determined formed component image using a second learned model generated by performing machine learning on other learning data corresponding to more detection components than the learning data using the same algorithm as that of the machine learning.
10. The image processing system of claim 7, wherein,
the second classification unit classifies the determined formed component image using a second learned model generated by machine learning the learning data using another algorithm having a higher classification performance than the machine learning algorithm.
11. The image processing system of claim 7, wherein,
the second classification unit classifies the determined formed component image using a second learned model generated by performing machine learning on other learning data corresponding to more detection components than the learning data using other algorithms having higher classification performance than the machine learning algorithm.
12. The image processing system according to claim 7 or 8, wherein,
further comprising a display section that displays the determined tangible component image,
the second classifying unit classifies the specified formed component images according to a classification operation of the specified formed component images displayed on the display unit by a user.
13. The image processing system according to claim 7 or 8, wherein,
The specified formed component image includes a formed component image that cannot be classified as a detection component by the first classification unit for each of the predetermined classifications.
14. An image processing method, wherein,
the image processing apparatus acquires a plurality of formed partial images,
classifying the plurality of formed component images into detection components for each predetermined classification, calculating a matching degree of classification results,
and transmitting the determined formed partial image from the plurality of formed partial images to a data management device via a network line based on the calculated matching degree.
15. A non-transitory storage medium storing an image processing program for causing a computer to execute:
a plurality of images of the tangible component are acquired,
classifying the plurality of formed component images into detection components for each predetermined classification, calculating a matching degree of classification results,
and transmitting the determined formed partial image from the plurality of formed partial images to a data management device via a network line based on the calculated matching degree.
CN202310366586.2A 2022-04-13 2023-04-07 Image processing apparatus, system and method, data management apparatus and storage medium Pending CN116912665A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2022-066601 2022-04-13
JP2023060223A JP2023156996A (en) 2022-04-13 2023-04-03 Image processing device, data management device, image processing system, and image processing program
JP2023-060223 2023-04-03

Publications (1)

Publication Number Publication Date
CN116912665A true CN116912665A (en) 2023-10-20

Family

ID=88355225

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310366586.2A Pending CN116912665A (en) 2022-04-13 2023-04-07 Image processing apparatus, system and method, data management apparatus and storage medium

Country Status (1)

Country Link
CN (1) CN116912665A (en)

Similar Documents

Publication Publication Date Title
AU2020200835B2 (en) System and method for reviewing and analyzing cytological specimens
US11035870B2 (en) Systems, methods and apparatus for identifying a specimen container cap
CN108738336B (en) Method and apparatus for multi-view characterization
US5933519A (en) Cytological slide scoring apparatus
JP2023113783A (en) Measurement system
CN111103433B (en) Information processing device and method, measurement system, and non-transitory storage medium
US20220012884A1 (en) Image analysis system and analysis method
CN114729953A (en) Sample analysis system and method, cell image analyzer, and storage medium
CN115210779A (en) Systematic characterization of objects in biological samples
WO2020004101A1 (en) Display control device, display control method, and display control program
US20160026851A1 (en) System and method for improved detection of objects of interest in image data by management of false positives
CN116912665A (en) Image processing apparatus, system and method, data management apparatus and storage medium
CN112036334A (en) Method, system and terminal for classifying visible components in sample to be detected
EP4261794A1 (en) Image processing device, data management device, image processing system, image processing method, and non-transitory storage medium
JP2023156996A (en) Image processing device, data management device, image processing system, and image processing program
WO2021134474A1 (en) Sample analysis system and method
CN111684279B (en) Cell analysis method, cell analysis device and storage medium
US20240037706A1 (en) Information processing device, measurement system, image processing method, and non-transitory storage medium
EP4177332A1 (en) Information processing device, information processing method, and program
CN116563848B (en) Abnormal cell identification method, device, equipment and storage medium
US20220291196A1 (en) Target cell statistical method, apparatus, and system
CN113039551A (en) Method of analyzing cells, cell analysis apparatus, and computer-readable storage medium
EP4367685A2 (en) Methods and apparatus providing training updates in automated diagnostic systems
WO2023283583A1 (en) Site-specific adaptation of automated diagnostic analysis systems
CN114677538A (en) Color classification method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication