WO2021059489A1 - Information processing device, information processing method, information processing program, and information processing system - Google Patents

Information processing device, information processing method, information processing program, and information processing system Download PDF

Info

Publication number
WO2021059489A1
WO2021059489A1 PCT/JP2019/038165 JP2019038165W WO2021059489A1 WO 2021059489 A1 WO2021059489 A1 WO 2021059489A1 JP 2019038165 W JP2019038165 W JP 2019038165W WO 2021059489 A1 WO2021059489 A1 WO 2021059489A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
observation
control unit
display image
index
Prior art date
Application number
PCT/JP2019/038165
Other languages
French (fr)
Japanese (ja)
Inventor
桃太郎 石川
俊秀 唯木
雄之 阿部
耕磨 林
恭子 根岸
良子 仙洞田
顕之 利根川
友希 淺野
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to PCT/JP2019/038165 priority Critical patent/WO2021059489A1/en
Publication of WO2021059489A1 publication Critical patent/WO2021059489A1/en

Links

Images

Classifications

    • CCHEMISTRY; METALLURGY
    • C12BIOCHEMISTRY; BEER; SPIRITS; WINE; VINEGAR; MICROBIOLOGY; ENZYMOLOGY; MUTATION OR GENETIC ENGINEERING
    • C12MAPPARATUS FOR ENZYMOLOGY OR MICROBIOLOGY; APPARATUS FOR CULTURING MICROORGANISMS FOR PRODUCING BIOMASS, FOR GROWING CELLS OR FOR OBTAINING FERMENTATION OR METABOLIC PRODUCTS, i.e. BIOREACTORS OR FERMENTERS
    • C12M1/00Apparatus for enzymology or microbiology
    • C12M1/34Measuring or testing with condition measuring or sensing means, e.g. colony counters

Definitions

  • the present invention relates to an information processing device, an information processing method, an information processing program, and an information processing system.
  • Patent Document 1 discloses a technique for calculating cell density from the area of a colony and the number of cells contained in the colony for which the area of the colony has been calculated, based on an image of the colony of the cell. .. For example, since the amount of analysis results using such a technique becomes enormous due to daily analysis, a technique that allows the user to easily visually recognize the analysis results is required.
  • the present invention relates to an acquisition unit that acquires an observation result obtained by imaging a plurality of objects under predetermined observation conditions, and an observation image of a plurality of accommodating units in which the plurality of objects are accommodated.
  • An information processing device including a control unit that displays a plurality of reference images on a display image based on a classification standard is provided.
  • the second aspect it is possible to acquire observation results obtained by imaging a plurality of objects under predetermined observation conditions, and to obtain observation images of a plurality of storage portions in which the plurality of objects are housed.
  • An information processing method including showing the reference image of the above in the display image based on the classification criteria is provided.
  • the computer acquires the observation results obtained by imaging a plurality of objects under predetermined observation conditions, and observes a plurality of accommodating portions in which the plurality of objects are accommodated.
  • An information processing program for displaying a plurality of reference images related to an image on a display image based on a classification criterion and executing the display is provided.
  • an information processing system that outputs a display image to a user terminal by cloud computing, and includes a server, which is obtained by imaging a plurality of objects under predetermined observation conditions.
  • a network an acquisition unit for acquiring observation results, an image generation unit for generating a display image showing a plurality of reference images related to observation images of a plurality of accommodation units in which a plurality of objects are housed, based on classification criteria, and a network.
  • An information processing system including an output unit that outputs a display image generated by the image generation unit to a user terminal is provided.
  • the figure which shows the screen example in the group registration which concerns on 1st Embodiment The figure which shows the screen example in the group registration which concerns on 1st Embodiment.
  • the figure which shows the screen example in the group registration which concerns on 1st Embodiment The figure which shows the screen example in the group registration which concerns on 1st Embodiment.
  • the figure which shows the screen example in the group registration which concerns on 1st Embodiment The figure which shows the screen example in the group registration which concerns on 1st Embodiment.
  • the figure which shows the screen example in the group registration which concerns on 1st Embodiment The figure which shows the screen example in the group registration which concerns on 1st Embodiment.
  • the figure which shows the screen example in the group registration which concerns on 1st Embodiment The figure which shows the screen example in the group registration which concerns on 1st Embodiment.
  • FIG. 1 is a diagram showing an overall configuration example of an analysis system including the information processing apparatus according to the first embodiment.
  • the analysis system 1000 includes a culture system BS, an information processing device 100, and a storage device 110.
  • the culture system BS includes a culture device 8 and an observation device 5.
  • the analysis system 1000 is a system for culturing an object (for example, a cell, a sample, a specimen, etc.), observing (imaging) the process of culturing, and analyzing the observation result (eg, captured image).
  • the culture system BS, the information processing device 100, and the storage device 110 are connected via networks such as the Internet, LAN (Local Area Network), and WAN (Wide Area Network). Further, the culture system BS, the information processing device 100, and the storage device 110 may be connected via a network combining the Internet, LAN, WAN, and the like. Such a network is not limited to a network by wired communication, and may include a network by wireless communication. Further, the information processing device 100 may be configured to include the storage device 110. Further, the culture system BS and the storage device 110 may be connected via a network.
  • FIG. 2 is a diagram showing a configuration example of a culture system connected to the information processing apparatus according to the first embodiment.
  • FIG. 3 is a block diagram showing a configuration example of a culture system connected to the information processing apparatus according to the first embodiment.
  • FIG. 4 is a diagram illustrating an example of the connection relationship around the control unit of the culture system connected to the information processing apparatus according to the first embodiment.
  • the culture system BS is roughly divided into a culture chamber 2 provided in the upper part of the housing 1, a stocker 3 for accommodating and holding a plurality of culture containers 10, and an observation for observing (imaging) an object in the culture container 10. It has a device 5 and a transport unit (transport device) 4 for transporting the culture container 10.
  • the culture system BS has a control unit (control device) 6 for controlling the operation of the system, and an operation panel 7 including a display device.
  • the culture chamber 2, the stocker 3, the transfer unit 4, and the like correspond to the culture device 8.
  • the culture chamber 2 is a chamber that forms a culture environment for observation objects such as cells in microscopic observation.
  • the culture chamber 2 is provided with a temperature control device 21, a humidifier 22, a gas supply device 23, a circulation fan 24, and an environmental sensor 25.
  • the temperature adjusting device 21 cooperates with the environment sensor 25 to adjust the temperature in the culture chamber 2 to a predetermined set temperature.
  • the humidifier 22 cooperates with the environment sensor 25 to adjust the humidity in the culture chamber 2 to a predetermined set humidity.
  • the gas supply device 23 cooperates with the environment sensor 25 to supply CO 2 gas, N 2 gas, O 2 gas, and the like.
  • the circulation fan 24 is a fan that circulates gas (air) in the culture chamber 2 and adjusts the temperature in cooperation with the environment sensor 25.
  • the environment sensor 25 detects the temperature, humidity, carbon dioxide concentration, nitrogen concentration, oxygen concentration, etc. of the culture chamber 2.
  • the stocker 3 is formed in a shelf shape divided into front and back and upper and lower. For example, a unique address is set for each shelf.
  • the culture container 10 is appropriately selected according to the type and purpose of the object to be cultured.
  • the culture container 10 may be, for example, a well plate, a flask, or a dish type culture container. In this embodiment, a case where a well plate is used will be given as an example.
  • the object is injected into the culture vessel 10 together with the liquid medium (culture solution) and held. For example, a code number is assigned to each of the culture vessels 10.
  • the culture container 10 is housed in association with the designated address of the stocker 3 according to the assigned code number.
  • the transfer unit 4 has a Z stage 41 that can move up and down, a Y stage 42 that can move back and forth, and an X stage 43 that can move left and right, which are provided inside the culture chamber 2.
  • the support arm 45 lifts and supports the culture vessel 10 on the tip end side of the X stage 43.
  • the observation device 5 includes a first lighting unit 51, a second lighting unit 52, a third lighting unit 53, a macro observation system 54, a microscopic observation system 55, and a control unit 6.
  • the first illumination unit 51 illuminates the object from below the sample table 15.
  • the second illumination unit 52 illuminates the object from above the sample table 15 along the optical axis of the microscopic observation system 55.
  • the third illumination unit 53 illuminates the object from below the sample table 15 along the optical axis of the microscopic observation system 55.
  • the macro observation system 54 carries out macro observation of the object.
  • the microscopic observation system 55 carries out microscopic observation of the object.
  • the sample table 15 is provided with a transparent window portion 16 such as glass in the observation region of the microscopic observation system 55.
  • the macro observation system 54 includes an observation optical system 54a and an image pickup device 54c such as a CCD camera that captures an image of an object imaged by the observation optical system 54a.
  • the macro observation system 54 acquires an overall observation image from above of the culture vessel 10 backlit by the first illumination unit 51.
  • the microscopic observation system 55 includes an observation optical system 55a including an objective lens, an intermediate magnification lens, a fluorescence filter, and an imaging device 55c such as a cooled CCD camera that captures an image of an object imaged by the observation optical system 55a. And have.
  • a plurality of objective lenses and intermediate variable magnification lenses may be provided.
  • the objective lens and the intermediate magnification lens can be set to an arbitrary observation magnification by changing the combination of lenses.
  • the microscopic observation system 55 includes a transmission image of the object illuminated by the second illumination unit 52, a reflection image of the object illuminated by the third illumination unit 53, and a fluorescence image of the object illuminated by the third illumination unit 53. To get. That is, the microscopic observation system 55 acquires a microscopic observation image obtained by microscopically observing the object in the culture vessel 10.
  • the control unit 6 processes the signals input from the image pickup device 54c of the macro observation system 54 and the image pickup device 55c of the microscopic observation system 55 to generate an image such as an overall observation image and a microscopic observation image. Further, the control unit 6 performs image analysis on the whole observation image and the microscopic observation image to generate a time-lapse image. The control unit 6 outputs the generated image to the information processing device 100 and stores it in the storage device 110.
  • the control unit 6 has a CPU (Central Processing Unit) (processor) 61, a ROM (Read Only Memory) 62, and a RAM (Random Access Memory) 63.
  • the CPU 61 controls the control unit 6 and executes various processes in the control unit 6.
  • the ROM 62 stores a control program related to the culture system BS, control data, and the like.
  • the RAM 63 includes an auxiliary storage device such as a hard disk or a DVD (Digital Versatile Disc), and temporarily stores observation conditions, image data, and the like.
  • Each component device such as the culture chamber 2, the transfer unit 4, the observation device 5, and the operation panel 7 is connected to the control unit 6 (see FIG. 3).
  • the RAM 63 stores, for example, the environmental conditions of the culture chamber 2 according to the observation program, the observation schedule, the observation type in the observation device 5, the observation position, the observation magnification, and the like. Further, the RAM 63 includes a storage area for storing the image data photographed by the observation device 5, and stores the image data in association with the index data including the code number of the culture vessel 10, the imaging date and time, and the like.
  • the operation panel 7 has an operation panel (operation unit, input unit) 71 and a display panel 72.
  • the operation panel 71 includes input / output devices (operation unit, input unit) such as a keyboard, a mouse, and a switch. The user operates the operation panel 71 to input observation program settings, condition selection, operation commands, and the like.
  • the communication unit 65 is configured in accordance with a wired or wireless communication standard, and is connected to an observation device 5, a culture system BS, or an external device (eg, a server, a user's client terminal, etc.) connected to the control unit 6. Send and receive data between.
  • Various types of information stored in the RAM 63 can be appropriately stored in the storage device 110 via the information processing device 100.
  • FIG. 5 is a block diagram showing a functional configuration example of the information processing device according to the first embodiment.
  • the information processing apparatus 100 includes a communication unit 131, an acquisition unit 132, a display control unit 133, a calculation unit 134, and a registration unit 135. Further, the information processing device 100 is connected to the input unit 121 and the display unit 122.
  • the input unit 121 receives various operations by the user of the information processing device 100 and outputs a control signal according to the user operation.
  • the input unit 121 is composed of, for example, a mouse, a keyboard, and the like.
  • the display unit 122 displays and outputs various information (information including an image) according to a user operation on the input unit 121.
  • the display unit 122 is composed of, for example, a display or the like.
  • the input unit 121 and the display unit 122 may be integrally configured. That is, even if the input unit 121 and the display unit 122 are configured by a portable terminal (eg, a tablet terminal) or the like having a touch panel that directly performs an input operation on various information displayed on the display unit 122. Good.
  • the communication unit 131 communicates with the culture system BS and the storage device 110 via the network to transmit and receive various information.
  • the communication unit 131 receives information on observation conditions and observation results from the culture system BS, for example, via a network. Further, the communication unit 131 transmits / receives information on observation conditions and observation results to / from the storage device 110 via a network, for example.
  • the acquisition unit 132 acquires the observation results obtained by imaging a plurality of objects housed in a container having a plurality of storage units under predetermined observation conditions. For example, the acquisition unit 132 appropriately acquires information on various observation results in the culture system BS stored in the storage device 110 from the storage device 110 via the network or the communication unit 131. Further, the acquisition unit 132 can appropriately acquire not only the information regarding the observation result but also the information regarding the observation condition from the storage device 110.
  • FIG. 6 is a diagram showing an example of information stored in the storage device according to the first embodiment.
  • the storage device 110 has, as data items, an experiment number, an experiment name, an experiment manager, an experiment person, an observation start date and time, an observation end date and time, a microscope name, a magnification, a container product, a container type, and a determination. Includes result, status, app number, and app name.
  • the experiment number is information indicating an identification number uniquely assigned to each experiment. Information such as "Exp00001" is stored in the experiment number.
  • the experiment name is information indicating the name of the experiment. Information such as "BS-T00001" is stored in the experiment name.
  • the person responsible for the experiment is information indicating the name of the person responsible for the experiment.
  • Information such as "responsible person A" is stored in the experiment manager.
  • the person in charge of the experiment is information indicating the name of the person in charge of the experiment.
  • Information such as "charge E” is stored in the person in charge of the experiment.
  • the data items of the experiment number, the experiment name, the person in charge of the experiment, and the person in charge of the experiment may be simply the data items of the number, the name, the person in charge, and the person in charge, respectively.
  • it when it is used not only in the experimental step but also in the culture step, it may be a data item of a culture number, a culture name, a culture manager, and a culture manager.
  • the observation start date and time is information indicating the date and time when the observation was started. Information such as "2019/08/25 09:00:00" is stored in the observation start date and time.
  • the observation end date and time is information indicating the date and time when the observation was completed. Information such as "2019/08/26 15:15:25” is stored in the observation end date and time.
  • the microscope name is information indicating the name of the microscope used in the observation. Information such as "microscope H” is stored in the microscope name.
  • the magnification is information indicating the magnification of the microscope set at the time of observation. Information such as "8x” is stored in the magnification.
  • the container product is information indicating the manufacturer name of a container (for example, a well plate) having a plurality of accommodating portions (for example, wells, dishes, etc.) for accommodating an object. Information such as "product type K” is stored in the container product.
  • the container type is information indicating the type of a container (for example, a well plate) having a plurality of accommodating portions (for example, wells or the like) for accommodating an object.
  • Information such as "6WP (Well Plate)" is stored in the container type.
  • the determination result is information indicating the user's determination for the experiment. Information such as "OK” or “NG” is stored in the determination result.
  • the status is information indicating the progress of analysis of the observation result. Information such as "completed” or "60%” is stored in the status.
  • the application number is information indicating an identification number uniquely assigned to each application package used in the analysis of the observation result. Information such as "App.00001" is stored in the application number.
  • the application name is information indicating the name of the application package. Information such as "AppX” is stored in the application name.
  • As the application package for example, there are an image analysis application, an application for calculating the area of an object, an application for calculating the number of objects, and the like.
  • the storage device 110 stores observation images of a plurality of storage units in which the objects are housed in association with the experiment number, the code number, and the like.
  • the observation image corresponds to, for example, the above-mentioned general observation image or microscopic observation image. Therefore, the acquisition unit 132 can appropriately acquire the observation image from the storage device 110 as well.
  • the storage device 110 also stores group information described later.
  • the display control unit 133 generates a display image to be displayed on the display unit 122, and displays the generated display image on the display unit 122.
  • the display control unit 133 generates various display images to be displayed on the display unit 122 and displays the generated display images on the display unit 122.
  • the group is mainly executed by the registration unit 135, which will be described later. Generate a display image related to the registration of. Further, the display control unit 133 acquires information that requires a calculation from the calculation unit 134 regarding the generation of the display image. That is, the calculation unit 134 executes a calculation based on the observation result.
  • the display control unit 133 acquires raw data such as observation conditions and observation results from the storage device 110, and acquires information on calculation processing results based on the observation results from the calculation unit 134. Details of the processing by the display control unit 133 according to this embodiment will be described later.
  • the registration unit 135 registers two or more accommodating units as the same group based on the observation conditions or the classification criteria obtained from the observation results and the observation results.
  • the classification criterion is, for example, at least one of the observation conditions including the type and amount of the culture solution to be charged into the container.
  • the classification criterion is, for example, at least one of the observation conditions including the type, concentration and amount of serum contained in the culture medium to be put into the container.
  • the classification criterion is, for example, at least one of the observation conditions including the type, concentration, duration of exposure, and timing of exposure of the drug to be charged into the containment.
  • the classification criterion is, for example, at least one of the observation conditions including the type and number of objects to be put into the containment unit.
  • the classification criterion is at least one of the observation conditions including, for example, the microscope name, the magnification, the temperature setting, the humidity setting, the atmosphere supply setting, and the light output setting in the space where the container is arranged (for example, the culture room 2). is there.
  • the classification criterion is, for example, at least one of the observation results including the number of objects, the time course of the number, the doubling time of the number, the amount of movement, and the morphological change.
  • the classification criterion is, for example, at least one of the observation results including the occupied area of the object and the peripheral length of the object.
  • the classification standard for example, the brightness value after the analysis of the observation result (image) may be used.
  • the registration unit 135 registers two or more storage units having the same (similar or related) observation result as one of the above-mentioned classification criteria or a combination of the classification criteria as the same group. ..
  • the registration unit 135 stores the group information in the storage device 110.
  • the classification criteria include features or indicators used for group registration and group display described later.
  • the information used in the group registration may include an observation image of the accommodating portion, which is one of the observation results. That is, the registration unit 135 registers two or more accommodating units having the same (similar, related) observation images as the same group in an arbitrary process (period) of observation. Further, the information used in the group registration may include information that visually expresses the observation result. That is, the registration unit 135 registers two or more accommodating units having the same (similar, related) graphed information as the same group based on the graphed information of the observation results.
  • group registration can be carried out without using the observation results.
  • the registration unit 135 executes group registration at an arbitrary timing without using the observation result.
  • Arbitrary timing is, for example, before, during, and / or after observation.
  • group registration that does not use the observation results can be performed before observation (at the timing when the classification criteria for observation conditions are set). That is, the registration unit 135 may register two or more accommodating units as the same group based on the classification criteria regarding the observation conditions.
  • the classification criteria for the observation conditions may be determined in advance by the user.
  • FIGS. 7 to 17 are diagrams showing screen examples in group registration using the classification criteria according to the first embodiment. Further, in the description of FIGS. 7 to 17, the processes in the display control unit 133, the calculation unit 134, and the registration unit 135 will be described as appropriate.
  • the display control unit 133 displays a display image showing the observation result search screen on the display unit 122.
  • a search is executed with the text input unit KSa that performs a keyword search and the condition search units FSa, FSb, FSc, and FSd that perform a search by selecting from predetermined search conditions.
  • a search button SB is included.
  • the condition search unit FSa is, for example, a pull-down for selecting the name of the person in charge of the experiment.
  • the condition search unit FSb is, for example, a pull-down for selecting a status.
  • the condition search unit FSc is, for example, a pull-down that selects an observation start date and an observation end date.
  • the condition search unit FSd is, for example, a pull-down for selecting an application name.
  • the condition search unit FSa, FSb, FSc, and FSd may be realized by text input. The search conditions are not limited to the above.
  • the user operates the input unit 121, performs text input to the text input unit KSa, makes a selection using the conditional search units FSa, FSb, FSc, and FSd, and presses the search button SB.
  • the user operates the input unit 121 and presses the search button SB without selecting text input or conditional search.
  • the display control unit 133 acquires information corresponding to the search condition from the storage device 110, and displays a display image showing the search result SR on the display unit 122. That is, the display control unit 133 displays a list of observation results based on the search conditions or a display image showing a list of all observation results on the display unit 122.
  • search result SR for example, information on the data items of the experiment name, the experiment manager, the experiment person, the observation start date and time, the observation end date and time, the microscope name, the magnification, the container product, the container type, the application name, the judgment, and the status Is included.
  • the data items of the search result SR are not limited to the above. Further, the search result SR can be sorted according to the instruction of the data item or the like. The user performs an operation of selecting (designating) from the search result SR in order to confirm the observation result.
  • the display control unit 133 shows the observation result selected from the search result SR in the display image.
  • the display image of the observation result includes, for example, information ExpI regarding observation, information ScI regarding imaging conditions (observation conditions), plate map PmI including observation images, and information EvI regarding events in observation.
  • the plate map PmI includes an observation image OI, a group name GN, a time-series switching content TS, and a depth switching content DP.
  • the observation image OI for example, an observation image included in the selected observation result is displayed.
  • a registered group name is displayed in the group name GN.
  • time-series switching content TS for example, content for switching observation images in chronological order is displayed.
  • the depth switching content DP for example, content for switching the observation image corresponding to the depth of the accommodating portion is displayed.
  • the time-series switching content TS is represented by, for example, a plurality of rectangles separated at regular intervals.
  • the user can operate the input unit 121 to select the rectangle of the time-series switching content TS.
  • the display control unit 133 switches the display of the plate map PmI using the observation image corresponding to the corresponding period as the display image.
  • the display control unit 133 may display information regarding the observation date and time in the selected rectangle.
  • the switching of the observation image using the time-series switching content TS is not limited to the selection of the rectangle, and may be realized by the time-series switching content TSa that moves the selection position of the rectangle.
  • the depth switching content DP is represented by, for example, a rectangle divided by a certain depth (thickness, Z direction) with respect to the accommodating portion.
  • the user can operate the input unit 121 to select the rectangle of the depth switching content DP.
  • the display control unit 133 switches the display of the plate map PmI using the observation image corresponding to the depth of the corresponding accommodating unit as the display image.
  • the switching of the observation image using the depth switching content DP is not limited to the selection of the rectangle, and may be realized by the depth switching content DPa that moves the selection position of the rectangle.
  • the display control unit 133 shows the observation result of the group selected by the user with respect to the display image relating to the container having the plurality of storage units, and displays the display image on the display unit 122. Further, when the search condition is not used, the display control unit 133 indicates the observation result selected by the user from the list (all lists) with respect to the display image relating to the container having a plurality of storage units, and displays the display image. It is displayed on the display unit 122. Further, when the search condition is used, the display control unit 133 indicates the observation result selected by the user from the list based on the search condition with respect to the display image relating to the container having a plurality of storage units, and displays the display image. It is displayed on the display unit 122. Then, the display control unit 133 shows the observation image of the accommodating unit included in the observation result with respect to the display image, and displays the display image on the display unit 122.
  • editing includes editing group information by registering a new group.
  • the display control unit 133 receives a signal indicating that the edit button EB has been pressed.
  • the display control unit 133 displays an observation image OI and a display image including the group addition button AG for adding a new group.
  • the user operates the input unit 121, selects two or more observation image OIs corresponding to the accommodating units to be added to the new group, and presses the group addition button AG.
  • the display control unit 133 displays a display image in which the observation image OI is inverted with a predetermined color according to the selection of the observation image OI.
  • the display control unit 133 receives a signal indicating that the group addition button AG is pressed, and as shown in FIG. 12, the display control unit 133 inputs (or edits) the group name of the new group.
  • a display image including the registration button RB for executing group registration is displayed.
  • the display control unit 133 clearly indicates the observation image OI to be registered as a group by highlighting information such as a color, a frame, and a line.
  • a group color content GC that expresses observation results and the like in different colors for each group is arranged.
  • the user operates the input unit 121, selects the group color content GC, inputs the group name to the text input unit KSb, and presses the registration button RB.
  • the display control unit 133 receives a signal indicating that the registration button RB is pressed, and as shown in FIG. 14, the display control unit 133 displays a display image including a confirmation image CI showing an image confirming the execution of the group registration. indicate.
  • the user operates the input unit 121 and presses the registration button included in the confirmation image CI.
  • the group registration is not executed, the user operates the input unit 121 and presses the cancel button included in the confirmation image CI.
  • FIG. 13 the user operates the input unit 121, selects the group color content GC, inputs the group name to the text input unit KSb, and presses the registration button RB.
  • the display control unit 133 receives a signal indicating that the registration button RB is pressed, and as shown in FIG. 14, the display control unit 133 displays a display image including a confirmation image
  • the display control unit 133 displays a display image including a completion image CS indicating that the group registration is completed.
  • the registration unit 135 stores the group information regarding the new group in the storage device 110.
  • the display control unit 133 displays a display image including the newly registered group name in the group name GN of the plate map PmI.
  • the information processing device 100 completes the group registration as described above.
  • the user operates the input unit 121 and selects a registered group from the group name GN in the plate map PmI.
  • the display control unit 133 receives a signal indicating the selection of the group from the group name GN, and as shown in FIG. 17, the display control unit 133 indicates the analysis result AR indicating the analysis result corresponding to the selected group name. And a display image including the doubling time DT indicating the doubling time is displayed.
  • the display control unit 133 may receive and display the calculation processing result from the calculation unit 134 when displaying the analysis result AR and the doubling time DT, for example.
  • the analysis result AR shown in FIG. 17 is an example in which the average number of objects in the containment unit belonging to the group B is visually expressed together with the error bar. That is, the display control unit 133 visually represents the observation result on the display image based on the calculation executed by the calculation unit 134, and displays the display image on the display unit 122.
  • the data on the vertical axis and the horizontal axis of the graph are shown as an example.
  • information on the time in the period from the observation start date and time to the observation end date and time may be displayed on the horizontal axis.
  • information on the item selected from the pull-down menu of the plate map PmI may be displayed on the vertical axis.
  • the vertical axis if the number of cells is selected in a pull-down, indicated by 0 to 1.0 of ( ⁇ 10 7) or the like.
  • the vertical axis is indicated by a ratio such as 0 to 100 (or 0 to 1.0) when the cell occupied area ratio is selected by pulling down. Therefore, the vertical and horizontal axes of the graph change according to the content to be displayed.
  • FIG. 18 is a flowchart showing an example of a processing flow in the information processing apparatus according to the first embodiment.
  • the acquisition unit 132 determines whether or not the search for which the search conditions have been set has been executed. For example, the acquisition unit 132 selects from the search conditions in which the search button SB is pressed by inputting text into the text input unit KSa, and the search conditions using the condition search units FSa, FSb, FSc, and FSd, and the search button SB is pressed. It is determined whether the search is performed or the search is performed only when the search button SB is pressed.
  • the acquisition unit 132 selects from the search conditions in which the search button SB is pressed by inputting text into the text input unit KSa, or the search conditions using the condition search units FSa, FSb, FSc, and FSd, and the search button SB.
  • the process in step S102 is executed.
  • the acquisition unit 132 executes the process in step S103.
  • step S102 the acquisition unit 132 acquires the observation result based on the search conditions. For example, the acquisition unit 132 acquires the observation result corresponding to the search condition from the storage device 110 via the communication unit 131.
  • step S103 the acquisition unit 132 acquires all the observation results. For example, the acquisition unit 132 acquires all the observation results from the storage device 110 via the communication unit 131.
  • step S104 the display control unit 133 displays a list of observation results. For example, the display control unit 133 shows a list of observation results acquired by the acquisition unit 132 on the display image, and displays the display image on the display unit 122.
  • the display control unit 133 accepts selection for the observation result.
  • the display control unit 133 accepts the user's selection of the observation result as a signal in the display image of the list of observation results.
  • the display control unit 133 displays the plate map.
  • the display control unit 133 shows the observation result selected by the user with respect to the display image of the container having the plurality of storage units, and displays the display image on the display unit 122.
  • the display control unit 133 may show a graph or the like for the observation result on the display image based on the execution result of the calculation by the calculation unit 134, and display the display image on the display unit 122.
  • the display control unit 133 may display the time-series switching content TS, the depth switching content DP, and the like on the display image, and display the display image on the display unit 122.
  • step S107 the display control unit 133 accepts selection for the observation image and group registration of the accommodating unit corresponding to the selected observation image. For example, the display control unit 133 receives a signal indicating that the edit button EB is pressed. Then, the display control unit 133 displays the observation image OI and the display image including the group addition button AG on the display unit 122. Subsequently, the display control unit 133 receives a signal indicating the selection of the observation image OI and the pressing of the group addition button AG. At this time, the display control unit 133 may display a display image in which the selected observation image OI is inverted with a predetermined color.
  • the display control unit 133 displays a display image including the text input unit KSb for inputting the group name and the registration button RB for executing the group registration on the display unit 122.
  • the observation image OI to be registered as a group may be clearly indicated by a frame or the like.
  • the display control unit 133 receives a signal indicating the selection of the group color content GC, the group name input to the text input unit KSb, and the pressing of the registration button RB.
  • the registration unit 135 executes group registration.
  • the registration unit 135 stores the group information in the storage device 110 based on the group color and the group name received by the display control unit 133 for the observation result to be the target of the new group.
  • Group registration is not limited to new group registration.
  • the registration unit 135 may store the group information to be added to the "group A" in the storage device 110 for the selected accommodating unit.
  • FIG. 19 is a flowchart showing an example of the flow of the display switching process of the observation image according to the first embodiment.
  • the display control unit 133 determines whether the plate map is displayed. At this time, the display control unit 133 executes the process in step S202 when the plate map is displayed (step S201: Yes). On the other hand, when the plate map is not displayed (step S201: No), the display control unit 133 ends the process because it is not necessary to switch the display of the observation image.
  • step S202 the display control unit 133 determines whether or not the time series switching operation has been accepted. For example, the display control unit 133 determines whether or not the rectangle of the time-series switching content TS is selected. The display control unit 133 may determine whether or not an operation of moving the selection position of the rectangle has been performed with respect to the time-series switching content TSa. Then, when the display control unit 133 receives the signal indicating the time series switching operation (step S202: Yes), the display control unit 133 displays the observation image corresponding to the time series in step S203. For example, the display control unit 133 switches the display of the plate map PmI using the observation image corresponding to the period corresponding to the rectangular position of the time-series switching content TS as the display image. On the other hand, when the display control unit 133 does not accept the time-series switching operation (step S202: No), the display control unit 133 executes the process in step S204.
  • step S204 the display control unit 133 determines whether or not the depth switching operation has been accepted. For example, the display control unit 133 determines whether or not the rectangle of the depth switching content DP is selected. The display control unit 133 may determine whether or not an operation of moving the selection position of the rectangle has been performed with respect to the depth switching content DPa. Then, when the display control unit 133 receives the signal indicating the depth switching operation (step S204: Yes), the display control unit 133 displays the observation image corresponding to the depth in step S205. For example, when the rectangle of the depth switching content DP is selected, the display control unit 133 switches the display of the plate map PmI using the observation image corresponding to the depth of the corresponding accommodating unit as the display image.
  • step S204 when the display control unit 133 does not accept the depth switching operation (step S204: No), the display control unit 133 executes the process in step S201. That is, when the display control unit 133 receives a signal indicating an operation for the time-series switching content TS or the depth switching content DP while the plate map is being displayed, the display control unit 133 switches to the corresponding observation image and displays it. Run.
  • FIG. 20 is a block diagram showing a functional configuration example of the information processing apparatus according to the second embodiment.
  • the information processing device 100a includes a communication unit 131, an acquisition unit 132, a display control unit 133a, a calculation unit 134, and a registration unit 135. Further, the input unit 121 and the display unit 122 are connected to the information processing device 100a.
  • the display control unit 133a corresponds to the "control unit”.
  • the acquisition unit 132 acquires the observation results obtained by imaging a plurality of objects under predetermined observation conditions.
  • the process in the acquisition unit 132 is the same as that in the above embodiment. That is, the acquisition unit 132 acquires data from the storage device 110 as appropriate for processing by the display control unit 133, the calculation unit 134, and the like.
  • the display control unit 133a shows a plurality of reference images relating to observation images of the plurality of storage units in which the plurality of objects are housed in the display image based on the above-mentioned classification criteria. For example, the display control unit 133a displays a display image including a plate map having a group name on the display unit 122.
  • the display control unit 133a receives a signal indicating the operation of selecting the group name by the user in the display image. Subsequently, the display control unit 133a clearly indicates the storage unit of the container belonging to the group selected by the user with respect to the display image related to the container, and displays the specified display image on the display unit 122.
  • the display control unit 133a in the present embodiment shows a plurality of reference images relating to each observation image of the container storage unit on the display image, and displays the display image on the display unit 122.
  • the reference image is, for example, a reduced image of the observed image, and is sometimes called a thumbnail (thumbnail image).
  • the display control unit 133a shows, for example, a plurality of reference images adjacent to or close to each other in the display image. Further, the display control unit 133a shows, for example, a plurality of reference images adjacent to or close to each other in the same arrangement as the plurality of accommodating units in the display image.
  • the reference images may or may not be in contact with each other, with the reference images having the same arrangement as each of the accommodating portions included in the container for each arrangement.
  • the fact that a plurality of reference images are adjacent to each other includes that the reference images are in contact with each other and that the reference images are not in contact with each other.
  • the display control unit 133a shows a plurality of reference images sorted based on the classification criteria selected by the user in the display image. Therefore, the information processing apparatus 100a in the present embodiment acquires and at least selects a plurality of reference images belonging to a group corresponding to the selected classification standard (or index) and a plurality of reference images not belonging to the group.
  • the plurality of reference images belonging to the group corresponding to the classified criteria are rearranged and displayed in the display image as the same group as shown in FIG. 22, for example.
  • the display control unit 133a receives a signal indicating that the classification standard has been selected and input, the display control unit 133a does not have the same arrangement as each of the storage units included in the container described above, but the classification standard (or described later).
  • a plurality of reference images corresponding to the index of the classification standard) are newly arranged and displayed on the display image based on the classification standard.
  • FIG. 21 is a diagram showing an example of a reference image according to the second embodiment.
  • the display control unit 133a corresponds to each of the plurality of storage units in which the object surrounded by the broken line is charged, excluding the storage unit in the container in which the anti-drying buffer solution is charged.
  • An image (for example, a reduced observation image, a thumbnail image) is displayed on the display image.
  • the reference image is shown in the display image in the same arrangement as each of the plurality of accommodating portions, adjacent or close to each other. Then, the plurality of reference images are rearranged in the display image based on a predetermined classification criterion as described below.
  • the display control unit 133a displays a display image including the classification standard on the display unit 122, and accepts the selection of the classification standard by the user operation.
  • the acquisition unit 132 acquires an index for evaluating the selected classification standard from the storage device 110. That is, the classification standard according to the present embodiment includes an observation condition or an index for evaluating the observation result. For example, when the classification criterion is the number of objects (eg, the number of cells), the number of objects may be greater than or equal to a predetermined value and less than a predetermined value as an index for classification of each accommodating portion (an index for classifying each accommodating portion). May be).
  • the classification is not limited to two, and may be classified into three or more by giving a range to the index.
  • the classification is a classification of the first predetermined value or more, a classification of the second predetermined value or more and less than the first predetermined value, and a classification of the second predetermined value or less (the first predetermined value> the second predetermined value). It can also be classified into three.
  • the display control unit 133a shows a plurality of reference images for each category based on the index acquired by the acquisition unit 132 on the display image, and displays the display image on the display unit 122.
  • the display control unit 133a shows the information indicating the evaluation for each of the categories on the display image.
  • the information indicating the evaluation is "OK", "NG”, etc. indicating whether or not the index is satisfied.
  • the display control unit 133a may show information indicating an index for each of the categories on the display image. For example, when classifying into three or more categories, clearly indicate what kind of index each category corresponds to.
  • the information indicating the evaluation includes information (good or bad, etc.) indicating the maturity or differentiation state of the cell as an index.
  • the display control unit 133a shows a plurality of reference images included in the same category in the display image adjacent to or close to each other. Further, the display control unit 133a shows a plurality of reference images on the display image at intervals for each division. That is, in the present embodiment, in order to clarify that the set of accommodating parts satisfying a certain index and the set of accommodating parts not satisfying a certain index are different divisions, a plurality of reference images are spaced by each division. It is displayed in a vacant and rearranged manner. In addition, in the present embodiment, by adjoining or bringing a plurality of reference images included in the same category adjacent to each other or close to each other, an image (reference image) of an object that is considered to have some feature in each category can be easily confirmed (compared).
  • the display control unit 133a may show in the display image that the classification is different.
  • the display control unit 133a may surround each division with a frame, may add a frame of a different color for each division, or may add some content indicating that the division is different.
  • the display control unit 133a may display the display image by setting the number of arrangements of the plurality of reference images in one direction of the plurality of divisions to the same number. For example, in FIG. 22, the number of arranged reference images in one direction is set to “6”. In this way, by making the number of arrangements of the reference images in one direction the same, the user can easily recognize the number of the accommodating portions included in each category based on the index. Of course, the number of arrangements of the plurality of reference images in the other direction (direction different from the above one direction) in which "OK" and "NG” shown in FIG. 22 are arranged may be the same. Further, the display control unit 133a may be switched for each division and shown in the display image according to the user operation.
  • the display control unit 133a accepts a user operation for displaying only "OK"
  • the user who displays only the category corresponding to "OK” among the plurality of reference images in the display image and displays only "NG”.
  • the operation is accepted, only the category corresponding to "NG” among the plurality of reference images may be shown in the display image.
  • the display control unit 133a may show the distribution of the number of the plurality of accommodating units with respect to the observation conditions on the display image. For example, as shown in FIG. 22, the display control unit 133a expresses and displays the distribution of the number of the plurality of accommodating units with respect to the drug concentration (one of the observation conditions) as a graph for each group corresponding to the classification standard. Shown in the image. The observation conditions expressed as a graph are not limited to the drug concentration. Further, the display control unit 133a may show the distribution of the number of the plurality of accommodating units with respect to the observation conditions on the display image for each division. For example, as shown in FIG. 23, the display control unit 133a expresses the distribution of the number of the plurality of accommodating units with respect to the drug concentration as a graph for each category and for each group included in each category, and shows the display image. ..
  • the display control unit 133a may receive the index setting and display a plurality of reference images on the display image for each category based on the received index setting. That is, the display control unit 133a accepts the selection of the classification standard and the setting of the index, and displays a plurality of reference images on the display image for each category according to the received index of the classification standard. Further, the index may be adjusted on the screen. For example, the display control unit 133a shows the content for setting the index on the display image. As shown in FIG.
  • the display control unit 133a uses a slider or the like by the user to display an index acquired from the storage device 110 or a display image in which a plurality of reference images are rearranged for each category based on the set index. Accepts index settings (changes, resets). As a result, the display control unit 133a rearranges a plurality of reference images for each category based on the index set by the slider or the like and shows them on the display image, and displays the display image on the display unit 122.
  • FIG. 25 is a flowchart showing an example of a processing flow in the information processing apparatus according to the second embodiment.
  • the display control unit 133a displays the display image including the classification standard on the display unit 122.
  • the acquisition unit 132 acquires an index for evaluating the classification criteria selected by the user operation from the storage device 110.
  • the display control unit 133a displays a display image in which a plurality of reference images are arranged for each division based on the index on the display unit 122. For example, the display control unit 133a displays a display image including a reference image for each observation image of the plurality of storage units contained in the container on the display unit 122, and then displays a display image including the classification standard on the display unit 122.
  • the classification criteria may be displayed according to the user operation. Then, the display control unit 133a arranges a plurality of reference images on the display image for each category based on the index for evaluating the classification standard acquired by the acquisition unit 132, and displays the display image on the display unit 122. The index may be set by the user along with the selection of the classification criteria. Further, the display control unit 133a is a graph showing information (“OK”, “NG”, etc.) indicating the evaluation for each of the categories in the display image on which the reference image is arranged, and the distribution of the number of the plurality of accommodating units with respect to the observation conditions. May be included. In addition, the display control unit 133a may include contents such as a slider for setting an index in the display image.
  • step S304 the display control unit 133a determines whether the index setting has been changed. For example, when the index setting is changed by a user operation on the slider or the like (step S304: Yes), the display control unit 133a rearranges the reference image for each category based on the set index in step S305. The display image is displayed on the display unit 122. When the reference image is rearranged by changing the setting of the index, the display control unit 133a also changes the graph showing the distribution of the number of the plurality of accommodating units with respect to the observation condition according to the setting of the index. On the other hand, the display control unit 133a executes the process in step S306 when the index setting has not been changed (step S304: No).
  • step S306 the display control unit 133a determines whether or not the operation of ending the screen for displaying the reference image has been performed. At this time, the display control unit 133a ends the process related to the index setting when the end operation is performed (step S306: Yes). On the other hand, when the end operation is not performed (step S306: No), the display control unit 133a executes the process in step S304 and continues the process related to the index setting.
  • FIG. 26 is a block diagram showing a functional configuration example of the information processing device according to the third embodiment.
  • the information processing apparatus 100b includes a communication unit 131, an acquisition unit 132, a display control unit 133b, a calculation unit 134, and a registration unit 135. Further, the input unit 121 and the display unit 122 are connected to the information processing device 100b.
  • the display control unit 133b corresponds to the "control unit”.
  • the display control unit 133b uses the index described in the above embodiment as the first index (first index), and refers to the first index with respect to the display image showing a plurality of reference images for each category based on the first index. A plurality of reference images are rearranged and shown based on a different second index (second index). For example, the display control unit 133b displays a display image including a classification criterion for setting the second index in a state where a display image including a plurality of reference images is displayed for each category based on the first index. Is displayed on the screen and accepts the selection of classification criteria.
  • the acquisition unit 132 acquires a second index for evaluating the selected classification criterion from the storage device 110. Then, the display control unit 133b rearranges and shows a plurality of reference images with respect to the display image based on the second index acquired by the acquisition unit 132.
  • the second index may be set by being selected by the user together with the selection of the classification criteria.
  • FIG. 27 is a diagram showing an example of rearranging the reference images according to the third embodiment.
  • the acquisition unit 132 acquires information about each storage unit from the storage device 110 for the first division based on the first index (for example, the division corresponding to “OK”). Further, the acquisition unit 132 acquires information about each storage unit from the storage device 110 for the second category (for example, the category corresponding to “NG”) based on the first index. Information about each containment is used in confirming the settings for the second indicator. Then, the display control unit 133b shows a display image in which each of the reference images of each division is rearranged within each division based on the setting of the second index of each accommodation unit, and the display image is displayed on the display unit 122. indicate. For example, as shown in FIG.
  • the display control unit 133b has an occupied area ratio of cells (in this case, a cell colony or a cell group) of 80% or more as an object for a plurality of accommodating units included in the first category.
  • a certain accommodating part, an accommodating part having an exclusive area ratio of 50% or more and less than 80%, and an accommodating part having an exclusive area ratio of less than 50% are specified, and a plurality of reference images are rearranged.
  • the occupied area ratio is an example of a second index showing observation conditions or observation results. That is, in the present embodiment, for each of the accommodating units classified by the first index, a process for asking the user to confirm what kind of situation the second index will be is executed. Further, in the present embodiment, a process for confirming the reference images sorted by the second index is executed according to the user's selection.
  • the display control unit 133b shows the information indicating the second index for each of the categories in the display image.
  • information indicating how much the occupied area ratio of the cells in the accommodating portion is is shown with respect to the reference image of each category.
  • the display control unit 133b may clearly indicate a plurality of reference images corresponding to the second index to the display image.
  • the display control unit 133b may enclose the reference image corresponding to each of the second index having the exclusive area ratio of 80% or more, the exclusive area ratio of 50% to less than 80%, and the exclusive area ratio of less than 50% with a frame.
  • the reference images may be arranged separately for each second index.
  • the switching of the second index may be carried out as appropriate.
  • the display control unit 133b rearranges a plurality of reference images according to the input signal indicating the switching of the second index.
  • FIG. 28 is a diagram showing an example of a graph displayed together with a reference image according to a third embodiment.
  • the display control unit 133b displays a graph expressing the ratio (or number, etc.) of the accommodating unit to the ratio of the occupied area ratio described above for each of the categories based on the first index on the display image. May be shown.
  • FIG. 29 is a flowchart showing an example of a processing flow in the information processing apparatus according to the third embodiment.
  • a process in which a plurality of reference images are shown in the display image for each category based on the first index will be given as an example.
  • the display control unit 133b displays the display image including the classification standard on the display unit 122.
  • the acquisition unit 132 acquires a second index for evaluating the selected classification criteria from the storage device 110.
  • the acquisition unit 132 acquires information about each of the accommodating units of each category with respect to the first index from the storage device 110.
  • the display control unit 133b displays a display image including the classification criteria on the display unit 122 in order to select the second index.
  • the classification criteria may be displayed according to the user operation. Then, the acquisition unit 132 acquires the second index for evaluating the selected classification standard from the storage device 110, and confirms what kind of setting each of the accommodating units in each category has for the acquired second index. Information about the second index of each accommodating portion is acquired from the storage device 110 in order to obtain the information.
  • step S404 the display control unit 133b displays on the display unit 122 a display image in which each of the reference images of each category is rearranged based on the second index. For example, the display control unit 133b displays on the display unit 122 the display images further rearranged according to the same setting of the second index with respect to the reference images arranged for each category based on the current first index.
  • step S405 the display control unit 133b determines whether another classification criterion has been selected. For example, the acquisition unit 132 executes the process in step S402 when another classification criterion is selected (step S405: Yes).
  • step S406 executes the process in step S406 when another classification criterion is not selected (step S405: No).
  • step S406 determines whether or not the operation of ending the screen for displaying the reference image has been performed. At this time, the display control unit 133b ends the process related to the setting of the second index when the end operation is performed (step S406: Yes).
  • step S406: No the display control unit 133b executes the process in step S405 when the end operation has not been performed.
  • FIG. 31 is a diagram showing a configuration example of the information processing system according to the fourth embodiment.
  • the information processing system SYS includes a first terminal 30, a second terminal 40, a terminal device 80, an information processing device 100, and a culture system BS.
  • the first terminal 30, the second terminal 40, the terminal device 80, and the information processing device 100 are connected to each other so as to be able to communicate with each other via the network N.
  • the network N may be any of the Internet, a mobile communication network, and a local network, and may be a network in which a plurality of types of these networks are combined.
  • the terminal device 80 is composed of a plurality of gateway devices 80a.
  • the gateway device 80a is connected to the culture system BS by wire or wirelessly.
  • the information processing system SYS has a configuration in which a plurality of culture system BSs are connected to the information processing device 100 via the terminal device 80, but the present invention is not limited to this, and a single culture system BS is connected to the information processing device 100 via the terminal device 80. It may be configured to be connected to the information processing device 100. Further, a plurality of information processing devices 100 may be provided in the information processing system SYS, or a single information processing device 100 may be provided.
  • each information processing apparatus 100 may include all of the various functions described in the above embodiments, or may include them in a distributed manner. That is, the information processing device 100 according to the present embodiment can be realized by cloud computing.
  • the information processing device 100 In the information processing system SYS, users can connect to the information processing device 100 from terminals (first terminal 30, second terminal 40, etc.) and view and operate observation results and the like using a browser.
  • the information processing device 100 acquires an observation result obtained by imaging a plurality of objects under predetermined observation conditions in an acquisition unit.
  • the information processing device 100 As an image generation unit, the information processing device 100 generates a display image showing a plurality of reference images relating to observation images of the plurality of storage units in which the plurality of objects are housed based on the classification criteria.
  • the information processing apparatus 100 outputs the display image generated by the image generation unit to the user terminal via the network N as an output unit.
  • the information processing device 100 includes, for example, a computer system.
  • the information processing device 100 reads an information processing program stored in the memory and executes various processes according to the read information processing program.
  • Such an information processing program for example, obtains observation results obtained by imaging a plurality of objects under predetermined observation conditions on a computer, and observes a plurality of storage units in which the plurality of objects are housed. Show a plurality of reference images related to the image on the display image based on the classification criteria, and execute.
  • the information processing program may be recorded and provided on a computer-readable storage medium (for example, non-transitory storage medium, non-transitory tangible media).
  • the arithmetic unit 134 of the information processing apparatus 100 displays information visually expressing the analysis result as a graph
  • one data to be displayed on the graph eg, next to the graph.
  • Axis data, etc. may be calculated from a plurality of observation images (eg, integration, average).

Landscapes

  • Chemical & Material Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Zoology (AREA)
  • Wood Science & Technology (AREA)
  • Organic Chemistry (AREA)
  • Biotechnology (AREA)
  • Analytical Chemistry (AREA)
  • Sustainable Development (AREA)
  • Microbiology (AREA)
  • Biomedical Technology (AREA)
  • Biochemistry (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Genetics & Genomics (AREA)
  • Medicinal Chemistry (AREA)
  • Apparatus Associated With Microorganisms And Enzymes (AREA)

Abstract

[Problem] To easily visually recognize information about a well. [Solution] This information processing device is provided with: an acquisition unit which acquires an observation result obtained by capturing images of a plurality of objects under a prescribed observation condition; and a control unit which shows, in a display image on the basis of a classification criterion, a plurality of reference images pertaining to observation images of a plurality of reception units in which the plurality of objects are respectively received.

Description

情報処理装置、情報処理方法、情報処理プログラム、及び情報処理システムInformation processing equipment, information processing methods, information processing programs, and information processing systems
 本発明は、情報処理装置、情報処理方法、情報処理プログラム、及び情報処理システムに関する。 The present invention relates to an information processing device, an information processing method, an information processing program, and an information processing system.
 特許文献1では、細胞のコロニーが撮像された画像に基づいて、コロニーの面積と、コロニーの面積を算出したコロニーに含まれる細胞の数とから、細胞の密度を算出する技術が開示されている。例えば、こうした技術を用いた解析結果は日々の解析によって膨大な量となるため、ユーザがそれらの解析結果を容易に視認できる技術が求められる。 Patent Document 1 discloses a technique for calculating cell density from the area of a colony and the number of cells contained in the colony for which the area of the colony has been calculated, based on an image of the colony of the cell. .. For example, since the amount of analysis results using such a technique becomes enormous due to daily analysis, a technique that allows the user to easily visually recognize the analysis results is required.
国際公開第2015/193951号International Publication No. 2015/193951
 第1の態様に従えば、所定の観察条件で複数の対象物を撮像して得られた観察結果を取得する取得部と、複数の対象物がそれぞれ収容される複数の収容部の観察画像に関する複数の参照画像を分類基準に基づいて表示画像に示す制御部と、を備える情報処理装置が提供される。 According to the first aspect, the present invention relates to an acquisition unit that acquires an observation result obtained by imaging a plurality of objects under predetermined observation conditions, and an observation image of a plurality of accommodating units in which the plurality of objects are accommodated. An information processing device including a control unit that displays a plurality of reference images on a display image based on a classification standard is provided.
 第2の態様に従えば、所定の観察条件で複数の対象物を撮像して得られた観察結果を取得することと、複数の対象物がそれぞれ収容される複数の収容部の観察画像に関する複数の参照画像を分類基準に基づいて表示画像に示すことと、を含む情報処理方法が提供される。 According to the second aspect, it is possible to acquire observation results obtained by imaging a plurality of objects under predetermined observation conditions, and to obtain observation images of a plurality of storage portions in which the plurality of objects are housed. An information processing method including showing the reference image of the above in the display image based on the classification criteria is provided.
 第3の態様に従えば、コンピュータに、所定の観察条件で複数の対象物を撮像して得られた観察結果を取得することと、複数の対象物がそれぞれ収容される複数の収容部の観察画像に関する複数の参照画像を分類基準に基づいて表示画像に示すことと、を実行させる情報処理プログラムが提供される。 According to the third aspect, the computer acquires the observation results obtained by imaging a plurality of objects under predetermined observation conditions, and observes a plurality of accommodating portions in which the plurality of objects are accommodated. An information processing program for displaying a plurality of reference images related to an image on a display image based on a classification criterion and executing the display is provided.
 第4の態様に従えば、クラウドコンピューティングによりユーザ端末に表示画像を出力する情報処理システムであって、サーバを備え、サーバは、所定の観察条件で複数の対象物を撮像して得られた観察結果を取得する取得部と、複数の対象物がそれぞれ収容される複数の収容部の観察画像に関する複数の参照画像を分類基準に基づいて示した表示画像を生成する画像生成部と、ネットワークを介して、画像生成部が生成した表示画像をユーザ端末に出力する出力部と、を備える情報処理システムが提供される。 According to the fourth aspect, it is an information processing system that outputs a display image to a user terminal by cloud computing, and includes a server, which is obtained by imaging a plurality of objects under predetermined observation conditions. A network, an acquisition unit for acquiring observation results, an image generation unit for generating a display image showing a plurality of reference images related to observation images of a plurality of accommodation units in which a plurality of objects are housed, based on classification criteria, and a network. An information processing system including an output unit that outputs a display image generated by the image generation unit to a user terminal is provided.
第1実施形態に係る情報処理装置を含む解析システムの全体構成例を示す図。The figure which shows the whole structure example of the analysis system including the information processing apparatus which concerns on 1st Embodiment. 第1実施形態に係る情報処理装置に接続される培養システムの構成例を示す図。The figure which shows the structural example of the culture system connected to the information processing apparatus which concerns on 1st Embodiment. 第1実施形態に係る情報処理装置に接続される培養システムの構成例を示すブロック図。The block diagram which shows the structural example of the culture system connected to the information processing apparatus which concerns on 1st Embodiment. 第1実施形態に係る情報処理装置に接続される培養システムの制御ユニット回りの接続関係の例を説明する図。The figure explaining the example of the connection relation around the control unit of the culture system connected to the information processing apparatus which concerns on 1st Embodiment. 第1実施形態に係る情報処理装置の機能構成例を示すブロック図。The block diagram which shows the functional structure example of the information processing apparatus which concerns on 1st Embodiment. 第1実施形態に係る記憶装置が記憶する情報例を示す図。The figure which shows the example of the information which the storage device which concerns on 1st Embodiment stores. 第1実施形態に係るグループ登録における画面例を示す図。The figure which shows the screen example in the group registration which concerns on 1st Embodiment. 第1実施形態に係るグループ登録における画面例を示す図。The figure which shows the screen example in the group registration which concerns on 1st Embodiment. 第1実施形態に係るグループ登録における画面例を示す図。The figure which shows the screen example in the group registration which concerns on 1st Embodiment. 第1実施形態に係るグループ登録における画面例を示す図。The figure which shows the screen example in the group registration which concerns on 1st Embodiment. 第1実施形態に係るグループ登録における画面例を示す図。The figure which shows the screen example in the group registration which concerns on 1st Embodiment. 第1実施形態に係るグループ登録における画面例を示す図。The figure which shows the screen example in the group registration which concerns on 1st Embodiment. 第1実施形態に係るグループ登録における画面例を示す図。The figure which shows the screen example in the group registration which concerns on 1st Embodiment. 第1実施形態に係るグループ登録における画面例を示す図。The figure which shows the screen example in the group registration which concerns on 1st Embodiment. 第1実施形態に係るグループ登録における画面例を示す図。The figure which shows the screen example in the group registration which concerns on 1st Embodiment. 第1実施形態に係るグループ登録における画面例を示す図。The figure which shows the screen example in the group registration which concerns on 1st Embodiment. 第1実施形態に係るグループ登録における画面例を示す図。The figure which shows the screen example in the group registration which concerns on 1st Embodiment. 第1実施形態に係る情報処理装置における処理の流れの例を示すフローチャート。The flowchart which shows the example of the processing flow in the information processing apparatus which concerns on 1st Embodiment. 第1実施形態に係る観察画像の表示切り替え処理の流れの例を示すフローチャート。The flowchart which shows the example of the flow of the display switching process of the observation image which concerns on 1st Embodiment. 第2実施形態に係る情報処理装置の機能構成例を示すブロック図。The block diagram which shows the functional structure example of the information processing apparatus which concerns on 2nd Embodiment. 第2実施形態に係る参照画像の例を示す図。The figure which shows the example of the reference image which concerns on 2nd Embodiment. 第2実施形態に係る参照画像を並び替える例を示す図。The figure which shows the example which rearranges the reference image which concerns on 2nd Embodiment. 第2実施形態に係る参照画像を並び替える例を示す図。The figure which shows the example which rearranges the reference image which concerns on 2nd Embodiment. 第2実施形態に係る参照画像を並び替える例を示す図。The figure which shows the example which rearranges the reference image which concerns on 2nd Embodiment. 第2実施形態に係る情報処理装置における処理の流れの例を示すフローチャート。The flowchart which shows the example of the processing flow in the information processing apparatus which concerns on 2nd Embodiment. 第3実施形態に係る情報処理装置の機能構成例を示すブロック図。The block diagram which shows the functional structure example of the information processing apparatus which concerns on 3rd Embodiment. 第3実施形態に係る参照画像を並び替える例を示す図。The figure which shows the example which rearranges the reference image which concerns on 3rd Embodiment. 第3実施形態に係る参照画像とともに表示するグラフの例を示す図。The figure which shows the example of the graph which displays with the reference image which concerns on 3rd Embodiment. 第3実施形態に係る情報処理装置における処理の流れの例を示すフローチャート。The flowchart which shows the example of the processing flow in the information processing apparatus which concerns on 3rd Embodiment. 第4実施形態に係る情報処理システムの構成例を示す図。The figure which shows the structural example of the information processing system which concerns on 4th Embodiment.
 以下、実施形態について図面を参照して説明する。図面においては実施形態を説明するため、一部分を大きく又は強調して表すなど適宜縮尺を変更して表現しており、実際の製品とは大きさ、形状が異なる場合がある。 Hereinafter, embodiments will be described with reference to the drawings. In the drawings, in order to explain the embodiment, the scale is appropriately changed and expressed, such as by enlarging or emphasizing a part, and the size and shape may differ from the actual product.
[第1実施形態]
 第1実施形態を説明する。図1は、第1実施形態に係る情報処理装置を含む解析システムの全体構成例を示す図である。図1に示すように、解析システム1000は、培養システムBSと、情報処理装置100と、記憶装置110とを含む。培養システムBSは、培養装置8と、観察装置5とを含む。解析システム1000は、対象物(例えば、細胞、試料、標本等)を培養するとともに、培養の過程を観察(撮像)し、観察結果(例、撮像画像)を解析するシステムである。
[First Embodiment]
The first embodiment will be described. FIG. 1 is a diagram showing an overall configuration example of an analysis system including the information processing apparatus according to the first embodiment. As shown in FIG. 1, the analysis system 1000 includes a culture system BS, an information processing device 100, and a storage device 110. The culture system BS includes a culture device 8 and an observation device 5. The analysis system 1000 is a system for culturing an object (for example, a cell, a sample, a specimen, etc.), observing (imaging) the process of culturing, and analyzing the observation result (eg, captured image).
 培養システムBS、情報処理装置100、及び記憶装置110は、インターネット、LAN(Local Area Network)、及びWAN(Wide Area Network)等のネットワークを介して接続される。また、培養システムBS、情報処理装置100、及び記憶装置110は、インターネット、LAN、及びWAN等を組み合わせたネットワークを介して接続されてもよい。かかるネットワークは、有線通信によるネットワークに限らず、無線通信によるネットワークを含んでもよい。また、情報処理装置100が記憶装置110を備えた構成としてもよい。また、培養システムBSと記憶装置110とは、ネットワークを介して接続されてもよい。 The culture system BS, the information processing device 100, and the storage device 110 are connected via networks such as the Internet, LAN (Local Area Network), and WAN (Wide Area Network). Further, the culture system BS, the information processing device 100, and the storage device 110 may be connected via a network combining the Internet, LAN, WAN, and the like. Such a network is not limited to a network by wired communication, and may include a network by wireless communication. Further, the information processing device 100 may be configured to include the storage device 110. Further, the culture system BS and the storage device 110 may be connected via a network.
 図2は、第1実施形態に係る情報処理装置に接続される培養システムの構成例を示す図である。図3は、第1実施形態に係る情報処理装置に接続される培養システムの構成例を示すブロック図である。図4は、第1実施形態に係る情報処理装置に接続される培養システムの制御ユニット回りの接続関係の例を説明する図である。 FIG. 2 is a diagram showing a configuration example of a culture system connected to the information processing apparatus according to the first embodiment. FIG. 3 is a block diagram showing a configuration example of a culture system connected to the information processing apparatus according to the first embodiment. FIG. 4 is a diagram illustrating an example of the connection relationship around the control unit of the culture system connected to the information processing apparatus according to the first embodiment.
 培養システムBSは、大別して、筐体1の上部に設けられた培養室2と、複数の培養容器10を収容及び保持するストッカ3と、培養容器10内の対象物を観察(撮像)する観察装置5と、培養容器10を搬送する搬送ユニット(搬送装置)4とを有する。加えて、培養システムBSは、システムの作動を制御する制御ユニット(制御装置)6と、表示装置を備える操作盤7とを有する。なお、培養室2、ストッカ3、及び搬送ユニット4等は、培養装置8に相当する。 The culture system BS is roughly divided into a culture chamber 2 provided in the upper part of the housing 1, a stocker 3 for accommodating and holding a plurality of culture containers 10, and an observation for observing (imaging) an object in the culture container 10. It has a device 5 and a transport unit (transport device) 4 for transporting the culture container 10. In addition, the culture system BS has a control unit (control device) 6 for controlling the operation of the system, and an operation panel 7 including a display device. The culture chamber 2, the stocker 3, the transfer unit 4, and the like correspond to the culture device 8.
 培養室2は、顕微鏡観察における細胞等の観察対象物の培養環境を形成するチャンバである。培養室2には、温度調整装置21と、加湿器22と、ガス供給装置23と、循環ファン24と、環境センサ25とが設けられる。温度調整装置21は、環境センサ25と連携し、培養室2内の温度を所定の設定温度に調整する。加湿器22は、環境センサ25と連携し、培養室2内の湿度を所定の設定湿度に調整する。ガス供給装置23は、環境センサ25と連携し、COガス、Nガス、及びOガス等を供給する。循環ファン24は、環境センサ25と連携し、培養室2内の気体(空気)を循環させ、温度調整するファンである。環境センサ25は、培養室2の温度、湿度、二酸化炭素濃度、窒素濃度、及び酸素濃度等を検出する。 The culture chamber 2 is a chamber that forms a culture environment for observation objects such as cells in microscopic observation. The culture chamber 2 is provided with a temperature control device 21, a humidifier 22, a gas supply device 23, a circulation fan 24, and an environmental sensor 25. The temperature adjusting device 21 cooperates with the environment sensor 25 to adjust the temperature in the culture chamber 2 to a predetermined set temperature. The humidifier 22 cooperates with the environment sensor 25 to adjust the humidity in the culture chamber 2 to a predetermined set humidity. The gas supply device 23 cooperates with the environment sensor 25 to supply CO 2 gas, N 2 gas, O 2 gas, and the like. The circulation fan 24 is a fan that circulates gas (air) in the culture chamber 2 and adjusts the temperature in cooperation with the environment sensor 25. The environment sensor 25 detects the temperature, humidity, carbon dioxide concentration, nitrogen concentration, oxygen concentration, etc. of the culture chamber 2.
 ストッカ3は、前後、及び上下に仕切られた棚状にて形成される。各棚には、例えば、固有の番地が設定される。培養容器10は、培養する対象物の種別や目的等に応じて適宜選択される。培養容器10は、例えば、ウェルプレート、フラスコ、又はディッシュタイプ等の培養容器でよい。本実施形態では、ウェルプレートを利用する場合を例に挙げる。培養容器10には、対象物が液体培地(培養液)とともに注入され、保持される。培養容器10のそれぞれには、例えば、コード番号が付与される。培養容器10は、付与されたコード番号に応じて、ストッカ3の指定番地に対応付けて収容される。搬送ユニット4は、培養室2の内部に設けられた、上下移動可能なZステージ41と、前後移動可能なYステージ42と、左右移動可能なXステージ43とを有する。支持アーム45は、Xステージ43の先端側において、培養容器10を持ち上げ、支持する。 The stocker 3 is formed in a shelf shape divided into front and back and upper and lower. For example, a unique address is set for each shelf. The culture container 10 is appropriately selected according to the type and purpose of the object to be cultured. The culture container 10 may be, for example, a well plate, a flask, or a dish type culture container. In this embodiment, a case where a well plate is used will be given as an example. The object is injected into the culture vessel 10 together with the liquid medium (culture solution) and held. For example, a code number is assigned to each of the culture vessels 10. The culture container 10 is housed in association with the designated address of the stocker 3 according to the assigned code number. The transfer unit 4 has a Z stage 41 that can move up and down, a Y stage 42 that can move back and forth, and an X stage 43 that can move left and right, which are provided inside the culture chamber 2. The support arm 45 lifts and supports the culture vessel 10 on the tip end side of the X stage 43.
 観察装置5は、第1照明部51と、第2照明部52と、第3照明部53と、マクロ観察系54と、顕微観察系55と、制御ユニット6とを有する。第1照明部51は、試料台15の下側から対象物を照明する。第2照明部52は、顕微観察系55の光軸に沿って、試料台15の上方から対象物を照明する。第3照明部53は、顕微観察系55の光軸に沿って、試料台15の下方から対象物を照明する。マクロ観察系54は、対象物のマクロ観察を実施する。顕微観察系55は、対象物のミクロ観察を実施する。試料台15には、顕微観察系55の観察領域において、ガラス等の透明な窓部16が設けられる。 The observation device 5 includes a first lighting unit 51, a second lighting unit 52, a third lighting unit 53, a macro observation system 54, a microscopic observation system 55, and a control unit 6. The first illumination unit 51 illuminates the object from below the sample table 15. The second illumination unit 52 illuminates the object from above the sample table 15 along the optical axis of the microscopic observation system 55. The third illumination unit 53 illuminates the object from below the sample table 15 along the optical axis of the microscopic observation system 55. The macro observation system 54 carries out macro observation of the object. The microscopic observation system 55 carries out microscopic observation of the object. The sample table 15 is provided with a transparent window portion 16 such as glass in the observation region of the microscopic observation system 55.
 マクロ観察系54は、観察光学系54aと、観察光学系54aにより結像された対象物の像を撮影するCCDカメラ等の撮像装置54cとを有する。マクロ観察系54は、第1照明部51によりバックライト照明された培養容器10の上方からの全体観察画像を取得する。顕微観察系55は、対物レンズ、中間変倍レンズ、及び蛍光フィルタ等を含む観察光学系55aと、観察光学系55aにより結像された対象物の像を撮影する冷却CCDカメラ等の撮像装置55cとを有する。対物レンズ、及び中間変倍レンズは、それぞれが複数設けられてもよい。対物レンズ、及び中間変倍レンズについては、レンズの組み合わせを変化させることで、任意の観察倍率に設定可能に構成される。顕微観察系55は、第2照明部52が照明した対象物の透過像と、第3照明部53が照明した対象物の反射像と、第3照明部53が照明した対象物の蛍光像とを取得する。つまり、顕微観察系55は、培養容器10内の対象物を顕微鏡観察した顕微観察画像を取得する。 The macro observation system 54 includes an observation optical system 54a and an image pickup device 54c such as a CCD camera that captures an image of an object imaged by the observation optical system 54a. The macro observation system 54 acquires an overall observation image from above of the culture vessel 10 backlit by the first illumination unit 51. The microscopic observation system 55 includes an observation optical system 55a including an objective lens, an intermediate magnification lens, a fluorescence filter, and an imaging device 55c such as a cooled CCD camera that captures an image of an object imaged by the observation optical system 55a. And have. A plurality of objective lenses and intermediate variable magnification lenses may be provided. The objective lens and the intermediate magnification lens can be set to an arbitrary observation magnification by changing the combination of lenses. The microscopic observation system 55 includes a transmission image of the object illuminated by the second illumination unit 52, a reflection image of the object illuminated by the third illumination unit 53, and a fluorescence image of the object illuminated by the third illumination unit 53. To get. That is, the microscopic observation system 55 acquires a microscopic observation image obtained by microscopically observing the object in the culture vessel 10.
 制御ユニット6は、マクロ観察系54の撮像装置54c、及び顕微観察系55の撮像装置55cから入力された信号を処理し、全体観察画像、及び顕微観察画像等の画像を生成する。また、制御ユニット6は、全体観察画像、及び顕微観察画像に画像解析を施し、タイムラプス画像を生成する。制御ユニット6は、生成した画像を情報処理装置100に出力し、記憶装置110に格納させる。 The control unit 6 processes the signals input from the image pickup device 54c of the macro observation system 54 and the image pickup device 55c of the microscopic observation system 55 to generate an image such as an overall observation image and a microscopic observation image. Further, the control unit 6 performs image analysis on the whole observation image and the microscopic observation image to generate a time-lapse image. The control unit 6 outputs the generated image to the information processing device 100 and stores it in the storage device 110.
 制御ユニット6は、CPU(Central Processing Unit)(プロセッサ)61と、ROM(Read Only Memory)62と、RAM(Random Access Memory)63とを有する。CPU61は、制御ユニット6を統括し、制御ユニット6において各種の処理を実行する。ROM62は、培養システムBSに関する制御プログラム、及び制御データ等を記憶する。RAM63は、ハードディスクやDVD(Digital Versatile Disc)等の補助記憶装置を含み、観察条件や画像データ等を一時記憶する。制御ユニット6には、培養室2、搬送ユニット4、観察装置5、及び操作盤7等の各構成機器が接続される(図3参照)。 The control unit 6 has a CPU (Central Processing Unit) (processor) 61, a ROM (Read Only Memory) 62, and a RAM (Random Access Memory) 63. The CPU 61 controls the control unit 6 and executes various processes in the control unit 6. The ROM 62 stores a control program related to the culture system BS, control data, and the like. The RAM 63 includes an auxiliary storage device such as a hard disk or a DVD (Digital Versatile Disc), and temporarily stores observation conditions, image data, and the like. Each component device such as the culture chamber 2, the transfer unit 4, the observation device 5, and the operation panel 7 is connected to the control unit 6 (see FIG. 3).
 RAM63は、例えば、観察プログラムに応じた培養室2の環境条件、及び観察スケジュール、観察装置5における観察種別、観察位置、及び観察倍率等を記憶する。また、RAM63は、観察装置5が撮影した画像データを記憶する記憶領域を含み、該画像データと、培養容器10のコード番号や撮影日時等を含むインデックスデータとを対応付けて記憶する。操作盤7は、操作パネル(操作部、入力部)71と、表示パネル72とを有する。操作パネル71は、キーボード、マウス、及びスイッチ等の入出力機器(操作部、入力部)を含む。ユーザは、操作パネル71を操作し、観察プログラムの設定、条件の選択、及び動作指令等を入力する。通信部65は、有線又は無線の通信規格に準拠して構成され、観察装置5、培養システムBS、又は制御ユニット6に接続される外部の機器(例、サーバ、ユーザのクライアント端末等)との間でデータを送受信する。RAM63が記憶する各種の情報は、情報処理装置100を介して、適宜、記憶装置110に格納され得る。 The RAM 63 stores, for example, the environmental conditions of the culture chamber 2 according to the observation program, the observation schedule, the observation type in the observation device 5, the observation position, the observation magnification, and the like. Further, the RAM 63 includes a storage area for storing the image data photographed by the observation device 5, and stores the image data in association with the index data including the code number of the culture vessel 10, the imaging date and time, and the like. The operation panel 7 has an operation panel (operation unit, input unit) 71 and a display panel 72. The operation panel 71 includes input / output devices (operation unit, input unit) such as a keyboard, a mouse, and a switch. The user operates the operation panel 71 to input observation program settings, condition selection, operation commands, and the like. The communication unit 65 is configured in accordance with a wired or wireless communication standard, and is connected to an observation device 5, a culture system BS, or an external device (eg, a server, a user's client terminal, etc.) connected to the control unit 6. Send and receive data between. Various types of information stored in the RAM 63 can be appropriately stored in the storage device 110 via the information processing device 100.
 図5は、第1実施形態に係る情報処理装置の機能構成例を示すブロック図である。図5に示すように、情報処理装置100は、通信部131と、取得部132と、表示制御部133と、演算部134と、登録部135とを有する。また、情報処理装置100には、入力部121と、表示部122とが接続される。 FIG. 5 is a block diagram showing a functional configuration example of the information processing device according to the first embodiment. As shown in FIG. 5, the information processing apparatus 100 includes a communication unit 131, an acquisition unit 132, a display control unit 133, a calculation unit 134, and a registration unit 135. Further, the information processing device 100 is connected to the input unit 121 and the display unit 122.
 入力部121は、情報処理装置100のユーザによる各種操作を受け付け、ユーザ操作に応じた制御信号を出力する。入力部121は、例えば、マウス、及びキーボード等で構成される。表示部122は、入力部121に対するユーザ操作に応じた各種情報(画像を含む情報)を表示出力する。表示部122は、例えば、ディスプレイ等で構成される。なお、入力部121と表示部122とは、一体的に構成されてもよい。すなわち、入力部121と表示部122とは、表示部122に表示される各種情報に対して直接的に入力操作を実施するタッチパネルを有する携帯型端末(例、タブレット端末)等で構成されてもよい。 The input unit 121 receives various operations by the user of the information processing device 100 and outputs a control signal according to the user operation. The input unit 121 is composed of, for example, a mouse, a keyboard, and the like. The display unit 122 displays and outputs various information (information including an image) according to a user operation on the input unit 121. The display unit 122 is composed of, for example, a display or the like. The input unit 121 and the display unit 122 may be integrally configured. That is, even if the input unit 121 and the display unit 122 are configured by a portable terminal (eg, a tablet terminal) or the like having a touch panel that directly performs an input operation on various information displayed on the display unit 122. Good.
 通信部131は、ネットワークを介して、培養システムBS、及び記憶装置110と通信し、各種情報を送受信する。通信部131は、例えば、ネットワークを介して、観察条件及び観察結果に関する情報を培養システムBSから受信する。また、通信部131は、例えば、ネットワークを介して、観察条件及び観察結果に関する情報を記憶装置110との間で送受信する。 The communication unit 131 communicates with the culture system BS and the storage device 110 via the network to transmit and receive various information. The communication unit 131 receives information on observation conditions and observation results from the culture system BS, for example, via a network. Further, the communication unit 131 transmits / receives information on observation conditions and observation results to / from the storage device 110 via a network, for example.
 取得部132は、複数の収容部を有する容器に収容された複数の対象物を所定の観察条件において撮像して得られた観察結果を取得する。例えば、取得部132は、ネットワーク又は通信部131を介して、記憶装置110が記憶する培養システムBSにおける種々の観察結果に関する情報を、記憶装置110から適宜取得する。また、取得部132は、観察結果に関する情報だけではなく、観察条件に関する情報についても記憶装置110から適宜取得することができる。 The acquisition unit 132 acquires the observation results obtained by imaging a plurality of objects housed in a container having a plurality of storage units under predetermined observation conditions. For example, the acquisition unit 132 appropriately acquires information on various observation results in the culture system BS stored in the storage device 110 from the storage device 110 via the network or the communication unit 131. Further, the acquisition unit 132 can appropriately acquire not only the information regarding the observation result but also the information regarding the observation condition from the storage device 110.
 図6は、第1実施形態に係る記憶装置が記憶する情報例を示す図である。図6に示すように、記憶装置110は、データ項目として、実験番号、実験名、実験責任者、実験担当者、観察開始日時、観察終了日時、顕微鏡名、倍率、容器製品、容器種別、判定結果、ステータス、アプリ番号、及びアプリ名を含む。実験番号は、実験毎に一意に割り当てられる識別番号を示す情報である。実験番号には、例えば、「Exp00001」等の情報が記憶される。実験名は、実験の名称を示す情報である。実験名には、例えば、「BS-T00001」等の情報が記憶される。実験責任者は、実験の責任者の氏名を示す情報である。実験責任者には、例えば、「責任者A」等の情報が記憶される。実験担当者は、実験の担当者の氏名を示す情報である。実験担当者には、例えば、「担当E」等の情報が記憶される。なお、実験番号、実験名、実験責任者、及び実験担当者のデータ項目は、単にそれぞれ、番号、名称、責任者、及び担当者のデータ項目としてもよい。例えば、実験工程に限らず、培養工程に利用される場合は、培養番号、培養名、培養責任者、及び培養担当者のデータ項目であってもよい。 FIG. 6 is a diagram showing an example of information stored in the storage device according to the first embodiment. As shown in FIG. 6, the storage device 110 has, as data items, an experiment number, an experiment name, an experiment manager, an experiment person, an observation start date and time, an observation end date and time, a microscope name, a magnification, a container product, a container type, and a determination. Includes result, status, app number, and app name. The experiment number is information indicating an identification number uniquely assigned to each experiment. Information such as "Exp00001" is stored in the experiment number. The experiment name is information indicating the name of the experiment. Information such as "BS-T00001" is stored in the experiment name. The person responsible for the experiment is information indicating the name of the person responsible for the experiment. Information such as "responsible person A" is stored in the experiment manager. The person in charge of the experiment is information indicating the name of the person in charge of the experiment. Information such as "charge E" is stored in the person in charge of the experiment. The data items of the experiment number, the experiment name, the person in charge of the experiment, and the person in charge of the experiment may be simply the data items of the number, the name, the person in charge, and the person in charge, respectively. For example, when it is used not only in the experimental step but also in the culture step, it may be a data item of a culture number, a culture name, a culture manager, and a culture manager.
 観察開始日時は、観察を開始した日時を示す情報である。観察開始日時には、例えば、「2019/08/25 09:00:00」等の情報が記憶される。観察終了日時は、観察を終了した日時を示す情報である。観察終了日時には、例えば、「2019/08/26 15:15:25」等の情報が記憶される。顕微鏡名は、観察で利用した顕微鏡の名称を示す情報である。顕微鏡名には、例えば、「顕微鏡H」等の情報が記憶される。倍率は、観察のときに設定した顕微鏡の倍率を示す情報である。倍率には、例えば、「8x」等の情報が記憶される。容器製品は、対象物を収容する収容部(例えば、ウェル、ディッシュ等)を複数有する容器(例えば、ウェルプレート等)のメーカ名を示す情報である。容器製品には、例えば、「製品タイプK」等の情報が記憶される。 The observation start date and time is information indicating the date and time when the observation was started. Information such as "2019/08/25 09:00:00" is stored in the observation start date and time. The observation end date and time is information indicating the date and time when the observation was completed. Information such as "2019/08/26 15:15:25" is stored in the observation end date and time. The microscope name is information indicating the name of the microscope used in the observation. Information such as "microscope H" is stored in the microscope name. The magnification is information indicating the magnification of the microscope set at the time of observation. Information such as "8x" is stored in the magnification. The container product is information indicating the manufacturer name of a container (for example, a well plate) having a plurality of accommodating portions (for example, wells, dishes, etc.) for accommodating an object. Information such as "product type K" is stored in the container product.
 容器種別は、対象物を収容する収容部(例えば、ウェル等)を複数有する容器(例えば、ウェルプレート等)の種別を示す情報である。容器種別には、例えば、「6WP(Well Plate)」等の情報が記憶される。判定結果は、実験に対するユーザの判定を示す情報である。判定結果には、例えば、「OK」又は「NG」等の情報が記憶される。ステータスは、観察結果の解析の進捗を示す情報である。ステータスには、例えば、「完了」又は「60%」等の情報が記憶される。アプリ番号は、観察結果の解析で利用されるアプリケーションパッケージ毎に一意に割り当てられる識別番号を示す情報である。アプリ番号には、例えば、「App.00001」等の情報が記憶される。アプリ名は、アプリケーションパッケージの名称を示す情報である。アプリ名には、例えば、「AppX」等の情報が記憶される。なお、アプリケーションパッケージとしては、例えば、画像解析のアプリ、対象物の面積を算出するアプリ、対象物の数を算出するアプリ等が存在する。 The container type is information indicating the type of a container (for example, a well plate) having a plurality of accommodating portions (for example, wells or the like) for accommodating an object. Information such as "6WP (Well Plate)" is stored in the container type. The determination result is information indicating the user's determination for the experiment. Information such as "OK" or "NG" is stored in the determination result. The status is information indicating the progress of analysis of the observation result. Information such as "completed" or "60%" is stored in the status. The application number is information indicating an identification number uniquely assigned to each application package used in the analysis of the observation result. Information such as "App.00001" is stored in the application number. The application name is information indicating the name of the application package. Information such as "AppX" is stored in the application name. As the application package, for example, there are an image analysis application, an application for calculating the area of an object, an application for calculating the number of objects, and the like.
 また、記憶装置110は、上記の情報のほかにも、実験番号及びコード番号等に対応付けて、対象物がそれぞれ収容される複数の収容部の観察画像を記憶する。観察画像は、例えば、上述した全体観察画像や顕微観察画像に相当する。従って、取得部132は、観察画像についても記憶装置110から適宜取得することができる。また、記憶装置110は、後述するグループ情報についても記憶する。 In addition to the above information, the storage device 110 stores observation images of a plurality of storage units in which the objects are housed in association with the experiment number, the code number, and the like. The observation image corresponds to, for example, the above-mentioned general observation image or microscopic observation image. Therefore, the acquisition unit 132 can appropriately acquire the observation image from the storage device 110 as well. The storage device 110 also stores group information described later.
 図5の説明に戻る。表示制御部133は、表示部122に表示する表示画像を生成し、生成した表示画像を表示部122に表示する。表示制御部133は、表示部122に表示する種々の表示画像を生成し、生成した表示画像を表示部122に表示するが、本実施形態では主に、後述する登録部135で実行されるグループの登録に関わる表示画像を生成する。また、表示制御部133は、表示画像の生成に関して、演算を要する情報については演算部134から取得する。すなわち、演算部134は、観察結果に基づく演算を実行する。換言すると、表示制御部133は、観察条件及び観察結果等の生データについては記憶装置110から取得し、観察結果をもとにした演算処理結果の情報については演算部134から取得する。本実施形態に係る表示制御部133による処理の詳細については後述する。 Return to the explanation in Fig. 5. The display control unit 133 generates a display image to be displayed on the display unit 122, and displays the generated display image on the display unit 122. The display control unit 133 generates various display images to be displayed on the display unit 122 and displays the generated display images on the display unit 122. In the present embodiment, the group is mainly executed by the registration unit 135, which will be described later. Generate a display image related to the registration of. Further, the display control unit 133 acquires information that requires a calculation from the calculation unit 134 regarding the generation of the display image. That is, the calculation unit 134 executes a calculation based on the observation result. In other words, the display control unit 133 acquires raw data such as observation conditions and observation results from the storage device 110, and acquires information on calculation processing results based on the observation results from the calculation unit 134. Details of the processing by the display control unit 133 according to this embodiment will be described later.
 登録部135は、観察条件又は観察結果から得られる分類基準と、観察結果とに基づいて、2以上の収容部を同一のグループとして登録する。分類基準は、例えば、収容部に投入する培養液の種類及び量を含む観察条件の少なくとも1つである。分類基準は、例えば、収容部に投入する培養液に含まれる血清の種類、濃度及び量を含む観察条件の少なくとも1つである。分類基準は、例えば、収容部に投入する薬剤の種類、濃度、暴露期間、及び暴露タイミングを含む観察条件の少なくとも1つである。分類基準は、例えば、収容部に投入する対象物の種類及び数を含む観察条件の少なくとも1つである。分類基準は、例えば、顕微鏡名、倍率、容器が配置された空間(例えば、培養室2)における温度設定、湿度設定、雰囲気の供給設定、及び光の出力設定を含む観察条件の少なくとも1つである。分類基準は、例えば、対象物の数、数の経時変化、数の倍加時間、移動量、及び形態変化を含む観察結果の少なくとも1つである。分類基準は、例えば、対象物の専有面積、及び対象物の周囲長を含む観察結果の少なくとも1つである。分類基準は、例えば、観察結果(画像)の解析後の輝度値を用いてもよい。例えば、画像の解析後の輝度値を用いる場合は、コロニーの細胞密度(粗密)に対して輝度値の平均値を「5」等とし、平均値を基準値とした場合、基準値以上と基準値未満とで分類することができる。すなわち、登録部135は、上述した分類基準の1つ、又は分類基準の組み合わせと、観察結果とが、同一となった(類似する、関連する)2以上の収容部を同一のグループとして登録する。登録部135は、グループ情報について、記憶装置110に格納する。また、例えば、分類基準は、後述するグループ登録やグループ表示に用いる特徴又は指標を含む。 The registration unit 135 registers two or more accommodating units as the same group based on the observation conditions or the classification criteria obtained from the observation results and the observation results. The classification criterion is, for example, at least one of the observation conditions including the type and amount of the culture solution to be charged into the container. The classification criterion is, for example, at least one of the observation conditions including the type, concentration and amount of serum contained in the culture medium to be put into the container. The classification criterion is, for example, at least one of the observation conditions including the type, concentration, duration of exposure, and timing of exposure of the drug to be charged into the containment. The classification criterion is, for example, at least one of the observation conditions including the type and number of objects to be put into the containment unit. The classification criterion is at least one of the observation conditions including, for example, the microscope name, the magnification, the temperature setting, the humidity setting, the atmosphere supply setting, and the light output setting in the space where the container is arranged (for example, the culture room 2). is there. The classification criterion is, for example, at least one of the observation results including the number of objects, the time course of the number, the doubling time of the number, the amount of movement, and the morphological change. The classification criterion is, for example, at least one of the observation results including the occupied area of the object and the peripheral length of the object. As the classification standard, for example, the brightness value after the analysis of the observation result (image) may be used. For example, when using the brightness value after image analysis, the average value of the brightness value is set to "5" or the like with respect to the cell density (roughness) of the colony, and when the average value is used as the reference value, the reference value is equal to or higher than the reference value. It can be classified as less than a value. That is, the registration unit 135 registers two or more storage units having the same (similar or related) observation result as one of the above-mentioned classification criteria or a combination of the classification criteria as the same group. .. The registration unit 135 stores the group information in the storage device 110. Further, for example, the classification criteria include features or indicators used for group registration and group display described later.
 グループ登録で利用される情報には、観察結果の1つである収容部を撮像した観察画像が含まれてもよい。すなわち、登録部135は、観察の任意の過程(期間)において、観察画像が同一となった(類似する、関連する)2以上の収容部を同一のグループとして登録する。また、グループ登録で利用される情報には、観察結果を視覚的に表現した情報が含まれてもよい。すなわち、登録部135は、観察結果をグラフ化した情報に基づいて、グラフ化した情報が同一となった(類似する、関連する)2以上の収容部を同一のグループとして登録する。 The information used in the group registration may include an observation image of the accommodating portion, which is one of the observation results. That is, the registration unit 135 registers two or more accommodating units having the same (similar, related) observation images as the same group in an arbitrary process (period) of observation. Further, the information used in the group registration may include information that visually expresses the observation result. That is, the registration unit 135 registers two or more accommodating units having the same (similar, related) graphed information as the same group based on the graphed information of the observation results.
 なお、グループ登録に関しては、観察結果を利用せずに実施することができる。登録部135は、例えば、任意のタイミングで、観察結果を利用せずにグループ登録を実行する。任意のタイミングは、例えば、観察前、観察中、及び/又は観察後である。一例として、観察結果を利用しないグループ登録は、観察前に(観察条件に関する分類基準が設定されたタイミングで)行うことができる。すなわち、登録部135は、観察条件に関する分類基準に基づいて、2以上の収容部を同一のグループとして登録してもよい。観察条件に関する分類基準は、ユーザによって予め決定されればよい。 Note that group registration can be carried out without using the observation results. For example, the registration unit 135 executes group registration at an arbitrary timing without using the observation result. Arbitrary timing is, for example, before, during, and / or after observation. As an example, group registration that does not use the observation results can be performed before observation (at the timing when the classification criteria for observation conditions are set). That is, the registration unit 135 may register two or more accommodating units as the same group based on the classification criteria regarding the observation conditions. The classification criteria for the observation conditions may be determined in advance by the user.
 図7から図17は、第1実施形態に係る分類基準を用いたグループ登録における画面例を示す図である。また、図7から図17の説明において、適宜、表示制御部133、演算部134、及び登録部135における処理を説明する。図7に示すように、表示制御部133は、観察結果の検索画面を示す表示画像を表示部122に表示する。観察結果の検索画面には、キーワード検索を行うテキスト入力部KSaと、予め決定された検索条件の中から選択することにより検索を行う条件検索部FSa、FSb、FSc、FSdと、検索を実行する検索ボタンSBとが含まれる。条件検索部FSaは、例えば、実験担当者の氏名を選択するプルダウンである。条件検索部FSbは、例えば、ステータスを選択するプルダウンである。条件検索部FScは、例えば、観察開始日と観察終了日とを選択するプルダウンである。条件検索部FSdは、例えば、アプリケーション名を選択するプルダウンである。なお、条件検索部FSa、FSb、FSc、FSdは、テキスト入力により実現してもよい。また、検索条件は、上記に限られない。 7 to 17 are diagrams showing screen examples in group registration using the classification criteria according to the first embodiment. Further, in the description of FIGS. 7 to 17, the processes in the display control unit 133, the calculation unit 134, and the registration unit 135 will be described as appropriate. As shown in FIG. 7, the display control unit 133 displays a display image showing the observation result search screen on the display unit 122. On the observation result search screen, a search is executed with the text input unit KSa that performs a keyword search and the condition search units FSa, FSb, FSc, and FSd that perform a search by selecting from predetermined search conditions. A search button SB is included. The condition search unit FSa is, for example, a pull-down for selecting the name of the person in charge of the experiment. The condition search unit FSb is, for example, a pull-down for selecting a status. The condition search unit FSc is, for example, a pull-down that selects an observation start date and an observation end date. The condition search unit FSd is, for example, a pull-down for selecting an application name. The condition search unit FSa, FSb, FSc, and FSd may be realized by text input. The search conditions are not limited to the above.
 ユーザは、入力部121を操作し、テキスト入力部KSaに対するテキスト入力、条件検索部FSa、FSb、FSc、FSdを用いた選択を行い、検索ボタンSBを押下する。又は、ユーザは、入力部121を操作し、テキスト入力や条件検索の選択を行わずに、検索ボタンSBを押下する。検索ボタンSBの押下に応じて、表示制御部133は、検索条件に該当する情報を記憶装置110から取得し、検索結果SRを示す表示画像を表示部122に表示する。つまり、表示制御部133は、検索条件に基づく観察結果の一覧、又は全ての観察結果の一覧を示す表示画像を表示部122に表示する。検索結果SRには、例えば、実験名、実験責任者、実験担当者、観察開始日時、観察終了日時、顕微鏡名、倍率、容器製品、容器種別、アプリ名、判定、及びステータスのデータ項目に対する情報が含まれる。なお、検索結果SRのデータ項目は、上記に限られない。また、検索結果SRは、データ項目の指示等によりソートすることができる。ユーザは、観察結果の確認のために、検索結果SRの中から選択する(指定する)操作を行う。 The user operates the input unit 121, performs text input to the text input unit KSa, makes a selection using the conditional search units FSa, FSb, FSc, and FSd, and presses the search button SB. Alternatively, the user operates the input unit 121 and presses the search button SB without selecting text input or conditional search. In response to pressing the search button SB, the display control unit 133 acquires information corresponding to the search condition from the storage device 110, and displays a display image showing the search result SR on the display unit 122. That is, the display control unit 133 displays a list of observation results based on the search conditions or a display image showing a list of all observation results on the display unit 122. In the search result SR, for example, information on the data items of the experiment name, the experiment manager, the experiment person, the observation start date and time, the observation end date and time, the microscope name, the magnification, the container product, the container type, the application name, the judgment, and the status Is included. The data items of the search result SR are not limited to the above. Further, the search result SR can be sorted according to the instruction of the data item or the like. The user performs an operation of selecting (designating) from the search result SR in order to confirm the observation result.
 次に、図8に示すように、表示制御部133は、検索結果SRの中から選択された観察結果を表示画像に示す。観察結果の表示画像には、例えば、観察に関する情報ExpI、撮影条件(観察条件)に関する情報ScI、観察画像を含むプレートマップPmI、及び観察におけるイベントに関する情報EvIが含まれる。これらのうち、プレートマップPmIには、観察画像OI、グループ名GN、時系列切替コンテンツTS、及び深さ切替コンテンツDPが含まれる。観察画像OIには、例えば、選択された観察結果に含まれる観察画像が表示される。グループ名GNには、例えば、登録済みのグループ名が表示される。時系列切替コンテンツTSには、例えば、観察画像を時系列で切り替えるコンテンツが表示される。深さ切替コンテンツDPには、例えば、収容部の深さに対応する観察画像を切り替えるコンテンツが表示される。 Next, as shown in FIG. 8, the display control unit 133 shows the observation result selected from the search result SR in the display image. The display image of the observation result includes, for example, information ExpI regarding observation, information ScI regarding imaging conditions (observation conditions), plate map PmI including observation images, and information EvI regarding events in observation. Among these, the plate map PmI includes an observation image OI, a group name GN, a time-series switching content TS, and a depth switching content DP. In the observation image OI, for example, an observation image included in the selected observation result is displayed. For example, a registered group name is displayed in the group name GN. In the time-series switching content TS, for example, content for switching observation images in chronological order is displayed. In the depth switching content DP, for example, content for switching the observation image corresponding to the depth of the accommodating portion is displayed.
 時系列切替コンテンツTSは、例えば、一定期間ごとに区切られた複数の矩形で表現される。ユーザは、入力部121を操作し、時系列切替コンテンツTSの矩形を選択することができる。表示制御部133は、時系列切替コンテンツTSの矩形が選択されると、該当する期間に対応する観察画像を表示画像として、プレートマップPmIの表示を切り替える。このとき、表示制御部133は、選択された矩形における観察日時に関する情報を表示してもよい。なお、時系列切替コンテンツTSを用いた観察画像の切り替えは、矩形の選択に限られず、矩形の選択位置を移動させる時系列切替コンテンツTSaにより実現してもよい。 The time-series switching content TS is represented by, for example, a plurality of rectangles separated at regular intervals. The user can operate the input unit 121 to select the rectangle of the time-series switching content TS. When the rectangle of the time-series switching content TS is selected, the display control unit 133 switches the display of the plate map PmI using the observation image corresponding to the corresponding period as the display image. At this time, the display control unit 133 may display information regarding the observation date and time in the selected rectangle. The switching of the observation image using the time-series switching content TS is not limited to the selection of the rectangle, and may be realized by the time-series switching content TSa that moves the selection position of the rectangle.
 深さ切替コンテンツDPは、例えば、収容部に対する一定の深さ(厚さ、Z方向)ごとに区切られた矩形で表現される。ユーザは、入力部121を操作し、深さ切替コンテンツDPの矩形を選択することができる。表示制御部133は、深さ切替コンテンツDPの矩形が選択されると、該当する収容部の深さに対応する観察画像を表示画像として、プレートマップPmIの表示を切り替える。なお、深さ切替コンテンツDPを用いた観察画像の切り替えは、矩形の選択に限られず、矩形の選択位置を移動させる深さ切替コンテンツDPaにより実現してもよい。 The depth switching content DP is represented by, for example, a rectangle divided by a certain depth (thickness, Z direction) with respect to the accommodating portion. The user can operate the input unit 121 to select the rectangle of the depth switching content DP. When the rectangle of the depth switching content DP is selected, the display control unit 133 switches the display of the plate map PmI using the observation image corresponding to the depth of the corresponding accommodating unit as the display image. The switching of the observation image using the depth switching content DP is not limited to the selection of the rectangle, and may be realized by the depth switching content DPa that moves the selection position of the rectangle.
 つまり、表示制御部133は、ユーザによって選択されたグループの観察結果を、複数の収容部を有する容器に関する表示画像に対して示し、該表示画像を表示部122に表示する。また、表示制御部133は、検索条件が使用されない場合、一覧(全ての一覧)からユーザによって選択された観察結果を、複数の収容部を有する容器に関する表示画像に対して示し、該表示画像を表示部122に表示する。また、表示制御部133は、検索条件が使用される場合、検索条件に基づく一覧からユーザによって選択された観察結果を、複数の収容部を有する容器に関する表示画像に対して示し、該表示画像を表示部122に表示する。そして、表示制御部133は、観察結果に含まれる収容部の観察画像を、表示画像に対して示し、該表示画像を表示部122に表示する。 That is, the display control unit 133 shows the observation result of the group selected by the user with respect to the display image relating to the container having the plurality of storage units, and displays the display image on the display unit 122. Further, when the search condition is not used, the display control unit 133 indicates the observation result selected by the user from the list (all lists) with respect to the display image relating to the container having a plurality of storage units, and displays the display image. It is displayed on the display unit 122. Further, when the search condition is used, the display control unit 133 indicates the observation result selected by the user from the list based on the search condition with respect to the display image relating to the container having a plurality of storage units, and displays the display image. It is displayed on the display unit 122. Then, the display control unit 133 shows the observation image of the accommodating unit included in the observation result with respect to the display image, and displays the display image on the display unit 122.
 次に、ユーザは、入力部121を操作し、グループ登録のために、編集ボタンEB(図9参照)を押下する。ここで、編集とは、新規のグループを登録することにより、グループ情報が編集されることを含む。図9に示すように、表示制御部133は、編集ボタンEBが押下されたことを示す信号を受け付ける。そして、図10に示すように、表示制御部133は、観察画像OI、及び新規のグループを追加するグループ追加ボタンAGを含む表示画像を表示する。続いて、図11に示すように、ユーザは、入力部121を操作し、新規のグループに追加したい収容部に対応する2以上の観察画像OIを選択し、グループ追加ボタンAGを押下する。ここで、表示制御部133は、観察画像OIの選択に応じて、観察画像OIを所定色で反転させた表示画像を表示する。これにより表示制御部133はグループ追加ボタンAGの押下を示す信号を受信し、図12に示すように、表示制御部133は、新規のグループのグループ名を入力(又は編集)するテキスト入力部KSbと、グループ登録を実行する登録ボタンRBとを含む表示画像を表示する。加えて、表示制御部133は、グループ登録の対象となる観察画像OIについて、色、枠や線等の強調情報により明示する。なお、テキスト入力部KSbの近傍には、観察結果等をグループごとに異なる色で表現するグループカラーコンテンツGCが配置される。 Next, the user operates the input unit 121 and presses the edit button EB (see FIG. 9) for group registration. Here, editing includes editing group information by registering a new group. As shown in FIG. 9, the display control unit 133 receives a signal indicating that the edit button EB has been pressed. Then, as shown in FIG. 10, the display control unit 133 displays an observation image OI and a display image including the group addition button AG for adding a new group. Subsequently, as shown in FIG. 11, the user operates the input unit 121, selects two or more observation image OIs corresponding to the accommodating units to be added to the new group, and presses the group addition button AG. Here, the display control unit 133 displays a display image in which the observation image OI is inverted with a predetermined color according to the selection of the observation image OI. As a result, the display control unit 133 receives a signal indicating that the group addition button AG is pressed, and as shown in FIG. 12, the display control unit 133 inputs (or edits) the group name of the new group. And a display image including the registration button RB for executing group registration is displayed. In addition, the display control unit 133 clearly indicates the observation image OI to be registered as a group by highlighting information such as a color, a frame, and a line. In the vicinity of the text input unit KSb, a group color content GC that expresses observation results and the like in different colors for each group is arranged.
 次に、図13に示すように、ユーザは、入力部121を操作し、グループカラーコンテンツGCの選択と、テキスト入力部KSbへのグループ名の入力とを行い、登録ボタンRBを押下する。これにより表示制御部133は登録ボタンRBの押下を示す信号を受信し、図14に示すように、表示制御部133は、グループ登録の実行を確認する画像を示す確認画像CIを含む表示画像を表示する。ユーザは、分類基準を用いたグループ登録を実行させる場合に、入力部121を操作し、確認画像CIに含まれる登録ボタンを押下する。一方、ユーザは、グループ登録を実行させない場合に、入力部121を操作し、確認画像CIに含まれるキャンセルボタンを押下する。図15に示すように、表示制御部133は、登録ボタンが押下されると、グループ登録が完了したことを示す完了画像CSを含む表示画像を表示する。このとき、登録部135は、新規のグループに関するグループ情報を記憶装置110に格納する。その後、図16に示すように、表示制御部133は、プレートマップPmIのグループ名GNに、新規に登録されたグループ名を含む表示画像を表示する。情報処理装置100は、以上により、グループの登録を完了する。 Next, as shown in FIG. 13, the user operates the input unit 121, selects the group color content GC, inputs the group name to the text input unit KSb, and presses the registration button RB. As a result, the display control unit 133 receives a signal indicating that the registration button RB is pressed, and as shown in FIG. 14, the display control unit 133 displays a display image including a confirmation image CI showing an image confirming the execution of the group registration. indicate. When executing group registration using the classification criteria, the user operates the input unit 121 and presses the registration button included in the confirmation image CI. On the other hand, when the group registration is not executed, the user operates the input unit 121 and presses the cancel button included in the confirmation image CI. As shown in FIG. 15, when the registration button is pressed, the display control unit 133 displays a display image including a completion image CS indicating that the group registration is completed. At this time, the registration unit 135 stores the group information regarding the new group in the storage device 110. After that, as shown in FIG. 16, the display control unit 133 displays a display image including the newly registered group name in the group name GN of the plate map PmI. The information processing device 100 completes the group registration as described above.
 ユーザは、入力部121を操作し、プレートマップPmIにおいてグループ名GNから登録済みのグループを選択する。これにより表示制御部133はグループ名GNからのグループの選択を示す信号を受信し、図17に示すように、表示制御部133は、選択されたグループ名に対応する解析結果を示す解析結果ARと、倍加時間を示す倍加時間DTとを含む表示画像を表示する。表示制御部133は、例えば、解析結果AR及び倍加時間DTの表示に際して、演算部134から演算処理結果を受け付けて表示してもよい。図17に示す解析結果ARは、グループBに属する収容部の対象物の数の平均をエラーバーとともに視覚的に表現した例である。つまり、表示制御部133は、演算部134が実行した演算に基づいて、観察結果を視覚的に表現した情報を、表示画像に対して示し、該表示画像を表示部122に表示する。 The user operates the input unit 121 and selects a registered group from the group name GN in the plate map PmI. As a result, the display control unit 133 receives a signal indicating the selection of the group from the group name GN, and as shown in FIG. 17, the display control unit 133 indicates the analysis result AR indicating the analysis result corresponding to the selected group name. And a display image including the doubling time DT indicating the doubling time is displayed. The display control unit 133 may receive and display the calculation processing result from the calculation unit 134 when displaying the analysis result AR and the doubling time DT, for example. The analysis result AR shown in FIG. 17 is an example in which the average number of objects in the containment unit belonging to the group B is visually expressed together with the error bar. That is, the display control unit 133 visually represents the observation result on the display image based on the calculation executed by the calculation unit 134, and displays the display image on the display unit 122.
 なお、本実施形態におけるグラフ(例、解析結果ARを表示する箇所)においては、該グラフの縦軸及び横軸のデータを一例として示している。例えば、横軸には、観察開始日時から観察終了日時までの期間における時間に関する情報が表示されてもよい。例えば、縦軸には、プレートマップPmIのプルダウンで選択された項目に対する情報が表示されてもよい。例えば、縦軸は、プルダウンで細胞の数が選択された場合、0から1.0の(×10)等で示される。例えば、縦軸は、プルダウンで細胞の専有面積率が選択された場合、0から100(又は、0から1.0)等の割合で示される。従って、グラフの縦軸及び横軸は、表示する内容に応じて変化する。 In the graph of the present embodiment (eg, the place where the analysis result AR is displayed), the data on the vertical axis and the horizontal axis of the graph are shown as an example. For example, information on the time in the period from the observation start date and time to the observation end date and time may be displayed on the horizontal axis. For example, information on the item selected from the pull-down menu of the plate map PmI may be displayed on the vertical axis. For example, the vertical axis, if the number of cells is selected in a pull-down, indicated by 0 to 1.0 of (× 10 7) or the like. For example, the vertical axis is indicated by a ratio such as 0 to 100 (or 0 to 1.0) when the cell occupied area ratio is selected by pulling down. Therefore, the vertical and horizontal axes of the graph change according to the content to be displayed.
 図18は、第1実施形態に係る情報処理装置における処理の流れの例を示すフローチャートである。ステップS101において、取得部132は、検索条件を設定した検索が実行されたかを判定する。例えば、取得部132は、テキスト入力部KSaにテキストを入力して検索ボタンSBが押下される検索、条件検索部FSa、FSb、FSc、FSdを用いた検索条件から選択して検索ボタンSBが押下される検索、及び検索ボタンSBが押下されるのみの検索のうちいずれであるかを判定する。このとき、取得部132は、テキスト入力部KSaにテキストを入力して検索ボタンSBが押下される検索、又は条件検索部FSa、FSb、FSc、FSdを用いた検索条件から選択して検索ボタンSBが押下される検索である場合(ステップS101:Yes)、ステップS102における処理を実行する。一方、取得部132は、検索ボタンSBが押下されるのみの検索である場合(ステップS101:No)、ステップS103における処理を実行する。 FIG. 18 is a flowchart showing an example of a processing flow in the information processing apparatus according to the first embodiment. In step S101, the acquisition unit 132 determines whether or not the search for which the search conditions have been set has been executed. For example, the acquisition unit 132 selects from the search conditions in which the search button SB is pressed by inputting text into the text input unit KSa, and the search conditions using the condition search units FSa, FSb, FSc, and FSd, and the search button SB is pressed. It is determined whether the search is performed or the search is performed only when the search button SB is pressed. At this time, the acquisition unit 132 selects from the search conditions in which the search button SB is pressed by inputting text into the text input unit KSa, or the search conditions using the condition search units FSa, FSb, FSc, and FSd, and the search button SB. When is a search in which is pressed (step S101: Yes), the process in step S102 is executed. On the other hand, when the search is performed only by pressing the search button SB (step S101: No), the acquisition unit 132 executes the process in step S103.
 ステップS102において、取得部132は、検索条件に基づく観察結果を取得する。例えば、取得部132は、通信部131を介して、検索条件に該当する観察結果を記憶装置110から取得する。ステップS103において、取得部132は、全ての観察結果を取得する。例えば、取得部132は、通信部131を介して、全ての観察結果を記憶装置110から取得する。ステップS104において、表示制御部133は、観察結果の一覧を表示する。例えば、表示制御部133は、取得部132が取得した観察結果の一覧を表示画像に示し、該表示画像を表示部122に表示する。 In step S102, the acquisition unit 132 acquires the observation result based on the search conditions. For example, the acquisition unit 132 acquires the observation result corresponding to the search condition from the storage device 110 via the communication unit 131. In step S103, the acquisition unit 132 acquires all the observation results. For example, the acquisition unit 132 acquires all the observation results from the storage device 110 via the communication unit 131. In step S104, the display control unit 133 displays a list of observation results. For example, the display control unit 133 shows a list of observation results acquired by the acquisition unit 132 on the display image, and displays the display image on the display unit 122.
 ステップS105において、表示制御部133は、観察結果に対する選択を受け付ける。例えば、表示制御部133は、観察結果の一覧の表示画像において、観察結果に対するユーザによる選択を信号として受け付ける。ステップS106において、表示制御部133は、プレートマップを表示する。例えば、表示制御部133は、ユーザによって選択された観察結果を、複数の収容部を有する容器に関する表示画像に対して示し、該表示画像を表示部122に表示する。このとき、表示制御部133は、演算部134による演算の実行結果に基づいて、観察結果に対するグラフ等を表示画像に示し、該表示画像を表示部122に表示してもよい。また、表示制御部133は、時系列切替コンテンツTSや深さ切替コンテンツDP等を表示画像に示し、該表示画像を表示部122に表示してもよい。 In step S105, the display control unit 133 accepts selection for the observation result. For example, the display control unit 133 accepts the user's selection of the observation result as a signal in the display image of the list of observation results. In step S106, the display control unit 133 displays the plate map. For example, the display control unit 133 shows the observation result selected by the user with respect to the display image of the container having the plurality of storage units, and displays the display image on the display unit 122. At this time, the display control unit 133 may show a graph or the like for the observation result on the display image based on the execution result of the calculation by the calculation unit 134, and display the display image on the display unit 122. Further, the display control unit 133 may display the time-series switching content TS, the depth switching content DP, and the like on the display image, and display the display image on the display unit 122.
 ステップS107において、表示制御部133は、観察画像に対する選択と、選択された観察画像に対応する収容部のグループ登録とを受け付ける。例えば、表示制御部133は、編集ボタンEBの押下を示す信号を受け付ける。そして、表示制御部133は、観察画像OI、及びグループ追加ボタンAGを含む表示画像を表示部122に表示する。続いて、表示制御部133は、観察画像OIの選択と、グループ追加ボタンAGの押下とを示す信号を受け付ける。このとき、表示制御部133は、選択された観察画像OIを所定色で反転させた表示画像を表示してもよい。その後、表示制御部133は、グループ名を入力するテキスト入力部KSbと、グループ登録を実行する登録ボタンRBとを含む表示画像を表示部122に表示する。このとき、グループ登録の対象となる観察画像OIについて、枠等により明示してもよい。そして、表示制御部133は、グループカラーコンテンツGCの選択、テキスト入力部KSbに入力されたグループ名、及び登録ボタンRBの押下を示す信号を受け付ける。 In step S107, the display control unit 133 accepts selection for the observation image and group registration of the accommodating unit corresponding to the selected observation image. For example, the display control unit 133 receives a signal indicating that the edit button EB is pressed. Then, the display control unit 133 displays the observation image OI and the display image including the group addition button AG on the display unit 122. Subsequently, the display control unit 133 receives a signal indicating the selection of the observation image OI and the pressing of the group addition button AG. At this time, the display control unit 133 may display a display image in which the selected observation image OI is inverted with a predetermined color. After that, the display control unit 133 displays a display image including the text input unit KSb for inputting the group name and the registration button RB for executing the group registration on the display unit 122. At this time, the observation image OI to be registered as a group may be clearly indicated by a frame or the like. Then, the display control unit 133 receives a signal indicating the selection of the group color content GC, the group name input to the text input unit KSb, and the pressing of the registration button RB.
 ステップS108において、登録部135は、グループ登録を実行する。例えば、登録部135は、新規のグループの対象となる観察結果に対して、表示制御部133が受け付けたグループカラー及びグループ名に基づいて、グループ情報を記憶装置110に格納する。なお、グループ登録は、新規のグループ登録に限らない。例えば、図11に示すように、ユーザは、観察画像OIの選択後に、既存のグループ(ここでは「グループA」)を選択し、登録ボタンRBを押下してもよい。これにより、登録部135は、選択された収容部について、「グループA」に追加するグループ情報を記憶装置110に格納してもよい。 In step S108, the registration unit 135 executes group registration. For example, the registration unit 135 stores the group information in the storage device 110 based on the group color and the group name received by the display control unit 133 for the observation result to be the target of the new group. Group registration is not limited to new group registration. For example, as shown in FIG. 11, after selecting the observation image OI, the user may select an existing group (here, “group A”) and press the registration button RB. As a result, the registration unit 135 may store the group information to be added to the "group A" in the storage device 110 for the selected accommodating unit.
 図19は、第1実施形態に係る観察画像の表示切り替え処理の流れの例を示すフローチャートである。ステップS201において、表示制御部133は、プレートマップが表示されているかを判定する。このとき、表示制御部133は、プレートマップが表示されている場合に(ステップS201:Yes)、ステップS202における処理を実行する。一方、表示制御部133は、プレートマップが表示されていない場合に(ステップS201:No)、観察画像の表示切り替えを実行しなくてよいため、処理を終了する。 FIG. 19 is a flowchart showing an example of the flow of the display switching process of the observation image according to the first embodiment. In step S201, the display control unit 133 determines whether the plate map is displayed. At this time, the display control unit 133 executes the process in step S202 when the plate map is displayed (step S201: Yes). On the other hand, when the plate map is not displayed (step S201: No), the display control unit 133 ends the process because it is not necessary to switch the display of the observation image.
 ステップS202において、表示制御部133は、時系列の切り替え操作を受け付けたかを判定する。例えば、表示制御部133は、時系列切替コンテンツTSの矩形が選択されたかを判定する。なお、表示制御部133は、時系列切替コンテンツTSaに対して矩形の選択位置を移動させる操作が行われたかを判定してもよい。そして、表示制御部133は、時系列の切り替え操作を示す信号を受け付けた場合に(ステップS202:Yes)、ステップS203において、時系列に対応する観察画像を表示する。例えば、表示制御部133は、時系列切替コンテンツTSの矩形の位置に該当する期間に対応する観察画像を表示画像として、プレートマップPmIの表示を切り替える。一方、表示制御部133は、時系列の切り替え操作を受け付けていない場合に(ステップS202:No)、ステップS204における処理を実行する。 In step S202, the display control unit 133 determines whether or not the time series switching operation has been accepted. For example, the display control unit 133 determines whether or not the rectangle of the time-series switching content TS is selected. The display control unit 133 may determine whether or not an operation of moving the selection position of the rectangle has been performed with respect to the time-series switching content TSa. Then, when the display control unit 133 receives the signal indicating the time series switching operation (step S202: Yes), the display control unit 133 displays the observation image corresponding to the time series in step S203. For example, the display control unit 133 switches the display of the plate map PmI using the observation image corresponding to the period corresponding to the rectangular position of the time-series switching content TS as the display image. On the other hand, when the display control unit 133 does not accept the time-series switching operation (step S202: No), the display control unit 133 executes the process in step S204.
 ステップS204において、表示制御部133は、深さの切り替え操作を受け付けたかを判定する。例えば、表示制御部133は、深さ切替コンテンツDPの矩形が選択されたかを判定する。なお、表示制御部133は、深さ切替コンテンツDPaに対して矩形の選択位置を移動させる操作が行われたかを判定してもよい。そして、表示制御部133は、深さの切り替え操作を示す信号を受け付けた場合に(ステップS204:Yes)、ステップS205において、深さに対応する観察画像を表示する。例えば、表示制御部133は、深さ切替コンテンツDPの矩形が選択されると、該当する収容部の深さに対応する観察画像を表示画像として、プレートマップPmIの表示を切り替える。一方、表示制御部133は、深さの切り替え操作を受け付けていない場合に(ステップS204:No)、ステップS201における処理を実行する。つまり、表示制御部133は、プレートマップが表示されている間、時系列切替コンテンツTS又は深さ切替コンテンツDPに対する操作を示す信号を受け付けた場合に、該当する観察画像に切り替えて表示する処理を実行する。 In step S204, the display control unit 133 determines whether or not the depth switching operation has been accepted. For example, the display control unit 133 determines whether or not the rectangle of the depth switching content DP is selected. The display control unit 133 may determine whether or not an operation of moving the selection position of the rectangle has been performed with respect to the depth switching content DPa. Then, when the display control unit 133 receives the signal indicating the depth switching operation (step S204: Yes), the display control unit 133 displays the observation image corresponding to the depth in step S205. For example, when the rectangle of the depth switching content DP is selected, the display control unit 133 switches the display of the plate map PmI using the observation image corresponding to the depth of the corresponding accommodating unit as the display image. On the other hand, when the display control unit 133 does not accept the depth switching operation (step S204: No), the display control unit 133 executes the process in step S201. That is, when the display control unit 133 receives a signal indicating an operation for the time-series switching content TS or the depth switching content DP while the plate map is being displayed, the display control unit 133 switches to the corresponding observation image and displays it. Run.
[第2実施形態]
 次に、第2実施形態を説明する。本実施形態において、上述した実施形態と同様の構成については、同一の符号を付し、その説明を省略又は簡略化する場合がある。
[Second Embodiment]
Next, the second embodiment will be described. In the present embodiment, the same components as those in the above-described embodiment are designated by the same reference numerals, and the description thereof may be omitted or simplified.
 図20は、第2実施形態に係る情報処理装置の機能構成例を示すブロック図である。図20に示すように、情報処理装置100aは、通信部131と、取得部132と、表示制御部133aと、演算部134と、登録部135とを有する。また、情報処理装置100aには、入力部121と、表示部122とが接続される。表示制御部133aは、「制御部」に対応する。 FIG. 20 is a block diagram showing a functional configuration example of the information processing apparatus according to the second embodiment. As shown in FIG. 20, the information processing device 100a includes a communication unit 131, an acquisition unit 132, a display control unit 133a, a calculation unit 134, and a registration unit 135. Further, the input unit 121 and the display unit 122 are connected to the information processing device 100a. The display control unit 133a corresponds to the "control unit".
 取得部132は、所定の観察条件で複数の対象物を撮像して得られた観察結果を取得する。取得部132における処理は、上記実施形態と同様である。すなわち、取得部132は、表示制御部133や演算部134等による処理に対して、適宜、記憶装置110からデータを取得する。表示制御部133aは、複数の対象物がそれぞれ収容される複数の収容部の観察画像に関する複数の参照画像を上述した分類基準に基づいて表示画像に示す。例えば、表示制御部133aは、グループ名を有するプレートマップを含む表示画像を表示部122に表示する。そして、表示制御部133aは、かかる表示画像においてグループ名に対するユーザによる選択の操作を示す信号を受け付ける。続いて、表示制御部133aは、容器に関する表示画像に対して、ユーザによって選択されたグループに属する容器の収容部を明示し、明示した表示画像を表示部122に表示する。 The acquisition unit 132 acquires the observation results obtained by imaging a plurality of objects under predetermined observation conditions. The process in the acquisition unit 132 is the same as that in the above embodiment. That is, the acquisition unit 132 acquires data from the storage device 110 as appropriate for processing by the display control unit 133, the calculation unit 134, and the like. The display control unit 133a shows a plurality of reference images relating to observation images of the plurality of storage units in which the plurality of objects are housed in the display image based on the above-mentioned classification criteria. For example, the display control unit 133a displays a display image including a plate map having a group name on the display unit 122. Then, the display control unit 133a receives a signal indicating the operation of selecting the group name by the user in the display image. Subsequently, the display control unit 133a clearly indicates the storage unit of the container belonging to the group selected by the user with respect to the display image related to the container, and displays the specified display image on the display unit 122.
 ここで、本実施形態における表示制御部133aは、容器の収容部のそれぞれの観察画像に関する複数の参照画像を表示画像に示し、該表示画像を表示部122に表示する。参照画像は、例えば、観察画像を縮小した画像であり、サムネイル(サムネイル画像)と呼ばれることがある。表示制御部133aは、例えば、複数の参照画像を隣接又は近接させて表示画像に示す。また、表示制御部133aは、例えば、複数の収容部と同一の配置で複数の参照画像を隣接又は近接させて表示画像に示す。つまり、参照画像は、それぞれの配置について、容器に含まれる収容部のそれぞれと同一の配置とし、参照画像間は互いに接していてもよいし接していなくてもよい。このように、複数の参照画像が隣接することは、参照画像間は互いに接していること、及び参照画像間が互いに接していないことを含む。その後、表示制御部133aは、ユーザによって選択された分類基準に基づいて並び替えた複数の参照画像を表示画像に示す。従って、本実施形態における情報処理装置100aは、選択された分類基準(又は指標)に対応するグループに属する複数の参照画像と、該グループに属さない複数の参照画像とを取得して、少なくとも選択された分類基準に対応するグループに属する該複数の参照画像を例えば図22のように同じグループとして配置を並び替えて表示画像に表示する。例えば、表示制御部133aは、分類基準が選択されて入力されたことを示す信号を受信した場合、上述した容器に含まれる収容部のそれぞれと同一の配置ではなく、該分類基準(又は、後述する分類基準の指標)に対応する複数の参照画像を分類基準に基づいて表示画像に新たに配列して表示する。 Here, the display control unit 133a in the present embodiment shows a plurality of reference images relating to each observation image of the container storage unit on the display image, and displays the display image on the display unit 122. The reference image is, for example, a reduced image of the observed image, and is sometimes called a thumbnail (thumbnail image). The display control unit 133a shows, for example, a plurality of reference images adjacent to or close to each other in the display image. Further, the display control unit 133a shows, for example, a plurality of reference images adjacent to or close to each other in the same arrangement as the plurality of accommodating units in the display image. That is, the reference images may or may not be in contact with each other, with the reference images having the same arrangement as each of the accommodating portions included in the container for each arrangement. As described above, the fact that a plurality of reference images are adjacent to each other includes that the reference images are in contact with each other and that the reference images are not in contact with each other. After that, the display control unit 133a shows a plurality of reference images sorted based on the classification criteria selected by the user in the display image. Therefore, the information processing apparatus 100a in the present embodiment acquires and at least selects a plurality of reference images belonging to a group corresponding to the selected classification standard (or index) and a plurality of reference images not belonging to the group. The plurality of reference images belonging to the group corresponding to the classified criteria are rearranged and displayed in the display image as the same group as shown in FIG. 22, for example. For example, when the display control unit 133a receives a signal indicating that the classification standard has been selected and input, the display control unit 133a does not have the same arrangement as each of the storage units included in the container described above, but the classification standard (or described later). A plurality of reference images corresponding to the index of the classification standard) are newly arranged and displayed on the display image based on the classification standard.
 図21は、第2実施形態に係る参照画像の例を示す図である。図21に示すように、表示制御部133aは、乾燥防止用緩衝液が投入された容器の収容部を除く、破線で囲まれた対象物が投入された複数の収容部のそれぞれに対応する参照画像(例えば、観察画像を縮小した画像、サムネイル画像)を表示画像に表示する。上述したように、参照画像は、複数の収容部のそれぞれと同一の配置で、隣接又は近接させて表示画像に示される。そして、複数の参照画像は、以下で説明するように、表示画像において所定の分類基準に基づいて並び替えられる。 FIG. 21 is a diagram showing an example of a reference image according to the second embodiment. As shown in FIG. 21, the display control unit 133a corresponds to each of the plurality of storage units in which the object surrounded by the broken line is charged, excluding the storage unit in the container in which the anti-drying buffer solution is charged. An image (for example, a reduced observation image, a thumbnail image) is displayed on the display image. As described above, the reference image is shown in the display image in the same arrangement as each of the plurality of accommodating portions, adjacent or close to each other. Then, the plurality of reference images are rearranged in the display image based on a predetermined classification criterion as described below.
 図22から図24は、第2実施形態に係る参照画像を並び替える例を示す図である。表示制御部133aは、分類基準を含む表示画像を表示部122に表示し、ユーザ操作による分類基準の選択を受け付ける。ここで、取得部132は、選択された分類基準を評価する指標を記憶装置110から取得する。すなわち、本実施形態に係る分類基準は、観察条件又は観察結果を評価する指標を含む。例えば、分類基準が対象物の数(例、細胞の数)である場合は、対象物の数について、所定値以上と所定値未満とを各収容部の区分の指標としてもよい(各収容部を分類する指標としてもよい)。なお、区分は、2つに分類されることに限られず、指標に範囲をもたせて3つ以上に分類されてもよい。例えば、区分は、第1所定値以上の区分、第2所定値以上から第1所定値未満の区分、第2所定値未満の区分(第1所定値>第2所定値とする)のように、3つに分類することもできる。 22 to 24 are diagrams showing an example of rearranging the reference images according to the second embodiment. The display control unit 133a displays a display image including the classification standard on the display unit 122, and accepts the selection of the classification standard by the user operation. Here, the acquisition unit 132 acquires an index for evaluating the selected classification standard from the storage device 110. That is, the classification standard according to the present embodiment includes an observation condition or an index for evaluating the observation result. For example, when the classification criterion is the number of objects (eg, the number of cells), the number of objects may be greater than or equal to a predetermined value and less than a predetermined value as an index for classification of each accommodating portion (an index for classifying each accommodating portion). May be). The classification is not limited to two, and may be classified into three or more by giving a range to the index. For example, the classification is a classification of the first predetermined value or more, a classification of the second predetermined value or more and less than the first predetermined value, and a classification of the second predetermined value or less (the first predetermined value> the second predetermined value). It can also be classified into three.
 図22に示すように、表示制御部133aは、取得部132が取得した指標に基づく区分ごとに複数の参照画像を表示画像に示し、該表示画像を表示部122に表示する。このとき、表示制御部133aは、区分のそれぞれに対する評価を示す情報を表示画像に示す。例えば、評価を示す情報は、指標を満たすか否かを表す「OK」、「NG」等である。また、表示制御部133aは、区分のそれぞれに対する指標を示す情報を表示画像に示してもよい。例えば、3つ以上の区分に分類する場合は、それぞれの区分がどのような指標に該当する区分であるかを明示する。上述した例で説明すると、対象物の数で3つの区分に分類する場合は、「第1所定値以上」、「第2所定値以上から第1所定値未満」、「第2所定値未満」等を、対応する区分の近傍にそれぞれ示す表示画像としてもよい(実際には、「所定値」は具体的な値で示される)。なお、例えば、評価を示す情報は、指標として細胞の成熟度又は分化状態を表す情報(良否等)を含む。 As shown in FIG. 22, the display control unit 133a shows a plurality of reference images for each category based on the index acquired by the acquisition unit 132 on the display image, and displays the display image on the display unit 122. At this time, the display control unit 133a shows the information indicating the evaluation for each of the categories on the display image. For example, the information indicating the evaluation is "OK", "NG", etc. indicating whether or not the index is satisfied. Further, the display control unit 133a may show information indicating an index for each of the categories on the display image. For example, when classifying into three or more categories, clearly indicate what kind of index each category corresponds to. Explaining with the above example, when classifying into three categories according to the number of objects, "1st predetermined value or more", "2nd predetermined value or more to less than 1st predetermined value", "2nd predetermined value or less", etc. , May be displayed images shown in the vicinity of the corresponding divisions (actually, the "predetermined value" is indicated by a specific value). For example, the information indicating the evaluation includes information (good or bad, etc.) indicating the maturity or differentiation state of the cell as an index.
 また、表示制御部133aは、同一の区分に含まれる複数の参照画像について、隣接又は近接させて表示画像に示す。さらに、表示制御部133aは、区分ごとに間隔を空けて複数の参照画像を表示画像に示す。つまり、本実施形態では、ある指標を満たす収容部の集合と、ある指標を満たさない収容部の集合とが異なる区分であることを明確にするために、複数の参照画像を区分ごとに間隔を空けて並べ替えて表示させている。加えて、本実施形態では、同一の区分に含まれる複数の参照画像について隣接又は近接させることで、区分ごとに何らかの特徴が存在すると考えられる対象物の画像(参照画像)を容易に確認(比較)することができる。また、表示制御部133aは、区分が異なることを表示画像に示してもよい。例えば、表示制御部133aは、各区分を枠で囲むようにしてもよいし、区分ごとに異なる色の枠を付加してもよいし、異なる区分であることを示す何らかのコンテンツを付加してもよい。 Further, the display control unit 133a shows a plurality of reference images included in the same category in the display image adjacent to or close to each other. Further, the display control unit 133a shows a plurality of reference images on the display image at intervals for each division. That is, in the present embodiment, in order to clarify that the set of accommodating parts satisfying a certain index and the set of accommodating parts not satisfying a certain index are different divisions, a plurality of reference images are spaced by each division. It is displayed in a vacant and rearranged manner. In addition, in the present embodiment, by adjoining or bringing a plurality of reference images included in the same category adjacent to each other or close to each other, an image (reference image) of an object that is considered to have some feature in each category can be easily confirmed (compared). )can do. Further, the display control unit 133a may show in the display image that the classification is different. For example, the display control unit 133a may surround each division with a frame, may add a frame of a different color for each division, or may add some content indicating that the division is different.
 また、表示制御部133aは、複数の区分の一方向における複数の参照画像の配置数を同数にして表示画像に示してもよい。例えば、図22では、一方向の参照画像の配置数を「6」としている。このように、一方向の参照画像の配置数を同数にすることで、ユーザは、指標に基づく各区分に含まれる収容部の数を容易に認識することができる。もちろん、図22に示す「OK」、「NG」が配置された他方向(上記の一方向とは異なる方向)における複数の参照画像の配置数を同数にしてもよい。また、表示制御部133aは、ユーザ操作に応じて、区分ごとに切り替えて表示画像に示してもよい。例えば、表示制御部133aは、「OK」のみを表示させるユーザ操作を受け付けた場合、複数の参照画像のうち「OK」に該当する区分のみを表示画像に示し、「NG」のみを表示させるユーザ操作を受け付けた場合、複数の参照画像のうち「NG」に該当する区分のみを表示画像に示してもよい。 Further, the display control unit 133a may display the display image by setting the number of arrangements of the plurality of reference images in one direction of the plurality of divisions to the same number. For example, in FIG. 22, the number of arranged reference images in one direction is set to “6”. In this way, by making the number of arrangements of the reference images in one direction the same, the user can easily recognize the number of the accommodating portions included in each category based on the index. Of course, the number of arrangements of the plurality of reference images in the other direction (direction different from the above one direction) in which "OK" and "NG" shown in FIG. 22 are arranged may be the same. Further, the display control unit 133a may be switched for each division and shown in the display image according to the user operation. For example, when the display control unit 133a accepts a user operation for displaying only "OK", the user who displays only the category corresponding to "OK" among the plurality of reference images in the display image and displays only "NG". When the operation is accepted, only the category corresponding to "NG" among the plurality of reference images may be shown in the display image.
 また、表示制御部133aは、観察条件に対する複数の収容部の数の分布を表示画像に示してもよい。例えば、図22に示すように、表示制御部133aは、薬剤の濃度(観察条件の1つ)に対する複数の収容部の数の分布を、分類基準に対応するグループごとにグラフとして表現し、表示画像に示す。なお、グラフとして表現する観察条件は、薬剤の濃度に限られない。また、表示制御部133aは、区分ごとに、観察条件に対する複数の収容部の数の分布を表示画像に示してもよい。例えば、図23に示すように、表示制御部133aは、薬剤の濃度に対する複数の収容部の数の分布を、区分ごとに、各区分に含まれるグループごとにグラフとして表現し、表示画像に示す。 Further, the display control unit 133a may show the distribution of the number of the plurality of accommodating units with respect to the observation conditions on the display image. For example, as shown in FIG. 22, the display control unit 133a expresses and displays the distribution of the number of the plurality of accommodating units with respect to the drug concentration (one of the observation conditions) as a graph for each group corresponding to the classification standard. Shown in the image. The observation conditions expressed as a graph are not limited to the drug concentration. Further, the display control unit 133a may show the distribution of the number of the plurality of accommodating units with respect to the observation conditions on the display image for each division. For example, as shown in FIG. 23, the display control unit 133a expresses the distribution of the number of the plurality of accommodating units with respect to the drug concentration as a graph for each category and for each group included in each category, and shows the display image. ..
 なお、上記では、分類基準の選択に応じて指標を記憶装置110から取得する場合を説明したが、これに限られない。ユーザは、分類基準の選択とともに、指標についても設定することができる。例えば、表示制御部133aは、指標の設定を受信し、受信された指標の設定に基づく区分ごとに複数の参照画像を表示画像に示してもよい。すなわち、表示制御部133aは、分類基準の選択と、指標の設定とを受け付け、受け付けた分類基準の指標に応じた区分ごとに、複数の参照画像を表示画像に示す。また、指標については、画面上で調整してもよい。例えば、表示制御部133aは、指標を設定するコンテンツを表示画像に示す。図24に示すように、表示制御部133aは、記憶装置110から取得した指標、又は、設定された指標に基づく区分ごとに複数の参照画像が並び替えられた表示画像において、ユーザによるスライダ等による指標の設定(変更、再設定)を受け付ける。これにより、表示制御部133aは、スライダ等によって設定された指標に基づく区分ごとに、複数の参照画像を並び替えて表示画像に示し、該表示画像を表示部122に表示する。 In the above, the case where the index is acquired from the storage device 110 according to the selection of the classification standard has been described, but the present invention is not limited to this. The user can set the index as well as the selection of the classification standard. For example, the display control unit 133a may receive the index setting and display a plurality of reference images on the display image for each category based on the received index setting. That is, the display control unit 133a accepts the selection of the classification standard and the setting of the index, and displays a plurality of reference images on the display image for each category according to the received index of the classification standard. Further, the index may be adjusted on the screen. For example, the display control unit 133a shows the content for setting the index on the display image. As shown in FIG. 24, the display control unit 133a uses a slider or the like by the user to display an index acquired from the storage device 110 or a display image in which a plurality of reference images are rearranged for each category based on the set index. Accepts index settings (changes, resets). As a result, the display control unit 133a rearranges a plurality of reference images for each category based on the index set by the slider or the like and shows them on the display image, and displays the display image on the display unit 122.
 図25は、第2実施形態に係る情報処理装置における処理の流れの例を示すフローチャートである。ステップS301において、表示制御部133aは、分類基準を含む表示画像を表示部122に表示する。ステップS302において、取得部132は、ユーザ操作によって選択された分類基準を評価する指標を記憶装置110から取得する。ステップS303において、表示制御部133aは、指標に基づく区分ごとに複数の参照画像を配置した表示画像を表示部122に表示する。例えば、表示制御部133aは、容器に含まれる複数の収容部のそれぞれの観察画像に対する参照画像を含む表示画像を表示部122に表示した後、分類基準を含む表示画像を表示部122に表示する。分類基準の表示は、ユーザ操作に応じて実施すればよい。そして、表示制御部133aは、取得部132によって取得された分類基準を評価する指標に基づく区分ごとに複数の参照画像を表示画像に配置し、該表示画像を表示部122に表示する。なお、指標は、分類基準の選択とともに、ユーザによって設定されてもよい。また、表示制御部133aは、参照画像を配置した表示画像に、区分のそれぞれに対する評価を示す情報(「OK」、「NG」等)、観察条件に対する複数の収容部の数の分布を表すグラフを含めてもよい。加えて、表示制御部133aは、指標を設定するスライダ等のコンテンツを表示画像に含めてもよい。 FIG. 25 is a flowchart showing an example of a processing flow in the information processing apparatus according to the second embodiment. In step S301, the display control unit 133a displays the display image including the classification standard on the display unit 122. In step S302, the acquisition unit 132 acquires an index for evaluating the classification criteria selected by the user operation from the storage device 110. In step S303, the display control unit 133a displays a display image in which a plurality of reference images are arranged for each division based on the index on the display unit 122. For example, the display control unit 133a displays a display image including a reference image for each observation image of the plurality of storage units contained in the container on the display unit 122, and then displays a display image including the classification standard on the display unit 122. .. The classification criteria may be displayed according to the user operation. Then, the display control unit 133a arranges a plurality of reference images on the display image for each category based on the index for evaluating the classification standard acquired by the acquisition unit 132, and displays the display image on the display unit 122. The index may be set by the user along with the selection of the classification criteria. Further, the display control unit 133a is a graph showing information (“OK”, “NG”, etc.) indicating the evaluation for each of the categories in the display image on which the reference image is arranged, and the distribution of the number of the plurality of accommodating units with respect to the observation conditions. May be included. In addition, the display control unit 133a may include contents such as a slider for setting an index in the display image.
 ステップS304において、表示制御部133aは、指標の設定が変更されたかを判定する。例えば、表示制御部133aは、スライダ等に対するユーザ操作により、指標の設定が変更された場合に(ステップS304:Yes)、ステップS305において、設定された指標に基づく区分ごとに参照画像を再配置した表示画像を表示部122に表示する。なお、表示制御部133aは、指標の設定変更により参照画像を再配置した場合、観察条件に対する複数の収容部の数の分布を表すグラフについても、指標の設定に応じて変更する。一方、表示制御部133aは、指標の設定が変更されていない場合に(ステップS304:No)、ステップS306における処理を実行する。ステップS306において、表示制御部133aは、参照画像を表示する画面を終了する操作がされたかを判定する。このとき、表示制御部133aは、終了の操作がされた場合に(ステップS306:Yes)、指標の設定に関する処理を終了する。一方、表示制御部133aは、終了の操作がされていない場合に(ステップS306:No)、ステップS304における処理を実行し、指標の設定に関する処理を継続する。 In step S304, the display control unit 133a determines whether the index setting has been changed. For example, when the index setting is changed by a user operation on the slider or the like (step S304: Yes), the display control unit 133a rearranges the reference image for each category based on the set index in step S305. The display image is displayed on the display unit 122. When the reference image is rearranged by changing the setting of the index, the display control unit 133a also changes the graph showing the distribution of the number of the plurality of accommodating units with respect to the observation condition according to the setting of the index. On the other hand, the display control unit 133a executes the process in step S306 when the index setting has not been changed (step S304: No). In step S306, the display control unit 133a determines whether or not the operation of ending the screen for displaying the reference image has been performed. At this time, the display control unit 133a ends the process related to the index setting when the end operation is performed (step S306: Yes). On the other hand, when the end operation is not performed (step S306: No), the display control unit 133a executes the process in step S304 and continues the process related to the index setting.
[第3実施形態]
 次に、第3実施形態を説明する。本実施形態において、上述した実施形態と同様の構成については、同一の符号を付し、その説明を省略又は簡略化する場合がある。
[Third Embodiment]
Next, the third embodiment will be described. In the present embodiment, the same components as those in the above-described embodiment are designated by the same reference numerals, and the description thereof may be omitted or simplified.
 図26は、第3実施形態に係る情報処理装置の機能構成例を示すブロック図である。図26に示すように、情報処理装置100bは、通信部131と、取得部132と、表示制御部133bと、演算部134と、登録部135とを有する。また、情報処理装置100bには、入力部121と、表示部122とが接続される。表示制御部133bは、「制御部」に対応する。 FIG. 26 is a block diagram showing a functional configuration example of the information processing device according to the third embodiment. As shown in FIG. 26, the information processing apparatus 100b includes a communication unit 131, an acquisition unit 132, a display control unit 133b, a calculation unit 134, and a registration unit 135. Further, the input unit 121 and the display unit 122 are connected to the information processing device 100b. The display control unit 133b corresponds to the "control unit".
 表示制御部133bは、上記実施形態で説明した指標を第1指標(1回目の指標)として、第1指標に基づく区分ごとに複数の参照画像を示す表示画像に対して、第1指標とは異なる第2指標(2回目の指標)に基づいて複数の参照画像を並び替えて示す。例えば、表示制御部133bは、第1指標に基づく区分ごとに複数の参照画像を含む表示画像が表示されている状態において、第2指標を設定するための分類基準を含む表示画像を表示部122に表示し、分類基準の選択を受け付ける。ここで、取得部132は、選択された分類基準を評価する第2指標を記憶装置110から取得する。そして、表示制御部133bは、表示画像に対し、取得部132が取得した第2指標に基づいて複数の参照画像を並び替えて示す。なお、第2指標は、分類基準の選択とともに、ユーザに選択されることによって設定されてもよい。 The display control unit 133b uses the index described in the above embodiment as the first index (first index), and refers to the first index with respect to the display image showing a plurality of reference images for each category based on the first index. A plurality of reference images are rearranged and shown based on a different second index (second index). For example, the display control unit 133b displays a display image including a classification criterion for setting the second index in a state where a display image including a plurality of reference images is displayed for each category based on the first index. Is displayed on the screen and accepts the selection of classification criteria. Here, the acquisition unit 132 acquires a second index for evaluating the selected classification criterion from the storage device 110. Then, the display control unit 133b rearranges and shows a plurality of reference images with respect to the display image based on the second index acquired by the acquisition unit 132. The second index may be set by being selected by the user together with the selection of the classification criteria.
 図27は、第3実施形態に係る参照画像を並び替える例を示す図である。取得部132は、第1指標に基づく第1区分(例えば、「OK」に対応する区分)について、各収容部に関する情報を記憶装置110から取得する。また、取得部132は、第1指標に基づく第2区分(例えば、「NG」に対応する区分)について、各収容部に関する情報を記憶装置110から取得する。各収容部に関する情報は、第2指標に対する設定の確認で使用される。そして、表示制御部133bは、各収容部の第2指標の設定に基づいて、各区分の参照画像のそれぞれを、各区分内で並び替えた表示画像を示し、該表示画像を表示部122に表示する。例えば、図27に示すように、表示制御部133bは、第1区分に含まれる複数の収容部について、対象物として細胞(この場合、細胞コロニー又は細胞群)の専有面積率が80パーセント以上である収容部、専有面積率が50パーセント以上80パーセント未満である収容部、専有面積率が50パーセント未満である収容部をそれぞれ特定し、複数の参照画像の並び替えを行う。専有面積率は、観察条件又は観察結果を示す第2指標の一例である。すなわち、本実施形態では、第1指標で区分された収容部それぞれについて、第2指標ではどのような状況になるのかをユーザに確認させるための処理が実行される。また、本実施形態では、ユーザの選択に応じて、第2指標で並び替えた参照画像を確認させるための処理が実行される。 FIG. 27 is a diagram showing an example of rearranging the reference images according to the third embodiment. The acquisition unit 132 acquires information about each storage unit from the storage device 110 for the first division based on the first index (for example, the division corresponding to “OK”). Further, the acquisition unit 132 acquires information about each storage unit from the storage device 110 for the second category (for example, the category corresponding to “NG”) based on the first index. Information about each containment is used in confirming the settings for the second indicator. Then, the display control unit 133b shows a display image in which each of the reference images of each division is rearranged within each division based on the setting of the second index of each accommodation unit, and the display image is displayed on the display unit 122. indicate. For example, as shown in FIG. 27, the display control unit 133b has an occupied area ratio of cells (in this case, a cell colony or a cell group) of 80% or more as an object for a plurality of accommodating units included in the first category. A certain accommodating part, an accommodating part having an exclusive area ratio of 50% or more and less than 80%, and an accommodating part having an exclusive area ratio of less than 50% are specified, and a plurality of reference images are rearranged. The occupied area ratio is an example of a second index showing observation conditions or observation results. That is, in the present embodiment, for each of the accommodating units classified by the first index, a process for asking the user to confirm what kind of situation the second index will be is executed. Further, in the present embodiment, a process for confirming the reference images sorted by the second index is executed according to the user's selection.
 また、表示制御部133bは、区分のそれぞれに対する第2指標を示す情報を表示画像に示す。図27では、収容部における細胞の専有面積率がどの程度であるかを示す情報が各区分の参照画像に対して表されている。また、表示制御部133bは、表示画像に対して、第2指標に該当する複数の参照画像を明示してもよい。例えば、表示制御部133bは、専有面積率80パーセント以上、専有面積率50パーセントから80パーセント未満、専有面積率50パーセント未満である第2指標のそれぞれに該当する参照画像について、枠で囲んでもよいし、参照画像を第2指標ごとに離して配置してもよい。また、第2指標の切り替えは、適宜、実施されてもよい。表示制御部133bは、第2指標の切り替えを示す入力信号に応じて、複数の参照画像の並び替えを実施する。 Further, the display control unit 133b shows the information indicating the second index for each of the categories in the display image. In FIG. 27, information indicating how much the occupied area ratio of the cells in the accommodating portion is is shown with respect to the reference image of each category. Further, the display control unit 133b may clearly indicate a plurality of reference images corresponding to the second index to the display image. For example, the display control unit 133b may enclose the reference image corresponding to each of the second index having the exclusive area ratio of 80% or more, the exclusive area ratio of 50% to less than 80%, and the exclusive area ratio of less than 50% with a frame. However, the reference images may be arranged separately for each second index. Moreover, the switching of the second index may be carried out as appropriate. The display control unit 133b rearranges a plurality of reference images according to the input signal indicating the switching of the second index.
 また、表示制御部133bは、グラフを上記実施形態とは異なる表現としてもよい。図28は、第3実施形態に係る参照画像とともに表示するグラフの例を示す図である。例えば、図28に示すように、表示制御部133bは、第1指標に基づく区分それぞれについて、上述した専有面積率の割合に対する収容部の割合(又は、数等)を表現したグラフを表示画像に示してもよい。 Further, the display control unit 133b may express the graph differently from the above embodiment. FIG. 28 is a diagram showing an example of a graph displayed together with a reference image according to a third embodiment. For example, as shown in FIG. 28, the display control unit 133b displays a graph expressing the ratio (or number, etc.) of the accommodating unit to the ratio of the occupied area ratio described above for each of the categories based on the first index on the display image. May be shown.
 図29は、第3実施形態に係る情報処理装置における処理の流れの例を示すフローチャートである。なお、図29では、第1指標に基づく区分ごとに複数の参照画像が表示画像に示された状態における処理を例に挙げる。ステップS401において、表示制御部133bは、分類基準を含む表示画像を表示部122に表示する。ステップS402において、取得部132は、選択された分類基準を評価する第2指標を記憶装置110から取得する。ステップS403において、取得部132は、第1指標に対する各区分の収容部のそれぞれに関する情報を記憶装置110から取得する。例えば、表示制御部133bは、第2指標を選択させるために、分類基準を含む表示画像を表示部122に表示する。分類基準の表示は、ユーザ操作に応じて実施すればよい。そして、取得部132は、選択された分類基準を評価する第2指標を記憶装置110から取得し、取得した第2指標について、各区分の収容部のそれぞれがどのような設定であるかを確認させるために、各収容部の第2指標に関する情報を記憶装置110から取得する。 FIG. 29 is a flowchart showing an example of a processing flow in the information processing apparatus according to the third embodiment. In FIG. 29, a process in which a plurality of reference images are shown in the display image for each category based on the first index will be given as an example. In step S401, the display control unit 133b displays the display image including the classification standard on the display unit 122. In step S402, the acquisition unit 132 acquires a second index for evaluating the selected classification criteria from the storage device 110. In step S403, the acquisition unit 132 acquires information about each of the accommodating units of each category with respect to the first index from the storage device 110. For example, the display control unit 133b displays a display image including the classification criteria on the display unit 122 in order to select the second index. The classification criteria may be displayed according to the user operation. Then, the acquisition unit 132 acquires the second index for evaluating the selected classification standard from the storage device 110, and confirms what kind of setting each of the accommodating units in each category has for the acquired second index. Information about the second index of each accommodating portion is acquired from the storage device 110 in order to obtain the information.
 ステップS404において、表示制御部133bは、第2指標に基づいて各区分の参照画像のそれぞれを並び替えた表示画像を表示部122に表示する。例えば、表示制御部133bは、現状の第1指標に基づく区分ごとに並べられた参照画像について、第2指標の同一の設定ごとにさらに並び替えた表示画像を表示部122に表示する。ステップS405において、表示制御部133bは、他の分類基準が選択されたかを判定する。例えば、取得部132は、他の分類基準が選択された場合に(ステップS405:Yes)、ステップS402における処理を実行する。一方、表示制御部133bは、他の分類基準が選択されていない場合に(ステップS405:No)、ステップS406における処理を実行する。ステップS406において、表示制御部133bは、参照画像を表示する画面を終了する操作がされたかを判定する。このとき、表示制御部133bは、終了の操作がされた場合に(ステップS406:Yes)、第2指標の設定に関する処理を終了する。一方、表示制御部133bは、終了の操作がされていない場合に(ステップS406:No)、ステップS405における処理を実行する。 In step S404, the display control unit 133b displays on the display unit 122 a display image in which each of the reference images of each category is rearranged based on the second index. For example, the display control unit 133b displays on the display unit 122 the display images further rearranged according to the same setting of the second index with respect to the reference images arranged for each category based on the current first index. In step S405, the display control unit 133b determines whether another classification criterion has been selected. For example, the acquisition unit 132 executes the process in step S402 when another classification criterion is selected (step S405: Yes). On the other hand, the display control unit 133b executes the process in step S406 when another classification criterion is not selected (step S405: No). In step S406, the display control unit 133b determines whether or not the operation of ending the screen for displaying the reference image has been performed. At this time, the display control unit 133b ends the process related to the setting of the second index when the end operation is performed (step S406: Yes). On the other hand, the display control unit 133b executes the process in step S405 when the end operation has not been performed (step S406: No).
[第4実施形態]
 次に、第4実施形態を説明する。本実施形態において、上述した実施形態と同様の構成については、同一の符号を付し、その説明を省略又は簡略化する場合がある。
[Fourth Embodiment]
Next, the fourth embodiment will be described. In the present embodiment, the same components as those in the above-described embodiment are designated by the same reference numerals, and the description thereof may be omitted or simplified.
 図31は、第4実施形態に係る情報処理システムの構成例を示す図である。図31に示すように、情報処理システムSYSは、第1端末30と、第2端末40と、端末装置80と、情報処理装置100と、培養システムBSとを有する。第1端末30、第2端末40、端末装置80、及び情報処理装置100は、ネットワークNを介して、相互に通信可能に接続される。ネットワークNは、インターネット、移動体通信網、及びローカルネットワークのいずれであってもよく、これらの複数種類のネットワークが組み合わされたネットワークであってもよい。 FIG. 31 is a diagram showing a configuration example of the information processing system according to the fourth embodiment. As shown in FIG. 31, the information processing system SYS includes a first terminal 30, a second terminal 40, a terminal device 80, an information processing device 100, and a culture system BS. The first terminal 30, the second terminal 40, the terminal device 80, and the information processing device 100 are connected to each other so as to be able to communicate with each other via the network N. The network N may be any of the Internet, a mobile communication network, and a local network, and may be a network in which a plurality of types of these networks are combined.
 端末装置80は、複数のゲートウェイ装置80aで構成される。ゲートウェイ装置80aは、培養システムBSと有線又は無線により接続される。なお、情報処理システムSYSでは、複数の培養システムBSが端末装置80を介して情報処理装置100に接続される構成としているが、これに限られず、単数の培養システムBSが端末装置80を介して情報処理装置100に接続される構成でもよい。また、情報処理装置100は、情報処理システムSYSにおいて、複数設けられていてもよいし、単数であってもよい。また、各情報処理装置100には、上記実施形態で説明した各種の機能が、全て含まれてもよいし、分散的に含まれてもよい。つまり、本実施形態に係る情報処理装置100は、クラウドコンピューティングにより実現することができる。 The terminal device 80 is composed of a plurality of gateway devices 80a. The gateway device 80a is connected to the culture system BS by wire or wirelessly. The information processing system SYS has a configuration in which a plurality of culture system BSs are connected to the information processing device 100 via the terminal device 80, but the present invention is not limited to this, and a single culture system BS is connected to the information processing device 100 via the terminal device 80. It may be configured to be connected to the information processing device 100. Further, a plurality of information processing devices 100 may be provided in the information processing system SYS, or a single information processing device 100 may be provided. In addition, each information processing apparatus 100 may include all of the various functions described in the above embodiments, or may include them in a distributed manner. That is, the information processing device 100 according to the present embodiment can be realized by cloud computing.
 情報処理システムSYSでは、ユーザ側の端末(第1端末30、第2端末40等)から、情報処理装置100に接続し、ブラウザを用いて観察結果等を閲覧や操作することができる。情報処理装置100は、サーバとして、取得部において、所定の観察条件で複数の対象物を撮像して得られた観察結果を取得する。そして、情報処理装置100は、画像生成部として、複数の対象物がそれぞれ収容される複数の収容部の観察画像に関する複数の参照画像を分類基準に基づいて示した表示画像を生成する。続いて、情報処理装置100は、出力部として、ネットワークNを介して、画像生成部が生成した表示画像をユーザ端末に出力する。 In the information processing system SYS, users can connect to the information processing device 100 from terminals (first terminal 30, second terminal 40, etc.) and view and operate observation results and the like using a browser. As a server, the information processing device 100 acquires an observation result obtained by imaging a plurality of objects under predetermined observation conditions in an acquisition unit. Then, as an image generation unit, the information processing device 100 generates a display image showing a plurality of reference images relating to observation images of the plurality of storage units in which the plurality of objects are housed based on the classification criteria. Subsequently, the information processing apparatus 100 outputs the display image generated by the image generation unit to the user terminal via the network N as an output unit.
 上述してきた実施形態において、情報処理装置100は、例えば、コンピュータシステムを含む。情報処理装置100は、メモリに記憶された情報処理プログラムを読み出し、読み出した情報処理プログラムに従って各種の処理を実行する。かかる情報処理プログラムは、例えば、コンピュータに、所定の観察条件で複数の対象物を撮像して得られた観察結果を取得することと、複数の対象物がそれぞれ収容される複数の収容部の観察画像に関する複数の参照画像を分類基準に基づいて表示画像に示すことと、を実行させる。情報処理プログラムは、コンピュータ読み取り可能な記憶媒体(例えば、非一時的な記憶媒体、non‐transitory tangible media)に記録されて提供されてもよい。また、上述した各実施形態において、情報処理装置100の演算部134は、解析結果を視覚的に表現した情報をグラフとして表示する場合に、該グラフに表示する1つのデータ(例、グラフの横軸のデータ等)を複数の観察画像から演算して(例、積算、平均)算出してもよい。 In the above-described embodiment, the information processing device 100 includes, for example, a computer system. The information processing device 100 reads an information processing program stored in the memory and executes various processes according to the read information processing program. Such an information processing program, for example, obtains observation results obtained by imaging a plurality of objects under predetermined observation conditions on a computer, and observes a plurality of storage units in which the plurality of objects are housed. Show a plurality of reference images related to the image on the display image based on the classification criteria, and execute. The information processing program may be recorded and provided on a computer-readable storage medium (for example, non-transitory storage medium, non-transitory tangible media). Further, in each of the above-described embodiments, when the arithmetic unit 134 of the information processing apparatus 100 displays information visually expressing the analysis result as a graph, one data to be displayed on the graph (eg, next to the graph). Axis data, etc.) may be calculated from a plurality of observation images (eg, integration, average).
 なお、技術範囲は、上述した実施形態等で説明した態様に限定されるものではない。上述した実施形態等で説明した要件の1つ以上は、省略されることがある。また、上述した実施形態等で説明した要件は、適宜、組み合わせることができる。また、法令で許容される限りにおいて、上述した実施形態等で引用した全ての文献の開示を援用して本文の記載の一部とする。 The technical scope is not limited to the modes described in the above-described embodiments and the like. One or more of the requirements described in the above-described embodiments and the like may be omitted. In addition, the requirements described in the above-described embodiments and the like can be combined as appropriate. In addition, to the extent permitted by law, the disclosure of all documents cited in the above-mentioned embodiments and the like shall be incorporated as part of the description in the main text.
100・・・情報処理装置、110・・・記憶装置、121・・・入力部、122・・・表示部、131・・・通信部、132・・・取得部、133・・・表示制御部、134・・・演算部、135・・・登録部 100 ... Information processing device, 110 ... Storage device, 121 ... Input unit, 122 ... Display unit, 131 ... Communication unit, 132 ... Acquisition unit, 133 ... Display control unit , 134 ・ ・ ・ Calculation unit, 135 ・ ・ ・ Registration unit

Claims (29)

  1.  所定の観察条件で複数の対象物を撮像して得られた観察結果を取得する取得部と、
     複数の前記対象物がそれぞれ収容される複数の収容部の観察画像に関する複数の参照画像を分類基準に基づいて表示画像に示す制御部と、
     を備える、情報処理装置。
    An acquisition unit that acquires observation results obtained by imaging a plurality of objects under predetermined observation conditions, and
    A control unit that shows a plurality of reference images related to observation images of a plurality of storage units in which the plurality of objects are housed in a display image based on a classification standard, and a control unit.
    Information processing device.
  2.  前記制御部は、ユーザによって選択される前記分類基準に基づいて並び替えた複数の前記参照画像を前記表示画像に示す、請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the control unit shows a plurality of the reference images rearranged based on the classification criteria selected by the user in the display image.
  3.  前記分類基準は、前記収容部に投入する培養液の種類及び量を含む前記観察条件の少なくとも1つである、請求項1又は請求項2に記載の情報処理装置。 The information processing apparatus according to claim 1 or 2, wherein the classification standard is at least one of the observation conditions including the type and amount of the culture solution to be charged into the container.
  4.  前記分類基準は、前記収容部に投入する培養液に含まれる血清の種類、濃度、及び量を含む前記観察条件の少なくとも1つである、請求項1から請求項3のいずれか一項に記載の情報処理装置。 The classification standard is described in any one of claims 1 to 3, which is at least one of the observation conditions including the type, concentration, and amount of serum contained in the culture solution to be put into the container. Information processing equipment.
  5.  前記分類基準は、前記収容部に投入する薬剤の種類、濃度、暴露期間、及び暴露タイミングを含む前記観察条件の少なくとも1つである、請求項1から請求項4のいずれか一項に記載の情報処理装置。 The classification standard according to any one of claims 1 to 4, wherein the classification standard is at least one of the observation conditions including the type, concentration, exposure period, and exposure timing of the drug to be put into the container. Information processing device.
  6.  前記分類基準は、前記収容部に投入する前記対象物の種類及び数を含む前記観察条件の少なくとも1つである、請求項1から請求項5のいずれか一項に記載の情報処理装置。 The information processing apparatus according to any one of claims 1 to 5, wherein the classification standard is at least one of the observation conditions including the type and number of the objects to be put into the storage unit.
  7.  前記分類基準は、複数の前記収容部を有する容器が配置された空間における温度設定、湿度設定、雰囲気の供給設定、及び光の出力設定を含む前記観察条件の少なくとも1つである、請求項1から請求項6のいずれか一項に記載の情報処理装置。 The classification criterion is at least one of the observation conditions including a temperature setting, a humidity setting, an atmosphere supply setting, and a light output setting in a space in which a container having a plurality of the housing units is arranged. The information processing apparatus according to any one of claims 6.
  8.  前記分類基準は、前記対象物の数、数の経時変化、数の倍加時間、移動量、形態変化、専有面積、及び周囲長を含む前記観察結果の少なくとも1つである、請求項1から請求項7のいずれか一項に記載の情報処理装置。 Claims 1 to 7 wherein the classification criterion is at least one of the observation results including the number of the objects, the change over time of the number, the doubling time of the number, the amount of movement, the morphological change, the occupied area, and the peripheral length. The information processing apparatus according to any one of the above.
  9.  前記分類基準は、前記観察画像を解析して得られる輝度値を含む前記観察結果である、請求項1から請求項8のいずれか一項に記載の情報処理装置。 The information processing apparatus according to any one of claims 1 to 8, wherein the classification standard is the observation result including a brightness value obtained by analyzing the observation image.
  10.  前記制御部は、複数の前記参照画像を隣接又は近接させて前記表示画像に示し、隣接又は近接させて示した複数の前記参照画像を前記分類基準に基づいて前記表示画像に示す、請求項1から請求項9のいずれか一項に記載の情報処理装置。 The control unit displays a plurality of the reference images adjacent to or close to each other on the display image, and shows the plurality of the reference images shown adjacent to or close to each other on the display image based on the classification criteria. The information processing apparatus according to any one of claims 9.
  11.  前記制御部は、複数の前記収容部と同一の配置で複数の前記参照画像を隣接又は近接させて前記表示画像に示す、請求項10に記載の情報処理装置。 The information processing device according to claim 10, wherein the control unit is shown in the display image with a plurality of the reference images adjacent to or close to each other in the same arrangement as the plurality of storage units.
  12.  複数の前記参照画像のそれぞれは、複数の前記観察画像のそれぞれを縮小した画像である、請求項1から請求項11のいずれか一項に記載の情報処理装置。 The information processing apparatus according to any one of claims 1 to 11, wherein each of the plurality of reference images is a reduced image of each of the plurality of observation images.
  13.  前記分類基準は、前記観察条件又は前記観察結果を評価する指標を含み、
     前記制御部は、前記指標に基づく区分ごとに複数の前記参照画像を前記表示画像に示す、請求項1から請求項12のいずれか一項に記載の情報処理装置。
    The classification criteria include the observation conditions or indicators for evaluating the observation results.
    The information processing device according to any one of claims 1 to 12, wherein the control unit shows a plurality of the reference images in the display image for each category based on the index.
  14.  前記制御部は、同一の前記区分に含まれる複数の前記参照画像を隣接又は近接させて前記表示画像に示す、請求項13に記載の情報処理装置。 The information processing device according to claim 13, wherein the control unit displays a plurality of the reference images included in the same category in the display image adjacent to or close to each other.
  15.  前記制御部は、前記区分ごとに間隔を空けて複数の前記参照画像を前記表示画像に示す、請求項13又は請求項14に記載の情報処理装置。 The information processing device according to claim 13, wherein the control unit shows a plurality of the reference images on the display image at intervals for each category.
  16.  前記制御部は、前記区分が異なることを前記表示画像に示す、請求項13から請求項15のいずれか一項に記載の情報処理装置。 The information processing device according to any one of claims 13 to 15, wherein the control unit indicates that the categories are different in the display image.
  17.  前記制御部は、前記区分のそれぞれに対する評価を示す情報を前記表示画像に示す、請求項13から請求項16のいずれか一項に記載の情報処理装置。 The information processing device according to any one of claims 13 to 16, wherein the control unit shows information indicating evaluation for each of the categories in the display image.
  18.  前記制御部は、前記区分のそれぞれに対する前記指標を示す情報を前記表示画像に示す、請求項13から請求項17のいずれか一項に記載の情報処理装置。 The information processing device according to any one of claims 13 to 17, wherein the control unit shows information indicating the index for each of the categories in the display image.
  19.  前記制御部は、複数の前記区分の一方向における複数の前記参照画像の配置数を同数にして前記表示画像に示す、請求項13から請求項18のいずれか一項に記載の情報処理装置。 The information processing device according to any one of claims 13 to 18, wherein the control unit has the same number of arrangements of the plurality of reference images in one direction of the plurality of categories and shows them in the display image.
  20.  前記制御部は、ユーザ操作に応じて、前記区分ごとに切り替えて前記表示画像に示す、請求項13から請求項18のいずれか一項に記載の情報処理装置。 The information processing device according to any one of claims 13 to 18, wherein the control unit is switched for each of the categories according to a user operation and shown in the display image.
  21.  前記制御部は、前記指標の設定を受信し、受信された前記指標の前記設定に基づく前記区分ごとに複数の前記参照画像を前記表示画像に示す、請求項13から請求項20のいずれか一項に記載の情報処理装置。 Any one of claims 13 to 20, wherein the control unit receives the setting of the index and shows a plurality of the reference images in the display image for each of the categories based on the setting of the received index. The information processing device described in the section.
  22.  前記制御部は、前記指標を設定するコンテンツを前記表示画像に示す、請求項13から請求項21のいずれか一項に記載の情報処理装置。 The information processing device according to any one of claims 13 to 21, wherein the control unit shows the content for setting the index in the display image.
  23.  前記制御部は、前記指標を第1指標として、前記第1指標に基づく前記区分ごとに複数の前記参照画像を示す前記表示画像に対して、前記第1指標とは異なる第2指標に基づいて複数の前記参照画像を並び替えて示す、請求項13から請求項22のいずれか一項に記載の情報処理装置。 The control unit uses the index as the first index, and uses the index as the first index for the display image showing a plurality of the reference images for each of the categories based on the first index, based on a second index different from the first index. The information processing apparatus according to any one of claims 13 to 22, wherein a plurality of the reference images are rearranged and shown.
  24.  前記制御部は、前記指標を第1指標として、前記第1指標に基づく前記区分ごとに複数の前記参照画像を示す前記表示画像に対して、前記第1指標とは異なる第2指標に該当する複数の前記参照画像を明示する、請求項13から請求項23のいずれか一項に記載の情報処理装置。 The control unit corresponds to a second index different from the first index with respect to the display image showing a plurality of the reference images for each of the categories based on the first index, using the index as the first index. The information processing apparatus according to any one of claims 13 to 23, which clearly indicates a plurality of the reference images.
  25.  前記制御部は、前記区分のそれぞれに対する前記第2指標を示す情報を前記表示画像に示す、請求項23又は請求項24に記載の情報処理装置。 The information processing device according to claim 23 or 24, wherein the control unit shows information indicating the second index for each of the categories in the display image.
  26.  前記制御部は、前記観察条件に対する複数の前記収容部の数の分布を前記表示画像に示す、請求項1から請求項25のいずれか一項に記載の情報処理装置。 The information processing device according to any one of claims 1 to 25, wherein the control unit shows the distribution of the number of the plurality of accommodating units with respect to the observation conditions in the display image.
  27.  所定の観察条件で複数の対象物を撮像して得られた観察結果を取得することと、
     複数の前記対象物がそれぞれ収容される複数の収容部の観察画像に関する複数の参照画像を分類基準に基づいて表示画像に示すことと、
     を含む、情報処理方法。
    Acquiring the observation results obtained by imaging a plurality of objects under predetermined observation conditions,
    A plurality of reference images relating to observation images of a plurality of storage units in which the plurality of objects are housed are shown in the display image based on the classification criteria.
    Information processing methods, including.
  28.  コンピュータに、
     所定の観察条件で複数の対象物を撮像して得られた観察結果を取得することと、
     複数の前記対象物がそれぞれ収容される複数の収容部の観察画像に関する複数の参照画像を分類基準に基づいて表示画像に示すことと、
     を実行させる、情報処理プログラム。
    On the computer
    Acquiring the observation results obtained by imaging a plurality of objects under predetermined observation conditions,
    A plurality of reference images relating to observation images of a plurality of storage units in which the plurality of objects are housed are shown in the display image based on the classification criteria.
    An information processing program that executes.
  29.  クラウドコンピューティングによりユーザ端末に表示画像を出力する情報処理システムであって、
     サーバを備え、
     前記サーバは、
     所定の観察条件で複数の対象物を撮像して得られた観察結果を取得する取得部と、
     複数の前記対象物がそれぞれ収容される複数の収容部の観察画像に関する複数の参照画像を分類基準に基づいて示した前記表示画像を生成する画像生成部と、
     ネットワークを介して、前記画像生成部が生成した前記表示画像を前記ユーザ端末に出力する出力部と、
     を備える、情報処理システム。
    An information processing system that outputs display images to user terminals by cloud computing.
    Equipped with a server
    The server
    An acquisition unit that acquires observation results obtained by imaging a plurality of objects under predetermined observation conditions, and
    An image generation unit that generates the display image showing a plurality of reference images relating to observation images of the plurality of storage units in which the plurality of objects are housed based on classification criteria, and an image generation unit.
    An output unit that outputs the display image generated by the image generation unit to the user terminal via a network, and an output unit.
    Information processing system equipped with.
PCT/JP2019/038165 2019-09-27 2019-09-27 Information processing device, information processing method, information processing program, and information processing system WO2021059489A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/038165 WO2021059489A1 (en) 2019-09-27 2019-09-27 Information processing device, information processing method, information processing program, and information processing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/038165 WO2021059489A1 (en) 2019-09-27 2019-09-27 Information processing device, information processing method, information processing program, and information processing system

Publications (1)

Publication Number Publication Date
WO2021059489A1 true WO2021059489A1 (en) 2021-04-01

Family

ID=75166028

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/038165 WO2021059489A1 (en) 2019-09-27 2019-09-27 Information processing device, information processing method, information processing program, and information processing system

Country Status (1)

Country Link
WO (1) WO2021059489A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0114310B2 (en) * 1980-03-22 1989-03-10 Toyota Motor Co Ltd
JP2005063043A (en) * 2003-08-08 2005-03-10 Olympus Corp Image display device, method and program
JP2011205939A (en) * 2010-03-29 2011-10-20 Sysmex Corp Cell image display system, cell image display device, cell image display method and cell image display program
JP2011232051A (en) * 2010-04-23 2011-11-17 Nagoya Univ Classification model generation device, cell classification device, incubator, method for cell culture, and program
JP2015520397A (en) * 2012-06-22 2015-07-16 マルバーン インストゥルメンツ リミテッド Characterization of heterogeneous fluid samples

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0114310B2 (en) * 1980-03-22 1989-03-10 Toyota Motor Co Ltd
JP2005063043A (en) * 2003-08-08 2005-03-10 Olympus Corp Image display device, method and program
JP2011205939A (en) * 2010-03-29 2011-10-20 Sysmex Corp Cell image display system, cell image display device, cell image display method and cell image display program
JP2011232051A (en) * 2010-04-23 2011-11-17 Nagoya Univ Classification model generation device, cell classification device, incubator, method for cell culture, and program
JP2015520397A (en) * 2012-06-22 2015-07-16 マルバーン インストゥルメンツ リミテッド Characterization of heterogeneous fluid samples

Similar Documents

Publication Publication Date Title
US8588504B2 (en) Technique for determining the state of a cell aggregation image processing program and image processing device using the technique, and method for producing a cell aggregation
WO2010146802A1 (en) State determination method for cell cluster, image processing program and imaging processing device using said method, and method for producing cell cluster
JP5672700B2 (en) Cell observation apparatus and program
WO2011016189A1 (en) Technique for classifying cells, image processing program and image processing device using the technique, and method for producing cell mass
WO2011013319A1 (en) Technique for determining maturity of cell mass, and image processing program and image processing device which use the technique, and method for producing cell mass
WO2009119330A1 (en) Method for analyzing image for cell observation, image processing program, and image processing device
US20140301665A1 (en) Image data generating apparatus, image data display system, and image data generating method
WO2011004568A1 (en) Image processing method for observation of fertilized eggs, image processing program, image processing device, and method for producing fertilized eggs
JP2008241699A (en) Incubator, and schedule management method and program
JP6355082B2 (en) Pathological diagnosis support apparatus and pathological diagnosis support method
JP2009229274A (en) Method for analyzing image for cell observation, image processing program and image processor
US20190095692A1 (en) Image-processing device and cell observation system
WO2021059489A1 (en) Information processing device, information processing method, information processing program, and information processing system
JP7380695B2 (en) Information processing device, information processing method, information processing program, and information processing system
WO2021059488A1 (en) Information processing device, information processing method, information processing program, and information processing system
JP2009229276A (en) Method for analyzing image for cell observation, image processing program and image processor
JP2024086999A (en) Information processing device, information processing method, information processing program, and information processing system
JP6034073B2 (en) Image analysis apparatus and image analysis method
JP6946807B2 (en) Image display device, program and control method
JP2022020559A (en) Image processing apparatus, image processing method, and program
JP2011010621A (en) Image processing method in observation of cultured product, image processing program and image processor
JP6939170B2 (en) Image display device, program and control method
US20200175254A1 (en) Observation system and information management method
WO2024053633A1 (en) Information processing device, information processing system, information processing method, and program
US11194145B2 (en) Image reproducing device and observation system for selecting images using tag information attached in accordance with information that operation is performed on sample and input when at least one of images is acquired

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19947142

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19947142

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP