WO2019229912A1 - Dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations et microscope - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations et microscope Download PDF

Info

Publication number
WO2019229912A1
WO2019229912A1 PCT/JP2018/020864 JP2018020864W WO2019229912A1 WO 2019229912 A1 WO2019229912 A1 WO 2019229912A1 JP 2018020864 W JP2018020864 W JP 2018020864W WO 2019229912 A1 WO2019229912 A1 WO 2019229912A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
point cloud
information
input
point
Prior art date
Application number
PCT/JP2018/020864
Other languages
English (en)
Japanese (ja)
Inventor
亘 友杉
定繁 石田
秀太朗 大西
五月 大島
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to PCT/JP2018/020864 priority Critical patent/WO2019229912A1/fr
Publication of WO2019229912A1 publication Critical patent/WO2019229912A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, an information processing program, and a microscope.
  • STORM STORM, PALM, etc. are known as super-resolution microscopes.
  • STORM a fluorescent substance is activated, and the activated fluorescent substance is irradiated with excitation light to acquire a fluorescent image (see Patent Document 1 below).
  • a display control unit that displays a point cloud image on a display unit, an input information acquisition unit that acquires input information input by the input unit, and an input information acquisition unit
  • a processing unit that extracts a part of the point group from the point group included in the point group image based on the input information, and the display control unit extracts the extracted points based on the part of the point group that the processing unit extracts.
  • the information processing apparatus according to the first aspect, the optical system that illuminates the activation light that activates part of the fluorescent substance contained in the sample, and the activated fluorescent substance
  • An illumination optical system that illuminates at least a part of the excitation light
  • an observation optical system that forms an image of light from the sample
  • an imaging unit that captures an image formed by the observation optical system
  • an imaging unit that calculates position information of a fluorescent substance based on the result and generates a point cloud using the calculated position information is provided.
  • the point cloud image is displayed on the display unit, the input information input by the input unit is acquired, and the points included in the point cloud image based on the input information.
  • An information processing method including extracting a part of a point group from a group and displaying an extracted point group image based on the extracted part of the point group on a display unit is provided.
  • the computer displays the point cloud image on the display unit, acquires the input information input by the input unit, and creates the point cloud image based on the input information.
  • an information processing program for executing extraction of a part of a point cloud from included points and display of an extracted point cloud image based on the extracted part of the point cloud on a display unit.
  • FIG. 1 is a diagram illustrating an information processing apparatus according to the first embodiment.
  • the information processing apparatus 1 according to the embodiment generates an image (point cloud image) using the point cloud data DG and displays the image on the display device 2. Further, the information processing apparatus 1 processes point cloud data DG (data group).
  • the point cloud data DG is a plurality of N-dimensional data D1.
  • N is an arbitrary integer of 2 or more.
  • the N-dimensional data D1 is data (eg, vector data) in which N values are combined.
  • point cloud data DG is three-dimensional data in which coordinate values (eg, x1, y1, z1) in a three-dimensional space are combined. In the following description, it is assumed that the above N is 3.
  • N may be 2 or 4 or more.
  • point cloud data DG is m pieces of N-dimensional data. m is an arbitrary integer of 2 or more.
  • the point cloud image is an image generated using the point cloud data DG.
  • the point cloud data DG is three-dimensional data in which coordinate values (eg, x1, y1, z1) in a three-dimensional space are set as one set
  • the image is an image displaying points at each coordinate position.
  • the shape of the displayed point is not limited to a circle, and may be another shape such as an ellipse or a rectangle.
  • Point cloud data is sometimes simply referred to as a point cloud.
  • a plurality of points on the point cloud image are appropriately referred to as a point cloud.
  • the point cloud data DG is supplied to the information processing device 1 from, for example, a device external to the information processing device 1 (hereinafter referred to as an external device).
  • the external device is, for example, a microscope main body 51 shown later in FIG.
  • the external device may not be the microscope main body 51.
  • the external device may be a CT scan that detects a value at each point inside the object, or a measurement device that measures the shape of the object.
  • the information processing apparatus 1 may generate point cloud data DG based on data supplied from an external device, and process the generated point cloud data DG.
  • the information processing apparatus 1 executes processing based on input information that a user inputs using a graphical user interface (referred to as GUI in this specification as appropriate).
  • the information processing device 1 is connected to a display device 2 (display unit).
  • the display device 2 is, for example, a liquid crystal display.
  • the information processing apparatus 1 supplies image data to the display device 2 and causes the display device 2 to display the image.
  • the display device 2 is an external device attached to the information processing device 1, but may be a part of the information processing device 1.
  • the information processing device 1 is connected to an input device 3 (input unit).
  • the input device 3 is an input interface that can be operated by a user.
  • the input device 3 includes, for example, at least one of a mouse, a keyboard, a touch pad, and a trackball.
  • the input device 3 detects an operation by the user and supplies the detection result to the information processing device 1 as input information input by the user.
  • the input device 3 is a mouse.
  • the information processing device 1 causes the display device 2 to display a pointer.
  • the information processing apparatus 1 acquires mouse movement information and click information indicating the presence or absence of a click from the input apparatus 3 as input information detected by the input apparatus 3.
  • the information processing apparatus 1 moves the pointer on the screen of the display device 2 based on the mouse movement information.
  • the information processing apparatus 1 executes processing assigned to the position of the pointer and click information (eg, left click, right click, drag, double click) based on the click information.
  • the input device 3 is, for example, a device externally attached to the information processing device 1, but may be a part of the information processing device 1 (for example, a built-in touch pad). Further, the input device 3 may be a touch panel integrated with the display device 2 or the like.
  • the information processing apparatus 1 includes, for example, a computer.
  • the information processing apparatus 1 includes an operating system unit 5 (hereinafter referred to as an OS unit 5), a GUI unit 6, a processing unit 7, and a storage unit 8.
  • the information processing apparatus 1 executes various processes according to the program stored in the storage unit 8.
  • the OS unit 5 provides an interface to the outside and the inside of the information processing apparatus 1.
  • the OS unit 5 controls the supply of image data to the display device 2.
  • the OS unit 5 acquires input information from the input device 3.
  • the OS unit 5 supplies input information to an application that manages an active GUI screen in the display device 2.
  • the GUI unit 6 includes an input control unit 11 and an output control unit 12.
  • the input control unit 11 is an input information acquisition unit that acquires input information input by the input unit (input device 3).
  • the output control unit 12 is a display control unit that displays a point cloud image on the display unit (display device 2).
  • the output control unit 12 causes the display device 2 to display a GUI screen (GUI screen W shown in FIG. 2 and the like later).
  • the GUI screen is a window provided by an application.
  • Information constituting the GUI screen (hereinafter referred to as GUI information) is stored in the storage unit 8, for example.
  • the output control unit 12 reads the GUI information from the storage unit 8 and supplies the GUI information to the OS unit 5.
  • the OS unit 5 causes the display device 2 to display a GUI screen based on the GUI information supplied from the output control unit 12. In this way, the output control unit 12 supplies the GUI information to the OS unit 5 to display the GUI screen on the display device 2.
  • the input control unit 11 acquires input information input by the user using the GUI screen. For example, the input control unit 11 acquires mouse movement information and click information as input information from the OS unit 5. When the click information indicates that there has been a click operation, the input control unit 11 causes the process assigned to the click information to be executed based on the coordinates of the pointer on the GUI screen obtained from the mouse movement information.
  • the input control unit 11 causes the output control unit 12 to execute processing for displaying the menu.
  • Information representing the menu is included in the GUI information, and the output control unit 12 causes the display device 2 to display the menu via the OS unit 5 based on the GUI information.
  • a left click is detected on the GUI screen.
  • the input control unit 11 specifies the position of the pointer on the GUI screen based on the movement information of the mouse, and determines whether there is a button at the specified pointer position.
  • the input control unit 11 causes a process assigned to this button to be executed if there is a button at the position of the pointer.
  • the processing unit 7 Based on the input information acquired by the input information acquisition unit (input control unit 11), the processing unit 7 extracts a part of the point group (hereinafter referred to as a point set) from the point group included in the point cloud image. .
  • the input information is information related to the point cloud specified in the point cloud image.
  • the processing unit 7 divides the point group included in the point group image into a plurality of point groups (a plurality of subsets), and calculates the feature amount or similarity between the divided point group (subset) and the designated point group. Some point clouds are extracted based on this.
  • the processing unit 7 includes a clustering unit 9 and a classifier 10.
  • the clustering unit 9 divides the point group included in the point group image into a plurality of point groups.
  • a point cloud obtained by dividing a point cloud included in a point cloud image is referred to as a subset.
  • the clustering unit 9 divides (classifies) the point cloud data DG into a plurality of subsets based on the distribution of the plurality of N-dimensional data D1. For example, the clustering unit 9 randomly selects the N-dimensional data D1 from the point cloud data DG. Further, the clustering unit 9 counts the number of other N-dimensional data D1 existing in a predetermined area centered on the selected N-dimensional data D1. When the clustering unit 9 determines that the counted number of N-dimensional data D1 is equal to or greater than the threshold, the selected N-dimensional data D1 and other N-dimensional data D1 existing in the predetermined region belong to the subset. Is determined.
  • the clustering unit 9 classifies the N-dimensional data D1 included in the point cloud data DG into a plurality of non-overlapping subsets or noise. For example, the clustering unit 9 assigns an identification number to a plurality of subsets, and for the N-dimensional data D1 belonging to the subset, the N-dimensional data D1 or the identification number thereof and the identification number of the subset to which the N-dimensional data D1 belongs Are stored in the storage unit 8. In addition, the clustering unit 9 adds a flag indicating noise, for example, to the N-dimensional data D1 classified as noise. The clustering unit 9 may delete the N-dimensional data D1 determined to be noise from the point cloud data DG.
  • the classifier 10 executes an extraction process for extracting a part of the point cloud (point set) from the point cloud data DG.
  • the information processing apparatus 1 acquires, as input information, information that specifies an extraction target by using the GUI as described above.
  • the information defining the extraction target is, for example, N-dimensional data distribution (hereinafter referred to as target distribution) corresponding to the point set to be extracted.
  • the input control unit 11 causes the processing unit 7 to execute the process assigned to the input information. For example, when the input information indicating the target distribution is acquired, the input control unit 11 identifies the distribution specified by the input information. Then, the input control unit 11 causes the processing unit 7 to execute an extraction process for extracting a point set of a distribution similar to the target distribution as a process assigned to the input information.
  • the processing unit 7 extracts a point set from the point cloud data DG including a plurality of N-dimensional data based on the distribution specified by the input control unit 11.
  • the classifier 10 classifies a point set that satisfies a predetermined condition from the point cloud data DG.
  • the classifier 10 executes a process (hereinafter referred to as a classification process) for classifying a point set that satisfies a condition that the similarity to the target distribution is equal to or greater than a predetermined value as the predetermined condition.
  • the processing unit 7 extracts a part of the point group (point set) from the point cloud data DG when the classifier 10 executes the classification process.
  • processing of the GUI unit 6 and the processing unit 7 in the extraction processing will be described with reference to FIGS.
  • FIG. 2 is a diagram showing a GUI screen according to the first embodiment.
  • the GUI screen W is displayed in the display area 2A of the display device 2 (see FIG. 1).
  • the GUI screen W is displayed in a part of the display area 2A, but may be displayed in full screen in the display area 2A.
  • the GUI screen W in FIG. 2 includes a window W1, a window W2, a window W3, and a window W4.
  • the point cloud image P1 is displayed in the window W1.
  • the point cloud image P1 is an image representing the distribution of the plurality of N-dimensional data D1 shown in FIG.
  • the N-dimensional data D1 is three-dimensional data, and one N-dimensional data D1 is represented by one point.
  • one N-dimensional data D1 shown in FIG. 1 is (x1, y1, z1), and is represented by a point in the point cloud image P1 where the X coordinate is x1, the Y coordinate is y1, and the Z coordinate is z1. Is done.
  • the information processing apparatus 1 When the information processing apparatus 1 receives a command for opening the point cloud data DG (a command for displaying the point cloud data DG) from the user based on the input information, the information processing apparatus 1 generates data of the point cloud image P1.
  • the output control unit 12 supplies the data of the generated point cloud image P1 to the OS unit 5, and the OS unit 5 displays the data on the window W1 of the GUI screen W.
  • the information processing apparatus 1 may remove noise in the point cloud data DG. For example, when the point cloud data DG is obtained by detecting an object, the information processing apparatus 1 may exclude the N-dimensional data D1 estimated not to constitute the structure of the object to be detected as noise and exclude it from the processing target. Good. For example, the information processing apparatus 1 counts the number of other N-dimensional data D1 existing in a space having a predetermined radius centered on the first N-dimensional data D1 (data point), and the counted number is less than the threshold value. The first N-dimensional data may be determined as noise. The information processing apparatus 1 may generate the point cloud image P1 based on the point cloud data DG from which noise has been removed.
  • the window W2 is output to the GUI screen W by the output control unit 12 when it is detected that the pointer P is right-clicked with the pointer P being placed on the GUI screen W, for example.
  • processing options regarding the point cloud data DG are displayed as input information options. [Analyze ⁇ data], [Some operation 1], [Some operation 2], and [Some operation 3] are displayed in the window W2 of FIG. These options are, for example, buttons to which commands are assigned.
  • [Analyze data] is selected as the processing option.
  • the selected option is displayed with emphasis over the other buttons.
  • [Analyze data] is displayed in a larger font than other options in window W2 (eg, [Some operation 1]).
  • the selected option ([Analyze data]) is displayed with a mark (for example, [ ⁇ ] in the figure) indicating that it is being selected.
  • a mark for example, [ ⁇ ] in the figure
  • the input control unit 11 acquires information on options selected using the GUI screen W among the options of input information. For example, when it is detected that the left click is performed in a state where the pointer P is arranged on [Analyze ⁇ data], the input control unit 11 acquires the content of the process assigned to [Analyze data]. The content of the process assigned to the option is defined in the GUI information, and the input control unit 11 collates the input information with the GUI information and acquires the content of the process corresponding to this input information. Then, the input control unit 11 causes the process assigned to [Analyze data] to be executed. [Analyze data] is assigned a process for starting the extraction process.
  • the window W3 is generated.
  • the input control unit 11 causes the output control unit 12 to output the window W3.
  • the information on the window W3 is included in the GUI information, and the output control unit 12 acquires the information on the window W3 from the GUI information stored in the storage unit 8.
  • the output control unit 12 supplies information on the window W3 to the OS unit 5, and the OS unit 5 causes the display device 2 to display the window W3.
  • [Some operation 1], [Some operation 2], and [Some operation 3] are allotted other processes (eg, opening a file, outputting the result, and ending the application).
  • the GUI unit 6 may not provide at least one option of [Some operation 1], [Some operation 2], and [Some operation 3]. Further, the GUI unit 6 may provide other options of [Some operation 1], [Some operation 2], and [Some operation 3].
  • [Example] is selected as an option for the distribution designation method. The selected option is displayed with emphasis over the other buttons. For example, [Example] is displayed in a larger font than other options of the window W3 (eg, [Already] prepared]).
  • a mark for example, [ ⁇ ]
  • selection is being performed is displayed together with the selected option ([Example]).
  • the input control unit 11 acquires information on options selected using the GUI screen W among the options of input information. For example, when it is detected that the left click is performed in a state where the pointer P is placed on [Example], the contents of the process assigned to [Example] are acquired. The content of the process assigned to the option is defined in the GUI information, and the input control unit 11 collates the input information with the GUI information and acquires the content of the process corresponding to this input information. Then, the input control unit 11 causes the process assigned to [Example] to be executed. [Example] is assigned a process of displaying options for selecting a distribution from predetermined candidates.
  • a window W4 is generated.
  • the input control unit 11 causes the output control unit 12 to output the window W4.
  • the information on the window W4 is included in the GUI information, and the output control unit 12 acquires the information on the window W4 from the GUI information stored in the storage unit 8.
  • the output control unit 12 supplies information on the window W4 to the OS unit 5, and the OS unit 5 causes the display device 2 to display the window W4.
  • distribution candidate categories are displayed as input information options.
  • [Geometric shape] and [Biological 4 objects] are displayed as distribution candidate categories. These options are, for example, buttons to which commands are assigned.
  • input information is information related to a geometric shape ([information specifying [Geometric shape]).
  • [Geometric shape] is selected as a category of distribution candidates.
  • the selected option is displayed more emphasized than the other buttons. For example, [Geometric shape] is displayed in a larger font than [Biological objects].
  • the selected [Geometric shape] is displayed with a mark (for example, [ ⁇ ]) indicating that it is being selected.
  • the input control unit 11 acquires information on options selected using the GUI screen W among the options of input information. For example, when it is detected that the left click is performed in a state where the pointer P is placed on [Geometric shape], the contents of the process assigned to [Geometric shape] are acquired. The content of the process assigned to the option is defined in the GUI information, and the input control unit 11 collates the input information with the GUI information and acquires the content of the process corresponding to this input information. Then, the input control unit 11 causes the process assigned to [Geometric shape] to be executed. [Geometric shape] is assigned a process of displaying a geometric candidate representing a distribution as a predetermined candidate.
  • a window W5 is generated.
  • the input control unit 11 causes the output control unit 12 to output the window W5.
  • the information on the window W5 is included in the GUI information, and the output control unit 12 acquires the information on the window W5 from the GUI information stored in the storage unit 8.
  • the output control unit 12 supplies information on the window W5 to the OS unit 5, and the OS unit 5 causes the display device 2 to display the window W5.
  • geometric shape candidates representing a distribution are displayed as input information options.
  • [Sphere], [Ellipsoid], [Star], [Etc ...] are displayed as geometric shape candidates. These options are, for example, buttons to which commands are assigned.
  • [Ellipsoid] is selected as a geometric candidate.
  • the selected option ([Ellipsoid]) is displayed with emphasis over the other buttons. For example, [Ellipsoid] is displayed in a larger font than [Sphere].
  • the selected [Ellipsoid] is also displayed with a mark (for example, [ ⁇ ]) indicating that it is being selected.
  • the input control unit 11 acquires information on options selected using the GUI screen W among the options of input information. For example, when it is detected that the left click is performed in a state where the pointer P is placed on [Ellipsoid], the contents of the process assigned to [Ellipsoid] are acquired. The content of the process assigned to the option is defined in the GUI information, and the input control unit 11 collates the input information with the GUI information and acquires the content of the process corresponding to this input information. Then, the input control unit 11 causes the process assigned to [Ellipsoid] to be executed. [Ellipsoid] is assigned a process for designating the distribution of data points that fall within an ellipsoid as the target distribution for extraction.
  • [Sphere] indicates that the target distribution is a spherical distribution.
  • [Star] indicates that the target distribution falls within a star shape.
  • [Etc ...] indicates that another geometric shape is designated as the target distribution. For example, when [Etc ...] is selected, the user can designate the geometric shape as a target distribution by, for example, reading data defining the geometric shape.
  • the input control unit 11 causes the processing unit 7 to perform an extraction process using a distribution that falls within an ellipsoid as a target distribution.
  • the processing unit 7 extracts N-dimensional data belonging to a subset whose outer shape is approximated by an ellipsoid from the point cloud data DG.
  • information related to the size (size) of the geometric shape may be settable.
  • FIG. 3 is a diagram illustrating processing by the processing unit according to the first embodiment.
  • symbol Ka is a target distribution.
  • the processing unit 7 (clustering unit 9) divides the point group included in the point group image into a plurality of point groups (subsets).
  • the codes Kb1 to Kb6 in FIG. 3 are distributions corresponding to the point groups (subsets) divided by the clustering unit 9.
  • Reference numerals Kb1 to Kb6 are distributions of the N-dimensional data D1 in a part of the space (eg, ROI) selected (cut out) from the data space in which the point cloud data DG is accommodated.
  • the processing unit 7 extracts a part of the point group (point set) based on the feature quantity of the divided point group (subset) and the geometric feature quantity.
  • the classifier 10 calculates the feature amount of the subset divided by the clustering unit 9 and compares (matches) with the feature amount (eg, geometric feature amount) specified by the input information.
  • the classifier 10 classifies the subset as a point set when the feature amount of the subset divided by the clustering unit 9 matches the feature amount (for example, the geometric feature amount) specified by the input information.
  • the feature amount may be the size of the structure.
  • the size is a size (absolute value) or a relative size (relative value) in real space.
  • the processing unit 7 divides the point group included in the point group image into a plurality of point groups (subsets), and a part based on the size of the shape represented by the divided point group and the size specified by the input information
  • the point cloud may be extracted.
  • the classifier 10 may classify the point set based on the similarity between the point group (target distribution) specified by the input information and the distribution of points corresponding to the subset. For example, the classifier 10 calculates the degree of similarity of the distribution Kb1, the distribution Kb2,... With the target distribution Ka. For example, the classifier 10 calculates the similarity Q1 between the distribution Kb1 and the target distribution Ka. For example, the similarity Q1 is obtained by dividing the sum of squares of the distance (norm) between the N-dimensional data D1 selected from the distribution Ka and the N-dimensional data D1 selected from the target distribution Kb1 by the number of data. The value obtained by subtracting the square root from 1.
  • the similarity Q1 may be a correlation coefficient between the distribution Kb1 and the target distribution Ka, for example. The same applies to the similarity Q2, the similarity Q3,.
  • the processing unit 7 may convert the distribution Kb1 and calculate the similarity between the converted distribution Kb1 and the target distribution Ka.
  • the transformation includes, for example, at least one of a translation transformation, a transformation to rotate, a linear transformation, a scale transformation, and a transformation combining two or more of these transformations (eg, affine transformation).
  • the type of conversion may be determined in advance or may be set according to input information from the user.
  • the classifier 10 determines whether or not to extract the distribution Kb1, the distribution Kb2,... By comparing the calculated similarity with a threshold value. For example, the classifier 10 determines that the N-dimensional data D1 belonging to the distribution Kb1 is extracted from the point group data DG when the similarity Q1 between the distribution Kb1 and the target distribution Ka is equal to or greater than a threshold value. Further, when the similarity Q1 between the distribution Kb1 and the target distribution Ka is less than the threshold, the classifier 10 determines that the N-dimensional data D1 belonging to the distribution Kb1 is not extracted from the point cloud data DG. The classifier 10 extracts a set of N-dimensional data D1 determined to be extracted as a partial point group (point set). The classifier 10 causes the storage unit 8 to store the extracted point set information as a processing result.
  • the classifier 10 may classify the point set as follows.
  • a condition for classifying the point set a condition that the geometric shape ([Geometric shape]) is an ellipsoid ([Ellipsoid]) is specified.
  • the following processing is performed on each point (each N-dimensional data D1) included in the point cloud data DG, where the number of points existing within a certain radius area centered on each point has a predetermined value.
  • the points in this region are defined as one subset (lumb).
  • the processing unit 7 calculates the feature amount of the subset classified by the clustering unit 9. For example, the processing unit 7 calculates a ratio between the major axis length and the minor axis length of the outer shape of the structure represented by the subset as the feature amount.
  • the classifier 10 classifies (extracts), as a point set, a subset in which the feature amount calculated by the processing unit 7 satisfies the above classification condition. For example, when the condition that the geometric shape ([Geometric shape]) is an ellipsoid ([Ellipsoid]) is specified as the condition for classification, the classifier 10 calculates the length of the long axis calculated as the feature amount by the processing unit 7.
  • the subset is classified as a sphere, and the above ratio is outside the predetermined range (eg, greater than 0)
  • the subset is classified into an ellipsoid if it is less than 0.9 or greater than 1.1.
  • the classifier 10 is based on parameters (eg, feature quantities) other than similarity. You may classify.
  • the display control unit displays an extracted point cloud image based on a part of the point cloud (point set) extracted by the processing unit 7.
  • the extracted point cloud image is an image representing a part of a point cloud (point set) extracted from the point cloud image.
  • the output control unit 12 causes the extraction point cloud image based on the extraction result by the processing unit 7 to be output to the GUI screen W.
  • FIG. 4 is a diagram showing an extracted point cloud image according to the first embodiment.
  • the output control unit 12 outputs the distribution of the subset extracted by the processing unit 7 to the GUI screen W as the extracted point cloud image P2.
  • the display control unit (output control unit 12) applies a part of the point group (point set) extracted by the processing unit 7 in the extracted point cloud image P2 to one of color and brightness with respect to the point cloud image P1. You may display so that both may differ.
  • the display control unit (output control unit 12) may display a part of the point group (point set) extracted by the processing unit 7 in a color different from that of the other point groups, or a brightness different from that of the other point groups. It may be displayed.
  • the display control unit (output control unit 12) may display only a part of the point group (point set) extracted by the processing unit 7 in the extracted point group image.
  • the display control unit (output control unit 12) may display the extracted point cloud image excluding the point cloud other than the point set from the point cloud included in the point cloud image.
  • the input control unit 11 causes the output control unit 12 to execute the process of outputting the extracted point cloud image P2 when the process of outputting the extracted point cloud image P2 is designated by the input information input by the user.
  • the output control unit 12 supplies data of the extracted point cloud image P2 generated using the processing result stored in the storage unit 8 to the OS unit 5.
  • the OS unit 5 outputs the data of the extracted point cloud image P2 to the display device 2 and displays the extracted point cloud image P2 on the GUI screen W.
  • the extracted point cloud image P2 shows the result of the extraction process when an ellipsoid is designated as the geometric shape representing the target distribution Ka, as described with reference to FIGS.
  • the extracted point cloud image P2 includes a distribution Kc of N-dimensional data D1 belonging to a subset determined to have an outer shape similar to an ellipsoid.
  • the extracted point cloud image P2 is an image in which a subset indicated by a triangle or a rectangle is excluded from the point cloud image P1 in FIG. Note that the extracted subset changes depending on a threshold value used to determine whether or not the outline of the subset is similar to an ellipsoid. This threshold value may be changeable according to input information input from the user.
  • FIG. 5 is a diagram showing an extracted point cloud image according to the first embodiment.
  • the extracted point cloud image P3 in FIG. 5 corresponds to the processing result of the extraction processing when the threshold value for determining whether or not the outer shape of the subset is similar to an ellipsoid is changed.
  • the similarity threshold for extracting the point set corresponding to the extracted point cloud image P3 in FIG. 5 is set higher than the similarity threshold for extracting the point set corresponding to the extracted point cloud image P2 in FIG. Has been.
  • the extracted point group image P3 in FIG. 5 is an image in which the oval shape lacking in the extracted point group image P2 in FIG.
  • the number of point sets (distribution Kc of extracted N-dimensional data D1) included in the extracted point group image P3 in FIG. 5 is the same as the number of point sets (extracted N-dimensional data D1 extracted in the extracted point group image P2 in FIG. 4). Less than the number of distributions Kc).
  • the information processing apparatus 1 may not remove noise from the point cloud data DG. For example, at least a part of the noise is excluded from the extracted point cloud image P2 by determining that the noise is not similar to the target distribution Ka.
  • FIG. 6 is a flowchart illustrating the information processing method according to the first embodiment.
  • step S1 the output control unit 12 causes the display unit (display device 2) to output the GUI screen W.
  • step S2 the information processing apparatus 1 acquires point cloud data DG.
  • step S3a the information processing apparatus 1 removes noise from the point cloud data DG.
  • step S3b the clustering unit 9 classifies the subset from the point group data DG from which noise has been removed (performs clustering processing on the point group data DG).
  • step S4 the information processing apparatus 1 generates a point cloud image P1 based on the point cloud data DG from which noise has been removed.
  • At least a part of the processing from step S2 to step S4 can be executed at an arbitrary timing before the processing of step S5 described below. For example, at least a part of the process from step S2 to step S4 may be executed before the start of the process of step S1, or may be executed in parallel with the process of step S1, and the end of the process of step S1. It may be performed later.
  • step S5 the output control unit 12 causes the point cloud image P1 generated in step S4 to be output to the GUI screen W.
  • the input control unit 11 acquires input information using the GUI screen W.
  • the input control unit 11 acquires information regarding a point cloud specified in the point cloud image as input information.
  • a feature amount is specified by input information as an extraction condition.
  • the input information is a geometric shape (eg, [Geometric shape] [Ellipsoid] in FIG. 2)
  • the input information specifies the feature quantity of the ellipsoid (eg, the ratio of the length of the major axis to the minor axis). .
  • step S7 the processing unit 7 extracts a partial set (point set) based on the input information.
  • step S8 the classifier 10 calculates the feature quantities of the clustered subsets, and compares the feature quantities of the subsets with the feature quantities based on the input information. For example, when the input information is a geometric shape (for example, [Geometric shape] [Ellipsoid] in FIG. 2), the processing unit 7 applies the shape represented by the subset to an ellipsoid, and the long axis and the short axis The length ratio is calculated as a feature amount of the subset.
  • a geometric shape for example, [Geometric shape] [Ellipsoid] in FIG. 2
  • the classifier 10 compares the feature quantity of the geometric shape based on the input information (eg, the ratio of the major axis to the minor axis is 0.9 or less or 1.1 or more) and the feature quantity of the subset.
  • the classifier 10 classifies the subset as a point group (point set) when the feature amount of the subset satisfies a predetermined relationship with the feature amount based on the input information. .
  • the classifier 10 matches the feature quantity of the subset with the feature quantity of the ellipsoid based on the input information (eg, the ratio of the major axis to the minor axis is 0.9 or less or 1.1 or more), This subset is classified as an ellipsoid.
  • the processing unit 7 stores the extracted point set information in the storage unit 8.
  • step S10 the output control unit 12 outputs the extraction result.
  • the output control unit 12 causes the extraction point group image P2 representing the extraction result by the processing unit 7 to be output to the GUI screen W.
  • the output control unit 12 may not output the extraction result by the processing unit 7 to the GUI screen W.
  • the output control unit 12 may cause the device (eg, printer) other than the display device 2 to output the extraction result by the processing unit 7.
  • the point cloud image is displayed on the display unit, the input information input by the input unit is acquired, and the point cloud image is obtained based on the input information. Extracting a part of the point group from the included point group and displaying an extracted point group image based on the extracted part of the point group on the display unit.
  • FIG. 7 is a diagram showing a GUI screen according to the first embodiment.
  • the input information is the type of structure (information specifying [BiologicalBioobjects]).
  • [Geometric shape] is selected as the distribution candidate category, but in the window W4 in FIG. 7, [Biological objects] is selected.
  • [Biological objects] indicates that the type of structure corresponding to the point set extracted by the processing unit 7 is specified as a distribution candidate category.
  • [Biological objects] is assigned a process of displaying candidate types of structures as predetermined candidates.
  • the window W6 is generated.
  • the process for generating the window W6 is the same as the process for generating the window W5 described with reference to FIG.
  • candidates for the type of structure to be extracted are displayed as choices of input information.
  • [Clathrin], [Mitochondria], and [Tubulin] are displayed as candidates for the type of structure to be extracted.
  • [Clathrin] is assigned a process for specifying clathrin as an extraction target.
  • the processing unit 7 (clustering unit 9) divides the point group included in the point group image into a plurality of point groups (subsets).
  • the processing unit 7 (classifier 10) extracts a part of the point group based on the feature amount of the point group (subset) divided by the clustering unit 9 and the feature amount of the structure.
  • the storage unit 8 stores information on the feature amount of the structure.
  • the information regarding the feature amount of the structure is information that defines the shape of the structure (eg, clathrin), for example.
  • the information on the feature amount of the structure is information defining the distribution of the N-dimensional data D1 corresponding to the shape of the structure (eg, clathrin), for example.
  • the input control unit 11 designates a distribution corresponding to the shape of the clathrin as the distribution of the N-dimensional data D1 in the point set to be extracted, and causes the processing unit 7 to execute the extraction process. .
  • the processing unit 7 reads the distribution information corresponding to the shape of the clathrin from the storage unit 8 and executes the extraction process.
  • a process for designating mitochondria as an extraction target is assigned to [Mitochondria].
  • [Tubulin] is assigned a process for specifying tubulin as an extraction target. The process when [Mitochondria] or [Tubulin] is selected is the same as the process when [Clathrin] is selected.
  • the point set is extracted by selecting [Input trained data] or [Targeting].
  • Conditions eg, type of structure, shape
  • FIG. 8 is a diagram showing a GUI screen according to the first embodiment.
  • [Example] is selected as an option for the distribution designation method, but in the window W3 in FIG. 8, [Targeting] is selected.
  • [Targeting] indicates that a method for specifying a distribution by a graphic drawn by the user on the GUI screen W is selected as a distribution specifying method.
  • [Targeting] is assigned a process of displaying candidates for a method of drawing a graphic on the GUI screen W.
  • the window W7 is generated.
  • the process for generating the window W7 is the same as the process for generating the window W4 described with reference to FIG.
  • [Rectangular domain] and [Draw curve] are displayed as candidates for a method of drawing a graphic on the GUI screen W.
  • [Rectangular domain] is selected.
  • [Rectangular domain] indicates that the distribution of the N-dimensional data D1 inside the specified area is specified as the target distribution by specifying the rectangular parallelepiped area.
  • the input control unit 11 displays a rectangular parallelepiped area AR1 at the position of the pointer P when it is detected that the left click is performed with the pointer P placed on the point cloud image P1.
  • the input control unit 11 expands and contracts the area AR1 in the moving direction of the pointer P when it is detected that the pointer P is arranged on the side of the area AR1 and is dragged in the crossing direction of the side.
  • AR1 is displayed. In this way, the user can expand and contract the area AR1 in each of the X direction, the Y direction, and the Z direction.
  • the input control unit 11 displays the point cloud image P1 whose viewpoint has been changed.
  • the information processing apparatus 1 executes rendering processing based on the direction and amount of movement of the pointer P by dragging, and generates a point cloud image P1 with a changed viewpoint.
  • the information processing apparatus 1 can also display the point cloud image P1 by zooming (eg, zooming in or zooming out) based on the input information.
  • the input control unit 11 causes the output control unit 12 to execute a process of displaying the point cloud image P1 whose viewpoint has been changed. In this way, the user can appropriately expand and contract the area AR1 while viewing the point cloud image P1 from different directions, and specify the area AR1 so as to surround a desired subset.
  • FIG. 9 is a diagram showing a GUI screen according to the first embodiment.
  • [Targeting] is selected as the distribution designation method.
  • [Drawvecurve] is selected as a candidate for a method of drawing a graphic on the GUI screen W.
  • [Draw curve] indicates that the user draws a free curve while moving the pointer P, and designates a distribution surrounded by the free curve as a target distribution.
  • the input control unit 11 corresponds to the locus of the pointer P starting from the position of the pointer P at the start of the dragging when it is detected that the pointer P is dragged with the pointer P placed on the point cloud image P1.
  • the curve P4 to be displayed is displayed.
  • the input control unit 11 sets the position of the pointer P when the drag is released as the end point of the curve P4.
  • the input control unit 11 determines whether or not the curve P4 drawn by the user includes a closed curve.
  • the input control unit 11 adjusts the curve P4 so that the curve P4 includes the closed curve, for example, by interpolation processing.
  • a three-dimensional region can be specified by using the point cloud image P1 whose viewpoint has been changed, as in the case where [Rectangular domain] is selected.
  • the input control unit 11 designates the distribution of the N-dimensional data D1 inside the closed curve included in the curve P4 as the target distribution.
  • FIGS. 16 and 11 are diagrams showing a GUI screen according to the first embodiment.
  • [Input trained data] is selected as an option for the distribution designation method.
  • [Input trained data] is an option indicating that information defining a distribution extracted by the processing unit is read as a distribution designation method. For example, a process of reading information obtained by machine learning (described later with reference to FIGS. 16 to 20) is assigned to [Input trained data].
  • the input control unit 11 causes the output control unit 12 to display the window W8 when it is determined that [Input ⁇ ⁇ ⁇ ⁇ trained data] is selected according to user input information.
  • [Drop here] in the window W8 indicates that a file including information defining the target distribution is designated by drag and drop.
  • the input control unit 11 displays a window W ⁇ b> 9 indicating the hierarchy of files managed by the OS unit 5. Alternatively, it may be displayed by the output control unit 12. In this case, the user can specify a file including information defining the target distribution on the window W9.
  • the information processing apparatus 1 may read a file that defines extraction conditions instead of reading a learning result file using [Input trained data], and extract a point set based on the definition. For example, the information processing apparatus 1 may extract a point set by reading a file defining a geometric feature amount as a file defining an extraction condition by “Etc ..” in the window W5 in FIG. .
  • the input information options described with reference to FIGS. 2 and 7 to 11 can be changed as appropriate.
  • the GUI unit 6 may not provide some of the input information options described with reference to FIGS. 2 and 7 to 11. Further, the GUI unit 6 may provide options different from the input information options described with reference to FIGS. 2 and 7 to 11.
  • the processing unit 7 selects a partial region (eg, ROI) from the space in which the point cloud data DG is defined, and the similarity between the distribution of the N-dimensional data D1 and the specified distribution in the selected region. May be calculated.
  • the classifier 10 may classify the selected region as a subset when the calculated similarity is greater than or equal to a threshold value.
  • the processing unit 7 may extract the subset by changing the position of the partial area and repeating the process of determining whether the partial area corresponds to the subset to be extracted. .
  • the information processing apparatus 1 has the same configuration as that shown in FIG. In the present embodiment, the configuration of the information processing apparatus 1 is appropriately referred to FIG.
  • the output control unit 12 displays the subset classified by the clustering unit 9 as a distribution candidate specified by the user using the input information.
  • FIG. 12 is a diagram illustrating a GUI screen output by the output control unit based on the subset classified by the clustering unit according to the second embodiment.
  • the GUI screen W in FIG. 12 includes a window W10.
  • a symbol Kd in the window W10 indicates the distribution of the N-dimensional data D1 in the subset classified by the clustering unit 9, respectively.
  • the output control unit 12 may display the subset identification number assigned by the clustering unit 9 and the distribution Kd together.
  • the information processing apparatus 1 can display the distribution of at least a part of the point cloud data DG by changing the viewpoint, as described with reference to FIGS. Similarly, in FIG. 12, the information processing apparatus 1 can display the distribution of the N-dimensional data D1 for each subset by changing the viewpoint. For example, when the input control unit 11 detects that the pointer K is dragged in a state where the pointer P is arranged on the subset distribution Kd, the input control unit 11 displays the distribution Kd by changing the viewpoint.
  • FIG. 13 is a diagram showing a distribution designation method using the GUI screen according to the second embodiment.
  • the user can select each of the plurality of distributions Kd displayed in the window W10 based on the input information.
  • the user can specify whether or not the selected distribution Kd (hereinafter referred to as distribution Kd1) is to be extracted by input information.
  • distribution Kd1 the selected distribution Kd
  • the input control unit 11 is the distribution selected by the user input information. Is determined.
  • the input control unit 11 displays the selected distribution Kd1 so as to be distinguishable from other distributions Kd.
  • the input control unit 11 displays a frame (indicated by a thick line in FIG. 13) surrounding the distribution Kd1 in a color or brightness different from the frame surrounding the other distribution Kd.
  • the user can specify a distribution to be extracted (hereinafter referred to as an extraction target distribution) by input information.
  • the input control unit 11 determines that the distribution Kd1 is designated as the extraction target distribution when it is detected that the left click is performed in a state where the pointer P is arranged on the selected distribution Kd1.
  • the symbols Kd2 and Kd3 represent distributions determined to be designated as the extraction target distribution.
  • the input control unit 11 displays the distribution Kd2 and the distribution Kd3 so as to be distinguishable from other distributions Kd.
  • the input control unit 11 displays a frame (indicated by a two-dot chain line in FIG. 13) surrounding the distribution Kd2 in a color or brightness different from that of the frame surrounding the other distribution Kd.
  • the user can specify a distribution to be excluded from extraction (hereinafter referred to as an extraction exclusion distribution) by input information.
  • the input control unit 11 determines that the distribution Kd1 is designated as the extraction exclusion distribution when it is detected that the right-click is performed in a state where the pointer P is placed on the selected distribution Kd1.
  • the distribution Kd determined to be designated as the distribution outside the extraction field is represented by the symbol Kd4.
  • the input control unit 11 displays the distribution Kd4 so as to be distinguishable from other distributions Kd.
  • the input control unit 11 displays a frame (indicated by a dotted line in FIG. 13) surrounding the distribution Kd4 in a color or brightness different from the frame surrounding the other distribution Kd.
  • the processing unit 7 extracts a point set from the point group based on the similarity between the distribution specified by the input control unit 11 and the distribution of N-dimensional data in the subset classified by the clustering unit 9.
  • FIG. 15 is a diagram illustrating processing by the processing unit according to the second embodiment.
  • the processing unit 7 calculates the similarity between each of the distributions Kd (Kd2, Kd3) determined to be designated as the extraction target distribution and each of the subset distributions Kd classified by the clustering unit 9.
  • the distribution Kd for which the degree of similarity with the distribution Kd (Kd2, Kd3) determined to be designated as the extraction target distribution is calculated is represented by the symbol Kd5.
  • the distribution Kd (Kd2, Kd3) determined to be designated as the extraction target distribution has a maximum similarity with itself, and may be excluded from the partner distribution Kd5 for calculating the similarity.
  • the processing unit 7 extracts a distribution whose similarity is equal to or greater than a threshold for at least one of the distributions Kd (Kd2, Kd3) determined to be designated as the extraction target distribution.
  • one of the distributions Kd5 is represented by a symbol Kd51.
  • the processing unit 7 extracts the distribution Kd51 when one or both of the similarity Q11 between the distribution Kd2 and the distribution Kd5 (Kd51) and the similarity Q21 between the distribution Kd3 and the distribution Kd51 are equal to or greater than a threshold value.
  • the processing unit 7 may determine that the distribution Kd51 is a distribution that is not similar to the target distribution when only one of the similarity Q11 and the similarity Q21 is less than the threshold. Further, the processing unit 7 may determine whether or not the distribution is similar to the target distribution based on values (for example, average values) calculated from the similarity Q11 and the similarity Q21. In FIG. 15, two distributions Kb2 and Kb3 are shown as target distributions, but the number of target distributions may be one or three or more.
  • the processing unit 7 calculates the degree of similarity with the distribution Kd5 in the same manner as the distributions (Kd2, Kd3) determined as the extraction target distribution for the distribution Kd4 (see FIG. 13) to be excluded from the extraction.
  • the processing unit 7 determines that the distribution Kd5 whose similarity to the distribution Kd5 is equal to or greater than the threshold is dissimilar from the target distribution.
  • the processing unit 7 excludes the distribution determined to be dissimilar to the target distribution from the extraction.
  • the processing unit 7 calculates the degree of similarity of each distribution Kd4 with each of the plurality of distributions Kd5.
  • the processing unit 7 may determine whether or not the distribution is dissimilar to the target distribution based on a plurality of values calculated as the degrees of similarity between each distribution Kd4 and the plurality of distributions Kd5. For example, the processing unit 7 may determine that the distribution is not similar to the target distribution when the maximum value of the plurality of values is equal to or greater than a threshold value. Further, the processing unit 7 may determine that the distribution is not similar to the target distribution when the minimum value of the plurality of values is equal to or greater than a threshold value. Further, the processing unit 7 may determine that the distribution is not similar to the target distribution when a value (eg, an average value) calculated from the plurality of values is equal to or greater than a threshold value.
  • a value eg, an average value
  • FIG. 16 is a flowchart illustrating an information processing method according to the second embodiment.
  • the processing from step S1 to step S3 is the same as the processing described in FIG.
  • the clustering unit 9 classifies the point cloud data DG into a subset.
  • the process of step S3 may be executed as part of the process of step S11.
  • the clustering unit 9 may classify noise when classifying the N-dimensional data D1 included in the point cloud data DG into a subset.
  • the process of step S3 may not be a part of the process of step S11, and does not need to be performed.
  • step S4 and the processing in step S5 are the same as the processing described in FIG.
  • step S ⁇ b> 6 the input control unit 11 acquires input information using the GUI screen W.
  • step S6a the output control unit 12 displays the distribution of the N-dimensional data D1 in the subset classified by the clustering unit 9 in step S11 on the GUI screen W (see FIGS. 12 and 13).
  • step S11 the input control unit 11 specifies a distribution specified by the input information (see FIG. 13).
  • step S12 the processing unit 7 extracts a subset based on the distribution specified in step S11 (see FIG. 15).
  • step S13 of step S12 the processing unit 7 calculates the similarity between the distribution of the clustered subset and the specified distribution.
  • the distribution of the clustered subset is a distribution of N-dimensional data included in the subset classified by the clustering unit 9 in step S3b. Further, the specified distribution is the distribution specified by the input control unit 11 as the distribution specified by the input information in step S11.
  • the classifier 10 classifies a subset whose similarity is equal to or higher than a threshold as a one-minute point group (point set).
  • the classifier 10 extracts (classifies) the subset as a point set similar to the distribution specified in step S11 when the similarity calculated in step S14 is equal to or greater than the threshold.
  • the processing unit 7 stores the extracted point set information in the storage unit 8.
  • FIG. 16 is a diagram illustrating an information processing apparatus according to the third embodiment.
  • the information processing apparatus 1 includes a machine learning unit 15.
  • the information processing apparatus 1 generates the classifier 10 by the machine learning unit 15.
  • the machine learning unit 15 Based on the input information acquired by the input control unit 11, the machine learning unit 15 generates an index (eg, determination criterion, evaluation function) when the processing unit 7 extracts a point set from the point cloud data DG by machine learning.
  • index eg, determination criterion, evaluation function
  • Examples of machine learning methods include Neural network (eg, Deep learning), support vector machine, regression forest, and the like.
  • the machine learning unit 15 executes machine learning by combining one or two or more of the above-described machine learning methods or other machine learning methods.
  • the input information acquisition unit acquires the teacher data of the machine learning unit 15 as input information. As described with reference to FIG. 13, the input control unit 11 acquires information representing a target distribution (eg, distribution Kb2, distribution Kb3) extracted by the processing unit 7 as input information.
  • the machine learning unit 15 executes machine learning using the target distribution obtained from the input information acquired by the input control unit 11 as teacher data.
  • the processing unit 7 extracts a part of the point group (point set) from the point cloud data DG based on the index generated by the machine learning unit 15.
  • the above teacher data includes information (information indicating the distribution to be extracted, correct teacher data) that defines a part of the point group (point set) extracted by the processing unit 7. Further, the teacher data includes information (information indicating a distribution that is not extracted, incorrect answer teacher data) that defines a point group that the processing unit 7 excludes from the extraction.
  • the user can input information representing a distribution to be extracted and information representing a distribution not to be extracted.
  • FIG. 17 is a diagram illustrating processing for designating a distribution. As described with reference to FIG. 8, the user can specify an area using the GUI screen W.
  • symbol AR3 represented by a two-dot chain line
  • a symbol AR4 represented by a dotted line
  • the input control unit 11 specifies the distribution (represented by the symbols Ke1, Ke2, and Ke3) specified by the user as the distribution to be extracted by specifying the distribution included in the area AR3.
  • the distribution 17 corresponds to information defining a point group to be excluded from extraction by the processing unit 7, and is a distribution group identified by the input control unit 11 as a distribution that is not extracted.
  • the distribution information included in the group G2 can be used as teacher data representing an incorrect answer. Note that the user may designate one or both of the distribution to be extracted and the distribution not to be extracted by selecting a candidate from the list (see FIG. 13).
  • FIG. 18 is a diagram illustrating processing by the machine learning unit and the processing unit according to the third embodiment.
  • the machine learning unit 15 calculates a feature amount for each of the distributions Ke1 to Ke3 selected as the target distribution extracted by the processing unit 7.
  • the types of feature amounts are, for example, the size of the space occupied by the distribution, the number density of the N-dimensional data D1 in the distribution, the curvature of the space occupied by the distribution, and the like.
  • the machine learning unit 15 calculates a plurality of types of feature amounts, for example.
  • the feature amounts calculated by the machine learning unit 15 are represented by “feature amount 1” and “feature amount 2” in FIG.
  • the machine learning unit 15 derives a relationship that the feature quantity 1 and the feature quantity 2 satisfy. For example, the machine learning unit 15 derives an area AR2 in which the feature quantity 2 with respect to the feature quantity 1 is arranged for each of a plurality of distributions (Ke1 to Ke3). The machine learning unit 15 generates information (for example, a function) representing the area AR2 as an index for extracting a point set from the point cloud data DG. The machine learning unit 15 causes the storage unit 8 to store information representing the area AR2 as a result of machine learning.
  • the processing unit 7 reads out the result of the machine learning by the machine learning unit 15 from the storage unit 8 and executes the extraction process. For example, when the user selects [Input trained data] as [Target] on the GUI screen W shown in FIGS. 10 and 11, the user designates information representing the area AR ⁇ b> 2 stored in the storage unit 8. The processing unit 7 reads information specified by the user as information representing the area AR2, and executes an extraction process.
  • the processing unit 7 calculates the feature amount 1 and the feature amount 2 for the distribution Kd51 of the N-dimensional data D1 in the subset classified by the clustering unit 9.
  • the classifier 10 determines whether or not the feature amount 2 for the feature amount 1 of the distribution Kd51 exists in the area AR2.
  • the classifier 10 determines that the distribution Kd51 is similar to the target distribution when the feature amount 2 for the feature amount 1 of the distribution Kd51 exists in the area AR2.
  • the classifier 10 extracts (classifies) the distribution Kd51 determined to be similar to the target distribution as a point set.
  • the processing unit 7 calculates the feature amount 1 and the feature amount 2 for the distribution Kd52 of the N-dimensional data D1 in the subset classified by the clustering unit 9.
  • the classifier 10 determines whether or not the feature amount 2 for the feature amount 1 of the distribution Kd52 exists in the area AR2.
  • the classifier 10 determines that the distribution Kd52 is not similar to the target distribution (is dissimilar) when the feature amount 2 for the feature amount 1 of the distribution Kd52 does not exist in the area AR2.
  • the classifier 10 does not extract the distribution Kd52 that is determined not to be similar to the target distribution as a point set.
  • FIG. 19 is a diagram illustrating processing by the machine learning unit according to the third embodiment.
  • the machine learning unit 15 performs machine learning based on a distribution (Ke1 to Ke3) selected as a target distribution to be extracted and a distribution (Kf1 to Kf3) selected as a distribution not to be extracted.
  • the machine learning unit 15 has the feature quantity 2 for the feature quantity 1 of the distribution to be extracted (Ke1 to Ke3) within the area AR2, and the feature quantity 2 for the feature quantity 1 of the distribution (Kf1 to Kf3) not to be extracted exists in the area AR2.
  • the area AR2 is derived so that it does not occur.
  • the machine learning unit 15 uses the distribution (Kf1 to Kf3) selected as the distribution not to be extracted, and does not use the distribution (Ke1 to Ke3) selected as the distribution to be extracted. Perform machine learning.
  • the machine learning unit 15 derives the area AR2 so that the feature quantity 2 for the feature quantity 1 of the distribution (Kf1 to Kf3) that is not extracted does not exist in the area AR2.
  • FIG. 20 is a flowchart illustrating an information processing method according to the third embodiment.
  • the machine learning unit 15 performs machine learning based on the distribution specified in step S7 (see FIG. 18A).
  • step S22 the processing unit 7 extracts a point set based on the result of the machine learning in step S21. At that time, in step S22a, the processing unit 7 calculates the feature amount of the distribution of the subset clustered by the clustering unit 9.
  • step S22b the classifier 10 classifies the point set based on the learning result and the feature amount.
  • the classification unit 7A reads information representing the area AR2 (see FIG. 18A, FIG. 19A, and FIG. 19B) from the storage unit 8 as a learning result, and the feature amount calculated in step S22a. It is determined whether or not the subset has the target distribution depending on whether or not the position of is within the area AR2.
  • the information processing apparatus 1 may include a surface generation unit that generates a surface representing the shape of the subset.
  • the surface generation unit generates a scalar field based on N-dimensional data included in the point cloud data DG, and generates a contour surface of the scalar field as a surface representing the shape of the subset.
  • the processing unit 7 may extract a part of the point group (point set) from the point group based on the surface generated by the surface generation unit.
  • FIG. 21 is a diagram illustrating an information processing apparatus according to the fourth embodiment.
  • the information processing apparatus 1 according to the present embodiment includes a calculation unit 17.
  • the computing unit 17 performs computation using a part of the point group (point set) extracted by the processing unit 7.
  • the calculation unit 17 calculates one or both of the surface area and the volume of the shape represented by a part of the point group (point set) extracted by the processing unit 7.
  • the calculation unit 17 applies the distribution of the N-dimensional data in the point set extracted by the processing unit 7 to a function representing an ellipsoid, and calculates the coefficient of this function.
  • the processing unit 7 calculates the major axis and the minor axis of the ellipsoid using the calculated coefficient.
  • the processing unit 7 calculates the surface area by substituting the calculated major axis and minor axis into the ellipsoidal surface area formula.
  • the processing unit 7 calculates the volume by substituting the calculated major axis and minor axis into the ellipsoidal volume formula.
  • the calculation unit 17 counts (calculates) the number of point sets extracted by the processing unit 7.
  • FIG. 22 is a diagram illustrating a GUI screen output by the output control unit based on the calculation result of the calculation unit according to the fourth embodiment.
  • the GUI screen W in FIG. 22 includes a window W11. In the window WA, a list of point sets is displayed as the calculation result image P5. An identification number ([Target No.] in the figure) is assigned to the point set based on the number of point sets counted by the calculation unit 17. [Volume] in the figure is a value calculated by the calculation unit 17 as a volume of a shape corresponding to a point set.
  • [Surface Area] in the figure is a value calculated by the calculation unit 17 as the surface area of the shape corresponding to the point set.
  • [X], [Y], and [Z] in the figure are coordinates representing a point set (for example, the position of the center of gravity).
  • the calculation result image P5 is displayed on the GUI screen W together with the extracted point cloud image P3, for example.
  • the input control unit 11 detects that the left click is performed in a state where the pointer P is arranged on the point set in the extracted point cloud image P3, the pointer P is arranged in the extracted point cloud image P3.
  • the point set Kg and the calculation result of the calculation unit 17 relating to the point set Kg in the calculation result image P5 are displayed with emphasis.
  • the input control unit 11 detects that the left click is performed in a state where the pointer P is arranged in the row in the calculation result image P5
  • the input control unit 11 calculates the calculation result of the calculation unit 17 in the row in which the pointer P is arranged,
  • the point set Kg of the extracted point cloud image P2 corresponding to the row may be highlighted and displayed.
  • FIG. 23 is a diagram illustrating N-dimensional data according to the embodiment.
  • the point cloud data DG in FIG. 23 is voxel data obtained by CT scan or the like.
  • the voxel data is four-dimensional data in which three-dimensional coordinate values (x, y, z) of each cell Cv and a value (v) given to the cell Cv are combined.
  • the cell Cv1 has a three-dimensional coordinate of (4, 2, 3) and a cell value v of 4.
  • the N-dimensional data D1 corresponding to the cell Cv1 is represented by (4, 2, 3, 4).
  • N-dimensional data D1 is processed by the information processing apparatus 1, for example, as shown in FIG. 23B, cells whose cell value v satisfies a predetermined condition are extracted (filtered). In FIG. 23B, cells having a value of v of 5 or more are extracted. Then, as shown in FIG. 23C, assuming that a point is arranged at the center position of each cell, three-dimensional point cloud data is obtained. As described in the above embodiment, information processing is performed. The partial area can be extracted by the apparatus 1.
  • N may be an integer of 4 or more
  • the information processing apparatus 1 may process N-dimensional data without performing the above filtering.
  • the information processing apparatus 1 may represent the value v of the above cell with luminance or color on the GUI screen W.
  • N values included in one N-dimensional data may be expressed by being divided into a plurality of windows. For example, in the case of four-dimensional data, the distribution of two-dimensional data in which two values selected from four values are grouped is displayed in one window, and the distribution of two-dimensional data in which the remaining two values are grouped is represented. It may be displayed in a separate window.
  • the input device 3 is a mouse.
  • a device other than a mouse eg, a keyboard
  • the user may execute at least a part of the processing of the information processing apparatus 1 by operating a keyboard and inputting a command to the command line.
  • the input information of the user includes information on the key pressed on the keyboard.
  • the information processing apparatus 1 includes, for example, a computer system.
  • the information processing device 1 reads an information processing program stored in the storage unit 8 (storage device) and executes various processes according to the information processing program.
  • This information processing program for example, causes a computer to display a point cloud image on a display unit, obtains input information input by the input unit, and points included in the point cloud image based on the input information. Extracting a part of the point group from the group and displaying an extracted point group image based on the extracted part of the point group on the display unit are executed.
  • the information processing program may be provided by being recorded on a computer-readable storage medium (eg, non-transitory recording medium, non-transitory tangible medium).
  • FIG. 24 is a diagram illustrating a microscope according to the embodiment.
  • the microscope 50 includes a microscope main body 51, the information processing apparatus 1 described in the above embodiment, and a control device 52.
  • the control device 52 includes a control unit 53 that controls each unit of the microscope main body 51 and an image processing unit 54. At least a part of the control device 52 may be provided in the microscope main body 51 (may be incorporated). In addition, the control unit 53 controls the information processing apparatus 1. At least a part of the control device 52 may be provided in the information processing device 1 (may be incorporated).
  • the microscope main body 51 detects a sample.
  • the microscope 50 is, for example, a fluorescence microscope, and the microscope main body 51 detects an image of fluorescence emitted from a sample containing a fluorescent substance.
  • the microscope according to the embodiment is a super-resolution microscope such as STORM or PALM, for example.
  • STORM activates a fluorescent material, and irradiates the activated fluorescent material with excitation light, thereby acquiring a plurality of fluorescent images.
  • Data of a plurality of fluorescent images is input to the image processing unit 54.
  • the image processing unit 54 calculates the position information of the fluorescent substance in each fluorescence image, and generates point cloud data DG using the calculated plurality of position information.
  • the image processing unit 54 generates a point cloud image representing the point cloud data DG.
  • the image processing unit 54 calculates the two-dimensional position information of the fluorescent substance, and generates point group data DG including a plurality of two-dimensional data.
  • the image processing unit 54 calculates three-dimensional position information of the fluorescent material, and generates point group data DG including a plurality of three-dimensional data.
  • FIG. 25 is a diagram showing a microscope main body according to the embodiment.
  • the microscope main body 51 can be used for both fluorescence observation of a sample labeled with one type of fluorescent substance and fluorescence observation of a sample labeled with two or more kinds of fluorescent substances.
  • fluorescent substance eg, reporter dye
  • the microscope 51 can generate a three-dimensional super-resolution image.
  • the microscope 51 has a mode for generating a two-dimensional super-resolution image and a mode for generating a three-dimensional super-resolution image, and can switch between the two modes.
  • the sample may include living cells (live cells), may include cells fixed using a tissue fixing solution such as a formaldehyde solution, or may be tissue.
  • the fluorescent substance may be a fluorescent dye such as a cyanine dye or a fluorescent protein.
  • the fluorescent dye includes a reporter dye that emits fluorescence when receiving excitation light in an activated state (hereinafter referred to as an activated state).
  • the fluorescent dye may include an activator dye that receives activation light and activates the reporter dye. If the fluorescent dye does not contain an activator dye, the reporter dye receives an activation light and becomes activated.
  • Fluorescent dyes are, for example, a dye pair in which two kinds of cyanine dyes are combined (eg, Cy3-Cy5 dye pair (Cy3, Cy5 is a registered trademark), Cy2-Cy5 dye pair (Cy2, Cy5 is a registered trademark) , Cy3-Alexa® Fluor647 dye pair (Cy3, Alexa® Fluor is a registered trademark)) and one type of dye (eg, Alexa® Fluor647 (Alexa® Fluor is a registered trademark)).
  • the fluorescent protein include PA-GFP and Dronpa.
  • the microscope main body 51 includes a stage 102, a light source device 103, an illumination optical system 104, a first observation optical system 105, an imaging unit 106, an image processing unit 54, and a control device 52.
  • the control device 52 includes a control unit 53 that comprehensively controls each unit of the microscope main body 51.
  • the image processing unit 54 is provided in the control device 52, for example.
  • the stage 102 holds the sample W to be observed.
  • the stage 102 can place the sample W on the upper surface thereof, for example.
  • the stage 102 may have a mechanism for moving the sample W like an XY stage, or may not have a mechanism for moving the sample W like a desk.
  • the microscope main body 51 may not include the stage 102.
  • the light source device 103 includes an activation light source 110a, an excitation light source 110b, a shutter 111a, and a shutter 111b.
  • the activation light source 110a emits activation light L that activates a part of the fluorescent material contained in the sample W.
  • the fluorescent material contains a reporter dye and does not contain an activator dye.
  • the reporter dye of the fluorescent substance is in an activated state capable of emitting fluorescence when irradiated with the activation light L.
  • the fluorescent substance may include a reporter dye and an activator dye. In this case, the activator dye activates the reporter dye when it receives the activation light L.
  • the fluorescent substance may be a fluorescent protein such as PA-GFP or Dronpa.
  • the excitation light source 110b emits excitation light L1 that excites at least a part of the fluorescent material activated in the sample W.
  • the fluorescent material emits fluorescence or is inactivated when the excitation light L1 is irradiated in the activated state.
  • an inactivated state When the fluorescent material is irradiated with the activation light L in an inactivated state (hereinafter referred to as an inactivated state), the fluorescent material is again activated.
  • the activation light source 110a and the excitation light source 110b include, for example, a solid light source such as a laser light source, and each emits laser light having a wavelength corresponding to the type of fluorescent material.
  • the emission wavelength of the activation light source 110a and the emission wavelength of the excitation light source 110b are selected from, for example, about 405 nm, about 457 nm, about 488 nm, about 532 nm, about 561 nm, about 640 nm, and about 647 nm.
  • the emission wavelength of the activation light source 110a is about 405 nm and the emission wavelength of the excitation light source 110b is a wavelength selected from about 488 nm, about 561 nm, and about 647 nm.
  • the shutter 111a is controlled by the control unit 53, and can switch between a state in which the activation light L from the activation light source 110a passes and a state in which the activation light L is blocked.
  • the shutter 111b is controlled by the control unit 53, and can switch between a state in which the excitation light L1 from the excitation light source 110b passes and a state in which the excitation light L1 is blocked.
  • the light source device 103 includes a mirror 112, a dichroic mirror 113, an acoustooptic device 114, and a lens 115.
  • the mirror 112 is provided on the emission side of the excitation light source 110b, for example.
  • the excitation light L1 from the excitation light source 110b is reflected by the mirror 112 and enters the dichroic mirror 113.
  • the dichroic mirror 113 is provided, for example, on the emission side of the activation light source 110a.
  • the dichroic mirror 113 has a characteristic that the activation light L is transmitted and the excitation light L1 is reflected.
  • the activation light L transmitted through the dichroic mirror 113 and the excitation light L1 reflected by the dichroic mirror 113 enter the acoustooptic device 114 through the same optical path.
  • the acoustooptic element 114 is, for example, an acoustooptic filter.
  • the acoustooptic device 114 is controlled by the control unit 53 and can adjust the light intensity of the activation light L and the light intensity of the excitation light L1.
  • the acoustooptic element 114 is controlled by the control unit 53 so that the activation light L and the excitation light L1 are blocked by the acoustooptic element 114 from passing through the acoustooptic element 114 (hereinafter referred to as a light passing state). Or a state in which the intensity is reduced (hereinafter referred to as a light shielding state) can be switched.
  • the control unit 53 controls the acoustooptic device 114 so that the activation light L and the excitation light L1 are irradiated simultaneously. Further, when the fluorescent material includes a reporter dye and an activator dye, the control unit 53 controls the acoustooptic device 114 so as to irradiate the excitation light L1 after the activation light L is irradiated, for example.
  • the lens 115 is, for example, a coupler, and condenses the activation light L and the excitation light L1 from the acoustooptic device 114 on the light guide member 116.
  • the microscope main body 51 may not include at least a part of the light source device 103.
  • the light source device 103 is unitized, and may be provided in the microscope main body 51 so as to be replaceable (attachable or removable).
  • the light source device 103 may be attached to the microscope main body 51 when observing with the microscope 1.
  • the illumination optical system 104 irradiates the activation light L that activates a part of the fluorescent substance contained in the sample W and the excitation light L1 that excites at least a part of the activated fluorescent substance.
  • the illumination optical system 104 irradiates the sample W with the activation light L and the excitation light L1 from the light source device 103.
  • the illumination optical system 104 includes a light guide member 116, a lens 117, a lens 118, a filter 119, a dichroic mirror 120, and an objective lens 121.
  • the light guide member 116 is an optical fiber, for example, and guides the activation light L and the excitation light L1 to the lens 117.
  • the lens 117 is a collimator, for example, and converts the activation light L and the excitation light L1 into parallel light.
  • the lens 118 condenses, for example, the activation light L and the excitation light L1 at the position of the pupil plane of the objective lens 121.
  • the filter 119 has a characteristic of transmitting the activation light L and the excitation light L1 and blocking at least a part of light of other wavelengths.
  • the dichroic mirror 120 has a characteristic that the activation light L and the excitation light L1 are reflected, and light (for example, fluorescence) in a predetermined wavelength band out of the light from the sample W is transmitted.
  • the light from the filter 119 is reflected by the dichroic mirror 120 and enters the objective lens 121.
  • the sample W is disposed on the front focal plane of the objective lens 121 during observation.
  • the activation light L and the excitation light L1 are applied to the sample W by the illumination optical system 104 as described above.
  • the illumination optical system 104 described above is an example, and can be changed as appropriate. For example, a part of the illumination optical system 104 described above may be omitted.
  • the illumination optical system 104 may include at least a part of the light source device 103.
  • the illumination optical system 104 may include an aperture stop, an illumination field stop, and the like.
  • the first observation optical system 105 forms an image of light from the sample W.
  • the first observation optical system 105 forms an image of fluorescence from the fluorescent material contained in the sample W.
  • the first observation optical system 105 includes an objective lens 121, a dichroic mirror 120, a filter 124, a lens 125, an optical path switching member 126, a lens 127, and a lens 128.
  • the first observation optical system 105 shares the objective lens 121 and the dichroic mirror 120 with the illumination optical system 104.
  • the optical path between the sample W and the imaging unit 106 is indicated by a solid line.
  • Fluorescence from the sample W enters the filter 124 through the objective lens 121 and the dichroic mirror 120.
  • the filter 124 has a characteristic that light in a predetermined wavelength band out of the light from the sample W selectively passes.
  • the filter 124 blocks, for example, illumination light, external light, stray light, etc. reflected by the sample W.
  • the filter 124 is unitized with, for example, the filter 119 and the dichroic mirror 120, and the filter unit 23 is provided in a replaceable manner.
  • the filter unit 23 is exchanged according to the wavelength of light emitted from the light source device 103 (for example, the wavelength of the activation light L, the wavelength of the excitation light L1), the wavelength of fluorescence emitted from the sample W, and the like.
  • a single filter unit corresponding to a plurality of excitation and fluorescence wavelengths may be used.
  • the light that has passed through the filter 124 enters the optical path switching member 126 through the lens 125.
  • the light emitted from the lens 125 passes through the optical path switching member 126 and then forms an intermediate image on the intermediate image surface 105b.
  • the optical path switching member 126 is a prism, for example, and is provided so as to be able to be inserted into and removed from the optical path of the first observation optical system 105.
  • the optical path switching member 126 is inserted into and removed from the optical path of the first observation optical system 105 by a drive unit (not shown) controlled by the control unit 53, for example.
  • the optical path switching member 126 guides the fluorescence from the sample W to the optical path toward the imaging unit 106 by internal reflection.
  • the lens 127 converts fluorescence emitted from the intermediate image (fluorescence that has passed through the intermediate image surface 105b) into parallel light, and the lens 128 condenses the light that has passed through the lens 127.
  • the first observation optical system 105 includes an astigmatism optical system (for example, a cylindrical lens 129).
  • the cylindrical lens 129 acts on at least part of the fluorescence from the sample W and generates astigmatism with respect to at least part of the fluorescence. That is, an astigmatism optical system such as the cylindrical lens 129 generates astigmatism by generating astigmatism with respect to at least a part of the fluorescence. This astigmatism is used to calculate the position of the fluorescent material in the depth direction of the sample W (the optical axis direction of the objective lens 121).
  • the cylindrical lens 129 is detachably provided in the optical path between the sample W and the imaging unit 106 (for example, the imaging device 140).
  • the cylindrical lens 129 can be inserted into and removed from the optical path between the lens 127 and the lens 128.
  • the cylindrical lens 129 is disposed in this optical path in a mode for generating a three-dimensional super-resolution image, and is retracted from this optical path in a mode for generating a two-dimensional super-resolution image.
  • the microscope main body 51 includes the second observation optical system 130.
  • the second observation optical system 130 is used for setting an observation range.
  • the second observation optical system 130 includes an objective lens 121, a dichroic mirror 120, a filter 124, a lens 125, a mirror 131, a lens 132, a mirror 133, a lens 134, a lens 135, and a mirror in order from the sample W toward the observer's viewpoint Vp. 136 and a lens 137.
  • the second observation optical system 130 shares the configuration from the objective lens 121 to the lens 125 with the first observation optical system 105.
  • the light from the sample W passes through the lens 125 and then enters the mirror 131 in a state where the optical path switching member 126 is retracted from the optical path of the first observation optical system 105.
  • the light reflected by the mirror 131 is incident on the mirror 133 via the lens 132, is reflected by the mirror 133, and then enters the mirror 136 via the lens 134 and the lens 135.
  • the light reflected by the mirror 136 enters the viewpoint Vp through the lens 137.
  • the second observation optical system 130 forms an intermediate image of the sample W in the optical path between the lens 135 and the lens 137.
  • the lens 137 is an eyepiece, for example, and the observer can set an observation range by observing the intermediate image.
  • the imaging unit 106 captures an image formed by the first observation optical system 105.
  • the imaging unit 106 includes an imaging element 140 and a control unit 141.
  • the image sensor 140 is, for example, a CMOS image sensor, but may be a CCD image sensor or the like.
  • the image sensor 140 has, for example, a structure having a plurality of pixels arranged two-dimensionally and a photoelectric conversion element such as a photodiode disposed in each pixel.
  • the imaging element 140 reads out the electric charge accumulated in the photoelectric conversion element by a reading circuit.
  • the image sensor 140 converts the read electric charges into digital data, and outputs data in a digital format (eg, image data) in which pixel positions and gradation values are associated with each other.
  • the control unit 141 operates the image sensor 140 based on a control signal input from the control unit 53 of the control device 52, and outputs captured image data to the control device 52. Further, the control unit 141 outputs the charge accumulation period and the charge read period to the control device 52.
  • the control device 52 includes a control unit 53 that collectively controls each unit of the microscope main body 51.
  • the control unit 53 transmits light from the light source device 103 to the acoustooptic device 114 based on a signal (imaging timing information) indicating the charge accumulation period and the charge read period supplied from the control unit 141.
  • a control signal for switching between a state and a light blocking state that blocks light from the light source device 103 is supplied.
  • the acoustooptic device 114 switches between a light transmission state and a light shielding state based on this control signal.
  • the control unit 53 controls the acoustooptic device 114 to control a period in which the activation light L is irradiated on the sample W and a period in which the activation light L is not irradiated on the sample W. Further, the control unit 53 controls the acoustooptic device 114 to control a period during which the sample W is irradiated with the excitation light L1 and a period during which the sample W is not irradiated with the excitation light L1. The control unit 53 controls the acoustooptic device 114 to control the light intensity of the activation light L and the light intensity of the excitation light L1 that are irradiated on the sample W.
  • control unit 141 controls the acoustooptic device 114 to switch between a light shielding state and a light transmission state based on a signal (information on imaging timing) indicating a charge accumulation period and a charge read period.
  • a signal may be supplied to control the acousto-optic element 114.
  • the control unit 53 controls the imaging unit 106 to cause the imaging device 140 to perform imaging.
  • the control unit 53 acquires an imaging result (captured image data) from the imaging unit 106.
  • the image processing unit 54 calculates the position information of the fluorescent substance in each fluorescence image by calculating the center of gravity of the fluorescence image shown in the captured image, and uses the calculated plurality of position information to obtain the point cloud data DG. Generate.
  • the image processing unit 54 is a two-dimensional STORM
  • the image generation unit 13 calculates two-dimensional position information of the fluorescent substance, and generates point cloud data DG including a plurality of two-dimensional data.
  • the image processing unit 54 calculates three-dimensional position information of the fluorescent material, and generates point group data DG including a plurality of three-dimensional data.
  • the image processing unit 54 outputs the point cloud data DG to the information processing apparatus 1 shown in FIG.
  • the information processing apparatus 1 processes point cloud data DG obtained from the detection result of the microscope main body 51.
  • the control device 52 acquires the imaging result (data of the captured image) from the imaging unit 106, outputs the acquired imaging result to the information processing device 1, and the information processing device 1 generates the point cloud data DG.
  • the information processing apparatus 1 calculates the position information of the fluorescent substance in each fluorescence image, and generates the point cloud data DG using the calculated plurality of position information.
  • the information processing apparatus 1 generates a point cloud image representing the point cloud data DG.
  • the information processing apparatus 1 calculates the two-dimensional position information of the fluorescent material, and generates point group data DG including a plurality of two-dimensional data.
  • the information processing apparatus 1 calculates three-dimensional position information of the fluorescent material and generates point group data DG including a plurality of three-dimensional data.
  • the observation method includes detecting a sample, displaying a point cloud image obtained by detecting the sample on the display unit, acquiring input information input by the input unit, and inputting Extracting a part of the point cloud from the point cloud included in the point cloud image based on the information, and causing the display unit to display the extracted point cloud image based on the extracted part of the point cloud.
  • the control device 52 controls the microscope body 51
  • the microscope body 51 detects the sample W by detecting an image of fluorescence emitted from the sample containing the fluorescent material.
  • the control device 52 controls the information processing device 1 and causes the output control unit 12 to output the GUI screen W to the display device 2.
  • control device 52 controls the information processing device 1 and causes the output control unit 12 to output the point cloud image P1 to the GUI screen W.
  • control device 52 controls the information processing device 1 so that the input control unit 11 acquires input information that the user inputs using the GUI screen W.
  • control device 52 controls the information processing device 1 to specify the distribution specified by the input information by the input control unit 11.
  • control device 52 controls the information processing device 1 and extracts a point set from the point cloud data DG including a plurality of N-dimensional data D1 by the processing unit 7 based on the distribution specified by the input control unit 11.
  • control device 52 includes, for example, a computer system.
  • the control device 52 reads the observation program stored in the storage unit (storage device) and executes various processes according to the program.
  • This observation program causes a computer to detect a sample, to display a point cloud image obtained by detecting the sample on a display unit, to acquire input information input by the input unit, and to input information And extracting a part of the point cloud from the point cloud included in the point cloud image and displaying the extracted point cloud image based on the extracted part of the point cloud on the display unit.
  • This observation program may be provided by being recorded on a computer-readable storage medium (eg, non-transitory recording medium, non-transitory tangible medium).
  • control device 52 may be provided in the information processing device 1.
  • the information processing apparatus 1 is an aspect in which a computer executes various processes according to an information processing program
  • at least a part of the control apparatus 52 is an aspect in which the same computer as the information processing apparatus 1 executes various processes according to an observation program But you can.
  • DESCRIPTION OF SYMBOLS 1 ... Information processing apparatus, 7 ... Processing part, 8 ... Memory

Abstract

Le problème décrit par la présente invention est de faciliter la gestion de nuage de points. La solution selon l'invention porte sur un dispositif de traitement d'informations comprenant une unité de commande d'affichage, permettant d'afficher une image de nuage de points sur une unité d'affichage, une unité d'acquisition d'informations d'entrée, permettant d'acquérir des informations d'entrée, entrées par une unité d'entrée, et une unité de traitement, permettant d'extraire une partie du nuage de points compris dans l'image de nuage de points, en fonction des informations d'entrée acquises par l'unité d'acquisition d'informations d'entrée. L'unité de commande d'affichage affiche, sur l'unité d'affichage, une image de nuage de points, extraite sur la base de la partie du nuage de points extraite par l'unité de traitement.
PCT/JP2018/020864 2018-05-30 2018-05-30 Dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations et microscope WO2019229912A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/020864 WO2019229912A1 (fr) 2018-05-30 2018-05-30 Dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations et microscope

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/020864 WO2019229912A1 (fr) 2018-05-30 2018-05-30 Dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations et microscope

Publications (1)

Publication Number Publication Date
WO2019229912A1 true WO2019229912A1 (fr) 2019-12-05

Family

ID=68698049

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/020864 WO2019229912A1 (fr) 2018-05-30 2018-05-30 Dispositif de traitement d'informations, procédé de traitement d'informations, programme de traitement d'informations et microscope

Country Status (1)

Country Link
WO (1) WO2019229912A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220284544A1 (en) * 2021-03-02 2022-09-08 Fyusion, Inc. Vehicle undercarriage imaging
US11562474B2 (en) 2020-01-16 2023-01-24 Fyusion, Inc. Mobile multi-camera multi-view capture
WO2023032086A1 (fr) * 2021-09-01 2023-03-09 株式会社Fuji Machine-outil
US11727626B2 (en) 2019-01-22 2023-08-15 Fyusion, Inc. Damage detection from multi-view visual data
US11748907B2 (en) 2019-01-22 2023-09-05 Fyusion, Inc. Object pose estimation in visual data
US11776142B2 (en) 2020-01-16 2023-10-03 Fyusion, Inc. Structuring visual data
US11783443B2 (en) 2019-01-22 2023-10-10 Fyusion, Inc. Extraction of standardized images from a single view or multi-view capture

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014109555A (ja) * 2012-12-04 2014-06-12 Nippon Telegr & Teleph Corp <Ntt> 点群解析処理装置、点群解析処理方法及びプログラム
WO2014155715A1 (fr) * 2013-03-29 2014-10-02 株式会社日立製作所 Dispositif de reconnaissance d'objet, procédé de reconnaissance d'objet et programme
JP2016118502A (ja) * 2014-12-22 2016-06-30 日本電信電話株式会社 点群解析処理装置、方法、及びプログラム
US20170251191A1 (en) * 2016-02-26 2017-08-31 Yale University Systems, methods, and computer-readable media for ultra-high resolution 3d imaging of whole cells

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014109555A (ja) * 2012-12-04 2014-06-12 Nippon Telegr & Teleph Corp <Ntt> 点群解析処理装置、点群解析処理方法及びプログラム
WO2014155715A1 (fr) * 2013-03-29 2014-10-02 株式会社日立製作所 Dispositif de reconnaissance d'objet, procédé de reconnaissance d'objet et programme
JP2016118502A (ja) * 2014-12-22 2016-06-30 日本電信電話株式会社 点群解析処理装置、方法、及びプログラム
US20170251191A1 (en) * 2016-02-26 2017-08-31 Yale University Systems, methods, and computer-readable media for ultra-high resolution 3d imaging of whole cells

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KAWATA, Y. ET AL.: "A GUI visualization system for airborne LiDAR image data to reconstruct 3D city model", PROC. SPIE, vol. 9643, 15 October 2015 (2015-10-15), XP060062245, DOI: 10.1117/12.2193067 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11727626B2 (en) 2019-01-22 2023-08-15 Fyusion, Inc. Damage detection from multi-view visual data
US11748907B2 (en) 2019-01-22 2023-09-05 Fyusion, Inc. Object pose estimation in visual data
US11783443B2 (en) 2019-01-22 2023-10-10 Fyusion, Inc. Extraction of standardized images from a single view or multi-view capture
US11562474B2 (en) 2020-01-16 2023-01-24 Fyusion, Inc. Mobile multi-camera multi-view capture
US11776142B2 (en) 2020-01-16 2023-10-03 Fyusion, Inc. Structuring visual data
US11972556B2 (en) 2020-01-16 2024-04-30 Fyusion, Inc. Mobile multi-camera multi-view capture
US20220284544A1 (en) * 2021-03-02 2022-09-08 Fyusion, Inc. Vehicle undercarriage imaging
US11605151B2 (en) * 2021-03-02 2023-03-14 Fyusion, Inc. Vehicle undercarriage imaging
US11893707B2 (en) 2021-03-02 2024-02-06 Fyusion, Inc. Vehicle undercarriage imaging
WO2023032086A1 (fr) * 2021-09-01 2023-03-09 株式会社Fuji Machine-outil

Similar Documents

Publication Publication Date Title
WO2019229912A1 (fr) Dispositif de traitement d&#39;informations, procédé de traitement d&#39;informations, programme de traitement d&#39;informations et microscope
JP6947841B2 (ja) 病理学用の拡張現実顕微鏡
JP6799146B2 (ja) 視覚化されたスライド全域画像分析を提供するためのデジタル病理学システムおよび関連するワークフロー
JP2021515240A (ja) 定量的バイオマーカデータのオーバレイを有する病理学用拡張現実顕微鏡
WO2017150194A1 (fr) Dispositif de traitement d&#39;image, procédé de traitement d&#39;image et programme
CN112106107A (zh) 显微镜切片图像的聚焦加权的机器学习分类器误差预测
KR20220012214A (ko) 디지털 병리학을 위한 인공 지능 처리 시스템 및 자동화된 사전-진단 워크플로우
JP7176697B2 (ja) 細胞評価システム及び方法、細胞評価プログラム
US10330912B2 (en) Image processing device and image processing method
JP2009512927A (ja) 画像処理方法
CA3002902C (fr) Systemes et procedes de non-melange d&#39;images presentant des proprietes d&#39;acquisition variables
KR102580984B1 (ko) 화상 처리 방법, 프로그램 및 기록 매체
JP4997255B2 (ja) 細胞画像解析装置
CN108475429A (zh) 三维显微镜图像的分割的系统和方法
US10921252B2 (en) Image processing apparatus and method of operating image processing apparatus
CN108604375B (zh) 用于多维数据的图像分析的系统和方法
Mickler et al. Drop swarm analysis in dispersions with incident-light and transmitted-light illumination
US10184885B2 (en) Information processing device to process spectral information, and information processing method
JP2013109119A (ja) 顕微鏡制御装置およびプログラム
Kromp et al. Deep Learning architectures for generalized immunofluorescence based nuclear image segmentation
JP4271054B2 (ja) 細胞画像解析装置
US7221784B2 (en) Method and arrangement for microscopy
US10690902B2 (en) Image processing device and microscope system
US11428920B2 (en) Information processing device, information processing method, information processing program, and microscope for displaying a plurality of surface images
US20220050996A1 (en) Augmented digital microscopy for lesion analysis

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18920906

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18920906

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP