US20160301879A1 - Information processing apparatus, information processing system, method of processing information, and program - Google Patents

Information processing apparatus, information processing system, method of processing information, and program Download PDF

Info

Publication number
US20160301879A1
US20160301879A1 US15/185,284 US201615185284A US2016301879A1 US 20160301879 A1 US20160301879 A1 US 20160301879A1 US 201615185284 A US201615185284 A US 201615185284A US 2016301879 A1 US2016301879 A1 US 2016301879A1
Authority
US
United States
Prior art keywords
information
image
processing apparatus
information processing
annotation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/185,284
Inventor
Hiroshi Kyusojin
Yoichi Mizutani
Yutaka Hasegawa
Masahiro Takahashi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to US15/185,284 priority Critical patent/US20160301879A1/en
Publication of US20160301879A1 publication Critical patent/US20160301879A1/en
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HASEGAWA, YUTAKA, MIZUTANI, YOICHI, TAKAHASHI, MASAHIRO, KYUSOJIN, HIROSHI
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • G06F19/321
    • G06F19/3425
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1827Network arrangements for conference optimisation or adaptation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M3/00Automatic or semi-automatic exchanges
    • H04M3/42Systems providing special services or facilities to subscribers
    • H04M3/56Arrangements for connecting several subscribers to a common circuit, i.e. affording conference facilities
    • H04M3/567Multimedia conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems

Definitions

  • the present technology relates to an information processing apparatus which may have a pathology image in common with another information processing apparatus to make a user observe the image, an information processing system having the information processing apparatus, a method of processing information in the information processing apparatus, and a program for the information processing apparatus.
  • Patent Literature (PTL) 1 an information communication service system, in which a conference is expedited while each of user terminals takes out contents registered in advance if necessary, is disclosed.
  • a server receives videos and voices transmitted from a plurality of terminals in real time, synthesizes these ones, and delivers the synthesized ones to the terminals.
  • PTL 4 described later discloses that a reproduction apparatus downloads in advance contents, of which the date and time to be provided for viewers is predetermined, from a server in a ciphered state, receives a key from the server at a viewable date and time, and deciphers the contents with the key to reproduce the contents.
  • terminals of a plurality of users e.g., doctors
  • have an image for pathological diagnosis in common with one another and hold a teleconference to make a diagnosis while transmitting opinions among user's terminals. Therefore, the diagnosis is efficiently made.
  • a server and client system delivers a plurality of tile images composing a pathology image to clients, and each client synthesizes the pathology image from the tile images and views the pathology image.
  • An information processing apparatus includes a processor, a display device, and a memory device storing instructions. When executed by the processor, the instructions cause the processor to: (a) in response to an operation input, determine area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, (b) transmit the area specifying information and the location information to a first information processing apparatus, the first information processing apparatus being configured to, in response to receiving the area specifying information and the location information, transmit the area specifying information and the location information to a second information processing apparatus, (c) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (d) receive the plurality of partial images from the at least one location, and (e) display the received plurality of partial images.
  • the information processing apparatus may receive each of the partial images of the pathology image from an arbitrary location being the same or different from locations of the other partial images, as compared with the case where the locations are concentrated into one point, the time required to prepare a teleconference using the pathology image with another information processing apparatus may be shortened. Further, because the information processing apparatus is not required to determine the display area in the pathology image, the teleconference using the pathology image may be expedited efficiently and smoothly by receiving the partial image according to the area specifying information and by displaying the partial image.
  • the information processing apparatus access the location. Accordingly, the traffic on the network may be reduced, and the teleconference using the pathology image may be further efficiently expedited.
  • the processor may be configured to be capable of controlling the information processing apparatus to receive, in advance, image specifying information, temporary location information, and diagnosis record information from the other information processing apparatus before receiving the area specifying information and the location information, the image specifying information specifying a pathology image that is capable of being used in a teleconference held with the other information processing apparatus, the temporary location information indicating a location of the specified pathology image at this time, the diagnosis record information indicating a past diagnosis record of the specified pathology image, and to receive a partial image, associated with the received diagnosis record information, among a plurality of partial images composing the pathology image specified by the received image specifying information, from the location indicated by the temporary location information.
  • the processor may be configured to be capable of controlling the storage to store the received partial image.
  • the information processing apparatus may download in advance a specific partial image, which may be used in the teleconference at a high probability, to store the specific partial image in the storage. Accordingly, the traffic during the teleconference may be reduced, and the teleconference using the pathology image may be efficiently expedited.
  • the processor may be configured to be capable of controlling the information processing apparatus to receive display record information, indicating one of a past display area and a display position in the specified pathology image, as the diagnosis record information. Therefore, the information processing apparatus may further efficiently expedite the teleconference using the pathology image by downloading in advance a partial image which may be displayed in the teleconference at a high probability in the same manner as in the past.
  • the processor may be configured to be capable of controlling the information processing apparatus to receive annotation information, affixed to a predetermined position in the specified pathology image, as the diagnosis record information.
  • the information processing apparatus may further efficiently expedite the teleconference by downloading in advance the partial image to which annotation information is affixed and which may attract attention in the teleconference at a high probability in the same manner as in the past.
  • the pathology image may exist for each of a plurality of slices, which are taken out from one biological tissue and are stained with different colors, respectively, to form a plurality of pathology images.
  • the processor may be configured to be capable of controlling the information processing apparatus to receive a partial image existing at a predetermined position, to which the annotation information is affixed, in a first pathology image of a slice stained with a first color, and to receive a partial image existing at the position same as the predetermined position in a second pathology image of a slice stained with a second color.
  • the information processing apparatus may download in advance not only the partial image of the first pathology image, to which the annotation information is affixed, but also the partial image of the second pathology image which is taken from the biological tissue in common with the first pathology image but is of a slice stained in a color differing from that in the first pathology image.
  • Images of a slice taken out from one biological tissue may be taken at a plurality of different resolutions, and the pathology image may exist for each of the plurality of resolutions to form a plurality of pathology images.
  • the processor may be configured to be capable of controlling the information processing apparatus to receive a partial image existing at a predetermined position, to which the annotation information is affixed, in a first pathology image taken at a first resolution, and to receive a partial image existing at the position same as the predetermined position in a second pathology image taken at a second resolution.
  • the information processing apparatus may download in advance not only the partial image of the first pathology image, to which the annotation information is affixed, but also the partial image of the second pathology image which is taken from the biological tissue in common with the first pathology image but of which the resolution differs from the resolution of the first pathology image.
  • the processor may be configured to be capable of controlling the information processing apparatus to receive the area specifying information from the other information processing apparatus via a first server apparatus.
  • the location may indicate a second server apparatus different from the first server apparatus.
  • the pathology image and the area specifying information are managed in different servers, respectively. Accordingly, it may be prevented that the concentration of the load on a specific server disturbs the expedition of the teleconference.
  • An information processing apparatus includes a processor, and
  • a memory device storing instructions. When executed by the processor, the instructions cause the processor to: (a) receive, from a first information processing apparatus, area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images; and (b) transmit, to a second information processing apparatus, the area specifying information and the location information.
  • an information processing apparatus comprising: a processor, a display device, and a memory device storing instructions.
  • the instructions When executed by the processor, the instructions cause the processor to: (a) receive area specifying information and location information from a first information processing apparatus, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, (b) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (c) receive the plurality of partial images from the at least one location, and (d) display the received plurality of partial images.
  • a method of operating an information processing apparatus includes (a) causing a processor to execute instructions to receive, from a first information processing apparatus, area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, and (b) causing the processor to execute instructions to transmit, to a second information processing apparatus, the area specifying information and the location information.
  • a system including a first, second, and third information processing apparatus.
  • the first information processing apparatus is configured to, in response to an operation input, determine area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images.
  • the second information processing apparatus is configured to receive the area specifying information and the location information from the first information processing apparatus.
  • the third information processing apparatus is configured to: (a) receive the area specifying information and the location information from the second information processing apparatus, (b) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (c) receive the plurality of partial images from the at least one location; and (d) display the received plurality of partial images.
  • the time required to prepare the holding of a teleconference in which data of the pathology image are possessed in common may be shortened, and the teleconference may be expedited efficiently and smoothly.
  • FIG. 1 is a block diagram showing the configuration of a teleconference system according to a first embodiment of the present technology.
  • FIG. 2 is a block diagram showing the configuration of hardware of a PC in the system.
  • FIG. 3 is a diagram explaining a display principle of a pathology image treated in the system.
  • FIG. 4 is a diagram indicating the procedure in the case of producing an image group of the pathology image treated in the system.
  • FIG. 5 is a flowchart showing the processing in a PC being the chairman when a pathology image is displayed in the system.
  • FIG. 6 is a flowchart showing the processing in a PC being an audience when a pathology image is displayed in the system.
  • FIG. 7 is a block diagram showing the configuration of a teleconference system according to a second embodiment of the present technology.
  • FIG. 8 is a diagram showing the specification of tiles, to be downloaded in advance, on the basis of past operation record information in the system according to the second embodiment.
  • FIG. 9 is a diagram showing the specification of tiles, to be downloaded in advance, on the basis of past operation record information in the system according to the second embodiment.
  • FIG. 10 is a diagram showing the specification of tiles, to be downloaded in advance, on the basis of past operation record information in the system according to the second embodiment.
  • FIG. 11 is a diagram showing the specification of a tile, to be downloaded in advance, according to past annotation information in the system according to the second embodiment.
  • FIG. 12 is a diagram showing the production of a pathology slide from a plurality of slices of a biological tissue.
  • FIG. 13 is a flowchart showing one example of the processing in a PC and a server when a pathology image is downloaded in advance in the system according to the second embodiment.
  • FIG. 14 is a flowchart showing another example of the processing in a PC and a server when a pathology image is downloaded in advance in the system according to the second embodiment.
  • FIG. 15 is a flowchart showing the processing in a PC being an audience when a pathology image is displayed in the system.
  • FIG. 16 is a block diagram showing the configuration of a teleconference system according to a third embodiment.
  • FIG. 17 is a block diagram showing the configuration of a teleconference system according to a fourth embodiment.
  • FIG. 18 is a block diagram showing the configuration of a teleconference system according to a fifth embodiment.
  • FIG. 19 is a flowchart showing the processing in a PC according to the fifth embodiment.
  • FIG. 20 is a block diagram showing the configuration of a teleconference system according to a sixth embodiment.
  • FIG. 21 is a block diagram showing the configuration of a teleconference system according to a seventh embodiment.
  • FIG. 1 is a block diagram showing the configuration of a teleconference system using a pathology image according to the first embodiment.
  • a diagnosis is efficiently made by holding a teleconference, while personal computers (PCs) of a plurality of users (e.g., doctors) have an image for pathological diagnosis (a pathology image) in common with one another, and by making a diagnosis while opinions are transmitted among the PCs.
  • PCs personal computers
  • a pathology image e.g., doctors
  • this system has a server 200 , a plurality of PCs 100 , and a scanner 300 . These may communicate with one another via the Internet 150 .
  • Data of a pathology image taken by the scanner 300 are uploaded to the server 200 via the Internet 150 and are stored.
  • the pathology image is obtained by taking an image of a slice of a biological tissue or the like held in a glass slide.
  • the pathology image data are not stored only in the server 200 , but may be stored in any of the PCs 100 .
  • one of the plurality of PCs 100 acts as the “chairman” of the teleconference, and the other ones act as “audiences”.
  • a pathology image specified by the PC 100 as the chairman is displayed on the PCs 100 as audiences.
  • the PC 100 acting as the chairman transmits information (area specifying information), specifying a display area in the pathology image for a diagnosis in the teleconference, and information (URL; uniform resource locator), indicating locations of the pathology image data, to the server 200 .
  • the server 200 transmits the area specifying information and the URL to the PCs 100 acting as the audiences. That is, the PC 100 acting as the chairman transmits the area specifying information and the URL to the PCs 100 acting as the audiences via the server 200 .
  • Each PC 100 acting as the audience receives an image (a partial image) of the pathology image existing in the area, specified by the area specifying information, from the URL and displays the partial image. Therefore, the PCs 100 have the pathology image in common with one another, and a pathological diagnosis may be made by the users of the PCs 100 .
  • FIG. 2 is a block diagram showing the hardware configuration of one PC 100 .
  • Each PC 100 has a central processing unit (CPU) 11 , a read only memory (ROM) 12 , a random access memory (RAM) 13 , an input/output interface 15 , and a bus 14 connecting these ones with one another.
  • CPU central processing unit
  • ROM read only memory
  • RAM random access memory
  • the input/output interface 15 is connected with a display 16 , an input section 17 , storage 18 , a communicating section 19 , a driver 20 , and the like.
  • the display 16 is, for example, a display device using liquid crystal, electro-luminescence (EL) or the like.
  • the input section 17 is, for example, a pointing device, a keyboard, a touch panel, a microphone, or another operating device.
  • the input section 17 includes a touch panel, the touch panel may be integrally formed with the display 16 .
  • the storage 18 is a nonvolatile storage device, for example, being a hard disk drive (HDD), a flash memory, or another solid memory.
  • HDD hard disk drive
  • flash memory or another solid memory.
  • an application program to be executed to receive and display the pathology image data in this system, is stored.
  • the driver 20 is, for example, a device capable of driving a removable recording medium 21 such as an optical recording medium, a floppy (registered trademark) disk, a magnetic recording tape, or a flash memory.
  • the storage 18 is often used as a device which is mounted in the PC 100 in advance and drives a non-removable recording medium.
  • the communicating section 19 is a modem, a router, or another communicating device which communicates with another device and may be connected to a local area network (LAN), a wide area network (WAN), or the like.
  • the communicating section 19 may use any of wire communication and wireless communication.
  • the communicating section 19 is often used while being formed independent of the PC 100 .
  • the hardware configuration of the server 200 is also the same as the hardware configuration of the PC 100 and has blocks such as a control section, storage, and a communicating section being necessary to act as a computer.
  • FIG. 3 is a diagram showing an image pyramid structure for explaining the display principle.
  • An image pyramid structure 50 is an image group (a total image group) of which images are produced from a single pathology image, obtained from a single observation object 40 (see FIG. 4 ) by an optical microscope, at a plurality of different resolutions respectively.
  • the image having the largest size is located in the lowest layer of the image pyramid structure 50 , while the image having the smallest size is located in the highest layer.
  • the resolution of the largest sized image is, for example, 50 ⁇ 50 kilopixels or 30 ⁇ 40 kilopixels.
  • the resolution of the smallest sized images is, for example, 256 ⁇ 256 pixels or 256 ⁇ 512 pixels.
  • the same display 16 displays these images, for example, in 100% size (each of the images is displayed at the number of physical dots which is the same as the number of pixels in the image), the image having the largest size is displayed in the largest size, and the image having the smallest size is displayed in the smallest size.
  • the display area of the display 16 is indicated by D.
  • FIG. 4 is a diagram for explaining the procedure in the case of producing the image group of the image pyramid structure 50 .
  • a digital image of an original image (a huge image) obtained at a predetermined observation magnification by an optical microscope (not shown) is prepared.
  • This original image is equivalent to the largest sized image being the lowest image of the image pyramid structure 50 shown in FIG. 3 , and is an image having the highest resolution. Therefore, as the lowest image of the image pyramid structure 50 , an image obtained by observing at a comparatively high resolution by the optical microscope is used.
  • a slice thinly cut off from a living internal organ, a biological tissue, a cell, or a part of any one of these is an observation object 40 .
  • the observation object 40 held in a glass slide is read out by the scanner 300 having the function of an optical microscope, and an obtained digital image is stored in the scanner 300 or other storage.
  • this scanner 300 or a generally-used computer produces a plurality of images which respectively have resolutions lowered step by step from the largest sized image obtained as described above, and stores these images, for example, every “tile” (partial image) unit denoting a unit of a predetermined size.
  • the size of one tile is, for example, 256 ⁇ 256 pixels.
  • identification information an ID or a number identifying the tile is added.
  • the image group produced as described above forms the image pyramid structure 50 , and this image pyramid structure 50 is stored in the storage 18 of the PC 100 or storage of the server 200 .
  • the PC 100 or the server 200 may store the images having a plurality of different resolutions and pieces of information of the resolutions while the images are associated with the pieces of information respectively.
  • the PC 100 may perform the production and storage of the image pyramid structure 50 .
  • the total image group forming this image pyramid structure 50 may be produced by a known compression method, and may be, for example, produced by a known compression method for producing a thumbnail image.
  • the PC 100 extracts a desired image from the image pyramid structure 50 in response to the operation of the user input from the input section 17 and displays this image on the display 16 .
  • the PC 100 displays an image of an arbitrary portion, selected by the user, from among images having an arbitrary resolution selected by the user.
  • the user may obtain a feeling that the user actually observes the observation object 40 . That is, in this case, the PC 100 acts as a virtual microscope, and a virtual observation magnification is practically equivalent to the resolution described above.
  • FIG. 5 is a flowchart showing the processing in the PC 100 being the chairman when a pathology image is displayed according to this embodiment
  • FIG. 6 is a flowchart showing the processing in the PC 100 being one audience in this case.
  • the CPU 11 of the PC 100 being the chairman waits for an operation input of the user which specifies a display area in a specific pathology image (step 51 ) and specifies the display area when receiving this operation input (step 52 ).
  • the CPU 11 specifies tiles of the pathology image used for the display process, that is, tiles (partial images) of which all or a part is included in the display area (step 53 ).
  • the CPU 11 transmits a request for data of the specified tiles to the server 200 (step 54 ).
  • the CPU 11 transmits display area information indicating the display area and a URL of the specified tile data to the server 200 (step 55 ).
  • the CPU 11 receives the tile data transmitted from the server 200 in response to the request (step 56 ) and displays the tile data on the display (step 57 ).
  • the CPU 11 of the PC 100 being each audience receives the display area information and the URL transmitted from the PC 100 being the chairman via the server 200 (step 61 ).
  • the CPU 11 specifies tiles of the pathology image required for the display process of the display area on the basis of the display area information (step 62 ).
  • the CPU 11 transmits the request for the specified tiles to the server 200 on the basis of the received URL (step 63 ).
  • the CPU 11 receives tiles transmitted in response to the request (step 64 ), and displays the tiles on the display 16 (step 65 ).
  • FIG. 7 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • the pathology image is stored in the server 200 , and is downloaded to the PC 100 on the day on which the teleconference is held.
  • the PC 100 may also download the pathology image to the storage 18 before the conference is held. This downloading may be performed in a tile unit without being performed in a unit of the whole pathology image.
  • each tile being used in the teleconference at a high probability may be downloaded.
  • the tiles being used at a high probability are, for example, the tiles used for a pathological diagnosis in the past.
  • the tiles used for a past pathological diagnosis are, for example, specified according to a following criterion.
  • the PC 100 specifies tiles used for a past diagnosis on the basis of a display operation record indicating that at which magnification (resolution) a doctor making a diagnosis in the past viewed a pathology image and how the doctor moved coordinates or a display area.
  • FIG. 8 , FIG. 9 , and FIG. 10 are diagrams showing examples of tiles specified on the basis of this operation record information.
  • each tile 70 of which all or a part is included in this rectangle is specified.
  • the PC 100 specifies a tile to which this annotation is affixed.
  • the annotation denotes information which is, for example, formed in a style of a symbol, a diagram, a text, a voice, an image, a link (e.g., URL), or the like on the basis of user's input on the PC displaying the pathology image.
  • a tile 70 to which an annotation 73 indicated by a symbol X is affixed, among tiles composing the pathology image I is specified.
  • the annotation indicated by a line, a rectangle, or a circle is affixed, in the same manner as in FIG. 8 to FIG. 10 , tiles on which the line goes across or tiles included in the rectangle or the circle are specified.
  • FIG. 12 is a diagram showing the preparation of slides (pathology slides) in which slices are held to be photographed for pathology images.
  • a pathology slide 80 is prepared by cutting off the observation object (a slice) 40 at the thickness of two to three micrometers from a pathology piece (a biological tissue) S taking out by a surgery or the like, mounting the slice 40 on a slide glass 81 , and covering the slice 40 with a cover glass 82 .
  • slices successively cutting off have approximately the same tissue shape and the same medical feature of cell and the like. That is, as shown in this figure, diagnosis information affixed to a pathology slide 80 A of a slice 40 A is also useful for an adjacent pathology slide 80 B of a slice 40 B.
  • the different slices 40 A and 40 B are stained with different colors to perform different inspections respectively. Accordingly, when a plurality of pathology images relating to the successive slices stained with different colors exist, the PC 100 specifies a tile of one pathology image to which an annotation is affixed, and specifies a tile of another pathology image existing at the same coordinates as those of the position at which the annotation is affixed.
  • the user may see one pathology image while changing the observation magnification (the resolution). Accordingly, when an annotation is affixed to a certain pathology image, this annotation is also useful information for another pathology image of the same observation object of which the magnification (the resolution) differs from that of the certain pathology image. Accordingly, when an annotation is affixed to a certain pathology image, the PC 100 specifies a tile of the certain pathology image to which the annotation is affixed, and also specifies a tile, which includes the same coordinates as coordinates at which the annotation is affixed, in another pathology image of the same observation object having a different resolution.
  • FIG. 13 is a flowchart showing the processing in the PC 100 and the server 200 in the download of a pathology image in advance.
  • pathology image data to be used for a teleconference and a list of participants of the teleconference are transmitted from the PC 100 of the promoter of the teleconference to the server 200 and are registered (step 131 ).
  • This process is, for example, performed by selecting a desired pathology image and users from data of all pathology images and a list of all users existing in the server 200 .
  • the server 200 transmits a notice of a conference holding to the PC 100 of each participant (step 132 ).
  • the PC 100 of each participant determines whether or not the storage 18 such as a HDD having a capacity capable of downloading a pathology image in advance exists in the PC 100 (step S 133 ).
  • the PC 100 of the participant receives a list of pathology image data to be used in the teleconference from the server 200 (step 134 ). In this case, past diagnosis record information relating to these pathology image data and the URL of the pathology image data (tiles) at this time are also received.
  • the PC 100 specifies tiles according to the criterion described above on the basis of the received list of pathology image data and the received diagnosis record information, and downloads the specified tiles from the server 200 (step 135 ).
  • the PC 100 of the participant may participate in the conference by being connected to the server 200 (step 136 ).
  • the list of data to be used in the conference is determined.
  • the list is updated by deleting unnecessary data by the promoter, or by adding, by another participant, data that this participant wishes to use.
  • step 141 to step 145 after the PC 100 performs the same processes as those at step 131 to step 135 of FIG. 13 , the PC 100 determines whether or not the start time of the teleconference comes (step 146 ).
  • the PC 100 accesses the server 200 and determines whether or not the list of pathology image data to be used has been updated (step 147 ).
  • the PC 100 receives a new list from the server 200 and repeats the processes performed at step 144 and steps following step 144 .
  • the PC 100 may receive a new list in response to a notice that indicates the update of the list and is sent from the server 200 to the PC 100 of each participant.
  • the destination to which tiles are downloaded in advance is not limited to the storage 18 of the PC 100 .
  • the downloading to a neighboring storage device connected with the PC 100 via a network is possible.
  • the downloading to the server 200 closest to the PC 100 is allowed.
  • FIG. 15 is a flowchart showing the processing in the PC 100 being an audience when a pathology image is displayed in the system according to this embodiment.
  • the CPU 11 of the PC 100 determines whether or not specified tiles are stored in the storage 18 (whether or not specified tiles have been downloaded in advance) (step 153 ).
  • the CPU 11 transmits a request for the tiles to the server 200 (step 154 ), and receives the tiles (step 156 ).
  • the CPU 11 reads out the tiles from the storage 18 (step 155 ).
  • the CPU 11 displays the tiles, received or read out, on the display 16 (step 157 ).
  • FIG. 16 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • both the display area information and the tile data of the pathology image are stored in the server 200 .
  • a tile data server 200 A for storing the tile data and a control data server 200 B for processing the display area information are separately located.
  • the control data server 200 B the past diagnosis record information described in the second embodiment, the list of tile data, the list of participants, and the like are managed (hereinafter, these data are called control data in a lump).
  • the server is optimized for control data which has a small data capacity but needs a rapid response, and tile data which has a large data capacity and needs a high throughput.
  • control data may be exchanged in the P2P communication among a plurality of PC 100 .
  • FIG. 17 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • a teleconference is held among a plurality of hospitals by connecting in-hospital networks 170 with one another.
  • pathology image data to be used in the conference are gathered to a single point, the cost for uploading and downloading is increased.
  • pathology image data to be used in the teleconference are originally obtained by taking an image by a scanner 300 of each hospital and are retained in the in-hospital servers 200 A. Therefore, as shown in this figure, in this embodiment, the in-hospital server 200 A of each hospital is used in place of the tile data server 200 A in the third embodiment. In this case, the control data server 200 B also exists on the Internet 150 .
  • FIG. 18 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • the in-hospital server delivers the tile data to participants of a teleconference. Therefore, there is a probability that a problem occurs in view of the load on the server depending on the network bandwidth and the number of the PCs 100 .
  • tile data to be downloaded are uploaded to a cloud server 200 C in advance. Accordingly, the load on the in-hospital server 200 D is reduced.
  • FIG. 19 is a flowchart showing the display process in the PC 100 according to this embodiment.
  • the CPU 11 of the PC 100 transmits a request for the specified tiles to the cloud server 200 C (step 193 ).
  • the CPU 11 determines whether or not the tiles to be requested are successfully received (step 194 ).
  • the CPU 11 When failing in the reception of the tiles (NO), the CPU 11 transmits a request for the tiles to the in-hospital server 200 D (step 196 ), and receives the tiles (step 197 ). Thereafter, the received tiles are displayed (step 197 ).
  • FIG. 20 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • tile servers 200 A- 1 and 200 A- 2 a plurality of tile servers for delivering the same tile data are located (tile servers 200 A- 1 and 200 A- 2 ).
  • the tile server 200 A- 1 is set as a primary server
  • the tile server 200 A- 2 is set as a secondary server.
  • the system may cope with the increase in the number of PCs 100 .
  • each of the following server determining algorithms may be used.
  • a server is statistically allocated to one client in advance for each client.
  • the client (the PC 100 ) selects the server having the smallest RTT (Round Trip Time).
  • the client (the PC 100 ) selects the server having the widest bandwidth.
  • the primary server determines while considering the total load.
  • the roundrobin processing is performed.
  • the load information of the server is periodically (e.g., every 10 seconds) obtained.
  • FIG. 21 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • tile data for which a diagnosis was finished is retreated from the in-hospital network 170 to the cloud server 200 C via the Internet 150 .
  • the tile data retreated to the cloud server 200 C is returned to the in-hospital server 200 D at the time of the reservation.
  • the configurations of the first to seventh embodiments may be combined with one another in any manner if the combined embodiments are not contradictory to one another.
  • the download process in advance in the second embodiment may be embodied in the configuration that the server is separately divided into the tile data server and the control data server as described in the third embodiment.
  • an information processing apparatus includes a processor and a memory device storing instructions. When executed by the processor, the instructions cause the processor to: (a) receive, from a first information processing apparatus, area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images; and (b) transmit, to a second information processing apparatus, the area specifying information and the location information.
  • the image includes data representative of a slice of a biological tissue.
  • the information processing apparatus includes a server
  • the first information processing apparatus includes a first personal computer
  • the second information processing apparatus includes a second personal computer
  • the plurality of partial images includes (a) a first partial image stored at a first location, and (b) a second partial image stored at a second location.
  • the area specifying information is determined based on a display record.
  • the area specifying information is determined based on annotation information.
  • an information processing apparatus includes a processor, a display device, and
  • a memory device storing instructions. When executed by the processor, the instructions cause the processor to: (a) in response to an operation input, determine area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, (b) transmit the area specifying information and the location information to a first information processing apparatus, the first information processing apparatus being configured to, in response to receiving the area specifying information and the location information, transmit the area specifying information and the location information to a second information processing apparatus, (c) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (d) receive the plurality of partial images from the at least one location; and (e) display the received plurality of partial images.
  • the image includes data representative of a slice of a biological tissue.
  • the information processing apparatus includes a first personal computer, (b) the first information processing apparatus includes a server; and (c) the second information processing apparatus includes a second personal computer.
  • the plurality of partial images includes: (a) a first partial image stored at a first location, and (b) a second partial image stored at a second location.
  • the instructions when executed by the processor, cause the processor to: (a) receive the first partial image from the first location, and (b) receive the second partial image from the second location.
  • the instructions when executed by the processor, cause the processor to: (a) determine whether the first partial image is stored by the memory device, (b) in response to a determination that the first partial image is stored by the memory device, read out the first partial image from the memory device, and (c) for the first partial image, do not transmit a request to the at least one location indicated by the location information.
  • the instructions when executed by the processor, cause the processor to determine the area specifying information based on a display record.
  • the instructions when executed by the processor, cause the processor to determine the area specifying information based on annotation information.
  • an information processing apparatus includes a processor, a display device, and
  • a memory device storing instructions. When executed by the processor, the instructions cause the processor to: (a) receive area specifying information and location information from a first information processing apparatus, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, (b) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (c) receive the plurality of partial images from the at least one location, and (d) display the received plurality of partial images.
  • the image includes data representative of a slice of a biological tissue.
  • the information processing apparatus includes a personal computer, and (b) the first information processing apparatus includes a server.
  • the plurality of partial images includes: (a) a first partial image stored at a first location, and (b) a second partial image stored at a second location.
  • the instructions when executed by the processor, cause the processor to: (a) receive the first partial image from the first location, and (b) receive the second partial image from the second location.
  • the instructions when executed by the processor, cause the processor to: (a) determine whether the first partial image is stored by the memory device, (b) in response to a determination that the first partial image is stored by the memory device, read out the first partial image from the memory device, and (c) for the first partial image, do not transmit a request to the at least one location indicated by the location information.
  • the area specifying information is determined based on a display record.
  • the area specifying information is determined based on annotation information.
  • a method of operating an information processing apparatus includes (a) causing a processor to execute instructions to receive, from a first information processing apparatus, area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, and (b) causing the processor to execute instructions to transmit, to a second information processing apparatus, the area specifying information and the location information.
  • the image includes data representative of a slice of a biological tissue.
  • the information processing apparatus includes a server
  • the first information processing apparatus includes a first personal computer
  • the second information processing apparatus includes a second personal computer
  • the plurality of partial images includes: (a) a first partial image stored at a first location, and (b) a second partial image stored at a second location.
  • the area specifying information is determined based on a display record.
  • the area specifying information is determined based on annotation information.
  • a system in another embodiment, includes first, second, and third information processing apparatus.
  • the first information processing apparatus is configured to, in response to an operation input, determine area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images.
  • the second information processing apparatus is configured to receive the area specifying information and the location information from the first information processing apparatus.
  • the third information processing apparatus is configured to: (a) receive the area specifying information and the location information from the second information processing apparatus, (b) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (c) receive the plurality of partial images from the at least one location, and (d) display the received plurality of partial images.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Epidemiology (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

An information processing apparatus, an information processing system, and a method of processing information are provided. In one embodiment, the information processing apparatus includes a processor, and a memory device storing instructions. When executed by the processor, the instructions cause the processor to receive, from a first information processing apparatus, area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images. The instructions further cause the processor to transmit, to a second information processing apparatus, the area specifying information and the location information.

Description

    CROSS REFERENCES TO RELATED APPLICATIONS
  • The present application is a continuation of U.S. application Ser. No. 13/880,267 filed Apr. 18, 2013, which is a national stage of International Application No. PCT/JP2012/004448 filed on Oct. 7, 2012 and claims priority to Japanese Patent Application No. 2011-180438 filed on Aug. 22, 2011, the disclosure of which is incorporated herein by reference.
  • BACKGROUND
  • The present technology relates to an information processing apparatus which may have a pathology image in common with another information processing apparatus to make a user observe the image, an information processing system having the information processing apparatus, a method of processing information in the information processing apparatus, and a program for the information processing apparatus.
  • In the past, to support a teleconference among a plurality of users, a system (e.g., a television conference system) in which terminals of the users have a screen in common with one another as a common information resource has been known.
  • For example, in Patent Literature (PTL) 1 described later, an information communication service system, in which a conference is expedited while each of user terminals takes out contents registered in advance if necessary, is disclosed. In this system, a server receives videos and voices transmitted from a plurality of terminals in real time, synthesizes these ones, and delivers the synthesized ones to the terminals.
  • Further, in PTL 2 and PTL 3 described later, a content distributing system, in which contents are copied and dispersed to a plurality of servers to be stored, is described.
  • Moreover, PTL 4 described later discloses that a reproduction apparatus downloads in advance contents, of which the date and time to be provided for viewers is predetermined, from a server in a ciphered state, receives a key from the server at a viewable date and time, and deciphers the contents with the key to reproduce the contents.
  • Furthermore, in the field of pathological diagnosis, terminals of a plurality of users (e.g., doctors) have an image for pathological diagnosis in common with one another, and hold a teleconference to make a diagnosis while transmitting opinions among user's terminals. Therefore, the diagnosis is efficiently made.
  • In relation to this, in PTL 5 described later, a server and client system is disclosed. In this system, a server delivers a plurality of tile images composing a pathology image to clients, and each client synthesizes the pathology image from the tile images and views the pathology image.
  • CITATION LIST Patent Literature
  • [PTL 1]
  • Japanese Patent No. 3795772
  • [PTL 2]
  • Japanese Patent Application Laid-open No. 2003-115873
  • [PTL 3]
  • Japanese Patent Application Laid-open No. 2003-085070
  • [PTL 4]
  • Japanese Patent Application Laid-open No. 2010-119142
  • [PTL 5]
  • U.S. Pat. No. 7,542,596
  • SUMMARY
  • However, in the case of constructing a common system of a pathology image, as described in PTL 1, in a tree type system for concentrating delivered data into a single point, because it is substantially required to gather data of a pathology image, photographed in each of terminals (hospitals), to a server of a cloud computing, it is time-consuming to upload the data. Further, when contents are concentrated into a single point, the contents act as a bottle neck, and scalability in simultaneously holding a plurality of conferences or the like may not be exhibited.
  • In a common system of normal contents, the problem in scalability may be solved by the technique described in each of PTL 2 and PTL 3. However, because data of a pathology image having a very high resolution have a large volume, it is time-consuming, in the same manner as in the uploading, to copy the data among a plurality of servers.
  • Further, in the technique described in PTL 4, the load on the server is decreased with the reduction of data communication performed during the conference. However, when the communication using the pathology image data is performed, it is very time-consuming to download the data in advance. Moreover, because only a slight portion of very huge volume of pathology image data is used for the pathological diagnosis, most of the downloaded data are not used. Therefore, efficiency in the download in advance is very low in view of effective use of a network band.
  • In view of the circumstances as described above, it is desirable to provide an information processing apparatus, an information processing system, and a method of processing information that are capable of shortening the time, taken to prepare the holding of a teleconference in which data of a pathology image are possessed in common, and expediting the teleconference efficiently and smoothly.
  • An information processing apparatus according to an embodiment of the present technology includes a processor, a display device, and a memory device storing instructions. When executed by the processor, the instructions cause the processor to: (a) in response to an operation input, determine area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, (b) transmit the area specifying information and the location information to a first information processing apparatus, the first information processing apparatus being configured to, in response to receiving the area specifying information and the location information, transmit the area specifying information and the location information to a second information processing apparatus, (c) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (d) receive the plurality of partial images from the at least one location, and (e) display the received plurality of partial images.
  • Therefore, because the information processing apparatus may receive each of the partial images of the pathology image from an arbitrary location being the same or different from locations of the other partial images, as compared with the case where the locations are concentrated into one point, the time required to prepare a teleconference using the pathology image with another information processing apparatus may be shortened. Further, because the information processing apparatus is not required to determine the display area in the pathology image, the teleconference using the pathology image may be expedited efficiently and smoothly by receiving the partial image according to the area specifying information and by displaying the partial image.
  • The information processing apparatus may further have storage. In this case, the processor may be configured to determine whether or not the partial images included in the display area are stored in the storage, and to be capable of controlling the display, when the partial images are stored in the storage, to display the partial images without transmitting the request information to the location.
  • Therefore, when the partial images are stored in the storage, it is not required that the information processing apparatus access the location. Accordingly, the traffic on the network may be reduced, and the teleconference using the pathology image may be further efficiently expedited.
  • The processor may be configured to be capable of controlling the information processing apparatus to receive, in advance, image specifying information, temporary location information, and diagnosis record information from the other information processing apparatus before receiving the area specifying information and the location information, the image specifying information specifying a pathology image that is capable of being used in a teleconference held with the other information processing apparatus, the temporary location information indicating a location of the specified pathology image at this time, the diagnosis record information indicating a past diagnosis record of the specified pathology image, and to receive a partial image, associated with the received diagnosis record information, among a plurality of partial images composing the pathology image specified by the received image specifying information, from the location indicated by the temporary location information. The processor may be configured to be capable of controlling the storage to store the received partial image.
  • Therefore, the information processing apparatus may download in advance a specific partial image, which may be used in the teleconference at a high probability, to store the specific partial image in the storage. Accordingly, the traffic during the teleconference may be reduced, and the teleconference using the pathology image may be efficiently expedited.
  • The processor may be configured to be capable of controlling the information processing apparatus to receive display record information, indicating one of a past display area and a display position in the specified pathology image, as the diagnosis record information. Therefore, the information processing apparatus may further efficiently expedite the teleconference using the pathology image by downloading in advance a partial image which may be displayed in the teleconference at a high probability in the same manner as in the past.
  • The processor may be configured to be capable of controlling the information processing apparatus to receive annotation information, affixed to a predetermined position in the specified pathology image, as the diagnosis record information.
  • Therefore, the information processing apparatus may further efficiently expedite the teleconference by downloading in advance the partial image to which annotation information is affixed and which may attract attention in the teleconference at a high probability in the same manner as in the past.
  • The pathology image may exist for each of a plurality of slices, which are taken out from one biological tissue and are stained with different colors, respectively, to form a plurality of pathology images. In this case, the processor may be configured to be capable of controlling the information processing apparatus to receive a partial image existing at a predetermined position, to which the annotation information is affixed, in a first pathology image of a slice stained with a first color, and to receive a partial image existing at the position same as the predetermined position in a second pathology image of a slice stained with a second color.
  • Therefore, the information processing apparatus may download in advance not only the partial image of the first pathology image, to which the annotation information is affixed, but also the partial image of the second pathology image which is taken from the biological tissue in common with the first pathology image but is of a slice stained in a color differing from that in the first pathology image.
  • Images of a slice taken out from one biological tissue may be taken at a plurality of different resolutions, and the pathology image may exist for each of the plurality of resolutions to form a plurality of pathology images. In this case, the processor may be configured to be capable of controlling the information processing apparatus to receive a partial image existing at a predetermined position, to which the annotation information is affixed, in a first pathology image taken at a first resolution, and to receive a partial image existing at the position same as the predetermined position in a second pathology image taken at a second resolution.
  • Therefore, the information processing apparatus may download in advance not only the partial image of the first pathology image, to which the annotation information is affixed, but also the partial image of the second pathology image which is taken from the biological tissue in common with the first pathology image but of which the resolution differs from the resolution of the first pathology image.
  • The processor may be configured to be capable of controlling the information processing apparatus to receive the area specifying information from the other information processing apparatus via a first server apparatus. In this case, the location may indicate a second server apparatus different from the first server apparatus.
  • Therefore, the pathology image and the area specifying information are managed in different servers, respectively. Accordingly, it may be prevented that the concentration of the load on a specific server disturbs the expedition of the teleconference.
  • An information processing apparatus according to another embodiment includes a processor, and
  • a memory device storing instructions. When executed by the processor, the instructions cause the processor to: (a) receive, from a first information processing apparatus, area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images; and (b) transmit, to a second information processing apparatus, the area specifying information and the location information.
  • According to another embodiment of the present technology, there is provided an information processing apparatus comprising: a processor, a display device, and a memory device storing instructions. When executed by the processor, the instructions cause the processor to: (a) receive area specifying information and location information from a first information processing apparatus, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, (b) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (c) receive the plurality of partial images from the at least one location, and (d) display the received plurality of partial images.
  • According to still another embodiment of the present technology, there is provided a method of operating an information processing apparatus. The method includes (a) causing a processor to execute instructions to receive, from a first information processing apparatus, area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, and (b) causing the processor to execute instructions to transmit, to a second information processing apparatus, the area specifying information and the location information.
  • According to still another embodiment of the present technology, there is provided a system including a first, second, and third information processing apparatus. The first information processing apparatus is configured to, in response to an operation input, determine area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images. The second information processing apparatus is configured to receive the area specifying information and the location information from the first information processing apparatus. The third information processing apparatus is configured to: (a) receive the area specifying information and the location information from the second information processing apparatus, (b) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (c) receive the plurality of partial images from the at least one location; and (d) display the received plurality of partial images.
  • As described above, according to the present technology, the time required to prepare the holding of a teleconference in which data of the pathology image are possessed in common may be shortened, and the teleconference may be expedited efficiently and smoothly.
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 is a block diagram showing the configuration of a teleconference system according to a first embodiment of the present technology.
  • FIG. 2 is a block diagram showing the configuration of hardware of a PC in the system.
  • FIG. 3 is a diagram explaining a display principle of a pathology image treated in the system.
  • FIG. 4 is a diagram indicating the procedure in the case of producing an image group of the pathology image treated in the system.
  • FIG. 5 is a flowchart showing the processing in a PC being the chairman when a pathology image is displayed in the system.
  • FIG. 6 is a flowchart showing the processing in a PC being an audience when a pathology image is displayed in the system.
  • FIG. 7 is a block diagram showing the configuration of a teleconference system according to a second embodiment of the present technology.
  • FIG. 8 is a diagram showing the specification of tiles, to be downloaded in advance, on the basis of past operation record information in the system according to the second embodiment.
  • FIG. 9 is a diagram showing the specification of tiles, to be downloaded in advance, on the basis of past operation record information in the system according to the second embodiment.
  • FIG. 10 is a diagram showing the specification of tiles, to be downloaded in advance, on the basis of past operation record information in the system according to the second embodiment.
  • FIG. 11 is a diagram showing the specification of a tile, to be downloaded in advance, according to past annotation information in the system according to the second embodiment.
  • FIG. 12 is a diagram showing the production of a pathology slide from a plurality of slices of a biological tissue.
  • FIG. 13 is a flowchart showing one example of the processing in a PC and a server when a pathology image is downloaded in advance in the system according to the second embodiment.
  • FIG. 14 is a flowchart showing another example of the processing in a PC and a server when a pathology image is downloaded in advance in the system according to the second embodiment.
  • FIG. 15 is a flowchart showing the processing in a PC being an audience when a pathology image is displayed in the system.
  • FIG. 16 is a block diagram showing the configuration of a teleconference system according to a third embodiment.
  • FIG. 17 is a block diagram showing the configuration of a teleconference system according to a fourth embodiment.
  • FIG. 18 is a block diagram showing the configuration of a teleconference system according to a fifth embodiment.
  • FIG. 19 is a flowchart showing the processing in a PC according to the fifth embodiment.
  • FIG. 20 is a block diagram showing the configuration of a teleconference system according to a sixth embodiment.
  • FIG. 21 is a block diagram showing the configuration of a teleconference system according to a seventh embodiment.
  • DESCRIPTION OF EMBODIMENTS
  • Hereinafter, embodiments according to the present technology will be described with reference to drawings.
  • First Embodiment
  • Initially, the first embodiment according to the present technology will be described.
  • [Outline of System]
  • FIG. 1 is a block diagram showing the configuration of a teleconference system using a pathology image according to the first embodiment. In this system, a diagnosis is efficiently made by holding a teleconference, while personal computers (PCs) of a plurality of users (e.g., doctors) have an image for pathological diagnosis (a pathology image) in common with one another, and by making a diagnosis while opinions are transmitted among the PCs.
  • As shown in this figure, this system has a server 200, a plurality of PCs 100, and a scanner 300. These may communicate with one another via the Internet 150.
  • Data of a pathology image taken by the scanner 300 are uploaded to the server 200 via the Internet 150 and are stored. As described later in detail, the pathology image is obtained by taking an image of a slice of a biological tissue or the like held in a glass slide. The pathology image data are not stored only in the server 200, but may be stored in any of the PCs 100.
  • In this system, one of the plurality of PCs 100 acts as the “chairman” of the teleconference, and the other ones act as “audiences”. A pathology image specified by the PC 100 as the chairman is displayed on the PCs 100 as audiences.
  • As described later in detail, the PC 100 acting as the chairman transmits information (area specifying information), specifying a display area in the pathology image for a diagnosis in the teleconference, and information (URL; uniform resource locator), indicating locations of the pathology image data, to the server 200. The server 200 transmits the area specifying information and the URL to the PCs 100 acting as the audiences. That is, the PC 100 acting as the chairman transmits the area specifying information and the URL to the PCs 100 acting as the audiences via the server 200.
  • Each PC 100 acting as the audience receives an image (a partial image) of the pathology image existing in the area, specified by the area specifying information, from the URL and displays the partial image. Therefore, the PCs 100 have the pathology image in common with one another, and a pathological diagnosis may be made by the users of the PCs 100.
  • [Hardware Configuration of PC]
  • FIG. 2 is a block diagram showing the hardware configuration of one PC 100.
  • Each PC 100 has a central processing unit (CPU) 11, a read only memory (ROM) 12, a random access memory (RAM) 13, an input/output interface 15, and a bus 14 connecting these ones with one another.
  • The input/output interface 15 is connected with a display 16, an input section 17, storage 18, a communicating section 19, a driver 20, and the like.
  • The display 16 is, for example, a display device using liquid crystal, electro-luminescence (EL) or the like.
  • The input section 17 is, for example, a pointing device, a keyboard, a touch panel, a microphone, or another operating device. When the input section 17 includes a touch panel, the touch panel may be integrally formed with the display 16.
  • The storage 18 is a nonvolatile storage device, for example, being a hard disk drive (HDD), a flash memory, or another solid memory. In the storage 18, not only the pathology image data are stored, but also an application program, to be executed to receive and display the pathology image data in this system, is stored.
  • The driver 20 is, for example, a device capable of driving a removable recording medium 21 such as an optical recording medium, a floppy (registered trademark) disk, a magnetic recording tape, or a flash memory. In contrast, the storage 18 is often used as a device which is mounted in the PC 100 in advance and drives a non-removable recording medium.
  • The communicating section 19 is a modem, a router, or another communicating device which communicates with another device and may be connected to a local area network (LAN), a wide area network (WAN), or the like. The communicating section 19 may use any of wire communication and wireless communication. The communicating section 19 is often used while being formed independent of the PC 100.
  • The hardware configuration of the server 200 is also the same as the hardware configuration of the PC 100 and has blocks such as a control section, storage, and a communicating section being necessary to act as a computer.
  • Next, a pathology image, which may be stored in the server 200 or the storage 18 of the PC 100, and a display principle of the image will be described. FIG. 3 is a diagram showing an image pyramid structure for explaining the display principle.
  • An image pyramid structure 50 according to this embodiment is an image group (a total image group) of which images are produced from a single pathology image, obtained from a single observation object 40 (see FIG. 4) by an optical microscope, at a plurality of different resolutions respectively. The image having the largest size is located in the lowest layer of the image pyramid structure 50, while the image having the smallest size is located in the highest layer. The resolution of the largest sized image is, for example, 50×50 kilopixels or 30×40 kilopixels. The resolution of the smallest sized images is, for example, 256×256 pixels or 256×512 pixels.
  • That is, when the same display 16 displays these images, for example, in 100% size (each of the images is displayed at the number of physical dots which is the same as the number of pixels in the image), the image having the largest size is displayed in the largest size, and the image having the smallest size is displayed in the smallest size. Here, in FIG. 3, the display area of the display 16 is indicated by D.
  • FIG. 4 is a diagram for explaining the procedure in the case of producing the image group of the image pyramid structure 50.
  • Initially, a digital image of an original image (a huge image) obtained at a predetermined observation magnification by an optical microscope (not shown) is prepared. This original image is equivalent to the largest sized image being the lowest image of the image pyramid structure 50 shown in FIG. 3, and is an image having the highest resolution. Therefore, as the lowest image of the image pyramid structure 50, an image obtained by observing at a comparatively high resolution by the optical microscope is used.
  • In the field of pathology, generally, a slice thinly cut off from a living internal organ, a biological tissue, a cell, or a part of any one of these is an observation object 40. Then, the observation object 40 held in a glass slide is read out by the scanner 300 having the function of an optical microscope, and an obtained digital image is stored in the scanner 300 or other storage.
  • As shown in FIG. 4, this scanner 300 or a generally-used computer (not shown) produces a plurality of images which respectively have resolutions lowered step by step from the largest sized image obtained as described above, and stores these images, for example, every “tile” (partial image) unit denoting a unit of a predetermined size. The size of one tile is, for example, 256×256 pixels. To each tile, identification information (an ID or a number) identifying the tile is added.
  • The image group produced as described above forms the image pyramid structure 50, and this image pyramid structure 50 is stored in the storage 18 of the PC 100 or storage of the server 200. Practically, the PC 100 or the server 200 may store the images having a plurality of different resolutions and pieces of information of the resolutions while the images are associated with the pieces of information respectively. The PC 100 may perform the production and storage of the image pyramid structure 50.
  • The total image group forming this image pyramid structure 50 may be produced by a known compression method, and may be, for example, produced by a known compression method for producing a thumbnail image.
  • When the user of each PC 100 views the pathology image stored in the storage 18 of the PC 100 not in a teleconference but in stand-alone, the PC 100 extracts a desired image from the image pyramid structure 50 in response to the operation of the user input from the input section 17 and displays this image on the display 16. In this case, the PC 100 displays an image of an arbitrary portion, selected by the user, from among images having an arbitrary resolution selected by the user. While the user changes the observation magnification, the user may obtain a feeling that the user actually observes the observation object 40. That is, in this case, the PC 100 acts as a virtual microscope, and a virtual observation magnification is practically equivalent to the resolution described above.
  • [Operation of System]
  • Next, an operation of the system configured as described above will be described. Hereinafter, an operation will be described while especially setting the CPU 11 of the PC 100 as a main operating subject, and the operation is performed in cooperation with other blocks of each device and software (application).
  • FIG. 5 is a flowchart showing the processing in the PC 100 being the chairman when a pathology image is displayed according to this embodiment, while FIG. 6 is a flowchart showing the processing in the PC 100 being one audience in this case.
  • As shown in FIG. 5, the CPU 11 of the PC 100 being the chairman waits for an operation input of the user which specifies a display area in a specific pathology image (step 51) and specifies the display area when receiving this operation input (step 52).
  • Then, on the basis of the specified display area, the CPU 11 specifies tiles of the pathology image used for the display process, that is, tiles (partial images) of which all or a part is included in the display area (step 53).
  • Then, the CPU 11 transmits a request for data of the specified tiles to the server 200 (step 54). On the other hand, the CPU 11 transmits display area information indicating the display area and a URL of the specified tile data to the server 200 (step 55).
  • Then, the CPU 11 receives the tile data transmitted from the server 200 in response to the request (step 56) and displays the tile data on the display (step 57).
  • On the other hand, as shown in FIG. 6, the CPU 11 of the PC 100 being each audience receives the display area information and the URL transmitted from the PC 100 being the chairman via the server 200 (step 61).
  • Then, the CPU 11 specifies tiles of the pathology image required for the display process of the display area on the basis of the display area information (step 62).
  • Then, the CPU 11 transmits the request for the specified tiles to the server 200 on the basis of the received URL (step 63).
  • Then, the CPU 11 receives tiles transmitted in response to the request (step 64), and displays the tiles on the display 16 (step 65).
  • Second Embodiment
  • Next, the second embodiment according to the present technology will be described.
  • FIG. 7 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • In the first embodiment described above, the pathology image is stored in the server 200, and is downloaded to the PC 100 on the day on which the teleconference is held. However, as shown in this figure, when the day fixed for the teleconference and a pathology image used in the teleconference are predetermined, the PC 100 may also download the pathology image to the storage 18 before the conference is held. This downloading may be performed in a tile unit without being performed in a unit of the whole pathology image.
  • In the above-described download in a tile unit in advance, each tile being used in the teleconference at a high probability may be downloaded. The tiles being used at a high probability are, for example, the tiles used for a pathological diagnosis in the past. The tiles used for a past pathological diagnosis are, for example, specified according to a following criterion.
  • That is, the PC 100 specifies tiles used for a past diagnosis on the basis of a display operation record indicating that at which magnification (resolution) a doctor making a diagnosis in the past viewed a pathology image and how the doctor moved coordinates or a display area.
  • FIG. 8, FIG. 9, and FIG. 10 are diagrams showing examples of tiles specified on the basis of this operation record information.
  • As shown in FIG. 8, when the operation record of a pathology image I is indicated as a coordinate moving path 71 being a line, tiles 70 (tiles to which the line belongs) on which the line goes across are specified. Further, when the operation record is indicated by a point (specific coordinates), a tile including this point is specified.
  • Further, as shown in FIG. 9, even in the same case as that in FIG. 8, tiles 70 being formed in a rectangle and including the line may be specified.
  • Further, as shown in FIG. 10, when the operation record of the pathology image I is indicated as a display area 72 of a rectangle, each tile 70 of which all or a part is included in this rectangle is specified.
  • Furthermore, when some diagnosis annotation is affixed to a pathology image by a doctor in a past diagnosis, the PC 100 specifies a tile to which this annotation is affixed. The annotation denotes information which is, for example, formed in a style of a symbol, a diagram, a text, a voice, an image, a link (e.g., URL), or the like on the basis of user's input on the PC displaying the pathology image.
  • For example, as shown in FIG. 11, a tile 70, to which an annotation 73 indicated by a symbol X is affixed, among tiles composing the pathology image I is specified. When the annotation indicated by a line, a rectangle, or a circle is affixed, in the same manner as in FIG. 8 to FIG. 10, tiles on which the line goes across or tiles included in the rectangle or the circle are specified.
  • FIG. 12 is a diagram showing the preparation of slides (pathology slides) in which slices are held to be photographed for pathology images.
  • As shown in this figure, a pathology slide 80 is prepared by cutting off the observation object (a slice) 40 at the thickness of two to three micrometers from a pathology piece (a biological tissue) S taking out by a surgery or the like, mounting the slice 40 on a slide glass 81, and covering the slice 40 with a cover glass 82.
  • Accordingly, slices successively cutting off have approximately the same tissue shape and the same medical feature of cell and the like. That is, as shown in this figure, diagnosis information affixed to a pathology slide 80A of a slice 40A is also useful for an adjacent pathology slide 80B of a slice 40B.
  • There is the case where the different slices 40A and 40B are stained with different colors to perform different inspections respectively. Accordingly, when a plurality of pathology images relating to the successive slices stained with different colors exist, the PC 100 specifies a tile of one pathology image to which an annotation is affixed, and specifies a tile of another pathology image existing at the same coordinates as those of the position at which the annotation is affixed.
  • Further, as described above, the user may see one pathology image while changing the observation magnification (the resolution). Accordingly, when an annotation is affixed to a certain pathology image, this annotation is also useful information for another pathology image of the same observation object of which the magnification (the resolution) differs from that of the certain pathology image. Accordingly, when an annotation is affixed to a certain pathology image, the PC 100 specifies a tile of the certain pathology image to which the annotation is affixed, and also specifies a tile, which includes the same coordinates as coordinates at which the annotation is affixed, in another pathology image of the same observation object having a different resolution.
  • FIG. 13 is a flowchart showing the processing in the PC 100 and the server 200 in the download of a pathology image in advance.
  • As shown in this figure, initially, pathology image data to be used for a teleconference and a list of participants of the teleconference are transmitted from the PC 100 of the promoter of the teleconference to the server 200 and are registered (step 131). This process is, for example, performed by selecting a desired pathology image and users from data of all pathology images and a list of all users existing in the server 200.
  • Then, the server 200 transmits a notice of a conference holding to the PC 100 of each participant (step 132).
  • Then, the PC 100 of each participant determines whether or not the storage 18 such as a HDD having a capacity capable of downloading a pathology image in advance exists in the PC 100 (step S133).
  • When a HDD capable of downloading exists (YES), the PC 100 of the participant receives a list of pathology image data to be used in the teleconference from the server 200 (step 134). In this case, past diagnosis record information relating to these pathology image data and the URL of the pathology image data (tiles) at this time are also received.
  • Then, the PC 100 specifies tiles according to the criterion described above on the basis of the received list of pathology image data and the received diagnosis record information, and downloads the specified tiles from the server 200 (step 135).
  • Then, when the start time of the teleconference comes, the PC 100 of the participant may participate in the conference by being connected to the server 200 (step 136).
  • In the example shown in FIG. 13, at the time when the promoter registers the pathology image data to the server 200, the list of data to be used in the conference is determined. However, practically, in the period of time from the time when a request for the holding of a conference is transmitted to the time when the conference is held, there is the case where the list is updated by deleting unnecessary data by the promoter, or by adding, by another participant, data that this participant wishes to use.
  • Therefore, as shown in the flowchart of FIG. 14, there is the idea that the list is maintained to an updated one by polling the list.
  • That is, as shown in this figure, at step 141 to step 145, after the PC 100 performs the same processes as those at step 131 to step 135 of FIG. 13, the PC 100 determines whether or not the start time of the teleconference comes (step 146).
  • Then, when the start time of the teleconference does not yet come (NO), the PC 100 accesses the server 200 and determines whether or not the list of pathology image data to be used has been updated (step 147).
  • When the list has been updated (YES), the PC 100 receives a new list from the server 200 and repeats the processes performed at step 144 and steps following step 144.
  • Here, in place of the processes at step 146 and step 147, the PC 100 may receive a new list in response to a notice that indicates the update of the list and is sent from the server 200 to the PC 100 of each participant.
  • Further, the destination to which tiles are downloaded in advance is not limited to the storage 18 of the PC 100. For example, the downloading to a neighboring storage device connected with the PC 100 via a network is possible. When there are a plurality of servers 200, the downloading to the server 200 closest to the PC 100 is allowed.
  • FIG. 15 is a flowchart showing the processing in the PC 100 being an audience when a pathology image is displayed in the system according to this embodiment.
  • As shown in this figure, after the CPU 11 of the PC 100 performs the same processes as those at step 61 and step 62 shown in FIG. 6 according to the first embodiment (steps 151 and 152), the CPU 11 determines whether or not specified tiles are stored in the storage 18 (whether or not specified tiles have been downloaded in advance) (step 153).
  • When the tiles do not exist in the storage 18 (NO), the CPU 11 transmits a request for the tiles to the server 200 (step 154), and receives the tiles (step 156).
  • When the tiles are stored in the storage 18 (YES), the CPU 11 reads out the tiles from the storage 18 (step 155).
  • Then, the CPU 11 displays the tiles, received or read out, on the display 16 (step 157).
  • As described above, according to the download in advance, unnecessary communication on the date of the teleconference is reduced. Accordingly, the load on the server 200 expected when the number of PC 100 being audiences is increased may be reduced, and unnecessary packets transmitted through the network may be avoided.
  • Third Embodiment
  • Next, a third embodiment according to the present technology will be described.
  • FIG. 16 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • In each of the embodiments described above, both the display area information and the tile data of the pathology image are stored in the server 200. However, it is not necessary to process both ones in the same server. Therefore, as shown in this figure, in this embodiment, a tile data server 200A for storing the tile data and a control data server 200B for processing the display area information are separately located. Further, in the control data server 200B, the past diagnosis record information described in the second embodiment, the list of tile data, the list of participants, and the like are managed (hereinafter, these data are called control data in a lump).
  • Therefore, the server is optimized for control data which has a small data capacity but needs a rapid response, and tile data which has a large data capacity and needs a high throughput.
  • Further, when the server 200B treating the control data does not exist, the control data may be exchanged in the P2P communication among a plurality of PC 100.
  • Fourth Embodiment
  • Next, a fourth embodiment according to the present technology will be described.
  • FIG. 17 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • As shown in this figure, a teleconference is held among a plurality of hospitals by connecting in-hospital networks 170 with one another. When pathology image data to be used in the conference are gathered to a single point, the cost for uploading and downloading is increased.
  • In contrast, pathology image data to be used in the teleconference are originally obtained by taking an image by a scanner 300 of each hospital and are retained in the in-hospital servers 200A. Therefore, as shown in this figure, in this embodiment, the in-hospital server 200A of each hospital is used in place of the tile data server 200A in the third embodiment. In this case, the control data server 200B also exists on the Internet 150.
  • Because of this configuration, the upload of the tile data to the server on the Internet 150 becomes unnecessary.
  • Fifth Embodiment
  • Next, a fifth embodiment according to the present technology will be described.
  • FIG. 18 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • In the fourth embodiment, it is substantially required that the in-hospital server delivers the tile data to participants of a teleconference. Therefore, there is a probability that a problem occurs in view of the load on the server depending on the network bandwidth and the number of the PCs 100.
  • Therefore, in this embodiment, as shown in this figure, tile data to be downloaded are uploaded to a cloud server 200C in advance. Accordingly, the load on the in-hospital server 200D is reduced.
  • FIG. 19 is a flowchart showing the display process in the PC 100 according to this embodiment.
  • As shown in this figure, in the same manner as the processing in FIG. 6 and FIG. 15, when specifying tiles to be used (steps 191 and 192), the CPU 11 of the PC 100 transmits a request for the specified tiles to the cloud server 200C (step 193).
  • Then, the CPU 11 determines whether or not the tiles to be requested are successfully received (step 194).
  • When failing in the reception of the tiles (NO), the CPU 11 transmits a request for the tiles to the in-hospital server 200D (step 196), and receives the tiles (step 197). Thereafter, the received tiles are displayed (step 197).
  • Because of this processing, the working volume of the uploading of the tile data to the server 200C is suppressed, and the load on the in-hospital server 200D is also reduced.
  • Sixth Embodiment
  • Next, a sixth embodiment according to the present technology will be described.
  • FIG. 20 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • In each of the embodiments described above, there is only the single tile server. However, as shown in this figure, in this embodiment, a plurality of tile servers for delivering the same tile data are located (tile servers 200A-1 and 200A-2). For example, the tile server 200A-1 is set as a primary server, and the tile server 200A-2 is set as a secondary server.
  • Therefore, because a server optimum to one PC 100 being a client is determined for each PC 100, the system may cope with the increase in the number of PCs 100. In this case, each of the following server determining algorithms may be used.
  • A server is statistically allocated to one client in advance for each client.
  • The client (the PC 100) selects the server having the smallest RTT (Round Trip Time).
  • The client (the PC 100) selects the server having the widest bandwidth.
  • The primary server (the tile server 200A-1) determines while considering the total load.
  • The roundrobin processing is performed.
  • The load information of the server is periodically (e.g., every 10 seconds) obtained.
  • Seventh Embodiment
  • Next, a seventh embodiment according to the present technology will be described.
  • FIG. 21 is a block diagram showing the configuration of a teleconference system according to this embodiment.
  • As shown in this figure, in an in-hospital system in an in-hospital network 170, tile data for which a diagnosis was finished is retreated from the in-hospital network 170 to the cloud server 200C via the Internet 150.
  • Then, when the reservation of an in-hospital conference is performed in the in-hospital system while using the retreated data, the tile data retreated to the cloud server 200C is returned to the in-hospital server 200D at the time of the reservation.
  • [Modifications]
  • The present technology is not limited to only the embodiments described above, and may be variously modified within the scope not departing from the subject matter of this disclosure.
  • The configurations of the first to seventh embodiments may be combined with one another in any manner if the combined embodiments are not contradictory to one another. For example, the download process in advance in the second embodiment may be embodied in the configuration that the server is separately divided into the tile data server and the control data server as described in the third embodiment.
  • [Others]
  • In the present technology, the following configurations may be adopted.
  • In an embodiment, an information processing apparatus includes a processor and a memory device storing instructions. When executed by the processor, the instructions cause the processor to: (a) receive, from a first information processing apparatus, area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images; and (b) transmit, to a second information processing apparatus, the area specifying information and the location information.
  • In the information processing apparatus according to an embodiment, the image includes data representative of a slice of a biological tissue.
  • In the information processing apparatus according to an embodiment, (a) the information processing apparatus includes a server, (b) the first information processing apparatus includes a first personal computer; and (c) the second information processing apparatus includes a second personal computer.
  • In the information processing apparatus according to an embodiment, the plurality of partial images includes (a) a first partial image stored at a first location, and (b) a second partial image stored at a second location.
  • In the information processing apparatus according to an embodiment, the area specifying information is determined based on a display record.
  • In the information processing apparatus according to an embodiment, the area specifying information is determined based on annotation information.
  • In another embodiment, an information processing apparatus includes a processor, a display device, and
  • a memory device storing instructions. When executed by the processor, the instructions cause the processor to: (a) in response to an operation input, determine area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, (b) transmit the area specifying information and the location information to a first information processing apparatus, the first information processing apparatus being configured to, in response to receiving the area specifying information and the location information, transmit the area specifying information and the location information to a second information processing apparatus, (c) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (d) receive the plurality of partial images from the at least one location; and (e) display the received plurality of partial images.
  • In the information processing apparatus according to an embodiment, the image includes data representative of a slice of a biological tissue.
  • In the information processing apparatus according to an embodiment, (a) the information processing apparatus includes a first personal computer, (b) the first information processing apparatus includes a server; and (c) the second information processing apparatus includes a second personal computer.
  • In the information processing apparatus according to an embodiment, the plurality of partial images includes: (a) a first partial image stored at a first location, and (b) a second partial image stored at a second location.
  • In the information processing apparatus according to an embodiment, the instructions, when executed by the processor, cause the processor to: (a) receive the first partial image from the first location, and (b) receive the second partial image from the second location.
  • In the information processing apparatus according to an embodiment, the instructions, when executed by the processor, cause the processor to: (a) determine whether the first partial image is stored by the memory device, (b) in response to a determination that the first partial image is stored by the memory device, read out the first partial image from the memory device, and (c) for the first partial image, do not transmit a request to the at least one location indicated by the location information.
  • In the information processing apparatus according to an embodiment, the instructions, when executed by the processor, cause the processor to determine the area specifying information based on a display record.
  • In the information processing apparatus according to an embodiment, the instructions, when executed by the processor, cause the processor to determine the area specifying information based on annotation information.
  • In another embodiment, an information processing apparatus includes a processor, a display device, and
  • a memory device storing instructions. When executed by the processor, the instructions cause the processor to: (a) receive area specifying information and location information from a first information processing apparatus, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, (b) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (c) receive the plurality of partial images from the at least one location, and (d) display the received plurality of partial images.
  • In the information processing apparatus according to an embodiment, the image includes data representative of a slice of a biological tissue.
  • In the information processing apparatus according to an embodiment, (a) the information processing apparatus includes a personal computer, and (b) the first information processing apparatus includes a server.
  • In the information processing apparatus according to an embodiment, the plurality of partial images includes: (a) a first partial image stored at a first location, and (b) a second partial image stored at a second location.
  • In the information processing apparatus according to an embodiment, the instructions, when executed by the processor, cause the processor to: (a) receive the first partial image from the first location, and (b) receive the second partial image from the second location.
  • In the information processing apparatus according to an embodiment, the instructions, when executed by the processor, cause the processor to: (a) determine whether the first partial image is stored by the memory device, (b) in response to a determination that the first partial image is stored by the memory device, read out the first partial image from the memory device, and (c) for the first partial image, do not transmit a request to the at least one location indicated by the location information.
  • In the information processing apparatus according to an embodiment, the area specifying information is determined based on a display record.
  • In the information processing apparatus according to an embodiment, the area specifying information is determined based on annotation information.
  • In another embodiment, a method of operating an information processing apparatus includes (a) causing a processor to execute instructions to receive, from a first information processing apparatus, area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images, and (b) causing the processor to execute instructions to transmit, to a second information processing apparatus, the area specifying information and the location information.
  • In the method of operating an information processing apparatus according to an embodiment, the image includes data representative of a slice of a biological tissue.
  • In the method of operating an information processing apparatus according to an embodiment, (a) the information processing apparatus includes a server, (b) the first information processing apparatus includes a first personal computer, and (c) the second information processing apparatus includes a second personal computer.
  • In the method of operating an information processing apparatus according to an embodiment, the plurality of partial images includes: (a) a first partial image stored at a first location, and (b) a second partial image stored at a second location.
  • In the method of operating an information processing apparatus according to an embodiment, the area specifying information is determined based on a display record.
  • In the method of operating an information processing apparatus according to an embodiment, the area specifying information is determined based on annotation information.
  • In another embodiment, a system includes first, second, and third information processing apparatus. The first information processing apparatus is configured to, in response to an operation input, determine area specifying information and location information, the area specifying information specifying a display area in an image, the display area including a plurality of partial images, the location information indicating at least one location of the plurality of partial images. The second information processing apparatus is configured to receive the area specifying information and the location information from the first information processing apparatus. The third information processing apparatus is configured to: (a) receive the area specifying information and the location information from the second information processing apparatus, (b) transmit a request for the plurality of partial images to the at least one location indicated by the location information, (c) receive the plurality of partial images from the at least one location, and (d) display the received plurality of partial images.
  • The present application contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2011-180438 filed in the Japan Patent Office on Aug. 22, 2011, the entire content of which is hereby incorporated by reference.
  • It should be understood that various changes and modifications to the presently preferred embodiments described herein will be apparent to those skilled in the art. Such changes and modifications can be made without departing from the spirit and scope of the present subject matter and without diminishing its intended advantages. It is therefore intended that such changes and modifications be covered by the appended claims.
  • REFERENCE SIGNS LIST
      • 11 CPU
      • 16 display
      • 18 storage
      • 19 communicating section
      • 40 observation object
      • 70 tile
      • 71 coordinates moving path
      • 72 display area
      • 73 annotation
      • 100 PC
      • 150 Internet
      • 170 in-hospital network
      • 200 server
      • 300 scanner

Claims (21)

1. An information apparatus comprising:
a processor; and
a memory device storing instructions which when executed by the processor, cause the processor to:
receive information including an image;
output the information to a display; in a case that an arbitrary portion of the image is selected,
obtain annotation information affixed to the arbitrary portion from a first information apparatus; and
obtain other information from a second information apparatus.
2. The information processing apparatus of claim 1, wherein the annotation information is the information that the first information apparatus downloads from the second apparatus in advance.
3. The information processing apparatus of claim 1, wherein the annotation information is selected from the group consisting of an observation magnification, a coordinate moving path, a position in the image to which the annotation is affixed, and combinations thereof.
4. The information processing apparatus of claim 1, wherein the annotation information comprises an operation record of the image.
5. The information processing apparatus of claim 3, wherein the operation record of the image is indicated as a display area.
6. The information processing apparatus of claim 1, wherein the annotation information is identifiable as one or more of a symbol, a diagram, a text, a voice, an image, and a line.
7. The information processing apparatus of claim 1, wherein the image comprises data representative of a slice of a biological tissue.
8. The information processing apparatus of claim 1, wherein the instructions further cause the processor to obtain second information of a second image.
9. The information processing apparatus of claim 8, wherein the second information of the second image comprises a second position.
10. The information processing apparatus of claim 9, wherein the second position has a same coordinate as a first position to which the annotation information is affixed.
11. A method of operating an information processing apparatus, the method comprising causing a processor to execute instructions to receive information of an image;
output the information to a display;
when an arbitrary portion of an image is selected,
obtain annotation information affixed to the arbitrary portion from a first information apparatus; and
obtain other information from a second information apparatus.
12. The method of claim 11, wherein the annotation information is selected from the group consisting of an observation magnification, a coordinate moving path, a position in the image to which the annotation is affixed, and combinations thereof.
13. The method of claim 11, wherein the annotation information comprises an operation record of the image.
14. The method of claim 11, wherein the annotation information is identifiable as one or more of a symbol, a diagram, a text, a voice, an image, and a line.
15. The method of claim 11, wherein the image comprises data representative of a slice of a biological tissue.
16. The method of claim 11, further comprising causing the processor to execute the instructions to obtain second information of a second image.
17. The method of claim 16, wherein the second information of the second image comprises a second position at a same coordinate as of a first position to which the annotation information is affixed.
18. A system comprising:
a first information processing apparatus configured to store annotation information affixed to an arbitrary portion of an image;
a second information processing apparatus configured to store other information of the arbitrary portion of the image; and
a third information processing apparatus configured to:
receive information of the image;
output the information to a display;
when the arbitrary portion of the image is selected,
obtain the annotation information affixed to the arbitrary portion from the first information apparatus in advance; and
obtain other information from a second information apparatus.
19. The system of claim 18, wherein the annotation information is selected from the group consisting of an observation magnification, a coordinate moving path, a position in the image to which the annotation is affixed, and combinations thereof.
20. The system of claim 1, wherein the annotation information comprises an operation record of the image.
21. The system of claim 1, wherein the annotation information is identifiable as one or more of a symbol, a diagram, a text, a voice, an image, and a line.
US15/185,284 2011-08-22 2016-06-17 Information processing apparatus, information processing system, method of processing information, and program Abandoned US20160301879A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/185,284 US20160301879A1 (en) 2011-08-22 2016-06-17 Information processing apparatus, information processing system, method of processing information, and program

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2011-180438 2011-08-22
JP2011180438A JP5859771B2 (en) 2011-08-22 2011-08-22 Information processing apparatus, information processing system information processing method, and program
PCT/JP2012/004448 WO2013027323A1 (en) 2011-08-22 2012-07-10 Information processing apparatus, information processing system, method of processing information, and program
US201313880267A 2013-04-18 2013-04-18
US15/185,284 US20160301879A1 (en) 2011-08-22 2016-06-17 Information processing apparatus, information processing system, method of processing information, and program

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2012/004448 Continuation WO2013027323A1 (en) 2011-08-22 2012-07-10 Information processing apparatus, information processing system, method of processing information, and program
US13/880,267 Continuation US9398233B2 (en) 2011-08-22 2012-07-10 Processing apparatus, system, method and program for processing information to be shared

Publications (1)

Publication Number Publication Date
US20160301879A1 true US20160301879A1 (en) 2016-10-13

Family

ID=47746098

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/880,267 Expired - Fee Related US9398233B2 (en) 2011-08-22 2012-07-10 Processing apparatus, system, method and program for processing information to be shared
US15/185,284 Abandoned US20160301879A1 (en) 2011-08-22 2016-06-17 Information processing apparatus, information processing system, method of processing information, and program

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/880,267 Expired - Fee Related US9398233B2 (en) 2011-08-22 2012-07-10 Processing apparatus, system, method and program for processing information to be shared

Country Status (7)

Country Link
US (2) US9398233B2 (en)
EP (1) EP2748709B1 (en)
JP (1) JP5859771B2 (en)
CN (1) CN103154915B (en)
BR (1) BR112013008721A2 (en)
CA (1) CA2811889A1 (en)
WO (1) WO2013027323A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5859771B2 (en) * 2011-08-22 2016-02-16 ソニー株式会社 Information processing apparatus, information processing system information processing method, and program
JP6455829B2 (en) * 2013-04-01 2019-01-23 キヤノン株式会社 Image processing apparatus, image processing method, and program
US10593017B2 (en) * 2016-09-29 2020-03-17 Ricoh Company, Ltd. Information processing apparatus, storage medium, and image output system
KR102192164B1 (en) * 2018-08-08 2020-12-16 주식회사 딥바이오 System for biomedical image diagnosis, method for biomedical image diagnosis and terminal performing the same
JP7322409B2 (en) * 2018-08-31 2023-08-08 ソニーグループ株式会社 Medical system, medical device and medical method

Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030118222A1 (en) * 2000-11-30 2003-06-26 Foran David J. Systems for analyzing microtissue arrays
US20060294455A1 (en) * 2005-06-23 2006-12-28 Morris Benjamin R Method and system for efficiently processing comments to records in a database, while avoiding replication/save conflicts
US20070252834A1 (en) * 2006-04-27 2007-11-01 Microsoft Corporation Remotely Viewing Large Tiled Image Datasets
US20090307428A1 (en) * 2008-06-06 2009-12-10 Microsoft Corporation Increasing remote desktop performance with video caching
US20090326810A1 (en) * 2008-06-26 2009-12-31 Microsoft Corporation Caching navigation content for intermittently connected devices
US20110129135A1 (en) * 2009-11-27 2011-06-02 Sony Corporation Information processing apparatus, information processing method, and program
US20110131376A1 (en) * 2009-11-30 2011-06-02 Nokia Corporation Method and apparatus for tile mapping techniques
US20120076390A1 (en) * 2010-09-28 2012-03-29 Flagship Bio Methods for feature analysis on consecutive tissue sections
US20120219206A1 (en) * 2009-09-18 2012-08-30 Andrew Janowczyk High-Throughput Biomarker Segmentation Utilizing Hierarchical Normalized Cuts
US8280414B1 (en) * 2011-09-26 2012-10-02 Google Inc. Map tile data pre-fetching based on mobile device generated event analysis
US20120271848A1 (en) * 2011-04-25 2012-10-25 Google Inc. Dynamic Highlighting of Geographic Entities on Electronic Maps
US20120317111A1 (en) * 2011-06-09 2012-12-13 Christopher Desmond Method and application for managing digital files
US20130016886A1 (en) * 2011-07-12 2013-01-17 Definiens Ag Generating Artificial Hyperspectral Images Using Correlated Analysis of Co-Registered Images
US20130147820A1 (en) * 2011-12-12 2013-06-13 Google Inc. Method of pre-fetching map data for rendering and offline routing
US20130147846A1 (en) * 2011-12-12 2013-06-13 Google Inc. Pre-fetching map tile data along a route
US20130198600A1 (en) * 2012-01-30 2013-08-01 Box, Inc. Extended applications of multimedia content previews in the cloud-based content management system
US20130208078A1 (en) * 2011-08-22 2013-08-15 Sony Corporation Information processing apparatus, information processing system, method of processing information, and program
US20140047332A1 (en) * 2012-08-08 2014-02-13 Microsoft Corporation E-reader systems
US20140185891A1 (en) * 2011-07-12 2014-07-03 Definiens Ag Generating Image-Based Diagnostic Tests By Optimizing Image Analysis and Data Mining Of Co-Registered Images
US20150007058A1 (en) * 2013-06-27 2015-01-01 Progressly, Inc. Collaborative network-based graphical progress management tool
US20150082148A1 (en) * 2013-09-13 2015-03-19 Box, Inc. System and method for rendering document in web browser or mobile device regardless of third-party plug-in software
US9036509B1 (en) * 2011-01-14 2015-05-19 Cisco Technology, Inc. System and method for routing, mobility, application services, discovery, and sensing in a vehicular network environment
US20150242443A1 (en) * 2014-02-27 2015-08-27 Dropbox, Inc. Systems and methods for selecting content items to store and present locally on a user device
US20150347457A1 (en) * 2014-05-28 2015-12-03 Oracle International Corporation Automatic update for map cache
US20160019695A1 (en) * 2013-03-14 2016-01-21 Ventana Medical Systems, Inc. Whole slide image registration and cross-image annotation devices, systems and methods
US20160070983A1 (en) * 2014-09-08 2016-03-10 Mapsense Inc. Density sampling map data
US20160321809A1 (en) * 2013-10-01 2016-11-03 Ventana Medical Systems, Inc. Line-based image registration and cross-image annotation devices, systems and methods
US20170052747A1 (en) * 2015-08-17 2017-02-23 Palantir Technologies Inc. Interactive geospatial map
US20170076442A1 (en) * 2015-09-10 2017-03-16 Ralf Schoenmeyer Generating Image-Based Diagnostic Tests By Optimizing Image Analysis and Data Mining Of Co-Registered Images
US20170083545A1 (en) * 2015-09-18 2017-03-23 Fujifilm Corporation Image extraction system, image extraction method, image extraction program, and recording medium storing program
US20170287437A1 (en) * 2016-04-04 2017-10-05 Yandex Europe Ag Method and system of downloading image tiles onto a client device
US20170287436A1 (en) * 2016-04-04 2017-10-05 Yandex Europe Ag Method and system of downloading image tiles onto a client device
US20170328817A1 (en) * 2015-01-31 2017-11-16 Roche Sequencing Solutions, Inc. Systems and methods for meso-dissection

Family Cites Families (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5519436A (en) * 1994-06-21 1996-05-21 Intel Corporation Static image background reference for video teleconferencing applications
US6396941B1 (en) * 1996-08-23 2002-05-28 Bacus Research Laboratories, Inc. Method and apparatus for internet, intranet, and local viewing of virtual microscope slides
JP3629852B2 (en) * 1996-12-06 2005-03-16 株式会社日立製作所 Telemedicine support method and telemedicine support system
US6182127B1 (en) 1997-02-12 2001-01-30 Digital Paper, Llc Network image view server using efficent client-server tilting and caching architecture
JPH10304330A (en) * 1997-04-30 1998-11-13 Olympus Optical Co Ltd Picture linkage system
JPH10301830A (en) * 1997-04-30 1998-11-13 Olympus Optical Co Ltd Picture registering system
US5968120A (en) * 1997-05-02 1999-10-19 Olivr Corporation Ltd. Method and system for providing on-line interactivity over a server-client network
US6597392B1 (en) * 1997-10-14 2003-07-22 Healthcare Vision, Inc. Apparatus and method for computerized multi-media data organization and transmission
JP3794194B2 (en) * 1999-03-24 2006-07-05 株式会社日立製作所 Cooperation display control method and telemedicine support system using the same
JP2001331614A (en) 2000-05-19 2001-11-30 Sony Corp Network conference system, conference minutes generating method, conference managing server, and conference minutes generating method
JP2002117415A (en) * 2000-10-06 2002-04-19 Kgt Inc Virtual collaborating work environment generating device
CN100403282C (en) 2000-12-06 2008-07-16 株式会社Ntt都科摩 Apparatus and method for distributing content
JP3795772B2 (en) 2001-06-25 2006-07-12 株式会社ノヴァ Multimedia information communication service system
JP2003016021A (en) * 2001-07-04 2003-01-17 Hitachi Ltd Image linkage system and server apparatus
JP2003085070A (en) 2001-09-11 2003-03-20 Oki Electric Ind Co Ltd Content delivery system, content copying method and multicast method
JP2003115873A (en) 2001-10-02 2003-04-18 Nippon Telegr & Teleph Corp <Ntt> Contents delivery server selection method
JP4263873B2 (en) * 2002-04-18 2009-05-13 ヤマハ株式会社 Server apparatus, client apparatus, distribution system, distribution program, and client program
US6826301B2 (en) * 2002-10-07 2004-11-30 Infocus Corporation Data transmission system and method
JP2004194108A (en) * 2002-12-12 2004-07-08 Sony Corp Information processor and information processing method, recording medium, and program
DE10309165A1 (en) * 2003-02-28 2004-09-16 Siemens Ag Medical system architecture for interactive transmission and progressive display of compressed image data of medical component images, compresses and stores images in packets, and decompresses on request
US8140980B2 (en) * 2003-08-05 2012-03-20 Verizon Business Global Llc Method and system for providing conferencing services
US7593918B2 (en) * 2004-11-24 2009-09-22 General Electric Company Enterprise medical imaging and information management system with enhanced communications capabilities
JP4297073B2 (en) * 2005-04-01 2009-07-15 ソニー株式会社 Image generating apparatus, processing method of these apparatuses, and program causing computer to execute the method
JP4380592B2 (en) * 2005-05-17 2009-12-09 ソニー株式会社 Data sharing system and method
US7804983B2 (en) * 2006-02-24 2010-09-28 Fotonation Vision Limited Digital image acquisition control and correction method and apparatus
US7581186B2 (en) * 2006-09-11 2009-08-25 Apple Inc. Media manager with integrated browsers
HU0700409D0 (en) * 2007-06-11 2007-08-28 3D Histech Kft Method and system for accessing a slide from a remote workstation
US8259157B2 (en) * 2008-01-11 2012-09-04 Sony Corporation Teleconference terminal apparatus and image transmitting method
CN101350923B (en) * 2008-09-03 2010-11-17 中国科学院上海技术物理研究所 Method for communication and indication of interactive iatrology image
JPWO2010119522A1 (en) * 2009-04-15 2012-10-22 パイオニア株式会社 Image sharing system
JP5561027B2 (en) * 2009-11-30 2014-07-30 ソニー株式会社 Information processing apparatus, information processing method, and program thereof
JP5568970B2 (en) * 2009-11-30 2014-08-13 ソニー株式会社 Information processing apparatus, information processing method, and program thereof
JP5617233B2 (en) * 2009-11-30 2014-11-05 ソニー株式会社 Information processing apparatus, information processing method, and program thereof
US8875038B2 (en) * 2010-01-19 2014-10-28 Collarity, Inc. Anchoring for content synchronization
JP5428886B2 (en) * 2010-01-19 2014-02-26 ソニー株式会社 Information processing apparatus, information processing method, and program thereof
JP4678068B2 (en) 2010-02-24 2011-04-27 株式会社日立製作所 Playback device, playback method, transmission / reception method, and transmission method
US20130093781A1 (en) * 2010-03-31 2013-04-18 Hitachi Medical Corporation Examination information display device and method
JP5531750B2 (en) * 2010-04-16 2014-06-25 ソニー株式会社 Information processing apparatus, information processing method, program, and information processing system
US20110282686A1 (en) * 2010-05-12 2011-11-17 General Electric Company Medical conferencing systems and methods
US20120011568A1 (en) * 2010-07-12 2012-01-12 Cme Advantage, Inc. Systems and methods for collaborative, networked, in-context, high resolution image viewing
JP5701685B2 (en) * 2011-05-26 2015-04-15 富士フイルム株式会社 MEDICAL INFORMATION DISPLAY DEVICE, ITS OPERATION METHOD, AND MEDICAL INFORMATION DISPLAY PROGRAM
JP5870840B2 (en) * 2012-05-14 2016-03-01 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
US20130332857A1 (en) * 2012-06-08 2013-12-12 Samsung Electronics Co., Ltd. Photo edit history shared across users in cloud system
JP2015534160A (en) * 2012-09-10 2015-11-26 カルガリー サイエンティフィック インコーポレイテッド Client-side image rendering in client-server image browsing architecture

Patent Citations (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030118222A1 (en) * 2000-11-30 2003-06-26 Foran David J. Systems for analyzing microtissue arrays
US20060294455A1 (en) * 2005-06-23 2006-12-28 Morris Benjamin R Method and system for efficiently processing comments to records in a database, while avoiding replication/save conflicts
US20070252834A1 (en) * 2006-04-27 2007-11-01 Microsoft Corporation Remotely Viewing Large Tiled Image Datasets
US20090307428A1 (en) * 2008-06-06 2009-12-10 Microsoft Corporation Increasing remote desktop performance with video caching
US20090326810A1 (en) * 2008-06-26 2009-12-31 Microsoft Corporation Caching navigation content for intermittently connected devices
US20120219206A1 (en) * 2009-09-18 2012-08-30 Andrew Janowczyk High-Throughput Biomarker Segmentation Utilizing Hierarchical Normalized Cuts
US20110129135A1 (en) * 2009-11-27 2011-06-02 Sony Corporation Information processing apparatus, information processing method, and program
US20110131376A1 (en) * 2009-11-30 2011-06-02 Nokia Corporation Method and apparatus for tile mapping techniques
US20120076390A1 (en) * 2010-09-28 2012-03-29 Flagship Bio Methods for feature analysis on consecutive tissue sections
US9036509B1 (en) * 2011-01-14 2015-05-19 Cisco Technology, Inc. System and method for routing, mobility, application services, discovery, and sensing in a vehicular network environment
US20120271848A1 (en) * 2011-04-25 2012-10-25 Google Inc. Dynamic Highlighting of Geographic Entities on Electronic Maps
US20120317111A1 (en) * 2011-06-09 2012-12-13 Christopher Desmond Method and application for managing digital files
US20130016886A1 (en) * 2011-07-12 2013-01-17 Definiens Ag Generating Artificial Hyperspectral Images Using Correlated Analysis of Co-Registered Images
US20140185891A1 (en) * 2011-07-12 2014-07-03 Definiens Ag Generating Image-Based Diagnostic Tests By Optimizing Image Analysis and Data Mining Of Co-Registered Images
US20130208078A1 (en) * 2011-08-22 2013-08-15 Sony Corporation Information processing apparatus, information processing system, method of processing information, and program
US8280414B1 (en) * 2011-09-26 2012-10-02 Google Inc. Map tile data pre-fetching based on mobile device generated event analysis
US20130147846A1 (en) * 2011-12-12 2013-06-13 Google Inc. Pre-fetching map tile data along a route
US20130147820A1 (en) * 2011-12-12 2013-06-13 Google Inc. Method of pre-fetching map data for rendering and offline routing
US20130198600A1 (en) * 2012-01-30 2013-08-01 Box, Inc. Extended applications of multimedia content previews in the cloud-based content management system
US20140047332A1 (en) * 2012-08-08 2014-02-13 Microsoft Corporation E-reader systems
US20160019695A1 (en) * 2013-03-14 2016-01-21 Ventana Medical Systems, Inc. Whole slide image registration and cross-image annotation devices, systems and methods
US9818190B2 (en) * 2013-03-14 2017-11-14 Ventana Medical Systems, Inc. Whole slide image registration and cross-image annotation devices, systems and methods
US20150007058A1 (en) * 2013-06-27 2015-01-01 Progressly, Inc. Collaborative network-based graphical progress management tool
US20150082148A1 (en) * 2013-09-13 2015-03-19 Box, Inc. System and method for rendering document in web browser or mobile device regardless of third-party plug-in software
US20160321809A1 (en) * 2013-10-01 2016-11-03 Ventana Medical Systems, Inc. Line-based image registration and cross-image annotation devices, systems and methods
US20150242443A1 (en) * 2014-02-27 2015-08-27 Dropbox, Inc. Systems and methods for selecting content items to store and present locally on a user device
US20150347457A1 (en) * 2014-05-28 2015-12-03 Oracle International Corporation Automatic update for map cache
US20160070983A1 (en) * 2014-09-08 2016-03-10 Mapsense Inc. Density sampling map data
US20170328817A1 (en) * 2015-01-31 2017-11-16 Roche Sequencing Solutions, Inc. Systems and methods for meso-dissection
US20170052747A1 (en) * 2015-08-17 2017-02-23 Palantir Technologies Inc. Interactive geospatial map
US20170076442A1 (en) * 2015-09-10 2017-03-16 Ralf Schoenmeyer Generating Image-Based Diagnostic Tests By Optimizing Image Analysis and Data Mining Of Co-Registered Images
US20170083545A1 (en) * 2015-09-18 2017-03-23 Fujifilm Corporation Image extraction system, image extraction method, image extraction program, and recording medium storing program
US20170287437A1 (en) * 2016-04-04 2017-10-05 Yandex Europe Ag Method and system of downloading image tiles onto a client device
US20170287436A1 (en) * 2016-04-04 2017-10-05 Yandex Europe Ag Method and system of downloading image tiles onto a client device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Saalfeld, S., Cardona, A., Hartenstein, V., & Tomančák, P. (2009). CATMAID: collaborative annotation toolkit for massive amounts of image data. Bioinformatics, 25(15), 1984-1986. *

Also Published As

Publication number Publication date
CN103154915A (en) 2013-06-12
EP2748709A4 (en) 2015-07-08
BR112013008721A2 (en) 2020-11-03
US9398233B2 (en) 2016-07-19
WO2013027323A1 (en) 2013-02-28
EP2748709B1 (en) 2018-09-05
CN103154915B (en) 2017-05-03
EP2748709A1 (en) 2014-07-02
CA2811889A1 (en) 2013-02-28
US20130208078A1 (en) 2013-08-15
JP5859771B2 (en) 2016-02-16
JP2013045144A (en) 2013-03-04

Similar Documents

Publication Publication Date Title
US20160301879A1 (en) Information processing apparatus, information processing system, method of processing information, and program
EP3657824A2 (en) System and method for multi-user control and media streaming to a shared display
US8285812B2 (en) Peer-to-peer synchronous content selection
US9516267B2 (en) Remote magnification and optimization of shared content in online meeting
JP7084450B2 (en) Systems and methods for distributed media interaction and real-time visualization
JP2006505862A (en) Method and system for performing image processing from mobile client device
CN102413150A (en) Server and virtual desktop control method and virtual desktop control system
US10044979B2 (en) Acquiring regions of remote shared content with high resolution
JPWO2014091519A1 (en) Medical support system and method
US20160234269A1 (en) Displaying regions of user interest in sharing sessions
US10225292B2 (en) Selectively porting meeting objects
US20150154749A1 (en) Information processing apparatus, information processing system, information processing method, and program
US20130162671A1 (en) Image combining apparatus, terminal device, and image combining system including the image combining apparatus and terminal device
JP6787370B2 (en) Information processing equipment, information processing methods and programs
JP6369581B2 (en) Information processing apparatus, information processing method, and program
JP2016105285A (en) Information processing device, information processing system information processing method, and program
TW200844849A (en) A collaborative telemedicine system
CN103701686B (en) Real-time remote image sharing method
US20080244007A1 (en) Electronic conference system, information processing apparatus, and program
KR20060110395A (en) Conference system and method using electronic filing document
CN117041468B (en) Network communication method, device, equipment and storage medium
JP7319228B2 (en) Image distribution device, image generation device and program
JP2022189144A (en) Image display control apparatus for online conference, method, and program
CN115455232A (en) Storage method, synchronous triggering method and device for augmented reality event in panoramic video
CN117542486A (en) Real-time collaborative film reading platform based on WebSocket

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KYUSOJIN, HIROSHI;MIZUTANI, YOICHI;HASEGAWA, YUTAKA;AND OTHERS;SIGNING DATES FROM 20130304 TO 20130306;REEL/FRAME:041302/0375

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION