JP2006101329A - Stereoscopic image observation device and its shared server, client terminal and peer to peer terminal, rendering image creation method and stereoscopic image display method and program therefor, and storage medium - Google Patents

Stereoscopic image observation device and its shared server, client terminal and peer to peer terminal, rendering image creation method and stereoscopic image display method and program therefor, and storage medium Download PDF

Info

Publication number
JP2006101329A
JP2006101329A JP2004286621A JP2004286621A JP2006101329A JP 2006101329 A JP2006101329 A JP 2006101329A JP 2004286621 A JP2004286621 A JP 2004286621A JP 2004286621 A JP2004286621 A JP 2004286621A JP 2006101329 A JP2006101329 A JP 2006101329A
Authority
JP
Japan
Prior art keywords
rendering
image
means
stereoscopic image
peer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2004286621A
Other languages
Japanese (ja)
Inventor
Masayuki Hashimoto
Atsushi Koike
Fumio Okuyama
文雄 奥山
淳 小池
真幸 橋本
Original Assignee
Kddi Corp
Kddi株式会社
R & D Okuyama:Kk
有限会社アールアンドディおくやま
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kddi Corp, Kddi株式会社, R & D Okuyama:Kk, 有限会社アールアンドディおくやま filed Critical Kddi Corp
Priority to JP2004286621A priority Critical patent/JP2006101329A/en
Publication of JP2006101329A publication Critical patent/JP2006101329A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a stereoscopic image observation device which is suitable for remote observation of 3-dimensional medical image data obtained by a medical image diagnostic device such as CT, MRI, etc. <P>SOLUTION: A shared server 1 which creates a rendering image from volume data, and a plurality of client terminals 2 which display the stereoscopic image based on the rendering image, are mutually connected through a network 3. The shared server 1 carries out rendering processing for the volume data corresponding to a rendering parameter, creates a rendering image, and transmits to each client terminal. Further, if a changing request of an observation condition is received from the client terminal 2, the shared server 1 changes the rendering parameter responding to the changing request, then carries out the rendering processing again. As the result, responding to the changing request of the observation condition inputted from a client terminal, the observation conditions of the stereoscopic image are changed in the all client terminals. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

  The present invention relates to a stereoscopic image observation apparatus and its shared server, client terminal and peer-to-peer terminal, and in particular, remote observation of three-dimensional medical image data obtained by medical image diagnostic equipment such as CT and MRI. And a shared server, a client terminal, and a peer-to-peer terminal.

  By setting a light transmission coefficient and adding a shadow to medical image data (hereinafter referred to as volume data) having a three-dimensional pixel arrangement obtained by medical diagnostic imaging equipment such as CT and MRI Rendering technology to create a typical 2D CG image has been established. In addition, by using a display device that can perform stereoscopic display, two images with different spatial coordinates of the viewpoint can be input separately for the right eye and the left eye, not just a two-dimensional image, and a stereoscopic three-dimensional image with depth. It is visually recognized as a space.

  A conventional stereoscopic viewing method is binocular stereoscopic viewing using two types of images for the left eye and the right eye. For example, in the world of moving images, a polarizer system that uses two projectors to project an image through a polarizing filter that is orthogonal to the left and right, and views the image with glasses using the polarizing filter is also widespread. In addition to this, the left and right separate images are displayed on the screen for each video image frame of 30 frames per second, and the left eye is blacked out when the right image is reflected in the synchronized LCD glasses. There is also a technology for creating images. In recent years, there is a so-called multi-view stereoscopic view in which a slit is provided on the surface of a display so that different images can be seen according to the position viewed by the viewer.

On the other hand, as a device that renders and displays medical volume data, a device that renders volume data stored in a display device and a volume data that exists on a server device different from the display device are displayed on the server. There is a form of rendering and transmitting the result to a display device for display. The medical volume rendering is disclosed in Japanese Patent Application Laid-Open No. 2004-228561.
JP 2004-187743 A

  When you want to display the same volume data on multiple remotely arranged display terminals and observe them simultaneously, especially in the medical field, each display terminal is equipped with a communication tool such as a TV conference system, and the viewer of each display terminal There is a need for a system that enables observation while communicating with each other. Furthermore, in such an observation system, when an operation for changing an observation condition such as a viewpoint change is performed on any display terminal, this is reflected not only on the display terminal but also on all other display terminals. It is desirable to always display the same image viewed from the same viewpoint on the terminal. However, in the conventional observation system, the change of the observation condition performed on one display terminal cannot be reflected on another display terminal.

  An object of the present invention is to solve the above-described problems of the prior art, and in a stereoscopic image observation apparatus that observes the same volume data on a plurality of display terminals, the observation conditions such as the viewpoint are changed on any of the display terminals. This is reflected in all display terminals, and a stereoscopic image observation apparatus that can always observe the same image in all display terminals, its shared server, client terminal and peer-to-peer terminal, rendering image generation method, and A stereoscopic image display method, a program thereof, and a storage medium are provided.

  In order to achieve the above object, the present invention is directed to a stereoscopic image observation apparatus that displays stereoscopic images on a plurality of display terminals based on a rendering image generated from volume data having a three-dimensional pixel arrangement. When a change in viewing conditions is requested in one display terminal, the one display terminal renders volume data and supplies it to each terminal if it is a client terminal of a server / client model A rendering command is sent to and a change of rendering parameters is requested. Since the shared server performs re-rendering based on the changed rendering parameter and transmits the rendered image to all the client terminals again, the change of the observation condition performed on one display terminal is reflected on the other display terminals. be able to.

  In addition, if the one display terminal is a peer-to-peer terminal connected to the other terminal in a peer-to-peer manner, when its own terminal requests a change in observation conditions, And a rendering command is transmitted to the counterpart terminal to request a change of the rendering parameter. As a result, since the rendering processing corresponding to the same rendering parameter is performed on the same volume data in both the own terminal and the partner terminal, the change of the observation condition performed in the own terminal is reflected in the partner terminal. be able to.

According to the present invention, the following effects are achieved.
(1) A shared server that generates a stereoscopic rendering image from volume data and at least one client terminal that displays a stereoscopic image based on the stereoscopic rendering image are connected to each other via a network. In the stereoscopic image observation apparatus, when the observation condition such as the viewpoint is changed in any of the client terminals, this is reflected in all the display terminals, so that the same image can always be observed in all the terminals.
(2) In a stereoscopic image observation apparatus in which a pair of terminals that display a stereoscopic image based on a rendering image generated from volume data is connected peer-to-peer via a network, a viewpoint at one terminal When the observation conditions such as the above are changed, this is reflected on the other terminal, so that the same image can always be observed on each terminal.
(3) Predicting the changes in the viewing conditions and generating a rendering prediction image in advance.When a change in the viewing conditions is requested, if a rendering prediction image that matches this request has already been registered, the rendering process is performed. Since the rendering prediction image is output without performing, the processing time can be shortened.
(4) Since each terminal can talk to each other using the TV conference function, it is possible to communicate in real time while observing a stereoscopic image.

  Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the drawings. FIG. 1 is a block diagram showing the configuration of the stereoscopic image observation apparatus according to the first embodiment of the present invention. Here, a stereoscopic rendering image is generated from volume data having a three-dimensional pixel arrangement. A shared server 1 and a plurality of client terminals 2 that share volume data on the shared server 1 and display the stereoscopic image from the same viewpoint are connected via a network 3 to form a server / client system. is doing.

  FIG. 2 is a block diagram showing the configuration of the main part of the shared server 1. In the volume data database (DB) 101, various types of models in which each point in the three-dimensional space has a value are modeled. Volume data is accumulated. On the volume data DB 101, each volume data is uniquely managed by identification information. The rendering processing unit 102 performs a rendering process on the volume data according to a rendering parameter including at least viewpoint information provided from the control unit 100, and generates a plurality of rendering images having different viewpoints.

  The transmission image buffer 103 temporarily holds a plurality of rendering images output from the rendering processing unit 102. The compression / encryption processing unit 104 reads out the rendering image from the transmission image buffer 103, encodes and compresses it, and performs encryption processing as necessary. From the image transmission unit 105, encoded data of the rendering image is transmitted to all the client terminals 2.

  The command receiving unit 106 receives a rendering command transmitted from the client terminal 2 in order to change a rendering parameter, and a TV conference connection request command transmitted from the client terminal 2 that desires to participate in the TV conference. The control unit 100 changes the rendering parameter based on the received rendering command, and instructs the rendering processing unit 102 to perform re-rendering with the changed rendering parameter.

  In the connection terminal list 107, identification information of the client terminal 2 that permits the video conference connection is registered in advance. The authentication unit 108 performs authentication based on whether or not the identification information of the client terminal 2 that has transmitted the video conference connection request is already registered in the connection terminal list 107, and a connection request from an authorized client terminal 2 If so, allow this. The TV conference information distribution unit 109 receives the TV conference video and the TV conference audio transmitted from each client terminal 2, combines them, and distributes them to all the client terminals 2.

  FIG. 3 is a block diagram showing the configuration of the rendering processing unit 102. Here, a twin-lens method will be described as an example.

  The rendering processing unit 102 generates a two-dimensional rendering image (right eye image) in which volume data is viewed from an arbitrary viewpoint x in space, and when the viewpoint x is a right eye viewpoint. And a second rendering processing unit 141b that creates a two-dimensional rendering image (left-eye image) in which the same volume data is expected from another viewpoint x ′ that is slightly separated from the viewpoint x and becomes the left-eye viewpoint. Including. Thereafter, a pair of first and second rendering images necessary for expressing a stereoscopic image may be simply expressed as a rendered image. The coordinate positions of the viewpoints x and x 'are provided from the control unit 100 as part of the rendering parameters. In the first and second rendering processing units 141a and 141b, rendering is performed using the same rendering parameters except for the viewpoint.

  FIG. 4 is a block diagram showing another configuration of the rendering processing unit 102. A plurality of pairs of the left and right rendering processing units 141 (141a, 141b) described above are provided, and rendering images viewed from different viewpoints are respectively provided. A multi-view configuration to be generated is shown. In both of the above-described binocular and multi-view types, the first and second rendered images are not output as they are, but for example, the difference data of the second rendered image with respect to the first rendered image is obtained to obtain the first rendering. You may make it output the pair of an image and difference data as a rendering image for stereoscopic vision.

  FIG. 5 is a block diagram showing the configuration of the client terminal 2, and all the client terminals have the same or equivalent configuration.

  The rendering communication unit 212 communicates with the image transmission unit 105 and the command reception unit 106 of the shared server 1 and controls reception of a rendering image transmitted from the shared server 1 and transmission of a rendering command to the shared server 1. The image decoder 207 decodes the encoded data of the received rendering image and reproduces a pair of left and right rendering images. If the rendering image is a pair of the first rendering image (right eye image) and difference data, the second rendering image (left eye image) is reproduced based on the first rendering image and the difference data. The stereoscopic image display device 202 displays a stereoscopic image based on the pair of left and right rendering images.

  The input operator 201 is an input device such as a mouse, a keyboard, or a joystick. The operator appropriately changes the viewpoint of the stereoscopic image displayed on the stereoscopic image display device 202 by operating the input operator 201 appropriately. It is possible to change the observation conditions such as cutting line change, image enlargement or reduction, color tone, brightness, contrast change, CT image or transmittance change for CT images. The command generation unit 206 detects a viewing condition change request based on an operation on the input operator 201, and generates a rendering command for requesting the shared server 1 to change a rendering parameter in accordance with the viewing condition. To do. This rendering command is transmitted from the rendering communication unit 212 to the shared server 1.

  The TV conference communication unit 213 communicates with the TV conference information distribution unit 109 of the shared server 1 and controls transmission / reception of the TV conference video and the TV conference audio. The audio decoder 209 decodes the video conference audio received via the shared server 1 and causes the speaker 203 to reproduce it. The video conference audio detected by the microphone 204 is encoded by the audio encoder 210 and then transmitted from the video conference communication unit 213 to the shared server 1. The video conference video captured by the camera 205 is encoded by the video encoder 211 and then transmitted from the video conference communication unit 213 to the shared server 1.

  6 and 7 are flowcharts showing the operation of the present embodiment, FIG. 6 shows the operation of the shared server 1, and FIG. 7 shows the operation of each client terminal 2. Here, a case where the rendering processing unit 102 is a twin-lens type will be described as an example.

  In the shared server 1, in step S <b> 1 of FIG. 6, a rendering start command including a volume data identifier and a rendering parameter is output from the control unit 100 to the rendering processing unit 102. In step S2, the rendering processing unit 102 performs rendering processing on the volume data corresponding to the volume data identifier under conditions according to the rendering parameters, and two necessary for expressing the stereoscopic image. A rendered image is generated.

  In step S 3, the rendered image is output to the compression / encryption processing unit 104 via the transmission image buffer 103. In step S4, the rendered image is encoded and compressed, and further encrypted as necessary. In step S5, the rendered image is transmitted from the image transmission unit 105 to all the client terminals 2. In step S6, the apparatus stands by in preparation for receiving a rendering command transmitted from any of the client terminals 2.

  When the client terminal 2 receives the encoded data of the rendering image transmitted from the shared server 1 in step S51 of FIG. 7, the rendering communication unit 212 receives the encoded data in the image decoder 207 in step S52. Decrypted. In step S53, the decoded rendering image is output to the stereoscopic display device 202, and the stereoscopic image is visibly displayed on the screen.

  In step S54, the command generator 206 determines whether or not there is a change request regarding the observation condition. Here, if the operator of the client terminal 2 operates the input operator 201 in order to observe the displayed stereoscopic image from a different viewpoint, this is determined as a request for changing the observation condition, and the process proceeds to step S55. In step S55, the command generation unit 206 generates a rendering command that requests generation of a rendering image viewed from the requested viewpoint. In step S <b> 56, the rendering command is transmitted from the rendering communication unit 212 to the shared server 1.

  Returning to FIG. 6, in the shared server 1, when the rendering command is received in step S <b> 6, the rendering parameter is changed in the control unit 100 in step S <b> 7, and the rendered rendering parameter is changed to the rendering processing unit 102. Based on re-rendering is directed.

  Thereafter, the process returns to step S2, and the rendering process is performed again based on the changed rendering parameter, and the newly generated rendered image is transmitted to each client terminal 2 in step S5. As a result, in this embodiment, the viewpoint change in one client terminal 2 is reflected in all the other client terminals 2, so that the stereoscopic image can always be observed from the same viewpoint in all the client terminals 2.

  FIG. 8 is a block diagram showing the configuration of the main part of the shared server 1 according to the second embodiment of the present invention. The same reference numerals as those described above represent the same or equivalent parts.

  The present embodiment is characterized in that the control unit 100 is provided with a change prediction unit 100a that predicts the change contents of the observation condition, and a prediction image buffer 110 that stores a rendering prediction image generated based on the prediction result. There is. The change predicting unit 100a predicts a rendering parameter set by a rendering command received after the next time based on the current rendering parameter and at least one rendering parameter before and after the previous time. The rendering processing unit 102 generates a rendering image in advance based on the prediction result of the rendering parameter, and stores this in the prediction image buffer 110.

  FIG. 9 is a flowchart showing the operation of the shared server 1 in the present embodiment. In the steps denoted by the same reference numerals as those described above, the same or equivalent processes are performed.

  In the present embodiment, when the transmission of the rendering image is completed in step S5, the process proceeds to step S8, and it is determined whether the generation and accumulation of the rendering predicted image based on the rendering parameter prediction result have already been completed. Since it is determined that the process is not completed at first, the process proceeds to the rendering prediction process in step S9.

  FIG. 10 is a flowchart showing the procedure of the rendering prediction process. In step S91, based on the rendering parameter history set by the rendering commands received so far, the rendering parameters to be set next time are shown. is expected. In the present embodiment, if a rendering command for changing the viewpoint has been received, the rendering parameter for the next and subsequent viewpoints is extrapolated based on the current rendering parameter for the viewpoint and at least one rendering parameter before and after the previous viewpoint. , Based on appropriate mathematical operations such as interpolation or quadratic regression. In step S92, a rendering process is performed based on the prediction result of the rendering parameter, and a rendering prediction image corresponding to the prediction result is generated. In step S <b> 93, the generated rendering predicted image is stored in the predicted image buffer 110.

  Note that this rendering prediction process is repeatedly executed in step S91 while changing the prediction parameter and the prediction algorithm, and even if a rendering command for requesting a change of the rendering parameter is newly received or no rendering command is received. When the accumulation of the predetermined number of predicted rendering images is completed, the process ends, and the process returns to step S6 in FIG.

  Thereafter, when a rendering command transmitted from any one of the client terminals 2 is received in step S6, in step S10, a predicted image previously rendered with a rendering parameter specified by the received rendering command is displayed in the predicted image buffer 110. It is determined whether or not already registered. If it is determined that it is not registered, the process proceeds to step S7, and the rendering parameter is changed based on the rendering command in the control unit 100 in the same manner as in the first embodiment described above, and the rendered rendering parameter is changed to the rendering processing unit 102. Re-rendering based on is directed.

  On the other hand, if the rendered image is already registered in the predicted image buffer 110, the process proceeds to step S11, and the rendered predicted image is extracted from the predicted image buffer 110 and transmitted to all the client terminals 2 in steps S3, 4, and 5. Is done.

  According to the present embodiment, the shared server 1 predicts future transitions based on the rendering parameter history and generates a predicted rendering image in advance using the standby time, so that a stereoscopic image in response to the rendering command is displayed. It is possible to shorten the response time for

  FIG. 11 is a block diagram showing a configuration of a stereoscopic image observation apparatus according to the third embodiment of the present invention. Here, a pair of terminals 4 are connected via a network 3 on a peer-to-peer basis. ing.

  FIG. 12 is a block diagram showing a configuration of a main part of the peer-to-peer (P2P) terminal 4, and the same reference numerals as those described above represent the same or equivalent parts.

  In the present embodiment, each P2P terminal 4 includes a volume data DB 101 and a rendering processing unit 102 together with the stereoscopic image display device 202, and the control unit 200 exchanges rendering commands and volume data identifiers with the counterpart terminal and renders both rendering parameters. And matching the volume data of the object to be observed. The rendering processing unit 102 reads volume data corresponding to the volume data identifier specified by the control unit 200 from the volume data DB 101, and performs rendering processing on the data according to the rendering parameter. The generated rendering image is displayed as a stereoscopic image on its own stereoscopic image display device 202.

  FIG. 13 is a flowchart showing the operation of the present embodiment. In step S61, the volume data to be observed is accumulated in the volume data DB 101 of each P2P terminal 4. This volume data may be obtained independently by each terminal 4 from a server (not shown) on the network 3, or if it is already stored in the volume data DB 101 of one terminal 4, this May be read and provided to the other party's P2P terminal.

  In step S62, processing for matching rendering parameters between P2P terminals is executed, and the same rendering parameters are set in each P2P terminal 4. In step S63, a rendering process corresponding to the set rendering parameter is performed on the volume data stored in the volume data DB 101, and a rendering image is generated in the same manner as in the first embodiment. In step S64, the rendered image is output to the stereoscopic image display device 202, and the stereoscopic image of the same viewpoint is displayed on the screen of each P2P terminal 4.

  In steps S65 and S69, it is determined whether or not there is a request for changing the observation condition. In step S65, the command generation unit 206 determines whether or not there is a request for changing the observation condition by the operator of the terminal itself. When the operator of the own terminal operates the input operator 201 to observe the displayed stereoscopic image from a different viewpoint, this is determined as a request for changing the observation condition, and the process proceeds to step S66. In step S66, the rendering parameters are changed according to the viewpoint requested by the operator.

  In step S67, the command generation unit 206 generates a rendering command for requesting generation of a rendering image corresponding to the viewpoint requested by the operator. In step S68, the rendering command is transmitted from the rendering communication unit 21 to the partner terminal.

  In the partner P2P terminal 4, when the rendering command is received in step S69, the rendering parameter is changed in the control unit 100 in step S70. Thereafter, since the process returns to step S63 in any P2P terminal 4, the rendering process is performed again based on the changed rendering parameter, and the stereoscopic image viewed from the viewpoint requested by the operator is stereoscopically viewed. It is displayed on the image display device 202.

  As described above, in this embodiment, the viewpoint change in one P2P terminal 4 is reflected not only on the own terminal but also on the other P2P terminal 4, so that each P2P terminal can always observe a stereoscopic image from the same viewpoint. become.

  FIG. 14 is a block diagram showing the configuration of the main part of the peer-to-peer terminal 4 according to the fourth embodiment of the present invention. The same reference numerals as those described above represent the same or equivalent parts.

  The present embodiment is characterized in that the control unit 200 is provided with a change prediction unit 200a related to rendering parameters and a prediction image buffer 110 that stores a rendering prediction image generated based on the prediction result of the rendering parameters. The change prediction unit 200a predicts the next and subsequent rendering parameters based on the current rendering parameter and at least one rendering parameter before and after the previous time. The rendering processing unit 102 generates a rendering image based on the prediction result of the rendering parameter, and accumulates the rendering image in the prediction image buffer 110.

  FIG. 15 is a flowchart showing the operation of the present embodiment. In the steps denoted by the same reference numerals as those described above, the same or equivalent processes are performed.

  In this embodiment, when the display of the rendering image is completed in step S64, the process proceeds to step S65, and the command generation unit 206 determines whether there is a change request regarding the observation condition. Here, when the operator of the P2P terminal 4 operates the input operator 201 in order to observe the displayed stereoscopic image from a different viewpoint, this is determined to be an observation condition change request, and the process proceeds to step S66. In step S66, the rendering parameters are changed according to the viewpoint requested by the operator. In step S67, the command generation unit 206 generates a rendering command that requests the counterpart terminal 4 to generate a rendering image corresponding to the viewpoint requested by the operator. In step S68, the rendering command is transmitted from the rendering communication unit 21 to the counterpart terminal 4.

  In step S95, it is determined whether or not the predicted image rendered with the changed rendering parameter is already registered in the predicted image buffer 110. If it is determined that it is not registered, the process returns to step S63, and re-rendering based on the changed rendering parameter is instructed from the control unit 100 to the rendering processing unit 102.

  On the other hand, if the rendered image is already registered in the predicted image buffer 110, the process proceeds to step S96, where the rendered predicted image is extracted from the predicted image buffer 110, and returned to step S64 for display.

  On the other hand, in step S65, if a request for changing the observation condition by the operator of the terminal is not detected, the process proceeds to step S69, and it is determined whether or not a rendering command has been received. Here, when the rendering command transmitted from the counterpart terminal is received, the rendering parameter is changed in the control unit 100 in step S70. In step S97, it is determined whether or not the predicted image rendered with the changed rendering parameter is already registered in the predicted image buffer 110. If it is determined that it is not registered, the process returns to step S63, and the re-rendering process based on the changed rendering parameter is instructed from the control unit 100 to the rendering processing unit 102.

  On the other hand, if the rendered image is already registered in the predicted image buffer 110, the process proceeds to step S98, the rendered predicted image is extracted from the predicted image buffer 110, and the process returns to step S64 to be displayed.

  If it is determined in step S69 that a rendering command has not been received, the process proceeds to step S8, and similarly to the second embodiment described above, a rendering prediction image is generated using a waiting time, and the prediction image is generated. Registered in the buffer 110.

  According to the present embodiment, each P2P terminal 4 predicts a future transition based on the rendering parameter history, and generates a rendering predicted image in advance using the standby time. Therefore, a stereoscopic image in response to the rendering command is generated. The response time for display can be shortened.

1 is a block diagram of a stereoscopic image observation apparatus according to a first embodiment of the present invention. It is the block diagram which showed the structure of the principal part of a shared server. It is the block diagram which showed the structure (two eyes type) of the rendering process part. It is the block diagram which showed the structure (multi-view type) of the rendering process part. It is the block diagram which showed the structure of the client terminal. It is the flowchart which showed operation | movement of the shared server in 1st Embodiment. It is the flowchart which showed the operation | movement of the client terminal in 1st Embodiment. It is the block diagram which showed the structure of the principal part of the shared server which concerns on 2nd Embodiment of invention. It is the flowchart which showed the operation | movement of the shared server in 2nd Embodiment. It is the flowchart which showed the procedure of the rendering prediction process. It is the block diagram which showed the structure of the stereoscopic vision image observation apparatus which concerns on 3rd Embodiment. It is the block diagram which showed the structure of the principal part of a peer to peer terminal. It is the flowchart which showed operation | movement of the peer to peer terminal in 3rd Embodiment. It is the block diagram which showed the structure of the peer to peer terminal which concerns on 4th Embodiment. It is the flowchart which showed the operation | movement of the peer to peer terminal in 4th Embodiment.

Explanation of symbols

  DESCRIPTION OF SYMBOLS 1 ... Shared server, 2 ... Client terminal, 3 ... Network, 4 ... Peer to peer (P2P) terminal, 100 ... Control part, 101 ... Volume data DB, 102 ... Rendering processing part, 103 ... Transmission image buffer, 104 ... compression / encryption processing unit, 105 ... image transmission unit, 106 ... command reception unit, 107 ... connected terminal list, 108 ... authentication unit, 109 ... TV conference information distribution unit

Claims (23)

  1. A shared server that generates a stereoscopic rendering image from volume data having a three-dimensional pixel arrangement and at least one client terminal that displays a stereoscopic image based on the stereoscopic rendering image via a network In the stereoscopic image observation apparatus connected to each other,
    The shared server is
    Means for storing volume data;
    Means for setting a rendering parameter including at least viewpoint information;
    Means for performing rendering processing according to the rendering parameter on the volume data and generating a rendered image;
    Means for transmitting the rendered image to a client terminal;
    Means for receiving an observation condition change request from the client terminal;
    Means for changing a rendering parameter setting in response to the change request;
    Each of the client terminals is
    Means for receiving a rendered image from a shared server;
    Means for displaying a stereoscopic image based on the rendered image;
    An operator for inputting a request to change the observation condition of the stereoscopic image;
    Means for transmitting the change request to a shared server,
    A stereoscopic image observation apparatus characterized in that a stereoscopic image observation condition is changed in all client terminals including the one client terminal in response to an observation condition change request input from the one client terminal. .
  2. The shared server further comprises:
    Means for predicting the content of the change request regarding the observation conditions;
    Means for performing rendering processing according to the prediction result on the volume data and generating a rendering prediction image;
    Means for storing the rendered predicted image;
    2. The apparatus according to claim 1, further comprising means for transmitting the registered rendering prediction image as a rendering image when a rendering prediction image that matches the received request for changing the observation condition is already registered in the storage unit. The stereoscopic image observation apparatus described in 1.
  3. The shared server and each client terminal have a TV conference function,
    The stereoscopic image observation apparatus according to claim 1, wherein the shared server collects video conference information detected by each client terminal, combines them, and distributes the information to all client terminals.
  4. In a stereoscopic image observation apparatus in which a pair of terminals displaying a stereoscopic image based on a rendering image generated from volume data having a three-dimensional pixel arrangement is connected peer-to-peer via a network,
    Each said peer-to-peer terminal is
    Means for storing volume data;
    Means for setting a rendering parameter including at least viewpoint information;
    Means for performing rendering processing according to the rendering parameter on the volume data and generating a rendered image;
    Means for displaying a stereoscopic image based on the rendered image;
    Means for receiving an observation condition change request from the other terminal;
    An operator for inputting a request to change the observation condition of the stereoscopic image;
    Means for changing the setting of the rendering parameter in response to the change request for each observation condition;
    Means for transmitting a change request for the observation condition input from the operation element to the counterpart terminal,
    A stereoscopic image viewing condition is changed in one and the other peer-to-peer terminal in response to a request for changing the viewing condition input from one peer-to-peer terminal. Visual image observation device.
  5. Each peer to peer terminal further includes:
    Means for predicting the content of the change request regarding the observation conditions;
    Means for performing rendering processing according to the prediction result on the volume data and generating a rendering prediction image;
    Means for storing the rendered predicted image;
    The stereoscopic image is displayed based on the registered rendering prediction image when the rendering prediction image that matches the received or input request for changing the observation condition is already registered in the storage unit. 4. The stereoscopic image observation apparatus according to 4.
  6. Each of the peer-to-peer terminals has a video conference function,
    6. The stereoscopic image observation according to claim 4, wherein one peer-to-peer terminal collects and reproduces the video conference information detected by the other peer-to-peer terminal. apparatus.
  7. Means for generating the rendered image;
    A first rendering processing unit for generating a two-dimensional first rendering image in which the volume data is viewed from a first viewpoint in space;
    A second rendering processing unit that generates a two-dimensional second rendering image in which the volume data is viewed from a second viewpoint different from the first viewpoint,
    The stereoscopic image observation apparatus according to claim 1, wherein the stereoscopic image observation apparatus is a binocular system in which a pair of the first and second rendering images is a rendering image.
  8. Means for generating the rendered image;
    The three-dimensional system according to claim 7, wherein the three-dimensional system includes a plurality of sets of the first and second rendering processing units, and a plurality of first and second rendering image pairs having different viewpoints. Visual image observation device.
  9.   The stereoscopic view according to claim 7 or 8, further comprising means for obtaining a difference between the second rendering image and the first rendering image, wherein the pair of the first rendering image and the difference is a rendering image. Image observation device.
  10. A stereoscopic image in which a shared server that generates a rendering image from volume data having a three-dimensional pixel arrangement and at least one client terminal that displays a stereoscopic image based on the rendering image are connected to each other via a network In the shared server of the observation device,
    Means for storing volume data;
    Means for setting a rendering parameter including at least viewpoint information;
    Means for performing rendering processing according to the rendering parameter on the volume data and generating a rendered image;
    Means for transmitting the rendered image to a client terminal;
    Means for receiving an observation condition change request from the client terminal;
    Means for changing a setting of a rendering parameter in response to the change request, and a shared server for a stereoscopic image observation apparatus.
  11. The shared server further comprises:
    Means for predicting the content of the change request regarding the observation conditions;
    Means for performing rendering processing according to the prediction result on the volume data and generating a rendering prediction image;
    Means for storing the rendered predicted image;
    11. The information processing apparatus according to claim 10, further comprising: a transmission unit configured to transmit the registered rendering prediction image as a rendering image when a rendering prediction image that matches the received request for changing the observation condition is already registered in the storage unit. A shared server of the stereoscopic image observation apparatus according to 1.
  12. A stereoscopic image in which a shared server that generates a rendering image from volume data having a three-dimensional pixel arrangement and at least one client terminal that displays a stereoscopic image based on the rendering image are connected to each other via a network In the client terminal of the observation device,
    Means for receiving a rendered image;
    Means for displaying a stereoscopic image based on the rendered image;
    An operator for inputting a request to change the observation condition of the stereoscopic image;
    A client terminal of the stereoscopic image display apparatus, comprising: means for transmitting the observation condition change request to a shared server.
  13. A pair of terminals that display a stereoscopic image based on a rendering image generated from volume data having a three-dimensional pixel arrangement is connected to a peer image of a stereoscopic image observation apparatus connected peer-to-peer via a network. In a two-peer terminal,
    Means for storing volume data;
    Means for setting a rendering parameter including at least viewpoint information;
    Means for performing rendering processing according to the rendering parameter on the volume data and generating a rendered image;
    Means for displaying a stereoscopic image based on the rendered image;
    Means for receiving an observation condition change request from the other terminal;
    An operator for inputting a request to change the observation condition of the stereoscopic image;
    Means for changing the setting of the rendering parameter in response to the change request for each observation condition;
    A peer-to-peer terminal of a stereoscopic image observation apparatus, comprising: means for transmitting a request for changing an observation condition input from the operator to a partner terminal.
  14. The peer-to-peer terminal further comprises:
    Means for predicting the content of the change request regarding the observation conditions;
    Means for performing rendering processing according to the prediction result on the volume data and generating a rendering prediction image;
    Means for storing the rendered predicted image;
    The stereoscopic image is displayed based on the registered rendering prediction image when the rendering prediction image that matches the received or input request for changing the observation condition is already registered in the storage unit. 14. A peer-to-peer terminal of the stereoscopic image observation apparatus according to 13.
  15. In a method in which a shared server connected to at least one client terminal via a network generates a rendering image from volume data having a three-dimensional pixel arrangement,
    Procedure for storing volume data,
    A procedure for setting rendering parameters including at least viewpoint information;
    A procedure for performing rendering processing according to the rendering parameter on volume data and generating a rendered image;
    Transmitting the rendered image to the client terminal;
    A procedure for receiving an observation condition change request from the client terminal;
    And a procedure for changing a setting of a rendering parameter in response to the change request.
  16. A procedure for predicting the content of a change request relating to the observation condition;
    A procedure for performing rendering processing according to the prediction result on the volume data and generating a rendering predicted image;
    Registering the rendering prediction image;
    16. The rendering according to claim 15, further comprising a step of transmitting the registered rendering prediction image as a rendering image when the rendering prediction image that matches the received request for changing the observation condition is already registered. Image generation method.
  17. In a method in which a client terminal receives a rendering image generated from volume data having a three-dimensional pixel arrangement from a shared server and displays a stereoscopic image,
    Receiving the rendered image; and
    Displaying a stereoscopic image based on the rendered image;
    A procedure for inputting a request for changing the viewing condition of the stereoscopic image;
    And a procedure for transmitting the observation condition change request to the shared server.
  18. In a method for displaying a stereoscopic image based on a rendering image generated from volume data having a three-dimensional pixel arrangement at each terminal connected by peer-to-peer,
    Procedure for storing volume data,
    A procedure for setting rendering parameters including at least viewpoint information;
    A procedure for performing rendering processing according to the rendering parameter on volume data and generating a rendered image;
    Displaying a stereoscopic image based on the rendered image;
    A procedure for receiving an observation condition change request from the partner terminal;
    A procedure for inputting a request for changing the viewing condition of the stereoscopic image;
    A procedure for changing the setting of the rendering parameter in response to the change request for each observation condition;
    And a procedure for transmitting the input request for changing the observation condition to the counterpart terminal.
  19. A procedure for predicting the content of a change request relating to the observation condition;
    A procedure for performing rendering processing according to the prediction result on the volume data and generating a rendering predicted image;
    Registering the rendering prediction image,
    19. The stereoscopic image is displayed based on the registered rendering prediction image when the rendering prediction image that matches the received or input change request for the observation condition is already registered. Stereoscopic image display method.
  20.   A rendering image generation program for causing a shared server to execute the rendering image generation method according to claim 15 or 16.
  21.   A storage medium for a rendering image generation program in which the rendering image generation program according to claim 20 is stored so as to be readable by a shared server.
  22.   A stereoscopic image display program for causing a computer to execute the stereoscopic image display method according to claim 17.
  23.   A storage medium for a stereoscopic image display program in which the stereoscopic image display program according to claim 22 is stored so as to be readable by a computer.
JP2004286621A 2004-09-30 2004-09-30 Stereoscopic image observation device and its shared server, client terminal and peer to peer terminal, rendering image creation method and stereoscopic image display method and program therefor, and storage medium Pending JP2006101329A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004286621A JP2006101329A (en) 2004-09-30 2004-09-30 Stereoscopic image observation device and its shared server, client terminal and peer to peer terminal, rendering image creation method and stereoscopic image display method and program therefor, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004286621A JP2006101329A (en) 2004-09-30 2004-09-30 Stereoscopic image observation device and its shared server, client terminal and peer to peer terminal, rendering image creation method and stereoscopic image display method and program therefor, and storage medium

Publications (1)

Publication Number Publication Date
JP2006101329A true JP2006101329A (en) 2006-04-13

Family

ID=36240700

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004286621A Pending JP2006101329A (en) 2004-09-30 2004-09-30 Stereoscopic image observation device and its shared server, client terminal and peer to peer terminal, rendering image creation method and stereoscopic image display method and program therefor, and storage medium

Country Status (1)

Country Link
JP (1) JP2006101329A (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2247117A2 (en) 2009-04-27 2010-11-03 Mitsubishi Electric Corporation Stereoscopic video distribution system, stereoscopic video distribution method, stereoscopic video distribution apparatus, stereoscopic video viewing system, stereoscopic video viewing method, and stereoscopic video viewing apparatus
JP2011109371A (en) * 2009-11-17 2011-06-02 Kddi Corp Server, terminal, program, and method for superimposing comment text on three-dimensional image for display
KR20120095983A (en) * 2009-11-18 2012-08-29 톰슨 라이센싱 Methods and systems for three dimensional content delivery with flexible disparity selection
WO2012137821A1 (en) * 2011-04-07 2012-10-11 株式会社東芝 Image processing system, apparatus, method and program
JP2012231235A (en) * 2011-04-25 2012-11-22 Toshiba Corp Image processing system, apparatus, method, and program
WO2012161193A1 (en) * 2011-05-24 2012-11-29 株式会社東芝 Medical image diagnostic apparatus, medical image-processing apparatus and method
CN102821694A (en) * 2011-04-08 2012-12-12 株式会社东芝 Medical image processing system, medical image processing apparatus, medical image diagnostic apparatus, medical image processing method and medical image processing program
JP2013005052A (en) * 2011-06-13 2013-01-07 Toshiba Corp Image processing system, apparatus, method and program
JP2013017056A (en) * 2011-07-04 2013-01-24 Toshiba Corp Image processing system, image processing method, and medical image diagnostic device
WO2013012042A1 (en) * 2011-07-19 2013-01-24 株式会社東芝 Image processing system, device and method, and medical image diagnostic device
JP2013021459A (en) * 2011-07-08 2013-01-31 Toshiba Corp Image processor, image processing method, image processing system and medical image diagnostic device
JP2013026792A (en) * 2011-07-20 2013-02-04 Toshiba Corp Image processing device, image processing method, image processing system, and medical image diagnostic device
JP2013026736A (en) * 2011-07-19 2013-02-04 Toshiba Corp Image processing system, device, method, and program
JP2013034197A (en) * 2011-07-04 2013-02-14 Toshiba Corp Image processing apparatus, image processing method, and medical image diagnostic apparatus
JP2013038467A (en) * 2011-08-03 2013-02-21 Toshiba Corp Image processing system, image processor, medical image diagnostic apparatus, image processing method, and image processing program
JP2013039351A (en) * 2011-07-19 2013-02-28 Toshiba Corp Image processing system, image processing device, image processing method, and medical image diagnostic device
JP2013066242A (en) * 2012-12-25 2013-04-11 Toshiba Corp Image display system, device, and method, and medical image diagnostic device
JP2013066241A (en) * 2011-06-09 2013-04-11 Toshiba Corp Image processing system and method
JP2013078141A (en) * 2011-07-19 2013-04-25 Toshiba Corp Image processing system, device, method, and medical image diagnostic device
JP2013122770A (en) * 2012-12-25 2013-06-20 Toshiba Corp Image processing system, device, method, and program
JP2013123227A (en) * 2012-12-25 2013-06-20 Toshiba Corp Image processing system, device, method, and medical image diagnostic device
JP2013214884A (en) * 2012-04-02 2013-10-17 Toshiba Corp Image processing device, method, program, and stereoscopic image display device
JP2013229828A (en) * 2012-04-26 2013-11-07 Toshiba Corp Image processing device, method, program, and stereoscopic image display device
JP2014500999A (en) * 2010-11-05 2014-01-16 コーニンクレッカ フィリップス エヌ ヴェ Image content based prediction and image cache controller
JP2014050487A (en) * 2012-09-05 2014-03-20 Toshiba Corp Medical image processor, medical image processing method and medical image processing program
JP2014135771A (en) * 2014-04-22 2014-07-24 Nintendo Co Ltd Stereoscopic display control program, stereoscopic display control system, stereoscopic display controller, and stereoscopic display control method
WO2015104849A1 (en) * 2014-01-09 2015-07-16 Square Enix Holdings Co., Ltd. Video gaming device with remote rendering capability
JP2015201853A (en) * 2015-05-11 2015-11-12 株式会社東芝 Image processing system, image processing method and medical image diagnostic device
JP2015534160A (en) * 2012-09-10 2015-11-26 カルガリー サイエンティフィック インコーポレイテッド Client-side image rendering in client-server image browsing architecture
JP6441426B1 (en) * 2017-08-28 2018-12-19 株式会社エイビック Character video display system

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04255192A (en) * 1991-02-06 1992-09-10 Nec Corp Picture changeover controller and group common share picture changeover controller
JPH05292498A (en) * 1992-04-16 1993-11-05 Hitachi Ltd Image synthesizing system and intermulti-spot video conference equipment using same
JPH07306955A (en) * 1992-07-24 1995-11-21 Walt Disney Co:The Method for producing three-dimensional illusion and system therefor
JPH1074269A (en) * 1996-06-26 1998-03-17 Matsushita Electric Ind Co Ltd Stereoscopic cg moving image generator
JPH10191393A (en) * 1996-12-24 1998-07-21 Sharp Corp Multi-view-point image coder
JP2001177849A (en) * 1999-12-15 2001-06-29 Mitsubishi Electric Corp Three-dimensional image generating system and three- dimensional image generator
JP2001251596A (en) * 2000-03-03 2001-09-14 Kddi Corp Image transmitter, shadow viewer of received image and image transmitter-shadow viewer having tv conference function
JP2002095018A (en) * 2000-09-12 2002-03-29 Canon Inc Image display controller, image display system and method for displaying image data
JP2003208633A (en) * 2002-01-10 2003-07-25 Mitsubishi Electric Corp Server, client, transfer system and transfer method
JP2004062457A (en) * 2002-07-26 2004-02-26 Matsushita Electric Works Ltd Merchandise information providing system
JP2004188002A (en) * 2002-12-12 2004-07-08 Fuji Photo Film Co Ltd Image display device
JP2004187743A (en) * 2002-12-09 2004-07-08 Hitachi Medical Corp Medical three-dimensional image display device
JP2004264907A (en) * 2003-02-17 2004-09-24 Sony Computer Entertainment Inc Image generation system, image generation device, and image generation method

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04255192A (en) * 1991-02-06 1992-09-10 Nec Corp Picture changeover controller and group common share picture changeover controller
JPH05292498A (en) * 1992-04-16 1993-11-05 Hitachi Ltd Image synthesizing system and intermulti-spot video conference equipment using same
JPH07306955A (en) * 1992-07-24 1995-11-21 Walt Disney Co:The Method for producing three-dimensional illusion and system therefor
JPH1074269A (en) * 1996-06-26 1998-03-17 Matsushita Electric Ind Co Ltd Stereoscopic cg moving image generator
JPH10191393A (en) * 1996-12-24 1998-07-21 Sharp Corp Multi-view-point image coder
JP2001177849A (en) * 1999-12-15 2001-06-29 Mitsubishi Electric Corp Three-dimensional image generating system and three- dimensional image generator
JP2001251596A (en) * 2000-03-03 2001-09-14 Kddi Corp Image transmitter, shadow viewer of received image and image transmitter-shadow viewer having tv conference function
JP2002095018A (en) * 2000-09-12 2002-03-29 Canon Inc Image display controller, image display system and method for displaying image data
JP2003208633A (en) * 2002-01-10 2003-07-25 Mitsubishi Electric Corp Server, client, transfer system and transfer method
JP2004062457A (en) * 2002-07-26 2004-02-26 Matsushita Electric Works Ltd Merchandise information providing system
JP2004187743A (en) * 2002-12-09 2004-07-08 Hitachi Medical Corp Medical three-dimensional image display device
JP2004188002A (en) * 2002-12-12 2004-07-08 Fuji Photo Film Co Ltd Image display device
JP2004264907A (en) * 2003-02-17 2004-09-24 Sony Computer Entertainment Inc Image generation system, image generation device, and image generation method

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2247117A2 (en) 2009-04-27 2010-11-03 Mitsubishi Electric Corporation Stereoscopic video distribution system, stereoscopic video distribution method, stereoscopic video distribution apparatus, stereoscopic video viewing system, stereoscopic video viewing method, and stereoscopic video viewing apparatus
JP2011109371A (en) * 2009-11-17 2011-06-02 Kddi Corp Server, terminal, program, and method for superimposing comment text on three-dimensional image for display
JP2013511889A (en) * 2009-11-18 2013-04-04 トムソン ライセンシングThomson Licensing Method and system for 3D content distribution with flexible parallax selection
KR20120095983A (en) * 2009-11-18 2012-08-29 톰슨 라이센싱 Methods and systems for three dimensional content delivery with flexible disparity selection
US20120229604A1 (en) * 2009-11-18 2012-09-13 Boyce Jill Macdonald Methods And Systems For Three Dimensional Content Delivery With Flexible Disparity Selection
KR101699957B1 (en) * 2009-11-18 2017-01-25 톰슨 라이센싱 Methods and systems for three dimensional content delivery with flexible disparity selection
US10015250B2 (en) 2010-11-05 2018-07-03 Koninklijke Philips N.V. Image content based prediction and image cache controller
JP2014500999A (en) * 2010-11-05 2014-01-16 コーニンクレッカ フィリップス エヌ ヴェ Image content based prediction and image cache controller
WO2012137821A1 (en) * 2011-04-07 2012-10-11 株式会社東芝 Image processing system, apparatus, method and program
US9445082B2 (en) 2011-04-07 2016-09-13 Toshiba Medical Systems Corporation System, apparatus, and method for image processing
JP2012217591A (en) * 2011-04-07 2012-11-12 Toshiba Corp Image processing system, device, method and program
CN102821695A (en) * 2011-04-07 2012-12-12 株式会社东芝 Image processing system, apparatus, method and program
CN102821694A (en) * 2011-04-08 2012-12-12 株式会社东芝 Medical image processing system, medical image processing apparatus, medical image diagnostic apparatus, medical image processing method and medical image processing program
US10140752B2 (en) 2011-04-08 2018-11-27 Toshiba Medical Systems Corporation Medical image processing system, medical image processing apparatus, medical image diagnosis apparatus, and medical image processing method, related to a stereoscopic medical image process
JP2012231235A (en) * 2011-04-25 2012-11-22 Toshiba Corp Image processing system, apparatus, method, and program
JP2013006022A (en) * 2011-05-24 2013-01-10 Toshiba Corp Medical image diagnostic apparatus, medical image processing apparatus, and method
WO2012161193A1 (en) * 2011-05-24 2012-11-29 株式会社東芝 Medical image diagnostic apparatus, medical image-processing apparatus and method
US9361726B2 (en) 2011-05-24 2016-06-07 Kabushiki Kaisha Toshiba Medical image diagnostic apparatus, medical image processing apparatus, and methods therefor
CN103561655A (en) * 2011-05-24 2014-02-05 株式会社东芝 Medical image diagnostic apparatus, medical image-processing apparatus and method
JP2013066241A (en) * 2011-06-09 2013-04-11 Toshiba Corp Image processing system and method
US9578303B2 (en) 2011-06-13 2017-02-21 Toshiba Medical Systems Corporation Image processing system, image processing apparatus, and image processing method for displaying a scale on a stereoscopic display device
JP2013005052A (en) * 2011-06-13 2013-01-07 Toshiba Corp Image processing system, apparatus, method and program
JP2013034197A (en) * 2011-07-04 2013-02-14 Toshiba Corp Image processing apparatus, image processing method, and medical image diagnostic apparatus
JP2013017056A (en) * 2011-07-04 2013-01-24 Toshiba Corp Image processing system, image processing method, and medical image diagnostic device
US9628773B2 (en) 2011-07-04 2017-04-18 Toshiba Medical Systems Corporation Image processing apparatus, image processing method, and medical image diagnosis apparatus
JP2013021459A (en) * 2011-07-08 2013-01-31 Toshiba Corp Image processor, image processing method, image processing system and medical image diagnostic device
JP2013039351A (en) * 2011-07-19 2013-02-28 Toshiba Corp Image processing system, image processing device, image processing method, and medical image diagnostic device
JP2013026736A (en) * 2011-07-19 2013-02-04 Toshiba Corp Image processing system, device, method, and program
US9479753B2 (en) 2011-07-19 2016-10-25 Toshiba Medical Systems Corporation Image processing system for multiple viewpoint parallax image group
WO2013012042A1 (en) * 2011-07-19 2013-01-24 株式会社東芝 Image processing system, device and method, and medical image diagnostic device
JP2013078141A (en) * 2011-07-19 2013-04-25 Toshiba Corp Image processing system, device, method, and medical image diagnostic device
JP2013026792A (en) * 2011-07-20 2013-02-04 Toshiba Corp Image processing device, image processing method, image processing system, and medical image diagnostic device
JP2013038467A (en) * 2011-08-03 2013-02-21 Toshiba Corp Image processing system, image processor, medical image diagnostic apparatus, image processing method, and image processing program
JP2013214884A (en) * 2012-04-02 2013-10-17 Toshiba Corp Image processing device, method, program, and stereoscopic image display device
JP2013229828A (en) * 2012-04-26 2013-11-07 Toshiba Corp Image processing device, method, program, and stereoscopic image display device
JP2014050487A (en) * 2012-09-05 2014-03-20 Toshiba Corp Medical image processor, medical image processing method and medical image processing program
JP2015534160A (en) * 2012-09-10 2015-11-26 カルガリー サイエンティフィック インコーポレイテッド Client-side image rendering in client-server image browsing architecture
JP2013066242A (en) * 2012-12-25 2013-04-11 Toshiba Corp Image display system, device, and method, and medical image diagnostic device
JP2013123227A (en) * 2012-12-25 2013-06-20 Toshiba Corp Image processing system, device, method, and medical image diagnostic device
JP2013122770A (en) * 2012-12-25 2013-06-20 Toshiba Corp Image processing system, device, method, and program
JP2016509485A (en) * 2014-01-09 2016-03-31 株式会社スクウェア・エニックス・ホールディングス Video game device having remote drawing capability
WO2015104849A1 (en) * 2014-01-09 2015-07-16 Square Enix Holdings Co., Ltd. Video gaming device with remote rendering capability
US9901822B2 (en) 2014-01-09 2018-02-27 Square Enix Holding Co., Ltd. Video gaming device with remote rendering capability
JP2014135771A (en) * 2014-04-22 2014-07-24 Nintendo Co Ltd Stereoscopic display control program, stereoscopic display control system, stereoscopic display controller, and stereoscopic display control method
JP2015201853A (en) * 2015-05-11 2015-11-12 株式会社東芝 Image processing system, image processing method and medical image diagnostic device
JP6441426B1 (en) * 2017-08-28 2018-12-19 株式会社エイビック Character video display system

Similar Documents

Publication Publication Date Title
IJsselsteijn et al. Subjective evaluation of stereoscopic images: effects of camera parameters and display duration
EP2299726B1 (en) Video communication method, apparatus and system
EP2357838B1 (en) Method and apparatus for processing three-dimensional images
JP5014979B2 (en) 3D information acquisition and display system for personal electronic devices
Jones et al. Controlling perceived depth in stereoscopic images
KR101492876B1 (en) 3d video control system to adjust 3d video rendering based on user prefernces
US8823769B2 (en) Three-dimensional video conferencing system with eye contact
EP1328129A1 (en) Apparatus for generating computer generated stereoscopic images
Huynh-Thu et al. Video quality assessment: From 2D to 3D—Challenges and future trends
EP2340534B1 (en) Optimal depth mapping
KR101237945B1 (en) Critical Alignment Of Parallax Images For Autostereoscopic Display
CN101459857B (en) Communication terminal
JP4533895B2 (en) Motion control for image rendering
JP2739820B2 (en) Video conferencing system
KR20100085188A (en) A three dimensional video communication terminal, system and method
US8472702B2 (en) Method and apparatus for processing three-dimensional images
US7734085B2 (en) Image data delivery system, image data transmitting device thereof, and image data receiving device thereof
US8300089B2 (en) Stereoscopic depth mapping
US7532224B2 (en) Information processing method and apparatus
US9774896B2 (en) Network synchronized camera settings
KR101313740B1 (en) OSMU( One Source Multi Use)-type Stereoscopic Camera and Method of Making Stereoscopic Video Content thereof
CN101636747B (en) Two dimensional/three dimensional digital information acquisition and display device
US9251621B2 (en) Point reposition depth mapping
Shao et al. Asymmetric coding of multi-view video plus depth based 3-D video for view rendering
NL1032656C2 (en) 3-d image processing device and method.

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070309

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20091225

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100203

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100331

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100929

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20110209