JP4496823B2 - Image processing apparatus and method, and image display system - Google Patents

Image processing apparatus and method, and image display system Download PDF

Info

Publication number
JP4496823B2
JP4496823B2 JP2004106885A JP2004106885A JP4496823B2 JP 4496823 B2 JP4496823 B2 JP 4496823B2 JP 2004106885 A JP2004106885 A JP 2004106885A JP 2004106885 A JP2004106885 A JP 2004106885A JP 4496823 B2 JP4496823 B2 JP 4496823B2
Authority
JP
Japan
Prior art keywords
image
display unit
user
display
position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2004106885A
Other languages
Japanese (ja)
Other versions
JP2005293197A (en
Inventor
博樹 河西
宏平 浅田
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Priority to JP2004106885A priority Critical patent/JP4496823B2/en
Publication of JP2005293197A publication Critical patent/JP2005293197A/en
Application granted granted Critical
Publication of JP4496823B2 publication Critical patent/JP4496823B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an image processing apparatus and method for performing a process for enhancing the presence of an image displayed on a display or projected using a projector, and an image display system including such an image processing apparatus.

  Conventionally, it is known that an image displayed on a virtual screen can be appreciated with a high sense of presence by wearing a head mounted display (HMD).

  In addition, it is known that an image projected from the back of a hemispherical dome-shaped transmission screen can be viewed from the center of the screen so that a realistic image with a wide field of view close to human vision can be appreciated (for example, patents). Reference 1).

JP-A-8-334845

  However, the above-described HMD is used in a state where the physical screen itself is small and is very close to both eyes, and thus there is a problem that the eyes of the user are easily fatigued. On the other hand, the hemispherical dome type screen on which the image of the projector is projected has a problem that it is difficult to install and appreciate the image even though it has a large physical screen.

  The present invention has been proposed in view of such a conventional situation, and an image processing apparatus and method, and an image processing apparatus capable of easily viewing a realistic image even in a general home. An object is to provide an image display system provided.

To achieve the above mentioned object, an image processing apparatus and method according to the present invention, predetermined for the image to be displayed into a plurality of sub-display unit provided adjacent to the main display unit and the main display unit An image processing apparatus and method for performing the above processing, wherein each display unit starts from a viewpoint position of a user who views the image among virtual images developed on a virtual plane formed by extending the plane of the main display unit A predetermined position on the virtual image at a position on the sub-display unit that intersects a straight line connecting the viewpoint position of the user and the predetermined position on the virtual image. perspective transformation by performing image processing for displaying, the partial image after perspective transformation is intended to be displayed on the secondary display.

In order to achieve the above-described object, an image display system according to the present invention includes a main display unit, a plurality of sub display units provided adjacent to the main display unit , the main display unit, and the sub display. An image display system comprising: an image processing device that performs a predetermined process on an image displayed on a unit , wherein the image processing device is a virtual image that is developed on a virtual plane formed by extending a plane of the main display unit Of the images, a partial image cut out by a group of straight lines starting from the viewpoint position of the user viewing the image and passing through the periphery of each display unit is a straight line connecting the viewpoint position of the user and a predetermined position on the virtual image. perspective transformation by performing image processing for a position on the sub-display section intersecting displaying a predetermined position on the virtual image, perspective transformation means for the partial image after perspective transformation is displayed on the secondary display Ru with a Than is.

  In such an image processing apparatus and method, and an image display system, the peripheral portion of the image display unit starts from the viewpoint position of the user who views the image displayed on the image display unit among the virtual images developed on the virtual plane. The partial image cut by the passing straight line group is perspective-transformed, and the partial image after the perspective transformation is displayed on the image display unit.

In order to achieve the above-described object, the image processing apparatus and method according to the present invention provide a three-dimensional virtual space in which each display element has position information as a two-dimensional image and adjacent to the main display unit and the main display unit. a plurality of image processing apparatus and method to be displayed on the sub-display unit provided with, among the 3-dimensional virtual space, the periphery of the display unit starts from the viewpoint position of a user viewing the two-dimensional image It converts each display element of the subspace surrounded by the coordinate system centered on the viewpoint position by a group of straight lines passing through the state, and are not to render the display elements after the coordinate transformation to each image display unit, the coordinate system In the conversion process, each display element of the partial space is a straight line connecting the user's viewpoint position and a predetermined position on the virtual image developed on a virtual plane obtained by extending the plane of the main display unit. The above subtables that intersect It is to perspective transformation by converting the coordinate system for the position on the part to display a predetermined position on the virtual image.

In order to achieve the above-described object, an image display system according to the present invention provides a three-dimensional virtual space in which each display element has position information as a two-dimensional image , adjacent to the main display unit and the main display unit. An image display system for displaying on a plurality of sub-display units , wherein the image processing device starts from a viewpoint position of a user who views the two-dimensional image in the three-dimensional virtual space, and has a peripheral edge of each image display unit A coordinate conversion means for converting each display element in a partial space surrounded by a group of straight lines passing through the section into a coordinate system centered on the viewpoint position, and a rendering means for rendering each display element after the coordinate conversion on each image display section; the provided, it said coordinate transformation means, each display element of the subspace, a predetermined position on the virtual image is developed into a virtual plane formed by extending the viewpoint position and the plane of the main display unit of the user It is to perspective transformation by converting the coordinate system for displaying a predetermined position on the virtual image position on the sub-display section which intersects a line drawn from and.

  In such an image processing apparatus and method, and an image display system, when a three-dimensional virtual space in which each display element has position information is displayed on the image display unit as a two-dimensional image, two of the three-dimensional virtual spaces are displayed. Each display element in the subspace surrounded by a group of straight lines starting from the viewpoint position of the user viewing the three-dimensional image and passing through the peripheral edge of the image display section is converted into a coordinate system centered on the viewpoint position, and each display after coordinate conversion Render the element to the image display.

  In the image processing apparatus and method, and the image display system according to the present invention, the peripheral portion of the image display unit starts from the viewpoint position of the user viewing the image displayed on the image display unit among the virtual images developed on the virtual plane. When the image display unit is arranged obliquely with respect to the direction of the user's line of sight, the partial image cut by the group of straight lines passing through is perspective-transformed and the partial image after perspective transformation is displayed on the image display unit. A larger virtual image is displayed on the image display unit by perspective transformation, and the user can appreciate the image with a high sense of reality.

  Further, in the image processing apparatus and method and the image display system according to the present invention, when the three-dimensional virtual space in which each display element has position information is displayed on the image display unit as a two-dimensional image, Among them, each display element in the subspace surrounded by a group of straight lines starting from the viewpoint position of the user viewing the two-dimensional image and passing through the peripheral edge of the image display section is converted into a coordinate system centered on the viewpoint position, and after the coordinate conversion In order to render each display element on the image display unit, not only the appearance of the three-dimensional virtual space changes with the movement of the user, but also the image display unit is arranged obliquely with respect to the user's line-of-sight direction. In this case, a larger two-dimensional image is displayed on the image display unit, and the user can be immersed in the three-dimensional virtual space with a high sense of reality.

  Hereinafter, specific embodiments to which the present invention is applied will be described in detail with reference to the drawings.

(First embodiment)
In general, rather than displaying an image on one display 10 in front of the user as shown in FIG. 1A, the sub-displays 21 and 22 on the left and right of the main display 20 in front of the user as shown in FIG. And displaying one image across these three displays 20 to 22 allows the user to appreciate the image with a higher sense of realism. That is, as shown in FIG. 1B, the displayable screen spreads in the horizontal direction as much as the sub-displays 21 and 22 are connected. Therefore, the horizontal range of the display image I is set to A− in the case of FIG. It can be expanded from A ′ to BB ′. As a result, the viewing angle R2 larger than the viewing angle R1 in the case of FIG. 1A is occupied for the user's field of view, and a high sense of presence can be given.

  In contrast, in the present embodiment, as shown in FIG. 1C, the left and right sub-displays 21 and 22 are arranged so that the viewing angle R3 is larger than the viewing angle R2 in the case of FIG. A display image I (virtual image) to be displayed in a range in which the horizontal direction on the virtual display VD (virtual plane) is in the range of AC and A′-C ′ is connected to the user at an angle. Displayed on the displays 21 and 22. Thereby, as shown in FIG. 2, the horizontal range of the display image I can be expanded by the range of BC and B′-C ′ on the virtual display VD as compared with the case of FIG. The sense of reality can be further increased as compared with the case of FIG.

  Here, in the present embodiment, since the sub-displays 21 and 22 are connected at an angle toward the user, the display image I on the virtual display VD is displayed on the sub-displays 21 and 22 as they are. Will result in a distorted image.

  Therefore, in the present embodiment, the display image I on the virtual display VD is projected onto the sub-displays 21 and 22 according to the angle formed by the main display 20 in front of the user and the sub-displays 21 and 22 on the left and right. Perform perspective transformation.

  Specifically, as shown in FIG. 3, each image portion (P1, P2, P3,...) Of the intermediate position (viewpoint position) O between the left and right eyes of the user and the virtual display VD on the left side of the main display 20, for example. ..) Are connected by a straight line, and image processing is performed to display a corresponding image portion at a position (P1 ′, P2 ′, P3 ′,...) Where the straight line and the sub-display 21 intersect. . FIG. 3 is a view as seen from above the user, and is an example in the horizontal direction, but the same image processing is also performed in the vertical direction.

  When such image processing is performed, the user can appreciate the image displayed in the area indicated by the hatched line in FIG. 4A, and this hatched area is the effective range of the virtual display VD. Thus, even if the sub-displays 21 and 22 are connected at an angle toward the user, by performing the above-described image processing, the user is viewing one large display. Therefore, a high immersive feeling can be obtained.

  At this time, for example, when the user faces the direction of the sub-display 21, an image having a region shape as shown by the oblique lines in FIG. However, the present embodiment assumes that the user's line-of-sight direction is the main display 20 in front of the user, and the sub-displays 21 and 22 are provided to occupy most of the field of view with less image distortion. This is not a problem because it plays an auxiliary role for vision.

  Note that an image may be displayed not only on the shaded area shown in FIG. 4B but also on the entire sub-displays 21 and 22. However, by displaying as shown in FIG. 4B, the user can view the image displayed in the hatched area in FIG. 4A, and is the user watching one large display? Since an illusion such as this provides a high immersive feeling, it is preferable to display an image only in the shaded area.

  In the image processing described above, since the perspective transformation is performed by connecting the intermediate position (viewpoint position) O between the left and right eyes of the user and each image portion on the virtual display VD, the image is actually displayed with the left and right eyes. When viewed, some distortion occurs due to parallax. However, as described above, the present embodiment assumes that the user's line-of-sight direction is the main display 20 in front of the user. In this case, the resolution for the sub-displays 21 and 22 located at the end of the field of view is low. It's bad, and it doesn't matter because you can get enough immersion even if it's not completely in focus.

  Incidentally, the image processing for displaying images on the sub-displays 21 and 22 as shown in FIG. 3 is effective only when the position of the user is fixed and facing the front direction. That is, when either the position of the user with respect to the main display 20 or the sub-displays 21 or 22 or the angle between the main display 20 and the sub-displays 21 and 22 is changed, the user views the distorted image. It will end up.

  Therefore, in the present embodiment, the user position and the angles of the sub-displays 21 and 22 are detected, and the images displayed on the sub-displays 21 and 22 are automatically corrected based on the detection result.

  Here, when detecting the user position, a CCD (Charge Coupled Device) camera can be used. Further, since the sub-displays 21 and 22 are mechanically connected to the main display 20 by hinges or the like, the angle detection is easy.

  Of course, when the sub-displays 21 and 22 are fixed to the main display 20, only the user position needs to be detected. When the user position is almost determined by the size and handling of the display and equipment. Need only detect the angle of the sub-displays 21 and 22.

  The case where the display unit that displays an image is a self-luminous display has been described above as an example. That is, the sub-screen is connected to the left and right of the main screen in front of the user, and images processed by the three projectors in the same manner as described above are projected onto the main screen and the sub-screen. An image projected on the same area as the hatched area in the middle can be viewed. Thereby, the user has an illusion as if he / she is looking at one large screen, and a high immersive feeling is obtained.

  FIG. 5 shows a schematic configuration of the image processing apparatus in the present embodiment that performs the above processing. In the image processing apparatus 30 shown in FIG. 5, the image area decomposition unit 31 decomposes the input two-dimensional image data into an area to be displayed on the main display or main screen and an area to be displayed on two sub-displays or sub-screens. The partial image data is supplied to the perspective conversion units 33 to 35.

  On the other hand, the angle-of-view changing unit 32 changes the angle of view of the virtual camera based on the user position detected by the CCD camera or the like and the angle of the sub-display or sub-screen, and changes the changed angle of view to the perspective conversion units 33 to 33. 35. Here, the virtual camera is virtually provided at the viewpoint position of the user corresponding to the main display or main screen and the sub display or sub screen. The angle of view of the virtual camera corresponds to the viewing angle when each display or screen is viewed from the viewpoint position.

  The perspective conversion units 33 to 35 perform the above-described perspective conversion on the partial image data supplied from the image region decomposition unit 31 based on the changed view angle supplied from the view angle changing unit 32. The perspective conversion units 33 to 35 display the partial image data after the perspective conversion on the main display and the sub display, or supply the partial image data to three projectors and project them on the main screen and the sub screen.

  In this example, it has been described that images are displayed on three displays, or images are projected on three screens with three projectors. However, images are displayed on one display having a shape as shown in FIG. May be displayed, or an image may be projected onto a screen having the same shape as in FIG. In this case, an image combining unit 36 may be added to the image processing apparatus 30 as shown in FIG. The image combining unit 36 combines the partial image data after the perspective conversion supplied from the perspective conversion units 33 to 35 into one image and displays it on one display having a shape as shown in FIG. Or it supplies to one projector and it projects on the screen of the shape similar to FIG.1 (C).

Specific application examples of such an image processing apparatus 30 are shown in FIGS.
First, in FIG. 7A, three screens 40 to 42 are installed on the side wall of the room, and images are displayed from the front of the screen with respect to the three screens 40 to 42 using the three projectors 50 to 52. To project. In this case, the screen in front of the user is the main screen 40, and the left and right screens are the sub screens 41 and 42. The above-described image processing apparatus 30 is provided with hardware or software on the projectors 51 and 52 that project images onto the sub screens 41 and 42. As mounted. By performing the above-described image processing by the image processing apparatus 30, the user can appreciate contents such as movies and television images displayed on a very large virtual screen VS that cannot be obtained only by the main screen 40. .

  In addition, you may make it use not only the reflective screens 40-42 like FIG. 7 (A) but the transmissive screens 60-62 like FIG. 7 (B).

  Next, FIG. 8 shows an image projected onto the ceiling and side walls using one projector 70 in the user's bedroom, for example. In this case, the image processing apparatus 30 mounted as hardware or software on the projector 70 performs the above-described image processing on the image projected on the side wall. With this image processing device 30, even when the ceiling is narrow, the user can appreciate content such as movies and television images displayed on a very large virtual screen VS.

  Subsequently, FIG. 9 shows an image displayed on three displays 80 to 82 for a personal computer (PC). In this case, the above-described image processing apparatus 30 is mounted on the PC main body 83 as software, for example. By performing the above-described image processing by the image processing device 30, the user can appreciate contents such as movies and television images displayed on a very large virtual display VD that cannot be obtained by the main display 80 alone. . Note that the user position may be detected using a CCD camera (not shown) attached to the PC system 84.

  Finally, FIG. 10 displays an image on the three displays 90 to 92 of the foldable portable terminal 93. The portable terminal 93 is equipped with the above-described image processing device 30 as software, for example. By performing the above-described image processing by the image processing device 30, the user can appreciate the content displayed on a very large virtual display that cannot be obtained only by the main display 90. The user position may be detected using the CCD camera 94. This portable terminal 93 can be folded and is not necessarily widened, but is used with an angle, so it is very space-saving and can be used even in crowded situations such as during commuting. In addition, since the user basically uses the display 90 to 92 close to the user, the user can occupy a horizontal field of view and enjoy a high sense of realism.

(Second Embodiment)
In the first embodiment described above, the case where the content displayed on the virtual display VD or the virtual screen VS is created in advance such as a movie or a television image has been described. In the present embodiment, it is assumed that the displayed content is a thing that changes in real time such as a three-dimensional CG (Computer Graphics) (game, interactive movie, etc.).

  As an example, consider the portable terminal 93 shown in FIG. As shown in FIG. 11, when displaying a three-dimensional CG with a sense of depth on the displays 90 to 92 of the portable terminal 93, the virtual camera is based on the angles of the sub-displays 91 and 92 and the user position detected by the CCD camera 94. By determining the angle of view of VC1 to VC3, rendering the virtual three-dimensional space and drawing it on the three displays 90 to 92, it is possible to display content that gives a high sense of realism.

  At this time, since the CG image is determined and drawn in real time, when the user shifts his / her head to the right as shown in FIG. 12, for example, the user position is detected by the CCD camera 94 and the image from the virtual camera VC is calculated. The image subjected to the image processing is drawn on the sub-displays 91 and 92, and at the same time, the image whose angle of view is changed is displayed. In this case, the appearance itself of the object in the CG changes.

  Similarly, for example, when the user brings his face close to the front as shown in FIG. 13, the user position is detected by the CCD camera 94, and an image with a changed angle of view is displayed. At this time, the images drawn on the sub-displays 91 and 92 are stretched in the horizontal direction, but the user's line of sight is directed to the main display 90 on the front, and the resolution at the edge of the field of view is poor. , 92 is not a problem even if the images are slightly distorted.

  As described above, due to the effect of increasing the virtual display VD using the three displays 90 to 92 and the change in the appearance of the three-dimensional CG according to the user position, the user himself / herself has a higher realism and the user himself / herself becomes a virtual space. Content that can be immersed in

  The case where the display unit that displays an image is a self-luminous display has been described above as an example.

  FIG. 14 shows a schematic configuration of the image processing apparatus in the present embodiment that performs the above processing. In the image processing apparatus 100 illustrated in FIG. 14, the angle-of-view changing unit 101 changes the angle of view of the virtual camera based on the user position detected by the CCD camera or the like and the angle of the sub display or the sub screen. The angle of view is supplied to the clipping unit 102.

  The clipping unit 102 calculates a view volume (view space) based on the changed view angle supplied from the view angle changing unit 101, clips an object outside the view volume, and assigns the object in the view volume to each virtual volume. Convert to camera coordinates of the camera VC.

  The rendering units 103 to 105 render a visible object in the view volume and display it on the main display and the sub display, or supply image data to three projectors and project them on the main screen and the sub screen.

  In this example, it has been described that images are displayed on three displays, or images are projected on three screens with three projectors. However, images are displayed on one display having a shape as shown in FIG. May be displayed, or an image may be projected onto a screen having the same shape as in FIG. In this case, an image combining unit 106 may be added to the image processing apparatus 100 as shown in FIG. The image combining unit 106 combines the image data supplied from the rendering units 103 to 105 into one image and displays it on one display having a shape as shown in FIG. 1C, or on one projector. Then, the image is projected onto a screen having the same shape as in FIG.

  The best mode for carrying out the present invention has been described above with reference to the drawings. However, the present invention is not limited to the above-described embodiment, and various modifications can be made without departing from the gist of the present invention. Of course, it is possible to change.

  For example, in the above-described embodiment, the description has been given mainly using the three-surface display in which the sub display is connected to the left and right of the main display. However, the present invention is not limited to this. Alternatively, a five-sided display in which sub-displays are connected to the top and bottom of the main display may be used. Furthermore, even in the case of a curved display, the present invention can be applied by assuming that a large number of strip-shaped displays are connected.

BRIEF DESCRIPTION OF THE DRAWINGS It is a conceptual diagram for demonstrating 1st Embodiment, The figure (A) shows the display of 1 screen, The figure (B) shows the 3 screen display by which the sub display was connected to the right and left of the main display. FIG. 4C shows a three-sided display in which the left and right sub-displays of the main display are directed to the user side. It is a figure which compares and shows the magnitude | size of the display image in FIG. 1 (B) and (C). It is a figure explaining perspective transformation for displaying an image on a sub display. It is a figure explaining the area | region where a user can actually appreciate an image, The figure (A) shows the example in case the user is facing the direction of the main display, The figure (B) is a sub display with a user left. An example in the case of facing the direction of is shown. It is a figure which shows an example of schematic structure of the image processing apparatus in 1st Embodiment. It is a figure which shows the other example of schematic structure of the image processing apparatus. It is a figure which shows the example at the time of applying the image processing apparatus to a projector, the figure (A) shows the case of a reflection type screen, and the figure (B) shows the case of a transmission type screen. It is a figure which shows the example at the time of applying the image processing apparatus to a projector. It is a figure which shows the example at the time of applying the image processing apparatus to a personal computer. It is a figure which shows the example at the time of applying the image processing apparatus to a foldable portable terminal. It is a conceptual diagram for demonstrating 2nd Embodiment. It is a figure which shows the change of the angle of view of a virtual camera when a user shifts his head to the right. It is a figure which shows the change of the angle of view of a virtual camera when a user brings a face close to the front. It is a figure which shows an example of schematic structure of the image processing apparatus in 2nd Embodiment. It is a figure which shows the other example of schematic structure of the image processing apparatus.

Explanation of symbols

  20 main display, 21, 22 sub-display, 30 image processing device, 31 image area decomposition unit, 32 angle of view conversion unit, 33, 34, 35 perspective conversion unit, 36 image combination unit, 100 image processing device, 101 angle of view change Section, 102 clipping section, 103, 104, 105 rendering section, 106 image combining section

Claims (11)

  1. An image processing apparatus that performs predetermined processing on an image displayed on a main display unit and a plurality of sub display units provided adjacent to the main display unit ,
    Of the virtual image developed on a virtual plane formed by extending the plane of the main display unit, a partial image cut out by a straight line group starting from the viewpoint position of the user viewing the image and passing through the peripheral part of each display unit , Perspective conversion is performed by performing image processing for displaying the predetermined position on the virtual image at a position on the sub-display unit that intersects a straight line connecting the user's viewpoint position and the predetermined position on the virtual image , the partial image after perspective transformation and perspective transformation unit to be displayed on the secondary display
    An image processing apparatus comprising:
  2. The main display unit, an image processing apparatus according to claim 1, wherein substantially Ru perpendicular der respect gaze direction of the user.
  3. An image processing method for performing predetermined processing on an image displayed on a main display unit and a plurality of sub display units provided adjacent to the main display unit ,
    Of the virtual image developed on a virtual plane formed by extending the plane of the main display unit, a partial image cut out by a straight line group starting from the viewpoint position of the user viewing the image and passing through the peripheral part of each display unit , Perspective conversion is performed by performing image processing for displaying the predetermined position on the virtual image at a position on the sub-display unit that intersects a straight line connecting the user's viewpoint position and the predetermined position on the virtual image , an image processing method that have a perspective transformation process for the partial image after perspective transformation is displayed on the secondary display.
  4. The main display unit, an image processing method according to claim 3, wherein substantially Ru perpendicular der respect gaze direction of the user.
  5. An image including a main display unit, a plurality of sub display units provided adjacent to the main display unit, and an image processing device that performs predetermined processing on the image displayed on the main display unit and the sub display unit A display system,
    The image processing apparatus includes a group of straight lines that start from a viewpoint position of a user viewing the image and pass through a peripheral portion of each display unit among virtual images developed on a virtual plane obtained by extending the plane of the main display unit. Image processing is performed to display the predetermined position on the virtual image at the position on the sub-display unit that intersects the straight line connecting the user's viewpoint position and the predetermined position on the virtual image. image display system perspective transformation, Ru with a perspective transformation means for displaying the partial image after perspective transformation in the sub-display section by.
  6. The main display section, an image display system according to claim 5, wherein substantially Ru perpendicular der respect gaze direction of the user.
  7. Position detecting means for detecting the position of the user;
    Angle detecting means for detecting an angle of the sub display unit with respect to the main display unit ,
    The image display system according to claim 5 , wherein the perspective conversion unit performs a perspective conversion so as to display an image according to the viewing angle of the user that changes based on the position of the user and the angle of the sub display unit .
  8. An image processing apparatus that displays a three-dimensional virtual space in which each display element has position information as a two-dimensional image on a main display unit and a plurality of sub-display units provided adjacent to the main display unit ,
    A coordinate system in which each display element in a partial space surrounded by a group of straight lines starting from the viewpoint position of the user viewing the two-dimensional image and passing through the peripheral edge of each display section in the three-dimensional virtual space is centered on the viewpoint position. Coordinate conversion means for converting to
    Rendering means for rendering each display element after coordinate conversion on each image display unit ,
    The coordinate conversion means includes a straight line connecting each display element of the partial space to the user's viewpoint position and a predetermined position on the virtual image developed on a virtual plane obtained by extending the plane of the main display unit. An image processing apparatus that performs perspective conversion by converting into a coordinate system for displaying a predetermined position on the virtual image at a position on the sub display unit that intersects .
  9. An image processing method for displaying a three-dimensional virtual space in which each display element has position information as a two-dimensional image on a main display unit and a plurality of sub-display units provided adjacent to the main display unit ,
    Coordinates centered on the viewpoint position of each display element in the partial space surrounded by a straight line group starting from the viewpoint position of the user viewing the two-dimensional image and passing through the peripheral edge of each image display section in the three-dimensional virtual space A coordinate transformation process to transform into a system;
    Have a rendering step of rendering each display element after the coordinate transformation to each image display unit,
    In the coordinate transformation step, each display element of the partial space is a straight line connecting the user's viewpoint position and a predetermined position on the virtual image developed on a virtual plane obtained by extending the plane of the main display unit. An image processing method for performing perspective conversion by converting to a coordinate system for displaying a predetermined position on the virtual image at a position on the sub-display unit that intersects .
  10. An image display system for displaying a three-dimensional virtual space in which each display element has position information as a two-dimensional image on a main display unit and a plurality of sub-display units provided adjacent to the main display unit ,
    The image processing apparatus includes:
    Coordinates centered on the viewpoint position of each display element in the partial space surrounded by a straight line group starting from the viewpoint position of the user viewing the two-dimensional image and passing through the peripheral edge of each image display section in the three-dimensional virtual space Coordinate conversion means for converting into a system;
    Rendering means for rendering each display element after coordinate transformation on each image display unit ,
    The coordinate conversion means includes a straight line connecting each display element of the partial space to the user's viewpoint position and a predetermined position on the virtual image developed on a virtual plane obtained by extending the plane of the main display unit. An image display system that performs perspective conversion by converting to a coordinate system for displaying a predetermined position on the virtual image at a position on the sub-display unit that intersects .
  11. Position detecting means for detecting the position of the user;
    Angle detecting means for detecting an angle of the sub display unit with respect to the user's line-of-sight direction ;
    The image display system according to claim 10 , wherein the coordinate conversion unit performs coordinate conversion so as to display an image corresponding to the viewing angle of the user that changes based on the position of the user and the angle of the sub display unit .
JP2004106885A 2004-03-31 2004-03-31 Image processing apparatus and method, and image display system Active JP4496823B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004106885A JP4496823B2 (en) 2004-03-31 2004-03-31 Image processing apparatus and method, and image display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004106885A JP4496823B2 (en) 2004-03-31 2004-03-31 Image processing apparatus and method, and image display system

Publications (2)

Publication Number Publication Date
JP2005293197A JP2005293197A (en) 2005-10-20
JP4496823B2 true JP4496823B2 (en) 2010-07-07

Family

ID=35326063

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004106885A Active JP4496823B2 (en) 2004-03-31 2004-03-31 Image processing apparatus and method, and image display system

Country Status (1)

Country Link
JP (1) JP4496823B2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009198698A (en) * 2008-02-20 2009-09-03 Dainippon Printing Co Ltd Immersive display device
KR101305249B1 (en) 2012-07-12 2013-09-06 씨제이씨지브이 주식회사 Multi-projection system
KR101429812B1 (en) * 2012-09-18 2014-08-12 한국과학기술원 Device and method of display extension for television by utilizing external projection apparatus
TWI532025B (en) * 2012-12-18 2016-05-01 Lg Display Co Ltd And a driving method of the display device
KR101906002B1 (en) * 2016-11-22 2018-10-08 순천향대학교 산학협력단 Multi sides booth system for virtual reality and the thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09311383A (en) * 1996-08-22 1997-12-02 Toppan Printing Co Ltd Video display system
JP2001136466A (en) * 1999-11-08 2001-05-18 Hitachi Ltd Image display device
JP2004078121A (en) * 2002-08-22 2004-03-11 Sharp Corp Device, method, and program for display correction and recording medium with recorded display correcting program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09311383A (en) * 1996-08-22 1997-12-02 Toppan Printing Co Ltd Video display system
JP2001136466A (en) * 1999-11-08 2001-05-18 Hitachi Ltd Image display device
JP2004078121A (en) * 2002-08-22 2004-03-11 Sharp Corp Device, method, and program for display correction and recording medium with recorded display correcting program

Also Published As

Publication number Publication date
JP2005293197A (en) 2005-10-20

Similar Documents

Publication Publication Date Title
Schmalstieg et al. Augmented reality: principles and practice
Fehn et al. Interactive 3-DTV-concepts and key technologies
US9899005B2 (en) Eye mounted displays and systems, with data transmission
US8786675B2 (en) Systems using eye mounted displays
JP4933164B2 (en) Information processing apparatus, information processing method, program, and storage medium
JP3948489B2 (en) Luminance filter creation method and virtual space generation system
US10449444B2 (en) Spatially-correlated multi-display human-machine interface
JP4823334B2 (en) Image generation system, image generation method, program, and information storage medium
US8896534B2 (en) Spatially-correlated multi-display human-machine interface
US20130057543A1 (en) Systems and methods for generating stereoscopic images
US6411266B1 (en) Apparatus and method for providing images of real and virtual objects in a head mounted display
Cruz-Neira et al. Surround-screen projection-based virtual reality: the design and implementation of the CAVE
US9098112B2 (en) Eye tracking enabling 3D viewing on conventional 2D display
US8928659B2 (en) Telepresence systems with viewer perspective adjustment
KR100520699B1 (en) Autostereoscopic projection system
US20120162384A1 (en) Three-Dimensional Collaboration
DeFanti et al. The future of the CAVE
US6965381B2 (en) Multi-person shared display device
US20110084983A1 (en) Systems and Methods for Interaction With a Virtual Environment
JP3230745B2 (en) 3-dimensional image generating apparatus and method for generating
US9595127B2 (en) Three-dimensional collaboration
US8704882B2 (en) Simulated head mounted display system and method
WO2010062117A2 (en) Immersive display system for interacting with three-dimensional content
EP1033682B1 (en) Image processing apparatus and image processing method
US8976323B2 (en) Switching dual layer display with independent layer content and a dynamic mask

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070313

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20091224

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100105

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100226

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20100323

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20100405

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130423

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140423

Year of fee payment: 4

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250