JP3403143B2 - Image processing method, apparatus and storage medium - Google Patents

Image processing method, apparatus and storage medium

Info

Publication number
JP3403143B2
JP3403143B2 JP2000082610A JP2000082610A JP3403143B2 JP 3403143 B2 JP3403143 B2 JP 3403143B2 JP 2000082610 A JP2000082610 A JP 2000082610A JP 2000082610 A JP2000082610 A JP 2000082610A JP 3403143 B2 JP3403143 B2 JP 3403143B2
Authority
JP
Japan
Prior art keywords
space
image
image processing
data
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2000082610A
Other languages
Japanese (ja)
Other versions
JP2000348201A (en
Inventor
昭宏 片山
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP8466099 priority Critical
Priority to JP11-84660 priority
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Priority to JP2000082610A priority patent/JP3403143B2/en
Publication of JP2000348201A publication Critical patent/JP2000348201A/en
Application granted granted Critical
Publication of JP3403143B2 publication Critical patent/JP3403143B2/en
Anticipated expiration legal-status Critical
Application status is Expired - Fee Related legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4007Interpolation-based scaling, e.g. bilinear interpolation

Description

BACKGROUND OF THE INVENTION [0001] [Technical Field of the Invention The present invention is a light space data
Used to generate virtual space images according to user operations
Is to do. Here, a method for recording light space data will be described. As shown in FIG. 1, the coordinate system OXY-
Install Z. A light beam passing through a reference plane P (Z = z) perpendicular to the Z axis is represented by a position (x, y) where the light beam crosses P and variables θ and φ indicating the direction of the light beam. That is, one ray is uniquely determined by five variables (x, y, z, θ, φ). If a function representing the light intensity of this light is defined as f, the light group data in this space can be expressed as f (x, y, z, θ, φ). This five-dimensional space is called “ray space”.
Call it. Here, if the reference plane P is set to z = 0 and the parallax information in the vertical direction of the light beam, that is, the degree of freedom in the φ direction is omitted, the light beam degree of freedom is reduced to two dimensions (x, θ). Can be made. This x-θ two-dimensional space is a partial space of the ray space. Then, if the light ray (Fig. 2) passing through the point (X, Z) in the real space is set as u = tanθ, as shown in Fig. 3 in the xu space, [Equation 1] X = It is mapped on the straight line x + u · Z. Photographing with a camera corresponds to an operation of receiving light rays passing through the entrance pupil of the camera on the imaging surface and imaging the brightness and color. In other words, the light ray group passing through one point in the real space called the entrance pupil position is acquired as the number of pixels as an image. Here, the degree of freedom in the φ direction is omitted, and the behavior of the light ray only in the XZ plane is considered, so only the pixels on the line segment that intersects the plane perpendicular to the Y axis in the image are considered. Become. In this way, it is possible to collect rays passing through one point by photographing an image, and x in one photographing.
-Data on one line segment of u space can be acquired. [0006] When this photographing is performed in a large number by changing the viewpoint position, a group of light rays passing through many points can be obtained. When a real space is photographed using N cameras as shown in FIG. 4, it corresponds to the entrance pupil position (Xn, Zn) of the nth (n = 1, 2,..., N) camera Cn. As shown in FIG. 5, data on a straight line of [Expression 2] x + u · Zn = Xn can be input. In this way, by shooting from a sufficiently large number of viewpoints,
The xu space can be densely filled with data. Conversely, an observation image from a new arbitrary viewpoint position can be generated from the data in the xu space (FIG. 6) (FIG. 7). As shown in this figure, the observation image from the new viewpoint position E (X, Z) represented in the form of an eye is
It can be generated by reading the data on the straight line of Equation 1 in the xu space from the xu space. One of the major features of ray space data is that ray space data is defined for each pixel. That is, the frame data of one scene is expressed by the light space data for the number of pixels of the frame. Therefore, the amount of light space data does not depend on the complexity of the scene, but the amount of computation depends only on the size and resolution of the scene, that is, only on the total number of pixels in the scene to be generated. In the case of normal CG data, if the scene becomes complicated, the complexity cannot be expressed unless the number of polygons is increased, and this increases the amount of computation, thereby reducing the drawing performance. In this case, if the total number of pixels of the image to be drawn is constant, the drawing performance is constant regardless of the complexity of the scene. Consider a case where a user walks through the virtual space generated using such ray space data. At this time, in order to give the user a sense of walking through, it is necessary to generate and present an image of several tens of frames per second. However, if the total number of pixels of the image reconstructed from the light space data in each frame is large, it takes time to draw one frame, and the drawing frame rate may not be able to follow the moving speed. Further, even when the user operates an object in the scene at high speed (such as movement or rotation), if the object is generated / drawn from the light space data, if the number of pixels of the object is large, Drawing is not in time. The present invention has been proposed in order to solve the above-mentioned problems of the prior art, and its object is to reconstruct an image from image data that includes spatial information such as ray space data. If there is no problem even if the resolution is reduced, for example, when moving in a space, the sampling rate is reduced and the image information is read at a resolution lower than the resolution of the image information at the time of recording. An object of the present invention is to propose an image processing apparatus and method that can secure a high drawing frame rate by reconstructing an image. The object of the present invention is achieved.
Therefore, for example, the image processing apparatus of the present invention has the following configuration.
Yeah. That is, by the user using the ray space data
Image processing to generate virtual space image according to operation
An apparatus for recording the light space data, and reading from the recording means in response to a user operation.
Setting means for setting a sampling rate indicating the pixel interval of the light ray space data to be emitted, and light rays from the recording means according to the sampling rate set by the setting means.
A reconstruction means for reconstructing an image of the virtual space is read between the data, the reconstructed image by the reconstruction means
To enlarge to the set image size
Interpolation means for performing pixel interpolation processing is provided . When the sampling rate is determined based on, for example, the moving speed of the virtual space, image information is read out at the sampling rate and an image is reconstructed. Since this rate is below the resolution at the time of recording, the time required for image reconstruction is shortened. In addition, the decrease in resolution is compensated by pixel interpolation (for example, texture mapping). In order to achieve the object of the present invention, for example, a book
The image processing method of the invention has the following configuration. That is,
By using the light space data recorded in the memory,
Image processing to generate an image of the virtual space according to the operation
An image processing method performed by a physical device, which is operated by a user
Depending on the image of the ray space data read from the memory.
A setting step of setting a sampling rate that indicates the elementary interval, according to the sampling rate set in the setting step, the reading light field data from the memory
A reconstruction process for reconstructing an image of the virtual space, and the reconstruction
The image reconstructed in the process is set to the set image size
Characterized in that it comprises an interpolation step of performing pixel interpolation processing so as to expand so as to. The above object can also be achieved by an image processing method or a storage medium for storing a program for realizing this method. DETAILED DESCRIPTION OF THE INVENTION Hereinafter, referring to the attached drawings,
An image processing apparatus and an image processing method to which the present invention is applied will be described in detail. This image processing apparatus and image processing method sets the reading sampling rate of the light space data when walking through the virtual space or when manipulating (moving or rotating) an object in the virtual space represented by the light space. By reducing the resolution by lowering the resolution, the drawing frame rate is made to follow the spatial movement speed. On the other hand, when the resolution is reduced, pixel interpolation is performed by texture mapping. FIG. 8 illustrates the quantization of the light space data in this embodiment. That is, the ray space data handled by the image processing apparatus of this embodiment is a camera having a CCD with N pixels in the main scanning direction, an angle of view of w, and an intersection angle of the optical axis and the Z axis is α. It is assumed that the image data is obtained by using.
The CCD pixels are ordered in order of 0,..., N−1, the angle formed by the light beam passing through the jth pixel with the Z axis is θ, and the light beam passing through the jth pixel intersects with the X axis. Assuming that the position is x, [Expression 3] N / 2 tan (θ−α) = (N / 2−j) tan (w / 2) is established. In FIG. 8, the Y axis is mapped. Therefore, if Equation 3 holds for any j-th pixel on the Y = m line, the angle θ corresponding to the pixel j can be obtained from Equation 3. That is, when the i-th image data is represented by I ″ (i), it is converted into image data I ′ in (x, θ) space, and further converted to image data I in (x, u) space by u = tan θ. If conversion is to be performed, [Equation 4] I "(i) = I '(x, [theta]) = I (x, u). U and x are each quantized to preserve the resolution of the original image. For example, with respect to the x-axis, the minimum value of the interval between two adjacent rays on the x-axis is obtained, and quantization is performed at this interval. Further, the u axis is quantized by the tangent of the angle formed by two adjacent pixels from the camera viewpoint position. In this way, as shown in FIG. 10, the light space data becomes discretely sampled points on a straight line. The smoothness of movement when the user or object moves at high speed in the virtual space depends on the drawing frame rate. That is, the faster the drawing frame rate, the more the user or object feels moving smoothly in the virtual space, and the slower the drawing frame rate, the more jerky movement makes the user feel uncomfortable with the movement in the space. Therefore, in order to realize smooth movement of the virtual space even when the moving speed of the user or the object is fast, it is necessary to improve the drawing rate. However, this means increasing the total number of pixels in the image reconstructed from the ray space data, so
There is a fear that drawing is not in time. Therefore, in this embodiment, the sample rate in FIG. 10 is changed. That is, 2
If every second pixel sample is suitable for drawing, sample every second. The moving speed of the user is
In the present embodiment, detection is performed based on the moving speed of the mouse. If this speed is vcm / s, the degree of load for drawing at this time is kv (k is a predetermined constant). The sample rate t should be set as large as possible. Load kv
The sample rate t (every pixel) corresponding to is the load kv
Is a function of If the sample rate t is set to be lower than that, it is a numerical value that cannot be drawn in time, and can be determined in advance according to the drawing speed of the system and the value of the load kv. If such t is determined, the ray space data is sampled at every t sample points in FIG. The amount of processing data is reduced by sampling every t pixels to cope with movement or scrolling of the user or object. Even if the resolution is degraded by reducing the amount of data, while the user or object is moving in space, the visual characteristics of the user observing it, that is, the details of the moving object or scene Since the discriminating ability of the space is reduced, the space with reduced resolution is not a problem. FIG. 11 shows the system configuration of this embodiment. The hardware configuration shown in FIG. 11 is a normal workstation configuration. In the figure, in particular, a large amount of ray space data is stored in the disk 25 as shown in FIG. In this system, the virtual space is given to the user by CRT23.
It is presented above. At this time, the user uses the mouse 2
8 can be freely walked through the virtual space, and an object in the virtual space can be operated (moved, rotated, etc.). The speed at which the walk-through or the object is operated is the moving speed of the mouse 28. FIG. 12 shows the first
The control procedure of the system of FIG. 1 is shown. When the system is started in step S2,
In step S4, the moving direction and moving amount v of the mouse are obtained based on the user's operation on the mouse 28.
In step S6, the image size is determined from the size of the display range of the CRT 23. In step S8, the sample rate t (pixel) is determined from the moving amount v of the mouse, and the light space data is read from the disk 25 at this sample rate t (pixel) to reconstruct the image. In step S10, image enlargement is performed. This enlargement is performed by reducing the image at the sample rate t because the image reconstruction performed in step S8 is reduced. Therefore, the image is enlarged so that the image size determined in step S6 is obtained, and this is performed using a texture mapper 24 which is a well-known hardware. By doing so, image interpolation is performed and image display at the required resolution is realized. The present invention can be further modified in various ways. Modified example 1: In the above embodiment, the case where an image is reconstructed from an image database in the light space data format at the same site has been dealt with. However, the present invention is applied to a case where the database is remote via a network. Is also applicable. This is because when the system is used via a network, the transfer rate fluctuates due to the load on the network. In the above embodiment, when the observer (user) walks through, the drawing rate is in time for the moving speed. This is because the same problem as in the case where there is not. In other words, even if there is little change or movement of the image over the network, the frame rate is lowered, so the CPU of the image processing apparatus on the receiving side has a margin but there is no data, so there is a problem that the picture becomes jerky . When the present invention is applied to deal with such a problem, image quality deterioration occurs in terms of video by setting the sampling rate of image data between the transmission side and the reception side according to the data transfer speed of the network. It becomes possible to provide a smooth motion image. The processing of this modification is shown in FIG. In the flowchart of FIG. 12, the same processes as those in FIG. The difference from FIG. 12 is that the network data transfer rate is determined in step S31, and the image sampling rate is set with reference to the data transfer rate in the image generation processing in step S81. Modification 2: The moving speed can be detected by means other than a mouse. In short, it is only necessary to detect the movement speed such as walk-through and scrolling. In the above embodiment, the load increase with respect to the moving speed of the user is assumed to be linear. However, the present invention is not limited to this form. For example, the resolution (the total number of pixels in the image) may be set so as to be inversely proportional to the square of the speed. Furthermore, actual measurement values may be actually obtained and determined. Modified example 3: In the above embodiment, the ray space data is obtained by calculation, but a RAM or ROM tabulated in advance may be used. Modification 4: The display device is not limited to a CRT. It can be applied to lenticular type and HMD type display devices. As described above, according to the present invention, even if the virtual space based on the light space data has a large processing load by the computer, the load is reduced by reducing the sampling rate and reducing the resolution. The increase can be suppressed. In particular, if the increase in load processing occurs due to, for example, movement of the virtual space or movement of an object in the virtual space by a user operation, a reduction in resolution caused by lowering the sampling rate is not a problem for the user's vision. . Incidentally, since the decrease in resolution is compensated by pixel interpolation, the deterioration in image quality can be suppressed to some extent. Also
According to the invention described in claim 6, the set sump is set.
Retrieval of ray space data by ring rate
It can be done easily.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a diagram for explaining the definition of a light space. FIG. 2 is a diagram showing a light ray group passing through a point (x, z) in real space. FIG. 3 is a diagram when a light ray group passing through a point (x, z) in real space is mapped to the light ray space; FIG. 4 is a view showing sampling of a light beam group by a plurality of cameras. FIG. 5 is a diagram illustrating a state in which a light beam group sampled by a plurality of cameras is mapped to a light space. FIG. 6: A point (x, z) in real space from ray space data
The figure explaining the principle which reconstructs the image which can be observed by. FIG. 7 is a diagram for explaining a relationship between a plurality of cameras used for sampling of light rays and a viewpoint position where an image is to be reconstructed. FIG. 8 is a diagram illustrating quantization in the embodiment. FIG. 9 is a view for explaining storage of ray space data in the embodiment. FIG. 10 is a diagram for explaining sampling storage of ray space data in the embodiment. FIG. 11 is a block diagram showing a hardware configuration of the embodiment. FIG. 12 is a flowchart illustrating a control procedure according to the embodiment. FIG. 13 is a flowchart of processing in Modification 1 of the present invention.

──────────────────────────────────────────────────── ─── Continuation of front page (56) References Interactive manipulation of 3D live-action space based on ray description, IEICE Transactions, Japan, IEICE Society, May 25, 1998, VOL . J81- D-2 NO. 5, p850-860 Realization of Time Critical Rendering using adaptive texture mapping, Information Processing Society of Japan Report 96-CG-79, Japan, Information Processing Society of Japan, February 23, 1996, p39-46 (58) Fields surveyed (Int.Cl. 7 , DB name) G06T 15/00-17/50 H04N 13/00-13/04

Claims (1)

  1. (57) [Claims] [Claim 1] Operation by a user using ray space data
    Image processing device for generating virtual space images according to work
    A location, a recording unit for recording the light space data, in response to operation by a user, read from the recording means
    Said reading and setting means for setting the sampling rate of a pixel interval of the light space data, the light space data from said recording means in accordance with a sampling rate set by the setting means
    Reconstruction means for reconstructing an image of the virtual space and an image reconstructed by the reconstruction means are set.
    Pixel interpolation processing to enlarge the image size
    The image processing apparatus characterized by comprising an interpolation means for performing. 2. The image processing apparatus according to claim 1, wherein the setting unit determines a sampling rate based on a moving speed of a virtual space designated by a user. 3. The image processing apparatus according to claim 1, wherein the setting unit determines a sampling rate based on an operation speed of an object in a virtual space designated by a user. 4. The ray space data includes a ray intersecting the X axis.
    Where u is the tangent of the position x and the angle between the ray and the Z axis.
    Characterized by being managed in the indicated (x, u) space
    The image processing apparatus according to claim 1. 5. The x is on the x-axis of two adjacent rays.
    Is quantized with the minimum value of the interval at which u is the camera viewpoint
    Quantized by the tangent of the angle between two adjacent pixels from the position
    The image processing apparatus according to claim 4, wherein 6. A light beam read by the reconstruction means.
    The spatial data is the sample set by the setting means.
    According to the rate, on a straight line in (x, u) space
    Characterized by discretely sampled data
    The image processing apparatus according to claim 4. 7. Use of light space data recorded in a memory.
    And generate an image of the virtual space according to the user's operation
    An image processing method because the image processing apparatus is performed in response to operation by a user, the light read from the memory
    A setting step of setting a sampling rate of a pixel spacing of line space data in accordance with a sampling rate set in the setting step, the from the memory reads the light space data temporary
    The reconstruction process for reconstructing the image of the virtual space and the image reconstructed in the reconstruction process are set.
    An image processing method characterized by comprising the <br/> interpolation step of performing pixel interpolation processing so as to expand so that the image size. 8. The image processing method according to claim 7 , wherein in the setting step , a sampling rate is determined based on a moving speed of a virtual space designated by a user. 9. The image processing method according to claim 7 , wherein in the setting step , a sampling rate is determined based on an operation speed of an object in a virtual space designated by a user. 10. The light space data includes a light beam and an X axis.
    The tangent of the intersecting position x and the angle between the ray and the Z axis
    It is characterized by being managed in the (x, u) space indicated by
    The image processing method according to claim 7. 11. x is the x-axis of two adjacent rays.
    Quantized by the minimum interval above, u is the camera view
    Quantize by tangent of angle between two adjacent pixels from point position
    The image processing method according to claim 10, wherein:
    Law. 12. A ray sky read out in the reconstruction step.
    The interval data is the sampling rate set in the setting step.
    Discrete on a straight line in (x, u) space
    Claimed data sampled in
    The image processing method according to 10. 13. A computer recorded in a memory.
    Virtual according to user's operation using ray space data
    Functions as an image processing device for generating images of space
    Computer reading program to store
    A storage medium that can be read from the memory in response to a user operation.
    Set the sampling rate of a pixel interval of the line space data
    According to the setting process to be set and the sampling rate set in the setting process.
    The light space data is read from the memory and the temporary
    The reconstruction process for reconstructing the image of the virtual space and the image reconstructed in the reconstruction process are set.
    Perform pixel interpolation processing to enlarge the image size
    A computer reading comprising an interpolation step
    A removable storage medium.
JP2000082610A 1999-03-26 2000-03-23 Image processing method, apparatus and storage medium Expired - Fee Related JP3403143B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP8466099 1999-03-26
JP11-84660 1999-03-26
JP2000082610A JP3403143B2 (en) 1999-03-26 2000-03-23 Image processing method, apparatus and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2000082610A JP3403143B2 (en) 1999-03-26 2000-03-23 Image processing method, apparatus and storage medium

Publications (2)

Publication Number Publication Date
JP2000348201A JP2000348201A (en) 2000-12-15
JP3403143B2 true JP3403143B2 (en) 2003-05-06

Family

ID=26425654

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2000082610A Expired - Fee Related JP3403143B2 (en) 1999-03-26 2000-03-23 Image processing method, apparatus and storage medium

Country Status (1)

Country Link
JP (1) JP3403143B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6526051B2 (en) * 2014-12-12 2019-06-05 キヤノン株式会社 Image processing apparatus, image processing method and program

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
光線記述に基づく3次元実写空間のインタラクティブ操作,電子情報通信学会論文誌,日本,社団法人電子情報通信学会,1998年 5月25日,VOL.J81−D−2 NO.5,p850−860
適応的テクスチャマッピングを用いたTime Critical Renderingの実現,情報処理学会研究報告96−CG−79,日本,社団法人情報処理学会,1996年 2月23日,p39−46

Also Published As

Publication number Publication date
JP2000348201A (en) 2000-12-15

Similar Documents

Publication Publication Date Title
US9443555B2 (en) Multi-stage production pipeline system
Wu et al. Light field image processing: An overview
US9438878B2 (en) Method of converting 2D video to 3D video using 3D object models
US10540818B2 (en) Stereo image generation and interactive playback
US20160307368A1 (en) Compression and interactive playback of light field pictures
US8953905B2 (en) Rapid workflow system and method for image sequence depth enhancement
US8928730B2 (en) Method and system for correcting a distorted input image
US20140348238A1 (en) Object tracking using graphics engine derived vectors in a motion estimation system
US8284258B1 (en) Unusual event detection in wide-angle video (based on moving object trajectories)
Birklbauer et al. Panorama light‐field imaging
EP2328125B1 (en) Image splicing method and device
Chen Quicktime VR: An image-based approach to virtual environment navigation
Patti et al. Superresolution video reconstruction with arbitrary sampling lattices and nonzero aperture time
Mann et al. Video orbits of the projective group a simple approach to featureless estimation of parameters
Teodosio et al. Salient video stills: Content and context preserved
Gross et al. blue-c: a spatially immersive display and 3D video portal for telepresence
US8078006B1 (en) Minimal artifact image sequence depth enhancement system and method
US6573912B1 (en) Internet system for virtual telepresence
US8396328B2 (en) Minimal artifact image sequence depth enhancement system and method
DE69932619T2 (en) Method and system for recording and representing three-dimensional geometry, color and shadowing of animated objects
US4736436A (en) Information extraction by mapping
AU652051B2 (en) Electronically interpolated integral photography system
Zhang Image processing
TWI423659B (en) Image corretion method and related image corretion system thereof
US5974194A (en) Projection based method for scratch and wire removal from digital images

Legal Events

Date Code Title Description
TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20030203

R150 Certificate of patent or registration of utility model

Ref document number: 3403143

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20080229

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090228

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100228

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100228

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110228

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120229

Year of fee payment: 9

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130228

Year of fee payment: 10

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140228

Year of fee payment: 11

LAPS Cancellation because of no payment of annual fees