WO2015163317A1 - 映像表示装置、映像投影装置、動的錯覚呈示装置、映像生成装置、それらの方法、データ構造、プログラム - Google Patents

映像表示装置、映像投影装置、動的錯覚呈示装置、映像生成装置、それらの方法、データ構造、プログラム Download PDF

Info

Publication number
WO2015163317A1
WO2015163317A1 PCT/JP2015/062093 JP2015062093W WO2015163317A1 WO 2015163317 A1 WO2015163317 A1 WO 2015163317A1 JP 2015062093 W JP2015062093 W JP 2015062093W WO 2015163317 A1 WO2015163317 A1 WO 2015163317A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
video
component
luminance
absolute value
Prior art date
Application number
PCT/JP2015/062093
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
隆寛 河邉
眞也 西田
和史 丸谷
正貴 澤山
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to JP2016514945A priority Critical patent/JP6425312B2/ja
Priority to EP19188084.8A priority patent/EP3637410A1/en
Priority to US15/306,011 priority patent/US10571794B2/en
Priority to EP15783584.4A priority patent/EP3136717A4/en
Priority to CN201580021244.0A priority patent/CN106233716B/zh
Publication of WO2015163317A1 publication Critical patent/WO2015163317A1/ja
Priority to US16/739,414 priority patent/US11036123B2/en
Priority to US17/243,654 priority patent/US20210247686A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B25/00Viewers, other than projection viewers, giving motion-picture effects by persistence of vision, e.g. zoetrope
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63JDEVICES FOR THEATRES, CIRCUSES, OR THE LIKE; CONJURING APPLIANCES OR THE LIKE
    • A63J13/00Panoramas, dioramas, stereoramas, or the like
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63JDEVICES FOR THEATRES, CIRCUSES, OR THE LIKE; CONJURING APPLIANCES OR THE LIKE
    • A63J21/00Conjuring appliances; Auxiliary apparatus for conjurers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63JDEVICES FOR THEATRES, CIRCUSES, OR THE LIKE; CONJURING APPLIANCES OR THE LIKE
    • A63J5/00Auxiliaries for producing special effects on stages, or in circuses or arenas
    • A63J5/02Arrangements for making stage effects; Auxiliary stage appliances
    • A63J5/021Mixing live action with images projected on translucent screens
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/32Details specially adapted for motion-picture projection
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • H04N5/145Movement estimation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for
    • G09F19/12Advertising or display means not otherwise provided for using special optical effects
    • G09F19/18Advertising or display means not otherwise provided for using special optical effects involving the use of optical projection means, e.g. projection of images on clouds
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/10Mixing of images, i.e. displayed pixel being the result of an operation, e.g. adding, on the corresponding input pixels

Definitions

  • the present invention relates to a technique that provides a visual illusion.
  • Non-Patent Document 1 measures the three-dimensional shape of an object (single achromatic color) with a camera, and projects an image representing movement according to the three-dimensional shape of the object.
  • a technique for providing a dynamic movement effect is disclosed.
  • a flat single achromatic plane and a single achromatic automobile model placed on the plane are used as objects, and the object is used as a canvas. Projecting a color scheme that simulates the daylight conditions of the road around the car and the space in which the car is running as an image on the object, the illusion that the car model that is the object is running on the road, The effect of movement is given to the object.
  • Non-Patent Document 1 regards an object as a mere canvas and gives the object an effect of movement without using the pattern of the object.
  • An object of the present invention is to make a subject feel as if a motion is given using a pattern of the target.
  • This video is a video including a luminance motion component corresponding to the motion given to the object.
  • FIG. 15 is a block diagram illustrating a functional configuration of the embodiment.
  • FIG. 16 is a flowchart illustrating the processing of the embodiment.
  • FIG. 17 is a flowchart illustrating the processing of the embodiment.
  • 18A and 18B are diagrams for illustrating a method of adding a luminance motion component to an image.
  • FIG. 19 is a block diagram illustrating a functional configuration of the embodiment.
  • the pattern appearance of the object is illusionally deformed while using the pattern of the object.
  • an object is not regarded as a mere canvas, but an illusion of movement is generated by actively using a pattern of the object. Therefore, it is preferable that the object is not a single achromatic color but a chromatic color or a gray scale.
  • the three-dimensionality of the object need not be considered. For example, it is possible to illusionally deform the appearance of an image obtained by photographing a stationary object.
  • FIG. 1 is a block diagram showing the configuration of the video display device 1 of the present embodiment.
  • FIG. 2 is a flowchart showing the operation of the video display device 1 of the present embodiment.
  • the video display device 1 includes a display unit 11.
  • the display unit 11 displays a transparent image superimposed on the object.
  • the “object” may be an object having a three-dimensional shape (for example, a vase, a ball, a model), or a predetermined plane (for example, paper, a board, a wall, a screen). Good.
  • the object When the object is a plane, it is desirable that the plane include a pattern.
  • the pattern included in the plane for example, a photograph printed on paper, an image, a photograph projected on a predetermined plane, and an image can be considered.
  • examples of the pattern include an image displayed on the screen such as a display.
  • a method of “superimposing and displaying” a transparent image on an object there is typically a method of projecting an image on the object with a projector or the like.
  • the image projected by the projector or the like naturally has transparency.
  • a transparent liquid crystal screen is installed in front of the object, and a transparent image is made to flow through the liquid crystal screen, so that the liquid crystal screen is sandwiched and the object is viewed from the observer facing the object.
  • the “video” is, for example, a video in which an image of a distortion distribution having a low spatial frequency component is temporally switched.
  • video is prepared in advance and is input from the outside of the video display device 1.
  • the display unit 11 displays the video so that the edge included in the video overlaps the outline of the target object or the edge included in the target object.
  • the display unit 11 displays the image so that the edge included in the image overlaps the edge included in the vase such as the outline of the vase that is the object or a pattern drawn on the vase. indicate.
  • the display unit 11 includes the edge included in the video in the target object such as the pattern of the image projected on the target object. The image is displayed so that it overlaps the edge.
  • the display unit 11 includes a transparent image and an image in which a distortion distribution image having a low spatial frequency component is temporally switched, and an edge included in the image is included in the contour of the object or the object.
  • the image is displayed so as to overlap with the target edge (S11).
  • FIG. 3 is a block diagram showing the configuration of the video projection apparatus 1a according to the example of the present embodiment.
  • FIG. 4 is a flowchart showing the operation of the image projection apparatus 1a of the example of the present embodiment.
  • the video projection device 1a includes a projection unit 11a.
  • the projection unit 11a is configured so that an image in which a distortion distribution image having a low spatial frequency component is temporally switched is stationary so that an edge included in the image overlaps an outline of the object or an edge included in the object.
  • the projection is superimposed on the target object (S11a).
  • the projection unit 11a can be realized by a projector, for example.
  • the horizontal viewing angle ⁇ and the vertical viewing angle ⁇ of the object 9 viewed from the center of the projector projection lens are the horizontal and vertical viewing angles of the projected image. Must match.
  • the projection unit 11a is selected as an example of the display unit 11 in the second, third, and fourth embodiments described later, and each block diagram representing each device in the embodiments described later. Indicates that the projection unit 11 a is included as an example of the display unit 11.
  • Non-Patent Document 1 it is not easy to give a special illusion that the medium (for example, air, water, water vapor) around the target object appears to fluctuate irregularly.
  • the medium for example, air, water, water vapor
  • enormous calculation is required for simulating fluctuation of a medium, refraction of light rays, and the like.
  • a special illusion can be given to an object with a small amount of calculation.
  • an illusion can be given to a planar object, a three-dimensional object, a chromatic object, and an achromatic and shaded object. .
  • FIG. 6 is a block diagram showing a configuration of the video display device 2 of the present embodiment.
  • FIG. 7 is a flowchart showing the operation of the video display device 2 of the present embodiment.
  • the video display device 2 includes the same display unit 11, imaging unit 21, and video generation unit 22 as those in the first embodiment.
  • the imaging unit 21 captures an object and acquires an original image (S21).
  • the video generation unit 22 generates a plurality of different narrowband images from the original image, and generates a video in which the generated different narrowband images are arranged so as to be smoothly connected in time (S22).
  • the display unit 11 displays the generated video so as to overlap the object (S11).
  • a narrow-band image is an image in which the spatial frequency band of the entire image is narrower than the spatial frequency band of the original image while retaining edge information included in the original image, and is a transparent image.
  • images are convolved with azimuth filters that are 180 degrees out of phase, and the convolved images are connected smoothly in time to provide an illusion of motion impression.
  • a video may be generated.
  • FIG. 8 is a block diagram showing a configuration of the video display device 3 of the present embodiment.
  • FIG. 9 is a flowchart showing the operation of the video generation unit 32 of the video display device 3 of the present embodiment.
  • the video display device 3 includes the same display unit 11 and photographing unit 21 as in the second embodiment, and a video generation unit 32 different from that in the second embodiment.
  • the video generation unit 32 includes a modified video generation unit 321, a Fourier transform unit 322, and a separation unit 323.
  • the modified video generation unit 321 creates a video in which the static target image is deformed by dynamically deforming the original image obtained by photographing the target object (S321).
  • the deformation is represented by pixel movement.
  • This pixel movement is based on a pre-calculated algorithm.
  • Reference Document 2 Japanese Patent Application No. 2013-132609, which is an unpublished patent application.
  • Reference Document 2 Kawabe, T., Maruya, K., & Nishida, S. (2013). Seeing transparent liquids from dynamic image distortion. Journal of Vision, 13 (9): 208.
  • the modified video generation unit 321 generates a plurality of modulated images obtained by modulating the original image based on the distortion distribution.
  • the distortion in which the spatial frequency of the distortion distribution (distortion map) is 3 cpd (cycles per degree) or less is modulated into the original image. Is desirable. In other words, when rough distortion that reduces the difference in distortion between adjacent pixels is added to the original image, the impression of the transparent liquid can be made stronger.
  • the original image may be modulated using a distortion distribution having the same low spatial frequency component (for example, 3 cpd or less), or different low spatial frequency components (for example, 2 cpd or less for one and 3 cpd or less for the other). You may modulate using the distortion distribution which has.
  • a two-dimensional or more distortion direction is given to the modulated image. Any kind of distortion may be used as long as it is a two-dimensional geometric distortion such as rotational distortion, translation distortion, and random distortion.
  • the modified video generation unit 321 generates a video based on a plurality of modulated images generated from the original image.
  • the modified video generation unit 321 may generate an ordered modulated image sequence as a video so that a plurality of modulated images generated from the original image are temporally switched and presented.
  • the modulated image sequence is a sequence in which the presentation time (frame rate) of each image is set within a range in which the modulated image sequence can be viewed as a moving image rather than a series of still images by the viewer, It is a video composed by arranging.
  • the modified video generation unit 321 may perform control to switch and present each of a plurality of modulated images generated from the original image.
  • the presentation interval of each image may be controlled within a range that can be viewed as a moving image (video) rather than a series of still images by the viewer.
  • the presentation time of each image may be in a range that does not exceed 0.05 (sec) or a frame rate of 20 Hz or higher.
  • the Fourier transform unit 322 performs a three-dimensional (space-time) Fourier transform on the image created by the modified image generation unit 321 (S322).
  • the separation unit 323 separates the DC component (stationary component) and the motion component by temporal filtering, and outputs only the motion component as an image (S323).
  • the modified video generation unit 321 generates a video in which a plurality of image groups obtained by moving the pixels of the original image based on a predetermined algorithm are arranged so as to be smoothly connected in time (S321). .
  • the Fourier transform unit 322 performs three-dimensional (time-space) Fourier transform on the generated video (S322).
  • the separation unit 323 separates the stationary component and the motion component based on the Fourier transform result, and outputs only the motion component as an image (S323).
  • FIG. 10 shows an example of a stationary component (DC component) and a motion component extracted in steps S321 to S323 described above.
  • the motion component is mainly composed of high spatial frequency components.
  • the motion component is displayed as an image with the same viewing angle as that of the target object so as to overlap the target object, whereby the phase relationship between the two matches the original moving image.
  • the spatial structure of the object is illusionally captured by the motion component. As a result, the object appears to move.
  • FIG. 11 is a block diagram showing the configuration of the video display device 4 of the present embodiment.
  • FIG. 12 is a flowchart showing the operation of the video display device 4 of the present embodiment.
  • the video display device 4 of the present embodiment includes the same display unit 11, imaging unit 21, video generation unit 32, and original image display unit 41 not in the third embodiment as in the third embodiment. .
  • the imaging unit 21 and the video generation unit 32 execute steps S21 and S32, respectively, as in the third embodiment.
  • the original image display unit 41 displays (projects) the original image acquired by photographing the object in step S21 on another stationary medium (for example, a display, a screen, or the like).
  • the display unit 11 superimposes and displays (projects) the video on the original image displayed on the stationary medium in this way. Thereby, the effect similar to the above is acquired.
  • the video may be created by convolving an orientation filter with the image as shown in the second embodiment, or may be created by modifying the image as shown in the third embodiment. .
  • the original image can be printed on the object, projected by another projector, or displayed on electronic paper.
  • a projector can be used as the projection unit 11a of the video projection device 4a which is an example of the present embodiment. This projector may be a commercially available projector, but when used in a bright room, it needs to be a projector with high brightness.
  • the ideal viewing distance varies depending on the size of the object. For example, when a motion is added to a 16 cm square image using the method of this embodiment, a viewing distance of about 1.5 m is required. The larger the target, the longer the viewing distance.
  • the expression of a still face image can be changed illusionally or the direction of the line of sight can be changed. Further, by changing the movement pattern, it is possible to make an illusion that the target object is fluttered and to appear as if it exists under the flowing liquid.
  • a two-dimensional medium such as a printed material
  • the dynamic range of luminance of the object in the medium is generally narrow, but in this embodiment, a higher dynamic range than that of the object is used. Since the above image is projected on the object, it is possible to add an illusion to the object while improving the appearance of the object.
  • the video display apparatus of this embodiment it is possible to add an illusion of motion immediately to an image captured by a camera, except for the processing time.
  • the video display device of this embodiment can be applied as an exhibition technique in art museums and museums, and as an element technology for attractions used in entertainment facilities. For example, a child character printed on a stationary medium such as paper can be visually moved.
  • a color difference motion component corresponding to the time frequency of the moving image is required.
  • an embodiment is described in which an illusion is made that an image is moving at a time frequency equal to or greater than that size without using a color difference motion component having a time frequency equal to or greater than a predetermined size.
  • the human visual system is insensitive to motion signals defined by color signals, but sensitive to motion signals defined by luminance signals (Ref. 6: Ramachandran, V. S., & Gregory, R. L, “DoesDocolour provide an input to human motion perception?” Nature (1978), 275, 55-56.).
  • human spatial resolution for moving objects above a certain level is lower than that for stationary objects (Ref. 7: Kelly, D. H. (1979).
  • a spatial frequency component having an absolute value greater than zero is included in an “image having an absolute value of time frequency equal to or less than the first value”, and “absolute value of time frequency”.
  • the “luminance motion component” is a component corresponding to the “image”, and the “second value” is larger than the “first value”.
  • “Including a spatial frequency component whose absolute value is greater than zero” means including a non-zero spatial frequency component. For example, when the “first value” is F 1 and the “second value” is F 2 , the relationship 0 ⁇ F 1 ⁇ F 2 is satisfied.
  • An “image” may have only a single time frequency component (for example, 0 Hz) or may have a plurality of time frequency components (that is, a composite of images of a plurality of time frequencies). May be).
  • a component whose absolute value of time frequency is equal to or less than the first value is referred to as “low time frequency component”
  • “a component whose absolute value of time frequency is equal to or greater than the second value” is referred to as “high time frequency component”.
  • the “image” is not uniform and includes a spatial frequency component whose absolute value is greater than zero (a non-zero spatial frequency component).
  • Both the color difference component and the luminance component of the “image” may include a spatial frequency component whose absolute value is greater than zero, or only one of the color difference component or the luminance component includes a spatial frequency component whose absolute value is greater than zero. May be.
  • An illusion that a human who sees an image in which a “luminance motion component” is added to such an “image” moves at a time frequency higher than the time frequency of the size of the “first value”. Do (dynamic illusion). For example, when a “luminance motion component” is added to a stationary “image”, the human will have the illusion that this “image” is moving.
  • the frequency ⁇ is higher than the frequency ⁇ ” means that the absolute value
  • the “image” may be an achromatic (grayscale) image including only a luminance component or an image including a chromatic color (color).
  • a chromatic image seems to move at a higher frequency than the low temporal frequency component even though it does not contain a higher color difference component than the low temporal frequency component.
  • the “image” may be an image (image information) that is an object of information processing, or may be an image that appears on the surface of the object.
  • images that appear on the surface of an object are based on the color of the material that makes up the surface of the object, such as images and photos that are “printed”, “drawn”, “displayed” or “projected” on the surface of the object A pattern or a pattern, a pattern based on the shape of the surface of an object (for example, a pattern, a boundary line, a shading), or the like.
  • the “surface of the object” may be a flat surface, a curved surface, or an uneven surface.
  • the “object” may be an object having a three-dimensional shape (for example, a vase, a ball, a model, a building), or an object that can be regarded as a plane for use (for example, paper, board, wall, screen, screen) , A transmissive display).
  • the “image” and the “luminance motion component” correspond to the same “moving image”.
  • “luminance motion component” This corresponds to a luminance component in which the absolute value of the time frequency in “frame” is positive (
  • the “moving image” includes a “periodic or repetitive motion component”, a greater effect can be expected.
  • the “periodic motion component” means not only a component that strictly performs periodic motion but also a component that performs motion with high periodicity.
  • the “repetitive motion component” means not only a component that performs exact repetitive motion but also a component that performs highly repetitive motion.
  • the “image” may correspond to a still image of an arbitrary frame included in the “moving image”.
  • “A corresponds to B” may be that A is B, B may be derived from A (based on), or A may be derived from B. There may be.
  • “A is derived from B” may mean that A is obtained from B, A may be obtained from a copy of B, or A is obtained from an approximation of B.
  • image and “luminance motion component” may be extracted from “video” or a duplicate thereof, or “video” is generated from a static “image” captured by a camera or scanner, and the “video”.
  • luminance motion component may be extracted from
  • the function of motion vision for brightness contrast above a certain value is stationary (Reference 8: Pantle, A., & Sekuler, R. (1969). Contrast response of human visual mechanisms sensitive to orientation and direction of motion. Vision Research, 9, 397-406.).
  • the spatial resolution of motion vision is lower than the spatial resolution of perception for still images (Reference Document 7). That is, even if the quality of the motion information itself is lowered by manipulating the spatial resolution and contrast of the luminance motion component extracted from the moving image, the perceived quality of the moving image is maintained. Therefore, even if a high spatial frequency component or a component with reduced contrast included in the luminance motion component moving image included in the “moving image” is used as the “luminance motion component”, a sufficient dynamic illusion can be caused. Thereby, the amount of information can be reduced without substantially reducing the degree of dynamic illusion.
  • Adding luminance motion components to an image means, for example, “combining luminance motion components to an image”, “superimposing luminance motion components to an image”, “integrating luminance motion components to an image”, “Add luminance motion component” “Reflect luminance motion component in image” “Incorporate luminance motion component in image” or “Add or multiply at least pixel value of image and luminance motion component Or, an operation including a power is applied.
  • a specific method for “adding a luminance motion component to an image” will be described later.
  • the “low time frequency component” and the “high time frequency component” are extracted from the moving image M 1 .
  • the “low temporal frequency component” is “image (image)”
  • the luminance component extracted from the “high temporal frequency component” is “luminance motion component”.
  • the color difference component of “high time frequency component” is not used.
  • An operation for integrating (adding) “luminance motion component” to “image” is performed, and a moving image M 2 obtained thereby is displayed (FIG. 13A). This integrates the “image” and the “luminance motion component” in the human visual system that saw the moving image M 2, giving the illusion that the “image” is moving at a time frequency higher than the “low time frequency”. Bring.
  • the quality of the perceptual image is maintained even remove the color difference components of the "high temporal frequency components" from the moving picture M 1.
  • the moving image component extraction device 51 of this embodiment includes a low temporal frequency component extraction unit 511 (first processing unit), a luminance motion component extraction unit 512 (second processing unit), and an output unit 513.
  • the dynamic illusion presentation device 52 (illusion presentation device) of this embodiment includes an input unit 521, a calculation unit 522, and a display unit 523.
  • the moving image component extraction device 51 and the low time frequency component extraction unit 511 are, for example, a processor (hardware processor) such as a CPU (central processing unit), a random access memory (RAM), a read-only memory (ROM), or the like.
  • the computer may include a single processor and memory, or may include a plurality of processors and memory.
  • This program may be installed in a computer, or may be recorded in a ROM or the like in advance.
  • some or all of the processing units are configured using an electronic circuit that realizes a processing function without using a program, instead of an electronic circuit (circuitry) that realizes a functional configuration by reading a program like a CPU. May be.
  • an electronic circuit constituting one device may include a plurality of CPUs.
  • Video M 1 of the present embodiment is a color moving image including a chromatic color, and a pixel value that represents the R channel, G channel, the intensity variation of B-channel (formula (1)).
  • ⁇ R (x, y, t) ⁇ , ⁇ G (x, y, t) ⁇ and ⁇ B (x, y, t) ⁇ are pixel values R (x, y, t) and G, respectively. It is a three-dimensional matrix having two-dimensional space and one-dimensional information with (x, y, t) and B (x, y, t) as elements.
  • Pixel values R (x, y, t), G (x, y, t), and B (x, y, t) are respectively the horizontal position x, vertical position y, and R channel, G channel at frame number t, Represents the intensity of the B channel.
  • x, y, and t are integers indicating the horizontal position, the vertical position, and the frame number when a moving image is expressed in a three-dimensional coordinate system.
  • the lower and upper limits of the horizontal position are x min and x max (x min ⁇ x max )
  • the lower and upper limits of the vertical position are y min and y max (y min ⁇ y max )
  • the lower and upper limits of the frame number are t min And t max (t min ⁇ t max ).
  • the moving image M 1 preferably includes a frame (non-uniform frame) having a spatial frequency component (non-zero spatial frequency component) whose absolute value is greater than zero.
  • Pixel values R (x, y, t), G (x, y, t), B (x, y, t) constituting the moving image M 1 (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max , t min ⁇ t ⁇ t max ) is input to the low time frequency component extraction unit 511 and the luminance motion component extraction unit 512 of the moving image component extraction device 51 (FIG. 15).
  • the low time frequency component extraction unit 511 has pixel values R (x, y, t), G (x, y, t), B (x, y, t) (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max , t min ⁇ t ⁇ t max ), the static components R static (x, y), G static (x, y), and B static (x, y) are obtained and output.
  • time average values of pixel values R (x, y, t), G (x, y, t), and B (x, y, t) are respectively R static (x, y) and G static ( x, y), B static (x, y) (formula (2)).
  • Mean [x (t)] a ⁇ t ⁇ b represents an average value of x (a),..., X (b).
  • the low time frequency component extraction unit 511 performs R static (x, y), G static (x, y), B static (x, y) (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max )
  • R static (x, y) G static (x, y)
  • B static (x, y) (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max )
  • a static image M static (formula (3)) Is output.
  • the image M static is a component in which the time frequency in a plurality of frames t (t min ⁇ t ⁇ t max ) of the moving image M 1 becomes zero. It is an example of an image having an absolute value equal to or less than a first value.
  • the image M static preferably includes a spatial frequency component whose absolute value is greater than zero, and preferably includes a chromatic color (step S511).
  • the luminance motion component extraction unit 512 converts the pixel values R (x, y, t), G (x, y, t), and B (x, y, t) into R original (x, y, t), G original ( x, y, t), B original (x, y, t) is set (at step S5121), and these weighted addition in accordance with the degree to which each color contributes to the luminance, the luminance component of the video M 1 Y original (x, y, t) (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max , t min ⁇ t ⁇ t max ) is obtained (step S5122) (formula (4)).
  • Y original (x, y, t) ⁇ R R original (x, y, t) + ⁇ G G original (x, y, t) + ⁇ B B original (x, y, t) (4)
  • the luminance motion component extraction unit 512 subtracts the luminance static component Y static (x, y) from the luminance component Y original (x, y, t) of each frame t to thereby obtain the luminance motion component Y motion (x, y, t). t) is obtained and output (formula (5)).
  • the luminance still component Y static (x, y) is the luminance component Y original (x, y, t) (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max , t min ⁇ t ⁇ t max ). Is obtained by time averaging.
  • Y motion (x, y, t) Y original (x, y, t) -Y static (x, y) (5)
  • the luminance motion component Y motion (x, y, t) is a luminance component in which the absolute value of the time frequency in a plurality of frames t (t min ⁇ t ⁇ t max ) of the moving image M 1 is positive. This is an example of “a luminance motion component whose absolute value of frequency is a second value or more”. This luminance motion component is a component corresponding to the image, and the second value is larger than the first value (step S5123).
  • the image M static and the luminance motion component Y motion (x, y, t) (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max , t min ⁇ t ⁇ t max ) are input to the output unit 513. From there, it is sent to the dynamic illusion presentation device 52.
  • the image M static and the luminance motion component Y motion (x, y, t) are input to the input unit 521 of the dynamic illusion presentation device 52 and sent to the calculation unit 522.
  • the calculation unit 522 obtains and outputs a moving image M 2 in which the luminance motion component Y motion (x, y, t) is added to the image M static .
  • the calculation unit 522 uses the static components R static (x, y), G static (x, y), and B static (x, y) of the image M static (where x min ⁇ x ⁇ x max , y min). ⁇ y ⁇ y max ) is added to the luminance motion component Y motion (x, y, t) to obtain a moving image M 2 (FIG. 18A, equation (6)).
  • This allows create video M 2 removing the color motion information from the moving image M 1 (step S522).
  • the chromaticity (RGB ratio) of each pixel of the moving image M 2 slightly changes from the moving image M 1 (original moving image).
  • each RGB channel may be multiplied by a modulation ratio Y ′ motion (x, y, t) due to the motion of the luminance static component Y static (x, y) (formula (7)). , FIG. 18B).
  • Y ′ motion x, y, t
  • luminance motion components that do not involve color signal modulation can be synthesized (step S522).
  • the object in order to make the object (M static ) perceive as being given a motion, the object (M motion (x, y, t) or Y ′ motion (x, y, t) )).
  • This image is an image including a luminance motion component corresponding to the motion given to the object (for example, an image including only the luminance component).
  • the region of the object (M static ) and the video Y motion (x, y, t) or Y motion (x, y, t)
  • the video is superimposed on the object so that the region corresponding to the movement given to the object overlaps.
  • the absolute value of the time frequency of the luminance motion component in the plurality of frames in the video is larger than the absolute value of the time frequency in the plurality of frames in the video corresponding to the object.
  • Video M 2 obtained as described above are input to the display unit 523 and displayed therefrom (step S523). Nevertheless Video M 2 does not contain a motion component of the color signal, resulting in illusion of motion. In other words, it is removed color motion component from moving M 1, can provide the original video and in no way inferior visual experience to the user.
  • the moving image component extraction device 51 ′ of this embodiment includes a frequency domain conversion unit 514 ′, a low temporal frequency component extraction unit 511 ′ (first processing unit), a high temporal frequency component extraction unit 515 ′, A luminance motion component extraction unit 512 ′ (second processing unit), time domain conversion units 516 ′ and 517 ′, and an output unit 513 are included.
  • the dynamic illusion presentation device 52 is the same as that of the fifth embodiment.
  • the moving image component extraction device 51 ′ is configured, for example, by reading a predetermined program into a computer as described above.
  • Pixel values R (x, y, t), G (x, y, t), B (x, y, t) constituting the moving picture M 1 are input to the frequency domain transform unit 514 ′.
  • the frequency domain transform unit 514 ′ transforms the pixel values R (x, y, t), G (x, y, t), and B (x, y, t) into the time frequency domain, and the time frequency domain value FR.
  • FR (x, y, f), FG (x, y, f), and FB (x, y, f) are R (x, y, t), G (x, y, t), B It is a Fourier spectrum of dimension t with respect to (x, y, t) (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max , t min ⁇ t ⁇ t max ).
  • Equation (8) is from ⁇ to + ⁇ , in practice, it is sufficient to calculate only for the finite interval t min ⁇ t ⁇ t max . Also, since R (x, y, t), G (x, y, t), and B (x, y, t) are discrete values, discrete Fourier transform may be used.
  • i an imaginary unit
  • a circular ratio
  • FR (x, y, f), FG (x, y, f), FB (x, y, f) (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max , 0 ⁇
  • the low-time frequency component extraction unit 511 ′ multiplies FR (x, y, f), FG (x, y, f), and FB (x, y, f) by a low-pass filter LF (f) to obtain GR static ( x, y, f), GG static (x, y, f), and GB static (x, y, f) are obtained and output (formula (9)).
  • k is a constant of 0 or more.
  • the low-pass filter LF (f) extracts a component having a time frequency of 0 Hz.
  • the low-pass filter LF (f) extracts a component whose absolute value of time frequency is equal to or lower than the time frequency corresponding to k (step S511 ′).
  • GR static (x, y, f), GG static (x, y, f), and GB static (x, y, f) are input to the time domain conversion unit 516 ′.
  • the time domain transform unit 516 ′ transforms GR static (x, y, f), GG static (x, y, f), and GB static (x, y, f) into the time domain, and generates a low time frequency component R static.
  • (X, y, t), G static (x, y, t), and B static (x, y, t) are obtained.
  • an example using inverse Fourier transform is shown (formula (11)).
  • Equation (11) the integration range of Equation (11) is from ⁇ to + ⁇ , in practice, it is only necessary to calculate for the finite interval 0 ⁇
  • the time domain transform unit 516 ′ outputs an image (image) M static having the following low time frequency component.
  • This image M static is an example of an “image whose time frequency absolute value is equal to or less than the first value”.
  • R static (x, y, t) G static (x, y, t), and B static (x, y, t) are R (x, y, t), respectively.
  • M static is a still image.
  • the image M static in this case is composed of components whose time frequency is zero in a plurality of frames t (t min ⁇ t ⁇ t max ) of the moving image M 1 .
  • M static is an image including a slow motion component (step S516 ′).
  • the high time frequency component extraction unit 515 ′ multiplies FR (x, y, f), FG (x, y, f), and FB (x, y, f) by a high-pass filter HF (f) to obtain GR motion ( x, y, f), GG motion (x, y, f), GB motion (x, y, f) (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max , 0 ⁇
  • h is a constant of 0 or more.
  • the high-pass filter HF (f) extracts a time frequency component having a positive absolute value.
  • h>0 the high-pass filter HF (f) extracts a time frequency component whose absolute value of time frequency is larger than the time frequency corresponding to h.
  • h is preferably k or more. It is not always necessary to make k and h equal.
  • the low-pass filter LF (f) and the high-pass filter HF (f) are complementary.
  • components not removed by the low time frequency component extraction unit 511 ′ are replaced with GR motion (x, y, f), GG motion (x, y, f). , GB motion (x, y, f).
  • GR motion (x, y, f), GG motion (x, y, f), and GB motion (x, y, f) are input to the time domain conversion unit 517 ′.
  • the time domain transform unit 517 ′ transforms GR motion (x, y, f), GG motion (x, y, f), and GB motion (x, y, f) into the time domain, and performs high time frequency component R motion.
  • Equation (16) an example using inverse Fourier transform is shown (formula (16)).
  • the integration range of Equation (16) is from ⁇ to + ⁇ , in practice, it is only necessary to calculate for the finite interval 0 ⁇
  • the luminance motion component extraction unit 512 ′ weights and adds R motion (x, y, t), G motion (x, y, t), and B motion (x, y, t), and the luminance component Y of the moving image M 1 original (x, y, t) (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max , t min ⁇ t ⁇ t max ) is obtained and output (step S512 ') (formula (17) ).
  • Y motion (x, y, t) ⁇ R R motion (x, y, t) + ⁇ G G motion (x, y, t) + ⁇ B B motion (x, y, t) (17)
  • ⁇ R , ⁇ G and ⁇ B are weighting coefficients (constants).
  • the luminance motion component Y motion (x, y, t) is an example of “a luminance motion component whose absolute value of the time frequency is greater than or equal to the second value”.
  • the subsequent processing is the same as in the fifth embodiment.
  • the illusion video M 2 is moving at a higher temporal frequency than the image M Static can bring users.
  • the image M Static comprising a low temporal frequency components of the video M 1, to produce a video M 2 in consideration of the luminance motion components extracted from the high temporal frequency components of the video M 1 ( FIG. 13A).
  • the image M Static color component determined from extracted from the low temporal frequency components of the video M 1 "chrominance components", to generate a video M 2 in consideration of the luminance motion components extracted from the high temporal frequency components of the video M 1 (FIG. 13B).
  • the image M Static consisting extracted from the low temporal frequency components of the video M 1 "luminance component” may generate the moving image M 2 in consideration of the luminance motion components extracted from the high temporal frequency components of the video M 1 (FIG. 14A).
  • Such even video M 2 can result in a dynamic illusion.
  • the video M 1 may be a gray scale video.
  • a moving image M 2 in which a luminance motion component is added to a still image of an arbitrary frame extracted from the moving image M 1 may be generated.
  • Such even videos M 2 it can be still image to illusion as seen moving.
  • the moving image component extraction apparatus 61 of this embodiment includes an image extraction unit 611 (first processing unit), a luminance motion component extraction unit 612 (second processing unit), and an output unit 513.
  • the dynamic illusion presentation device 52 is the same as that of the fifth embodiment.
  • the moving image component extraction device 61 is configured, for example, by reading a predetermined program into a computer as described above.
  • step S611 is executed instead of step S511, and step S6123 is executed instead of step S5123.
  • step S611 and step S6123 which are different points will be described.
  • R static (x, y) R (x, y, n)
  • G static (x, y) G (x, y, n)
  • B static (x, y) B (x, y, n) (x min ⁇ x ⁇ x max, y min ⁇ y ⁇ y max) 2 -dimensional matrix with elements ⁇ R static (x, y) ⁇ , ⁇ G static (x, y) ⁇ , ⁇ B static (
  • An image M static (formula (3)) consisting of x, y) ⁇ is output.
  • the image M static in this embodiment is also an example of “an image having an absolute value of time frequency equal to or less than the first value” when the “first value” is 0.
  • the image M static preferably includes a spatial frequency component whose absolute value is greater than zero, and preferably includes a chromatic color (step S611).
  • Step S6123 The luminance motion component extraction unit 612 performs R static (x, y), G static (x, y), B static (x, y) (where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max ).
  • the luminance component corresponding to is a static luminance component Y static (x, y).
  • the luminance static component Y static (x, y) is obtained by weighted addition of R static (x, y), G static (x, y), and B static (x, y), as in step S5122. .
  • the luminance motion component extraction unit 612 subtracts the luminance stationary component Y static (x, y) from the luminance component Y original (x, y, t) of each frame t to thereby obtain the luminance motion component Y motion (x, y, t). ) (Where x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max , t min ⁇ t ⁇ t max ) is obtained and output (step S6123) (formula (5)). That is, the image (Y motion (x, y, t)) of this embodiment corresponds to the object from the luminance component Y original (x, y, t) of each of a plurality of frames in the image corresponding to the object. A luminance motion component obtained by subtracting a luminance static component Y static (x, y) obtained from a still image based on the video to be reproduced.
  • an illusion that a single still image of an arbitrary frame is visually moving is provided by the luminance motion component, and a visual experience that is comparable to the original moving image can be given to the user.
  • the still image M Static any frame of the moving M 1 to produce a video M 2 in consideration of the luminance motion components extracted from the high temporal frequency components of the video M 1 (FIG. 13A).
  • the image of the color component determined from extracted from the still image M Static any frame of the moving M 1 "color difference component”
  • the video M 2 in consideration of the luminance motion components extracted from the high temporal frequency components of the video M 1 You may produce
  • the image consisting extracted from still images M Static any frame of the moving M 1 "luminance component” to generate a video M 2 in consideration of the luminance motion components extracted from the high temporal frequency components of the video M 1 (FIG. 14A).
  • Such even video M 2 can result in a dynamic illusion.
  • the video M 1 may be a gray scale video.
  • the perceived quality of the moving image is maintained even if the spatial resolution and contrast of the “luminance motion component” combined with the “image” are reduced due to the motion visual characteristics of the visual system.
  • the video M 1 a "low temporal frequency component” and “image (image)”
  • videos M 1 of the luminance component Video M 1 to include the luminance motion component image extracted from the "high temporal frequency components”
  • filtering for reducing at least one of the high spatial frequency component and the contrast is performed on the luminance motion component image.
  • an operation for integrating (adding) “luminance motion component” to “image” is performed, and a moving image M 2 obtained thereby is displayed (FIG. 14B).
  • the moving image component extraction device 71 of this embodiment includes a low time frequency component extraction unit 511, a luminance motion component extraction unit 512, a filtering unit 719, and an output unit 513.
  • the dynamic illusion presentation device 52 of this embodiment is the same as that of the fifth embodiment.
  • the moving image component extraction device 71 is configured, for example, by reading a predetermined program into a computer as described above.
  • Luminance obtained in step S5123 motion component Y motion (x, y, t ) ( provided that, x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max, t min ⁇ t ⁇ t max) to the filtering unit 719 Entered.
  • the filtering unit 719 obtains FY motion ( ⁇ , ⁇ , ⁇ ) by converting Y motion (x, y, t) into a spatio-temporal frequency domain.
  • ⁇ , ⁇ , and ⁇ represent a horizontal spatial frequency, a vertical spatial frequency, and a time frequency, respectively.
  • the lower and upper limits of the horizontal spatial frequency are ⁇ min and ⁇ max ( ⁇ min ⁇ max ), the lower and upper limits of the vertical spatial frequency ⁇ min and ⁇ max ( ⁇ min ⁇ max ), and the time frequency
  • the lower limit and the upper limit be ⁇ min and ⁇ max ( ⁇ min ⁇ max ).
  • ⁇ , ⁇ , and ⁇ satisfy ⁇ min ⁇ ⁇ ⁇ ⁇ max , ⁇ min ⁇ ⁇ ⁇ max , and ⁇ min ⁇ ⁇ ⁇ max .
  • FY motion ( ⁇ , ⁇ , ⁇ ) in this case is a Fourier spectrum of Y motion (x, y, t).
  • the integration range of equation (18) is from ⁇ to + ⁇ , but in practice, if only finite intervals x min ⁇ x ⁇ x max , y min ⁇ y ⁇ y max , t min ⁇ t ⁇ t max are calculated. Good. Also, discrete Fourier transform may be used (step S7191).
  • the filtering unit 719 multiplies FY motion ( ⁇ , ⁇ , ⁇ ) by a filter G ( ⁇ , ⁇ , ⁇ ), and obtains a luminance motion component gY motion (x, y, t) by inverse Fourier transform ( Equation 19).
  • the integration range of equation (19) is from ⁇ to + ⁇ , but in practice, if only finite intervals ⁇ min ⁇ ⁇ ⁇ ⁇ max , ⁇ min ⁇ ⁇ ⁇ ⁇ max , ⁇ min ⁇ ⁇ ⁇ ⁇ max are calculated. Good. Inverse discrete Fourier transform may be used.
  • the luminance motion component gY motion (x, y, t) is an example of “a luminance motion component whose time frequency absolute value is equal to or greater than a second value”.
  • G ( ⁇ , ⁇ , ⁇ ) is a filter for reducing high spatial frequency components or contrast.
  • a filter for reducing high spatial frequency components is a low-pass filter
  • a filter for reducing contrast is, for example, a function for linearly transforming a gray level or a function for flattening a histogram (spatio-temporal frequency filter). ) Etc.
  • a specific example of the low-pass filter is shown below (Formula (20)). However, a and b are positive constants.
  • G ( ⁇ , ⁇ , ⁇ ) is a function that cuts the high spatial frequency Fourier spectrum in a stepped manner, but any function that can cut the high spatial frequency Fourier spectrum is G ( ⁇ , ⁇ ). , ⁇ ) (step S7192).
  • the subsequent processing is the one in which the luminance motion component Y motion (x, y, t) of the fifth embodiment is replaced with the luminance motion component gY motion (x, y, t).
  • the luminance motion component Y motion (x, y, t) of the fifth embodiment is replaced with the luminance motion component gY motion (x, y, t).
  • the luminance motion component Y motion (x, y, t) of the fifth embodiment is replaced with the luminance motion component gY motion (x, y, t).
  • the luminance motion component Y motion (x, y, t) of Modification 1 of the fifth embodiment may be replaced with the luminance motion component gY motion (x, y, t).
  • steps S7191 and 7192 are executed after step S512 ′ of FIG.
  • the luminance motion component Y motion (x, y, t) of the second modification of the fifth embodiment or the sixth embodiment may be replaced with the luminance motion component gY motion (x, y, t).
  • the moving picture component extraction device 81 of the present embodiment outputs any one of the moving picture component extraction devices 51, 51 ′, 61, and 71 of the fifth to seventh embodiments or their modifications.
  • the unit 513 is replaced with an output unit 813 and a printing unit 814.
  • the dynamic illusion presentation device 82 (illusion presentation device) of this embodiment includes an input unit 821 and a projection unit 823 (projector).
  • the object 83 exemplified in this embodiment is a two-dimensional medium such as paper.
  • the static image M static (Equation (3)) extracted from the moving image M 1 as in any one of the embodiments or modifications described above is input to the printing unit 814 (output unit).
  • the printing unit 814 prints the image M static on the surface of the object 83.
  • the luminance motion component Y motion (x, y, t) or gY motion (x, y, t) extracted from the moving image M 1 as in any embodiment or modification is sent to the dynamic illusion presentation device 82. Are input to the input unit 821.
  • the luminance motion component Y motion (x, y, t) or gY motion (x, y, t) is sent to the projection unit 823, and the projection unit 823 uses a known optical production technique (for example, Reference 9: Takahiro Kawasaki).
  • the luminance motion component Y motion (x, y, t) or gY motion (x, y, t) is projected onto the image M static printed on the object 83 to display the moving image M 2 (formula ( 21)).
  • the luminance motion component Y motion (x, y, t) or gY motion (x, y, t) is added and multiplied in a complex manner with respect to the luminance component of the image M static.
  • the state (considered state). In other words, it represents a state in which an operation including at least one of addition and multiplication is performed on the luminance component and the luminance motion component Y motion or gY motion (x, y, t) of the image M static .
  • the reflection pattern varies depending on the characteristics of the paper and ink, and the brightness changes partially in a multiplication manner, and the brightness changes in addition in other portions. Therefore, the calculation that causes both of these luminance changes is indicated by ⁇ .
  • Equation (6) shows a state where the luminance motion component Y motion (x, y, t) is added to the luminance component in the image M static
  • equation (7) shows the luminance in the image M static.
  • the component is multiplied by the modulation ratio Y ′ motion (x, y, t) due to the motion of the luminance motion component Y motion (x, y, t). That is, in this embodiment, in order to make the object (M static ) perceive as moving, an image (Y motion (x, y, t) or gY motion (x, y, t) is displayed on the object.
  • This image is an image including a luminance motion component (for example, an image including only the luminance component) corresponding to the motion given to the object, and as exemplified in equations (3) and (21).
  • the absolute value of the temporal frequency of the luminance motion component in the plurality of frames in the video is the time in the plurality of frames in the video corresponding to the target. Greater than the absolute value of the wave number.
  • a stationary image M static is printed on a “target” such as paper.
  • a “target” such as paper.
  • it may be displayed by being projected onto a “target” such as a screen by another projector, or may be displayed on a “target” such as electronic paper.
  • a “motion component moving image” may be projected on a transmissive display as shown in Reference Document 9.
  • the image M static is projected or displayed, the image M static need not be a static image, and may be an image that moves slowly. In this case, it is possible to give the user a visual experience that is moving at a higher time frequency than the image M static .
  • the moving image M 1 is created based on a still image obtained by photographing “objects” such as buildings and paintings, and then the luminance motion component Y motion (x, y, t) or gY motion (x, y, t), or luminance motion component Y motion (x, y, t) or from a still image obtained by photographing the “object” or gY motion (x, y, t) may be generated.
  • this invention is not limited to the above-mentioned embodiment and its modification.
  • the moving image component extraction device and the dynamic illusion presentation device may be the same device.
  • the processing of each unit included in the moving image component extraction device and the dynamic illusion presentation device may be performed by different devices.
  • a computer-readable recording medium is a non-transitory recording medium. Examples of such a recording medium are a magnetic recording device, an optical disk, a magneto-optical recording medium, a semiconductor memory, and the like.
  • This program is distributed, for example, by selling, transferring, or lending a portable recording medium such as a DVD or CD-ROM in which the program is recorded. Furthermore, the program may be distributed by storing the program in a storage device of the server computer and transferring the program from the server computer to another computer via a network.
  • a computer that executes such a program first stores a program recorded on a portable recording medium or a program transferred from a server computer in its own storage device. When executing the process, this computer reads a program stored in its own recording device and executes a process according to the read program. As another execution form of the program, the computer may read the program directly from the portable recording medium and execute processing according to the program, and each time the program is transferred from the server computer to the computer. The processing according to the received program may be executed sequentially.
  • the above-described processing may be executed by a so-called ASP (Application Service Provider) type service that does not transfer a program from the server computer to the computer but implements a processing function only by the execution instruction and result acquisition. Good.
  • ASP Application Service Provider
  • Distribution of a data structure that associates “second image data for recording a second image having a” may be performed. This distribution may be performed by distributing the data structure via the Internet or by selling, transferring, or lending a portable recording medium such as a DVD or CD-ROM that records the data structure. It may be broken.
  • a second data representing a luminance motion component whose absolute value is greater than or equal to the second value may be distributed. This distribution may be performed by distributing the data structure via the Internet or by selling, transferring, or lending a portable recording medium such as a DVD or CD-ROM that records the data structure. It may be broken.
  • the device that has received the data structure inputs “first data” and “second data” to the calculation unit, and the calculation unit performs a calculation in which “luminance motion component” is added to “image”, thereby moving the moving image.
  • the first data is input to the output unit
  • the output unit represents the “image” on the surface of the “object”
  • the second data is input to the projection unit
  • the projection unit sets the “luminance motion component” to the “object”.
  • the image is projected onto the “image” represented on the surface.
  • the moving image causing a dynamic illusion can be displayed on the display unit or the object.
  • the data structure of the video used in the apparatus that superimposes the video on the “object” may be distributed.
  • the absolute value of the time frequency of the luminance motion component in a plurality of frames in the video is larger than the absolute value of the time frequency in the plurality of frames in the video corresponding to the “object”.
  • the luminance obtained by subtracting the luminance still component obtained from the still image based on the image corresponding to the “object” from the luminance component of each of the plurality of frames in the image corresponding to the “object” A motion component may be included.
  • the present invention provides, for example, (1) an advertising field in which a motion impression is added to a paper medium by light projection or a motion impression is given to a signboard, and (2) an interior pattern such as a floor or a wall is illusionally deformed. It can be used in the interior field such as (3) art, toy, entertainment field, etc., such as giving motion to character illustrations and fusing with conventional projection mapping technology.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Image Processing (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)
  • Studio Circuits (AREA)
PCT/JP2015/062093 2014-04-22 2015-04-21 映像表示装置、映像投影装置、動的錯覚呈示装置、映像生成装置、それらの方法、データ構造、プログラム WO2015163317A1 (ja)

Priority Applications (7)

Application Number Priority Date Filing Date Title
JP2016514945A JP6425312B2 (ja) 2014-04-22 2015-04-21 動的錯覚呈示装置、その方法、プログラム
EP19188084.8A EP3637410A1 (en) 2014-04-22 2015-04-21 Video presentation device, dynamic illusion presentation device, video generation device, method thereof, data structure, and program
US15/306,011 US10571794B2 (en) 2014-04-22 2015-04-21 Video presentation device, dynamic illusion presentation device, video generation device, method thereof, data structure, and program
EP15783584.4A EP3136717A4 (en) 2014-04-22 2015-04-21 Video display device, video projection device, dynamic illusion presentation device, video generation device, method thereof, data construct, and program
CN201580021244.0A CN106233716B (zh) 2014-04-22 2015-04-21 动态错觉呈现装置、动态错觉呈现方法、程序
US16/739,414 US11036123B2 (en) 2014-04-22 2020-01-10 Video presentation device, method thereof, and recording medium
US17/243,654 US20210247686A1 (en) 2014-04-22 2021-04-29 Video generation device, video generation method, and recording medium

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2014088389 2014-04-22
JP2014-088389 2014-04-22
JP2014230720 2014-11-13
JP2014-230720 2014-11-13

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US15/306,011 A-371-Of-International US10571794B2 (en) 2014-04-22 2015-04-21 Video presentation device, dynamic illusion presentation device, video generation device, method thereof, data structure, and program
US16/739,414 Division US11036123B2 (en) 2014-04-22 2020-01-10 Video presentation device, method thereof, and recording medium

Publications (1)

Publication Number Publication Date
WO2015163317A1 true WO2015163317A1 (ja) 2015-10-29

Family

ID=54332481

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/062093 WO2015163317A1 (ja) 2014-04-22 2015-04-21 映像表示装置、映像投影装置、動的錯覚呈示装置、映像生成装置、それらの方法、データ構造、プログラム

Country Status (5)

Country Link
US (3) US10571794B2 (zh)
EP (2) EP3637410A1 (zh)
JP (5) JP6425312B2 (zh)
CN (4) CN110989284A (zh)
WO (1) WO2015163317A1 (zh)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017142407A (ja) * 2016-02-12 2017-08-17 日本電信電話株式会社 装置、映像投影装置、映像投影システム、映像表示装置、映像生成装置、それらの方法、プログラム、およびデータ構造
JP2017143450A (ja) * 2016-02-12 2017-08-17 日本電信電話株式会社 対象に動きの変化が与えられているように知覚させる装置、およびその方法
JP2017142408A (ja) * 2016-02-12 2017-08-17 日本電信電話株式会社 情報呈示システム、情報呈示方法、およびデータ構造
JP2017163373A (ja) * 2016-03-10 2017-09-14 日本電信電話株式会社 装置、投影装置、表示装置、画像生成装置、それらの方法、プログラム、およびデータ構造
JP2018050216A (ja) * 2016-09-23 2018-03-29 日本電信電話株式会社 映像生成装置、映像生成方法、およびプログラム
JP2018078660A (ja) * 2014-04-22 2018-05-17 日本電信電話株式会社 映像呈示装置、その方法、データ構造、プログラム
JP2018182353A (ja) * 2017-04-03 2018-11-15 日本電信電話株式会社 映像生成装置、映像生成方法、およびプログラム
JP2019013012A (ja) * 2018-08-17 2019-01-24 日本電信電話株式会社 データ構造
WO2019198570A1 (ja) * 2018-04-11 2019-10-17 日本電信電話株式会社 映像生成装置、映像生成方法、プログラム、およびデータ構造
WO2020066676A1 (ja) * 2018-09-27 2020-04-02 日本電信電話株式会社 画像生成装置、画像生成方法、およびプログラム
WO2020066675A1 (ja) * 2018-09-27 2020-04-02 日本電信電話株式会社 錯覚呈示システム、および錯覚呈示方法
WO2020110738A1 (ja) * 2018-11-28 2020-06-04 日本電信電話株式会社 動きベクトル生成装置、投影像生成装置、動きベクトル生成方法、およびプログラム

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10609355B2 (en) * 2017-10-27 2020-03-31 Motorola Mobility Llc Dynamically adjusting sampling of a real-time depth map
JP6845181B2 (ja) * 2018-04-24 2021-03-17 日本電信電話株式会社 映像生成装置、映像生成方法、およびプログラム

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0793558A (ja) * 1993-09-22 1995-04-07 Toshiba Corp 画像監視装置
JP2004088728A (ja) * 2002-04-22 2004-03-18 Nariyuki Motoi 画像提供装置、画像提供方法及び画像提供プログラム
JP2005204923A (ja) * 2004-01-22 2005-08-04 Heiwa Corp 遊技機
JP2011022762A (ja) * 2009-07-15 2011-02-03 Hitachi Ltd 画像信号処理装置
JP2011085686A (ja) * 2009-10-14 2011-04-28 Hara Seisakusho:Kk 立体表示方法及び装置
JP2013179542A (ja) * 2012-02-29 2013-09-09 Nikon Corp 画像処理装置およびプログラム

Family Cites Families (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5794784A (en) 1980-11-20 1982-06-12 Kobe Steel Ltd Picture information synthetizing terminal
US5325473A (en) * 1991-10-11 1994-06-28 The Walt Disney Company Apparatus and method for projection upon a three-dimensional object
JPH086158A (ja) * 1994-06-17 1996-01-12 Hitachi Ltd 投写型表示装置
US6567564B1 (en) * 1996-04-17 2003-05-20 Sarnoff Corporation Pipelined pyramid processor for image processing systems
WO1998044479A1 (fr) * 1997-03-31 1998-10-08 Matsushita Electric Industrial Co., Ltd. Procede de visualisation du premier plan d'images et dispositif connexe
US7604348B2 (en) * 2001-01-23 2009-10-20 Kenneth Martin Jacobs Continuous adjustable 3deeps filter spectacles for optimized 3deeps stereoscopic viewing and its control method and means
JP3453657B2 (ja) * 2001-03-13 2003-10-06 北海道地図株式会社 映像表示装置
US7259747B2 (en) * 2001-06-05 2007-08-21 Reactrix Systems, Inc. Interactive video display system
US7019748B2 (en) 2001-08-15 2006-03-28 Mitsubishi Electric Research Laboratories, Inc. Simulating motion of static objects in scenes
JP3674568B2 (ja) * 2001-10-02 2005-07-20 ソニー株式会社 強度変調方法及びシステム並びに光量変調装置
JP2004173320A (ja) 2002-04-22 2004-06-17 Nariyuki Motoi 画像提供装置
JP2004128936A (ja) * 2002-10-03 2004-04-22 Matsushita Electric Ind Co Ltd 映像信号処理装置
JP4038689B2 (ja) * 2004-01-21 2008-01-30 ソニー株式会社 表示制御装置および方法、記録媒体、並びにプログラム
JP4522140B2 (ja) * 2004-05-14 2010-08-11 キヤノン株式会社 指標配置情報推定方法および情報処理装置
EP1804112B1 (en) * 2004-09-08 2016-11-30 Nippon Telegraph And Telephone Corporation Three-dimensional display method, apparatus and program
CN1770204A (zh) * 2004-10-29 2006-05-10 中国科学院计算技术研究所 从具有静态背景的运动视频提取运动对象重心轨迹的方法
JP4415842B2 (ja) * 2004-12-08 2010-02-17 パイオニア株式会社 情報多重装置及び方法、情報抽出装置及び方法、並びにコンピュータプログラム
JP2006325122A (ja) * 2005-05-20 2006-11-30 Otsuka Denshi Co Ltd 動画表示性能判定方法、検査画面及び動画表示性能判定装置
US7397933B2 (en) * 2005-05-27 2008-07-08 Microsoft Corporation Collusion resistant desynchronization for digital video fingerprinting
CN100584035C (zh) * 2005-10-10 2010-01-20 重庆大学 基于压缩传输数据的多个显示器动态视频显示方法
US8565525B2 (en) * 2005-12-30 2013-10-22 Telecom Italia S.P.A. Edge comparison in segmentation of video sequences
JP4795091B2 (ja) * 2006-04-21 2011-10-19 キヤノン株式会社 情報処理方法および装置
JP4697063B2 (ja) * 2006-06-20 2011-06-08 日産自動車株式会社 接近車両検出装置
US20080085741A1 (en) * 2006-10-10 2008-04-10 Sony Ericsson Mobile Communications Ab Method for providing an alert signal
JP4885690B2 (ja) * 2006-11-28 2012-02-29 株式会社エヌ・ティ・ティ・ドコモ 画像調整量決定装置、画像調整量決定方法、画像調整量決定プログラムおよび画像処理装置
KR20080101700A (ko) * 2007-05-18 2008-11-21 소니 가부시끼 가이샤 표시 장치, 표시 장치의 구동 방법 및 컴퓨터 프로그램
KR101388583B1 (ko) * 2007-06-12 2014-04-24 삼성디스플레이 주식회사 구동장치, 이를 갖는 표시장치 및 표시장치의 구동방법
US8422803B2 (en) * 2007-06-28 2013-04-16 Mitsubishi Electric Corporation Image encoding device, image decoding device, image encoding method and image decoding method
JP2009033564A (ja) * 2007-07-27 2009-02-12 Sanyo Electric Co Ltd 表示装置及び表示プログラム
JP5080899B2 (ja) * 2007-08-08 2012-11-21 キヤノン株式会社 映像処理装置及びその制御方法
JP4829855B2 (ja) * 2007-09-04 2011-12-07 キヤノン株式会社 画像投影装置及びその制御方法
JP5464819B2 (ja) * 2008-04-30 2014-04-09 キヤノン株式会社 動画像処理装置および方法、プログラム
JP2010072025A (ja) * 2008-09-16 2010-04-02 Nikon Corp プロジェクタ付電子機器
EP2378392B1 (en) * 2008-12-25 2016-04-13 Panasonic Intellectual Property Management Co., Ltd. Information displaying apparatus and information displaying method
JP5402056B2 (ja) * 2009-02-16 2014-01-29 コニカミノルタ株式会社 画像処理装置、画像処理方法、およびプログラム
JP2010211303A (ja) * 2009-03-06 2010-09-24 Olympus Corp 画像生成装置、異物検査システム、及び画像生成方法
CN101562755B (zh) * 2009-05-19 2010-09-01 无锡景象数字技术有限公司 一种由平面视频制作3d视频的方法
JP5268796B2 (ja) * 2009-06-22 2013-08-21 日本放送協会 移動物体領域検出装置及び移動物体領域検出プログラム
CN101777180B (zh) * 2009-12-23 2012-07-04 中国科学院自动化研究所 基于背景建模和能量最小化的复杂背景实时替换方法
JP2011146980A (ja) * 2010-01-15 2011-07-28 Sony Corp 画像処理装置および方法
US20120320986A1 (en) * 2010-02-23 2012-12-20 Nippon Telegraph And Telephone Corporation Motion vector estimation method, multiview video encoding method, multiview video decoding method, motion vector estimation apparatus, multiview video encoding apparatus, multiview video decoding apparatus, motion vector estimation program, multiview video encoding program, and multiview video decoding program
CN101815177B (zh) * 2010-03-11 2011-09-21 广东威创视讯科技股份有限公司 同步显示装置、同步显示方法及叠加拼接显示系统
CN101833791B (zh) * 2010-05-11 2012-04-18 成都索贝数码科技股份有限公司 一种单摄像机下的场景建模方法及系统
CN102109543B (zh) * 2010-12-08 2013-04-17 电子科技大学 一种具有波形图像实时缩放功能的数字三维示波器
KR101818024B1 (ko) * 2011-03-29 2018-01-12 퀄컴 인코포레이티드 각각의 사용자의 시점에 대해 공유된 디지털 인터페이스들의 렌더링을 위한 시스템
CN103135889B (zh) * 2011-12-05 2017-06-23 Lg电子株式会社 移动终端及其3d图像控制方法
JP5812855B2 (ja) 2011-12-27 2015-11-17 株式会社久保田製作所 遠心分離機用バケット、遠心分離機用バケットの製造方法
US10426333B2 (en) * 2012-02-23 2019-10-01 American University Directional illusions based on motion pixels and uses thereof
JP2013186691A (ja) * 2012-03-08 2013-09-19 Casio Comput Co Ltd 画像処理装置及び画像処理方法並びにプログラム
JP6040564B2 (ja) * 2012-05-08 2016-12-07 ソニー株式会社 画像処理装置、投影制御方法及びプログラム
JP6021053B2 (ja) * 2012-05-22 2016-11-02 学校法人東京理科大学 動画視認性定量化装置、動画視認性定量化方法、及びプログラム
JP2014171097A (ja) * 2013-03-04 2014-09-18 Toshiba Corp 符号化装置、符号化方法、復号装置、および、復号方法
US9478004B2 (en) * 2013-04-11 2016-10-25 John Balestrieri Method and system for analog/digital image simplification and stylization
US9292956B2 (en) * 2013-05-03 2016-03-22 Microsoft Technology Licensing, Llc Automated video looping with progressive dynamism
JP6069115B2 (ja) * 2013-06-25 2017-02-01 日本電信電話株式会社 映像生成装置、映像生成方法、プログラム
CN103310422B (zh) * 2013-06-28 2016-08-31 新晨易捷(北京)科技有限公司 获取图像的方法及装置
EP3092603B1 (en) * 2014-01-07 2022-05-11 ML Netherlands C.V. Dynamic updating of composite images
CN110989284A (zh) 2014-04-22 2020-04-10 日本电信电话株式会社 视频呈现装置、视频呈现方法以及程序

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0793558A (ja) * 1993-09-22 1995-04-07 Toshiba Corp 画像監視装置
JP2004088728A (ja) * 2002-04-22 2004-03-18 Nariyuki Motoi 画像提供装置、画像提供方法及び画像提供プログラム
JP2005204923A (ja) * 2004-01-22 2005-08-04 Heiwa Corp 遊技機
JP2011022762A (ja) * 2009-07-15 2011-02-03 Hitachi Ltd 画像信号処理装置
JP2011085686A (ja) * 2009-10-14 2011-04-28 Hara Seisakusho:Kk 立体表示方法及び装置
JP2013179542A (ja) * 2012-02-29 2013-09-09 Nikon Corp 画像処理装置およびプログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3136717A4 *

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10571794B2 (en) 2014-04-22 2020-02-25 Nippon Telegraph And Telephone Corporation Video presentation device, dynamic illusion presentation device, video generation device, method thereof, data structure, and program
US11036123B2 (en) 2014-04-22 2021-06-15 Nippon Telegraph And Telephone Corporation Video presentation device, method thereof, and recording medium
JP2018078660A (ja) * 2014-04-22 2018-05-17 日本電信電話株式会社 映像呈示装置、その方法、データ構造、プログラム
JP2017142407A (ja) * 2016-02-12 2017-08-17 日本電信電話株式会社 装置、映像投影装置、映像投影システム、映像表示装置、映像生成装置、それらの方法、プログラム、およびデータ構造
JP2017142408A (ja) * 2016-02-12 2017-08-17 日本電信電話株式会社 情報呈示システム、情報呈示方法、およびデータ構造
JP2017143450A (ja) * 2016-02-12 2017-08-17 日本電信電話株式会社 対象に動きの変化が与えられているように知覚させる装置、およびその方法
JP2017163373A (ja) * 2016-03-10 2017-09-14 日本電信電話株式会社 装置、投影装置、表示装置、画像生成装置、それらの方法、プログラム、およびデータ構造
JP2018050216A (ja) * 2016-09-23 2018-03-29 日本電信電話株式会社 映像生成装置、映像生成方法、およびプログラム
JP2018182353A (ja) * 2017-04-03 2018-11-15 日本電信電話株式会社 映像生成装置、映像生成方法、およびプログラム
WO2019198570A1 (ja) * 2018-04-11 2019-10-17 日本電信電話株式会社 映像生成装置、映像生成方法、プログラム、およびデータ構造
JP7010122B2 (ja) 2018-04-11 2022-01-26 日本電信電話株式会社 映像生成装置、映像生成方法、およびプログラム
JP2019186762A (ja) * 2018-04-11 2019-10-24 日本電信電話株式会社 映像生成装置、映像生成方法、プログラム、およびデータ構造
JP2019013012A (ja) * 2018-08-17 2019-01-24 日本電信電話株式会社 データ構造
WO2020066675A1 (ja) * 2018-09-27 2020-04-02 日本電信電話株式会社 錯覚呈示システム、および錯覚呈示方法
JP2020052245A (ja) * 2018-09-27 2020-04-02 日本電信電話株式会社 錯覚呈示システム、および錯覚呈示方法
WO2020066676A1 (ja) * 2018-09-27 2020-04-02 日本電信電話株式会社 画像生成装置、画像生成方法、およびプログラム
JP7063216B2 (ja) 2018-09-27 2022-05-09 日本電信電話株式会社 錯覚呈示システム、および錯覚呈示方法
JP2020052741A (ja) * 2018-09-27 2020-04-02 日本電信電話株式会社 画像生成装置、画像生成方法、およびプログラム
JP7035936B2 (ja) 2018-09-27 2022-03-15 日本電信電話株式会社 画像生成装置、画像生成方法、およびプログラム
US11954867B2 (en) 2018-11-28 2024-04-09 Nippon Telegraph And Telephone Corporation Motion vector generation apparatus, projection image generation apparatus, motion vector generation method, and program
JP7040422B2 (ja) 2018-11-28 2022-03-23 日本電信電話株式会社 動きベクトル生成装置、投影像生成装置、動きベクトル生成方法、およびプログラム
JP2020087069A (ja) * 2018-11-28 2020-06-04 日本電信電話株式会社 動きベクトル生成装置、投影像生成装置、動きベクトル生成方法、およびプログラム
WO2020110738A1 (ja) * 2018-11-28 2020-06-04 日本電信電話株式会社 動きベクトル生成装置、投影像生成装置、動きベクトル生成方法、およびプログラム

Also Published As

Publication number Publication date
US11036123B2 (en) 2021-06-15
JP2020030843A (ja) 2020-02-27
CN110996080A (zh) 2020-04-10
EP3136717A4 (en) 2017-12-13
US20170045813A1 (en) 2017-02-16
CN106233716B (zh) 2019-12-24
US20210247686A1 (en) 2021-08-12
JP2018078660A (ja) 2018-05-17
JP2023101517A (ja) 2023-07-21
JP6965914B2 (ja) 2021-11-10
CN110989284A (zh) 2020-04-10
JP7283513B2 (ja) 2023-05-30
EP3136717A1 (en) 2017-03-01
JPWO2015163317A1 (ja) 2017-04-20
US20200150521A1 (en) 2020-05-14
CN110996080B (zh) 2021-10-08
JP2022002141A (ja) 2022-01-06
JP6425312B2 (ja) 2018-11-21
CN106233716A (zh) 2016-12-14
EP3637410A1 (en) 2020-04-15
CN110989285A (zh) 2020-04-10
JP6611837B2 (ja) 2019-11-27
US10571794B2 (en) 2020-02-25

Similar Documents

Publication Publication Date Title
JP6611837B2 (ja) 映像呈示装置、その方法、プログラム
US5438429A (en) Digital filtering for lenticular printing
WO2019198570A1 (ja) 映像生成装置、映像生成方法、プログラム、およびデータ構造
EP2567374B1 (en) A method and device for transforming an image
EP3057316B1 (en) Generation of three-dimensional imagery to supplement existing content
JP2018028710A (ja) 映像生成装置、映像生成方法、およびプログラム
JP6457964B2 (ja) 装置、投影装置、表示装置、画像生成装置、それらの方法、およびプログラム
EP3183870A1 (en) Methods and apparatus for mapping input image
JP6430420B2 (ja) 情報呈示システム、および情報呈示方法
JP6666296B2 (ja) 映像生成装置、その方法、およびプログラム
JP6615818B2 (ja) 映像生成装置、映像生成方法、およびプログラム
WO2019208143A1 (ja) 映像生成装置、映像生成方法、およびプログラム
EP1239416A1 (en) Method and device to calculate light sources in a scene and to generate mutual photometric effects
JP2019013012A (ja) データ構造
Turk A review of color vision and imaging
EP1239417A1 (en) Method and device to calculate light sources in a scene and to generate mutual photometric effects
US20130076733A1 (en) Image processing apparatus, image processing method, and image processing program
Neumann et al. Enhancing Perceived Depth in Images Via Artistic Matting

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15783584

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2016514945

Country of ref document: JP

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2015783584

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 15306011

Country of ref document: US

Ref document number: 2015783584

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE