WO2015156128A1 - 表示制御装置、表示制御方法、及び、プログラム - Google Patents
表示制御装置、表示制御方法、及び、プログラム Download PDFInfo
- Publication number
- WO2015156128A1 WO2015156128A1 PCT/JP2015/059070 JP2015059070W WO2015156128A1 WO 2015156128 A1 WO2015156128 A1 WO 2015156128A1 JP 2015059070 W JP2015059070 W JP 2015059070W WO 2015156128 A1 WO2015156128 A1 WO 2015156128A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display surface
- image
- user
- display
- projection image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1694—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
- H04N13/279—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/373—Image reproducers using viewer tracking for tracking forward-backward translational head movements, i.e. longitudinal movements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/376—Image reproducers using viewer tracking for tracking left-right translational head movements, i.e. lateral movements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/38—Image reproducers using viewer tracking for tracking vertical translational head movements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/006—Pseudo-stereoscopic systems, i.e. systems wherein a stereoscopic effect is obtained without sending different images to the viewer's eyes
Definitions
- the present technology relates to a display control apparatus, a display control method, and a program, and in particular, realizes, for example, a UI (User Interface) that allows a user to intuitively select a desired area in an image.
- the present invention relates to a display control device, a display control method, and a program that make it possible.
- an image display apparatus having a function of displaying an image such as a tablet
- when displaying an image such as a still image photographed by a camera on a touch panel for example, operating a touch panel on which a still image is displayed.
- the still image displayed on the display surface of the touch panel can be enlarged or reduced.
- Patent Document 1 in a video communication system in which users A and B communicate, a two-dimensional image of user B and a two-dimensional image of user B based on the distance between the display surface and user B To generate a two-dimensional image of the user B from the viewpoint position of the user A and the three-dimensional image information of the user B while converting it into three-dimensional image information having depth information, and displaying it on the display surface of the user A
- a technique has been proposed which gives the user a sense of distance to the other party of dialogue and a sense of reality.
- the position of the user who views the still image (the user holding the tablet) and the position of the display surface are not considered.
- the still image is a so-called sticky display on the display surface. Therefore, even if the user or the display surface moves (parallel), the still image displayed on the display surface of the tablet does not change according to the movement of the user or the display surface.
- the still image displayed on the display surface follows the movement of the display surface and moves to the display surface. Although it sticks and moves, the content of the still image (the pixel value of each pixel of the still image displayed on the display surface) does not change.
- the UI User Interface
- the UI can allow the user to have a feeling that the image on the other side of the window is seen through the window with the display surface of the tablet as the window. It is difficult to realize a UI that allows the user to intuitively select the area that you want to see in the image (beyond the window).
- the present technology has been made in view of such a situation, and enables a user to realize a UI that can intuitively select a desired area in an image. is there.
- a display control apparatus or a program detects a position of a display surface on which the display device displays an image, a pixel of the display surface whose position is detected by the detection unit, and a position of a user.
- a display control method detects a position of a display surface on which a display device displays an image, and generates a predetermined image along a straight line passing the pixel of the display surface whose position is detected and the position of the user. It is a display control method including the step of controlling the display device such that a projection image obtained by projecting an image model onto the display surface is displayed on the display surface.
- the position of the display surface on which the display device displays the image is detected, and the pixel of the display surface whose position is detected;
- a projection image obtained by projecting an image model of a predetermined image onto the display surface along a straight line passing through the position is displayed on the display surface.
- the display control device may be an independent device or an internal block constituting one device.
- the program can be provided by transmitting via a transmission medium or recording on a recording medium.
- FIG. 1 is a perspective view showing a configuration example of an embodiment of an image display device to which the present technology is applied. It is a block diagram which shows the functional structural example of an image display apparatus.
- FIG. 6 is a block diagram showing an exemplary configuration of a control unit 24.
- 5 is a flowchart illustrating an example of processing of a control unit 24. It is a figure explaining the principle of the production
- FIG. It is a figure which shows the 1st example of an image model. It is a figure which shows the 2nd example of an image model. It is a figure which shows the 3rd example of an image model. It is a figure explaining an example of generation of a projection picture in case display side 11 moves to a horizontal direction.
- FIG. 11 It is a figure which illustrates further an example of generation of a projection picture in case display side 11 moves horizontally.
- FIG. 11 It is a figure explaining an example of generation of a projection picture in case a display surface 11 is rotated and inclined in a pitch direction. It is a figure which illustrates further an example of generation of a projection picture in case display surface 11 is inclined in a pitch direction.
- FIG. 11 It is a figure explaining an example of generation of a projection picture in case a display surface 11 is rotated in a yaw direction, and is inclined. It is a figure which illustrates further an example of generation of a projection picture in case display surface 11 is inclined in the direction of yaw.
- FIG. 1 It is a figure which illustrates further an example of generation of a projection picture in case a user moves in the perpendicular direction.
- FIG. 21 is a block diagram illustrating a configuration example of an embodiment of a computer to which the present technology is applied.
- FIG. 1 is a perspective view showing a configuration example of an embodiment of an image display device to which the present technology is applied.
- the image display device is, for example, a smart phone (or a tablet), and a rectangular display surface 11 for displaying an image and a camera 12 for capturing an image are provided on the front side.
- the display surface 11 is configured of, for example, a touch panel that displays an image and receives an input (contact or proximity) of the user.
- FIG. 2 is a block diagram showing a functional configuration example of a smartphone as the image display device of FIG.
- the smartphone includes a data acquisition unit 21, a display surface detection unit 22, a user detection unit 23, a control unit 24, and a display unit 25.
- the data acquisition unit 21 acquires data of the content of the image and supplies the data to the control unit 24.
- the data acquisition unit 21 incorporates, for example, a recording (storage) medium, and acquires the content by reading data of the content recorded in the recording medium.
- a recording (storage) medium For example, computer graphics data, animation data, data captured by a digital (still / video) camera, and the like can be recorded on the recording medium.
- the data acquisition unit 21 is, for example, a network interface, and acquires data by downloading content data from a server on a network such as the Internet.
- the data of the content acquired by the data acquisition unit 21 may be still image data or moving image data.
- the data of the content acquired by the data acquiring unit 21 may be data of a 2D (Dimensional) image or data of a 3D image.
- the data acquisition unit 21 can acquire data of content including an image and sound (sound) attached to the image.
- the data acquisition unit 21 controls data of an image captured in real time with a camera (not shown) provided on the back side of the smartphone, a camera (not shown) that can communicate with the smartphone, etc. It can be acquired as data of content supplied to the unit 24.
- the display surface detection unit 22 detects the position and orientation (tilt) of the display surface 11 of the display unit 25 of the smartphone, and supplies the detected position and orientation (tilt) to the control unit 24 as display surface information.
- the position and attitude of the display surface 11 for example, the position and attitude of the smartphone can be adopted.
- the display surface detection unit 22 for example, a sensor that detects a motion of an acceleration sensor, a gyro, or the like, a magnetic sensor that detects a magnetic field, or the like that is incorporated in a smartphone can be employed. Further, as the display surface detection unit 22, for example, GPS (Global Positioning System) can be adopted.
- GPS Global Positioning System
- the position of the display surface 11 detected by the display surface detection unit 22 may be, for example, an absolute position such as latitude and longitude obtained from GPS, or the position of the display surface 11 at a certain timing. It may be a relative position as a reference.
- the posture of the display surface 11 detected by the display surface detection unit 22 for example, one or more rotation angles in the pitch direction, the yaw direction, and the roll direction of the display surface 11 can be adopted.
- the display surface detection unit 22 can detect only the position of the display surface 11.
- the display surface detection unit 22 can detect all the positions in the horizontal direction, the vertical direction, and the depth direction of the display surface 11 in the three-dimensional space as the position of the display surface 11. One or two positions in the horizontal direction, the vertical direction, and the depth direction can also be detected.
- the display surface detection unit 22 can detect all the rotation angles of the display surface 11 in the pitch direction, the yaw direction, and the roll direction as the attitude of the display surface 11. One or two rotation angles in the yaw direction and the roll direction can also be detected.
- the detection accuracy of the position and orientation of the display surface 11 in the display surface detection unit 22 is not particularly limited.
- the smartphone based on the position and orientation of the display surface 11, an image is provided that reproduces a view that can be seen when the user views the image model generated from the data of the content using the display surface 11 as a window. . Therefore, the factor of detecting both the position and orientation of the display surface 11 or detecting only the position of the display surface 11 is the reproducibility of the image provided by the smartphone (in other words, the window-likeness of the display surface 11) Affect
- the user detection unit 23 detects the position of the user and supplies the detected position as user position information to the control unit 24.
- the user position detection unit 23 for example, the camera 12 (FIG. 1) provided on the front of the smartphone can be adopted. In this case, the user position detection unit 23 can detect the position of the user based on the position of the smartphone based on the image of the user captured by the camera 12.
- the control unit 24 is, for example, a display control device configured to control display of an image, which is configured by a central processing unit (CPU) or a graphics processing unit (GPU) of a smartphone, and the content of the image supplied from the data acquisition unit 21
- An image model is generated from the data of
- the image model is made up of (a set of) voxels with voxels as components.
- the voxel has color and position information, and the color of the voxel is arranged at the position of the voxel to constitute the content of the image.
- the control unit 24 generates an image model, and the image model, the position and orientation of the display surface 11 represented by the display surface information supplied from the display surface detection unit 22, and the user position information supplied from the user detection unit 23.
- the projection image to be displayed on the display surface 11 of the display unit 25 is generated based on the position or the like of the user indicated by the symbol.
- the display unit 25 is controlled to display the projection image.
- control unit 24 arranges the display surface 11 (virtually) at a position represented by the display surface information at (a position corresponding to) the position represented by the display surface information in a predetermined reference coordinate system.
- the user's (viewpoint) is (virtually) arranged at (a position corresponding to) the position represented by the user position information in the coordinate system.
- control unit 24 may control the image model at a predetermined position in the reference coordinate system (for example, a position on the back side of the display surface 11 or a position surrounding the user and the display surface 11 when viewed from the user in the reference coordinate system). Place.
- control unit 24 projects the image model onto (the respective pixels of) the display surface 11 along a straight line passing through each pixel of the display surface 11 in the reference coordinate system and the position (viewpoint) of the user. Are supplied to the display unit 25 and displayed on the display surface 11.
- the display unit 25 is a display device that displays an image, such as a touch panel of a smartphone, and displays a projection image according to the control of the control unit 24.
- the surface on which the projection image is displayed is the display surface 11.
- the display unit 25 is, for example, a touch panel as described above, since the touch panel and the display surface 11 are integrated, detection of the position and orientation of the display surface 11 by the display surface detection unit 22 is performed. This can be performed by detecting the position and the attitude of the touch panel as the display unit 25 and the smartphone integrated with the touch panel.
- the control unit 24 can generate a projection image without using the user position information supplied from the user position detection unit 23.
- the control unit 24 controls, for example, a predetermined position (for example, the display surface) facing the display surface 11 in the reference coordinate system.
- the projected image can be generated by arranging the user at a predetermined distance from the center of the display surface or the like on a straight line passing through the center (center of gravity) of 11 and orthogonal to the display surface 11.
- the smartphone can be configured without providing the user position detection unit 23.
- the control unit 24 uses, for example, a predetermined default posture as the posture of the display surface 11.
- Display surface 11 can be arranged in the reference coordinate system.
- the control unit 24 does not detect the detected position.
- the display surface 11 can be arranged in the reference coordinate system using a predetermined default position as the position of.
- FIG. 3 is a block diagram showing a configuration example of the control unit 24 of FIG.
- control unit 24 includes a reference coordinate system generation unit 31, a display surface information acquisition unit 32, an image model generation unit 33, a user position information acquisition unit 34, a display surface arrangement unit 35, an image model arrangement unit 36, and a user arrangement.
- a unit 37 and an image generation unit 38 are included.
- the reference coordinate system generation unit 31 generates a predetermined three-dimensional coordinate system as a reference coordinate system, and supplies the display surface arrangement unit 35, the image model arrangement unit 36, and the user arrangement unit 37.
- the reference coordinate system generation unit 31 generates the xy with respect to the display surface 11 at a predetermined timing (hereinafter, also referred to as an initial setting timing) when, for example, the smartphone is operated to display a projection image.
- a predetermined timing hereinafter, also referred to as an initial setting timing
- a three-dimensional coordinate system or the like in which planes are parallel is generated as a reference coordinate system.
- the display surface information acquisition unit 32 acquires display surface information from the display surface detection unit 22 and supplies the display surface information to the display surface arrangement unit 35.
- the display surface information can include the shape of the display surface 11 as needed, in addition to the position and orientation of the display surface 11.
- Data of content is supplied to the image model generation unit 33 from the data acquisition unit 21.
- the image model generation unit 33 analyzes the data of the content from the data acquisition unit 21, identifies that the data of the content is a 2D image or a 3D image, and the like, and then an image model corresponding to the data of the content Generate
- the image model generation unit 33 supplies the image model to the image model arrangement unit 36.
- the user position information acquisition unit 34 acquires user position information from the user detection unit 23 and supplies the user position information to the user placement unit 37.
- the display surface arranging unit 35 arranges the display surface 11 (virtually) on the reference coordinate system from the reference coordinate system generating unit 31 based on the display surface information from the display surface information acquiring unit 32, and the arrangement result Are supplied to the image generation unit 38.
- the display surface arranging unit 35 arranges the display surface 11 at a position represented by the display surface information at (a position corresponding to) the display surface information in the reference coordinate system.
- the display surface 11 is arranged in the reference coordinate system so as to be parallel to the xy plane of the reference coordinate system at the initial setting timing.
- the display surface 11 is, for example, a rectangular surface, and the user sets the long side and the short side of the rectangular display surface 11 at the initial setting timing.
- One of them for example, has the long side directed horizontally.
- the display surface 11 is arranged in the reference coordinate system such that the long side is parallel to the x-axis and the other short side is parallel to the y-axis.
- the image model arranging unit 36 arranges (virtually) the image model supplied from the image model generating unit 33 on the reference coordinate system from the reference coordinate system generating unit 31 at the initial setting timing, and the arrangement result is , And supplies to the image generation unit 38.
- the image model placement unit 36 for example, at a predetermined position such as a position on the back side of the display surface 11 or a position surrounding the user and the display surface 11 as viewed from the user in the reference coordinate system. Arrange the image model.
- the user placement unit 37 places (virtually) the user (viewpoint) on the reference coordinate system from the reference coordinate system generation unit 31 based on the user position information from the user position information acquisition unit 34, and arranges the user The result is supplied to the image generation unit 38.
- the user placement unit 37 places the user at (a position corresponding to) the position represented by the user position information in the reference coordinate system.
- the image generation unit 38 sets the display surface 11 to the reference coordinate system from the display surface arrangement unit 35, the arrangement result from the image model arrangement unit 36 to the reference coordinate system from the image model, and the user arrangement unit.
- a projected image for displaying on the display surface 11 an image that reproduces a view that can be seen when the user views the image model with the display surface 11 as a window based on the arrangement result from the user 37 in the reference coordinate system It is generated and supplied to the display unit 25.
- the image generation unit 38 projects the image model on (the respective pixels of) the display surface 11 along a straight line passing through each pixel of the display surface 11 in the reference coordinate system and the position (viewpoint) of the user. Generate an image.
- the image generation unit 38 generates an image model that intersects a straight line passing the pixel of interest in the reference coordinate system and the user.
- the voxel at the position of is detected as a cross voxel.
- the image generation unit 38 adopts the color possessed by the cross voxel as the pixel value of the target pixel, and performs the above processing as all the pixels of the display surface 11 in the reference coordinate system as the target pixel.
- a projected image is generated by projecting a part or all of the image model in the coordinate system onto the display surface 11.
- FIG. 4 is a flowchart illustrating an example of processing of the control unit 24 of FIG.
- step S11 the reference coordinate system generation unit 31 generates a reference coordinate system and supplies it to the display surface arrangement unit 35, the image model arrangement unit 36, and the user arrangement unit 37, and the process proceeds to step S12.
- step S12 the image model generation unit 33 generates an image model corresponding to the data of the content from the data of the content supplied from the data acquisition unit 21, supplies the image model to the image model arrangement unit 36, and The process proceeds to step S13.
- step S13 the display surface information acquisition unit 32 acquires display surface information from the display surface detection unit 22 and supplies the display surface information to the display surface arrangement unit 35, and the user position information acquisition unit 34 receives the display surface information from the user detection unit 23.
- the user position information is acquired and supplied to the user placement unit 37, and the process proceeds to step S14.
- step S14 the display surface arrangement unit 35 arranges the display surface 11 on the reference coordinate system from the reference coordinate system generation unit 31 based on the display surface information from the display surface information acquisition unit 32, and the arrangement result is , And supplies to the image generation unit 38.
- step S14 the user arranging unit 37 arranges the user on the reference coordinate system from the reference coordinate system generating unit 31 based on the user position information from the user position information acquiring unit 34, and the arrangement result is It is supplied to the image generation unit 38.
- step S14 the image model arranging unit 36 arranges the image model at a predetermined position such as the position on the back side of the display surface 11 as viewed from the user in the reference coordinate system, for example. Go to
- step S15 the image generation unit 38 arranges the display surface 11 in the reference coordinate system from the display surface arrangement unit 35, arranges the image model in the reference coordinate system from the image model arrangement unit 36, and The image model is displayed along a straight line passing each pixel of the display surface 11 in the reference coordinate system and the position (viewpoint) of the user based on the arrangement result from the user placement unit 37 in the reference coordinate system of the user.
- a projected image projected onto each pixel of the surface 11 is generated, and the process proceeds to step S16.
- step S16 the image generation unit 38 supplies the projection image to the display unit 25 to display the projection image on the display surface 11, and the process proceeds to step S17.
- step S17 as in step S13, the display surface information acquisition unit 32 acquires display surface information from the display surface detection unit 22 and supplies the display surface information to the display surface arrangement unit 35, and the user position information acquisition unit 34 The user position information is acquired from the user detection unit 23 and supplied to the user placement unit 37, and the process proceeds to step S18.
- step S18 the display surface arrangement unit 35 arranges the display surface 11 on the reference coordinate system from the reference coordinate system generation unit 31 based on the display surface information from the display surface information acquisition unit 32, and the arrangement result is , While supplying the image generation unit 38, the user arrangement unit 37 arranges the user on the reference coordinate system from the reference coordinate system generation unit 31 based on the user position information from the user position information acquisition unit 34, and The arrangement result is supplied to the image generation unit 38.
- FIG. 5 is a diagram for explaining the principle of generation of a projection image by the image generation unit 38 of FIG. 3.
- the user, the display surface 11, and the image model are arranged on the reference coordinate system based on the display surface information and the user position information.
- the image generation unit 38 is a straight line passing the pixel P of interest in the reference coordinate system and the user (arrows in dotted line in FIG.
- the voxel V at the position of the image model that intersects with) is detected as a cross voxel V.
- the image generation unit 38 projects the cross voxel V on the target pixel P by adopting the color of the cross voxel V as the pixel value of the target pixel P.
- the image generation unit 38 generates the projection image displayed on the display surface 11 by performing the same processing as all pixels of the display surface 11 in the reference coordinate system as the target pixel.
- the cross voxels even if there is a plurality of voxels as the cross voxels, if the color of the cross voxel closest to the near side is a color having transparency, the cross voxel closest to the near side according to the transparency
- the color obtained by superimposing the color of and the color of the second (inner side) cross voxel from the front side is adopted as the pixel value of the target pixel P.
- the color of the second and subsequent intersecting voxels from the front has transparency.
- the image generation unit 38 can detect, for example, the voxel closest to the model intersection position as the intersection voxel, and adopt the color of the intersection voxel as the pixel value of the target pixel P.
- the image generation unit 38 detects a plurality of closest voxels adjacent or adjacent to the model intersection position as candidate voxels to be candidates for intersection voxels, and performs interpolation using the plurality of candidate voxels ( Interpolated voxels can be generated at the model intersecting position by interpolation or extrapolation).
- Interpolated voxels can be generated at the model intersecting position by interpolation or extrapolation).
- a composite color is generated by combining the colors possessed by each of a plurality of candidate voxels as the colors possessed by the intersected voxels according to the distance between each candidate voxel and the model intersection position.
- such a composite color is adopted as the pixel value of the target pixel P.
- the user based on the display surface information and the user position information, the user, the display surface 11 and the image model are arranged on the reference coordinate system, and the pixels P of the display surface 11 in the reference coordinate system
- the user looks at the image model with the display surface 11 as a window by projecting the image model onto the pixels P of the display surface 11 along a straight line passing through with the user to generate a projected image
- An image that reproduces a view that is viewed can be obtained as a projection image.
- the image projected onto the user's retina is the case where the user views the image model through the window using the display surface 11 as a window.
- the image is the same (identical or similar) as the image projected onto the user's retina, and the user is as if the image model has been localized in front of the user's eyes, with the display surface 11 as the window, and through the window You can enjoy the feeling as if you were looking at an image model.
- the projection image displayed on the display surface 11 is an image that reproduces a view that can be seen when the user views the image model with the display surface 11 as a window, so, for example, the image
- the user can generate a projection image using an image model (hereinafter, also referred to as an image model of place A) obtained from (an image content of) an image of a scene of place A taken by an imaging device to be photographed.
- an image model of place A obtained from (an image content of) an image of a scene of place A taken by an imaging device to be photographed.
- the projected image can also be generated based on the position (absolute position) and posture of the photographing device at the time of photographing the place A by the photographing device.
- a projection image using the image model of the place A can be generated when the user possessing the smartphone is actually at the place A.
- the generation of the projection image using the image model of the place A is such that the scene model at the time of photographing the place A is reproduced based on the position and posture of the photographing device at the time of photographing the place A. Can be placed in the reference coordinate system.
- the user can see the past scenery of the place A at the place A from the display surface 11 as a window, so to speak, to experience a window of time or a time machine.
- the user at location A can see the current view of location A.
- the image model of the place A is used as a reference coordinate system so that generation of a projection image using the image model of the place A reproduces the past scenery at the time of shooting the place A.
- the display surface 11 of the smartphone displays the past of the direction B when the place A is photographed by the photographing apparatus.
- the landscape is displayed as a projection image.
- the user views the past scenery of the direction B which could be seen if it was at the place A at the time of photographing the place A with the photographing apparatus as a projection image displayed on the display surface 11 as a window be able to.
- the user can view the current scenery in the direction B of the place A as the actual scenery and see the past scenery in the direction B of the place A through the window with the display surface 11 as a window You can enjoy the feeling like
- FIG. 6 is a diagram showing a first example of an image model.
- the image model is composed of voxels having position and color information, and can represent an image of a structure of any shape.
- FIG. 6 shows an example of an image model of a 2D image.
- the image model of the 2D image has, for example, a rectangular (planar) structure.
- the voxels of the image model of the 2D image have horizontal and vertical position information as position information.
- FIG. 7 is a diagram showing a second example of the image model.
- FIG. 7 shows an example of an image model of a 3D image.
- the image model of the 3D image has a complex structure extending horizontally, vertically, and in the depth direction, and therefore, voxels of the image model of the 3D image are horizontally, vertically, and as position information. It has information on the position in the depth direction.
- FIG. 8 is a diagram showing a third example of the image model.
- the image model of FIG. 8 has a curved surface structure in which a rectangle is curved.
- any structure such as a sphere can be adopted.
- the display surface 11 moves (translates) (parallelly) in the horizontal direction, the vertical direction, and the depth direction
- the display surface 11 rotates in each of the pitch direction, the yaw direction, and the roll direction.
- a specific example of a projected image generated in each of the case where the user moves in the horizontal direction, the vertical direction, and the depth direction will be described.
- the display surface 11 when the display surface 11 moves in the horizontal direction, the vertical direction, and the depth direction, the display surface 11 has a pitch direction, a yaw direction, and a roll direction.
- the specific example of generation of a projection image will be described separately for each of the case of rotating in each of the cases, and the case of moving in each of the horizontal direction, vertical direction, and depth direction when the user rotates. It can be generated for any combination of these movements and rotations.
- FIG. 9 is a diagram for explaining an example of generation of a projection image when the display surface 11 moves in the horizontal direction.
- the display surface 11 and the user move from the state of the initial setting timing.
- the reference coordinate system displays, for example, the left to right direction of the long side direction of the display surface 11 at the initial setting timing, the x axis, the lower to upper direction of the short side direction of the display surface 11, and the y axis It is assumed that a direction orthogonal to the surface 11 and facing the display surface 11 is a three-dimensional coordinate system in which the z axis is used.
- the user is disposed on the side facing the display surface 11 on a straight line passing through the center of the display surface 11, for example, the horizontal direction of the user and the display surface 11,
- the vertical direction and the depth direction are parallel to the x axis, y axis, and z axis of the reference coordinate system, respectively.
- an image model of a 2D image still image is adopted.
- a of FIG. 9 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the display surface 11 In the movement of the display surface 11 in the horizontal direction, the display surface 11 does not change the posture by the user, and a plane (a plane parallel to the xy plane) including the display surface 11 at the initial setting timing is shown in FIG. As shown by the solid thick arrow in FIG. 9, it is moved leftward, or is moved rightward as shown by dotted thick arrow in A of FIG.
- FIG. 9 shows a projection image generated before movement (initial setting timing) and a projection image generated after movement when the display surface 11 is moved in the left direction.
- FIG. 9 shows the projection image generated before the movement and the projection image generated after the movement when the display surface 11 is moved in the right direction.
- the image model of the still image present on the other side of the display surface 11 as a window remains in place, and the image model can be viewed by moving the window.
- a projection image is generated in which the range is changed to the range on the right side before the movement.
- FIG. 10 is a diagram for further explaining an example of generation of a projection image when the display surface 11 moves in the horizontal direction.
- FIG. 10 is a view of the reference coordinate system as viewed from the positive direction of the y axis, and from left to right, in a direction perpendicular to the drawing, and from top to bottom are respectively the x axis and y of the reference coordinate system. It is the axis and the z axis.
- the display surface 11 and the image model (of a 2D image still image) are parallel to the xy plane both before and after the movement of the display surface 11.
- the voxel V # 1 having an image model is projected to the pixel P # 1 on the left side of the display surface 11 at time T before the movement of the display surface 11, but time T + after the display surface 11 is moved to the left At 1, the light is projected onto the pixel P # 2 on the right side of the display surface 11.
- FIG. 11 is a display example of a projection image at time T when voxel V # 1 in FIG. 10 is projected to pixel P # 1, and a projection image at time T + 1 when voxel V # 1 is projected to pixel P # 2
- FIG. 16 is a diagram showing a display example of FIG.
- Voxel V # 1 of the image model appears as pixel P # 1 to the left of display surface 11 as a window at time T before movement of display surface 11, and time T + 1 after movement of display surface 11 to the left Then, it appears as a pixel P # 2 on the right of the display surface 11 as a window.
- FIG. 12 is a diagram for explaining an example of generation of a projection image when the display surface 11 moves in the vertical direction.
- a of FIG. 12 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the display surface 11 In the movement of the display surface 11 in the vertical direction, the display surface 11 does not change the posture by the user, and a plane (a plane parallel to the xy plane) including the display surface 11 at the initial setting timing is shown in FIG. As shown by the solid thick arrow in FIG. 12, it is moved downward, or is moved upward as shown by the dotted thick arrow in FIG. 12A.
- FIG. 12 shows a projection image generated before the movement (initial setting timing) and a projection image generated after the movement when the display surface 11 is moved downward.
- the display surface 11 is moved downward, the display surface 11 as a window is on the lower side in front of the user according to the principle described in FIG. A projected image is generated as if you were looking at the model.
- FIG. 12 shows the projection image generated before the movement and the projection image generated after the movement when the display surface 11 is moved upward.
- the image model of the still image present on the other side of the display surface 11 as a window remains in place, and the image model can be viewed by the movement of the window.
- a projection image is generated in which the range is changed to the upper range before the movement.
- FIG. 13 is a diagram further illustrating an example of generation of a projection image when the display surface 11 moves in the vertical direction.
- FIG. 13 is a view of the reference coordinate system as viewed from the positive direction of the x-axis, and a direction perpendicular to the drawing, right to left and top to bottom are the x-axis and y of the reference coordinate system, respectively. It is the axis and the z axis.
- the display surface 11 and the image model (of a 2D image still image) are parallel to the xy plane both before and after the movement of the display surface 11.
- the voxel V # 1 having an image model is projected to the pixel P # 1 near the upper and lower center of the display surface 11 at time T before the movement of the display surface 11, but after movement below the display surface 11 At time T + 1, the pixel P # 2 is projected on the upper side of the display surface 11.
- FIG. 14 shows a display example of a projection image at time T when voxel V # 1 is projected to pixel P # 1 in FIG. 13 and a projection image at time T + 1 when voxel V # 1 is projected to pixel P # 2
- FIG. 16 is a diagram showing a display example of FIG.
- the voxel V # 1 of the image model appears as a pixel P # 1 near the upper and lower center of the display surface 11 as a window at time T before the movement of the display surface 11, and the time after movement below the display surface 11 At T + 1, it appears as a pixel P # 2 on the display surface 11 as a window.
- FIG. 15 is a diagram for explaining an example of generation of a projection image when the display surface 11 moves in the depth direction.
- a of FIG. 15 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the display surface 11 In the movement of the display surface 11 in the depth direction, the display surface 11 does not change its posture by the user, but in the direction orthogonal to the display surface 11 at the initial setting timing (in the direction of the z axis) in A of FIG. As shown by the solid thick arrow, it is moved to the near side (the front side as viewed from the user), or as shown by the dotted thick arrow in A of FIG. Moved to).
- FIG. 15 shows a projection image generated before movement (initial setting timing) and a projection image generated after movement when the display surface 11 is moved in the near direction.
- the display surface 11 is moved in the front direction, according to the principle described in FIG. 5, the display surface 11 as a window is present on the front side of the front of the user.
- a projected image is generated as if looking at the image model.
- FIG. 15 shows the projection image generated before the movement and the projection image generated after the movement when the display surface 11 is moved in the back direction.
- FIG. 16 is a diagram further illustrating an example of generation of a projection image when the display surface 11 moves in the depth direction.
- FIG. 16 is a view of the reference coordinate system as viewed from the positive direction of the y-axis, and from left to right, in a direction perpendicular to the drawing, and from top to bottom are the x-axis and y of the reference coordinate system, respectively. It is the axis and the z axis.
- the display surface 11 and the image model (of a 2D image still image) are parallel to the xy plane both before and after the movement of the display surface 11.
- the voxel V # 1 having an image model is projected to the pixel P # 1 on the left side far from the center of the display surface 11 at time T before the movement of the display surface 11, but after movement to the front of the display surface 11.
- the pixel P # 2 is projected to the pixel P # 2 on the left side near the center of the display surface 11.
- FIG. 17 shows a display example of a projection image at time T when voxel V # 1 in FIG. 16 is projected to pixel P # 1, and a projection image at time T + 1 when voxel V # 1 is projected to pixel P # 2
- FIG. 16 is a diagram showing a display example of FIG.
- the voxel V # 1 of the image model appears as a pixel P # 1 on the left side far from the center of the display surface 11 as a window at time T before the movement of the display surface 11, and the time after moving the display surface 11 to the front At T + 1, the pixel P # 2 appears on the left side near the center of the display surface 11 as a window.
- FIG. 18 is a view for explaining an example of generation of a projection image when the display surface 11 is rotated and tilted in the pitch direction.
- a of FIG. 18 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the display surface 11 In rotation in the pitch direction of the display surface 11 (rotation about the x axis), the display surface 11 is parallel to the x axis passing the center of the display surface 11 at the initial setting timing, for example, without changing the position by the user.
- a straight line as an axis of rotation as shown by a solid thick arrow in A of FIG. 18, or by being rotated in the opposite direction of the arrow and tilted.
- FIG. 18B before the display surface 11 is tilted, the display surface 11 as a window in a state of facing the user is directly in front of the user according to the principle described in FIG. A projected image is generated as if looking at the image model of.
- the display surface 11 is rotated and inclined in the pitch direction, the display surface 11 as a window inclined in the pitch direction is directly in front of the user according to the principle described in FIG. A projected image is generated as if the user were looking at the still image model.
- the image model of the still image present on the other side (back side) of the display surface 11 as the tilted window remains in place, and the window is pitched. Tilting in a direction produces a projected image in which the visible range of the image model has changed to a narrower range than before the window was tilted.
- FIG. 19 is a diagram further illustrating an example of generation of a projection image when the display surface 11 is tilted in the pitch direction.
- FIG. 19 is a view of the reference coordinate system as viewed from the negative direction of the x-axis, in the direction perpendicular to the drawing, from the bottom to the top, and from the left to the right are the x-axis of the reference coordinate system, y It is the axis and the z axis.
- the image model (of a 2D image still image) is parallel to the xy plane before or after the display surface 11 is tilted.
- the display surface 11 is parallel to the xy plane at time T before being inclined in the pitch direction, but is parallel to the xy plane at time T + 1 after being inclined in the pitch direction. is not.
- a voxel V # 1 having an image model is projected to the upper pixel P # 1 closer to the center of the display surface 11, but the display surface 11 is tilted.
- the pixel is projected to the upper pixel P # 2 far from the center of the display surface 11.
- FIG. 20 shows a display example of a projected image at time T when voxel V # 1 in FIG. 19 is projected to pixel P # 1, and a projected image at time T + 1 when voxel V # 1 is projected to pixel P # 2
- FIG. 16 is a diagram showing a display example of FIG.
- the voxel V # 1 of the image model appears as a pixel P # 1 on the upper side near the center of the display surface 11 as a window at time T before the display surface 11 is tilted, and the time after the display surface 11 is tilted At T + 1, the pixel P # 2 appears on the upper side (upper side in the window frame of the window simulated by the display surface 11) far from the center of the display surface 11 as a window.
- the user is as if the image model was localized in front of the user's eyes, and the range of the image model viewed through the window was changed by tilting the window in the pitch direction with the display surface 11 as the window You can enjoy the feeling like
- FIG. 21 is a view showing an example of a projection image displayed on the display surface 11 when the display surface 11 tilted in the pitch direction of FIG. 20 is viewed from the direction orthogonal to the display surface 11.
- the projection image displayed on the display surface 11 in the state of being inclined in the pitch direction is viewed from the display surface 11 in the state of being inclined in the pitch direction as described above.
- it is a vertically elongated image so as to be an image of a view seen when the image model is viewed from the window.
- FIG. 22 is a diagram for explaining an example of generation of a projection image when the display surface 11 is rotated and tilted in the yaw direction.
- a of FIG. 22 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the display surface 11 When the display surface 11 rotates in the yaw direction (rotation about the y-axis), the display surface 11 does not change the position by the user, but a straight line parallel to the y-axis passing the center of the display surface 11 at the initial setting timing. As indicated by a solid thick arrow in A of FIG. 22 as the rotation axis, or is rotated in the opposite direction of the arrow and tilted.
- the display surface 11 before the display surface 11 is tilted, the display surface 11 as a window facing the user is directly in front of the user according to the principle described with reference to FIG. A projected image is generated as if looking at the image model of.
- the display surface 11 is rotated and inclined in the yaw direction, the display surface 11 as a window inclined in the yaw direction is directly in front of the user according to the principle described in FIG. A projected image is generated as if the user were looking at the still image model.
- the image model of the still image present on the other side (back side) of the display surface 11 as the tilted window remains in place and the window Tilting in a direction produces a projected image in which the visible range of the image model has changed to a narrower range than before the window was tilted.
- FIG. 23 is a diagram further illustrating an example of generation of a projection image when the display surface 11 is tilted in the yaw direction.
- FIG. 23 is a view when the reference coordinate system is viewed from the positive direction of the y axis, and from left to right, in a direction perpendicular to the figure, and from top to bottom are respectively the x axis and y of the reference coordinate system. It is the axis and the z axis.
- the image model (of a 2D image still image) is parallel to the xy plane before and after the display surface 11 is tilted.
- the display surface 11 is parallel to the xy plane at time T before being tilted in the yaw direction, but is parallel to the xy plane at time T + 1 after being tilted in the yaw direction. is not.
- a voxel V # 1 having an image model is projected to the pixel P # 1 on the left side closer to the center of the display surface 11, but the display surface 11 is tilted.
- FIG. 24 shows a display example of a projection image at time T when voxel V # 1 is projected to pixel P # 1 in FIG. 23, and a projection image at time T + 1 when voxel V # 1 is projected to pixel P # 2
- FIG. 16 is a diagram showing a display example of FIG.
- the voxel V # 1 of the image model appears as a pixel P # 1 on the left side near the center of the display surface 11 as a window, and the time after the display surface 11 is tilted At T + 1, the pixel P # 2 appears on the left side far from the center of the display surface 11 as the window (the left side in the window frame of the window simulated by the display surface 11).
- the user is as if the image model was localized in front of the user's eyes, and the range of the image model viewed through the window was changed by tilting the window in the yaw direction with the display surface 11 as the window You can enjoy the feeling like
- FIG. 25 is a view showing an example of a projected image displayed on the display surface 11 when the display surface 11 tilted in the yaw direction of FIG. 24 is viewed from the direction orthogonal to the display surface 11.
- the projection image displayed on the display surface 11 in the state of being inclined in the yaw direction is a window of the display surface 11 in the state of being inclined in the yaw direction as viewed from the display surface 11.
- the image is a horizontally elongated image so as to be an image of a view seen when the image model is viewed from the window.
- FIG. 26 is a diagram for explaining an example of generation of a projection image in the case where the display surface 11 is rotated and tilted in the roll direction.
- a of FIG. 26 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the display surface 11 In rotation of the display surface 11 in the roll direction (rotation about the z axis), the display surface 11 does not change the position by the user, but a straight line parallel to the z axis passing the center of the display surface 11 at the initial setting timing. As indicated by a solid thick arrow in A of FIG. 26 as the rotation axis, or is rotated in the opposite direction of the arrow and tilted.
- 26B is generated before the display surface 11 is tilted (initial setting timing) when the display surface 11 is rotated and tilted in the roll direction as indicated by a thick solid arrow in FIG. 26A. And the projected image generated after the display surface 11 is tilted.
- the window as a state in which the long side and the short side are respectively directed to the horizontal direction and the vertical direction of the user in front of the user according to the principle described in FIG.
- the display surface 11 is rotated and inclined in the roll direction, the display surface 11 as a window inclined in the roll direction is directly in front of the user according to the principle described in FIG. A projected image is generated as if the user were looking at the still image model.
- the image model of the still image present on the other side (back side) of the display surface 11 as the tilted window remains in place, and the window is a roll. Tilting in a direction produces a projected image in which the visible range of the image model has changed to a different range than before the window was tilted.
- FIGS. 27 and 28 are diagrams further illustrating an example of generation of a projection image when the display surface 11 is tilted in the roll direction.
- FIG. 27 is a view of the reference coordinate system as viewed from the positive direction of the y axis, and from left to right, in a direction perpendicular to the figure, and from top to bottom are respectively the x axis and y of the reference coordinate system. It is the axis and the z axis.
- FIG. 28 is a view of the reference coordinate system as viewed from the positive direction of the z-axis, and from left to right, from bottom to top, and in the direction perpendicular to the figure are the x-axis and y of the reference coordinate system, respectively. It is the axis and the z axis.
- the display surface 11 and the image model are parallel to the xy plane both before and after the display surface 11 is tilted.
- a voxel V # 1 of an image model is projected to a pixel P # 1 on a straight line parallel to the x-axis passing through the center of the display surface 11 at time T before the display surface 11 is tilted in the roll direction.
- time T + 1 after the display surface 11 is tilted it is projected onto the pixel P # 2 on the lower left side from the center of the display surface 11 (in the state not being tilted in the roll direction).
- FIG. 29 shows a display example of a projection image at time T when the voxel V # 1 in FIG. 27 and FIG. 28 is projected to the pixel P # 1, and time T + 1 when the voxel V # 1 is projected to the pixel P # 2. It is a figure which shows the display example of the projection image of.
- the voxel V # 1 of the image model appears as a pixel P # 1 on a straight line parallel to the x-axis passing through the center of the display surface 11 as a window at time T before the display surface 11 is tilted.
- the pixel P # 2 appears on the lower left side (the lower left side in the window frame simulated by the display surface 11) from the center of the display surface 11 as a window.
- the user is as if the image model was localized in front of the user's eyes, and the range of the image model viewed through the window was changed by tilting the window in the roll direction with the display surface 11 as the window You can enjoy the feeling like
- FIG. 30 is a view showing an example of a projection image displayed on the display surface 11 in the case where the display surface 11 inclined in the roll direction in FIG. 29 is similarly inclined in the roll direction.
- the projection image displayed on the display surface 11 in the state of being inclined in the roll direction when the display surface 11 in the state of being inclined in the roll direction is viewed as such, uses the inclined display surface 11 as a window,
- the image is inclined in the roll direction opposite to the roll direction in which the display surface 11 is inclined so as to be an image of a view seen when the image model is viewed from the window.
- FIG. 31 is a diagram for explaining an example of generation of a projection image when the user moves in the horizontal direction.
- a of FIG. 31 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the user moves leftward in a plane parallel to the display surface 11 (a plane parallel to the xy plane) at the initial setting timing as shown by a thick solid arrow in A of FIG. It moves or moves to the right as shown by the thick thick arrow in dotted line in A of FIG.
- FIG. 31 illustrates a projection image generated before the movement (initial setting timing) and a projection image generated after the movement when the user moves in the left direction.
- a projection image is generated in which the viewable range of the model is changed to the range on the right side before the movement.
- FIG. 31 shows a projection image generated before the movement and a projection image generated after the movement when the user moves in the right direction.
- the image model of the still image present on the other side of the display surface 11 as a window stays in place, and the user can move so that the visible range of the image model is A projection image is generated that has changed to the left side range before the movement.
- FIG. 32 is a diagram further illustrating an example of generation of a projection image when the user moves in the horizontal direction.
- FIG. 32 is a view of the reference coordinate system as viewed from the positive direction of the y axis, and from left to right, in a direction perpendicular to the drawing, and from top to bottom are respectively the x axis and y of the reference coordinate system. It is the axis and the z axis.
- the display surface 11 and the image model (of a 2D image still image) are parallel to the xy plane both before and after the movement of the user.
- the voxel V # 1 of the image model is projected to the pixel P # 1 at the center of the display surface 11 at time T before the user moves, but displayed at time T + 1 after the user moves to the left It is projected to the pixel P # 2 on the left side of the surface 11.
- FIG. 33 shows a display example of a projected image at time T when voxel V # 1 in FIG. 32 is projected to pixel P # 1, and a projected image at time T + 1 when voxel V # 1 is projected to pixel P # 2
- FIG. 16 is a diagram showing a display example of FIG.
- the voxel V # 1 of the image model appears as a pixel P # 1 at the center of the display surface 11 as a window at time T before the user moves, and as a window at time T + 1 after the user moves to the left It appears as a pixel P # 2 on the left of the display surface 11 of.
- FIG. 34 is a diagram for explaining an example of generation of a projection image when the user moves in the vertical direction.
- a of FIG. 34 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the user moves upward in a plane parallel to the display surface 11 (a plane parallel to the xy plane) at the initial setting timing, as indicated by a solid thick arrow in A of FIG. It moves or moves downward as shown by a thick thick arrow in dotted line in FIG.
- FIG. 34 illustrates a projection image generated before the movement (initial setting timing) and a projection image generated after the movement when the user moves upward.
- a projection image is generated in which the viewable range of the model is changed to the range lower than that before the movement.
- FIG. 34 shows a projection image generated before the movement and a projection image generated after the movement when the user moves downward.
- the image model of the still image present on the other side of the display surface 11 as a window stays in place, and the user can move so that the visible range of the image model is A projection image is generated that has changed to the upper range than before movement.
- FIG. 35 is a diagram further illustrating an example of generation of a projection image when the user moves in the vertical direction.
- FIG. 35 is a view of the reference coordinate system as viewed from the positive direction of the x-axis, and a direction perpendicular to the drawing, right to left and top to bottom are the x-axis and y of the reference coordinate system, respectively. It is the axis and the z axis.
- the display surface 11 and the image model (of a 2D image still image) are parallel to the xy plane both before and after the user's movement.
- a voxel V # 1 of an image model is projected to the pixel P # 1 at the center of the display surface 11 at time T before movement of the user, but displayed at time T + 1 after movement onto the user It is projected onto the pixel P # 2 on the upper side of the surface 11.
- FIG. 36 shows a display example of a projection image at time T when voxel V # 1 in FIG. 35 is projected to pixel P # 1, and a projection image at time T + 1 when voxel V # 1 is projected to pixel P # 2
- FIG. 16 is a diagram showing a display example of FIG.
- the voxel V # 1 of the image model appears as a pixel P # 1 at the center of the display surface 11 as a window at time T before the user moves, and as a window at time T + 1 after the user moves on It appears as a pixel P # 2 on the display surface 11 of
- FIG. 37 is a diagram for explaining an example of generation of a projection image when the user moves in the depth direction.
- a of FIG. 37 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the user moves in the direction orthogonal to the display surface 11 at the initial setting timing. That is, the user moves in the back direction (the direction from the user toward the display surface 11) as shown by a thick solid arrow in A of FIG. 37, or as shown by a thick thick arrow in A of FIG. In the direction toward the user (in the direction from the display surface 11 to the user).
- FIG. 37 illustrates a projection image generated before the movement (initial setting timing) and a projection image generated after the movement when the user moves in the depth direction.
- FIG. 37 shows a projection image generated before the movement and a projection image generated after the movement when the user moves in the near direction.
- the image model of the still image present on the other side of the display surface 11 as a window stays in place, and the user can move to see the visible range of the image model.
- a projection image is generated that has changed to a narrower range than before movement.
- FIG. 38 is a diagram further illustrating an example of generation of a projection image when the user moves in the depth direction.
- FIG. 38 is a view of the reference coordinate system as viewed from the positive direction of the y axis, and from left to right, in a direction perpendicular to the drawing, and from top to bottom are respectively the x axis and y of the reference coordinate system. It is the axis and the z axis.
- the display surface 11 and the image model (of a 2D image still image) are parallel to the xy plane both before and after the movement of the user.
- the voxel V # 1 of the image model is projected to the left pixel P # 1 away from the center of the display surface 11 at time T before the user moves, but time T + after the user moves to the back At 1, the light beam is projected to the pixel P # 2 close to the center of the display surface 11.
- the case where the user moves in the back direction and the case where the display surface 11 is moved in the near direction are common in that the view angle of the view seen from the display surface 11 as a window is large. .
- the user and the image model are close, so the size of the object constituting the view seen from the display surface 11 as a window is larger than before the movement of the user.
- FIG. 39 shows a display example of a projection image at time T when voxel V # 1 in FIG. 38 is projected to pixel P # 1, and a projection image at time T + 1 when voxel V # 1 is projected to pixel P # 2
- FIG. 16 is a diagram showing a display example of FIG.
- the voxel V # 1 of the image model appears as a pixel P # 1 on the left away from the center of the display surface 11 as a window at time T before movement of the user, and time T + 1 after movement to the back of the user Then, it appears as a pixel P # 2 at a position close to the center of the display surface 11 as a window.
- FIG. 40 is a diagram for explaining generation of a projection image in the case of using an image model of a 3D image.
- FIG. 40 shows an example of the reference coordinate system in which the image model of the user, the display surface 11 and the 3D image are arranged.
- the image model of the 3D image is composed of four objects obj # 1, obj # 2, obj # 3, obj # 4 different in depth position (depth).
- voxels constituting an image model of a 2D image have horizontal and vertical position information as position information, and do not have position information in the depth direction, or depth direction Even if it has the information of the position of, the information of the position in the depth direction is the same information in all the voxels.
- voxels constituting an image model of a 3D image have information on positions in the horizontal direction, vertical direction, and depth direction as position information, and Information is not necessarily the same information, and may be different information.
- FIG. 41 is a diagram for explaining an example of generation of a projection image using a 3D image model when the user moves in the horizontal direction.
- a of FIG. 41 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the user moves leftward in a plane parallel to the display surface 11 (a plane parallel to the xy plane) at the initial setting timing as shown by a thick solid arrow in A of FIG. It moves or moves to the right as shown by the thick thick arrow in dotted line in A of FIG.
- FIG. 41 illustrates a projection image generated before the movement (initial setting timing) and a projection image generated after the movement when the user moves in the horizontal direction.
- a projection image is generated in which almost the entire object obj # 2 located almost directly behind the object obj # 1 is hidden behind the object obj # 1.
- the display surface 11 as a window is on the right of the front of the user according to the principle described in FIG. 5, and the 3D image model is A projected image is generated as if looking.
- the 3D image model existing on the other side (rear side) of the display surface 11 as the window remains on the spot, and the user moves the image model.
- a projected image is generated in which the visible range is changed to the right range before the movement.
- a projection image in which motion parallax similar to that in the case of moving to the left occurs is generated while looking at an object having depth in the real world.
- the 3D image model present on the other side of the display surface 11 as a window stays in place, and the user moves.
- a projected image is generated in which the viewable range of the image model is changed to the left side of the range before movement.
- FIG. 42 is a diagram for explaining an example of generation of a projection image using a 3D image model when the user moves in the vertical direction.
- a of FIG. 42 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the user moves upward in a plane parallel to the display surface 11 (a plane parallel to the xy plane) at the initial setting timing as shown by a thick solid arrow in A of FIG. It moves or moves downward as shown by a thick thick arrow in dotted line in FIG.
- FIG. 42 shows a projection image generated before the movement (initial setting timing) and a projection image generated after the movement when the user moves in the vertical direction.
- a projection image is generated in which almost the entire object obj # 2 located almost directly behind the object obj # 1 is hidden behind the object obj # 1.
- the 3D image model present on the other side (rear side) of the display surface 11 as the window remains on the spot, and the user moves the image model.
- a projected image is generated in which the viewable range is changed to the range lower than that before the movement.
- the 3D image model present on the other side of the display surface 11 as a window stays in place, and the user moves.
- a projected image is generated in which the visible range of the image model has changed to the upper range before the movement.
- FIG. 43 is a diagram for explaining an example of generation of a projection image using a 3D image model when the user moves in the depth direction.
- a of FIG. 43 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the user moves in the direction orthogonal to the display surface 11 at the initial setting timing. That is, the user moves in the back direction (the direction from the user toward the display surface 11) as shown by a solid thick arrow in A of FIG. 43, or as shown by a dotted thick arrow in A of FIG. In the direction toward the user (in the direction from the display surface 11 to the user).
- FIG. 43 illustrates a projection image generated before movement (initial setting timing) and a projection image generated after movement when the user moves in the depth direction.
- a projection image is generated in which almost the entire object obj # 2 located almost directly behind the object obj # 1 is hidden behind the object obj # 1.
- the 3D image model existing on the other side (back side) of the display surface 11 as the window remains on the spot and can be seen from the window by the user moving
- a projected image is generated such that the image model is larger than before movement.
- a projection image is generated in which motion parallax similar to that in the case of moving the back is generated while looking at an object having a depth in the real world.
- the 3D image model present on the other side of the display surface 11 as a window stays in place, and the user moves.
- a projected image is generated such that the image model seen through the window is smaller than before the movement.
- a projection image with motion parallax generated by the movement of the user is generated.
- the motion parallax differs depending on the position (depth) of the depth of the object constituting the 3D image model, but such motion parallax can be given not only when the user moves but also when the display surface 11 moves.
- the display surface 11 moves (oscillates) by camera shake, but the motion parallax is a projection image based on such movement of the display surface 11 Can be given to
- FIG. 44 is a diagram for explaining motion parallax given to a projected image based on the movement of the display surface 11 when the display surface 11 moves due to camera shake.
- a of FIG. 44 shows an example of a projection image when, for example, the smartphone is placed on a table or the like and the display surface 11 is not moving.
- FIG. 44 shows an example of a projection image when, for example, the user holds a smartphone in hand and the display surface 11 moves due to camera shake.
- the display surface 11 is generated by generating a projection image with motion parallax based on the movement of the display surface 11. It is possible to emphasize the stereoscopic effect of the projected image displayed on.
- FIG. 45 is a diagram for explaining an example of a method of generating a projection image with motion parallax based on the movement of the display surface 11.
- FIG. 45 shows a reference coordinate system in which the user, the display surface 11, and the 3D image model are arranged.
- the user makes a slight movement such that the user vibrates in a size corresponding to the movement of display surface 11 from the front of display surface 11 in the horizontal direction, that is, to the left as shown by the thick arrow in FIG.
- a projection image can be generated.
- FIG. 46 is a diagram for explaining the enlargement of the difference in motion parallax.
- the motion parallax differs depending on the position in the depth direction of the objects constituting the 3D image model, and the motion parallax is larger for the object closer to the front.
- the motion parallax can be adjusted for each of the objects constituting the 3D image model.
- FIG. 46 is a diagram for explaining an example of adjustment of motion parallax of a projected image using a 3D image model when the user moves in the depth direction.
- a of FIG. 46 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the user moves in the direction orthogonal to the display surface 11 at the initial setting timing. That is, the user moves in the back direction (the direction from the user toward the display surface 11) as shown by a solid thick arrow in A of FIG. 46, or as shown by a dotted thick arrow in A of FIG. In the direction toward the user (in the direction from the display surface 11 to the user).
- FIG. 46 shows the projection image generated before movement (initial setting timing) and the projection image generated after movement without adjustment of motion parallax when the user moves in the depth direction. There is.
- a projection image is generated in which almost the entire object obj # 2 located almost directly behind the object obj # 1 is hidden behind the object obj # 1 before the user's movement.
- the 3D image model existing on the other side (back side) of the display surface 11 as the window remains on the spot and can be seen from the window by the user moving
- a projected image is generated such that the image model is larger than before movement.
- a projection image is generated in which motion parallax similar to that in the case of moving the back is generated while looking at an object having a depth in the real world.
- FIG. 46 shows the projection image generated before movement (initial setting timing) and the projection image generated after movement, with adjustment of motion parallax when the user moves in the depth direction. There is.
- the 3D image model existing on the other side (back side) of the display surface 11 as the window remains on the spot and can be seen from the window by the user moving
- a projected image is generated such that the image model is larger than before movement.
- a projection image is generated in which the object closer to the front among the objects constituting the 3D image model is larger than in the case of no adjustment of movement parallax. Therefore, in the projection image, the difference between the motion parallaxes of the object in front and the object in back is enlarged.
- the user who views the projection image feels that the positions in the depth direction of the object in front and the object on the back side are largely different, so that the stereoscopic effect of the projection image can be emphasized.
- FIG. 47 is a view for explaining another configuration example of the display surface 11.
- the display surface 11 is a rectangular surface in the above-mentioned case, a surface of a predetermined shape other than the rectangular surface can be adopted as the display surface 11.
- a surface having a shape obtained by curving a rectangle (hereinafter, also referred to as a curved surface) is adopted as the display surface 11 and is disposed in the reference coordinate system together with the user and the image model.
- the display surface 11 is a curved surface as shown in FIG.
- FIG. 48 is a diagram for explaining an example of generation of a projection image when the display surface 11 is a curved surface.
- FIG. 48 is a view of the reference coordinate system as viewed from the positive direction of the y axis, and from left to right, in a direction perpendicular to the drawing, and from top to bottom are respectively the x axis and y of the reference coordinate system. It is the axis and the z axis.
- the projected image displayed on the display surface 11 which is a curved surface has a color which the voxel V # 1 of the position of the image model intersects with the straight line passing through the user and the pixel P # 1 of the display surface 11 which is a curved surface.
- the pixel value is generated by projecting the voxel V # 1 onto the pixel P # 1 as the pixel value of the pixel P # 1.
- a surface of a variable shape can be adopted as the display surface 11.
- the display unit 25 (FIG. 3) having the display surface 11 of a variable shape, for example, there is a thin film organic EL (Electro Luminescence) display or the like which can be distorted to some extent.
- a thin film organic EL Electro Luminescence
- the shape of the display surface 11 is required to generate a projection image according to the principle described in FIG. 5, but the shape of the display surface 11 is, for example, It can be detected by the display surface detection unit 22 (FIG. 2).
- FIG. 49 is a diagram for explaining yet another configuration example of the display surface 11.
- one surface is adopted as the display surface 11, but a plurality of surfaces can be adopted as the display surface 11.
- a plurality of surfaces are adopted as the display surface 11, a plurality of projection images the same as the plurality of surfaces are generated.
- FIG. 49 two display surfaces 11L and 11R arranged in the horizontal direction are adopted as the display surface 11.
- FIG. 49 shows a user, two display surfaces 11L and 11R, and a reference coordinate system in which an image model is disposed.
- a projection image for the left eye observed by the left eye of the user and a projection for the right eye observed by the right eye of the user An image is generated, and a projection image for the left eye is displayed on the display surface 11L on the left side of the two display surfaces 11L and 11R, and a projection image for the right eye is displayed on the display surface 11R on the right side.
- the user observes the projection image for the left eye displayed on the display surface 11L with the left eye, and observes the projection image for the right eye displayed on the display surface 11R with the right eye.
- the projection image for the left eye is an image projected onto the retina of the user's left eye when the user looks at the projection image for the left eye displayed on the display surface 11L with the display surface 11L as a window. It is generated according to the principle described in FIG. 5 so as to obtain an image similar to an image projected onto the retina of the user's left eye when looking at the image model through the window.
- an image projected onto the retina of the user's right eye has the display surface 11R as a window. It is generated according to the principle described in FIG. 5 so as to obtain an image similar to an image projected onto the retina of the user's right eye when looking at the image model through the window.
- FIG. 50 is a diagram for explaining an example of generation of a projection image for the left eye and a projection image for the right eye.
- FIG. 50 is a view of the reference coordinate system as viewed from the positive direction of the y axis, and from left to right, in a direction perpendicular to the drawing, and from top to bottom are respectively the x axis and y of the reference coordinate system. It is the axis and the z axis.
- a projection image for the left eye is generated by projecting onto the pixel PL # 1 of the display surface 11 intersecting the straight line passing the user's left eye and the voxel V # 1, and for the right eye
- the projection image is generated by projecting onto the pixel PR # 1 of the display surface 11R intersecting the straight line passing the user's right eye and the voxel V # 1.
- the projection image for the left eye and the projection image for the right eye can be displayed on one display surface 11 in addition to the display on the two display surfaces 11L and 11R as described above.
- the projection image for the left eye and the projection image for the right eye can be displayed, for example, in the area on the left side and the area on the right side of one display surface 11, respectively.
- the projection image for the left eye and the projection image for the right eye can be displayed on one display surface 11 according to the same principle as that of a 3D display, for example.
- FIG. 51 is a perspective view showing a configuration example of a second embodiment of the image display device to which the present technology is applied.
- the image display device is configured as binoculars, and displays the same projected image as that of the first embodiment described above in the binoculars.
- the image display device configured as binoculars, it is possible to make the user who looks through the binoculars feel as if actually observing the image model with the binoculars.
- FIG. 52 is a perspective view showing a configuration example of a third embodiment of the image display device to which the present technology is applied.
- the image display apparatus is configured as a projector system having a projector and a screen, and displays a corresponding image on the screen in accordance with the light of the projected image emitted by the projector.
- the display surface 11 is a screen (including a wall or the like that functions as a screen), and on the screen that is the display surface 11, an image displayed according to the light emitted by the projector is , Depending on the positional relationship between the projector and the screen.
- the positional relationship between the projector and the screen changes depending on how the projector and the screen are arranged.
- the image displayed on the screen also changes according to the light of the projected image emitted by the projector.
- the size of the image displayed on the screen increases as the distance between the projector and the screen increases.
- a projector system is provided to allow the user to use the screen as a window and to feel as if the image model is viewed through the window. It is necessary to generate a projection image in consideration of the positional relationship between the projector and the screen.
- the position of the projector which is a display device for displaying an image (on the screen) is detected, and in addition to the display surface information and the user position information, the projector
- the projected image is generated on the basis of the position of the projector and also considering the positional relationship between the projector and the screen.
- the position of the projector can be detected by the display surface detection unit 22 (FIG. 2).
- FIG. 53 is a diagram for explaining the magnifying glass mode.
- a window mode and a magnifying glass mode can be mounted as operation modes.
- an image that reproduces a view that can be seen when the user views the image model through the window with the display surface 11 as a window is generated as a projected image and displayed on the display surface 11 .
- the magnifying glass mode for example, with the display surface 11 as (a lens of) the magnifying glass, an image that reproduces a virtual image seen when the image model is viewed through the magnifying glass is generated as a projected image and displayed on the display surface 11 .
- FIG. 53 shows an example of generation of a projection image when the operation mode is a magnifying glass mode.
- a of FIG. 53 shows an example of the arrangement of the user and the display surface 11 on the reference coordinate system at the initial setting timing.
- the user moves the display surface 11 (smartphone having the same) in the direction orthogonal to the display surface 11 at the initial setting timing (in the direction of the z axis) without changing its posture.
- a thick solid arrow in A it moves toward the near side (the front side as viewed from the user), or as shown by a thick thick arrow in A in FIG. Move to the side).
- FIG. 53 shows a projection image generated before movement (initial setting timing) and a projection image generated after movement when the display surface 11 is moved in the near direction.
- the display surface 11 as a magnifying glass is directly in front of the user, and the projected image as if looking at the image model of the object to be observed with the magnifying glass through the magnifying glass is It is generated.
- the display surface 11 is moved in the front direction, there is the display surface 11 as a magnifying glass on the near side directly in front of the user, and the projected image as if the user is looking at the image model is through the magnifying glass. It is generated.
- the image model of the still image present on the other side (rear side) of the display surface 11 as the magnifying glass remains on the spot, and the magnifying glass moves.
- the visible range of the image model that is, the viewing angle, changes to a narrower range than that before the movement, and as a result, a projected image in which the image model is expanded more than that before the movement is generated.
- FIG. 53 shows a projection image generated before the movement and a projection image generated after the movement when the display surface 11 is moved in the back direction.
- the image model existing on the other side of the display surface 11 as a magnifying glass remains on the spot, and the visible range of the image model, that is, the magnifying glass moves.
- the viewing angle changes to a wider range than that before the movement, and as a result, a projected image in which the image model is reduced more than before the movement is generated.
- the projected image displayed on the display surface 11 when the display surface 11 is moved in the depth direction is opposite to that in the window mode when the display surface 11 is moved in the depth direction.
- the range of the image model viewed from the display surface 11 as a window is a narrow range
- the display surface 11 as a magnifying glass the display surface 11 as a magnifying glass
- the display surface 11 of the reference coordinate system is moved in the back direction, and when the user moves the display surface 11 in the back direction,
- the display surface 11 in the front direction By moving the display surface 11 in the front direction and generating a projection image according to the principle described in FIG. 5, when the display surface 11 is moved in the front direction, an image model of an image model seen from the display surface 11 as a magnifying glass It is possible to generate a projection image in which the range is narrowed and the range of the image model viewed from the display surface 11 as a magnifying glass is widened when the display surface 11 is moved in the back direction.
- the image display device to which the present technology is applied at least the position of the display surface 11 is detected, and a predetermined line is formed along a straight line passing the pixel of the display surface 11 whose position is detected and the position of the user.
- a projected image obtained by projecting an image model of an image onto the display surface 11 is generated and displayed on the display surface 11. For example, by moving the display surface 11, the user wants to see an area in the predetermined image Can be realized intuitively.
- the user moves the display surface 11 in the back direction, for example, to view a portion in the image model of a still image or a moving image as if localized on the other side of the display surface 11 as a window. It can be easily selected and viewed. In addition, the user can easily view the entire still image or moving image model as if it is localized on the other side of the display surface 11 as a window, for example, by moving the display surface 11 in the near direction. be able to.
- the user is on the other side of the display surface 11 as a window, through the feeling as if the image model (structure represented as) remains and the display surface 11 as a window.
- the user can enjoy the feeling as if he were looking at the image model, and as a result, he could feel the realism as if he was actually looking at the image model from the window.
- the image display device when the image model is a 3D image model, a projection image with motion parallax is generated based on the movement of the display surface 11 or in front of the 3D image model. Since the difference in motion parallax between a certain object and an object on the back side can be enlarged, the stereoscopic effect of the projection image displayed on the display surface 11 can be emphasized.
- the image display device to which the present technology is applied for example, when a user who holds a smartphone as the image display device is actually at the place A, Since a projected image using the image model of the place A can be generated and displayed on the display surface 11 in consideration of the position and the posture, the user can set the position A of the place A through the window using the display surface 11 as a window. You can enjoy the feeling as if you were looking at the past scenery.
- the image display device obtains an image model obtained from the image of the scenery of the place A While displaying the used projection image on the display surface 11, the sound recorded at the time of imaging
- photography of the place A can be output from the speaker which is not shown in figure.
- the situation at the time of shooting of the place A can be reproduced by both the image (projected image) and the sound.
- the series of processes of the control unit 24 described above can be performed by hardware or software.
- a program constituting the software is installed in a general-purpose computer or the like.
- FIG. 54 illustrates an example configuration of an embodiment of a computer in which a program that executes the series of processes described above is installed.
- the program can be recorded in advance in a hard disk 105 or a ROM 103 as a recording medium built in the computer.
- the program can be stored (recorded) in the removable recording medium 111.
- Such removable recording medium 111 can be provided as so-called package software.
- examples of the removable recording medium 111 include a flexible disc, a compact disc read only memory (CD-ROM), a magneto optical disc (MO), a digital versatile disc (DVD), a magnetic disc, a semiconductor memory, and the like.
- the program may be installed on the computer from the removable recording medium 111 as described above, or may be downloaded to the computer via a communication network or a broadcast network and installed on the built-in hard disk 105. That is, for example, the program is wirelessly transferred from the download site to the computer via an artificial satellite for digital satellite broadcasting, or transferred to the computer via a network such as a LAN (Local Area Network) or the Internet. be able to.
- a network such as a LAN (Local Area Network) or the Internet.
- the computer incorporates a CPU (Central Processing Unit) 102, and an input / output interface 110 is connected to the CPU 102 via a bus 101.
- a CPU Central Processing Unit
- the CPU 102 executes a program stored in a ROM (Read Only Memory) 103 accordingly. .
- the CPU 102 loads a program stored in the hard disk 105 into a random access memory (RAM) 104 and executes the program.
- RAM random access memory
- the CPU 102 performs the processing according to the above-described flowchart or the processing performed by the configuration of the above-described block diagram. Then, the CPU 102 outputs the processing result from the output unit 106, transmits the processing result from the communication unit 108, or records the processing result on the hard disk 105, for example, through the input / output interface 110, as necessary.
- the input unit 107 is configured of a keyboard, a mouse, a microphone, and the like. Further, the output unit 106 is configured of an LCD (Liquid Crystal Display), a speaker, and the like.
- LCD Liquid Crystal Display
- the processing performed by the computer according to the program does not necessarily have to be performed chronologically in the order described as the flowchart. That is, the processing performed by the computer according to the program includes processing executed in parallel or separately (for example, parallel processing or processing by an object).
- the program may be processed by one computer (processor) or may be distributed and processed by a plurality of computers. Furthermore, the program may be transferred to a remote computer for execution.
- the system means a set of a plurality of components (apparatus, modules (parts), etc.), and it does not matter whether all the components are in the same housing or not. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device housing a plurality of modules in one housing are all systems. .
- the present technology can have a cloud computing configuration in which one function is shared and processed by a plurality of devices via a network.
- each step described in the above-described flowchart can be executed by one device or in a shared manner by a plurality of devices.
- the plurality of processes included in one step can be executed by being shared by a plurality of devices in addition to being executed by one device.
- the present technology can be configured as follows.
- a detection unit that detects the position of the display surface on which the display device displays the image; A projection image obtained by projecting an image model of a predetermined image onto the display surface is displayed on the display surface along a straight line passing through the pixels of the display surface whose position is detected by the detection unit and the position of the user And a control unit configured to control the display device.
- a control unit configured to control the display device.
- ⁇ 3> It further comprises another detection unit for detecting the position of the user, The control unit causes the image model to be displayed on the display surface along a straight line passing a pixel of the display surface whose position is detected by the detection unit and a position of the user detected by the other detection unit.
- the display control device according to ⁇ 2> wherein the display device is controlled to display the projection image projected onto the display surface on the display surface.
- the other detection unit detects one or more positions in the horizontal direction, the vertical direction, and the depth direction of the user.
- the detection unit detects the position and orientation of the display surface
- the controller controls the image model along a straight line passing through pixels of the display surface whose position and orientation are detected by the detector and the position of the user detected by the other detector.
- the display control device according to ⁇ 3> or ⁇ 4>, which controls the display device to display a projection image projected on a display surface on the display surface.
- ⁇ 6> The display control device according to ⁇ 5>, wherein the detection unit detects one or more rotation angles of the pitch direction, the yaw direction, and the roll direction of the display surface as a posture of the display surface.
- ⁇ 7> The display control device according to any one of ⁇ 5> or ⁇ 6>, wherein the control unit generates the projection image using the position and orientation of the display surface and the position of the user.
- the control unit uses the position and orientation of the display surface and the position of the user to reproduce an image that reproduces a view that can be seen when the user views the image model using the display surface as a window.
- the display control device according to ⁇ 7> which is generated as the projection image.
- ⁇ 9> The display control device according to any one of ⁇ 1> to ⁇ 8>, wherein the image model is a 2D (Dimensional) image or a 3D image model.
- ⁇ 10> The display control device according to any one of ⁇ 1> to ⁇ 9>, wherein the image model includes voxels having color and position information as constituent elements.
- the control unit generates the projection image in which the color of the voxel intersecting the straight line passing the pixel on the display surface and the position of the user is projected as the color of the pixel on the display surface.
- ⁇ 12> The display control device according to ⁇ 10> or ⁇ 11>, wherein the voxels have positions in the horizontal direction, the vertical direction, and the depth direction of the voxels.
- ⁇ 13> The display control device according to ⁇ 8>, wherein the control unit generates the projection image in which a difference in motion parallax of objects having different positions in the depth direction among the objects in the projection image is enlarged.
- the control unit generates the projection image given motion parallax based on the movement of the display surface.
- the display surface is a surface having a predetermined shape.
- the display control device is a surface of a fixed shape or a surface of a variable shape.
- the detection unit further detects the position of the display device; When the positional relationship between the display device and the display surface changes, the control unit generates the projected image using the position and orientation of the display surface, the position of the user, and the position of the display device.
- the display control apparatus as described in ⁇ 8>.
- the control unit arranges the image model based on a position and an attitude of the imaging device when the content to be the image model is imaged in the imaging device, and generates the projection image.
- Display control device ⁇ 19> The display control device according to ⁇ 8>, wherein the control unit generates a plurality of the projection images.
- ⁇ 20> The display control device according to ⁇ 19>, wherein the control unit generates a projection image for the left eye and a projection image for the right eye.
- ⁇ 21> The display control device according to ⁇ 20>, wherein the projection image for the left eye and the projection image for the right eye are displayed on one display surface.
- ⁇ 22> The display control device according to any one of ⁇ 1> to ⁇ 8> configured as binoculars.
- the display detects the position of the display surface on which the image is displayed, A projected image obtained by projecting an image model of a predetermined image onto the display surface along a straight line passing the pixel of the display surface whose position has been detected and the position of the user is displayed on the display surface.
- a display control method comprising the step of controlling the display device.
- a detection unit that detects the position of the display surface on which the display device displays the image;
- a projection image obtained by projecting an image model of a predetermined image onto the display surface is displayed on the display surface along a straight line passing through the pixels of the display surface whose position is detected by the detection unit and the position of the user
- display surface 12 camera 21 data acquisition unit 22 display surface detection unit 23 user detection unit 24 control unit 25 display unit 31 reference coordinate system generation unit 32 display surface information acquisition unit 33 image model generation unit , 34 user position information acquisition unit, 35 display surface arrangement unit, 36 image model arrangement unit, 37 user arrangement unit, 38 image generation unit, 101 bus, 102 CPU, 103 ROM, 104 RAM, 105 hard disk, 106 output unit, 107 Input unit, 108 communication unit, 109 drive, 110 I / O interface, 111 removable recording medium
Abstract
Description
表示装置が画像を表示する表示面の位置を検出する検出部と、
前記検出部により位置が検出された前記表示面の画素と、ユーザの位置とを通る直線に沿って、所定の画像の画像モデルを前記表示面上に射影した射影画像を、前記表示面に表示するように、前記表示装置を制御する制御部と
を備える表示制御装置。
<2>
前記検出部は、前記表示面の水平方向、垂直方向、及び、奥行き方向のうちの1以上の位置を検出する
<1>に記載の表示制御装置。
<3>
前記ユーザの位置を検出する他の検出部をさらに備え、
前記制御部は、前記検出部により位置が検出された前記表示面の画素と、前記他の検出部により検出された前記ユーザの位置とを通る直線に沿って、前記画像モデルを前記表示面上に射影した射影画像を、前記表示面に表示するように、前記表示装置を制御する
<2>に記載の表示制御装置。
<4>
前記他の検出部は、前記ユーザの水平方向、垂直方向、及び、奥行き方向のうちの1以上の位置を検出する
<3>に記載の表示制御装置。
<5>
前記検出部は、前記表示面の位置及び姿勢を検出し、
前記制御部は、前記検出部により位置及び姿勢が検出された前記表示面の画素と、前記他の検出部により検出された前記ユーザの位置とを通る直線に沿って、前記画像モデルを、前記表示面上に射影した射影画像を、前記表示面に表示するように、前記表示装置を制御する
<3>又は<4>に記載の表示制御装置。
<6>
前記検出部は、前記表示面のピッチ方向、ヨー方向、及び、ロール方向のうちの1以上の回転角度を、前記表示面の姿勢として検出する
<5>に記載の表示制御装置。
<7>
前記制御部は、前記表示面の位置及び姿勢、並びに、前記ユーザの位置を用いて、前記射影画像を生成する
<5>又は<6>のいずれかに記載の表示制御装置。
<8>
前記制御部は、前記表示面の位置及び姿勢、並びに、前記ユーザの位置を用いて、前記ユーザが、前記表示面を窓として、前記画像モデルを見た場合に見える景色を再現した画像を、前記射影画像として生成する
<7>に記載の表示制御装置。
<9>
前記画像モデルは、2D(Dimensional)画像、又は、3D画像のモデルである
<1>ないし<8>のいずれかに記載の表示制御装置。
<10>
前記画像モデルは、色と位置の情報を有するボクセルを構成要素として、前記ボクセルで構成される
<1>ないし<9>のいずれかに記載の表示制御装置。
<11>
前記制御部は、前記表示面の画素と、前記ユーザの位置とを通る直線と交差する前記ボクセルの色を、前記表示面の画素の色として射影した前記射影画像を生成する
<10>に記載の表示制御装置。
<12>
前記ボクセルは、前記ボクセルの水平方向、垂直方向、及び、奥行き方向の位置を有する
<10>又は<11>に記載の表示制御装置。
<13>
前記制御部は、前記射影画像内のオブジェクトのうちの、奥行き方向の位置が異なるオブジェクトの運動視差の差異を拡大した前記射影画像を生成する
<8>に記載の表示制御装置。
<14>
前記制御部は、前記表示面の動きに基づいて、運動視差を与えた前記射影画像を生成する
<8>に記載の表示制御装置。
<15>
前記表示面は、所定の形状の面である
<1>ないし<14>のいずれかに記載の表示制御装置。
<16>
前記表示面は、固定の形状の面、又は、可変の形状の面である
<1>ないし<15>のいずれかに記載の表示制御装置。
<17>
前記検出部は、前記表示装置の位置を、さらに検出し、
前記表示装置と前記表示面との位置関係が変化する場合、前記制御部は、前記表示面の位置及び姿勢、前記ユーザの位置、及び、前記表示装置の位置を用いて、前記射影画像を生成する
<8>に記載の表示制御装置。
<18>
前記制御部は、撮影装置において前記画像モデルとなるコンテンツを撮影したときの、前記撮影装置の位置及び姿勢に基づいて、前記画像モデルを配置して、前記射影画像を生成する
<8>に記載の表示制御装置。
<19>
前記制御部は、複数の前記射影画像を生成する
<8>に記載の表示制御装置。
<20>
前記制御部は、左眼用の射影画像と、右眼用の射影画像とを生成する
<19>に記載の表示制御装置。
<21>
前記左眼用の射影画像と、前記右眼用の射影画像とは、1つの表示面に表示される
<20>に記載の表示制御装置。
<22>
双眼鏡として構成される
<1>ないし<8>のいずれかに記載の表示制御装置。
<23>
表示装置が画像を表示する表示面の位置を検出し、
位置が検出された前記表示面の画素と、ユーザの位置とを通る直線に沿って、所定の画像の画像モデルを前記表示面上に射影した射影画像を、前記表示面に表示するように、前記表示装置を制御する
ステップを含む表示制御方法。
<24>
表示装置が画像を表示する表示面の位置を検出する検出部と、
前記検出部により位置が検出された前記表示面の画素と、ユーザの位置とを通る直線に沿って、所定の画像の画像モデルを前記表示面上に射影した射影画像を、前記表示面に表示するように、前記表示装置を制御する制御部と
して、コンピュータを機能させるためのプログラム。
Claims (24)
- 表示装置が画像を表示する表示面の位置を検出する検出部と、
前記検出部により位置が検出された前記表示面の画素と、ユーザの位置とを通る直線に沿って、所定の画像の画像モデルを前記表示面上に射影した射影画像を、前記表示面に表示するように、前記表示装置を制御する制御部と
を備える表示制御装置。 - 前記検出部は、前記表示面の水平方向、垂直方向、及び、奥行き方向のうちの1以上の位置を検出する
請求項1に記載の表示制御装置。 - 前記ユーザの位置を検出する他の検出部をさらに備え、
前記制御部は、前記検出部により位置が検出された前記表示面の画素と、前記他の検出部により検出された前記ユーザの位置とを通る直線に沿って、前記画像モデルを前記表示面上に射影した射影画像を、前記表示面に表示するように、前記表示装置を制御する
請求項2に記載の表示制御装置。 - 前記他の検出部は、前記ユーザの水平方向、垂直方向、及び、奥行き方向のうちの1以上の位置を検出する
請求項3に記載の表示制御装置。 - 前記検出部は、前記表示面の位置及び姿勢を検出し、
前記制御部は、前記検出部により位置及び姿勢が検出された前記表示面の画素と、前記他の検出部により検出された前記ユーザの位置とを通る直線に沿って、前記画像モデルを、前記表示面上に射影した射影画像を、前記表示面に表示するように、前記表示装置を制御する
請求項4に記載の表示制御装置。 - 前記検出部は、前記表示面のピッチ方向、ヨー方向、及び、ロール方向のうちの1以上の回転角度を、前記表示面の姿勢として検出する
請求項5に記載の表示制御装置。 - 前記制御部は、前記表示面の位置及び姿勢、並びに、前記ユーザの位置を用いて、前記射影画像を生成する
請求項6に記載の表示制御装置。 - 前記制御部は、前記表示面の位置及び姿勢、並びに、前記ユーザの位置を用いて、前記ユーザが、前記表示面を窓として、前記画像モデルを見た場合に見える景色を再現した画像を、前記射影画像として生成する
請求項7に記載の表示制御装置。 - 前記画像モデルは、2D(Dimensional)画像、又は、3D画像のモデルである
請求項8に記載の表示制御装置。 - 前記画像モデルは、色と位置の情報を有するボクセルを構成要素として、前記ボクセルで構成される
請求項8に記載の表示制御装置。 - 前記制御部は、前記表示面の画素と、前記ユーザの位置とを通る直線と交差する前記ボクセルの色を、前記表示面の画素の色として射影した前記射影画像を生成する
請求項10に記載の表示制御装置。 - 前記ボクセルは、前記ボクセルの水平方向、垂直方向、及び、奥行き方向の位置を有する
請求項11に記載の表示制御装置。 - 前記制御部は、前記射影画像内のオブジェクトのうちの、奥行き方向の位置が異なるオブジェクトの運動視差の差異を拡大した前記射影画像を生成する
請求項8に記載の表示制御装置。 - 前記制御部は、前記表示面の動きに基づいて、運動視差を与えた前記射影画像を生成する
請求項8に記載の表示制御装置。 - 前記表示面は、所定の形状の面である
請求項8に記載の表示制御装置。 - 前記表示面は、固定の形状の面、又は、可変の形状の面である
請求項8に記載の表示制御装置。 - 前記検出部は、前記表示装置の位置を、さらに検出し、
前記表示装置と前記表示面との位置関係が変化する場合、前記制御部は、前記表示面の位置及び姿勢、前記ユーザの位置、及び、前記表示装置の位置を用いて、前記射影画像を生成する
請求項8に記載の表示制御装置。 - 前記制御部は、撮影装置において前記画像モデルとなるコンテンツを撮影したときの、前記撮影装置の位置及び姿勢に基づいて、前記画像モデルを配置して、前記射影画像を生成する
請求項8に記載の表示制御装置。 - 前記制御部は、複数の前記射影画像を生成する
請求項8に記載の表示制御装置。 - 前記制御部は、左眼用の射影画像と、右眼用の射影画像とを生成する
請求項19に記載の表示制御装置。 - 前記左眼用の射影画像と、前記右眼用の射影画像とは、1つの表示面に表示される
請求項20に記載の表示制御装置。 - 双眼鏡として構成される
請求項8に記載の表示制御装置。 - 表示装置が画像を表示する表示面の位置を検出し、
位置が検出された前記表示面の画素と、ユーザの位置とを通る直線に沿って、所定の画像の画像モデルを前記表示面上に射影した射影画像を、前記表示面に表示するように、前記表示装置を制御する
ステップを含む表示制御方法。 - 表示装置が画像を表示する表示面の位置を検出する検出部と、
前記検出部により位置が検出された前記表示面の画素と、ユーザの位置とを通る直線に沿って、所定の画像の画像モデルを前記表示面上に射影した射影画像を、前記表示面に表示するように、前記表示装置を制御する制御部と
して、コンピュータを機能させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15776617.1A EP3130994A4 (en) | 2014-04-07 | 2015-03-25 | Display control device, display control method, and program |
US15/119,921 US20170052684A1 (en) | 2014-04-07 | 2015-03-25 | Display control apparatus, display control method, and program |
JP2016512656A JP6601392B2 (ja) | 2014-04-07 | 2015-03-25 | 表示制御装置、表示制御方法、及び、プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-078590 | 2014-04-07 | ||
JP2014078590 | 2014-04-07 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015156128A1 true WO2015156128A1 (ja) | 2015-10-15 |
Family
ID=54287705
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/059070 WO2015156128A1 (ja) | 2014-04-07 | 2015-03-25 | 表示制御装置、表示制御方法、及び、プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20170052684A1 (ja) |
EP (1) | EP3130994A4 (ja) |
JP (1) | JP6601392B2 (ja) |
WO (1) | WO2015156128A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021246134A1 (ja) * | 2020-06-05 | 2021-12-09 | ソニーグループ株式会社 | デバイス、制御方法及びプログラム |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1118025A (ja) * | 1997-06-24 | 1999-01-22 | Fujitsu Ltd | 画像呈示装置 |
JP2004309947A (ja) * | 2003-04-10 | 2004-11-04 | Sharp Corp | データ表示装置、データ表示プログラム、および、プログラム記録媒体 |
JP2006184573A (ja) * | 2004-12-27 | 2006-07-13 | Toppan Printing Co Ltd | 画像表示装置及び画像表示システム |
JP2013121453A (ja) * | 2011-12-12 | 2013-06-20 | Toshiba Corp | 超音波診断装置及び画像処理装置 |
WO2013108285A1 (ja) * | 2012-01-16 | 2013-07-25 | パナソニック株式会社 | 画像記録装置、立体画像再生装置、画像記録方法、及び立体画像再生方法 |
JP2013222447A (ja) * | 2012-04-19 | 2013-10-28 | Kddi Corp | 情報提示システム |
JP2014029690A (ja) * | 2012-07-30 | 2014-02-13 | Samsung Electronics Co Ltd | ベンディングインタラクションガイドを提供するフレキシブル装置及びその制御方法 |
JP2014049866A (ja) * | 2012-08-30 | 2014-03-17 | Sharp Corp | 画像処理装置及び画像表示装置 |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7519918B2 (en) * | 2002-05-30 | 2009-04-14 | Intel Corporation | Mobile virtual desktop |
US8842070B2 (en) * | 2004-03-17 | 2014-09-23 | Intel Corporation | Integrated tracking for on screen navigation with small hand held devices |
US7526378B2 (en) * | 2004-11-22 | 2009-04-28 | Genz Ryan T | Mobile information system and device |
WO2009144306A1 (en) * | 2008-05-30 | 2009-12-03 | 3Dvisionlab Aps | A system for and a method of providing image information to a user |
US9092053B2 (en) * | 2008-06-17 | 2015-07-28 | Apple Inc. | Systems and methods for adjusting a display based on the user's position |
US20100045666A1 (en) * | 2008-08-22 | 2010-02-25 | Google Inc. | Anchored Navigation In A Three Dimensional Environment On A Mobile Device |
US8788977B2 (en) * | 2008-11-20 | 2014-07-22 | Amazon Technologies, Inc. | Movement recognition as input mechanism |
JP5237234B2 (ja) * | 2009-09-29 | 2013-07-17 | 日本電信電話株式会社 | 映像コミュニケーションシステム、及び映像コミュニケーション方法 |
US8339364B2 (en) * | 2010-02-03 | 2012-12-25 | Nintendo Co., Ltd. | Spatially-correlated multi-display human-machine interface |
US8581905B2 (en) * | 2010-04-08 | 2013-11-12 | Disney Enterprises, Inc. | Interactive three dimensional displays on handheld devices |
US8913056B2 (en) * | 2010-08-04 | 2014-12-16 | Apple Inc. | Three dimensional user interface effects on a display by using properties of motion |
JP6243586B2 (ja) * | 2010-08-06 | 2017-12-06 | 任天堂株式会社 | ゲームシステム、ゲーム装置、ゲームプログラム、および、ゲーム処理方法 |
US20120212405A1 (en) * | 2010-10-07 | 2012-08-23 | Benjamin Zeis Newhouse | System and method for presenting virtual and augmented reality scenes to a user |
GB2487039A (en) * | 2010-10-11 | 2012-07-11 | Michele Sciolette | Visualizing Illustrated Books And Comics On Digital Devices |
KR101915615B1 (ko) * | 2010-10-14 | 2019-01-07 | 삼성전자주식회사 | 모션 기반 사용자 인터페이스 제어 장치 및 방법 |
US8872854B1 (en) * | 2011-03-24 | 2014-10-28 | David A. Levitt | Methods for real-time navigation and display of virtual worlds |
JP2013009196A (ja) * | 2011-06-24 | 2013-01-10 | Toshiba Corp | 画像表示装置 |
JP5167439B1 (ja) * | 2012-02-15 | 2013-03-21 | パナソニック株式会社 | 立体画像表示装置及び立体画像表示方法 |
US8643951B1 (en) * | 2012-03-15 | 2014-02-04 | Google Inc. | Graphical menu and interaction therewith through a viewing window |
US9791897B2 (en) * | 2012-06-29 | 2017-10-17 | Monkeymedia, Inc. | Handheld display device for navigating a virtual environment |
SE536759C2 (sv) * | 2012-10-18 | 2014-07-15 | Ortoma Ab | Metod och system för planering av position för implantatkomponent |
US9279983B1 (en) * | 2012-10-30 | 2016-03-08 | Google Inc. | Image cropping |
GB201303707D0 (en) * | 2013-03-01 | 2013-04-17 | Tosas Bautista Martin | System and method of interaction for mobile devices |
US9213403B1 (en) * | 2013-03-27 | 2015-12-15 | Google Inc. | Methods to pan, zoom, crop, and proportionally move on a head mountable display |
KR102012254B1 (ko) * | 2013-04-23 | 2019-08-21 | 한국전자통신연구원 | 이동 단말기를 이용한 사용자 응시점 추적 방법 및 그 장치 |
US20150123966A1 (en) * | 2013-10-03 | 2015-05-07 | Compedia - Software And Hardware Development Limited | Interactive augmented virtual reality and perceptual computing platform |
-
2015
- 2015-03-25 EP EP15776617.1A patent/EP3130994A4/en not_active Ceased
- 2015-03-25 JP JP2016512656A patent/JP6601392B2/ja not_active Expired - Fee Related
- 2015-03-25 US US15/119,921 patent/US20170052684A1/en not_active Abandoned
- 2015-03-25 WO PCT/JP2015/059070 patent/WO2015156128A1/ja active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1118025A (ja) * | 1997-06-24 | 1999-01-22 | Fujitsu Ltd | 画像呈示装置 |
JP2004309947A (ja) * | 2003-04-10 | 2004-11-04 | Sharp Corp | データ表示装置、データ表示プログラム、および、プログラム記録媒体 |
JP2006184573A (ja) * | 2004-12-27 | 2006-07-13 | Toppan Printing Co Ltd | 画像表示装置及び画像表示システム |
JP2013121453A (ja) * | 2011-12-12 | 2013-06-20 | Toshiba Corp | 超音波診断装置及び画像処理装置 |
WO2013108285A1 (ja) * | 2012-01-16 | 2013-07-25 | パナソニック株式会社 | 画像記録装置、立体画像再生装置、画像記録方法、及び立体画像再生方法 |
JP2013222447A (ja) * | 2012-04-19 | 2013-10-28 | Kddi Corp | 情報提示システム |
JP2014029690A (ja) * | 2012-07-30 | 2014-02-13 | Samsung Electronics Co Ltd | ベンディングインタラクションガイドを提供するフレキシブル装置及びその制御方法 |
JP2014049866A (ja) * | 2012-08-30 | 2014-03-17 | Sharp Corp | 画像処理装置及び画像表示装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3130994A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021246134A1 (ja) * | 2020-06-05 | 2021-12-09 | ソニーグループ株式会社 | デバイス、制御方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20170052684A1 (en) | 2017-02-23 |
JPWO2015156128A1 (ja) | 2017-04-13 |
EP3130994A1 (en) | 2017-02-15 |
JP6601392B2 (ja) | 2019-11-06 |
EP3130994A4 (en) | 2018-01-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6643357B2 (ja) | 全球状取込方法 | |
US11010958B2 (en) | Method and system for generating an image of a subject in a scene | |
US20150235408A1 (en) | Parallax Depth Rendering | |
US10977864B2 (en) | Techniques for capturing and displaying partial motion in virtual or augmented reality scenes | |
US10681276B2 (en) | Virtual reality video processing to compensate for movement of a camera during capture | |
JP2024050721A (ja) | 情報処理装置、情報処理方法およびコンピュータプログラム | |
JP2010072477A (ja) | 画像表示装置、画像表示方法及びプログラム | |
JP6576536B2 (ja) | 情報処理装置 | |
JP7392105B2 (ja) | 没入型ビデオコンテンツをフォービエイテッドメッシュを用いてレンダリングするための方法、システム、および媒体 | |
TW201619913A (zh) | 模擬立體圖像顯示方法及顯示設備 | |
JP2020004325A (ja) | 画像処理装置、画像処理方法およびプログラム | |
US9025007B1 (en) | Configuring stereo cameras | |
WO2020166376A1 (ja) | 画像処理装置、画像処理方法、及び、プログラム | |
JP2010283529A (ja) | 表示制御装置、表示装置の制御方法、プログラム、記憶媒体 | |
JP6601392B2 (ja) | 表示制御装置、表示制御方法、及び、プログラム | |
TWI603288B (zh) | 在3d空間中將3d幾何數據用於虛擬實境圖像的呈現和控制的方法 | |
JP5878511B2 (ja) | 3d空間中で3d幾何データを仮想現実画像の表現と制御に用いる方法 | |
JP5200141B2 (ja) | 映像提示システム、映像提示方法、プログラム及び記録媒体 | |
JP2018151793A (ja) | プログラム、情報処理装置 | |
KR101800612B1 (ko) | 3차원 모형을 이용한 증강현실 환경에서의 진동발생장치 및 그 방법 | |
GB2548080A (en) | A method for image transformation | |
KR101923640B1 (ko) | 가상 현실 방송을 제공하는 방법 및 장치 | |
KR20230039466A (ko) | 복수의 카메라들로부터 획득된 이미지들에 기반하는 입체 사진을 생성하는 전자 장치 및 그 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15776617 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2015776617 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015776617 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15119921 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2016512656 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |