JP2011248655A - User viewpoint spatial image provision device, user viewpoint spatial image provision method, and program - Google Patents

User viewpoint spatial image provision device, user viewpoint spatial image provision method, and program Download PDF

Info

Publication number
JP2011248655A
JP2011248655A JP2010121568A JP2010121568A JP2011248655A JP 2011248655 A JP2011248655 A JP 2011248655A JP 2010121568 A JP2010121568 A JP 2010121568A JP 2010121568 A JP2010121568 A JP 2010121568A JP 2011248655 A JP2011248655 A JP 2011248655A
Authority
JP
Japan
Prior art keywords
subject
user
eyeball position
display range
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2010121568A
Other languages
Japanese (ja)
Other versions
JP5439281B2 (en
Inventor
Makoto Nakamura
Shiro Ozawa
Kazuhiko Tanaka
誠 中村
史朗 小澤
和彦 田中
Original Assignee
Ntt Comware Corp
エヌ・ティ・ティ・コムウェア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ntt Comware Corp, エヌ・ティ・ティ・コムウェア株式会社 filed Critical Ntt Comware Corp
Priority to JP2010121568A priority Critical patent/JP5439281B2/en
Publication of JP2011248655A publication Critical patent/JP2011248655A/en
Application granted granted Critical
Publication of JP5439281B2 publication Critical patent/JP5439281B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Abstract

PROBLEM TO BE SOLVED: To reduce discomfort or confusion of a user by automatically displaying an image in accordance with the vision to naked eyes of the user.SOLUTION: A user viewpoint spatial image provision device comprises an eyeball position measuring section for measuring a user eyeball position, a target photographing section for photographing a target, a target distance measuring section for measuring the distance to the target, a display range calculating section for calculating a display range based on the eyeball position measured by the eyeball position measuring section and the distance to the target measured by the target distance measuring section, and a display portion for displaying the display range calculated by the display range calculating section cut out of an image photographed by the target photographing section.

Description

  The present invention relates to a user viewpoint space video presentation device, a user viewpoint space video presentation method, and a program.

  In recent years, in virtual reality (VR) technology, a system that presents a space displayed from a user's viewpoint using not only a conventional head mounted display but also a handheld device such as a tablet PC (Personal Computer). It is increasing. In addition, a service that uses augmented reality (AR) technology to present additional information in a space displayed from a user's viewpoint via a small device such as a smartphone has attracted attention.

  FIG. 7 is an image diagram illustrating a display example of an image taken by a handheld device or a small device (hereinafter collectively referred to as a device). As shown in FIG. 7, in a normal device, unless the user performs an operation, the subject 500 displayed on the device has a scale different from the scale of the subject 400 that the user sees with the naked eye (for example, the display magnification). Displayed.

  In the technique described in Patent Document 1, the dimensional identity between a specific object in a photographed image and an image displayed by being superimposed on the object is verified.

JP 2001-142604 A

  However, in the related art, as shown in FIG. 7, the subject 500 displayed on the device is displayed on a different scale from the subject 400 that the user sees with the naked eye. For this reason, when displaying additional information etc. using the augmented reality technology, a sense of incongruity or confusion may occur. In the technique described in Patent Document 1, the dimensional identity with an image displayed in a superimposed manner with a specific object has been verified, but a subject such as a landscape seen with the naked eye and a video displayed on the device The linkage is not considered. For this reason, for example, in order to display a part of a subject that is obstructed by the device and cannot be seen by the user so that the subject appears seamless with the subject that the user sees with the naked eye, the user manually There is a problem that it is necessary to adjust the display magnification to bring the scale of the subject displayed on the device close to the scale of the subject seen with the naked eye, which is troublesome.

  The present invention has been made in view of the above points, and an object of the present invention is to automatically display an image corresponding to what the user sees with the naked eye, thereby reducing discomfort and confusion felt by the user. Another object of the present invention is to provide a user viewpoint space video presentation device, a user viewpoint space video presentation method, and a program.

  The present invention has been made to solve the above problems, and one aspect of the present invention includes an eyeball position measurement unit that measures the user's eyeball position, a subject imaging unit that captures a subject, A subject distance measurement unit that measures a distance; a display range calculation unit that calculates a display range based on the eyeball position measured by the eyeball position measurement unit; and the distance to the subject measured by the subject distance measurement unit; And a display unit that cuts out and displays the display range calculated by the display range calculation unit from an image captured by the subject imaging unit.

  According to the present invention, the display range is determined based on the user's eyeball position and the distance to the subject. Thereby, it is possible to display close to the display range corresponding to the subject that the user sees with the naked eye without any particular operation by the user.

  In addition, according to one aspect of the present invention, in the user viewpoint space video presentation device, the display range calculation unit is a distance from the eyeball position to the subject with respect to a distance from the eyeball position to the user viewpoint space video presentation device. The display range is calculated using

  According to the present invention, the display range is calculated using the distance from the eyeball position to the subject with respect to the distance to the eyeball position. As a result, the subject that the user sees with the naked eye and the subject to be displayed can be displayed as a series of subjects, and the user feels uncomfortable or confused when viewing the displayed subject. Can be reduced.

  One aspect of the present invention is the above-described user viewpoint space video presentation device, further comprising a user photographing unit that photographs a user, wherein the eyeball position measuring unit is configured to capture the user's eyeball from an image photographed by the user photographing unit. The position is measured.

  According to the present invention, since the position of the user's eyeball is measured from the captured image, for example, a distance measuring sensor using infrared rays or the like is not required, and the configuration of the user viewpoint space video presentation device can be simplified.

  Further, according to one aspect of the present invention, in the user viewpoint space video presentation device, the eyeball position measurement unit measures the eyeball position of the user by using an eyeball interval in an image captured by the user imaging unit. It is characterized by.

  According to this invention, since the user's eyeball position is measured using the eyeball interval, the user's eyeball position can be measured more simply.

  According to another aspect of the present invention, the eyeball position measurement unit measures the user's eyeball position, the subject photographing unit photographs the subject, and the subject distance measurement unit measures the distance to the subject. The display range calculating unit calculating the display range based on the eyeball position measured by the eyeball position measuring unit and the distance to the subject measured by the subject distance measuring unit; and the display unit And a step of cutting out and displaying the display range calculated by the display range calculation unit from the image captured by the subject photographing unit.

  According to one embodiment of the present invention, a step of measuring a user's eyeball position, a step of photographing a subject, a step of measuring a distance to the subject, a measured eyeball position, and a measured subject are stored in a computer. This is a program for executing a step of calculating a display range based on the distance and a step of cutting out and displaying the calculated display range from a captured image.

  According to the present invention, the display range is determined based on the user's eyeball position and the distance to the subject. Accordingly, the display can be displayed close to the display range corresponding to the subject that the user sees with the naked eye without manually adjusting the display range and the display magnification. For this reason, it is possible to reduce a sense of incongruity or confusion when the user looks at the displayed subject.

It is the schematic which shows the external appearance structure etc. of the user viewpoint space image | video presentation apparatus by one Embodiment of this invention. It is a block diagram which shows the function structure of the user viewpoint space video presentation apparatus by this embodiment. It is a flowchart which shows the procedure of the video display process by this embodiment. It is the schematic for demonstrating the eyeball position measuring method by this embodiment. It is the schematic for demonstrating the display range calculation method by this embodiment. It is an image figure showing the example of a display of the user viewpoint space video presentation apparatus by this embodiment. It is an image figure showing the example of a display of the image image | photographed in the device by a prior art.

Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
FIG. 1 is a schematic diagram showing an external configuration and the like of a user viewpoint space video presentation device 10 according to an embodiment of the present invention.
The user viewpoint space video presentation device 10 is a small device such as a smartphone or a mobile phone, or a handheld device such as a tablet PC, and includes an in-camera 11, a display 12, and an out-camera 13. The in-camera (user photographing unit) 11 is a camera that photographs the user A who operates the user viewpoint space video presentation device 10. The out camera (subject photographing unit) 13 is a camera that photographs a subject such as an object or a landscape. The display 12 is, for example, a liquid crystal display that displays a subject photographed by the out camera 13. In FIG. 1, a shooting range S <b> 1 is a shooting range of the out camera 13. The display range S2 is a range displayed on the display 12 in the photographing range S1. As shown in FIG. 1, the user viewpoint space video presentation device 10 is based on the positional relationship between the user A and the user viewpoint space video presentation device 10 and the subject that the user A sees with the naked eye and the subject displayed on the display 12. Display video so that and are seamless. This makes it easy for the user A to intuitively grasp the space of the video displayed on the display 12, and can reduce discomfort and confusion.

FIG. 2 is a block diagram illustrating a functional configuration of the user viewpoint space video presentation device 10 according to the present embodiment.
The user viewpoint space video presentation device 10 includes an in-camera 11, a display 12, an out-camera 13, an eyeball position measurement unit 14, a display range calculation unit 15, a real space imaging unit 16, and a subject distance measurement unit 17. The video cutout unit 18 and the video display unit 19 are included.
The eyeball position measurement unit 14 measures the eyeball position of the user A from the video (image) taken by the in-camera 11 and outputs the measurement result to the display range calculation unit 15. The details of the eyeball position measurement method will be described later. The subject distance measurement unit 17 measures the distance to the subject using a distance measuring sensor provided in the out camera 13 and outputs the measurement result to the display range calculation unit 15. Specifically, the subject distance measurement unit 17 irradiates the subject with infrared rays from the distance measuring sensor, and detects the distance to the subject based on the time until the reflected wave returns and the irradiation angle. The display range calculation unit 15 calculates the display range of the video (image) to be displayed on the display 12 based on the eyeball position and the distance to the subject, and outputs the calculated display range to the video cutout unit 18. Details of the display range calculation method will be described later.

  The real space imaging unit 16 captures a video (image) of the subject using the out-camera 13 and outputs the captured video (image) to the video cutout unit 18. The video cutout unit 18 cuts out the display range calculated by the display range calculation unit 15 from the video (image) taken by the real space shooting unit 16 and outputs the cut out video (image) to the video display unit 19. The video display unit 19 displays the video (image) cut out by the video cutout unit 18 on the display 12. The video cutout unit 18, the video display unit 19 and the display 12 constitute a display unit of the present application.

Next, with reference to FIG. 3, the video display processing by the user viewpoint space video presentation device 10 will be described. FIG. 3 is a flowchart showing a procedure of video display processing according to the present embodiment.
[Step S101: Eye Position Measurement]
First, in step S101, the eyeball position measurement unit 14 measures the eyeball position of the user A. Here, the eyeball position measurement method will be described in detail with reference to FIG. FIG. 4 is a schematic view for explaining the eyeball position measuring method according to the present embodiment. In FIG. 4, an xyz coordinate system in the in-camera 11 with the in-camera 11 as the origin (0, 0, 0), the horizontal direction with respect to the in-camera 11 as the x-axis direction, the vertical direction as the y-axis direction, and the depth direction as the z-axis. Determine. That is, the x-axis represents the horizontal relative distance between the in-camera 11 and the user A's eyeball position. The y axis represents the relative distance in the vertical direction between the in-camera 11 and the position of the eyeball of the user A. The z axis represents the distance between the in-camera 11 and the user A's eyeball position.

First, the eyeball position measurement unit 14 detects the eyeball of the user A from the captured video I that is a video captured by the in-camera 11. Specifically, the eyeball position measurement unit 14 detects the face of the user A from parts constituting the face such as eyes, nose, and mouth using an image processing technique using face recognition, for example. The position (i L , j L ) and the position (i R , j R ) of the left eye eyeball in the captured image I are calculated.

Next, the eyeball position measurement unit 14 calculates the eyeball position (x, y, z) of the user A in the real space. Eye position (x, y, z) is an intermediate coordinate of the left eye position of the eye of the user A (x L, y L, z) and the position of the right eye eyeball (x R, y R, z ). First, the eyeball position measurement unit 14 calculates the distance l between the left eyeball and the right eyeball detected on the captured image I by the following equation (1).

Then, the eyeball position measurement unit 14 converts the distance l between the eyeballs reflected in the CCD (solid coupled device) of the in-camera 11 into the distance l e in the real space by the following equation (2). . Here, α is a coefficient representing the size in real space (for example, the unit is meters) for one pixel in the captured image I, that is, a coefficient for converting the pixel into meter coordinates.

l e = α × l (2)

Next, the eyeball position measurement unit 14 calculates the distance z from the in-camera 11 to the eyeball by the following equation (3). Here, f1 is the focal length of the in-camera 11. ΔE is the interocular width and is a constant (for example, 6.5 cm which is a general average value of the interocular width). Note that the interocular width ΔE may be set for each user. As shown in Expression (3), the eyeball position measurement unit 14 measures the eyeball position of the user A using the distance 1 between the eyeballs in the image captured by the in-camera 11. In other words, the eyeball position measurement unit 14 measures the distance to the eyeball of the user A using the interocular width ΔE with respect to the distance l e in the real space of the distance l.

z = (f1 × ΔE) / l e (3)

Next, the eyeball position measurement unit 14 calculates the eyeball position x in the x-axis direction by the following equation (4). Here, i is an intermediate coordinate between i L and i R , and i = (i L + i R ) / 2.

x = (z × f1) / (α × i) (4)

Next, the eyeball position measurement unit 14 calculates the eyeball position y in the y-axis direction by the following equation (5). Here, j is an intermediate coordinates j L and j R, is j = (j L + j R ) / 2.

y = (z × f1) / (α × j) (5)

  In this embodiment, the eyeball position is measured using the captured image I. However, for example, the user uses a distance measuring sensor or the like that measures the distance to the subject by irradiating the subject with infrared rays, ultrasonic waves, or the like. The eyeball position of A may be measured.

[Step S102: Subject Distance Measurement]
Next, in step S102, the subject distance measuring unit 17 measures a distance Z obj from the out camera 13 to the subject.

[Step S103: Display Range Calculation]
Next, in step S <b> 103, the display range calculation unit 15 calculates a display range to be displayed on the display 12. Here, the display range calculation method will be described in detail with reference to FIG. FIG. 5 is a schematic diagram for explaining the display range calculation method according to the present embodiment. In FIG. 5, the xyz coordinate system in the out camera 13 with the out camera 13 as the origin (0, 0, 0), the horizontal direction with respect to the out camera 13 as the x axis direction, the vertical direction as the y axis direction, and the depth direction as the z axis. Determine. Here, the upper end position of the display 12 in the y-axis direction is y upper , and the lower end position is y bottom . Further, the left end position of the display 12 in the x-axis direction is x left and the right end position is x right . The distance from the out camera 13 to the display 12 is ΔZ.

First, the display range calculation unit 15 converts the eyeball position (x, y, z) calculated in step S101 into a coordinate system (x pos , y pos , z pos ) in the out camera 13. Here, from the position (x i , y i , z i ) of the in-camera 11, x pos = x i + x, y pos = y i + y, and z pos = z i + z.

Next, the display range calculation unit 15 calculates the upper end y obj_upper in the y-axis direction in the display range S2 by the following equation (6). The display range S2 is a range of a subject to be displayed on the display 12.

y objupper = y pos − (y pos −y upper ) × (z pos + Z obj ) / (z pos −ΔZ) (6)

In addition, the display range calculation unit 15 calculates the lower end y obj_bottom in the y-axis direction in the display range S2 by the following equation (7).

y obj —bottom = y pos − (y pos −y bottom ) × (z pos + Z obj ) / (z pos −ΔZ) (7)

Next, the display range calculation unit 15 calculates the left end x obj_left in the x-axis direction in the display range S2 by the following equation (8).

x obj_left = x pos - (x pos -x left) × (z pos + Z obj) / (z pos -ΔZ) ... (8)

In addition, the display range calculation unit 15 calculates the right end x obj_right in the x-axis direction in the display range S2 by the following equation (9).

x obj_right = x pos − (x pos −x right ) × (z pos + Z obj ) / (z pos −ΔZ) (9)

  As shown in Expressions (6), (7), (8), and (9), the display range calculation unit 15 determines the subject from the eyeball position relative to the distance from the eyeball position to the display 12 (user viewpoint space video presentation device 10). The display range is calculated using the distance up to.

Next, the display range calculation unit 15 converts the display range S2 in the real space into a display range a in the captured video O that is a video captured by the out camera 13. First, the display range calculating section 15 calculates the equation (10) the upper end j 1 of the following in the y-axis direction of the display range a. Here, β is a coefficient representing the size in real space (for example, the unit is meters) for one pixel in the captured video O, that is, a coefficient for converting a pixel into meter coordinates. Further, f2 is a focal length of the out camera 13.

j 1 = f2 × y obj — upper / β × Z obj (10)

Subsequently, the display range calculating section 15, the lower end j 2 in the y-axis direction of the display range a in the captured image O is calculated by the following equation (11).

j 2 = f2 × y obj_bottom / β × Z obj (11)

Next, the display range calculation unit 15 calculates the left end i 1 of the display range a in the captured image O in the x-axis direction by the following equation (12).

i 1 = f2 × x obj_left / β × Z obj (12)

Subsequently, the display range computing unit 15, the right edge i 2 in the y-axis direction of the display range a in the captured image O is calculated by the following equation (13).

i 2 = f2 × x obj_right / β × Z obj (13)

Thereby, the display range calculation unit 15 calculates the vertices (i 1 , j 1 ), (i 1 , j 2 ), (i 2 , j 1 ), (i 2 , j 2 ) in the rectangular display range a. To do.

[Step S104: Image Cutout]
Next, in step S <b> 104, the video cutout unit 18 cuts out the display range a from the captured video O.

[Step S105: Video Display]
Finally, in step 105, the video display unit 19 displays the video clipped by the video cutout unit 18 on the display 12 according to the size of the display 12. That is, the video display unit 19 adjusts (enlarges or reduces) the display magnification of the clipped video so that the clipped video is displayed on the entire screen of the display 12 and displays the video on the display 12.

  FIG. 6 is an image diagram illustrating a display example of the user viewpoint space video presentation device 10 according to the present embodiment. FIG. 6A is a display example when the captured image O of the out camera 13 is displayed on the display 12. FIG. 6B is a display example when an image obtained by cutting out the display range a from the captured image O is displayed on the display 12. As shown in FIG. 6A, the scale of the subject 200 in the captured image O is different from the scale of the subject 100 that the user sees with the naked eye. The user viewpoint space video presentation device 10 cuts out and displays the display range a from the captured video O so that the subject 100 that the user is viewing with the naked eye and the video displayed on the display 12 are seamless. Therefore, as shown in FIG. 6B, in the user viewpoint space video presentation device 10 according to the present embodiment, for example, the user A visually observes with the naked eye without the user A manually adjusting the display magnification or the display range. Displayed as if the bridge 101 and the bridge 301 displayed on the display 13 are connected. That is, the scale of the subject 100 viewed by the user with the naked eye is substantially the same as the scale of the subject 300 displayed on the display 12.

Thus, according to the present embodiment, the user viewpoint space video presentation device 10 determines the display range of the subject to be displayed on the display 12 based on the user's eyeball position and the distance to the subject. Thereby, even if the user does not manually adjust the display range and the display magnification, it is possible to display the subject that the user sees with the naked eye and the subject to be displayed as a series of subjects. For this reason, the user can feel as if the display 12 is a glass window and the image displayed on the display 12 is an actual image seen through the glass window. Therefore, it is possible to reduce a sense of discomfort and confusion that the user feels when viewing the displayed subject.
Further, when a CG (Computer Graphics) is added on the display 12, the CG can be seen at the same magnification as in reality, so that augmented reality can be obtained as if the CG actually existed. it can.

In addition, a program for realizing each step shown in FIG. 3 is recorded on a computer-readable recording medium, and the program recorded on the recording medium is read into a computer system and executed, thereby executing video display processing. You may go. Here, the “computer system” may include an OS and hardware such as peripheral devices.
Further, the “computer system” includes a homepage providing environment (or display environment) if a WWW system is used.
The “computer-readable recording medium” means a flexible disk, a magneto-optical disk, a ROM, a writable nonvolatile memory such as a flash memory, a portable medium such as a CD-ROM, a hard disk built in a computer system, etc. This is a storage device.

Further, the “computer-readable recording medium” means a volatile memory (for example, DRAM (Dynamic DRAM) in a computer system that becomes a server or a client when a program is transmitted through a network such as the Internet or a communication line such as a telephone line. Random Access Memory)), etc., which hold programs for a certain period of time.
The program may be transmitted from a computer system storing the program in a storage device or the like to another computer system via a transmission medium or by a transmission wave in the transmission medium. Here, the “transmission medium” for transmitting the program refers to a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line.
The program may be for realizing a part of the functions described above. Furthermore, what can implement | achieve the function mentioned above in combination with the program already recorded on the computer system, and what is called a difference file (difference program) may be sufficient.

  As described above, the embodiment of the present invention has been described in detail with reference to the drawings. However, the specific configuration is not limited to the above, and various design changes and the like can be made without departing from the scope of the present invention. It is possible to

  DESCRIPTION OF SYMBOLS 10 ... User viewpoint space image | video presentation apparatus 11 ... In camera 12 ... Display 13 ... Out camera 14 ... Eyeball position measurement part 15 ... Display range calculation part 16 ... Real space imaging | photography part 17 ... Subject distance measurement part 18 ... Image | video clipping part 19 ... Video display

Claims (6)

  1. An eyeball position measurement unit that measures the user's eyeball position;
    A subject photographing unit for photographing the subject;
    A subject distance measuring unit for measuring the distance to the subject;
    A display range calculation unit that calculates a display range based on the eyeball position measured by the eyeball position measurement unit and the distance to the subject measured by the subject distance measurement unit;
    A display unit that cuts out and displays the display range calculated by the display range calculation unit from the image captured by the subject photographing unit;
    A user viewpoint space video presentation device comprising:
  2.   The display range calculation unit calculates a display range by using a distance from the eyeball position to the subject with respect to a distance from the eyeball position to the user viewpoint space video presentation device. User viewpoint space video presentation device.
  3. A user photographing unit for photographing the user;
    3. The user viewpoint space video presentation device according to claim 1, wherein the eyeball position measurement unit measures the eyeball position of the user from an image captured by the user photographing unit.
  4.   4. The user viewpoint space video presentation device according to claim 3, wherein the eyeball position measurement unit measures an eyeball position of the user using an eyeball interval in an image captured by the user photographing unit.
  5. An eyeball position measuring unit measuring the user's eyeball position;
    A subject photographing unit photographing a subject;
    A subject distance measuring unit measuring a distance to the subject;
    A display range calculation unit calculating a display range based on the eyeball position measured by the eyeball position measurement unit and the distance to the subject measured by the subject distance measurement unit;
    A step of displaying and cutting out a display range calculated by the display range calculation unit from an image captured by the subject photographing unit;
    A user viewpoint space video presentation method characterized by comprising:
  6. On the computer,
    Measuring a user's eyeball position;
    Shooting a subject,
    Measuring the distance to the subject;
    Calculating a display range based on the measured eyeball position and the measured distance to the subject;
    Cutting out and displaying the calculated display range from the captured image;
    A program for running
JP2010121568A 2010-05-27 2010-05-27 User viewpoint space video presentation device, user viewpoint space video presentation method and program Active JP5439281B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010121568A JP5439281B2 (en) 2010-05-27 2010-05-27 User viewpoint space video presentation device, user viewpoint space video presentation method and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2010121568A JP5439281B2 (en) 2010-05-27 2010-05-27 User viewpoint space video presentation device, user viewpoint space video presentation method and program

Publications (2)

Publication Number Publication Date
JP2011248655A true JP2011248655A (en) 2011-12-08
JP5439281B2 JP5439281B2 (en) 2014-03-12

Family

ID=45413834

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010121568A Active JP5439281B2 (en) 2010-05-27 2010-05-27 User viewpoint space video presentation device, user viewpoint space video presentation method and program

Country Status (1)

Country Link
JP (1) JP5439281B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013254338A (en) * 2012-06-06 2013-12-19 Nippon Telegr & Teleph Corp <Ntt> Video generation system, video generation device, video generation method, and computer program
JP2015156131A (en) * 2014-02-20 2015-08-27 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus and information processing method
JP2019008473A (en) * 2017-06-22 2019-01-17 ファナック株式会社 Composite reality simulation device and composite reality simulation program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08190640A (en) * 1995-01-12 1996-07-23 Hitachi Ltd Information display method and information provision system
JP2006208451A (en) * 2005-01-25 2006-08-10 Konica Minolta Photo Imaging Inc Video display device
JP3926837B2 (en) * 2004-06-04 2007-06-06 松下電器産業株式会社 Display control method and apparatus, program, and portable device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08190640A (en) * 1995-01-12 1996-07-23 Hitachi Ltd Information display method and information provision system
JP3926837B2 (en) * 2004-06-04 2007-06-06 松下電器産業株式会社 Display control method and apparatus, program, and portable device
JP2006208451A (en) * 2005-01-25 2006-08-10 Konica Minolta Photo Imaging Inc Video display device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013254338A (en) * 2012-06-06 2013-12-19 Nippon Telegr & Teleph Corp <Ntt> Video generation system, video generation device, video generation method, and computer program
JP2015156131A (en) * 2014-02-20 2015-08-27 株式会社ソニー・コンピュータエンタテインメント Information processing apparatus and information processing method
WO2015125709A1 (en) * 2014-02-20 2015-08-27 株式会社ソニー・コンピュータエンタテインメント Information processing device and information processing method
US10192360B2 (en) 2014-02-20 2019-01-29 Sony Interactive Entertainment Inc. Information processing apparatus and information processing method
JP2019008473A (en) * 2017-06-22 2019-01-17 ファナック株式会社 Composite reality simulation device and composite reality simulation program

Also Published As

Publication number Publication date
JP5439281B2 (en) 2014-03-12

Similar Documents

Publication Publication Date Title
KR20170100641A (en) Virtual representations of real-world objects
US8964066B2 (en) Apparatus and method for generating image including multiple people
JP5415948B2 (en) Gaze detection apparatus and gaze detection method
US10095030B2 (en) Shape recognition device, shape recognition program, and shape recognition method
US9401050B2 (en) Recalibration of a flexible mixed reality device
CN102246202B (en) The image display apparatus, image display method and a program
EP3382510A1 (en) Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device
US20150070389A1 (en) Information processing apparatus, information processing system, and information processing method
US9508195B2 (en) Management of content in a 3D holographic environment
JP2008129775A (en) Display control unit, display device and display control method
US9552060B2 (en) Radial selection by vestibulo-ocular reflex fixation
KR20160149252A (en) Stabilization plane determination based on gaze location
US10133073B2 (en) Image generation apparatus and image generation method
JP2004062756A (en) Information-presenting device and information-processing method
US20050190989A1 (en) Image processing apparatus and method, and program and recording medium used therewith
JP4137078B2 (en) Mixed reality information generating apparatus and method
TW201145074A (en) Method and system for adaptive viewport for a mobile device based on viewing angle
KR100894874B1 (en) Apparatus and Method for Generating a Stereoscopic Image from a Two-Dimensional Image using the Mesh Map
JP4144492B2 (en) Image display device
CN103207664A (en) Image processing method and equipment
JP6522708B2 (en) Preview image display method and apparatus, and terminal
CN102395036A (en) Apparatus and method for providing 3D augmented reality
US9965030B2 (en) Wearable glasses and method of displaying image via the wearable glasses
US9412169B2 (en) Real-time visual feedback for user positioning with respect to a camera and a display
JP2016062486A (en) Image generation device and image generation method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20130220

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20131107

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20131119

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20131216

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250