CN101843104A - Acquiring images within a 3-dimensional room - Google Patents

Acquiring images within a 3-dimensional room Download PDF

Info

Publication number
CN101843104A
CN101843104A CN200880113766A CN200880113766A CN101843104A CN 101843104 A CN101843104 A CN 101843104A CN 200880113766 A CN200880113766 A CN 200880113766A CN 200880113766 A CN200880113766 A CN 200880113766A CN 101843104 A CN101843104 A CN 101843104A
Authority
CN
China
Prior art keywords
image
overlapping frame
dimensional
indoor
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN200880113766A
Other languages
Chinese (zh)
Inventor
约瑞·戈伊肯斯
理查德·P·克莱赫斯特
皮姆·科尔温
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Publication of CN101843104A publication Critical patent/CN101843104A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Abstract

The application relates to acquiring images within a 3-dimensional room 4. Image acquiring areas 6 of the at least two imaging units 2 overlap within the room 4 within at least one 3 -dimensional overlap box 8. In order to reduce occlusion, there is provided at least one image processing unit 10 arranged for obtaining the acquired images from the at least two imaging units 2, and for determining information about the at least one 3- dimensional overlap box 8, wherein said image processing unit 10 is further arranged for outputting information about the 3-dimensional overlap box 8 for being output by an information output unit 12.

Description

At the indoor image that obtains of three-dimensional
Technical field
Present patent application relates to a kind of three-dimensional indoor system that obtains image that is disposed in.The application also relates to a kind of being used in the indoor method of obtaining image of three-dimensional, a kind of being used at indoor computer program and a kind of computer program that obtains image of three-dimensional, and can be in the indoor game console of obtaining image of three-dimensional.
Background technology
In current game console was used, for example in video-game, video camera can be used to observe a player (user) or a plurality of player (a plurality of user) in the video-game.The player can operate video-game by their motion and the attitude of obtaining from video camera.Equally, in the personal computer applications program, can obtain the attitude of entity (for example people) and come operation sequence.For example, according to US 6,901,561B1, known a kind of user's of identification action also makes those actions and the corresponding method and apparatus of certain computer function.According to the prior art, can the window on screen in the image of explicit user.Window can comprise the target area.This method can also comprise: the first computer incident is associated with first user action in being presented at the target area, and in memory device stored information, the user action of winning is associated with the first computer incident.This system can discern specific user action, and specific user action can be associated with the certain computer explanation.
Yet multi-player video game and more senior camera application develop into uses a plurality of video cameras.When using a plurality of video camera, can make these video cameras three-dimensional identical point in indoor of coarse alignment from different perspectives.The visual angle of video camera can overlap in the centre, and the three-dimensional room area (overlapping region that exists all video cameras to check; The overlapping frame).This zone is the ideal position that the user carries out any action.
Use the possibility of blocking that a plurality of video cameras can prevent the user.Yet the user can find to be difficult to estimate the volume and the exact position of overlapping region, and particularly, how wide the visual angle that the imagination user discovers less than each video camera have.
Summary of the invention
Therefore, the application's purpose provides method, system, computer program and the game console of a plurality of image acquisition units of a kind of use that can make the user be easy to handle (for example, video camera).Another purpose of the application provides minimum occlusion when using more than a camera operation computer program.Another purpose of present patent application is the availability that increases multi-camera system.
These and other purposes of the application can be solved by the system that comprises two image-generating units at least, and at least two image-generating units are arranged at the indoor image that obtains of three-dimensional.Indoor in three-dimensional, the image-acquisition area of at least two image-generating units can overlap at least one three-dimensional overlapping frame.At least one graphics processing unit that obtains image that is arranged to obtain from least two image-generating units can be provided.Graphics processing unit can be determined and the relevant information of at least one three-dimensional overlapping frame.Graphics processing unit can also be arranged to export and the relevant information of three-dimensional overlapping frame, exports for information output unit (12).
Obtain to allow graphics processing unit to export this information for information about with the overlapping frame.Export this information and can allow in the user is given in the overlapping frame, how to locate the information of blocking of preventing that oneself makes.In addition, can indicate the user to move in the overlapping frame, so that provide by 3 d pose for example in three-dimensional in-house operation computer program or video-game or video control console or the incident on it.
Can provide the definite information relevant with the visual angle by the position that computed image obtains the unit with at least one three-dimensional overlapping frame.It for example can be video camera that image obtains the unit.Video camera can detect position each other mutually.Can be in graphics processing unit predefine and storage camera position.Video camera can also detect visual angle each other mutually, and this information is offered graphics processing unit.Graphics processing unit can also be learnt the video camera visual angle, and can calculate at least one three-dimensional overlapping frame according to known location.
In information output unit, provide the information relevant can allow the user to estimate the exact position of overlapping region with three-dimensional overlapping frame.Particularly, the user needn't be concerned about the visual angle of video camera.Visual video camera bundle intercepting space each other can allow the user to indoor the strolling about of the suitable visible three-dimensional of all video cameras (when the user strolls about in the overlapping frame).
According to embodiment, provide the placement of images processing unit to be used to export the information relevant with indoor zone, do not overlap in the image-acquisition area of these indoor at least two image-generating units.By provide the information relevant with the frame that overlaps and with the relevant information in zone that does not exist the video camera bundle to overlap, allow the indication user accurately to move in the overlapping frame.Particularly, can by provide with the image-acquisition area of at least two image-generating units do not have the relevant information in the zone that overlaps indicate the user its in overlapping frame outside.
According to another embodiment, can provide at least three image-generating units.Indoor, the image-acquisition area of at least three image-generating units can overlap at least one first three-dimensional overlapping frame.The overlapping frame that the video camera bundle that the first three-dimensional overlapping frame can be all video cameras overlaps each other.For example, have three video cameras in indoor, have a frame, wherein the visual angle of video camera makes all image-acquisition area (that is video camera bundle) overlap.This first three-dimensional overlapping frame provides the optimal user visual field and the best to block and prevents.In addition, in having the embodiment of at least three image-generating units, indoor, the image-acquisition area of two image-generating units can overlap at least one second three-dimensional overlapping frame.In the second three-dimensional overlapping frame, just two video camera bundles can overlap.This zone can be understood that the mean quality zone, in this mean quality zone, can utilize good accuracy to obtain user's attitude, yet this accuracy is less than the accuracy in the first three-dimensional overlapping frame.
In addition, for the first and second overlapping frames, in the indoor zone that the zone does not overlap that obtains that can also have image-generating unit of three-dimensional.
Graphics processing unit can be arranged to determine and the first three-dimensional overlapping frame, the second three-dimensional overlapping frame and the relevant information in zone that do not overlap.Graphics processing unit can also be arranged to export by information output unit output with the first three-dimensional overlapping frame, the second three-dimensional overlapping frame and the relevant information in zone that do not overlap.By exporting this information, the user can know he which by three video cameras see, he which by two video cameras see, which he only seen by a video camera at.This can allow the user accurately to move in this position, can prevent from best to block in this position, and this position can be the first three-dimensional overlapping frame.
According to embodiment, information output unit comprises: display unit, be arranged at three-dimensional display screen inner projection indoorly, and and in projection chamber, show at least one three-dimensional overlapping frame.Show that in screen the overlapping frame allows user oneself to move in this zone.For example, the position that can explicit user on screen and the position of overlapping frame, and user oneself can move to the position of overlapping frame.According to the actual angle of video camera, the indoor and projection overlapping frame need not be limited to the indoor visual field.On the contrary, the indoor visual field that for example shows on screen and the visual field of overlapping frame can rotate 6 or less than 6 degrees of freedom, make the user on the indoor visual field and the screen or indoor himself visual angle can be aimed at.
In order to show the first and second overlapping frames and the zone that does not overlap, according to embodiment, can arrange display unit, distinguish the zone of the first three-dimensional overlapping frame and the frame that do not overlap at least, and the zone of the frame that do not overlap be by providing the different optical information acquisition in screen.
For example, zones of different can wait visual by different colours, different shade, different texture, different contrast, different brightness with the overlapping frame.
According to embodiment, can arrange that image-generating unit is used to obtain the information of the attitude of indoor entity and/or entity.Can obtain the image of entity.The profile of entity be can also only obtain, and the attitude and the position of entity obtained according to its profile.
According to embodiment, image-generating unit is arranged to obtain the locus of entity.According to embodiment,, can make this position relevant with the overlapping frame by obtaining the locus of entity.This can allow to indicate the user its whether in the overlapping frame, and the indication user moves to specific direction to enter into the overlapping frame.
According to embodiment, image-generating unit is arranged to determine indoor position each other.Image-generating unit also is arranged to detect mutually visual angle each other.For example, each image-generating unit can provide lighting unit (for example, LED), this lighting unit allows other image-generating units to find this position.Image-generating unit can also communicate with one another by wired or wireless communication, and communicate with one another their visual angle and/or their position.Yet graphics processing unit can also be determined the position and the visual angle of image-generating unit.
Embodiment is provided in the graphics processing unit and calculates at least one three-dimensional overlapping frame according to the positional information of image-generating unit at least.
According to embodiment, graphics processing unit can be arranged to, and calculates at least one three-dimensional overlapping frame according to the information relevant with the visual angle of image-generating unit at least.According to embodiment, display unit can be arranged to handle the demonstration of at least one three-dimensional indoor three-dimensional overlapping frame of projection.
In order to provide to the user, can also produce acoustic information about their information with respect to the position of overlapping frame.For example, information output unit can be arranged to, and provides acoustic information according to the relative tertiary location of at least one overlapping frame and at least one entity.According to embodiment, information output unit can form the part of game console.
The application's is a kind of method on the other hand, comprising: obtain at least two images in that three-dimensional is indoor, wherein, indoor, the image-acquisition area of at least two image-generating units overlaps at least one three-dimensional overlapping frame; The image that acquisition is obtained; Determine and the relevant information of at least one three-dimensional overlapping frame; And output and the relevant information of three-dimensional overlapping frame.
The application's is a kind of computer program on the other hand, comprising: instruction, Operation Processor are obtained at least two images in that three-dimensional is indoor.Indoor, the image-acquisition area of at least two image-generating units can overlap at least one three-dimensional overlapping frame.The image that can obtain to obtain.Can determine and the relevant information of at least one three-dimensional overlapping frame.Can export and the relevant information of three-dimensional overlapping frame.
The application's is a kind of computer program on the other hand, comprising: instruction, Operation Processor are obtained at least two images in that three-dimensional is indoor.Indoor, the image-acquisition area of at least two image-generating units can overlap at least one three-dimensional overlapping frame.The image that can obtain to obtain.Can determine and the relevant information of at least one three-dimensional overlapping frame.Can export and the relevant information of three-dimensional overlapping frame.
The application's is a kind of game console on the other hand.Game console can comprise: be arranged to obtain at least one graphics processing unit that obtains image from least two image-generating units.Graphics processing unit can also be arranged to determine and the relevant information of at least one three-dimensional overlapping frame.Indoor, the image-acquisition area of at least two image-generating units overlaps.Graphics processing unit can also be arranged to export and the relevant information of three-dimensional overlapping frame, exports for information output unit.Can provide processor to be used for the information of processing entities with respect to the overlapping frame.
With reference to the following drawings, these and other aspects of the application will become apparent and set forth.
Description of drawings
Fig. 1 shows has the indoor of two video cameras;
Fig. 2 shows the indoor top view with three video cameras;
Fig. 3 shows the system that is used to obtain image;
Fig. 4 shows the video camera that is used to obtain image.
Embodiment
Fig. 1 has schematically shown three-dimensional indoor 4.In three-dimensional indoor 4, arrange two image-generating unit 2a, 2b, can be the video camera such as ccd video camera.Image- acquisition area 6a, 6b have been schematically shown.Image-acquisition area 6a is the zone that image-generating unit 2 can therefrom obtain image.The size of image-acquisition area 6a is limited by the visual angle of image-generating unit 2a.Imaging region 6b is limited by the visual angle of image-generating unit 2b.Imaging region 6a, 6b intersect in being illustrated as the overlapping frame 8 in a zone.The overlapping of image- acquisition area 6a, 6b is an overlapping frame 8, and in this overlapping frame 8, two image-generating unit 2a, 2b obtain indoor 4 interior images.
Also show zone 14, in zone 14, image- acquisition area 6a, 6b do not overlap, and in zone 14, only image-generating unit 2a, a 2b obtain image.
For example, when from image-generating unit 2b, extracting the visual field of overlapping frame 8, for example, and when object is positioned at before the overlapping frame 8, quiescent imaging unit 2a can also obtain the to overlap image of entities in the frame 8.Therefore, in overlapping frame 8, can minimize or prevent and block.
Fig. 2 shows indoor 4 top view, in indoor 4, except image-generating unit 2a, 2b, also provides image-generating unit 2c.As can be seen, in the first overlapping frame 8a, image- acquisition area 6a, 6b and 6c (the image-acquisition area 6c of image-generating unit 2c) overlap in the middle of indoor 4.In this first overlapping frame 8a, all three image-generating unit 2a-c can obtain image.Except overlapping frame 8a, also there is overlapping frame 8b with the straight line signal, in overlapping frame 8b, two image-generating unit 2a, 2b, 2c obtain image respectively.This can be understood as the second overlapping frame 8b.Except overlapping frame 8, also there is zone 14, in zone 14, only an image-generating unit 2 can obtain image.
Fig. 3 shows the system that has having of image-generating unit 2 indoor 4 as shown in Figure 1, 2.In addition, show graphics processing unit 10, processor 24, game console 20, output unit 12a, 12b and computer program 22.In addition, output unit 12a comprises screen 18, and output unit 12b comprises loud speaker.
Graphics processing unit 10 can obtain indoor 4 the image that obtains from image-generating unit 2.In addition, image-generating unit 2 can communicate each other, and communicate by letter they the visual angle and they are in indoor 4 position.About the position of image-generating unit 2 and the information at visual angle can also be communicated to graphics processing unit 10.
By having the information relevant with the position with the visual angle of image-generating unit 2, graphics processing unit 10 can be handled and calculate the information relevant with the first overlapping frame 8a, the second overlapping frame 8b shown in 2 and zone 14.
Calculate after this information, graphics processing unit 10 can offer this information processor 24.In processor 24, interface 24a can receive with the image-related information of obtaining from image-generating unit 2 and with overlapping frame 8 and zone 14 relevant information.In controller 24b, handle the information relevant and from the image information of image-generating unit 2 with overlapping frame 8 and regional 14.
Image information is offered interface 24c.
Except image information, the information relevant with overlapping frame 8 and regional 14 is offered interface 24c from image-generating unit 2.Interface 24c offers output unit 12a with information.According to the information that provides, in screen 18, there is the zone that is projected as projection chamber 16 with indoor 4.In projection chamber 16, illustrate indoor 4 with graphics mode.Except illustrating indoor 4, in projection chamber, illustrated at least one overlapping frame 8 in indoor 4 in 6.In addition, can be in projection chamber the position of signal video camera in 16.In addition, can be in projection chamber the visual angle and the image-acquisition area 6 of signal image-generating unit 2 in 16.In addition, can be in projection chamber signal zone 14 in 16.16 6 degrees of freedom such as can freely be moved, tilt, roll, shake with screen 18 in the projection chamber, make allow to adjust with indoor 4 in the projection chamber of position alignment of entity in the visual field on 16.In projection chamber 16, can also be presented at the entity in indoor 4.By display entity in projection chamber 16 and overlapping frame 8, allow the entity in indoor 4 in overlapping frame 8, to locate oneself.Therefore, entity can determine its by at least two or even more image-generating units 2 see.
Except export overlapping frame 18 and zone 14 with figure, interface 24d can be to loud speaker 12b output and overlapping frame 8 and regional 14 relevant information.In addition, can handle information about entity to loud speaker 12b.Interface 24d can also counting chamber in entity in 4 and the relative position between the overlapping frame 8.Under the situation of overlapping frame 8 outsides, interface 24d can indicate loud speaker 12b to send information to entity and move it in the frame at entity.This can be about, other instructions that maybe can indicate entity in overlapping frame 8, to move of left and right sides information and alternative sounds.
Fig. 4 shows image-generating unit 2.Image-generating unit 2 can comprise object lens 2a and the imageing sensor 2b of the image that is used to obtain obtained by object lens 2a.In addition, image-generating unit 2 can comprise LED2c and optical sensor 2d.Processor 2f can indication LED 2c flicker.Processor 2e can obtain the relevant information of relative position with the flicker LED of other image-generating units 2 from transducer 2d, thereby allows the acquisition information relevant with the position of other video cameras.By indication LED 2c flicker, the position of image-generating unit 2 shown in other video cameras can obtain.Except image information 2b, another processor 2g can handle and the relevant information in position that is obtained by transducer 2e.In addition, can be by processor 2g from image-generating unit 2 output image informations and positional information and visual angle information.
By the information relevant with the frame that overlaps is provided, the application allows reduction to block and indicates entity easily to move in the overlapping frame.This can improve the operability of the video camera that is subjected to video control console control.

Claims (18)

1. system comprises:
-at least two image-generating units (2) are used for obtaining image in three-dimensional indoor (4),
-wherein, in indoor (4), the image-acquisition area (6) of at least two image-generating units (2) overlaps at least one three-dimensional overlapping frame (8),
-at least one graphics processing unit (10) is used for the image that obtain of acquisition from least two image-generating units (2), and is used for determining and the relevant information of at least one three-dimensional overlapping frame (8),
-wherein, described graphics processing unit (10) also is used for output and the relevant information of three-dimensional overlapping frame (8), exports for information output unit (12).
2. system according to claim 1, wherein, described graphics processing unit (10) also is used for the output and the relevant information in zone (14) of indoor (4), and in described zone (14), the image-acquisition area (6) of at least two image-generating units (2) does not overlap.
3. system according to claim 1 wherein, provides at least three image-generating units (2),
-wherein, in indoor (4), the image-acquisition area (6) of at least three image-generating units (2) overlaps at least one first three-dimensional overlapping frame (8a),
-wherein, in indoor (4), the image-acquisition area (6) of two image-generating units (2) overlaps at least one second three-dimensional overlapping frame (8b),
-wherein, the zone (14) that obtains of image-generating unit (2) does not overlap,
-wherein, graphics processing unit (10) determines and the first three-dimensional overlapping frame (8a), the second three-dimensional overlapping frame (8b) and the relevant information in zone (14) that do not overlap, and
-wherein, graphics processing unit (10) also is used for output and the first three-dimensional overlapping frame (8a), the second three-dimensional overlapping frame (8b) and the relevant information in zone (14) that do not overlap, exports for information output unit (12).
4. system according to claim 1, wherein, information output unit (12) comprising: display unit (12a) is used in two-dimensional display curtain (8) inner projection (16) indoor (4), and shows at least one three-dimensional overlapping frame in indoor (16) of projection.
5. system according to claim 4, wherein, display unit (12a) is distinguished at least the first three-dimensional overlapping frame (8a) and the zone (14) that does not overlap by different optical information is provided in screen (18).
6. system according to claim 1, wherein, image-generating unit (2) obtains the information of the attitude of the entity of indoor (4) and/or entity.
7. system according to claim 6, wherein, the locus that image-generating unit (2) obtains entity.
8. system according to claim 6, wherein, the information of the entity that information output unit (12) output is relevant with at least one overlapping frame (8).
9. system according to claim 1, wherein, image-generating unit (2) obtains each other the position in indoor (4).
10. system according to claim 9, wherein, graphics processing unit (10) is at least according at least one three-dimensional overlapping frame (8) of positional information calculation of image-generating unit (2).
11. system according to claim 1, wherein, graphics processing unit (10) is at least according to information calculations at least one three-dimensional overlapping frame (8) relevant with the visual angle of image-generating unit (2).
12. system according to claim 4, wherein, the demonstration of at least one the three-dimensional overlapping frame (8) in three-dimensional indoor (16) of display unit (12a) manipulation projection.
13. system according to claim 6, wherein, information output unit (12) provides acoustic information according to the relative tertiary location of at least one overlapping frame (8) and at least one entity.
14. system according to claim 1, wherein, graphics processing unit (10) and information output unit (12) form the part of game console (20).
15. a method comprises:
-obtain at least two images in three-dimensional indoor (4),
-wherein, in indoor (4), the image-acquisition area (6) of at least two image-generating units (2) overlaps at least one three-dimensional overlapping frame (8),
-the image that obtains,
-determine and the relevant information of at least one three-dimensional overlapping frame (8), and
-output and the relevant information of three-dimensional overlapping frame (8).
16. a computer program comprises that Operation Processor (24) carries out the instruction of following operation:
-obtain at least two images in three-dimensional indoor (4),
-wherein, in indoor (4), the image-acquisition area (6) of at least two image-generating units (2) overlaps at least one three-dimensional overlapping frame (8),
-the image that obtains,
-determine and the relevant information of at least one three-dimensional overlapping frame (8),
-output and the relevant information of three-dimensional overlapping frame (8).
17. a computer program comprises that Operation Processor (24) carries out the instruction of following operation:
-obtain at least two images in three-dimensional indoor (4),
-wherein, in indoor (4), the image-acquisition area (6) of at least two image-generating units (2) overlaps at least one three-dimensional overlapping frame (8),
-the image that obtains,
-determine and the relevant information of at least one three-dimensional overlapping frame (8),
-output and the relevant information of three-dimensional overlapping frame (8).
18. a game console comprises:
-at least one graphics processing unit (10), be used for the image that obtain of acquisition from least two image-generating units (2), and be used for determining and the relevant information of at least one three-dimensional overlapping frame (8), in described at least one three-dimensional overlapping frame (8), the image-acquisition area of at least two image-generating units overlaps in indoor (4)
-wherein, described graphics processing unit (16) is also exported and the relevant information of three-dimensional overlapping frame (8), for information output unit (12) output, and
-processor (24) is used to handle the information of the entity relevant with the frame that overlaps.
CN200880113766A 2007-11-02 2008-10-28 Acquiring images within a 3-dimensional room Pending CN101843104A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP07119889 2007-11-02
EP07119889.9 2007-11-02
PCT/IB2008/054443 WO2009057042A2 (en) 2007-11-02 2008-10-28 Acquiring images within a 3-dimensional room

Publications (1)

Publication Number Publication Date
CN101843104A true CN101843104A (en) 2010-09-22

Family

ID=40468825

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200880113766A Pending CN101843104A (en) 2007-11-02 2008-10-28 Acquiring images within a 3-dimensional room

Country Status (4)

Country Link
US (1) US20100248831A1 (en)
EP (1) EP2215849A2 (en)
CN (1) CN101843104A (en)
WO (1) WO2009057042A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102638657A (en) * 2011-02-14 2012-08-15 索尼公司 Information processing apparatus and imaging region sharing determination method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6842175B1 (en) * 1999-04-22 2005-01-11 Fraunhofer Usa, Inc. Tools for interacting with virtual environments
US6901561B1 (en) * 1999-10-19 2005-05-31 International Business Machines Corporation Apparatus and method for using a target based computer vision system for user interaction
US20030052971A1 (en) * 2001-09-17 2003-03-20 Philips Electronics North America Corp. Intelligent quad display through cooperative distributed vision
US20040196282A1 (en) * 2003-02-14 2004-10-07 Oh Byong Mok Modeling and editing image panoramas
US7292257B2 (en) * 2004-06-28 2007-11-06 Microsoft Corporation Interactive viewpoint video system and process
US7142209B2 (en) * 2004-08-03 2006-11-28 Microsoft Corporation Real-time rendering system and process for interactive viewpoint video that was generated using overlapping images of a scene captured from viewpoints forming a grid
US7697750B2 (en) * 2004-12-06 2010-04-13 John Castle Simmons Specially coherent optics
US7978928B2 (en) * 2007-09-18 2011-07-12 Seiko Epson Corporation View projection for dynamic configurations

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102638657A (en) * 2011-02-14 2012-08-15 索尼公司 Information processing apparatus and imaging region sharing determination method

Also Published As

Publication number Publication date
WO2009057042A3 (en) 2009-06-25
WO2009057042A2 (en) 2009-05-07
EP2215849A2 (en) 2010-08-11
US20100248831A1 (en) 2010-09-30

Similar Documents

Publication Publication Date Title
CN106062862B (en) System and method for immersive and interactive multimedia generation
JP5966510B2 (en) Information processing system
KR102105189B1 (en) Apparatus and Method for Selecting Multi-Camera Dynamically to Track Interested Object
US20120176303A1 (en) Gesture recognition apparatus and method of gesture recognition
JP2017174125A (en) Information processing apparatus, information processing system, and information processing method
JP2010086336A (en) Image control apparatus, image control program, and image control method
US10386633B2 (en) Virtual object display system, and display control method and display control program for the same
JP2015084002A (en) Mirror display system and image display method thereof
JP2016151798A (en) Information processing device, method, and program
JP2006202181A (en) Image output method and device
JP2024050696A (en) Information processing device, user guide presentation method, and head-mounted display
JP2016213674A (en) Display control system, display control unit, display control method, and program
JP6649010B2 (en) Information processing device
WO2009119288A1 (en) Communication system and communication program
KR20120064831A (en) Three dimensional camera device and control method thereof
US11589001B2 (en) Information processing apparatus, information processing method, and program
JP4960270B2 (en) Intercom device
US10634891B2 (en) Medical observation device, lens driving control device, lens driving control method, and video microscope device
JP6690637B2 (en) Medical observation device, information processing method, program, and video microscope device
JP6164780B2 (en) A moving image processing apparatus, a moving image processing method, a moving image processing program, and a moving image processing display system.
CN101843104A (en) Acquiring images within a 3-dimensional room
JP2020088840A (en) Monitoring device, monitoring system, monitoring method, and monitoring program
JP6467039B2 (en) Information processing device
JP3875199B2 (en) Imaging device
JP7395296B2 (en) Image processing device, image processing method, and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20100922