WO2015122108A1 - 情報処理装置、情報処理方法及びプログラム - Google Patents
情報処理装置、情報処理方法及びプログラム Download PDFInfo
- Publication number
- WO2015122108A1 WO2015122108A1 PCT/JP2014/084350 JP2014084350W WO2015122108A1 WO 2015122108 A1 WO2015122108 A1 WO 2015122108A1 JP 2014084350 W JP2014084350 W JP 2014084350W WO 2015122108 A1 WO2015122108 A1 WO 2015122108A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- user
- information
- information processing
- display
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 225
- 238000003672 processing method Methods 0.000 title claims abstract description 39
- 238000003384 imaging method Methods 0.000 claims abstract description 308
- 230000033001 locomotion Effects 0.000 claims description 101
- 230000008859 change Effects 0.000 claims description 80
- 238000003702 image correction Methods 0.000 claims description 78
- 238000012545 processing Methods 0.000 claims description 70
- 238000000034 method Methods 0.000 claims description 62
- 238000012937 correction Methods 0.000 claims description 49
- 230000008569 process Effects 0.000 claims description 34
- 230000000007 visual effect Effects 0.000 claims description 14
- 230000006870 function Effects 0.000 description 63
- 230000007246 mechanism Effects 0.000 description 38
- 230000001360 synchronised effect Effects 0.000 description 27
- 241000282414 Homo sapiens Species 0.000 description 25
- 230000004048 modification Effects 0.000 description 13
- 238000012986 modification Methods 0.000 description 13
- 238000004891 communication Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 10
- 238000004590 computer program Methods 0.000 description 9
- 239000000284 extract Substances 0.000 description 7
- 206010025482 malaise Diseases 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 238000007794 visualization technique Methods 0.000 description 6
- 230000000295 complement effect Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 5
- 201000003152 motion sickness Diseases 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 239000011521 glass Substances 0.000 description 4
- 239000003550 marker Substances 0.000 description 4
- 230000005484 gravity Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 230000001151 other effect Effects 0.000 description 2
- 230000000149 penetrating effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000006798 recombination Effects 0.000 description 2
- 238000005215 recombination Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 208000012661 Dyskinesia Diseases 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000010411 cooking Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000017311 musculoskeletal movement, spinal reflex action Effects 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
- G06T15/205—Image-based rendering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/12—Panospheric to cylindrical image transformations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a program.
- first-person viewpoint images from wearable devices such as head-mounted cameras have been used to create various types of content in order to transmit human experiences directly to others.
- an interface has been proposed in which communication with another person is performed by transmitting the first-person viewpoint image as described above so that the other person can share experiences and ask for the knowledge and instructions of the other person.
- Patent Document 1 discloses a technique for transmitting an image captured by an imaging device mounted on the head to another device and viewing the image on the other device.
- the present disclosure provides an information processing apparatus, an information processing method, and a program that can share a space while maintaining a degree of freedom of line of sight.
- image information generated by imaging of an imaging device mounted on a moving body that moves in space imaging device posture information that is information related to the posture of the imaging device, and user operation operated by the user
- an information processing apparatus including a control unit that performs control to display a display image generated based on user viewing information that is obtained from a device and that identifies a region that the user wants to view in a display region that the user visually recognizes. Is done.
- image information generated by imaging of an imaging device mounted on a moving body that moves in space image information generated by imaging of an imaging device mounted on a moving body that moves in space
- imaging device posture information that is information related to the posture of the imaging device
- user operation An information processing method including performing control of displaying a display image generated based on user viewing information obtained from a user operation device and identifying a region that the user wants to view in a display region viewed by the user.
- the image information generated by the imaging of the imaging device mounted on the moving body that moves in the space the imaging device posture information that is information related to the posture of the imaging device, the user,
- a function of performing control to display a display image generated based on user viewing information that is obtained from a user operation device operated by the user and that identifies a region that the user wants to view in a display region that the user visually recognizes is realized.
- a program is provided.
- FIG. 1 is an explanatory diagram illustrating a schematic configuration of a system according to a first embodiment of the present disclosure.
- FIG. It is explanatory drawing which showed the schematic structure of the apparatus which concerns on the same embodiment. It is explanatory drawing which showed typically an example of the wearable apparatus which concerns on the embodiment.
- It is the block diagram which showed an example of the structure of the information processing apparatus which concerns on the embodiment. It is the block diagram which showed an example of the structure of the information processing apparatus which concerns on the embodiment.
- FIG. 10 is an explanatory diagram illustrating a schematic configuration of a system according to a third embodiment of the present disclosure. It is explanatory drawing which showed schematic structure of the imaging device which concerns on the embodiment. It is explanatory drawing which showed schematic structure of the imaging device which concerns on the embodiment. It is explanatory drawing which showed schematic structure of the imaging device which concerns on the embodiment. It is explanatory drawing which showed schematic structure of the imaging device which concerns on the embodiment. It is explanatory drawing which showed schematic structure of the imaging device which concerns on the embodiment. It is explanatory drawing which showed schematic structure of the imaging device which concerns on the embodiment.
- FIG. 1 is a diagram illustrating a schematic configuration of a system according to the first embodiment of the present disclosure.
- the system 10 in this embodiment includes a server 100 and clients 200 to 700.
- the server 100 is a collection of functions realized by a single server device or a plurality of server devices connected via a wired or wireless network to cooperate, and provides various services to the client devices 200 to 700. provide.
- the client devices 200 to 700 are terminal devices connected to the server 100 through various wired or wireless networks.
- the server 100 and the client devices 200 to 700 realize at least one of the following functions (1) to (7) in the system 10 by themselves or by cooperating with each other.
- An apparatus that has an imaging mechanism such as a camera and provides captured real-space images to the server 100 or other client devices 200 to 700.
- It has an imaging mechanism such as a camera, performs various image processing on the captured real space image, and displays various images related to the real space obtained by the image processing to the server 100 or other client devices.
- (3) Having an imaging mechanism such as a camera, performing various image processing on the captured real space image, and generating an image desired by the user according to various operations related to the various images performed by the user An apparatus for providing the generated various images to the server 100 or other client devices 200 to 700.
- It has at least a display mechanism such as a display, and preferably further has an operation mechanism such as a touch panel, and acquires an image provided by the apparatus of (1) and performs various operations related to the image performed by the user.
- It has at least a display mechanism such as a display, and preferably further has an operation mechanism such as a touch panel.
- the image provided by the device of (3) is acquired and used for viewing by the user, and the image by the user That accepts various operations.
- the client device 200 is a wearable terminal (hereinafter also simply referred to as a wearable terminal 200).
- the wearable terminal 200 has, for example, at least one of an imaging mechanism and a display mechanism, and functions as at least one of the above (1) to (7).
- wearable terminal 200 is a glasses type, but is not limited to this example as long as it is a shape that can be worn on the user's body.
- the wearable terminal 200 has a camera installed in, for example, a frame portion of glasses as an imaging mechanism. With this camera, wearable terminal 200 can acquire an image in real space from a position close to the user's viewpoint. The acquired image is transmitted to the server 100 or other client devices 300 to 700.
- the wearable terminal 200 has, for example, a display installed on a part or all of the lens portion of the glasses as a display mechanism. Wearable terminal 200 displays an image captured by the camera on the display.
- the client device 300 is a tablet terminal (hereinafter also simply referred to as a tablet terminal 300).
- the tablet terminal 300 includes at least a display mechanism, preferably further includes an operation mechanism, and can function as, for example, the devices (4) to (7) described above.
- the tablet terminal 300 may further include an imaging mechanism in addition to the display mechanism and the operation mechanism, and may function as at least one of the above-described (1) to (3) devices. That is, the tablet terminal 300 can function as any device among the devices (1) to (7).
- the client device 400 is a mobile phone (smart phone) (hereinafter also simply referred to as the mobile phone 400).
- the function of the mobile phone 400 in the system 10 is the same as that of the tablet terminal 300, detailed description is abbreviate
- a device such as a portable game machine, a portable music player, or a digital camera also has a communication mechanism, a display mechanism, an operation mechanism, or an imaging mechanism. It can function in the same manner as the tablet terminal 300 and the mobile phone 400.
- the client device 500 is a laptop PC (hereinafter, also simply referred to as a laptop PC 500).
- the laptop PC 500 has a display mechanism and an operation mechanism, and functions as the devices (4) to (7).
- the laptop PC 500 is treated as an example of a device that does not function as the devices (1) to (3) because it is basically fixed and used.
- a desktop PC or a television can function in the same manner as the laptop PC 500.
- the laptop PC 500 has a display as a display mechanism and a mouse and a keyboard as an operation mechanism, and displays images provided directly from the devices (1) to (3) or via various devices. The user receives various operations on the image.
- the laptop PC 500 further includes an imaging mechanism such as a camera, the laptop PC 500 can also function as the devices (1) to (3).
- the client device 600 is a fixed camera (hereinafter also simply referred to as a fixed camera 600).
- the fixed camera 600 has an imaging mechanism and functions as the devices (1) to (3).
- the fixed camera 600 is treated as an example of a device that is used in a fixed manner and does not function as the devices (4) to (7) because it does not have a display mechanism.
- a camera that projects the front of the screen is provided on a desktop PC or television, or when a movable device such as a digital camera is temporarily fixed to a tripod, These devices can function in the same manner as the fixed camera 600.
- the fixed camera 600 has a camera as an imaging mechanism, and includes a real space from a fixed viewpoint (including a case where the camera swings automatically or in response to a user's operation for viewing a captured image). Images can be acquired.
- the client device 700 is a projector (hereinafter also simply referred to as the projector 700).
- the projector 700 has a projection device as a display mechanism, and functions as the device (7).
- the projector 700 does not have an imaging mechanism and does not have an operation mechanism for receiving an input to a displayed (projected) image, and therefore does not function as the above-described devices (1) to (6). It is treated as an example of a device.
- the projector 700 displays various images in real space by projecting images onto the surface of a screen or object using a projection device.
- the projector 700 is shown as a fixed type, but may be a handheld type.
- the server 100 functions as at least one of the above-described (1) to (7) by itself or in cooperation with the client devices 200 to 700. That is, the server 100 acquires a real space image, performs various image processing on the obtained image, and acquires at least one of the acquired real space image and an image obtained by image processing. Has a function of displaying.
- the server 100 and the client devices 200 to 700 for example, various creatures such as human beings, self-propelled bodies that self-run on the surface, the ground, or underwater, or fly in the air.
- a user browses an image of a real space where a moving object such as a flying object exists, and the space can be shared between the various moving objects and the user.
- the processing as described in detail below is performed, so that the user can freely view an image of the real space where the moving body exists independently of the moving body. Is also possible.
- the system 10 according to the present embodiment includes a device that can acquire an image of a real space, and a device that can use the real space image for viewing by the user and accept various operations by the user. And an apparatus for displaying an image generated by various operations by the user.
- FIG. 2 is a diagram showing a schematic configuration of the apparatus according to the present embodiment.
- the apparatus 900 includes a processor 910 and a memory 920.
- the apparatus 900 may further include at least one of a display unit 930, an operation unit 940, a communication unit 950, an imaging unit 960, or a sensor 970. These components are connected to each other by a bus 980.
- the device 900 can realize, for example, the server device configuring the server 100 and the client devices 200 to 700.
- the processor 910 is a variety of processors such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processor), and performs various operations such as computation and control according to a program stored in the memory 920, for example. Realize the function.
- the processor 910 realizes a control function of the entire apparatus of the server 100 and the client devices 200 to 700, for example.
- the processor 910 executes various image processing as described later and display control for displaying an image on a display screen.
- the memory 920 is configured by a storage medium such as a semiconductor memory or a hard disk, and stores a program and data for processing by the apparatus 900.
- the memory 920 may store captured image data acquired by the imaging unit 960 and sensor data acquired by the sensor 970.
- an external data source for example, a data server, a network storage, or an external memory
- the display unit 930 is provided in a client having the above-described display mechanism, for example.
- the display unit 930 can be a display corresponding to the shape of the device 900, for example.
- the wearable terminal 200 may have a display having a shape corresponding to, for example, the lens portion of the glasses or a shape corresponding to the display area of the head mounted display.
- the tablet terminal 300, the mobile phone 400, and the laptop PC 500 may have a flat panel display provided in each case.
- the display unit 930 may be a projection device that projects an image on an object.
- the projector 700 can have a projection device as a display unit.
- the operation unit 940 is provided in a client having the above-described operation mechanism, for example.
- the operation unit 940 is configured by combining a touch sensor (which forms a touch panel together with the display), a touch pad, a mouse, and other pointing devices provided on the display, with a keyboard, buttons, switches, and the like as necessary.
- the operation unit 940 identifies a position in an image displayed on the display unit 930 by using, for example, a pointing device, and accepts a user operation for inputting some information to the position using a keyboard, a button, a switch, or the like.
- the operation unit 940 may specify a position in an image displayed on the display unit 930 by a pointing device, and further accept a user operation to input some information with respect to the position using the pointing device.
- the communication unit 950 is a communication interface that mediates communication between the device 900 and other devices.
- the communication unit 950 supports an arbitrary wireless communication protocol or wired communication protocol, and establishes a communication connection with another device.
- the imaging unit 960 is a camera module that captures an image.
- the imaging unit 960 images a real space using an imaging element such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) to generate a captured image.
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- a series of captured images generated by the imaging unit 960 constitutes a video.
- the imaging unit 960 is not necessarily a part of the apparatus 900.
- an imaging device connected to the device 900 by wire or wireless may be handled as the imaging unit 960.
- the imaging unit 960 may include a depth sensor that measures the distance between the imaging unit 960 and the subject for each pixel. The depth data output from the depth sensor can be used for environment recognition in an image obtained by capturing a real space as will be described later.
- the sensor 970 may include various sensors such as a positioning sensor, an acceleration sensor, and a gyro sensor.
- the measurement result obtained by the sensor 970 may be used for various purposes such as support for environment recognition in an image captured in real space, acquisition of data specific to a geographical location, or detection of user input. Good.
- the sensor 970 can be provided in a device having the imaging unit 960 (in the above example, the wearable terminal 200, the tablet terminal 300, the mobile phone 400, or the fixed camera 600).
- the type of the captured image handled by the information processing apparatus 10 according to the present embodiment is not particularly limited, and may be a still image or a moving image.
- the captured image handled by the information processing apparatus 1000 according to the present embodiment is an image of a wide range as much as possible in the real space. Therefore, the imaging device used for imaging in real space is preferably a camera with a wide-angle lens as much as possible, and more preferably an omnidirectional camera, for example, as schematically shown in FIG. Is preferred.
- FIG. 3 schematically illustrates a configuration in a case where an omnidirectional camera for imaging a real space is realized as the wearable terminal 200.
- a camera with a wide-angle lens as much as possible is provided in an annular shape so as to cover the periphery of a human head that is an example of a moving body. Further, since it is difficult to obtain an image in the zenith direction simply by installing the camera around the human head, in FIG. 3, a camera is also provided at the top of the head.
- the wearable terminal 200 is provided with various sensors such as a positioning sensor, an acceleration sensor, and a gyro sensor. Information regarding the line of sight of the imaging device (in other words, the orientation of the imaging device) output from the sensor is output to the information processing device described later, and is information regarding the orientation of the imaging device in the information processing device. Used as imaging device attitude information.
- FIG. 3 a case where the camera is arranged in a ring shape to obtain an omnidirectional image is shown.
- the camera does not have to be arranged in an annular shape, and it is sufficient that the camera is provided on at least a part of the human head.
- the number of cameras used for realizing the wearable terminal 200 as shown in FIG. 3 is not limited, and the number of cameras may be set as appropriate depending on how much images are desired to be acquired.
- the moving body is a human being
- the moving body is not limited to a human being, and may be an animal other than a human wearing the wearable terminal 200, It may be a self-propelled or flying object such as a robot equipped with a camera.
- An information processing apparatus 1000 that performs various types of information processing on a captured image captured by an imaging device illustrated in FIG. 3 is an image generated by imaging of an imaging apparatus mounted on a moving body that moves in a space.
- Display image generated based on information imaging device attitude information that is information related to the attitude of the imaging device, and user visual information that is obtained from a user operation device operated by the user and that specifies a region that the user wants to visually recognize Is a device that performs control to display in a display area visually recognized by the user.
- the imaging device attitude information is, for example, information related to rotation of the imaging device
- the user visual recognition information is information that specifies a display angle of view that the user wants to visually recognize, for example, in an all-around image captured by the imaging device. There may be.
- the information processing apparatus 1000 includes at least a display control unit 1050 that is an example of a control unit.
- the information processing apparatus 1000 includes an image generation unit 1010, an image selection unit 1020, an image correction unit 1030, and a moving body line of sight as illustrated in FIG. 4B. You may further provide at least any one of the information generation part 1040, the data acquisition part 1060, the data provision part 1070, and the memory
- each processing unit illustrated in FIGS. 4A and 4B may be realized in any one of the server 100 or the client devices 200 to 700, or may be realized by being distributed among a plurality of devices. Also good.
- the information processing apparatus 1000 performs display control of a display image generated based on the captured image captured by the imaging apparatus, the imaging apparatus attitude information, and the user viewing information.
- the information processing apparatus 1000 for example, an image generated by an imaging device or another device different from the imaging device and the information processing device based on the captured image and the imaging device attitude information (for example, the captured image).
- display control may be performed in the same manner as described below on the basis of the corrected image on which the orientation of the imaging apparatus is corrected in advance) and the user viewing information.
- the image generation unit 1010 generates a surrounding captured image in which the periphery of the position where the moving body exists is captured using the captured image captured by the imaging device attached to the moving body moving in the space. For example, when a captured image is output from an imaging device as illustrated in FIG. 3, the surrounding captured image generation processing by the image generation unit 1010 is performed as needed in real time.
- the ambient captured image generated by the image generation unit 1010 integrating the captured images.
- a method for generating a surrounding captured image from a plurality of captured images captured by a plurality of cameras is not particularly limited, and a known method may be applied.
- the image generation unit 1010 may generate a rectangular image equivalent to the global image as shown in FIG. 6 instead of the global image (global image) as shown in FIG. 5 as the peripheral image.
- a rectangular image equivalent to the global image can be generated by converting the global image by a known method such as equirectangular projection.
- the image selection unit 1020 is based on the surrounding captured image generated by the image generation unit 1010 and the user viewing information indicating the space that the user wants to view obtained from the user operation device operated by the user.
- the captured image corresponding to the user viewing information is selected as the user viewing image.
- the user viewing image selected by the image selection unit 1020 is a user operation device operated by the user (for example, in the example shown in FIGS. 5 and 6, such as a head mounted display worn by a user different from the moving body). Is provided to the wearable terminal 200) and used for browsing by the user.
- the user operating the user operation device can share the space with the moving body that moves in a certain space, and the position that the user desires to view in the space is independent of the moving body. Can be selected.
- the user can freely select an image at a position different from the position visually recognized by the moving body.
- Such generation processing of surrounding captured images and image selection processing from surrounding captured images are less computationally expensive than space recombination techniques that frequently use processing with high computational costs such as matching feature points between images. You can execute the process. Therefore, the information processing apparatus 1000 capable of performing such processing can realize a reduction in size and weight of the apparatus.
- the user viewing information set by the user operation device is generated when the user operates various input mechanisms such as a touch pad, a keyboard, and a mouse provided in the user operation device, and is transmitted to the image selection unit 1020. Is done.
- the user operation device is the wearable terminal 200 as shown in FIGS. 5 and 6, the user's behavior (for example, the user) is detected by various sensors such as a positioning sensor, an acceleration sensor, and a gyro sensor provided in the wearable terminal 200. May be generated by automatically detecting the viewing direction).
- Such user visual recognition information may be generated by voice input or gesture input by the user to the user operation device.
- the information processing apparatus 1000 includes the image generation unit 1010 and the image selection unit 1020, so that an image of a space that the mobile body (more specifically, the imaging apparatus) is viewing ( A so-called first person viewpoint image) is provided to the user in real time.
- a so-called first person viewpoint image A so-called first person viewpoint image
- intense screen shaking may occur due to the moving body (more specifically, the imaging device) looking around the position where the moving object exists. If the user visually recognizes such a strong shaking of the screen, the user may feel “sickness” (motion sickness) due to seeing an image with a strong shaking. Therefore, it is preferable that the information processing apparatus 1000 according to the present embodiment further includes a correction function for correcting the rotational movement as described above in the imaging apparatus.
- the image correction unit 1030 is a processing unit that corrects the change in the image accompanying the rotational movement of the imaging device as described above based on the imaging device attitude information.
- the image correction unit 1030 is configured to change the imaging device's position relative to the surrounding captured image when the line-of-sight direction of the imaging device changes without changing the position of the imaging device. Correction is performed to suppress changes in the surrounding captured image that accompany changes in the line-of-sight direction.
- the image correction unit 1030 uses the moving body line-of-sight information, and the surrounding captured image after the change in the line-of-sight direction of the imaging apparatus according to the magnitude of the rotation angle accompanying the change in the line-of-sight direction of the imaging apparatus Perform a correction to reverse the rotation.
- this correction process will be described with reference to FIG.
- a global image A is generated at a certain point in time from imaging data from a moving body (human) wearing a wearable terminal 200 having a sensor and an all-around camera.
- a rotational movement occurs in the imaging apparatus and a change in the line-of-sight direction occurs, and accordingly, a global image B is generated.
- the image correction unit 1030 extracts the rotation component with reference to the sensor information output from the wearable terminal 200, and specifies the magnitude of the rotation angle associated with the change in the line-of-sight direction of the imaging device.
- the image correction unit 1030 performs a correction for reversely rotating the image according to the magnitude of the obtained rotation angle on the global image B, so that the global image in which the rotation component is canceled from the global image B. C is generated.
- the global image C becomes an image viewed in substantially the same direction as the global image A as a result of canceling the rotation component.
- the image correction unit 1030 may perform the rotational motion correction process so that the local feature amounts coincide before and after the change in the line-of-sight direction accompanying the rotational motion of the imaging device.
- FIG. 7 illustrates the case where rotation correction is performed using the output from the sensor provided on the moving body.
- the local feature amount in the global image A and the local feature amount in the global image B are shown in FIG. Paying attention, you may implement the correction process of rotational motion.
- the image correction unit 1030 extracts a local feature amount (for example, the position of a feature point) in the global image A and a local feature amount in the global image B, and performs a matching process of the local feature amount. .
- the image correction unit 1030 may perform correction so that the local feature amounts match before and after the change in the line-of-sight direction of the imaging apparatus.
- the image correction unit 1030 extracts a rotation component to be applied to match the two local feature amounts, and reversely rotates the image according to the obtained rotation angle with respect to the global image B.
- the correction to be performed may be performed.
- the local feature amount focused on by the image correction unit 1030 is not particularly limited, and a known local feature amount can be used.
- a local feature amount for example, Scale Invariant Feature is used. Examples include Transform (SIFT).
- the image correction unit 1030 may use both the image correction process based on the output from the sensor mounted on the moving body and the image correction process based on the local feature amount. As a result, the image correction unit 1030 can cancel the rotation component more precisely.
- the image correction unit 1030 may control the degree of correction according to correction application information that indicates the degree of application of the correction obtained from the user operation device. As a result, the image correction unit 1030 performs correction to such an extent that the rotation component is completely canceled by the correction of the rotation component as described above, the rotation component is not corrected, or the rotation component is not completely canceled. You can.
- the image correction unit 1030 can also perform image control that gradually follows the rotational movement of the imaging apparatus by performing correction that does not completely cancel the rotation component.
- the rotational motion that can be generated in the imaging apparatus can be expressed using rotational coordinate axes that are defined independently of each other, such as a yaw axis, a pitch axis, and a roll axis. Is possible. Therefore, for example, as illustrated in FIG. 9, the image correction unit 1030 may control the degree of execution of the rotation correction as described above for each of the rotation coordinate axes independently of each other.
- the moving body line-of-sight information generation unit 1040 generates line-of-sight information indicating the direction (position) or field of view of the imaging apparatus based on the imaging apparatus attitude information.
- This line-of-sight information can be generated in a known direction using, for example, output information from various sensors attached to the moving body (that is, imaging device attitude information).
- the image for user viewing provided to the user operation device includes the direction (position) of the line of sight of the imaging device, It is possible to display an object indicating a visual field.
- the user can grasp the gaze direction of the imaging device at any time while visually recognizing the surrounding captured image in an arbitrary direction different from the direction (position) or the visual field of the imaging device.
- the display control unit 1050 controls the display contents of the information processing apparatus 1000 and a display device such as a display provided outside the information processing apparatus 1000.
- the display control unit 1050 includes image information generated by imaging of an imaging device mounted on a moving body that moves in space, imaging device posture information that is information regarding the posture of the imaging device, a user Control is performed to display a display image generated based on user viewing information obtained from the user operating device operated by the user and specifying user viewing information in a display region viewed by the user.
- the display control unit 1050 performs display control of the display screen of the user operation device, for example, as illustrated in FIG. 10, an object that represents the line-of-sight direction and the field of view of the imaging device in the user viewing image. Can be displayed. Thereby, the user can grasp the line-of-sight direction of the moving body at any time while selecting the line-of-sight direction independently of the moving body.
- the data acquisition unit 1060 acquires line-of-sight related data including captured image data output from an imaging device attached to a moving body, sensor output related to the line-of-sight direction of the imaging apparatus (that is, imaging apparatus attitude information), and the like. Acquire data related to user operations output from user operation devices. Various data acquired from various devices by the data acquisition unit 1060 can be appropriately used by each processing unit of the information processing apparatus 1000.
- the data providing unit 1070 receives various data generated by the information processing apparatus 1000 (for example, captured image data such as a peripheral captured image and a user viewing image, and line-of-sight related data such as a line-of-sight direction of the imaging apparatus). It is provided to a device provided outside of 1000. Accordingly, various information generated by the information processing apparatus 1000 can be used even in an apparatus provided outside the information processing apparatus 1000.
- captured image data such as a peripheral captured image and a user viewing image
- line-of-sight related data such as a line-of-sight direction of the imaging apparatus.
- the storage unit 1080 includes various types of processing used by the image generation unit 1010, the image selection unit 1020, the image correction unit 1030, the moving body line-of-sight information generation unit 1050, the display control unit 1050, the data acquisition unit 1060, and the data provision unit 1070.
- Various programs including applications used for various arithmetic processing executed by these processing units, various parameters that need to be saved when performing some processing, and the progress of processing are recorded appropriately. May be.
- the storage unit 1080 includes processing units such as an image generation unit 1010, an image selection unit 1020, an image correction unit 1030, a moving body line-of-sight information generation unit 1050, a display control unit 1050, a data acquisition unit 1060, and a data provision unit 1070. It can be freely accessed to write and read data.
- each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component.
- the CPU or the like may perform all functions of each component. Therefore, it is possible to appropriately change the configuration to be used according to the technical level at the time of carrying out the present embodiment.
- a computer program for realizing each function of the information processing apparatus according to the present embodiment as described above can be produced and installed in a personal computer or the like.
- a computer-readable recording medium storing such a computer program can be provided.
- the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
- the above computer program may be distributed via a network, for example, without using a recording medium.
- captured image data is acquired from a camera attached to a moving body (step S101).
- the image generation unit 1010 of the information processing apparatus 1000 generates a surrounding captured image such as a global image or a rectangular image obtained by converting the global image into a rectangle based on the acquired captured image data (step S103).
- the image correction unit 1030 of the information processing apparatus 1000 performs the above correction processing on the generated surrounding captured image as necessary (step S105).
- the image selection unit 1020 of the information processing apparatus 1000 selects an image corresponding to the user visual information (that is, an image for user visual recognition) from the surrounding captured images in accordance with the user visual information acquired from the user operation device. (Step S107).
- the display control unit 1050 of the information processing apparatus 1000 controls the display of the selected image on the display screen of the user operation device (step S109). Thereby, the user using the user operation device can share the image in the space where the moving body exists with the moving body.
- processing can be performed in the same manner as described above even when processing is performed using a generated image generated based on a captured image and imaging device attitude information instead of the captured image acquired from the imaging device. .
- FIGS. 12 to 17 are explanatory diagrams for explaining the display control process according to the present embodiment
- FIG. 17 is a flowchart illustrating an example of the flow of the display control process according to the present embodiment.
- a rotation component can be extracted by paying attention to a change between frames of a rectangular image (that is, a surrounding captured image) based on a cylindrical projection or the like.
- the surrounding captured image F 1 corresponding to the frame 1, the rotary motion Q 1, 2 developed between the periphery captured image F 2 corresponding to the frame 2, surrounding the captured image feature points in the periphery captured image F 1 by a known estimation processing such as performed by paying attention to whether present in any position in the F 2, can be identified.
- the specific process of such a rotational movement, the frame and around the captured image F N-1 corresponding to the (N-1), rotational movement Q N-1, N developed between the periphery captured image F N corresponding to the frame N
- the rotational motions Q 1 and N between the frame 1 and the frame N as shown in the following equation 101 can be specified by taking the product of the obtained rotational motions.
- Q 1 and N obtained as described above can be said to be information relating to rotation (that is, rotation information) associated with a change in the line-of-sight direction of the imaging apparatus.
- rotation information can be handled as information representing the locus of the rotational motion generated between the frames 1 to N, and the display control unit 1050 visualizes Q 1 and N so that the frame 1 It is possible to provide the user with the locus of the rotational motion generated during the frame N (in other words, posture information visualizing the change in the posture of the imaging device).
- a coordinate system absolute coordinate system, hereinafter also referred to as coordinate system A
- B relative coordinate system, hereinafter also referred to as coordinate system B
- the display control unit 1050 uses the coordinate system A of the two coordinate systems to appropriately display the surrounding captured image in the space, and displays images and various objects corresponding to the posture information. The image is superimposed on the surrounding captured image.
- the display control unit 1050 further includes an image corresponding to the posture information with respect to the surrounding captured image subjected to the correction for suppressing the change in the surrounding captured image due to the change in the line-of-sight direction of the imaging device. Or various objects may be superimposed.
- the user viewing image selected from the surrounding captured image by the image selection unit 1020 is a part of the surrounding captured image pasted on the surface of the whole sphere in the coordinate system A as schematically illustrated in FIG.
- the coordinate system B is defined separately from the coordinate system A at an arbitrary point located inside the sphere.
- the coordinate system that can be used in the present embodiment as shown in FIG. 12 is a rotating coordinate system that expresses an arbitrary position on the surface of the sphere using two rotation angles.
- the display control unit 1050 When the user operation device requests addition of various annotations such as text data, image data, and audio data to a specific position of the surrounding captured image, the display control unit 1050 is designated by the user operation device. It is preferable to associate various annotations with corresponding positions in the coordinate system A at the determined positions. In addition, the display control unit 1050 displays various objects such as an image representing an annotation, an icon, and the like at a corresponding portion (that is, a corresponding portion in the coordinate system A) of the surrounding captured image corresponding to the designated position from the user operation device. It is also possible.
- the display control unit 1050 sets or changes the visualization method to be adopted based on at least one user operation of the operator of the imaging apparatus or the operator of the user operation device.
- (A) The movement of the coordinate system A is fixed, and the movement of the coordinate system B is changed according to the posture information.
- the surrounding captured image displayed in the display area of the display device such as the user operation device is displayed so as to change as the posture of the imaging device changes.
- images and various objects corresponding to the posture information are displayed as if they were fixed in the display area even if the posture of the imaging device has changed.
- (B) The movement of the coordinate system A is changed according to the posture information, and the movement of the coordinate system B is fixed.
- the image of the surrounding captured image displayed in the display area of the display device such as the user operation device does not change, or the image that accompanies the change in the orientation of the imaging device Displayed with less change.
- images and various objects corresponding to the posture information are displayed in the display area so as to change (virtually rotate, etc.) as the posture of the imaging apparatus changes.
- the image correction processing by the image correction unit 1030 described above corresponds to the processing for fixing the movement of the coordinate system B in the visualization method (B).
- the display control unit 1050 displays the coordinate system as an object representing posture information, for example, as shown in FIG.
- An object representing the coordinate axis of B (for example, a coordinate axis defined by an angle in the latitude direction / longitude direction) may be superimposed on the surrounding captured image, and the coordinate axis may be rotated according to the posture information.
- the display control unit 1050 performs a movement corresponding to the change in the posture information, for example, as illustrated in FIG.
- the locus may be superimposed on the surrounding captured image.
- the surrounding captured image fixed to the coordinate system A changes according to the posture information while the direction of the coordinate system B is fixed.
- the surrounding captured image itself rotates according to the posture information
- the user of the user operation device can easily grasp the change in the posture of the imaging device.
- the display control unit 1050 superimposes an object representing a coordinate axis (for example, a coordinate axis defined by an angle in the latitude direction / longitude direction) of the coordinate system A as illustrated in FIG. 12 on the surrounding captured image.
- the coordinate axes may be rotated according to the rotation of the surrounding captured image.
- the display control unit 1050 performs the movement corresponding to the change in the posture information, for example, as illustrated in FIG.
- the locus may be superimposed on the surrounding captured image.
- the display control unit 1050 when the display control unit 1050 visualizes the rotation of the two types of coordinate systems as described above, the display control unit 1050 changes the line-of-sight direction of the imaging apparatus with respect to the generated display image (that is, the user viewing image). It is possible to superimpose at least one of an object that rotates with the accompanying rotational motion and an object that does not rotate. In other words, the display control unit 1050 rotates the object representing the coordinate axis of the coordinate system in accordance with the rotational motion, but the rotational control such as the numerical value or the character given to the coordinate axis shown in FIG. For an object that is difficult to grasp if it rotates with it, it is not necessary to rotate the object.
- the position of the object which is difficult to grasp when rotated along with the rotational movement, moves along with the rotational movement, but the posture of the object can be made constant with respect to the viewpoint. As a result, it is possible to make it easier for the user to grasp the object.
- the display control unit 1050 displays the setting of which coordinate axis of the coordinate system A and the coordinate axis of the coordinate system B is displayed and which coordinate axis is rotated, for the operator of the imaging apparatus or the user operation device. It is preferable to determine or change based on the user operation of at least one of the operators.
- the display control unit 1050 uses a rotation information to visualize a change in the posture of the imaging device, and a position different from the center of the coordinate system (coordinate system A) fixed in space (for example, in FIG. 12). It is preferable to generate a display image as a space visually viewed from a position O) translated backward from the center C of the coordinate system A along the line-of-sight direction of the imaging device.
- the user of the user operation device can visually recognize as if a display image is generated from a fixed camera virtually installed at a position different from the position of the imaging device. As a result, the user of the user operation device can more easily grasp the change in the posture of the imaging device.
- the display control unit 1050 sets the reference position for visualization (position O in FIG. 12) based on at least one user operation of the operator of the imaging apparatus or the operator of the user operation device. It is possible to change.
- the display control unit 1050 is a playback speed when displaying a display image in a display area visually recognized by the user in accordance with the posture information in order to more effectively transmit the change of the posture information to the user of the user operation device.
- at least one of the display angle of view may be controlled.
- the display control unit 1050 transmits the change in posture information more effectively to the user of the user operation device by performing display control such as slowing down the reproduction speed when the rotation amount based on the posture information is large. can do.
- the display control unit 1050 generates a display image when the space is virtually visually recognized from the specified arbitrary position with the arbitrary position specified from the user operating device as the center, and provides the generated image to the user operating device. May be.
- FIG. 14 is an example of a display image when the visualization method (A) is adopted.
- the surrounding image changes, and the orientation of the object representing the coordinate axis of the coordinate system A sometimes changes according to the change. You can see that it is changing every moment.
- position information of the wearable terminal 200 is superimposed on the image for user visual recognition.
- FIG. 15 is an example of a display image when the visualization method (B) is adopted.
- the surrounding image does not change, and the coordinate axis of the coordinate system B displayed in a superimposed manner changes every moment. I understand that.
- the display control processing according to the present embodiment applies not only to the global image as illustrated in FIGS. 14 and 15, but also to a rectangular image by the equirectangular projection equivalent to the global image as illustrated in FIG. 16. It is possible to apply.
- captured image data is acquired from a camera attached to a moving body (step S151).
- the image generation unit 1010 of the information processing apparatus 1000 generates a surrounding captured image such as a global image or a rectangular image obtained by converting the global image into a rectangle based on the acquired captured image data (step S153).
- the image correction unit 1030 of the information processing apparatus 1000 performs the rotation analysis process as described above using each generated surrounding captured image (step S155).
- the display control unit 1050 of the information processing apparatus 1000 arranges the generated surrounding captured image and various graphic objects on the celestial sphere based on the coordinate system A (step S157).
- the image selection unit 1020 of the information processing apparatus 1000 selects an image corresponding to the user viewing information from the surrounding captured images on which various graphic objects are superimposed according to the user viewing information acquired from the user operation device. That is, a user viewing image is generated (step S159).
- the display control unit 1050 of the information processing apparatus 1000 controls the display of the image selected by the image selection unit 1020 on the display screen of the user operation device, so that the user using the user operation device can It is possible to share an image in an existing space with a moving object.
- the video surrounding the moving object can be observed in real time as the surrounding captured image, and the user can be present as if he was present in the place where the moving object exists. A feeling can be obtained. Further, by performing the correction process as described above, the image shake due to the rotational movement of the moving body is suppressed, and thus the user can avoid motion sickness (video sickness) due to a sudden image change.
- Such a configuration includes a case of unidirectional (only information flows unilaterally from the mobile body to the user) and a case of bidirectional (information is transmitted from the user to the mobile body by voice or other means).
- the former case for example, when a sports player is a moving body, a person who appreciates play becomes a user, and sports broadcasting with a sense of reality can be realized.
- the number of users is not limited to 1, and tens of thousands of users similar to those of broadcast listeners may be considered.
- the latter assumes an application in which the user gives some guidance or instruction to the moving body while sharing the field of view of the moving body. For example, when the moving body (human) is cooking, the user gives the instruction.
- the number of users is not limited to 1, but it is realistic that the number is relatively small.
- the information processing apparatus 1100 performs various types of information processing on the captured image captured by the imaging apparatus attached to the moving moving body, and more naturally captures the entire surrounding captured image around the moving body. It is a device that enables the user to view the video.
- the information processing apparatus 1100 is an image of a space (so-called first-person viewpoint image) that is viewed by a moving body (more specifically, an imaging apparatus). Is provided to the user in real time.
- the first-person viewpoint image includes shaking caused by the movement of the moving body being imaged. Therefore, when a user visually recognizes such a first-person viewpoint image, the user may experience “drunk” (motion sickness) due to inconsistency between the movement included in the first-person viewpoint image and the movement of his / her body. I felt it.
- the shaking of the first-person viewpoint image caused by the movement of the moving body is extracted as the rotation component of the image, and the rotation component is reversely rotated.
- the process of correcting the image is performed.
- the rotation of the imaging device when the moving body changes the moving direction is also corrected. For this reason, when a user views a first-person viewpoint image of a moving body that moves while changing the moving direction, the first-person viewpoint image always faces a certain direction, which may give an unnatural impression to the user.
- the information processing apparatus 1100 provides a more natural first-person viewpoint image to the user by controlling the display of the first-person viewpoint image based on the moving direction of the moving object.
- the captured image handled by the information processing apparatus 1100 according to the present embodiment is captured by, for example, an omnidirectional camera mounted on the wearable terminal 200 as illustrated in FIG. (For example, an all-around captured image) is preferable.
- Each processing unit shown in FIG. 18 may be realized by any one of the server 100 and the client devices 200 to 700 in FIG. 1, or may be realized by being distributed among a plurality of devices. Good.
- the information processing apparatus 1100 includes an image generation unit 1110, an image selection unit 1120, an image correction unit 1130, a display control unit 1150, a data acquisition unit 1160, and data provision Unit 1170, storage unit 1180, and direction control unit 1190.
- the image generation unit 1110 is substantially the same as the image generation unit 1010
- the image selection unit 1120 is substantially the same as the image selection unit 1020
- the display control unit 1150 is the same as the display control unit 1050.
- the data acquisition unit 1160 is substantially the same as the data acquisition unit 1060
- the data providing unit 1170 is substantially the same as the data acquisition unit 1070
- the storage unit 1180 is a storage unit. Since it is substantially the same as 1080, detailed description here is omitted.
- the characteristic image correction unit 1130 and the direction control unit 1190 in the present embodiment will be described.
- the image correction unit 1130 is a processing unit that corrects a change in the image due to the rotation of the imaging device by applying a reverse rotation of the rotation of the moving body (more specifically, the imaging device) to the surrounding captured image.
- the image correction unit 1130 may detect the rotation of the imaging apparatus using various sensors such as an acceleration sensor and an inclination sensor, and correct the change in the image using the detected rotation.
- the image correction unit 1130 may estimate the rotation of the imaging device from the surrounding captured image captured by the imaging device, and correct the change in the image using the estimated rotation of the imaging device.
- the image correction unit 1130 estimates the rotation of the imaging device from the surrounding captured image captured by the imaging device and corrects the change in the image due to the estimated rotation of the imaging device. In such a case, it is easy to synchronize the rotation of the imaging device and the rotation correction of the surrounding captured image, and it is possible to cope with higher-speed rotation than when using various sensors. More preferred.
- the image correction unit 1130 estimates the rotation of the imaging device from the surrounding captured image and corrects the change in the image based on the estimated rotation of the imaging device. To do.
- a global image A is generated from imaging data from a moving body (human) wearing a wearable terminal 200 having an all-around camera.
- time t + 1 it is assumed that a rotational motion has occurred in the moving body (more specifically, the imaging device) and the global image B has been generated.
- the plurality of image feature points are uniformly extracted at intervals as much as possible in the entire global image. It should be noted that image feature points are preferably not extracted from the high latitude portion because the distortion of the image tends to increase in the high latitude portion of the global image.
- the local feature amount U extracted in the global image A and the global image B is compared, and the change amount F of the local feature amount when the global image A changes to the global image B is calculated.
- the local feature amount change amount F those having a predetermined threshold value or more may be excluded as outliers.
- the local feature amount U (t + 1) in the global image B at the time t + 1 is the local feature amount U (t) in the global image A at the previous time t, and the local feature amount changes between the global image A and the global image B.
- F (t + 1) it can be expressed by the following expression 201.
- the image correction unit 1130 pastes the global image A and the global image B on the celestial sphere on the equirectangular projection, and the calculated local feature amounts (U (t), U (t + 1)) are three-dimensionally displayed.
- the feature values are converted into P (t) and P (t + 1).
- the image correction unit 1130 estimates a transformation Mat (t + 1) from the three-dimensional feature quantity P (t) to P (t + 1) by three-dimensional affine estimation (Formula 202).
- the image correction unit 1130 can estimate the conversion Mat (t + 1) from P (t) to P (t + 1). Furthermore, the image correction unit 1130 can calculate the rotation Q (t + 1) from the global image A to the global image B based on the estimated conversion Mat (t + 1). Note that the image correcting unit 1130 may determine whether or not the rotation estimation is successful by calculating an estimation error of the calculated rotation Q (t + 1). Further, when it is determined that the estimation has failed, the image correction unit 1130 may perform rotation estimation again.
- the image correction unit 1130 can generate the global image C with the rotation corrected by applying the rotation corresponding to the inverse transformation of the estimated rotation Q (t + 1) to the global image B. Further, the image correction unit 1130 may integrate rotation from the global image at a predetermined time and correct the surrounding captured image based on the integrated rotation.
- the local feature amount focused by the image correction unit 1130 is not particularly limited, and a known local feature amount can be used. Examples of such a known local feature amount include SIFT (Scale Invariant Feature Transform).
- the direction control unit 1190 is a processing unit that controls the display of the surrounding captured image based on the moving direction of the moving body. Specifically, the direction control unit 1190 displays the display image that is displayed in the display area that the user is viewing so that the reference direction of the display image that the user is viewing matches the moving direction of the moving object. To control. More specifically, the direction control unit 1190 is visually recognized by the user when the angle difference between the moving direction of the moving body and the reference direction of the moving body is within a threshold (for example, 15 ° on one side, 30 ° in total). The display angle of view of the image displayed in the display area that the user is viewing is controlled so that the reference direction of the display image matches the reference direction or moving direction of the moving object.
- a threshold for example, 15 ° on one side, 30 ° in total
- the reference direction of the moving object is, for example, the front direction of the moving object (human, self-propelled object, flying object, etc.).
- the reference direction of the display image visually recognized by the user is, for example, the field angle direction of the display image displayed in the display area in front of the user when the user is facing the front direction.
- the direction control unit 1190 displays the display image so that the front direction of the moving object matches the angle of view of the display image displayed in the display area in front of the user when the user is facing the front direction.
- the display angle of view can be controlled. Therefore, the direction control unit 1190 can eliminate the unnaturalness of the display image that occurs when the moving body changes the moving direction.
- the direction control unit 1190 may calculate the moving direction of the moving object from the position information of the moving object acquired by position measurement using, for example, GPS (Global Positioning System) or Wi-Fi (registered trademark). In addition, the direction control unit 1190 may calculate the moving direction of the moving body from information detected by various sensors such as a geomagnetic sensor and an acceleration sensor, for example. Furthermore, the direction control unit 1190 may calculate the moving direction of the moving body from the surrounding captured image captured by the imaging device.
- GPS Global Positioning System
- Wi-Fi registered trademark
- the direction control unit 1190 may calculate the moving direction of the moving body from information detected by various sensors such as a geomagnetic sensor and an acceleration sensor, for example.
- the direction control unit 1190 may calculate the moving direction of the moving body from the surrounding captured image captured by the imaging device.
- the direction control unit 1190 calculates the moving direction of the moving body from the captured surrounding captured image. According to this configuration, the direction control unit 1190 can calculate the moving direction of the moving body more efficiently when the rotation is calculated by the image processing by the image correction unit 1130.
- the direction control unit 1190 estimates and rotates the three-dimensional feature amount P (t + 1) of the global image B and the global image A.
- the difference from the three-dimensional feature quantity P (t) * Q (t + 1) rotated by Q (t + 1) is calculated.
- the translation component T (t + 1) between the global image B and the global image A can be estimated as a difference between P (t + 1) and P (t) * Q (t + 1) (formula 203).
- the moving direction of the moving body can be estimated as the opposite direction of the parallel moving direction of the surrounding captured image, and the direction control unit 1190 uses the direction opposite to the moving direction of the estimated parallel moving component T (t + 1) as the moving body. It can be estimated that the moving direction is.
- the direction control unit 1190 performs a time average process on the calculated T (t + 1) and estimates the moving direction of the moving body from the time-averaged parallel movement component T. This is because, when the moving body frequently changes the moving direction, if the direction control unit 1190 frequently rotates the display angle of the display image that is visually recognized by the user, the user may get “drunk”. Because there is. Needless to say, the direction control unit 1190 preferably uses the moving direction of the moving body averaged over a predetermined time as the moving direction of the moving body even when the moving direction of the moving body is calculated by another method. Yes.
- the direction control unit 1190 that has calculated the moving direction of the moving object by the above method is visually recognized by the user so that the moving direction of the moving object matches the reference direction of the display image being visually recognized by the user.
- the display angle of the display image displayed in the display area is changed.
- the direction control unit 1190 when the angle difference between the moving direction of the moving body and the front direction of the moving body is within a threshold (for example, 15 ° on one side, 30 ° in total), the user faces the front direction.
- a threshold for example, 15 ° on one side, 30 ° in total
- the direction control unit 1190 can match the front direction of the moving body with the angle of view of the display image displayed on the front when the user is facing the front direction. It is possible to eliminate the unnaturalness of the display image that occurs when the direction is changed.
- the direction control unit 1190 displays in the display area that the user is viewing in order to match the moving direction of the moving body with the reference direction of the display image that is viewed by the user.
- a method for rotating the angle of view of the image to be rotated will be described more specifically.
- the moving direction 211 of the moving body 210 is indicated by an arrow
- the field angle direction of the display image 231 that the user is viewing is indicated by an angle display.
- “0 °” in the display image 231 visually recognized by the user is the angle of view of the display image displayed in front of the user when the user is facing the front direction (that is, the reference direction of the display image).
- the direction control unit 1190 displays the moving direction 211 of the moving body and “ The display angle of view of the surrounding captured image is rotated so that “0 °” matches, and the display image 231 that the user is viewing in front is changed.
- the direction control unit 1190 can suppress unnatural image generation in which the first-person viewpoint image visually recognized by the user always faces one direction despite the moving body changing the moving direction.
- the direction control unit 1190 rotates the display angle of view of the surrounding captured image at a predetermined speed in order to make the reference direction of the display image visually recognized by the user coincide with the moving direction of the moving object.
- a predetermined speed in order to make the reference direction of the display image visually recognized by the user coincide with the moving direction of the moving object.
- the speed at which the direction control unit 1190 rotates the display angle of view of the surrounding captured image will be described with reference to FIG.
- FIG. 21 is a graph showing a time change between the moving direction of the moving body 210 and the reference direction of the display image 231 visually recognized by the user. The change in both directions is shown as an angle change.
- the direction control unit 1190 displays the reference direction of the display image 231 that the user is viewing.
- the angle of view of the display image 231 is rotated at a predetermined speed so that is coincident with the moving direction of the moving body 210.
- the direction control unit 1190 gently displays the image of the display image 231 at a predetermined speed. It is preferable to rotate the corners.
- the predetermined speed at which the angle of view of the display image 231 is rotated may be controlled by the user's selection, or may be controlled by the moving speed or the rotating speed of the moving body 210.
- the direction control unit 1190 may rotate the angle of view of the display image 231 faster.
- the correspondence relationship between the moving speed or rotational speed of the moving body 210 and the rotational speed in the angle of view of the display image 231 is stored in the storage unit 1180 in advance in the form of a correspondence table or a function. Is preferred.
- the function of the direction control unit 1190 may be controlled not to be executed by an input from the user.
- Examples of such input from the user include an input operation in which the user holds the user operation device, which is a wearable terminal, with both hands.
- the function of the direction control unit 1190 may be controlled not to be executed. Specifically, when the detected amount of change in the line-of-sight direction of the user exceeds a threshold value, the function of the direction control unit 1190 may be controlled not to be executed. Moreover, when the variation
- the display image provided to the user may display the line-of-sight direction of a moving body (for example, a human), and the locus of the line-of-sight direction of the moving body (for example, a human) is displayed. Also good.
- a display image for example, an image visually recognized by the user
- the user's line-of-sight direction may be displayed on the display image.
- a trajectory in the line-of-sight direction may be displayed.
- each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component.
- the CPU or the like may perform all functions of each component.
- the hardware configuration of the present embodiment can be changed as appropriate according to the technical level at the time of implementing the present embodiment.
- a computer program for realizing each function of the information processing apparatus according to the present embodiment as described above can be produced and installed in a personal computer or the like.
- a computer-readable recording medium storing such a computer program can be provided.
- the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
- the above computer program may be distributed via a network, for example, without using a recording medium.
- the image generation unit 1110, the image selection unit 1120, the image correction unit 1130, the direction control unit 1190, the data acquisition unit 1160, the data provision unit 1170, and the storage unit 1180 illustrated in FIG. 18 communicate with the information processing apparatus 10100.
- the functions described above may be realized by being mounted on another device such as a computer capable of supporting the information processing device 1100 and another device.
- captured image data is acquired from an imaging device (camera) mounted on a moving body (step S171).
- the image generation unit 1110 of the information processing apparatus 1100 generates a surrounding captured image such as a global image or a rectangular image obtained by converting the global image into a rectangle based on the acquired captured image data (step S173).
- the image correction unit 1130 of the information processing device 1100 extracts a rotation component from the generated surrounding captured image, and performs image processing for correcting the rotation of the moving body (more specifically, the imaging device) on the surrounding captured image. Implement (step S175).
- the direction control unit 1190 of the information processing apparatus 1100 acquires the moving direction data of the moving body (step S177).
- the direction control unit 1190 of the information processing apparatus 1100 determines whether or not the angle difference between the moving direction of the moving body and the reference direction of the moving body (for example, the front direction of the moving body) is within a threshold (step).
- the threshold value may be, for example, 30 ° (that is, 15 ° on one side).
- the image selection unit 1120 of the information processing device 1100 displays an image that the user visually recognizes from the surrounding captured images. Select.
- the display control unit 1150 of the information processing apparatus 1100 controls the display of the selected image on the display screen of the user operation device (step S181).
- step S179 when the angle difference between the moving direction of the moving body and the reference direction of the moving body is within the threshold (step S179 / Yes), the image selection unit 1120 and the display control unit of the information processing apparatus 1100 are similar to step S181.
- 1150 controls display on the display screen (step S183).
- the direction control unit 1190 of the information processing apparatus 1100 displays control for rotating the surrounding captured image displayed on the display screen at a predetermined speed so that the moving direction or the reference direction of the moving object matches the reference direction of the user. Is performed (step S185).
- the information processing apparatus 1100 corrects the shaking of the surrounding captured image due to the rotation of the imaging apparatus, and provides the user with the surrounding captured image that is not unnatural even when the moving body changes the moving direction. it can.
- the image correction unit 1130 determines whether or not the movement of the moving body and the movement of the user are synchronized. When it is determined that the movement of the moving body and the movement of the user are synchronized, the functions of the image correction unit 1130 and the direction control unit 1190 are controlled not to be executed.
- the movement in which the movement of such a moving body (for example, a human) and the movement of the user are synchronized includes, for example, a movement in the case of chasing a tennis ball.
- FIG. 23 is a graph showing the change in the rotation angle of the head of the moving body and the change in the rotation angle of the user's head over time.
- the movement of the moving body and the movement of the user are “synchronized” indicates that, for example, the directions of rotation of the moving body and the user's head are the same.
- the waveform of the head rotation of the moving body 210 and the waveform of the head rotation of the user 230 vary in amplitude and period, but the timing is convex upward and convex downward. Are substantially the same timing. In such a case, it can be said that the movement of the moving body and the movement of the user are “synchronized”.
- the movement of the moving body and the movement of the user are “synchronized” with the same direction of rotation of the moving body and the user's head, and the moving body and the user's head.
- the case where the rotation amount of the rotation of the part is greater than or equal to a predetermined magnitude is shown.
- the directions of rotation of the moving body and the user's head may coincide by chance due to involuntary movement or the like.
- the predetermined size is, for example, 40 ° (20 ° on one side).
- the movement of the moving body and the movement of the user are “synchronized”. May be considered.
- the tilt direction of the moving body (for example, a human) and the tilt direction of the user's body match, the movement of the moving body and the movement of the user are regarded as “synchronized”. Also good.
- the detection of the position of the center of gravity or the inclination of the body of the moving body or the user can be realized by using a known sensor such as an acceleration sensor or a motion sensor.
- the movement of the moving body and the movement of the user may be regarded as “synchronized”.
- the mobile object's gaze point and the user's gaze point coincide with each other, the movement of the mobile object and the user's movement may be regarded as “synchronized”. This is because it can be considered that the consciousness or recognition is substantially the same (synchronized) between the moving body (for example, human) and the user.
- the detection of these gaze directions or gazing points can be executed by, for example, a gaze detection function of a mobile object (for example, a human) and a wearable terminal worn by the user.
- the line-of-sight detection function can be realized by the method described in the first embodiment or a known method.
- the image correction unit 1130 when the motion of the moving body (for example, a human) and the motion of the user are synchronized, the image correction unit 1130 does not perform rotation correction on the surrounding captured image. In addition, when motion that is not synchronized between the moving body and the user is detected, the image correction unit 1130 executes rotation correction to the surrounding captured image.
- the flow of the information processing method executed by such a modification of the present embodiment will be described with reference to FIG.
- the movement experienced by the moving body and the user is a movement capable of synchronizing the movement or consciousness between the moving body and the user.
- captured image data is acquired from an imaging device (camera) attached to a moving body (step S191).
- the image generation unit 1110 of the information processing apparatus 1100 generates a surrounding captured image such as a global image or a rectangular image obtained by converting the global image into a rectangle based on the acquired captured image data (step S193).
- the image correction unit 1130 of the information processing apparatus 1100 determines whether or not the rotation of the mobile body (more specifically, the imaging apparatus) and the rotation of the terminal worn by the user are synchronized. More specifically, the image correction unit 1130 of the information processing device 1100 determines whether the rotation direction of the imaging device is the same as the rotation direction of the terminal worn by the user (step S195). ).
- the image correction unit 1130 of the information processing device 1100 extracts the rotation component from the surrounding captured image. Image processing for extracting and correcting the rotation of the imaging device is performed on the surrounding captured image (step S197).
- the direction control unit 1190 may further execute control of the display image based on the moving direction of the moving body.
- step S195 / Yes when the rotation of the moving body and the rotation of the terminal worn by the user are synchronized (step S195 / Yes), the image correction unit 1130 of the information processing device 1100 corrects the rotation of the imaging device. The process is not performed on the surrounding captured image.
- the image selection unit 1120 of the information processing device 1100 selects an image that the user visually recognizes from the surrounding captured images, and the display control unit 1150 of the information processing device 1100 displays the display screen of the user operation device of the selected image.
- the display is controlled (step S199).
- the information processing apparatus 1100 can provide the user with a surrounding image having a higher sense of reality when the user observes the surrounding image while experiencing the movement synchronized with the moving object.
- the video surrounding the moving object can be observed in real time as the surrounding captured image, and the user can be present as if he was present in the place where the moving object exists. A feeling can be obtained. Further, by performing the correction process as described above, it is possible to suppress the shaking of the image due to the rotational movement of the imaging apparatus and to reflect the change in the moving direction of the moving body in the display image. Therefore, the user can suppress the occurrence of motion sickness (sickness) and can visually recognize a more natural first-person captured image.
- the information processing apparatus and the information processing method according to the modification of the present embodiment when the user observes the surrounding captured image while experiencing the movement synchronized with the moving object, the surroundings with a higher sense of presence for the user It is also possible to provide a captured image.
- a viewpoint position can be moved by imaging a space with a camera arranged in a lattice shape.
- this ray space method after imaging a space with a camera array arranged on a lattice, each pixel constituting the captured image is projected onto a space having two projection planes as shown in FIG.
- the light (for example, the light ray 1 in FIG. 25) passing through the rectangular space constituted by the projection plane A and the projection plane B has coordinates (u, v) indicating the position on the projection plane A and the projection plane B.
- L u, v, s, t
- point a in FIG. 25 represents the viewpoint
- point b represents the observation point viewed from the viewpoint a.
- the light (light ray 2) poured from the viewpoint a to the observation point b can be expressed by L as described above. Therefore, the field of view from the viewpoint a can be reproduced by repeating the same processing on each of the points constituting the projection plane in an arbitrary projection plane located between the projection plane A and the projection plane B. .
- the degree of freedom of the viewpoint position is still limited.
- the line of sight (light ray 3) from the viewpoint c does not penetrate the projection plane A and the projection plane B and cannot be expressed as a point in the ray space L. Therefore, the view recombination based on the ray space as shown in FIG. 25 is subject to position restrictions.
- FIG. 26 is a diagram illustrating a schematic configuration of a system according to the third embodiment of the present disclosure.
- the system 20 in the present embodiment includes an imaging device 800 and an information processing apparatus 2000.
- the restriction in the ray space method as shown in FIG. 25 is due to the fact that the array camera, which is an imaging device, is arranged on a two-dimensional lattice. Therefore, the present inventor considered that it is possible to eliminate the restriction of the viewpoint by arranging the cameras in a three-dimensional lattice pattern. However, in this case, there arise problems that (1) one camera hides the field of view of another camera, and (2) the number of required cameras becomes enormous.
- the present inventor uses a self-propelled imaging device 800 that can freely move in the space, thereby avoiding the above-described problems and reducing the space.
- the inventors have conceived that the image can be captured in a three-dimensional lattice pattern.
- the imaging device 800 will be described in detail later again.
- a plurality of imaging data captured by the self-propelled imaging device 800 is provided to the information processing apparatus 2000, and the information processing apparatus 2000 reproduces images based on the plurality of imaging data.
- a configuration process is performed.
- the information processing apparatus 2000 performs image reconstruction processing based on a plurality of image data captured by the imaging device 800.
- the information processing apparatus 2000 generates a three-dimensional imaging space using a plurality of imaging data captured by the imaging device 800 when performing image reconstruction processing.
- the information processing apparatus 2000 performs image reconstruction processing by using the generated imaging space or the light space further generated based on the generated imaging space.
- the information processing apparatus 2000 may be configured by, for example, at least one of the server 100 and the client devices 200 to 700 included in the system 10 according to the first embodiment described with reference to FIG. In this case, the information processing apparatus 2000 as described in detail below is realized as the entire system 20 by the server 100 and the client devices 200 to 700 independently or cooperating with each other.
- the information processing apparatus 2000 may be configured by a single various computer. Since the hardware configuration of the information processing apparatus 2000 is the same as that of FIG. 2, a detailed description thereof will be omitted below. The configuration of the information processing apparatus 2000 will be described in detail later.
- the imaging device 800 may be configured as a single imaging device such as a robot that can freely move in a space, but has an array camera as shown in FIG. 27A or 27B. It is preferable to use a self-propelled imaging device 800.
- cameras 801 are arranged in a one-dimensional direction.
- at least one position recognition marker 803 for specifying the imaging position is provided on the upper portion of the camera 801.
- the camera 801 and the position recognition marker 803 are supported by a movable base 805 having wheels.
- Control of the camera 801 and the mount 805 is performed by a control computer 807 including a battery provided on the mount 805.
- An imaging apparatus 800 illustrated in FIG. 27B is obtained by arranging the camera 801 illustrated in FIG. 27A on the array in a two-dimensional direction.
- the imaging device 800 as illustrated in FIG. 27A or 27B repeats the imaging process while moving in space.
- the camera 801 provided in the imaging device 800 illustrated in FIGS. 27A and 27B may be a wide-angle camera having an imaging field of view only in a predetermined direction as illustrated in FIG. 28A. As shown in 28B, the camera may be an omnidirectional camera having an imaging field around the entire periphery. However, as described below, since it is preferable to capture as many images as possible in the space as densely as possible, the camera 801 provided in the imaging device 800 is an all-around camera as illustrated in FIG. 28B. It is preferable.
- the imaging device 800 repeats imaging while moving in the space at a predetermined interval (preferably at regular intervals), as schematically shown in FIG. At this time, the imaging position in the space is recorded simultaneously with the imaging by the position recognition marker 803 provided in the imaging device 800.
- a plurality of pieces of imaging data imaged by such an imaging device 800 are output to the information processing apparatus 2000.
- the information processing apparatus 2000 includes an imaging space generation unit 201, a light space generation unit 2020, and an image reconstruction unit 2030.
- the information processing apparatus 2000 may further include at least one of a display control unit 2040, a data acquisition unit 2050, a data providing unit 2060, and a storage unit 2070.
- each processing unit shown in FIG. 30 may be realized in any one of the server 100 and the client devices 200 to 700, or may be realized by being distributed among a plurality of devices.
- the imaging space generation unit 2010 generates each captured image using a plurality of captured images captured by the imaging device 800 that moves in the space or the imaging device that is attached to the moving body that moves in the space.
- An imaging space in which information representing a position in the space and the corresponding captured image are associated with each other is generated.
- the position recognition marker 803 is provided together with the camera 801, it is possible to easily associate the captured image with the imaging position.
- the generated imaging space is a three-dimensional space in which each grid point corresponds to an imaging position by the imaging device 800 or the like, and a captured image is associated with each grid point. .
- a captured image associated with the imaging position (x ′, y ′, z ′) can be provided.
- the captured image associated with each grid point is an omnidirectional image
- an image of the line-of-sight direction received from the corresponding all-round image Can be cut out and used for user browsing.
- the number of grid points in the imaging space as shown in FIG. 31 is as large as possible, and the interval between adjacent grid points is as short as possible.
- the free viewpoint image based on the imaging space as shown in FIG. 31 can be provided.
- the light space is generated by the light space generation unit 2020 based on the imaging space generated by the imaging space generation unit 2010.
- the light space generation unit 2020 generates a light space circumscribing the imaging space based on the imaging space generated by the imaging space generation unit 2010.
- the ray space may be a rectangular space circumscribing the imaging space as shown in FIG. 32A, for example, or may be a spherical space circumscribing the imaging space as shown in FIG. 32B. .
- a light space circumscribing the imaging space is generated, so that light penetrating through the light space penetrates any two places in the light space.
- the light ray L can be defined by using the coordinates of the light penetration point in the rectangular parallelepiped.
- the light ray L defined in this way corresponds to one point in the four-dimensional space (u, v, s, t).
- the coordinate system is different for each surface of the rectangular parallelepiped, as shown in the lower part of FIG. 32A, by considering the development view of the light space, the same coordinate system is used regardless of the penetrating surface. This makes it possible to express points.
- the light beam space is a spherical space as shown in FIG. 32B
- one point on the sphere surface is converted into one point on a rectangular plane by a known method such as equirectangular projection. Is possible. Therefore, even when the light beam space is a spherical space as shown in FIG. 32B, the light beam L can be associated with one point in the four-dimensional space, as in FIG. 25A.
- the image reconstruction unit 2030 reconstructs an image corresponding to the reconstruction information based on the light space generated by the light space generation unit 2020 and the reconstruction information regarding the viewpoint and the line-of-sight direction specified by the user.
- an image reconstruction method will be described with reference to FIG.
- one point of the image visually recognized by the user can be represented by a vector represented by a viewpoint position and a viewpoint direction from the viewpoint position.
- the image reconstruction unit 2030 identifies a vector as shown in FIG. 33 based on the reconstruction information, the image reconstruction unit 2030 extends this vector and calculates the position of the intersection with the ray space.
- the coordinates of the intersection are (u, v) and (s, t) as shown in FIG. 33
- the light ray L corresponding to the pixel corresponding to the reconstruction information is , L (u, v, s, t).
- the image reconstruction unit 2030 can reproduce an image at an arbitrary viewpoint designated by the user by repeatedly performing the processing described above for all the pixels included in the reconstructed image. It is.
- Such a reconstruction method can be realized with higher accuracy by densely capturing multi-viewpoint images with the imaging device 800 to generate a light space. Since the relationship between the generated light space and imaging is simple, and the method for reproducing the space as described above is also simple, the above-described processing can be performed at a low calculation cost. In addition, since the reconstruction method according to the present embodiment does not need to recognize corresponding points between two images as in a conventional multi-view stereo image, an image of an arbitrary viewpoint and an arbitrary direction is used. It can be said that this is a general-purpose method that can be reconstructed extremely simply.
- the display control unit 2040 controls display contents of a display device such as a display provided outside the information processing device 2000 or the information processing device 2000. Specifically, the display control unit 2040 displays the imaging space generated by the imaging space generation unit 2010, the light space generated by the light space generation unit 2020, the image reconstructed by the image reconstruction unit 2030, and the like. It becomes possible to make the user visually recognize. As a result, the user can grasp images of arbitrary viewpoints and arbitrary directions at any time.
- the data acquisition unit 2050 acquires captured image data output from the imaging device 800 and data related to the imaging position, and acquires data related to user operations output from user operation devices, various input mechanisms, and the like. Various types of data acquired by the data acquisition unit 2050 can be appropriately used by each processing unit of the information processing apparatus 2000.
- the data providing unit 2060 provides various types of data (for example, reconstructed image data) generated by the information processing apparatus 2000 to a device provided outside the information processing apparatus 2000. Thereby, even in an apparatus provided outside the information processing apparatus 2000, various types of information generated by the information processing apparatus 2000 can be used.
- data for example, reconstructed image data
- the storage unit 2070 includes various databases used for processing in the imaging space generation unit 2010, the light space generation unit 2020, the image reconstruction unit 2030, the display control unit 2040, the data acquisition unit 2050, and the data provision unit 2060.
- Various programs including applications used for various arithmetic processes executed by the processing unit, various parameters that need to be saved when performing some processing, progress of processing, and the like may be recorded as appropriate.
- the storage unit 2070 is freely accessed by each processing unit such as the imaging space generation unit 2010, the light space generation unit 2020, the image reconstruction unit 2030, the display control unit 2040, the data acquisition unit 2050, and the data provision unit 2060. Data can be written and read.
- each component described above may be configured using a general-purpose member or circuit, or may be configured by hardware specialized for the function of each component.
- the CPU or the like may perform all functions of each component. Therefore, it is possible to appropriately change the configuration to be used according to the technical level at the time of carrying out the present embodiment.
- a computer program for realizing each function of the information processing apparatus according to the present embodiment as described above can be produced and installed in a personal computer or the like.
- a computer-readable recording medium storing such a computer program can be provided.
- the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
- the above computer program may be distributed via a network, for example, without using a recording medium.
- the imaging space generation unit 2010 of the information processing apparatus 2000 acquires imaging data captured by the imaging device 800, and based on the imaging data and information on the imaging position associated with the imaging data. Thus, an imaging space as shown in FIG. 33 is generated (step S201).
- the light beam space generation unit 2020 of the information processing apparatus 2000 first performs the definition loop of the light beam L expressed in step S203 and subsequent steps using the imaging space generated by the imaging space generation unit 2010.
- the definition loop of the light ray L is performed until the processing is completed for all the pixels (Px, Py) at all the imaging positions (Cx, Cy, Cz).
- the light space generation unit 2020 performs a complementation loop of the light ray L expressed in step S211 and subsequent steps. This complementary loop of the light ray L is performed until the processing is completed for all the light ray space coordinates (u, v, s, t).
- the light ray space generation unit 2020 first defines L (u, v, s, t) for the light ray space coordinate (u, v, s, t) of interest. It is determined whether or not (step S213). If L (u, v, s, t) has already been defined, the light ray generation unit 2020 continues the process of step S213 for another ray space coordinate (u, v, s, t). On the other hand, when L (u, v, s, t) is not defined, the ray space generation unit 2020 is known based on the fact that L has been defined in the vicinity of (u, v, s, t).
- the image reconstruction unit 2030 of the information processing apparatus 2000 first refers to information related to user operations and the like, and compares the viewpoint position (Cx, Cy, Cz) and each pixel included in the image to be reconstructed.
- the position (Px, Py) is specified (step S301).
- the image reconstruction unit 2030 performs a luminance calculation loop expressed in step S303 and subsequent steps. This luminance calculation loop is performed until the processing is completed for all the pixels (Px, Py) included in the image to be reconstructed.
- the image reconstruction unit 2030 outputs the image data reconstructed as described above via the display control unit 2040 and the data providing unit 2060 (step S309). As a result, the reconstructed image is visually recognized by the user.
- a predetermined light space is generated based on a plurality of captured image data captured by an imaging device moving in the space, and the generated light is generated. Based on the space, an image in an arbitrary viewpoint and an arbitrary direction designated by the user is reconstructed.
- the ray space generated by the information processing apparatus and the information processing method according to the present embodiment can describe all rays observed from the inside of the ray space without restrictions on the observation position and the observation direction. It is possible to re-synthesize the field of view without receiving.
- An information processing apparatus comprising: a control unit that performs control to display a display image generated based on user viewing information that specifies an area that the user wants to visually recognize in a display area that the user visually recognizes.
- An image generation unit that generates a surrounding captured image in which the periphery of the position where the moving object is present is captured using a captured image included in the image information; Based on the surrounding captured image generated by the image generating unit and the user viewing information, an image selection unit that selects a captured image corresponding to the user viewing information from the surrounding captured images as a user viewing image.
- the information processing apparatus according to (1) further comprising: (3) When the line-of-sight direction of the imaging apparatus changes, based on the image-capturing apparatus attitude information, correction is performed to suppress the change in the surrounding captured image accompanying the change in the line-of-sight direction of the imaging apparatus with respect to the surrounding captured image
- the information processing apparatus according to (2) further including an image correction unit that performs the processing.
- the information processing apparatus according to any one of (3) to (6), wherein the image correction unit performs the correction so that local feature amounts match before and after a change in a line-of-sight direction of the imaging apparatus. .
- the image generation unit generates, as the surrounding captured image, an all-around image at a position where the moving object exists, or a converted image obtained by converting the all-around image into a rectangular image.
- (2) to (7) The information processing apparatus according to any one of the above.
- the information processing apparatus according to any one of (1) to (8), wherein the imaging apparatus attitude information is information related to rotation of the imaging apparatus.
- the information processing apparatus according to any one of (2) to (9), wherein the user viewing information is information specifying a display angle of view that the user wants to view in the surrounding captured image.
- a line-of-sight information generation unit that generates line-of-sight information indicating the line-of-sight direction of the image pickup apparatus based on the image pickup apparatus attitude information;
- the control unit uses the line-of-sight information generated by the line-of-sight information generation unit to display an object indicating the line-of-sight direction of the imaging device represented by the line-of-sight information together with the user-viewing image.
- the information processing apparatus according to 2).
- the controller is Using the rotation information related to the rotation accompanying the change in the line-of-sight direction of the imaging device calculated based on the imaging device orientation information, generate orientation information that visualizes the change in the orientation of the imaging device,
- the information processing apparatus according to (10) or (11), wherein the generated posture information is superimposed on the display image, and control is performed to display the display image in a display area visually recognized by the user.
- the control unit uses the rotation information, The movement of the coordinate system fixed in the space where the imaging device exists is fixed, and the coordinate system fixed in the imaging device is changed, or the coordinate system fixed in the space where the imaging device exists
- the change in posture of the image pickup device is visualized by changing the movement and fixing the movement of the coordinate system fixed to the image pickup device, according to any one of (12) to (14).
- Information processing device (16) When the control unit visualizes a change in the posture of the imaging device using the rotation information, when the space is virtually viewed from a position different from the center of the coordinate system fixed to the space.
- the information processing apparatus according to (15), wherein the corresponding display image is generated.
- control unit determines the specific position in the coordinate system fixed in the space where the imaging device exists.
- the information processing apparatus according to any one of (12) to (16), wherein the annotation is associated with a corresponding location.
- the control unit controls at least one of a reproduction speed and a display angle of view when the display image is displayed in a display area visually recognized by the user according to the rotation information.
- the information processing apparatus according to any one of the above.
- the control unit generates a display image when the space is virtually viewed from the specified arbitrary position, with the arbitrary position specified from the user operation device as a center.
- the information processing apparatus according to any one of the above.
- the control unit performs an operation performed on at least one of the imaging apparatus or the user operation device with respect to the setting related to the correction process in the image correction unit and the setting related to the superimposition process of the posture information in the control unit.
- the information processing apparatus according to any one of (13) to (19), which is changed based on (21)
- the information processing apparatus according to (2) further comprising: an image correction unit that performs correction on the surrounding captured image to suppress a change in the surrounding captured image accompanying rotation of the imaging device.
- the information processing apparatus according to (21) wherein the control unit controls display of the display image based on a moving direction of the moving body on which the imaging device is mounted.
- the control unit matches the reference direction in the display image with the reference direction or the moving direction of the moving object when an angle difference between the moving direction of the moving object and the reference direction of the moving object is within a threshold value.
- the information processing apparatus according to (22) wherein a display field angle of the display image displayed in a display area visually recognized by the user is rotated at a predetermined speed.
- the predetermined speed is controlled based on at least one of a moving speed or a rotational speed of the moving body.
- the user operation device is a wearable device worn by the user and displaying the display image;
- the image correction unit does not perform correction for suppressing changes in the surrounding captured image when the rotation direction of the moving body and the rotation direction of the user operation device coincide with each other.
- the information processing apparatus according to any one of the above.
- the user operation device is a wearable device worn by the user, The information processing apparatus according to any one of (1) to (26), wherein the user visual recognition information is generated according to a line-of-sight direction of the user detected by the wearable device.
- the moving object is any one of a human being different from a user who operates the user operation device, a self-propelled object that self-propels in the space, or a flying object that flies in the space. 27) The information processing apparatus according to any one of 27).
- User information obtained from image information generated by imaging of an imaging device mounted on a moving body moving in space, imaging device posture information that is information related to the posture of the imaging device, and a user operation device operated by the user
- An information processing method comprising: controlling display of a display image generated based on user viewing information for specifying a region that the user wants to visually recognize in a display region visually recognized by the user.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Studio Devices (AREA)
- Position Input By Displaying (AREA)
- Telephone Function (AREA)
Abstract
Description
1.第1の実施形態
1.1.システム構成例について
1.2.情報処理装置の構成について
1.3.情報処理方法の流れについて
1.4.表示制御処理の例
1.5.まとめ
2.第2の実施形態
2.1.情報処理装置の構成について
2.2.情報処理方法の流れについて
2.3.情報処理方法の変形例について
2.4.まとめ
3.第3の実施形態
3.1.システム構成例について
3.2.撮像機器の構成について
3.3.情報処理装置の構成について
3.4.情報処理方法の流れについて
3.5.まとめ
<システム構成例について>
図1は、本開示の第1の実施形態に係るシステムの概略的な構成を示す図である。図1に示したように、本実施形態におけるシステム10は、サーバ100と、クライアント200~700と、を有する。
(2)カメラなどの撮像機構を有し、撮像された実空間の画像に対して各種の画像処理を実施し、画像処理により得られた実空間に関する各種画像を、サーバ100又は他のクライアント機器200~700に提供する装置。
(3)カメラなどの撮像機構を有し、撮像された実空間の画像に対して各種の画像処理を実施し、ユーザによって実施された各種画像に関する各種操作に応じてユーザの所望する画像を生成し、生成した各種画像を、サーバ100又は他のクライアント機器200~700に提供する装置。
(4)ディスプレイなどの表示機構を少なくとも有し、好ましくはタッチパネルなどの操作機構を更に有し、(1)の装置により提供された画像を取得するとともに、ユーザによって実施された画像に関する各種操作に応じてユーザの所望する画像を生成し、生成した各種画像をユーザの閲覧に供する装置。
(5)ディスプレイなどの表示機構を少なくとも有し、好ましくはタッチパネルなどの操作機構を更に有し、(2)の装置により提供された画像を取得するとともに、ユーザによって実施された画像に関する各種操作に応じてユーザの所望する画像を生成し、生成した各種画像をユーザの閲覧に供する装置。
(6)ディスプレイなどの表示機構を少なくとも有し、好ましくはタッチパネルなどの操作機構を更に有し、(3)の装置により提供された画像を取得してユーザの閲覧に供するとともに、ユーザによる画像への各種操作を受け付ける装置。
(7)ディスプレイなどの表示機構を有し、(4)~(6)の装置によって受け付けられた各種のユーザ操作に基づき生成された各種画像を表示させる装置。
図2は、本実施形態に係る装置の概略的な構成を示す図である。図2に示したように、装置900は、プロセッサ910及びメモリ920を含む。装置900は、更に、表示部930、操作部940、通信部950、撮像部960、又は、センサ970の少なくとも何れかを含みうる。これらの構成要素は、バス980によって相互に接続される。装置900は、例えば、上記のサーバ100を構成するサーバ装置、及び、クライアント機器200~700を実現しうる。
次に、上記のようなサーバ100やクライアント機器200~700が単独で又は互いに協働することでシステム10全体として実現される、本実施形態に係る情報処理装置の構成について、図3~図10を参照しながら、主としてその機能の面を詳細に説明する。
次に、図11を参照しながら、本実施形態に係る情報処理装置1000で実施される情報処理方法の流れについて、簡単に説明する。
続いて、図12~図17を参照しながら、本実施形態に係る情報処理装置1000の表示制御部1050によって生成される表示画像の例を挙げつつ、表示制御部1050における表示制御処理について、具体的に説明する。
図12~図16は、本実施形態に係る表示制御処理について説明するための説明図であり、図17は、本実施形態に係る表示制御処理の流れの一例を示した流れ図である。
(a)撮像装置が存在する空間に固定された座標系(絶対座標系、以下、座標系Aともいう。)
(b)撮像装置に固定された座標系(相対座標系、以下、座標系Bともいう。)
この場合、撮像装置の姿勢の変化に伴い、ユーザ操作機器等の表示装置の表示領域に表示される周囲撮像画像が変化するように表示される。また、姿勢情報に対応する画像や各種オブジェクトは、撮像装置の姿勢に変化があったとしても、表示領域に固定されたように表示される。
(B)座標系Aの動きを姿勢情報に応じて変化させ、かつ、座標系Bの動きを固定する。
この場合、撮像装置の姿勢の変化があったとしても、ユーザ操作機器等の表示装置の表示領域に表示される周囲撮像画像の像が変化しない、又は、撮像装置の姿勢の変化に伴う像の変化が少なくなるように表示される。また、姿勢情報に対応する画像や各種オブジェクトは、撮像装置の姿勢の変化に伴って、変化する(仮想的に回転等する)ように表示領域に表示される。
以下では、図14~図16を示しながら、本実施形態に係る表示制御部1050で実施される表示制御処理により、ユーザ操作機器へと伝送される表示画像の一例について、簡単に説明する。
次に、図17を参照しながら、本実施形態に係る表示制御処理の流れの一例について、簡単に説明する。
このように、本実施形態に係る情報処理装置及び情報処理方法では、移動体を取り巻く映像を周囲撮像画像として実時間で観測でき、ユーザは、あたかも移動体が存在する場に居合わせたような臨場感を得ることができる。また、上記のような補正処理を実施することで、移動体の回転運動による画像の揺れが抑制されるため、ユーザは、急激な画像の変化によるモーションシックネス(映像酔い)を避けることができる。
<情報処理装置の構成について>
続いて、本開示の第2の実施形態について説明する。本実施形態に係る情報処理装置1100は、移動する移動体に装着された撮像装置によって撮像された撮像画像に対して、各種の情報処理を行い、移動体の周囲の全周囲撮像画像をより自然にユーザが視聴することを可能にする装置である。
次に、図22を参照して、本実施形態に係る情報処理装置1100で実施される情報処理方法の流れについて、簡単に説明する。
続いて、図23を参照して、本実施形態の変形例について、説明する。本実施形態の変形例は、移動体の動き(運動)と、ユーザの動き(運動)が同期している場合に、周囲撮像画像への回転補正を実行しないように画像補正部を制御する情報処理装置及び情報処理方法である。
このように、本実施形態に係る情報処理装置及び情報処理方法では、移動体を取り巻く映像を周囲撮像画像として実時間で観測でき、ユーザは、あたかも移動体が存在する場に居合わせたような臨場感を得ることができる。また、上記のような補正処理を実施することで、撮像装置の回転運動による画像の揺れが抑制され、かつ移動体の移動方向の変化を表示画像に反映させることができる。そのため、ユーザは、モーションシックネス(酔い)の発生が抑制され、より自然な一人称撮像画像を視認することができる。
従来、現実の景観を複数台のカメラで取得し、仮想空間内にその景観を再合成して、利用者に空間の閲覧を許す技術が多く開発されている。そのような技術の一例として、走行する自動車に搭載されたカメラによる画像を位置情報と関連づけ、画像内での移動を可能にした技術や、全周囲画像が撮像可能なカメラを搭載して、ある地点からの自由な視線方向の変化を可能にした技術などがある。これらの技術により、仮想空間内を移動しながら、周囲の画像を閲覧することができる。一方、これらの技術では、撮影箇所が離散的であるため、近傍の観測点に移動して、移動先での全周囲画像を観測することはできるが、視点をなめらかに移動することができない。従って、これらの技術で実現される画像は、利用者があたかもその場所にいるような画像の再現とは異なっている。
図26は、本開示の第3の実施形態に係るシステムの概略的な構成を示す図である。図26に示したように、本実施形態におけるシステム20は、撮像機器800と、情報処理装置2000と、を有する。
本実施形態に係る撮像機器800は、空間内を自由に移動可能なロボットのような単体の撮像機器として構成されていてもよいが、図27A又は図27Bに示したような、アレイカメラを有する自走式の撮像機器800を用いることが好ましい。
本実施形態に係る情報処理装置2000は、図30に示したように、撮像空間生成部201と、光線空間生成部2020と、画像再構成部2030と、を備える。情報処理装置2000は、更に、表示制御部2040と、データ取得部2050と、データ提供部2060と、記憶部2070と、の少なくとも何れかを更に備えてもよい。ここで、図30に示した各処理部は、サーバ100又はクライアント機器200~700の何れか1つの機器に実現されていてもよいし、複数の機器に分散されて実現されていてもよい。
次に、図34及び図35を参照しながら、本実施形態に係る情報処理装置2000で実施される情報処理方法の流れについて、簡単に説明する。
まず、図34を参照しながら、光線空間の生成処理までの流れについて説明する。
本実施形態に係る情報処理装置2000の撮像空間生成部2010は、撮像機器800によって撮像された撮像データを取得して、撮像データと、当該撮像データに関連付けられている撮像位置に関する情報とに基づいて、図33に示したような撮像空間を生成する(ステップS201)。
次に、図35を参照しながら、画像の再構成処理の流れについて説明する。
本実施形態に係る情報処理装置2000の画像再構成部2030は、まず、ユーザ操作に関する情報などを参照して、視点位置(Cx,Cy,Cz)と、再構成する画像に含まれる各画素の位置(Px,Py)と、を特定する(ステップS301)。
このように、本実施形態に係る情報処理装置及び情報処理方法では、空間内を移動する撮像機器により撮像された複数の撮像画像データに基づいて、所定の光線空間が生成され、生成された光線空間に基づいて、ユーザにより指定された任意視点・任意方向の画像が再構成される。本実施形態に係る情報処理装置及び情報処理方法で生成される光線空間は、光線空間の内部から観測する全ての光線を観測位置や観測方向の制限なしに記述することが可能であるため、制約を受けることなく視界を再合成することが可能となる。
(1)
空間内を移動する移動体に装着された撮像装置の撮像により生成される画像情報と、当該撮像装置の姿勢に関する情報である撮像装置姿勢情報と、ユーザの操作するユーザ操作機器から得られる、ユーザが視認したい領域を特定するユーザ視認情報と、に基づいて生成される表示画像を、前記ユーザが視認する表示領域に表示させる制御を行う制御部を備える、情報処理装置。
(2)
前記前記画像情報に含まれる撮像画像を利用して、前記移動体が存在する位置の周囲が撮像された周囲撮像画像を生成する画像生成部と、
前記画像生成部により生成された前記周囲撮像画像と、前記ユーザ視認情報と、に基づいて、前記周囲撮像画像のうち前記ユーザ視認情報に対応する撮像画像をユーザ視認用画像として選択する画像選択部と、
を更に備える、(1)に記載の情報処理装置。
(3)
前記撮像装置の視線方向が変化したときに、前記撮像装置姿勢情報に基づいて、前記周囲撮像画像に対して前記撮像装置の視線方向の変化に伴う前記周囲撮像画像の変化を抑制する補正を実施する画像補正部を更に備える、(2)に記載の情報処理装置。
(4)
前記画像補正部は、前記ユーザ操作機器から得られた前記補正の適用度合いを示した補正適用情報に応じて、前記補正の実施の度合いを制御する、(3)に記載の情報処理装置。
(5)
前記画像補正部は、前記補正適用情報に応じて、前記撮像装置に関して互いに独立に規定される回転座標軸のそれぞれに対し、前記補正の実施の度合いを制御する、(4)に記載の情報処理装置。
(6)
前記画像補正部は、前記撮像装置の視線方向の変化に伴う回転角の大きさに応じて、前記移動体の視線方向の変化後の前記周囲撮像画像を逆回転させる補正を実施する、(3)~(5)の何れか1つに記載の情報処理装置。
(7)
前記画像補正部は、前記撮像装置の視線方向の変化の前後において局所特徴量が一致するように、前記補正を実施する、(3)~(6)の何れか1つに記載の情報処理装置。
(8)
前記画像生成部は、前記周囲撮像画像として、前記移動体の存在する位置における全周囲画像、又は、当該全周囲画像を矩形画像へと変換した変換画像を生成する、(2)~(7)の何れか1つに記載の情報処理装置。
(9)
前記撮像装置姿勢情報は、前記撮像装置の回転に関する情報である、(1)~(8)の何れか1つに記載の情報処理装置。
(10)
前記ユーザ視認情報は、前記周囲撮像画像において、前記ユーザが視認したい表示画角を特定する情報である、(2)~(9)の何れか1つに記載の情報処理装置。
(11)
前記撮像装置姿勢情報に基づいて、前記撮像装置の視線方向を示す視線情報を生成する視線情報生成部を更に備え、
前記制御部は、前記視線情報生成部により生成された前記視線情報を利用して、当該視線情報で表わされる前記撮像装置の視線方向を示すオブジェクトを前記ユーザ視認用画像とあわせて表示させる、(2)に記載の情報処理装置。
(12)
前記制御部は、
前記撮像装置姿勢情報に基づき算出された、前記撮像装置の視線方向の変化に伴う回転に関する回転情報を利用して、前記撮像装置の姿勢の変化を可視化した姿勢情報を生成し、
生成した当該姿勢情報を前記表示画像に重畳した上で、当該表示画像を前記ユーザが視認する表示領域に表示させる制御を行う、(10)または(11)に記載の情報処理装置。
(13)
前記撮像装置の位置が変化することなく当該撮像装置の視線方向が変化した場合に、前記周囲撮像画像に対して、前記撮像装置の視線方向の変化に伴う前記周囲撮像画像の変化を抑制する補正を実施する画像補正部を更に備え、
前記制御部は、前記画像補正部による補正後の前記周囲撮像画像に対して、前記姿勢情報を表わすオブジェクトを重畳させる、(12)に記載の情報処理装置。
(14)
前記制御部は、前記表示画像に対して、前記撮像装置の視線方向の変化に伴う回転運動に伴って回転するオブジェクトと、回転しないオブジェクトと、の少なくとも何れか一方を重畳させる、(13)に記載の情報処理装置。
(15)
前記制御部は、前記回転情報を利用し、
前記撮像装置が存在する空間に固定された座標系の動きを固定し、かつ、前記撮像装置に固定された座標系を変化させる、又は、前記撮像装置が存在する空間に固定された座標系の動きを変化させ、かつ、前記撮像装置に固定された座標系の動きを固定することにより、前記撮像装置の姿勢の変化を可視化する、(12)~(14)の何れか1つに記載の情報処理装置。
(16)
前記制御部は、前記回転情報を利用して前記撮像装置の姿勢の変化を可視化する際に、前記空間に固定された座標系の中心とは異なる位置から前記空間を仮想的に視認した場合に対応する前記表示画像を生成する、(15)に記載の情報処理装置。
(17)
前記制御部は、前記ユーザ操作機器から、前記表示画像の特定の位置に対してアノテーションの付加を要求された場合に、前記撮像装置が存在する空間に固定された座標系における前記特定の位置の対応箇所に対して、前記アノテーションを関連付ける、(12)~(16)の何れか1つに記載の情報処理装置。
(18)
前記制御部は、前記回転情報に応じて、前記表示画像を前記ユーザが視認する表示領域に表示させる際の再生速度、又は、表示画角の少なくとも何れかを制御する、(12)~(17)の何れか1つに記載の情報処理装置。
(19)
前記制御部は、前記ユーザ操作機器から指定された任意の位置を中心とし、当該指定された任意の位置から前記空間を仮想的に視認した場合の表示画像を生成する、(12)~(18)の何れか1つに記載の情報処理装置。
(20)
前記制御部は、前記画像補正部における補正処理に関する設定、及び、前記制御部における姿勢情報の重畳処理に関する設定を、前記撮像装置、又は、前記ユーザ操作機器の少なくとも一方に対して行われた操作に基づいて変更する、(13)~(19)の何れか1つに記載の情報処理装置。
(21)
前記周囲撮像画像に対して、前記撮像装置の回転に伴う前記周囲撮像画像の変化を抑制する補正を実施する画像補正部を更に備える、(2)に記載の情報処理装置。
(22)
前記制御部は、前記撮像装置を装着した前記移動体の移動方向に基づいて、前記表示画像の表示を制御する、(21)に記載の情報処理装置。
(23)
前記制御部は、前記移動体の移動方向と前記移動体の基準方向との角度差が閾値以内である場合に、前記表示画像における基準方向と前記移動体の基準方向または移動方向とが一致するように、前記ユーザが視認する表示領域に表示させる前記表示画像の表示画角を所定の速度で回転させる、(22)に記載の情報処理装置。
(24)
前記所定の速度は、前記移動体の移動速度、または回転速度の少なくともいずれかに基づいて制御される、(23)に記載の情報処理装置。
(25)
前記ユーザ操作機器は、前記ユーザが装着し、前記表示画像が表示されるウェアラブル機器であり、
前記画像補正部は、前記移動体の回転方向と、前記ユーザ操作機器の回転方向とが一致している場合、前記周囲撮像画像の変化を抑制する補正を実行しない、(21)~(24)の何れか1つに記載の情報処理装置。
(26)
前記画像補正部は、前記撮像装置および前記ユーザ操作機器の回転量が閾値以下の場合、前記周囲撮像画像の変化を抑制する補正を実行する、(25)に記載の情報処理装置。
(27)
前記ユーザ操作機器は、前記ユーザの装着するウェアラブル機器であり、
前記ユーザ視認情報は、前記ウェアラブル機器が検知した前記ユーザの視線方向に応じて生成される、(1)~(26)の何れか1つに記載の情報処理装置。
(28)
前記移動体は、前記ユーザ操作機器を操作するユーザとは異なる人間、前記空間内を自走する自走体、又は、前記空間内を飛行する飛行体の何れかである、(1)~(27)の何れか1つに記載の情報処理装置。
(29)
前記画像情報と、前記撮像装置姿勢情報とに基づいて生成される中間画像情報を取得する取得部を更に備え、
前記制御部は、前記中間画像情報と、前記ユーザ視認情報とに基づいて生成される表示画像を前記ユーザが視認する表示領域に表示させる制御を行う、(1)に記載の情報処理装置。
(30)
空間内を移動する移動体に装着された撮像装置の撮像により生成される画像情報と、当該撮像装置の姿勢に関する情報である撮像装置姿勢情報と、ユーザの操作するユーザ操作機器から得られる、ユーザが視認したい領域を特定するユーザ視認情報と、に基づいて生成される表示画像を、前記ユーザが視認する表示領域に表示させる制御を行うことを含む、情報処理方法。
(31)
コンピュータに、
空間内を移動する移動体に装着された撮像装置の撮像により生成される画像情報と、当該撮像装置の姿勢に関する情報である撮像装置姿勢情報と、ユーザの操作するユーザ操作機器から得られる、ユーザが視認したい領域を特定するユーザ視認情報と、に基づいて生成される表示画像を、前記ユーザが視認する表示領域に表示させる制御を行う制御機能を実現させるためのプログラム。
1010、1110 画像生成部
1020、1120 画像選択部
1030、1130 画像補正部
1040 移動体視線情報生成部
1050、1150 表示制御部
1060、1160、2010 データ取得部
1070、1170、2060 データ提供部
1080、1180、2070 記憶部
1190 方向制御部
2020 撮像空間生成部
2030 光線空間生成部
2040 画像再構成部
Claims (22)
- 空間内を移動する移動体に装着された撮像装置の撮像により生成される画像情報と、当該撮像装置の姿勢に関する情報である撮像装置姿勢情報と、ユーザの操作するユーザ操作機器から得られる、ユーザが視認したい領域を特定するユーザ視認情報と、に基づいて生成される表示画像を、前記ユーザが視認する表示領域に表示させる制御を行う制御部を備える、情報処理装置。
- 前記画像情報に含まれる撮像画像を利用して、前記移動体が存在する位置の周囲が撮像された周囲撮像画像を生成する画像生成部と、
前記画像生成部により生成された前記周囲撮像画像と、前記ユーザ視認情報と、に基づいて、前記周囲撮像画像のうち前記ユーザ視認情報に対応する撮像画像をユーザ視認用画像として選択する画像選択部と、
を更に備える、請求項1に記載の情報処理装置。 - 前記撮像装置の視線方向が変化したときに、前記撮像装置姿勢情報に基づいて、前記周囲撮像画像に対して前記撮像装置の視線方向の変化に伴う前記周囲撮像画像の変化を抑制する補正を実施する画像補正部を更に備える、請求項2に記載の情報処理装置。
- 前記画像補正部は、前記ユーザ操作機器から得られた前記補正の適用度合いを示した補正適用情報に応じて、前記補正の実施の度合いを制御する、請求項3に記載の情報処理装置。
- 前記画像補正部は、前記補正適用情報に応じて、前記撮像装置に関して互いに独立に規定される回転座標軸のそれぞれに対し、前記補正の実施の度合いを制御する、請求項4に記載の情報処理装置。
- 前記画像補正部は、前記撮像装置の視線方向の変化に伴う回転角の大きさに応じて、前記移動体の視線方向の変化後の前記周囲撮像画像を逆回転させる補正を実施する、請求項3に記載の情報処理装置。
- 前記画像補正部は、前記撮像装置の視線方向の変化の前後において局所特徴量が一致するように、前記補正を実施する、請求項3に記載の情報処理装置。
- 前記画像生成部は、前記周囲撮像画像として、前記移動体の存在する位置における全周囲画像、又は、当該全周囲画像を矩形画像へと変換した変換画像を生成する、請求項2に記載の情報処理装置。
- 前記撮像装置姿勢情報に基づいて、前記撮像装置の視線方向を示す視線情報を生成する視線情報生成部を更に備え、
前記制御部は、前記視線情報生成部により生成された前記視線情報を利用して、当該視線情報で表わされる前記撮像装置の視線方向を示すオブジェクトを前記ユーザ視認用画像とあわせて表示させる、請求項2に記載の情報処理装置。 - 前記制御部は、
前記撮像装置姿勢情報に基づき算出された、前記撮像装置の視線方向の変化に伴う回転に関する回転情報を利用して、前記撮像装置の姿勢の変化を可視化した姿勢情報を生成し、
生成した当該姿勢情報を前記表示画像に重畳した上で、当該表示画像を前記ユーザが視認する表示領域に表示させる制御を行う、請求項8に記載の情報処理装置。 - 前記撮像装置の位置が変化することなく当該撮像装置の視線方向が変化した場合に、前記周囲撮像画像に対して、前記撮像装置の視線方向の変化に伴う前記周囲撮像画像の変化を抑制する補正を実施する画像補正部を更に備え、
前記制御部は、前記画像補正部による補正後の前記周囲撮像画像に対して、前記姿勢情報を表わすオブジェクトを重畳させる、請求項10に記載の情報処理装置。 - 前記制御部は、前記表示画像に対して、前記撮像装置の視線方向の変化に伴う回転運動に伴って回転するオブジェクトと、回転しないオブジェクトと、の少なくとも何れか一方を重畳させる、請求項11に記載の情報処理装置。
- 前記制御部は、前記回転情報を利用し、
前記撮像装置が存在する空間に固定された座標系の動きを固定し、かつ、前記撮像装置に固定された座標系を変化させる、又は、前記撮像装置が存在する空間に固定された座標系の動きを変化させ、かつ、前記撮像装置に固定された座標系の動きを固定することにより、前記撮像装置の姿勢の変化を可視化する、請求項10に記載の情報処理装置。 - 前記制御部は、前記回転情報を利用して前記撮像装置の姿勢の変化を可視化する際に、前記空間に固定された座標系の中心とは異なる位置から前記空間を仮想的に視認した場合に対応する前記表示画像を生成する、請求項13に記載の情報処理装置。
- 前記制御部は、前記ユーザ操作機器から、前記表示画像の特定の位置に対してアノテーションの付加を要求された場合に、前記撮像装置が存在する空間に固定された座標系における前記特定の位置の対応箇所に対して、前記アノテーションを関連付ける、請求項10に記載の情報処理装置。
- 前記制御部は、前記回転情報に応じて、前記表示画像を前記ユーザが視認する表示領域に表示させる際の再生速度、又は、表示画角の少なくとも何れかを制御する、請求項10に記載の情報処理装置。
- 前記制御部は、前記ユーザ操作機器から指定された任意の位置を中心とし、当該指定された任意の位置から前記空間を仮想的に視認した場合の表示画像を生成する、請求項10に記載の情報処理装置。
- 前記制御部は、前記画像補正部における補正処理に関する設定、及び、前記制御部における姿勢情報の重畳処理に関する設定を、前記撮像装置、又は、前記ユーザ操作機器の少なくとも一方に対して行われた操作に基づいて変更する、請求項11に記載の情報処理装置。
- 前記ユーザ操作機器は、前記ユーザの装着するウェアラブル機器であり、
前記ユーザ視認情報は、前記ウェアラブル機器が検知した前記ユーザの視線方向に応じて生成される、請求項1に記載の情報処理装置。 - 前記移動体は、前記ユーザ操作機器を操作するユーザとは異なる人間、前記空間内を自走する自走体、又は、前記空間内を飛行する飛行体の何れかである、請求項1に記載の情報処理装置。
- 空間内を移動する移動体に装着された撮像装置の撮像により生成される画像情報と、当該撮像装置の姿勢に関する情報である撮像装置姿勢情報と、ユーザの操作するユーザ操作機器から得られる、ユーザが視認したい領域を特定するユーザ視認情報と、に基づいて生成される表示画像を、前記ユーザが視認する表示領域に表示させる制御を行うことを含む、情報処理方法。
- コンピュータに、
空間内を移動する移動体に装着された撮像装置の撮像により生成される画像情報と、当該撮像装置の姿勢に関する情報である撮像装置姿勢情報と、ユーザの操作するユーザ操作機器から得られる、ユーザが視認したい領域を特定するユーザ視認情報と、に基づいて生成される表示画像を、前記ユーザが視認する表示領域に表示させる制御を行う制御機能を実現させるためのプログラム。
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14882561.5A EP3109744B1 (en) | 2014-02-17 | 2014-12-25 | Information processing device, information processing method and program |
US14/906,967 US9787895B2 (en) | 2014-02-17 | 2014-12-25 | Information processing device, information processing method, and program for generating circumferential captured images |
JP2015562709A JP6515813B2 (ja) | 2014-02-17 | 2014-12-25 | 情報処理装置、情報処理方法及びプログラム |
KR1020167018597A KR20160122702A (ko) | 2014-02-17 | 2014-12-25 | 정보 처리 장치, 정보 처리 방법 및 프로그램 |
CN201480075275.XA CN105980962B (zh) | 2014-02-17 | 2014-12-25 | 信息处理设备和信息处理方法 |
RU2016133123A RU2683262C2 (ru) | 2014-02-17 | 2014-12-25 | Устройство обработки информации, способ обработки информации и программа |
US15/176,577 US10389937B2 (en) | 2014-02-17 | 2016-06-08 | Information processing device, information processing method, and program |
US16/519,883 US10574889B2 (en) | 2014-02-17 | 2019-07-23 | Information processing device, information processing method, and program |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014028015 | 2014-02-17 | ||
JP2014-028015 | 2014-02-17 | ||
JP2014125799 | 2014-06-18 | ||
JP2014-125799 | 2014-06-18 | ||
JP2014-191990 | 2014-09-19 | ||
JP2014191990 | 2014-09-19 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/906,967 A-371-Of-International US9787895B2 (en) | 2014-02-17 | 2014-12-25 | Information processing device, information processing method, and program for generating circumferential captured images |
US15/176,577 Continuation US10389937B2 (en) | 2014-02-17 | 2016-06-08 | Information processing device, information processing method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015122108A1 true WO2015122108A1 (ja) | 2015-08-20 |
Family
ID=53799862
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/084350 WO2015122108A1 (ja) | 2014-02-17 | 2014-12-25 | 情報処理装置、情報処理方法及びプログラム |
Country Status (7)
Country | Link |
---|---|
US (3) | US9787895B2 (ja) |
EP (1) | EP3109744B1 (ja) |
JP (1) | JP6515813B2 (ja) |
KR (1) | KR20160122702A (ja) |
CN (1) | CN105980962B (ja) |
RU (1) | RU2683262C2 (ja) |
WO (1) | WO2015122108A1 (ja) |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106488181A (zh) * | 2015-08-31 | 2017-03-08 | 卡西欧计算机株式会社 | 显示控制装置以及显示控制方法 |
WO2017068928A1 (ja) * | 2015-10-21 | 2017-04-27 | ソニー株式会社 | 情報処理装置及びその制御方法、並びにコンピュータ・プログラム |
WO2017068926A1 (ja) * | 2015-10-21 | 2017-04-27 | ソニー株式会社 | 情報処理装置及びその制御方法、並びにコンピュータ・プログラム |
EP3280130A1 (en) * | 2016-08-04 | 2018-02-07 | LG Electronics Inc. | Display apparatus |
WO2018030795A1 (ko) * | 2016-08-10 | 2018-02-15 | 삼성전자 주식회사 | 카메라 장치, 디스플레이 장치 및 그 장치에서 움직임을 보정하는 방법 |
KR20180059765A (ko) | 2015-09-25 | 2018-06-05 | 소니 주식회사 | 정보 처리 장치, 정보 처리 방법 및 프로그램 |
WO2018150659A1 (ja) | 2017-02-14 | 2018-08-23 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
EP3383036A2 (en) | 2017-01-12 | 2018-10-03 | Sony Corporation | Information processing device, information processing method, and program |
JP2019046248A (ja) * | 2017-09-04 | 2019-03-22 | 株式会社コロプラ | 仮想空間を提供するための方法、プログラム、および当該プログラムを実行するための情報処理装置 |
CN109691084A (zh) * | 2016-09-15 | 2019-04-26 | 索尼公司 | 信息处理装置和方法以及程序 |
JP2019519125A (ja) * | 2017-03-07 | 2019-07-04 | リンクフロー カンパニー リミテッド | 全方位画像撮像方法及びその方法を行う装置 |
JP2019118090A (ja) * | 2017-01-31 | 2019-07-18 | 株式会社リコー | 撮像装置および撮像装置の制御方法 |
WO2019188229A1 (ja) | 2018-03-26 | 2019-10-03 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP2020523957A (ja) * | 2017-06-12 | 2020-08-06 | インターデジタル シーイー パテント ホールディングス | マルチ・ビュー・コンテンツを観察するユーザに情報を提示する方法及び機器 |
JP2020526096A (ja) * | 2017-06-29 | 2020-08-27 | リンクフロー カンパニー リミテッド | 最適状況判断撮像方法およびこのような方法を遂行する装置 |
JP2020171031A (ja) * | 2016-05-02 | 2020-10-15 | ホアウェイ・テクノロジーズ・カンパニー・リミテッド | ヘッドマウントディスプレイコンテンツの撮影および共有 |
JP2020532219A (ja) * | 2017-08-23 | 2020-11-05 | リンクフロー カンパニー リミテッド | ストリートビューサービス方法およびこのような方法を遂行する装置 |
WO2021014673A1 (ja) * | 2019-07-25 | 2021-01-28 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | 映像表示制御装置、方法およびプログラム |
US11360545B2 (en) | 2016-03-28 | 2022-06-14 | Sony Corporation | Information processing device, information processing method, and program |
WO2023100220A1 (ja) * | 2021-11-30 | 2023-06-08 | 日本電信電話株式会社 | 映像処理装置、方法およびプログラム |
JP7485527B2 (ja) | 2020-03-19 | 2024-05-16 | Lineヤフー株式会社 | 提供装置、提供方法及び提供プログラム |
WO2024201814A1 (ja) * | 2023-03-29 | 2024-10-03 | マクセル株式会社 | 映像共有システム及び映像共有方法 |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2524249B (en) * | 2014-03-17 | 2021-01-20 | Sony Interactive Entertainment Inc | Image Processing |
US10788948B2 (en) * | 2018-03-07 | 2020-09-29 | Quantum Interface, Llc | Systems, apparatuses, interfaces and implementing methods for displaying and manipulating temporal or sequential objects |
JP2016191845A (ja) * | 2015-03-31 | 2016-11-10 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
US10002406B2 (en) * | 2016-10-03 | 2018-06-19 | Samsung Electronics Co., Ltd. | Consistent spherical photo and video orientation correction |
WO2018101227A1 (ja) * | 2016-11-29 | 2018-06-07 | シャープ株式会社 | 表示制御装置、ヘッドマウントディスプレイ、表示制御装置の制御方法、および制御プログラム |
JP7013808B2 (ja) * | 2017-11-15 | 2022-02-01 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置およびプログラム |
KR102213308B1 (ko) * | 2017-12-01 | 2021-02-08 | 링크플로우 주식회사 | 전방향 영상 촬상 방법 및 이러한 방법을 수행하는 장치 |
EP3731066A4 (en) | 2017-12-20 | 2021-01-27 | Sony Corporation | DATA PROCESSING DEVICE AND PROCESS, AND ASSOCIATED PROGRAM |
US10863154B2 (en) * | 2018-02-20 | 2020-12-08 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and storage medium |
JP2021105749A (ja) * | 2018-03-20 | 2021-07-26 | ソニーグループ株式会社 | 情報処理装置、情報処理方法およびプログラム |
WO2020032371A1 (ko) * | 2018-08-09 | 2020-02-13 | 링크플로우 주식회사 | 영상 공유 방법 및 장치 |
JP2020043387A (ja) * | 2018-09-06 | 2020-03-19 | キヤノン株式会社 | 画像処理装置、画像処理方法、プログラム、及び、記憶媒体 |
US11749141B2 (en) * | 2018-10-04 | 2023-09-05 | Sony Corporation | Information processing apparatus, information processing method, and recording medium |
CN110413108B (zh) * | 2019-06-28 | 2023-09-01 | 广东虚拟现实科技有限公司 | 虚拟画面的处理方法、装置、系统、电子设备及存储介质 |
DE112021002093T5 (de) * | 2020-03-30 | 2023-02-09 | Sony Group Corporation | Verfahren zum ändern des blickpunkts im virtuellen raum |
KR102398839B1 (ko) * | 2020-12-01 | 2022-05-17 | 서울과학기술대학교 산학협력단 | 객체의 표면에 영상을 투영시키는 장치 |
US11622100B2 (en) * | 2021-02-17 | 2023-04-04 | flexxCOACH VR | 360-degree virtual-reality system for dynamic events |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002135641A (ja) * | 2000-10-27 | 2002-05-10 | Nippon Telegr & Teleph Corp <Ntt> | 視点・視線自由移動カメラシステム |
JP2003284058A (ja) * | 2002-03-26 | 2003-10-03 | Sony Corp | 画像処理装置および方法、撮像装置および方法、並びにプログラム |
JP2006047748A (ja) * | 2004-08-05 | 2006-02-16 | Canon Inc | 手ぶれ補正機能を有する光学機器 |
JP2007043225A (ja) * | 2005-07-29 | 2007-02-15 | Univ Of Electro-Communications | 撮像画像処理装置及び撮像画像処理方法 |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6690338B1 (en) * | 1993-08-23 | 2004-02-10 | Francis J. Maguire, Jr. | Apparatus and method for providing images of real and virtual objects in a head mounted display |
JPH11153987A (ja) * | 1997-11-19 | 1999-06-08 | Olympus Optical Co Ltd | グラフィック表示装置 |
CN1093711C (zh) * | 1998-02-06 | 2002-10-30 | 财团法人工业技术研究院 | 全景图象式虚拟现实播放系统和方法 |
US7023913B1 (en) * | 2000-06-14 | 2006-04-04 | Monroe David A | Digital security multimedia sensor |
KR20020025301A (ko) * | 2000-09-28 | 2002-04-04 | 오길록 | 다중 사용자를 지원하는 파노라믹 이미지를 이용한증강현실 영상의 제공 장치 및 그 방법 |
EP1599036A1 (en) * | 2003-02-25 | 2005-11-23 | Matsushita Electric Industrial Co., Ltd. | Image pickup processing method and image pickup apparatus |
US8001623B2 (en) * | 2005-05-26 | 2011-08-23 | Gertsch Jeffrey H | Electronic helmet |
US9270976B2 (en) * | 2005-11-02 | 2016-02-23 | Exelis Inc. | Multi-user stereoscopic 3-D panoramic vision system and method |
US9891435B2 (en) * | 2006-11-02 | 2018-02-13 | Sensics, Inc. | Apparatus, systems and methods for providing motion tracking using a personal viewing device |
EP1926345A3 (en) * | 2006-11-22 | 2011-09-14 | Panasonic Corporation | Stereophonic sound control apparatus and stereophonic sound control method |
CN101578571A (zh) * | 2007-07-09 | 2009-11-11 | 索尼株式会社 | 电子设备及其控制方法 |
US8436788B2 (en) * | 2008-01-08 | 2013-05-07 | Lockheed Martin Corporation | Method and apparatus for displaying |
US20100259619A1 (en) * | 2009-04-10 | 2010-10-14 | Nicholson Timothy J | Hmd with elevated camera |
US9144714B2 (en) * | 2009-05-02 | 2015-09-29 | Steven J. Hollinger | Ball with camera for reconnaissance or recreation and network for operating the same |
WO2010130084A1 (zh) * | 2009-05-12 | 2010-11-18 | 华为终端有限公司 | 远程呈现系统、方法及视频采集设备 |
JP2011205358A (ja) * | 2010-03-25 | 2011-10-13 | Fujifilm Corp | ヘッドマウントディスプレイ装置 |
CN101916452B (zh) * | 2010-07-26 | 2012-04-25 | 中国科学院遥感应用研究所 | 一种基于飞行控制信息的无人机遥感影像自动拼接方法 |
JP5406813B2 (ja) * | 2010-10-05 | 2014-02-05 | 株式会社ソニー・コンピュータエンタテインメント | パノラマ画像表示装置およびパノラマ画像表示方法 |
EP2485119A3 (en) * | 2011-02-02 | 2012-12-12 | Nintendo Co., Ltd. | Spatially-correlated multi-display human-machine interface |
US20130050069A1 (en) * | 2011-08-23 | 2013-02-28 | Sony Corporation, A Japanese Corporation | Method and system for use in providing three dimensional user interface |
WO2013086246A1 (en) * | 2011-12-06 | 2013-06-13 | Equisight Inc. | Virtual presence model |
US20130322683A1 (en) * | 2012-05-30 | 2013-12-05 | Joel Jacobs | Customized head-mounted display device |
JP6124517B2 (ja) * | 2012-06-01 | 2017-05-10 | 任天堂株式会社 | 情報処理プログラム、情報処理装置、情報処理システム、およびパノラマ動画表示方法 |
JP5664677B2 (ja) | 2013-02-19 | 2015-02-04 | ソニー株式会社 | 撮像表示装置、撮像表示方法 |
-
2014
- 2014-12-25 CN CN201480075275.XA patent/CN105980962B/zh active Active
- 2014-12-25 KR KR1020167018597A patent/KR20160122702A/ko not_active Application Discontinuation
- 2014-12-25 JP JP2015562709A patent/JP6515813B2/ja active Active
- 2014-12-25 US US14/906,967 patent/US9787895B2/en active Active
- 2014-12-25 WO PCT/JP2014/084350 patent/WO2015122108A1/ja active Application Filing
- 2014-12-25 RU RU2016133123A patent/RU2683262C2/ru not_active IP Right Cessation
- 2014-12-25 EP EP14882561.5A patent/EP3109744B1/en active Active
-
2016
- 2016-06-08 US US15/176,577 patent/US10389937B2/en active Active
-
2019
- 2019-07-23 US US16/519,883 patent/US10574889B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002135641A (ja) * | 2000-10-27 | 2002-05-10 | Nippon Telegr & Teleph Corp <Ntt> | 視点・視線自由移動カメラシステム |
JP2003284058A (ja) * | 2002-03-26 | 2003-10-03 | Sony Corp | 画像処理装置および方法、撮像装置および方法、並びにプログラム |
JP2006047748A (ja) * | 2004-08-05 | 2006-02-16 | Canon Inc | 手ぶれ補正機能を有する光学機器 |
JP2007043225A (ja) * | 2005-07-29 | 2007-02-15 | Univ Of Electro-Communications | 撮像画像処理装置及び撮像画像処理方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3109744A4 * |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017049670A (ja) * | 2015-08-31 | 2017-03-09 | カシオ計算機株式会社 | 表示制御装置、表示制御方法及びプログラム |
US10630892B2 (en) | 2015-08-31 | 2020-04-21 | Casio Computer Co., Ltd. | Display control apparatus to perform predetermined process on captured image |
CN106488181B (zh) * | 2015-08-31 | 2020-03-10 | 卡西欧计算机株式会社 | 显示控制装置、显示控制方法以及记录介质 |
CN106488181A (zh) * | 2015-08-31 | 2017-03-08 | 卡西欧计算机株式会社 | 显示控制装置以及显示控制方法 |
KR20180059765A (ko) | 2015-09-25 | 2018-06-05 | 소니 주식회사 | 정보 처리 장치, 정보 처리 방법 및 프로그램 |
WO2017068926A1 (ja) * | 2015-10-21 | 2017-04-27 | ソニー株式会社 | 情報処理装置及びその制御方法、並びにコンピュータ・プログラム |
US10986206B2 (en) | 2015-10-21 | 2021-04-20 | Sony Corporation | Information processing apparatus, control method thereof, and computer readable medium for visual information sharing |
WO2017068928A1 (ja) * | 2015-10-21 | 2017-04-27 | ソニー株式会社 | 情報処理装置及びその制御方法、並びにコンピュータ・プログラム |
US11360545B2 (en) | 2016-03-28 | 2022-06-14 | Sony Corporation | Information processing device, information processing method, and program |
JP2022009355A (ja) * | 2016-05-02 | 2022-01-14 | ホアウェイ・テクノロジーズ・カンパニー・リミテッド | ヘッドマウントディスプレイコンテンツの撮影および共有 |
JP2020171031A (ja) * | 2016-05-02 | 2020-10-15 | ホアウェイ・テクノロジーズ・カンパニー・リミテッド | ヘッドマウントディスプレイコンテンツの撮影および共有 |
US10171641B2 (en) | 2016-08-04 | 2019-01-01 | Lg Electronics Inc. | Display apparatus |
EP3280130A1 (en) * | 2016-08-04 | 2018-02-07 | LG Electronics Inc. | Display apparatus |
WO2018030795A1 (ko) * | 2016-08-10 | 2018-02-15 | 삼성전자 주식회사 | 카메라 장치, 디스플레이 장치 및 그 장치에서 움직임을 보정하는 방법 |
US10802286B2 (en) | 2016-08-10 | 2020-10-13 | Samsung Electronics Co., Ltd. | Camera device, display device, and method for correcting motion in device |
US11189055B2 (en) | 2016-09-15 | 2021-11-30 | Sony Corporation | Information processing apparatus and method and program |
US20190172227A1 (en) * | 2016-09-15 | 2019-06-06 | Sony Corporation | Information processing apparatus and method and program |
CN109691084A (zh) * | 2016-09-15 | 2019-04-26 | 索尼公司 | 信息处理装置和方法以及程序 |
EP3383036A2 (en) | 2017-01-12 | 2018-10-03 | Sony Corporation | Information processing device, information processing method, and program |
JP7180074B2 (ja) | 2017-01-31 | 2022-11-30 | 株式会社リコー | 撮像装置 |
JP2019118090A (ja) * | 2017-01-31 | 2019-07-18 | 株式会社リコー | 撮像装置および撮像装置の制御方法 |
WO2018150659A1 (ja) | 2017-02-14 | 2018-08-23 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP2019519125A (ja) * | 2017-03-07 | 2019-07-04 | リンクフロー カンパニー リミテッド | 全方位画像撮像方法及びその方法を行う装置 |
US11589034B2 (en) | 2017-06-12 | 2023-02-21 | Interdigital Madison Patent Holdings, Sas | Method and apparatus for providing information to a user observing a multi view content |
JP2020523957A (ja) * | 2017-06-12 | 2020-08-06 | インターデジタル シーイー パテント ホールディングス | マルチ・ビュー・コンテンツを観察するユーザに情報を提示する方法及び機器 |
JP7293208B2 (ja) | 2017-06-12 | 2023-06-19 | インターデジタル マディソン パテント ホールディングス, エスアーエス | マルチ・ビュー・コンテンツを観察するユーザに情報を提示する方法及び機器 |
JP2020526096A (ja) * | 2017-06-29 | 2020-08-27 | リンクフロー カンパニー リミテッド | 最適状況判断撮像方法およびこのような方法を遂行する装置 |
US11076097B2 (en) | 2017-08-23 | 2021-07-27 | Linkflow Co., Ltd. | Method for street view service and apparatus for performing same method |
JP2020532219A (ja) * | 2017-08-23 | 2020-11-05 | リンクフロー カンパニー リミテッド | ストリートビューサービス方法およびこのような方法を遂行する装置 |
JP2019046248A (ja) * | 2017-09-04 | 2019-03-22 | 株式会社コロプラ | 仮想空間を提供するための方法、プログラム、および当該プログラムを実行するための情報処理装置 |
US11480787B2 (en) | 2018-03-26 | 2022-10-25 | Sony Corporation | Information processing apparatus and information processing method |
WO2019188229A1 (ja) | 2018-03-26 | 2019-10-03 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
KR20200133338A (ko) | 2018-03-26 | 2020-11-27 | 소니 주식회사 | 정보 처리 장치, 정보 처리 방법, 및 프로그램 |
WO2021014673A1 (ja) * | 2019-07-25 | 2021-01-28 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | 映像表示制御装置、方法およびプログラム |
US11610343B2 (en) | 2019-07-25 | 2023-03-21 | Ntt Communications Corporation | Video display control apparatus, method, and non-transitory computer readable medium |
JP7485527B2 (ja) | 2020-03-19 | 2024-05-16 | Lineヤフー株式会社 | 提供装置、提供方法及び提供プログラム |
WO2023100220A1 (ja) * | 2021-11-30 | 2023-06-08 | 日本電信電話株式会社 | 映像処理装置、方法およびプログラム |
WO2024201814A1 (ja) * | 2023-03-29 | 2024-10-03 | マクセル株式会社 | 映像共有システム及び映像共有方法 |
Also Published As
Publication number | Publication date |
---|---|
EP3109744A1 (en) | 2016-12-28 |
CN105980962B (zh) | 2019-07-23 |
US20160284048A1 (en) | 2016-09-29 |
RU2016133123A (ru) | 2018-02-16 |
KR20160122702A (ko) | 2016-10-24 |
JP6515813B2 (ja) | 2019-05-22 |
US10389937B2 (en) | 2019-08-20 |
US20160301865A1 (en) | 2016-10-13 |
RU2683262C2 (ru) | 2019-03-27 |
US9787895B2 (en) | 2017-10-10 |
EP3109744B1 (en) | 2020-09-09 |
US10574889B2 (en) | 2020-02-25 |
US20190349525A1 (en) | 2019-11-14 |
CN105980962A (zh) | 2016-09-28 |
RU2016133123A3 (ja) | 2018-09-04 |
EP3109744A4 (en) | 2017-08-02 |
JPWO2015122108A1 (ja) | 2017-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10574889B2 (en) | Information processing device, information processing method, and program | |
US10460512B2 (en) | 3D skeletonization using truncated epipolar lines | |
US11189055B2 (en) | Information processing apparatus and method and program | |
US11170580B2 (en) | Information processing device, information processing method, and recording medium | |
CN107636534A (zh) | 一般球面捕获方法 | |
US10681276B2 (en) | Virtual reality video processing to compensate for movement of a camera during capture | |
JP7249755B2 (ja) | 画像処理システムおよびその制御方法、プログラム | |
CN108629830A (zh) | 一种三维环境信息显示方法及设备 | |
JP2020008972A (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP7378243B2 (ja) | 画像生成装置、画像表示装置および画像処理方法 | |
JP7353782B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
CN115191006B (zh) | 用于所显示的2d元素的3d模型 | |
CN117319790A (zh) | 基于虚拟现实空间的拍摄方法、装置、设备及介质 | |
JP2023550773A (ja) | 画像ベースの指の追跡とコントローラの追跡 | |
JP6518645B2 (ja) | 情報処理装置および画像生成方法 | |
JP2018033107A (ja) | 動画の配信装置及び配信方法 | |
WO2014008438A1 (en) | Systems and methods for tracking user postures and motions to control display of and navigate panoramas | |
WO2023248832A1 (ja) | 遠隔視認システム、現地撮像システム | |
JP2022186326A (ja) | 情報処理装置および画像生成方法 | |
JP2021125225A (ja) | 画像表示装置、画像表示方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14882561 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14906967 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2015562709 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20167018597 Country of ref document: KR Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2014882561 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014882561 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016133123 Country of ref document: RU Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112016018265 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112016018265 Country of ref document: BR Kind code of ref document: A2 Effective date: 20160808 |