WO2017199352A1 - 全天球カメラ撮像画像表示システム、全天球カメラ撮像画像表示方法及びプログラム - Google Patents
全天球カメラ撮像画像表示システム、全天球カメラ撮像画像表示方法及びプログラム Download PDFInfo
- Publication number
- WO2017199352A1 WO2017199352A1 PCT/JP2016/064661 JP2016064661W WO2017199352A1 WO 2017199352 A1 WO2017199352 A1 WO 2017199352A1 JP 2016064661 W JP2016064661 W JP 2016064661W WO 2017199352 A1 WO2017199352 A1 WO 2017199352A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- omnidirectional
- subject
- omnidirectional camera
- distance
- captured image
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
- H04N5/2226—Determination of depth image, e.g. for foreground/background separation
Definitions
- the present invention relates to an omnidirectional camera captured image display system, an omnidirectional camera captured image display method, and a program for displaying captured images captured by a plurality of omnidirectional cameras.
- an omnidirectional camera that can capture 360 degree panoramic images in all directions.
- a 360-degree panoramic image is obtained by capturing an omnidirectional image with an imaging device having a plurality of cameras as one device or an imaging device with a plurality of special lenses. Can be displayed as a display image.
- Patent Document 1 since the depth of the subject displayed as the omnidirectional image is unknown, it is possible to know the distance from the omnidirectional camera to the subject and to correspond to the subject displayed in the omnidirectional image. It was difficult to create a 3D model.
- An object of the present invention is to display the distance from the position of the omnidirectional camera to the subject so as to know the distance from the omnidirectional camera to the subject and to correspond to the subject displayed in the omnidirectional image. It is an object of the present invention to provide an omnidirectional camera captured image display system, an omnidirectional camera captured image display method, and a program.
- the present invention provides the following solutions.
- the invention according to the first feature is an omnidirectional camera captured image display system for displaying captured images obtained by capturing images of a subject by a plurality of omnidirectional cameras, Distance display means for displaying a distance from the omnidirectional camera to the subject on the subject displayed as the captured image;
- An omnidirectional camera captured image display system is provided.
- the omnidirectional camera captured image display system that displays a captured image obtained by capturing a subject with a plurality of omnidirectional cameras is configured to display the omnidirectional camera on the subject displayed as the captured image. The distance from the spherical camera to the subject is displayed.
- the invention according to the first feature is a category of the omnidirectional camera captured image display system, but in other categories such as a method or a program, the same actions and effects according to the category are exhibited. .
- the invention according to the second feature is an omnidirectional camera captured image display system that displays a 3D model of the subject created from captured images obtained by capturing images of the subject by a plurality of omnidirectional cameras, Distance display means for displaying a distance from the omnidirectional camera to the subject on the subject displayed as the 3D model; An omnidirectional camera captured image display system is provided.
- the omnidirectional camera captured image display system that displays a 3D model of the subject created from the captured images obtained by capturing the subject with a plurality of omnidirectional cameras is displayed as the 3D model.
- the distance from the omnidirectional camera to the subject is displayed on the subject.
- the invention according to the second feature is a category of the omnidirectional camera captured image display system, but in other categories such as a method or a program, the same actions and effects according to the category are exhibited. .
- the invention according to the third feature is a receiving means for receiving a user operation; Switching means for switching ON / OFF of the display of the distance by receiving the user operation; An omnidirectional camera captured image display system which is an invention according to either the first or second feature is provided.
- the omnidirectional camera captured image display system which is the invention relating to either the first or second feature accepts a user operation, and accepts the user operation, Switch the distance display ON / OFF.
- the invention according to a fourth feature is a distance correction unit that corrects the distance from distortion of the captured image;
- An omnidirectional camera captured image display system which is an invention according to either the first or second feature is provided.
- the omnidirectional camera captured image display system according to the first or second aspect of the invention corrects the distance from the distortion of the captured image.
- the invention according to a fifth feature is directed to a direction correcting means for making the directions of the plurality of omnidirectional cameras parallel,
- An omnidirectional camera captured image display system which is an invention according to either the first or second feature is provided.
- the omnidirectional camera captured image display system makes the directions of the plurality of omnidirectional cameras parallel.
- the invention according to a sixth aspect includes a measuring means for measuring a distance between the plurality of omnidirectional cameras, An omnidirectional camera captured image display system which is an invention according to either the first or second feature is provided.
- the omnidirectional camera captured image display system measures the distance between the plurality of omnidirectional cameras. .
- An invention according to a seventh feature is an omnidirectional camera captured image display method for displaying captured images obtained by capturing a subject with a plurality of omnidirectional cameras, Displaying the distance from the omnidirectional camera to the subject on the subject displayed as the captured image; An omnidirectional camera captured image display method is provided.
- An invention according to an eighth feature is an omnidirectional camera captured image display method for displaying a 3D model of a subject created from captured images obtained by capturing images of the subject by a plurality of omnidirectional cameras, Displaying the distance from the omnidirectional camera to the subject on the subject displayed as the 3D model; An omnidirectional camera captured image display method is provided.
- the invention according to the ninth feature provides an omnidirectional camera captured image display system for displaying captured images obtained by capturing images of a subject by a plurality of omnidirectional cameras. Displaying the distance from the omnidirectional camera to the subject on the subject displayed as the captured image; A program characterized in that is executed is provided.
- An invention according to a tenth feature provides an omnidirectional camera captured image display system that displays a 3D model of a subject created from captured images obtained by capturing images of the subject by a plurality of omnidirectional cameras. Displaying the distance from the omnidirectional camera to the subject on the subject displayed as the 3D model; A program characterized in that is executed is provided.
- the distance from the position of the omnidirectional camera to the subject is known, and the 3D model corresponding to the subject displayed in the omnidirectional image. It is possible to provide an omnidirectional camera captured image display system, an omnidirectional camera captured image display method, and a program that can facilitate the creation of the image.
- FIG. 1 is a diagram showing an overview of an omnidirectional camera captured image display system 1.
- FIG. 2 is an overall configuration diagram of the omnidirectional camera captured image display system 1.
- FIG. 3 is a functional block diagram of the omnidirectional camera 100 and the information terminal 200.
- FIG. 4 is a flowchart showing captured image display processing executed by the omnidirectional camera 100 and the information terminal 200.
- FIG. 5 is a flowchart showing 3D model display processing executed by the omnidirectional camera 100 and the information terminal 200.
- FIG. 6 is a diagram illustrating an example of a distance measurement method executed by the information terminal 200.
- FIG. 7 is a diagram illustrating an example of the distance displayed by the information terminal 200.
- FIG. 1 is a diagram for explaining an overview of an omnidirectional camera captured image display system 1 which is a preferred embodiment of the present invention.
- the omnidirectional camera captured image display system 1 includes omnidirectional cameras 100a and 100b (hereinafter simply referred to as the omnidirectional camera 100 unless otherwise specified) and an information terminal 200.
- the omnidirectional camera 100a and the omnidirectional camera 100b are preferably arranged in parallel, but if they are not arranged in parallel, the omnidirectional camera captured image display system 1 performs correction to arrange in parallel. It is possible to execute. Further, the omnidirectional camera 100a or the omnidirectional camera 100b and the information terminal 200 may not be separate but may be an integrated terminal device.
- the omnidirectional camera 100 is not limited to one or two but may be more than that.
- the information terminal 200 is not limited to one and may be plural. Further, the information terminal 200 may be realized by either or both of a real device and a virtual device. In addition, each process described below may be realized by either or both of the omnidirectional camera 100 and the information terminal 200.
- the omnidirectional camera 100 is an imaging device that is capable of data communication with the information terminal 200 and has a configuration in which a plurality of cameras are combined into a single camera, or a configuration including a plurality of special lenses.
- the omnidirectional camera 100 is an imaging device that can capture all omnidirectional images of 360 degrees panoramic images by capturing images in all directions.
- the omnidirectional camera 100 may be an imaging device that can capture a 360-degree panoramic image by capturing images of each orientation from a certain point and combining the captured images of each orientation.
- the omnidirectional camera 100 may be an imaging device capable of capturing a 360-degree panoramic image with other configurations.
- the information terminal 200 is a terminal device capable of data communication with the omnidirectional camera 100 and capable of displaying a 360-degree panoramic image captured by the omnidirectional camera 100.
- the information terminal 200 is, for example, a mobile phone, a portable information terminal, a tablet terminal, a personal computer, an electronic product such as a netbook terminal, a slate terminal, an electronic book terminal, a portable music player, or a smart glass worn by an operator. , Wearable terminals such as head mounted displays, and other articles.
- the omnidirectional camera 100 receives an input from the operator and captures an omnidirectional image (step S01). There are a plurality of subjects in the omnidirectional image.
- the subject is, for example, a tree, a building, a person, or a landscape.
- the omnidirectional camera 100 transmits omnidirectional image data, which is data of the captured omnidirectional image, to the information terminal 200 (step S02).
- the information terminal 200 includes the omnidirectional image data received from the omnidirectional camera 100a, the omnidirectional image data received from the omnidirectional camera 100b, and the omnidirectional camera 100a and the omnidirectional camera 100b.
- the distance from the omnidirectional camera 100 to the subject is measured based on the distance.
- the information terminal 200 displays an omnidirectional image based on the omnidirectional image data, and displays the measured distance to the subject displayed on the omnidirectional image (step S03).
- the information terminal 200 may be configured to create and display a 3D model of a subject included in the omnidirectional image data based on the omnidirectional image data. In this case, the information terminal 200 displays the 3D model of each subject based on the omnidirectional image data, and displays the measured distance on the 3D model.
- FIG. 2 is a diagram showing a system configuration of the omnidirectional camera captured image display system 1 which is a preferred embodiment of the present invention.
- the omnidirectional camera captured image display system 1 includes a plurality of omnidirectional cameras 100a and 100b (hereinafter referred to as the omnidirectional camera 100 unless otherwise specified), an information terminal 200, a public line network (Internet network, (Third, fourth generation communication network, etc.) 5.
- the number of omnidirectional cameras 100 is not limited to two, and may be one or three or more. Further, the number of information terminals 200 is not limited to one and may be plural.
- the information terminal 200 may be realized by either or both of an actual device and a virtual device. In addition, each process described below may be realized by either or both of the omnidirectional camera 100 and the information terminal 200.
- the omnidirectional camera captured image display system 1 may have a configuration in which a server or the like exists in addition to the configuration described above. In this case, for example, each process described below may be executed by any one or a combination of the omnidirectional camera 100, the information terminal 200, and the server.
- the omnidirectional camera 100 has the functions described later and is the above-described imaging device.
- the information terminal 200 is the above-described terminal device having functions described later.
- FIG. 3 is a functional block diagram of the omnidirectional camera 100 and the information terminal 200.
- the omnidirectional camera 100 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like as the control unit 110, and the communication unit 120 is capable of communicating with other devices.
- the omnidirectional camera 100 includes, as the input / output unit 130, a display unit that outputs and displays data and images controlled by the control unit 110, an input unit such as a touch panel, a keyboard, and a mouse that receives input from the user, An imaging device for imaging a subject is included.
- the omnidirectional camera 100 implements the data transmission module 150 and the correction instruction reception module 151 in cooperation with the communication unit 120 by the control unit 110 reading a predetermined program.
- the omnidirectional camera 100 implements the imaging module 160 and the orientation adjustment module 161 in cooperation with the input / output unit 130 when the control unit 110 reads a predetermined program.
- the information terminal 200 includes a CPU, RAM, ROM, etc. as the control unit 210, a wireless compatible device, etc. as the communication unit 220, and a display unit, An input unit and the like are provided.
- the information terminal 200 implements the data reception module 250 and the correction instruction transmission module 251 in cooperation with the communication unit 220 when the control unit 210 reads a predetermined program.
- the information terminal 200 cooperates with the input / output unit 230 by the control unit 210 reading a predetermined program, so that the parallel determination module 260, the distance measurement module 261, the distortion determination module 262, the correction module 263, and the display module. H.264, the input reception module 265, and the 3D model creation module 266 are realized.
- FIG. 4 is a diagram illustrating a flowchart of captured image display processing executed by the omnidirectional camera 100 and the information terminal 200. The processing executed by the modules of each device described above will be described together with this processing.
- the imaging module 160 accepts an input from the operator, images the subject, and captures an omnidirectional image (step S10).
- the subject is, for example, a tree, a building, a person, or a landscape.
- the omnidirectional image is a 360-degree panoramic image.
- the omnidirectional camera 100a and the omnidirectional camera 100b each capture an omnidirectional image.
- the omnidirectional camera 100 may capture an omnidirectional image when the operator inputs an imaging instruction to a terminal device such as a controller, or inputs an imaging instruction from the information terminal 200.
- the omnidirectional image may be taken by accepting the omnidirectional image, or the omnidirectional image may be taken by other configurations.
- the data transmission module 150 transmits the captured omnidirectional image as omnidirectional image data to the information terminal 200 (step S11).
- each of the omnidirectional camera 100a and the omnidirectional camera 100b transmits omnidirectional image data.
- the data receiving module 250 receives a plurality of omnidirectional image data.
- the parallel determination module 260 determines whether or not the directions of the omnidirectional cameras 100a and 100b are parallel to each other based on the received plurality of omnidirectional image data (step S12).
- the parallel determination module 260 analyzes the received plurality of omnidirectional image data and extracts image data of one subject.
- the parallel determination module 260 determines whether or not the omnidirectional cameras 100a and 100b are parallel based on the extracted image data of one subject. That is, the parallel determination module 260 determines whether or not they are parallel based on whether or not the feature amounts of the image data of one subject match.
- the parallel determination module 260 may determine whether or not the omnidirectional cameras 100a and 100b are parallel by a configuration other than the determination method described above. Further, even when there are three or more omnidirectional cameras 100, it is possible to determine which omnidirectional cameras 100 are not parallel by executing the same processing.
- step S12 when the parallel determination module 260 determines that they are not parallel (NO in step S12), the correction instruction transmission module 251 corrects the orientation of either or both of the omnidirectional camera 100a and the omnidirectional camera 100b.
- the correction instruction to be transmitted is transmitted to the omnidirectional camera 100 (step S13).
- step S ⁇ b> 13 the correction instruction transmission module 251, for example, corrects the instruction to change the imaging direction of the imaging module 160, change the position of the lens that forms the imaging module 160, or change the position of the omnidirectional camera 100. Send.
- the correction instruction receiving module 151 receives the correction instruction. Based on the received correction instruction, the orientation adjustment module 161 corrects the orientations of the omnidirectional cameras 100a and 100b so that the omnidirectional camera 100a and the omnidirectional camera 100b are parallel to each other (step S14).
- the imaging module 160 captures an omnidirectional image in the corrected orientation (step S15).
- step S15 not only the corrected omnidirectional camera 100 captures an omnidirectional image but also the uncorrected omnidirectional camera 100 may capture an omnidirectional image.
- the data transmission module 150 transmits the omnidirectional image data of the captured omnidirectional image to the information terminal 200 (step S16).
- the data receiving module 250 receives the omnidirectional image data, and the information terminal 200 executes Step 17 described later.
- step S12 determines in step S12 that they are parallel (YES in step S12)
- the distance measurement module 261 measures the distance from the omnidirectional camera 100 to the subject (step S17).
- the distance measurement method is not limited to the method of the present embodiment, and may be executed by other methods.
- the information terminal 200 measures the distance X between the omnidirectional camera 100a and the omnidirectional camera 100b. For example, the information terminal 200 acquires position information of each of the omnidirectional camera 100a and the omnidirectional camera 100b, and measures the distance X based on the acquired position information.
- the information terminal 200 may be configured such that the distance X is set in advance and the set distance X is acquired.
- the information terminal 200 may be configured to acquire the distance X from an external device such as a server. Further, the information terminal 200 may be configured to measure the distance X by other configurations.
- the information terminal 200 extracts the partial image 400 having the angle of view Z including the subject 300 for measuring the distance at one end from the omnidirectional image data acquired from the omnidirectional camera 100a. In addition, the information terminal 200 extracts a partial image 410 having an angle of view Z including the subject 300 whose distance is to be measured at one end from the omnidirectional image data acquired from the omnidirectional camera 100b.
- the information terminal 200 creates a superimposed image 420 in which the subject 300 included in the extracted partial images 400 and 410 is superimposed.
- the information terminal 200 measures the distance Y from the omnidirectional camera 100 to the subject based on the distance X and the angle of view Z of the omnidirectional camera 100.
- the information terminal 200 measures the distance Y for all subjects existing in the omnidirectional image data.
- the information terminal 200 may be configured to measure the distance Y based on the distance X. Further, the information terminal 200 may be configured to measure the distance Y by another configuration.
- the distortion determination module 262 determines whether or not there is distortion in the subject included in the omnidirectional image data (step S18).
- the distortion is, for example, barrel aberration, pincushion aberration, vignetting, chromatic aberration, or the like.
- the correction module 263 corrects the distortion of the subject, and corrects the distance Y measured in step S17 based on the distortion. (Step S19), the process proceeds to Step S20 described later.
- the correction module 263 corrects the distance Y for all subjects with distortion.
- the display module 264 displays the omnidirectional image based on the omnidirectional image data, and in step S17.
- the measured distance from the omnidirectional camera 100 to the subject is displayed (step S20).
- the display module 264 displays an omnidirectional image captured by either the omnidirectional camera 100a or the omnidirectional camera 100b.
- the display module 264 may be configured to combine the omnidirectional images captured by the omnidirectional camera 100a and the omnidirectional camera 100b and display the combined omnidirectional image.
- FIG. 7 is a diagram illustrating a state in which the display module 264 displays the subject 300 and the distance display area 500 from the omnidirectional camera 100 to the subject 300.
- a state in which only the subject 300 is displayed in the omnidirectional image is shown, but other subjects may be displayed, and the same applies to other subjects in the following description. What is necessary is just composition.
- the display module 264 displays the distance display area 500 so as to overlap with the vicinity of the subject 300 or a part of the subject 300.
- the vicinity is, for example, a periphery that does not overlap the subject 300.
- the distance display area 500 is an area for displaying the distance from the omnidirectional camera 100 to the subject 300.
- the distance display area 500 may be displayed in a different area from the display area of the omnidirectional image. In this case, any structure may be used as long as the distance display area 500 indicates the distance of the subject by an arrow, a tension line, a symbol, or the like.
- the display position and shape of the distance display area 500 can be changed as appropriate. Further, the distance display area 500 may be configured to display only on a predetermined subject, or may be configured to display on all subjects.
- the input reception module 265 determines whether or not an input for switching on / off the display of the distance from the operator has been received (step S21).
- step S21 when the input reception module 265 determines that the input has been received (step S21 YES), the display of the distance is switched based on the input content (step S22).
- step S22 if the received input is display ON, the distance is displayed in the vicinity of the subject. If the received input is display OFF, the distance displayed in the vicinity of the subject is hidden. After switching the display, the input receiving module 265 executes the process of step S21 again.
- the operator may designate one subject or a plurality of subjects, and the distance of the designated subject may be switched ON / OFF.
- step S21 determines whether an input for ending the display of the omnidirectional image has been received (step S23).
- step S23 when the input reception module 265 determines that the input is not received (step S23: NO), the process of step S21 described above is executed again.
- step S23 determines in step S23 that the input has been received (step S23: YES).
- the above is the captured image display processing.
- FIG. 5 is a diagram illustrating a flowchart of 3D model display processing executed by the omnidirectional camera 100 and the information terminal 200. The processing executed by the modules of each device described above will be described together with this processing. Note that detailed description of processing similar to the captured image display processing described above is omitted.
- the omnidirectional camera 100 and the information terminal 200 execute the processing from step S10 to step S19 described above (step S30 to step S39). Since the processes in steps S30 to S39 are the same as the processes in steps S10 to S19 described above, detailed description thereof is omitted.
- the 3D model creation module 266 creates a 3D model of each subject based on the omnidirectional image data (step S40).
- the 3D model creation module 266 creates a 3D model using, for example, a solid, a surface, a wire frame, and a polygon.
- the 3D model creation module 266 creates a 3D model of each subject based on the omnidirectional image data captured by either the omnidirectional camera 100a or the omnidirectional camera 100b.
- the 3D model creation module 266 combines the omnidirectional image data captured by the omnidirectional camera 100a and the omnidirectional camera 100b, and generates a 3D model based on the synthesized omnidirectional image data. It may be configured to.
- the display module 264 displays the created 3D model of the subject in place of the subject in the omnidirectional image (step S41). That is, in step S41, the display module 264 displays the 3D model of each subject as an omnidirectional image.
- the display module 264 displays the distance from the omnidirectional camera 100 measured in step S37 to the subject on the 3D model (step S42).
- the processing in step S42 has the same configuration as that in step S20 described above except that the image of the subject to be displayed is changed to the 3D model, and thus detailed description thereof is omitted.
- the input reception module 265 determines whether or not an input for switching ON / OFF of the display of the distance from the operator has been received (step S43). In step S43, when the input reception module 265 determines that the input has been received (step S43: YES), the display of the distance is switched based on the input content (step S44). Since the processing of step S43 and step S44 has the same configuration except that the image of the subject to be displayed is changed to the 3D model in the processing of step S21 and step S22 described above, detailed description thereof is omitted.
- step S43 determines whether an input for ending the display of the 3D model has been received (NO in step S43).
- step S45 when the input reception module 265 determines that the input is not received (step S45: NO), the input reception module 265 executes the process of step S42 described above. Since the process of step S45 is the same as the process of step S23 described above, detailed description thereof is omitted.
- step S45 determines in step S45 that the input has been received (step S45: YES).
- the means and functions described above are realized by a computer (including a CPU, an information processing apparatus, and various terminals) reading and executing a predetermined program.
- the program is provided in a form recorded on a computer-readable recording medium such as a flexible disk, CD (CD-ROM, etc.), DVD (DVD-ROM, DVD-RAM, etc.).
- the computer reads the program from the recording medium, transfers it to the internal storage device or the external storage device, stores it, and executes it.
- the program may be recorded in advance in a storage device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and provided from the storage device to a computer via a communication line.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
前記撮像画像として表示された被写体に、前記全天球カメラから当該被写体までの距離を表示する距離表示手段と、
を備えることを特徴とする全天球カメラ撮像画像表示システムを提供する。
前記3Dモデルとして表示された被写体に、前記全天球カメラから当該被写体までの距離を表示する距離表示手段と、
を備えることを特徴とする全天球カメラ撮像画像表示システムを提供する。
前記ユーザ操作を受け付けることにより、前記距離の表示のON/OFFを切り替える切替手段と、
を備えることを特徴とする第1又は第2のいずれかの特徴に係る発明である全天球カメラ撮像画像表示システムを提供する。
を備えることを特徴とする第1又は第2のいずれかの特徴に係る発明である全天球カメラ撮像画像表示システムを提供する。
を備えることを特徴とする第1又は第2のいずれかの特徴に係る発明である全天球カメラ撮像画像表示システムを提供する。
を備えることを特徴とする第1又は第2のいずれかの特徴に係る発明である全天球カメラ撮像画像表示システムを提供する。
前記撮像画像として表示された被写体に、前記全天球カメラから当該被写体までの距離を表示するステップと、
を備えることを特徴とする全天球カメラ撮像画像表示方法を提供する。
前記3Dモデルとして表示された被写体に、前記全天球カメラから当該被写体までの距離を表示するステップと、
を備えることを特徴とする全天球カメラ撮像画像表示方法を提供する。
前記撮像画像として表示された被写体に、前記全天球カメラから当該被写体までの距離を表示するステップ、
を実行させることを特徴とするプログラムを提供する。
前記3Dモデルとして表示された被写体に、前記全天球カメラから当該被写体までの距離を表示するステップ、
を実行させることを特徴とするプログラムを提供する。
本発明の好適な実施形態である全天球カメラ撮像画像表示システム1の概要について、図1に基づいて説明する。図1は、本発明の好適な実施形態である全天球カメラ撮像画像表示システム1の概要を説明するための図である。全天球カメラ撮像画像表示システム1は、全天球カメラ100a,100b(以下、特に断りがない限り単に全天球カメラ100と称す)、情報端末200から構成される。
図2に基づいて、全天球カメラ撮像画像表示システム1のシステム構成について説明する。図2は、本発明の好適な実施形態である全天球カメラ撮像画像表示システム1のシステム構成を示す図である。全天球カメラ撮像画像表示システム1は、複数の全天球カメラ100a,100b(以下、特に断りがない限り全天球カメラ100と称す。)、情報端末200、公衆回線網(インターネット網や、第3、第4世代通信網等)5から構成される。
図3に基づいて、本発明の好適な実施形態である全天球カメラ撮像画像表示システム1の機能について説明する。図3は、全天球カメラ100、情報端末200の機能ブロック図である。
図4に基づいて、全天球カメラ100、情報端末200が実行する撮像画像表示処理について説明する。図4は、全天球カメラ100、情報端末200が実行する撮像画像表示処理のフローチャートを示す図である。上述した各装置のモジュールが実行する処理について、本処理に併せて説明する。
次に、上述した全天球カメラ撮像画像表示システム1が実行する3Dモデル表示処理について、図5に基づいて説明する。図5は、全天球カメラ100、情報端末200が実行する3Dモデル表示処理のフローチャートを示す図である。上述した各装置のモジュールが実行する処理について、本処理に併せて説明する。なお、上述した撮像画像表示処理と同様の処理については、その詳細な説明は省略する。
Claims (10)
- 複数の全天球カメラが被写体を撮像した撮像画像を表示する全天球カメラ撮像画像表示システムであって、
前記撮像画像として表示された被写体に、前記全天球カメラから当該被写体までの距離を表示する距離表示手段と、
を備えることを特徴とする全天球カメラ撮像画像表示システム。 - 複数の全天球カメラが被写体を撮像した撮像画像から作成した当該被写体の3Dモデルを表示する全天球カメラ撮像画像表示システムであって、
前記3Dモデルとして表示された被写体に、前記全天球カメラから当該被写体までの距離を表示する距離表示手段と、
を備えることを特徴とする全天球カメラ撮像画像表示システム。 - ユーザ操作を受け付ける受付手段と、
前記ユーザ操作を受け付けることにより、前記距離の表示のON/OFFを切り替える切替手段と、
を備えることを特徴とする請求項1又は2に記載の全天球カメラ撮像画像表示システム。 - 前記撮像画像の歪みから、前記距離の補正を行う距離補正手段と、
を備えることを特徴とする請求項1又は2に記載の全天球カメラ撮像画像表示システム。 - 前記複数の全天球カメラの向きを平行にする向き補正手段と、
を備えることを特徴とする請求項1又は2に記載の全天球カメラ撮像画像表示システム。 - 前記複数の全天球カメラの間の距離を計測する計測手段と、
を備えることを特徴とする請求項1又は2に記載の全天球カメラ撮像画像表示システム。 - 複数の全天球カメラが被写体を撮像した撮像画像を表示する全天球カメラ撮像画像表示方法であって、
前記撮像画像として表示された被写体に、前記全天球カメラから当該被写体までの距離を表示するステップと、
を備えることを特徴とする全天球カメラ撮像画像表示方法。 - 複数の全天球カメラが被写体を撮像した撮像画像から作成した当該被写体の3Dモデルを表示する全天球カメラ撮像画像表示方法であって、
前記3Dモデルとして表示された被写体に、前記全天球カメラから当該被写体までの距離を表示するステップと、
を備えることを特徴とする全天球カメラ撮像画像表示方法。 - 複数の全天球カメラが被写体を撮像した撮像画像を表示する全天球カメラ撮像画像表示システムに、
前記撮像画像として表示された被写体に、前記全天球カメラから当該被写体までの距離を表示するステップ、
を実行させることを特徴とするプログラム。 - 複数の全天球カメラが被写体を撮像した撮像画像から作成した当該被写体の3Dモデルを表示する全天球カメラ撮像画像表示システムに、
前記3Dモデルとして表示された被写体に、前記全天球カメラから当該被写体までの距離を表示するステップ、
を実行させることを特徴とするプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/064661 WO2017199352A1 (ja) | 2016-05-17 | 2016-05-17 | 全天球カメラ撮像画像表示システム、全天球カメラ撮像画像表示方法及びプログラム |
JP2018517983A JP6404525B2 (ja) | 2016-05-17 | 2016-05-17 | 全天球カメラ撮像画像表示システム、全天球カメラ撮像画像表示方法及びプログラム |
US16/057,981 US20180352158A1 (en) | 2016-05-17 | 2018-08-08 | Omnidirectional camera captured image display system, omnidirectional camera captured image display method, and program |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/064661 WO2017199352A1 (ja) | 2016-05-17 | 2016-05-17 | 全天球カメラ撮像画像表示システム、全天球カメラ撮像画像表示方法及びプログラム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/057,981 Continuation-In-Part US20180352158A1 (en) | 2016-05-17 | 2018-08-08 | Omnidirectional camera captured image display system, omnidirectional camera captured image display method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017199352A1 true WO2017199352A1 (ja) | 2017-11-23 |
Family
ID=60325114
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/064661 WO2017199352A1 (ja) | 2016-05-17 | 2016-05-17 | 全天球カメラ撮像画像表示システム、全天球カメラ撮像画像表示方法及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20180352158A1 (ja) |
JP (1) | JP6404525B2 (ja) |
WO (1) | WO2017199352A1 (ja) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11455074B2 (en) * | 2020-04-17 | 2022-09-27 | Occipital, Inc. | System and user interface for viewing and interacting with three-dimensional scenes |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007024647A (ja) * | 2005-07-14 | 2007-02-01 | Iwate Univ | 距離算出装置、距離算出方法、構造解析装置及び構造解析方法。 |
JP2007263669A (ja) * | 2006-03-28 | 2007-10-11 | Denso It Laboratory Inc | 3次元座標取得装置 |
JP2008304248A (ja) * | 2007-06-06 | 2008-12-18 | Konica Minolta Holdings Inc | 車載用ステレオカメラの校正方法、車載用距離画像生成装置及びプログラム |
JP2011192228A (ja) * | 2010-03-17 | 2011-09-29 | Casio Computer Co Ltd | 三次元モデリング装置、三次元モデリング方法、ならびに、プログラム |
WO2014171052A1 (ja) * | 2013-04-16 | 2014-10-23 | コニカミノルタ株式会社 | 画像処理方法、画像処理装置、撮像装置および画像処理プログラム |
JP2014224410A (ja) * | 2013-05-16 | 2014-12-04 | 住友建機株式会社 | 作業機械用周辺監視装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10115506A (ja) * | 1996-10-11 | 1998-05-06 | Fuji Heavy Ind Ltd | ステレオカメラの調整装置 |
EP2391119B1 (en) * | 2010-03-31 | 2015-06-03 | FUJIFILM Corporation | 3d-image capturing device |
-
2016
- 2016-05-17 JP JP2018517983A patent/JP6404525B2/ja active Active
- 2016-05-17 WO PCT/JP2016/064661 patent/WO2017199352A1/ja active Application Filing
-
2018
- 2018-08-08 US US16/057,981 patent/US20180352158A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007024647A (ja) * | 2005-07-14 | 2007-02-01 | Iwate Univ | 距離算出装置、距離算出方法、構造解析装置及び構造解析方法。 |
JP2007263669A (ja) * | 2006-03-28 | 2007-10-11 | Denso It Laboratory Inc | 3次元座標取得装置 |
JP2008304248A (ja) * | 2007-06-06 | 2008-12-18 | Konica Minolta Holdings Inc | 車載用ステレオカメラの校正方法、車載用距離画像生成装置及びプログラム |
JP2011192228A (ja) * | 2010-03-17 | 2011-09-29 | Casio Computer Co Ltd | 三次元モデリング装置、三次元モデリング方法、ならびに、プログラム |
WO2014171052A1 (ja) * | 2013-04-16 | 2014-10-23 | コニカミノルタ株式会社 | 画像処理方法、画像処理装置、撮像装置および画像処理プログラム |
JP2014224410A (ja) * | 2013-05-16 | 2014-12-04 | 住友建機株式会社 | 作業機械用周辺監視装置 |
Also Published As
Publication number | Publication date |
---|---|
US20180352158A1 (en) | 2018-12-06 |
JPWO2017199352A1 (ja) | 2018-10-18 |
JP6404525B2 (ja) | 2018-10-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3054414B1 (en) | Image processing system, image generation apparatus, and image generation method | |
US8953036B2 (en) | Information processing apparatus, information processing method, program, and information processing system | |
JP6627352B2 (ja) | 画像表示装置、画像表示方法及びプログラム | |
CN108600576A (zh) | 图像处理装置、方法和系统以及计算机可读记录介质 | |
CN104160693A (zh) | 图像捕获装置、图像捕获系统、图像处理方法、信息处理装置和计算机可读存储介质 | |
CN108513072A (zh) | 图像处理器、图像处理方法和成像系统 | |
JP2012174116A (ja) | オブジェクト表示装置、オブジェクト表示方法及びオブジェクト表示プログラム | |
CN107197137A (zh) | 图像处理装置、图像处理方法以及记录介质 | |
JP7205386B2 (ja) | 撮像装置、画像処理方法、プログラム | |
CN107560637B (zh) | 头戴显示设备校准结果验证方法及头戴显示设备 | |
CN103198286B (zh) | 信息处理终端、信息处理方法和程序 | |
JP6374849B2 (ja) | ユーザ端末、色彩補正システム及び色彩補正方法 | |
KR20190014959A (ko) | 움직임 정보에 기반하여 동영상을 재생하기 위한 장치 및 그의 동작 방법 | |
JP6283329B2 (ja) | 拡張現実対象認識装置 | |
JP6267809B1 (ja) | パノラマ画像合成解析システム、パノラマ画像合成解析方法及びプログラム | |
JP5448739B2 (ja) | 画像再生装置、撮像装置、画像再生方法 | |
JP6404525B2 (ja) | 全天球カメラ撮像画像表示システム、全天球カメラ撮像画像表示方法及びプログラム | |
JP6246441B1 (ja) | 画像解析システム、画像解析方法、およびプログラム | |
WO2020255766A1 (ja) | 情報処理装置、情報処理方法、プログラム、投映装置、および情報処理システム | |
JP6200604B1 (ja) | 全天球カメラロボット高度調整システム、全天球カメラロボット高度調整方法及びプログラム | |
EP3665656A1 (en) | Three-dimensional video processing | |
WO2018189880A1 (ja) | 情報処理装置、情報処理システム、および画像処理方法 | |
JP2000222116A (ja) | 表示画像の位置認識方法とその位置認識装置および仮想画像立体合成装置 | |
JP2020167657A (ja) | 画像処理装置、ヘッドマウントディスプレイ、および画像表示方法 | |
US11928775B2 (en) | Apparatus, system, method, and non-transitory medium which map two images onto a three-dimensional object to generate a virtual image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2018517983 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16902371 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 21/02/2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16902371 Country of ref document: EP Kind code of ref document: A1 |