WO2008010345A1 - Système de photographie d'image panoramique et procédé de photographie d'image panoramique - Google Patents

Système de photographie d'image panoramique et procédé de photographie d'image panoramique Download PDF

Info

Publication number
WO2008010345A1
WO2008010345A1 PCT/JP2007/060193 JP2007060193W WO2008010345A1 WO 2008010345 A1 WO2008010345 A1 WO 2008010345A1 JP 2007060193 W JP2007060193 W JP 2007060193W WO 2008010345 A1 WO2008010345 A1 WO 2008010345A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
circle boundary
panoramic
outer circle
degree
Prior art date
Application number
PCT/JP2007/060193
Other languages
English (en)
Japanese (ja)
Inventor
Masayuki Ito
Original Assignee
Opt Corporation
Shinya Makoto
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Opt Corporation, Shinya Makoto filed Critical Opt Corporation
Publication of WO2008010345A1 publication Critical patent/WO2008010345A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/147Communication arrangements, e.g. identifying the communication as a video-communication, intermediate storage of the signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a panorama developed image imaging apparatus and a panorama developed image imaging system.
  • Patent Document 1 discloses an omnidirectional photographing apparatus.
  • the omnidirectional imaging device develops and displays or records the video captured by one camera module that has a lens that can directly capture a 360-degree annular image.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2006-148787 (abstract, detailed description of the invention, drawings, particularly, paragraph 0024, paragraph 0049, FIG. 8, FIG. 9, etc.)
  • a lens that directly captures a 360-degree annular image is a special convex glass lens, specifically, a zenith reflecting surface and a shape surrounding the zenith reflecting surface.
  • Patent Document 1 suggests that a fish-eye lens can be used in place of the special convex glass lens.
  • the omnidirectional imaging device of Patent Document 1 uses such a special convex glass lens, the image height that can be captured is limited. [0006] Because there is a restriction on the image height, the omnidirectional photographing apparatus of Patent Document 1 imposes a restriction on the distance between the glass lens and the subject in practice. For example, if the subject gets too close to the glass lens, the subject's head will be cut off in the 360-degree annular image. In order to prevent the subject's head from being cut off, the distance of the glass lens to the subject must be secured.
  • An object of the present invention is to obtain a panorama unfolded image capturing apparatus and a panorama unfolded image capturing system capable of capturing a 360-degree panorama unfolded image without causing the subject to be cut.
  • the V Lamma development image capturing apparatus generates a 360 degree V Lama development image by imaging.
  • the panoramic developed image imaging device includes a fish-eye lens, an imaging unit that forms a circular image by the fish-eye lens on a light-receiving surface on which a plurality of light-receiving elements are arranged, and generates captured image data having the circular image;
  • the 360 degree panorama is stored in the memory based on the user's operation and the memory that stores the position or size of the inner and outer circle boundaries that define the range of the ring shape used to generate the unfolded image.
  • a user update unit that updates the position or size of at least one of the inner circle boundary and the outer circle boundary, and the inner circle boundary and outer circle boundary stored in the memory in the captured image generated by the imaging unit.
  • Panorama unfolded image generating means for generating a 360-degree non-linear raster image based on an image within the range of the ring shape.
  • a circular image formed by a fisheye lens is formed on the imaging means.
  • the imaging means generates a captured image having a circular image.
  • the circular image is included without the subject's head being cut off.
  • the no-V llama unfolded image generating means includes an inner circle boundary and an outer circle boundary in the captured image. Based on the image in the range of the ring shape sandwiched between the two, a 360 degree V-Rama expansion image is generated.
  • the positions and sizes of the inner circle boundary and the outer circle boundary can be updated by the user update unit based on the user's operation. Therefore, for example, even if it is not possible to physically secure the distance of the fisheye lens to the subject, the position of the inner circle boundary is updated by the user's operation so that the position of the outer circle boundary is updated.
  • Subject force Can be adjusted so that it is within the range of the S-ring shape. As a result, the subject's head is cut off, and a 360-degree panoramic image can be generated.
  • a panoramic developed image imaging device has the following characteristics. That is, the fisheye lens is of a three-dimensional projection system.
  • the memory stores an image height correction table for correcting the balance of the imaging size of the subject obtained based on the relationship between the incident angle of the stereoscopic lens fisheye lens and the image height.
  • the panorama spread image generation means refers to the image height correction table, and generates a 360-degree V-Rama development image in which the tolerance of the imaging size of the subject is adjusted.
  • the peripheral portion in the image formed by the fisheye lens is less distorted and the amount of information in the peripheral portion than when a general equidistant projection fisheye lens is used. Will increase. Therefore, when the panoramic developed image pickup device is placed on the table with the fisheye lens facing upward, it is possible to obtain an image of a person or a blackboard with a high resolution. The characters on the blackboard can be reproduced in a 360-degree panoramic image. In addition, it is possible to obtain a well-balanced image such as a person or a blackboard with a fisheye lens.
  • a panoramic developed image capturing apparatus has the following features in addition to the above-described components of the present invention. That is, the user update means updates the image height correction table stored in the memory after updating the position or size of at least one of the inner circle boundary and the outer circle boundary stored in the memory. It is updated based on the outer circle boundary and the relationship between the angle of incidence and the image height of the cuboid projection fisheye lens.
  • the panoramic developed image capturing apparatus has the following features in addition to the above-described components of the present invention.
  • the user update means, when the size after the change of the inner circle boundary or the outer circle boundary is a radius greater than or equal to the predetermined number of pixels, Update the inner circle boundary or outer circle boundary stored in memory.
  • the radius of the inner circle boundary becomes one pixel
  • the same pixel in the captured image is used many times in order to obtain pixel values of a plurality of display pixels at the top of the 360-degree panorama development image. Will be.
  • the load of generating a 360-degree panorama development image increases without improving the image quality of the 360-degree rama development image.
  • the panoramic developed image capturing apparatus has the following features in addition to the above-described components of the present invention. That is, the no-rama development image generation means calculates the distance and the direction from the center of the circular image of each display pixel of the 360-degree panorama development image, and uses the calculated distance and the direction of each display pixel. Then, the coordinates in the captured image are calculated, and the pixel value of the pixel of the captured image located at each calculated coordinate is acquired as the pixel value of each display pixel.
  • a panoramic developed image imaging device includes the following components in addition to the above-described components of the present invention. It has the following characteristics.
  • the no-rama development image generation means divides the range between the inner circle boundary and the outer circle boundary in the captured image into two ranges every 180 degrees, and 2 for each range of the divided images.
  • Two horizontally long panoramic divided development images are generated.
  • the display means can display a large 360-degree image of the llama.
  • a NORMAL development image imaging system displays a panorama development image imaging device that generates a 360-degree panorama development image by imaging, and a NORMAL development image generated by the panorama development image imaging device. Or it has a computer apparatus which preserve
  • the panoramic developed image imaging device includes a fish-eye lens, an imaging unit that forms a circular image by the fish-eye lens on a light-receiving surface on which a plurality of light-receiving elements are arranged, and generates data of a captured image having a circular image;
  • a memory that stores the position or size of the inner and outer circle boundaries that define the range of the ring shape used to generate a 360-degree image of the 360 ° llama in the image, and a memory based on user operations. Is the position of at least one of the inner circle boundary and the outer circle boundary stored in the !!
  • the user update means for updating the size, and the inner circle boundary stored in the memory in the captured image generated by the imaging means Panorama unfolded image generating means for generating a 360 degree NOV llama unfolded image based on an image within a range of a ring shape sandwiched between the outer circle boundaries.
  • a 360-degree panoramic developed image can be taken without the subject being cut off.
  • FIG. 1 is a perspective view showing a panoramic developed image imaging system according to an embodiment of the present invention.
  • 2 is a block diagram showing an electric circuit of the 360-degree video camera device in FIG.
  • FIG. 3 is an explanatory view showing an optical arrangement of the fisheye lens and the image sensor in FIG.
  • FIG. 4 is a diagram illustrating an example of a captured image generated by the color conversion processing unit in FIG. 1 based on luminance distribution data of the image sensor.
  • FIG. 5 is an explanatory diagram showing an image obtained by superimposing a circular mark indicating the position of the inner circle boundary and a circular mark indicating the position of the outer circle boundary on the captured image in the 360-degree video camera device of FIG. is there.
  • FIG. 6 is an image height (field angle difference) characteristic diagram of the three-dimensional projection type fisheye lens in FIG.
  • FIG. 7 is a flow chart showing the flow of a divided expanded image generation process by the expanded still image generation unit for one divided image range generated in the 360-degree video camera device of FIG.
  • FIG. 8 is a diagram showing an example of an array of a plurality of display pixels in a divided expanded image generated in the 360-degree video camera device of FIG.
  • FIG. 9 is an explanatory diagram illustrating directions in a ring-shaped image (directions in a divided image range) of each developed pixel row in FIG. 8.
  • FIG. 10 is an explanatory diagram of coordinate positions in an original image (circular image) of a specific display pixel in the 360-degree video camera device of FIG.
  • FIG. 11 A display example of a 360-degree panoramic developed image displayed on the liquid crystal display device of the PC in FIG.
  • FIG. 12 is a flowchart showing the operation of the process management unit in the change mode of the 360-degree video camera device of FIG. Explanation of symbols
  • the panorama development image capturing apparatus will be described by taking a conference 360-degree video camera apparatus as an example.
  • the V-Lama unfolding imaging system will be described using an example of a 360-degree video camera device connected to a personal computer.
  • FIG. 1 is a perspective view showing a panoramic developed image imaging system according to an embodiment of the present invention.
  • the NOV llama unfolded image capturing system includes a 360 degree video camera device 1 as a NOV llama unfolded image capturing device and a PC (personal computer) 3 as a computer device.
  • the 360-degree video camera device 1 is connected to the PC 3 by a USB (Universal Serial Bus) cable 2.
  • USB Universal Serial Bus
  • the PC 3 includes a USB connector 11, a liquid crystal display device 12, a speaker 13, an input device 14, and the like.
  • a CPU Central Processing Unit
  • a storage device power program (not shown) so that the PC 3 has a communication processing unit 15, a reproduction processing unit 16, a command generation unit. 17 is realized.
  • the communication processing unit 15 controls data communication using the USB connector 11.
  • the reproduction processing unit 16 controls the content displayed on the liquid crystal display device 12 and causes the speaker 13 to output sound.
  • the command generation unit 17 generates a command based on input data to which the input device 14 is also input. Examples of the input device 14 include a keyboard and a pointing device.
  • the 360-degree video camera device 1 has a cubic-shaped housing 21.
  • the housing 21 is provided with a fisheye lens 22, a USB connector 23, a video output connector 24, an audio output connector 25, and the like.
  • the fisheye lens 22 is disposed on the upper surface of the housing 21.
  • housing A ventilation hole 20 for the microphone 26 is formed on the upper surface of 21.
  • the USB connector 23, the video output connector 24 and the audio output connector 25 are arranged on the side surface of the housing 21.
  • FIG. 2 is a block diagram showing an electric circuit of 360-degree video camera device 1 in FIG.
  • an electric circuit for generating a 360-degree expanded image of a 360-degree llama by imaging is incorporated.
  • the 360-degree video camera device 1 includes an image sensor 27 as a part of the imaging means, an FPG A (Field Programmable Gate Array). 28, DSP (Digital Signal Processor) 29, external memory 30, audio IC (Integrated Circuit) 31, video encoder 32, etc.
  • FPG A Field Programmable Gate Array
  • DSP Digital Signal Processor
  • FIG. 3 is an explanatory diagram showing an optical arrangement of the fisheye lens 22 and the image sensor 27.
  • the image sensor 27 is, for example, a CMOS (Complementary Metal-Oxide Semiconductor) image sensor.
  • the image sensor 27 has a light receiving surface 33.
  • a plurality of light receiving elements (not shown) are arranged in a matrix at a ratio of, for example, 3: 4.
  • Each light receiving element outputs a value corresponding to the amount of received light.
  • the image sensor 27 generates luminance distribution data having a plurality of received light amount values output from a plurality of light receiving element forces.
  • the image sensor 27 generates luminance distribution data at a predetermined cycle.
  • the fish-eye lens 22 has a wide viewing angle of, for example, 180 degrees or more, and has a stereoscopic projection method. Compared with a general equidistant projection fisheye lens, the three-dimensional projection fisheye lens 22 causes less distortion in the peripheral portion of the image formed and increases the amount of information in the peripheral portion.
  • the projection method of the fish-eye lens 22 in addition to the stereoscopic projection method and the equidistant projection method, an equal stereoscopic projection method, an orthographic projection method, and the like can be adopted.
  • the 360-degree video camera device 1 is placed on the table in the conference room with the fisheye lens 22 facing upward, the amount of information such as a person or blackboard that appears in the surrounding area increases, so the stereoscopic projection method The one is the best.
  • the fisheye lens 22 is disposed above the light receiving surface 33 of the image sensor 27. This thus, a circular image (hereinafter referred to as a circular image) by the fisheye lens 22 is formed on the light receiving surface 33 of the image sensor 27.
  • the image sensor 27 periodically generates luminance distribution data and outputs it to the FP GA28.
  • a color conversion processing unit 36 is realized as a part of the imaging means.
  • the color conversion processing unit 36 replaces the data of each pixel of the luminance distribution data using a color conversion table (not shown).
  • the color conversion processing unit 36 replaces, for example, pixel data in a circular image with predetermined color data using the pixel value and the peripheral pixel values in the luminance distribution data. Thereby, captured image data having appropriate color data is generated.
  • FIG. 4 is a diagram illustrating an example of a captured image that the color conversion processing unit 36 generates based on the luminance distribution data of the image sensor 27. As shown in FIG. 4, the captured image has a circular image at the center. An image of the subject is formed inside the circular image. The color conversion processing unit 36 outputs the generated captured image data to the DSP 29.
  • the microphone 26 generates a waveform signal corresponding to the sound.
  • the waveform signal is converted into an audio signal by the audio IC 31 and supplied to the audio output connector 25.
  • a speaker unit or headphones can be connected to the audio output connector 25. Audio can be heard through the speaker unit or headphones connected to the audio output connector 25.
  • a voice storage processing unit 37 is realized in the voice IC 31. The sound storage processing unit 37 samples the waveform signal supplied from the microphone 26 and stores the sound data 38 generated by the sampling in the external memory 30.
  • the DSP 29 has an EEPROM (Electrically Erasable Programmable Read-Only Memory) 41 as a memory.
  • the EEPROM 41 stores inner circle boundary data 42, outer circle boundary data 43, an incident angle image height table 44, an image height correction table 45, a direction table 46, and the like.
  • FIG. 5 is an explanatory diagram showing an image in which a circular mark 47 indicating the position of the inner circle boundary and a circular mark 48 indicating the position of the outer circle boundary are superimposed on the captured image.
  • the inner circle boundary and the outer circle boundary are boundaries that specify an image range to be expanded as a panoramic expansion image.
  • the image portion between the inner circle boundary and the outer circle boundary is developed as a panoramic image.
  • the data 42 is data indicating the position and size of the inner circle boundary in the captured image.
  • the outer circle boundary data 43 is data indicating the position and size of the outer circle boundary in the captured image.
  • FIG. 6 is an image height (field angle difference) characteristic diagram of the fisheye lens 22 of the three-dimensional projection method.
  • the horizontal axis is the relative field angle when the optical axis direction of the fish-eye lens 22 is 0 degrees
  • the vertical axis is the image height (field angle difference).
  • the image height (field angle difference) of the subject existing in the direction of the angle of view of the fisheye lens 22 is about 0.1.
  • the image height (angle difference) of the subject that exists in the direction of 90 degrees is about 0.2.
  • the incident angle image height table 44 is obtained by converting the image height (view angle difference) characteristics with respect to the angle of view of the fisheye lens 22 into data.
  • the stereoscopic projection method has a larger amount of peripheral information than the conventional equi-distance projection fish-eye lens, so that the blind spots are reduced and the image is less distorted. However, the distortion is reduced, but not zero.
  • the image height correction table 45 stores correction data for obtaining a developed image in which the subject appears as a natural balance from the image formed on the image sensor 27 by the fisheye lens 22 of the three-dimensional projection method.
  • the DSP 29 has a CPU (not shown). This CPU executes the program
  • the DSP 29 includes a still image storage processing unit 51, a developed still image generation unit 52 as a V Lama development image generation unit, a compressed still image generation unit 53, a streaming generation unit 54 as a display data generation unit, and a user update unit.
  • the processing management unit 55, the communication processing unit 56, and the like are realized.
  • External memory 30 is connected to the DSP 29.
  • External memory 30 is SRAM (Static RAM)
  • DSP29 CPU It consists of storage devices such as RAM) and DDR—SDRAM (Double Data Rate SDRAM), and is accessible to the DSP29 CPU.
  • RAM random access memory
  • DDR—SDRAM Double Data Rate SDRAM
  • the external memory 30 stores still image data 39, audio data 38, two compressed and expanded still image data 61 and 62, and the like.
  • the external memory 30 has a first VRAM (VideoRAM) area 63 and a second VRAM area 64.
  • One of the two compressed decompressed still image data 61 and 62 is stored in the first VRAM area 63, and the other is stored in the second VRAM area 64.
  • the first VRAM area 63 stores the nth compressed decompression still image data 61
  • the second VRAM area 64 stores the nth compressed decompression still image data 62.
  • the still image storage processing unit 51 implemented in the DSP 29 stores the captured image data in the external memory 30 as still image data 39. save.
  • the developed still image generation unit 52 generates expanded still image data from the still image data 39 stored in the external memory 30. Expanded still image data is image data obtained by expanding a circular image in a captured image into a 360-degree V-Rama image. The expanded still image generating unit 52 generates and stores expanded still image data alternately in the first VRAM area 63 and the second VRAM area 64.
  • the compressed still image generation unit 53 compresses the expanded still image data stored in the first VRAM area 63 or the second VRAM area 64.
  • an image compression method such as a PEG (Joint Photographic Coding Experts Group) compression method or an MPEG (Moving Picture Coding Experts Group) compression method may be used.
  • the compressed expanded still image data 61 is stored in the first VRAM area 63
  • the compressed expanded still image data 62 is stored in the second VRAM area 64.
  • the streaming generation unit 54 reads the compressed and expanded still image data 61 and 62 and the audio data 38 from the external memory 30, and generates streaming data having these content data.
  • a streaming format such as MPEG or div-X (divix) may be adopted.
  • the communication processing unit 56 controls data communication using the USB connector 23.
  • the process management unit 55 manages the execution of the still image storage processing unit 51, the developed still image generation unit 52, the compressed still image generation unit 53, the streaming generation unit 54, the communication processing unit 56, and the like.
  • the process management unit 55 instructs the still image storage processing unit 51, the developed still image generation unit 52, the compressed still image generation unit 53, the streaming generation unit 54, the communication processing unit 56, and the like to start or stop them.
  • the video encoder 32 reads the compressed decompressed still image data 61 and 62 from the external memory 30, and generates a video signal. Examples of video signals include NTSC (National TV Standards Committee) and PAL (Phase Alternating Line). The video encoder 32 outputs the generated video signal to the video output connector 24. A television receiver or the like can be connected to the video output connector 24. With a television receiver connected to the video output connector 24, the video signal can be reproduced and the video can be viewed.
  • NTSC National TV Standards Committee
  • PAL Phase Alternating Line
  • the 360-degree video camera device 1 When power is supplied to the 360-degree video camera device 1 from, for example, the USB cable 2, the DP S29 and the like operate, and the 360-degree video camera device 1 includes a color conversion processing unit 36, an audio storage processing unit 37, and a stationary state.
  • An image storage processing unit 51, a developed still image generation unit 52, a compressed still image generation unit 53, a streaming generation unit 54, a process management unit 55, and the like are realized.
  • the process management unit 55 instructs the still image storage processing unit 51, the developed still image generation unit 52, the compressed still image generation unit 53, and the streaming generation unit 54 to start them.
  • the image sensor 27 of the 360-degree video camera device 1 On the image sensor 27 of the 360-degree video camera device 1, an image formed by the light collected by the fisheye lens 22 is formed.
  • the image sensor 27 generates luminance distribution data including the luminance distribution of the circular image.
  • the color conversion processing unit 36 generates captured image data having a circular image as illustrated in FIG. 4 from the luminance distribution data using a color conversion table (not shown).
  • the still image storage processing unit 51 converts the captured image data into the external memory 3 as still image data 39. Save to 0.
  • the image sensor 27 periodically generates luminance distribution data. Therefore, the still image data 39 in the external memory 30 is updated to new captured image data every predetermined period.
  • the developed still image generating unit 52 When the still image data 39 in the external memory 30 is updated, the developed still image generating unit 52 generates expanded still image data from the updated still image data 39.
  • the expanded still image generation unit 52 first reads the inner circle boundary data 42 and the outer circle boundary data 43 from the EEPROM 41, and generates a panoramic expanded image in the captured image based on the updated still image data 39. Determine the range of the ring shape to do.
  • the developed still image generation unit 52 determines two division ranges obtained by equally dividing the ring-shaped image range into two at 180 degrees. In FIG. 5, dotted lines 71 and 72 for horizontally dividing the ring-shaped image range into two are shown.
  • the developed still image generation unit 52 divides the ring-shaped image between the outer circle boundary and the inner circle boundary into two parts every 180 degrees with these two dotted lines 71 and 72, for example, and two upper and lower divided images. For each of the ranges 73 and 74, the divided development image generation processing is executed.
  • FIG. 7 is a flowchart showing the flow of a divided expanded image generation process by the expanded still image generation unit 52 for one divided image range 73 (74).
  • the developed still image generating unit 52 first confirms whether or not the inner circle boundary data 42 and the outer circle boundary data 43 stored in the EEPROM 41 have been updated after the last generated generated image (step ST1).
  • the developed still image generating unit 52 displays the divided developed image.
  • the direction in each divided pixel range 73 (74) of each developed pixel column is calculated to generate the direction table 46 (step ST2).
  • FIG. 8 is a diagram illustrating an example of an array of a plurality of display pixels of the divided development image.
  • a plurality of display pixels are arranged in 3 rows x 4 columns in this divided expanded image.
  • the developed still image generation unit 52 has a ring shape for each of four developed pixel columns (1, Z) 81, (2, Z) 8 2, (3, Z) 83, (4, Z) 84.
  • the direction in the image is calculated.
  • FIG. 9 shows the direction of the developed pixel columns 81, 82, 83, 84 in FIG.
  • FIG. 16 is an explanatory diagram illustrating a direction in a divided image range 73 (74).
  • the developed still image generating unit 52 calculates the angle of the expanded pixel row (2, Z) 82 as “ ⁇ 1 + equal division angle X 2”.
  • the expanded still image generating unit 52 calculates the angle of the fourth expanded pixel column (4, Z) 84 as “ ⁇ 1 + equal division angle X 3”.
  • the developed still image generating unit 52 calculates the direction of each of the developed pixel columns 81, 82, 83, and 84 as described above.
  • the developed still image generating unit 52 generates a direction table 46 in which each direction is associated with a plurality of developed pixel columns 81, 82, 83, and 84.
  • the developed still image generation unit 52 stores the generated direction table 46 in the EEPROM 41.
  • the direction table 46 may be stored in the external memory 30 or the like.
  • the expanded still image generating unit 52 specifies the first display pixel of the divided expanded image (step ST3).
  • the expanded still image generation unit 52 specifies, for example, the upper left display pixel ((1, 1) in FIG. 8) of the divided expanded image.
  • the developed still image generation unit 52 identifies the first display pixel of the divided developed image (step ST3).
  • the developed still image generation unit 52 specifies a pixel in the captured image corresponding to the specified display pixel, acquires a pixel value of the pixel in the specified captured image, and acquires the acquired pixel value. Is stored as the pixel value of the specified display pixel.
  • the developed still image generation unit 52 first refers to the image height correction table 45 and calculates the distance R from the center of the original image (circular image) for the specified display pixel. (Step ST4). For example, referring to the image height correction table 45, the developed still image generation unit 52 compares one developed pixel row of the divided developed image with a pixel array in the radial direction within the ring shape range of the captured image. The distance R from the center of the donut at the position of the original image (circular image) corresponding to the specific display pixel is calculated. Next, the developed still image generation unit 52 uses the direction of the specific display pixel stored in the direction table 46 and the calculated distance R to perform the horizontal display of the specific display pixel in the original image (circular image). Calculate the coordinates (step ST5). The developed still image generation unit 52 calculates the ordinate in the original image (circular image) of the specific display pixel from the direction of the specific display pixel stored in the direction table 46 and the calculated distance R (step) ST6).
  • FIG. 10 is an explanatory diagram of the coordinate position in the original image (circular image) of the specific display pixel.
  • the center of the original image (circular image) (IOX, IOY) force is the center of the inner circle boundary and the center of the outer circle boundary.
  • (IOX, IOY) is the center of the donut with the inner and outer circle boundaries.
  • the coordinate position (IX, IY) of the specific display pixel in the original image (circular image) is Is calculated by The coordinate position (IX, IY) is based on the upper left of the original image (circular image).
  • IX IOX + RX cos ⁇ ⁇ ⁇ ⁇ Equation 1
  • the developed still image generating unit 52 When the coordinate position (IX, IY) of the specific display pixel in the original image (circular image) is calculated, the developed still image generating unit 52 generates the pixel 91 of the original image (circular image) at that coordinate. The value is acquired and stored as the pixel value of the specified display pixel. The developed still image generation unit 52 stores the pixel value of the specified display pixel in the first VRAM area 63 or the second VRAM area 64 of the external memory 30 (step ST7).
  • the developed still image generation unit 52 determines and stores the pixel value of one display pixel specified in step ST3. Thereafter, the developed still image generating unit 52 determines whether or not the display pixels for which the pixel value has not been determined remain among the plurality of display pixels of the divided expanded image, determines the pixel value, and determines the pixel value. Until the number of display pixels is exhausted, the process for determining the pixel value of the above display pixels is repeated.
  • the developed still image generating unit 52 first determines whether or not the determination of the pixel value for the display pixels for one column of the divided expanded image has been completed (step ST8). If one column has not been completed (No in step ST8), the developed still image generation unit 52 identifies the display pixel of the next row in the same column (step ST9), and Pixel Determine the value (steps ST4 to ST7).
  • step ST8 the expanded still image generation unit 52 further determines all of the divided expanded images. It is determined whether or not the force has been determined for the pixel value for the column (step ST10). If the processing has not been completed for all columns (No in step ST10), the developed still image generation unit 52 identifies the display pixel at the upper end (top row) of the next column (step ST11). Then, the pixel value of the specific display pixel is determined (steps ST4 to ST7).
  • the expanded still image generation unit 52 performs the divided expanded image generation processing of FIG. finish.
  • the divided expanded image generation process for one divided image range 73 (74) is completed.
  • the developed still image generation unit 52 executes a divided expanded image generation process for the remaining one divided image range 74 (73).
  • two divided expanded images based on the still image data 39 are stored in the first VRAM area 63 or the second VRAM area 64 of the external memory 30.
  • the compressed still image generating unit 53 compresses the two divided expanded images.
  • the compressed and expanded still image data 61 and 62 are stored in the first VRAM area 63 or the second VRAM area 64.
  • the streaming generation unit 54 reads the compressed decompressed still image data 61 and 62 in order from the first VRAM area 63 and the second VRAM area 64, and also reads the audio data 38 from the external memory 30. Generate streaming data with content data. In the streaming data, the two divided expanded images are arranged one above the other.
  • the streaming generation unit 54 supplies the generated streaming data to the communication processing unit 56.
  • the communication processing unit 56 of the 360-degree video camera apparatus 1 transmits streaming data to the communication processing unit 15 of the PC 3 via the USB connector 23, the USB cable 2, and the USB connector 11 of the PC 3.
  • the communication processing unit 15 of the PC 3 supplies the received streaming data to the reproduction processing unit 16.
  • the The playback processing unit 16 of the PC 3 extracts compressed decompressed still image data from the streaming data and decodes it, and supplies the data of the two divided expanded images to the liquid crystal display device 12 as display data.
  • the liquid crystal display device 12 displays two divided expanded images.
  • the playback processing unit 16 of the PC 3 extracts the audio data with the streaming data power and supplies it to the speaker 13.
  • the speaker 13 outputs a sound based on the extracted audio data.
  • FIG. 11 is a display example of a 360-degree panoramic developed image displayed on the liquid crystal display device 12 of the PC 3 by imaging.
  • a 360-degree image expanded by imaging is displayed by two divided expanded screens 101 and 102 arranged vertically. Since the two divided development screens 101 and 102 are arranged one above the other, the display screen of the liquid crystal display device 12 can be used more effectively than when these are one development image, for example.
  • a 360-degree panoramic image can be displayed large on the liquid crystal display device 12.
  • the two split development screens 101 and 102 in FIG. 11 have two split developments every 180 degrees obtained by dividing the ring-shaped image range of FIG. 5 into two by dotted lines 71 and 72. An image is displayed. The two split developed images are continuous with each other at both ends, and a 360-degree panoramic image is displayed seamlessly as a whole. With this display, the user of the PC 3 can view the 360-degree video image developed by the 360-degree video camera device 1. Further, the user of the PC 3 can hear the sound acquired by the 360-degree video camera device 1 by the sound emitted from the speaker 13.
  • the image sensor 27 of the 360-degree video camera device 1 generates brightness distribution data for each period.
  • the color conversion processing unit 36 generates captured image data from the luminance distribution data.
  • the still image storage processing unit 51 updates the still image data 39 in the external memory 30 with the captured image data generated every period.
  • the developed still image generation unit 52 reads still image data 39 from the external memory 30, and alternately generates two divided expanded image data based on the read still image data 39 in the first VRAM area 63 and the second VRAM area 64.
  • the streaming generation unit 54 reads the compressed data of the two divided expanded images written in the first VRAM area 63 or the second VRAM area 64, and generates streaming data.
  • the streaming generation unit 54 power generates a decompressed still image while reading the data of two compressed expanded images from one of the first VRAM area 63 and the second VRAM area 64.
  • the unit 52 can generate two divided expanded images in the other VRAM area, and the compressed still image generating unit 53 can compress the two divided expanded images in the other VRAM area.
  • the streaming generation unit 54 can assign the two divided development images based on the luminance distribution data generated by the image sensor 27 every period to the streaming data without causing image omission (frame omission). Accordingly, the moving image based on the luminance distribution data periodically generated by the image sensor 27 is displayed on the liquid crystal display device 12 of PC3.
  • the command generation unit 17 of the PC 3 generates an auto pan instruction command based on the input data input from the input device 14.
  • the communication processing unit 15 of the PC 3 transmits the auto pan instruction command to the communication processing unit 56 of the 360-degree video camera device 1 via the USB cable 2.
  • the communication processing unit 56 of the 360-degree video camera device 1 supplies the received autopan command to the processing management unit 55.
  • the process management unit 55 When the auto pan instruction command is supplied, the process management unit 55 periodically updates the direction table 46 stored in the EEPROM 41. Specifically, the process management unit 55 periodically repeats the process of adding a predetermined constant angle to the data in a plurality of directions for each developed pixel column in the direction table 46.
  • the developed still image generation unit 52 periodically increases the direction of the specific display pixel by a predetermined constant angle in the flowchart of FIG.
  • the angle 01 of the developed pixel row (1, Z) 81 in the first row in FIG. 9 periodically increases by a predetermined constant angle.
  • the direction of the specific display pixel used in steps ST5 and ST6 in FIG. 7 also periodically increases by a predetermined constant angle.
  • the command generation unit 17 of the PC 3 generates a change mode start instruction command based on the input data input from the input device 14.
  • the communication processing unit 15 of the PC 3 transmits a change mode start instruction command to the communication processing unit 56 of the 360-degree video camera device 1 via the USB cable 2.
  • the communication processing unit 56 of the 360-degree video camera device 1 supplies the received change mode start instruction command to the processing management unit 55.
  • FIG. 12 is a flowchart showing the operation of the process management unit 55 in the change mode.
  • step ST21 When the change mode start instruction command is supplied (step ST21), the process management unit 55 supplies a boundary display instruction to the stream generation unit 54 (step ST22). After that, the process management unit 55 waits to receive a boundary change instruction command and a change mode end instruction command (steps ST23 and ST24).
  • the streaming generation unit 54 to which the boundary display instruction is supplied reads the still image data 39 from the external memory 30. Every time the still image data 39 is read, the streaming generation unit 54 reads the inner circle boundary data 42 and the outer circle boundary data 43 from the EEPROM 41 following the reading. As shown in FIG. 5, the streaming generation unit 54 generates a display image in which the inner circle boundary mark 47 and the outer circle boundary mark 48 are superimposed on the captured image of the still image data 39. Note that the streaming generation unit 54 may read the inner circle boundary data 42 and the outer circle boundary data 43 each time the still image data 39 is read a plurality of times.
  • the streaming generation unit 54 generates streaming data including the generated display image data, and supplies the streaming data to the communication processing unit 56.
  • the communication processing unit 56 of the 360-degree video camera device 1 transmits the streaming data to the communication processing unit 15 of the PC 3 via the USB cable 2.
  • the communication processing unit 15 of the PC 3 supplies the received streaming data to the reproduction processing unit 16.
  • the reproduction processing unit 16 of the PC 3 extracts display image data from the streaming data and supplies it to the liquid crystal display device 12 as display data.
  • the liquid crystal display device 12 displays a captured image in which a circular mark 47 on the inner circle boundary and a circular mark 48 on the outer circle boundary are superimposed.
  • the command generation unit 17 of the PC 3 moves the positions of the inner circle boundary and the outer circle boundary, and enlarges / reduces the radii of the inner circle boundary and the outer circle boundary based on a user operation on the input device 14. A boundary change instruction command is generated.
  • the communication processing unit 15 of the PC 3 transmits a boundary change instruction command to the communication processing unit 56 of the 360-degree video camera device 1 via the USB cable 2.
  • the communication processing unit 56 of the 360 degree video camera apparatus 1 supplies the received boundary change instruction command to the processing management unit 55.
  • the process management unit 55 first determines whether the instruction is appropriate (step ST25). Specifically, the process management unit 55 determines whether, for example, the radius of the inner circle boundary after the change or the radius of the outer circle boundary after the change is equal to or larger than a predetermined radius (for example, 144 pixels).
  • step ST25 If it is determined that the instruction is appropriate (Yes in step ST25), the process management unit 55 determines that the inner circle boundary data 42 and the outer circle boundary stored in the EEPROM 41 are based on the change instruction. The boundary data 43 is updated (step ST26). Further, the process management unit 55 updates the image height correction table 45 stored in the EEPROM 41 based on the new combination of the inner circle boundary and the outer circle boundary (step ST27).
  • the subject forms an image with a different size depending on the angle of view. Therefore, as described in FIG. 6, even if the distance between the inner circle boundary and the outer circle boundary has not changed due to the update, the position of the inner circle boundary and the outer circle boundary only moves, and the imaging is performed. The division ratio of a plurality of pixels in the space changes.
  • the processing management unit 55 updates the image height correction table 45 in accordance with the change in the division ratio of the plurality of pixels accompanying the change between the inner circle boundary and the outer circle boundary. Thereafter, the process management unit 55 returns to the reception waiting state for the boundary change instruction command and the change mode end instruction command (steps ST23 and ST24).
  • step ST25 If the indication is not appropriate (No in step ST25), for example, the radius of the inner circle boundary is If the radius is smaller than the predetermined radius, process management unit 55 ignores the change instruction and returns to a waiting state for receiving a boundary change instruction command and a change mode end instruction command (steps ST23 and ST24).
  • the streaming generation unit 54 to which the boundary display instruction is supplied reads the inner circle boundary data 42 and the outer circle boundary data 43 from the EEPROM 41 each time the still image data 39 is read from the external memory 30. Therefore, when the process management unit 55 updates the inner circle boundary data 42 and the outer circle boundary data 43, the streaming generation unit 54 also updates the position and size of the overlap of the circular marks 47 and 48 with respect to the captured image. The positions and sizes of the circular marks 47 and 48 displayed on the liquid crystal display device 12 so as to overlap the captured image are changed according to the boundary change instruction command. The user can know that the inner circle boundary data 42 and the outer circle boundary data 43 have been updated by the display on the liquid crystal display device 12.
  • the command generation unit 17 of the PC 3 generates a change mode end instruction command based on a user operation on the input device 14.
  • the communication processing unit 15 of the PC 3 transmits a change mode end instruction command to the communication processing unit 56 of the 360-degree video camera device 1 via the USB cable 2.
  • the communication processing unit 56 of the 360-degree video camera apparatus 1 supplies the received change mode end instruction command to the process management unit 55.
  • the process management unit 55 ends the process of the change mode in FIG.
  • the inner circle boundary data 42, the outer circle boundary data 43, and the image height correction table 45 are changed to new ones, and the developed still image generation unit 52 of the 360-degree video camera device 1 performs this change. Using this data, a 360-degree no-V llama expansion image is generated.
  • a circular image by the fisheye lens 22 is formed on the image sensor 27 of the 360-degree video camera device 1.
  • the still image storage processing unit 51 generates captured image data having a circular image based on the luminance distribution data of the image sensor 27. This circular image includes the subject's head without being cut off.
  • the developed still image generating unit 52 uses two divided developed images based on an image within the range of the ring shape sandwiched between the inner circle boundary and the outer circle boundary in the captured image. 360 A panorama unfolded image is generated. The position and size of the inner and outer circle boundaries are
  • the processing management unit 55 can freely set the user.
  • the process management unit 55 by updating the position or size of the inner circle boundary or outer circle boundary by the process management unit 55, it is possible to adjust the subject in the circular image to be within the ring shape range.
  • the position of the inner circle boundary etc. It can generate a 360-degree image with no head cut.
  • the display screen of the liquid crystal display device 12 can be used more effectively than if, for example, a 360-degree panoramic development image is a single development image.
  • the liquid crystal display device 12 can display a large 360-degree image of the V llama.
  • the fisheye lens 22 is a stereoscopic projection system. Compared to the case of using a regular equidistant projection fisheye lens, the stereoscopic projection fisheye lens 22 has less distortion at the periphery in the image formed by the fisheye lens 22, and the amount of information at the periphery. Will increase.
  • the EEPROM 41 also stores an image height correction table 45 that corrects the balance of the imaging size of the subject obtained based on the relationship between the incident angle and the image height of the fisheye lens 22 of this stereoscopic projection method! To do.
  • the developed still image generation unit 52 refers to the image height correction table 45 and generates a 360-degree panoramic developed image in which the balance of the imaging size of the subject is adjusted.
  • the image height correction table 45 stored in the EEPROM 41 is It is updated each time the inner circle boundary data 42 or the outer circle boundary data 43 is updated.
  • the division ratio of the image space between them changes by a plurality of pixels. Even if the positions of the inner circle boundary and the outer circle boundary only move, the division ratio of the plurality of pixels in the image space between them changes.
  • the image height correction table 45 is maintained to be compatible with these.
  • the inner circle boundary data 42 and the outer circle boundary data 43 stored in the EEPROM 41 are updated when the changed size is a radius of 144 pixels or more. For example, if the radius of the inner circle boundary becomes 1 pixel, the same pixel in the captured image will be used many times to obtain the pixel values of the top display pixels of the 360 ° panorama development image. It's a little bit. In this case, the load of generating a 360-degree panoramic image increases without improving the image quality of the 360-degree image. By preventing the radius of the inner circle boundary or outer circle boundary from becoming smaller than 144 pixels as in this embodiment, it is possible to prevent such disadvantages from occurring.
  • the developed still image generation unit 52 calculates the distance R and the direction from the center of the circular image of each display pixel of the 360-degree panoramic developed image, and calculates the calculated distance R and Using the direction, the coordinates (IX, IY) in the captured image of each display pixel are calculated, and the pixel value of the pixel 91 in the captured image located at each calculated coordinate (IX, IY) is calculated for each display pixel. Obtained as a pixel value.
  • this embodiment is an example of a preferred embodiment of the present invention.
  • the fish-eye lens 22 uses a stereoscopic projection type.
  • a general equidistant projection method, an isometric projection method, an orthographic projection method, or the like may be used as the fisheye lens 22.
  • the developed still image generating unit 52 generates two divided expanded images every 180 degrees as a 360-degree panoramic expanded image.
  • the developed still image generating unit 52 may generate one 360-degree panoramic expanded image, or may be divided into three or four.
  • the process management unit 55 updates both the inner circle boundary data 42 and the outer circle boundary data 43 stored in the EEPROM 41 based on a user operation.
  • the process management unit 55 may update only one of the inner circle boundary data 42 and the outer circle boundary data 43 stored in the EEPROM 41 based on a user operation.
  • the PC 3 is connected to the 360 degree video camera apparatus 1 by the USB cable 2.
  • the 360 degree video camera apparatus 1 may be connected to an input board for outputting a command, for example, via a USB cable 2 or the like.
  • the PC 3 connected to the 360-degree video camera device 1 displays a moving image based on a 360-degree panoramic image on the liquid crystal display device 12.
  • PC3 should store a 360-degree video image in a storage device (not shown).
  • the present invention can be widely used as a panorama developed image capturing apparatus, a panorama expanded image capturing system, and the like for capturing an image of a meeting.

Abstract

Cette invention porte sur l'image panoramique d'un objet à 360 degrés qui est photographiée sans sa discontinuité. Un dispositif (1) de photographie d'image panoramique génère une image panoramique photographiée s'étalant sur 360 degrés. Le dispositif (1) de photographie d'image panoramique comporte une mémoire (41) pour stocker des positions et des dimensions d'une limite de cercle interne (42) et d'une limite de cercle externe (43) pour définir une région en forme d'anneau utilisée pour générer l'image panoramique s'étalant sur 360 degrés dans une image photographiée générée par un moyen de photographie (27, 36) sur la base d'une image formée par une lentille à grand angulaire extrême (22), un moyen de mise à jour d'utilisateur pour mettre à jour la position ou la taille d'au moins soit la limite du cercle interne (42), soit la limite du cercle externe (43) stockée dans la mémoire (41) conformément aux opérations de l'utilisateur, et un moyen (52) de génération d'image panoramique pour générer une image panoramique s'étalant sur 360 degrés sur la base de l'image dans la région en forme d'anneau mise entre la limite (42) du cercle interne et la limite (43) ducercle externe stockées dans la mémoire (41).
PCT/JP2007/060193 2006-07-20 2007-05-18 Système de photographie d'image panoramique et procédé de photographie d'image panoramique WO2008010345A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006197848A JP2008028606A (ja) 2006-07-20 2006-07-20 パノラマ展開画像撮像装置およびパノラマ展開画像撮像システム
JP2006-197848 2006-07-20

Publications (1)

Publication Number Publication Date
WO2008010345A1 true WO2008010345A1 (fr) 2008-01-24

Family

ID=38956681

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/060193 WO2008010345A1 (fr) 2006-07-20 2007-05-18 Système de photographie d'image panoramique et procédé de photographie d'image panoramique

Country Status (2)

Country Link
JP (1) JP2008028606A (fr)
WO (1) WO2008010345A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016076853A (ja) * 2014-10-07 2016-05-12 キヤノン株式会社 画像処理装置、画像処理方法、撮像装置
US9349195B2 (en) 2012-03-19 2016-05-24 Google Inc. Apparatus and method for spatially referencing images
US9811946B1 (en) 2016-05-30 2017-11-07 Hong Kong Applied Science and Technology Research Institute Company, Limited High resolution (HR) panorama generation without ghosting artifacts using multiple HR images mapped to a low resolution 360-degree image
CN113612976A (zh) * 2021-10-11 2021-11-05 成都派沃特科技股份有限公司 安保预案优化方法和安保预案优化装置

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4787292B2 (ja) * 2008-06-16 2011-10-05 富士フイルム株式会社 全方位撮像装置
JP2010068071A (ja) * 2008-09-09 2010-03-25 Fujifilm Corp 撮影装置及び方法、画像処理装置及び方法、並びに、画像処理プログラム
JP5573349B2 (ja) 2010-05-17 2014-08-20 パナソニック株式会社 パノラマ展開画像撮影システムおよび方法
WO2014091736A1 (fr) * 2012-12-10 2014-06-19 パナソニック株式会社 Dispositif d'affichage pour une image étendue de manière panoramique
JP6271917B2 (ja) * 2013-09-06 2018-01-31 キヤノン株式会社 画像記録装置及び撮像装置
US10587799B2 (en) 2015-11-23 2020-03-10 Samsung Electronics Co., Ltd. Electronic apparatus and method for controlling electronic apparatus thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000242773A (ja) * 1999-02-19 2000-09-08 Fitto:Kk 画像データ変換装置
JP2002325246A (ja) * 2001-04-25 2002-11-08 Univ Waseda 競技場用中継システム
JP2002374519A (ja) * 2001-06-12 2002-12-26 Sharp Corp 画像監視装置及び画像監視方法及び画像監視処理プログラム
JP2006098942A (ja) * 2004-09-30 2006-04-13 Elmo Co Ltd 魚眼レンズおよびこれを備える撮影装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000242773A (ja) * 1999-02-19 2000-09-08 Fitto:Kk 画像データ変換装置
JP2002325246A (ja) * 2001-04-25 2002-11-08 Univ Waseda 競技場用中継システム
JP2002374519A (ja) * 2001-06-12 2002-12-26 Sharp Corp 画像監視装置及び画像監視方法及び画像監視処理プログラム
JP2006098942A (ja) * 2004-09-30 2006-04-13 Elmo Co Ltd 魚眼レンズおよびこれを備える撮影装置

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9349195B2 (en) 2012-03-19 2016-05-24 Google Inc. Apparatus and method for spatially referencing images
US9740962B2 (en) 2012-03-19 2017-08-22 Google Inc. Apparatus and method for spatially referencing images
US10262231B2 (en) 2012-03-19 2019-04-16 Google Llc Apparatus and method for spatially referencing images
US10891512B2 (en) 2012-03-19 2021-01-12 Google Inc. Apparatus and method for spatially referencing images
JP2016076853A (ja) * 2014-10-07 2016-05-12 キヤノン株式会社 画像処理装置、画像処理方法、撮像装置
US9811946B1 (en) 2016-05-30 2017-11-07 Hong Kong Applied Science and Technology Research Institute Company, Limited High resolution (HR) panorama generation without ghosting artifacts using multiple HR images mapped to a low resolution 360-degree image
CN113612976A (zh) * 2021-10-11 2021-11-05 成都派沃特科技股份有限公司 安保预案优化方法和安保预案优化装置

Also Published As

Publication number Publication date
JP2008028606A (ja) 2008-02-07

Similar Documents

Publication Publication Date Title
US20200296282A1 (en) Imaging system, imaging apparatus, and system
WO2008010345A1 (fr) Système de photographie d'image panoramique et procédé de photographie d'image panoramique
US20210084221A1 (en) Image processing system and image processing method
JP6927382B2 (ja) 撮像システム、方法、プログラム、動画表示装置および画像処理装置。
US8570334B2 (en) Image processing device capable of efficiently correcting image data and imaging apparatus capable of performing the same
JP3992045B2 (ja) 映像信号処理装置及び該方法並びに仮想現実感生成装置
JP2007531333A (ja) ひずみのない画像をリアルタイムに表示するパノラマ・ビデオ・システム
JP2008129903A (ja) 画像処理装置及び画像処理方法
US20100020202A1 (en) Camera apparatus, and image processing apparatus and image processing method
JP7424076B2 (ja) 画像処理装置、画像処理システム、撮像装置、画像処理方法およびプログラム
JP7306089B2 (ja) 画像処理システム、撮像システム、画像処理装置、撮像装置およびプログラム
JPH11308608A (ja) 動画像生成方法,動画像生成装置及び動画像呈示装置
JP2016171577A (ja) 撮像装置および撮像システム
WO2008012983A1 (fr) Dispositif de prise d'image et système de prise d'image
KR20000058739A (ko) 파노라마 영상 감시 시스템 및 그 제어방법
JP2023139983A (ja) 撮影装置、撮影方法、及びプログラム
JP3934345B2 (ja) 撮像装置
JPH10285474A (ja) 画像撮り込み装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07743628

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 07743628

Country of ref document: EP

Kind code of ref document: A1