WO2007055335A1 - 画像処理装置、画像処理方法、そのプログラム及びそのプログラムを記録した記録媒体 - Google Patents
画像処理装置、画像処理方法、そのプログラム及びそのプログラムを記録した記録媒体 Download PDFInfo
- Publication number
- WO2007055335A1 WO2007055335A1 PCT/JP2006/322498 JP2006322498W WO2007055335A1 WO 2007055335 A1 WO2007055335 A1 WO 2007055335A1 JP 2006322498 W JP2006322498 W JP 2006322498W WO 2007055335 A1 WO2007055335 A1 WO 2007055335A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- area
- region
- selection
- unit
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 324
- 238000003672 processing method Methods 0.000 title claims description 6
- 238000003384 imaging method Methods 0.000 claims abstract description 203
- 230000003287 optical effect Effects 0.000 claims abstract description 108
- 238000012937 correction Methods 0.000 claims abstract description 68
- 238000001514 detection method Methods 0.000 claims description 22
- 230000000007 visual effect Effects 0.000 claims description 16
- 238000000034 method Methods 0.000 abstract description 46
- 230000008569 process Effects 0.000 abstract description 37
- 238000010586 diagram Methods 0.000 description 59
- 230000008859 change Effects 0.000 description 48
- 238000004886 process control Methods 0.000 description 32
- 238000009434 installation Methods 0.000 description 23
- 101100479031 Caenorhabditis elegans aars-2 gene Proteins 0.000 description 8
- 101100176745 Mus musculus Gsc2 gene Proteins 0.000 description 8
- 239000002131 composite material Substances 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 230000036544 posture Effects 0.000 description 6
- 230000009467 reduction Effects 0.000 description 6
- 101100436078 Caenorhabditis elegans asm-2 gene Proteins 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 241000282412 Homo Species 0.000 description 3
- 230000005856 abnormality Effects 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 238000012790 confirmation Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 101100479039 Caenorhabditis elegans aars-1 gene Proteins 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013075 data extraction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000474 nursing effect Effects 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G06T5/80—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
Definitions
- Image processing apparatus image processing method, program thereof, and recording medium recording the program
- the present invention relates to an image processing device, an image processing method, a program thereof, and a recording medium that process a captured wide-field image.
- the captured image includes not only a still image but also a moving image (see, for example, Japanese Patent Laid-Open No. 2000-324386 (the specification specification [0009], FIG. 1)).
- An image processing apparatus processes image data including distortion of an imaging optical unit obtained by imaging an optical image from a subject via an imaging optical unit that applies distortion.
- the first region selection mode for selecting a selected region indicating a partial region of the visual field using an orthogonal coordinate system and the visual field represented by the image data.
- Area selection mode setting section for selectively setting the second area selection mode for selecting a selection area indicating a partial area of the field of view using the polar coordinate system, and the area selection mode setting section.
- a data output unit that outputs distortion correction data obtained by correcting distortion of image data corresponding to the selected area selected in the first or second area selection mode.
- a first region selection mode and a second region selection mode are provided as the region selection mode, and the selection region is set or switched in one of the region selection modes.
- the area selection mode may be set to, for example, a user's favorite area selection mode! /, Or may be set according to the place where the image is taken, the environment, the purpose, or the like. Or it is good also as what is set to the area
- the subject image mainly means moving images, but of course includes still images.
- the selected region is switched using an orthogonal coordinate system. This is particularly effective when images such as That is, in such a case
- the area selection mode (1) By selecting the area selection mode (1), an operation that is highly intuitive for humans becomes possible.
- the second region selection mode when the second region selection mode is selected, the selected region is switched using the polar coordinate system, so that, for example, an image with a field of view that faces upward or downward from the horizontal plane is captured. It is valid. In other words, by selecting the second region selection mode in such a case, an operation that is highly intuitive for humans becomes possible.
- image data used in the image processing apparatus is output from an imaging unit having an imaging optical unit and an imaging element that generates image data of a subject image incident via the imaging optical unit.
- the image data may be used, or the storage device that stores the image data of the subject image incident through the imaging optical unit may be read image data.
- the imaging unit and the storage device may be provided integrally with the image processing device or may be provided separately. Furthermore, the configuration may be such that only one of the imaging unit and the storage device is used.
- the imaging unit for example, an image of a selected area indicating a partial area of the field of view from a wide-field subject image obtained in real time can be displayed without causing distortion.
- a storage device for example, an image of a selected area from a wide-field subject image that has already been taken can be displayed without causing distortion. wear.
- an imaging unit and a storage device are used, a real-time image and a past image can be viewed at a time, which is convenient.
- the image processing apparatus includes a direction detection sensor that detects an incident direction of an optical image incident on the imaging optical unit, and the first region selection mode based on the sensor signal of the direction detection sensor force.
- the second area selection mode may be set.
- the region selection mode may be set according to the angle with respect to the vertical direction determined based on the sensor signal of the direction detection sensor force.
- the area selection mode can be switched according to the imaging direction by the imaging optical unit, so that an appropriate area selection mode is selected according to whether the subject is imaged from the top, bottom, or horizontal direction, for example. This can improve user convenience.
- the second region selection mode is used when the detected angle is less than 45 degrees or more than 135 degrees.
- the angle is greater than 135 degrees and less than 135 degrees, it is switched to the first area selection mode.
- the first region setting mode is switched to the second region setting mode depending on whether or not the angle exceeds the first threshold value, and the angle exceeds the second threshold value different from the first threshold value. Switching from the second region selection mode to the first region selection mode may be performed according to whether or not the power is high.
- the threshold value for switching from the first area selection mode to the second area selection mode is different from the threshold value for switching from the second area selection mode to the first area selection mode.
- the image processing apparatus includes a detection sensor that detects contact of an object, and performs a first region selection mode and a second region selection mode according to whether or not contact is detected. Let's switch.
- the area selection mode when imaging while moving the imaging optical unit, the area selection mode can be switched according to the distance between the imaging optical unit and the subject.
- an appropriate mode switching according to the user's application can be realized, such as the first area selection mode.
- the first area selection mode or the second area selection mode is set according to the setting state of the selection area. This makes it possible to automatically switch the area selection mode depending on how the selection area is set.
- the GUI Graphic User Interface
- An image processing method is an image processing image data including distortion of the imaging optical unit obtained by imaging an optical image from a subject through an imaging optical unit that applies distortion.
- a first region selection mode for selecting a selection region indicating a partial region of the visual field using an orthogonal coordinate system and the image data
- a region selection mode setting step for selectively setting a second region selection mode for selecting a selection region indicating a partial region of the field of view using a polar coordinate system, and the region selection
- the present invention can also be applied to the invention of a program and the invention of a recording medium on which the program is recorded.
- image data including distortion of the imaging optical unit which is obtained by imaging an optical image having a strong subject power through the imaging optical unit that applies distortion, is used.
- image data including distortion of the imaging optical unit, which is obtained by imaging an optical image having a strong subject power through the imaging optical unit that applies distortion.
- FIG. 1 is a block diagram showing a configuration of an image processing system according to an embodiment of the present invention.
- FIG. 2 is a diagram for explaining the relationship between a subject image formed on an image sensor and a selected area.
- FIG. 3 is a block diagram showing a functional configuration of an image processing unit.
- FIG. 4 is a diagram for explaining a display mode and a region selection mode.
- FIG. 5 is a block diagram illustrating a specific configuration of an image processing unit.
- FIG. 6 is a diagram showing an entire image.
- FIG. 7 is a diagram showing an example of an image displayed on the display unit.
- FIG. 8 is a flowchart showing the operation of a processing control unit.
- FIG. 10 is a diagram showing the field of view (imaging range) when being imaged by the imaging unit. [11] It is a diagram for explaining the image height characteristics of the lens.
- FIG. 12 is a diagram for explaining the principle of distortion correction processing.
- FIG. 14 is a diagram for explaining an operation in the polar coordinate mode among the region selection modes.
- ⁇ 15] Shows an example of the GUI displayed when the user issues an instruction to change the selected area using the input unit.
- FIG. 16 is a diagram for explaining a case where a selection area switching instruction is issued when the orthogonal coordinate mode is selected.
- ⁇ 17 Indicates the display image displayed on the display unit 14 when the display mode is switched in order when the Cartesian coordinate mode is selected.
- FIG. 18 is a diagram showing a display image on the display unit when the polar coordinate mode is selected.
- FIG. 19 is a diagram showing a display image after a switching instruction when the polar coordinate mode is selected.
- FIG. 20 is a diagram showing a display image on the display unit when the polar coordinate mode is selected, and is a diagram for explaining a case where the display in the split display mode is turned upside down.
- FIG. 22 is a diagram for explaining a Z reduction process.
- ⁇ 23 It is a diagram showing a state before a rotation operation instruction is given.
- ⁇ 24 It is a diagram for explaining the rotation processing of the image in the selected area.
- ⁇ 25 It is also a diagram for explaining the rotation processing of the image in the selected area.
- FIG. 26 is a block diagram showing a configuration of an image processing system according to another embodiment of the present invention.
- FIG. 27 is a flowchart showing an example of processing performed by the image processing system shown in FIG. The
- FIG. 28 is a flowchart showing another example of processing performed by the image processing system shown in FIG. 26.
- FIG. 29 is a flowchart showing a storage process of position information or trajectory information.
- FIG. 30 is a diagram for explaining a locus of a predetermined range on the display image in the flow shown in FIG. 29.
- FIG. 31 is a block diagram showing a configuration of an image processing system according to still another embodiment of the present invention.
- ⁇ 32 A block diagram showing a configuration of an image processing system according to still another embodiment of the present invention.
- FIG. 33 is a diagram conceptually showing how the area selection mode MS is switched according to the installation direction of the imaging unit 11 in the image processing system of FIG. 32.
- FIG. 34 The method for setting the threshold value for switching the states S 1, S 2 and S in FIG. 33 has been described.
- FIG. 1 A first figure.
- FIG. 35 Operation when the image processing system 40 in FIG. 32 switches between states S, S and S.
- FIG. 36 is a diagram illustrating a coordinate calculation method when the image processing system in FIG. 32 sets the orthogonal coordinate mode and the polar coordinate mode, respectively.
- FIG. 37 is a diagram illustrating a coordinate calculation method when the image processing system in FIG. 32 sets the orthogonal coordinate mode and the polar coordinate mode, respectively.
- FIG. 38 is a diagram illustrating a coordinate calculation method when the image processing system in FIG. 32 sets the orthogonal coordinate mode and the polar coordinate mode, respectively.
- FIG. 39 is a diagram illustrating a coordinate calculation method when the image processing system in FIG. 32 sets the orthogonal coordinate mode and the polar coordinate mode, respectively.
- FIG. 40 is a diagram conceptually showing a method of switching display modes according to contact.
- FIG. 42 is a diagram showing a state where the imaging direction is 45 degrees upward with respect to the horizontal direction.
- FIG. 43 is a diagram illustrating an installation example of an imaging unit.
- FIG. 44 is a diagram for explaining automatic switching of the area selection mode.
- FIG. 45 is a diagram showing a GUI display and a moving direction of the image area when the area selection mode is automatically switched.
- FIG. 46 is a diagram for explaining a region selection mode switching operation including a mixed mode.
- FIG. 47 is a flowchart showing a region selection mode switching operation including a mixed mode.
- FIG. 48 is a flowchart showing an operation when a direction button is operated.
- FIG. 49 is a diagram showing a state in which the display mode MH is changed according to switching of the selection area in still another embodiment of the present invention.
- FIG. 50 is a diagram showing a state in which the display mode MH is changed in accordance with the change in switching of the region selection mode MS in still another embodiment of the present invention.
- FIG. 51 is a flowchart showing an operation flow of the image processing system when display mode switching processing is performed.
- FIG. 52 is a flowchart showing another flow of operation of the image processing system when display mode switching processing is performed.
- FIG. 53 is a diagram showing another form of display mode switching processing.
- FIG. 1 is a block diagram showing a configuration of an image processing system according to an embodiment of the present invention.
- the image processing system 10 includes an imaging unit 11, an input unit 12, an image processing unit 13, and a display unit 14.
- the imaging unit 11 is configured using an imaging optical unit 111 and an imaging element 112.
- the imaging optical unit 111 is for forming a subject image on the imaging surface of the imaging element 112.
- the imaging optical unit 111 uses, for example, a wide-angle lens in order to form a wide-field subject image on the imaging surface of the imaging element 112.
- a wide-angle lens a force that uses an angle of view of about 45 degrees or more is not limited to this.
- the imaging optics 11 1 may be configured using a fish-eye lens, a PAL (Panoramic Annular Lens), or the like.
- a cylindrical, bowl-shaped, or pyramid-shaped mirror is used so that a subject image with a wide field of view is imaged on the imaging surface of the image sensor 112 using the reflection of the mirror. It may be.
- the field of view may be further expanded by combining a plurality of lenses and mirrors. For example, by using two fisheye lenses with an angle of view of about 180 °, it is possible to obtain a subject image with a field of view in all directions (spherical space (360 °;)).
- the image sensor 112 uses, for example, a CCD (Charge Coupled Device) or a CMOS (Complementary Metal-Oxide Semiconductor) sensor that converts light into an electrical signal.
- the image pickup device 112 generates image data DVa based on the subject image and supplies it to the image processing unit 13.
- FIG. 2 shows the relationship between the object image formed on the image sensor 112 and the selected area when a fisheye lens is used for the imaging optical unit 111.
- the angle of view of the imaging optical unit 111 is, for example, about 180 ° and the field of view can be shown as the hemispherical spherical surface 51 in FIG. 2, the subject image formed on the image sensor 11 2 (hereinafter referred to as “wide field image” t I) Gc is an image distorted by the imaging optical unit 111, for example, a circular image.
- the displayed image becomes a display image in which distortion due to the imaging optical unit 111 occurs.
- this selection area corresponds to, for example, the selection area 71 within the angle of view of the imaging optical unit 111.
- the image processing unit 13 can display the image of the selected region without distortion by performing distortion correction processing for correcting the distortion generated by the imaging optical unit 111 on the image of the image region ARs.
- the selection area so that the desired subject is included within the angle of view of the imaging optical unit 111, it is possible to display an image of the desired subject without distortion.
- the position of the selected area is switched to a new position, or if the size or shape of the selected area is changed, the position, size, or shape of the image area ARs corresponding to the selected area is also changed. Therefore, an image at an arbitrary position or region within the angle of view of the imaging optical unit 111 is reduced with distortion of the imaging optical unit 111. It is possible to display in a corrected state.
- the selection area can be set by designating the position of the selection area 71 within the angle of view of the imaging optical unit 111, the angle range indicating the range of the selection area 71, and the like. Further, since the image area ARs set on the wide-field image Gc corresponds to the selection area as described above, the selection area can also be specified by specifying the position, range, etc. of the image area ARs. Can be set.
- the input unit 12 sets the operation mode when switching the position of the selected region, changing the region size and shape of the selected region, and switching the selected region, and the image display mode according to the user operation. And so on.
- the input unit 12 may be any device that can be operated by the user. For example, a mouse, a keyboard, a switch device, a touch sensor, a controller for a game device, a stick-like operation device that can be held by a user, and the like can be given.
- the input unit 12 generates input information PS corresponding to the user operation and supplies it to the image processing unit 13.
- the image processing unit 13 performs distortion correction processing using the image data DVa supplied from the imaging unit 11, and the distortion generated by the imaging optical unit 111 is corrected to generate an image of the selected region. . Further, the image processing unit 13 sets the display mode of an image to be displayed on the display unit 14, generates display image data DVd corresponding to the set display mode, and supplies the image data DVd to the display unit 14. Note that the image processing unit 13 uses a wide-field image, an image of a selected region in which distortion is corrected, or the like as a display image.
- the image processing unit 13 sets an area selection mode, which is an operation mode for switching the position of the selection area, and based on the input information PS from the input unit 12, the image selection unit 13 sets the selection area in the set area selection mode. Perform the switching process.
- the image processing unit 13 sets the display mode and area selection mode specified in advance at the start of the first operation, sets the display mode and area selection mode at the end of the operation, and starts the operation. Also do.
- the display unit 14 includes a liquid crystal display element, an organic EL element, and the like, and performs image display based on the image data DVd supplied from the image processing unit 13.
- the imaging unit 11, the input unit 12, the image processing unit 13, and the display unit 14 may be provided as a single unit or may be provided separately. It is also good. Furthermore, only a part may be integrated. For example, if the input unit 12 and the display unit 14 are provided integrally, it is possible to easily operate the input unit 12 while confirming the display on the display unit 14. Further, in the imaging unit 11, the imaging optical unit 111 and the imaging element 112 may be integrated or provided separately.
- FIG. 3 is a block diagram showing a functional configuration of the image processing unit 13.
- the image processing unit 13 includes a distortion correction unit 13a, a region selection mode setting unit 13b, a display mode setting unit 13c, a data output unit 13d, and a control unit 13e.
- the distortion correction unit 13a performs distortion correction for correcting distortion of the imaging optical unit 111 using image data corresponding to the selected region in the image data DVa, and generates distortion correction data.
- the area selection mode setting unit 13b sets an area selection mode that is an operation mode when setting or switching the selection area.
- the region selection mode setting unit 13b includes, as the region selection mode MS, an orthogonal coordinate mode MS1 that is a first region selection mode and a polar coordinate mode MS2 that is a first region selection mode. Select one of the area selection modes. Each region selection mode will be described later.
- the display mode setting unit 13c sets a display mode for displaying an image or the like that has been subjected to distortion correction on the display unit 14.
- this display mode setting unit 13c for example, as shown in FIG. 4, an overall image display mode MH1, a selected image display mode MH2, a both display mode MH3, and a split display mode MH4 are provided as display modes MH. Set to the display mode. Each display mode will be described later.
- the data output unit 13d outputs image data of a display image corresponding to the set display mode.
- distortion correction data is output.
- the image data supplied from the imaging unit 11 is output.
- the image of the selected area and the wide-field image that the distortion of the imaging optics 111 is corrected Uses the distortion correction data and the image data supplied from the imaging unit 11 to generate and output new image data.
- control unit 13e performs setting and switching of the selection region in accordance with the region selection mode MS set by the region selection mode setting unit 13b.
- FIG. 5 is a block diagram illustrating a specific configuration of the image processing unit 13.
- the image data DVa is supplied to the image extraction processing unit 131 and the distortion correction processing unit 133.
- the image extraction processing unit 131 extracts the image data DVc of the wide-field image (subject image) Gc from the image data DVa, and supplies the image data DVc to the selection region emphasis display processing unit 132.
- the wide-field image Gc indicates a partial region on the sensor surface of the image sensor 112 and is determined by the imaging optical unit 111. For this reason, when the area of the wide-field image Gc on the sensor surface is fixed, predetermined pixel data extraction is performed from the image data DVa to extract the pixel data of the area of the wide-field image Gc.
- the imaging optical unit 111 can be replaced and the area of the wide-field image Gc on the sensor surface changes, or the optical characteristics of the imaging optical unit 111 can be changed!
- the area of the field image Gc changes the area of the wide field image Gc on the sensor surface is discriminated in advance, and the image data of the area of the wide field image Gc that is discriminated is extracted.
- imaging is performed so that the entire field of view of the imaging optical unit 111 is a white subject, and a pixel position where the image data DVa is at a white level is detected. In this way, it is possible to easily determine the region of the wide-field image Gc.
- the selection area highlighting processing unit 132 is indicated by the selection area setting information 3 ⁇ 4JA based on the selection area setting information JA supplied from the processing control section 135 described later in the wide-field image Gc.
- the image area ARs corresponding to the selected area is processed so that the user can easily identify it.
- the selection area emphasis display processing unit 132 provides a boundary display indicating the boundary between the image area ARs and the area that is not the image area ARs, or changes the luminance or color of the area excluding the image area ARs to display the image. Display control is performed so that the area ARs can be identified.
- the image in this image area ARs is an image that is emphasized so as to be identifiable, and is called an enhanced image Gs.
- An image of an image (hereinafter referred to as “whole image Gcp” t ⁇ ) in which the image area ARs is identified as the enhanced image Gs in the wide-field image Gc by performing such image processing.
- Data DVcp is generated and supplied to the image output processing unit 134.
- the distortion correction processing unit 133 corresponds to the distortion correction unit 13a shown in FIG. 3, performs the distortion correction processing of the imaging optical unit 111 using the image data DVa, and is supplied from the processing control unit 135. Then, the corrected image data DVsc in which the distortion of the selected region indicated by the selected region setting information JA is corrected is generated and supplied to the image output processing unit 134.
- the image output processing unit 134 corresponds to the data output unit 13d shown in FIG. 3, and based on the display control information JH from the processing control unit 135, the image data DVcp and Z or the corrected image data DVsc Used to generate image data DVd of the display image.
- the process control unit 135 corresponds to the region selection mode setting unit 13b, the display mode setting unit 13c, and the control unit 13e shown in FIG.
- the process control unit 135 sets the area selection mode, and performs selection area setting and switching based on the input information PS from the input unit 12 according to the set area selection mode.
- Selection area setting information A indicating the newly set selection area is generated and supplied to the selection area emphasis display processing unit 132 and the distortion correction processing unit 133.
- the process control unit 135 sets the display mode, and displays control information according to the set display mode! BJH is generated and supplied to the image output processing unit 134.
- the processing control unit 135 causes the display control information H to perform processing for including a menu display in the display image.
- the image processing unit 13 includes, for example, a CPU (Central Processing Unit), a RAM (Random Access).
- a CPU Central Processing Unit
- RAM Random Access
- the image processing unit 13 is composed of an FPGA (Field Programmable Gate Array), or a DSP (Digital Signal Processor), and a video encoder, a sound encoder, and an interface for acquiring the input information PS. Alternatively, an interface for outputting image data DVd to the display unit 14 may be provided, etc. Also, both FPGA and DSP should be used, and both should share processing.
- FPGA Field Programmable Gate Array
- DSP Digital Signal Processor
- the input information PS supplied from the input unit 12 to the image processing unit 13 is information indicating the setting of the display mode MH and the region selection mode MS, the selection region switching instruction, and the like as described above.
- the information indicating the selection area switching instruction includes a predetermined amount of the selected area in a predetermined direction. It may include information for moving the position, information indicating the position of the selected area after the change, and the like.
- the input information PS should include information for changing the area size of the selected area, information for rotating the selected area, information for setting the area shape of the selected area, and the like.
- the selection area is not limited to a mode in which switching or the like is performed according to a user operation.
- a mode in which the selection area is set at a position designated in advance as described above is also conceivable.
- the information related to the selected area may be stored in advance in a ROM (not shown) or an external storage device.
- a specific area in the wide-field image Gc is automatically recognized, a form in which the automatically recognized specific area is processed as an image area ARs corresponding to the selected area may be considered.
- a mode in which an area including an image of the moving subject is automatically processed as an image area ARs is also conceivable.
- image regions detected in the wide-field image Gc are processed as image regions ARs corresponding to the selected region by various sensors (not shown).
- the sensor include a temperature sensor, a sound sensor, a pressure sensor, an optical sensor, a humidity sensor, a vibration sensor, a gas sensor, and various other sensors.
- the sensor signals generated by these sensors are supplied to the processing control unit 135, and the processing control unit 135 uses the supplied sensor signals to switch the selected region and control each unit. For example, if an abnormal temperature or abnormal sound is detected, the screen of the display unit 14 can be changed if the selected area is automatically switched according to the direction in which the abnormal temperature or abnormal sound is detected.
- the captured image in the direction in which the abnormality is detected can be displayed in a corrected state.
- the display mode is switched, the region size is switched, or the region shape is switched in accordance with the detection of an abnormality, a display that allows the user to easily confirm the abnormality is possible.
- the configuration shown in FIG. 5 is exemplary, and the image processing unit 13 is not limited to the configuration shown in FIG. 5 as long as it has the functional configuration shown in FIG.
- the image processing unit 13 is not limited to the configuration shown in FIG. 5 as long as it has the functional configuration shown in FIG.
- the image data DVa supplied from the imaging unit 11 indicates only the wide-field image Gc
- the selection area highlighting processing unit 132 may be provided on the output side instead of the input side of the image output processing unit 134. In this case, When a wide-field image is included in the image based on the image data DVa, processing is performed so that the user can easily identify the image region corresponding to the selected region in the wide-field image.
- FIG. 6 shows the entire image.
- the wide-field image Gc is an image having distortion caused by the imaging optical unit 111.
- the selection region is set by the processing control unit 135, the selection region emphasis display processing unit 132 performs image processing so that the selection region can be easily identified as described above.
- a boundary display for example, a frame display
- the brightness and color of the area excluding ARs are changed, and the enhanced image Gs is obtained by emphasizing the image area ARs.
- the image area ARs corresponding to the selected area shows the entire wide-field image Gc.
- the processing control unit 135 controls the operation of the image output processing unit 134 by the display control information! BJH and sets the entire image display mode MH1, as shown in FIG. In Gc, the image area ARs can be identified as the emphasized image Gs, and image data DVd having only the entire image Gcp as a display image is generated.
- the processing control unit 135 corrects distortion generated by the imaging optical unit 111 with respect to the image in the image area ARs (hereinafter referred to as “selected area display image”). The image data DVd with only Gsc as the display image is generated.
- the processing control unit 135 when both display modes MH3 are set, the processing control unit 135 generates image data DVd of a display image in which the entire image Gcp and the selected area display image Gsc are displayed simultaneously as shown in FIG.
- the image data DVd of the display image displayed together with the entire image Gcp is generated as the selection area display image Gsc with the distortion corrected image of each emphasized image corresponding to the selection area.
- the image output processing unit 134 may use a technique such as OSD (On Screen Display).
- the process control unit 135 performs initial setting of the display mode and area selection mode and setting of the selection area (ST1 001). For example, at the start of the first operation, the preset display mode and area selection Select mode. In addition, the selected area is set in a predetermined size in the preset viewing direction. In addition, information indicating the display mode, area selection mode setting status, and selection area setting status is stored at the end of the operation, and this information is used at the start of the next operation to operate in the state at the end of the previous operation. It may be a start.
- the process control unit 135 determines whether or not the input information PS supplied from the input unit 12 is information that causes a change in setting or the like (ST1002).
- the processing control unit 135 sets the display mode and the area selection mode according to the acquired input information PS.
- the operation of the distortion correction processing unit 133 is controlled so that the distortion correction processing is performed on the image of the selected area after the change or the selection area is changed.
- the selected area highlighting processing unit 132 is controlled so that the image of the selected area after the change can be identified.
- the operation of the image output processing unit 134 is controlled so that the image data DVd corresponding to the changed mode is generated (ST1003).
- the distortion correction processing unit 133 when the acquired input information PS is to change the size or shape of the selected area, the distortion correction processing unit 133 or the like so that distortion correction or highlight processing is performed corresponding to the selected area after the change.
- the selected area highlighting processing unit 132 is controlled. For example, even when the shape of the selected area is set to a circle, ellipse, triangle, pentagon or more polygon, a shape composed of straight lines and curves, or other complex shapes, etc.
- the distortion correction processing unit 133 is controlled so that distortion correction of an image included in the shape selection region is performed.
- the operation of the selection area emphasis display processing unit 132 is controlled so that an emphasized image corresponding to the selection area of these shapes is obtained.
- the process returns to ST1002, and the input information PS newly supplied from the input unit 12 is information that causes a change in setting or selection. It is determined whether or not there is a certain force.
- the process of obtaining the input information PS and obtaining the image of the selected area in a state where the distortion is corrected in accordance with the input information PS is referred to as a development process.
- FIG. 9 When the field of view is expressed in a three-dimensional space as shown in FIG. 9, the selected area can be expressed on the spherical surface 52.
- the angle ⁇ indicates the incident angle when the arrow OA is the optical axis.
- the imaging optical unit 111 is configured using a fisheye lens having an angle of view of, for example, about 180 °, the field of view corresponds to the hemispherical portion of the spherical surface 52. For this reason, when the imaging optical unit 111 is installed directly upward, the upper half of the spherical surface becomes the field of view of the imaging optical unit 111 as shown in FIG.
- the field of view of the upper half of the sphere is called the field of upper hemisphere.
- the lower half of the spherical surface becomes the field of view of the imaging optical unit 111 as shown in FIG.
- the field of view of the lower half of the sphere is called the field of lower hemisphere.
- the front half of the spherical surface is the field of view of the imaging optical unit 111, as shown in FIG.
- the field of view of the front half of the spherical surface is called the field of front hemisphere.
- the field of view in the case of imaging the left direction or the right direction instead of imaging in the front is a left hemisphere field or a left hemisphere field.
- the imaging optical unit 111 As a case where the imaging optical unit 111 is installed facing directly above, that is, a case where the optical axis of the imaging optical unit 111 is substantially aligned with a vertical line and the imaging direction is upward, for example, Ground, floor, desktop force It is assumed that you are looking upward. For example, when the imaging optical unit 111 is installed directly below, that is, when the optical axis of the imaging optical unit 111 substantially coincides with the vertical line and the imaging direction is downward, for example, downward from the ceiling or the sky. It is assumed that As a case where the imaging optical unit 111 is installed in the horizontal direction, for example, a case where the horizontal or horizontal direction is viewed from a wall perpendicular to the ground is assumed.
- a field of view of the upper half and the lower half are also conceivable.
- a hemispherical field of view according to the installation direction of the imaging optical unit 111 (when the imaging optical unit 111 and the imaging device 112 are configured as a body, the installation direction of the imaging unit 11) is obtained.
- the direction of the field of view changes depending on the installation direction not only when a fisheye lens is used but also when a wide-angle lens, a mirror, or the like is used. Further, when the visual field is wide, a part of the visual field range may be selected and the selected visual field range may be used.
- the distortion correction processing of the distortion correction processing unit 133 will be described.
- geometric correction is used.
- a general algorithm such as a two-dimensional orthogonal coordinate system without distortion may be used.
- the conversion formula and conversion table are stored in a ROM or other memory not shown. Just do it.
- the present invention is not limited to such distortion correction, and other known distortion correction may be used.
- FIG. 11 is a diagram for explaining the image height characteristics of the lens.
- the upper hemisphere field of view centered at point O is displayed in two dimensions when viewed from the y-axis direction.
- an arrow OPk indicates, for example, the direction of the subject.
- the distance to the imaging point Q for the point O force is the image height Lh.
- FIG. 11B is a graph illustrating the image height characteristics.
- the horizontal axis shows the angle (incident angle) ⁇
- the vertical axis shows the image height Lh.
- the data may be stored in the memory in advance as a conversion table.
- FIG. 12 is a diagram for explaining the principle of the distortion correction processing.
- (A) of FIG. 12 is a display plane 81 showing the range of the image displayed on the display unit 14.
- FIG. 12B shows a state in which the display plane 81 is set with respect to the spherical surface 51 of the upper hemisphere field of view.
- the display plane 81 indicating the range of the display image corresponds to the selection area.
- (C) in FIG. 12 shows a state in which the spherical surface 51 shown in FIG. 12 (B) is projected onto the xy plane, and the area where the spherical surface 51 is projected corresponds to the area of the entire image Gcp. To do.
- the angle ⁇ between the direction of the point P 'on the XY plane corresponding to the point P and the X axis can be obtained, and the angle ⁇ and the image height Lh force can also be obtained as the position of the image point Q.
- the drawing of the point P on the display plane 81 is performed based on this pixel data as shown in FIG. Do.
- an image in which distortion generated by the imaging optical unit 111 is corrected can be displayed on the display plane 81.
- the pixel signal corresponding to the imaging point Q is generated using the pixel signal of the pixel located around the imaging point Q.
- a pixel signal corresponding to the imaging point Q can be generated by performing interpolation or the like using pixel signals of pixels located around the imaging point Q.
- the processing control unit 135 uses the information indicating the selection area using the angles ⁇ and ⁇ shown in FIG. 12B as the selection area setting information JA. In this case, since the image area ARs corresponding to the selected area can be determined from the image height Lh corresponding to the angle ⁇ and the angle ⁇ , the selected area emphasis display processing unit 132 emphasizes the image area ARs corresponding to the selected area.
- Whole image as Gs Gcp image data DVcp can be generated.
- the distortion correction processing unit 133 obtains pixel data corresponding to each pixel based on the angles 0 and ⁇ indicating each pixel on the display plane corresponding to the selection region, thereby selecting the distortion corrected processing.
- the corrected image data DVsc of the area display image Gsc can be generated.
- the image data DVcp of the entire image Gcp or the selected area display image Gsc can be obtained by performing the above-described arithmetic processing.
- the corrected image data DVsc can be generated. If coordinate values are used, the selected area can be easily shown even if the shape of the selected area becomes complicated.
- the area selection mode MS includes the first area selection mode, the Cartesian coordinate mode MS1 and the second area selection mode. There is a polar coordinate mode MS2!
- the Cartesian coordinate mode MS1 is, for example, as shown in FIG. 10 (C), when a wall isotropic force perpendicular to the ground is viewed in the horizontal or horizontal direction as the front hemisphere field of view. In this mode, an image that has been subjected to normal processing can be easily obtained.
- the processing control unit 135 performs arithmetic processing for moving the selection region 71 in the direction of the axis of the orthogonal coordinate system based on the switching instruction. Line! ⁇ ⁇ Generate selection area setting information A indicating the newly set selection area.
- FIG. 13 is a diagram for explaining the operation in the orthogonal coordinate mode MS1 in the region selection mode.
- the selection area 71 is switched according to the switching instruction using the Cartesian coordinate system.
- the switching instruction indicates, for example, the coordinate values of the X coordinate and y coordinate of the selected area after switching, or indicates the amount of movement of the selected area in the X direction and the amount of movement in the y direction. Switch on.
- the selection area 71 becomes a new position moved in the direction of the axis of the orthogonal coordinate system. If the X coordinate and y coordinate are changed, the selection area 71 becomes a new position moved in an oblique direction with respect to the axial direction of the orthogonal coordinate system.
- the locus of an arbitrary point (for example, the center PO) on the selection area 71 is a line 51x. Further, when setting is performed sequentially by moving in the direction of the selection area 71 based on the switching instruction, the locus of the center PO on the selection area 71 is a line 51y. Note that the image area ARs also moves when the selection area 71 is moved.
- the selection area is switched by changing the coordinate value of the orthogonal coordinate system. For this reason, when the field of view is the front hemisphere, for example, if the Cartesian coordinate mode MS1 is selected, the selection area 71 can be easily set to the position moved in the horizontal or vertical direction according to the selection area switching instruction. Therefore, the image displayed on the display unit 14 can be easily switched to an image in a desired direction. For example, it becomes easy to select and display a desired subject from the subjects arranged in the horizontal direction.
- Polar coordinate mode MS2 As shown in Fig. 10 (A), when viewing from the top, from the ground, on the floor, or from the desk as the upper hemisphere view, or from the ceiling or the sky as the lower hemisphere view, as shown in Fig. 10 (B). This is a mode in which an image subjected to distortion correction processing for a desired subject can be easily obtained, for example, when viewing.
- the processing control unit 135 changes the selection area 71 based on the switching instruction to change the declination angle of the polar coordinate system. An arithmetic process that moves in the direction is performed, and selection area setting information A indicating the newly set selection area is generated.
- FIG. 14 is a diagram for explaining the operation in the polar coordinate mode MS2 in the region selection mode.
- the selection area 71 is switched according to the switching instruction using the polar coordinate system.
- the switching instruction may indicate the declination angle ⁇ ag and declination c ⁇ ag of the selected area after switching,
- the selected area 71 is switched on the polar coordinates to indicate the change amount of the angle ⁇ ag and the change amount of the deflection angle ⁇ ag.
- the selected area 71 is the direction in which the declination ⁇ ag of the polar coordinate system changes (hereinafter referred to as the ⁇ ag change direction). Alternatively, it becomes a new position moved in the direction in which the declination ⁇ ag changes (hereinafter referred to as “ ⁇ ag change direction”). Further, if the declination ⁇ ag and the declination ⁇ ag are changed, the selected area 71 becomes a new position moved in the oblique direction with respect to the ⁇ ag change direction or ⁇ ag change direction of the polar coordinate system.
- the locus of an arbitrary point (for example, the center PO) on the selection area 71 is a line 51r.
- the locus of the center line on the selection area 71 is a line 51s. Note that the image area ARs also moves when the selection area 71 is moved.
- the switching instruction may indicate, for example, the declination ⁇ ag and the radius of the selected area after the switch, or the declination ⁇ ag By indicating the amount of change and the amount of change in the radius, the selection area 71 can be switched on the polar coordinates even when the field of view is expressed in a two-dimensional space.
- the declination and Z or radius of the polar coordinate system are changed. By doing so, the selection area is switched. For this reason, if the polar coordinate mode MS2 is selected when the field of view is the upper hemisphere field or the lower hemisphere field of view, the selected area 71 can be easily moved to the position moved in the direction of change of the declination according to the selection area switching instruction. Since the setting can be performed, the image displayed on the display unit 14 can be easily switched to an image in a desired direction. For example, the subject force positioned around the imaging optical unit 111 can easily select and display a desired subject.
- FIG. 15 shows an example of a GUI (Graphical User Interface) displayed when the user operates the selection area using the input unit 12.
- the operation input screen Gu shown in (A) of FIG. 15 and (B) of FIG. 15 may be displayed on the display unit 14 together with the entire image Gcp and the selection area display image Gsc shown in FIG.
- the operation input screen Gu and the whole image Gcp may be displayed on separate display units.
- a display unit is provided separately from the display unit 14 that displays the entire image Gcp and the selection area display image Gsc, for example, a display unit is provided in the input unit 12 and GUI display is performed on this display unit. Also good.
- the operation input screen Gu is provided with a direction button group Gua or a direction button group Gub, an “enlarge” button Gucl, and a “reduction” button Guc2.
- a direction button group Gua for example, there are direction buttons Gua2 such as “Up”, “Down”, “Right”, and “Left” buttons around the center “select” button Gual.
- direction buttons Gub2 such as a “North” button and an “SE (South East)” button around the central “select” button Gubl.
- FIG. 16 is a diagram for explaining a case where a selection area switching instruction is issued when the orthogonal coordinate mode MS1 is selected.
- the case where both display modes MH3 are displayed as the display mode MH is described.
- a description will be given of an example in which the user performs an operation while looking at the operation input screen Gu shown in FIG.
- Continuous press refers to a state in which the state of being pressed once is not released, and is a so-called “long press”.
- the processing control unit 135 of the image processing unit 13 selects the direction according to the orthogonal coordinate mode MS 1 according to the input information PS of the input unit 12. region Perform the switching process! ⁇ ⁇ Generate selection area setting information A indicating the newly set selection area. Further, the process control unit 135 supplies the generated selection area setting information JA to the selection area emphasis display processing unit 132 so that the emphasized image Gs indicates the image area ARs corresponding to the newly set selection area. Change the highlight area. Further, the processing control unit 135 supplies the generated selection region setting information A to the distortion correction processing unit 133, and the imaging optical unit 111 generates an image of the image region ARs corresponding to the newly set selection region. The selected area display image Gsc with the corrected distortion is processed.
- the image output processing unit 134 generates image data DVd of a display image including the selection area display image Gsc according to the display mode and supplies the generated image data DVd to the display unit 14.
- the display unit 14 displays an image when the selected area is moved to the right as shown in FIG. Also, the position of the emphasized image Gs is updated in the entire image Gcp.
- the processing control unit 135 sets the movement amount of the selected area according to the number of times the “Right” button is operated and the pressed period.
- the selected area display image Gsc is updated. In this way, the processing control unit 135 performs the expansion process using the orthogonal coordinate system shown in FIG. 13 and causes the display unit 14 to display the selected area display image.
- the processing control unit 135 When the user presses an “Up” button, for example, the processing control unit 135 performs a selection area switching process according to the input information PS, and accompanies the selection area switching! Then, the enhanced image Gs of the moving image area ARs is displayed on the display unit 14 as the selected area display image Gsc in which the distortion generated by the imaging optical unit 111 is corrected.
- buttons Gual and Gubl in (A) and (B) of FIG. 15 can be used in various ways. For example, it can be used as a recording start button for recording the selected area display image Gsc in the current image area ARs. In addition, in the case where the image area ARs is indicated as the emphasized image Gs, V, and so on, this image area ARs is displayed as the emphasized image Gs and can be used as an area selection operation start button for allowing the user to perform the area selection operation. Available. Alternatively, it can be used as a switching button for display mode MH and other various buttons.
- the selection area display image Gsc is updated by newly setting the selection area in the “3 ⁇ 4” direction. Further, when the processing control unit 135 determines that the “West” button has been operated, the selection area display image Gsc is updated by newly setting the selection area in the “West” direction. In this way, by operating the button indicating the desired direction, it is possible to display an image in which the selected area is set at a desired new position without causing distortion on the display unit 14.
- GUI display shown in FIGS. 15A and 15B is merely an example, and needless to say, the present invention is not limited to this.
- FIG. 17 shows a display image displayed on the display unit 14 when the display mode is sequentially switched when, for example, the orthogonal coordinate mode is selected.
- A in FIG. 17 is a display image when the whole image display mode MH1 is set, and only the whole image Gcp is displayed.
- B of FIG. 17 is a display image when the selected image display mode MH2 is set, and only the selected region display image Gsc is displayed.
- C in FIG. 17 is a display image when both display modes MH3 are set, and both the entire image Gcp and the selection area display image Gsc are displayed.
- the display modes are sequentially switched, the entire image display mode MH1, the selected image display mode MH2, and the both display mode MH3 are cyclically switched.
- FIG. 18 shows a display image displayed on the display unit 14 when the polar coordinate mode MS2 is selected.
- FIG. 18 shows a case where two selection areas are provided. Note that the selection area is not limited to one or two, and the user may be provided freely. For example, each time the “Menu” button Guc3 shown in FIGS. 15A and 15B is operated, it may be increased, or another GUI display (not shown) may be used.
- the process control unit 135 performs a process of setting a plurality of selection areas according to the input information PS from the input unit 12. Alternatively, it may be set so that a plurality of selection areas are provided in advance, regardless of the input information PS from the input unit 12. Further, the processing control unit 135 individually generates a selection region display image of each selection region when the divided display mode MH4 is set according to the input information PS from the input unit 12, and the display unit 14 It is good also as what is displayed simultaneously.
- the selection area display image Gscl is an image of the first selection area. That is, the enhanced image Gsl of the image area ARsl corresponding to the first selection area is an image that has been subjected to distortion correction processing. Further, the selection area display image Gsc2 is an image in which the enhanced image Gs2 of the image area ARs2 corresponding to the second selection area is subjected to distortion correction processing.
- the control unit 135 performs a process of switching the selected area in the ⁇ ag change direction in response to the switching instruction.
- the selected area setting information A indicating the selected area after the processing is selected and highlighted.
- the selection area emphasis display processing unit 132 displays the emphasized images Gsl and Gs2 corresponding to the selected area after processing.
- the distortion correction processing unit 133 is a selection area display image in a state where the distortion of the image areas ARsl and ARs2 corresponding to the selected area after processing is corrected by the imaging optical unit 111. Perform correction processing to Gscl and Gsc2. For this reason, the display image after the switching instruction is displayed as the selection area display images Gscl and Gsc2 as shown in FIG. 18B. Further, the emphasized images Gsl and Gs2 correctly indicate the regions of the selection region display images Gscl and Gsc2. In this case, the image areas ARsl and ARs2 are moved in the counterclockwise direction on the entire image Gcp. When the direction of the switching instruction is the reverse direction, the image areas ARsl and ARs2 are moved in the clockwise direction. Will be.
- the input information PS supplied from the input unit 12 is, for example, a switching instruction to switch the selection region in the ⁇ ag change direction of the polar coordinate system shown in FIG.
- the process control unit 135 performs a process of switching the selected area in the ⁇ ag change direction, and selects the selected area setting information 3 ⁇ 4JA indicating the selected area after the process.
- the selection area emphasis display processing unit 132 displays the emphasized images Gsl and Gs2 corresponding to the selected area after processing.
- the distortion correction processing unit 133 performs processing based on the selection area setting information JA.
- Correction processing is performed on the images of the image areas ARsl and ARs2 corresponding to the subsequent selection areas as the selection area display images Gscl and Gsc2 in a state in which the distortion generated by the imaging optical unit 111 is corrected. For this reason, the display image after the switching instruction is displayed as the selection area display images Gscl and Gsc2 as shown in FIG. Further, the emphasized images Gsl and Gs2 correctly indicate the regions of the selection region display images Gscl and Gsc2.
- the image areas ARs1 and ARs2 are moved so as to approach each other in the radial direction on the entire image Gcp, and when the direction of the switching instruction is the reverse direction, the image areas ARsl and ARs2 are The entire image Gcp is moved away from each other in the radial direction.
- switching of the selection area can be realized by a mouse, a keyboard, a touch sensor, or the like, and the GUI at that time may take any form.
- polar coordinate mode MS2 for example, from the state of the display image shown in FIG. 18 (A), as shown in FIG. 20, the selected region display image Gs cl according to the input information PS acquired from the input unit 12 is shown. Can be displayed by rotating 180 degrees below the display unit 14, and the selection area display image Gsc2 can be displayed rotating 180 degrees above the display unit 14. This makes it convenient for the user to view the image at an angle.
- the image processing unit 13 does not generate a single-screen composite distortion corrected image even when a plurality of selection regions are provided, and selects a selection region table of one selection region.
- a display image may be generated and output to the display unit 14.
- image data for outputting one selection area display image for each selection area as a single screen is output, or selection area display images for a plurality of selection areas are generated and each selection area display image is combined. Whether to output the image data to be displayed on the screen is controlled by the processing control unit 135 of the image processing unit 13 according to the input information PS from the input unit 12 or according to the preset setting information.
- the processing control unit 135 switches the display mode MH in the same manner as in the orthogonal coordinate mode MS1. In addition, when multiple selection areas are provided, select the split display mode MH4. Make it possible.
- FIG. 21 is a diagram showing transition of display mode switching when the split display mode MH4 can be selected. It is assumed that four selection areas are provided.
- both display modes MH3 is a mode that displays one selected area correction image ((A) in Fig. 21)
- both display modes MH3 is a mode that displays one selected area display image upside down ( Figure 21 ( B)), a mode that displays two selected area corrected images in split display mode MH4 ((C) in Fig. 21), and an image that displays two selected area corrected images in split display mode MH4.
- split display mode MH4 displays four selected area corrected images (Fig. 21 (E))
- split display mode MH4 displays selected area corrected images. It is possible to switch the display mode (Fig.
- the orthogonal coordinate mode MS 1 and the polar coordinate mode MS 2 are provided as the region selection mode, for example, by switching the region selection mode according to the viewing direction, it is possible to intuitively This makes it possible to realize an image processing apparatus that is convenient and easy to use for the user.
- the image processing unit 13 can perform processing for enlarging, reducing, or rotating the selected region display image.
- the processing control unit 135 performs processing for reducing the range of the selection region! ⁇ Selection region setting information JA indicating the selection region after processing is selected region highlight display processing unit 132 And supplied to the distortion correction processing unit 133. Based on the selection area setting information JA, the selection area emphasis display processing unit 132 displays the emphasized images Gsl and Gs2 corresponding to the selected area after processing.
- the distortion correction processing unit 133 captures the images of the image areas ARsl and ARs2 corresponding to the selected area after processing using the imaging light. Correction processing is performed for the selected region display images Gscl and Gsc2 in a state in which distortion caused by the undergraduate 111 is corrected.
- the display image after the reduction operation is the selected region display image Gsc corresponding to the selected region after the reduction.
- the image is displayed on the entire screen except for the display region of the entire image Gcp. Therefore, the human image GM included in the image area ARs is compared to Fig. 22 (A). It will be enlarged and displayed. Since the selected area is reduced, the image area ARs in the whole image Gcp becomes narrower.
- the processing control unit 135 performs the process of reducing the range of the selection area according to the input information PS, indicating the selection area after processing.
- the selected area setting information A is supplied to the selected area emphasis display processing unit 132 and the distortion correction processing unit 133.
- the emphasized image Gs of the image area ARs corresponding to the enlarged selected area is subjected to distortion correction processing and displayed as the selected area display image Gsc on the entire screen excluding the display area of the entire image Gcp.
- the person image GM included in is reduced and displayed as compared with (A) in FIG. In this way, enlargement / reduction processing can be performed.
- FIG. 23 shows a case where the image processing unit 13 rotates and displays the image of the selected region.
- (A) and (B) in FIG. 23 show the state before the “rotation” operation instruction
- (A) in FIG. 23 shows, for example, the person image GM in the entire image Gcp in the image area ARs. This shows a case where the selection area is set as shown.
- FIG. 23 (B) shows a selected area display image Gsc that is an image in which the enhanced image Gs in the image area ARs has been subjected to distortion correction processing.
- the processing control unit 135 rotates about the center point of the image area ARs according to the input information PS. In this way, the selection area is changed. In this case, the person image GM is rotated in the reverse direction in the image area ARs corresponding to the selected area after the change process. Therefore, if the selection area display image Gsc corresponding to the emphasized image Gs of the image area ARs corresponding to the selection area after the change process is generated, the person image GM rotates the selection area as shown in FIG. An image rotated in the direction opposite to the direction can be obtained. This rotation process is convenient because the user can see the observation object at an easy-to-see angle. [0097] The entire image Gcp may be rotated instead of rotating the selection area.
- FIG. 26 is a block diagram showing a configuration of an image processing system according to another embodiment of the present invention.
- the description of the same devices and functions provided in the image processing system 10 shown in FIG. 1 will be simplified or omitted, and different points will be mainly described.
- the image processing system 20 has a configuration in which a storage device 21 is further added to the image processing system 10 of FIG.
- the storage device 21 is a device that stores, for example, the image data D Va generated by the imaging unit 11 and various image data generated by the image processing unit 13.
- the storage medium used for the storage device 21 may be any medium that can store image data, such as an optical disk, a magnetic disk, a semiconductor memory, a dielectric memory, and a tape-shaped storage medium.
- the image processing unit 13 reads the image data D Va desired by the user from the storage device 21 in accordance with the input information PS from the input unit 12. Thus, it can be displayed on the display unit 14. Specifically, in accordance with the input information PS based on the user operation, the image processing unit 13 reads the past wide-field image Gc image data DVa stored in the storage device 21, and reads the read image.
- a selection area is set for the field of view represented by the data DVa
- a selection area display image which is an image in which distortion of the selection area is corrected, is displayed on the display unit 14.
- the image processing unit 13 performs distortion correction processing on an image of a preset selected area from past wide-field images stored in the storage device 21 and displays the image on the display unit 14.
- the form to be able to make is also considered.
- the following form may be considered. For example, if a user The selected area is selected from the wide-field image obtained in real time from the imaging unit 11, and the entire image Gcp and the selected area display image Gsc are viewed in real time and stored in the storage device 21. The user can then view a selected area display image Gsc by selecting a different area from the area selected in real time while viewing the stored whole image Gcp.
- the image processing unit 13 may store only the image data of the selected region display image Gsc without storing the image data of the entire image Gcp in the storage device 21. In this case, the user can view the selected area display image Gsc later.
- the image data indicating both the entire image Gcp and the selected area display image Gsc and the image data of each image may be stored.
- the image processing unit 13 can also perform the processing of the flowchart shown in FIG.
- the image processing unit 13 acquires a real-time wide-field image from the imaging unit 11 (ST2401), and acquires a past wide-field image or a past selected area display image stored in the storage device 21 (ST2402). .
- the image processing unit 13 performs a process of combining the acquired wide-field image and the past wide-field image or the past selected area display image as image data of one screen (ST2403), and the combined image Data can be output to the display unit 14 (ST2404).
- the image processing unit 13 may display the acquired wide-field image and the past wide-field image or the past selected area display image on separate display units 14.
- a form in which ST2401 and 2402 are in the reverse order is also conceivable.
- the image processing unit 13 may select a display area display image Gsc obtained by performing distortion correction processing in real time (this selection area display image Gsc may be an image generated in real time from a real-time wide-field image. In other words, it may be an image generated in real time from a past wide-field image stored in the storage device 21.) and a past selected area display image may be output.
- this selection area display image Gsc may be an image generated in real time from a real-time wide-field image. In other words, it may be an image generated in real time from a past wide-field image stored in the storage device 21.
- a past selected area display image may be output.
- the image processing unit 13 acquires a wide-field image from the imaging unit 11 in real time, or acquires a past wide-field image from the storage device 21 (ST2501). Furthermore, the image processing unit 13 acquires a past selection area display image stored in the storage device 21 (ST2502).
- the image processing unit 13 performs distortion correction processing on the image of the selected area in the wide-field image acquired in ST2501 (ST25 03).
- the image processing unit 13 synthesizes the selection area display image generated by the distortion correction process and the selection area display image acquired in ST 2502 as one screen image data (ST2 504), and selects this display area display image. It is output to the display unit 14 as Gsc (ST2505). It is possible to consider a form in which ST2501 and 2502 are in the reverse order, and a form in which ST2502 and 2503 are in the reverse order of force.
- the image processing unit 13 further selects a selection area display image (hereinafter referred to as a real-time selection area display image) obtained by performing distortion correction processing in ST2503, and a past selection area. It is also possible to output the display image in such a manner that a human can distinguish it on the display unit. Specifically, when the image processing unit 13 generates an image with an identifier added to at least one of the real-time selection area display image and the past selection area display image, for example, a frame surrounding the power images is generated and the frame is displayed. For example, an image with a different color may be generated.
- the image data stored in the storage device 21 is a moving image
- a certain amount of moving image data is stored in accordance with the storage capacity of the storage device 21, and the oldest image frame force is automatically and sequentially deleted. Let's do it.
- FIG. 29 is a flowchart showing a storage process of position information or track information.
- the image processing unit 13 receives input information for setting a selection region in a wide-field image acquired in real time from the imaging unit 11 or in a wide-field image based on image data read from the storage device 21.
- the processing control unit 135 of the ST260 Do image processing unit 13 generates position information of the selected area according to the input information, or generates trajectory information when a selection area switching operation is performed. Thus, it is possible to reproduce the switching of the selected area (ST2602).
- Unit 135 stores the generated position information or trajectory information in storage device 21 (ST2603).
- Such a form is effective when, for example, an image in a predetermined range or an image with a trajectory in a certain range is required in a certain place.
- an image in a predetermined range or an image with a trajectory in a certain range is required in a certain place.
- the imaging unit 11 is mounted on a security fixed point monitoring camera, it is considered that an image of a certain predetermined range or an image of a certain range of trajectory is required from the wide-field image.
- the user can display the selection area display of that range or the range.
- the image can always be monitored as a display image on the display unit 14.
- the image in the predetermined range may be a still image or a moving image. It can be stored as a moving image from the start point to the end point of the trajectory, whether it is an image of a trajectory in a certain range or a still image at a position on the trajectory.
- the image processing unit 13 is excluded from processing such as outputting the image after distortion correction processing as a still image.
- FIG. 30 is a diagram for explaining a method for setting a trajectory of a predetermined range as described above as an embodiment using the storage device 21.
- the user sets the selection area for the wide-field image and switches the position of the selection area.
- the image processing unit 13 operates the “select” buttons Gual and Gubl.
- the position information of the selected area at this time is stored in the storage device 21 according to the input information shown. For example, if the “Select” button Gual, Gubl is operated when the image area ARs corresponding to the selected area is set to the position a on the wide-field image Gc, the selection at this time is performed.
- the position information of the selection area is stored.
- the trajectory may be set in advance by a program, or the trajectory is generated by automatic recognition by the various sensors described above. Also good.
- the trajectory may be one in which the selection area is set so that the image area ARs is provided at discontinuous points such as positions a, b, c, and d shown in FIG. 30, or from position a to position d.
- the selection area may be set so as to be continuous.
- the selection area is set so that the image area ARs is provided at the position where each point at positions a, b, c, and d is interpolated.
- the image processing unit 13 may include a program to be set.
- FIG. 31 is a block diagram showing a configuration of an image processing system according to still another embodiment of the present invention.
- the image processing system 30 includes a storage device 21 that the imaging unit 11 described above has.
- the storage device 21 stores a wide-field image in advance as described above, for example.
- the image processing unit 13 can read the image data DVm of the wide-field image, and can obtain the selected region display image from the wide-field image by developing processing.
- the image processing unit 13 obtains a selection area display image by performing development processing from a wide-field image based on image data stored in advance in the storage device 21, and the wide-field image is displayed. It is also possible to store the image and the selected area display image in the storage device 21 in association with each other. Alternatively, the image processing unit 13 associates a wide-field image based on pre-stored image data with information indicating a selection area that is an area to be expanded from the wide-field image, and stores it in the storage device 21. It is also possible to keep it.
- the whole image Gcp and the selection area display image Gsc displayed on the display unit 14 may be alternately displayed on the display unit 14 at predetermined time intervals. In that case, if some operation input from the user is matched, both the entire image Gcp and the selected area display image Gsc may be displayed.
- the image sensor 112, the input unit 12, the image processing unit 13, the display unit 14, the storage device 21 and the like of the image capturing unit 11 are the Internet, a LAN (Local Area Network), or a dedicated device. They are connected via a network such as a line.
- LAN Local Area Network
- the image processing system includes, for example, a security system, a teleconference system, a system for inspection, management, testing, etc. of machines and facilities, a road traffic system, a mobile camera (for example, a moving vehicle).
- a security system for example, a teleconference system
- a system for inspection, management, testing, etc. of machines and facilities for example, a road traffic system
- a mobile camera for example, a moving vehicle
- a system that uses a camera for imaging a flying object or other moving body force, a nursing care, or a medical system.
- the process control unit 135 of the image processing unit 13 can switch the region selection mode MS according to the installation direction (installation angle) of the imaging unit 11.
- the configuration and operation of the image processing system 40 in this case will be described.
- FIG. 32 is a block diagram showing a configuration of an image processing system according to still another embodiment of the present invention.
- the image processing system 40 detects the incident direction of the optical image incident on the imaging optical unit 111 in addition to the components of the image processing system 10 shown in FIG.
- a direction detection sensor such as a gyro sensor 41 is provided.
- the gyro sensor 41 is fixed to the imaging optical unit 111, detects the incident direction of the optical image incident on the imaging optical unit 111, and performs image processing on the sensor signal ES indicating the direction detection result. This is supplied to the processing control unit 135 of the unit 13.
- the imaging unit 11 is configured by integrating the imaging optical unit 111, the imaging device 112, and the gyro sensor 41! The
- FIG. 33 is a diagram conceptually showing how the area selection mode MS is switched in accordance with the installation direction of the imaging unit 11 in the present embodiment.
- the installation mode of the imaging unit 11 is as shown in Fig. 10 described above, where the imaging unit 11 is installed on the ground, the floor, a desk, etc. to capture a desired subject HM (upper hemisphere field of view, (A) in Fig. 33)
- HM upper hemisphere field of view
- FIG. 33 front hemisphere view, Fig. 33 (B)
- the subject is installed on the ceiling or in the air.
- imaging HM lower hemisphere field of view, (C) in Fig. 33
- the processing control unit 135 of the image processing unit 13 determines the area selection mode MS according to the angle in the vertical direction in the imaging unit 11 determined based on the sensor signal of the direction detection sensor force! Set and switch automatically.
- the area selection mode MS is switched to the polar coordinate mode MS2 when the imaging unit 11 is in the upper hemisphere field and the lower hemisphere field, and the orthogonal coordinate mode MS1 when it is in the front hemisphere field. Switch to.
- the imaging unit 11 is in the upper or lower hemisphere field of view, it is possible to observe the surroundings more easily and more uniformly than the subject in the center of the wide-field image Gc. It becomes.
- setting the Cartesian coordinate mode MS 1 makes it easy to view the subject at the center of the wide-field image Gc in detail and to move the subject vertically and horizontally. Can be observed.
- the imaging unit 11 is installed so as to have an upper hemisphere field of view, and the state where the area selection mode MS is the polar coordinate mode MS2 is S, and FIG. (B)
- the imaging unit 11 is placed in the front hemisphere field of view, the region selection mode MS is in the orthogonal coordinate mode MS1, and the imaging unit 11 is in the lower hemisphere field of view as shown in Fig. 33 (C).
- the state where the area selection mode MS is set to the polar coordinate mode MS2 is called S.
- FIG. 34 illustrates a method for setting a threshold for switching between states S 1, S, and S.
- the force which is one of the states S, S and S is discriminated, and the region selection mode is determined based on the discrimination result.
- the threshold ⁇ is the state S or
- 5 6 3 0 is the threshold for switching from state S to state S, and threshold ⁇ is switched from state S to state S
- the threshold ⁇ is a threshold for switching from state S to state S.
- the threshold ⁇ is a threshold for switching to the state S force state S. This includes the thresholds ⁇ and ⁇ .
- the threshold ⁇ is 45 degrees, for example, and the threshold ⁇ is 135 degrees, for example.
- the threshold values ⁇ , ⁇ , ⁇ , ⁇ are limited to the above ⁇ 10 degrees.
- the absolute value of each difference from 6 may be set differently.
- FIG. 35 shows the operation when the image processing system 40 switches between states S, S, and S.
- the processing control unit 135 of the image processing unit 13 acquires the measurement result of the gyro sensor 41 at this time (ST3201).
- the imaging unit 11 is installed at a desired position.
- the processing control unit 135 acquires the measurement result of the gyro sensor 41 at this time, and measures the current angle ⁇ from this measurement result and the measurement result acquired in ST3201 (ST3202).
- the process control unit 135 determines whether or not the current angle ⁇ is equal to or smaller than a threshold value ⁇ (
- the process control unit 135 determines that the state is in the above S state, and selects the region selection mode MS
- the region selection mode MS is set to the polar coordinate mode MS 2 in the lower hemisphere field of view (ST3212).
- the process control unit 135 determines that the imaging unit 11 has been changed to the state S.
- the area selection mode MS is set to the orthogonal coordinate mode MS1 (ST3208).
- ⁇ is ⁇
- the process control unit 135 maintains the state of the polar coordinate mode MS2 (ST3204).
- the processing control unit 135 After setting the region selection mode MS to the orthogonal coordinate mode MS 1 in ST3208, the processing control unit 135 reads the current angle ⁇ from the gyro sensor 41 again (ST3209),
- the process control unit 135 determines that the imaging unit 11 has been changed to the state S described above.
- the area selection mode MS is set to the polar coordinate mode MS2 (ST3204).
- the process control unit 135 determines that the imaging unit 11 has been changed to the state S.
- Selection mode MS is set to polar coordinate mode MS2 (ST3212). If ⁇ is ⁇ ( ⁇
- the process control unit 135 maintains the state of the orthogonal coordinate mode MSI (ST3208).
- the processing control unit 135 After setting the region selection mode MS to the polar coordinate mode MS2 in ST3212, the processing control unit 135 reads the current angle ⁇ from the gyro sensor 41 again (ST3213), and
- the process control unit 135 determines that the imaging unit 11 has been changed to the state S, and Set area selection mode MS to Cartesian coordinate mode MS 1 (ST3208)
- processing control unit 135 operates in polar coordinate mode.
- the processing control unit 135 automatically switches the region selection mode MS according to the installation angle of the imaging unit 11 by repeating the above operation.
- the angle of Pan (rotation about the z-axis) of the display plane 81 is set to H (x-axis direction).
- the angle of Tilt (rotation around the X axis or y axis) is V (X axis direction is 0 degree).
- the coordinate axes in FIG. 36 (A) are as follows: x-axis ⁇ y-axis y-axis ⁇ z-axis, ⁇ -axis ⁇ axis
- the angles at which the Pan angle (H) and Ti It angle (V) in Fig. 36 (A) are rotated are respectively set as the Pan angle (h) and Tilt angle (v).
- the direction vector [D] representing the set direction of the selected area corresponding to the image displayed on the display unit 14 is shown in FIG. 37 due to the rotation of the unit vector in the X-axis direction in each coordinate system. Calculated in rows. Accordingly, the sin value and the cos value of the rotated Pan angle (h) and Tilt angle (V) in the orthogonal coordinate mode MS1 are obtained as follows.
- the display unit 14 is a fixed unit such as a VGA (Video Graphics Arrav) display.
- VGA Video Graphics Arrav
- the display plane [a] is perpendicular to the x-axis and each point sequence is parallel to the y-axis and z-axis.
- the plane [a] can be expressed as 3D.
- [0149] [( ⁇ , yO), (xl, yO), ⁇ ⁇ ⁇ (xN, yM)]
- ⁇ [A] [(xO, yO, 1), (xl, yO, 1), ... (xN, yM, 1)]
- the plane [PO] is determined by the parameters that can be set on the operation input screen Gu in FIGS. 15A and 15B as shown in FIG. Enlarge and move on a spherical surface as shown in (B).
- the moved plane is plane [P2]
- the point on plane [P2] corresponding to point Pj on plane [PO] is point Pj2 (x2, y2, z2).
- Each point on this plane [P2] is represented by the [ ⁇ ], [ ⁇ 2], [ ⁇ ], [ ⁇ ], [ ⁇ ], [ ⁇ ] and [ ⁇ ] matrices shown in (C) of FIG. 3 Calculated according to 9 (D).
- the coordinate values (x2, y2, z2) corresponding to the coordinate value force point Pj2 of plane [P2] calculated in this way are used.
- the point Pj2 (x2, y2, z2) force on the plane [P2] is also processed in the same manner as the principle of the distortion correction processing using FIG.
- the imaging point Qj2 corresponding to Pj2 (x2, y2, z2) can be found.
- the selected region display image Gsc which is the image after the distortion correction processing in the orthogonal coordinate mode MS1 and the polar coordinate mode MS2, is performed on the image sensor 112. Each pixel position is obtained, and the pixel data at the obtained pixel position is used to select a pixel that is not distorted.
- the area display image Gsc can be displayed.
- the installation angle of the imaging unit 11 is detected by the gyro sensor 41, and the region selection mode MS is determined according to the installation angle. Can be switched as appropriate, and the convenience of the user is improved.
- the region selection mode MS is switched while providing hysteresis, when the installation angle fluctuates in the vicinity of the threshold value ⁇ and the threshold value ⁇ , it responds to the fluctuation.
- the installation angle of the imaging unit 11 may be detected by using another sensor such as a gravity sensor instead of the gyro sensor 41 as the direction detection sensor.
- the force used to switch the region selection mode MS according to the installation angle of the imaging unit 11 for example, the region selection according to whether or not the imaging unit 11 has touched an object. You can change the mode MS.
- FIG. 40 is a diagram conceptually showing a method of switching the display mode in accordance with contact.
- the processing control unit 135 sets the polar coordinate mode MS2 in the piping 95 to image the surroundings of the piping 95, for example. .
- the processing control unit 135 may be switched to the orthogonal coordinate mode MS1 in order to image the wall surface.
- the image processing system may be provided with a detection sensor for detecting contact, and the detection result may be input to the processing control unit 135 of the image processing unit 13.
- the detection sensor may be a mechanical sensor or an optical sensor.
- the automatic switching between the orthogonal coordinate mode MS1 and the polar coordinate mode MS2 can also be performed without using a direction detection sensor or a detection sensor. Next, the case where automatic switching between the orthogonal coordinate mode MS1 and the polar coordinate mode MS2 without using a sensor will be described.
- FIG. 41 shows a case where a field of view of 270 degrees is obtained by using an ultra-wide angle lens as the imaging optical unit 111.
- the light incident on the imaging optical unit 111 A wide field image Gc having a field of 270 degrees is formed on the sensor surface of the image sensor 112 in the direction of the image element 112.
- FIG. 43 shows an installation example of the imaging unit 11 having a field of view of 270 degrees.
- the imaging unit 11 is provided at the tip of the ship so that the center direction of the visual field is 45 degrees upward with respect to the horizontal direction.
- a seat FS is provided behind the imaging unit 11.
- the image processing unit 13 automatically switches the region selection mode MS according to which direction the selection region is set, and the distortion of the imaging optical unit 111 is corrected! / Can be displayed on the display unit 14.
- the selection area is set by specifying the direction of the selection area or the angle range indicating the selection area by the input information PS or the like
- the selection area is set based on the input information or the like specifying the selection area.
- the direction of the selected area can be determined. Further, as described above, since the selection area and the image area ARs correspond to each other! /, The setting direction of the selection area can be determined from the image position of the image area ARs on which the distortion correction processing is performed.
- FIG. 44 shows a case where the area selection mode MS is automatically switched according to the setting direction of the selection area.
- a wide-field image Gc having a field of view of 270 degrees is as shown in FIG.
- the center of the wide-field image Gc is the optical axis direction of the imaging optical unit 111, and when the imaging unit 11 is provided so that the center direction of the visual field is 45 degrees upward with respect to the horizontal direction, the horizontal direction
- the frontal position of This is the point Pf on the image corresponding to degrees.
- an area AS1 is provided so as to include the front front image, and the image area ARs
- the orthogonal coordinate mode MS1 is set.
- an area AS2 is provided so as to include the rear image, and when the direction of the selected area is set so that, for example, the center position of the image area ARs is included in the area AS2, the polar coordinate mode MS2 is set. Note that when the center position of the image area ARs is not included in the areas AS1 and AS2, the set area selection mode MS is retained.
- an area selection mode MS is set in which the area of the wide-field image Gc is divided in advance in a matrix and set for each area.
- the area selection mode MS may be set according to which area the central position is included in.
- the orthogonal coordinate mode MS1 is assigned to each area ASml including the front image as the area selection mode MS to be set.
- the polar coordinate mode MS2 is assigned to each area ASm2 including the rear image as the area selection mode MS to be set.
- the orthogonal coordinate mode MS1 is set.
- the polar coordinate mode MS2 is set.
- the present invention can be applied to any of image processing systems 10, 20, and 30. Furthermore, the present invention may be applied to an image processing system 40 using a sensor.
- the imaging unit 11 is The area selection mode MS is switched with the same characteristics as when the imaging unit 11 is 45 degrees upward with respect to the horizontal direction. be able to.
- the area Selection mode MS switching characteristics can be set freely.
- the area selection mode MS when the area selection mode MS is automatically switched, if the GUI display is set according to each area selection mode, it is set to the V selection area selection mode MS and it can be easily discriminated. .
- Fig. 45 shows the GUI display and the moving direction of the image area ARs when the area selection mode MS is automatically switched.
- the rectangular coordinate mode MS1 for example, “Up”, “Down”, “Right”, and “Left” buttons are provided as the direction buttons Gua2 as shown in FIG.
- FIG. 45 (B) illustrates the moving direction of the image area ARs in the entire image Gcp when the direction button Gua2 is operated.
- the polar coordinate mode is set to MS2, as shown in (C) of Fig. 45, for example, "Center”, “Outer”, “Right rotation ratio”, “Left rotation” buttons are provided as direction buttons Gud2.
- the GUI display according to each area selection mode is used, the user can easily determine the force set in the shifted area selection mode. Further, when displaying an image of a subject located in a desired direction on the display unit 14, the user can easily select a direction button.
- the imaging unit 11 may stop the automatic switching operation according to the tilt angle with respect to the vertical direction (or horizontal direction) and set to either the orthogonal coordinate mode MS1 or the polar coordinate mode MS2. .
- the angle ⁇ is “337.5 degrees ⁇ ⁇ 22.5 degrees” or “157.5 degrees ⁇ ⁇ ⁇ 202.5 degrees”, regardless of the position of the image area ARs, Set to polar mode MS2.
- the angle ⁇ is “67.5 degrees ⁇ ⁇ 112.5 degrees” or “247.5 degrees ⁇ ⁇ ⁇ 292.5 degrees”
- the orthogonal coordinate mode MS1 is set regardless of the position of the image area ARs.
- Angle ⁇ is 22.5 degrees ⁇ ⁇ 67.5 degrees, 112.5 degrees ⁇ ⁇ 157.5 degrees, 202.5 degrees ⁇ ⁇ 247.5 degrees, 292.5 degrees ⁇ ⁇
- the mixed mode is automatically set to the orthogonal coordinate mode MS1 or the polar coordinate mode MS2 according to the position of the image area ARs as described above.
- FIG. 47 is a flowchart showing the region selection mode switching operation including the mixed mode.
- the processing control unit 135 detects the angle and detects the tilt angle ⁇ of the image pickup unit 11 (S T3301). Next, processing control section 135 determines whether or not to set the mixed mode based on the detected tilt angle ⁇ (ST3302). Here, when the mixed mode is not set (NO), based on the detection result of the angle ⁇ , the processing control unit 135 changes the region selection mode to the orthogonal coordinate mode MS 1! /, Or to the polar coordinate mode MS 2! Set but detect (ST3303). When the mixed mode is set (YES in ST3302), the processing control unit 135 sets the area selection mode to V or deviation of the orthogonal coordinate mode MS1 or polar coordinate mode MS2 based on the position of the selected area. (ST3304).
- the process control unit 135 sets the area selection mode to the detected coordinate mode (ST3305).
- processing control section 135 performs GUI display according to the area selection mode set in ST3305, and returns to ST3301 (ST3306).
- FIG. 48 is a flowchart showing the operation when the direction button is operated.
- Processing control unit 135 determines whether or not region selection mode MS is set to orthogonal coordinate mode MS 1 (ST3401).
- the process control unit 135 determines whether or not the direction button operation is performed based on the input information PS (ST3402).
- the process returns to ST3401, and when it is determined that the direction button operation is performed (YES), it is determined whether or not the “Right” button is operated. Is discriminated (ST3403).
- the process control unit 135 When determining that the operation is the operation of the "Right” button (YES), the process control unit 135 performs a process of switching the selected area to the position in the right direction, and returns to ST3401 (ST3404). When it is determined that the “Right” button is not operated (NO in ST3403), the process control unit 135 determines whether or not the “Left” button is operated (ST3405).
- processing control section 135 When determining that the operation is the operation of the "Left” button (YES), processing control section 135 performs processing for switching the selected area to the left position, and returns to ST3401 (ST3406). When it is determined that the “Left” button is not operated (NO in ST3405), the process control unit 135 determines whether or not the “Up” button is operated (ST3407). [0177] When it is determined that the operation is the operation of the "Up” button (YES), process control section 135 performs a process of switching the selected area to the upward position, and returns to ST3401 (ST3408).
- the process control unit 135 determines that the operation is not an operation of the “Up” button (NO in ST3407), the process control unit 135 determines that the operation is the operation of the “Down” button, and performs a process of switching the selected area to a position in the downward direction. Return (ST3409).
- the processing control unit 135 determines whether or not the direction button operation is performed based on the input information PS (ST3410).
- the process returns to ST3401, and when it is determined that the direction button operation has been performed (YES), “Whether the operation is a right rotation button operation or not. Determine (ST3411).
- the process controller 135 When determining that the operation is the operation of the "Right rotation” button (YES), the process controller 135 performs a process of rotating the selected area in the right direction and returns to ST3401 (ST3412). When it is determined that the operation is not the operation of the right rotation button (NO in ST3411), it is determined whether or not the operation is the operation of the left rotation button (ST3413).
- the process control unit 135 performs a process of rotating the selected region leftward and returns to ST3401 (ST3414). "When it is determined that the operation is not the operation of the Left rotationj button (NO in ST3413), it is determined whether or not the operation is the operation of the Center button (ST3415).
- the process control unit 135 When determining that the operation is the operation of the "Center” button (YES), the process control unit 135 performs a process of switching the selected area to the position in the center direction, and returns to ST3401 (ST3416). When the process control unit 135 determines that the operation is not an operation of the “Center” button (NO in ST3415), it is determined that the operation is an operation of the “Outer” button, and the selected region is positioned at an outward position opposite to the center direction. The switching process is performed, and the process returns to ST3401 (ST3417).
- the selection area can be easily switched in a desired direction. That is, an image of a subject located in a desired direction can be displayed on the screen of the display unit 14 in a state where the distortion of the imaging optical unit 111 is corrected.
- the image processing system of the present embodiment has the above-described plurality of modes as the display mode MH.
- the display mode change operation is performed by changing the display mode so that the entire image Gcp is displayed for at least a predetermined time when the selection area is changed or the area selection mode MS is changed. This makes it possible to easily confirm the selected area without the user performing the above.
- the configuration of the image processing system in the present embodiment is the same as that of the image processing system 40 shown in FIG.
- FIG. 49 and 50 show the display operation of the entire image Gcp.
- FIG. 49 shows how the display mode MH is changed in accordance with the selection area switching.
- FIG. 50 shows the area selection.
- FIG. 5 is a diagram showing a state in which a display mode MH is changed according to switching of a mode MS.
- the image processing unit 13 of the image processing system performs the display mode switching to the display mode in which the entire image Gcp is not displayed, or when the display mode is set to the display mode in which the entire image Gcp is not displayed. , Area selection mode Set when MS or selection area is switched! Regardless of the display mode, the display mode is automatically changed to the display mode in which the entire image Gc P is displayed for a predetermined time.
- FIG. 49A shows a case where the display is performed in the selected image display mode MH2.
- the entire image Gcp is combined with the display image corresponding to the set display mode, and the combined image is displayed.
- a new display image is displayed.
- the display mode MH is changed to the both display mode MH3, and the entire image Gcp is also displayed as shown in FIG.
- the composition processing of the entire image Gcp is terminated and the entire image Gcp is erased from the display screen.
- the display mode MH is returned to the selected image display mode MH2, and the entire image Gcp is erased as shown in FIG.
- the display mode MH is changed to the display mode MH3 for a predetermined time to change the display mode MH3 in FIG. Display in the order of (D) and (A) of FIG.
- (A) in Fig. 50 is displayed in the selected image display mode MH2, and the polar coordinate mode M
- the area selection mode MS is switched to the Cartesian coordinate mode MS1
- the entire image Gcp is combined with the display image corresponding to the set display mode.
- the display mode MH is changed to the both display mode MH3, and the entire image Gcp is also displayed as shown in FIG.
- the composition processing of the entire image Gcp is terminated and the entire image Gcp is erased from the display screen.
- the display mode MH is returned to the selected image display mode MH2, and the entire image Gcp is erased as shown in FIG.
- Fig. 50 Display is performed in the order of (D) in FIG. 50 and (A) in FIG.
- the predetermined time is not limited to a force that is several seconds such as 3 seconds or 5 seconds.
- the image processing unit 13 displays the entire image Gcp not only when the area selection mode MS is switched but also when the selection area switching instruction is issued, and displays the selected area after a predetermined time has elapsed. It is also possible to change the display to only the image Gsc.
- the image processing unit 13 captures the imaging direction only by changing the display mode for a predetermined period when the selection region is switched by the selection region switching instruction or when the region selection mode MS is switched.
- the display mode may be changed for a predetermined period also when the image of the image area ARs corresponding to the selected area is changed due to the change in the display area. For example, since the image of the wide-field image Gc changes when the imaging direction by the imaging unit 11 changes, the selected area display image Gsc changes even if the position of the image area ARs on the sensor surface of the image sensor 112 does not change. Will end up. That is, the selected area is changed.
- the display mode is changed for a predetermined period and the entire image Gcp is displayed. In this way, the user can easily determine how to instruct selection area switching in order to display the image of the subject located in the desired direction on the display unit 14.
- FIG. 51 shows an image processing system when the display mode shown in FIG. 49 or 50 is changed. It is the flowchart which showed the flow of operation
- the processing control unit 135 of the image processing unit 13 determines whether or not the selected region has been switched (moved) (ST3601). That is, based on whether or not the input information PS indicating the user operation for the direction button Gua2, Gud2 and the direction button Gub2 for instructing the selection area switching is supplied from the input unit 12, or based on the trajectory information described above. If the selected area is automatically switched, it is determined whether or not the selected area is switched.
- the display control information JH is supplied to the image output processing unit 134, and the entire image Gcp and the selection region display image Gsc are displayed by the image output processing unit 134.
- the synthesized composite image is displayed on display unit 14 (ST3606).
- the process control unit 135 determines whether or not a predetermined time has elapsed since the output of the composite image (ST3607). If the predetermined time has elapsed (YES), the display control information!
- the BJH is supplied to the image output processing unit 134, and is changed to the display of the selection area display image Gsc on one screen from which the entire image Gcp is deleted from the composite image (ST3608).
- the image output processing unit 134 is controlled by the processing control unit 135, and the display mode MH is changed from the display mode MH3 to the selected image display mode MH2 after a predetermined time has elapsed, as shown in FIGS. Change to
- the processing control unit 135 of the image processing unit 13 determines whether or not the selection region has been switched and determines that the selection region has not been switched (NO)
- the processing control unit 135 displays Indication mode MH and area selection mode
- the power of whether or not there is mode switching that is, input information PS indicating user operation for the above-mentioned “selection” button Gual, Gubl and “Menu” button Guc3 for switching the mode It is determined whether or not the cover is supplied from 12 (ST3 602).
- the processing after ST3606 is performed.
- the image processing unit 13 determines whether or not there is a change in the posture (installation angle) of the imaging unit 11 (ST3603). That is, the process control unit 135 determines, for example, the difference between the detection result and a predetermined initial value based on the sensor signal ES indicating the tilt angle detected by the gyro sensor 41 shown in FIG. When there is a change in the installation angle of the imaging unit 11 (YES), the image processing unit 13 performs the processing after ST3606.
- the processing control unit 135 does not change the installation angle of the imaging unit 11 in ST3603. If it is determined (NO), it is determined whether or not there is any user operation on the input unit 12, and if there is a user operation (YES), the processing after ST3606 is performed, and the user operation is not performed. If not (NO), repeat the process after ST3601. Note that the user operation is the same as the process of ST3601 or ST3602 when the display mode MH or the area selection mode MS is switched or the selection area is switched. User operations are judged for other operations.
- FIG. 51 when it is determined that there is no selection area switching, whether there is mode switching, whether there is a posture change, and whether there is an operation input.
- ST3605 shown in Fig. 52 it is determined whether there is any of selection area switching, mode switching, posture change, or operation input. Even if the processing from is performed, the entire image Gcp can be displayed for a predetermined time.
- the entire image Gcp is at least predetermined.
- the display is performed so that the time display state is obtained. Therefore, when the user switches the selected area or changes the mode, how the selected area is set! It can be easily confirmed by the enhanced image Gs of the image area ARs in the whole image Gcp, and after the confirmation, the selected area display image Gsc can be observed without being disturbed by the whole image Gcp. .
- the image processing unit 13 first displays the entire image Gcp in both display modes MH3 and then switches to the selected image display mode MH2 without the entire image Gcp after a predetermined time has elapsed.
- mode MH and area selection mode MS, etc. first, only the selected area display image Gsc is output in the selected image display mode MH2, and after the predetermined time has elapsed, both the combined images Gcp are displayed.
- Image data DVd may be generated so that a composite image is displayed in mode MH3.
- the selection area display image Gsc displayed on the display unit 14 is confirmed without blind spots, and then the selection area display image Gsc that has been displayed after a predetermined time has elapsed is observed to determine the current selection area. To check Togashi.
- FIG. 53 is a diagram showing another form of the display mode switching process.
- the image processing unit 13 may reduce the selection region display image Gsc as a composite image and output it side by side so that it does not overlap with the entire image Gcp. It's okay!
- the entire image Gcp may be synthesized and output in a semi-transparent state on the selection area display image Gsc.
- the image processing unit 13 does not have to display the entire image Gcp, and may perform processing so that the user S can observe the selected region display image Gsc without blind spots.
- the image processing unit 13 may dynamically change the predetermined time without setting the predetermined time. For example, when the area selection mode MS is switched, the position of the selection area may change significantly. In this case, the predetermined area is switched or imaged in the same area selection mode MS in that case. It may be longer than the predetermined time when the posture of the unit 11 is changed, so that the user can confirm the selected area with certainty. Also, for example, if the number of divided areas increases in the divided display mode MH4, it may take time to confirm a plurality of selected areas, so the number of divided areas increases (the number of emphasized images Gs in the entire image Gcp). If the user performs an operation that increases the time), the user may be able to confirm the selected area by making the predetermined time longer than when switching modes or changing postures! ,.
- the first area selection mode for selecting a selection area indicating a partial area of the visual field using the orthogonal coordinate system with respect to the visual field represented by the image data, and the image data.
- a second region selection mode is provided for selecting a selected region indicating a partial region of the field of view using the polar coordinate system, and the set first or second region selection mode. Then, distortion correction of the image data corresponding to the selected area selected is performed. For this reason, it is suitable when a desired area is set as a selection area for the captured wide-angle image and confirmation of an image in the selection area is performed.
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP06823320A EP1954029B1 (en) | 2005-11-11 | 2006-11-10 | Image processing device, image processing method, program thereof, and recording medium containing the program |
CN2006800421038A CN101305595B (zh) | 2005-11-11 | 2006-11-10 | 图像处理设备以及图像处理方法 |
US12/084,349 US8254713B2 (en) | 2005-11-11 | 2006-11-10 | Image processing apparatus, image processing method, program therefor, and recording medium in which the program is recorded |
DE602006021225T DE602006021225D1 (de) | 2005-11-11 | 2006-11-10 | Bildverarbeitungseinrichtung, bildverarbeitungsverfahren, programm dafür und das programm enthaltendes aufzeichnungsmedium |
KR1020087011221A KR101329470B1 (ko) | 2005-11-11 | 2006-11-10 | 화상 처리 장치, 화상 처리 방법 및 그 프로그램을 기록한 기록 매체 |
JP2007544204A JP5136059B2 (ja) | 2005-11-11 | 2006-11-10 | 画像処理装置、画像処理方法、そのプログラム及びそのプログラムを記録した記録媒体 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005327749 | 2005-11-11 | ||
JP2005-327749 | 2005-11-11 | ||
JP2006176915 | 2006-06-27 | ||
JP2006-176915 | 2006-06-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007055335A1 true WO2007055335A1 (ja) | 2007-05-18 |
Family
ID=38023329
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/322498 WO2007055335A1 (ja) | 2005-11-11 | 2006-11-10 | 画像処理装置、画像処理方法、そのプログラム及びそのプログラムを記録した記録媒体 |
PCT/JP2006/322499 WO2007055336A1 (ja) | 2005-11-11 | 2006-11-10 | 画像処理装置、画像処理方法、そのプログラム及びそのプログラムを記録した記録媒体と撮像装置 |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/322499 WO2007055336A1 (ja) | 2005-11-11 | 2006-11-10 | 画像処理装置、画像処理方法、そのプログラム及びそのプログラムを記録した記録媒体と撮像装置 |
Country Status (6)
Country | Link |
---|---|
US (2) | US8254713B2 (ja) |
EP (2) | EP1950954B1 (ja) |
JP (2) | JP5136060B2 (ja) |
KR (2) | KR101329470B1 (ja) |
DE (2) | DE602006021219D1 (ja) |
WO (2) | WO2007055335A1 (ja) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012085267A (ja) * | 2010-10-14 | 2012-04-26 | Sony Corp | 撮像装置、撮像システム及び撮像方法 |
JP2012244480A (ja) * | 2011-05-20 | 2012-12-10 | Toshiba Teli Corp | 全方位監視画像表示処理システム |
JP2012253620A (ja) * | 2011-06-03 | 2012-12-20 | Sony Computer Entertainment Inc | 画像処理装置 |
JP2014106888A (ja) * | 2012-11-29 | 2014-06-09 | Brother Ind Ltd | 作業補助システムおよびプログラム |
JP2015018013A (ja) * | 2013-07-08 | 2015-01-29 | 株式会社リコー | 表示制御装置、プログラム及び記録媒体 |
JP2016025516A (ja) * | 2014-07-22 | 2016-02-08 | キヤノン株式会社 | 情報処理装置、情報処理方法およびプログラム |
JP2016039539A (ja) * | 2014-08-08 | 2016-03-22 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP2016123127A (ja) * | 2016-03-03 | 2016-07-07 | パナソニックIpマネジメント株式会社 | 表示装置及びコンピュータプログラム |
JP2017058812A (ja) * | 2015-09-15 | 2017-03-23 | カシオ計算機株式会社 | 画像表示装置、画像表示方法及びプログラム |
JP2017208658A (ja) * | 2016-05-17 | 2017-11-24 | キヤノン株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP2018107583A (ja) * | 2016-12-26 | 2018-07-05 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP2018139096A (ja) * | 2016-11-30 | 2018-09-06 | 株式会社リコー | 情報処理装置およびプログラム |
JP2018174468A (ja) * | 2017-03-31 | 2018-11-08 | キヤノン株式会社 | 映像表示装置、映像表示装置の制御方法及びプログラム |
JP2018201065A (ja) * | 2017-05-25 | 2018-12-20 | キヤノン株式会社 | 表示制御装置、表示制御方法及びプログラム |
JP2018206205A (ja) * | 2017-06-07 | 2018-12-27 | 村田機械株式会社 | 魚眼画像補正方法、魚眼画像補正プログラム及び魚眼画像補正装置。 |
JP2019009507A (ja) * | 2017-06-20 | 2019-01-17 | キヤノン株式会社 | 画像処理装置およびその制御方法、撮像装置、監視システム |
JPWO2018043135A1 (ja) * | 2016-08-31 | 2019-06-24 | ソニー株式会社 | 情報処理装置、情報処理方法、並びにプログラム |
JP2020006177A (ja) * | 2018-07-06 | 2020-01-16 | メドス・インターナショナル・エスエイアールエルMedos International SARL | カメラスコープ電子可変プリズム |
JP2021051593A (ja) * | 2019-09-25 | 2021-04-01 | 株式会社リコー | 画像処理システム、画像処理装置および方法 |
US11651471B2 (en) | 2011-02-10 | 2023-05-16 | Panasonic Intellectual Property Management Co., Ltd. | Display device, computer program, and computer-implemented method |
Families Citing this family (60)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007288354A (ja) | 2006-04-13 | 2007-11-01 | Opt Kk | カメラ装置、画像処理装置および画像処理方法 |
JP2008225522A (ja) * | 2007-03-08 | 2008-09-25 | Sony Corp | 画像処理装置、カメラ装置、画像処理方法、およびプログラム |
JP5109803B2 (ja) | 2007-06-06 | 2012-12-26 | ソニー株式会社 | 画像処理装置、画像処理方法及び画像処理プログラム |
JP5349839B2 (ja) * | 2007-06-22 | 2013-11-20 | キヤノン株式会社 | 生体情報イメージング装置 |
JP4940050B2 (ja) * | 2007-08-09 | 2012-05-30 | キヤノン株式会社 | 画像データに歪曲収差補正を施す画像処理方法、プログラム、および、記録媒体 |
JP5067336B2 (ja) * | 2007-12-26 | 2012-11-07 | 大日本印刷株式会社 | 画像変換装置および画像変換方法 |
JP5124835B2 (ja) * | 2008-02-05 | 2013-01-23 | 富士フイルム株式会社 | 画像処理装置、画像処理方法、およびプログラム |
US8018353B2 (en) * | 2008-05-09 | 2011-09-13 | Omron Scientic Technologies, Inc. | Method and apparatus for zone selection in area monitoring devices |
US8264598B2 (en) | 2008-09-22 | 2012-09-11 | Freedom Scientific, Inc. | Multiposition handheld electronic magnifier |
US8115831B2 (en) * | 2008-08-04 | 2012-02-14 | Freedom Scientific, Inc. | Portable multi position magnifier camera |
JP5169787B2 (ja) * | 2008-12-12 | 2013-03-27 | 大日本印刷株式会社 | 画像変換装置および画像変換方法 |
EP2207342B1 (en) * | 2009-01-07 | 2017-12-06 | LG Electronics Inc. | Mobile terminal and camera image control method thereof |
JP5049300B2 (ja) * | 2009-01-20 | 2012-10-17 | クラリオン株式会社 | 障害物検出表示装置 |
CN102395994B (zh) * | 2010-03-18 | 2015-06-10 | 松下电器产业株式会社 | 全景图像处理装置及全景图像处理方法 |
JP2011203446A (ja) * | 2010-03-25 | 2011-10-13 | Fujifilm Corp | ヘッドマウントディスプレイ装置 |
JP5720134B2 (ja) * | 2010-04-20 | 2015-05-20 | 株式会社リコー | 画像検査装置及び画像形成装置 |
JP5672862B2 (ja) | 2010-08-27 | 2015-02-18 | ソニー株式会社 | 撮像装置、撮像システム及び撮像方法 |
US9055205B2 (en) * | 2010-09-03 | 2015-06-09 | Canon Kabushiki Kaisha | Imaging control system, control apparatus, control method, and storage medium |
JPWO2012081400A1 (ja) * | 2010-12-14 | 2014-05-22 | コニカミノルタ株式会社 | 画像処理方法、画像処理装置及び撮像装置 |
TWI516119B (zh) * | 2011-01-25 | 2016-01-01 | 華晶科技股份有限公司 | 電子裝置、影像擷取裝置及其方法 |
WO2013114848A1 (ja) | 2012-01-31 | 2013-08-08 | パナソニック株式会社 | 画像処理装置及び画像処理方法 |
JP5924020B2 (ja) * | 2012-02-16 | 2016-05-25 | セイコーエプソン株式会社 | プロジェクター、及び、プロジェクターの制御方法 |
JP5925579B2 (ja) * | 2012-04-25 | 2016-05-25 | ルネサスエレクトロニクス株式会社 | 半導体装置、電子装置、及び画像処理方法 |
JP6303270B2 (ja) * | 2012-05-18 | 2018-04-04 | 株式会社リコー | ビデオ会議端末装置、ビデオ会議システム、映像の歪み補正方法および映像の歪み補正プログラム |
JP6186775B2 (ja) * | 2012-05-31 | 2017-08-30 | 株式会社リコー | 通信端末、表示方法、及びプログラム |
JP6071364B2 (ja) * | 2012-09-19 | 2017-02-01 | キヤノン株式会社 | 画像処理装置、その制御方法、および制御プログラム |
JP2014143678A (ja) * | 2012-12-27 | 2014-08-07 | Panasonic Corp | 音声処理システム及び音声処理方法 |
CN106027910B (zh) * | 2013-01-22 | 2019-08-16 | 华为终端有限公司 | 预览画面呈现方法、装置及终端 |
JP6104010B2 (ja) * | 2013-03-26 | 2017-03-29 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法、画像処理プログラム、および、記憶媒体 |
JP6229283B2 (ja) * | 2013-03-26 | 2017-11-15 | 株式会社リコー | 画像処理装置、表示端末及び画像表示システム、並びに、画像処理方法、表示端末の制御方法、画像表示システムの制御方法及びそれらの方法のプログラム |
USD768644S1 (en) * | 2013-11-21 | 2016-10-11 | Nikon Corporation | Display screen with transitional graphical user interface |
USD760785S1 (en) * | 2014-01-13 | 2016-07-05 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with animated graphical user interface |
JP5835384B2 (ja) | 2014-03-18 | 2015-12-24 | 株式会社リコー | 情報処理方法、情報処理装置、およびプログラム |
JP5835383B2 (ja) * | 2014-03-18 | 2015-12-24 | 株式会社リコー | 情報処理方法、情報処理装置、およびプログラム |
US9883101B1 (en) * | 2014-07-23 | 2018-01-30 | Hoyos Integrity Corporation | Providing a real-time via a wireless communication channel associated with a panoramic video capture device |
JP2016025639A (ja) * | 2014-07-24 | 2016-02-08 | エイオーエフ イメージング テクノロジー リミテッド | 撮像装置、画像信号転送制御方法およびプログラム |
CN106664369B (zh) * | 2014-09-05 | 2020-05-19 | 富士胶片株式会社 | 云台操作装置、相机系统及云台操作方法 |
KR20160045441A (ko) * | 2014-10-17 | 2016-04-27 | 삼성전자주식회사 | 동영상 재생 방법 및 장치 |
CN105141827B (zh) | 2015-06-30 | 2017-04-26 | 广东欧珀移动通信有限公司 | 一种畸变校正方法及终端 |
CN104994288B (zh) * | 2015-06-30 | 2018-03-27 | 广东欧珀移动通信有限公司 | 一种拍照方法及用户终端 |
US10909384B2 (en) | 2015-07-14 | 2021-02-02 | Panasonic Intellectual Property Management Co., Ltd. | Monitoring system and monitoring method |
CN108702441A (zh) * | 2016-02-24 | 2018-10-23 | 株式会社理光 | 图像处理设备、图像处理系统及程序 |
US10824320B2 (en) * | 2016-03-07 | 2020-11-03 | Facebook, Inc. | Systems and methods for presenting content |
JP6942940B2 (ja) * | 2016-03-14 | 2021-09-29 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
JP6309176B2 (ja) * | 2016-03-15 | 2018-04-11 | 三菱電機株式会社 | 遠隔作業支援装置、指示用端末及び現場用端末 |
KR20170124811A (ko) * | 2016-05-03 | 2017-11-13 | 삼성전자주식회사 | 영상 표시 장치 및 그 동작방법 |
JPWO2018030319A1 (ja) * | 2016-08-12 | 2018-08-09 | パナソニックIpマネジメント株式会社 | 測距システム、および、移動体システム |
KR102536945B1 (ko) * | 2016-08-30 | 2023-05-25 | 삼성전자주식회사 | 영상 표시 장치 및 그 동작방법 |
JP6996511B2 (ja) * | 2016-10-06 | 2022-01-17 | ソニーグループ株式会社 | 再生装置および再生方法、並びにプログラム |
WO2018102990A1 (en) * | 2016-12-06 | 2018-06-14 | SZ DJI Technology Co., Ltd. | System and method for rectifying a wide-angle image |
KR20180073327A (ko) * | 2016-12-22 | 2018-07-02 | 삼성전자주식회사 | 영상 표시 방법, 저장 매체 및 전자 장치 |
US10521468B2 (en) * | 2017-06-13 | 2019-12-31 | Adobe Inc. | Animated seek preview for panoramic videos |
WO2019049331A1 (ja) * | 2017-09-08 | 2019-03-14 | 株式会社ソニー・インタラクティブエンタテインメント | キャリブレーション装置、キャリブレーションシステム、およびキャリブレーション方法 |
JP2019054322A (ja) * | 2017-09-12 | 2019-04-04 | 株式会社リコー | 通信端末、画像通信システム、通信方法、及びプログラム |
JP6933566B2 (ja) * | 2017-11-24 | 2021-09-08 | キヤノンメディカルシステムズ株式会社 | 医用画像表示装置 |
JP6688277B2 (ja) * | 2017-12-27 | 2020-04-28 | 本田技研工業株式会社 | プログラム、学習処理方法、学習モデル、データ構造、学習装置、および物体認識装置 |
US10582181B2 (en) | 2018-03-27 | 2020-03-03 | Honeywell International Inc. | Panoramic vision system with parallax mitigation |
US11295541B2 (en) * | 2019-02-13 | 2022-04-05 | Tencent America LLC | Method and apparatus of 360 degree camera video processing with targeted view |
US11153481B2 (en) * | 2019-03-15 | 2021-10-19 | STX Financing, LLC | Capturing and transforming wide-angle video information |
KR20210066366A (ko) * | 2019-11-28 | 2021-06-07 | 삼성전자주식회사 | 영상 복원 방법 및 장치 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000132673A (ja) * | 1998-10-28 | 2000-05-12 | Sharp Corp | 画像システム |
JP2005063141A (ja) * | 2003-08-12 | 2005-03-10 | D Link Corp | 画像変換システム及び画像変換方法 |
EP1515548A2 (en) | 2003-09-12 | 2005-03-16 | Sensormatic Electronics Corporation | Imaging system and method for displaying and /or recording undistorted wide-angle image data |
JP2005086279A (ja) * | 2003-09-04 | 2005-03-31 | Equos Research Co Ltd | 撮像装置、及び撮像装置を備えた車両 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63205233A (ja) * | 1987-02-21 | 1988-08-24 | Photo Composing Mach Mfg Co Ltd | 写真植字機 |
CA2635901C (en) | 1995-07-27 | 2009-05-19 | Sensormatic Electronics Corporation | Image splitting, forming and processing device and method for use with no moving parts camera |
US6304284B1 (en) * | 1998-03-31 | 2001-10-16 | Intel Corporation | Method of and apparatus for creating panoramic or surround images using a motion sensor equipped camera |
US7046275B1 (en) * | 1998-10-15 | 2006-05-16 | Ricoh Company, Ltd. | Digital camera and imaging method |
JP2000324386A (ja) | 1999-05-07 | 2000-11-24 | Sony Corp | 魚眼レンズを用いた録画再生装置 |
JP4340358B2 (ja) * | 1999-08-02 | 2009-10-07 | 富士フイルム株式会社 | 画像撮影装置 |
US6833843B2 (en) * | 2001-12-03 | 2004-12-21 | Tempest Microsystems | Panoramic imaging and display system with canonical magnifier |
US7224382B2 (en) * | 2002-04-12 | 2007-05-29 | Image Masters, Inc. | Immersive imaging system |
US7194699B2 (en) * | 2003-01-14 | 2007-03-20 | Microsoft Corporation | Animating images to reflect user selection |
US7450165B2 (en) * | 2003-05-02 | 2008-11-11 | Grandeye, Ltd. | Multiple-view processing in wide-angle video camera |
JP2005110202A (ja) * | 2003-09-08 | 2005-04-21 | Auto Network Gijutsu Kenkyusho:Kk | カメラ装置及び車両周辺監視装置 |
US7839446B2 (en) * | 2005-08-30 | 2010-11-23 | Olympus Corporation | Image capturing apparatus and image display apparatus including imparting distortion to a captured image |
-
2006
- 2006-11-10 WO PCT/JP2006/322498 patent/WO2007055335A1/ja active Application Filing
- 2006-11-10 US US12/084,349 patent/US8254713B2/en active Active
- 2006-11-10 WO PCT/JP2006/322499 patent/WO2007055336A1/ja active Application Filing
- 2006-11-10 JP JP2007544205A patent/JP5136060B2/ja not_active Expired - Fee Related
- 2006-11-10 DE DE602006021219T patent/DE602006021219D1/de active Active
- 2006-11-10 DE DE602006021225T patent/DE602006021225D1/de active Active
- 2006-11-10 KR KR1020087011221A patent/KR101329470B1/ko active IP Right Grant
- 2006-11-10 EP EP06823321A patent/EP1950954B1/en not_active Expired - Fee Related
- 2006-11-10 JP JP2007544204A patent/JP5136059B2/ja not_active Expired - Fee Related
- 2006-11-10 US US12/084,345 patent/US8169527B2/en active Active
- 2006-11-10 EP EP06823320A patent/EP1954029B1/en not_active Expired - Fee Related
-
2008
- 2008-05-09 KR KR1020087011223A patent/KR101270893B1/ko active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000132673A (ja) * | 1998-10-28 | 2000-05-12 | Sharp Corp | 画像システム |
JP2005063141A (ja) * | 2003-08-12 | 2005-03-10 | D Link Corp | 画像変換システム及び画像変換方法 |
JP2005086279A (ja) * | 2003-09-04 | 2005-03-31 | Equos Research Co Ltd | 撮像装置、及び撮像装置を備えた車両 |
EP1515548A2 (en) | 2003-09-12 | 2005-03-16 | Sensormatic Electronics Corporation | Imaging system and method for displaying and /or recording undistorted wide-angle image data |
Non-Patent Citations (1)
Title |
---|
See also references of EP1954029A4 |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016171599A (ja) * | 2010-10-14 | 2016-09-23 | ソニー株式会社 | 撮像装置、撮像システム及び撮像方法 |
US11418752B2 (en) | 2010-10-14 | 2022-08-16 | Sony Group Corporation | Vehicle camera system |
US10178339B2 (en) | 2010-10-14 | 2019-01-08 | Sony Corporation | Capturing device, capturing system and capturing method |
US11082657B2 (en) | 2010-10-14 | 2021-08-03 | Sony Group Corporation | Camera system for use in a vehicle with settable image enlargement values |
US9485429B2 (en) | 2010-10-14 | 2016-11-01 | Sony Corporation | Capturing device, capturing system and capturing method |
US9215376B2 (en) | 2010-10-14 | 2015-12-15 | Sony Corporation | Capturing device, capturing system and capturing method |
US10142580B2 (en) | 2010-10-14 | 2018-11-27 | Sony Corporation | Capturing device, capturing system and capturing method |
US9643539B2 (en) | 2010-10-14 | 2017-05-09 | Sony Corporation | Capturing device, capturing system and capturing method |
JP2012085267A (ja) * | 2010-10-14 | 2012-04-26 | Sony Corp | 撮像装置、撮像システム及び撮像方法 |
US11651471B2 (en) | 2011-02-10 | 2023-05-16 | Panasonic Intellectual Property Management Co., Ltd. | Display device, computer program, and computer-implemented method |
JP2012244480A (ja) * | 2011-05-20 | 2012-12-10 | Toshiba Teli Corp | 全方位監視画像表示処理システム |
US9363466B2 (en) | 2011-06-03 | 2016-06-07 | Sony Corporation | Image processing device for determining an orientation and a direction of the image processing device |
JP2012253620A (ja) * | 2011-06-03 | 2012-12-20 | Sony Computer Entertainment Inc | 画像処理装置 |
JP2014106888A (ja) * | 2012-11-29 | 2014-06-09 | Brother Ind Ltd | 作業補助システムおよびプログラム |
JP2015018013A (ja) * | 2013-07-08 | 2015-01-29 | 株式会社リコー | 表示制御装置、プログラム及び記録媒体 |
JP2016025516A (ja) * | 2014-07-22 | 2016-02-08 | キヤノン株式会社 | 情報処理装置、情報処理方法およびプログラム |
JP2016039539A (ja) * | 2014-08-08 | 2016-03-22 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP2017058812A (ja) * | 2015-09-15 | 2017-03-23 | カシオ計算機株式会社 | 画像表示装置、画像表示方法及びプログラム |
JP2016123127A (ja) * | 2016-03-03 | 2016-07-07 | パナソニックIpマネジメント株式会社 | 表示装置及びコンピュータプログラム |
JP2017208658A (ja) * | 2016-05-17 | 2017-11-24 | キヤノン株式会社 | 画像処理装置、画像処理方法及びプログラム |
JPWO2018043135A1 (ja) * | 2016-08-31 | 2019-06-24 | ソニー株式会社 | 情報処理装置、情報処理方法、並びにプログラム |
JP2018139096A (ja) * | 2016-11-30 | 2018-09-06 | 株式会社リコー | 情報処理装置およびプログラム |
JP7020024B2 (ja) | 2016-11-30 | 2022-02-16 | 株式会社リコー | 情報処理装置およびプログラム |
JP2018107583A (ja) * | 2016-12-26 | 2018-07-05 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP2018174468A (ja) * | 2017-03-31 | 2018-11-08 | キヤノン株式会社 | 映像表示装置、映像表示装置の制御方法及びプログラム |
JP2018201065A (ja) * | 2017-05-25 | 2018-12-20 | キヤノン株式会社 | 表示制御装置、表示制御方法及びプログラム |
US11190747B2 (en) | 2017-05-25 | 2021-11-30 | Canon Kabushiki Kaisha | Display control apparatus, display control method, and storage medium |
JP2018206205A (ja) * | 2017-06-07 | 2018-12-27 | 村田機械株式会社 | 魚眼画像補正方法、魚眼画像補正プログラム及び魚眼画像補正装置。 |
JP2019009507A (ja) * | 2017-06-20 | 2019-01-17 | キヤノン株式会社 | 画像処理装置およびその制御方法、撮像装置、監視システム |
JP2020006177A (ja) * | 2018-07-06 | 2020-01-16 | メドス・インターナショナル・エスエイアールエルMedos International SARL | カメラスコープ電子可変プリズム |
JP7460336B2 (ja) | 2018-07-06 | 2024-04-02 | メドス・インターナショナル・エスエイアールエル | カメラスコープ電子可変プリズム |
JP2021051593A (ja) * | 2019-09-25 | 2021-04-01 | 株式会社リコー | 画像処理システム、画像処理装置および方法 |
JP7419723B2 (ja) | 2019-09-25 | 2024-01-23 | 株式会社リコー | 画像処理システム、画像処理装置および方法 |
Also Published As
Publication number | Publication date |
---|---|
US20090160996A1 (en) | 2009-06-25 |
JPWO2007055335A1 (ja) | 2009-04-30 |
JP5136060B2 (ja) | 2013-02-06 |
EP1954029B1 (en) | 2011-04-06 |
DE602006021219D1 (de) | 2011-05-19 |
US8254713B2 (en) | 2012-08-28 |
JPWO2007055336A1 (ja) | 2009-04-30 |
EP1950954B1 (en) | 2011-04-06 |
US20090041378A1 (en) | 2009-02-12 |
EP1954029A4 (en) | 2010-02-24 |
US8169527B2 (en) | 2012-05-01 |
JP5136059B2 (ja) | 2013-02-06 |
DE602006021225D1 (de) | 2011-05-19 |
KR101329470B1 (ko) | 2013-11-13 |
KR101270893B1 (ko) | 2013-06-05 |
EP1950954A4 (en) | 2010-03-03 |
KR20080068697A (ko) | 2008-07-23 |
EP1950954A1 (en) | 2008-07-30 |
KR20080068698A (ko) | 2008-07-23 |
WO2007055336A1 (ja) | 2007-05-18 |
EP1954029A1 (en) | 2008-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5136059B2 (ja) | 画像処理装置、画像処理方法、そのプログラム及びそのプログラムを記録した記録媒体 | |
JP6587113B2 (ja) | 画像処理装置及び画像処理方法 | |
JP5569329B2 (ja) | 会議システム、監視システム、画像処理装置、画像処理方法及び画像処理プログラム等 | |
KR101662074B1 (ko) | 제어 장치, 카메라 시스템 및 기록 매체 | |
CN101305595B (zh) | 图像处理设备以及图像处理方法 | |
US20150109452A1 (en) | Display image formation device and display image formation method | |
KR20180129667A (ko) | 표시 제어 장치, 표시 제어 방법 및 저장 매체 | |
KR101718081B1 (ko) | 손 제스처 인식용 초광각 카메라 시스템 및 그가 적용된 TVI(Transport Video Interface) 장치 | |
JP2007288354A (ja) | カメラ装置、画像処理装置および画像処理方法 | |
KR102009988B1 (ko) | 초광각 카메라를 이용한 렌즈 왜곡 영상 보정 카메라 시스템의 영상 보정 방법 및 그가 적용된 tvi 장치 | |
JP2008028778A (ja) | 画像処理装置、画像処理方法及びそのプログラム | |
JP2019032448A (ja) | 制御装置、制御方法、及びプログラム | |
US11928775B2 (en) | Apparatus, system, method, and non-transitory medium which map two images onto a three-dimensional object to generate a virtual image | |
KR102518863B1 (ko) | 뷰 모드를 추천하는 모니터링 시스템 | |
US11516390B2 (en) | Imaging apparatus and non-transitory storage medium | |
JP2019008494A (ja) | 画像処理装置 | |
JP2021164100A (ja) | 映像調整システムおよび映像調整装置 | |
JP2013223012A (ja) | 画像処理装置、画像処理方法及び画像処理プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680042103.8 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 12084349 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007544204 Country of ref document: JP Ref document number: 1020087011221 Country of ref document: KR Ref document number: 2006823320 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |