WO2015033810A1 - 撮像装置および方法、並びにプログラム - Google Patents
撮像装置および方法、並びにプログラム Download PDFInfo
- Publication number
- WO2015033810A1 WO2015033810A1 PCT/JP2014/072105 JP2014072105W WO2015033810A1 WO 2015033810 A1 WO2015033810 A1 WO 2015033810A1 JP 2014072105 W JP2014072105 W JP 2014072105W WO 2015033810 A1 WO2015033810 A1 WO 2015033810A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- zoom
- image
- operation state
- control unit
- angular velocity
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/04—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
- G02B7/08—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B5/00—Adjustment of optical system relative to image or object surface other than for focusing
- G03B5/06—Swinging lens about normal to the optical axis
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2205/00—Adjustment of optical system relative to image or object surface other than for focusing
- G03B2205/0046—Movement of one or more optical elements for zooming
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2217/00—Details of cameras or camera bodies; Accessories therefor
- G03B2217/005—Blur detection
Definitions
- the present technology relates to an imaging apparatus, method, and program, and more particularly, to an imaging apparatus, method, and program that enable more appropriate zoom control.
- a panning operation is detected based on the output of the angular velocity sensor.
- the camera shake correction process is stopped and the panning velocity is obtained from the amount of displacement of the angular velocity.
- zoom control in the zoom-out direction is performed based on the obtained panning speed and zoom position information.
- the target subject can easily fit within the screen by zooming out, and the usability of the imaging apparatus is improved. Will be able to.
- the zoom magnification is simply changed to the magnification determined for the obtained panning speed itself, that is, the angular velocity itself, so that appropriate zoom control can be performed. There wasn't.
- the present technology has been made in view of such a situation, and makes it possible to perform more appropriate zoom control.
- An imaging apparatus includes an angular velocity detection unit that detects an angular velocity of its own movement, an operation determination unit that determines a user's operation state based on the angular velocities at a plurality of different times, and the operation state And a zoom control unit that performs zoom control based on the zoom control unit.
- the zoom control unit can perform zoom control so that the zoom magnification is reduced when the angular velocity increases with time.
- the zoom control unit can perform zoom control so that the zoom magnification is increased in the operation state in which the angular velocity decreases with time.
- the zoom control unit performs zoom control so that the zoom magnification remains constant when the operation state in which the angular velocity decreases with time changes to the operation state in which the angular velocity increases with time. Can be done.
- the zoom control unit can be configured to make it difficult to determine that the operation determination unit is in the predetermined operation state when specific zoom control is being performed.
- the motion determination unit is configured to obtain a relational expression between the time and the angular velocity based on the angular velocities at the plurality of different times, and is determined in advance for each of the obtained coefficient of the relational expression and a plurality of the operation states.
- the operation state can be determined by comparing the range of values that the coefficients of the relational expression can take.
- the operation determining unit may determine the operation state by obtaining distances between the angular velocities at the plurality of different times and the angular velocities at a plurality of times determined in advance for each of the plurality of operation states. it can.
- the motion determination unit is configured to estimate the angular velocity at a future time based on the angular velocities at the plurality of different times, and based on the angular velocity obtained by the estimation and the angular velocities at the plurality of different times.
- the operation state can be determined.
- the imaging apparatus may further include a zoom processing unit that generates a zoom image by cutting out a partial area of the captured image in accordance with the control of the zoom control unit.
- the imaging apparatus may further include a display control unit that displays the captured image and the zoom image side by side.
- the display control unit can display a frame in a region where the zoom image is cut out from the image.
- the imaging apparatus may further include a display control unit that displays an image obtained by combining the images taken at a plurality of different times and the zoom image side by side.
- the display control unit can display a frame in a region where the zoom image is cut out in the image obtained by the synthesis.
- the imaging apparatus further includes an object detection unit that detects an object from the captured image, and the zoom control unit is in a state where the object is displayed in an enlarged manner when the object is detected near the center of the image Zoom control can be performed so that is maintained.
- the imaging apparatus further includes an object detection unit that detects an object from the captured image, and the zoom control unit determines a position from which the zoom image is cut out based on the detection result of the object. Can be made.
- the imaging apparatus may further include an object detection unit that detects an object from the captured image, and a correction unit that tilts a lens that collects light incident from the subject according to the detection result of the object. .
- An imaging method or program detects an angular velocity of movement of an imaging device, determines an operation state of a user who operates the imaging device based on the angular velocity at a plurality of different times, and enters the operation state. And performing zoom control based on the step.
- an angular velocity of movement of the imaging device is detected, an operation state of a user operating the imaging device is determined based on the angular velocities at a plurality of different times, and zoom control is performed based on the operation state Is done.
- more appropriate zoom control can be performed.
- FIG. 1 is a diagram illustrating a configuration example of an imaging apparatus to which the present technology is applied.
- the imaging device 11 is, for example, a digital camera, a digital still camera, a multi-function mobile phone, or the like.
- the imaging device 11 includes an imaging lens 21, an imaging element 22, an analog signal processing unit 23, an A / D (Analog / Digital) conversion unit 24, a control unit 25, a shutter control unit 26, a focus unit 27, an iris unit 28, and a zoom unit. 29, a display unit 30, a recording unit 31, an operation unit 32, a memory 33, and an angular velocity sensor 34.
- a / D Analog / Digital
- the imaging lens 21 includes a plurality of lenses, a mechanical shutter, and the like, and collects light incident from the subject and forms an image on the light receiving surface of the imaging element 22.
- the imaging element 22 is configured by a CCD (Charge Coupled Device) or the like, receives light incident from the imaging lens 21, performs photoelectric conversion, and supplies an electric signal obtained as a result to the analog signal processing unit 23.
- CCD Charge Coupled Device
- the analog signal processing unit 23 performs various processes such as noise removal on the electrical signal supplied from the image sensor 22 and supplies the processed signal to the A / D conversion unit 24.
- the A / D conversion unit 24 performs A / D conversion on the electrical signal supplied from the analog signal processing unit 23 to convert the electrical signal from an analog signal to a digital signal, and supplies the digital signal to the control unit 25.
- the electric signal output from the A / D converter 24 is used as image data. More specifically, the image data output from the A / D conversion unit 24 is subjected to gamma correction and white balance correction by a digital signal processing unit (not shown) and supplied to the control unit 25.
- the control unit 25 is a control circuit that comprehensively controls the operation of the entire imaging apparatus 11. For example, the control unit 25 supplies the image data supplied from the A / D conversion unit 24 to the display unit 30 to display an image, or supplies the image data to the recording unit 31 to record it.
- the control unit 25 includes an operation determination unit 41, a zoom control unit 42, and a display control unit 43.
- the operation determination unit 41 to the display control unit 43 are realized, for example, when the control unit 25 executes a program.
- the operation determination unit 41 determines the operation of the user with respect to the imaging apparatus 11 such as a panning operation at a constant speed based on the angular velocities at a plurality of different times supplied from the angular velocity sensor 34.
- the zoom control unit 42 determines the zoom magnification of the image at the time of shooting according to the determination result by the operation determination unit 41.
- the zoom control unit 42 controls the zoom unit 29 so that the zoom magnification at the imaging lens 21 becomes the determined zoom magnification or the zoom magnification determined by the user's operation on the operation unit 32, thereby capturing an image. Zoom control of the lens 21 is performed.
- the display control unit 43 controls image display on the display unit 30.
- the shutter control unit 26 performs shutter control of the imaging lens 21 according to the control of the control unit 25.
- the focus unit 27 drives the lens of the imaging lens 21 according to the control of the control unit 25 and performs focus control.
- focus control performed on the image pickup lens 21 is performed based on an output from a focus sensor provided on the image pickup device 22 or a focus sensor (not shown) provided on the image pickup apparatus 11. That is, the control unit 25 determines the focal position based on the signal supplied from the focus sensor and controls the focus unit 27.
- the iris unit 28 drives the iris (aperture) of the imaging lens 21 under the control of the control unit 25.
- the zoom unit 29 moves the lenses constituting the imaging lens 21 in the optical axis direction according to the control by the zoom control unit 42 so that an image is taken at the designated zoom magnification.
- the display unit 30 is composed of, for example, a liquid crystal display element, and displays an image based on the image data supplied from the display control unit 43.
- the recording unit 31 is a data recording medium including, for example, an IC memory card or a built-in memory, and records various data such as still image and moving image image data supplied from the control unit 25 and image metadata. .
- the operation unit 32 includes, for example, a touch panel, buttons, switches, dials, zoom buttons, and the like, and supplies a signal corresponding to a user operation to the control unit 25.
- the memory 33 is a memory composed of, for example, RAM (Random Access Memory) or the like, and temporarily records data supplied from the control unit 25. That is, the memory 33 is used as a temporary storage area for data processed by the control unit 25.
- RAM Random Access Memory
- the angular velocity sensor 34 detects the angular velocity of the imaging device 11 and supplies it to the control unit 25.
- the angular velocity detected by the angular velocity sensor 34 is used for camera shake correction and zoom control of the imaging lens 21. Note that the angular velocity sensor 34 detects not only the angular velocity indicating the movement of the imaging device 11 in the horizontal direction but also the angular velocity indicating the movement of the imaging device 11 in the vertical direction.
- the imaging device 11 when the user moves the imaging device 11 in the horizontal direction, that is, when the user performs a panning operation on the imaging device 11, the imaging device 11 changes in the panning speed.
- the zoom magnification is changed accordingly.
- a curve L11 indicates the horizontal angular velocity ⁇ at each time t detected by the angular velocity sensor 34, and a broken line L12 indicates the zoom magnification Z of the imaging lens 21 at each time t.
- the vertical axis indicates angular velocity or zoom magnification
- the horizontal axis indicates time.
- the imaging device 11 decreases the zoom magnification Z with time in order to make it easy to grasp the entire scene. That is, the imaging device 11 changes the zoom magnification to the wide side so that the subject is zoomed out.
- the imaging device 11 keeps the zoom magnification Z at a constant value.
- the imaging apparatus 11 increases the zoom magnification Z with time so that a new subject to be observed can be easily seen. To go. That is, the imaging device 11 changes the zoom magnification to the tele side so that the subject is zoomed in.
- the imaging device 11 by pulling wide while the user is panning quickly, it is possible to make the user feel the video speed slower and to make the subject easier to see.
- the zoom magnification is increased, so that the subject of interest can be enlarged without making the user perform a zoom operation, and can be easily viewed.
- the imaging device 11 slowly changes the zoom magnification according to the change in the panning speed.
- a curve L21 indicates the angular velocity ⁇ at each time t detected by the angular velocity sensor 34, and a broken line L22 indicates the zoom magnification Z of the imaging lens 21 at each time t.
- the vertical axis indicates angular velocity or zoom magnification, and the horizontal axis indicates time.
- the imaging apparatus 11 increases the zoom magnification Z with time in accordance with the speed at which the user stops the panning operation, as indicated by the broken line L22.
- the imaging apparatus 11 can make it easier for the user to find an object of interest by increasing the zoom magnification in accordance with the speed.
- the imaging apparatus 11 is moved too much, so that when the user slightly returns in the direction opposite to the direction in which the user has moved, zooming is performed. By not changing the magnification, the subject can be easily found.
- a curve L31 indicates the angular velocity ⁇ at each time t detected by the angular velocity sensor 34, and a broken line L32 indicates the zoom magnification Z of the imaging lens 21 at each time t.
- the vertical axis indicates angular velocity or zoom magnification, and the horizontal axis indicates time.
- the imaging device 11 increases the zoom magnification Z with time in order to make it easy to see a new subject to be observed in the section D11 as indicated by the broken line L32.
- the angular velocity ⁇ is a negative value.
- the imaging device 11 fixes the zoom magnification Z so as to be a constant value without changing.
- the moving direction of the imaging apparatus 11 is reversed in the panning operation of the user, that is, the sign of the angular velocity ⁇ is changed from positive to negative, and thereafter, when the absolute value of the angular velocity ⁇ starts to decrease, Zooming in to is stopped.
- the user tried to find a new subject and stop the panning operation.
- the user moved the imaging apparatus 11 too much, and thus moved the imaging apparatus 11 in the opposite direction.
- the imaging device 11 is stopped after the operation. Therefore, the imaging apparatus 11 stops the zoom-in operation that has been performed so that a new subject is enlarged, and makes it easier to find a subject that has gone too far from the center of the screen.
- more appropriate zoom control can be performed by determining the zoom magnification Z based on the change in the angular velocity ⁇ .
- a panning operation for moving the imaging device 11 in the horizontal direction is performed and the zoom magnification is changed according to the panning operation will be described.
- the imaging device 11 is moved in the vertical direction or in other directions. It is of course possible to change the zoom magnification according to the above.
- a curve L41 indicates the angular velocity ⁇ at each time t detected by the angular velocity sensor 34.
- the vertical axis indicates angular velocity
- the horizontal axis indicates time.
- the user starts the panning operation from a state where the imaging device 11 is stopped without being moved, and then stops the panning operation.
- the imaging device 11 classifies the current state of the panning operation of the user as one of the operation state a to the operation state d specified by the time change of the angular velocity ⁇ based on the change of the angular velocity ⁇ .
- the panning operation state Is the operating state a.
- the panning operation state is set to the operation state b.
- the state of the panning operation is the operation state c. .
- the state of the panning operation is the operation state d.
- the imaging device 11 determines the operation state at the current time based on the angular velocities ⁇ at a plurality of consecutive times detected by the angular velocity sensor 34, and sets several predetermined zoom states based on the determined operation state. One zoom state is selected from among them. Then, the imaging device 11 changes the zoom magnification of the imaging lens 21 according to the selected zoom state.
- zoom states Z (1) to Z (4) are predetermined as zoom states, and the zoom magnification is controlled as shown in FIG. 6 in each zoom state.
- a polygonal line L51 indicates the zoom magnification Z at each time
- the vertical axis and the horizontal axis indicate the zoom magnification Z and time t, respectively.
- the zoom magnification Z is set to Z min as zoom control in the zoom state Z (1).
- the zoom magnification Z min is the minimum value of the zoom magnification Z that can be taken when the imaging apparatus 11 performs automatic zoom control.
- the zoom magnification Z min is not necessarily the minimum value of the zoom magnification Z that the imaging lens 21 can take, that is, the zoom magnification at the wide end.
- the zoom state is the zoom state Z (1), that is, for example, while the panning operation is performed at a constant speed and the operation state c is set
- the zoom magnification Z is continuously maintained at the constant zoom magnification Z. It is left as min .
- the zoom magnification Z is increased from Z min to Z max at a constant speed with time.
- the zoom magnification Z max is the maximum value of the zoom magnification Z that can be taken when the imaging apparatus 11 performs automatic zoom control.
- the zoom magnification Z max is not necessarily the maximum value of the zoom magnification Z that the imaging lens 21 can take, that is, the zoom magnification at the tele end.
- zoom state is the zoom state Z (2), that is, for example, when the user is about to stop the panning operation and is in the operation state d
- zoom control is performed so that the zoom magnification Z increases with time. Done.
- the zoom magnification Z is set to Z max .
- the zoom state is the zoom state Z (3), that is, for example, while the imaging device 11 is stationary and in the operation state a
- the zoom magnification Z continues to be a constant zoom magnification Z max . Is done.
- zoom control in the zoom state Z (4) the zoom magnification Z decreases from Z max to Z min at a constant speed with time.
- zoom state is the zoom state Z (4), that is, for example, while the user starts the panning operation and is in the operation state b
- zoom control is performed so that the zoom magnification Z decreases with time.
- the image pickup apparatus 11 When the user turns on the power switch as the operation unit 32 of the image pickup apparatus 11, the image pickup apparatus 11 starts shooting processing, takes an image such as a moving image, and displays or records the obtained image.
- photographing processing by the imaging device 11 will be described. Note that this photographing process is repeatedly performed until, for example, the user turns off the power of the imaging device 11 or instructs the end of photographing.
- step S11 the control unit 25 turns on the power of the imaging device 11 so that power is supplied from a power source (not shown) to each unit of the imaging device 11.
- step S12 the imaging device 11 starts capturing an image.
- the imaging lens 21 forms an image of light incident from a subject on the light receiving surface of the imaging element 22, and the imaging element 22 receives the light incident from the imaging lens 21 and performs photoelectric conversion, and the resulting electric
- the signal is supplied to the analog signal processing unit 23.
- the analog signal processing unit 23 performs processing such as noise removal on the electrical signal supplied from the image sensor 22 and supplies the processed signal to the A / D conversion unit 24.
- the A / D conversion unit 24 receives the analog signal processing unit 23 from the analog signal processing unit 23.
- the supplied electric signal is A / D converted and supplied to the control unit 25.
- the electrical signal output from the A / D conversion unit 24 is used as image data of a captured image.
- the imaging lens 21 to the A / D conversion unit 24 sequentially perform a process of capturing a subject, generating image data, and supplying the image data to the control unit 25.
- the angular velocity sensor 34 starts detecting the angular velocity ⁇ of the horizontal movement of the imaging device 11 and sequentially supplies the detected angular velocity ⁇ at each time to the control unit 25.
- step S ⁇ b> 13 the control unit 25 acquires the angular velocity ⁇ from the angular velocity sensor 34 and supplies it to the memory 33.
- step S14 the memory 33 stores the angular velocity ⁇ supplied from the control unit 25.
- step S ⁇ b> 15 the control unit 25 controls the memory 33 to delete the N-th previous angular velocity ⁇ among the angular velocities ⁇ at each time stored in the memory 33.
- the memory 33 stores the angular speed ⁇ at the last N times.
- step S ⁇ b> 16 the motion determination unit 41 refers to the latest N angular velocities ⁇ stored in the memory 33, and compares these N angular velocities ⁇ with a database previously recorded in the recording unit 31. Then, the operation of the user is determined.
- N 5 as indicated by an arrow Q11 in FIG. 8 and angular velocities indicated by points P11 to P15 are obtained as angular velocities ⁇ at each time.
- the vertical axis and the horizontal axis indicate angular velocity and time.
- the operation discriminating unit 41 obtains a primary relational expression of the angular velocity ⁇ and the time t shown in the following equation (1) based on the value of the angular velocity ⁇ at each time indicated by the five points P11 to P15.
- the straight line L61 represents the estimation formula of the angular velocity ⁇ at each time t obtained from the points P11 to P15, that is, the linear expression shown in the formula (1).
- the recording unit 31 can take ideal values of the coefficient x and the coefficient y when each of the operation states a to d is in each operation state, more specifically, the coefficient x and the coefficient y.
- the range of values is recorded as a database.
- the motion determination unit 41 compares the coefficient x and the coefficient y obtained from the N angular velocities ⁇ with the coefficient x and the coefficient y as a database for each motion state recorded in the recording unit 31, so that the current time It is discriminated whether the user's operation in the operation state a to the operation state d.
- an area determined by a range of values that can be taken by the coefficient x and the coefficient y recorded as a database is an area indicated by an arrow Q12.
- the vertical axis and the horizontal axis indicate the coefficient x and the coefficient y.
- the regions R11 to R14 on the xy space indicate regions determined from the database of the operation state a to the operation state d.
- the region R11 is a region indicating a range of possible values of the coefficient x and the coefficient y for the operation state a. Since the imaging apparatus 11 is stationary when the operation state is a, the angular velocity ⁇ should be small and the variation in the time direction of the angular velocity ⁇ should be small. Therefore, the region R11 corresponding to the operation state a is a region where the coefficient x and the coefficient y, that is, the intercept and slope of the estimation formula are small values.
- the motion determination unit 41 determines the user's motion state depending on which of the regions R11 to R14 the coordinates (position) in the xy space determined by the coefficient x and the coefficient y obtained from the N angular velocities ⁇ are present. Determine. For example, when the position in the xy space determined by the obtained coefficient x and coefficient y is within the region R11, the user's action at the current time is assumed to be the action state a.
- each operation state is weighted so that a specific operation state can be easily selected, or a specific operation state is selected. It can be made difficult.
- FIG. 8 illustrates an example in which a user's operation state is obtained using a linear expression
- a user's operation state may be obtained using a polynomial as shown in FIG.
- the vertical axis and the horizontal axis indicate angular velocity and time.
- N 5
- angular velocities indicated by points P21 to P25 are obtained as angular velocities ⁇ at each time.
- the motion determination unit 41 obtains a relational expression between the angular velocity ⁇ and the time t shown in the following equation (2).
- the curve L71 represents an estimation formula of the angular velocity ⁇ at each time t obtained from the points P21 to P25, that is, the polynomial shown in the formula (2).
- the recording unit 31 has values of ideal coefficients x, coefficients y, coefficients z, and so on for each of the operation states a to d, and more specifically each coefficient of the polynomial.
- the range of possible values is recorded as a database.
- the motion discriminating unit 41 has a coordinate (position) on the space determined by the values of the respective coefficients of the polynomials such as the coefficient x, the coefficient y, and the coefficient z obtained from the N angular velocities ⁇ as a database as a database for each motion state. Based on which of the areas determined by the range of values that each coefficient can take, the user's operation state is determined.
- the operation state of the user is obtained by the same processing as in FIG.
- the polynomial shown in Equation (2) is a quadratic equation
- the coordinates in the xyz space determined by the coefficients x to z determined from the actually detected angular velocities ⁇ are the values of the coefficients x to z as a database. It is specified which of the areas determined by the range of values to be obtained.
- N angular velocities ⁇ are recorded in the recording unit 31 as a database, and based on the distance between the N angular velocities ⁇ for each operation state and the actually detected N angular velocities ⁇ , The state may be determined.
- the angular velocities ⁇ at N times are recorded as a database for each of the operation state a to the operation state d.
- the vertical axis and the horizontal axis indicate the angular velocity and time.
- N 5
- five angular velocities ⁇ indicated by five points P31 to P35 are recorded as a database of one operation state, and a broken line L81 is obtained at each time obtained from the points P31 to P35.
- the angular velocities indicated by points P41 to P45 are obtained as the actual angular velocities ⁇ at each time.
- the motion determination unit 41 calculates the following equation (3) to obtain the distance d x between the motion state database and the actually detected angular velocity ⁇ for each motion state.
- Equation (3) the sum of squares of the difference between the actually detected angular velocity ⁇ at the corresponding time and the angular velocity recorded as the database is the distance d x between the detected angular velocity and the database.
- the operation determination unit 41 determines whether or not each obtained distance d x is equal to or less than a predetermined threshold th. Then, the behavior distinguishing unit 41, the operating state of the most value to determine the small distance d x database of the distance d x which is to be less than the threshold value th is assumed to be the operating state of the user at the current time. On the other hand, if the distance d x is not less than or equal to the threshold th, the operation state at the current time is unknown.
- the threshold value th is set for each database, and each threshold value th is adjusted as necessary to weight each operation state so that a specific operation state can be easily selected. It can be made difficult to select.
- the operation state of the database for which the distance d x having the smallest value among the obtained distances d x is determined as the user operation state at the current time. Also good.
- step S ⁇ b> 16 when the operation state of the user at the current time is determined in step S ⁇ b> 16, the process proceeds to step S ⁇ b> 17.
- step S17 based on the signal from the operation unit 32, the control unit 25 determines whether or not a manual zoom operation on the imaging lens 21, that is, a manual operation by the user is performed.
- step S17 If it is determined in step S17 that no manual operation has been performed, the process proceeds to step S18.
- step S18 the zoom control unit 42 performs zoom control according to the operation state of the user at the current time.
- the zoom control unit 42 determines an appropriate zoom state based on the operation state determined in step S16, and determines an appropriate zoom magnification based on the zoom state and the zoom magnification at the current time. . Further, the zoom control unit 42 controls the zoom unit 29 based on the determined zoom magnification, and changes the zoom magnification Z of the imaging lens 21. That is, the zoom control unit 42 moves the lenses constituting the imaging lens 21 in the optical axis direction so that the zoom magnification of the imaging lens 21 becomes the determined zoom magnification.
- step S18 If zoom control is performed in step S18, then the process proceeds to step S20.
- step S19 the zoom control unit 42 controls the zoom according to the user's manual operation. That is, the zoom control unit 42 controls the zoom unit 29 according to the signal supplied from the operation unit 32 to change the zoom magnification of the imaging lens 21.
- step S19 when zoom control is performed according to the user's operation, the process proceeds to step S20.
- step S20 the control unit 25 displays and records an image based on the image data supplied from the A / D conversion unit 24.
- the display control unit 43 of the control unit 25 supplies the image data to the display unit 30 to display the image, and supplies the image data to the recording unit 31 for recording. Note that only one of image display and recording may be performed.
- step S20 When the image is displayed and recorded in step S20, the process thereafter returns to step S13, and the above-described process is repeated until an instruction to end the photographing process is given.
- the imaging device 11 determines the user's operation state based on the angular velocities at a plurality of times, and performs zoom control according to the operation state.
- a zoom state determination process for determining an appropriate zoom state based on the user's operation state is also performed.
- the zoom state determination process is called and executed when the process of step S18 in FIG. 7 is performed at each time.
- zoom state determination process which is a process for determining the zoom state at each time in step S18 of FIG. 7, will be described with reference to the flowchart of FIG.
- This zoom state determination process is performed by the imaging device 11, and is completed at the same time as the photographing process is completed.
- step S51 the zoom control unit 42 sets the user operation state and zoom state to an initial state.
- the operation state of the user is the operation state a
- step S52 the zoom control unit 42 determines whether or not the user's operation state obtained by the determination in step S16 of FIG. 7 is the operation state b.
- step S52 If it is determined in step S52 that the operation state is not b, that is, if the panning operation is not started, the process returns to step S52, and the determination process is repeated until it is determined that the operation state is b.
- a zoom magnification that is smaller than the zoom magnification Z so far by a predetermined size is set as a new zoom magnification, and zoom control is performed. .
- step S54 the zoom control unit 42 determines whether or not the user's operation state obtained by the determination in step S16 of FIG. 7 is the operation state d.
- step S54 If it is determined in step S54 that the operating state is d, the process proceeds to step S58.
- step S54 If it is determined in step S54 that the state is not the operation state d, the zoom control unit 42 weights the operation state d in order to avoid a transition from the zoom state Z (4) to the zoom state Z (2). The process proceeds to step S55.
- the zoom control unit 42 adjusts the position and size of the region corresponding to the operation state d such as the region R14 in FIG. This makes it difficult to achieve the operation state d.
- the zoom control unit 42 reduces the threshold value th of the operation state d to make it difficult to enter the operation state d.
- step S55 the zoom control unit 42 determines whether or not the zoom magnification Z is the zoom ratio Z min.
- step S55 If it is determined in step S55 that the zoom magnification is not Z min , the process returns to step S54 and the above-described process is repeated.
- step S55 when it is determined in step S55 that the zoom magnification is Z min , in step S56, the zoom control unit 42 sets the zoom state to the zoom state Z (1).
- the zoom magnification of the imaging lens 21 is continuously maintained at the zoom magnification Z min , that is, at the wide side.
- step S57 the zoom control unit 42 determines whether or not the user's operation state obtained by the determination in step S16 of FIG. 7 is the operation state d.
- step S57 If it is determined in step S57 that the operation state is not d, that is, if the panning operation at a constant speed is continuously performed, the process returns to step S57, and the determination process is performed until it is determined that the operation state is d. Is repeated.
- step S57 if it is determined in step S57 that the operating state is d, or if it is determined in step S54 that the operating state is d, the process of step S58 is performed.
- step S58 the zoom control unit 42 sets the zoom state to the zoom state Z (2). Thereby, for example, zoom control is performed so that the subject is zoomed in in accordance with the deceleration of the panning speed.
- step S59 the zoom control unit 42 determines whether or not the user's operation state obtained by the determination in step S16 of FIG. 7 is the operation state b.
- step S59 If it is determined in step S59 that the operating state is b, the process returns to step S53, and the above-described process is repeated.
- the zoom magnification is determined by the zoom control unit 42 so that the zoom magnification remains the current zoom magnification, that is, remains constant. May be.
- the zoom magnification may be kept at the current zoom magnification.
- step S59 if it is determined in step S59 that the state is not the operation state b, the zoom control unit 42 assigns a weight to the operation state b in order to avoid a transition from the zoom state Z (2) to the zoom state Z (4). In addition, the process proceeds to step S60.
- the zoom control unit 42 adjusts the position and size of a region corresponding to the operation state b such as the region R12 in FIG. As a result, the operation state b is less likely to occur. Further, for example, when the operation state is determined by the method described with reference to FIG. 10, the zoom control unit 42 makes the operation state b difficult by reducing the threshold value th of the operation state b.
- step S60 the zoom control unit 42 determines whether or not the zoom magnification Z is the zoom ratio Z max.
- step S61 when it is determined in step S60 that the zoom magnification is Z max , in step S61, the zoom control unit 42 sets the zoom state to the zoom state Z (3). Thereby, for example, while the user stops the panning operation, the zoom magnification of the imaging lens 21 is continuously maintained at the zoom magnification Z max , that is, the telephoto side.
- step S61 When the process of step S61 is performed, the process thereafter returns to step S52, and the above-described process is repeated until the zoom state determination process ends.
- the imaging device 11 changes the zoom state according to the operation state of the user.
- more appropriate zoom control can be performed.
- step S91 to step S94 is the same as the processing from step S11 to step S14 in FIG.
- step S95 the control unit 25 controls the memory 33 to delete the previous two angular velocities from the angular velocities ⁇ at each time stored in the memory 33.
- the memory 33 stores the angular velocities ⁇ at the two most recent times.
- step S96 the motion determination unit 41 calculates the angular acceleration based on the latest two angular velocities ⁇ stored in the memory 33, and estimates the next time, that is, the angular velocities at the future time, from the obtained angular accelerations.
- the angular acceleration is obtained by obtaining the difference between two angular velocities. Since angular acceleration is a temporal change in angular velocity, if angular acceleration is known, the angular velocity at a future time can be predicted.
- step S97 the motion determination unit 41 is recorded in the recording unit 31 in advance with the latest two angular velocities ⁇ stored in the memory 33 and the angular velocity ⁇ at the next time estimated in the process of step S96. Compare with the database to determine user actions.
- step S97 differs from step S16 in FIG. 7 in that the operation state is determined using the angular velocity obtained by estimation in addition to the actually detected angular velocity, but in step S97, The operation state of the user is determined by similar processing.
- step S98 to step S101 is performed and the photographing process is terminated.
- these processes are the same as the process from step S17 to step S20 in FIG. The description is omitted.
- the imaging apparatus 11 estimates the angular velocity at the next time using the two detected angular velocities, and uses the angular velocity obtained by the estimation and the actually detected angular velocity to determine the user's operation state. And performing zoom control according to the operation state.
- the start or stop of the panning operation can be detected more quickly, and the user's operation and the appropriate operation for the user's operation can be detected.
- the time difference from the zoom control can be further reduced.
- the recording capacity of the memory 33 can be reduced by reducing the number of angular velocities stored in the memory 33.
- the imaging device 11 is monocular
- the imaging device may be binocular.
- the imaging device is configured as shown in FIG. 13, for example.
- portions corresponding to those in FIG. 1 are denoted by the same reference numerals, and description thereof is omitted as appropriate.
- the imaging device 71 includes the imaging lens 21 to the angular velocity sensor 34, the imaging lens 81, the imaging element 82, an analog signal processing unit 83, an A / D conversion unit 84, and a shutter.
- a control unit 85, a focus unit 86, an iris unit 87, and a zoom unit 88 are provided.
- the imaging device 71 has a configuration in which an imaging lens 81 to a zoom unit 88 are further provided with respect to the configuration of the imaging device 11.
- the imaging lens 81 through the A / D conversion unit 84 correspond to the imaging lens 21 through the A / D conversion unit 24, and capture and obtain an image having parallax with the image obtained by the imaging lens 21 through the A / D conversion unit 24.
- the obtained image data is supplied to the control unit 25.
- the shutter control unit 85 to zoom unit 88 correspond to the shutter control unit 26 to zoom unit 29, and perform shutter control, focus control, iris drive, and zoom of the imaging lens 81.
- the display unit 30 has, for example, a display area that presents an image to the user's right eye and a display area that presents an image to the user's left eye.
- step S131 is the same as the process of step S11 of FIG. 7, the description is abbreviate
- step S132 the imaging device 71 starts capturing an image.
- the imaging lens 21 to the A / D conversion unit 24 sequentially capture the subject to generate image data and continuously supply the image data to the control unit 25, and the imaging lens 81 to the A / D conversion unit 84. Also, the process of photographing the subject to generate image data and supplying the image data to the control unit 25 is continuously performed. As a result, image data of two images having parallax can be obtained.
- step S140 When image shooting is started, the processing from step S133 to step S140 is performed and the shooting processing is terminated. However, these processing are the same as the processing from step S13 to step S20 in FIG. Description is omitted. However, in step S140, two parallax images are displayed on the display unit 30 and recorded on the recording unit 31.
- the imaging device 71 determines the user's operation state based on the angular velocities at a plurality of times, and performs zoom control according to the operation state. Thereby, a more detailed operation state can be specified, and more appropriate zoom control according to the user's operation state can be performed.
- the imaging device 71 may also determine the user's action by estimating the angular velocity at the next time. In such a case, the imaging device 71 performs the imaging process illustrated in FIG. Hereinafter, with reference to the flowchart of FIG. 15, a photographing process by the imaging device 71 will be described.
- step S171 is the same as the process of step S91 of FIG. 12, the description is abbreviate
- step S172 the imaging device 71 starts capturing an image.
- the imaging lens 21 to the A / D conversion unit 24 sequentially capture the subject to generate image data and continuously supply the image data to the control unit 25, and the imaging lens 81 to the A / D conversion unit 84. Also, the process of photographing the subject to generate image data and supplying the image data to the control unit 25 is continuously performed. As a result, image data of two images having parallax can be obtained.
- step S173 to step S181 When image capturing is started, the processing from step S173 to step S181 is performed thereafter, and the image capturing processing is terminated. However, these processing are the same as the processing from step S93 to step S101 in FIG. Description is omitted. However, in step S181, two parallax images are displayed on the display unit 30 and recorded on the recording unit 31.
- the imaging device 71 estimates the angular velocity at the next time using the two detected angular velocities, and uses the angular velocity obtained by the estimation and the actually detected angular velocity to determine the user's operation state. And performing zoom control according to the operation state. Thereby, a user's operation
- the optical zoom for changing the angle of view of the image by changing the zoom magnification of the imaging lens 21 has been described as an example.
- the angle of view of the image may be changed by digital zoom.
- the imaging device 11 is configured as shown in FIG. 16, for example.
- portions corresponding to those in FIG. 1 are denoted by the same reference numerals, and description thereof is omitted as appropriate.
- the imaging device 11 in FIG. 16 includes an imaging lens 21 through an iris unit 28, and a display unit 30 through an angular velocity sensor 34. That is, the imaging device 11 in FIG. 16 is different from the imaging device 11 in FIG. 1 in that the zoom unit 29 is not provided, and the other configurations are the same. In addition, in the imaging device 11 of FIG. 16, a zoom processing unit 111 is further provided in the control unit 25.
- the zoom processing unit 111 performs zooming of the image by cutting out a partial region of the image based on the image data supplied from the A / D conversion unit 24 under the control of the zoom control unit 42.
- or step S217 is the same as the process of step S11 thru
- step S217 If it is determined in step S217 that no manual operation has been performed, in step S218, the zoom control unit 42 performs zoom control according to the user's operation state at the current time.
- zoom control unit 42 determines an appropriate zoom state by the zoom state determination process of FIG. 11 based on the operation state determined in the process of step S216, based on the zoom state and the current zoom magnification, A new zoom magnification at the next time is determined.
- the zoom processing unit 111 cuts out a region determined by the new zoom magnification in the image based on the image data supplied from the A / D conversion unit 24, and the final captured image, that is, the image subjected to zoom processing. And for example, an area having a size determined by a new zoom magnification with the center position of the photographed image as a center is cut out to obtain a zoomed image, that is, an image photographed at a predetermined zoom magnification.
- an image generated by clipping by the zoom processing unit 111 that is, an image subjected to zoom processing is also referred to as a zoom image.
- zoom control can be performed by determining the zoom state according to the operation state of the user. If zoom control is performed in step S218, then the process proceeds to step S220.
- step S219 the zoom control unit 42 controls the zoom according to the user's manual operation, that is, according to the signal supplied from the operation unit 32.
- the zoom processing unit 111 cuts out an area determined by the zoom magnification designated by the user from the image based on the image data supplied from the A / D conversion unit 24 under the control of the zoom control unit 42, and designates the designated zoom magnification. Zoom image taken with.
- step S219 If zoom control is performed according to the user's operation in step S219, then the process proceeds to step S220.
- step S218 When zoom control is performed in step S218 or step S219, the display control unit 43 generates a display image in step S220.
- the display control unit 43 has one zoom image EP11 obtained by performing zoom processing in step S218 or step S219, and an original image LP11 for obtaining the zoom image EP11.
- the image is generated as the display image DP11.
- the image on the screen on which the two images of the zoom image EP11 and the image LP11 are arranged and displayed simultaneously is the display image DP11.
- the display image DP11 is an image in which the zoom image EP11 is arranged on the upper side in the figure and the image LP11 is arranged on the lower side in the figure.
- the image LP11 is an image before zoom processing, that is, the original image obtained by cutting out the zoom image EP11, which is output from the A / D conversion unit 24.
- a frame W11 is displayed in the area of the image LP11 which is the zoom image EP11, so that it can be seen which area of the image LP11 having a wider angle of view is cut out to be the zoom image EP11. Has been made. Thereby, the user can easily grasp which part of the entire photographing target is viewed.
- step S220 when a display image is generated in step S220, the process proceeds to step S221.
- step S221 the control unit 25 displays and records an image based on the image data of the display image and the image data of the zoom image.
- the display control unit 43 of the control unit 25 supplies the image data of the display image to the display unit 30 to display the image, and the control unit 25 supplies the image data of the zoom image to the recording unit 31 for recording.
- the control unit 25 supplies the image data of the zoom image to the recording unit 31 for recording.
- step S221 When the image is displayed and recorded in step S221, the process returns to step S213, and the above-described process is repeated until the end of the shooting process is instructed.
- the imaging device 11 determines the user's operation state based on the angular velocities at a plurality of times, and performs zoom control according to the operation state. Thereby, a more detailed operation state can be specified and more appropriate zoom control can be performed.
- the imaging device 11 displays the image before the zoom process and the image after the zoom process side by side. Thereby, an image can be displayed more effectively and easily.
- the imaging apparatus 11 shown in FIG. 16 performs the imaging process shown in FIG.
- imaging processing performed by the imaging apparatus 11 having the configuration shown in FIG. 16 will be described with reference to the flowchart of FIG.
- or step S259 is the same as the process of step S211 thru
- step S260 the display control unit 43 generates a display image.
- the display control unit 43 combines the zoom image EP21 obtained by performing zoom processing in step S258 or step S259 and a plurality of images before zoom processing obtained by past shooting.
- One image composed of the image WP11 obtained in this way is generated as the display image DP21.
- the display image DP21 is an image in which the zoom image EP21 is arranged on the upper side in the figure and the image WP11 is arranged on the lower side in the figure.
- the image WP11 includes a plurality of images taken in the past before zoom processing including the original image for obtaining the zoom image EP21, that is, a plurality of images output from the A / D conversion unit 24. It is an image obtained by arranging and synthesizing images so that the same subject overlaps.
- dotted line frames LP21-1 to LP21-4 on the image WP11 represent images used to generate the image WP11. Note that these frames LP21-1 to LP21-4 are not actually displayed on the image WP11.
- a frame W21 is displayed in the area designated as the zoom image EP21 in the image WP11, so that it can be seen which area of the wider photographing range is the photographing range of the zoom image EP21. . Thereby, the user can easily grasp which part of the entire photographing target is viewed.
- step S261 when a display image is generated in step S260, the process of step S261 is performed and the photographing process is terminated.
- the process of step S261 is the same as the process of step S221 of FIG. Since it is the same, the description is omitted.
- step S261 an image obtained by combining a plurality of images before zoom processing may be recorded together with the zoom image.
- the imaging device 11 determines the user's operation state based on the angular velocities at a plurality of times, and performs zoom control according to the operation state. Thereby, a more detailed operation state can be specified and more appropriate zoom control can be performed.
- the imaging device 11 displays an image having a wider shooting range than the image before zoom processing and the zoom image side by side. Thereby, an image can be displayed more effectively and easily.
- the imaging device 11 when processing that combines imaging processing and object recognition processing is performed, for example, the imaging device 11 is configured as shown in FIG. In FIG. 21, parts corresponding to those in FIG. 16 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
- 21 is configured such that an object detection unit 141 is further provided in the control unit 25 of the imaging device 11 shown in FIG.
- the object detection unit 141 performs object recognition processing on the image supplied from the A / D conversion unit 24 and detects an object from the image.
- or step S297 is the same as the process of step S211 thru
- step S298 the zoom control unit 42 performs zoom control in accordance with the operation state of the user at the current time.
- the zoom control unit 42 determines an appropriate zoom state by the zoom state determination process of FIG. 11 based on the operation state determined in step S296, based on the zoom state and the current zoom magnification, Determine a new zoom factor.
- the shooting process of FIG. 22 when the object is recognized in addition to the zoom state determination process described with reference to FIG. Also, processing is performed so that the zoom state is maintained as much as possible.
- the zoom processing unit 111 cuts out an area determined by the new zoom magnification in the image based on the image data supplied from the A / D conversion unit 24, and shoots at the new zoom magnification.
- the obtained image that is, a zoom image is used.
- step S298 If zoom control is performed in step S298, then the process proceeds to step S300.
- step S299 the zoom control unit 42 controls the zoom according to the user's manual operation, that is, according to the signal supplied from the operation unit 32.
- the zoom processing unit 111 cuts out an area determined by the zoom magnification specified by the user from the image based on the image data supplied from the A / D conversion unit 24 under the control of the zoom control unit 42, and is specified by the user. A zoom image taken at a zoom magnification is used.
- step S299 when zoom control is performed according to the user's operation, the process proceeds to step S300.
- step S300 the control unit 25 displays and records an image based on the image data of the zoom image zoomed by the zoom processing unit 111.
- the display control unit 43 of the control unit 25 supplies the image data to the display unit 30 to display the zoom image, and the control unit 25 supplies the image data to the recording unit 31 for recording.
- step S300 When the image is displayed and recorded in step S300, the process returns to step S293, and the above-described process is repeated until the end of the shooting process is instructed.
- the imaging device 11 determines the user's operation state based on the angular velocities at a plurality of times, and performs zoom control according to the operation state. Thereby, a more detailed operation state can be specified and more appropriate zoom control can be performed.
- ⁇ Description of adjustment process> 21 performs object recognition together with the zoom state determination process described with reference to FIG. 11, and when an object is detected near the center of the image, the zoom state is maintained as much as possible. An adjustment process is also performed.
- the adjustment process performed by the imaging device 11 will be described with reference to the flowchart of FIG.
- step S331 the object detection unit 141 performs object recognition processing on the image supplied from the A / D conversion unit 24, and detects an object from the image.
- the object detection unit 141 detects a predetermined object from the image by comparing a feature quantity of an arbitrary object prepared in advance with a feature quantity extracted from the image. Further, for example, when the object detection unit 141 performs face recognition, a face of a person registered in advance as an object to be detected or a face of an arbitrary person may be detected from the image.
- step S332 the zoom control unit 42 determines whether or not an object is detected from the vicinity of the center of the image captured by the image sensor 22 in the process of step S331. For example, when an object is detected from an area of a predetermined size including the center of the image, it is determined that the object is detected in the vicinity of the image center.
- step S332 If it is determined in step S332 that no object has been detected, the process returns to step S331, and the above-described process is repeated.
- step S333 the zoom control unit 42 is unlikely to select the operation state b as the user operation state in the process of step S296 in FIG. To be. In other words, it is difficult to determine that the operation state is the operation state b in step S52 and step S59 in FIG. Also, the zoom control unit 42 makes it easy for the operation state d to be selected as the user's operation state in the process of step S296 in FIG. In other words, the operation state is easily determined to be the operation state d in step S54 and step S57 in FIG.
- the zoom control unit 42 determines the position and size of the region corresponding to the operation state b such as the region R12 in FIG. By adjusting the height, it becomes difficult to reach the operation state b. Further, for example, when the operation state is determined by the method described with reference to FIG. 10, the zoom control unit 42 makes the operation state b difficult by reducing the threshold value th of the operation state b. Similarly, when the operation state is determined by the method described with reference to FIGS. 8 and 9, the zoom control unit 42 adjusts the position and size of the region corresponding to the operation state d such as the region R14 in FIG. By doing so, it becomes easy to be in the operation state d. For example, when the operation state is determined by the method described with reference to FIG. 10, the zoom control unit 42 increases the threshold value th of the operation state d so that the operation state d is easily achieved.
- the zoom control unit 42 performs zoom control so that the state in which the object is zoomed up (enlarged display) is maintained as much as possible.
- the transition to the operation state d is facilitated, the subject is easily zoomed up. For example, when an object is detected near the center of the image and the panning speed is reduced, the object is zoomed in.
- step S334 the zoom control unit 42 determines the zoom magnification Z max according to the size of the object detected from the image.
- the zoom control unit 42 sets the value of the zoom magnification Z max to a value determined by the size of the object detected from the image.
- the value of the zoom magnification Z max is determined so that the detected object does not protrude from the image itself or a frame displayed on the image.
- the imaging device 11 detects an object from an image, and adjusts the ease of selecting an operation state and the zoom magnification Z max according to the detection result. Thereby, when the user finds a subject of interest, that is, when there is an object in the vicinity of the center of the image, it is possible to maintain the zoomed-up state of the object as much as possible.
- or step S367 is the same as the process of step S291 thru
- step S368 the zoom control unit 42 performs zoom control according to the operation state of the user at the current time.
- the zoom control unit 42 determines an appropriate zoom state by the zoom state determination process of FIG. 11 based on the operation state determined in step S366, based on the zoom state and the current zoom magnification, Determine a new zoom factor.
- the zoom control unit 42 determines an appropriate zoom state by the zoom state determination process of FIG. 11 based on the operation state determined in step S366, based on the zoom state and the current zoom magnification, Determine a new zoom factor.
- the zoom control unit 42 determines an appropriate zoom state by the zoom state determination process of FIG. 11 based on the operation state determined in step S366, based on the zoom state and the current zoom magnification, Determine a new zoom factor.
- the zoom processing unit 111 When the new zoom magnification is determined, the zoom processing unit 111 has a size determined by the new zoom magnification in the image based on the image data supplied from the A / D conversion unit 24, and detects (recognizes) an object. ) Cut out an area at a position determined according to the result. Then, the zoom processing unit 111 sets the image of the clipped area as a zoom image captured at a new zoom magnification.
- step S368 If zoom control is performed in step S368, then the process proceeds to step S370.
- step S369 the zoom control unit 42 controls the zoom according to the user's manual operation, that is, according to the signal supplied from the operation unit 32.
- step S369 when zoom control is performed according to the user's operation, the process proceeds to step S370.
- step S370 When zoom control is performed in step S368 or step S369, the process in step S370 is performed and the photographing process ends. However, the process in step S370 is the same as the process in step S300 in FIG. To do.
- the imaging device 11 determines the user's operation state based on the angular velocities at a plurality of times, and performs zoom control according to the operation state. Thereby, a more detailed operation state can be specified and more appropriate zoom control can be performed. In addition, by cutting out a region corresponding to the result of object recognition to obtain a zoomed image, it is possible to obtain a better-looking image including a subject that the user will be paying attention to.
- step S401 the object detection unit 141 performs object recognition processing on the image supplied from the A / D conversion unit 24, and detects an object from the image.
- step S401 processing similar to that in step S331 in FIG. 23 is performed.
- step S402 the zoom control unit 42 determines whether an object is detected from the image.
- step S403 the zoom control unit 42 determines the position of the zoom center based on the detection result of the object. For example, the zoom control unit 42 sets the position of the detected object on the image as the center position of the area cut out by the zoom process. As a result, a region centered on the detected object is cut out as a zoom image. The position of the center of the cutout position of the image determined in this way is used in the process of step S368 in FIG.
- step S403 When the process of step S403 is performed, the process thereafter returns to step S401, and the above-described process is repeatedly performed.
- step S404 the zoom control unit 42 sets a predetermined position as the position of the zoom center. For example, the zoom control unit 42 sets the center position of the image as the center position of the region cut out by the zoom process. As a result, a region near the center of the captured image is cut out as a zoom image.
- step S404 When the process of step S404 is performed, the process thereafter returns to step S401, and the above-described process is repeatedly performed.
- the zoom state determination process in FIG. 11 ends, the zoom position determination process also ends.
- the imaging device 11 detects an object from an image, and when an object is detected, adjusts the position of an area on the image cut out by zoom processing according to the position of the object. Thereby, it is possible to obtain a better-looking image with the subject positioned at the center.
- ⁇ Ninth embodiment> ⁇ Configuration example of imaging device>
- the example in which the zoom process is performed so that the subject is positioned at the center of the image when the digital zoom process is performed has been described.
- the center of the image is also performed when the optical zoom process is performed.
- the subject may be positioned.
- the imaging device 11 is configured as shown in FIG. 26, for example.
- the same reference numerals are given to the portions corresponding to those in FIG. 1 or FIG. 21, and description thereof will be omitted as appropriate.
- a correction unit 171 is further provided in the imaging device 11 illustrated in FIG. 1, and an object detection unit 141 and a correction control unit 172 are further provided in the control unit 25. .
- the correction unit 171 adjusts the direction in which the imaging device 11 is photographed by inclining a part or all of the lenses constituting the imaging lens 21 according to the control of the correction control unit 172.
- the lens tilted by the correction unit 171 is a camera shake correction lens or the like.
- the description will be continued assuming that the imaging lens 21 itself is tilted with respect to the imaging element 22.
- the object detection unit 141 performs object recognition processing on the image supplied from the A / D conversion unit 24 and detects an object from the image.
- the correction control unit 172 obtains a tilt angle at which the imaging lens 21 should be tilted according to the object recognition result by the object detection unit 141, and controls the correction unit 171 based on the tilt angle.
- or step S438 is the same as the process of FIG.7 S11 thru
- the object recognition is performed together with the zoom state determination process described with reference to FIG. 11, and the object fits in the image according to the recognition result. In this way, processing for obtaining the tilt angle of the imaging lens 21 is also performed.
- step S439 the correction unit 171 tilts the imaging lens 21 according to the control of the correction control unit 172.
- the correction control unit 172 obtains the tilt angle of the imaging lens 21 so that the detected object is positioned at the center of the image as described later. Then, the correction control unit 172 controls the driving of the correction unit 171 so that the angle formed by the optical axis of the imaging lens 21 and the normal line of the light receiving surface of the imaging element 22 is the obtained inclination angle, and the imaging lens 21 is inclined. The correction unit 171 tilts the imaging lens 21 according to the control of the correction control unit 172.
- the object detected from the image is positioned at the center of the image, and a better-looking image can be obtained.
- the process of tilting the imaging lens 21 is not performed.
- step S439 If the process of step S439 is performed, the process proceeds to step S441.
- step S440 the zoom control unit 42 controls the zoom unit 29 according to the user's manual operation to change the zoom magnification of the imaging lens 21.
- step S440 when zoom control is performed according to the user's operation, the process proceeds to step S441.
- step S441 When the process of step S439 or step S440 is performed, the process of step S441 is performed thereafter, and the photographing process ends. However, the process of step S441 is the same as the process of step S20 of FIG. Omitted.
- the imaging device 11 specifies the user's operation state based on the angular velocities at a plurality of times, and performs zoom control according to the specified operation state. Thereby, a more detailed operation state can be specified and more appropriate zoom control can be performed. In addition, by tilting the imaging lens 21 in accordance with the result of object recognition, it is possible to obtain a better-looking image including the subject that the user will be paying attention to.
- ⁇ Description of tilt angle determination process> 26 performs the object recognition together with the zoom state determination process described with reference to FIG. 11, and tilts the imaging lens 21 so that the object fits in the image according to the recognition result.
- An inclination angle determination process that is a process for obtaining an angle is also performed.
- the inclination angle determination process performed by the imaging device 11 will be described.
- step S471 the object detection unit 141 performs object recognition processing on the image supplied from the A / D conversion unit 24, and detects an object from the image.
- step S471 processing similar to that in step S331 in FIG. 23 is performed.
- step S472 the correction control unit 172 determines whether an object is detected from the image.
- step S472 If it is determined in step S472 that no object has been detected, the process returns to step S471, and the above-described process is repeated. That is, when no object is detected, the inclination angle of the imaging lens 21 is 0, and the imaging lens 21 is not inclined.
- step S473 the correction control unit 172 is based on the position of the object on the image so that the detected object is positioned at the center of the image.
- the inclination angle of the imaging lens 21 is obtained.
- the correction control unit 172 controls the driving of the correction unit 171 based on the obtained inclination angle.
- step S473 When the process in step S473 is performed, the process returns to step S471, and the above-described process is repeated. Then, when the zoom state determination process in FIG. 11 ends, the tilt angle determination process also ends.
- the imaging device 11 detects an object from an image, and when an object is detected, obtains an inclination angle of the imaging lens 21 according to the position of the object. This makes it possible to obtain a better-looking image with the subject located at the center.
- the example in which the position of the object on the image is adjusted by tilting the imaging lens 21 has been described.
- the imaging lens 21 is tilted and the cutout position of the region including the object on the image is adjusted.
- the position of the object on the image may be adjusted by performing the adjustment.
- the series of processes described above can be executed by hardware or can be executed by software.
- a program constituting the software is installed in the computer.
- the computer includes, for example, a general-purpose computer capable of executing various functions by installing a computer incorporated in dedicated hardware and various programs.
- FIG. 29 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
- the CPU 501, the ROM 502, and the RAM 503 are connected to each other by a bus 504.
- An input / output interface 505 is further connected to the bus 504.
- An input unit 506, an output unit 507, a recording unit 508, a communication unit 509, and a drive 510 are connected to the input / output interface 505.
- the input unit 506 includes a keyboard, a mouse, a microphone, an image sensor, and the like.
- the output unit 507 includes a display, a speaker, and the like.
- the recording unit 508 includes a hard disk, a nonvolatile memory, and the like.
- the communication unit 509 includes a network interface or the like.
- the drive 510 drives a removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
- the CPU 501 loads the program recorded in the recording unit 508 to the RAM 503 via the input / output interface 505 and the bus 504 and executes the program, for example. Is performed.
- the program executed by the computer (CPU 501) can be provided by being recorded in, for example, a removable medium 511 as a package medium or the like.
- the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
- the program can be installed in the recording unit 508 via the input / output interface 505 by attaching the removable medium 511 to the drive 510. Further, the program can be received by the communication unit 509 via a wired or wireless transmission medium and installed in the recording unit 508. In addition, the program can be installed in advance in the ROM 502 or the recording unit 508.
- the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
- the present technology can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is jointly processed.
- each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
- the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
- the present technology can be configured as follows.
- An angular velocity detector that detects the angular velocity of its own movement;
- An operation determination unit that determines the operation state of the user based on the angular velocities at a plurality of different times;
- An imaging apparatus comprising: a zoom control unit that performs zoom control based on the operation state.
- the zoom control unit performs zoom control so that a zoom magnification is reduced in the operation state in which the angular velocity increases with time.
- the zoom control unit performs zoom control so that a zoom magnification is increased in the operation state in which the angular velocity decreases with time.
- the zoom control unit performs zoom control so that the zoom magnification remains constant when the operation state in which the angular velocity decreases with time changes to the operation state in which the angular velocity increases with time.
- the imaging device according to any one of (1) to (3).
- the zoom control unit is configured to make it difficult for the operation determination unit to determine that the operation state is the predetermined state when specific zoom control is performed.
- the motion determination unit obtains a relational expression between the time and the angular speed based on the angular velocities at the plurality of different times, the coefficient of the obtained relational expression, and the relation predetermined for each of the plurality of motion states.
- the imaging apparatus according to any one of (1) to (5), wherein the operation state is determined by comparing a range of values that can be taken by a coefficient of the equation. (7)
- the operation determination unit determines the operation state by obtaining distances between the angular velocities at the plurality of different times and the angular velocities at a plurality of times determined in advance for each of the plurality of operation states.
- the imaging device according to any one of (5).
- the motion determining unit estimates the angular velocity at a future time based on the angular velocities at the plurality of different times, and based on the angular velocity obtained by the estimation and the angular velocities at the plurality of different times.
- the imaging apparatus according to any one of (1) to (5), wherein an operating state is determined.
- the imaging apparatus according to any one of (1) to (8), further including a zoom processing unit that generates a zoom image by cutting out a partial region of a captured image in accordance with control of the zoom control unit.
- the imaging apparatus further including a display control unit that displays the captured image and the zoom image side by side.
- the imaging apparatus wherein the display control unit displays a frame in a region where the zoom image is cut out of the image.
- the imaging apparatus wherein the display control unit displays a frame in a region where the zoom image is cut out in the image obtained by the synthesis.
- An object detection unit for detecting an object from the captured image; The zoom control unit performs zoom control so that the state in which the object is displayed in an enlarged manner is maintained when the object is detected near the center of the image (1) to (9).
- Imaging device An object detection unit for detecting an object from the captured image; The imaging apparatus according to (9), wherein the zoom control unit determines a position where the zoom image is cut out from the image based on the detection result of the object.
- An object detection unit for detecting an object from the captured image;
- the imaging apparatus according to any one of (1) to (8), further including: a correction unit that tilts a lens that collects light incident from the subject according to the detection result of the object.
- Detect the angular velocity of the movement of the imaging device Determining the operating state of the user operating the imaging device based on the angular velocities at a plurality of different times;
- An imaging method including a step of performing zoom control based on the operation state.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Adjustment Of Camera Lenses (AREA)
Abstract
Description
〈撮像装置の構成例〉
図1は、本技術を適用した撮像装置の構成例を示す図である。
次に、撮像装置11のズーム制御について説明する。
次に、撮像装置11の動作について説明する。
また、撮影処理の実行中には、ユーザの動作状態に基づいて、適切なズーム状態を決定するズーム状態決定処理も行われる。例えばズーム状態決定処理は、各時刻において図7のステップS18の処理が行われているときにプログラムが呼び出され、実行される。
〈撮影処理の説明〉
なお、以上においては、実際に検出(測定)された最新のN個の角速度を用いてユーザの動作状態を判別する例について説明したが、最新のいくつかの角速度を用いて次の時刻の角速度を推定し、推定により得られた角速度も用いて動作状態を判別してもよい。そのような場合、撮像装置11は、例えば図12に示す撮影処理を行う。
〈撮像装置の構成例〉
さらに、以上においては、撮像装置11が単眼である場合について説明したが、撮像装置は双眼であってもよい。そのような場合、撮像装置は、例えば図13に示すように構成される。なお、図13において図1における場合と対応する部分には同一の符号を付してあり、その説明は適宜省略する。
次に、撮像装置71の動作について説明する。以下、図14のフローチャートを参照して、撮像装置71により行われる撮影処理について説明する。
〈撮影処理の説明〉
また、撮像装置71においても次の時刻の角速度を推定してユーザの動作を判別するようにしてもよい。そのような場合、撮像装置71は、図15に示す撮影処理を行う。以下、図15のフローチャートを参照して、撮像装置71による撮影処理について説明する。
〈撮像装置の構成例〉
また、以上においては、撮像レンズ21のズーム倍率を変化させることで画像の画角を変化させる光学ズームを例として説明したが、デジタルズームにより画像の画角を変更させるようにしてもよい。そのような場合、撮像装置11は、例えば図16に示すように構成される。なお、図16において図1における場合と対応する部分には同一の符号を付してあり、その説明は適宜省略する。
次に、図16の撮像装置11の動作について説明する。以下、図17のフローチャートを参照して、図16に示す構成の撮像装置11により行われる撮影処理について説明する。
〈撮影処理の説明〉
また、第5の実施の形態では、ズーム画像とともに、ズーム処理前のもとの画像も表示させる例について説明したが、ズーム処理前のもとの画像の撮影範囲よりもより広い撮影範囲の画像が表示されるようにしてもよい。そのようにすることで、特にユーザが撮像装置11を持ってパンニング動作をしているときには、より広い被写体全体の範囲のどの部分が撮影されているかを容易に把握できるようになる。
〈撮像装置の構成例〉
また、撮像装置11においてデジタルのズーム処理が行われる場合には、撮影処理と物体認識処理を組み合わせて、例えば画像中心近傍で物体が検出されたときには、その物体がズームインされるようにしてもよい。
次に、図21の撮像装置11の動作について説明する。以下、図22のフローチャートを参照して、図21に示す構成の撮像装置11により行われる撮影処理について説明する。
また、図21に示す撮像装置11は、図11を参照して説明したズーム状態決定処理とともに、物体認識を行って、画像中心近傍で物体が検出された場合には、なるべくズーム状態が維持されるようにする処理である調整処理も行う。以下、図23のフローチャートを参照して、撮像装置11により行われる調整処理について説明する。
〈撮影処理の説明〉
また、図21に示した撮像装置11において、画像から物体が検出された場合、その物体を含む領域がズーム処理された画像として切り出されるようにしてもよい。そのような場合、撮像装置11は例えば図24に示す撮影処理を行う。以下、図24のフローチャートを参照して、図21に示す構成の撮像装置11により行われる撮影処理について説明する。
また、図21に示す撮像装置11は、図11を参照して説明したズーム状態決定処理とともに、物体認識を行って、その認識結果に応じてズーム処理で切り出される領域の位置を定める処理であるズーム位置決定処理も行う。以下、図25のフローチャートを参照して、撮像装置11により行われるズーム位置決定処理について説明する。
〈撮像装置の構成例〉
また、第8の実施の形態では、デジタルズーム処理が行われる場合に、画像中心に被写体が位置するようにズーム処理を行う例について説明したが、光学ズーム処理が行われる場合にも、画像中心に被写体が位置するようにされてもよい。そのような場合、撮像装置11は、例えば図26に示すように構成される。なお、図26において図1または図21における場合と対応する部分には同一の符号を付してあり、その説明は適宜省略する。
次に、図26に示す撮像装置11の動作について説明する。
また、図26に示す撮像装置11は、図11を参照して説明したズーム状態決定処理とともに、物体認識を行って、その認識結果に応じて物体が画像内に収まるように撮像レンズ21の傾斜角度を求める処理である傾斜角度決定処理も行う。以下、図28のフローチャートを参照して、撮像装置11により行われる傾斜角度決定処理について説明する。
自分自身の移動の角速度を検出する角速度検出部と、
複数の異なる時刻の前記角速度に基づいてユーザの動作状態を判別する動作判別部と、
前記動作状態に基づいてズーム制御を行うズーム制御部と
を備える撮像装置。
(2)
前記ズーム制御部は、時間とともに前記角速度が増加する前記動作状態である場合、ズーム倍率が小さくなるようにズーム制御を行う
(1)に記載の撮像装置。
(3)
前記ズーム制御部は、時間とともに前記角速度が減少する前記動作状態である場合、ズーム倍率が大きくなるようにズーム制御を行う
(1)または(2)に記載の撮像装置。
(4)
前記ズーム制御部は、時間とともに前記角速度が減少する前記動作状態から、時間とともに前記角速度が増加する前記動作状態に変化した場合、ズーム倍率が一定の大きさのままとなるようにズーム制御を行う
(1)乃至(3)の何れかに記載の撮像装置。
(5)
前記ズーム制御部は、特定のズーム制御が行われている場合、前記動作判別部により所定の前記動作状態であると判別されにくくなるようにする
(1)乃至(4)の何れかに記載の撮像装置。
(6)
前記動作判別部は、前記複数の異なる時刻の前記角速度に基づいて前記時刻と前記角速度の関係式を求め、求めた前記関係式の係数と、複数の前記動作状態ごとに予め定められた前記関係式の係数の取り得る値の範囲とを比較することで、前記動作状態を判別する
(1)乃至(5)の何れかに記載の撮像装置。
(7)
前記動作判別部は、前記複数の異なる時刻の前記角速度と、複数の前記動作状態ごとに予め定められた複数の時刻の前記角速度との距離を求めることで前記動作状態を判別する
(1)乃至(5)の何れかに記載の撮像装置。
(8)
前記動作判別部は、前記複数の異なる時刻の前記角速度に基づいて未来の時刻の前記角速度を推定し、前記推定により得られた前記角速度と、前記複数の異なる時刻の前記角速度とに基づいて前記動作状態を判別する
(1)乃至(5)の何れかに記載の撮像装置。
(9)
前記ズーム制御部の制御に従って、撮影された画像の一部の領域を切り出してズーム画像を生成するズーム処理部をさらに備える
(1)乃至(8)の何れかに記載の撮像装置。
(10)
撮影された前記画像と、前記ズーム画像とを並べて表示させる表示制御部をさらに備える
(9)に記載の撮像装置。
(11)
前記表示制御部は、前記画像における前記ズーム画像が切り出された領域に枠を表示させる
(10)に記載の撮像装置。
(12)
複数の異なる時刻に撮影された前記画像を合成して得られた画像と、前記ズーム画像とを並べて表示させる表示制御部をさらに備える
(9)に記載の撮像装置。
(13)
前記表示制御部は、前記合成して得られた画像における前記ズーム画像が切り出された領域に枠を表示させる
(12)に記載の撮像装置。
(14)
撮影された画像から物体を検出する物体検出部をさらに備え、
前記ズーム制御部は、前記画像の中心近傍において前記物体が検出された場合、前記物体が拡大表示された状態が維持されるようにズーム制御を行う
(1)乃至(9)の何れかに記載の撮像装置。
(15)
撮影された前記画像から物体を検出する物体検出部をさらに備え、
前記ズーム制御部は、前記物体の検出結果に基づいて、前記画像から前記ズーム画像が切り出される位置を決定する
(9)に記載の撮像装置。
(16)
撮影された画像から物体を検出する物体検出部と、
前記物体の検出結果に応じて、被写体から入射する光を集光するレンズを傾斜させる補正部と
をさらに備える(1)乃至(8)の何れかに記載の撮像装置。
(17)
撮像装置の移動の角速度を検出し、
複数の異なる時刻の前記角速度に基づいて前記撮像装置を操作するユーザの動作状態を判別し、
前記動作状態に基づいてズーム制御を行う
ステップを含む撮像方法。
(18)
撮像装置の移動の角速度を検出し、
複数の異なる時刻の前記角速度に基づいて前記撮像装置を操作するユーザの動作状態を判別し、
前記動作状態に基づいてズーム制御を行う
ステップを含む処理をコンピュータに実行させるプログラム。
Claims (18)
- 自分自身の移動の角速度を検出する角速度検出部と、
複数の異なる時刻の前記角速度に基づいてユーザの動作状態を判別する動作判別部と、
前記動作状態に基づいてズーム制御を行うズーム制御部と
を備える撮像装置。 - 前記ズーム制御部は、時間とともに前記角速度が増加する前記動作状態である場合、ズーム倍率が小さくなるようにズーム制御を行う
請求項1に記載の撮像装置。 - 前記ズーム制御部は、時間とともに前記角速度が減少する前記動作状態である場合、ズーム倍率が大きくなるようにズーム制御を行う
請求項1に記載の撮像装置。 - 前記ズーム制御部は、時間とともに前記角速度が減少する前記動作状態から、時間とともに前記角速度が増加する前記動作状態に変化した場合、ズーム倍率が一定の大きさのままとなるようにズーム制御を行う
請求項1に記載の撮像装置。 - 前記ズーム制御部は、特定のズーム制御が行われている場合、前記動作判別部により所定の前記動作状態であると判別されにくくなるようにする
請求項1に記載の撮像装置。 - 前記動作判別部は、前記複数の異なる時刻の前記角速度に基づいて前記時刻と前記角速度の関係式を求め、求めた前記関係式の係数と、複数の前記動作状態ごとに予め定められた前記関係式の係数の取り得る値の範囲とを比較することで、前記動作状態を判別する
請求項1に記載の撮像装置。 - 前記動作判別部は、前記複数の異なる時刻の前記角速度と、複数の前記動作状態ごとに予め定められた複数の時刻の前記角速度との距離を求めることで前記動作状態を判別する
請求項1に記載の撮像装置。 - 前記動作判別部は、前記複数の異なる時刻の前記角速度に基づいて未来の時刻の前記角速度を推定し、前記推定により得られた前記角速度と、前記複数の異なる時刻の前記角速度とに基づいて前記動作状態を判別する
請求項1に記載の撮像装置。 - 前記ズーム制御部の制御に従って、撮影された画像の一部の領域を切り出してズーム画像を生成するズーム処理部をさらに備える
請求項1に記載の撮像装置。 - 撮影された前記画像と、前記ズーム画像とを並べて表示させる表示制御部をさらに備える
請求項9に記載の撮像装置。 - 前記表示制御部は、前記画像における前記ズーム画像が切り出された領域に枠を表示させる
請求項10に記載の撮像装置。 - 複数の異なる時刻に撮影された前記画像を合成して得られた画像と、前記ズーム画像とを並べて表示させる表示制御部をさらに備える
請求項9に記載の撮像装置。 - 前記表示制御部は、前記合成して得られた画像における前記ズーム画像が切り出された領域に枠を表示させる
請求項12に記載の撮像装置。 - 撮影された画像から物体を検出する物体検出部をさらに備え、
前記ズーム制御部は、前記画像の中心近傍において前記物体が検出された場合、前記物体が拡大表示された状態が維持されるようにズーム制御を行う
請求項1に記載の撮像装置。 - 撮影された前記画像から物体を検出する物体検出部をさらに備え、
前記ズーム制御部は、前記物体の検出結果に基づいて、前記画像から前記ズーム画像が切り出される位置を決定する
請求項9に記載の撮像装置。 - 撮影された画像から物体を検出する物体検出部と、
前記物体の検出結果に応じて、被写体から入射する光を集光するレンズを傾斜させる補正部と
をさらに備える請求項1に記載の撮像装置。 - 撮像装置の移動の角速度を検出し、
複数の異なる時刻の前記角速度に基づいて前記撮像装置を操作するユーザの動作状態を判別し、
前記動作状態に基づいてズーム制御を行う
ステップを含む撮像方法。 - 撮像装置の移動の角速度を検出し、
複数の異なる時刻の前記角速度に基づいて前記撮像装置を操作するユーザの動作状態を判別し、
前記動作状態に基づいてズーム制御を行う
ステップを含む処理をコンピュータに実行させるプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP14842216.5A EP3041217B1 (en) | 2013-09-06 | 2014-08-25 | Imaging device, method and program |
JP2015535429A JP6589635B2 (ja) | 2013-09-06 | 2014-08-25 | 撮像装置および方法、並びにプログラム |
US14/906,023 US10171742B2 (en) | 2013-09-06 | 2014-08-25 | Image capturing apparatus, method, and program with operation state determination based upon angular velocity detection |
CN201480046802.4A CN105745915B (zh) | 2013-09-06 | 2014-08-25 | 图像拍摄装置、方法和程序 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-185179 | 2013-09-06 | ||
JP2013185179 | 2013-09-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015033810A1 true WO2015033810A1 (ja) | 2015-03-12 |
Family
ID=52628288
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/072105 WO2015033810A1 (ja) | 2013-09-06 | 2014-08-25 | 撮像装置および方法、並びにプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US10171742B2 (ja) |
EP (1) | EP3041217B1 (ja) |
JP (1) | JP6589635B2 (ja) |
CN (1) | CN105745915B (ja) |
WO (1) | WO2015033810A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017116924A (ja) * | 2015-12-15 | 2017-06-29 | キヤノン株式会社 | ズーム制御装置およびズーム制御方法、撮像装置 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11202011B2 (en) * | 2016-07-25 | 2021-12-14 | Canon Kabushiki Kaisha | Control apparatus, image capturing apparatus, lens apparatus, image capturing system, control method, and storage medium |
US10869026B2 (en) * | 2016-11-18 | 2020-12-15 | Amitabha Gupta | Apparatus for augmenting vision |
US10498964B2 (en) * | 2017-12-28 | 2019-12-03 | Gopro, Inc. | Adaptive modes of operation based on user intention or activity |
CN115278079B (zh) * | 2022-07-27 | 2024-07-02 | 维沃移动通信有限公司 | 拍摄方法及其装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002099014A (ja) * | 2000-09-22 | 2002-04-05 | Nikon Corp | 信号予測装置及びこれを具えたカメラ |
JP2005202018A (ja) * | 2004-01-14 | 2005-07-28 | Fujinon Corp | 像振れ補正装置 |
JP2006033513A (ja) | 2004-07-16 | 2006-02-02 | Canon Inc | 撮像装置 |
JP2006091106A (ja) * | 2004-09-21 | 2006-04-06 | Canon Inc | 撮像装置及びその制御方法 |
JP2009206920A (ja) * | 2008-02-28 | 2009-09-10 | Nikon Corp | カメラ |
JP2013046149A (ja) * | 2011-08-23 | 2013-03-04 | Canon Inc | 撮像装置およびその制御方法 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7110024B1 (en) * | 1997-07-15 | 2006-09-19 | Silverbrook Research Pty Ltd | Digital camera system having motion deblurring means |
US6630950B1 (en) * | 1998-03-19 | 2003-10-07 | Canon Kabushiki Kaisha | Apparatus for improving image vibration suppression |
JP4273184B2 (ja) | 2003-09-03 | 2009-06-03 | 独立行政法人産業技術総合研究所 | リン酸測定方法とその装置 |
JP2005252732A (ja) * | 2004-03-04 | 2005-09-15 | Olympus Corp | 撮像装置 |
CN101616262B (zh) * | 2005-02-07 | 2012-07-25 | 松下电器产业株式会社 | 成像装置 |
US20070291104A1 (en) * | 2006-06-07 | 2007-12-20 | Wavetronex, Inc. | Systems and methods of capturing high-resolution images of objects |
JP4959535B2 (ja) * | 2007-12-13 | 2012-06-27 | 株式会社日立製作所 | 撮像装置 |
JP5268433B2 (ja) * | 2008-06-02 | 2013-08-21 | キヤノン株式会社 | 撮像装置、及び撮像装置の制御方法 |
JP5366454B2 (ja) * | 2008-06-30 | 2013-12-11 | キヤノン株式会社 | 光学機器 |
JP4715913B2 (ja) * | 2008-12-17 | 2011-07-06 | ソニー株式会社 | 撮像装置、画像処理装置、ズーム制御方法およびズーム制御プログラム |
JP2011193329A (ja) * | 2010-03-16 | 2011-09-29 | Panasonic Corp | 撮像装置 |
JP5612916B2 (ja) * | 2010-06-18 | 2014-10-22 | キヤノン株式会社 | 位置姿勢計測装置、その処理方法、プログラム、ロボットシステム |
CN102346346A (zh) * | 2010-07-21 | 2012-02-08 | 佳能株式会社 | 模糊校正设备和摄像设备 |
EP2442549B1 (en) * | 2010-10-14 | 2015-10-14 | Sony Corporation | Image capturing device, system and method |
JP2012151796A (ja) * | 2011-01-21 | 2012-08-09 | Sony Corp | 画像処理装置と画像処理方法およびプログラム |
CA2866131A1 (en) * | 2012-03-02 | 2013-06-09 | H4 Engineering, Inc. | Multifunction automatic video recording device |
JP2013225104A (ja) * | 2012-03-23 | 2013-10-31 | Panasonic Corp | 撮像装置 |
GB201208088D0 (en) * | 2012-05-09 | 2012-06-20 | Ncam Sollutions Ltd | Ncam |
JP2014053794A (ja) * | 2012-09-07 | 2014-03-20 | Nintendo Co Ltd | 情報処理プログラム、情報処理装置、情報処理システム及び情報処理方法 |
US9064385B2 (en) * | 2013-03-15 | 2015-06-23 | Immersion Corporation | Method and apparatus to generate haptic feedback from video content analysis |
-
2014
- 2014-08-25 EP EP14842216.5A patent/EP3041217B1/en active Active
- 2014-08-25 JP JP2015535429A patent/JP6589635B2/ja active Active
- 2014-08-25 US US14/906,023 patent/US10171742B2/en active Active
- 2014-08-25 CN CN201480046802.4A patent/CN105745915B/zh active Active
- 2014-08-25 WO PCT/JP2014/072105 patent/WO2015033810A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002099014A (ja) * | 2000-09-22 | 2002-04-05 | Nikon Corp | 信号予測装置及びこれを具えたカメラ |
JP2005202018A (ja) * | 2004-01-14 | 2005-07-28 | Fujinon Corp | 像振れ補正装置 |
JP2006033513A (ja) | 2004-07-16 | 2006-02-02 | Canon Inc | 撮像装置 |
JP2006091106A (ja) * | 2004-09-21 | 2006-04-06 | Canon Inc | 撮像装置及びその制御方法 |
JP2009206920A (ja) * | 2008-02-28 | 2009-09-10 | Nikon Corp | カメラ |
JP2013046149A (ja) * | 2011-08-23 | 2013-03-04 | Canon Inc | 撮像装置およびその制御方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3041217A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017116924A (ja) * | 2015-12-15 | 2017-06-29 | キヤノン株式会社 | ズーム制御装置およびズーム制御方法、撮像装置 |
Also Published As
Publication number | Publication date |
---|---|
US10171742B2 (en) | 2019-01-01 |
JP6589635B2 (ja) | 2019-10-16 |
CN105745915B (zh) | 2019-02-15 |
JPWO2015033810A1 (ja) | 2017-03-02 |
US20160156849A1 (en) | 2016-06-02 |
EP3041217A1 (en) | 2016-07-06 |
EP3041217A4 (en) | 2017-04-12 |
EP3041217B1 (en) | 2021-03-31 |
CN105745915A (zh) | 2016-07-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9344634B2 (en) | Imaging apparatus having subject detection function, method for controlling the imaging apparatus, and storage medium | |
JP4873762B2 (ja) | 撮像装置及び撮像装置の制御方法 | |
JP6184189B2 (ja) | 被写体検出装置およびその制御方法、撮像装置、被写体検出装置の制御プログラムおよび記憶媒体 | |
JP6589635B2 (ja) | 撮像装置および方法、並びにプログラム | |
JP5623256B2 (ja) | 撮像装置、その制御方法及びプログラム | |
KR101593995B1 (ko) | 자동 초점 조절 방법, 상기 방법을 기록한 기록 매체, 및 상기 방법을 실행하는 자동 초점 조절 장치 | |
CN107850753B (zh) | 检测设备、检测方法、检测程序和成像设备 | |
JPWO2006082967A1 (ja) | 撮像装置 | |
JP2012199675A (ja) | 画像処理装置、画像処理方法及びプログラム | |
US8648960B2 (en) | Digital photographing apparatus and control method thereof | |
KR20080101778A (ko) | 화상흔들림의 검출기능을 구비한 촬상장치, 촬상장치의제어방법 및 촬상장치의 제어프로그램을 기록한 기록매체 | |
JP2010008936A (ja) | 撮像装置及びその制御方法 | |
EP3567844B1 (en) | Control apparatus and control method | |
TW201351980A (zh) | 影像處理裝置、影像處理方法、程式 | |
JP2015103852A (ja) | 画像処理装置、撮像装置、画像処理装置の制御方法、画像処理装置の制御プログラム及び記憶媒体 | |
JP5783696B2 (ja) | 撮像装置、オートズーム方法、及び、プログラム | |
JP6057608B2 (ja) | 焦点調節装置及びその制御方法 | |
JP4807582B2 (ja) | 画像処理装置、撮像装置及びそのプログラム | |
JP2013017218A (ja) | 画像処理装置、画像処理方法及びプログラム | |
WO2019146164A1 (ja) | 撮像装置、撮像方法、及びプログラム | |
JP6018466B2 (ja) | 撮像装置及びその制御方法 | |
JP4981758B2 (ja) | 焦点調節装置及び方法 | |
JP5832618B2 (ja) | 撮像装置、その制御方法及びプログラム | |
JP6164978B2 (ja) | 焦点調整装置、その制御方法、および制御プログラム、並びに撮像装置 | |
JP2017038243A (ja) | 撮像装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14842216 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015535429 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14906023 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2014842216 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014842216 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |