US20160205308A1 - Display apparatus, image capturing apparatus, image capturing system, control method for display apparatus, control method for image capturing apparatus, and storage medium - Google Patents
Display apparatus, image capturing apparatus, image capturing system, control method for display apparatus, control method for image capturing apparatus, and storage medium Download PDFInfo
- Publication number
- US20160205308A1 US20160205308A1 US14/990,176 US201614990176A US2016205308A1 US 20160205308 A1 US20160205308 A1 US 20160205308A1 US 201614990176 A US201614990176 A US 201614990176A US 2016205308 A1 US2016205308 A1 US 2016205308A1
- Authority
- US
- United States
- Prior art keywords
- captured image
- display
- image capturing
- display apparatus
- attitude
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 44
- 238000001514 detection method Methods 0.000 claims abstract description 51
- 230000004044 response Effects 0.000 claims abstract description 19
- 230000005540 biological transmission Effects 0.000 claims abstract description 14
- 238000012545 processing Methods 0.000 claims description 117
- 238000012937 correction Methods 0.000 claims description 67
- 230000000694 effects Effects 0.000 claims description 27
- 230000008569 process Effects 0.000 description 22
- 238000004891 communication Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 12
- 230000001276 controlling effect Effects 0.000 description 6
- 230000015654 memory Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000003702 image correction Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000012805 post-processing Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000003936 working memory Effects 0.000 description 2
- 230000003321 amplification Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Images
Classifications
-
- H04N5/23203—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00129—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a display device, e.g. CRT or LCD monitor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/684—Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time
- H04N23/6842—Vibration or motion blur correction performed by controlling the image sensor readout, e.g. by controlling the integration time by controlling the scanning position, e.g. windowing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H04N5/23229—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/667—Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
Definitions
- the present invention relates to a display apparatus, an image capturing apparatus, an image capturing system, a control method for the display apparatus, a control method for the image capturing apparatus, and a storage medium.
- a user when remotely operating an image capturing apparatus using an external terminal, a user may notice inconsistency between a horizontal direction in an image captured by the image capturing apparatus and a horizontal direction of a subject while confirming the captured image.
- a photographer operating the external terminal and the image capturing apparatus, it may be difficult for the user to correct the position and attitude of the image capturing apparatus.
- a method is suggested in which the angle of view of a captured image is electronically corrected by a user issuing a rotation angle instruction to an image capturing apparatus through a remote operation (see Japanese Patent Laid-Open No. 2007-228097).
- Another method is also suggested in which a horizontal direction of an image capturing apparatus is electronically and automatically corrected using tilt information of the image capturing apparatus (see Japanese Patent Laid-Open No. 2012-147071).
- a tilt of a captured image is electronically corrected by a client terminal transmitting angle information for correcting the tilt of the captured image to an image capturing apparatus connected to a network.
- a user determines an angle used in correcting the tilt of the captured image by operating a slide bar displayed on the client terminal.
- the image capturing apparatus at a remote location receives the angle information, and electronically corrects the captured image. In this way, the tilt of the captured image can be corrected through a remote operation.
- an image capturing apparatus detects a tilt angle of itself, and corrects a captured image using detected angle information.
- This method detects a tilt of the image capturing apparatus itself using, for example, an angle sensor within the image capturing apparatus.
- the image capturing apparatus electronically corrects the captured image using the detected angle information. In this way, a tilt of the captured image can be automatically corrected.
- Japanese Patent Laid-Open No. 2007-228097 it is difficult for the user to intuitively grasp a relationship between an operation amount of the slide bar and a rotation angle, and hence to correct the tilt of the captured image quickly and accurately.
- Japanese Patent Laid-Open No. 2012-147071 does not support a case in which the image capturing apparatus is level whereas a subject is tilted.
- the present invention has been made in view of the above situations, and provides a technique to enable intuitive correction of a captured image by a user of a display apparatus that receives the captured image from an image capturing apparatus and displays the received captured image.
- a display apparatus comprising: a reception unit configured to receive a captured image from an image capturing apparatus; a display unit configured to display the captured image; a detection unit configured to detect an attitude of the display apparatus; and a transmission unit configured to, while the display unit is displaying the captured image, transmit attitude information to the image capturing apparatus in response to an instruction from a user, the attitude information indicating the attitude detected by the detection unit.
- an image capturing apparatus comprising: an image capturing unit configured to generate a captured image; a transmission unit configured to transmit the captured image to a display apparatus; a reception unit configured to receive attitude information indicating an attitude of the display apparatus from the display apparatus; and a processing unit configured to apply specific processing to the captured image based on the attitude information received by the reception unit, and record the captured image.
- an image capturing system including a display apparatus and an image capturing apparatus, the display apparatus comprising: a reception unit configured to receive a captured image from the image capturing apparatus; a display unit configured to display the captured image; a detection unit configured to detect an attitude of the display apparatus; and a transmission unit configured to, while the display unit is displaying the captured image, transmit attitude information to the image capturing apparatus in response to an instruction from a user, the attitude information indicating the attitude detected by the detection unit, the image capturing apparatus comprising: an image capturing unit configured to generate the captured image; a transmission unit configured to transmit the captured image to the display apparatus; a reception unit configured to receive the attitude information indicating the attitude of the display apparatus from the display apparatus; and a processing unit configured to apply specific processing to the captured image based on the attitude information received by the reception unit of the image capturing apparatus, and record the captured image.
- a control method for a display apparatus including a display unit and an attitude detection unit, the control method comprising: receiving a captured image from an image capturing apparatus; displaying the captured image on the display unit; detecting an attitude of the display apparatus with the attitude detection unit; and while displaying the captured image on the display unit, transmitting attitude information to the image capturing apparatus in response to an instruction from a user, the attitude information indicating the attitude detected by the detecting.
- a control method for an image capturing apparatus comprising: generating a captured image; transmitting the captured image to a display apparatus; receiving attitude information indicating an attitude of the display apparatus from the display apparatus; and applying specific processing to the captured image based on the attitude information received by the receiving, and recording the captured image.
- a display apparatus comprising: a reception unit configured to receive a captured image from an image capturing apparatus; a display unit configured to display the captured image; a detection unit configured to detect an attitude of the display apparatus; a storage unit configured to, while the display unit is displaying the captured image, store attitude information in response to an instruction from a user, the attitude information indicating the attitude detected by the detection unit; and a processing unit configured to apply specific processing to the captured image received from the image capturing apparatus based on the attitude information stored by the storage unit, and record the captured image.
- a control method for a display apparatus comprising: receiving a captured image from an image capturing apparatus; displaying the captured image on a display unit of the display apparatus; detecting an attitude of the display apparatus; while displaying the captured image on the display unit, storing attitude information in response to an instruction from a user, the attitude information indicating the attitude detected by the detecting; and applying specific processing to the captured image received from the image capturing apparatus based on the stored attitude information, and recording the captured image.
- a non-transitory computer-readable storage medium which stores a program for causing a computer to function as a display apparatus comprising: a reception unit configured to receive a captured image from an image capturing apparatus; a display unit configured to display the captured image; a detection unit configured to detect an attitude of the display apparatus; and a transmission unit configured to, while the display unit is displaying the captured image, transmit attitude information to the image capturing apparatus in response to an instruction from a user, the attitude information indicating the attitude detected by the detection unit.
- a non-transitory computer-readable storage medium which stores a program for causing a computer to function as an image capturing apparatus, comprising: an image capturing unit configured to generate a captured image; a transmission unit configured to transmit the captured image to a display apparatus; a reception unit configured to receive attitude information indicating an attitude of the display apparatus from the display apparatus; and a processing unit configured to apply specific processing to the captured image based on the attitude information received by the reception unit, and record the captured image.
- a non-transitory computer-readable storage medium which stores a program for causing a computer to function as a display apparatus comprising: a reception unit configured to receive a captured image from an image capturing apparatus; a display unit configured to display the captured image; a detection unit configured to detect an attitude of the display apparatus; a storage unit configured to, while the display unit is displaying the captured image, store attitude information in response to an instruction from a user, the attitude information indicating the attitude detected by the detection unit; and a processing unit configured to apply specific processing to the captured image received from the image capturing apparatus based on the attitude information stored by the storage unit, and record the captured image.
- FIG. 1 is a block diagram showing schematic configurations of a display apparatus 100 and an image capturing apparatus 200 .
- FIGS. 2A to 2C show an overview of a first embodiment.
- FIGS. 3A and 3B are flowcharts showing operations of the display apparatus 100 and the image capturing apparatus 200 according to the first embodiment.
- FIGS. 4A and 4B are flowcharts showing a modification example of FIGS. 3A and 3B .
- FIGS. 5A to 5C show an overview of a second embodiment.
- FIGS. 6A and 6B are flowcharts showing operations of the display apparatus 100 and the image capturing apparatus 200 according to the second embodiment.
- FIGS. 7A and 7B show an overview of a third embodiment.
- FIG. 8 is a flowchart showing operations of the image capturing apparatus 200 according to the third embodiment.
- FIG. 9A shows a rotation axis 901 of the display apparatus 100 according to the first embodiment.
- FIG. 9B shows a rotation axis 902 of the display apparatus 100 according to the second embodiment.
- FIG. 1 is a block diagram showing schematic configurations of a display apparatus 100 and an image capturing apparatus 200 included in each of image capturing systems according to the embodiments.
- the display apparatus 100 is, for example, a mobile terminal (e.g., a smartphone) that can be held in a hand of a user, and is used by the user to remotely operate the image capturing apparatus 200 .
- the image capturing apparatus 200 is, for example, a digital camera, and has a function of correcting a tilt and a keystone effect of a captured image through electronic image processing.
- the display apparatus 100 communicates with the image capturing apparatus 200 connected to a network.
- the display apparatus 100 receives an image captured by the image capturing apparatus 200 via the network.
- the display apparatus 100 also transmits attitude information indicating an attitude of the display apparatus 100 to the image capturing apparatus 200 .
- attitude information is also referred to as angle information.
- the display apparatus 100 receives an image captured by the image capturing apparatus 200 with the aid of the communication processing unit 1 , and displays the received captured image on a display unit 2 .
- a display processing unit 3 converts the captured image into information that can be displayed on the display unit 2 of the display apparatus 100 .
- the display processing unit 3 also generates display data for displaying the angle information of the display apparatus 100 on the display unit 2 .
- An angle detection unit 4 (attitude detection unit) detects a positional attitude of the display apparatus 100 , and converts the result of detection into the angle information.
- An acceleration sensor, a gyroscope, or the like can be used as the angle detection unit 4 .
- a control unit 5 includes a nonvolatile ROM and a volatile RAM, and controls the display apparatus 100 by executing a control program stored in the ROM.
- the RAM in the control unit 5 is used as a working memory for execution of the control program by the control unit 5 .
- FIG. 1 shows the display processing unit 3 as an independent block, the control unit 5 may execute processing of the display processing unit 3 .
- the image capturing apparatus 200 obtains captured image data by capturing an image of a subject with the aid of an image sensor included in an image capture processing unit 6 .
- An image processing unit 7 generates a captured image based on the captured image data.
- the image processing unit 7 also electronically corrects the captured image based on the angle information received from the display apparatus 100 via a communication processing unit 9 .
- the captured image is recorded to a recording unit 10 .
- the image capturing apparatus 200 also includes an operation unit 11 for performing a shooting operation, configuring menu settings, and the like on the image capturing apparatus 200 , as well as a display unit 12 for confirming the captured image, shooting information, and the like.
- the image capturing apparatus 200 further includes the communication processing unit 9 for transmitting the captured image to the display apparatus 100 , or for receiving the angle information of the display apparatus 100 .
- a control unit 8 includes a nonvolatile ROM and a volatile RAM, and controls the image capturing apparatus 200 by executing a control program stored in the ROM.
- the RAM in the control unit 8 is used as a working memory for execution of the control program by the control unit 8 .
- the image capture processing unit 6 is composed of an optical unit including a plurality of lenses, a diaphragm, and the like, the image sensor, a driver for driving the image sensor, a timing generation circuit, a CDS/AGC circuit, and the like.
- the optical unit includes the diaphragm for adjusting an amount of incident light from outside, and a neutral density (ND) filter.
- the image capture processing unit 6 drives a lens assembly with respect to an optical axis so as to, for example, focus on the subject and reduce blur in the captured image caused by a hand shake and the like.
- the image sensor captures an image of the subject through photoelectric conversion, and with the use of the CDS/AGC circuit, samples and amplifies image information based on electric charges (image signals) accumulated in pixels of the image sensor.
- CDS correlated double sampling
- A/D converter converts the image information (analog signal) output from the CDS/AGC circuit into a digital signal.
- the image processing unit 7 applies various types of signal processing, including auto white balance (AWB) and gamma control etc., to the image information (digital signal) output from the A/D converter, thereby generating the captured image.
- the driver for driving the image sensor and the timing generation circuit feed, for example, driving pulses for driving the image sensor to the image sensor, and adjust readout of the image captured by the image sensor and an exposure time period.
- the image processing unit 7 electronically corrects the captured image based on the angle information of the display apparatus 100 received via the communication processing unit 9 . It will be assumed that a correction in a direction of rotation and a direction of a keystone effect is performed through image processing.
- the recording unit 10 stores the captured image generated by the image processing unit 7 as an image file to an internal memory or an external memory (recording medium), such as a memory card. At this time, the recording unit 10 can write the angle information received via the communication processing unit 9 to the image file.
- the operation unit 11 the user performs a key operation and configures setting menus and the like when shooting with the image capturing apparatus 200 .
- the display unit 12 displays a display image generated by the image processing unit 7 .
- a horizontal detection unit 13 detects a tilt of the image capturing apparatus 200 and outputs angle information.
- An acceleration sensor, a gyroscope, or the like can be used as the horizontal detection unit 13 .
- each block is illustrated as an independent circuit unit. Alternatively, all or a part of the blocks may be included in the control unit 5 or the control unit 8 .
- the image capturing apparatus 200 is placed so as to capture an image of a subject.
- the image processing unit 7 generates a captured image from captured image data obtained by the image capture processing unit 6 .
- the image capturing apparatus 200 is placed to be level with respect to a ground surface.
- a direction that is “level with respect to the ground surface” means a direction that is perpendicular to a gravitational direction detected by the horizontal detection unit 13 of the image capturing apparatus 200 ; hereinafter, it may be simply referred to as a “horizontal direction detected by the image capturing apparatus 200 ”.
- the subject is placed on a slanted surface in a tilted manner.
- the image capturing apparatus 200 displays the captured image on the display unit 12 .
- the image capturing apparatus 200 also displays, on the display unit 12 , a horizontal line (a horizontal line of the image capturing apparatus 200 ) based on angle information output from the horizontal detection unit 13 .
- the horizontal line is parallel to a transverse direction of the captured image.
- the subject included in the captured image is also tilted in FIG. 2B .
- the image capturing apparatus 200 is level with respect to the ground surface in the description of the present embodiment, the image capturing apparatus 200 is not limited to having such an attitude.
- the image capturing apparatus 200 may be placed in such a manner that it is tilted with respect to the ground surface.
- the horizontal line is displayed in a tilted manner, with its tilt corresponding to a tilt detected by the horizontal detection unit 13 of the image capturing apparatus 200 (that is to say, the displayed horizontal line is at an angle to the transverse direction of the captured image).
- the image capturing apparatus 200 transmits the captured image to the communication processing unit 1 of the display apparatus 100 connected to the network via the communication processing unit 9 . At this time, the image capturing apparatus 200 transmits information indicating an attitude of the image capturing apparatus 200 together with the captured image. It will be assumed that information indicating the horizontal direction of the image capturing apparatus 200 (horizontal information, image capturing apparatus attitude information), which is obtained based on the angle information output from the horizontal detection unit 13 , is transmitted as the information indicating the attitude of the image capturing apparatus 200 . As shown in FIG.
- the display apparatus 100 converts the captured image received from the image capturing apparatus 200 via the communication processing unit 1 into a format that can be displayed on the display unit 2 with the aid of the display processing unit 3 , and displays the result of conversion on the display unit 2 .
- a photographer determines angle information for correcting the tilt of the captured image by rotating the display apparatus 100 while confirming the captured image being displayed on the display unit 2 .
- This rotation is performed around a rotation axis orthogonal to a display screen of the display unit 2 (see a rotation axis 901 in FIG. 9A ).
- a rotation angle of a case in which a transverse line (a line extending in a left-right direction) of the display screen is level with respect to the ground surface, as shown on the left side of FIG. 9A is defined as 0 degrees.
- a direction that is “level with respect to the ground surface” means a direction perpendicular to a gravitational direction detected by the angle detection unit 4 of the display apparatus 100 ; hereinafter, it may also be simply referred to as a “horizontal direction detected by the display apparatus 100 ”.
- horizontal auxiliary lines are parallel to the transverse line of the display screen (as will be elaborated later, the horizontal auxiliary lines are lines extending in a transverse direction among auxiliary lines shown in FIG. 2B ).
- the “tilt of the captured image” to be corrected denotes “deviation” from the horizontal auxiliary lines of the display apparatus 100 caused by rotation of the captured image around the rotation axis.
- the display unit 2 displays the horizontal line based on the horizontal information received from the image capturing apparatus 200 and auxiliary information indicating the rotation angle (attitude) of the display apparatus 100 , together with the captured image.
- the auxiliary information denotes grid-like auxiliary lines including horizontal auxiliary lines and vertical auxiliary lines.
- An angle formed by the horizontal auxiliary lines and the transverse line of the display screen corresponds to the rotation angle of the display apparatus 100 .
- the horizontal line and the auxiliary lines are generated by the display processing unit 3 .
- the display processing unit 3 updates display so that the horizontal auxiliary lines keep showing the horizontal direction detected by the display apparatus 100 even when the display apparatus 100 has been rotated. In this way, the user can confirm the horizontal direction detected by the display apparatus 100 by viewing the horizontal auxiliary lines.
- the user determines a line that belongs to the subject included in the captured image and that should match the horizontal direction detected by the display apparatus 100 .
- a line on which a bottom surface of a flower pot is in contact with a board on which the flower pot is placed (a subject's horizontal line H) is the line that should match the horizontal direction detected by the display apparatus 100 .
- the user rotates the display apparatus 100 so that the subject's horizontal line H matches the horizontal direction detected by the display apparatus 100 .
- the user can easily correct the tilt (level the subject) of the captured image by rotating the display apparatus 100 so that the subject's horizontal line H is parallel to the horizontal auxiliary lines.
- the horizontal auxiliary lines are not essential in the present embodiment. Even if the horizontal auxiliary lines are not displayed, the user can level the subject by rotating the display apparatus 100 while viewing the captured image (especially, the subject's horizontal line H) displayed on the display unit 2 .
- the present embodiment is described under the assumption that, while the image capturing apparatus 200 is placed to be level with respect to the ground surface, the subject is tilted, thus causing the tilt of the captured image.
- the image capturing apparatus 200 and the subject are not limited to having such attitudes.
- the horizontal line of the image capturing apparatus 200 is displayed in a tilted manner in FIG. 2B , with its tilt corresponding to a tilt detected by the horizontal detection unit 13 of the image capturing apparatus 200 (that is to say, the displayed horizontal line is at an angle to the transverse direction of the captured image).
- the user can correct the tilt of the captured image attributed to the tilt of the image capturing apparatus 200 by rotating the display apparatus 100 so that the horizontal line of the image capturing apparatus 200 is parallel to the horizontal auxiliary lines.
- Being able to make such correction is a significant advantage in the case of, for example, remote shooting using the display apparatus 100 while the image capturing apparatus 200 is fixed in place by a tripod.
- the user need not use the horizontal line of the image capturing apparatus 200 .
- the user can correct the captured image by rotating the display apparatus 100 so that a desired portion of the subject has a desired attitude.
- the display apparatus 100 transmits a rotation instruction command to the image capturing apparatus 200 via the communication processing unit 1 in response to a user instruction.
- the rotation instruction command includes information indicating the rotation angle of the display apparatus 100 (angle information).
- the image capturing apparatus 200 receives the rotation instruction command including the angle information via the communication processing unit 9 .
- the image capturing apparatus 200 applies angle correction (tilt correction) to the captured image by executing electronic image processing with the aid of the image processing unit 7 .
- angle correction tilt correction
- the image capturing apparatus 200 cuts out a partial region of the captured image. However, the larger the angle of correction, the smaller the region that needs to be cut out from the captured image.
- the display apparatus 100 analyzes an image size of the captured image received from the image capturing apparatus 200 . With the aid of the control unit 5 , the display apparatus 100 also calculates a cutout image size within the captured image, which is necessary for correction, based on the angle information detected by the angle detection unit 4 , thereby identifying the partial region to be cut out. The display apparatus 100 notifies the user of information indicating the identified partial region (cutout frame) by superimposing the information on the display unit 2 via the display processing unit 3 .
- the user can correct the tilt of the image captured by the image capturing apparatus 200 through an intuitive operation by rotating the display apparatus 100 .
- FIGS. 3A and 3B pertain to the display apparatus 100 and the image capturing apparatus 200 , respectively.
- Processes of steps in a flowchart of FIG. 3A are realized as a result of the control unit 5 controlling blocks of the display apparatus 100 by deploying the control program stored in the ROM to the RAM and executing the deployed control program, unless specifically stated otherwise.
- processes of steps in a flowchart of FIG. 3B are realized as a result of the control unit 8 controlling blocks of the image capturing apparatus 200 by deploying the control program stored in the ROM to the RAM and executing the deployed control program, unless specifically stated otherwise.
- FIGS. 4A and 4B The same goes for later-described FIGS. 4A and 4B .
- step S 301 the control unit 5 of the display apparatus 100 determines whether the user has issued an instruction for switching to an angle correction mode. If the instruction for switching to the angle correction mode has not been issued, processing of the present flowchart is ended. If the instruction for switching to the angle correction mode has been issued, the processing proceeds to step S 302 .
- step S 302 the control unit 5 receives, via the communication processing unit 1 , an image captured by the image capturing apparatus 200 and horizontal information of the image capturing apparatus 200 (see FIG. 2B ).
- step S 303 the display processing unit 3 displays the captured image that was received from the image capturing apparatus 200 in step S 302 on the display unit 2 .
- the display processing unit 3 also generates an image of a horizontal line based on the horizontal information, and displays the horizontal line by superimposing the horizontal line over the captured image displayed on the display unit 2 (see FIG. 2B ).
- step S 304 the control unit 5 detects angle information output from the angle detection unit 4 .
- This angle information indicates a rotation angle around the rotation axis orthogonal to the display screen of the display unit 2 (see the rotation axis 901 in FIG. 9A ).
- step S 305 based on the angle information detected in step S 304 , the control unit 5 generates grid-like auxiliary lines that are used to confirm a horizontal direction detected by the display apparatus 100 , and hence are used by the user as a guide for correction of the captured image. Then, the display processing unit 3 displays the auxiliary lines by superimposing the auxiliary lines over the captured image displayed on the display unit 2 . As shown in FIG.
- transverse lines among these grid-like auxiliary lines are displayed in such a manner that they are always maintained level (parallel to the horizontal direction detected by the display apparatus 100 ), even when the rotation angle of the display apparatus 100 has been changed.
- the user adjusts the rotation angle of the display apparatus 100 so that the horizontal auxiliary lines match a horizontal portion of a subject being shot (e.g., the subject's horizontal line H shown in FIGS. 2B and 2C ). In this way, the user can easily generate correction information for shooting the subject as if the subject is level.
- step S 306 the display processing unit 3 displays, on the display unit 2 , information (cutout frame) indicating a cutout size within the captured image for making correction in accordance with the rotation angle of the display apparatus 100 (see FIG. 2C ).
- the cutout size is calculated by the control unit 5 based on the angle information detected by the angle detection unit 4 .
- step S 307 the control unit 5 determines whether the user has issued an instruction for ending the angle correction mode. If the instruction for ending the angle correction mode has not been issued, the processing returns to step S 302 , and similar processes are repeated. If the instruction for ending the angle correction mode has been issued, the processing proceeds to step S 308 .
- step S 308 the control unit 5 generates a rotation instruction command to be issued to the image capturing apparatus 200 .
- the rotation instruction command includes angle information for electronic image correction by the image capturing apparatus 200 . It will be assumed that the rotation instruction command includes, for example, information indicating the gravitational direction detected by the angle detection unit 4 , or information related to an angle (orientation) with respect to the gravitational direction around the rotation axis orthogonal to a display surface of the display unit 2 .
- step S 309 the control unit 5 transmits the rotation instruction command to the image capturing apparatus 200 via the communication processing unit 1 , and then the processing of the present flowchart is ended.
- the captured image that has been corrected in response to the rotation instruction command can be received from the image capturing apparatus 200 and displayed on the display unit 2 .
- the user of the display apparatus 100 can confirm whether an intended orientation has been achieved through correction, and perform a correction operation again by switching back to the angle correction mode if necessary.
- step S 310 the control unit 8 of the image capturing apparatus 200 determines whether the user has issued the instruction for switching to the angle correction mode.
- This switching instruction is, for example, remotely issued via the display apparatus 100 . If the instruction for switching to the angle correction mode has not been issued, processing of the present flowchart is ended. If the instruction for switching to the angle correction mode has been issued, the processing proceeds to step S 311 .
- step S 311 the control unit 8 transmits the captured image to the display apparatus 100 via the communication processing unit 9 . It also transmits, to the display apparatus 100 , the above-described horizontal information detected by the horizontal detection unit 13 , which indicates the attitude of the image capturing apparatus 200 .
- step S 312 the control unit 8 determines whether the rotation instruction command has been received from the display apparatus 100 . If the rotation instruction command has not been received, the processing returns to step S 311 , and similar processes are repeated. These processes are repeated in a cycle of a predetermined frame rate (e.g., 30 fps) so that captured images are visible in the form of live view on the display apparatus 100 . If the rotation instruction command has been received, the processing proceeds to step S 313 .
- a predetermined frame rate e.g. 30 fps
- step S 313 the image processing unit 7 corrects a tilt of the captured image by executing electronic image processing based on the angle information included in the rotation instruction command.
- the captured image whose tilt has been corrected, is transmitted to the display apparatus 100 to be reviewed on the display apparatus 100 .
- information for correction including the angle information in the rotation instruction command, is stored (recorded) to the ROM or RAM in the image capturing apparatus 200 , and then the processing of the present flowchart is ended. Thereafter, when the image capturing apparatus 200 captures still images or moving images and records or transmits the captured images, the image processing unit 7 makes correction based on the stored information (the information for correction, including the angle information in the rotation instruction command). Then, the corrected captured images are recorded to the recording unit 10 , or transmitted via the communication processing unit 9 , as still images or moving images. That is to say, a plurality of images that are captured after receiving the rotation instruction command are corrected based on the stored information.
- the tilted captured image shown in FIG. 2B is corrected in accordance with the rotation angle of the display apparatus 100 as shown in FIG. 2C .
- the display apparatus 100 transmits the rotation instruction command to the image capturing apparatus 200 after the issuance of the instruction for ending the angle correction mode in step S 307 .
- the display apparatus 100 may generate the rotation instruction command and transmit the rotation instruction command to the image capturing apparatus 200 in real time without waiting for the instruction for ending the angle correction mode. In this way, the image capturing apparatus 200 can correct captured images immediately following rotation of the display apparatus 100 .
- FIGS. 4A and 4B which respectively correspond to FIGS.
- FIG. 4A is the same as FIG. 3A , except that steps S 308 and S 309 precede step S 307 .
- FIG. 4B is the same as FIG. 3B , except that step S 401 is added after step S 313 .
- the control unit 8 determines whether the user has issued the instruction for ending the angle correction mode. This ending instruction is, for example, remotely issued via the display apparatus 100 . If the instruction for ending the angle correction mode has not been issued, processing returns to step S 311 , and similar processes are repeated.
- information for correction including the angle information in the rotation instruction command
- the image processing unit 7 makes correction based on the stored information (the information for correction, including the angle information in the rotation instruction command).
- the corrected captured images are recorded to the recording unit 10 , or transmitted via the communication processing unit 9 , as still images or moving images.
- the display apparatus 100 displays a captured image received from the image capturing apparatus 200 on the display unit 2 . Then, the display apparatus 100 transmits angle information indicating a rotation angle of the display apparatus 100 (a rotation angle around the rotation axis orthogonal to the display screen of the display unit 2 ) to the image capturing apparatus 200 . The image capturing apparatus 200 corrects a tilt of the captured image in accordance with the angle information.
- the user of the display apparatus which receives the captured image from the image capturing apparatus and displays the received captured image, can correct the captured image intuitively.
- the first embodiment has described a configuration in which a tilt of an image captured by the image capturing apparatus 200 is electronically corrected using angle information indicating a rotation angle of the display apparatus 100 .
- the second embodiment describes a configuration in which a keystone effect of a captured image is corrected by the user tilting the display apparatus 100 forward and backward.
- Basic configurations of the display apparatus 100 and the image capturing apparatus 200 according to the second embodiment are similar to the configurations shown in FIG. 1 according to the first embodiment, and thus a detailed description of such configurations will be omitted. The following description focuses mainly on portions that differ from the first embodiment.
- the second embodiment is similar to the first embodiment in that a captured image is corrected in accordance with an angle of the display apparatus 100 .
- an “angle of the display apparatus 100 ” corresponds to a forward/backward tilt angle of the display apparatus 100 (strictly speaking, a forward/backward tilt angle of the display screen of the display unit 2 ).
- the “forward/backward tilt” mentioned here corresponds to rotation of the display screen around a rotation axis that is in plane with the display screen of the display unit 2 and parallel to a transverse line of the display screen (see a rotation axis 902 in FIG. 9B ).
- a tilt angle of a case in which an up-down line (a line extending in an up-down direction) of the display screen is perpendicular to the ground surface is defined as 0 degrees.
- a direction that is “perpendicular to the ground surface” means the gravitational direction detected by the angle detection unit 4 of the display apparatus 100 .
- the display apparatus 100 transmits a keystone effect instruction command including the angle information to the image capturing apparatus 200 .
- the image capturing apparatus 200 corrects a keystone effect by correcting a captured image in accordance with the angle information.
- FIGS. 6A and 6B pertain to the display apparatus 100 and the image capturing apparatus 200 , respectively.
- steps of performing processes that are identical or similar to processes of FIGS. 3A and 3B are given the same reference numerals thereas.
- Processes of steps in a flowchart of FIG. 6A are realized as a result of the control unit 5 controlling blocks of the display apparatus 100 by deploying the control program stored in the ROM to the RAM and executing the deployed control program, unless specifically stated otherwise.
- processes of steps in a flowchart of FIG. 6B are realized as a result of the control unit 8 controlling blocks of the image capturing apparatus 200 by deploying the control program stored in the ROM to the RAM and executing the deployed control program, unless specifically stated otherwise.
- step S 604 the control unit 5 detects angle information output from the angle detection unit 4 .
- This angle information indicates a forward/backward tilt angle of the display screen of the display unit 2 (see the rotation axis 902 in FIG. 9B ).
- step S 605 based on the angle information detected in step S 604 , the control unit 5 calculates auxiliary information that is used by the user as a guide for correction of the captured image.
- the auxiliary information denotes grid-like auxiliary lines including two vertical auxiliary lines in particular.
- the display processing unit 3 displays the auxiliary lines by superimposing the auxiliary lines over the captured image displayed on the display unit 2 .
- tilts (angles) of the two vertical auxiliary lines change in accordance with the tilt angle of the display apparatus 100 , and an interval between upper portions of the two vertical auxiliary lines differs from an interval between lower portions of the two vertical auxiliary lines.
- the two vertical auxiliary lines are displayed in such a manner that they are parallel to the up-down direction of the display screen. That is to say, the displayed two vertical auxiliary lines are parallel to each other.
- a top portion of the display apparatus 100 is swung backward, the interval between the upper portions of the two vertical auxiliary lines is small, whereas the interval between the lower portions of the two vertical auxiliary lines is large.
- the top portion of the display apparatus 100 is swung forward, the interval between the upper portions of the two vertical auxiliary lines is large, whereas the interval between the lower portions of the two vertical auxiliary lines is small.
- the user tilts the display apparatus 100 forward/backward until the two vertical auxiliary lines reach an angle with which the user wants to vertically correct a subject.
- the user tilts the display apparatus 100 forward/backward so that a subject's vertical line V is parallel to the vertical auxiliary lines.
- the user can correct the captured image by adjusting an angle formed by a desired portion of the subject and the vertical auxiliary lines through rotation of the display apparatus 100 .
- steps S 608 and S 609 are similar to the processes of steps S 308 and S 309 in FIG. 3A , except that a command is a “keystone effect instruction command”.
- angle information included in the keystone effect instruction command includes, for example, information indicating the gravitational direction detected by the angle detection unit 4 , or information related to an angle (orientation) of the display surface of the display unit 2 with respect to the gravitational direction.
- the captured image that has been corrected in accordance with the keystone effect instruction command can be received from the image capturing apparatus 200 and displayed on the display unit 2 . In this way, the user of the display apparatus 100 can confirm whether the keystone effect has been corrected as intended, and perform a correction operation again by switching back to the angle correction mode if necessary.
- step S 612 is similar to the process of step S 312 in FIG. 3B , except that a command is the “keystone effect instruction command”.
- step S 613 the image processing unit 7 corrects the keystone effect of the captured image by executing electronic image processing based on the angle information in the keystone effect instruction command.
- the captured image, whose keystone effect has been corrected is transmitted to the display apparatus 100 to be reviewed on the display apparatus 100 .
- information for correction including the angle information in the keystone effect instruction command, is stored (recorded) to the ROM or the RAM in the image capturing apparatus 200 , and then processing of the present flowchart is ended.
- the image processing unit 7 makes correction based on the stored information (the information for correction, including the angle information in the keystone effect instruction command). Then, the corrected captured images are recorded to the recording unit 10 , or transmitted via the communication processing unit 9 , as still images or moving images. That is to say, a plurality of images that are captured after receiving the keystone effect instruction command are corrected based on the stored information.
- the display apparatus 100 displays a captured image received from the image capturing apparatus 200 on the display unit 2 , and transmits angle information indicating a tilt angle of the display apparatus 100 (a forward/backward tilt angle of the display screen of the display unit 2 ) to the image capturing apparatus 200 .
- the image capturing apparatus 200 corrects a keystone effect of the captured image in accordance with the angle information. In this way, the user can correct the keystone effect of the captured image intuitively.
- angle information transmitted from the display apparatus 100 to the image capturing apparatus 200 includes information that is necessary for both rotational correction for a captured image and keystone effect correction for a captured image.
- the image capturing apparatus 200 applies both rotational correction and keystone effect correction based on angle information included in a rotation instruction command and a keystone effect instruction command received from the display apparatus 100 .
- the first and second embodiments have described a configuration in which a tilt or a keystone effect of an image captured by the image capturing apparatus 200 is corrected during shooting.
- the third embodiment describes a configuration for writing angle information to an image file, thereby enabling correction during reproduction or during image post-processing.
- Basic configurations of the display apparatus 100 and the image capturing apparatus 200 according to the third embodiment are similar to the configurations shown in FIG. 1 according to the first embodiment, and thus a detailed description of such configurations will be omitted.
- the following description focuses mainly on portions that differ from the first embodiment.
- the present invention is described in the context of correcting a tilt of a captured image, the present embodiment is also applicable similarly to the case of correcting a keystone effect of a captured image (i.e., the context of the second embodiment).
- the third embodiment is similar to the first embodiment in that the display apparatus 100 transmits a rotation instruction command including angle information to the image capturing apparatus 200 (see FIG. 7A ).
- the image capturing apparatus 200 writes the angle information to an image file of a captured image, instead of correcting the captured image in response to the rotation instruction command.
- the image capturing apparatus 200 corrects the image in accordance with the angle information during reproduction of the image file. That is to say, the image capturing apparatus 200 records the captured image in association with the angle information.
- the user can instruct the image capturing apparatus 200 whether to correct the image in accordance with the angle information via the operation unit 11 .
- FIG. 8 A description is now given of operations of the image capturing apparatus 200 according to the third embodiment with reference to FIG. 8 .
- Operations of the display apparatus 100 are similar to those according to the first embodiment (see FIG. 3A ).
- steps of performing processes that are identical or similar to processes of FIG. 3B are given the same reference numerals thereas.
- Processes of steps in a flowchart shown in FIG. 8 are realized as a result of the control unit 8 controlling blocks of the image capturing apparatus 200 by deploying the control program stored in the ROM to the RAM and executing the deployed control program, unless specifically stated otherwise.
- step S 810 the control unit 8 of the image capturing apparatus 200 determines whether a current operation mode of the image capturing apparatus 200 is a shooting mode. If the current operation mode is the shooting mode, processing proceeds to step S 310 .
- step S 811 which follows steps S 310 to S 312 , the control unit 8 records the captured image as an image file and writes angle information to the image file via the recording unit 10 .
- step S 810 determines whether the operation mode of the image capturing apparatus 200 is not the shooting mode (is a reproduction mode).
- the processing proceeds to step S 812 .
- step S 812 the control unit 8 determines whether an image file to be reproduced includes angle information. The processing proceeds to step S 813 if the image file includes the angle information, and to step S 815 if the image file does not include the angle information.
- step S 813 the control unit 8 determines whether correction processing based on the angle information is in an ON state.
- the user can switch between ON and OFF of the correction processing via the operation unit 11 .
- the processing proceeds to step S 814 if the correction processing is in the ON state, and to step S 815 if the correction processing is not in the ON state.
- step S 814 the image processing unit 7 corrects an image in the image file in accordance with the angle information, and displays the corrected image on the display unit 12 .
- the image is, for example, a moving image.
- the image processing unit 7 performs normal reproduction without making correction in step S 815 .
- a tilt of an image captured by the image capturing apparatus 200 can be corrected during reproduction, instead of during shooting.
- the tilt can be corrected not only during reproduction, but also in post-processing with the use of an image correction tool.
- the user can not only determine whether to make correction after shooting, but also determine whether to make correction in consideration of tradeoff between correction and a correction-caused reduction in a cutout size within the captured image.
- the display apparatus 100 transmits, to the image capturing apparatus 200 , attitude information indicating a tilt of the display apparatus 100 as information used for rotation and cutout performed by the image capturing apparatus 200 ; however, no limitation is intended in this regard.
- the display apparatus 100 is configured to receive an image captured by the image capturing apparatus 200 (an image to be recorded as an image file)
- rotation and cutout may be performed by the display apparatus 100 , instead of by the image capturing apparatus 200 .
- attitude information (the information described in the above embodiments—angle information, rotation instruction command) indicating a tilt of the display apparatus 100 , which is generated in response to an instruction from the user, is stored to the RAM of the control unit 5 without being transmitted to the image capturing apparatus 200 .
- steps S 313 , S 613 , and S 811 which have been described above as processes executed by the image capturing apparatus 200 , are applied to the image received from the image capturing apparatus 200 based on the information stored in the RAM, the resultant image can be recorded as the image file.
- the display apparatus 100 and the image capturing apparatus 200 may be controlled by a single item of hardware, or the entire apparatuses may be controlled by a plurality of items of hardware sharing processing.
- the above embodiments have described an example in which the present invention is applied to a display apparatus such as a smartphone and an image capturing apparatus such as a digital camera, the above embodiments are not limited to such an example.
- the present invention is applicable to any type of apparatus that receives and displays a captured image, and to any type of apparatus that captures and transmits an image.
- the present invention is applicable to, for example, a personal computer, a PDA, a mobile telephone terminal, a mobile image viewer, a display-equipped printer apparatus, a digital photo frame, a music player, a game console, and an electronic book reader.
- Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
- computer executable instructions e.g., one or more programs
- a storage medium which may also be referred to more fully as a
- the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
- the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
- the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
- 1. Field of the Invention
- The present invention relates to a display apparatus, an image capturing apparatus, an image capturing system, a control method for the display apparatus, a control method for the image capturing apparatus, and a storage medium.
- 2. Description of the Related Art
- Recent years have witnessed the development and widespread use of image capturing apparatuses, such as digital still cameras and digital camcorders, which can be remotely operated via external terminals, such as smartphones, using wireless communication functions such as Wi-Fi (Wireless Fidelity). Remote operation functions of such external terminals enable users to operate the image capturing apparatuses from a location far from the image capturing apparatuses. With such functions, the users can not only confirm images captured by the image capturing apparatuses, but also perform shooting start instructions and setting of shooting parameters.
- For example, when remotely operating an image capturing apparatus using an external terminal, a user may notice inconsistency between a horizontal direction in an image captured by the image capturing apparatus and a horizontal direction of a subject while confirming the captured image. However, if there is a large distance between a photographer (user) operating the external terminal and the image capturing apparatus, it may be difficult for the user to correct the position and attitude of the image capturing apparatus.
- In view of this, a method is suggested in which the angle of view of a captured image is electronically corrected by a user issuing a rotation angle instruction to an image capturing apparatus through a remote operation (see Japanese Patent Laid-Open No. 2007-228097). Another method is also suggested in which a horizontal direction of an image capturing apparatus is electronically and automatically corrected using tilt information of the image capturing apparatus (see Japanese Patent Laid-Open No. 2012-147071).
- According to Japanese Patent Laid-Open No. 2007-228097, a tilt of a captured image is electronically corrected by a client terminal transmitting angle information for correcting the tilt of the captured image to an image capturing apparatus connected to a network. With this method, a user determines an angle used in correcting the tilt of the captured image by operating a slide bar displayed on the client terminal. The image capturing apparatus at a remote location receives the angle information, and electronically corrects the captured image. In this way, the tilt of the captured image can be corrected through a remote operation.
- According to Japanese Patent Laid-Open No. 2012-147071, an image capturing apparatus detects a tilt angle of itself, and corrects a captured image using detected angle information. This method detects a tilt of the image capturing apparatus itself using, for example, an angle sensor within the image capturing apparatus. The image capturing apparatus electronically corrects the captured image using the detected angle information. In this way, a tilt of the captured image can be automatically corrected.
- However, with Japanese Patent Laid-Open No. 2007-228097, it is difficult for the user to intuitively grasp a relationship between an operation amount of the slide bar and a rotation angle, and hence to correct the tilt of the captured image quickly and accurately. On the other hand, Japanese Patent Laid-Open No. 2012-147071 does not support a case in which the image capturing apparatus is level whereas a subject is tilted.
- The present invention has been made in view of the above situations, and provides a technique to enable intuitive correction of a captured image by a user of a display apparatus that receives the captured image from an image capturing apparatus and displays the received captured image.
- According to a first aspect of the present invention, there is provided a display apparatus, comprising: a reception unit configured to receive a captured image from an image capturing apparatus; a display unit configured to display the captured image; a detection unit configured to detect an attitude of the display apparatus; and a transmission unit configured to, while the display unit is displaying the captured image, transmit attitude information to the image capturing apparatus in response to an instruction from a user, the attitude information indicating the attitude detected by the detection unit.
- According to a second aspect of the present invention, there is provided an image capturing apparatus, comprising: an image capturing unit configured to generate a captured image; a transmission unit configured to transmit the captured image to a display apparatus; a reception unit configured to receive attitude information indicating an attitude of the display apparatus from the display apparatus; and a processing unit configured to apply specific processing to the captured image based on the attitude information received by the reception unit, and record the captured image.
- According to a third aspect of the present invention, there is provided an image capturing system including a display apparatus and an image capturing apparatus, the display apparatus comprising: a reception unit configured to receive a captured image from the image capturing apparatus; a display unit configured to display the captured image; a detection unit configured to detect an attitude of the display apparatus; and a transmission unit configured to, while the display unit is displaying the captured image, transmit attitude information to the image capturing apparatus in response to an instruction from a user, the attitude information indicating the attitude detected by the detection unit, the image capturing apparatus comprising: an image capturing unit configured to generate the captured image; a transmission unit configured to transmit the captured image to the display apparatus; a reception unit configured to receive the attitude information indicating the attitude of the display apparatus from the display apparatus; and a processing unit configured to apply specific processing to the captured image based on the attitude information received by the reception unit of the image capturing apparatus, and record the captured image.
- According to a fourth aspect of the present invention, there is provided a control method for a display apparatus including a display unit and an attitude detection unit, the control method comprising: receiving a captured image from an image capturing apparatus; displaying the captured image on the display unit; detecting an attitude of the display apparatus with the attitude detection unit; and while displaying the captured image on the display unit, transmitting attitude information to the image capturing apparatus in response to an instruction from a user, the attitude information indicating the attitude detected by the detecting.
- According to a fifth aspect of the present invention, there is provided a control method for an image capturing apparatus, the control method comprising: generating a captured image; transmitting the captured image to a display apparatus; receiving attitude information indicating an attitude of the display apparatus from the display apparatus; and applying specific processing to the captured image based on the attitude information received by the receiving, and recording the captured image.
- According to a sixth aspect of the present invention, there is provided a display apparatus, comprising: a reception unit configured to receive a captured image from an image capturing apparatus; a display unit configured to display the captured image; a detection unit configured to detect an attitude of the display apparatus; a storage unit configured to, while the display unit is displaying the captured image, store attitude information in response to an instruction from a user, the attitude information indicating the attitude detected by the detection unit; and a processing unit configured to apply specific processing to the captured image received from the image capturing apparatus based on the attitude information stored by the storage unit, and record the captured image.
- According to a seventh aspect of the present invention, there is provided a control method for a display apparatus, the control method comprising: receiving a captured image from an image capturing apparatus; displaying the captured image on a display unit of the display apparatus; detecting an attitude of the display apparatus; while displaying the captured image on the display unit, storing attitude information in response to an instruction from a user, the attitude information indicating the attitude detected by the detecting; and applying specific processing to the captured image received from the image capturing apparatus based on the stored attitude information, and recording the captured image.
- According to an eighth aspect of the present invention, there is provided a non-transitory computer-readable storage medium which stores a program for causing a computer to function as a display apparatus comprising: a reception unit configured to receive a captured image from an image capturing apparatus; a display unit configured to display the captured image; a detection unit configured to detect an attitude of the display apparatus; and a transmission unit configured to, while the display unit is displaying the captured image, transmit attitude information to the image capturing apparatus in response to an instruction from a user, the attitude information indicating the attitude detected by the detection unit.
- According to a ninth aspect of the present invention, there is provided a non-transitory computer-readable storage medium which stores a program for causing a computer to function as an image capturing apparatus, comprising: an image capturing unit configured to generate a captured image; a transmission unit configured to transmit the captured image to a display apparatus; a reception unit configured to receive attitude information indicating an attitude of the display apparatus from the display apparatus; and a processing unit configured to apply specific processing to the captured image based on the attitude information received by the reception unit, and record the captured image.
- According to a tenth aspect of the present invention, there is provided a non-transitory computer-readable storage medium which stores a program for causing a computer to function as a display apparatus comprising: a reception unit configured to receive a captured image from an image capturing apparatus; a display unit configured to display the captured image; a detection unit configured to detect an attitude of the display apparatus; a storage unit configured to, while the display unit is displaying the captured image, store attitude information in response to an instruction from a user, the attitude information indicating the attitude detected by the detection unit; and a processing unit configured to apply specific processing to the captured image received from the image capturing apparatus based on the attitude information stored by the storage unit, and record the captured image.
- Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
-
FIG. 1 is a block diagram showing schematic configurations of adisplay apparatus 100 and animage capturing apparatus 200. -
FIGS. 2A to 2C show an overview of a first embodiment. -
FIGS. 3A and 3B are flowcharts showing operations of thedisplay apparatus 100 and theimage capturing apparatus 200 according to the first embodiment. -
FIGS. 4A and 4B are flowcharts showing a modification example ofFIGS. 3A and 3B . -
FIGS. 5A to 5C show an overview of a second embodiment. -
FIGS. 6A and 6B are flowcharts showing operations of thedisplay apparatus 100 and theimage capturing apparatus 200 according to the second embodiment. -
FIGS. 7A and 7B show an overview of a third embodiment. -
FIG. 8 is a flowchart showing operations of theimage capturing apparatus 200 according to the third embodiment. -
FIG. 9A shows arotation axis 901 of thedisplay apparatus 100 according to the first embodiment. -
FIG. 9B shows arotation axis 902 of thedisplay apparatus 100 according to the second embodiment. - Embodiments of the present invention will now be described with reference to the attached drawings. It should be noted that the technical scope of the present invention is defined by the claims, and is not limited by any of the embodiments described below. In addition, not all combinations of the features described in the embodiments are necessarily required for realizing the present invention.
-
FIG. 1 is a block diagram showing schematic configurations of adisplay apparatus 100 and animage capturing apparatus 200 included in each of image capturing systems according to the embodiments. Thedisplay apparatus 100 is, for example, a mobile terminal (e.g., a smartphone) that can be held in a hand of a user, and is used by the user to remotely operate theimage capturing apparatus 200. Theimage capturing apparatus 200 is, for example, a digital camera, and has a function of correcting a tilt and a keystone effect of a captured image through electronic image processing. - With the aid of a communication processing unit 1, the
display apparatus 100 communicates with theimage capturing apparatus 200 connected to a network. Thedisplay apparatus 100 receives an image captured by theimage capturing apparatus 200 via the network. Thedisplay apparatus 100 also transmits attitude information indicating an attitude of thedisplay apparatus 100 to theimage capturing apparatus 200. In the embodiments described below, it will be assumed that an angle of thedisplay apparatus 100 is used as an attitude of thedisplay apparatus 100, and attitude information is also referred to as angle information. Thedisplay apparatus 100 receives an image captured by theimage capturing apparatus 200 with the aid of the communication processing unit 1, and displays the received captured image on adisplay unit 2. At this time, adisplay processing unit 3 converts the captured image into information that can be displayed on thedisplay unit 2 of thedisplay apparatus 100. Thedisplay processing unit 3 also generates display data for displaying the angle information of thedisplay apparatus 100 on thedisplay unit 2. An angle detection unit 4 (attitude detection unit) detects a positional attitude of thedisplay apparatus 100, and converts the result of detection into the angle information. An acceleration sensor, a gyroscope, or the like can be used as theangle detection unit 4. Acontrol unit 5 includes a nonvolatile ROM and a volatile RAM, and controls thedisplay apparatus 100 by executing a control program stored in the ROM. The RAM in thecontrol unit 5 is used as a working memory for execution of the control program by thecontrol unit 5. AlthoughFIG. 1 shows thedisplay processing unit 3 as an independent block, thecontrol unit 5 may execute processing of thedisplay processing unit 3. - The
image capturing apparatus 200 obtains captured image data by capturing an image of a subject with the aid of an image sensor included in an image capture processing unit 6. Animage processing unit 7 generates a captured image based on the captured image data. Theimage processing unit 7 also electronically corrects the captured image based on the angle information received from thedisplay apparatus 100 via a communication processing unit 9. The captured image is recorded to arecording unit 10. Theimage capturing apparatus 200 also includes anoperation unit 11 for performing a shooting operation, configuring menu settings, and the like on theimage capturing apparatus 200, as well as adisplay unit 12 for confirming the captured image, shooting information, and the like. Theimage capturing apparatus 200 further includes the communication processing unit 9 for transmitting the captured image to thedisplay apparatus 100, or for receiving the angle information of thedisplay apparatus 100. Acontrol unit 8 includes a nonvolatile ROM and a volatile RAM, and controls theimage capturing apparatus 200 by executing a control program stored in the ROM. The RAM in thecontrol unit 8 is used as a working memory for execution of the control program by thecontrol unit 8. - Constituent elements of the
image capturing apparatus 200 will now be described in more detail. The image capture processing unit 6 is composed of an optical unit including a plurality of lenses, a diaphragm, and the like, the image sensor, a driver for driving the image sensor, a timing generation circuit, a CDS/AGC circuit, and the like. The optical unit includes the diaphragm for adjusting an amount of incident light from outside, and a neutral density (ND) filter. The image capture processing unit 6 drives a lens assembly with respect to an optical axis so as to, for example, focus on the subject and reduce blur in the captured image caused by a hand shake and the like. The image sensor captures an image of the subject through photoelectric conversion, and with the use of the CDS/AGC circuit, samples and amplifies image information based on electric charges (image signals) accumulated in pixels of the image sensor. Note that correlated double sampling (CDS) and auto gain control (AGC) are performed in sampling and amplification, respectively. An A/D converter converts the image information (analog signal) output from the CDS/AGC circuit into a digital signal. Theimage processing unit 7 applies various types of signal processing, including auto white balance (AWB) and gamma control etc., to the image information (digital signal) output from the A/D converter, thereby generating the captured image. The driver for driving the image sensor and the timing generation circuit feed, for example, driving pulses for driving the image sensor to the image sensor, and adjust readout of the image captured by the image sensor and an exposure time period. - As stated earlier, the
image processing unit 7 electronically corrects the captured image based on the angle information of thedisplay apparatus 100 received via the communication processing unit 9. It will be assumed that a correction in a direction of rotation and a direction of a keystone effect is performed through image processing. - The
recording unit 10 stores the captured image generated by theimage processing unit 7 as an image file to an internal memory or an external memory (recording medium), such as a memory card. At this time, therecording unit 10 can write the angle information received via the communication processing unit 9 to the image file. - With the use of the
operation unit 11, the user performs a key operation and configures setting menus and the like when shooting with theimage capturing apparatus 200. Thedisplay unit 12 displays a display image generated by theimage processing unit 7. Ahorizontal detection unit 13 detects a tilt of theimage capturing apparatus 200 and outputs angle information. An acceleration sensor, a gyroscope, or the like can be used as thehorizontal detection unit 13. - Note that in
FIG. 1 , each block is illustrated as an independent circuit unit. Alternatively, all or a part of the blocks may be included in thecontrol unit 5 or thecontrol unit 8. - First, an overview of a first embodiment will be described with reference to
FIGS. 2A to 2C . As shown inFIG. 2A , theimage capturing apparatus 200 is placed so as to capture an image of a subject. Theimage processing unit 7 generates a captured image from captured image data obtained by the image capture processing unit 6. In an example ofFIG. 2A , theimage capturing apparatus 200 is placed to be level with respect to a ground surface. From a viewpoint of theimage capturing apparatus 200, a direction that is “level with respect to the ground surface” means a direction that is perpendicular to a gravitational direction detected by thehorizontal detection unit 13 of theimage capturing apparatus 200; hereinafter, it may be simply referred to as a “horizontal direction detected by theimage capturing apparatus 200”. On the other hand, the subject is placed on a slanted surface in a tilted manner. - As shown in
FIG. 2B , theimage capturing apparatus 200 displays the captured image on thedisplay unit 12. Theimage capturing apparatus 200 also displays, on thedisplay unit 12, a horizontal line (a horizontal line of the image capturing apparatus 200) based on angle information output from thehorizontal detection unit 13. As theimage capturing apparatus 200 is placed to be level with respect to the ground surface as stated earlier, the horizontal line is parallel to a transverse direction of the captured image. As the subject is placed on the slanted surface in a tilted manner as stated earlier, the subject included in the captured image is also tilted inFIG. 2B . - Although the
image capturing apparatus 200 is level with respect to the ground surface in the description of the present embodiment, theimage capturing apparatus 200 is not limited to having such an attitude. Theimage capturing apparatus 200 may be placed in such a manner that it is tilted with respect to the ground surface. In this case, the horizontal line is displayed in a tilted manner, with its tilt corresponding to a tilt detected by thehorizontal detection unit 13 of the image capturing apparatus 200 (that is to say, the displayed horizontal line is at an angle to the transverse direction of the captured image). - The
image capturing apparatus 200 transmits the captured image to the communication processing unit 1 of thedisplay apparatus 100 connected to the network via the communication processing unit 9. At this time, theimage capturing apparatus 200 transmits information indicating an attitude of theimage capturing apparatus 200 together with the captured image. It will be assumed that information indicating the horizontal direction of the image capturing apparatus 200 (horizontal information, image capturing apparatus attitude information), which is obtained based on the angle information output from thehorizontal detection unit 13, is transmitted as the information indicating the attitude of theimage capturing apparatus 200. As shown inFIG. 2B , thedisplay apparatus 100 converts the captured image received from theimage capturing apparatus 200 via the communication processing unit 1 into a format that can be displayed on thedisplay unit 2 with the aid of thedisplay processing unit 3, and displays the result of conversion on thedisplay unit 2. - A photographer (user) determines angle information for correcting the tilt of the captured image by rotating the
display apparatus 100 while confirming the captured image being displayed on thedisplay unit 2. This rotation is performed around a rotation axis orthogonal to a display screen of the display unit 2 (see arotation axis 901 inFIG. 9A ). For example, a rotation angle of a case in which a transverse line (a line extending in a left-right direction) of the display screen is level with respect to the ground surface, as shown on the left side ofFIG. 9A , is defined as 0 degrees. From a viewpoint of the display screen (or display apparatus 100), a direction that is “level with respect to the ground surface” means a direction perpendicular to a gravitational direction detected by theangle detection unit 4 of thedisplay apparatus 100; hereinafter, it may also be simply referred to as a “horizontal direction detected by thedisplay apparatus 100”. Thus, when the rotation angle is 0 degrees, horizontal auxiliary lines are parallel to the transverse line of the display screen (as will be elaborated later, the horizontal auxiliary lines are lines extending in a transverse direction among auxiliary lines shown inFIG. 2B ). The “tilt of the captured image” to be corrected denotes “deviation” from the horizontal auxiliary lines of thedisplay apparatus 100 caused by rotation of the captured image around the rotation axis. - When the user rotates the
display apparatus 100, thedisplay unit 2 displays the horizontal line based on the horizontal information received from theimage capturing apparatus 200 and auxiliary information indicating the rotation angle (attitude) of thedisplay apparatus 100, together with the captured image. In the present embodiment, it will be assumed that the auxiliary information denotes grid-like auxiliary lines including horizontal auxiliary lines and vertical auxiliary lines. An angle formed by the horizontal auxiliary lines and the transverse line of the display screen corresponds to the rotation angle of thedisplay apparatus 100. The horizontal line and the auxiliary lines are generated by thedisplay processing unit 3. Based on angle information detected by theangle detection unit 4, thedisplay processing unit 3 updates display so that the horizontal auxiliary lines keep showing the horizontal direction detected by thedisplay apparatus 100 even when thedisplay apparatus 100 has been rotated. In this way, the user can confirm the horizontal direction detected by thedisplay apparatus 100 by viewing the horizontal auxiliary lines. - To correct the tilt of the captured image, the user determines a line that belongs to the subject included in the captured image and that should match the horizontal direction detected by the
display apparatus 100. In an example ofFIG. 2B , a line on which a bottom surface of a flower pot is in contact with a board on which the flower pot is placed (a subject's horizontal line H) is the line that should match the horizontal direction detected by thedisplay apparatus 100. Then, as shown inFIG. 2C , the user rotates thedisplay apparatus 100 so that the subject's horizontal line H matches the horizontal direction detected by thedisplay apparatus 100. At this time, the user can easily correct the tilt (level the subject) of the captured image by rotating thedisplay apparatus 100 so that the subject's horizontal line H is parallel to the horizontal auxiliary lines. Note that the horizontal auxiliary lines are not essential in the present embodiment. Even if the horizontal auxiliary lines are not displayed, the user can level the subject by rotating thedisplay apparatus 100 while viewing the captured image (especially, the subject's horizontal line H) displayed on thedisplay unit 2. - Note that the present embodiment is described under the assumption that, while the
image capturing apparatus 200 is placed to be level with respect to the ground surface, the subject is tilted, thus causing the tilt of the captured image. However, theimage capturing apparatus 200 and the subject are not limited to having such attitudes. For example, as stated earlier, in a case where theimage capturing apparatus 200 is tilted, the horizontal line of theimage capturing apparatus 200 is displayed in a tilted manner inFIG. 2B , with its tilt corresponding to a tilt detected by thehorizontal detection unit 13 of the image capturing apparatus 200 (that is to say, the displayed horizontal line is at an angle to the transverse direction of the captured image). In this case, the user can correct the tilt of the captured image attributed to the tilt of theimage capturing apparatus 200 by rotating thedisplay apparatus 100 so that the horizontal line of theimage capturing apparatus 200 is parallel to the horizontal auxiliary lines. Being able to make such correction is a significant advantage in the case of, for example, remote shooting using thedisplay apparatus 100 while theimage capturing apparatus 200 is fixed in place by a tripod. - On the other hand, in a case where the user wishes to correct the attitude of the subject to a desired attitude irrespective of the attitude of the image capturing apparatus 200 (the tilt thereof with respect to the ground surface), the user need not use the horizontal line of the
image capturing apparatus 200. In this case, it is sufficient for the user to rotate thedisplay apparatus 100 based on the subject's horizontal line H as stated earlier. Depending on the subject types, there may be no element that is equivalent to the subject's horizontal line H; in this case also, the user can correct the captured image by rotating thedisplay apparatus 100 so that a desired portion of the subject has a desired attitude. - After the user has rotated the
display apparatus 100, thedisplay apparatus 100 transmits a rotation instruction command to theimage capturing apparatus 200 via the communication processing unit 1 in response to a user instruction. The rotation instruction command includes information indicating the rotation angle of the display apparatus 100 (angle information). Theimage capturing apparatus 200 receives the rotation instruction command including the angle information via the communication processing unit 9. Then, theimage capturing apparatus 200 applies angle correction (tilt correction) to the captured image by executing electronic image processing with the aid of theimage processing unit 7. In a case where correction is made through image processing, theimage capturing apparatus 200 cuts out a partial region of the captured image. However, the larger the angle of correction, the smaller the region that needs to be cut out from the captured image. In view of this, with the aid of thecontrol unit 5, thedisplay apparatus 100 analyzes an image size of the captured image received from theimage capturing apparatus 200. With the aid of thecontrol unit 5, thedisplay apparatus 100 also calculates a cutout image size within the captured image, which is necessary for correction, based on the angle information detected by theangle detection unit 4, thereby identifying the partial region to be cut out. Thedisplay apparatus 100 notifies the user of information indicating the identified partial region (cutout frame) by superimposing the information on thedisplay unit 2 via thedisplay processing unit 3. - With the above control, the user can correct the tilt of the image captured by the
image capturing apparatus 200 through an intuitive operation by rotating thedisplay apparatus 100. - A description is now given of operations of the
display apparatus 100 and theimage capturing apparatus 200 according to the first embodiment with reference toFIGS. 3A and 3B .FIGS. 3A and 3B pertain to thedisplay apparatus 100 and theimage capturing apparatus 200, respectively. Processes of steps in a flowchart ofFIG. 3A are realized as a result of thecontrol unit 5 controlling blocks of thedisplay apparatus 100 by deploying the control program stored in the ROM to the RAM and executing the deployed control program, unless specifically stated otherwise. Similarly, processes of steps in a flowchart ofFIG. 3B are realized as a result of thecontrol unit 8 controlling blocks of theimage capturing apparatus 200 by deploying the control program stored in the ROM to the RAM and executing the deployed control program, unless specifically stated otherwise. The same goes for later-describedFIGS. 4A and 4B . - First, the operations of the
display apparatus 100 shown inFIG. 3A will be described. In step S301, thecontrol unit 5 of thedisplay apparatus 100 determines whether the user has issued an instruction for switching to an angle correction mode. If the instruction for switching to the angle correction mode has not been issued, processing of the present flowchart is ended. If the instruction for switching to the angle correction mode has been issued, the processing proceeds to step S302. - In step S302, the
control unit 5 receives, via the communication processing unit 1, an image captured by theimage capturing apparatus 200 and horizontal information of the image capturing apparatus 200 (seeFIG. 2B ). In step S303, thedisplay processing unit 3 displays the captured image that was received from theimage capturing apparatus 200 in step S302 on thedisplay unit 2. Thedisplay processing unit 3 also generates an image of a horizontal line based on the horizontal information, and displays the horizontal line by superimposing the horizontal line over the captured image displayed on the display unit 2 (seeFIG. 2B ). - In step S304, the
control unit 5 detects angle information output from theangle detection unit 4. This angle information indicates a rotation angle around the rotation axis orthogonal to the display screen of the display unit 2 (see therotation axis 901 inFIG. 9A ). In step S305, based on the angle information detected in step S304, thecontrol unit 5 generates grid-like auxiliary lines that are used to confirm a horizontal direction detected by thedisplay apparatus 100, and hence are used by the user as a guide for correction of the captured image. Then, thedisplay processing unit 3 displays the auxiliary lines by superimposing the auxiliary lines over the captured image displayed on thedisplay unit 2. As shown inFIG. 2C , transverse lines among these grid-like auxiliary lines (horizontal auxiliary lines) are displayed in such a manner that they are always maintained level (parallel to the horizontal direction detected by the display apparatus 100), even when the rotation angle of thedisplay apparatus 100 has been changed. The user adjusts the rotation angle of thedisplay apparatus 100 so that the horizontal auxiliary lines match a horizontal portion of a subject being shot (e.g., the subject's horizontal line H shown inFIGS. 2B and 2C ). In this way, the user can easily generate correction information for shooting the subject as if the subject is level. - In step S306, the
display processing unit 3 displays, on thedisplay unit 2, information (cutout frame) indicating a cutout size within the captured image for making correction in accordance with the rotation angle of the display apparatus 100 (seeFIG. 2C ). As stated earlier with reference toFIG. 2C , the cutout size is calculated by thecontrol unit 5 based on the angle information detected by theangle detection unit 4. The larger the rotation angle, i.e., a correction amount, the smaller the cutout size; by viewing the cutout frame, the user can rotate thedisplay apparatus 100 while confirming how small the captured image will be. - In step S307, the
control unit 5 determines whether the user has issued an instruction for ending the angle correction mode. If the instruction for ending the angle correction mode has not been issued, the processing returns to step S302, and similar processes are repeated. If the instruction for ending the angle correction mode has been issued, the processing proceeds to step S308. - In step S308, the
control unit 5 generates a rotation instruction command to be issued to theimage capturing apparatus 200. The rotation instruction command includes angle information for electronic image correction by theimage capturing apparatus 200. It will be assumed that the rotation instruction command includes, for example, information indicating the gravitational direction detected by theangle detection unit 4, or information related to an angle (orientation) with respect to the gravitational direction around the rotation axis orthogonal to a display surface of thedisplay unit 2. In step S309, thecontrol unit 5 transmits the rotation instruction command to theimage capturing apparatus 200 via the communication processing unit 1, and then the processing of the present flowchart is ended. Thereafter, the captured image that has been corrected in response to the rotation instruction command can be received from theimage capturing apparatus 200 and displayed on thedisplay unit 2. In this way, the user of thedisplay apparatus 100 can confirm whether an intended orientation has been achieved through correction, and perform a correction operation again by switching back to the angle correction mode if necessary. - Next, the operations of the
image capturing apparatus 200 shown inFIG. 3B will be described. In step S310, thecontrol unit 8 of theimage capturing apparatus 200 determines whether the user has issued the instruction for switching to the angle correction mode. This switching instruction is, for example, remotely issued via thedisplay apparatus 100. If the instruction for switching to the angle correction mode has not been issued, processing of the present flowchart is ended. If the instruction for switching to the angle correction mode has been issued, the processing proceeds to step S311. - In step S311, the
control unit 8 transmits the captured image to thedisplay apparatus 100 via the communication processing unit 9. It also transmits, to thedisplay apparatus 100, the above-described horizontal information detected by thehorizontal detection unit 13, which indicates the attitude of theimage capturing apparatus 200. In step S312, thecontrol unit 8 determines whether the rotation instruction command has been received from thedisplay apparatus 100. If the rotation instruction command has not been received, the processing returns to step S311, and similar processes are repeated. These processes are repeated in a cycle of a predetermined frame rate (e.g., 30 fps) so that captured images are visible in the form of live view on thedisplay apparatus 100. If the rotation instruction command has been received, the processing proceeds to step S313. - In step S313, the
image processing unit 7 corrects a tilt of the captured image by executing electronic image processing based on the angle information included in the rotation instruction command. The captured image, whose tilt has been corrected, is transmitted to thedisplay apparatus 100 to be reviewed on thedisplay apparatus 100. Furthermore, information for correction, including the angle information in the rotation instruction command, is stored (recorded) to the ROM or RAM in theimage capturing apparatus 200, and then the processing of the present flowchart is ended. Thereafter, when theimage capturing apparatus 200 captures still images or moving images and records or transmits the captured images, theimage processing unit 7 makes correction based on the stored information (the information for correction, including the angle information in the rotation instruction command). Then, the corrected captured images are recorded to therecording unit 10, or transmitted via the communication processing unit 9, as still images or moving images. That is to say, a plurality of images that are captured after receiving the rotation instruction command are corrected based on the stored information. - Through the above processing, the tilted captured image shown in
FIG. 2B is corrected in accordance with the rotation angle of thedisplay apparatus 100 as shown inFIG. 2C . - Incidentally, in the flowcharts of
FIGS. 3A and 3B , thedisplay apparatus 100 transmits the rotation instruction command to theimage capturing apparatus 200 after the issuance of the instruction for ending the angle correction mode in step S307. In this way, once the angle correction mode has been ended following the adjustment of the angle of correction, the same correction can be repeatedly applied to images shot thereafter. On the other hand, as shown inFIGS. 4A and 4B , thedisplay apparatus 100 may generate the rotation instruction command and transmit the rotation instruction command to theimage capturing apparatus 200 in real time without waiting for the instruction for ending the angle correction mode. In this way, theimage capturing apparatus 200 can correct captured images immediately following rotation of thedisplay apparatus 100. InFIGS. 4A and 4B , which respectively correspond toFIGS. 3A and 3B , steps of performing processes that are identical or similar to processes ofFIGS. 3A and 3B are given the same reference numerals thereas.FIG. 4A is the same asFIG. 3A , except that steps S308 and S309 precede step S307.FIG. 4B is the same asFIG. 3B , except that step S401 is added after step S313. In step S401, thecontrol unit 8 determines whether the user has issued the instruction for ending the angle correction mode. This ending instruction is, for example, remotely issued via thedisplay apparatus 100. If the instruction for ending the angle correction mode has not been issued, processing returns to step S311, and similar processes are repeated. If the instruction for ending the angle correction mode has been issued, information for correction, including the angle information in the rotation instruction command, is stored (recorded) to the ROM or RAM in theimage capturing apparatus 200, and then the processing of the present flowchart is ended. Thereafter, when theimage capturing apparatus 200 captures still images or moving images and records or transmits the captured images, theimage processing unit 7 makes correction based on the stored information (the information for correction, including the angle information in the rotation instruction command). Then, the corrected captured images are recorded to therecording unit 10, or transmitted via the communication processing unit 9, as still images or moving images. - As described above, in the first embodiment, the
display apparatus 100 displays a captured image received from theimage capturing apparatus 200 on thedisplay unit 2. Then, thedisplay apparatus 100 transmits angle information indicating a rotation angle of the display apparatus 100 (a rotation angle around the rotation axis orthogonal to the display screen of the display unit 2) to theimage capturing apparatus 200. Theimage capturing apparatus 200 corrects a tilt of the captured image in accordance with the angle information. - In this way, the user of the display apparatus, which receives the captured image from the image capturing apparatus and displays the received captured image, can correct the captured image intuitively.
- The first embodiment has described a configuration in which a tilt of an image captured by the
image capturing apparatus 200 is electronically corrected using angle information indicating a rotation angle of thedisplay apparatus 100. The second embodiment describes a configuration in which a keystone effect of a captured image is corrected by the user tilting thedisplay apparatus 100 forward and backward. Basic configurations of thedisplay apparatus 100 and theimage capturing apparatus 200 according to the second embodiment are similar to the configurations shown inFIG. 1 according to the first embodiment, and thus a detailed description of such configurations will be omitted. The following description focuses mainly on portions that differ from the first embodiment. - First, an overview of the second embodiment will be described with reference to
FIGS. 5A to 5C . The second embodiment is similar to the first embodiment in that a captured image is corrected in accordance with an angle of thedisplay apparatus 100. However, in the second embodiment, an “angle of thedisplay apparatus 100” corresponds to a forward/backward tilt angle of the display apparatus 100 (strictly speaking, a forward/backward tilt angle of the display screen of the display unit 2). The “forward/backward tilt” mentioned here corresponds to rotation of the display screen around a rotation axis that is in plane with the display screen of thedisplay unit 2 and parallel to a transverse line of the display screen (see arotation axis 902 inFIG. 9B ). For example, a tilt angle of a case in which an up-down line (a line extending in an up-down direction) of the display screen is perpendicular to the ground surface, as shown on the right side ofFIG. 9B , is defined as 0 degrees. From a viewpoint of the display screen (or display apparatus 100), a direction that is “perpendicular to the ground surface” means the gravitational direction detected by theangle detection unit 4 of thedisplay apparatus 100. As shown inFIG. 5C , thedisplay apparatus 100 transmits a keystone effect instruction command including the angle information to theimage capturing apparatus 200. Theimage capturing apparatus 200 corrects a keystone effect by correcting a captured image in accordance with the angle information. - A description is now given of operations of the
display apparatus 100 and theimage capturing apparatus 200 according to the second embodiment with reference toFIGS. 6A and 6B .FIGS. 6A and 6B pertain to thedisplay apparatus 100 and theimage capturing apparatus 200, respectively. InFIGS. 6A and 6B , steps of performing processes that are identical or similar to processes ofFIGS. 3A and 3B are given the same reference numerals thereas. Processes of steps in a flowchart ofFIG. 6A are realized as a result of thecontrol unit 5 controlling blocks of thedisplay apparatus 100 by deploying the control program stored in the ROM to the RAM and executing the deployed control program, unless specifically stated otherwise. Similarly, processes of steps in a flowchart ofFIG. 6B are realized as a result of thecontrol unit 8 controlling blocks of theimage capturing apparatus 200 by deploying the control program stored in the ROM to the RAM and executing the deployed control program, unless specifically stated otherwise. - First, the operations of the
display apparatus 100 shown inFIG. 6A will be described. In step S604, thecontrol unit 5 detects angle information output from theangle detection unit 4. This angle information indicates a forward/backward tilt angle of the display screen of the display unit 2 (see therotation axis 902 inFIG. 9B ). - In step S605, based on the angle information detected in step S604, the
control unit 5 calculates auxiliary information that is used by the user as a guide for correction of the captured image. In the present embodiment, it will be assumed that the auxiliary information denotes grid-like auxiliary lines including two vertical auxiliary lines in particular. Then, thedisplay processing unit 3 displays the auxiliary lines by superimposing the auxiliary lines over the captured image displayed on thedisplay unit 2. As shown inFIG. 5C , tilts (angles) of the two vertical auxiliary lines change in accordance with the tilt angle of thedisplay apparatus 100, and an interval between upper portions of the two vertical auxiliary lines differs from an interval between lower portions of the two vertical auxiliary lines. When the tilt angle of thedisplay apparatus 100 has a reference value (e.g., an angle shown inFIG. 5B ), the two vertical auxiliary lines are displayed in such a manner that they are parallel to the up-down direction of the display screen. That is to say, the displayed two vertical auxiliary lines are parallel to each other. When a top portion of thedisplay apparatus 100 is swung backward, the interval between the upper portions of the two vertical auxiliary lines is small, whereas the interval between the lower portions of the two vertical auxiliary lines is large. When the top portion of thedisplay apparatus 100 is swung forward, the interval between the upper portions of the two vertical auxiliary lines is large, whereas the interval between the lower portions of the two vertical auxiliary lines is small. The user tilts thedisplay apparatus 100 forward/backward until the two vertical auxiliary lines reach an angle with which the user wants to vertically correct a subject. In examples ofFIGS. 5B and 5C , the user tilts thedisplay apparatus 100 forward/backward so that a subject's vertical line V is parallel to the vertical auxiliary lines. Note that depending on the subject types, there may be no element that is equivalent to the subject's vertical line V; in this case also, the user can correct the captured image by adjusting an angle formed by a desired portion of the subject and the vertical auxiliary lines through rotation of thedisplay apparatus 100. - Processes of steps S608 and S609 are similar to the processes of steps S308 and S309 in
FIG. 3A , except that a command is a “keystone effect instruction command”. It will be assumed that angle information included in the keystone effect instruction command includes, for example, information indicating the gravitational direction detected by theangle detection unit 4, or information related to an angle (orientation) of the display surface of thedisplay unit 2 with respect to the gravitational direction. Thereafter, the captured image that has been corrected in accordance with the keystone effect instruction command can be received from theimage capturing apparatus 200 and displayed on thedisplay unit 2. In this way, the user of thedisplay apparatus 100 can confirm whether the keystone effect has been corrected as intended, and perform a correction operation again by switching back to the angle correction mode if necessary. - Next, the operations of the
image capturing apparatus 200 shown inFIG. 6B will be described. A process of step S612 is similar to the process of step S312 inFIG. 3B , except that a command is the “keystone effect instruction command”. In step S613, theimage processing unit 7 corrects the keystone effect of the captured image by executing electronic image processing based on the angle information in the keystone effect instruction command. The captured image, whose keystone effect has been corrected, is transmitted to thedisplay apparatus 100 to be reviewed on thedisplay apparatus 100. Furthermore, information for correction, including the angle information in the keystone effect instruction command, is stored (recorded) to the ROM or the RAM in theimage capturing apparatus 200, and then processing of the present flowchart is ended. Thereafter, when theimage capturing apparatus 200 captures still images or moving images and records or transmits the captured images, theimage processing unit 7 makes correction based on the stored information (the information for correction, including the angle information in the keystone effect instruction command). Then, the corrected captured images are recorded to therecording unit 10, or transmitted via the communication processing unit 9, as still images or moving images. That is to say, a plurality of images that are captured after receiving the keystone effect instruction command are corrected based on the stored information. - As described above, in the second embodiment, the
display apparatus 100 displays a captured image received from theimage capturing apparatus 200 on thedisplay unit 2, and transmits angle information indicating a tilt angle of the display apparatus 100 (a forward/backward tilt angle of the display screen of the display unit 2) to theimage capturing apparatus 200. Theimage capturing apparatus 200 corrects a keystone effect of the captured image in accordance with the angle information. In this way, the user can correct the keystone effect of the captured image intuitively. - Note that the first embodiment related to rotational correction for a captured image, and the second embodiment related to keystone effect correction for a captured image, can be implemented simultaneously. In this case, among auxiliary lines displayed on the
display unit 2, horizontal auxiliary lines are similar to those described in the first embodiment, i.e., auxiliary lines indicating the horizontal direction detected by thedisplay apparatus 100, whereas vertical auxiliary lines are similar to those described in the second embodiment, i.e., auxiliary lines indicating a forward/backward tilt of the display apparatus 100 (an angle of the display surface with respect to the gravitational direction). Also, angle information transmitted from thedisplay apparatus 100 to theimage capturing apparatus 200 includes information that is necessary for both rotational correction for a captured image and keystone effect correction for a captured image. Furthermore, theimage capturing apparatus 200 applies both rotational correction and keystone effect correction based on angle information included in a rotation instruction command and a keystone effect instruction command received from thedisplay apparatus 100. - The first and second embodiments have described a configuration in which a tilt or a keystone effect of an image captured by the
image capturing apparatus 200 is corrected during shooting. The third embodiment describes a configuration for writing angle information to an image file, thereby enabling correction during reproduction or during image post-processing. Basic configurations of thedisplay apparatus 100 and theimage capturing apparatus 200 according to the third embodiment are similar to the configurations shown inFIG. 1 according to the first embodiment, and thus a detailed description of such configurations will be omitted. The following description focuses mainly on portions that differ from the first embodiment. Although the present invention is described in the context of correcting a tilt of a captured image, the present embodiment is also applicable similarly to the case of correcting a keystone effect of a captured image (i.e., the context of the second embodiment). - First, an overview of the third embodiment will be described with reference to
FIGS. 7A and 7B . The third embodiment is similar to the first embodiment in that thedisplay apparatus 100 transmits a rotation instruction command including angle information to the image capturing apparatus 200 (seeFIG. 7A ). However, as shown inFIG. 7B , theimage capturing apparatus 200 writes the angle information to an image file of a captured image, instead of correcting the captured image in response to the rotation instruction command. Theimage capturing apparatus 200 corrects the image in accordance with the angle information during reproduction of the image file. That is to say, theimage capturing apparatus 200 records the captured image in association with the angle information. Furthermore, the user can instruct theimage capturing apparatus 200 whether to correct the image in accordance with the angle information via theoperation unit 11. - A description is now given of operations of the
image capturing apparatus 200 according to the third embodiment with reference toFIG. 8 . Operations of thedisplay apparatus 100 are similar to those according to the first embodiment (seeFIG. 3A ). InFIG. 8 , steps of performing processes that are identical or similar to processes ofFIG. 3B are given the same reference numerals thereas. Processes of steps in a flowchart shown inFIG. 8 are realized as a result of thecontrol unit 8 controlling blocks of theimage capturing apparatus 200 by deploying the control program stored in the ROM to the RAM and executing the deployed control program, unless specifically stated otherwise. - In step S810, the
control unit 8 of theimage capturing apparatus 200 determines whether a current operation mode of theimage capturing apparatus 200 is a shooting mode. If the current operation mode is the shooting mode, processing proceeds to step S310. In step S811, which follows steps S310 to S312, thecontrol unit 8 records the captured image as an image file and writes angle information to the image file via therecording unit 10. - On the other hand, if it is determined in step S810 that the operation mode of the
image capturing apparatus 200 is not the shooting mode (is a reproduction mode), the processing proceeds to step S812. In step S812, thecontrol unit 8 determines whether an image file to be reproduced includes angle information. The processing proceeds to step S813 if the image file includes the angle information, and to step S815 if the image file does not include the angle information. - In step S813, the
control unit 8 determines whether correction processing based on the angle information is in an ON state. The user can switch between ON and OFF of the correction processing via theoperation unit 11. The processing proceeds to step S814 if the correction processing is in the ON state, and to step S815 if the correction processing is not in the ON state. - In step S814, the
image processing unit 7 corrects an image in the image file in accordance with the angle information, and displays the corrected image on thedisplay unit 12. The image is, for example, a moving image. On the other hand, if the image file does not include the angle information, or if the correction processing is in an OFF state, theimage processing unit 7 performs normal reproduction without making correction in step S815. - Through the above processing, a tilt of an image captured by the
image capturing apparatus 200 can be corrected during reproduction, instead of during shooting. By thus recording the captured image and correction information (angle information) simultaneously, the tilt can be corrected not only during reproduction, but also in post-processing with the use of an image correction tool. In this way, the user can not only determine whether to make correction after shooting, but also determine whether to make correction in consideration of tradeoff between correction and a correction-caused reduction in a cutout size within the captured image. - The above embodiments have described an example in which the
display apparatus 100 transmits, to theimage capturing apparatus 200, attitude information indicating a tilt of thedisplay apparatus 100 as information used for rotation and cutout performed by theimage capturing apparatus 200; however, no limitation is intended in this regard. In a case where thedisplay apparatus 100 is configured to receive an image captured by the image capturing apparatus 200 (an image to be recorded as an image file), rotation and cutout may be performed by thedisplay apparatus 100, instead of by theimage capturing apparatus 200. That is to say, attitude information (the information described in the above embodiments—angle information, rotation instruction command) indicating a tilt of thedisplay apparatus 100, which is generated in response to an instruction from the user, is stored to the RAM of thecontrol unit 5 without being transmitted to theimage capturing apparatus 200. After processes similar to the processes of steps S313, S613, and S811, which have been described above as processes executed by theimage capturing apparatus 200, are applied to the image received from theimage capturing apparatus 200 based on the information stored in the RAM, the resultant image can be recorded as the image file. - Note that the
display apparatus 100 and theimage capturing apparatus 200 may be controlled by a single item of hardware, or the entire apparatuses may be controlled by a plurality of items of hardware sharing processing. - Although the present invention has been elaborated based on various embodiments thereof, the present invention is not limited to these specific embodiments, and various embodiments are embraced within the present invention as long as they do not depart from the spirit of the present invention. Furthermore, the above embodiments merely represent some of embodiments of the present invention, and can also be combined as appropriate.
- Although the above embodiments have described an example in which the present invention is applied to a display apparatus such as a smartphone and an image capturing apparatus such as a digital camera, the above embodiments are not limited to such an example. The present invention is applicable to any type of apparatus that receives and displays a captured image, and to any type of apparatus that captures and transmits an image. In other words, the present invention is applicable to, for example, a personal computer, a PDA, a mobile telephone terminal, a mobile image viewer, a display-equipped printer apparatus, a digital photo frame, a music player, a game console, and an electronic book reader.
- Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
- This application claims the benefit of Japanese Patent Application No. 2015-003604, filed Jan. 9, 2015, which is hereby incorporated by reference herein in its entirety.
Claims (26)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-003604 | 2015-01-09 | ||
JP2015003604A JP6518069B2 (en) | 2015-01-09 | 2015-01-09 | Display device, imaging system, display device control method, program, and recording medium |
Publications (2)
Publication Number | Publication Date |
---|---|
US20160205308A1 true US20160205308A1 (en) | 2016-07-14 |
US9924086B2 US9924086B2 (en) | 2018-03-20 |
Family
ID=56368421
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/990,176 Active 2036-03-26 US9924086B2 (en) | 2015-01-09 | 2016-01-07 | Display apparatus, image capturing system, control method for display apparatus, and storage medium for displaying information based on attitude |
Country Status (2)
Country | Link |
---|---|
US (1) | US9924086B2 (en) |
JP (1) | JP6518069B2 (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160373612A1 (en) * | 2015-06-16 | 2016-12-22 | Chengdu Ck Technology Co., Ltd. | Systems and methods for generating images with specific orientations |
US10007476B1 (en) * | 2014-03-23 | 2018-06-26 | Kevin Glikmann | Sharing a host mobile camera with a remote mobile device |
US20180338088A1 (en) * | 2017-05-22 | 2018-11-22 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and non-transitory storage medium |
US10158804B2 (en) * | 2016-05-31 | 2018-12-18 | Olympus Corporation | Imaging device, control method and recording medium having stored program |
US20190281209A1 (en) * | 2016-12-02 | 2019-09-12 | SZ DJI Technology Co., Ltd. | Photographing control method, apparatus, and control device |
CN110621543A (en) * | 2017-06-08 | 2019-12-27 | 金泰克斯公司 | Display device with horizontal correction |
EP3644600A1 (en) * | 2018-10-22 | 2020-04-29 | Ricoh Company, Ltd. | Imaging device, information processing method, system, and carrier means |
US20200162671A1 (en) * | 2018-11-21 | 2020-05-21 | Ricoh Company, Ltd. | Image capturing system, terminal and computer readable medium which correct images |
CN113039210A (en) * | 2018-09-19 | 2021-06-25 | 科纳根公司 | Controlled protein degradation by engineering degradation tag variants in corynebacterium host cells |
CN113841386A (en) * | 2020-08-26 | 2021-12-24 | 深圳市大疆创新科技有限公司 | Image correction method and apparatus, image pickup device, and storage medium |
US20220408010A1 (en) * | 2021-06-21 | 2022-12-22 | Canon Kabushiki Kaisha | Image pickup apparatus and information processing apparatus that are capable of automatically adding appropriate rotation matrix during photographing, control method for image pickup apparatus, and storage medium |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108353126B (en) | 2015-04-23 | 2019-08-23 | 苹果公司 | Handle method, electronic equipment and the computer readable storage medium of the content of camera |
US9912860B2 (en) | 2016-06-12 | 2018-03-06 | Apple Inc. | User interface for camera effects |
DK180859B1 (en) | 2017-06-04 | 2022-05-23 | Apple Inc | USER INTERFACE CAMERA EFFECTS |
US11112964B2 (en) | 2018-02-09 | 2021-09-07 | Apple Inc. | Media capture lock affordance for graphical user interface |
US20190297265A1 (en) * | 2018-03-21 | 2019-09-26 | Sawah Innovations Inc. | User-feedback video stabilization device and method |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
US10674072B1 (en) | 2019-05-06 | 2020-06-02 | Apple Inc. | User interfaces for capturing and managing visual media |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
US11039074B1 (en) | 2020-06-01 | 2021-06-15 | Apple Inc. | User interfaces for managing media |
US11212449B1 (en) | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
US11539876B2 (en) | 2021-04-30 | 2022-12-27 | Apple Inc. | User interfaces for altering visual media |
US12112024B2 (en) | 2021-06-01 | 2024-10-08 | Apple Inc. | User interfaces for managing media styles |
Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4967278A (en) * | 1988-08-08 | 1990-10-30 | Steve Greenbaum | Video camera with a transverse tilt detector and indicator comprising an ordered array of light-emitting diodes |
US5790085A (en) * | 1994-10-19 | 1998-08-04 | Raytheon Company | Portable interactive heads-up weapons terminal |
US20020118292A1 (en) * | 2001-02-28 | 2002-08-29 | Baron John M. | System and method for removal of digital image vertical distortion |
US20050117024A1 (en) * | 2003-11-29 | 2005-06-02 | Lg Electronics Inc. | Gradient displaying method of mobile terminal |
US6917370B2 (en) * | 2002-05-13 | 2005-07-12 | Charles Benton | Interacting augmented reality and virtual reality |
US6968094B1 (en) * | 2000-03-27 | 2005-11-22 | Eastman Kodak Company | Method of estimating and correcting camera rotation with vanishing point location |
US20080204566A1 (en) * | 2005-09-09 | 2008-08-28 | Canon Kabushiki Kaisha | Image pickup apparatus |
US7495198B2 (en) * | 2004-12-01 | 2009-02-24 | Rafael Advanced Defense Systems Ltd. | System and method for improving nighttime visual awareness of a pilot flying an aircraft carrying at least one air-to-air missile |
US20090278975A1 (en) * | 2006-09-26 | 2009-11-12 | Detlef Grosspietsch | Method of correcting perspective deformation of a lens system |
US7735230B2 (en) * | 2006-03-29 | 2010-06-15 | Novatac, Inc. | Head-mounted navigation system |
US20110205377A1 (en) * | 2000-07-11 | 2011-08-25 | Phase One A/S | Digital camera with integrated accelerometers |
US20130321568A1 (en) * | 2012-06-01 | 2013-12-05 | Hal Laboratory, Inc. | Storage medium storing information processing program, information processing device, information processing system, and information processing method |
US20130322845A1 (en) * | 2012-06-01 | 2013-12-05 | Hal Laboratory, Inc. | Storage medium storing information processing program, information processing device, information processing system, and panoramic video display method |
US20130322843A1 (en) * | 2012-06-01 | 2013-12-05 | Hal Laboratory, Inc. | Storage medium storing information processing program, information processing device, information processing system, and panoramic video display method |
US20140072274A1 (en) * | 2012-09-07 | 2014-03-13 | Nintendo Co., Ltd. | Computer-readable storage medium having information processing program stored therein, information processing apparatus, information processing system, and information processing method |
US20140132705A1 (en) * | 2012-11-09 | 2014-05-15 | Nintendo Co., Ltd. | Image generation method, image display method, storage medium storing image generation program, image generation system, and image display device |
US20140140677A1 (en) * | 2012-11-19 | 2014-05-22 | Lg Electronics Inc. | Video display device and method of displaying video |
US20140270692A1 (en) * | 2013-03-18 | 2014-09-18 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing device, information processing system, panoramic video display method, and storage medium storing control data |
US20160048942A1 (en) * | 2014-03-18 | 2016-02-18 | Ricoh Company, Ltd. | Information processing method, information processing device, and program |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3896505B2 (en) * | 2001-03-12 | 2007-03-22 | 富士フイルム株式会社 | Electronic camera |
JP4013138B2 (en) * | 2002-12-13 | 2007-11-28 | 富士フイルム株式会社 | Trimming processing apparatus and trimming processing program |
JP2005175813A (en) * | 2003-12-10 | 2005-06-30 | Sony Corp | Electronic equipment, image information transmission system and method |
JP2005348212A (en) * | 2004-06-04 | 2005-12-15 | Casio Comput Co Ltd | Imaging apparatus |
JP2007228097A (en) | 2006-02-21 | 2007-09-06 | Canon Inc | Camera server, network camera system, control method, and program |
JP2012147071A (en) | 2011-01-07 | 2012-08-02 | Seiko Epson Corp | Imaging apparatus and imaging method |
JP5820181B2 (en) * | 2011-07-29 | 2015-11-24 | キヤノン株式会社 | Imaging system and control method thereof, display control apparatus and control method thereof, program, and storage medium |
JP2013162277A (en) * | 2012-02-03 | 2013-08-19 | Nikon Corp | Digital camera |
-
2015
- 2015-01-09 JP JP2015003604A patent/JP6518069B2/en active Active
-
2016
- 2016-01-07 US US14/990,176 patent/US9924086B2/en active Active
Patent Citations (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4967278A (en) * | 1988-08-08 | 1990-10-30 | Steve Greenbaum | Video camera with a transverse tilt detector and indicator comprising an ordered array of light-emitting diodes |
US5790085A (en) * | 1994-10-19 | 1998-08-04 | Raytheon Company | Portable interactive heads-up weapons terminal |
US6968094B1 (en) * | 2000-03-27 | 2005-11-22 | Eastman Kodak Company | Method of estimating and correcting camera rotation with vanishing point location |
US20110205377A1 (en) * | 2000-07-11 | 2011-08-25 | Phase One A/S | Digital camera with integrated accelerometers |
US8854482B2 (en) * | 2000-07-11 | 2014-10-07 | Phase One A/S | Digital camera with integrated accelerometers |
US20020118292A1 (en) * | 2001-02-28 | 2002-08-29 | Baron John M. | System and method for removal of digital image vertical distortion |
US6963365B2 (en) * | 2001-02-28 | 2005-11-08 | Hewlett-Packard Development Company, L.P. | System and method for removal of digital image vertical distortion |
US6917370B2 (en) * | 2002-05-13 | 2005-07-12 | Charles Benton | Interacting augmented reality and virtual reality |
US20050117024A1 (en) * | 2003-11-29 | 2005-06-02 | Lg Electronics Inc. | Gradient displaying method of mobile terminal |
US7495198B2 (en) * | 2004-12-01 | 2009-02-24 | Rafael Advanced Defense Systems Ltd. | System and method for improving nighttime visual awareness of a pilot flying an aircraft carrying at least one air-to-air missile |
US20080204566A1 (en) * | 2005-09-09 | 2008-08-28 | Canon Kabushiki Kaisha | Image pickup apparatus |
US7735230B2 (en) * | 2006-03-29 | 2010-06-15 | Novatac, Inc. | Head-mounted navigation system |
US20090278975A1 (en) * | 2006-09-26 | 2009-11-12 | Detlef Grosspietsch | Method of correcting perspective deformation of a lens system |
US20130321568A1 (en) * | 2012-06-01 | 2013-12-05 | Hal Laboratory, Inc. | Storage medium storing information processing program, information processing device, information processing system, and information processing method |
US20130322843A1 (en) * | 2012-06-01 | 2013-12-05 | Hal Laboratory, Inc. | Storage medium storing information processing program, information processing device, information processing system, and panoramic video display method |
US20160366460A1 (en) * | 2012-06-01 | 2016-12-15 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing device, information processing system, and panoramic video display method |
US20130322845A1 (en) * | 2012-06-01 | 2013-12-05 | Hal Laboratory, Inc. | Storage medium storing information processing program, information processing device, information processing system, and panoramic video display method |
US9485484B2 (en) * | 2012-06-01 | 2016-11-01 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing device, information processing system, and panoramic video display method |
US9473699B2 (en) * | 2012-06-01 | 2016-10-18 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing device, information processing system, and information processing method |
US9270966B2 (en) * | 2012-06-01 | 2016-02-23 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing device, information processing system, and panoramic video display method |
US20140072274A1 (en) * | 2012-09-07 | 2014-03-13 | Nintendo Co., Ltd. | Computer-readable storage medium having information processing program stored therein, information processing apparatus, information processing system, and information processing method |
US9294673B2 (en) * | 2012-11-09 | 2016-03-22 | Nintendo Co., Ltd. | Image generation method, image display method, storage medium storing image generation program, image generation system, and image display device |
US20140132705A1 (en) * | 2012-11-09 | 2014-05-15 | Nintendo Co., Ltd. | Image generation method, image display method, storage medium storing image generation program, image generation system, and image display device |
US20140140677A1 (en) * | 2012-11-19 | 2014-05-22 | Lg Electronics Inc. | Video display device and method of displaying video |
US9094655B2 (en) * | 2013-03-18 | 2015-07-28 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing device, information processing system, panoramic video display method, and storage medium storing control data |
US20140270692A1 (en) * | 2013-03-18 | 2014-09-18 | Nintendo Co., Ltd. | Storage medium storing information processing program, information processing device, information processing system, panoramic video display method, and storage medium storing control data |
US20160048942A1 (en) * | 2014-03-18 | 2016-02-18 | Ricoh Company, Ltd. | Information processing method, information processing device, and program |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10007476B1 (en) * | 2014-03-23 | 2018-06-26 | Kevin Glikmann | Sharing a host mobile camera with a remote mobile device |
US9848103B2 (en) * | 2015-06-16 | 2017-12-19 | Chengdu Sioeye Technology Co., Ltd. | Systems and methods for generating images with specific orientations |
US20160373612A1 (en) * | 2015-06-16 | 2016-12-22 | Chengdu Ck Technology Co., Ltd. | Systems and methods for generating images with specific orientations |
US10158804B2 (en) * | 2016-05-31 | 2018-12-18 | Olympus Corporation | Imaging device, control method and recording medium having stored program |
US10897569B2 (en) * | 2016-12-02 | 2021-01-19 | SZ DJI Technology Co., Ltd. | Photographing control method, apparatus, and control device |
US11863857B2 (en) | 2016-12-02 | 2024-01-02 | SZ DJI Technology Co., Ltd. | Photographing control method, apparatus, and control device |
US11575824B2 (en) | 2016-12-02 | 2023-02-07 | SZ DJI Technology Co., Ltd. | Photographing control method, apparatus, and control device |
US20190281209A1 (en) * | 2016-12-02 | 2019-09-12 | SZ DJI Technology Co., Ltd. | Photographing control method, apparatus, and control device |
US10951826B2 (en) * | 2017-05-22 | 2021-03-16 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and non-transitory storage medium |
US20180338088A1 (en) * | 2017-05-22 | 2018-11-22 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and non-transitory storage medium |
CN110621543A (en) * | 2017-06-08 | 2019-12-27 | 金泰克斯公司 | Display device with horizontal correction |
CN113039210A (en) * | 2018-09-19 | 2021-06-25 | 科纳根公司 | Controlled protein degradation by engineering degradation tag variants in corynebacterium host cells |
US11102403B2 (en) | 2018-10-22 | 2021-08-24 | Ricoh Company, Ltd. | Image device, information processing apparatus, information processing method, system, and storage medium |
CN111093006A (en) * | 2018-10-22 | 2020-05-01 | 株式会社理光 | Image pickup apparatus, information processing apparatus, compensation amount setting method, and computer program |
EP3644600A1 (en) * | 2018-10-22 | 2020-04-29 | Ricoh Company, Ltd. | Imaging device, information processing method, system, and carrier means |
US10897573B2 (en) * | 2018-11-21 | 2021-01-19 | Ricoh Company, Ltd. | Image capturing system, terminal and computer readable medium which correct images |
US20200162671A1 (en) * | 2018-11-21 | 2020-05-21 | Ricoh Company, Ltd. | Image capturing system, terminal and computer readable medium which correct images |
CN113841386A (en) * | 2020-08-26 | 2021-12-24 | 深圳市大疆创新科技有限公司 | Image correction method and apparatus, image pickup device, and storage medium |
US20220408010A1 (en) * | 2021-06-21 | 2022-12-22 | Canon Kabushiki Kaisha | Image pickup apparatus and information processing apparatus that are capable of automatically adding appropriate rotation matrix during photographing, control method for image pickup apparatus, and storage medium |
US11711609B2 (en) * | 2021-06-21 | 2023-07-25 | Canon Kabushiki Kaisha | Image pickup apparatus and information processing apparatus that are capable of automatically adding appropriate rotation matrix during photographing, control method for image pickup apparatus, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
US9924086B2 (en) | 2018-03-20 |
JP6518069B2 (en) | 2019-05-22 |
JP2016129315A (en) | 2016-07-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9924086B2 (en) | Display apparatus, image capturing system, control method for display apparatus, and storage medium for displaying information based on attitude | |
JP5267451B2 (en) | Direction calculation apparatus, direction calculation method, and program | |
US9172878B2 (en) | Image capturing apparatus, image capturing control method and storage medium for capturing a subject to be recorded with intended timing | |
US8750674B2 (en) | Remotely controllable digital video camera system | |
US8823814B2 (en) | Imaging apparatus | |
WO2017054677A1 (en) | Mobile terminal photographing system and mobile terminal photographing method | |
US9124805B2 (en) | Adapting an optical image stabilizer on a camera | |
US9420188B2 (en) | Lens control apparatus, lens control method, image capturing apparatus, information processing apparatus, information processing method, image capturing system, and computer readable storage medium | |
US20190098250A1 (en) | Information processing apparatus, imaging apparatus, information processing method, and recording medium | |
US20190260933A1 (en) | Image capturing apparatus performing image stabilization, control method thereof, and storage medium | |
US20130077932A1 (en) | Digital video camera system having two microphones | |
US9369623B2 (en) | Remote-control apparatus and control method thereof, image capturing apparatus and control method thereof, and system | |
JP4748442B2 (en) | Imaging apparatus and program thereof | |
WO2014034023A1 (en) | Image processing apparatus, image processing method, and computer program | |
US9621799B2 (en) | Imaging apparatus | |
JP2016103666A (en) | Electronic apparatus and imaging device | |
US10868962B2 (en) | Image capturing apparatus performing image stabilization, control method thereof, and storage medium | |
JP2013012978A (en) | Digital camera | |
JP2013243552A (en) | Imaging apparatus, control method of the same, program, and storage medium | |
JP2017046160A (en) | Image processing system, control method of the same, control program, and storage medium | |
US11245830B2 (en) | Image capture apparatus and control method for same, and storage medium | |
US9648220B2 (en) | Imaging apparatus, imaging apparatus body and image sound output method | |
JP2014225836A (en) | Imaging device and control method of imaging device | |
US9467549B2 (en) | External apparatus, communication apparatus, and control method therefor | |
JP2011114769A (en) | Imaging device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MAEDA, MUNEYOSHI;REEL/FRAME:038299/0207 Effective date: 20151204 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: VITESCO TECHNOLOGIES USA, LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CONTINENTAL AUTOMOTIVE SYSTEMS, INC.;REEL/FRAME:057426/0356 Effective date: 20210810 |