US9270901B2 - Display control device, display control method, program, and recording medium - Google Patents
Display control device, display control method, program, and recording medium Download PDFInfo
- Publication number
- US9270901B2 US9270901B2 US13/857,267 US201313857267A US9270901B2 US 9270901 B2 US9270901 B2 US 9270901B2 US 201313857267 A US201313857267 A US 201313857267A US 9270901 B2 US9270901 B2 US 9270901B2
- Authority
- US
- United States
- Prior art keywords
- image
- auxiliary
- display control
- display
- control device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 76
- 238000003384 imaging method Methods 0.000 claims abstract description 107
- 239000000203 mixture Substances 0.000 claims abstract description 72
- 238000012545 processing Methods 0.000 claims description 50
- 238000011156 evaluation Methods 0.000 claims description 12
- 230000008859 change Effects 0.000 claims description 2
- 238000004590 computer program Methods 0.000 claims 1
- 230000008569 process Effects 0.000 description 52
- 230000000875 corresponding effect Effects 0.000 description 20
- 238000010586 diagram Methods 0.000 description 17
- 238000004891 communication Methods 0.000 description 11
- 238000012986 modification Methods 0.000 description 10
- 230000004048 modification Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 230000007246 mechanism Effects 0.000 description 6
- 238000003708 edge detection Methods 0.000 description 5
- 238000003825 pressing Methods 0.000 description 5
- 101100256921 Ajellomyces capsulatus SID3 gene Proteins 0.000 description 3
- 101100064323 Arabidopsis thaliana DTX47 gene Proteins 0.000 description 3
- 101000840469 Arabidopsis thaliana Isochorismate synthase 1, chloroplastic Proteins 0.000 description 3
- 101150026676 SID1 gene Proteins 0.000 description 3
- 101100366400 Schizosaccharomyces pombe (strain 972 / ATCC 24843) spg1 gene Proteins 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 101150096768 sid4 gene Proteins 0.000 description 3
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000006837 decompression Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 235000006961 Fumaria officinalis Nutrition 0.000 description 1
- 244000044980 Fumaria officinalis Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000002087 whitening effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H04N5/23293—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/743—Bracketing, i.e. taking a series of images with varying exposure conditions
-
- H04N5/23222—
-
- H04N5/2356—
Definitions
- the present disclosure relates to a display control device, a display control method, a program, and a recording medium.
- Japanese Unexamined Patent Application Publication No. 2009-231992 discloses a technology for automatically determining a composition determined to be the best composition in an imaging device and presenting an image based on the composition.
- the composition is automatically determined in the imaging device. Therefore, since it is not necessary for a user to determine the composition, convenience is improved.
- a composition automatically determined in an imaging device may not necessarily be a composition in which an inclination or preference of the user is reflected.
- a display control device including a display control unit that displays a plurality of auxiliary images with different compositions together with a predetermined image.
- a display control method in a display control device including displaying a plurality of auxiliary images with different compositions together with a predetermined image.
- a program causing a computer to perform a display control method in a display control device, the method including displaying a plurality of auxiliary images with different compositions together with a predetermined image, or a recording medium having a program recorded thereon.
- a plurality of auxiliary images with different compositions can be displayed.
- a user can refer to the plurality of compositions by viewing the plurality of displayed auxiliary images.
- FIG. 1 is a diagram illustrating an example of the outer appearance of an imaging device according to an embodiment
- FIG. 2 is a diagram illustrating an example of the configuration of the imaging device according to the embodiment
- FIG. 3 is a diagram illustrating an example of a process of generating auxiliary images
- FIG. 4 is a diagram illustrating an example of a through image displayed on a display unit
- FIG. 5 is a diagram illustrating examples of the through image and auxiliary images displayed on the display unit
- FIG. 6 is a diagram illustrating an example of a form in which the edge of a selected auxiliary image is displayed so as to overlap the through image;
- FIG. 7 is a diagram illustrating an example of a form in which an image for which transparency of the selected auxiliary image is changed is displayed so as to overlap the through image;
- FIG. 8 is a flowchart illustrating an example of the flow of a process
- FIG. 9 is a diagram illustrating display of auxiliary images and the like according to a modification example.
- FIG. 10 is a diagram illustrating an example of the configuration of an imaging device according to a modification example
- FIG. 11 is a diagram illustrating display of auxiliary images and the like according to a modification example
- FIG. 12 is a diagram illustrating the display of the auxiliary images and the like according to the modification example.
- FIG. 13 is a diagram illustrating another example of the configuration of an imaging device according to a modification example.
- FIG. 1 is a diagram illustrating an example of the outer appearance of an imaging device 100 according to the embodiment.
- the imaging device 100 includes a body (casing) 10 .
- a release button (also referred to as a shutter button or the like) 11 is formed on the body 10 .
- a two-stage pressing operation of a half-pressing stage and a full-pressing stage can be considered to be performed on the release button 11 .
- a display unit 12 is installed on one side surface of the body 10 .
- a through image with a predetermined composition, an image reproduced by a recording device, or the like is displayed on the display unit 12 .
- the display unit 12 includes a touch panel, and thus an input operation on the display unit 12 can be performed.
- a menu screen or an operation screen used to perform various settings is displayed on the display unit 12 in addition to the above-mentioned images.
- a display region of the display unit 12 is divided into, for example, display regions 12 a and 12 b .
- the display region 12 a is considered to be larger than the display region 12 b .
- a through image is displayed in the display region 12 a .
- numbers or icons in addition to a through image are displayed in the display region 12 a .
- a number S 1 indicating a frame rate of the imaging device 100 and an icon S 2 indicating a remaining amount of battery mounted on the imaging device 100 are displayed in the display region 12 a.
- Icons, characters, and the like are displayed in the display region 12 b .
- characters S 3 of “MENU” and characters S 4 of “KOZU (composition)” are displayed.
- the characters S 3 (appropriately referred to as a MENU button S 3 ) of MENU are touched by a finger of a user or the like, a menu screen is displayed on the display unit 12 .
- a plurality of auxiliary images are displayed.
- the plurality of auxiliary images are auxiliary images for determination of compositions.
- the compositions of the plurality of auxiliary images are different from each other.
- the user refers to the plurality of auxiliary images to determine a preferred composition.
- the composition is also referred to as framing and refers to a disposition state of a subject within an image frame.
- the display or the like of the auxiliary images will be described in detail below.
- An icon S 5 indicating a face detection function, an icon S 6 indicating a function of automatically detecting a smiley face and imaging the smiley face, and an icon S 7 indicating a beautiful skin correction function of detecting the region of facial skin and whitening the detected region so that specks or rough skin are unnoticeable are displayed in the display region 12 b .
- ON and OFF of the function corresponding to the touched icon can be switched.
- the kinds of icons and the displayed positions of the icons can be appropriately changed.
- Physical operation units may be provided near the position at which the MENU button S 3 and the KOZU button S 4 are displayed.
- a button 13 is provided at a position near the MENU button S 3 on the body 10 .
- the menu screen is displayed on the display unit 12 according to a press of the button 13 .
- a button 14 is provided at a position near the KOZU button S 4 on the body 10 .
- the plurality of auxiliary images are displayed on the display unit 12 according to a press of the button 14 .
- a substantially circular dial button 15 is also provided on the body 10 .
- the circumferential portion of the dial button 15 is considered to be rotatable and the central portion of the dial button 15 is considered to be pressed down.
- items displayed on the display unit 12 can be changed.
- the central portion of the dial button 15 is pressed down, the selection of this item is confirmed. Then, a function assigned to this item is performed. Further, when one auxiliary image is selected from the plurality of auxiliary images to be described below, the dial button 15 may be used.
- the above-described outer appearance of the imaging device 100 is merely an example and the embodiment of the present disclosure is not limited thereto.
- a REC button used to capture and record the moving image may be provided on the body 10 .
- a play button used to play back a still image or a moving image obtained after the imaging may be provided on the body 10 .
- FIG. 2 is a diagram illustrating an example of the main configuration of the imaging device 100 .
- the imaging device 100 includes not only the display unit 12 but also, for example, a control unit 20 , an imaging unit 21 , an image processing unit 22 , an input operation unit 23 , a record reproduction unit 24 , a recording device 25 , an auxiliary image generation unit 26 , and an auxiliary image processing unit 27 .
- a display control unit includes the auxiliary image generation unit 26 and the auxiliary image processing unit 27 .
- each unit will be described.
- control unit 20 includes a central processing unit (CPU) and is electrically connected to each unit of the imaging device 100 .
- the control unit 20 includes a read-only memory (ROM) and a random access memory (RAM).
- the ROM stores a program executed by the control unit 20 .
- the RAM is used as a memory that temporarily stores data or a work memory when the control unit 20 executes a program.
- FIG. 2 connection between the control unit 20 and each unit of the imaging device 100 , the ROM, and the random access memory (RAM) is not illustrated.
- a control signal CS transmitted from the control unit 20 is supplied to each unit of the imaging device 100 , and thus each unit of the imaging device 100 is controlled.
- the imaging unit 21 includes a lens that images a subject, an imaging element such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS), a mechanism that drives the imaging element to a predetermined position or a mechanism that adjusts a stop, a mechanism that adjusts focus, a mechanism that adjusts zoom, and a mechanism that corrects camera-shake.
- the lens, the imaging element, and each mechanism are controlled by, for example, the control unit 20 .
- a frame rate of the imaging device 100 is considered to be, for example, 60 f/s (frame per second).
- the image processing unit 22 includes an analog signal processing unit, an analog-to-digital (A/D) conversion unit, and a digital signal processing unit.
- the analog signal processing unit performs a correlated double sampling (CDS) process on analog image data obtained by a photoelectric conversion function of the imaging element to improve a signal-to-noise ratio (S/N ratio) and performs an automatic gain control (AGC) process to control a gain.
- CDS correlated double sampling
- AGC automatic gain control
- the analog image data subjected to the analog signal processing is converted into digital image data by the A/D conversion unit.
- the digital image data is supplied to the digital signal processing unit.
- the digital signal processing unit performs camera signal processing such as a de-mosaic process, an auto focus (AF) process, an auto exposure (AE) process, and an auto white balance (AWB) process on the digital image data.
- AF auto focus
- AE auto exposure
- AVB auto white balance
- the image processing unit 22 stores the image data subjected to the above-described processes in a frame memory (not shown).
- the image processing unit 22 appropriately converts the size of the image data stored in the frame memory according to the display region of the display unit 12 .
- the image data with the converted size is displayed as a through image on the display unit 12 .
- Image data is supplied in the frame memory according to the frame rate of the imaging device 100 and the image data is sequentially overwritten.
- the image data processed by the image processing unit 22 is converted and compressed in correspondence with a predetermined format.
- the image data subjected to the compression and the like is supplied to the record reproduction unit 24 .
- Examples of the predetermined format include a design rule for camera file system (DCF) and an exchangeable image file format for digital still camera (Exif).
- Joint Photographic Experts Group (JPEG) is exemplified as a compression type.
- the image processing unit 22 performs a decompression process on the image data supplied from the record reproduction unit 24 .
- the image data subjected to the decompression process is supplied to the display unit 12 , and then an image based on the image data is reproduced.
- the input operation unit 23 is a generic name for the release button 11 , the button 13 , and the like described above.
- An operation signal OS is generated according to an operation on the input operation unit 23 .
- the operation signal OS is supplied to the control unit 20 .
- the control unit 20 generates the control signal CS according to the contents of the operation signal OS.
- the control signal CS is supplied to a predetermined processing block. When the predetermined processing block operates according to the control signal CS, a process corresponding to the operation on the input operation unit 23 is performed.
- the record reproduction unit 24 is a driver that performs recording and reproduction on the recording device 25 .
- the record reproduction unit 24 records the image data supplied from the image processing unit 22 on the recording device 25 .
- the record reproduction unit 24 reads the image data corresponding to the predetermined image from the recording device 25 and supplies the read image data to the image processing unit 22 .
- some of the processes, such as a process of compressing the image data and a process of decompressing the image data, performed by the image processing unit 22 may be performed by the record reproduction unit 24 .
- the recording device 25 is, for example, a hard disk that is included in the imaging device 100 .
- the recording device 25 may be a semiconductor memory or the like detachably mounted on the imaging device 100 .
- the recording device 25 records, for example, the image data and audio data such as background music (BGM) reproducible together with an image.
- BGM background music
- the display unit 12 includes a monitor that includes a liquid crystal display (LCD) and an organic electro-luminescence (EL) and a driver that drives the monitor.
- a monitor that includes a liquid crystal display (LCD) and an organic electro-luminescence (EL) and a driver that drives the monitor.
- EL organic electro-luminescence
- auxiliary image data image data (appropriately referred to as auxiliary image data) of the auxiliary image is supplied from the auxiliary image processing unit 27 to the display unit 12
- the driver operates to display the auxiliary image based on the auxiliary image data, and thus the auxiliary image is displayed on the monitor.
- data indicating information corresponding to a selected auxiliary image is supplied from the auxiliary image processing unit 27 to the display unit 12
- the drive operates so that display based on this data is performed so as to overlap a predetermined image.
- the information corresponding to the selected auxiliary image is, for example, information indicating the contour (edge) of the selected auxiliary image or information on an image in which transparency of the selected auxiliary image is changed.
- the display unit 12 includes a touch panel of an electrostatic capacitance type and functions as the input operation unit 23 .
- the display unit 12 may include a touch panel of another type such as a resistive film type or an optical type.
- the operation signal OS is generated according to an operation of touching a predetermined position on the display unit 12 and the operation signal OS is supplied to the control unit 20 .
- the control unit 20 generates the control signal CS according to the operation signal OS.
- the control signal CS is supplied to a predetermined processing block and a process is performed according to an operation.
- the auxiliary image generation unit 26 generates the auxiliary images with a plurality of different compositions based on a predetermined image. For example, when the KOZU button S 4 is pressed down, the control signal CS generated according to the pressing operation is supplied to the image processing unit 22 and the auxiliary image generation unit 26 .
- the image processing unit 22 supplies the image data stored in the frame memory to the auxiliary image generation unit 26 according to the control signal CS.
- the auxiliary image generation unit 26 generates the plurality of auxiliary image data based on the supplied image data.
- the generated plurality of auxiliary image data are supplied to the auxiliary image processing unit 27 .
- the original image data from which the plurality of auxiliary image data are generated is sometimes referred to as original image data.
- the auxiliary image processing unit 27 temporarily retains the plurality of auxiliary image data supplied from the auxiliary image generation unit 26 in a memory (not shown).
- the auxiliary image processing unit 27 supplies the plurality of auxiliary image data to the display unit 12 .
- the plurality of auxiliary images based on the auxiliary image data are displayed on the display unit 12 .
- One auxiliary image is selected from the plurality of auxiliary images using the input operation unit 23 .
- the operation signal OS indicating the selection is supplied to the control unit 20 .
- the control unit 20 generates the control signal CS corresponding to the operation signal OS indicating the selection and supplies the generated control signal CS to the auxiliary image processing unit 27 .
- the auxiliary image processing unit 27 reads predetermined auxiliary image data instructed by the control signal CS from the memory.
- the auxiliary image processing unit 27 performs, for example, an edge detection process on the auxiliary image data read from the memory.
- Image data (appropriately referred to as edge image data) indicating the edge is supplied to the display unit 12 .
- an edge image based on the edge image data is displayed to overlap the through image.
- the edge detection process for example, a known process such as a process of applying a differential filter on the image data or a process of extracting an edge through template matching can be applied.
- the auxiliary image processing unit 27 performs, for example, a transparency changing process on the auxiliary image data read from the memory. An image based on the image data with the changed transparency is displayed to overlap the through image. Such a process is referred to as alpha blend or the like.
- the transparency may be considered to be constant or may be set by the user. Further, the transparency may be changed in real time according to a predetermined operation.
- the imaging device 100 performs the same process as an imaging device of the related art. Such a process will appropriately not be described. An example of a process relevant to the embodiment of the present disclosure will be described.
- the imaging device 100 is oriented toward a subject.
- the imaging device 100 held with the hand of the user may be oriented toward the subject or the imaging device 100 fixed by a tripod stand or the like may be oriented toward the subject.
- a through image is displayed on the display region 12 a of the imaging device 100 .
- the KOZU button S 4 and the like are displayed on the display region 12 b .
- the user determines a composition, while confirming the through image. When it is not necessary to confirm the other compositions, the user presses down the release button 11 to perform normal imaging.
- the user can perform an operation to touch the KOZU button S 4 .
- the image data stored in the frame memory is supplied as the original image data from the image processing unit 22 to the auxiliary image generation unit 26 according to the touch of the KOZU button S 4 .
- the auxiliary image generation unit 26 generates the plurality of auxiliary image data based on the original image data.
- the generated plurality of auxiliary image data are supplied to the auxiliary image processing unit 27 .
- the plurality of auxiliary image data are supplied from the auxiliary image processing unit 27 to the display unit 12 .
- the plurality of auxiliary images based on the plurality of auxiliary image data are displayed on the display unit 12 .
- the user can confirm various compositions.
- An operation of selecting a predetermined auxiliary image from the plurality of auxiliary images displayed on the display unit 12 is performed.
- the operation signal OS corresponding to the selection operation is supplied to the control unit 20 .
- the control unit 20 generates the control signal CS corresponding to the operation signal OS and supplies the control signal CS to the auxiliary image processing unit 27 .
- the auxiliary image processing unit 27 reads the auxiliary image data indicated by the control signal CS from the memory.
- the auxiliary image processing unit 27 performs, for example, the edge detection process on the auxiliary image data read from the memory to generate the edge image data.
- the edge image data is supplied to the display unit 12 , and thus an edge image is displayed on the display unit 12 . For example, the edge image is displayed to overlap the through image.
- the edge image displayed on the display unit 12 is presented as an imaging guide.
- the user moves the imaging device 100 so that the subject in the through image substantially matches the edge indicated by the edge image.
- the user presses down the release button 11 to perform the imaging.
- the user can take a photograph with the composition substantially identical to the composition of the selected auxiliary image.
- direction information indicating a movement direction of the imaging device 100 is displayed for each of the plurality of auxiliary images.
- the image data acquired by the imaging unit 21 and subjected to the signal processing by the image processing unit 22 is stored in the frame memory.
- the size of the image data stored in the frame memory is appropriately converted.
- An image based on the converted image data is displayed as a through image.
- the image data stored in the frame memory is appropriately updated according to a frame rate of the imaging device 100 .
- the KOZU button S 4 is pressed down, the image data stored in the frame memory is supplied as the original image data BID to the auxiliary image generation unit 26 .
- the auxiliary image generation unit 26 divides the original image data BID into 16 regions of 4 ⁇ 4 (a region A 1 , a region A 2 , a region A 3 , a region A 4 , a region A 5 , a region A 6 , . . . , a region A 15 , and a region A 16 ). For example, the auxiliary image generation unit 26 cuts out the original image data BID into 9 regions of 3 ⁇ 3 and generates 4 pieces of auxiliary image data (auxiliary image data SID 1 , auxiliary image data SID 2 , auxiliary image data SID 3 , and auxiliary image data SID 4 ).
- the auxiliary image data SID 1 is formed in 9 regions (the region A 1 , the region A 2 , the region A 3 , the region A 5 , the region A 6 , the region A 7 , the region A 9 , the region A 10 , and the region A 11 ) on the upper left side of the drawing.
- the auxiliary image data SID 2 is formed on 9 regions (the region A 5 , the region A 6 , the region A 7 , the region A 9 , the region A 10 , the region A 11 , the region A 13 , the region A 14 , and the region A 15 ) on the upper right side of the drawing.
- the auxiliary image data SID 3 is formed of 9 regions (the region A 2 , the region A 3 , the region A 4 , the region A 6 , the region A 7 , the region A 8 , the region A 10 , the region A 11 , and the region A 12 ) on the lower left side of the drawing.
- the auxiliary image data SID 4 is formed in 9 regions (the region A 6 , the region A 7 , the region A 8 , the region A 10 , the region A 11 , the region A 12 , the region A 14 , the region A 15 , and the region A 16 ) on the lower right side of the drawing.
- the auxiliary image data SID When it is not necessary to individually distinguish the auxiliary image data from each other, the auxiliary image data are referred to as the auxiliary image data SID.
- direction guide data is generated according to the cutout position.
- a direction guide to be described below is displayed based on the direction guide data.
- the direction guide data indicating the upper left side is generated and the generated direction guide data can correspond to the auxiliary image data SID 1 .
- the direction guide data indicating the upper right side is generated and the generated direction guide data can correspond to the auxiliary image data SID 2 .
- the direction guide data indicating the lower left side is generated and the generated direction guide data can correspond to the auxiliary image data SID 3 .
- the direction guide data indicating the lower right side is generated and the generated direction guide data can correspond to the auxiliary image data SID 4 .
- the size of the auxiliary image data SID is appropriately converted such that the size of the auxiliary image data SID is suitable for the display unit 12 .
- the auxiliary image data SID is supplied to the auxiliary image processing unit 27 .
- the 4 pieces of auxiliary image data SID are each stored temporarily in the memory so that the auxiliary image data SID can be processed by the auxiliary image processing unit 27 .
- the 4 pieces of auxiliary image data SID are supplied to the display unit 12 .
- An auxiliary image SI based on each auxiliary image data SID is displayed on the display unit 12 .
- the direction guide which is based on the direction guide data is displayed in correspondence with each auxiliary image SI.
- auxiliary image data SID is not limited to 4.
- a range cut out from the original image data BID is also not limited to 9 regions, but may be appropriately changed. Even when the button 14 is pressed down rather than the KOZU button S 4 , the auxiliary image data SID are likewise generated and the auxiliary images SI are displayed on the display unit 12 .
- a subject near the center of the original image is disposed near the intersection point of division lines of the regions of the auxiliary images SI, and the auxiliary images are generated. Therefore, it is possible to prevent generation of an auxiliary image with an inappropriate composition such as a composition in which the subject near the center of the original image is out of an image frame.
- the appropriate auxiliary images can be generated by a relatively simple algorithm.
- FIG. 4 is a diagram illustrating an example of the through image and the like displayed on the display unit 12 .
- the through image is displayed on the display region 12 a .
- the plurality of icons and the like are displayed together with the through image on the display unit 12 .
- the subject includes two flowers and a butterfly. Accordingly, as the through image, a flower image FL 1 , a flower image FL 2 , and an image B of the butterfly resting on the flower image FL 2 are displayed on the display region 12 a .
- the user presses down the KOZU button S 4 .
- the image stored in the frame memory is supplied as the original image data BID to the auxiliary image generation unit 26 . Then, for example, four pieces of auxiliary image data SID are generated according to the above-described method and the auxiliary images SI are displayed based on the auxiliary image data SID.
- auxiliary images As shown in FIG. 5 , for example, four auxiliary images (an auxiliary image SI 10 , an auxiliary image SI 20 , an auxiliary image SI 30 , and an auxiliary image SI 40 ) are displayed in the display region 12 b .
- the user can confirm the other compositions simultaneously.
- the four auxiliary images may be switched and displayed.
- a method of displaying the plurality of auxiliary images together with the through image includes a method of switching and displaying the plurality of auxiliary images. Further, when it is not necessary to individually distinguish the auxiliary images, the auxiliary images are referred to as the auxiliary images SI.
- Each auxiliary image SI is displayed in correspondence with the direction guide.
- the direction guide is information guiding a direction in which the user moves the imaging device 100 (the imaging unit 21 ) when the user performs imaging according to the composition corresponding to the auxiliary image SI.
- the auxiliary image SI 10 is displayed in correspondence with the direction guide S 10 indicating the upper left direction.
- the auxiliary image SI 20 is displayed in correspondence with the direction guide S 20 indicating the upper right direction.
- the auxiliary image SI 30 is displayed in correspondence with the direction guide S 30 indicating the lower left direction.
- the auxiliary image SI 40 is displayed in correspondence with the direction guide S 40 indicating the lower right direction.
- An erasing button S 8 and a return button S 9 are displayed in the display region 12 b .
- the erasing button S 8 is touched, for example, the auxiliary image SI is erased.
- the return button S 9 is touched, for example, a screen is transitioned to the immediately previous screen.
- the user selects an auxiliary image with a preferred composition by referring to the four auxiliary images. For example, as shown in FIG. 5 , the user performs an operation (appropriately referred to as a tap operation) of touching the auxiliary image SI 30 once to select the auxiliary image SI 30 . A cursor CU is displayed in the circumference of the selected auxiliary image SI 30 so that the selected auxiliary image SI 30 can be distinguished from the other auxiliary images SI. The user performs an operation (appropriately referred to as a double tap operation) of touching the auxiliary image SI 30 twice successively to confirm the selection of the auxiliary image SI 30 .
- an operation appropriately referred to as a tap operation
- the tap operation may not necessarily be performed.
- the user can select the auxiliary image SI 10 by performing a double tap operation on the auxiliary image SI (for example, the auxiliary image SI 10 ) in which the cursor CU is not displayed and confirm the selection of the auxiliary image SI 10 .
- a tap operation on the auxiliary image SI is sometimes referred to as an auxiliary image selection operation and a double tap operation on the auxiliary image SI is sometimes referred to as an auxiliary image decision operation.
- information corresponding to the selected auxiliary image SI 30 overlaps the through image.
- the information corresponding to the auxiliary image SI 30 includes information indicating the edge of the auxiliary image SI 30 and information on an image for which the transparency of the auxiliary image SI 30 is changed. These two pieces of information can be switched and displayed.
- the auxiliary image processing unit 27 reads the auxiliary image data SID 30 corresponding to the auxiliary image SI 30 from the memory according to the selection of the auxiliary image SI 30 .
- the auxiliary image processing unit 27 performs an edge detection process on the auxiliary image data SID 30 .
- Edge image data is generated through the edge detection process.
- the size of the edge image data is appropriately converted.
- the edge image data is supplied to the display unit 12 .
- An edge image based on the edge image data is displayed so as to overlap the through images.
- an edge E 10 indicating the edge of the flower image FL 1 an edge E 20 indicating the edge of the flower image FL 2 , and an edge E 30 indicating the edge of the image B of the butterfly are displayed at predetermined positions in the display region 12 a .
- the edges are indicated by dotted lines in FIG. 6 , but the edges may be displayed by solid lines or the like colored with red or the like.
- the user moves the imaging device 100 so that the subject in the through image matches the edges. It is not necessary for the subject in the through image to completely match the edges.
- a photo with a composition substantially identical to the composition of the auxiliary image SI 30 can be obtained.
- the user moves the imaging device 100 so that the flower image FL 1 substantially matches the edge E 10 .
- the user may move the imaging device 100 so that the flower image FL 2 substantially matches the edge E 20 or the image B of the butterfly substantially matches the edge E 30 .
- the auxiliary image SI 30 is displayed in correspondence with the direction guide S 30 .
- the user may move the imaging device 100 in the direction indicated by the direction guide S 30 . Because the direction guide S 30 is displayed, for example, it is possible to prevent the user from erroneously moving the imaging device 100 in the upper right direction in which the edge E 20 or the like is displayed. After the user moves the imaging device 100 so that the subject in the through image substantially matches the edges, the user presses down the release button 11 to perform the imaging.
- a sign S 50 of characters “Mode 1 ” and a sign S 51 of characters “Mode 2 ” are displayed in the display region 12 a .
- the sign S 50 is displayed in the middle portion of the left side of the display region 12 a .
- the sign S 51 is displayed in the middle portion of the right side of the display region 12 a .
- Mode 1 (mode 1 ) indicated by the sign S 50 is a button for displaying the edge of the selected auxiliary image.
- Mode 2 (mode 2 ) indicated by the sign S 51 is a button for displaying an image for which the transparency of the selected auxiliary image is changed.
- the transparency of the auxiliary image SI 30 is changed instead of the edge E 10 and the like, but the image is displayed so as to overlap the through image.
- the transparency of the auxiliary image SI 30 is changed, but the edge image is displayed so as to overlap the through image instead of the image.
- FIG. 7 is a diagram illustrating a form in which the transparency of the auxiliary image SI 30 is changed, but the image is displayed so as to overlap the through image.
- the transparency of the auxiliary image SI 30 is changed, but the image includes a flower image C 10 , a flower image C 20 , and an image C 30 of the butterfly.
- the user moves the imaging device 100 to perform the imaging so that the flower image C 10 substantially matches the flower image FL 1 .
- a sign S 52 of characters “Darker” and a sign S 53 of characters “Lighter” are displayed in the display region 12 a .
- shading of the display of the edge E 10 and the like can be changed through an operation on the signs S 52 and S 53 .
- an operation (appropriately referred to as a holding operation) of continuously touching the sign S 52 in the state in which the edge E 10 and the like are displayed, the denseness of the display of the edge such as the edge E 10 is darkened.
- the user performs a holding operation on the sign S 53 in the state in which the edge E 10 and the like are displayed the denseness of the display of the edge such as the edge E 10 is lightened.
- the shading is smoothly changed through the hold operation on the sign S 52 or the sign S 53 .
- the transparency of the flower image C 10 and the like is changed to decrease and display is realized based on the changed transparency.
- the transparency of the flower image C 10 and the like is changed to increase and display is realized based on the changed transparency.
- the transparency can be changed in real time through the operation on the signs S 52 and S 53 .
- the processes corresponding to the operations on the signs S 51 , S 52 , S 53 , and S 54 are performed by, for example, the auxiliary image processing unit 27 .
- FIG. 8 is a flowchart illustrating an example of the flow of the process of the imaging device 100 .
- the imaging device 100 is oriented toward a subject and the subject with a predetermined composition is displayed on the display unit 12 .
- step ST 102 the plurality of auxiliary image data SID are generated.
- the auxiliary images SI corresponding to the plurality of auxiliary image data SID are displayed on the display unit 12 .
- the user can confirm the other compositions by referring to the plurality of auxiliary images SI.
- the user half presses the release button 11 .
- the process proceeds to step ST 105 .
- step ST 105 a focusing process is performed.
- the process proceeds to step ST 106 to perform the imaging.
- the captured image data is recorded in the recording device 25 .
- step ST 102 When the auxiliary image selection operation is performed in step ST 102 , the process proceeds to step ST 103 .
- the cursor CU is displayed in the circumference of the operated auxiliary image, and thus the composition of the auxiliary image is selected.
- step ST 104 When the auxiliary image decision operation is performed, the process proceeds to step ST 104 .
- step ST 104 the selection of the auxiliary image is confirmed, the information corresponding to the selected auxiliary image is displayed in the overlapping manner.
- the edge image based on the selected auxiliary image is displayed so as to overlap the through image.
- the image for which the transparency of the selected auxiliary image is changed may be displayed so as to overlap the through image.
- step ST 102 When the auxiliary image decision operation is performed in step ST 102 , the process proceeds to step ST 104 and the edge image based on the auxiliary image subjected to the auxiliary image decision operation is displayed so as to overlap the through image.
- the imaging device 100 is moved so that the subject in the through image substantially matches the edges.
- the user can easily recognize the movement direction of the imaging device 100 by referring to the direction guide.
- the user half presses the release button 11 . Then, the process proceeds to step ST 105 .
- step ST 105 a focusing process is performed.
- the process proceeds to step ST 106 to perform the imaging.
- the captured image data is recorded in the recording device 25 .
- the user can refer to the plurality of compositions.
- the edge or the guide such as the direction guide are displayed so that the user can image a photo with the composition. Accordingly, the user can easily image the photo with the desired composition.
- a plurality of auxiliary images may be confirmed with the hand of the user.
- the imaging device 100 is oriented toward a predetermined subject, and then the KOZU button S 4 is pressed down when the predetermined subject is displayed as a through image on the display unit 12 .
- image data stored in the frame memory is supplied as original image data to the auxiliary image generation unit 26 .
- the plurality of auxiliary image data based on the original image data are generated by the auxiliary image generation unit 26 .
- the plurality of auxiliary images based on the plurality of auxiliary image data are displayed on the display unit 12 .
- FIG. 9 is a diagram illustrating a display example of the auxiliary images and the like according to a modification example.
- An icon CI resembling the imaging device 100 is displayed near the center of the display region 12 a .
- Four auxiliary images (an auxiliary image SI 10 , an auxiliary image SI 20 , an auxiliary image SI 30 , and an auxiliary image SI 40 ) are displayed in the display region 12 a .
- a direction guide S 10 is displayed between the auxiliary image SI 10 and the icon CI.
- a direction guide S 20 is displayed between the auxiliary image SI 20 and the icon CI.
- a direction guide S 30 is displayed between the auxiliary image SI 30 and the icon CI.
- a direction guide S 40 is displayed between the auxiliary image SI 40 and the icon CI.
- the icon CI, the plurality of auxiliary images SI, and the plurality of direction guides are displayed together with the image based on the original image data.
- the image based on the original image data includes a flower image FL 1 , a flower image FL 2 , and an image B of a butterfly.
- the icon CI, the plurality of auxiliary images SI, and the plurality of direction guides are erased.
- a return button S 9 is touched, the icon CI, the plurality of auxiliary images SI, and the plurality of direction guides are displayed again.
- the user can confirm the other compositions. Further, the user can refer to the composition, that is, the composition of the image based on the original image data, when the user presses down the KOZU button S 4 . It is not necessary to continuously orient the imaging device 100 toward the subject and the user can refer to the plurality of compositions while holding the imaging device 100 .
- the imaging device 100 is prepared in the same direction as when the KOZU button S 4 is pressed down.
- the edge image is displayed so as to overlap the through image. The user performs the imaging with reference to the edge image.
- the image data obtained through the imaging unit 21 has been described as an example of the original image data, but other image data may be set as the original image data.
- image data recorded on the recording device 25 may be set as the original image data.
- the imaging device 100 includes a GPS sensor 30 and a communication unit 31 as an example of a position acquisition unit.
- the GPS sensor 30 acquires position information on the current position of the imaging device 100 .
- the communication unit 31 communicates with an image server via the Internet.
- the position information acquired by the GPS sensor 30 is transmitted to the image server according to a predetermined operation of the user.
- the image server transmits a plurality of image data according to the position information to the imaging device 100 .
- the plurality of image data transmitted from the image server are received by the communication unit 31 .
- the image data are supplied to the auxiliary image generation unit 26 .
- the auxiliary image generation unit 26 performs, for example, a process of converting the size of each of the plurality of image data. Images based on the processed image data are displayed as the auxiliary images. Image data downloaded from the image server may be configured to be selected by the user.
- a predetermined landscape is displayed in the display region 12 a .
- Images based on the image data transmitted from the image server are displayed as the auxiliary images in the display region 12 b .
- an auxiliary image SI 60 with a composition centering on a building an auxiliary image SI 70 with a composition centering on a distant landscape such as a mountain or a hill, and an auxiliary image SI 80 with a composition centering on trees or a building are displayed.
- the user determines a composition by referring to such auxiliary images.
- the edge of the auxiliary image SI is displayed in an overlapping manner.
- the user can take a photograph with a composition substantially identical to the composition of the selected auxiliary image SI by performing the imaging with reference to the edge.
- the imaging can be performed emulating the composition of an existing image.
- the communication unit 31 may perform short-range wireless communication.
- a wireless short-range scheme for example, communication by infrared light, communication by the “Zigbee (registered trademark)” standard, communication by “Bluetooth (registered trademark),” or communication by “WiFi (registered trademark)” that easily forms a network can be used, but embodiments of the present disclosure are not limited thereto.
- image data is acquired from the other device. Images based on the image data acquired from the other device may be displayed as auxiliary images. Auxiliary images based on other original image data may be displayed together.
- the plurality of auxiliary images may be displayed in a display form according to the evaluation value.
- the evaluation value is defined by the number of downloads of the image data, the number of submissions of a high evaluation for the image data, or the like.
- a predetermined mark may be given to the auxiliary image based on the image data of which the number of downloads is large and may be displayed.
- a crown mark S 15 may be given to the auxiliary image SI 70 based on the image data of which the number of downloads is large and may be displayed.
- the evaluation value may be determined according to the position of a predetermined subject.
- the predetermined subject is a main subject among the plurality of subjects and is, for example, a subject with the largest size.
- the flower image FL 2 is set as the predetermined subject.
- the evaluation value may increase the closer the central position of a region including the flower image FL 2 is to the center of the auxiliary image SI.
- the auxiliary images SI may be arranged in decreasing order of evaluation values.
- a crown mark or the like may be given to the auxiliary image SI with a large evaluation value and may be displayed. Of course, such display is presented merely as a reference to the user, and thus does not limit the selection of the auxiliary image SI by the user.
- the predetermined operation of generating the auxiliary images has been described as the operation of touching the KOZU button S 4 or the operation of pressing down the button 14 , but may be an operation performed by audio.
- the imaging device 100 may include a microphone 40 which is an example of an audio reception unit and an audio recognition unit 41 that recognizes audio received by the audio reception unit.
- the microphone 40 may be a microphone that receives a sound when a moving image is captured. For example, the user says, for example, “display compositions” toward the microphone 40 while holding the imaging device 100 to display a through image on the display unit 12 .
- a recognition signal RS indicating an audio recognition result of the audio recognition unit 41 is generated.
- the recognition signal RS is supplied to the control unit 20 .
- the control unit 20 generates a control signal CS to generate auxiliary images according to the recognition signal RS.
- the control signal CS is supplied to the auxiliary image generation unit 26 .
- the auxiliary image generation unit 26 generates auxiliary image data according to the control signal CS.
- the auxiliary images based on the auxiliary image data are displayed. Further, identification information such as a number may be displayed in correspondence with each of the plurality of auxiliary images.
- an auxiliary image may be configured to be selected by an audio of “the second auxiliary image.”
- the imaging device 100 may be configured to be operated by audio.
- the user When the user performs the imaging, the user prepares the imaging device 100 to face a subject in many cases, holding the imaging device 100 with his or her both hands. Even in such cases, the auxiliary images and the edges can be displayed on the display unit 12 without changing the position of the imaging device 100 .
- the direction guide may be displayed at a timing at which the user moves the imaging device 100 .
- a predetermined auxiliary image may be selected and the direction guide may be displayed at the selection timing.
- the direction guide may be displayed in the display region 12 a .
- the direction guide may be displayed in a blinking manner.
- the direction guide may be configured to guide movement by audio.
- the display control device is not limited to the imaging device 100 , but may be realized by a personal computer, a tablet-type computer, a smart phone, or the like.
- the embodiment of the present disclosure is not limited to a device, but may be realized as a method, a program, or a recording medium.
- the embodiment of the present disclosure can be applied to a so-called cloud system in which the exemplified processes are distributed and processed by a plurality of devices.
- the embodiment of the present disclosure can be realized as a system that performs the exemplified processes and a device that performs at least some of the processes.
- present technology may also be configured as below.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012105047A JP5880263B2 (ja) | 2012-05-02 | 2012-05-02 | 表示制御装置、表示制御方法、プログラムおよび記録媒体 |
JP2012-105047 | 2012-05-02 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20130293746A1 US20130293746A1 (en) | 2013-11-07 |
US9270901B2 true US9270901B2 (en) | 2016-02-23 |
Family
ID=49491995
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/857,267 Active US9270901B2 (en) | 2012-05-02 | 2013-04-05 | Display control device, display control method, program, and recording medium |
Country Status (3)
Country | Link |
---|---|
US (1) | US9270901B2 (enrdf_load_stackoverflow) |
JP (1) | JP5880263B2 (enrdf_load_stackoverflow) |
CN (1) | CN103384304B (enrdf_load_stackoverflow) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150146042A1 (en) * | 2013-11-26 | 2015-05-28 | Kathleen Panek-Rickerson | Template Photography and Methods of Using the Same |
US20160054903A1 (en) * | 2014-08-25 | 2016-02-25 | Samsung Electronics Co., Ltd. | Method and electronic device for image processing |
US20160227108A1 (en) * | 2015-02-02 | 2016-08-04 | Olympus Corporation | Imaging apparatus |
US20170078565A1 (en) * | 2015-09-14 | 2017-03-16 | Olympus Corporation | Imaging operation guidance device and imaging operation guidance method |
US10091414B2 (en) * | 2016-06-24 | 2018-10-02 | International Business Machines Corporation | Methods and systems to obtain desired self-pictures with an image capture device |
US10282952B2 (en) * | 2007-06-04 | 2019-05-07 | Trover Group Inc. | Method and apparatus for segmented video compression |
US10911682B2 (en) * | 2017-02-23 | 2021-02-02 | Huawei Technologies Co., Ltd. | Preview-image display method and terminal device |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9894262B2 (en) * | 2013-01-08 | 2018-02-13 | Sony Corporation | Display control apparatus to enable a user to check a captured image after image processing |
BR112016006091B1 (pt) * | 2013-11-21 | 2022-09-06 | Huawei Device (Shenzhen) Co., Ltd | Método de exibição de imagem, meio de armazenamento de computador, aparelho de exibição de imagem e dispositivo de terminal |
FR3022388B1 (fr) * | 2014-06-16 | 2019-03-29 | Antoine HUET | Film personnalise et maquette video |
CN104967790B (zh) * | 2014-08-06 | 2018-09-11 | 腾讯科技(北京)有限公司 | 照片拍摄方法、装置及移动终端 |
JP2016046676A (ja) * | 2014-08-22 | 2016-04-04 | 株式会社リコー | 撮像装置および撮像方法 |
US10104284B2 (en) * | 2014-09-19 | 2018-10-16 | Huawei Technologies Co., Ltd. | Method and apparatus for determining photographing delay time, and photographing device |
EP3496386A4 (en) * | 2016-11-24 | 2019-07-03 | Huawei Technologies Co., Ltd. | GUIDANCE METHOD AND DEVICE FOR PHOTOGRAPHIC COMPOSITION |
CN109479087B (zh) * | 2017-01-19 | 2020-11-17 | 华为技术有限公司 | 一种图像处理的方法及装置 |
JP6875196B2 (ja) * | 2017-05-26 | 2021-05-19 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | モバイルプラットフォーム、飛行体、支持装置、携帯端末、撮像補助方法、プログラム、及び記録媒体 |
CN108093174A (zh) * | 2017-12-15 | 2018-05-29 | 北京臻迪科技股份有限公司 | 拍照设备的构图方法、装置和拍照设备 |
KR102159803B1 (ko) * | 2018-10-11 | 2020-09-24 | 강산 | 촬영 가이드 제공 장치 및 프로그램 |
CN111856751B (zh) * | 2019-04-26 | 2022-12-09 | 苹果公司 | 具有低光操作的头戴式显示器 |
CN111182207B (zh) * | 2019-12-31 | 2021-08-24 | Oppo广东移动通信有限公司 | 图像拍摄方法、装置、存储介质及电子设备 |
KR102216145B1 (ko) * | 2020-02-10 | 2021-02-16 | 중앙대학교 산학협력단 | OpenCV를 이용한 사진 촬영 보조 장치 및 방법 |
WO2021185296A1 (zh) * | 2020-03-20 | 2021-09-23 | 华为技术有限公司 | 一种拍摄方法及设备 |
CN115150543B (zh) * | 2021-03-31 | 2024-04-16 | 华为技术有限公司 | 拍摄方法、装置、电子设备及可读存储介质 |
Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030169350A1 (en) * | 2002-03-07 | 2003-09-11 | Avi Wiezel | Camera assisted method and apparatus for improving composition of photography |
US20050007468A1 (en) * | 2003-07-10 | 2005-01-13 | Stavely Donald J. | Templates for guiding user in use of digital camera |
US7088865B2 (en) * | 1998-11-20 | 2006-08-08 | Nikon Corporation | Image processing apparatus having image selection function, and recording medium having image selection function program |
US20060221223A1 (en) * | 2005-04-05 | 2006-10-05 | Hiroshi Terada | Digital camera capable of continuous shooting and control method for the digital camera |
US20070146528A1 (en) * | 2005-12-27 | 2007-06-28 | Casio Computer Co., Ltd | Image capturing apparatus with through image display function |
US20070291154A1 (en) * | 2006-06-20 | 2007-12-20 | Samsung Techwin Co., Ltd. | Method of controlling digital photographing apparatus, and digital photographing apparatus using the method |
US7349020B2 (en) * | 2003-10-27 | 2008-03-25 | Hewlett-Packard Development Company, L.P. | System and method for displaying an image composition template |
US20090015702A1 (en) * | 2007-07-11 | 2009-01-15 | Sony Ericsson Communicatins Ab | Enhanced image capturing functionality |
JP2009231992A (ja) | 2008-03-20 | 2009-10-08 | Brother Ind Ltd | 印刷データ作成装置、印刷装置、印刷データ作成プログラム、及びコンピュータ読み取り可能な記録媒体 |
US20100110266A1 (en) * | 2008-10-31 | 2010-05-06 | Samsung Electronics Co., Ltd. | Image photography apparatus and method for proposing composition based person |
US20100194963A1 (en) * | 2007-09-18 | 2010-08-05 | Sony Corporation | Display control apparatus, image capturing apparatus, display control method, and program |
US7973848B2 (en) * | 2007-04-02 | 2011-07-05 | Samsung Electronics Co., Ltd. | Method and apparatus for providing composition information in digital image processing device |
US8045007B2 (en) * | 2004-12-24 | 2011-10-25 | Fujifilm Corporation | Image capturing system and image capturing method |
US8063972B2 (en) * | 2009-04-29 | 2011-11-22 | Hon Hai Precision Industry Co., Ltd. | Image capture device and control method thereof |
US8125557B2 (en) * | 2009-02-08 | 2012-02-28 | Mediatek Inc. | Image evaluation method, image capturing method and digital camera thereof for evaluating and capturing images according to composition of the images |
US8154646B2 (en) * | 2005-12-19 | 2012-04-10 | Casio Computer Co., Ltd. | Image capturing apparatus with zoom function |
US8289433B2 (en) * | 2005-09-14 | 2012-10-16 | Sony Corporation | Image processing apparatus and method, and program therefor |
US20120268641A1 (en) * | 2011-04-21 | 2012-10-25 | Yasuhiro Kazama | Image apparatus |
US20130314580A1 (en) * | 2012-05-24 | 2013-11-28 | Mediatek Inc. | Preview system for concurrently displaying multiple preview images generated based on input image generated by image capture apparatus and related preview method thereof |
US8654238B2 (en) * | 2004-09-03 | 2014-02-18 | Nikon Corporation | Digital still camera having a monitor device at which an image can be displayed |
US20140247325A1 (en) * | 2011-12-07 | 2014-09-04 | Yi Wu | Guided image capture |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08294025A (ja) * | 1995-04-24 | 1996-11-05 | Olympus Optical Co Ltd | カメラ |
JP3833486B2 (ja) * | 2000-04-19 | 2006-10-11 | 富士写真フイルム株式会社 | 撮像装置 |
JP4499271B2 (ja) * | 2000-11-13 | 2010-07-07 | オリンパス株式会社 | カメラ |
JP2007158868A (ja) * | 2005-12-07 | 2007-06-21 | Sony Corp | 画像処理装置および方法 |
JP4935559B2 (ja) * | 2007-07-25 | 2012-05-23 | 株式会社ニコン | 撮像装置 |
US7805066B2 (en) * | 2007-12-24 | 2010-09-28 | Microsoft Corporation | System for guided photography based on image capturing device rendered user recommendations according to embodiments |
JP4869270B2 (ja) * | 2008-03-10 | 2012-02-08 | 三洋電機株式会社 | 撮像装置及び画像再生装置 |
JP2010130540A (ja) * | 2008-11-28 | 2010-06-10 | Canon Inc | 映像表示装置 |
JP5287465B2 (ja) * | 2009-04-21 | 2013-09-11 | ソニー株式会社 | 撮像装置、撮影設定方法及びそのプログラム |
JP4844657B2 (ja) * | 2009-07-31 | 2011-12-28 | カシオ計算機株式会社 | 画像処理装置及び方法 |
JP5359762B2 (ja) * | 2009-10-15 | 2013-12-04 | ソニー株式会社 | 情報処理装置、表示制御方法及び表示制御プログラム |
JP5561019B2 (ja) * | 2010-08-23 | 2014-07-30 | ソニー株式会社 | 撮像装置、プログラムおよび撮像方法 |
-
2012
- 2012-05-02 JP JP2012105047A patent/JP5880263B2/ja not_active Expired - Fee Related
-
2013
- 2013-04-05 US US13/857,267 patent/US9270901B2/en active Active
- 2013-04-25 CN CN201310146644.7A patent/CN103384304B/zh not_active Expired - Fee Related
Patent Citations (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7088865B2 (en) * | 1998-11-20 | 2006-08-08 | Nikon Corporation | Image processing apparatus having image selection function, and recording medium having image selection function program |
US20030169350A1 (en) * | 2002-03-07 | 2003-09-11 | Avi Wiezel | Camera assisted method and apparatus for improving composition of photography |
US20050007468A1 (en) * | 2003-07-10 | 2005-01-13 | Stavely Donald J. | Templates for guiding user in use of digital camera |
US7349020B2 (en) * | 2003-10-27 | 2008-03-25 | Hewlett-Packard Development Company, L.P. | System and method for displaying an image composition template |
US8654238B2 (en) * | 2004-09-03 | 2014-02-18 | Nikon Corporation | Digital still camera having a monitor device at which an image can be displayed |
US8045007B2 (en) * | 2004-12-24 | 2011-10-25 | Fujifilm Corporation | Image capturing system and image capturing method |
US20060221223A1 (en) * | 2005-04-05 | 2006-10-05 | Hiroshi Terada | Digital camera capable of continuous shooting and control method for the digital camera |
US8289433B2 (en) * | 2005-09-14 | 2012-10-16 | Sony Corporation | Image processing apparatus and method, and program therefor |
US8154646B2 (en) * | 2005-12-19 | 2012-04-10 | Casio Computer Co., Ltd. | Image capturing apparatus with zoom function |
US20070146528A1 (en) * | 2005-12-27 | 2007-06-28 | Casio Computer Co., Ltd | Image capturing apparatus with through image display function |
US20070291154A1 (en) * | 2006-06-20 | 2007-12-20 | Samsung Techwin Co., Ltd. | Method of controlling digital photographing apparatus, and digital photographing apparatus using the method |
US7973848B2 (en) * | 2007-04-02 | 2011-07-05 | Samsung Electronics Co., Ltd. | Method and apparatus for providing composition information in digital image processing device |
US20090015702A1 (en) * | 2007-07-11 | 2009-01-15 | Sony Ericsson Communicatins Ab | Enhanced image capturing functionality |
US20100194963A1 (en) * | 2007-09-18 | 2010-08-05 | Sony Corporation | Display control apparatus, image capturing apparatus, display control method, and program |
US20130308032A1 (en) * | 2007-09-18 | 2013-11-21 | Sony Corporation | Display control apparatus, image capturing appartus, display control method, and program |
JP2009231992A (ja) | 2008-03-20 | 2009-10-08 | Brother Ind Ltd | 印刷データ作成装置、印刷装置、印刷データ作成プログラム、及びコンピュータ読み取り可能な記録媒体 |
US20100110266A1 (en) * | 2008-10-31 | 2010-05-06 | Samsung Electronics Co., Ltd. | Image photography apparatus and method for proposing composition based person |
US8125557B2 (en) * | 2009-02-08 | 2012-02-28 | Mediatek Inc. | Image evaluation method, image capturing method and digital camera thereof for evaluating and capturing images according to composition of the images |
US8063972B2 (en) * | 2009-04-29 | 2011-11-22 | Hon Hai Precision Industry Co., Ltd. | Image capture device and control method thereof |
US20120268641A1 (en) * | 2011-04-21 | 2012-10-25 | Yasuhiro Kazama | Image apparatus |
US20140247325A1 (en) * | 2011-12-07 | 2014-09-04 | Yi Wu | Guided image capture |
US20130314580A1 (en) * | 2012-05-24 | 2013-11-28 | Mediatek Inc. | Preview system for concurrently displaying multiple preview images generated based on input image generated by image capture apparatus and related preview method thereof |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10282952B2 (en) * | 2007-06-04 | 2019-05-07 | Trover Group Inc. | Method and apparatus for segmented video compression |
US10847003B1 (en) | 2007-06-04 | 2020-11-24 | Trover Group Inc. | Method and apparatus for segmented video compression |
US9497384B2 (en) * | 2013-11-26 | 2016-11-15 | Kathleen Panek-Rickerson | Template photography and methods of using the same |
US20150146042A1 (en) * | 2013-11-26 | 2015-05-28 | Kathleen Panek-Rickerson | Template Photography and Methods of Using the Same |
US20160054903A1 (en) * | 2014-08-25 | 2016-02-25 | Samsung Electronics Co., Ltd. | Method and electronic device for image processing |
US10075653B2 (en) * | 2014-08-25 | 2018-09-11 | Samsung Electronics Co., Ltd | Method and electronic device for image processing |
US20160227108A1 (en) * | 2015-02-02 | 2016-08-04 | Olympus Corporation | Imaging apparatus |
US9843721B2 (en) * | 2015-02-02 | 2017-12-12 | Olympus Corporation | Imaging apparatus |
US10075632B2 (en) | 2015-02-02 | 2018-09-11 | Olympus Corporation | Imaging apparatus |
US10375298B2 (en) | 2015-02-02 | 2019-08-06 | Olympus Corporation | Imaging apparatus |
US20170078565A1 (en) * | 2015-09-14 | 2017-03-16 | Olympus Corporation | Imaging operation guidance device and imaging operation guidance method |
US10116860B2 (en) * | 2015-09-14 | 2018-10-30 | Olympus Corporation | Imaging operation guidance device and imaging operation guidance method |
US10264177B2 (en) * | 2016-06-24 | 2019-04-16 | International Business Machines Corporation | Methods and systems to obtain desired self-pictures with an image capture device |
US10091414B2 (en) * | 2016-06-24 | 2018-10-02 | International Business Machines Corporation | Methods and systems to obtain desired self-pictures with an image capture device |
US10911682B2 (en) * | 2017-02-23 | 2021-02-02 | Huawei Technologies Co., Ltd. | Preview-image display method and terminal device |
US11196931B2 (en) | 2017-02-23 | 2021-12-07 | Huawei Technologies Co., Ltd. | Preview-image display method and terminal device |
US11539891B2 (en) | 2017-02-23 | 2022-12-27 | Huawei Technologies Co., Ltd. | Preview-image display method and terminal device |
US12212840B2 (en) | 2017-02-23 | 2025-01-28 | Huawei Technologies Co., Ltd. | Preview-image display method and terminal device |
Also Published As
Publication number | Publication date |
---|---|
CN103384304A (zh) | 2013-11-06 |
CN103384304B (zh) | 2017-06-27 |
JP5880263B2 (ja) | 2016-03-08 |
US20130293746A1 (en) | 2013-11-07 |
JP2013232861A (ja) | 2013-11-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9270901B2 (en) | Display control device, display control method, program, and recording medium | |
CN113556461B (zh) | 一种图像处理方法、电子设备及计算机可读存储介质 | |
US11949978B2 (en) | Image content removal method and related apparatus | |
JP6834056B2 (ja) | 撮影モバイル端末 | |
US12219243B2 (en) | Shooting preview interface for electronic device with multiple camera | |
EP3076659B1 (en) | Photographing apparatus, control method thereof, and non-transitory computer-readable recording medium | |
US8564682B2 (en) | Method for creating content using a camera of a portable terminal and a portable terminal adapted therefor | |
US9536479B2 (en) | Image display device and method | |
CN106688227B (zh) | 多摄像装置、多摄像方法 | |
JP6302564B2 (ja) | 動画編集装置、動画編集方法及び動画編集プログラム | |
US20130258122A1 (en) | Method and device for motion enhanced image capture | |
JP2014183425A (ja) | 画像処理方法、画像処理装置および画像処理プログラム | |
JP2019121857A (ja) | 電子機器及びその制御方法 | |
US11996123B2 (en) | Method for synthesizing videos and electronic device therefor | |
US20110249139A1 (en) | Imaging control device and imaging control method | |
EP3259658B1 (en) | Method and photographing apparatus for controlling function based on gesture of user | |
CA2810548A1 (en) | Method and device for motion enhanced image capture | |
US9582179B2 (en) | Apparatus and method for editing image in portable terminal | |
US12401890B2 (en) | Photographing method and electronic device | |
JP6010303B2 (ja) | 画像再生装置 | |
CN105376479A (zh) | 图像生成装置以及图像生成方法 | |
US20240007599A1 (en) | Information processing system, information processing device, information processing method, information processing program, imaging device, control method of imaging device, and control program | |
US20230103051A1 (en) | Image processing apparatus, image processing method, and program | |
WO2017187573A1 (ja) | 撮像装置 | |
JP2013135398A (ja) | 画像合成装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:IKI, MASARU;REEL/FRAME:030158/0807 Effective date: 20130402 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |