WO2014109129A1 - 表示制御装置、プログラム及び表示制御方法 - Google Patents
表示制御装置、プログラム及び表示制御方法 Download PDFInfo
- Publication number
- WO2014109129A1 WO2014109129A1 PCT/JP2013/080914 JP2013080914W WO2014109129A1 WO 2014109129 A1 WO2014109129 A1 WO 2014109129A1 JP 2013080914 W JP2013080914 W JP 2013080914W WO 2014109129 A1 WO2014109129 A1 WO 2014109129A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display control
- processed images
- processed
- display
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
- H04N23/632—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
Definitions
- the present disclosure relates to a display control device, a program, and a display control method.
- a digital camera is manufactured as a dedicated device, or built in any device (for example, a smartphone).
- an image generated through an image sensor, an image processing circuit, or the like of a digital camera is stored as digital data and displayed on a monitor.
- Digital cameras are usually provided with functions for automatically performing exposure adjustment during imaging, color balance adjustment according to the type of light source, and the like.
- many digital cameras are provided with a plurality of imaging modes according to the type of subject such as a person, a flower, a landscape, a night view, or the purpose of imaging.
- imaging conditions such as a shutter speed, an aperture value, and a light source type are automatically set to be suitable for the type of subject or the purpose of imaging.
- a function for changing the image tone by an effect according to the preference of the user of the digital camera is also provided.
- a sample image obtained by changing the tone of a captured image at any time while displaying a live view image (or a through image) on a monitor is also used as a thumbnail.
- a technique for displaying on the monitor is disclosed in Patent Document 1. According to this technique, a user of a digital camera causes the digital camera to set imaging conditions by selecting a thumbnail with a desired image tone while viewing a live view image.
- the frame rate of thumbnails can be lower than the frame rate of live view images.
- the thumbnails may not be displayed smoothly but may be displayed as if frames have been dropped. Therefore, there is a concern that the user who sees the thumbnail feels a strong stress.
- a live view image generated based on image data obtained through an image sensor and an image data at any point in time among the image data are generated under respective image processing conditions.
- a display control device including a display control unit that displays one or more processed images on a display device, and a determination unit that determines whether a predetermined user operation is recognized.
- the display control unit updates the displayed one or more processed images when it is determined that the predetermined user operation is recognized.
- the computer can display each image based on the live view image generated based on the image data obtained through the image sensor and the image data at any point in time among the image data.
- a program is provided for causing one or more processed images generated under the processing conditions to function as a display control unit for displaying on a display device and a determination unit for determining whether a predetermined user operation has been recognized.
- the display control unit updates the displayed one or more processed images when it is determined that the predetermined user operation is recognized.
- a live view image generated based on image data obtained through an image sensor, and image data at any point in time among the image data, under each image processing condition.
- a display control method includes displaying one or more generated processed images on a display device and determining whether a predetermined user operation has been recognized. The one or more processed images displayed are updated when it is determined that the predetermined user operation has been recognized.
- a user can easily check a captured image after image processing at the time of imaging without stress.
- FIG. 4 is an explanatory diagram for describing an example of an appearance of a display control device according to an embodiment of the present disclosure.
- FIG. It is a block diagram which shows an example of the hardware constitutions of the display control apparatus which concerns on one Embodiment. It is a block diagram which shows an example of a structure of the display control apparatus which concerns on 1st Embodiment.
- FIG. 1 is an explanatory diagram for describing an example of an appearance of a display control apparatus 100 according to an embodiment of the present disclosure.
- a display control device 100 is shown.
- the display control apparatus 100 is a smartphone.
- the display control device 100 includes a display device 101.
- the display control apparatus 100 displays on the display apparatus 101 a screen for showing to the user of the display control apparatus 100.
- the display control device 100 includes a touch panel.
- the display device 101 is a display surface of a touch panel. Then, the display control apparatus 100 detects a user's touch on the touch panel and recognizes the user's touch operation from the detection result.
- the display control apparatus 100 includes a camera 103.
- An image generated through the camera 103 is stored in the display control device 100 and displayed on the display device 101.
- FIG. 2 is a block diagram illustrating an example of a hardware configuration of the display control apparatus 100 according to an embodiment.
- the display control apparatus 100 includes, for example, a CPU (Central Processing Unit) 71, a lens block 73, a motor 75, a motor driver 77, an image sensor 79, an image sensor driver 81, a timing generator 83, and a signal processing circuit.
- a CPU Central Processing Unit
- the display control apparatus 100 includes, for example, a CPU (Central Processing Unit) 71, a lens block 73, a motor 75, a motor driver 77, an image sensor 79, an image sensor driver 81, a timing generator 83, and a signal processing circuit.
- 85 an image processing circuit 87, an SDRAM (Synchronous Dynamic Random Access Memory) 89, a monitor 91, an internal memory 93, an external memory 95, a key input device 97, and a touch detection surface 99.
- SDRAM Synchronous Dynamic Random Access Memory
- CPU 71 controls the entire display control apparatus 100.
- the lens block 73 is an optical system and is driven by a motor 75.
- the motor 75 is controlled by a motor driver 77.
- the motor driver 77 controls the motor 75 in accordance with a control signal from the CPU 71 to perform lens extension and storage, change of the optical system zoom magnification, focus adjustment, aperture adjustment, and the like when the power is turned on and off.
- the image sensor 79 is an arbitrary image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
- the image sensor driver 81 controls the image sensor 79 based on the timing signal generated by the timing generator 83 according to the control signal of the CPU 71.
- the image sensor 79 photoelectrically converts the optical image of the subject formed by the lens block 73 into an image signal under the control of the image sensor driver 81, and outputs the image signal to the signal processing circuit 85.
- the signal processing circuit 85 includes a circuit that removes noise included in the imaging signal, an analog / digital converter that converts the imaging signal after noise removal into a digital signal, and the like.
- the signal processing circuit 85 outputs the digitized imaging signal to the image processing circuit 87.
- the imaging signal is RAW format image data (hereinafter referred to as “RAW data”).
- the image processing circuit 87 converts the imaging signal into YUV format image data (hereinafter referred to as “YUV data”) including a luminance (Y) signal and a color difference (UV) signal. Further, the image processing circuit 87 performs pixel complementation as necessary. In addition, the image processing circuit 87 performs image processing such as white balance adjustment, contour enhancement, and tone change according to control by the CPU 71.
- the YUV data converted by the image processing circuit 87 is sequentially stored in the SDRAM 89. In the imaging standby state in the recording mode, every time one frame of YUV data is accumulated in the SDRAM 89, a live view image (or a through image) corresponding to the YUV data is displayed on the monitor 91.
- the monitor 91 can be realized using a liquid crystal, an organic EL (Organic Light-Emitting Diode: OLED), a CRT (Cathode Ray Tube), or the like.
- the monitor 91 is a display surface of a touch panel, for example.
- the YUV data is compressed into image data in any format such as JPEG (Joint Photographic Experts Group) format, JPEG2000 format, etc.
- the compressed image data is stored in the internal memory 93 built in the display control apparatus 100 or a removable external memory 95 (for example, a memory) connected to the display control apparatus 100 via an interface (not shown). Card).
- the image data stored in the internal memory 93 or the external memory 95 is appropriately read out by the CPU 71 according to a user operation, converted into YUV data, and read into the SDRAM 89. Then, the YUV data is displayed on the monitor 91.
- a microcomputer (not shown) is connected to the CPU 71.
- the microcomputer is connected to a key input device 97 (such as a tact switch), a touch detection surface 99, and a power supply control circuit (not shown).
- the touch detection surface 99 is, for example, a touch detection surface of a touch panel.
- the microcomputer regularly scans the key input operation on the key input device 97 and the touch operation on the touch detection surface 99.
- the microcomputer detects the user's key input operation on the key input device 97 and provides the detection result to the CPU 71.
- the microcomputer detects a user touch on the touch detection surface 99 and provides the detection result to the CPU 71.
- the microcomputer controls the power supply from the power control circuit.
- the display control apparatus 100 may include other components such as a communication circuit in addition to the components described above, or may not necessarily include the components described above.
- First Embodiment a live view image and a processed image after image processing are displayed, and the one or more processed images are updated according to a predetermined user operation.
- FIG. 3 is a block diagram showing an example of the configuration of the display control apparatus 100-1 according to the first embodiment.
- the display control device 100-1 includes an imaging unit 110, an input unit 120, a display unit 130, a storage unit 140, and a control unit 150.
- the imaging unit 110 obtains image data through the imaging element.
- the image data is, for example, RAW data.
- the imaging unit 110 outputs an imaging signal by an imaging element, and performs noise removal, analog / digital conversion, and the like included in the imaging signal. As a result, the image data is obtained.
- the imaging unit 110 provides the image data to the control unit 150 (image processing unit 153).
- the imaging unit 110 is mounted by a lens block 73, a motor 75, a motor driver 77, an imaging element 79, an imaging element driver 81, a timing generator 83, a signal processing circuit 85, and the like.
- the input unit 120 detects a user operation. Then, the input unit 120 provides the detection result to the control unit 150 (operation recognition unit 155).
- the input unit 120 detects a user touch on the touch panel. Then, the input unit 120 provides the detection result (for example, touch position information) to the control unit 150 (operation recognition unit 155).
- the detection result for example, touch position information
- the input unit 120 detects a user key input operation in the key input device. Then, the input unit 120 provides the detection result to the control unit 150 (operation recognition unit 155).
- the input unit 120 can be implemented by the key input device 97, the touch detection surface 99, the microcomputer (not shown), and the like.
- the display unit 130 displays a screen for showing to the user of the display control apparatus 100-1. That is, the display unit 130 corresponds to the display device 101. For example, the display unit 130 displays the screen according to control by the control unit 150 (display control unit 157).
- the display unit 130 is mounted by the monitor 91 or the like. That is, the display unit 130 is a display surface of a touch panel, for example.
- the storage unit 140 stores a program and data for the operation of the display control apparatus 100-1.
- the storage unit 140 stores image data obtained through the imaging unit 110 and / or the control unit 150 (image processing unit 153).
- the storage unit 140 can be implemented by the SDRAM 89, the internal memory 93, the external memory 95, and the like.
- Control unit 150 The control unit 150 provides various functions of the display control apparatus 100-1. For example, the control unit 150 provides the various functions described above by executing a program stored in the storage unit 140 or another storage medium.
- the control unit 150 includes an imaging control unit 151, an image processing unit 153, an operation recognition unit 155, an operation determination unit 156, a display control unit 157, and a parameter change unit 159.
- the image processing unit 153 can be implemented by the CPU 71, the image processing circuit 87, and the like.
- Imaging control unit 151 The imaging control unit 151 performs control related to imaging by the imaging unit 110.
- the imaging control unit 151 controls the timing of imaging. More specifically, for example, the imaging control unit 151 controls the imaging timing by providing an imaging timing control signal to the imaging unit 110.
- the imaging control unit 151 controls driving of the imaging unit 110. More specifically, for example, the imaging control unit 151 transmits a drive control signal to the imaging unit 110, thereby extending and retracting the lens when the power is turned on and off, changing the zoom magnification of the optical system, focus adjustment, Adjust the aperture.
- the image processing unit 153 generates an image by performing image processing. For example, the image processing unit 153 generates an image based on image data obtained through the image sensor. That is, an image is generated based on the image data provided by the imaging unit 110.
- the image data is, for example, RAW data.
- the image processing unit 153 generates a live view image (or a through image) based on image data obtained through the image sensor. For example, in the standby state in the imaging mode, the image processing unit 153 generates a live view image based on image data sequentially obtained through the imaging element.
- the image processing unit 153 generates one or more processed images under each image processing condition based on the image data at any point of the image data.
- the image data is RAW data
- the one or more processed images are arbitrary images that can be generated by image processing based on the RAW data.
- the image processing condition is an arbitrary condition related to image processing.
- the image processing conditions include conditions for developing RAW data (hereinafter referred to as “developing conditions”).
- the development conditions include conditions such as white balance, contrast, and brightness.
- the first processed image is an image generated by a developing process under the first developing condition
- the second processed image is an image generated by a developing process under the second developing condition. May be.
- the first processed image may be an image having a higher contrast than the normal image
- the second processed image may be an image brighter than the normal image.
- the image processing conditions include processing (and processing conditions for the processing) further performed on the image after the development processing.
- the processing includes, for example, extraction of a specific color (part color), conversion to a negative image, conversion to an illustration-like image, and the like.
- the third processed image may be an image that has been subjected to the first processing
- the fourth processed image may be an image that has been subjected to the second processing.
- the third processed image may be an image obtained by extracting a specific color
- the fourth processed image may be an image converted into an illustration.
- the image processing unit 153 can generate a processed image under an arbitrary condition related to image processing based on the image data.
- the image processing unit 153 generates one or more processed images based on the image data at any point in the image data.
- Any one of the time points includes, for example, any time point (for example, the time point immediately after) after the display control unit 157 is instructed to generate a new processed image. That is, the image processing unit 153 generates a new processed image after receiving an instruction from the display control unit 157.
- any one of the above points may include the start point of the imaging mode. That is, the image processing unit 153 may generate the first processed image when the imaging mode is started.
- the one or more processed images are images that are captured in a state where a subject that is partially or entirely included in the live view image is further reduced. That is, the image processing unit 153 generates one or more processed images that are captured in a state in which a subject that is partially or entirely included in the live view image is further reduced. As an example, the one or more processed images are images that are captured in a state in which the entire subject captured in the live view image is further reduced.
- the one or more processed images are a plurality of processed images. That is, the image processing unit 153 generates a plurality of processed images under each image processing condition. For example, a subject that appears in each of the plurality of processed images overlaps at least partially with a subject that appears in one or more other processed images of the plurality of processed images. As an example, the subject that appears in each of the plurality of processed images overlaps with the subject that appears in any other processed image of the plurality of processed images. That is, the subjects shown in the plurality of processed images are common subjects.
- the operation recognition unit 155 recognizes a user operation. For example, when detecting the user operation by the user of the display control apparatus 100-1, the input unit 120 provides the detection result of the user operation to the operation recognition unit 155, and the operation recognition unit 155 acquires the detection result. Then, the operation recognition unit 155 recognizes the user operation based on the detection result.
- the user operation includes a touch operation on the touch panel. That is, the operation recognition unit 155 recognizes a touch operation on the touch panel. More specifically, for example, the operation recognition unit 155 recognizes a touch event as a touch operation on the touch panel.
- the touch event includes, for example, a tap, a double tap, a drag, a flick, and a pinch.
- the operation recognizing unit 155 sequentially acquires touch position information as a detection result of the user operation, and recognizes a touch event from the sequentially acquired touch position information.
- the user operation includes a user key input operation in the key input device.
- the operation recognition unit 155 recognizes a key input operation in the key input device. More specifically, for example, the operation recognition unit 155 recognizes the type of key input operation as a key input operation in the key input device.
- the operation determination unit 156 determines whether a predetermined user operation (hereinafter referred to as "update operation”) has been recognized.
- the update operation is a predetermined touch operation on the touch panel. More specifically, for example, the update operation is a touch operation at a predetermined position on the touch panel. As an example, the update operation is a touch operation at a position on the touch panel where the one or more processed images are displayed. That is, the operation determination unit 156 determines whether a touch operation (that is, an update operation) at a position on the touch panel where the one or more processed images are displayed is recognized.
- a touch operation that is, an update operation
- the operation determination unit 156 monitors the user operation recognized by the operation recognition unit 155, and determines whether the update operation is recognized by determining whether the user operation matches the update operation. Then, when the update operation is recognized, the operation determination unit 156 notifies the display control unit 157 that the update operation has been recognized.
- the operation determination unit 156 determines whether a predetermined other user operation (hereinafter referred to as “parameter change operation”) has been recognized.
- a parameter changing unit 159 changes a parameter for generating at least one of the one or more processed images.
- the parameter changing operation is a touch operation at a display position of an object for changing the parameter (hereinafter referred to as a parameter changing object) on the touch panel.
- the operation determination unit 156 determines whether a touch operation (that is, a parameter change operation) at a position on the touch panel where the parameter change object is displayed is recognized.
- the operation determination unit 156 determines whether the parameter change operation has been recognized by monitoring the user operation recognized by the operation recognition unit 155 and determining whether the user operation matches the parameter change operation. To do. Then, when the parameter changing operation is recognized, the operation determining unit 156 notifies the parameter changing unit 159 that the parameter changing operation has been recognized.
- the display control unit 157 causes the display device 101 to display a screen for showing to the user. As described above, since the display device 101 corresponds to the display unit 130, the display control unit 157 causes the display unit 130 to display the screen.
- the display control unit 157 displays the live view image generated based on the image data obtained through the image sensor and any time point of the image data.
- One or more processed images generated under the respective image processing conditions based on the image data are displayed on the display unit 130.
- the display control unit 157 causes the display unit 130 to display a screen including a live view image generated by the image processing unit 153 and one or more processed images.
- the one or more processed images are images that are captured in a state in which a subject that is partially or entirely included in the live view image is further reduced. That is, the display control unit 157 causes the display unit 130 to display one or more processed images that are captured in a state in which a subject that is partially or entirely included in the live view image is further reduced. As an example, the one or more processed images are images that are captured in a state in which a subject that appears in the entire live view image is further reduced.
- the one or more processed images are a plurality of processed images. That is, the display control unit 157 causes the display unit 130 to display the live view image and the plurality of processed images.
- the plurality of processed images are displayed side by side. That is, the display control unit 157 causes the display unit 130 to display a plurality of processed images side by side.
- the plurality of processed images are displayed side by side, so that the user can more easily compare the plurality of processed images. That is, the user can more easily determine which image processing conditions are more desirable.
- a subject that appears in each of the plurality of processed images overlaps at least partially with a subject that appears in one or more other processed images of the plurality of processed images.
- the subject that appears in each of the plurality of processed images overlaps with the subject that appears in any other processed image of the plurality of processed images. That is, the subjects shown in the plurality of processed images are common subjects.
- the subject appearing in the processed image overlaps between the plurality of processed images, so that the user can more easily compare the plurality of processed images. That is, the user can more easily determine which image processing conditions are more desirable.
- FIG. 4 is an explanatory diagram for explaining an example of a screen including a live view image and a processed image.
- the display control device 100-1 is shown.
- the live view image 10 and the plurality of processed images 20 are shown on the display device 101 of the display control device 100-1.
- the plurality of processed images 20 are displayed side by side.
- each of the plurality of processed images 20 is an image generated under each image processing condition.
- each of the plurality of processed images 20 is an image that is captured in a state where the entire subject captured in the live view image 10 is further reduced. Therefore, the subject that appears in the plurality of processed images 20 is a common subject.
- the live view image and the processed image are displayed.
- the display control unit 157 updates the displayed one or more processed images when it is determined that the update operation has been recognized.
- the display control unit 157 when an update operation is recognized, the display control unit 157 is notified by the operation determination unit 156 that the update operation has been recognized. Then, the display control unit 157 instructs the image processing unit 153 to generate a new processed image. Then, the display control unit 157 causes the display unit 130 to display the live view image and one or more processed images generated based on the image data at the new time point. In this way, the one or more processed images are updated.
- FIGS. 5A and 5B a specific example of the update of the processed image will be described with reference to FIGS. 5A and 5B.
- FIG. 5A and FIG. 5B are explanatory diagrams for explaining an example of the update of the processed image.
- a display control apparatus 100-1 is shown.
- the live view image 10 and the plurality of processed images 20 are shown on the display device 101 of the display control device 100-1.
- FIG. 5A shows an example when the display control device 100-1 shown in the example of FIG. 4 is moved.
- the plurality of processed images 20 shown in FIG. 5A are the same as the plurality of processed images 20 shown in FIG. 4, but the live view image 10 shown in FIG. 5A is the live view image shown in FIG. Different from 10. That is, in the example shown in FIG.
- the subject that appears in the live view image 10 and the subject that appears in the plurality of processed images 20 are different. For this reason, it is difficult for the user of the display control apparatus 100-1 to imagine what the captured image after image processing will look like when the subject captured in the live view image 10 is captured.
- the user performs a touch operation (for example, a tap) at a position where the plurality of processed images 20 are displayed with the finger 30. Then, the plurality of processed images 20 are updated. That is, the subject that appears in the live view image 10 is the same as the subject that appears in the plurality of processed images 20.
- a touch operation for example, a tap
- the user can view a processed image in which the same subject as the live view image is captured. Therefore, the user can easily confirm the captured image after image processing.
- the processed image is updated only when an update operation is performed, the processed image is displayed as if the processed image has been dropped, as in the case where the processed image is not updated at the frame rate like the live view image. There is nothing. Therefore, the user can check the captured image after image processing without stress. In other words, the user can easily check the captured image after image processing at the time of imaging without stress.
- the update operation is a touch operation
- the user can easily perform the operation while looking at the screen, so that the processed image can be updated more easily.
- the update operation is a touch operation at the display position of the processed image, the user can update the processed image with a more intuitive operation.
- the display control unit 157 is a screen including the live view image and the one or more processed images, and updates the one or more processed images.
- the screen to be notified is displayed on the display unit 130.
- the screen includes an animation of the one or more processed images (hereinafter referred to as “notification animation”) that notifies the update of the one or more processed images. That is, when it is determined that the update operation is recognized, the display control unit 157 causes the display unit 130 to display a screen including the notification animation.
- the notification animation is, for example, a fade-in for the one or more processed images.
- the notification animation may be another animation such as blinking of the one or more processed images, zooming in on the one or more processed images.
- a specific example of the notification animation will be described with reference to FIG.
- FIG. 6 is an explanatory diagram for explaining an example of a notification animation for notifying update of a processed image.
- the display state of the processed image 20 from time T 1 to time T 3 is shown. If it is determined that the update operation has been recognized, for example, a new processed image 20 is displayed in a fade-in manner from time T 1 to time T 3 as described above.
- the screen may include an object (hereinafter referred to as “notification object”) that notifies the update of the one or more processed images. That is, the display control unit 157 may cause the display unit 130 to display a screen including the notification object when it is determined that the update operation has been recognized.
- the notification object is, for example, a graphic object surrounded by the one or more processed images.
- the notification object may be another object such as an arrow object pointing to the one or more processed images or an object including character information such as “Updated”.
- a specific example of the notification object will be described with reference to FIG.
- FIG. 7 is an explanatory diagram for explaining an example of a notification object for notifying update of a processed image.
- the display control device 100-1 is shown as in FIG. 5B.
- the display device 101 displays a live view image 10 and a plurality of processed images 20.
- a rectangular notification object 40 surrounding the plurality of processed images 20 is further displayed on the display device 101. If it is determined that the update operation has been recognized, for example, the notification object 40 is displayed as described above.
- the notification of the update of the processed image ensures that the user notices that the processed image has been updated. Further, the user can understand the relationship between the update operation and the update of the processed image.
- the notification on the screen allows the user who is viewing the screen to more reliably notice that the processed image has been updated.
- the notification animation as described above is very conspicuous on the screen. Therefore, according to the notification animation, the user who is looking at the screen is more surely aware that the processed image has been updated.
- the notification object as described above has a small processing amount for display. Therefore, according to the notification object, it is possible to suppress the processing amount in the display processing while making the user watching the screen aware of the update. Can do.
- a parameter changing unit 159 changes a parameter for generating at least one processed image among the one or more processed images. Then, the display control unit 157 causes the display unit 130 to display an object for changing the parameter (that is, a parameter changing object).
- the parameter changing object is a scroll bar.
- FIG. 8 is an explanatory diagram for explaining an example of the parameter changing object.
- the display control device 100-1 is shown as in FIG. 5B.
- the display device 101 displays a live view image 10 and a plurality of processed images 20.
- a parameter changing object 50 that is a scroll bar is further displayed on the display device 101.
- the scroll bar 50 includes a knob portion 51.
- the parameter changing object is displayed in this way.
- the parameter changing unit 159 changes a parameter for generating a processed image.
- the parameter is the degree of contrast of the processed image.
- the parameter changing unit 159 generates at least one processed image of the one or more processed images when the live view image and the one or more processed images are displayed on the display unit 130. Change the parameters for
- the parameter changing unit 159 changes the parameter when it is determined that the parameter changing operation has been recognized.
- the parameter changing unit 159 when the parameter changing operation is recognized, the parameter changing unit 159 is notified by the operation determining unit 156 that the parameter changing operation has been recognized. Then, the parameter changing unit 159 changes a parameter for generating at least one processed image in accordance with the parameter operation.
- the knob 51 of the parameter changing object 50 which is a scroll bar, is moved by the parameter changing operation.
- the parameter changing unit 159 sets the parameter for generating the selected processed image 20 (for example, the processed image 20B) among the plurality of processed images 20 by an amount corresponding to the amount of movement of the knob. change.
- the parameters for generating the processed image are changed.
- the user can adjust the parameters while actually viewing the processed image. Therefore, the user can easily confirm what the captured image after the image processing will be when the parameter is adjusted. That is, convenience for the user is enhanced.
- FIG. 9 is a flowchart illustrating an example of a schematic flow of the display control process according to the first embodiment.
- the display control process is started by starting an application for imaging.
- step S301 an imaging mode of an application for imaging starts.
- step S303 the image processing unit 153 generates one or more new processed images.
- step S305 the image processing unit 153 generates a live view image.
- step S307 the display control unit 157 updates the screen including the live view image and one or more processed images.
- step S309 the operation determination unit 156 determines whether there is an operation for ending the application or the imaging mode. If there is an end operation, the process ends. Otherwise, the process proceeds to step S311.
- step S311 the operation determination unit 156 determines whether an update operation has been performed. If there is an update operation, the process proceeds to step S313. Otherwise, the process returns to step S305.
- step S313 the display control unit 157 instructs the image processing unit 153 to generate a new processed image. Then, the process returns to step S303.
- Japanese Patent Application Laid-Open No. 2007-94819 discloses a sample obtained by changing the tone of a captured image at any point in time while displaying a live view image (or a through image) on a monitor.
- a technique for displaying an image as a thumbnail on the monitor is disclosed. According to this technique, a user of a digital camera causes the digital camera to set imaging conditions by selecting a thumbnail with a desired image tone while viewing a live view image.
- the sample image is displayed as a thumbnail, it is an image further reduced than the live view image. Therefore, even if the user looks at the sample image displayed as a thumbnail, it may be difficult to confirm in detail what captured image is obtained.
- a live view image and a processed image in which a subject is captured in a more enlarged state than the live view image are displayed.
- FIG. 10 is a block diagram showing an example of the configuration of the display control apparatus 100-2 according to the second embodiment.
- the display control device 100-2 includes an imaging unit 110, an input unit 120, a display unit 130, a storage unit 140, and a control unit 160.
- the imaging unit 110 the input unit 120, the display unit 130, and the storage unit 140.
- the imaging control unit 151, the image processing unit 153, the operation recognition unit 155, the operation determination unit 156, and the parameter change unit 159 included in the control unit 160 are also between the first embodiment and the second embodiment. There is no difference. Therefore, here, the image processing unit 163 and the display control unit 167 included in the control unit 160 will be described.
- the image processing unit 163 generates an image by performing image processing. For example, the image processing unit 163 generates an image based on image data obtained through the image sensor. That is, an image is generated based on the image data provided by the imaging unit 110.
- the image data is, for example, RAW data.
- the image processing unit 163 generates a live view image (or a through image) based on image data obtained through the image sensor. This point is as described for the image processing unit 153 according to the first embodiment.
- the image processing unit 163 generates one or more processed images under each image processing condition based on the image data at any point of the image data.
- the image data is RAW data
- the one or more processed images are arbitrary images that can be generated by image processing based on the RAW data.
- the image processing condition is an arbitrary condition related to image processing.
- the image processing conditions include conditions for developing RAW data (hereinafter referred to as “developing conditions”). Further, for example, the image processing conditions include processing (and processing conditions for the processing) further performed on the image after the development processing. These points are as described for the image processing unit 153 according to the first embodiment.
- the image processing unit 163 can generate a processed image under an arbitrary condition related to image processing based on the image data.
- any one of the above points includes, for example, any point (eg, a point immediately after) after the display control unit 157 is instructed to generate a new processed image. This point is also as described for the image processing unit 153 according to the first embodiment.
- the one or more processed images are images in which a subject is captured in a more enlarged state than the live view image.
- the image processing unit 163 generates one or more processed images in which the subject is captured in a more enlarged state than the live view image.
- the one or more processed images are a plurality of processed images. That is, the image processing unit 163 generates a plurality of processed images under each image processing condition. For example, a subject that appears in each of the plurality of processed images overlaps at least partially with a subject that appears in one or more other processed images of the plurality of processed images.
- Display control unit 167 causes the display device 101 to display a screen for showing to the user. As described above, since the display device 101 corresponds to the display unit 130, the display control unit 157 causes the display unit 130 to display the screen.
- the display control unit 167 displays the live view image generated based on the image data obtained through the image sensor, and at any point of the image data.
- One or more processed images generated under the respective image processing conditions based on the image data are displayed on the display unit 130.
- the display control unit 167 causes the display unit 130 to display a screen including a live view image generated by the image processing unit 163 and one or more processed images.
- the one or more processed images are images in which the subject is captured in a more enlarged state than the live view image.
- the display control unit 157 causes the display unit 130 to display one or more processed images in which the subject is captured in an enlarged state as compared with the live view image.
- the one or more processed images are a plurality of processed images.
- the plurality of processed images are displayed side by side.
- a subject that appears in each of the plurality of processed images overlaps at least partially with a subject that appears in one or more other processed images of the plurality of processed images.
- FIG. 11 is an explanatory diagram for explaining a first example of a screen including a live view image and a processed image according to the second embodiment.
- the display control device 100-2 is shown.
- the live view image 10 and the plurality of processed images 20 are shown on the display device 101 of the display control device 100-2.
- Each of the plurality of processed images 20 is an image generated under each image processing condition.
- Each of the plurality of processed images 20 is an image in which the subject is captured in an enlarged state as compared with the live view image 10.
- each of the plurality of processed images 20 is an image that is captured in a state where the subject that is captured in a part 11 of the live view image 10 is further enlarged.
- the subject that appears in the plurality of processed images 20 is a common subject.
- the plurality of processed images 20 are displayed side by side.
- a part 11 of the live view image 10 may be shown on the display device 101.
- the part 11 may be indicated by a broken-line square object.
- FIG. 12 is an explanatory diagram for explaining a second example of a screen including a live view image and a processed image according to the second embodiment.
- a display control device 100-2 is shown.
- the live view image 10 and the plurality of processed images 20 are shown on the display device 101 of the display control device 100-2.
- Each of the plurality of processed images 20 is an image generated under each image processing condition.
- Each of the plurality of processed images 20 is an image in which the subject is captured in an enlarged state as compared with the live view image 10.
- the subject that appears in each processed image 20 partially overlaps the subject that appears in another adjacent processed image 20.
- FIG. 13 is an explanatory diagram for explaining a third example of a screen including a live view image and a processed image according to the second embodiment.
- a display control device 100-2 is shown.
- the live view image 10 and the plurality of processed images 20 are shown on the display device 101 of the display control device 100-2.
- Each of the plurality of processed images 20 is an image generated under each image processing condition.
- Each of the plurality of processed images 20 is an image in which the subject is captured in an enlarged state as compared with the live view image 10.
- the subject that appears in each processed image 20 partially overlaps the subject that appears in another adjacent processed image 20.
- the live view image and the processed image are displayed.
- the display control unit 167 updates the displayed one or more processed images when it is determined that the update operation has been recognized. This point is as described for the display control unit 157 according to the first embodiment.
- the display control unit 167 causes the display unit 130 to display an object for changing the parameter (that is, an object for parameter change). .
- an object for parameter change that is, an object for parameter change.
- the display control process according to the second embodiment is the same as the display control process according to the first embodiment described with reference to FIG.
- the difference between the first embodiment and the second embodiment is that the processed image generated in step S303 shown in FIG. 9 is different.
- each of the image processing based on the live view image generated based on the image data obtained through the image sensor and the image data at any point of the image data are displayed on the display device. It is also determined whether an update operation has been recognized. When it is determined that the update operation has been recognized, the one or more processed images to be displayed are updated.
- the user can view a processed image in which the same subject as the live view image is captured. Therefore, the user can easily confirm the captured image after image processing.
- the processed image is updated only when an update operation is performed, the processed image is displayed as if the processed image has been dropped, as in the case where the processed image is not updated at the frame rate like the live view image. There is nothing. Therefore, the user can check the captured image after image processing without stress. In other words, the user can easily check the captured image after image processing at the time of imaging without stress.
- the display device is a display surface of a touch panel
- the update operation is a predetermined touch operation on the touch panel.
- the update operation is a touch operation at a position on the touch panel where the one or more processed images are displayed.
- the user who performed the update operation is notified of the update of the one or more processed images.
- the screen when it is determined that the update operation has been recognized, the screen includes the live view image and the one or more processed images, and notifies the update of the one or more processed images. Is displayed on the display device.
- the screen includes an animation (that is, a notification animation) for the one or more processed images that notifies the update of the one or more processed images.
- an animation that is, a notification animation
- the notification animation is very conspicuous on the screen. According to the notification animation, the user watching the screen can more surely notice that the processed image has been updated.
- the screen includes an object (that is, a notification object) that notifies the update of the one or more processed images.
- the notification object Since the notification object has a small processing amount for display, according to the notification object, it is possible to suppress the processing amount in the display process while notifying the user who is viewing the screen of the update.
- the one or more processed images are images in which the subject is captured in a more enlarged state than the live view image.
- the one or more processed images are images that are captured in a state where a subject that is partially or entirely included in the live view image is further reduced.
- the one or more processed images are a plurality of processed images.
- the plurality of processed images are displayed side by side.
- a subject that appears in each of the plurality of processed images overlaps at least partially with a subject that appears in one or more other processed images of the plurality of processed images.
- a parameter for generating at least one processed image of the one or more processed images is set. Be changed. Further, it is determined whether or not the parameter changing operation has been recognized. Further, when it is determined that the parameter changing operation has been recognized, the parameter is changed.
- the display device is a display surface of a touch panel.
- an object for changing the parameter is further displayed on the display device.
- the parameter changing operation is a touch operation at the display position of the object in the touch panel.
- the update operation is a touch operation at a position where one or more processed images are displayed
- the update operation is not a touch operation at a position where one or more processed images are displayed, but indicates another position (for example, a position near a position where one or more processed images are displayed, or a button. It may be a touch operation at a position where the object is displayed.
- the update operation may be a touch operation at an arbitrary position (for example, a double tap at an arbitrary position, a pinch at an arbitrary position, etc.) instead of a touch operation at a predetermined position.
- the update operation may be a different operation (for example, a key input operation, a voice input operation, an input operation using a sensor, etc.) instead of a touch operation.
- the operation recognition unit may notify the operation determination unit that the update operation or the parameter change operation has been recognized. Then, the operation determination unit may determine that the update operation or the parameter change operation has been recognized based on the notification.
- the update of the processed image is notified by the screen displayed on the display device
- the present disclosure is not limited to the example.
- the update of the processed image may be notified by another means (for example, sound, vibration, etc.) instead of the notification by the screen or in combination with the notification by the screen.
- the display control device may be another device (for example, a digital camera, a tablet terminal, or the like) having a camera and a display device other than a smartphone.
- the display control device may be a device that does not have a camera or a display device.
- the display control device may be a device (for example, a server) that communicates with a camera or another device having a display device.
- processing steps in the display control processing of this specification do not necessarily have to be executed in time series in the order described in the flowchart.
- the processing steps in the display control process may be executed in an order different from the order described in the flowchart, or may be executed in parallel.
- the following configurations also belong to the technical scope of the present disclosure.
- (1) One or more processes generated under respective image processing conditions based on live view images generated based on image data obtained through an image sensor and image data at any point in time among the image data
- a display control unit for displaying an image on a display device;
- a determination unit that determines whether a predetermined user operation is recognized;
- the display control unit updates the one or more processed images to be displayed when it is determined that the predetermined user operation is recognized.
- Display control device (2)
- the display device is a display surface of a touch panel,
- the predetermined user operation is a predetermined touch operation on the touch panel.
- the display control apparatus wherein the predetermined user operation is a touch operation at a position on the touch panel where the one or more processed images are displayed.
- the predetermined user operation is a touch operation at a position on the touch panel where the one or more processed images are displayed.
- the user who has performed the predetermined user operation is notified of the update of the one or more processed images, any one of (1) to (3)
- the display control device according to item.
- the display control unit is a screen including the live view image and the one or more processed images when it is determined that the predetermined user operation is recognized, and updates the one or more processed images.
- the display control device according to any one of (1) to (4), wherein the display screen is displayed on the display device.
- the display control device according to (5), wherein the screen includes an animation of the one or more processed images that notifies the update of the one or more processed images.
- the display control device according to (5), wherein the screen includes an object for notifying update of the one or more processed images.
- the display control apparatus according to any one of (1) to (7), wherein the one or more processed images are images in which a subject is captured in an enlarged state as compared with the live view image.
- the one or more processed images are images that are captured in a state in which a subject captured in a part or the whole of the live view image is further reduced. Control device.
- the display control apparatus according to any one of (1) to (9), wherein the one or more processed images are a plurality of processed images.
- the display control device according to (10), wherein the plurality of processed images are displayed side by side.
- the subject shown in each of the plurality of processed images is at least partially overlapped with a subject shown in one or more other processed images of the plurality of processed images, according to (10) or (11).
- Display control device (13)
- a changing unit that changes a parameter for generating at least one processed image of the one or more processed images when the live view image and the one or more processed images are displayed on the display device. Further comprising The determination unit determines whether another predetermined user operation is recognized, The changing unit changes the parameter when it is determined that the predetermined another user operation is recognized.
- the display control apparatus according to any one of (1) to (12).
- the display device is a display surface of a touch panel, The display control unit further displays an object for changing the parameter on the display device,
- the predetermined another user operation is a touch operation at a display position of the object in the touch panel.
- the display control device according to (13).
- Computer One or more processes generated under respective image processing conditions based on live view images generated based on image data obtained through an image sensor and image data at any point in time among the image data A display control unit for displaying an image on a display device; A determination unit that determines whether a predetermined user operation is recognized; Function as The display control unit updates the one or more processed images to be displayed when it is determined that the predetermined user operation is recognized. program.
- One or more processes generated under respective image processing conditions based on live view images generated based on image data obtained through an image sensor and image data at any point in time among the image data Displaying an image on a display device; Determining whether a predetermined user operation has been recognized; Including The one or more processed images to be displayed are updated when it is determined that the predetermined user operation has been recognized. Display control method.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
1.表示制御装置の外観
2.第1の実施形態
2.1.表示制御装置の構成
2.2.処理の流れ
3.第2の実施形態
3.1.表示制御装置の構成
3.2.処理の流れ
4.ハードウェア構成
5.まとめ
まず、図1を参照して、本開示の一実施形態に係る表示制御装置の外観を説明する。図1は、本開示の一実施形態に係る表示制御装置100の外観の一例を説明するための説明図である。図1を参照すると、表示制御装置100が示されている。この例では、表示制御装置100は、スマートフォンである。
続いて、図2を参照して、本開示の一実施形態に表示制御装置100のハードウェア構成の一例を説明する。図2は、一実施形態に係る表示制御装置100のハードウェア構成の一例を示すブロック図である。図2を参照すると、表示制御装置100は、例えば、CPU(Central Processing Unit)71、レンズブロック73、モータ75、モータドライバ77、撮像素子79、撮像素子ドライバ81、タイミング発生器83、信号処理回路85、画像処理回路87、SDRAM(Synchronous Dynamic Random Access Memory)89、モニタ91、内部メモリ93、外部メモリ95、キー入力装置97及びタッチ検出面99を備える。
まず、本開示の第1の実施形態を説明する。本開示の第1の実施形態によれば、ライブビュー画像と画像処理後の処理画像とが表示され、当該1つ以上の処理画像は、所定のユーザ操作に応じて更新される。
図3~図7を参照して、第1の実施形態に係る表示制御装置100-1の構成の一例を説明する。図3は、第1の実施形態に係る表示制御装置100-1の構成の一例を示すブロック図である。図3を参照すると、表示制御装置100-1は、撮像部110、入力部120、表示部130、記憶部140及び制御部150を備える。
撮像部110は、撮像素子を通じて画像データを得る。当該画像データは、例えばRAWデータである。
入力部120は、ユーザ操作を検出する。そして、入力部120は、当該検出結果を制御部150(操作認識部155)に提供する。
表示部130は、表示制御装置100-1のユーザに見せるための画面を表示する。即ち、表示部130は、表示装置101に対応する。例えば、表示部130は、制御部150(表示制御部157)による制御に応じて、上記画面を表示する。
記憶部140は、表示制御装置100-1の動作のためのプログラム及びデータを記憶する。また、例えば、記憶部140は、撮像部110及び/又は制御部150(画像処理部153)を通じて得られる画像データを記憶する。
制御部150は、表示制御装置100-1の様々な機能を提供する。例えば、制御部150は、記憶部140又は他の記憶媒体に記憶されるプログラムを実行することにより、上記様々な機能を提供する。制御部150は、撮像制御部151、画像処理部153、操作認識部155、操作判定部156、表示制御部157及びパラメータ変更部159を含む。
撮像制御部151は、撮像部110による撮像に関する制御を行う。
画像処理部153は、画像処理を行うことにより画像を生成する。例えば、画像処理部153は、撮像素子を通じて得られる画像データに基づいて画像を生成する。即ち、撮像部110により提供される画像データに基づいて画像を生成する。当該画像データは、例えばRAWデータである。
とりわけ本実施形態では、画像処理部153は、撮像素子を通じて得られる画像データに基づいて、ライブビュー画像(又はスルー画像)を生成する。例えば、撮像モードでの待機状態において、画像処理部153は、撮像素子を通じて順次得られる画像データに基づいて、ライブビュー画像を生成する。
また、とりわけ本実施形態では、画像処理部153は、上記画像データのうちのいずれかの時点での画像データに基づいて、1つ以上の処理画像をそれぞれの画像処理条件で生成する。例えば、上記画像データは、RAWデータであり、上記1つ以上の処理画像は、RAWデータに基づいて画像処理により生成され得る任意の画像である。また、上記画像処理条件は、画像処理に関する任意の条件である。
操作認識部155は、ユーザ操作を認識する。例えば、入力部120は、表示制御装置100-1のユーザによるユーザ操作を検出すると、当該ユーザ操作の検出結果を操作認識部155に提供し、操作認識部155は、当該検出結果を取得する。そして、操作認識部155は、当該検出結果に基づいて、ユーザ操作を認識する。
-更新操作の判定
操作判定部156は、所定のユーザ操作(以下、「更新操作」と呼ぶ)が認識されたかを判定する。
操作判定部156は、所定の別のユーザ操作(以下、「パラメータ変更操作」と呼ぶ)が認識されたかを判定する。
表示制御部157は、ユーザに見せるための画面を表示装置101に表示させる。上述したとおり、表示装置101は、表示部130に対応するので、表示制御部157は、上記画面を表示部130に表示させる。
とりわけ本実施形態では、表示制御部157は、撮像素子を通じて得られる画像データに基づいて生成されるライブビュー画像、及び、当該画像データのうちのいずれかの時点での画像データに基づいてそれぞれの画像処理条件で生成される1つ以上の処理画像を、表示部130に表示させる。例えば、表示制御部157は、画像処理部153により生成されるライブビュー画像及び1つ以上の処理画像を含む画面を、表示部130に表示させる。
また、とりわけ本実施形態では、表示制御部157は、上記更新操作が認識されたと判定される場合に、表示される上記1つ以上の処理画像を更新する。
また、例えば、上記1つ以上の処理画像が更新される場合に、上記更新操作を行ったユーザに上記1つ以上の処理画像の更新が通知される。例えば、表示制御部157は、上記更新操作が認識されたと判定された場合に、上記ライブビュー画像及び上記1つ以上の処理画像を含む画面であって、当該1つ以上の処理画像の更新を通知する当該画面を、表示部130に表示させる。
また、例えば、後述するように、上記1つ以上の処理画像のうちの少なくとも1つの処理画像の生成のためのパラメータが、パラメータ変更部159により変更される。そして、表示制御部157は、上記パラメータの変更のためのオブジェクト(即ち、パラメータ変更用オブジェクト)を、表示部130に表示させる。一例として、当該パラメータ変更用オブジェクトは、スクロールバーである。以下、この点について図8を参照して具体例を説明する。
パラメータ変更部159は、処理画像の生成のためのパラメータを変更する。一例として、処理画像が、通常の画像よりも高コントラストの画像である場合に、上記パラメータは、処理画像のコントラストの程度である。
次に、図9を参照して、第1の実施形態に係る表示制御処理の一例を説明する。図9は、第1の実施形態に係る表示制御処理の概略的な流れの一例を示すフローチャートである。当該表示制御処理は、撮像のためのアプリケーションの起動により開始する。
続いて、本開示の第2の実施形態を説明する。
図10~図13を参照して、第2の実施形態に係る表示制御装置100-2の構成の一例を説明する。図10は、第2の実施形態に係る表示制御装置100-2の構成の一例を示すブロック図である。図10を参照すると、表示制御装置100-2は、撮像部110、入力部120、表示部130、記憶部140及び制御部160を備える。
画像処理部163は、画像処理を行うことにより画像を生成する。例えば、画像処理部163は、撮像素子を通じて得られる画像データに基づいて画像を生成する。即ち、撮像部110により提供される画像データに基づいて画像を生成する。当該画像データは、例えばRAWデータである。
画像処理部163は、撮像素子を通じて得られる画像データに基づいて、ライブビュー画像(又はスルー画像)を生成する。この点は、第1の実施形態に係る画像処理部153について説明したとおりである。
また、画像処理部163は、上記画像データのうちのいずれかの時点での画像データに基づいて、1つ以上の処理画像をそれぞれの画像処理条件で生成する。例えば、上記画像データは、RAWデータであり、上記1つ以上の処理画像は、RAWデータに基づいて画像処理により生成され得る任意の画像である。また、上記画像処理条件は、画像処理に関する任意の条件である。
表示制御部167は、ユーザに見せるための画面を表示装置101に表示させる。上述したとおり、表示装置101は、表示部130に対応するので、表示制御部157は、上記画面を表示部130に表示させる。
本実施形態では、表示制御部167は、撮像素子を通じて得られる画像データに基づいて生成されるライブビュー画像、及び、当該画像データのうちのいずれかの時点での画像データに基づいてそれぞれの画像処理条件で生成される1つ以上の処理画像を、表示部130に表示させる。例えば、表示制御部167は、画像処理部163により生成されるライブビュー画像及び1つ以上の処理画像を含む画面を、表示部130に表示させる。
また、例えば、第2の実施形態でも、表示制御部167は、上記更新操作が認識されたと判定される場合に、表示される上記1つ以上の処理画像を更新する。この点は、第1の実施形態に係る表示制御部157について説明したとおりである。
また、例えば、第2の実施形態でも、上記1つ以上の処理画像が更新される場合に、上記更新操作を行ったユーザに上記1つ以上の処理画像の更新が通知される。この点は、第1の実施形態に係る表示制御部157について説明したとおりである。
また、例えば、第2の実施形態でも、表示制御部167は、上記パラメータの変更のためのオブジェクト(即ち、パラメータ変更用オブジェクト)を、表示部130に表示させる。この点は、第1の実施形態に係る表示制御部157について説明したとおりである。
第2の実施形態に係る表示制御処理は、図9を参照して説明した第1の実施形態に係る表示制御処理と同様である。第1の実施形態と第2の実施形態との間の相違点は、図9に示されるステップS303で生成される処理画像が異なるという点である。
ここまで、図1~図13を用いて、本開示の実施形態に係る表示制御装置100及び表示制御処理を説明した。本開示に係る実施形態によれば、撮像素子を通じて得られる画像データに基づいて生成されるライブビュー画像、及び、上記画像データのうちのいずれかの時点での画像データに基づいてそれぞれの画像処理条件で生成される1つ以上の処理画像が、表示装置に表示される。また、更新操作が認識されたかが判定される。また、上記更新操作が認識されたと判定される場合に、表示される上記1つ以上の処理画像が更新される。
(1)
撮像素子を通じて得られる画像データに基づいて生成されるライブビュー画像、及び、前記画像データのうちのいずれかの時点での画像データに基づいてそれぞれの画像処理条件で生成される1つ以上の処理画像を、表示装置に表示させる表示制御部と、
所定のユーザ操作が認識されたかを判定する判定部と、
を備え、
前記表示制御部は、前記所定のユーザ操作が認識されたと判定される場合に、表示される前記1つ以上の処理画像を更新する、
表示制御装置。
(2)
前記表示装置は、タッチパネルの表示面であり、
前記所定のユーザ操作は、前記タッチパネルにおける所定のタッチ操作である、
前記(1)に記載の表示制御装置。
(3)
前記所定のユーザ操作は、前記タッチパネルのうちの前記1つ以上の処理画像が表示される位置でのタッチ操作である、前記(2)に記載の表示制御装置。
(4)
前記1つ以上の処理画像が更新される場合に、前記所定のユーザ操作を行ったユーザに前記1つ以上の処理画像の更新が通知される、前記(1)~(3)のいずれか1項に記載の表示制御装置。
(5)
前記表示制御部は、前記所定のユーザ操作が認識されたと判定された場合に、前記ライブビュー画像及び前記1つ以上の処理画像を含む画面であって、前記1つ以上の処理画像の更新を通知する前記画面を、前記表示装置に表示させる、前記(1)~(4)のいずれか1項に記載の表示制御装置。
(6)
前記画面は、前記1つ以上の処理画像の更新を通知する、前記1つ以上の処理画像についてのアニメーションを含む、前記(5)に記載の表示制御装置。
(7)
前記画面は、前記1つ以上の処理画像の更新を通知するオブジェクトを含む、前記(5)に記載の表示制御装置。
(8)
前記1つ以上の処理画像は、前記ライブビュー画像と比べてより拡大された状態で被写体が写る画像である、前記(1)~(7)のいずれか1項に記載の表示制御装置。
(9)
前記1つ以上の処理画像は、前記ライブビュー画像の一部又は全体に写る被写体がより縮小された状態で写る画像である、前記(1)~(7)のいずれか1項に記載の表示制御装置。
(10)
前記1つ以上の処理画像は、複数の処理画像である、前記(1)~(9)のいずれか1項に記載の表示制御装置。
(11)
前記複数の処理画像は、並べて表示される、前記(10)に記載の表示制御装置。
(12)
前記複数の処理画像の各々に写る被写体は、前記複数の処理画像のうちの1つ以上の別の処理画像に写る被写体と少なくとも一部で重複する、前記(10)又は(11)に記載の表示制御装置。
(13)
前記ライブビュー画像及び前記1つ以上の処理画像が前記表示装置に表示されている際に、前記1つ以上の処理画像のうちの少なくとも1つの処理画像の生成のためのパラメータを変更する変更部
をさらに備え、
前記判定部は、所定の別のユーザ操作が認識されたかを判定し、
前記変更部は、前記所定の別のユーザ操作が認識されたと判定される場合に、前記パラメータを変更する、
前記(1)~(12)のいずれか1項に記載表示制御装置。
(14)
前記表示装置は、タッチパネルの表示面であり、
前記表示制御部は、前記パラメータの変更のためのオブジェクトを、前記表示装置にさらに表示させ、
前記所定の別のユーザ操作は、前記タッチパネルのうちの前記オブジェクトの表示位置でのタッチ操作である、
前記(13)に記載の表示制御装置。
(15)
コンピュータを、
撮像素子を通じて得られる画像データに基づいて生成されるライブビュー画像、及び、前記画像データのうちのいずれかの時点での画像データに基づいてそれぞれの画像処理条件で生成される1つ以上の処理画像を、表示装置に表示させる表示制御部と、
所定のユーザ操作が認識されたかを判定する判定部と、
として機能させ、
前記表示制御部は、前記所定のユーザ操作が認識されたと判定される場合に、表示される前記1つ以上の処理画像を更新する、
プログラム。
(16)
撮像素子を通じて得られる画像データに基づいて生成されるライブビュー画像、及び、前記画像データのうちのいずれかの時点での画像データに基づいてそれぞれの画像処理条件で生成される1つ以上の処理画像を、表示装置に表示させることと、
所定のユーザ操作が認識されたかを判定することと、
を含み、
表示される前記1つ以上の処理画像は、前記所定のユーザ操作が認識されたと判定される場合に更新される、
表示制御方法。
20 処理画像
30 指
40 通知オブジェクト
100 表示制御装置
110 撮像部
120 入力部
130 表示部
140 記憶部
150、160 制御部
151 撮像制御部
153 画像処理部
155 操作認識部
156 操作判定部
157、167 表示制御部
159 パラメータ変更部
Claims (16)
- 撮像素子を通じて得られる画像データに基づいて生成されるライブビュー画像、及び、前記画像データのうちのいずれかの時点での画像データに基づいてそれぞれの画像処理条件で生成される1つ以上の処理画像を、表示装置に表示させる表示制御部と、
所定のユーザ操作が認識されたかを判定する判定部と、
を備え、
前記表示制御部は、前記所定のユーザ操作が認識されたと判定される場合に、表示される前記1つ以上の処理画像を更新する、
表示制御装置。 - 前記表示装置は、タッチパネルの表示面であり、
前記所定のユーザ操作は、前記タッチパネルにおける所定のタッチ操作である、
請求項1に記載の表示制御装置。 - 前記所定のユーザ操作は、前記タッチパネルのうちの前記1つ以上の処理画像が表示される位置でのタッチ操作である、請求項2に記載の表示制御装置。
- 前記1つ以上の処理画像が更新される場合に、前記所定のユーザ操作を行ったユーザに前記1つ以上の処理画像の更新が通知される、請求項1に記載の表示制御装置。
- 前記表示制御部は、前記所定のユーザ操作が認識されたと判定された場合に、前記ライブビュー画像及び前記1つ以上の処理画像を含む画面であって、前記1つ以上の処理画像の更新を通知する前記画面を、前記表示装置に表示させる、請求項1に記載の表示制御装置。
- 前記画面は、前記1つ以上の処理画像の更新を通知する、前記1つ以上の処理画像についてのアニメーションを含む、請求項5に記載の表示制御装置。
- 前記画面は、前記1つ以上の処理画像の更新を通知するオブジェクトを含む、請求項5に記載の表示制御装置。
- 前記1つ以上の処理画像は、前記ライブビュー画像と比べてより拡大された状態で被写体が写る画像である、請求項1に記載の表示制御装置。
- 前記1つ以上の処理画像は、前記ライブビュー画像の一部又は全体に写る被写体がより縮小された状態で写る画像である、請求項1に記載の表示制御装置。
- 前記1つ以上の処理画像は、複数の処理画像である、請求項1に記載の表示制御装置。
- 前記複数の処理画像は、並べて表示される、請求項10に記載の表示制御装置。
- 前記複数の処理画像の各々に写る被写体は、前記複数の処理画像のうちの1つ以上の別の処理画像に写る被写体と少なくとも一部で重複する、請求項10に記載の表示制御装置。
- 前記ライブビュー画像及び前記1つ以上の処理画像が前記表示装置に表示されている際に、前記1つ以上の処理画像のうちの少なくとも1つの処理画像の生成のためのパラメータを変更する変更部
をさらに備え、
前記判定部は、所定の別のユーザ操作が認識されたかを判定し、
前記変更部は、前記所定の別のユーザ操作が認識されたと判定される場合に、前記パラメータを変更する、
請求項1に記載表示制御装置。 - 前記表示装置は、タッチパネルの表示面であり、
前記表示制御部は、前記パラメータの変更のためのオブジェクトを、前記表示装置にさらに表示させ、
前記所定の別のユーザ操作は、前記タッチパネルのうちの前記オブジェクトの表示位置でのタッチ操作である、
請求項13に記載の表示制御装置。 - コンピュータを、
撮像素子を通じて得られる画像データに基づいて生成されるライブビュー画像、及び、前記画像データのうちのいずれかの時点での画像データに基づいてそれぞれの画像処理条件で生成される1つ以上の処理画像を、表示装置に表示させる表示制御部と、
所定のユーザ操作が認識されたかを判定する判定部と、
として機能させ、
前記表示制御部は、前記所定のユーザ操作が認識されたと判定される場合に、表示される前記1つ以上の処理画像を更新する、
プログラム。 - 撮像素子を通じて得られる画像データに基づいて生成されるライブビュー画像、及び、前記画像データのうちのいずれかの時点での画像データに基づいてそれぞれの画像処理条件で生成される1つ以上の処理画像を、表示装置に表示させることと、
所定のユーザ操作が認識されたかを判定することと、
を含み、
表示される前記1つ以上の処理画像は、前記所定のユーザ操作が認識されたと判定される場合に更新される、
表示制御方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP13870589.2A EP2945364B1 (en) | 2013-01-08 | 2013-11-15 | Display control device, program, and display control method |
JP2014556335A JP6245182B2 (ja) | 2013-01-08 | 2013-11-15 | 表示制御装置、プログラム及び表示制御方法 |
US14/758,715 US9894262B2 (en) | 2013-01-08 | 2013-11-15 | Display control apparatus to enable a user to check a captured image after image processing |
US15/861,766 US10440260B2 (en) | 2013-01-08 | 2018-01-04 | Display control apparatus to enable a user to check a captured image after image processing |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013001016 | 2013-01-08 | ||
JP2013-001016 | 2013-01-08 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/758,715 A-371-Of-International US9894262B2 (en) | 2013-01-08 | 2013-11-15 | Display control apparatus to enable a user to check a captured image after image processing |
US15/861,766 Continuation US10440260B2 (en) | 2013-01-08 | 2018-01-04 | Display control apparatus to enable a user to check a captured image after image processing |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014109129A1 true WO2014109129A1 (ja) | 2014-07-17 |
Family
ID=51166794
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/080914 WO2014109129A1 (ja) | 2013-01-08 | 2013-11-15 | 表示制御装置、プログラム及び表示制御方法 |
Country Status (4)
Country | Link |
---|---|
US (2) | US9894262B2 (ja) |
EP (1) | EP2945364B1 (ja) |
JP (1) | JP6245182B2 (ja) |
WO (1) | WO2014109129A1 (ja) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6617428B2 (ja) * | 2015-03-30 | 2019-12-11 | 株式会社ニコン | 電子機器 |
JP7050067B2 (ja) * | 2016-12-14 | 2022-04-07 | サムスン エレクトロニクス カンパニー リミテッド | ディスプレイ装置及びその制御方法 |
CN109348137B (zh) * | 2018-11-30 | 2021-11-16 | 努比亚技术有限公司 | 移动终端拍照控制方法、装置、移动终端以及存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11284884A (ja) * | 1998-03-30 | 1999-10-15 | Sanyo Electric Co Ltd | デジタルカメラ |
JP2005078343A (ja) * | 2003-08-29 | 2005-03-24 | Olympus Corp | データ書換装置 |
JP2005110097A (ja) * | 2003-10-01 | 2005-04-21 | Sony Corp | 撮像装置および撮像方法 |
JP2005229326A (ja) * | 2004-02-13 | 2005-08-25 | Casio Comput Co Ltd | カメラ装置及びスルー画像表示方法 |
JP2007194819A (ja) | 2006-01-18 | 2007-08-02 | Casio Comput Co Ltd | カメラ装置、及び撮影条件設定方法 |
JP2012015827A (ja) * | 2010-07-01 | 2012-01-19 | Nec Casio Mobile Communications Ltd | 端末装置及びプログラム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030169350A1 (en) * | 2002-03-07 | 2003-09-11 | Avi Wiezel | Camera assisted method and apparatus for improving composition of photography |
US7349020B2 (en) * | 2003-10-27 | 2008-03-25 | Hewlett-Packard Development Company, L.P. | System and method for displaying an image composition template |
KR100724426B1 (ko) * | 2005-10-11 | 2007-06-04 | 엘지전자 주식회사 | 영상촬영 장치 및 방법 |
US9148618B2 (en) * | 2009-05-29 | 2015-09-29 | Apple Inc. | Systems and methods for previewing newly captured image content and reviewing previously stored image content |
JP5457217B2 (ja) * | 2010-02-02 | 2014-04-02 | オリンパスイメージング株式会社 | カメラ |
US8332429B2 (en) * | 2010-06-22 | 2012-12-11 | Xerox Corporation | Photography assistant and method for assisting a user in photographing landmarks and scenes |
US20120198386A1 (en) * | 2011-01-31 | 2012-08-02 | Nokia Corporation | Causing display of thumbnail images |
JP5806512B2 (ja) * | 2011-05-31 | 2015-11-10 | オリンパス株式会社 | 撮像装置、撮像方法および撮像プログラム |
JP5880263B2 (ja) * | 2012-05-02 | 2016-03-08 | ソニー株式会社 | 表示制御装置、表示制御方法、プログラムおよび記録媒体 |
KR101969424B1 (ko) * | 2012-11-26 | 2019-08-13 | 삼성전자주식회사 | 촬영된 이미지를 표시하는 촬영 장치 및 그 촬영 방법 |
US9497384B2 (en) * | 2013-11-26 | 2016-11-15 | Kathleen Panek-Rickerson | Template photography and methods of using the same |
-
2013
- 2013-11-15 WO PCT/JP2013/080914 patent/WO2014109129A1/ja active Application Filing
- 2013-11-15 US US14/758,715 patent/US9894262B2/en active Active
- 2013-11-15 EP EP13870589.2A patent/EP2945364B1/en active Active
- 2013-11-15 JP JP2014556335A patent/JP6245182B2/ja not_active Expired - Fee Related
-
2018
- 2018-01-04 US US15/861,766 patent/US10440260B2/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11284884A (ja) * | 1998-03-30 | 1999-10-15 | Sanyo Electric Co Ltd | デジタルカメラ |
JP2005078343A (ja) * | 2003-08-29 | 2005-03-24 | Olympus Corp | データ書換装置 |
JP2005110097A (ja) * | 2003-10-01 | 2005-04-21 | Sony Corp | 撮像装置および撮像方法 |
JP2005229326A (ja) * | 2004-02-13 | 2005-08-25 | Casio Comput Co Ltd | カメラ装置及びスルー画像表示方法 |
JP2007194819A (ja) | 2006-01-18 | 2007-08-02 | Casio Comput Co Ltd | カメラ装置、及び撮影条件設定方法 |
JP2012015827A (ja) * | 2010-07-01 | 2012-01-19 | Nec Casio Mobile Communications Ltd | 端末装置及びプログラム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2945364A4 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014109129A1 (ja) | 2017-01-19 |
US20150358532A1 (en) | 2015-12-10 |
EP2945364B1 (en) | 2021-12-29 |
US20180131867A1 (en) | 2018-05-10 |
US9894262B2 (en) | 2018-02-13 |
JP6245182B2 (ja) | 2017-12-13 |
EP2945364A1 (en) | 2015-11-18 |
EP2945364A4 (en) | 2016-07-06 |
US10440260B2 (en) | 2019-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6445219B2 (ja) | デジタルカメラ及びデジタルカメラの表示方法 | |
US10805522B2 (en) | Method of controlling camera of device and device thereof | |
JP4655135B2 (ja) | 撮像装置、撮像領域表示方法および撮像領域表示プログラム | |
US8582891B2 (en) | Method and apparatus for guiding user with suitable composition, and digital photographing apparatus | |
US20140176776A1 (en) | Display control apparatus and display control method | |
US20080136942A1 (en) | Image sensor equipped photographing apparatus and picture photographing method | |
JP2006303651A (ja) | 電子装置 | |
US10440260B2 (en) | Display control apparatus to enable a user to check a captured image after image processing | |
JP2007199311A (ja) | 画像表示装置及び撮像装置 | |
JP2006261912A (ja) | 撮影装置 | |
JP2006344168A (ja) | 画像表示装置及び撮影装置 | |
US9160926B2 (en) | Image processing apparatus having display device, control method therefor, and storage medium | |
JP6742789B2 (ja) | 表示制御装置、その制御方法、プログラムおよび記憶媒体 | |
JP2007178735A (ja) | 撮像装置、ズーム時の画角表示方法、およびプログラム | |
KR101812656B1 (ko) | 디지털 촬영 장치 및 이의 제어 방법 | |
JP5479190B2 (ja) | 撮像装置及び撮像装置の制御方法 | |
JP2009282436A (ja) | 液晶表示装置及び液晶表示方法 | |
JP5003803B2 (ja) | 画像出力装置、および、プログラム | |
JP2007123961A (ja) | 画像表示装置及びビデオ信号生成方法 | |
JP2011035918A (ja) | デジタルズーム機能付きカメラ | |
JP2011210204A (ja) | 差分画像生成装置、プログラム及び差分画像生成方法 | |
JP2019009527A (ja) | 画像処理装置およびその制御方法ならびにプログラム | |
KR20080017977A (ko) | 영상 촬영 장치 및 영상 촬영 조건 변경 방법 | |
JPH10224677A (ja) | 情報処理装置および記録媒体 | |
JP2006211434A (ja) | デジタルズーム機能付きカメラ |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13870589 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014556335 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013870589 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14758715 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |