CN107295247B - Image recording apparatus and control method thereof - Google Patents

Image recording apparatus and control method thereof Download PDF

Info

Publication number
CN107295247B
CN107295247B CN201710223065.6A CN201710223065A CN107295247B CN 107295247 B CN107295247 B CN 107295247B CN 201710223065 A CN201710223065 A CN 201710223065A CN 107295247 B CN107295247 B CN 107295247B
Authority
CN
China
Prior art keywords
image
display
recording
control unit
recording apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710223065.6A
Other languages
Chinese (zh)
Other versions
CN107295247A (en
Inventor
太田知宏
中岛道纪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2016079862 priority Critical
Priority to JP2016-079862 priority
Priority to JP2016191330A priority patent/JP6808424B2/en
Priority to JP2016-191330 priority
Application filed by Canon Inc filed Critical Canon Inc
Publication of CN107295247A publication Critical patent/CN107295247A/en
Application granted granted Critical
Publication of CN107295247B publication Critical patent/CN107295247B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23229Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor comprising further processing of the captured image without influencing the image pickup process
    • H04N5/23235Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor comprising further processing of the captured image without influencing the image pickup process by using a single image in order to influence the resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23218Control of camera operation based on recognized objects
    • H04N5/23219Control of camera operation based on recognized objects where the recognized objects include parts of the human body, e.g. human faces, facial parts or facial expressions

Abstract

The invention provides an image recording apparatus and a control method thereof. The image recording apparatus: generating a reduced image by reducing the image; generating a first cut-out image by cutting out a part of the image before the reduction; performing processing for image recording involving writing into a memory for the reduced image, and performing recording processing for recording the processed image; presenting a first display by outputting an image based on the reduced image to the display unit, presenting an enlarged display larger than the first display by outputting an image based on the first cut-out image to the display unit during the recording process; and not performing specific processing involving reading or writing data from or into the memory at least when the enlarged display is in progress during the recording processing.

Description

Image recording apparatus and control method thereof
Technical Field
The present invention relates to an image recording apparatus equipped with a function of recording an image and also displaying a part of the image in an enlarged manner, and a control method of the image recording apparatus.
Background
In particular, when a video image is captured at 4 kilos (K) or 8K, it is difficult to correctly perform manual focusing while visually confirming a relatively small viewfinder or monitor provided to the image capturing apparatus, and therefore, a function for assisting confirmation is required 4K refers to 2160p (4096 × 2160) or the number of pixels close thereto, and 8K refers to 4320p (8192 × 4320) or the number of pixels close thereto.
Japanese patent laid-open No. 11-341331 discusses a focus state in which details of an object are facilitated to be confirmed by displaying a moving image of the object enlarged at a predetermined ratio at the time of manual focus.
The increasing development of image sensors and the number of pixels in video formats has led to an increasing demand for processing performance, particularly reading and writing performance of a memory (random access memory (RAM)) to be used in data processing in an apparatus, which an image recording apparatus such as an image pickup apparatus needs to meet.
For example, in order to present an enlarged display like japanese patent laid-open No. 11-341331 with high quality, a partial area in a RAW signal acquired from an image sensor should be cut out and developed, and data should be transmitted to a display device. At this time, the memory is used to temporarily store the RAW image, and is used to read and write video ram (vram) data. Further, in order to record moving image data conforming to a predetermined video format into a recording medium, processes of reading data from and writing data into a memory occur separately due to development processing and encoding processing. In addition, additional functions such as face detection and transmission of images to an external device also use a memory for the respective control processes of these additional functions.
When all these processes are made to operate simultaneously, the read and write performance of the memory may be insufficient depending on the system. In this case, the development processing and the encoding processing cannot be completed within a predetermined period of time, which results in frame loss and broken recording data.
In an image recording apparatus that handles video data of a large number of pixels (e.g., 4K and 8K), the possibility of facing such a problem is high. Further, in the case of image pickup at a high frame rate (slow motion) which is used in large quantities in recent years, signal processing of 1 frame should be performed in a shorter unit time. Assuming that even if image capturing can be performed as a combination of image capturing at a high frame rate and recording at a large number of pixels, we can be said to be in such a case: it cannot be said that this problem is easily solved due to the improvement of system performance in the near future.
Disclosure of Invention
The present invention is directed to a technique capable of presenting a high-definition enlarged display by using limited read and write performance of a memory.
According to an aspect of the present invention, an image recording apparatus includes: an acquisition unit configured to acquire an image; a reduced image generating unit configured to generate a reduced image by reducing the image acquired by the acquiring unit; a first cut-out image generating unit configured to generate a first cut-out image by cutting out a part of the image which is not reduced by the reduced image generating unit, and store the generated first cut-out image in a memory; a processing unit configured to perform a specific process involving writing data into or reading data from the memory; a recording processing unit configured to perform the processing for image recording involving writing into the memory on the reduced image stored in the memory, and perform recording processing for recording the processed image into the storage unit; a display control unit configured to control to present a first display by outputting an image based on the reduced image to a display unit, and to control to present an enlarged display larger than the first display by outputting an image based on the first cut-out image to a display unit during a recording process; and a control unit configured to control not to execute the specific process at least when the enlarged display is in progress during the recording process.
Further features of the invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Drawings
Fig. 1A is a hardware block diagram of the image recording apparatus, and fig. 1B is a functional block diagram of the image recording apparatus.
Fig. 2 is a flowchart illustrating the overall operation of the image recording apparatus.
Fig. 3 is a flowchart illustrating a setting change process.
Fig. 4 is a flowchart illustrating a face detection and tracking process.
Fig. 5 is a flowchart illustrating a browser communication process.
Fig. 6 is a flowchart illustrating a process for changing a recording state and changing a magnification state.
FIGS. 7A-7D illustrate a slave high definition multimedia interfaceExamples of video images output and displays on panel 118.
Fig. 8A to 8C illustrate slave devices when a verify key is operatedExamples of video images output and displays on a panel.
Fig. 9A to 9C illustrate examples of display on a web browser.
Fig. 10 illustrates an example of an internal configuration of the outward output unit or the panel output unit 117 according to an exemplary modification.
Detailed Description
Exemplary embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
It should be noted that the following exemplary embodiment is only one example for implementing the present invention, and can be modified or changed as appropriate depending on individual structures and various conditions of an apparatus to which the present invention is applied. Therefore, the present invention is by no means limited to the following exemplary embodiments.
In the following description, exemplary embodiments of the present invention will be described with reference to the accompanying drawings. In the present exemplary embodiment, an image pickup apparatus capable of shooting and recording a moving image is described as an example of the image recording apparatus 100 as one exemplary embodiment to which the present invention is applied.
Fig. 1A is a hardware block diagram illustrating an example of the internal configuration of the image recording apparatus 100 according to the present exemplary embodiment. Fig. 1B is a functional block diagram illustrating a data path in image processing performed by the image recording apparatus 100. In each of the blocks 150, 151, and 173 shown in fig. 1B, the number of pixels in each process when an image captured at 4K is recorded as 2K by way of example is additionally shown. Needless to say, this does not necessarily apply to a case where at least one of the number of pixels in the captured image and the number of pixels to be recorded differs in the setting of the number of pixels.
In fig. 1A, a lens unit 101 includes a fixed lens group for converging light, a variable power lens group, a diaphragm, and a correction lens group having both a function of correcting an image forming position shifted due to movement of the variable power lens group and a function of performing focus adjustment. With these components, an object image is formed on an imaging surface of an image sensor 102, which will be described later, through the lens unit 101.
The image sensor 102 converts light into electric charges to generate an image pickup signal. The generated image pickup signal is output to the preprocessing unit 103. The image sensor 102 is an image sensor such as a Charge Coupled Device (CCD) image sensor and a Complementary Metal Oxide Semiconductor (CMOS) image sensor.
The preprocessing unit 103 converts an image pickup signal acquired by being input from the image sensor 102 into RAW data (RAW image) that can be received by the RAW reduction unit 104 and the RAW cut-out unit 105, and outputs the converted RAW data to the RAW reduction unit 104 and the RAW cut-out unit 105 (block 150).
The RAW reduction unit 104 generates reduced RAW data (reduced RAM image) in which the RAW data generated by the preprocessing unit 103 is reduced at a predetermined ratio, and outputs the generated reduced RAW data to the image generation unit 106 (generated reduced image in block 151) — for example, the RAW reduction unit 104 generates reduced RAW data of 2048 × 1080 by reducing the RAW data of 4096 × 2160 pixels input from the preprocessing unit 103 by half horizontally and vertically, and outputs the generated reduced RAW data to the image generation unit 106.
The RAW cut-out unit 105 generates cut-out RAW data by cutting out a predetermined range of RAW data generated by the preprocessing unit 103 according to a predetermined size (block 173), and stores the generated cut-out RAW data in the Dynamic Random Access Memory (DRAM)113 (block 174) — for example, the RAW cut-out unit 105 generates cut-out RAW data by cutting out RAW data of 2048 × 1080 pixels from a predetermined position of RAW data of 4096 × 2160 pixels input from the preprocessing unit 103.
The image generation unit 106 generates full-view image data in YUV format by performing RAW development processing such as interpolation processing and image quality adjustment processing on the reduced RAW data input from the RAW reduction unit 104 (block 152), and stores the generated full-view image data in the DRAM113 (block 153). The full-view image data is developed data including the entire range of the photographed image that is not cut out.
Further, with respect to the cut-out RAW data input from the RAW cut-out unit 105, the image generation unit 106 generates cut-out image data by reading out the cut-out RAW data from the DRAM113 and performing RAM development processing thereon (block 175), and transfers the generated cut-out image data to the resolution conversion unit 2 (114). The image generating unit 106 is limited in the image size that can be processed per unit time, and therefore cannot process the reduced RAW data and the cut-out RAW data at the same time. Therefore, the cut-out RAW data is first stored in DRAM113 before being input to the image generating unit 106 (block 174), so that the timing for processing the reduced RAW data and the cut-out RAW data is shifted.
The resolution conversion unit 1(123) converts the resolution of the YUV data. The resolution converting unit 1(123) generates reduced image data for face detection by performing reduction resizing processing on the full-view image data stored in the DRAM113 (block 154), and stores the generated reduced image data for face detection into the DRAM113 (block 155). Further, the resolution converting unit 1(123) generates reduced data for Joint Photographic Experts Group (JPEG) compression by performing reduction resizing processing on the full-view image data stored in the DRAM113 (block 154), and stores the generated reduced data for JPEG compression in the DRAM113 (block 157).
The still image compression unit 107 generates JPEG data by encoding the reduced data for JPEG compression stored in the DRAM113 according to the JPEG format (block 158), and stores the generated JPEG data into the DRAM113 (block 159).
The network element 108 may be implemented via, for example, a Wireless Fidelity (Wireless Fidelity)) Or BluetoothEtc. to transmit the web application to the web browser 126 as a communication destination by using a hypertext transfer protocol (HTTP), the web application includes a hypertext markup language (HTM L) stored in a Read Only Memory (ROM)125 described later, and a program for executing the program,And cascading data segments of a style sheet (CSS). Further, the network unit 108 also implements a function of receiving a processing request from the web browser 126 and transmitting the received processing request to the control unit 111. The web browser 126 can function as a remote controller of the image recording apparatus 100 through the web application. Further, the network unit 108 can display the live view image on the web browser 126 in real time by successively transmitting JPEG data set in the DRAM113 in accordance with a request from the web browser 126 (block 160). A series of these communication functions will be collectively referred to as browser remoting.
The face detection unit 109 detects the position of the face from the reduced image data for face detection set in the DRAM113 (block 156), and stores the face position information into the DRAM113 (subject detection processing). Further, the face detection unit 109 also has a function of tracking a specific face among the detected faces. This face position information is used as position information when OSD data for a face frame generated by an On Screen Display (OSD) unit 124 described later is synthesized with an image for outward output and an image for panel output described later.
The focus/exposure control unit 127 performs autofocus and exposure control by driving the lens group and the diaphragm in the lens unit 101 based on the face position information detected by the face detection unit 109. Further, the focus/exposure control unit 127 instructs the image generation unit 106 to control the luminance gain.
The operation unit 110 is used for user input operations, and includes, for example, a touch panel and/or operation keys (e.g., buttons, dials, levers). When the user operates the operation unit 110, operation information is notified to the control unit 111.
The control unit 111 is a processing unit including a Central Processing Unit (CPU) and the like, and controls each block included in the image recording apparatus 100.
A Random Access Memory (RAM)112 is a volatile memory used as a work area by the control unit 111.
The DRAM113 is a volatile memory in which each block of the image recording apparatus 100 is used as a work area. The DRAM113 has a limit in the amount of data (i.e., read speed and write speed) that can be output and input (read and write) per predetermined period of time, and data cannot be read out and written therein beyond the upper limit. The DRAM113 can also be replaced with a high-speed volatile memory or a nonvolatile memory based on a different mechanism from the DRAM. Further, the RAM112 and the DRAM113 may be constructed in the same storage device.
The ROM 125 is a nonvolatile recording medium storing, for example, a program to be executed by the control unit 111, and is realized by using, for example, a flash ROM. Alternatively, program data stored in a recording medium 121 described later may be loaded into the RAM112, and the RAM112 may be used as a ROM.
The OSD unit 124 generates OSD data (display items) such as various setting menus and face boxes, and sets the generated OSD data into the DRAM 113.
Resolution conversion unit 2(114) generates to be output to the high-definition multimedia interface from the full-view image data stored in DRAM113 or the cutout image data input from image generation unit 106116 for the outward output image (block 161). Both the full-view image data and the cut-out image data are developed YUV data. The generated image for output is stored in DRAM113 (block 162). Similarly, the resolution converting unit 2(114) generates an image for panel output resized at the resolution to be output to the panel 118 (block 161), and stores the generated image for panel output in the DRAM113 (block 165). Similarly, resolution conversion sheetElement 2(114) generates an image for recording a moving image resized by the resolution of the moving image compressed by the moving image compression unit 119 (block 161), and stores the generated image for recording a moving image in the DRAM113 (block 168).
The outward output unit 115 synthesizes the image for outward output stored in the DRAM113 and OSD data, and outputs the resultant image as a signal for outward output to116 (block 163). The outward output unit 115 has a cut-out and enlargement function for outward output (i.e., a function of performing processing according to an exemplary modification described later) that cuts out and enlarges a part of an image for outward output and outputs the resulting image as a signal for outward output to116。
116 changes a signal for an external output inputted from the external output unit 115 intoFormat, and outputs the resulting signal to the outside (block 164).
The panel output unit 117 synthesizes the image for panel output stored in the DRAM113 and the OSD data, and outputs the resultant image to the panel 118 as a signal for panel output (block 166). This panel output unit 117 has a cut-out and enlargement function (a function of performing processing according to an exemplary modification described later) for panel output, which cuts out and enlarges a part of an image for panel output, and outputs the resultant image to the panel 118 as a signal for panel output.
The panel 118 is a display panel such as a liquid crystal panel and an organic electroluminescence (E L) panel, and displays a signal for panel output input from the panel output unit 117 (block 167).
The Moving image compression unit 119 compresses the images for recording Moving images stored in the DRAM113 according to the Moving Picture Experts Group (MPEG) format, and stores the resultant images as Moving image data in the DRAM113 (compression processing in block 169, and block 170).
The media control unit 120 records moving image data generated by the moving image compression unit 119 and stored in the DRAM113 into the recording medium 121 according to a format compatible with a computer, such as a File Allocation Table (FAT) File system (blocks 171 and 172).
Examples of the recording medium 121 include a memory card. This recording medium 121 is a recording medium that is attachable to and detachable from the image recording apparatus 100, and can be mounted on, for example, a Personal Computer (PC) in addition to the image recording apparatus 100.
The bus 122 is a data bus for the respective blocks of the image recording apparatus 100 to exchange data, and the respective blocks of the image recording apparatus 100 exchange data via this bus 122.
The web browser 126 is a web browser included in an external device different from the image recording apparatus 100, and is capable of issuing a processing request to the image recording apparatus 100 based on a received input by executing a web application received from the network unit 108. Further, the web browser 126 can display a live view image in real time by successively receiving JPEG data. Examples of usable external devices include a smartphone capable of connecting to the image recording device 100 wirelessly or via a wired connection, and a PC such as a desktop PC, a notebook PC, and a tablet PC. Further, the external device is not limited to these as long as the external device includes a web browser function. Subsequently, the operation according to the present exemplary embodiment performed by the image recording apparatus 100 will be described with reference to the flowcharts shown in fig. 2 to 6. The control unit 111 controls the respective units of the image recording apparatus 100 based on the program stored in the ROM 125, thereby realizing these flows. Further, fig. 7A to 7D illustrate the drawings in fig. 2 to 7D4 to the output while the flow chart shown in operation116 and present an example of a display on a panel 118. Fig. 8A to 8C illustrate output to the flowchart shown in fig. 6 while the flowchart is in operation116 and present an example of a display on a panel 118. Fig. 9A to 9C illustrate examples of displays presented on the web browser 126 while the flowchart shown in fig. 5 is in operation. Each of them will be described together with the present flow.
Fig. 2 is a flowchart illustrating an overall operation according to the present exemplary embodiment.
In step S201, the control unit 111 performs processing for changing various settings regarding the image recording apparatus 100 based on the user' S operation. Details of this setting change process will be described below with reference to fig. 3.
In step S202, the control unit 111 determines whether the operation of the face detection and tracking function is in a permitted state and the face detection and tracking function is also in an ON state. If the control unit 111 determines that the operation of the face detection and tracking function is in the permission state and the face detection and tracking function is also in the ON state (yes in step S202), the processing proceeds to step S203. If the control unit 111 determines that the operation of the face detection and tracking function is not in the permission state or the face detection and tracking function is not in the ON state (no in step S202), the processing proceeds to step S204. Whether the operation of the face detection and tracking function is permitted or prohibited is determined based on whether the image recording apparatus 100 is set to prioritize magnity during recording (to be described below with reference to fig. 3). Further, it is determined whether the face detection and tracking function is set to ON or OFF (OFF) based ON the state of the selection screen setting displayed by the selection of the menu item related to the setting of the face detection and tracking function by the user ON the menu screen (details thereof will be described below in the description of step S313 shown in fig. 3).
In step S203, the control unit 111 performs face detection and tracking processing. Details of the face detection and tracking process will be described below with reference to fig. 4.
In step S204, the control unit 111 determines whether the browser remote setting is in the ON state. If the control unit 111 determines that the browser remote setting is in the ON state (yes in step S204), the processing proceeds to step S205. If the control unit 111 determines that the browser remote setting is not in the ON state (no in step S204), the processing proceeds to step S206. Whether the browser remote setting is set to ON or OFF is determined based ON a state of a selection screen setting displayed by selection of a menu item related to the browser remote setting by a user ON a menu screen.
In step S205, the control unit 111 performs browser communication processing. Details of the browser communication process will be described below with reference to fig. 5.
In step S206, the control unit 111 performs processing for changing the recording state and changing the state of verify (a function of presenting an enlarged display for manual focus adjustment). Details of this processing will be described below with reference to fig. 6.
In step S207, the control unit 111 determines whether control (recording processing) of recording the moving image into the recording medium 121 is in operation (i.e., recording processing is underway). If the control unit 111 determines in step S207 that it is in operation (yes in step S207), the processing proceeds to step S208. If the control unit 111 determines in step S207 that it is not in operation (no in step S207), the processing proceeds to step S209.
Fig. 7A illustrates output to the recording medium 121 when it is determined in step S207 that control (recording process) of recording a moving image into the recording medium 121 is in operation (i.e., recording control is underway)116 and display of a captured image displayed on the panel 118Examples of (2). Fig. 7B illustrates output to the recording medium 121 when it is determined in step S207 that control (recording process) of recording a moving image into the recording medium 121 is not in operation (i.e., recording is in standby)116 and a display of a captured image displayed on the panel 118.
A display item 701a (record (rec) display) shown in fig. 7A is a display item indicating that the image recording apparatus 100 is in a state in which a recording operation is in progress.
A display item 701B (STAND-BY display) shown in fig. 7B is a state display indicating that the image recording apparatus 100 is in a state where recording is on standby (stopped).
The display items 702 shown in fig. 7A and 7B are display items indicating the currently set imaging frame rate as shown in the drawings, imaging can be at a frame rate of 59.94p (so-called 60p) or higher, a recording data path and a verify data path described later periodically operate in synchronization with the frame rate displayed here, further, in fig. 7A and 7B, a whole live view (L V) image 700 is displayed in the background of the display items, a whole L V image 700 is a live view image in a non-enlarged state, output toThe whole L V image 700 of 116 is an image for outward output stored in DRAM113 in block 162, and the whole L V image 700 displayed on panel 118 is an image for panel output stored in block 165.
In step S208, the control unit 111 causes the recording data path to operate, thereby causing the following series of processing procedures to be performed.
1. Processing for storing the image for recording the moving image resized by the resolution converting unit 2(114) into the DRAM113 (processing of block 161 and block 168)
2. A process of reading out an image for recording a moving image from DRAM113, compressing the image by moving image compression unit 119, and storing the compressed image again as moving image data in DRAM113 (a process of blocks 168 to 170)
3. Processing of reading out moving image data from DRAM113 and recording the moving image data in recording medium 121 by medium control unit 120 (processing of blocks 170 to 172)
Due to this series of processes, reading from DRAM113 and writing into DRAM113 occur.
In step S209, the control unit 111 determines whether the image recording apparatus 100 is performing a verify operation (the verify operation will be described below with reference to fig. 6). If the control unit 111 determines that the image recording apparatus 100 is performing the verify operation (yes in step S209), the processing proceeds to step S210. If the control unit 111 determines that the image recording apparatus 100 is not performing the verify operation (no in step S209), the processing returns to step S201. Then, the entire steps of the flow are repeated.
In step S210, the control unit 111 causes the operation through the switched-out main datapath. The verification data path by cutting out refers to the following series of processing procedures.
1. Processing for generating cut-out RAW data by RAW cut-out unit 105 and storing the generated cut-out RAW data in DRAM113 (processing of block 150, block 173, and block 174)
2. The process of reading out cut-out RAW data from DRAM113 and converting (developing) the cut-out RAW data into cut-out image data by image generating unit 106 (the process of blocks 174 and 175)
3. Processing (processing of blocks 161 to 162 and processing of blocks 161 to 165) of generating an image for outward output and an image for panel output from the cutout image data by the resolution conversion unit 2(114)
Due to this series of processes, reading from DRAM113 and writing into DRAM113 occur.
Fig. 3 is a flowchart illustrating the details of the setting change processing in step S201.
In step S301, the control unit 111 determines whether an operation requesting display of a menu screen is performed on the operation unit 110. If the control unit 111 determines that the operation is performed (yes in step S301), the processing proceeds to step S302. If the control unit 111 determines that the operation is not performed (no in step S301), the present flow ends.
In step S302, the control unit 111 displays a menu screen on the panel 118, and outputs the menu screen to the OSD unit 124116。
In step S303, when the operation unit 110 displays the menu screen, the control unit 111 determines whether a menu item for setting whether to prioritize the verify operation during recording is selected by a user input via the operation unit 110. If the control unit 111 determines that the menu item is selected (yes in step S303), the processing proceeds to step S304. If the control unit 111 determines that the menu item is not selected (no in step S303), the processing proceeds to step S310.
In step S304, the control unit 111 outputs a selection screen for setting whether to prioritize a verify operation during recording to116, and displays the picture on the panel 118. Fig. 7C illustrates an example of display of a selection screen for setting whether to prioritize a verify operation during recording.
The screen title 703 is a title display indicating that the currently displayed screen is a selection screen for setting whether or not to prioritize the verify operation during recording.
Message 704 is a message display indicating that the face detection and tracking function and the browser remote live view function become unavailable when the function is enabled. The display enables the user to be aware in advance of what type of functionality will be limited in the exchange of that functionality.
The user can determine an operation by selecting any of the button 705 and the button 706 using the operation unit 110, store the setting state determined at this time in the RAM112, display the whole L V image 700 in the background, the image recording apparatus 100 may be configured not to display the whole L V image 700 on the selection screen for setting whether to prioritize a major operation during recording.
In step S305, the control unit 111 determines whether the setting of the priority verify operation during recording is validated by a selection operation input from the user to the operation unit 110. If the control unit 111 determines to validate the setting (yes in step S305), the processing proceeds to step S306. If the control unit 111 determines that the setting is not validated (no in step S305), the processing proceeds to step S308. If the button 706 is selected and an operation to determine the selection is performed, the control unit 111 determines to validate the setting (yes in step S305), whereas if the button 705 is selected and an operation to determine the selection is performed, the control unit 111 determines not to validate the setting (no in step S305).
In step S306, the control unit 111 prohibits the live view function remotely using the browser.
In step S307, the control unit 111 prohibits the use of the face detection and tracking function. When the use of the face detection and tracking function is prohibited by this step, the menu item for setting the face detection and tracking function becomes unable to be selected or removed from the display on the menu screen, thereby making it impossible to set it.
In step S308, the control unit 111 permits use of the live view function remotely from the browser (sets the function to an executable state).
In step S309, the control unit 111 permits use of the face detection and tracking function (sets the function to an executable state).
The prohibited state or permitted state of the function determined in steps S306 to S309 is stored in the RAM112 regardless of whether the determined state is the prohibited state or permitted state.
In step S310, the control unit 111 reads out the state stored in the RAM112, and determines whether the face detection and tracking function is in a permitted state. If the control unit 111 determines that the face detection and tracking function is in the permission state (yes in step S310), the processing proceeds to step S311. If the control unit 111 determines that the face detection and tracking function is not in the permission state (no in step S310), the processing proceeds to step S316.
In step S311, the control unit 111 determines whether a menu item for setting the face detection and tracking function is selected by an input to the menu screen via the operation unit 110. If the control unit 111 determines that the menu is selected (yes in step S311), the processing proceeds to step S312. If the control unit 111 determines that the menu is not selected (no in step S311), the processing proceeds to step S316.
In step S312, the control unit 111 displays a selection screen of the face detection and tracking function on the panel 118, and outputs the screen to116. On this screen, respective options for enabling and disabling the face detection and tracking functions, respectively, are displayed, and the user can select any one of them by using the operation unit 110.
In step S313, the control unit 111 determines whether the face detection and tracking function is set to ON by the operation ON the operation unit 110. If the control unit 111 determines that the face detection and tracking function is set to ON (yes in step S313), the processing proceeds to step S314. If the control unit 111 determines that the face detection and tracking function is not set to ON (no in step S313), the processing proceeds to step S315.
In step S314, the control unit 111 sets the face detection and tracking function to ON.
In step S315, the control unit 111 sets the face detection and tracking function to OFF.
The operation state (ON or OFF) of the function determined in step S314 or step S315 is stored in the RAM112 regardless of whether the determined operation state is ON or OFF.
In step S316, the control unit 111 determines whether a menu item for setting the browser remote function is selected by an input on the menu screen via the operation unit 110. If the control unit 111 determines that the menu item is selected (yes in step S316), the processing proceeds to step S317. If the control unit 111 determines that the menu item is not selected (no in step S316), the processing proceeds to step S321.
In step S317, the control unit 111 displays a selection screen of the browser remote function. On this screen, respective options for enabling and disabling the browser remote function, respectively, are displayed, and the user can select any one of them by using the operation unit 110.
In step S318, the control unit 111 determines whether the browser remote function is set to ON by the operation to the operation unit 110. If the control unit 111 determines that the browser remote function is set ON (yes in step S318), the processing proceeds to step S319. If the control unit 111 determines that the browser remote function is not set to ON (no in step S318), the processing proceeds to step S320.
In step S319, the control unit 111 sets the browser remote function to ON.
In step S320, the control unit 111 sets the browser remote function to OFF.
The operation state (ON or OFF) of the function determined in step S319 or step S320 is stored in the RAM112 regardless of whether the determined operation state is ON or OFF.
In step S321, the control unit 111 determines whether a request to end the menu screen is issued by an operation to the operation unit 110. If the control unit 111 determines that the request to end the menu screen has been issued (yes in step S321), the processing proceeds to step S322. If the control unit 111 determines that the request to end the menu screen has not been issued (no in step S321), the processing returns to step S303.
In step S322, the control unit 111 ends the menu screen, and shifts the screen to the image capturing screen.
Fig. 4 is a flowchart illustrating the details of the face detection and tracking process in step S203.
In step S401, the control unit 111 generates reduced image data for face detection by the resolution conversion unit 1(123), and stores the generated reduced image data for face detection into the DRAM 113. This step corresponds to the processing of blocks 154 and 155 described above.
In step S402, the control unit 111 performs processing for detecting the position of a face from the reduced image data for face detection stored in the DRAM113, and processing for detecting a destination to which the face as a target of tracking moves. This process corresponds to the process of block 156 described above.
Due to the processing of step S401 and step S402, reading from DRAM113 and writing into DRAM113 occur.
In step S403, the control unit 111 performs focus control by the focus/exposure control unit 127 based on the position of the face detected in step S402.
In step S404, the control unit 111 performs exposure control by the focus/exposure control unit 127 based on the position of the face detected in step S402.
In step S405, the control unit 111 updates the output to the OSD unit 124 based on the position of the face detected in step S402116 and display the display position of the face box on the panel 118. Fig. 7D illustrates an example of display of a face frame.
A face frame 707 and a face frame 708 are each displayed on the detected face in a state of being superimposed on the whole L V image 700, the face frame 707 represents a main face, and the face frame 708 represents a face other than the main face.
Fig. 5 is a flowchart illustrating details of the browser communication processing performed in step S205.
In step S501, the control unit 111 determines whether the connection between the network unit 108 and the web browser 126 is in an established state. If the control unit 111 determines that the connection between the network unit 108 and the web browser 126 is in the established state (yes in step S501), the processing proceeds to step S504. If the control unit 111 determines that the connection between the network unit 108 and the web browser 126 is not in the established state (no in step S501), the processing proceeds to step S502.
In step S502, the control unit 111 determines whether a connection request is received by the network unit 108 from the web browser 126. If the control unit 111 determines that a connection request is received (yes in step S502), the processing proceeds to step S503. If the control unit 111 determines that the connection request is not received (no in step S502), the present flow ends.
In step S503, the control unit 111 performs connection processing between the network unit 108 and the web browser 126, thereby establishing a connection therebetween, the connection processing further includes a step for sending an HTM L for the web browser 126,And CSS to perform processing of a web application remote from the browser.
In step S504, the control unit 111 determines whether a request to acquire the state of any unit of the image recording apparatus 100 is received from the web browser 126. Examples of the state of any cell include the following operation states: for example, whether a recording operation is in progress, various parameters regarding focus and exposure control, and the remaining available capacity of the recording medium 121. Further, examples of the state also include a permission state of the live view function of the browser remote that has been set by step S306 or step S308. Any of these states are stored in RAM 112. If the control unit 111 determines that the request is received (yes in step S504), the processing proceeds to step S505. If the control unit 111 determines that the request is not received (no in step S504), the processing proceeds to step S506.
In step S505, the control unit 111 reads out the status information requested in step S504 from the RAM112, and transmits the status information via the network unit 108.
In step S506, the control unit 111 determines whether a request to update the live view image remote from the browser is received from the web browser 126. A request to update the live view image is periodically transmitted from the Web browser 126 as long as the display of the live view image continues on the Web browser 126. If the control unit 111 determines that a request to update the live view image of the browser remote is received (yes in step S506), the processing proceeds to step S507. If the control unit 111 determines that a request to update the live view image remote from the browser has not been received (no in step S506), the processing proceeds to step S510.
In step S507, the control unit 111 determines whether the live view function remote from the browser is in a permitted state based on the state set in step S306 or step S308 and stored in the RAM 112. If the control unit 111 determines that the live view function remote from the browser is in the permitted state (yes in step S507), the processing proceeds to step S508. If the control unit 111 determines that the live view function remote from the browser is not in the permitted state (no in step S507), the processing proceeds to step S510.
In step S508, the control unit 111 performs processing for generating reduced data for JPEG compression by the resolution conversion unit 1(123), and stores the generated reduced data for JPEG compression into the DRAM113, and further compresses the data according to the JPEG format by the still image compression unit 107, and stores the compressed data into the DRAM 113. This process corresponds to the processes of blocks 154, 157, 158 and 159 described above.
In step S509, the control unit 111 reads out the JPEG data generated in step S508 from the DRAM113, and transmits the JPEG data to the web browser 126. This process corresponds to the process of blocks 159 and 160 described above.
Due to these processes of step S508 and step S509, reading from DRAM113 and writing to DRAM113 occur.
In step S510, the control unit 111 determines whether a processing request is received from the web browser 126. Examples of processing requests include: a request to start or stop recording, a request to change various parameters regarding focus and exposure control, and a request to specify a main face and a tracking target for a face detection and tracking function. If the control unit 111 determines that a processing request is received (yes in step S510), the processing proceeds to step S511. If the control unit 111 determines that the processing request has not been received (no in step S510), the present flow ends.
In step S511, the control unit 111 performs processing by controlling the corresponding unit of the image recording apparatus 100 according to the processing request received in step S510.
Fig. 9A to 9C illustrate examples of display of screens displayed on the web browser 126 while maintaining establishment of communication with the web browser 126.
Fig. 9A illustrates an example of display when the image recording apparatus 100 is in a state where live view is permitted and the web browser 126 side is in a state where live view display is presented.
The live view image in which the JPEG data transmitted from the image recording apparatus 100 in step S509 is expanded is displayed in the area 901.
The button 902 is a button for switching whether or not to present a live view display. When this button 902 is pressed (touched or clicked), the screen transitions to the state shown in fig. 9B.
Fig. 9B illustrates an example of display when the image recording apparatus 100 is in a state in which live view is permitted and the web browser 126 side is in a state in which live view display is not presented.
The region 901 is a region in which a live view image should be displayed, but in which the live view image is not displayed in this state.
The button 902 is a button for switching whether or not to present a live view display. When this button 902 is pressed, the screen transitions to the state shown in fig. 9A.
Fig. 9C illustrates an example of display when the image recording apparatus 100 is in a state in which live view is prohibited.
The region 901 is a region in which a live view image should be displayed, but since JPEG data is not transmitted from the image recording apparatus 100 in this state, the live view image is not displayed therein.
The button 902 is a button for switching whether or not to present the live view display, but is in an inoperable state (grayed-out display) because a notification indicating that the image recording apparatus 100 is in a state in which live view is prohibited is received from the image recording apparatus 100.
The face selection and tracking function remote via the browser issues a processing request to the image recording apparatus 100 based on a touch operation on the live view image, and is thus operable only in the case where the live view image is displayed. When displaying the live view image on the web browser 126 side, the user can select a face as a tracking target by touching or clicking the face of the subject on the displayed live view image. Information indicating the coordinates of the touch position or the click position is transmitted to the image recording apparatus 100, and the image recording apparatus 100 sets the face as a tracking target based on the received coordinate information. The processing request concerning the other functions can be issued regardless of the display state of the live view image, and therefore the operation on the web browser 126 is not limited either.
Fig. 6 is a flowchart illustrating the details of the processing for changing the recording status and changing the verify status in step S206.
In step S601, the control unit 111 determines whether the REC key included in the operation unit 110 is pressed. Alternatively, if the browser remote function is in progress, the control unit 111 determines whether a processing request equivalent to pressing the REC key is received from the web browser 126, in accordance with an operation of receiving an instruction to issue an instruction to start or stop recording on the web browser 126 side. If the control unit 111 determines that the REC key is pressed (yes in step S601), the processing proceeds to step S602. If the control unit 111 determines that the REC key is not pressed (no in step S601), the processing proceeds to step S608.
In step S602, the control unit 111 determines whether the image recording apparatus 100 is performing an operation of recording a moving image. If the control unit 111 determines that the image recording apparatus 100 is performing an operation of recording a moving image (yes in step S602), the processing proceeds to step S603. If the control unit 111 determines that the image recording apparatus 100 is not performing an operation of recording a moving image (no in step S602), the processing proceeds to step S604.
In step S603, the control unit 111 stops the operation of recording the moving image on the image recording apparatus 100.
In step S604, the control unit 111 determines whether the verify function is in progress. If the control unit 111 determines that the verify function is in progress (yes in step S604), the processing proceeds to step S605. If the control unit 111 determines that the verify function is not in progress (no in step S604), the processing proceeds to step S607.
In step S605, the control unit 111 determines whether the image recording apparatus 100 is set to prioritize the verify operation during recording, based on the information (the state information stored in the RAM 112) selected and set in step S304 shown in fig. 3. If the control unit 111 determines that the image recording apparatus 100 is set to prioritize the verify operation (yes in step S605), the processing proceeds to step S607. If the control unit 111 determines that the image recording apparatus 100 is not set to the priority verify operation (no in step S605), the processing proceeds to step S606.
In step S606, the control unit 111 ends the verify operation, when the verify operation ends, the whole L V image 700 is displayed, and the screen returns to the display without enlargement (non-enlarged display capable of containing the whole image) — furthermore, the display item 801a (to be described below with reference to fig. 8A) or 801c indicating that verify is in progress, which is displayed on the screen of the panel 118, is deleted.
Even if the control unit 111 determines in step S605 that the image recording apparatus 100 is set to the priority verify operation (yes in step S605), the present flow may be configured in such a manner that the processing proceeds to step S606. More specifically, even while the magnification (enlargement display) is in progress, the control unit 111 temporarily stops the enlargement display in accordance with the start of recording of the moving image. Controlling the display in this manner enables the user to recognize that the moving image being recorded is not only a portion displayed in an enlarged manner by magnity, but also a range of full-view images. After the start of recording the moving image, if an operation instructing the image recording apparatus 100 to execute magnity is input during recording, the control unit 111 executes magnity again.
Alternatively, if the control unit 111 determines in step S605 that the image recording apparatus 100 is set to the priority verify operation (yes in step S605), the processing may proceed to step S607, and the control unit 111 may start recording while maintaining the enlarged display, and also display a warning indicating that verify is in progress or display an after-mentioned display item 801a or 801c in a manner of blinking for a predetermined period of time, for example. In other words, the control unit 111 may present a display notifying the user that the enlargement display is in progress in an emphasized manner. Presenting the display in this manner can prevent the user from mistakenly thinking that the angle of view displayed in an enlarged manner (not the whole of the full-angle-of-view image, but a partial area) is being recorded.
In step S607, the control unit 111 starts a recording operation (recording control) of the image recording apparatus 100. With this start, control similar to step S208 described above is started.
In step S608, the control unit 111 determines whether the verify key included in the operation unit 110 is pressed. If the control unit 111 determines that the verify key is pressed (yes in step S608), the processing proceeds to step S609. If the control unit 111 determines that the verify key is not pressed (no in step S608), the present flow ends.
In step S609, the control unit 111 determines whether the image recording apparatus 100 is executing the verify function. If the control unit 111 determines that the image recording apparatus 100 is executing the verify function (yes in step S609), the processing proceeds to step S610. If the control unit 111 determines that the image recording apparatus 100 is not executing the verify function (no in step S609), the processing proceeds to step S612.
In step S610, the control unit 111 ends the verify operation similarly to step S606.
In step S611, the control unit 111 deletes a display item 801a (to be described below with reference to fig. 8A) or 801c indicating that verification is in progress, which is displayed on the screen of the panel 118.
In step S612, similar to step S602, the control unit 111 determines whether the image recording apparatus 100 is performing a recording operation. If the control unit 111 determines that the image recording apparatus 100 is performing a recording operation (yes in step S612), the processing proceeds to step S613. If the control unit 111 determines that the image recording apparatus 100 is not performing a recording operation (no in step S612), the processing proceeds to step S614.
In step S613, similarly to step S605, the control unit 111 determines whether the image recording apparatus 100 is set to prioritize the verify operation during recording. If the control unit 111 determines that the image recording apparatus 100 is set to prioritize the verify operation (yes in step S613), the processing proceeds to step S614. If the control unit 111 determines that the image recording apparatus 100 is not set to the priority verify operation (no in step S613), the processing proceeds to step S616.
In step S614, the control unit 111 starts the verify operation, and controls the display to present an enlarged display like the example shown in fig. 8A described later.
In step S615, the control unit 111 outputs a display item 801a (to be described below with reference to fig. 8A) indicating that verification is in progress to116, and displays the display item 801a on the screen of the panel 118.
FIG. 8A illustrates an example of display when Magnify is in progress, the enlarged L V image 804 is not displayed in the entire area but is displayed in an enlarged mannerA live view image of a partial area in the entire range (range of full-view image data) being photographed. As described above, the image is obtained by enlarging the image according to the enlargement range and according to the number of pixels of the panel 118 orThe image generated by cutting out the image from the RAW data is reduced by the number of pixels of 116, therefore, as long as the number of pixels in the cut-out RAW data is sufficient for the number of pixels of the panel 118, the image is not enlarged during the processing (processing for stretching the image by increasing the number of pixels from the original number of pixels). accordingly, the image is displayed with less deterioration of image quality, and the image is kept displayed with high definition, so that the user can easily confirm how much the image recording apparatus 100 is focused, for example, at the time of manual focusing.A display item 801a is an icon representing how much the image is being performed by cutting out Magnify, and is displayed in a state of being superimposed on the enlarged 85V image 804 by the processing of step S615.
Rectangles 802 and 803 are radar displays for indicating a cut-out range with respect to the entire video image (i.e., the position of the currently displayed video region.) the rectangle 802 indicates the entire video image (the range of the whole L V image 700). the rectangle 803 indicates a cut-out range in the entire video image (the range of the enlarged L V image 804).
The position and magnification of the enlargement range in the Magnify can be changed by a user operation performed on the operation unit 110. if an operation to change the position of the enlargement range or an operation to change the magnification is input while the Magnify is in progress, the display of the rectangle 803 is updated to represent the position of the enlargement range after the change or the magnification after the change. furthermore, the RAW cut-out unit 105 updates the enlarged L V image 804 by cutting out a range corresponding to the enlargement range after the change. furthermore, the image recording apparatus 100 can receive a manual focus operation (MF operation) input from the user to the operation unit 110 while performing a Magnify function (presenting enlarged display). the focus/exposure control unit 127 drives the lens unit 101 based on the received MF operation.in some cases, the focus lens of the lens unit 101 can be configured not to be driven by the focus/exposure control unit 127. in this case, or even in a case where the focus lens of the lens unit 101 is configured to be drivable by directly operating the focus/exposure control unit (e.g., the focus ring position of the lens operation unit 101).
On the other hand, in step S616, the control unit 111 outputs the display item 801b indicating that verify is in the unexecutable state to116, and displays the display item 801B on the screen of the panel 118 fig. 8B illustrates an example of display in this case the display item 801B is an icon representing a failure in execution of the magnity, and is displayed by being added to the whole L V image 700 displayed before the magnity key has been pressed in step S608, the icon is displayed in red to emphasize the failure in execution of the function when the key operation is performed in step S608 the image recording apparatus 100 may be configured to automatically remove the display item 801B from the display upon the elapse of a predetermined period of time from the start of the display the image recording apparatus 100 may be configured to continuously display the display item 801B as an option when the magnity is in the non-executable state, regardless of whether the magnity key is pressed.
According to the present exemplary embodiment, if it is set to prioritize the verify operation during recording, the image recording apparatus 100 can perform control so that the face detection and tracking function and the live view function remote from the browser are not executable. As a result, the image recording apparatus 100 can avoid (reduce) the reading and writing processes to the DRAM113 that would otherwise be consumed by the operations of the face detection and tracking function and the browser remote live view function. Therefore, even when the verify function and the process for recording a moving image are operated simultaneously, the image recording apparatus 100 can prevent the reading and writing performance of the DRAM113 from being degraded. As a result, the image recording apparatus 100 enables the user to record a video image in a predetermined format including a high load condition such as a large number of pixels and a high frame rate while correctly performing manual focusing by using the verify function.
Whether or not to prioritize the verify operation during recording can be selected via a menu, so that when the user wants to use the face detection and tracking function and the live view function remote from the browser, the user can change the prioritized function by using the menu setting.
Further, according to the present exemplary embodiment, if it is set to prioritize the verify operation during recording, the image recording apparatus 100 restricts the use of the face detection and tracking function and the live view function remotely from the browser, regardless of whether the verify function is in progress and/or whether the process for recording a moving image is in progress. By applying the restriction in this way, the image recording apparatus 100 can clearly express the contents of the function restriction to the user as shown in fig. 7C, thereby preventing the user from being confused by a complicated restriction condition. On the other hand, the image recording apparatus 100 may be configured to restrict the use of the face detection and tracking function and the live view function remotely from the browser only while the verification function and/or the process for recording a moving image is being performed. In other words, the image recording apparatus 100 can enable the use of the face detection and tracking function and the live view function of the browser remote without limiting the use thereof when the enlarged display is not being presented by the verify function and/or the processing for recording the moving image is not being performed. In this case, the image recording apparatus 100 can shorten the scene in which the use of the face detection and tracking function and the live view function of the browser remote is limited to the minimum period of time.
In the present exemplary embodiment, the function exclusive to the verify operation during recording (limited function) has been described as the face detection and tracking function and the live view function remote from the browser, but is not limited thereto. If there is another function accompanying read and write processing to DRAM113, this function can be set as a target to be regarded as an exclusive function. As a result, depending on the performance of the system, it becomes possible to use even the face detection and tracking function and the browser remote live view function together with the verify operation during recording.
< modification >
As a modification, if the image recording apparatus 100 is not set to prioritize the verify operation during recording (no in step S613), the image recording apparatus 100 may present a simplified enlarged display without using the cut-out image data instead of displaying the display item 801b in step S616 fig. 8C illustrates an example of display in this case fig. 8C is a simplified enlarged L V image 805 which is a live view image not displaying the entire range but displaying a partial range in the entire range being photographed (the range of the full-view image data) in an enlarged manner, a simplified enlarged L V image 805 is generated in the following manner.
Is to be output to116, image: according to the need to follow116, the magnification of an image cut out from the image for outward output stored in DRAM113 (data stored in block 162) according to the enlargement range.
Image to be displayed on the panel 118: the magnification of an image cut out from the image for panel output (data stored in block 165) stored in DRAM113 according to the enlargement range is changed according to the pixels of panel 118.
The image for outward output in the block 162 or the image for panel output in the block 165 is an image that has been reduced in the block 151 and further reduced in the block 161 according to the setting of the number of pixels for recording. The number of pixels in a video image cut out from the reduced image according to the enlargement range is smaller than the number of pixels used for recording. The image magnification depends on the number of pixels of the panel 118 or116, and thus, if the number of pixels in the video image cut out according to the enlargement range is smaller than the number of pixels of the panel 118 or116, the magnification is changed, thereby the image is displayed with deteriorated image quality and thus is displayed as a video image coarser than the above-described enlarged L V image 804, the enlarged L V image 804 is generated by generating an image for outward output and an image for panel output from RAW data before reduction processing (processing of block 151) by the RAW reduction unit 104.
On the other hand, in the verify operation by the simplified enlargement, an image for outward output and an image for panel output are generated from the reduced RAW data after the reduction processing by the RAW reduction unit 104. Thus, suppose that the same video range is output to116 and is displayed on the panel 118 in an enlarged manner, the enlarged L V image 804 is of higher definition than the reduced enlarged L V image 805.
However, the display of the simplified enlarged L V image 805 can be realized without at least performing processing for storing cut-out RAW data containing a relatively large amount of data into the DRAM113 (processing of block 174). The amount of data read from and written into the DRAM113 can be reduced by an amount corresponding thereto.accordingly, the simplified enlarged L V image 805 has an advantage of being able to be displayed without limiting processing (block 155 and/or block 157) for reading and writing data for face detection and/or JPEG-compressed data of a L V image for browser remote from the DRAM 113. the display item 801c is an icon representing that Magnify is proceeding by the simplified enlargement. the display item 801c is prepared as an icon different from the display item 801a to represent that this is the simplified enlargement.
In this way, according to a modification, when the face detection and tracking function and other functions such as the browser-remote live view function are prioritized, the image recording apparatus 100 can prevent the Magnify function from becoming unavailable, further, the image recording apparatus 100 can allow the Magnify function to be available while recording is in progress, while using other functions such as the browser-remote live view function by simplifying the enlargement L V image 805, and also can present a high-definition enlarged display by displaying the enlarged L V image 804 while recording is in standby.
The simplified enlarged display is a display intended to reduce the amount of data read from and written to DRAM113 during recording, whereby performing an operation to stop recording enables image recording apparatus 100 to present the enlarged display using cut-out image data, therefore, when an operation to stop recording is performed in step S603 with the simplified enlarged display presented, image recording apparatus 100 should end the simplified enlarged display, after ending the simplified enlarged display according to the operation to stop recording, image recording apparatus 100 may end the enlargement and display L V image 700 as a whole, or may maintain the Magnify operation itself and switch the display from the simplified enlarged display directly to enlarged L V image 804.
Further, when the REC key is operated with the enlarged L V image 804 displayed, if the image recording apparatus 100 is not set to prioritize the verify operation during recording, the image recording apparatus 100 can switch the display to the simplified enlarged L V image 805 without performing the process of step S606 (i.e., without ending verify).
In addition, in a modification, the image recording apparatus 100 can also be configured to not include a setting of whether to prioritize the Magnify operation during recording itself in other words, the image recording apparatus 100 can make the face detection and tracking function and other functions such as the live view function of the browser remote and the like executable regardless of the Magnify operation in the case where the image recording apparatus 100 is configured to not include a setting of whether to prioritize the Magnify operation during recording itself, if it is determined in the above-described step S612 that recording is in progress (yes in the step S612), the image recording apparatus 100 displays the simplified magnified L V image 805 without performing the processing of the step S613, further, if it is determined in the step S604 that the Magnify function is in progress (yes in the step S604), the image recording apparatus 100 displays the simplified magnified L V image 805 without performing the processing of the step S605, if it is determined in the step S604 that the Magnify function is in progress (yes in the step S604), the display is switched from the magnified 32V image 805 without performing the processing of the step S605, if it is determined in the step S604 that the image recording is not directly performed, the image recording is performed in order to prevent the image recording from being directly switched from the magnification operation in the recording apparatus 100 being switched to the recording being performed, if it is not to the recording is configured to prevent the image recording from being performed in the step S604, if it is not to be a problem that the image recording is not to be performed in step S604, if it is not to be switched to be the image recording apparatus 100, if it is not to be switched to the image recording is performed in the recording apparatus 100, if it is.
Super-resolution processing may be performed to improve the image quality of the simplified magnified display described above. The variable magnification processing for simplifying the enlarged display in this case will be described. Fig. 10 illustrates an example of an internal configuration of the outward output unit 115 or the panel output unit 117. A DRAM _ interface (I/F)1001 reads image data from the DRAM 113. The resolution conversion unit 1002 performs variable magnification processing for increasing the number of pixels by referring to peripheral pixels to interpolate pixels for image data read by the DRAM _ I/F1001. The super-resolution processing unit 1003 performs variable magnification processing for increasing the number of pixels by referring to a plurality of frames (previous frame and subsequent frame) of image data read in by the DRAM _ I/F1001 to interpolate pixels. The selector 1004 selects whether to output the data read by the DRAM _ I/F1001 directly to the subsequent stage (processing by the peaking processing unit 1005 and processing thereafter), output the data processed by the resolution converting unit 1002, or output the data processed by the super-resolution processing unit 1003. The peaking processing unit 1005 performs peaking processing on the image acquired via the selector 1004. Peaking processing refers to processing of coloring a contour portion of a subject in an image with a color line (for example, red, blue, or yellow). The outline portion in the video image is highlighted in different colors according to the in-focus state. Colors for highlighting are added for display and do not affect the recorded video image. The selector 1006 selects whether to output data subjected to peaking processing by the peaking processing unit 1005 to the signal generation unit 1007 or to output data received from the selector 1004 and not subjected to peaking processing. The output signal generation unit 1007 converts an image received via the selector 1006 into an image116 and panel 118 are capable of receiving data formats. In this way, the variable magnification processing (enlargement processing) for simplifying the enlargement display is performed as any one of the variable magnification processing using the conversion of the resolution and the variable magnification processing using the super-resolution processing. If the super-resolution processing interpolates pixels by referring to the previous frame and the next frame, the image recording apparatus 100 can display an image of higher definition than normal electronic zoom magnification. Therefore, the user can easily visually confirm whether or not the object is in the focused state, and thus can perform manual focusing more accurately. Further, performing peaking processing while presenting a simplified enlarged display enables the user to further easily focus the image recording apparatus 100 on the subject.
With regard to the above-described various controls that the control unit 111 is supposed to perform, which have been described, a single hardware device may perform them, or a plurality of hardware devices may control the entire apparatus by dividing the processing between them.
Further, although the present invention has been described in detail based on the representative exemplary embodiments of the invention, the present invention is not limited to these specific exemplary embodiments, and various embodiments are also encompassed within the scope not departing from the spirit of the invention. Further, each of the above-described exemplary embodiments represents only one exemplary embodiment of the present invention, and the respective exemplary embodiments can also be arbitrarily combined.
Further, the above-described exemplary embodiments have been described based on an example of applying the present invention to the image recording apparatus 100 including the image sensor 102, but the application of the present invention is not limited to this example and the present invention can be applied to any electronic apparatus that controls to display an input image in an enlarged manner. For example, the present invention can be applied to a moving image recorder: a moving image input from an external input terminal is output to a display device in an enlarged manner, and is also recorded to an external or built-in recording medium. Similarly, the present invention can be applied to a personal computer, a Personal Digital Assistant (PDA), a mobile phone terminal, a mobile image viewer, a printer device including a display, a digital photo frame, and the like. Further, the present invention can be applied to a music player, a game machine, an electronic book reader, a tablet terminal, a smart phone, a projector, a home appliance, an in-vehicle device including a display, and the like.
According to the present invention, it is possible to present a high-definition enlarged display by using limited read and write performance of a memory.
Embodiments of the present invention may also be implemented by a computer of a system or apparatus that reads and executes computer-executable instructions (e.g., one or more programs) recorded on a storage medium (also more fully referred to as "non-transitory computer-readable storage medium") to perform the functions of one or more of the above-described embodiments, and/or includes one or more circuits (e.g., Application Specific Integrated Circuits (ASICs)) for performing the functions of one or more of the above-described embodiments, and methods may be utilized by a computer of a system or apparatus, for example, that reads and executes computer-executable instructions from a storage medium to perform the functions of one or more of the above-described embodiments, and/or controls one or more circuits to perform the functions of one or more of the above-described embodiments, to implement embodiments of the present invention. The computer may include one or more processors (e.g., Central Processing Unit (CPU), Micro Processing Unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the meterComputer-executable instructions. The computer-executable instructions may be provided to the computer, for example, from a network or a storage medium. The storage medium may include, for example, a hard disk, Random Access Memory (RAM), Read Only Memory (ROM), memory of a distributed computing system, an optical disk such as a Compact Disk (CD), Digital Versatile Disk (DVD), or blu-ray disk (BD)TM) A flash memory device, a memory card, and the like.
The embodiments of the present invention can also be realized by a method in which software (programs) that perform the functions of the above-described embodiments are supplied to a system or an apparatus through a network or various storage media, and a computer or a Central Processing Unit (CPU), a Micro Processing Unit (MPU) of the system or the apparatus reads out and executes the methods of the programs.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.

Claims (21)

1. An image recording apparatus, comprising:
an acquisition unit configured to acquire an image;
a reduced image generating unit configured to generate a reduced image by reducing the image acquired by the acquiring unit;
a memory;
a first cutout image generation unit configured to generate a first cutout image by cutting out a part of the image that is not reduced by the reduced image generation unit;
a recording processing unit configured to perform, on the reduced image stored in the memory, recording processing including image processing for recording involving writing into the memory and processing for recording the processed image into the storage unit;
a display control unit configured to control to present a first display by outputting an image based on the reduced image to a display unit, or to control to present an enlarged display larger than the first display by outputting an image based on the first cut-out image to a display unit during a recording process;
a processing unit configured to perform specific processing including face detection or processing for output to an external network, the specific processing involving writing data into or reading data from a memory, different from the recording processing and display processing for presenting a first display or an enlarged display; and
a control unit configured to control not to perform the specific process when the enlarged display is in progress during the recording process.
2. The image recording apparatus according to claim 1, further comprising:
a setting unit configured to set whether to make the enlarged display effective during the recording process based on a selection operation by a user,
wherein the control unit controls, in a case where the enlarged display during the recording process is set to be valid by the setting unit, so that the specific process is not executed during the recording process, and controls, in a case where the enlarged display during the recording process is not set to be valid by the setting unit, so that the specific process is executable during the recording process.
3. The image recording apparatus according to claim 2, wherein in a case where the enlarged display during the recording process is set to be effective by the setting unit, the control unit controls not to perform the specific process regardless of whether the recording process is in progress.
4. The image recording apparatus according to claim 2, wherein in a case where the enlarged display during the recording process is set to be effective by the setting unit, the control unit controls not to execute the specific process when the recording process is in progress, and controls to make the specific process executable when the recording process is in standby in which the recording process is not in progress.
5. The image recording apparatus according to claim 2, wherein in a case where the enlarged display during the recording process is set to be valid by the setting unit, the control unit controls not to perform the specific process regardless of whether the enlarged display is in progress.
6. The image recording apparatus according to claim 2, wherein in a case where the enlarged display during the recording process is set to be valid by the setting unit, the control unit controls not to execute the specific process when the enlarged display is in progress, and controls to make the specific process executable when the enlarged display is not in progress.
7. The image recording apparatus according to claim 2, wherein when the enlarged display during the recording process is not set to be valid by the setting unit, the display control unit controls to present a display indicating that the enlarged display is invalidated without presenting the enlarged display in response to an operation instructing the image recording apparatus to present the enlarged display.
8. The image recording apparatus according to claim 2, further comprising:
a second cutout image generation unit configured to generate a second cutout image by cutting out a part of the reduced image,
wherein the display control unit controls, in a case where the enlarged display is not set to be effective by the setting unit, to present a simplified enlarged display of an image by outputting an image based on the second cut-out image to a display unit during the recording process, instead of presenting a simplified enlarged display of an image by outputting an image based on the first cut-out image to a display unit during the recording process.
9. The image recording apparatus according to claim 8, wherein when the recording process is ended during the simplified enlarged display, the display control unit controls to end the simplified enlarged display.
10. The image recording apparatus according to claim 8, wherein the display control unit controls to display, in the simplified enlarged display, an image generated by performing super-resolution processing for increasing the number of pixels on the second cutout image using interpolation processing that refers to a previous frame image and a subsequent frame image.
11. The image recording apparatus according to claim 8, wherein the display control unit controls to display, in the simplified enlarged display, an image generated by subjecting the second cut-out image to peaking processing for coloring a contour portion in an image.
12. The image recording apparatus according to claim 8, wherein at the time of the simplified enlarged display, the display control unit controls to present a display indicating that the simplified enlarged display is an enlarged display different from an enlarged display displaying an image based on the first cut-out image.
13. The image recording apparatus according to claim 1, wherein the specific process is at least any one of a detection process for detecting a subject from an image and a transmission process for transmitting the image to an external apparatus.
14. The image recording device according to claim 1, further comprising an image pickup unit,
wherein the acquisition unit acquires a live image captured by the imaging unit.
15. The image recording apparatus according to claim 1, wherein the image is a moving image.
16. The image recording apparatus according to claim 1, wherein the process for image recording includes a compression process.
17. The image recording apparatus according to claim 1, wherein the reduced image is a reduced RAW image generated by reducing a RAW image acquired by the acquisition unit.
18. The image recording apparatus according to claim 17, wherein the first cutout image is a cutout RAW image generated by cutting out a part of the RAW image acquired by the acquisition unit.
19. The image recording device according to claim 1,
wherein the first cutout image generation unit generates a first cutout image by cutting out a part of an image larger than the reduced image.
20. The image recording apparatus according to claim 1, wherein the reduced image generating unit generates the 2K image by reducing the 4K image acquired by the acquiring unit;
wherein the first cutout image generation unit generates the first cutout image by cutting out a part of the 4K image acquired by the acquisition unit.
21. The image recording apparatus according to claim 20, wherein the first cutout image is a 2K image.
CN201710223065.6A 2016-04-12 2017-04-07 Image recording apparatus and control method thereof Active CN107295247B (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2016079862 2016-04-12
JP2016-079862 2016-04-12
JP2016191330A JP6808424B2 (en) 2016-04-12 2016-09-29 Image recording device and its control method
JP2016-191330 2016-09-29

Publications (2)

Publication Number Publication Date
CN107295247A CN107295247A (en) 2017-10-24
CN107295247B true CN107295247B (en) 2020-07-14

Family

ID=60084904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710223065.6A Active CN107295247B (en) 2016-04-12 2017-04-07 Image recording apparatus and control method thereof

Country Status (2)

Country Link
JP (1) JP6808424B2 (en)
CN (1) CN107295247B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022028978A (en) * 2018-10-03 2022-02-17 シャープ株式会社 Picture processor, display device, and method for processing picture

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104243951A (en) * 2013-06-07 2014-12-24 索尼电脑娱乐公司 Image processing device, image processing system and image processing method
CN104702838A (en) * 2013-12-05 2015-06-10 佳能株式会社 Image capturing apparatus and control method thereof

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE60329244D1 (en) * 2002-01-22 2009-10-29 Canon Kk Apparatus for image processing and control method thereto
JP5300756B2 (en) * 2010-02-05 2013-09-25 キヤノン株式会社 Imaging apparatus and image processing method
JP6148431B2 (en) * 2010-12-28 2017-06-14 キヤノン株式会社 Imaging apparatus and control method thereof
EP2762939B1 (en) * 2011-09-29 2017-06-14 FUJIFILM Corporation Lens system and camera system
JP5873378B2 (en) * 2012-04-10 2016-03-01 キヤノン株式会社 Imaging apparatus and control method thereof
WO2014069228A1 (en) * 2012-11-05 2014-05-08 富士フイルム株式会社 Image processing device, imaging device, image processing method, and program
US9894270B2 (en) * 2013-03-15 2018-02-13 Canon Kabushiki Kaisha Image processing apparatus and image processing method for handling a raw image, of a moving image or a still image
JP2015076782A (en) * 2013-10-10 2015-04-20 キヤノン株式会社 Image processing device, control method therefor, and control program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104243951A (en) * 2013-06-07 2014-12-24 索尼电脑娱乐公司 Image processing device, image processing system and image processing method
CN104702838A (en) * 2013-12-05 2015-06-10 佳能株式会社 Image capturing apparatus and control method thereof

Also Published As

Publication number Publication date
JP2017192123A (en) 2017-10-19
JP6808424B2 (en) 2021-01-06
CN107295247A (en) 2017-10-24

Similar Documents

Publication Publication Date Title
JP4655135B2 (en) Imaging apparatus, imaging area display method, and imaging area display program
JP4795193B2 (en) Image display apparatus, control method therefor, and program
JP6727989B2 (en) Image processing apparatus and control method thereof
JP5950755B2 (en) Image processing apparatus, control method, program, and storage medium
CN107295247B (en) Image recording apparatus and control method thereof
US9432574B2 (en) Method of developing an image from raw data and electronic apparatus
US10349002B2 (en) Image recording apparatus and method for controlling the same
JP6991742B2 (en) Image processing equipment, its control method, and control program
JP4513491B2 (en) Electronic camera
JP6012351B2 (en) Image processing apparatus, control method thereof, and program
JP6590560B2 (en) Imaging control apparatus and control method thereof
US9247148B2 (en) Variable-magnification image processing apparatus
JP6460868B2 (en) Display control apparatus and control method thereof
JP6889622B2 (en) Image processing device and its control method and program
JP6917800B2 (en) Image processing device and its control method and program
US11165970B2 (en) Image processing apparatus, imaging apparatus, image processing method, and non-transitory computer readable medium
US20220247917A1 (en) Image capture apparatus, operation apparatus and control methods
US20210227129A1 (en) Information processing apparatus, image processing apparatus, and method of controlling the same
US20210037185A1 (en) Information processing apparatus that performs arithmetic processing of neural network, and image pickup apparatus, control method, and storage medium
JP6685853B2 (en) Display control device and control method thereof
JP6223125B2 (en) Display control apparatus, display control method, and program
JP6202991B2 (en) Image display apparatus, control method thereof, and program
JP2019110423A (en) Imaging device, control method, and program
JP2019110372A (en) camera
JP2017168984A (en) Imaging device and control method for imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant