JP2017163586A - Imaging apparatus, method of controlling the same, and program - Google Patents

Imaging apparatus, method of controlling the same, and program Download PDF

Info

Publication number
JP2017163586A
JP2017163586A JP2017090969A JP2017090969A JP2017163586A JP 2017163586 A JP2017163586 A JP 2017163586A JP 2017090969 A JP2017090969 A JP 2017090969A JP 2017090969 A JP2017090969 A JP 2017090969A JP 2017163586 A JP2017163586 A JP 2017163586A
Authority
JP
Japan
Prior art keywords
unit
display unit
imaging
display
direction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2017090969A
Other languages
Japanese (ja)
Inventor
浩章 山口
Hiroaki Yamaguchi
浩章 山口
善朗 古川
Yoshiaki Furukawa
善朗 古川
Original Assignee
ソニー株式会社
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社, Sony Corp filed Critical ソニー株式会社
Priority to JP2017090969A priority Critical patent/JP2017163586A/en
Publication of JP2017163586A publication Critical patent/JP2017163586A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To facilitate photographing including a user himself/herself operating an imaging apparatus in a subject.SOLUTION: The imaging apparatus includes: a detection unit that detects whether a display unit and an imaging unit are in a predetermined relationship; and one or more buttons. The detection unit detects that both a display direction of the display unit and an imaging direction of the imaging unit are directed to a subject by rotating the display unit, and in accordance with the detection result of the detection unit, changes a function assigned to the one or more buttons.SELECTED DRAWING: Figure 1

Description

  The present disclosure relates to an imaging apparatus, an imaging apparatus control method, and a program.

  Imaging devices such as digital cameras are remarkably increasing in functionality and size.

  In recent years, a camera called a “digital single-lens camera” that can be exchanged with a small lens has appeared, and a so-called compact digital camera is also popular. Digital single-lens cameras and compact-type digital cameras (hereinafter simply referred to as “digital cameras” as appropriate) are small and lightweight, and the user of the imaging apparatus (hereinafter simply referred to as “user” as appropriate). It can be used for easy shooting.

  In a digital camera, imaging is performed by the action of photoelectric conversion of an image sensor. Therefore, in general, a digital camera includes a display unit that displays a subject to be photographed by a user.

  In order to enable photographing from various angles, for example, an imaging apparatus in which a display unit arranged on the back surface of the main body is movable is also known. Note that Patent Document 1 below proposes switching the arrangement of the additional information image displayed together with the captured image in accordance with the movable state of the display unit.

JP 2005-123908 A

  Incidentally, there is a need for an imaging apparatus that wants to perform shooting including the user who operates the imaging apparatus as a subject. For example, there is a case where the user takes an image with the lens of the imaging device facing the user himself / herself while supporting the imaging device with one hand.

  For example, when the display unit included in the imaging device is movable, if the imaging device can direct the display surface of the display unit to the user, the user can display an image (hereinafter referred to as an object) displayed on the display unit. And appropriately referred to as a subject image). However, if the display unit is directed to itself, it becomes difficult to operate buttons and keys disposed on the back surface of the imaging apparatus, and the operability of the imaging apparatus may be impaired.

  Therefore, it is desirable to facilitate shooting including the user who operates the imaging apparatus as a subject.

The first technique includes a detection unit that detects whether or not the display unit and the imaging unit have a predetermined relationship, and one or more buttons. The detection unit is configured by rotating the display unit. The imaging apparatus detects that the display direction of the display unit and the imaging direction of the imaging unit are both directed toward the subject, and changes a function assigned to one or more buttons according to a detection result of the detection unit.
It is.

  In addition, according to the second technique, when the display unit is rotated so that the display direction of the display unit and the imaging direction of the imaging unit are in the same direction, an icon related to the automatic recording setting by the self-timer on the display unit. It is an imaging device which displays.

  Furthermore, the third technique includes a detection unit that detects whether or not the display unit and the imaging unit have a predetermined relationship, and the detection unit has a display direction of the display unit and an imaging direction of the imaging unit. Both are detected to be directed to the subject, and an icon relating to self-portrait shooting is displayed on the display unit according to the detection result of the detection unit, and the icon is located on the right side when the display unit is viewed from the subject. Device.

  The fourth technique detects whether or not the display unit and the imaging unit are in a predetermined relationship, and the display unit is rotated so that the display direction of the display unit and the imaging direction of the imaging unit are determined. Is a method for controlling an imaging apparatus that changes the function assigned to one or more buttons when it is detected that both are directed to the subject.

  Further, the fifth technique detects whether the display unit and the imaging unit are in a predetermined relationship and rotates the display unit, so that the display direction of the display unit and the imaging direction of the imaging unit are Is a program that causes a computer to execute a control method of an imaging apparatus that changes a function assigned to one or more buttons.

  The sixth technique includes a detection unit that detects whether the display unit and the imaging unit have a predetermined relationship, and the detection unit displays the display unit by rotating the display unit. The imaging apparatus detects that the direction and the imaging direction of the imaging unit are both directed toward the subject, and changes the display position of the icon related to the assigned function according to the detection result of the detection unit.

  Furthermore, the seventh technique includes a detection unit that detects whether or not the display unit and the imaging unit have a predetermined relationship, and one or more buttons. By rotating the display unit, the display unit This is an imaging device that changes the function assigned to one or more buttons when the display direction and the imaging direction of the imaging unit are in the same direction.

  According to at least one embodiment, it is possible to facilitate shooting including the user who operates the imaging apparatus as a subject.

FIG. 1A is a front view illustrating an example of an imaging apparatus to which a display control apparatus according to an embodiment of the present disclosure is applied. FIG. 1B is a rear view illustrating an example of an imaging apparatus to which the display control apparatus according to the embodiment of the present disclosure is applied. FIG. 1C is a schematic diagram illustrating a state in which the display surface of the display unit of the imaging device illustrated in FIGS. 1A and 1B faces the subject. FIG. 2A is a block diagram illustrating an outline of a configuration of a display control device according to an embodiment of the present disclosure. FIG. 2B is a block diagram illustrating an example of a configuration of an imaging device to which the display control device according to the embodiment of the present disclosure is applied. 3A and 3B are schematic diagrams illustrating an example of a configuration of a detection unit in an imaging device to which the display control device according to the embodiment of the present disclosure is applied. FIG. 3C is a left side view illustrating a state in the middle of rotation of the display unit with respect to the housing of the main body unit. FIG. 4A is a front view showing a state in which both the display surface of the display unit and the imaging surface of the imaging device are directed toward the subject. FIG. 4B is a top view illustrating a state in which both the display surface of the display unit and the imaging surface of the imaging element are directed toward the subject. FIG. 4C is a left side view illustrating a state in which both the display surface of the display unit and the imaging surface of the imaging element are directed toward the subject. FIG. 5A is an image diagram illustrating an example of an image displayed on the display unit in a state where the display direction of the display unit and the imaging direction of the imaging unit are substantially antiparallel. FIG. 5B is an image diagram illustrating another example of an image displayed on the display unit in a state where the display direction of the display unit and the imaging direction of the imaging unit are substantially antiparallel. FIG. 6A is an image diagram illustrating an example of an image displayed on the display unit in a state where the display direction of the display unit and the imaging direction of the imaging unit are substantially parallel to each other. FIG. 6B is a flowchart illustrating an example of processing in the display control apparatus according to the embodiment of the present disclosure. FIG. 7A is a rear view illustrating an example of an imaging apparatus to which the display control apparatus according to the second embodiment is applied. FIG. 7B is a schematic diagram illustrating a state in which the display surface of the display unit of the imaging apparatus illustrated in FIG. 7A faces the subject. FIG. 8A is an image diagram illustrating an example of a setting screen displayed when the user touches an icon displayed on the display unit. FIG. 8B is a flowchart illustrating an example of processing executed by touching an icon displayed on the display unit. FIG. 9A is a diagram used for explaining a preferable arrangement of one or more icons arranged on the display unit. FIG. 9B is a schematic diagram illustrating a relative positional relationship between the user's hand and the imaging device during self-portrait shooting. FIG. 10A is a rear view illustrating an example of an imaging apparatus to which the display control apparatus according to the third embodiment is applied. FIG. 10B is a schematic diagram illustrating a state in which the display surface of the display unit of the imaging apparatus illustrated in FIG. 10A faces the subject. FIG. 10C is an image diagram illustrating an example of a setting screen displayed when a function button arranged on the display unit is pressed. FIG. 11A is a flowchart illustrating an example of a process of assigning functions to function buttons arranged on the display unit. FIG. 11B is a flowchart illustrating an example of a process for an operation on a function button arranged on the display unit. 12A to 12C are schematic diagrams illustrating other modes of connection of the display unit to the housing of the main body unit. FIG. 13A to FIG. 13C are schematic views illustrating other modes of connection of the display unit to the housing of the main body unit.

Hereinafter, embodiments of the display control device and the display control method will be described. The description will be made in the following order.
<1. First Embodiment>
[1-1. Schematic configuration of imaging apparatus]
(1-1-1. Configuration Example of Detection Unit)
(1-1-2. Example of image displayed on display unit)
[1-2. Example of processing in display control apparatus]
<2. Second Embodiment>
[2-1. Schematic configuration of imaging apparatus]
[2-2. Outline of operation of imaging apparatus]
[2-3. Example of processing in display control apparatus]
[2-4. Icon arrangement]
<3. Third Embodiment>
[3-1. Schematic configuration of imaging apparatus]
[3-2. Outline of operation of imaging apparatus]
[3-3. Example of processing in display control apparatus]
<4. Modification>

  The embodiment described below is a preferable specific example of the display control device and the display control method. In the following description, various technically preferable limitations are given, but unless otherwise specified to limit the present disclosure, examples of the display control device and the display control method are described in the following embodiments. It is not limited.

<1. First Embodiment>
Hereinafter, a specific example of the display control device and the display control method according to the present disclosure will be described using a digital camera as an example. As will be apparent from the following description, application examples of the display control device and the display control method of the present disclosure are not limited to digital cameras.

[1-1. Schematic configuration of imaging apparatus]
FIG. 1A is a front view illustrating an example of an imaging apparatus to which a display control apparatus according to an embodiment of the present disclosure is applied.

  As illustrated in FIG. 1A, the imaging device 1 includes, for example, a main body 1b and a lens unit 1r. An imaging unit 15 including an imaging device for converting light from a subject into an electrical signal is disposed inside the housing 10 of the main body 1b. The lens unit 1r includes a lens group for forming an image related to the subject on the imaging surface of the imaging device.

  For example, the lens unit 1r can be freely attached to and separated from the main body 1b. When the lens unit 1r is detachable from the main body 1b, the user can select an optimum lens unit from a plurality of types of lens units according to a shooting scene or the like. Of course, the main body 1b and the lens unit 1r may be integrally formed.

  FIG. 1B is a rear view illustrating an example of an imaging apparatus to which the display control apparatus according to the embodiment of the present disclosure is applied.

  As shown in FIG. 1B, for example, a group of function buttons 61 and a display unit 13 are arranged on the back surface of the main body 1b.

  The function button group 61 includes, for example, a so-called cross key 61a and buttons 61b to 61d. The function button group 61 is used for, for example, menu operation, selection of a shooting mode according to a scene, selection of image data displayed as a thumbnail, and the like.

  Note that the release button R is generally disposed, for example, on the upper surface of the main body 1b and on the right side when the imaging device 1 is viewed from the back side. This is because when it is assumed that a right-handed user uses the imaging device 1, the user can easily operate the release button R while holding the imaging device 1 with a dominant hand.

  The display unit 13 is, for example, a display such as a liquid crystal display (LCD) or an organic EL (Electroluminescence) display. The display unit 13 displays a subject image obtained by the photoelectric conversion effect of the image sensor. The display unit 13 displays various parameter setting values used for shooting, one or more icons for menu operations, and the like as necessary. In FIG. 1B, the subject image acquired by the image sensor is not shown in order to avoid the figure becoming complicated. The same applies to the following description.

  FIG. 1C is a schematic diagram illustrating a state in which the display surface of the display unit of the imaging device illustrated in FIGS. 1A and 1B faces the subject.

  The display unit 13 is connected to the housing 10 of the main body unit 1b by, for example, a hinge h1. For example, as shown in FIG. 1C, the display unit 13 is freely rotatable with respect to the housing 10 of the main body 1b. As shown in FIG. 1C, the display surface of the display unit 13 is rotated toward the subject, so that the user confirms the subject image and the like in a state where the imaging unit 15 including the imaging element faces the user. can do.

  FIG. 2A is a block diagram illustrating an outline of a configuration of a display control device according to an embodiment of the present disclosure.

  As illustrated in FIG. 2A, the display control device 11 according to the embodiment of the present disclosure includes a detection unit 17. The detection unit 17 detects whether or not the display unit 13 and the imaging unit 15 have a predetermined relationship. Details of the detection unit 17 will be described later. In the present disclosure, at least one of information related to automatic recording of a subject image obtained by the imaging unit and information related to correction of the subject image obtained by the imaging unit is displayed on the display unit according to the detection result of the detection unit.

  FIG. 2B is a block diagram illustrating an example of a configuration of an imaging device to which the display control device according to the embodiment of the present disclosure is applied.

  The imaging unit 15 includes an imaging element such as a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS), and acquires an image signal related to the subject by photoelectric conversion. The image signal related to the subject acquired by the imaging unit 15 is output to the control unit 65 described later.

  The operation unit 63 includes various buttons such as the function button group 61 and the release button R described above. The operation unit 63 functions as a user interface for operating the imaging device 1. The operation unit 63 may include an external control device such as a remote controller. An operation signal received by the operation unit 63 and corresponding to a user input operation is output to the control unit 65 described later.

  The control unit 65 is a processing device including a processor, and the control unit 65 is configured as, for example, a digital signal processor (DSP) or a CPU (central processing unit). The control unit 65 controls each unit of the imaging device 1 and outputs a processing result corresponding to an input from the operation unit 63, for example.

  The control unit 65 includes, for example, a display control unit 19, an image processing unit 67, an image recording unit 69, a shooting mode control unit 73, a face detection unit 75, a smile shooting control unit 77, and a skin color adjustment unit 79. Hereinafter, the display control unit 19, the image processing unit 67, the image recording unit 69, the shooting mode control unit 73, the face detection unit 75, the smile shooting control unit 77, and the skin color adjustment unit 79 will be described in order.

  The display control unit 19 performs display control for displaying various data on the display unit 13. As an example of data displayed on the display unit 13, image data related to a subject obtained by the imaging unit 15 can be given. By sequentially displaying the image data related to the subject on the display unit 13, the user can check the current state of the subject with reference to the display unit 13.

  Other examples of data displayed on the display unit 13 include icons indicating the remaining battery level, setting values of parameters used for photographing, and the like. Examples of parameters used for shooting include the presence / absence of a strobe, shutter speed, aperture opening, ISO sensitivity, and the like. The parameters used for shooting include parameters used for shooting including the user who operates the imaging apparatus 1 as a subject (hereinafter referred to as self-portrait shooting as appropriate). As will be described later, examples of parameters used for self-portrait photographing include self-timer and face detection, so-called “smile shutter”, so-called “skin color correction”, and the like.

  The image processing unit 67 performs predetermined signal processing on the image signal related to the subject output from the imaging unit 15 and outputs the image signal after the signal processing. Examples of signal processing for an image signal related to a subject include digital gain adjustment, gamma correction, color correction, and contrast correction.

  The image recording unit 69 compresses the image signal after the signal processing output from the image processing unit 67 by a compression encoding method such as JPEG (Joint Photographic Experts Group), and outputs the compressed data. The image data output from the image recording unit 69 is stored in the storage device 71, for example.

  The storage device 71 includes, for example, an external storage device that can be freely attached to and detached from the imaging device 1 and an internal storage device that is fixed inside the main body. Image data obtained by shooting is stored in the storage device 71. Whether the image data is stored in the external storage device or the internal storage device can be arbitrarily set by the user, for example.

  Note that a program for performing various arithmetic processes and controlling each unit of the imaging apparatus 1 is, for example, a RAM (Random Access Memory) and a ROM (Read-Only Memory) disposed in the control unit 65, or a control unit 65. Stored in the storage device 71 connected to the. Examples of the storage device 71 include a hard disk, a flash memory, an optical disk, a magneto-optical disk, and an MRAM (Magnetoresistive Random Access Memory).

  The shooting mode control unit 73 performs control for recording image data according to the shooting mode selected by the user. Examples of the shooting mode setting include single shooting (recording of image data for each frame), continuous shooting, and shooting using a self-timer.

  The face detection unit 75 detects a specific target from the image data related to the subject acquired by the imaging unit 15 by, for example, pattern matching. Here, the specific target is, for example, the face of a person or animal included in the subject. Therefore, the face detection unit 75 detects one or more faces included in the image data regarding the subject.

  The user can set face detection by an input operation on the operation unit 63, and the user can select ON or OFF as the face detection setting value.

  The smile photographing control unit 77 determines whether or not the face detected by the face detection unit 75 is a smile. The smile photographing control unit 77 performs control for automatically executing recording of image data when the face detected by the face detection unit 75 is a smile. The function of automatically recording image data in accordance with the determination result of whether or not the subject is smiling is called “smile shutter”.

  The user can set “smile shutter” by an input operation on the operation unit 63, and the user can select on or off as the set value of “smile shutter”. When “ON” is selected as the setting value of “SMILE SHUTTER”, the user can further set how much the subject is smiling to record the image data. Specifically, for example, the user can further set “Laughter”, “Normal laughter”, “Smile” or the like as the set value of “Smile shutter”.

  The skin color adjustment unit 79 corrects the recorded image data so that the skin of the person looks smooth. The correction of the image data relating to the skin of the person's face, in particular, the skin of the person is called “skin color correction” or the like. That is, for example, the skin color adjustment unit 79 performs image processing on the data relating to the human face detected by the face detection unit 75 to record the photographed person's skin so that the skin looks smooth. Correct the image data. Specifically, the skin color adjustment unit 79 performs noise removal on data relating to a person's face.

  The user can set “skin color correction” by an input operation on the operation unit 63, and the user can select ON or OFF as the setting value of “skin color correction”. When ON is selected as the setting value of “skin color correction”, the user can further set the degree of “skin color correction”. Specifically, for example, the user can further set “strong”, “medium”, or “weak” as “skin color correction”.

(1-1-1. Configuration Example of Detection Unit)
Next, a configuration example of the detection unit 17 and an example of the operation of the detection unit 17 will be described. As described above, the detection unit 17 detects whether or not the display unit 13 and the imaging unit 15 are in a predetermined relationship.

  Here, the predetermined relationship indicates, for example, the relative positional relationship between the display unit 13 and the imaging unit 15. Specifically, for example, the detection unit 17 detects whether or not both the display surface of the display unit 13 and the imaging surface of the imaging element are directed toward the subject. That is, the predetermined relationship is a relationship in which the subject can confirm the display content of the display unit 13. Therefore, for example, when the display unit 13 is rotatable with respect to the housing 10 of the main body 1b, the detection unit 17 detects the rotation angle of the display 13 with respect to the housing 10 of the main body 1b, Detect the display direction.

  When the display unit 13 is rotated or moved with respect to the housing 10 of the main body unit 1b, the display direction of the display unit 13 and the imaging direction of the imaging unit 15 are both directed toward the subject. Therefore, it can be determined that the photographer wants to shoot with the subject included. Therefore, in the present disclosure, based on whether or not the display unit and the imaging unit are in a predetermined arrangement, it is estimated whether or not the user is going to perform shooting that includes himself / herself as a subject.

  3A and 3B are schematic diagrams illustrating an example of a configuration of a detection unit in an imaging device to which the display control device according to the embodiment of the present disclosure is applied. FIG. 3A is a diagram illustrating a back surface of the imaging device 1 to which the display control device 11 according to the embodiment of the present disclosure is applied. FIG. 3B is a diagram illustrating a left side view of the imaging device 1 to which the display control device 11 according to the embodiment of the present disclosure is applied.

  As shown in FIGS. 3A and 3B, the detection unit 17 is configured by a set of a magnetic field sensor 17a and a magnet 17b, for example. The magnetic field sensor 17a is disposed, for example, inside the housing 10 of the main body 1b, and the magnet 17b is disposed, for example, inside the display unit 13. Of course, the magnetic field sensor 17a may be disposed inside the display unit 13, and the magnet 17b may be disposed inside the housing 10 of the main body unit 1b.

  The magnetic field sensor 17a is a sensor including a Hall element, for example. The magnetic field sensor 17a includes a digital output type sensor that outputs a logical value corresponding to the presence or absence of a magnetic field (in this specification, “magnetic flux density” and “magnetic field” are not distinguished), a magnetic field sensor There is an analog output type sensor that outputs a signal proportional to the size. If it is possible to detect whether or not the display unit 13 and the imaging unit 15 are in a predetermined arrangement, either a digital output type or an analog output type may be used as the magnetic field sensor 17a.

  Now, assuming that the user is shooting a subject that does not include himself / herself, the imaging surface of the image sensor is directed toward the subject, and the user's face and the display unit 13 on the opposite side to the imaging surface are displayed. Opposite to the display surface. In other words, the display direction of the display unit 13 and the imaging direction of the imaging unit 15 are substantially antiparallel. In FIG. 3B, the display direction of the display unit 13 is schematically indicated by an arrow D3, and the imaging direction of the imaging unit 15 is schematically indicated by an arrow D5.

  FIG. 3C is a left side view illustrating a state in the middle of rotation of the display unit with respect to the housing of the main body unit.

  When the display unit 13 is rotated with respect to the housing 10 of the main body unit 1b so that the display surface of the display unit 13 faces the subject as indicated by an arrow E in FIG. The magnet 17b also moves. Then, with the movement of the magnet 17b, the magnetic field near the magnetic field sensor 17a also changes as the display unit 13 rotates.

  FIG. 4A is a front view showing a state in which both the display surface of the display unit and the imaging surface of the imaging device are directed toward the subject. FIG. 4B is a top view illustrating a state in which both the display surface of the display unit and the imaging surface of the imaging element are directed toward the subject. FIG. 4C is a left side view illustrating a state in which both the display surface of the display unit and the imaging surface of the imaging element are directed toward the subject.

  In FIG. 4C, the display direction of the display unit 13 is schematically indicated by an arrow D3, and the imaging direction of the imaging unit 15 is schematically indicated by an arrow D5. As shown in FIG. 4C, in a state where both the display surface of the display unit 13 and the imaging surface of the image sensor are directed toward the subject, the display direction of the display unit 13 and the imaging direction of the imaging unit 15 are substantially the same. It is parallel.

  Here, for example, if the magnetic moment of the magnet 17b is parallel to the display direction of the display unit 13 (the direction indicated by the arrow D3), the state shown in FIG. 4C is compared with the state shown in FIG. 3B. Thus, the direction of the magnetic field in the vicinity of the magnetic field sensor 17a is reversed. Therefore, the polarity of the output from the magnetic field sensor 17a is reversed, and the control unit 65 causes both the display surface of the display unit 13 and the imaging surface of the image sensor to be directed toward the subject from the polarity of the output from the magnetic field sensor 17a. It can be determined whether or not. In other words, the control unit 65 can determine based on the detection result of the detection unit 17 whether or not shooting including the user who operates the imaging device 1 as a subject is being performed.

  In the example described above, the rotation of the display unit 13 with respect to the housing 10 is detected due to the change of the magnetic field. However, the method for detecting the rotation and movement of the display unit 13 with respect to the housing 10 is not limited to this example. For example, when the display unit 13 is rotatably connected to the housing 10 by a rotating arm or the like, how much the display unit 13 is rotated with respect to the housing 10 by detecting the rotation angle of the rotating arm. It may be determined whether or not. Alternatively, for example, an electrical contact is provided on the housing 10 and the display unit 13, or a notch is provided on one of the housing 10 or the display unit 13 to detect contact of the display unit 13 with the housing 10. It may be.

(1-1-2. Example of image displayed on display unit)
FIG. 5A is an image diagram illustrating an example of an image displayed on the display unit in a state where the display direction of the display unit and the imaging direction of the imaging unit are substantially antiparallel.

  In a state where the display direction of the display unit 13 and the imaging direction of the imaging unit 15 are substantially antiparallel, the subject image acquired by the imaging device and various parameters used for imaging, for example, One or more icons indicating setting values are displayed on the display unit 13. FIG. 5A shows an example in which an icon C1 indicating the remaining battery level, an icon C2 indicating that the imaging apparatus 1 is in the shooting mode, and the like are displayed on the display unit 13.

  FIG. 5B is an image diagram illustrating another example of an image displayed on the display unit in a state where the display direction of the display unit and the imaging direction of the imaging unit are substantially antiparallel.

  In FIG. 5B, an icon C0 indicating the setting value of the strobe, an icon S0 indicating the setting value of the self-timer, an icon S1 indicating the setting value of face detection, an icon S2 indicating the setting value of “smile shutter”, and “skin color correction” An example in which an icon S3 indicating a setting value is further displayed on the display unit 13 is shown. In the example shown in FIG. 5B, strobe light emission, self-timer and “smile shutter” are turned off, and face detection is turned on. The degree of “skin color correction” is set to “medium”. In this way, setting values of parameters used for self-portrait photography may be further displayed on the display unit.

  As shown in FIGS. 5A and 5B, the display unit 13 displays a part or all of icons indicating setting values of various parameters. In general, the display format in the imaging apparatus 1 can be switched so that the user can adjust the amount of information regarding various parameters. That is, in the imaging apparatus 1, it is possible to switch between the display format as shown in FIG. 5A and the display format as shown in FIG. 5B, for example, by operating the function button group 61 and the like. ing.

  Here, in a state where the display direction of the display unit and the imaging direction of the imaging unit are substantially anti-parallel (the state shown in FIGS. 1A and 1B), the user includes the user himself / herself as a subject. Suppose that the image pickup surface of the image pickup device is directed to the user side.

  Then, since the display surface of the display unit comes to the back side of the imaging device as viewed from the user, the user can check the setting values of various parameters by imaging the user so that the display surface of the display unit faces the user. The device must be turned over one by one. Further, a group of function buttons used for changing setting values of various parameters is generally arranged on the back surface of the imaging apparatus. Therefore, the conventional imaging apparatus is difficult to handle when it is necessary to change the setting values of various parameters. As described above, when shooting with the user himself / herself included in the subject, the user needs to set the setting values of various parameters in advance and direct the imaging surface of the imaging device to himself / herself.

  Therefore, in the present disclosure, based on whether or not the display unit and the imaging unit are in a predetermined arrangement, it is estimated whether or not the user is going to shoot to include himself / herself in the subject, and various parameters are set. The set value is displayed on the display unit. At this time, the display control apparatus 11 according to the present disclosure causes the display unit 13 to display, for example, an icon indicating a parameter setting value used for self-portrait photographing among one or more icons indicating the setting values of various parameters. .

  FIG. 6A is an image diagram illustrating an example of an image displayed on the display unit in a state where the display direction of the display unit and the imaging direction of the imaging unit are substantially parallel to each other.

  In the present disclosure, when the user directs the display surface of the display unit to himself, the display format of the image displayed on the display unit is automatically changed. That is, for example, when the user directs the display surface of the display unit toward himself with the image pickup surface of the image sensor facing himself / herself, the image displayed on the display unit is changed to, for example, the image illustrated in FIG. 6A.

  In a state where the display direction of the display unit 13 and the imaging direction of the imaging unit 15 are substantially parallel, for example, information related to processing for an image obtained by the imaging unit 15 is displayed on the display unit 13. As processing for an image obtained by the imaging unit 15, for example, processing relating to automatic recording of a subject image obtained by the imaging unit 15, processing relating to correction of a subject image obtained by the imaging unit 15, and an image obtained by the imaging unit 15 For example, a process for detecting a specific target is included. The information related to the automatic recording of the subject image obtained by the imaging unit 15 is executed according to the determination result of whether or not the subject is smiling, such as a setting value of automatic recording by a self-timer and “smile shutter”. Set value for automatic recording. Examples of the information regarding the correction of the subject image obtained by the imaging unit 15 include a setting value of “skin color correction”. Examples of the detection of the specific target from the subject image obtained by the imaging unit 15 include face detection.

  More specifically, when the user directs the display surface of the display unit 13 to himself / herself, as shown in FIG. 6A, for example, an icon S1 indicating a setting value for face detection, and an icon indicating a setting value for “smile shutter” Icons S3 and the like indicating the setting values of S2 and “skin color correction” are displayed on the display unit 13. FIG. 6A shows an example in which an icon S1 indicating the face detection setting value, an icon S2 indicating the “smile shutter” setting value, and an icon S3 indicating the “skin color correction” setting value are displayed on the display unit 13. The image displayed on the display unit 13 is not limited to this example. For example, the number, type, arrangement, and the like of icons displayed on the display unit 13 may be arbitrarily set by the user.

  In particular, an icon indicating the setting value of a parameter used for photographing a person is displayed on the display unit 13, so that the user can set the parameter used for self-portrait photographing without turning the imaging device 1 over. The value can be confirmed. Thus, in the state where the display direction of the display unit 13 and the imaging direction of the imaging unit 15 are substantially parallel, an icon indicating the setting value of the parameter used for photographing the person is displayed on the display unit 13. It is preferably displayed.

[1-2. Example of processing in display control apparatus]
FIG. 6B is a flowchart illustrating an example of processing in the display control apparatus according to the embodiment of the present disclosure. A series of processing described below with reference to FIG. 6B is executed by, for example, the control unit.

  First, in step St1, it is determined whether or not the display unit 13 and the imaging unit 15 have a predetermined relationship. That is, based on the detection result of the detection unit 17, for example, it is determined whether or not the display direction of the display unit 13 and the imaging direction of the imaging unit 15 are substantially parallel. If it is not detected that the display direction of the display unit 13 and the imaging direction of the imaging unit 15 are substantially parallel, the process ends.

  On the other hand, when the detection unit 17 detects that the display direction of the display unit 13 and the imaging direction of the imaging unit 15 are substantially parallel, the process proceeds to step St2. Then, in step St2, for example, parameter setting values used for self-portrait photographing are read. The set values of various parameters are stored, for example, in an internal storage device fixed inside the main body 1b.

  When the reading of the parameter setting values is completed, the process proceeds to step St3. In step St3, the display format of the image displayed on the display unit 13 is changed. That is, for example, the image displayed on the display unit 13 is changed from the image shown in FIG. 5A to the image shown in FIG. 6A.

  At this time, as shown in FIG. 6A, an icon corresponding to each set value is displayed on the display unit 13. Therefore, the user refers to each icon displayed on the display unit 13, for example, the face detection setting value is turned on, the “smile shutter” setting value is turned off, and the degree of “skin color correction” is set. It can be easily confirmed that is set to “medium”.

  As described above, according to the first embodiment of the present disclosure, the display format of the image displayed on the display unit 13 is automatically changed when the user directs the display surface of the display unit 13 to himself / herself.

  For example, when the user performs self-portrait shooting, the display surface of the display unit is directed toward the user. However, in a state where the display surface of the display unit is directed toward the user, the imaging device shakes and the user captures the image well. May not be possible. Therefore, for example, the user performs shooting after making settings for self-portrait shooting such as a self-timer in advance with respect to the imaging apparatus. In this case, the user needs to make various settings before changing the display direction of the display unit.

  If the user performs various settings before changing the display direction of the display unit, whether or not shooting by the self-timer is enabled while checking the own image displayed on the display unit, etc. There are situations where the user wants to check the setting values of various parameters. At this time, the user cannot simply confirm the setting values of the various parameters simply by directing the display surface of the display unit toward the user. In order for the user to check the setting values of various parameters, the user must call a display screen on which the setting values that the user needs to check are displayed, for example, by performing a menu operation or a display format switching operation. Because.

  For this reason, in the conventional imaging apparatus, in order for the user to switch the display format, the user has to operate the buttons arranged on the back surface of the imaging apparatus while keeping a pose. Alternatively, the user has to break the pose, return the display direction of the display unit to the original direction, and operate these buttons while visually recognizing the buttons arranged on the back of the imaging device.

  In the present disclosure, the detection unit detects that the display direction of the display unit has been changed by the user, and is used for shooting including, for example, the user who operates the imaging apparatus as a subject according to the detection result of the detection unit. The parameter setting value is displayed on the display. Therefore, whether the setting value of a parameter used for shooting is a setting suitable for shooting including the user who operates the imaging apparatus as a subject with the display surface of the display unit facing the user. It can be easily confirmed whether or not. Therefore, according to the present disclosure, the user can easily perform self-portrait shooting without the need for manipulating the buttons on the back of the imaging apparatus or breaking the pose.

  In FIG. 6A, when the user points the display surface of the display unit 13 to himself / herself, parameters used for self-portrait shooting among one or more icons indicating setting values of various parameters used for shooting. An example in which an icon indicating the set value is displayed on the display unit 13 is shown. As described above, among the one or more icons indicating the setting values of various parameters used for shooting, the icon indicating the setting value of parameters used for self-portrait shooting may be displayed with priority.

  The number of icons displayed on the display unit 13 is preferably five or less, and more preferably three. This is because if the number of icons displayed on the display unit 13 is large, it becomes easy for the user to check the setting values of various parameters, but the display on the screen becomes complicated and it is difficult for the user to check the subject image. When the number of icons displayed on the display unit 13 is about three, the image displayed on the display unit 13 is not complicated and the balance of the information amount is good.

  When the user faces the display surface of the display unit 13 toward himself, for example, the image displayed on the display unit 13 may be changed from the image shown in FIG. 5A to the image shown in FIG. 5B. Of course it does not matter. In this way, the amount of information displayed on the display unit 13 may be adjusted according to the user's needs.

<2. Second Embodiment>
[2-1. Schematic configuration of imaging apparatus]
FIG. 7A is a rear view illustrating an example of an imaging apparatus to which the display control apparatus according to the second embodiment is applied. FIG. 7B is a schematic diagram illustrating a state in which the display surface of the display unit of the imaging apparatus illustrated in FIG. 7A faces the subject.

  As illustrated in FIGS. 7A and 7B, the imaging device 2 to which the display control device 21 according to the second embodiment is applied includes, for example, a main body 2b and a lens unit 2r. An imaging unit 25 including an imaging element is arranged inside the housing 20 of the main body 2b, and for example, a group of function buttons and a display unit 23 are arranged on the back of the main body 2b of the imaging device 2. The imaging device 2 according to the second embodiment is common to the imaging device 1 according to the first embodiment. In addition, according to the detection result of the detection unit 27, at least one of information related to automatic recording of the subject image obtained by the imaging unit 25 and information related to correction of the subject image obtained by the imaging unit 25 is displayed on the display unit 23. Thus, the second embodiment is common to the first embodiment.

  The second embodiment is different from the first embodiment in that the display unit 23 has a function of an input device that receives an instruction from a user. That is, the display unit 23 in the second embodiment is specifically configured as a touch panel, and thus the display unit 23 also functions as the operation unit 63 described above.

  In the second embodiment, according to the detection result of the detection unit 27, for example, one or more icons indicating the setting values of parameters used for self-portrait shooting are displayed on the display unit 23. The second embodiment is different from the first embodiment in that the user touches these icons displayed on the display unit 23 to execute functions corresponding to the contents displayed by the individual icons. Yes.

[2-2. Outline of operation of imaging apparatus]
As shown in FIG. 7A, in the state where the display direction of the display unit 23 and the imaging direction of the imaging unit 25 are substantially antiparallel, for example, subject images acquired by the imaging device, One or more icons indicating parameter setting values are displayed on the display unit 23. Alternatively, for example, setting values of parameters used for photographing including the user who operates the imaging device 2 as a subject are further displayed on the display unit 23.

  Here, when the display surface of the display unit 23 is rotated by the user with respect to the housing 20, for example, the detection unit 27 configured by a set of the magnetic field sensor 27 a and the magnet 27 b is connected to the display unit 23 and the imaging unit 25. Is detected to have a predetermined relationship. Then, as in the case of the first embodiment, the display format of the display unit 23 is changed, and information regarding automatic recording of the subject image obtained by the imaging unit 25 or information regarding correction of the subject image is displayed on the display unit 23. The Specifically, as shown in FIG. 7B, for example, an icon S21 indicating a setting value for face detection, an icon S22 indicating a setting value for “smile shutter”, an icon S23 indicating a setting value for “skin color correction”, and the like. It is displayed on the display unit 23.

  As described above, in the second embodiment, the display unit 23 is configured as a touch panel. When the user's touch on the icons S21 to S23 displayed on the display unit 23 is detected, the display control device 21 according to the second embodiment, for example, displays the screen displayed on the display unit 23 using individual icons. Transition to the screen for changing the setting value of the indicated parameter. Therefore, the user can change the setting corresponding to the process indicated by each of the icons S21 to S23 by touching each of the icons S21 to S23 displayed on the display unit 23 during self-portrait shooting. .

  Note that, for example, an icon different from one or more icons initially displayed on the display unit 23 when the display surface of the display unit 23 is directed to the user by a so-called “flick” operation or a tracing operation by the user is displayed on the display unit. 23 may appear. For example, when the user traces the vicinity of the icons S21 to S23 displayed on the display unit 23 downward, the icons S21 to S23 may change so as to flow downward. In this case, for example, an icon S23 indicating the setting value of “skin color correction” is hidden at the bottom of the screen by the user's tracing operation, and for example, an icon indicating the setting value of the self-timer appears from the top of the screen. In this way, the user can easily change the setting values of various parameters used for shooting.

  FIG. 8A is an image diagram illustrating an example of a setting screen displayed when the user touches an icon displayed on the display unit.

  FIG. 8A is a diagram illustrating a state after the user touches the icon S <b> 22 displayed on the display unit 23. When the user touches the icon S22 displayed on the display unit 23, as shown in FIG. 8A, for example, the screen displayed on the display unit 23 is a screen for changing settings corresponding to the process indicated by the icon S22. Transition to.

  In FIG. 8A, an icon Sa22 for setting the “smile shutter” setting value to “big laughter”, an icon Sb22 for setting the setting value “normal laughter”, an icon Sc22 for setting the setting value “smile”, The icon Sd22 for turning off the setting value and the icon Se22 for closing the setting screen are displayed. For example, the user can switch the setting value of the process indicated by the icon S22, that is, the setting value of “smile shutter” by touching any of the icons Sa22, Sb22, Sc22 or the icon Sd22.

  FIG. 8A shows a state in which OFF is selected as the setting value of “smile shutter”. For example, after “OFF” is selected as the setting value of “SMILE SHUTTER”, when the user touches the icon Se22 for closing the setting screen, the setting of “SMILE SHUTTER” is determined to be OFF and the setting screen is closed. It is done.

  Note that when the user's touch on the icons S21 to S23 is detected, the screen displayed on the display unit 23 is changed to the setting screen, and the setting corresponding to the process indicated by each icon is changed. You may do it. That is, among the icons S21 to S23 displayed on the display unit 23, for example, every time the icon S23 is touched by the user, the setting value of “skin color correction” is OFF → “weak” → “medium” → “strong”. “→ Off →... May be switched. In this case, for example, the user can switch the setting value of the process indicated by the icon S23, that is, the setting value of “skin color correction” by repeatedly touching the icon S23.

[2-3. Example of processing in display control apparatus]
FIG. 8B is a flowchart illustrating an example of processing executed by touching an icon displayed on the display unit. In FIG. 8B, it is assumed that the display unit 23 and the imaging unit 25 have a predetermined relationship. A series of processing described below with reference to FIG. 8B is executed by, for example, the control unit.

  First, in step St21, it is determined whether or not the user has touched the display unit 23 configured as a touch panel. If it is not detected that the user has touched the display unit 23, the process ends.

  On the other hand, when the detection unit 27 detects that the user has touched the display unit 23, the process proceeds to step St22. In step St <b> 22, it is determined in which of the display areas of the display unit 23 the touch by the user has been detected. That is, it is determined which icon the user has touched among the icons S21 to S23 displayed on the display unit 23. In the following, it is assumed that a variable for designating an area (which may be referred to as coordinates) in which each of the icons S21 to S23 is displayed is U, and for the area in which the icons S21, S22, and S23 are displayed, Assume that “a”, “b”, and “c” are set as values.

  For example, when the icon S21 displayed on the display unit 23 is touched by the user, the screen displayed on the display unit 23 is changed to a screen for changing the face detection setting in step St23. . For example, when the icon S22 displayed on the display unit 23 is touched by the user, the screen displayed on the display unit 23 is changed to a screen for changing the setting of “smile shutter” in step St24. Is done. Further, for example, when the icon S23 displayed on the display unit 23 is touched by the user, the screen displayed on the display unit 23 is a screen for changing the setting of “skin color correction” in step St25. Transition to. When the user touches an area away from the area where icons S21 to S23 are displayed (when U == NULL shown in FIG. 9B), the process ends.

[2-4. Icon arrangement]
The one or more icons displayed on the display unit 23 when the display surface of the display unit 23 is directed to the user are preferably arranged at a position away from the release button R when viewed from the user, for example. For example, the user can operate the release button R of the imaging device 2 with the left hand, and can operate the icon displayed on the display unit 23 with the right hand. At this time, it is preferable that the user's finger or the user's hand trying to touch the display unit 23 does not block the light incident on the imaging unit 25.

  FIG. 9A is a diagram used for explaining a preferable arrangement of one or more icons arranged on the display unit. FIG. 9B is a schematic diagram illustrating a relative positional relationship between the user's hand and the imaging device during self-portrait shooting.

  As shown in FIG. 9A, a straight line m passing through the center of the screen of the display unit 23 and the center of the imaging unit 25 is assumed. In FIG. 9A, the straight line m is indicated by a one-dot chain line. Further, as shown in FIG. 9A, an area including the imaging device 2 is assumed. In FIG. 9A, a region including the imaging device 2 is indicated by a two-dot chain line.

  Here, it is assumed that the region including the imaging device 2 is divided into two regions V1 and V2 by the straight line m. At this time, one or more icons displayed on the display unit when the display surface of the display unit 23 is directed to the user are located in a portion of the display surface of the display unit 23 in a region not including the release button R. Preferably they are arranged. Specifically, for example, the icons S21 to S23 are preferably displayed on a portion of the display surface of the display unit 23 in a region V2 on the side not including the release button R.

  When one or more icons are displayed at a position relatively distant from the release button R, as shown in FIG. 9B, the icon is incident on the imaging unit 25 by a user's hand or the like trying to touch the display unit 23. This is because the light is not blocked.

  The arrangement of the one or more icons may be a position relatively distant from the release button R, and not all of the one or more icons are necessarily displayed in the area V2 that does not include the release button R. Also good.

  For example, all or a part of one or more icons may be displayed at a position away from the center of the screen of the display unit 23 with respect to the release button R. Alternatively, for example, when the screen of the display unit 23 is rectangular, one or more icons are provided in a portion relatively distant from the release button R, out of two portions when the screen is assumed to be divided into two. May be displayed in whole or in part. As a method of dividing the screen, there are, for example, two divisions in the vertical direction, two divisions in the left and right directions, and two divisions by diagonal lines. For example, when the release button R is positioned on the left side when viewed from the user during self-portrait shooting, if one or more icons are displayed on the right side of the screen, the user can easily operate the one or more icons with the right hand. .

  As described above, the arrangement of the one or more icons displayed on the display unit 23 is performed by the user when the display unit 23 and the imaging unit 25 are arranged with respect to the housing 20 of the main body unit 2b or when self-portrait shooting is performed. It may be adjusted as appropriate depending on the direction in which the device 2 is held.

  According to the second embodiment of the present disclosure, the display format of the image displayed on the display unit 23 is automatically changed when the user points the display surface of the display unit 23 toward himself / herself. Therefore, the user can easily confirm the set values of parameters used for self-portrait shooting. Furthermore, in the second embodiment, by touching one or more icons displayed on the display unit 23, the user can change the setting values of the parameters respectively indicated by the icons displayed on the display unit 23. . Therefore, the user can easily change the setting values of the parameters used for self-portrait shooting while confirming the subject image displayed on the display unit 23 without bothering to display the menu screen.

<3. Third Embodiment>
[3-1. Schematic configuration of imaging apparatus]
FIG. 10A is a rear view illustrating an example of an imaging apparatus to which the display control apparatus according to the third embodiment is applied. FIG. 10B is a schematic diagram illustrating a state in which the display surface of the display unit of the imaging apparatus illustrated in FIG. 10A faces the subject.

  As shown in FIGS. 10A and 10B, the imaging device 3 to which the display control device 31 according to the third embodiment is applied includes, for example, a main body 3b and a lens unit 3r. An imaging unit 35 including an imaging device is arranged inside the housing 30 of the main body 3b, and a group of function buttons and a display unit 33 are arranged on the back of the main body 3b of the imaging device 3, for example. The imaging device 3 according to the third embodiment is common to the imaging device 1 according to the first embodiment. In addition, according to the detection result of the detection unit 37, the information about the automatic recording of the subject image obtained by the imaging unit 35 or the information about the correction of the subject image is displayed on the display unit 33. Common to the first embodiment.

  The third embodiment is different from the first embodiment in that the display unit 33 is a display unit including one or more function buttons 34. In the third embodiment, according to the detection result of the detection unit 37, each of the one or more function buttons 34 arranged on the display unit 33 has a function corresponding to information regarding processing on an image obtained by the imaging unit 35. It is different from the first embodiment in that it is assigned.

[3-2. Outline of operation of imaging apparatus]
In the example illustrated in FIG. 10A, four function buttons 34 a to 34 d are arranged on the display unit 33. In a state where the display direction of the display unit 33 and the imaging direction of the imaging unit 35 are substantially anti-parallel, the function buttons 34 a to 34 d are similar to the group of function buttons on the back of the housing 30. For example, functions such as menu operation and selection of a shooting mode according to the scene are assigned.

  Here, when the user rotates the display surface of the display unit 33 with respect to the housing 30, for example, the detection unit 37 configured by a set of the magnetic field sensor 37 a and the magnet 37 b includes the display unit 33 and the imaging unit 35. Assume that a predetermined relationship is detected. Then, as in the case of the first embodiment, the display format of the display unit 33 is changed, and information related to processing on the image obtained by the imaging unit 35 is displayed on the display unit 33. Specifically, as shown in FIG. 10B, for example, an icon S31 indicating a setting value for face detection, an icon S32 indicating a setting value for “smile shutter”, an icon S33 indicating a setting value for “skin color correction”, and the like. It is displayed on the display unit 33. At this time, each of the icons S31 to S33 is displayed in the vicinity of the function buttons 34a to 34c.

  In the third embodiment, in addition to changing the display format of the display unit 33, the function realized by operating the function buttons 34a to 34c arranged on the display unit 33 is a parameter used for self-portrait shooting. Changed to function related to settings. In other words, in the third embodiment, a function for changing the setting corresponding to the processing for the image obtained by the imaging unit 35 is assigned to each of the function buttons 34 a to 34 c according to the detection result of the detection unit 37. It is done. Therefore, the user can change the setting corresponding to the process indicated by each of the icons S31 to S33 by operating each of the function buttons 34a to 34c during the self-portrait shooting.

  In addition, it is preferable that the function buttons 34a to 34d are arranged at positions relatively distant from the release button R. For example, when the release button R is positioned on the left side when viewed from the user during self-portrait shooting, when the function buttons 34a to 34d are arranged on the right side of the screen, the user operates the function buttons 34a to 34d with the right hand. It's easy to do.

  FIG. 10C is an image diagram illustrating an example of a setting screen displayed when a function button arranged on the display unit is pressed.

  FIG. 10C is a diagram illustrating a state where the user has pressed the function button 34a. When the user presses the function button 34a, as shown in FIG. 10C, for example, the screen displayed on the display unit 33 is changed to a screen for changing the setting corresponding to the process indicated by the icon S31. For example, the user can switch the setting value of the process indicated by the icon S31, that is, on / off of face detection, by repeatedly pressing the function button 34a.

  Note that FIG. 10C shows a state in which OFF is selected as the face detection setting value. For example, after OFF is selected as the face detection setting value, the user presses the function button 34d, whereby the face detection setting is determined to be OFF and the setting screen is closed. Alternatively, for example, after a certain time has elapsed after the user selects OFF as the face detection setting value, the face detection setting is determined to be OFF and the setting screen is closed.

[3-3. Example of processing in display control apparatus]
FIG. 11A is a flowchart illustrating an example of a process of assigning functions to function buttons arranged on the display unit. A series of processes described below with reference to FIG. 11A is executed by, for example, the control unit.

  First, in step St31, it is determined whether or not the display unit 33 and the imaging unit 35 have a predetermined relationship. That is, based on the detection result of the detection unit 37, for example, it is determined whether or not the display direction of the display unit 33 and the imaging direction of the imaging unit 35 are substantially parallel. If it is not detected that the display direction of the display unit 33 and the imaging direction of the imaging unit 35 are substantially parallel, the process ends.

  On the other hand, when the detection unit 37 detects that the display direction of the display unit 33 and the imaging direction of the imaging unit 35 are substantially parallel, the process proceeds to step St32. In step St32, the display format of the display unit 33 is changed. For example, icons S31 to S33 indicating parameter setting values are displayed on the display unit 33.

  When the change of the display format of the display unit 33 is completed, the process proceeds to step St33. In step St <b> 33, functions for changing the settings respectively indicated by the icons S <b> 31 to S <b> 33 displayed on the display unit 33 are assigned to operations on the function buttons 34 a to 34 c.

  Specifically, for example, a function for switching on and off face detection is assigned to the function button 34a located near the icon S31 indicating the face detection setting value. For example, a function for switching the setting value of “smile shutter” is assigned to the function button 34b located near the icon S32 indicating the setting value of “smile shutter”. Further, for example, a function for switching the setting value of the degree of “skin color correction” is assigned to the function button 34c located near the icon S33 indicating the setting value of “skin color correction”. Each of the icons S31 to S33 is displayed in the vicinity of the function buttons 34a to 34c, so that the user can easily know what function is assigned to which button.

  FIG. 11B is a flowchart illustrating an example of a process for an operation on a function button arranged on the display unit. A series of processing described below with reference to FIG. 11B is executed by, for example, the control unit.

  First, in step St34, it is determined whether or not any of the function buttons 34a to 34c located near the icons S31 to S33 displayed on the display unit 33 is pressed. If the function buttons 34a to 34c arranged on the display unit 33 are not pressed, the process is terminated.

  Next, in step St35, it is determined which of the function buttons 34a to 34c arranged on the display unit 33 has been pressed. In the following, N is a variable for designating each of the function buttons 34a to 34c, and “a”, “b”, and “c” are set as variable values for the function buttons 34a, 34b, and 34c, respectively. And

  For example, when the function button 34a is pressed by the user, the screen displayed on the display unit 33 is changed to a screen for changing the face detection setting in step St36. For example, when the function button 34b is pressed by the user, the screen displayed on the display unit 33 is changed to a screen for changing the setting of “smile shutter” in step St37. For example, when the function button 34c is pressed by the user, the screen displayed on the display unit 33 is changed to a screen for changing the setting of “skin color correction” in step St38.

  According to the third embodiment of the present disclosure, the display format of the image displayed on the display unit 33 is automatically changed when the user points the display surface of the display unit 33 toward himself / herself. Therefore, the user can easily confirm the set values of parameters used for self-portrait shooting. Further, in the third embodiment, the function for changing the setting indicated by one or more icons displayed on the display unit 33 is arranged on the display unit 33 whose display surface faces the user in the self-portrait shooting. Assigned to one or more function buttons. Therefore, the user is used for photographing while confirming the subject image displayed on the display unit 33 without changing the imaging device 3 in order to operate the function buttons arranged on the back of the imaging device 3. The set values of various parameters can be easily changed.

<4. Modification>
The preferred embodiments have been described above, but the preferred specific examples are not limited to the above-described examples, and various modifications can be made.

  In the embodiment described above, an example in which the display unit is rotatable with respect to the housing of the main body unit has been described, but the manner of connection of the display unit with respect to the housing of the main body unit is not limited to this example.

  12A to 12C and FIGS. 13A to 13C are schematic views showing other modes of connection of the display unit to the housing of the main body unit. 12A and 12B are diagrams illustrating a back surface of the imaging device 4 to which the display control device according to the embodiment of the present disclosure is applied. FIG. 12C is a diagram illustrating the front of the imaging device 4 illustrated in FIGS. 12A and 12B.

  As shown in FIG. 12A, in the state where the display direction of the display unit 43 and the imaging direction of the imaging unit 45 are substantially antiparallel, for example, the subject image and various parameters used for shooting are displayed. One or more icons indicating setting values are displayed on the display unit 43.

  The display unit 43 of the imaging device 4 is a slide type panel. As shown by the arrow F1 in FIG. 12B, the display unit 43 is supported by, for example, rails g1 and g2 arranged on the housing 40 of the main body 4b, and is slidable downward with respect to the housing 40 of the main body 4b. It is said that.

  FIG. 12C shows a state in which the display surface of the display unit 43 is directed toward the user as the display unit 43 is slid downward with respect to the housing 40 of the main body unit 4b. As illustrated in FIG. 12C, the display unit 43 includes, for example, a sub screen 43 d on the front side of the imaging device 4 in addition to the main screen on the back side of the imaging device 4.

  For example, it is assumed that the detection unit 47 configured by a set of the magnetic field sensor 47a and the magnet 47b detects that the display unit 43 and the imaging unit 45 have a predetermined relationship. Then, on the sub screen 43d, for example, one or more icons indicating setting values of parameters used for self-portrait shooting are displayed. That is, the detection unit 47 is configured so that the display unit 43 is moved with respect to the housing 40 of the main body unit 4b, and the display unit 43 and the imaging unit 45 are arranged so that the sub screen 43d faces the user. Is detected.

  Further, for example, the display unit may be slidable and rotatable with respect to the housing of the main body unit.

  13A and 13B are diagrams illustrating a back surface of the imaging device 5 to which the display control device according to the embodiment of the present disclosure is applied. FIG. 13C is a diagram illustrating the front of the imaging device 5 illustrated in FIGS. 13A and 13B.

  In FIG. 13A, the hinge part h2 which connects the housing 50 of the main body part 5b and the display part 53 is slidably supported by, for example, rails g3 and g4 arranged on the housing 50 of the main body part 5b, and the display part 53 is An example in which the hinge part h2 is rotatable is shown. That is, as indicated by an arrow F2 in FIG. 13A, the user can slide the display unit 53 and the hinge unit h2 downward with respect to the housing 50 of the main body unit 5b. Further, as indicated by an arrow F3 in FIG. 13B, the user can rotate the display unit 53 with respect to the hinge part h2 about the axis L indicated by the broken line in FIG. 13B as the rotation axis.

  In FIG. 13C, the display unit 53 is slid downward with respect to the housing 50 of the main body unit 5b and rotated with respect to the hinge unit h2, so that the display surface of the display unit 53 is directed to the user. Indicates the state. For example, it is assumed that the detection unit 57 configured by a set of the magnetic field sensor 57a and the magnet 57b detects that the display unit 53 and the imaging unit 55 have a predetermined relationship. Then, also in this case, one or more icons indicating setting values of parameters used for self-portrait shooting are displayed on the display unit 53, for example.

  In the above-described embodiments, the example in which the display unit appears on the upper side or the lower side of the main body unit in a state where the display surface of the display unit is directed toward the subject is shown, but the display unit appears on the side surface side of the main body unit. You may do it.

  In the present disclosure, the connection mode between the imaging unit and the display unit is not particularly limited, and the detection unit only needs to be able to detect the final state of the other of the imaging unit and the display unit. Thus, by detecting whether or not the display unit and the imaging unit are in a predetermined relationship, the user can perform automatic imaging and the like without bothering hands.

  In the above-described embodiment, an example in which one or more icons indicating setting values of parameters used for shooting are displayed when the display unit and the imaging unit have a predetermined relationship is shown. Further, it may be further determined whether or not a person is included in the subject. For example, it is further determined whether or not a person is included in the subject, and the parameters used for self-portrait shooting when the display unit and the imaging unit have a predetermined relationship and the subject includes a person. One or more icons indicating setting values may be displayed on the display unit.

  The display control device of the present disclosure can be applied by changing the type of parameter for displaying the setting value on the display unit depending on whether the user is shooting a landscape or a person. The operability of the image pickup apparatus can be further improved.

  The present disclosure can be applied not only to still image shooting but also to moving image shooting.

  In the above-described embodiment, the example of the imaging device including the display control device has been described, but the application of the display control device is not limited to this example. The display control device of the present disclosure can be applied to any electronic device as long as the display control device includes a display unit that is movable with respect to the housing including the imaging unit. For example, for mobile phones, smart phones, electronic books, audio players, personal computers (tablets, laptops, desktops), personal digital assistants (PDA), electronic notebooks, webcams, video game machines, etc. The present disclosure can also be applied.

  Note that the configurations, methods, shapes, numerical values, and the like given in the above-described embodiments are merely examples, and different configurations, methods, shapes, numerical values, and the like may be used as necessary. The configurations, methods, shapes, numerical values, and the like of the above-described embodiments can be combined with each other without departing from the gist of the present disclosure.

For example, this indication can also take the following composition.
(1)
A detection unit that detects whether or not the display unit and the imaging unit have a predetermined relationship;
With one or more buttons,
The detection unit detects that the display direction of the display unit and the imaging direction of the imaging unit are both directed toward the subject by rotating the display unit;
An imaging apparatus that changes a function assigned to the one or more buttons according to a detection result of the detection unit.
(2)
The function assigned to the one or more buttons is an imaging apparatus according to (1), which is a function related to imaging.
(3)
The image pickup apparatus according to (1) or (2), wherein the one or more buttons include a release button.
(4)
The imaging device according to any one of (1) to (3), wherein the display unit is rotatably connected to a housing provided with the imaging unit.
(5)
One or more buttons,
The imaging device according to any one of (1) to (4), wherein a function for changing a setting corresponding to the automatic recording is assigned to each of the one or more buttons according to a detection result of the detection unit. .
(6)
The imaging device according to any one of (1) to (5), wherein the display unit is configured as a touch panel.
(6)
When the user performs an operation on the display unit in a state where the icon regarding the automatic recording setting by the self-timer is displayed on the display unit, the icon is different from the icon regarding the automatic recording setting by the self-timer. The imaging device according to (6), wherein an icon is displayed.
(7)
The imaging device according to any one of (1) to (6), wherein the imaging unit includes a lens unit.
(8)
An imaging apparatus that displays an icon relating to setting of automatic recording by a self-timer on the display unit when the display direction of the display unit and the imaging direction of the imaging unit are oriented in the same direction by rotating the display unit.
(10)
A detection unit that detects whether the display unit and the imaging unit have a predetermined relationship;
The detection unit detects that the display direction of the display unit and the imaging direction of the imaging unit are both directed toward the subject;
In accordance with the detection result of the detection unit, an icon related to self-portrait shooting is displayed on the display unit, and the icon is located on the right side when the display unit is viewed from the subject.
(11)
Detecting whether the display unit and the imaging unit have a predetermined relationship,
Imaging that changes the function assigned to one or more buttons when it is detected that the display direction of the display unit and the imaging direction of the imaging unit are both directed toward the subject by rotating the display unit Control method of the device.
(12)
Detecting whether the display unit and the imaging unit have a predetermined relationship,
Imaging that changes the function assigned to one or more buttons when it is detected that the display direction of the display unit and the imaging direction of the imaging unit are both directed toward the subject by rotating the display unit A program for causing a computer to execute a device control method.
(13)
A detection unit that detects whether the display unit and the imaging unit have a predetermined relationship;
The detection unit detects that the display direction of the display unit and the imaging direction of the imaging unit are both directed toward the subject by rotating the display unit;
An imaging device that changes a display position of an icon related to an assigned function according to a detection result of the detection unit.
(14)
A detection unit that detects whether or not the display unit and the imaging unit have a predetermined relationship;
With one or more buttons,
An imaging apparatus that changes a function assigned to the one or more buttons when the display direction of the display unit and the imaging direction of the imaging unit are oriented in the same direction by rotating the display unit.
(15)
The imaging device according to (14), wherein the display unit is configured to be rotatable around a rotation axis.

1, 2, 3, 4, 5 ... imaging devices 10, 20, 30, 40, 50 ... housings 11, 21, 31 ... display control devices 13, 33, 43, 53 ... display unit 23 ... Display unit (touch panel)
15, 25, 35, 45, 55... Imaging units 17, 27, 37, 47, 57... Detection unit 19... Display control units 34a, 34b, 34c. Unit 71 ... Storage device R ... Release button

In order to solve the above-described problem, the first technique includes a display unit, a detection unit that detects rotation of the display unit, and one or more buttons, and the detection unit rotates the display unit in a predetermined state. When it is detected that the condition is satisfied, the function is to change the function assigned to one or more buttons and change the notification in the vicinity of the button whose function has been changed .

Further, the second technique detects the rotation of the display unit, changes the function assigned to one or more buttons when detecting that the rotation of the display unit satisfies a predetermined state, and changes the function. It is the control method of the imaging device which changes the notification in the vicinity .

Further, the third technique detects the rotation of the display unit, changes the function assigned to one or more buttons when detecting that the rotation of the display unit satisfies a predetermined state, and changes the function. Is a program that causes a computer to execute a control method of an imaging apparatus that changes notification in the vicinity of the computer.

Claims (15)

  1. A detection unit that detects whether or not the display unit and the imaging unit have a predetermined relationship;
    With one or more buttons,
    The detection unit detects that the display direction of the display unit and the imaging direction of the imaging unit are both directed toward the subject by rotating the display unit;
    An imaging apparatus that changes a function assigned to the one or more buttons according to a detection result of the detection unit.
  2. The imaging apparatus according to claim 1, wherein the function assigned to the one or more buttons is a function related to imaging.
  3. The imaging apparatus according to claim 1, wherein the one or more buttons include a release button.
  4. The imaging device according to claim 1, wherein the display unit is rotatably connected to a housing provided with the imaging unit.
  5. One or more buttons,
    The imaging apparatus according to claim 1, wherein a function for changing a setting corresponding to the automatic recording is assigned to each of the one or more buttons according to a detection result of the detection unit.
  6. The imaging device according to claim 1, wherein the display unit is configured as a touch panel.
  7. When the user performs an operation on the display unit in a state where the icon regarding the automatic recording setting by the self-timer is displayed on the display unit, the icon is different from the icon regarding the automatic recording setting by the self-timer. The imaging device according to claim 6, wherein an icon is displayed.
  8. The imaging device according to claim 1, wherein the imaging unit includes a lens unit.
  9. An imaging apparatus that displays an icon relating to setting of automatic recording by a self-timer on the display unit when the display direction of the display unit and the imaging direction of the imaging unit are oriented in the same direction by rotating the display unit.
  10. A detection unit that detects whether the display unit and the imaging unit have a predetermined relationship;
    The detection unit detects that the display direction of the display unit and the imaging direction of the imaging unit are both directed toward the subject;
    In accordance with the detection result of the detection unit, an icon related to self-portrait shooting is displayed on the display unit, and the icon is located on the right side when the display unit is viewed from the subject.
  11. Detecting whether the display unit and the imaging unit have a predetermined relationship,
    Imaging that changes the function assigned to one or more buttons when it is detected that the display direction of the display unit and the imaging direction of the imaging unit are both directed toward the subject by rotating the display unit Control method of the device.
  12. Detecting whether the display unit and the imaging unit have a predetermined relationship,
    Imaging that changes the function assigned to one or more buttons when it is detected that the display direction of the display unit and the imaging direction of the imaging unit are both directed toward the subject by rotating the display unit A program for causing a computer to execute a device control method.
  13. A detection unit that detects whether the display unit and the imaging unit have a predetermined relationship;
    The detection unit detects that the display direction of the display unit and the imaging direction of the imaging unit are both directed toward the subject by rotating the display unit;
    An imaging device that changes a display position of an icon related to an assigned function according to a detection result of the detection unit.
  14. A detection unit that detects whether or not the display unit and the imaging unit have a predetermined relationship;
    With one or more buttons,
    An imaging apparatus that changes a function assigned to the one or more buttons when the display direction of the display unit and the imaging direction of the imaging unit are oriented in the same direction by rotating the display unit.
  15. The imaging device according to claim 14, wherein the display unit is configured to be rotatable around a rotation axis.
JP2017090969A 2017-05-01 2017-05-01 Imaging apparatus, method of controlling the same, and program Pending JP2017163586A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017090969A JP2017163586A (en) 2017-05-01 2017-05-01 Imaging apparatus, method of controlling the same, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2017090969A JP2017163586A (en) 2017-05-01 2017-05-01 Imaging apparatus, method of controlling the same, and program

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2016138914 Division 2016-07-13

Publications (1)

Publication Number Publication Date
JP2017163586A true JP2017163586A (en) 2017-09-14

Family

ID=59858108

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2017090969A Pending JP2017163586A (en) 2017-05-01 2017-05-01 Imaging apparatus, method of controlling the same, and program

Country Status (1)

Country Link
JP (1) JP2017163586A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001249379A (en) * 2000-03-06 2001-09-14 Canon Inc Camera
JP2007036590A (en) * 2005-07-26 2007-02-08 Canon Inc Imaging apparatus
JP2009147461A (en) * 2007-12-11 2009-07-02 Canon Inc Image pickup device, its control method, and program
JP2009164756A (en) * 2007-12-28 2009-07-23 Victor Co Of Japan Ltd Video camera
JP2009175748A (en) * 1999-05-28 2009-08-06 Sony Corp Imaging apparatus and imaging method for the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009175748A (en) * 1999-05-28 2009-08-06 Sony Corp Imaging apparatus and imaging method for the same
JP2001249379A (en) * 2000-03-06 2001-09-14 Canon Inc Camera
JP2007036590A (en) * 2005-07-26 2007-02-08 Canon Inc Imaging apparatus
JP2009147461A (en) * 2007-12-11 2009-07-02 Canon Inc Image pickup device, its control method, and program
JP2009164756A (en) * 2007-12-28 2009-07-23 Victor Co Of Japan Ltd Video camera

Similar Documents

Publication Publication Date Title
US8670060B2 (en) Image capturing device with touch screen for adjusting camera settings
US9596398B2 (en) Automatic image capture
CN101963860B (en) Condition changing device
JP2011160275A (en) Camera
JP4510713B2 (en) Digital camera
KR101608556B1 (en) Information display apparatus and information display method
KR101545883B1 (en) Method for controlling camera of terminal and terminal thereof
KR101860571B1 (en) Image processing apparatus, image capturing apparatus, image processing method, and non-transitory computer storage device
KR20120070500A (en) Image display control apparatus and image display control method
JPH10240436A (en) Information processor and recording medium
US8654243B2 (en) Image pickup apparatus and control method thereof
JP5264298B2 (en) Image processing apparatus and image processing method
KR101783059B1 (en) Image processing device, and control method and computer readable medium
CN102023799A (en) Information processing device, display method and program
JP5717510B2 (en) Imaging device, its control method, and storage medium
TW200808044A (en) Imaging apparatus and computer readable recording medium
JP5775659B2 (en) Imaging apparatus and mode switching method in imaging apparatus
US20090227283A1 (en) Electronic device
JP2007295183A (en) Device, method, and program for reproducing image, and image sensing device
JP2013030122A (en) Display control apparatus and control method therefor
JP5967473B2 (en) Imaging apparatus and imaging system
JP2013013063A (en) Imaging apparatus and imaging system
JP2010072749A (en) Image search device, digital camera, image search method, and image search program
CN103379278B (en) Display control unit and display control method
JP2013105272A (en) Display control device and method, program, and storage medium

Legal Events

Date Code Title Description
A871 Explanation of circumstances concerning accelerated examination

Free format text: JAPANESE INTERMEDIATE CODE: A871

Effective date: 20170613

A975 Report on accelerated examination

Free format text: JAPANESE INTERMEDIATE CODE: A971005

Effective date: 20170712

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170725

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170922

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20171114

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180115

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20180220