JP2013172282A - Portable terminal, imaging key control program, and imaging key control method - Google Patents

Portable terminal, imaging key control program, and imaging key control method Download PDF

Info

Publication number
JP2013172282A
JP2013172282A JP2012034703A JP2012034703A JP2013172282A JP 2013172282 A JP2013172282 A JP 2013172282A JP 2012034703 A JP2012034703 A JP 2012034703A JP 2012034703 A JP2012034703 A JP 2012034703A JP 2013172282 A JP2013172282 A JP 2013172282A
Authority
JP
Japan
Prior art keywords
key
shooting
parameter value
touch operation
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2012034703A
Other languages
Japanese (ja)
Other versions
JP5816571B2 (en
Inventor
Takashi Maeda
貴司 前田
Original Assignee
Kyocera Corp
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyocera Corp, 京セラ株式会社 filed Critical Kyocera Corp
Priority to JP2012034703A priority Critical patent/JP5816571B2/en
Publication of JP2013172282A publication Critical patent/JP2013172282A/en
Application granted granted Critical
Publication of JP5816571B2 publication Critical patent/JP5816571B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23216Control of parameters, e.g. field or angle of view of camera via graphical user interface, e.g. touchscreen
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S715/00Data processing: presentation processing of document, operator interface processing, and screen saver display processing
    • Y10S715/974Slider control as on-screen object in operator interface

Abstract

PROBLEM TO BE SOLVED: To improve operability of an imaging apparatus which changes imaging parameters using a touch panel, by speedily changing plural parameters.SOLUTION: When a camera function is executed by a mobile phone 10, a through image based on imaging parameter values, a shutter key, etc. are displayed on a display 14. After the expiration of a predetermined time interval after a touch operation on the shutter key, an auxiliary key corresponding to a coordinate range is displayed. When a user moves his or her finger into the coordinate range in accordance with the display of the auxiliary key, present imaging parameter values are changed to imaging parameter values associated with a coordinate range registered in RAM. When the finger is released in this state, an image captured in accordance with the changed imaging parameter values is stored.

Description

  The present invention relates to a portable terminal, a shooting key control program, and a shooting key control method, and more particularly to a portable terminal, a shooting key control program, and a shooting key control method capable of shooting an image.

An example of a portable terminal capable of taking an image is disclosed in Patent Document 1. A first icon group indicating the type (parameter) of the operating condition is displayed on the display unit of the digital camera disclosed in Patent Document 1. When a drag operation is performed on an arbitrary first icon and the drag operation is determined to be the second operation, any one of the plurality of second icons corresponding to the parameter options and those second icons is selected. A new cursor will be displayed. Further, when the drag operation determined to be the second operation is continued, the second icon selected by the cursor changes. Then, the user can change the parameter corresponding to the second icon by ending the second operation in a state where the desired second icon is selected.
JP 2010-204844 [G06F 3/048, H04N 5/225, G06F 3/041, H04M 1/247]

  However, in the digital camera of Patent Document 1, only one parameter can be changed by one touch operation, and it takes time to change a plurality of parameters for each shooting scene. For this reason, it is conceivable that the user misses a photo opportunity while performing a change operation of a plurality of parameters.

  Therefore, a main object of the present invention is to provide a novel portable terminal, shooting key control program, and shooting key control method.

  Another object of the present invention is to provide a portable terminal, a shooting key control program, and a shooting key control method that can improve operability when shooting.

  The present invention employs the following configuration in order to solve the above problems. The reference numerals in parentheses, supplementary explanations, and the like indicate the corresponding relationship with the embodiments described in order to help understanding of the present invention, and do not limit the present invention.

  1st invention is a portable terminal which has a camera module and the 1st imaging parameter value was set before imaging | photography, Comprising: The detection which detects the touch operation with respect to a display part, the touch panel provided in a display part, and a touch panel A storage unit that associates and stores a touch operation condition with at least one second shooting parameter value registered in advance, a display processing unit that displays a shooting key on the display unit, and a touch operation on the shooting key. A determination unit that determines whether the touch operation satisfies the condition, and when it is determined that the touch operation on the shooting key satisfies the condition, the first shooting parameter value is changed to the second shooting parameter value. When the touch operation on the changing unit and the shooting key is finished, the camera module performs shooting based on the changed second shooting parameter value. The comprises a storage unit for storing images, a portable terminal.

  In the first invention, the mobile terminal (10: reference numeral exemplifying a corresponding part in the embodiment, hereinafter the same) has a camera module (50) and a display unit (14) such as a display. For example, the image output from the camera module is displayed on the display unit based on the first shooting parameter value. The display unit is provided with a touch panel (16) that functions as a pointing device, and a touch operation on the touch panel is detected by a detection unit (48) including a control circuit and the like. The storage unit (46) including a memory or the like stores the second shooting parameter value registered in advance by the user or the like in association with the touch operation condition. For example, when the camera function is executed, the display processing unit (30, S3) displays the photographing key (66) on the display unit. When a touch operation is performed on the shooting key, the determination unit (30, S133) determines whether the touch operation satisfies the above-described conditions. The changing unit (30, S135) changes the first shooting parameter value to the second shooting parameter value when the determination unit determines that the touch operation satisfies the above-described condition. For example, when the user's finger is released from the shooting key and the touch operation is ended, the storage unit (30, S139) stores an image shot based on the changed second shooting parameter value.

  According to the first invention, the user can easily change the shooting parameter value and perform shooting only by performing a predetermined touch operation when shooting. Therefore, the operability when shooting is improved.

  A second invention is dependent on the first invention, the condition includes a coordinate range, and the determination unit includes a touch position of a subsequent touch operation when the touch is performed on the photographing key. Judgment is made.

  In the second invention, the coordinate range (A) is associated with the second imaging parameter value. When the photographing unit is touched, the determination unit determines whether the touch position during the touch operation is included in the coordinate range.

  According to the second aspect of the invention, the user can easily select the shooting parameter value simply by moving the touch position to the coordinate range.

  A third invention is dependent on the first invention and the second invention, and further includes an auxiliary key display unit for displaying an auxiliary key corresponding to the condition.

  In the third invention, the auxiliary key display unit (30, S127) displays the auxiliary key in order to guide the touch operation that satisfies the condition.

  According to the third aspect, the user can easily change the photographing parameter value by referring to the display of the auxiliary key.

  A fourth invention is dependent on the third invention, and the auxiliary key display unit displays an auxiliary key corresponding to the coordinate range when a predetermined time has elapsed after the photographing key is touched.

  In the fourth invention, the auxiliary key is displayed when a predetermined time (for example, 3 seconds) elapses after the photographing key is touched.

  According to the fourth aspect, the coordinate range is easily grasped by displaying the auxiliary keys.

  A fifth invention is dependent on any one of the first to fourth inventions, and further includes a registration unit that registers at least one second imaging parameter value in the storage unit.

  In the fifth invention, the registration unit (30, S109) registers at least one second imaging parameter value in the storage unit, for example, in response to a user operation.

  According to the fifth aspect, the user can arbitrarily register the shooting parameters.

  A sixth invention is according to the fifth invention, further comprising a reading unit (30, S103) for reading the current first shooting parameter value, and the registration unit is configured to read the current first shooting read by the reading unit. The parameter value is registered in the storage unit as the second imaging parameter value.

  In the sixth invention, the reading unit (30, S103) reads the current first imaging parameter value set by the user, for example. The registration unit registers the first shooting parameter read by the reading unit as a second shooting parameter.

  According to the sixth aspect of the invention, the user can easily register the shooting parameters used until immediately before.

  The seventh invention includes a camera module (50), a display unit (14), a touch panel (16) provided in the display unit, a detection unit (48) for detecting a touch operation on the touch panel, and at least one first registered in advance. A processor (30) of the portable terminal (10) having a storage unit (46) for storing the two shooting parameter values in association with the conditions of the touch operation and having the first shooting parameter value set before shooting. A display processing unit (S3) for displaying a shooting key on the display unit, a determination unit (S133) for determining whether the touch operation satisfies a condition when a touch operation is performed on the shooting key, and a touch on the shooting key. When it is determined that the operation satisfies the condition, a change unit (S135) that changes the first shooting parameter value to the second shooting parameter value and the shooting key When the touch operation has been completed, based on the second imaging parameter values that have been changed, a storage unit for storing the image captured by the camera module to function as a (S139), a photographing key control program.

  In the seventh invention as well, similar to the first invention, the user can easily change the shooting parameter value and perform shooting only by performing a predetermined touch operation when shooting. Therefore, the operability when shooting is improved.

  The eighth invention includes a camera module (50), a display unit (14), a touch panel (16) provided on the display unit, a detection unit (48) for detecting a touch operation on the touch panel, and at least one first registered in advance. A shooting key control method for the portable terminal (10), which has a storage unit (46) for storing a touch operation condition in association with two shooting parameter values, and the first shooting parameter value is set before shooting. The photographing key is displayed on the display unit (S3), and when the touch operation is performed on the photographing key, it is determined whether the touch operation satisfies the condition (S133), and the touch operation on the photographing key is a condition. When it is determined that the first shooting parameter value is satisfied, the first shooting parameter value is changed to the second shooting parameter value (S135), and the touch operation on the shooting key is completed. , Based on the second imaging parameter values that have been changed, to store the image captured by the camera module (S139), a photographing key control method.

  In the eighth invention, similarly to the first invention, the user can easily change the shooting parameter value and perform shooting only by performing a predetermined touch operation when shooting. Therefore, the operability when shooting is improved.

  According to the present invention, operability when shooting can be improved.

  The above object, other objects, features, and advantages of the present invention will become more apparent from the following detailed description of embodiments with reference to the drawings.

FIG. 1 is an external view showing a mobile phone according to an embodiment of the present invention. FIG. 2 is an illustrative view showing an electrical configuration of the mobile phone shown in FIG. FIG. 3 is an illustrative view showing one example of a through image displayed on the display shown in FIG. 1. 4 is an illustrative view showing an example of a touch operation performed on the touch panel shown in FIG. 1, and FIG. 4 (A) shows an example of a state in which the shutter key is touched, and FIG. Shows an example of a state in which an auxiliary key is displayed as a result of touching the shutter key. FIG. 5 is an illustrative view showing one example of a coordinate range set in the touch range of the touch panel shown in FIG. FIG. 6 is an illustrative view showing another example of the through image displayed on the display shown in FIG. 1, FIG. 6 (A) shows an example of the through image whose color is inverted, and FIG. An example of a through image in a state where the focus position is set to macro is shown. FIG. 7 is an illustrative view showing one example of a pop-up displayed on the display shown in FIG. FIG. 8 is an illustrative view showing one example of a menu screen displayed on the display shown in FIG. 9 is an illustrative view showing an example of a setting screen displayed on the display shown in FIG. 1. FIG. 9A is an example of a setting screen when the shutter slide function is disabled, and FIG. B) is an example of a setting screen when the shutter slide function is enabled, and FIG. 9C is another example of the setting screen when the shutter slide function is enabled. 10 is an illustrative view showing an example of a change screen displayed on the display shown in FIG. 1, FIG. 10A shows an example of imaging parameter values corresponding to the coordinate range shown in FIG. 5, and FIG. ) Shows a list of imaging parameter values for focus shown in FIG. 10A, and FIG. 10C shows another example of imaging parameter values corresponding to the coordinate range shown in FIG. FIG. 11 is an illustrative view showing one example of a configuration of a photographing parameter value table stored in the RAM shown in FIG. FIG. 12 is an illustrative view showing one example of a memory map of the RAM shown in FIG. FIG. 13 is a flowchart showing an example of imaging control processing of the processor shown in FIG. FIG. 14 is a flowchart showing an example of menu setting processing of the processor shown in FIG. FIG. 15 is a flowchart showing an example of shutter slide setting processing of the processor shown in FIG. FIG. 16 is a flowchart showing an example of the setting change process of the processor shown in FIG. FIG. 17 is a flowchart showing an example of a part of the shutter key control process of the processor shown in FIG. 18 is a flowchart showing another part of the shutter key control process of the processor shown in FIG. 2, and is a flowchart subsequent to FIG. FIG. 19 is an illustrative view showing another example of the through image displayed on the display shown in FIG. 1. FIG. 19A shows an example of the through image in a state where no touch operation is performed, and FIG. ) Shows another example of a state in which the shutter key is touched. FIG. 20 is an illustrative view showing still another example of a through image displayed on the display shown in FIG.

  Referring to FIG. 1, a mobile phone 10 according to one embodiment of the present invention is a smartphone as an example, and includes a vertically long flat rectangular housing 12. However, it should be pointed out in advance that the present invention can be applied to any portable terminal such as a tablet terminal and a PDA.

  On one main surface (front surface) of the housing 12, for example, a display 14 such as a liquid crystal or an organic EL that functions as a display unit is provided. A touch panel 16 is provided on the display 14. Therefore, in the cellular phone 10 of this embodiment, most of the input operations except those by hard key operations described later are performed through the touch panel 16.

  A speaker 18 is built in the surface of one end of the housing 12 in the vertical direction, and a microphone 20 is built in the surface of the other end in the vertical direction.

  In this embodiment, a call key 22a, an end key 22b, and a menu key 22c are provided as hard keys that constitute input operation means together with the touch panel 16.

  For example, the user can input a telephone number by touching a dial key (not shown) displayed on the display 14 with the touch panel 16, and can start a voice call by operating the call key 22a. Can do. If the end call key 22b is operated, the voice call can be ended. The user can turn on / off the power of the mobile phone 10 by pressing and holding the end call key 22b.

  Further, if the menu key 22c is operated, a menu screen is displayed on the display 14, and a touch operation on the touch panel 16 with respect to a soft key or a menu icon (both not shown) displayed on the display 14 in that state. The menu can be selected by confirming the selection.

  A lens opening 24 and an LED 26 for the camera module 50 (FIG. 2) are provided on the back side of one end in the longitudinal direction of the housing 12. Therefore, the camera module 50 can photograph the subject irradiated by the light emission (flash) of the LED 26 through the lens opening 24. However, the LED 26 may not emit light during shooting.

  Referring to FIG. 2, the mobile phone 10 of the embodiment shown in FIG. 1 includes a processor 30 called a computer or CPU. The processor 30 includes a wireless communication circuit 32, an A / D converter 36, a D / A converter 38, an input device 40, a display driver 42, a flash memory 44, a RAM 46, a touch panel control circuit 48, a camera module 50, and an LED control circuit. 52 and the like are connected.

  The processor 30 controls the entire mobile phone 10. In the RAM 46 functioning as a storage unit, all or a part of a program preset in the flash memory 44 is expanded when used, and the processor 30 operates according to the program on the RAM 46. The RAM 46 is further used as a working area or a buffer area for the processor 30.

  The input device 40 includes the touch panel 16 and the hard keys 22 shown in FIG. 1, and constitutes an operation unit or an input unit. Information on the hard key operated by the user (key data) is input to the processor 30.

  The wireless communication circuit 32 is a circuit for transmitting and receiving radio waves for voice calls and mails through the antenna 34. In the embodiment, the wireless communication circuit 32 is a circuit for performing wireless communication by the CDMA method. For example, when the user operates the input device 40 to instruct a telephone call (calling), the wireless communication circuit 32 executes a telephone call process under the finger of the processor 30 and sends a telephone call signal via the antenna 34. Output. The telephone call signal is transmitted to the other party's telephone through the base station and the communication network. When an incoming call process is performed at the other party's telephone, a communicable state is established, and the processor 30 executes a call process.

  The normal call processing will be described in detail. The modulated audio signal transmitted from the other party's telephone is received by the antenna 34. The received modulated audio signal is demodulated and decoded by the wireless communication circuit 32. The received voice signal obtained by these processes is converted to a voice signal by the D / A converter 38 and then output from the speaker 18. On the other hand, the transmission voice signal captured through the microphone 20 is converted into voice data by the A / D converter 36 and then given to the processor 30. The audio data is subjected to encoding processing and modulation processing by the wireless communication circuit 32 under the instruction of the processor 30, and is output via the antenna 34. Therefore, the modulated voice signal is transmitted to the other party's telephone through the base station and the communication network.

  When a telephone call signal from the other party's telephone is received by the antenna 34, the radio communication circuit 32 notifies the processor 30 of an incoming call (incoming call). In response to this, the processor 30 controls the display driver 42 to display caller information (such as a telephone number) described in the incoming call notification on the display 14. In addition, the processor 30 causes the speaker 18 to output a ring tone (sometimes called a ringing melody or a ringing voice) from the speaker 18.

  When the user performs a response operation using the call key 22 a (FIG. 1) included in the input device 40, the wireless communication circuit 32 executes a call incoming process under the instruction of the processor 30. Further, a communicable state is established, and the processor 30 executes the above-described call processing.

  Further, when a call termination operation is performed by the call termination key 22b (FIG. 1) included in the input device 40 after shifting to a call ready state, the processor 30 controls the wireless communication circuit 32 to terminate the call to the other party. Send a signal. Then, after transmitting the call end signal, the processor 30 ends the call process. The processor 30 also ends the call process when a call end signal is received from the other party first. Furthermore, the processor 30 also ends the call process when a call end signal is received from the mobile communication network regardless of the call partner.

  The microphone 20 shown in FIG. 1 is connected to the A / D converter 36, and the audio signal from the microphone 20 is connected to the digital audio data by the A / D converter 36 and input to the processor 30 as described above. The The speaker 18 is connected to the D / A converter 38. The D / A converter 38 converts digital audio data into an audio signal and supplies the audio signal to the speaker 18 through an amplifier. Therefore, the sound data is output from the speaker 18.

  The processor 30 adjusts the volume of the sound output from the speaker 18 by controlling the amplification factor of the amplifier connected to the D / A converter 38, for example, in response to the volume operation by the user. Can do.

  The display driver 42 is connected to the display 14 and the processor 30 and stores image data output from the processor 30 in the VRAM. Then, the display driver 42 displays an image corresponding to the VRAM data on the display 14. That is, the display driver 42 controls display on the display 14 connected to the display driver 42 under the instruction of the processor 30. The display 14 is provided with a backlight using, for example, an LED as a light source, and the display driver 42 controls the brightness of the backlight and lighting / extinguishing according to an instruction from the processor 30.

  The touch panel 16 shown in FIG. 1 is connected to the touch panel control circuit 48. The touch panel control circuit 48 applies necessary voltage to the touch panel 16 and indicates a start signal indicating the start of touch by the user on the touch panel 16, an end signal indicating the end of touch by the user, and a touch position touched by the user. Coordinate data is input to the processor 30. Therefore, the processor 30 can determine which icon or key the user touched at that time based on the coordinate data.

  In the embodiment, the touch panel 16 is a capacitance type that detects a change in capacitance between electrodes caused by an object such as a finger approaching the surface. For example, one or more fingers touch the touch panel 16. Detect that. The touch panel 16 is provided on the display 14 and is a pointing device for instructing an arbitrary position within the screen. The touch panel control circuit 48 functions as a detection unit, detects a touch operation within the touch effective range of the touch panel 16, and outputs coordinate data indicating the position of the touch operation to the processor 30. That is, the user inputs an operation position, an operation direction, and the like to the mobile phone 10 by performing operations such as touch, slide, and release on the surface of the touch panel 16.

  Note that the detection method of the touch panel 16 may employ a surface-type capacitance method, or may employ a resistance film method, an ultrasonic method, an infrared method, an electromagnetic induction method, or the like. The touch operation is not limited to the user's finger, and may be performed with a stylus pen or the like.

  The camera module 50 is also called a main camera, and includes a control circuit, a lens, an image sensor, and the like. When an operation for executing the camera function is performed, the processor 30 activates the camera module 50 and displays a through image (preview image) corresponding to the background and the subject on the display 14. Then, when the user performs a shooting operation, the camera module 50 executes a process of shooting an image.

  The LED control circuit 52 controls light emission of the LED 26 connected to the LED control circuit 52 under the instruction of the processor 22b. Further, when the camera function is being executed, the LED 26 may emit light as a flash. The LED control circuit 52 and the LED 26 may be referred to as a light emitting unit.

  FIG. 3 is an illustrative view showing one example of a through image displayed on the display 14. The display range of the display 14 includes a status display area 60 and a function display area 62. In the status display area 60, an icon (pict) indicating the radio wave reception status by the antenna 12, an icon indicating the remaining battery capacity of the secondary battery, and the date and time are displayed. In the function display area 62, a menu key 64 for displaying a menu screen and a shutter key (also referred to as a shooting key) 66 for performing a shooting operation are displayed. When the camera function is executed, a through image is displayed based on the shooting parameter value. For example, when the user performs a touch operation on the menu key 64, a menu screen (see FIG. 8) for changing shooting parameter values such as focus, flash, and color effect (color effect) is displayed. The user can change these shooting parameter values by operating the GUI on the menu screen.

  The “imaging parameter value” is a parameter value indicating an imaging condition of an image. The shooting parameter values in this embodiment include shooting parameter values such as focus, flash, and color effect (color effect).

  For example, the focus shooting parameter values are “None” that does not perform autofocus processing, “Auto” that performs autofocus processing, “Macro” that fixes the focus lens focus position in the foreground (macro), and focus lens focus position It includes a “distant view” that fixes to a distant view. The flash shooting parameter values are “Yes” that always causes the LED 26 to emit light as a flash, “No” that does not cause the LED 26 to emit light as a flash, “Auto” that determines the light emission of the LED 26 based on the amount of ambient light, and the subject's red-eye is suppressed. Thus, the “red eye” that causes the LED 26 to emit light is included. The shooting parameters for the color effect are “standard” for normal color tone of the image to be shot, “sepia” for the color of the image to be shot, and “monochrome” for the color of the image to be shot in monochrome. And “negative” that inverts the color of the image to be captured.

  Then, the user can take an image reflecting the change of the photographing parameter value by setting these photographing parameter values before photographing and operating the shutter key 66.

  Here, the mobile phone 10 of this embodiment has a shutter slide function. The shutter slide function means that when a touch operation that satisfies a predetermined condition is performed on the shutter key 66, a current shooting parameter value (first shooting parameter value) is registered in advance as a shooting parameter value (second shooting parameter value). It is a function to change to). When the user finishes the touch operation that satisfies the predetermined condition, an image is taken based on the changed shooting parameter value. Hereinafter, the shutter slide function will be specifically described.

  4A and 4B, when a predetermined time (for example, 3 seconds) elapses after touching the shutter key 66, an auxiliary key 68 is displayed around the shutter key 66. The menu key 64 disappears. The three auxiliary keys 68 indicate a predetermined condition for changing the photographing parameter value to the user. That is, the user can easily change the shooting parameter value by referring to the display of the auxiliary key 68. Further, the display of the auxiliary key 68 makes it easy to grasp a coordinate range A described later. In FIG. 4B, three auxiliary keys 68 are displayed, but in other embodiments, two or less, or four or more auxiliary keys 68 may be displayed.

  Referring to FIG. 5, the predetermined condition described above includes a coordinate range, and coordinate range A is associated with each auxiliary key 68. For example, the coordinate range A1 corresponds to the first auxiliary key 68a, the coordinate range A2 corresponds to the second auxiliary key 68b, and the coordinate range A3 corresponds to the third auxiliary key 68c. Further, in the coordinate range A1, shooting parameter values in which the focus is “none”, the flash is “present”, and the color effect is “negative” are registered in association with each other. In the coordinate range A2, shooting parameter values of “macro” for the focus, “no” for the flash, and “monochrome” for the color effect are registered in association with each other. In the coordinate range A3, the focus is “distant view”, the flash is “no”, and the color effect is “standard” in association with each other. In this embodiment, after the shutter key 66 is touched, pre-registered shooting parameter values are read based on the coordinate range A including the touch position.

  Referring to FIG. 6A, when the user moves (slides) the touch position from shutter key 66 to first auxiliary key 68a, the touch position enters coordinate range A1 corresponding to first auxiliary key 68a. To be judged. In this case, the shooting parameter value corresponding to the coordinate range A1 is read, and the current shooting parameter value is changed to the shooting parameter value corresponding to the coordinate range A1. That is, each shooting parameter value is changed to “no” for the focus, “present” for the flash, and “negative” for the color effect. As a result of the change of the shooting parameter value, the through image displayed on the display 14 is in an inverted color state.

  6B, when the user moves (slides) the touch position to the second auxiliary key 68b, it is determined that the touch position has entered the coordinate range A2 corresponding to the second auxiliary key 68b. The Therefore, the shooting parameter value corresponding to the coordinate range A2 is read, and the current shooting parameter value is changed to the shooting parameter value corresponding to the coordinate range A2. In other words, the focus is changed to “macro”, the flash is “no”, the color effect is changed to “monochrome”, and a monochrome through image is displayed on the display 14 in focus on the subject (flower) in front.

  In another embodiment, among the three auxiliary keys 68, when the user moves the touch position to an arbitrary auxiliary key 68, the auxiliary key 68 may be highlighted and displayed. For example, in another embodiment, when the user slides the touch position to the first auxiliary key 68a, the first auxiliary key 68a is displayed larger than the other auxiliary keys 68b and 68c, or the character “A” is displayed. Is displayed in bold or displayed in a different color.

  As described above, in this embodiment, the user can easily select the shooting parameter value simply by moving the touch position to the arbitrary coordinate range A.

  When the user releases the touch panel 16 with the shooting parameter value changed, an image is shot based on the changed shooting parameter value. For example, when the finger is released from the touch panel 16 in a state where the touch position is changed to the second auxiliary key 68b, a shooting process is executed, and an image corresponding to FIG. 6B is shot.

  Referring to FIG. 7, when the photographing process is executed, menu key 64 is displayed again and auxiliary key 68 is erased. Then, a pop-up 80 for confirming whether to save the photographed image is displayed. The pop-up 80 includes a save key 82 and a cancel key 84. At this time, the user can save the image data corresponding to the displayed image in the flash memory 44 by performing a touch operation on the save key 82. Further, if the user performs a touch operation on the cancel key 84, the user can return to the state in which the through image is displayed without saving the captured image.

  Therefore, the user can easily change the shooting parameter value and perform shooting only by performing a predetermined touch operation when shooting. Therefore, the operability when shooting is improved.

  Even if the touch position moves to each coordinate range A before the auxiliary key 68 is displayed, the imaging parameter value is changed. Therefore, the user can change the shooting parameter value and simultaneously perform shooting by performing an operation (flick operation) on the surface of the touch panel 16 on the shutter key 66 before the auxiliary key 68 is displayed. . And when operated in this way, the time from the change of the shooting parameter value to the shooting is further shortened.

  In another embodiment, the display position of the shutter key 66 on the display 14 may be arbitrarily set by the user. In this case, the user can display the shutter key 66 at a position on the display 14 where it is easy to touch.

  Next, a procedure for registering the shooting parameter value for change in advance will be described. Referring to FIG. 8, when menu key 64 is operated, a menu screen is displayed. On this menu screen, a return key 90, a focus key 92, a flash key 94, a color effect key 96, and a shutter slide key 98 are displayed as a GUI. The return key 90 is a key for returning to a state in which a through image as shown in FIG. 3 is displayed. Each of the focus key 92, the flash key 94, and the color effect key 96 is a key for changing a corresponding shooting parameter value. The shutter slide key 98 is a key for changing the setting of the shutter slide function.

  9A and 9B, when the shutter slide key 98 is operated, a setting screen is displayed. Here, if the shutter slide function is not enabled, a return key 90 for returning to the menu screen of FIG. 8 and an invalid key 100a indicating the invalid state are displayed on the setting screen. On the other hand, if the shutter slide function is enabled, a return key 90 and an effective key 100b indicating that it is in an effective state are displayed on the setting screen, and a character string 102 indicating each coordinate range A and each coordinate are displayed. A change key 104 for changing the shooting parameter value associated with the range A is displayed. The first character string 102a and the first change key 104a correspond to the coordinate range A1, the second character string 102b and the second change key 104b correspond to the coordinate range A2, and the third character string 102c and the second change key 104c Corresponds to the coordinate range A3. The invalid key 100a and the valid key 100b may be collectively referred to as a switching key 100.

  For example, when the invalid key 100a is touched in the invalid state, the shutter slide function is validated and the setting screen shown in FIG. 9B is displayed. Further, when the valid key 100b is touched in the valid state, the shutter slide function is invalidated, and the setting screen shown in FIG. 9A is displayed.

  Also, referring to FIG. 9C, when a touch operation is performed on an arbitrary character string 102, the current setting of the shooting parameter value is displayed. For example, when a touch operation is performed on the first character string 102a, focus, flash, and color are set between the first character string 102a and the first change key 104a, and between the second character string 102b and the second change key 104b. The effect shooting parameter values are displayed.

  Referring to FIG. 10A, in FIG. 9B or FIG. 9C, for example, when a touch operation is performed on the first change key 104a, the imaging parameter value related to the first coordinate range A1. Is displayed on the display 14. The change screen includes a return key 90, a registration key 110, a focus change key 122, a flash change key 124, a color effect change key 126, and a readout key 128.

  The return key 90 is a key for returning to the setting screen. The registration key 110 is a key for registering the shooting parameter value changed on the change screen in the shooting parameter value table (see FIG. 11). A focus change key 122, a flash change key 124, and a color effect change key 126 are keys for changing the registered content of each shooting parameter value. In addition, a character string indicating the current shooting parameter is written in these keys. The read key 128 is a key for reading the current shooting parameter value set by an operation of the user or the like and reflecting it in each shooting parameter value. By using this key, the user can easily register the shooting parameters used until immediately before.

  Referring to FIG. 10B, when a touch operation is performed on the focus change key 122, “None” key 130, “Auto” key 132, “Macro” key 134, and “ A “distant view” key 136 is further displayed. Further, the display of the return key 90 and the registration key 110 is deleted in this state. Further, the color is reversed to indicate that the focus change key 122 is selected. Note that the “None” key 130 is displayed in a state where the color is reversed to indicate that it is a photographing parameter value set at the present time.

  Referring to FIG. 10C, for example, when the “automatic” key 132 is touched, the photographing parameter value of the focus is changed to “automatic”. In addition, the character string written in the focus change key 122 is changed to a character string “automatic”. In this state, when the registration key 110 is touched, the focus shooting parameter value is registered as “automatic” in the shooting parameter value table. That is, the user can arbitrarily register the shooting parameters. However, if the return key 90 is operated without operating the registration key 110, the change result is discarded without being registered.

  Note that even when the flash change key 124 and the color effect change key 126 are operated, a key indicating the corresponding shooting parameter value is displayed. The processing when a key corresponding to another shooting parameter value is operated is substantially the same as when the “automatic” key 132 is operated, and thus detailed description thereof is omitted.

  FIG. 11 is an illustrative view showing details of the imaging parameter value table. The shooting parameter value table includes a column of coordinate ranges, a column of shooting parameter values, and a column of auxiliary keys. The shooting parameter value column includes a focus column, a flash column, a color effect column, and the like. The shooting parameter value and the auxiliary key are associated and registered corresponding to the coordinate range of each row.

For example, in the first coordinate range A1 indicated by two linear functions of “Y> a 1 X + b 1 ” and “Y ≧ a 2 X + b 2 ”, the focus is “automatic”, the flash is “no”, and the color effect is Associated with “negative” is associated with the first auxiliary key 68a. Similarly, in the second coordinate range A2 indicated by two linear functions of “Y <a 2 X + b 2 ” and “Y ≧ a 3 X + b 3 ”, the focus is “macro”, the flash is “automatic”, and the color effect Are associated as “normal” and the second auxiliary key 68b is associated. In the third coordinate range A3 indicated by two linear functions of “Y <a 3 X + b 3 ” and “Y ≧ a 4 X + b 4 ”, the focus is “far”, the flash is “on”, and the color effect is Associated with “Sepia” is associated with the third auxiliary key 68c.

  Although the features of the embodiment have been outlined above, the following description will be made in detail with reference to the memory map of the RAM 46 of the mobile phone 10 shown in FIG. 12 and the flowchart of the processor 30 of the mobile phone 10 shown in FIGS. To do.

  Referring to FIG. 12, program storage area 302 and data storage area 304 are formed in RAM 46 shown in FIG. 2. As described above, the program storage area 302 is an area for reading and storing (developing) part or all of the program data set in advance in the flash memory 44 (FIG. 2).

  The program storage area 302 includes a shooting control program 310, a menu setting program 312, a shutter slide setting program 314, a setting change program 316, a shutter key control program 318, and the like. The shooting control program 310 is a program for controlling operations on the menu key 64 and the shutter key 66 when the camera function is being executed. The menu setting program 312 is a program that is executed when the menu screen is displayed. The shutter slide setting program 314 is a program that is executed when the setting screen is displayed. The setting change program 316 is a program that is executed when the change screen is displayed.

  The program storage area 302 includes a program for executing a telephone function and the like.

  Subsequently, a touch buffer 330, a shooting parameter value buffer 332, an initial shooting parameter value buffer 334, and a display image buffer 336 are provided in the data storage area 304 of the RAM 46. The data storage area 304 stores touch coordinate map data 340, GUI data 342, GUI coordinate data, and shooting parameter value table data, and is provided with a touch flag 348, a shutter flag 350, and a touch counter 352.

  The touch buffer 330 stores touch coordinate data output from the touch panel control circuit 48. The shooting parameter value buffer 332 temporarily stores each shooting parameter value changed by a user operation when the camera function is being executed. The initial shooting parameter value buffer 334 temporarily stores the current shooting parameter value when the shooting parameter value is changed by the shutter slide function. The display image buffer 336 temporarily stores image data to be displayed as a through image. The change buffer 338 temporarily stores the change contents when changing the pre-registered shooting parameter value.

  The touch coordinate map data 340 is data for associating touch coordinates in a touch operation with display coordinates on the display 14. That is, based on the touch coordinate map data 340, the result of the touch operation performed on the touch panel 16 is reflected on the display 14.

  The GUI data 342 includes image data and character string data for displaying keys and the like displayed on the display 14. The GUI coordinate data 344 includes display coordinate data of the displayed GUI. The shooting parameter value table data 346 is table data configured as shown in FIG.

  The touch flag 348 is a flag for determining whether or not the touch panel 16 is touched. For example, the touch flag 348 is composed of a 1-bit register. When the touch flag 348 is turned on (established), a data value “1” is set in the register. On the other hand, when the touch flag 348 is turned off (not established), a data value “0” is set in the register. Touch flag 348 is switched on / off based on a signal output from touch panel control circuit 48.

  The shutter flag 350 is a flag indicating whether the shutter slide function is enabled. Therefore, the shutter flag 350 is turned on when the shutter slide function is enabled, and turned off when the shutter slide function is disabled. Further, since the configuration of the shutter flag 350 is substantially the same as that of the touch flag 348, detailed description of the configuration is omitted.

  The touch counter 352 is a counter for measuring the time after the shutter key 66 is touched, and starts counting (measurement) when reset. The touch counter 352 stops (expires) when a predetermined time (for example, 3 seconds) elapses. Therefore, the touch counter 352 may be called a touch timer.

  The data storage area 304 stores image data displayed in a standby state, character string data, and the like, and is provided with counters and flags necessary for the operation of the mobile phone 10.

  The processor 30 is under the control of a Linux (registered trademark) -based OS such as Android (registered trademark) and REX, and other OSs, the shooting control process shown in FIG. 13, the menu setting process shown in FIG. A plurality of tasks are processed in parallel, including a shutter slide setting process shown in FIG. 16, a setting change process shown in FIG. 16, a shutter key control process shown in FIG.

  Referring to FIG. 13, the imaging control process is started when an operation for executing a camera function is performed, for example. In step S1, the processor 30 activates the camera module 50. For example, the processor 30 issues a start command to the control circuit of the camera module 50. Subsequently, in step S3, the processor 30 displays a GUI. That is, the processor 30 reads the GUI data 342 and causes the display 14 to display images corresponding to the menu key 64 and the shutter key 66. Note that a GUI is displayed, and a through image is also displayed on the display 14 based on the image data stored in the display image buffer 336. Note that the processor 30 that executes the process of step S3 functions as a display processing unit.

  Subsequently, in step S5, the processor 30 determines whether or not the menu key 64 has been operated. That is, it is determined whether the start coordinates of the touch operation recorded in the touch buffer 330 are included in the display coordinate range of the menu key 64 stored in the GUI coordinate data 344. If “YES” in the step S5, that is, if the menu key 64 is operated, the processor 30 executes a menu setting process in a step S7, and returns to the step S3. Since the menu key setting process will be described later with reference to the flowchart of FIG. 14, detailed description thereof is omitted here.

  If “NO” in the step S5, that is, if the menu key 64 is not operated, the processor 30 determines whether or not the shutter key 66 is operated in a step S9. That is, it is determined whether the start coordinates of the touch operation recorded in the touch buffer 330 are included in the display coordinate range of the shutter key 66 stored in the GUI coordinate data 344. If “YES” in the step S9, that is, if the shutter key 66 is operated, the processor 30 executes a shutter key control process in a step S11, and returns to the step S3. Since the shutter key control process will be described later with reference to the flowchart shown in FIG. 17, a detailed description thereof is omitted here.

  If “NO” in the step S9, that is, if the shutter key 66 is not operated, the processor 30 determines whether or not the ending operation is performed in a step S13. For example, it is determined whether the end call key 22b has been operated in order to end the camera function. If “NO” in the step S13, that is, if the camera function end operation is not performed, the processor 30 returns to the step S3. On the other hand, if “YES” in the step S13, that is, if an operation for ending the camera function is performed, the processor 30 stops the camera module 50 in a step S15 and ends the photographing control process.

  FIG. 14 is a flowchart of the menu setting process. The menu setting process is started when step S7 is executed in the shooting control process (FIG. 13). In step S31, the processor 30 displays a menu screen. For example, the processor 30 reads the GUI data 342 and displays a menu screen including keys for changing each shooting parameter value on the display 14.

  Subsequently, in step S33, the processor 30 determines whether or not the shutter slide key 98 has been operated. That is, it is determined whether the start coordinates stored in the touch buffer 330 are included in the display coordinate range of the shutter slide key 98 included in the GUI coordinate data 344. If “YES” in the step S33, that is, if the shutter slide key 98 is operated, the processor 30 executes a shutter slide setting process in a step S35, and returns to the step S33. Since the shutter slide process will be described later with reference to the flowchart shown in FIG. 15, detailed description thereof is omitted here.

  On the other hand, if “NO” in the step S33, that is, if the shutter slide key 98 is not operated, the processor 30 determines whether or not other keys are operated in a step S37. That is, it is determined whether the start coordinates of the touch operation are included in a display coordinate range such as another focus key 92.

  If “NO” in the step S37, for example, if the displayed key is not operated, the processor 30 returns to the step S33. If “YES” in the step S37, that is, if another key is operated, the processor 30 determines whether or not the return key 90 is operated in a step S39. That is, it is determined whether the start coordinates of the touch operation recorded in the touch buffer 330 are included in the display coordinate range of the return key 90 stored in the GUI coordinate data 344. If “NO” in the step S39, for example, when the focus key 92 is operated, the processor 30 executes a process corresponding to the key operated in the step S41. That is, if the focus key 92 is operated, a screen for changing the focus shooting parameter value is displayed on the display 14.

  If “YES” in the step S39, that is, if the return key 90 is operated, the processor 30 ends the menu setting process and returns to the photographing control process. As a result, the display 14 returns to the display state shown in FIG. 3 from the menu screen shown in FIG.

  FIG. 15 is a flowchart of the shutter slide setting process. The shutter slide setting process is started when step S35 is executed in the menu setting process. In step S61, the processor 30 displays the switching key 100. For example, if the shutter flag 350 is on, the valid key 100a is displayed. On the other hand, if the shutter flag 350 is off, the invalid key 100b is displayed.

  Subsequently, in step S63, the processor 30 determines whether or not the shutter flag 350 is on. For example, in step S63, it is determined whether a touch operation has been performed on the switching key 100 and whether the shutter flag 350 is switched on / off. If “NO” in the step S63, that is, if the shutter flag 350 is turned off, the processor 30 proceeds to a step S75. On the other hand, if “YES” in the step S63, that is, if the shutter flag 350 is turned on, the character string 102 and the change key 104 are displayed in a step S65. For example, the processor 30 reads the GUI data 342 and displays the character string 102 and the change key 104 corresponding to each coordinate range A.

  Subsequently, in step S67, the processor 30 determines whether or not a touch operation has been performed on the character string 102. That is, it is determined whether the start coordinates of the touch operation recorded in the touch buffer 330 are included in the display coordinate range of the character string 102 stored in the GUI coordinate data 344. If “NO” in the step S67, that is, if the touch operation is not performed on the character string 102, the processor 30 proceeds to step S71. On the other hand, if “YES” in the step S67, for example, when a touch operation is performed on the character string 102a, the processor 30 displays the photographing parameter value in a step S69. For example, the processor 30 reads the shooting parameter value corresponding to the character string 102a (first coordinate range A1) from the shooting parameter value table data 346. Then, as shown in FIG. 9C, the read shooting parameter value is displayed on the display 14.

  Subsequently, in step S71, the processor 30 determines whether or not the change key 104 has been operated. For example, it is determined whether a touch operation has been performed on the change keys 104a-104b shown in FIG. If “NO” in the step S71, that is, if the change key 104 is not operated, the processor 30 proceeds to step S75. On the other hand, if “YES” in the step S71, for example, when any of the change keys 104a to 104c is operated, the processor 30 executes a setting changing process in a step S73. Since the setting change process will be described later with reference to the flowchart of FIG. 16, detailed description thereof is omitted here.

  Subsequently, in step S75, the processor 30 determines whether the return key 90 has been operated. If “NO” in the step S75, that is, if the return key 90 is not operated, the processor 30 returns to the step S63. On the other hand, if “YES” in the step S75, for example, when the return key 90 is operated, the processor 30 ends the shutter slide setting process and returns to the menu setting process.

  FIG. 16 is a flowchart of the setting change process. The setting change process is started when step S73 is executed in the shutter slide process (FIG. 15). In step S <b> 91, the processor 30 records the shooting parameter value corresponding to the change key 104 in the change buffer 338. For example, when the change key 104a is operated, the shooting parameter value corresponding to the first coordinate range A1 is read from the shooting parameter value table data 346, and the shooting parameter value is recorded in the change buffer 338.

  Subsequently, in step S93, the processor 30 displays a change screen. For example, the processor 30 reads the GUI data 342 and displays a change screen for the first coordinate range on the display 14. Subsequently, in step S95, the processor 30 determines whether or not the change operation is performed. For example, it is determined whether any of the focus change key 122, the flash change key 124, and the color effect change key 126 shown in FIG. If “YES” in the step S95, for example, when the focus change key 122 is operated, the processor 30 executes a changing process in a step S97. For example, as shown in FIG. 10B, a key corresponding to the focus shooting parameter value is displayed on the display 14, and a change operation of the shooting parameter value is accepted. Subsequently, in step S99, the change result is recorded in the change buffer 338, and the process proceeds to step S107. For example, when a touch operation is performed on the “automatic” key 132, “automatic” is recorded in the change buffer 338 as the focus shooting parameter value.

  On the other hand, if “NO” in the step S95, that is, if the changing operation is not performed, the processor 30 determines whether or not an operation for reading the current photographing parameter value is performed in a step S101. That is, the processor 30 determines whether the read key 128 has been operated. Specifically, it is determined whether the start coordinates of the touch operation recorded in the touch buffer 330 are included in the display coordinate range of the read key 128 stored in the GUI coordinate data 344. If “NO” in the step S101, that is, if the read key 128 is not operated, the processor 30 proceeds to a step S107.

  If “YES” in the step S101, that is, if the read key 128 is operated, the processor 30 reads the current shooting parameter value in a step 103, and the current shooting parameter value in the change buffer 338 in a step S105. Record. That is, the shooting parameter value is read from the shooting parameter value buffer 332, and the read shooting parameter value is recorded in the change buffer 338. The processor 30 that executes the process of step S103 functions as a reading unit.

  In step S107, the processor 30 determines whether or not the registration key 110 has been operated. That is, it is determined whether the start coordinates of the touch operation recorded in the touch buffer 330 are included in the display coordinate range of the registration key 110 stored in the GUI coordinate data 344. If “YES” in the step S107, that is, if a touch operation is performed on the registration key 110, the processor 30 registers the change contents in the photographing parameter value table in a step S109. That is, the shooting parameter value recorded in the change buffer 338 is recorded in the shooting parameter value table data 346. When the process of step S109 ends, the processor 30 ends the setting change process and returns to the shutter slide setting process. The processor 30 that executes the process of step S109 functions as a registration unit.

  If “NO” in the step S107, that is, if the registration key 110 is not operated, the processor 30 determines whether or not the return key 90 is operated in a step S111. If “NO” in the step S111, that is, if the return key 90 is not operated, the process returns to the step S93. On the other hand, if “YES” in the step S111, that is, if the return key 90 is operated, the processor 30 ends the setting change process and returns to the shutter slide setting process.

  FIG. 17 is a flowchart of shutter key control processing. The shutter key control process is started when step S11 is executed in the shooting control process (FIG. 13). In step S121, the processor 30 temporarily stores the current shooting parameter value. That is, the shooting parameter value stored in the shooting parameter value buffer 332 is recorded in the initial shooting parameter value buffer 334. In step S123, the touch timer is initialized. That is, the processor 30 resets the touch counter 352 and starts measuring the time during which the shutter key 66 is touched.

  Subsequently, in step S125, the processor 30 determines whether or not the touch timer has expired. That is, it is determined whether a predetermined time has elapsed since the shutter key 66 was touched. If “YES” in the step S125, that is, if a predetermined time has elapsed after the shutter key 66 is touched, the processor 30 displays the auxiliary key 68 in a step S127, and proceeds to a step S129. The processor 30 that executes the process of step S127 functions as an auxiliary key display unit.

  On the other hand, if “NO” in the step S125, that is, if a predetermined time has not elapsed after the shutter key 66 is touched, the processor 30 determines whether or not the touch position has changed in a step S129. That is, it is determined whether the current touch coordinates stored in the touch buffer 330 have changed from the previous touch coordinates. If “NO” in the step S129, that is, if the touch position has not changed, the processor 30 proceeds to the step S137. If “YES” in the step S129, for example, if the user's finger slides and the touch position changes, the processor 30 reads the touch coordinates in a step S131. That is, touch coordinates indicating the current touch position are read from the touch buffer 330. Subsequently, in step S133, the processor 30 determines whether the touch position is included in the coordinate range. For example, the processor 30 reads each coordinate range A from the imaging parameter value table data 346, and determines whether touch coordinates indicating the current touch position are included in each read coordinate range A. The processor 30 that executes the process of step S133 functions as a determination unit.

  If “NO” in the step S133, that is, if the touch coordinates are not included in each coordinate range A, the processor 30 proceeds to the step S145. On the other hand, if “YES” in the step S133, for example, if the touch coordinates are included in the first coordinate range A1, the processor 30 changes the photographing parameter value in a step S135. For example, the shooting parameter value corresponding to the first coordinate range A1 is read from the shooting parameter value table data 346 and the shooting parameter value is recorded in the shooting parameter value buffer 332. The processor 30 that executes the process of step S135 functions as a changing unit.

  Subsequently, in step S137, the processor 30 determines whether or not it has been released. For example, it is determined whether the user's finger touching the shutter key 66 has been released from the touch panel 16. Specifically, the processor 30 determines whether the touch flag 348 has been turned off. If “NO” in the step S137, for example, if the user's finger is not released, the processor 30 returns to the step S125.

  On the other hand, if “YES” in the step S137, for example, when the user's finger is released, the processor 30 executes a photographing process in a step S139. For example, when the photographing process is executed, the image stored in the display image buffer 336 is moved to another buffer when released. At the same time, a pop-up 80 (FIG. 7) for confirming whether or not to save the image is displayed on the display 70. When the user performs an operation for saving an image, the captured image is saved in the flash memory 44. The processor 30 that executes the process of step S139 functions as a storage unit.

  Subsequently, in step S141, the processor 30 determines whether or not the imaging parameter value has been changed. That is, it is determined whether the end position is included in each coordinate range A. If “YES” in the step S141, that is, if the shooting parameter value has been changed, the processor 30 restores the shooting parameter value in a step S143, ends the shutter key control process, and returns to the shooting control process. . That is, the shooting parameter value stored in the initial shooting parameter value buffer 334 is recorded in the shooting parameter value buffer 332. On the other hand, if “NO” in the step S141, that is, if the finger is released from the shutter key 66 without changing the shooting parameter value, the processor 30 ends the shutter key control process and the shooting control process. Return to.

  If the touch position is not included in each coordinate range A, that is, if “NO” in the step S133, the processor 30 determines whether or not the touch position is released in a step S145. That is, the processor 30 determines whether the finger is released outside the coordinate range A. If “NO” in the step S145, that is, if not released, the processor 30 returns to the step S125.

  If “YES” in the step S145, that is, if the finger is released outside the coordinate ranges A, the processor 30 determines whether or not the shutter key 66 is released in a step S147. That is, the processor 30 determines whether the end position of the touch operation is included in the display range of the shutter key 66. If “YES” in the step S147, that is, if the finger is released within the display range of the shutter key 66, the processor 30 executes a photographing process in a step S149 similarly to the step S139. Then, when the shooting process ends in step S149, the processor 30 ends the shutter key control process. Note that the processor 30 that executes the processes of steps S139 and S149 functions as a storage unit.

  If “NO” in the step S147, that is, if the released position is not in the display area of the shutter key 66 or each coordinate range A, the photographing process is not executed and the shutter key control process is performed. End and return to the shooting control process.

  Although not shown in the shutter key control process, when a shooting parameter value is set in step S135, a through image corresponding to the set shooting parameter is displayed on the display 14. When the user's finger is not released and the touch position changes and the touch position moves from one coordinate range A to another coordinate range A, the through image displayed on the display 14 changes. That is, every time the imaging parameter value corresponding to the coordinate range A changes, the through image displayed on the display 14 also changes.

  Further, when the touch position is included in an arbitrary coordinate range A, the touch position of the finger may be blurred. Therefore, in another embodiment, a boundary line that is determined to exceed the coordinate range A in which the touch position is included (touched) may be provided outside the coordinate range A. As a result, even if the touch position comes out of the coordinate range A, it is not determined that the touch position has gone outside the coordinate range A unless the touch position exceeds the boundary line provided outside the coordinate range A. it can. However, even in the above case, it may be determined that the touch position has moved outside the coordinate range A if a certain time has elapsed.

  In another embodiment, when the touch position is determined to be out of a certain coordinate range A, a confirmation process may be executed to deal with the above-described finger shake. For example, when the touch position comes out of a certain coordinate range A, a timer for a fixed time (for example, 0.5 seconds) is executed, and if the touch position after the fixed time is outside the coordinate range A, “touch position Has come out of a certain coordinate range A ”. On the other hand, if the touch position after a certain time is inside the coordinate range A, it is determined that “the touch position does not come out of a certain coordinate range A”.

  Here, the auxiliary key 68 may be arranged at another position. For example, referring to FIGS. 19A and 19B, when a touch operation is performed on shutter key 66, first auxiliary key 66 a to third auxiliary key 66 c may be arranged on the left side of display 14. . In this case, if the user touches the right shutter key 66 and performs a touch operation on the left auxiliary key 68, the user can change the shooting parameter value and execute the shooting process.

  In addition, referring to FIG. 20, the shutter key 66 and the first auxiliary key 66a to the third auxiliary key 66c may be displayed side by side in the initial state. In this case, when a touch operation is performed on any auxiliary key 68, the photographing parameter value is changed and the photographing process is executed.

  In another embodiment, the predetermined condition for changing the shooting parameter may be the slide direction instead of the coordinate range. For example, when the predetermined condition is the slide direction, the angle range with respect to the horizontal direction and the shooting parameter value are registered in association with the shooting parameter value table. When a flick operation is performed, an angle with respect to the horizontal direction is calculated based on the start coordinates and the end coordinates of the flick operation. When the calculated angle is included in the angle range, the shooting parameter value is changed and the shooting process is executed.

  In addition, when the predetermined condition is the slide direction, the shooting parameter may be changed by combining a plurality of slide operations. For example, the type of shooting parameter to be changed by the first slide operation is selected, and the content to be changed by the second slide operation is determined.

  The plurality of shooting parameter values may be collectively referred to as “shooting mode”, and the user can change the shooting mode in accordance with the shooting scene. In this case, the user can register a predetermined shooting mode and a predetermined condition in advance, and can change the shooting mode by performing a touch operation that satisfies the predetermined condition.

  The program used in this embodiment may be stored in the HDD of the data distribution server and distributed to the mobile phone 10 via the network. Further, the storage medium may be sold or distributed in a state where a plurality of programs are stored in a storage medium such as an optical disk such as a CD, a DVD, or a BD (Blue-Ray Disk), a USB memory, and a memory card. When the program downloaded through the above-described server or storage medium is installed in a portable terminal having the same configuration as that of this embodiment, the same effect as that of this embodiment can be obtained.

  The specific numerical values given in this specification are merely examples, and can be appropriately changed according to a change in product specifications.

DESCRIPTION OF SYMBOLS 10 ... Mobile phone 14 ... Display 16 ... Touch panel 30 ... Processor 40 ... Input device 44 ... Flash memory 46 ... RAM
48 ... Touch panel control circuit 50 ... Camera module

Claims (8)

  1. A mobile terminal having a camera module and having a first shooting parameter value set before shooting,
    Display,
    A touch panel provided in the display unit,
    A detection unit for detecting a touch operation on the touch panel;
    A storage unit for storing a touch operation condition in association with at least one second shooting parameter value registered in advance;
    A display processing unit for displaying a photographing key on the display unit;
    A determination unit that determines whether the touch operation satisfies the condition when a touch operation is performed on the shooting key;
    A change unit that changes the first shooting parameter value to a second shooting parameter value when it is determined that a touch operation on the shooting key satisfies the condition, and a change when the touch operation on the shooting key ends. A portable terminal comprising: a storage unit that stores an image captured by the camera module based on the second captured parameter value.
  2. The condition includes a coordinate range;
    The mobile terminal according to claim 1, wherein the determination unit determines whether a touch position of a subsequent touch operation is included in the coordinate range when the photographing key is touched.
  3.   The portable terminal according to claim 2, further comprising an auxiliary key display unit that displays an auxiliary key corresponding to the condition.
  4.   The portable terminal according to claim 3, wherein the auxiliary key display unit displays an auxiliary key corresponding to the coordinate range when a predetermined time has elapsed since the photographing key was touched.
  5.   The mobile terminal according to claim 1, further comprising a registration unit that registers at least one second imaging parameter value in the storage unit.
  6. A reading unit for reading the current first imaging parameter value;
    The portable terminal according to claim 5, wherein the registration unit registers the current first shooting parameter value read by the reading unit in the storage unit as a second shooting parameter value.
  7. The camera module, the display unit, the touch panel provided on the display unit, the detection unit for detecting a touch operation on the touch panel, and at least one second shooting parameter value registered in advance are stored in association with the conditions of the touch operation. A processor of a portable terminal having a storage unit and having a first shooting parameter value set before shooting;
    A display processing unit for displaying a photographing key on the display unit;
    A determination unit that determines whether the touch operation satisfies the condition when a touch operation is performed on the shooting key;
    A change unit that changes the first shooting parameter value to a second shooting parameter value when it is determined that a touch operation on the shooting key satisfies the condition, and a change when the touch operation on the shooting key ends. An imaging key control program that functions as a storage unit that stores an image captured by the camera module based on the second captured parameter value.
  8. The camera module, the display unit, the touch panel provided on the display unit, the detection unit for detecting a touch operation on the touch panel, and at least one second shooting parameter value registered in advance are stored in association with the conditions of the touch operation. A shooting key control method for a mobile terminal having a storage unit, wherein a first shooting parameter value is set before shooting,
    Display the shooting key on the display,
    When a touch operation is performed on the shooting key, it is determined whether the touch operation satisfies the condition,
    When it is determined that the touch operation with respect to the shooting key satisfies the condition, the first shooting parameter value is changed to a second shooting parameter value, and when the touch operation with respect to the shooting key is completed, the touch key is changed. An imaging key control method for storing an image captured by the camera module based on a second imaging parameter value.
JP2012034703A 2012-02-21 2012-02-21 Mobile terminal, shooting key control program, and shooting key control method Active JP5816571B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012034703A JP5816571B2 (en) 2012-02-21 2012-02-21 Mobile terminal, shooting key control program, and shooting key control method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012034703A JP5816571B2 (en) 2012-02-21 2012-02-21 Mobile terminal, shooting key control program, and shooting key control method
US13/772,064 US9001253B2 (en) 2012-02-21 2013-02-20 Mobile terminal and imaging key control method for selecting an imaging parameter value

Publications (2)

Publication Number Publication Date
JP2013172282A true JP2013172282A (en) 2013-09-02
JP5816571B2 JP5816571B2 (en) 2015-11-18

Family

ID=48982006

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012034703A Active JP5816571B2 (en) 2012-02-21 2012-02-21 Mobile terminal, shooting key control program, and shooting key control method

Country Status (2)

Country Link
US (1) US9001253B2 (en)
JP (1) JP5816571B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015040809A1 (en) 2013-09-20 2015-03-26 Necソリューションイノベータ株式会社 Electronic device, method for controlling electronic device, and storage medium
WO2016006149A1 (en) * 2014-07-08 2016-01-14 ソニー株式会社 Image pickup control device, image pickup control method and program

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6039328B2 (en) * 2012-09-14 2016-12-07 キヤノン株式会社 Imaging control apparatus and imaging apparatus control method
KR20150133056A (en) * 2014-05-19 2015-11-27 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
US20190174069A1 (en) * 2016-03-18 2019-06-06 Kenneth L. Poindexter, JR. System and Method for Autonomously Recording a Visual Media
US10225471B2 (en) * 2016-03-18 2019-03-05 Kenneth L. Poindexter, JR. System and method for autonomously recording a visual media

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006067439A (en) * 2004-08-30 2006-03-09 Olympus Corp Reproducing apparatus, camera and method for selecting and reproducing image data
GB2431804B (en) * 2005-10-31 2011-04-13 Hewlett Packard Development Co Image capture device and method of capturing an image
JP2008268726A (en) * 2007-04-24 2008-11-06 Canon Inc Photographing device
US8237807B2 (en) * 2008-07-24 2012-08-07 Apple Inc. Image capturing device with touch screen for adjusting camera settings
JP5129181B2 (en) 2009-03-02 2013-01-23 オリンパスイメージング株式会社 Operation control device, camera, operation control method, and operation control program
JP5306266B2 (en) * 2010-03-15 2013-10-02 キヤノン株式会社 Imaging apparatus and control method thereof
CN102215339A (en) * 2010-04-01 2011-10-12 三洋电机株式会社 Electronic device and image sensing device
US20110292268A1 (en) * 2010-05-26 2011-12-01 T-Mobile Usa, Inc. Multi-region touchpad device
JP5552947B2 (en) * 2010-07-30 2014-07-16 ソニー株式会社 Information processing apparatus, display control method, and display control program
JP5494337B2 (en) * 2010-07-30 2014-05-14 ソニー株式会社 Information processing apparatus, information processing method, and information processing program
US8599106B2 (en) * 2010-10-01 2013-12-03 Z124 Dual screen application behaviour
US9001255B2 (en) * 2011-09-30 2015-04-07 Olympus Imaging Corp. Imaging apparatus, imaging method, and computer-readable storage medium for trimming and enlarging a portion of a subject image based on touch panel inputs
JP5936183B2 (en) * 2012-02-07 2016-06-15 オリンパス株式会社 Photography equipment
JP6112819B2 (en) * 2012-10-10 2017-04-12 オリンパス株式会社 Electronic device, driving method and program

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015040809A1 (en) 2013-09-20 2015-03-26 Necソリューションイノベータ株式会社 Electronic device, method for controlling electronic device, and storage medium
WO2016006149A1 (en) * 2014-07-08 2016-01-14 ソニー株式会社 Image pickup control device, image pickup control method and program
JPWO2016006149A1 (en) * 2014-07-08 2017-06-08 ソニー株式会社 Imaging control apparatus, imaging control method, and program
US10270960B2 (en) 2014-07-08 2019-04-23 Sony Corporation Image pickup control apparatus by which a user can select instant-shutter function or a self-timer function when taking a selfie

Also Published As

Publication number Publication date
US9001253B2 (en) 2015-04-07
JP5816571B2 (en) 2015-11-18
US20130215313A1 (en) 2013-08-22

Similar Documents

Publication Publication Date Title
US9544495B2 (en) Image capturing device with touch screen for adjusting camera settings
US9030418B2 (en) Mobile terminal capable of sensing proximity touch
KR101132872B1 (en) Portable electronic device for photo management
US9298265B2 (en) Device, method, and storage medium storing program for displaying a paused application
RU2589366C2 (en) Above-lock camera access
JP5086429B2 (en) Imaging system and method using desired facial expression recognition
US8723979B2 (en) Portable electronic equipment with automatic control to keep display turned on and method
JP5739303B2 (en) Mobile terminal, lock control program, and lock control method
US9600163B2 (en) Mobile terminal device, method for controlling mobile terminal device, and program
EP3518526A1 (en) Mobile terminal with touch screen and method of capturing image using the same
US9900422B2 (en) Mobile terminal and method of controlling therefor
KR101432177B1 (en) Portable device and method for processing the photography the same, and photography processing system having it
US8913176B2 (en) Mobile terminal and method of performing multi-focusing and photographing image including plurality of objects using the same
KR101901919B1 (en) Terminal and operation method for messenger video call service
US9661447B2 (en) Mobile terminal and controlling method thereof
CN102714698B (en) For electronic equipment and the method for photographic images
KR101700331B1 (en) Unlocking method, device, terminal, program and recording medium
KR102013443B1 (en) Method for transmitting for image and an electronic device thereof
US9171506B2 (en) Image pickup device and image pickup method
US20130120602A1 (en) Taking Photos With Multiple Cameras
WO2010000118A1 (en) A photo taking apparatus and a method thereof
CN102088499A (en) Mobile terminal having an image projector and control method therein
US20150042852A1 (en) Mobile terminal and controlling method thereof
KR101570037B1 (en) Apparatus and method for controlling of function in portable terminal
JP6267363B2 (en) Method and apparatus for taking images

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20140808

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20150427

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20150507

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150706

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20150901

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20150928

R150 Certificate of patent or registration of utility model

Ref document number: 5816571

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150