WO2013161134A1 - Electronic device, control method for same, and non-temporary computer-readable medium storing control program - Google Patents

Electronic device, control method for same, and non-temporary computer-readable medium storing control program Download PDF

Info

Publication number
WO2013161134A1
WO2013161134A1 PCT/JP2013/000083 JP2013000083W WO2013161134A1 WO 2013161134 A1 WO2013161134 A1 WO 2013161134A1 JP 2013000083 W JP2013000083 W JP 2013000083W WO 2013161134 A1 WO2013161134 A1 WO 2013161134A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
unit
display
imaging area
captured image
Prior art date
Application number
PCT/JP2013/000083
Other languages
French (fr)
Japanese (ja)
Inventor
隆誠 香田
Original Assignee
Necカシオモバイルコミュニケーションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necカシオモバイルコミュニケーションズ株式会社 filed Critical Necカシオモバイルコミュニケーションズ株式会社
Publication of WO2013161134A1 publication Critical patent/WO2013161134A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light

Definitions

  • the present invention relates to an electronic device, a control method thereof, and a non-transitory computer-readable medium storing a control program. It relates to a temporary computer readable medium.
  • a mobile terminal device with a camera an in-camera mainly intended for user's own imaging is disposed on the front of the same casing as the surface on which the display unit is disposed, and an out-camera used for normal imaging is disposed on the rear of the casing. It is used for imaging various subjects.
  • Patent Document 1 a device described in Patent Document 1 is known as a related mobile terminal device with a camera.
  • An object of the present invention is to provide a non-transitory computer-readable medium in which an electronic device that solves such a problem, a control method thereof, and a control program are stored.
  • An electronic apparatus includes an imaging unit arranged with an optical axis direction oblique to a main surface, a display unit arranged on the main surface and displaying a captured image captured from the imaging unit, and the display unit An input detection means for detecting an input operation on the surface of the image, an imaging area setting means for setting an imaging area for the displayed captured image, and in accordance with the detected input operation, Imaging processing means for generating imaging data imaged based on the captured image.
  • An electronic device control method is a control method for an electronic device including an image pickup unit in which an optical axis direction is obliquely arranged with respect to a main surface, and the image pickup unit is connected to a display unit on the main surface.
  • the captured image captured from the display unit is displayed, an input operation on the surface of the display means is detected, an imaging region is set for the displayed captured image, and the set operation is performed according to the detected input operation. Imaging data captured based on the captured image of the imaging region is generated.
  • a non-transitory computer-readable medium storing a control program according to the present invention is a control for causing a computer to execute a control process of an electronic device including an imaging unit arranged with an optical axis direction oblique to a main surface.
  • a non-transitory computer-readable medium storing a program, displaying a captured image captured from the imaging unit on the display unit on the main surface, detecting an input operation on the surface of the display unit, An imaging area is set for the displayed captured image, and imaging data captured based on the captured image of the set imaging area is generated according to the detected input operation.
  • a non-transitory computer-readable medium in which an electronic device capable of improving the visibility and operability of a user during imaging, a control method thereof, and a control program are stored.
  • FIG. 1 is an external view of a mobile terminal device according to Embodiment 1.
  • FIG. 1 is an external view of a mobile terminal device according to Embodiment 1.
  • FIG. 1 is an external view of a mobile terminal device according to Embodiment 1.
  • FIG. 1 is a perspective view of a mobile terminal device according to Embodiment 1.
  • FIG. 6 is an explanatory diagram for explaining a usage state of the mobile terminal device according to Embodiment 1.
  • FIG. 1 is an external view of a mobile terminal device according to Embodiment 1.
  • FIG. 6 is an explanatory diagram for explaining a usage state of the mobile terminal device according to Embodiment 1.
  • FIG. 5 is a configuration diagram showing a configuration of a mobile terminal device according to Embodiment 2.
  • FIG. 6 is a block diagram showing functional blocks of a mobile terminal device according to Embodiment 2.
  • FIG. 6 is an image diagram showing a display image of a mobile terminal device according to Embodiment 2.
  • FIG. 6 is a flowchart illustrating an operation of the mobile terminal device according to the second embodiment. 6 is a flowchart illustrating an operation of the mobile terminal device according to the second embodiment. 6 is a flowchart illustrating an operation of the mobile terminal device according to the second embodiment.
  • FIG. 6 is a diagram showing a display configuration of a display unit of a mobile terminal device according to Embodiment 3.
  • FIG. 10 is a flowchart showing the operation of the mobile terminal device according to the third embodiment.
  • 10 is an image diagram showing a display image of a mobile terminal device according to Embodiment 3.
  • FIG. 6 is a diagram showing a display configuration of a display unit of a mobile terminal device according to Embodiment 4.
  • FIG. 10 is a flowchart showing the operation of the mobile terminal device according to the fourth embodiment.
  • FIG. 9 is an image diagram showing a display image of a mobile terminal device according to a fifth embodiment.
  • FIG. 9 is an image diagram showing a display image of a mobile terminal device according to a fifth embodiment.
  • FIG. 9 is an image diagram showing a display image of a mobile terminal device according to a fifth embodiment.
  • FIG. 1A is a front external view of a portable terminal device 100 of a premise example
  • FIG. 1B is a side external view thereof
  • FIG. 1C is a back external view thereof.
  • the mobile terminal device 100 is a terminal device (electronic device) that can be carried by a user, such as a smartphone, a tablet-type mobile terminal device, a mobile phone, a game machine, and an electronic book terminal.
  • the portable terminal device 100 of the premise example includes a display module 110 disposed on the main surface and a casing 120 disposed on the back surface.
  • the main surface is a surface facing the user side while being held by the user, an input / output surface that performs input / output with the main user, that is, a display surface that displays an image or the like to the user. It is also an operation surface for performing input operations.
  • the display module 110 is provided with a display unit (display screen) 111 for displaying images and the like, a touch panel 112 on which a user performs an input operation, and a camera (in-camera) 113 for imaging the user himself.
  • a microphone for inputting sound and the like and a speaker for outputting sound and the like are also provided.
  • a GUI Graphic User Interface
  • a function corresponding to the operation is executed.
  • the housing 120 has a concave cross section, and is fixedly arranged so as to cover the entire back surface of the display module 110.
  • a camera (out camera) 114 for performing normal imaging is provided on the back surface (bottom surface) of the housing 120.
  • the focal direction (optical axis direction) of the camera 114 is perpendicular to the extending direction of the main surface.
  • the out camera 114 is arranged on the back of the casing. For this reason, as described in the above problem, the visibility and operability may be reduced during imaging. That is, when an image is captured using the out-camera 114 in the portable terminal device 100 of the premise example, as shown in FIG. 4A, the portable terminal device 100 is perpendicular to the user's line-of-sight direction, that is, the direction of the subject. It is necessary to hold it while holding it. Then, unlike a state in which a normal stable operation is performed, the user checks an image and performs an operation such as imaging in a state where the burden is high on the user holding the mobile terminal device 100 vertically. There is a problem that visibility and operability deteriorate.
  • the out camera is arranged obliquely with respect to the main surface, thereby improving visibility and operability. .
  • FIG. 2A is a front external view of mobile terminal device 1 according to the present embodiment
  • FIG. 2B is a side external view thereof
  • FIG. 2C is a back external view thereof
  • FIG. 3 is a perspective view of the mobile terminal device 1 according to the present embodiment.
  • the mobile terminal device 1 is a terminal device (electronic device) that can be carried by a user, such as a smartphone, a tablet-type mobile terminal device, a mobile phone, a game machine, and an electronic book terminal. ).
  • the mobile terminal device 1 includes a display module 10 disposed on the main surface and a housing 20 disposed on the back surface. .
  • the display module 10 is provided with a display unit (display screen) 11 for displaying images and the like, a touch panel 12 on which the user performs an input operation, and a camera (in-camera) 13 for imaging the user himself.
  • a microphone for inputting sound and the like and a speaker for outputting sound and the like are also provided.
  • the housing 20 has a concave cross section, and is fixedly arranged so as to cover the entire back surface of the display module 10.
  • an upper bottom surface (upper side surface) 21 is formed at the upper end of the casing that is located upward while the user holds the mobile terminal device 1.
  • a camera (out camera) 14 for performing normal imaging is provided in the approximate center of the upper bottom surface 21.
  • the upper bottom surface 21 is formed at an acute angle ⁇ with respect to the main surface, that is, the camera 14 is inclined at an angle ⁇ with respect to the main surface. Therefore, the focal direction (optical axis direction) of the camera 14 is also inclined at an angle ⁇ with respect to the extending direction of the main surface.
  • the camera mounting position is set above the surface displayed by the display unit so that the user can easily perform an operation while confirming the display in the preview mode through the touch panel type display unit. It arrange
  • the predetermined angle ⁇ is an angle at which the main surface and the upper bottom surface form an acute angle.
  • the user operates the touch panel arranged on the main surface of the mobile terminal device, and the user is positioned by the camera on the upper bottom surface. It is possible to capture an image of the front area from the point.
  • the imaging operation can be performed in a state in which the mobile terminal device is held obliquely, that is, in a normal state in which the mobile terminal device is operated. Improves.
  • Embodiment 2 The second embodiment of the present invention will be described below with reference to the drawings.
  • the imaging operation in the mobile terminal device 1 shown in the first embodiment will be mainly described.
  • the external configuration of the mobile terminal device will be described as the configuration of FIGS. 2A to 2C of Embodiment 1, but the external configuration of the mobile terminal device is assumed to be the configuration of FIGS. 1A to 1C as a premise example. Also good.
  • FIG. 5 shows a hardware configuration of the mobile terminal device 1 according to the present embodiment.
  • the mobile terminal device 1 includes a control unit 31, a wireless communication unit 32, a display unit 33, a touch panel 34, a microphone 35, a speaker 36, a storage unit 37, and a camera 38.
  • the control unit 31 is a control unit that controls the operation of each unit, such as a CPU.
  • the control unit 31 executes a control program and an application program stored in the storage unit 37, and controls the operation of each unit according to each program.
  • the wireless communication unit 32 is a communication unit for performing wireless communication with a base station and other communication devices.
  • the wireless communication unit 32 transmits and receives audio signals for calls and the like, and transmits and receives data signals such as captured images. In addition, you may perform not only wireless but wired communication and other communication control.
  • the display unit 33 corresponds to the display unit 11 of FIGS. 2A to 2C, and is a display display such as a liquid crystal display or an organic EL display.
  • the display unit 33 displays a GUI such as an icon, an image, a video, or the like in accordance with an instruction from the control unit 31.
  • the touch panel 34 corresponds to the touch panel 12 of FIGS. 2A to 2C, detects the position on the display unit 33 touched by the user's finger using capacitance or the like, and sends a signal indicating the detected position to the control unit 31. Output.
  • the microphone 35 inputs voice from the user and the speaker 36 outputs voice and the like to the user. For example, a sound signal corresponding to the sound input from the user to the microphone 35 is transmitted via the wireless communication unit 32, and a sound corresponding to the sound signal received by the wireless communication unit 32 is output from the speaker to the user.
  • the storage unit 37 is a storage unit such as a flash memory, and stores information necessary for the operation of the control unit 31 and each unit.
  • the storage unit 37 stores a control program or application program executed by the control unit 31, imaging data captured by the camera 38, and the like.
  • the camera 38 mainly corresponds to the camera 14 in FIGS. 2A to 2C (may be the camera 13 in FIGS. 2A to 2C), and may be a normal camera, but here is a wide angle camera having a wide angle lens. . By mounting a wide-angle camera, a wider range of subjects can be imaged.
  • FIG. 6 shows a functional block configuration realized by the control unit 31 of the mobile terminal device 1 according to the present embodiment.
  • the control unit 31 includes a GUI unit 41, a preview display unit 42, an imaging region setting unit 43, an imaging processing unit 44, an image storage unit 45, and an image processing unit 46.
  • the functional blocks of the control unit 31 in FIG. 6 are examples, and other block configurations may be used as long as the imaging operation in the present embodiment can be realized.
  • each block (each function and each processing unit) in FIG. 6 may be configured by hardware, software, or both.
  • each block of the control unit 31 is realized by a CPU (computer) executing a control program for performing an imaging operation according to the present embodiment.
  • the GUI unit 41 is an interface that accepts user input operations via the display unit 33 and the touch panel 34.
  • the GUI unit 41 receives a position signal corresponding to the icon from the touch panel 34, and detects that the icon has been operated.
  • the GUI unit 41 detects a flick operation, a drag operation, a tap operation, a double tap operation, a pinch-in operation, a pinch-out operation, and the like according to a user operation on the touch panel 34, and notifies each unit that an input operation has been detected. .
  • the preview display unit 42 displays a preview on the display unit 33 when the camera 38 captures an image.
  • display is performed in two preview modes corresponding to the wide-angle camera. That is, the display information displayed as a preview on the display unit 33 displays the entire captured image captured by the camera 38 or a substantially entire image obtained by excluding a partial region from the end of each side to a predetermined width of the entire image. Assume that the first preview mode and the second preview mode for previewing the captured image itself captured by the imaging instruction are provided.
  • reference numeral 701 in FIG. 7 is the entire imaging target
  • the entire captured image captured by the camera 38 is displayed as indicated by reference numeral 702 in FIG.
  • the second preview mode as indicated by reference numeral 703 in FIG. 7, only an imaging region including a specific subject is displayed among the captured images captured by the camera 38.
  • the imaging area setting unit 43 sets the imaging area in the captured image and displays it in a simplified diagram in the first preview mode or the second preview mode.
  • the imaging area setting unit 43 sets the imaging area in accordance with the user's imaging area designation operation, and displays an imaging area designation schematic diagram (imaging area 11a) as indicated by reference numeral 703 in FIG.
  • the schematic diagram for imaging area designation is a double frame or the like displayed as an imaging area designation area frame indicating the position of the imaging area designation area within the frame indicating the captured image, and the double frame is displayed. You may display in the corner of a part.
  • the imaging processing unit 44 captures an image of the designated imaging area and generates imaging data according to the user input.
  • the image storage unit 45 stores imaging data generated by imaging in the storage unit 37 or the like.
  • the image processing unit 46 performs various image processes on the captured image data.
  • FIG. 8 shows an imaging operation (imaging process) of the mobile terminal device 1 according to the present embodiment. For example, when a user operates a camera icon, a camera application is activated, and the following processing is executed by each unit in FIG.
  • the preview display unit 42 displays the entire captured image of the camera 38 in the first preview mode (S101). Subsequently, the GUI unit 41 determines whether or not an imaging region designation operation has been performed (S102). In the case of the first preview mode, the user performs an operation instruction including a contact operation or a proximity operation on the touch panel type display unit 33 as an imaging region designation operation as an imaging region designation operation.
  • the preview display unit 42 switches to the second preview mode and displays it (S103).
  • the preview display unit 42 enlarges and displays the designated imaging area in the second preview mode.
  • the imaging area setting unit 43 displays the commanded imaging area as a simplified diagram in the state displayed in the second preview mode (S104).
  • the imaging area setting unit 43 displays the imaging area 11a with a double frame or the like as described above.
  • the user can check which area is captured as the imaging area in the preview image visually recognized through the display unit while holding the mobile terminal device by hand.
  • This designation / change operation can be performed by the user as an imaging region designation operation.
  • the imaging area setting unit 43 performs a follow-up display that follows the subject included in the designated imaging area (S105). For example, when a subject such as a person or an animal included in the commanded imaging region is recognized and movement of the subject is detected, the imaging region is moved and displayed following the movement of the subject. In the present embodiment, not only designation by the user but also designation processing by the control unit of the terminal is possible. Once a specific display object within the area specified by the user is recognized as an imaging target display object by the image recognition processing, the display object moves or the user's hand moves. Even in such a case, it is possible to follow the display object as an imaging region.
  • the imaging processing unit 44 performs imaging and storage of imaging data by the imaging execution process (S106), and the process ends.
  • FIG. 9 shows another example of the imaging operation of the mobile terminal device 1 according to the present embodiment shown in FIG.
  • the display is performed in the second preview mode (S103).
  • the mode may be switched from the second preview mode to the first preview mode by a mode switching operation.
  • the mode can be switched by operating a mode switching icon.
  • the imaging region designation operation (S102) is performed in the first or second preview mode
  • the imaging region is displayed in the state of the first or second preview mode (S104), and the subject tracking display is performed ( S105). Subsequently, imaging and storage of imaging data are performed by imaging execution processing (S106).
  • FIG. 10 shows the operation of the imaging execution process (S106) in FIG. 8 or FIG.
  • the imaging processing unit 44 determines whether or not an imaging operation has been performed in a state where the imaging region 11a designated by the operations of FIGS. 8 and 9 is displayed (S110). An imaging operation is performed by an operation instruction including a contact operation or a proximity operation on the displayed imaging area, imaging icon, or the like.
  • the imaging processing unit 44 executes imaging (S111).
  • the imaging processing unit 44 generates imaging data from the image signal in the designated imaging area.
  • the GUI 41 determines whether or not an imaging data storage operation has been performed (S112).
  • a storage operation is performed on the displayed image storage icon or the like by an operation instruction including a contact operation or a proximity operation.
  • the image saving unit 45 saves the captured data (S113).
  • the image storage unit 45 stores the generated imaging data in the storage unit 37.
  • the arrangement angle between the camera and the display unit has a predetermined angle that is an acute angle, an operation for specifying the imaging region while confirming the preview display on the display unit, or An operation for designating a display object to be imaged becomes easy.
  • the control unit of the terminal follows the subject, so in the first preview display mode, the following display object is displayed as a preview. Above, it can be visually recognized as frame information shown as an imaging region. Also in the second preview display mode, it can be easily recognized by the above-described double frame display or the like.
  • FIG. 11 shows a display example of the mobile terminal device 1 according to the present embodiment.
  • a first indicator 201, a second indicator 202, and a third indicator 203 are arranged in the lower indicator arrangement region 200 in the display unit 11.
  • the user designates the imaging area 11a by performing an imaging area designation operation or the like as shown in FIGS.
  • the user can issue an imaging instruction by performing a flick operation on the display unit 11 in a specific direction.
  • the display unit 11 has an index for specifying where to capture the captured image data.
  • the control unit detects a flick operation in the preview display state, the control unit displays these indexes. Detect if the direction is pointing.
  • three icons (indexes) located below the display unit 11 are displayed.
  • This index is associated with information indicating the capture destination.
  • the first index 201 is the in-terminal memory
  • the second index 202 is a server provided by the image sharing service on the network
  • the third index The index 203 indicates a specific user's memory.
  • the first index 201 includes an instruction to save in the in-terminal memory
  • the second index 202 is an instruction to access a server such as an image sharing service
  • the third index 203 is a specific user. Includes a transfer instruction by e-mail attachment.
  • FIG. 12 shows the operation of the mobile terminal device 1 according to the present embodiment, and shows the operation of the imaging execution process (S106) in FIG. 8 and FIG.
  • the imaging processing unit 44 receives a flick operation from the user in the GUI unit 41 in a state where the imaging region 11a designated by the operation of FIGS. 8 and 9 is displayed (S201).
  • the operation is not limited to the flick operation, and may be an operation that associates the imaging region with one of the indexes, such as a drag operation.
  • the GUI unit 41 determines whether or not the first index 201, the second index 202, and the third index 203 exist in the operation direction of the flick operation (S202). If no index exists in the operation direction, the imaging area is changed (S203). If no index exists in the direction indicated by the flick operation, it may be detected as a process other than the storage process. Here, the imaging region is moved in the operation direction and displayed.
  • the first to first indicators may be processed as an imaging instruction to be stored in the terminal memory as it is.
  • the imaging processing unit 44 performs imaging to generate imaging data (S204), and the image storage unit 45 stores the imaging data in the storage unit 37 in the terminal. Save (S205).
  • the imaging processing unit 44 performs imaging to generate imaging data (S206), and the image storage unit 45 accesses the server to acquire the imaging data. (S207). Thereby, imaging data is stored in the server.
  • the imaging processing unit 44 performs imaging to generate imaging data (S208), and the image storage unit 45 forwards the imaging data attached to the mail and transfers it. (S209).
  • the imaging data is stored in the mail transmission destination apparatus.
  • FIG. 13 shows a display image during the imaging operation of the mobile terminal device 1 according to the present embodiment.
  • the imaging area 11a is enlarged and displayed in the second preview mode as indicated by reference numeral 1302 in FIG.
  • the display unit 11 is touched in this state and flicked in the direction without the icon, the live image can be moved like a gallery and set to the best angle of view. Therefore, the live video of the camera can be operated as desired by touching.
  • the operation is not detected as an imaging instruction. That is, it is detected as an imaging instruction only when a predetermined direction is directed to the index. If not directed, in the second preview mode, the imaging area can be changed as indicated by reference numeral 1303 in FIG.
  • FIG. 13 indicates a state in which the imaging region is touched from the state of reference numeral 1303 in FIG. 13 and flicked to the SNS icon 11b.
  • FIG. 13D when a screen in accordance with the best angle of view is touched, the live video of the camera is paused and dragged directly to the SNS icon 11b at the bottom of the screen. Thereby, the imaging data imaged in the imaging area is transmitted to SNS.
  • the video can be cut and shared with the usual touch operation, and the user can immediately share it as he / she feels it.
  • the user can perform the imaging process and the storage destination of the captured image by one operation. As a result, it is possible to simultaneously capture the captured image and specify the usage destination, improving operability. Therefore, it is possible to diversify the use opportunities of imaging information.
  • Embodiment 4 of the present invention will be described below with reference to the drawings.
  • the external configuration of the mobile terminal device will be described as the configuration of FIGS. 2A to 2C of Embodiment 1, but in addition, the external configuration of the mobile terminal device is assumed as an example in FIGS. 1A to 1C. It is good also as a structure.
  • Other configurations are the same as those in FIGS. 5 and 6 of the second embodiment.
  • FIG. 14 shows a display example of the mobile terminal device 1 according to the present embodiment.
  • the mobile terminal device 1 has a first indicator 201, a second indicator 202, and a third indicator 203 arranged in the lower indicator arrangement region 200 in the display unit 11. Yes.
  • a fourth index 204, a fifth index 205, and a sixth index 206 are displayed in the index arrangement area 210.
  • the preview display state when the preview display and the index indicating the imaging capture instruction destination are displayed on the display unit, it is possible to specify the capture destination simultaneously with the imaging instruction.
  • image correction can be performed in the same manner.
  • an image correction index is displayed, and a captured image capture instruction and image correction are performed from the preview display state by flicking or dragging in the direction in which the index is located. It is possible to do it at the same time.
  • a plurality of image correction indices may be displayed according to the type of image correction.
  • the fourth index 204 specifies monochrome processing for reducing the saturation to a predetermined value
  • the fifth index 205 specifies mosaic processing for a specific display object
  • the sixth index 206 specifies sepia processing for converting the color tone to sepia.
  • the operation contents when the index placement area 200 displaying the index group designating the import destination and the index placement area 210 displaying the index group designating the image correction are displayed as shown in FIG. 14 will be exemplified.
  • the control unit performs image correction corresponding to the specific index in the second area, and then specifies the capture destination corresponding to the specific index specified in the first area.
  • the control unit receives the imaging instruction, the control unit applies a mosaic process to the face of the user's friend in the imaging area frame, and then corresponds to the second index in the first area. Upload process to.
  • FIG. 15 shows the operation of the mobile terminal device 1 according to the present embodiment, and shows the operation of the imaging execution process (S106) in FIGS.
  • the imaging processing unit 44 receives the flick operation from the user in a state where the imaging region designated by the operation of FIGS. 8 and 9 is displayed (S301).
  • the operation is not limited to the flick operation, and may be an operation that associates the imaging region with one of the indexes, such as a drag operation.
  • the GUI unit 41 determines whether or not the fourth index 204, the fifth index 205, and the sixth index 206 exist in the operation direction of the flick operation (S302). If no index exists in the operation direction, the imaging area is changed (S303). If no index exists in the direction indicated by the flick operation, it may be detected as a process other than the storage process. Here, the imaging region is moved in the operation direction and displayed.
  • the imaging processing unit 44 When the fourth index 204 exists in the operation direction of the flick operation, the imaging processing unit 44 performs imaging to generate imaging data (S304), and the image processing unit 46 performs monochrome processing on the imaging data (S305). .
  • the imaging processing unit 44 performs imaging to generate imaging data (S306), and the image processing unit 46 performs monochrome processing on the imaging data (S307). .
  • the imaging processing unit 44 performs imaging to generate imaging data (S308), and the image processing unit 46 performs sepia processing on the imaging data (S307). .
  • the GUI unit 41 determines whether or not the first index 201, the second index 202, and the third index 203 exist in the flick operation direction (S310).
  • the image storage unit 45 stores the imaging data in the storage unit 37 in the terminal (S311).
  • the image storage unit 45 accesses the server and transmits the imaging data to the server (S312). Thereby, imaging data is stored in the server.
  • the image storage unit 45 attaches the imaging data to the mail and transfers it (S313). As a result, the imaging data is stored in the mail transmission destination apparatus.
  • the imaging process and the capture destination of the captured image can be easily specified, but also the image correction can be performed at the same time, so that the image correction according to the capture destination is easy. Can be done. Therefore, the operability is further improved. For example, even when a captured image capturing process in which the boundary between the inside and outside of the terminal is ambiguous is performed, image correction can be appropriately performed on an image that may be exposed to the outside.
  • the image data may be variously processed by a flick operation for a plurality of indices, not limited to two indices.
  • the fixedly displayed horizontal line and the inclined line displaced according to the attitude of the terminal may be simply displayed. Thereby, it becomes possible to confirm the inclination state of a terminal by visual recognition of a display part.
  • the horizontal line here indicates a position parallel to the focal direction of the camera when the camera arrangement angle forms an acute angle as the first angle with respect to the main surface of the housing.
  • the mobile terminal device 1 has a built-in tilt sensor (horizontal position detecting means) for displaying a tilt line.
  • FIG. 16A to 16C show display examples of the mobile terminal device 1 according to the present embodiment.
  • FIG. 16A is a display example when the focal direction (optical axis direction) of the camera is horizontal.
  • a horizontal line 301 and an inclined line 302 are displayed on the display unit 11.
  • the horizontal line 301 and the tilt line 302 are displayed overlapping each other. For example, it is possible to detect an inclination angle with respect to a horizontal line by setting in advance that this state is a horizontal state with respect to the inclination sensor.
  • FIG. 16B is a display example when the focal direction (optical axis direction) of the camera is directed downward from the horizontal direction.
  • a horizontal line 301 and an inclined line 302 are displayed on the display unit 11.
  • the tilt line 302 is displayed above the horizontal line 301.
  • FIG. 16C is a display example in the case where the focal direction (optical axis direction) of the camera is directed upward from the horizontal direction.
  • a horizontal line 301 and an inclined line 302 are displayed on the display unit 11.
  • the focus direction of the camera is directed above the horizontal, and the tilt sensor 302 detects the tilt upward, so that the tilt line 302 is displayed below the horizontal line 301.
  • the tilt state of the terminal can be confirmed by visual recognition of the display unit. Since the user can easily adjust the inclination of the apparatus, the operability can be improved.
  • a camera is disposed on the back surface of the housing, and in Embodiment 1 of FIGS. 2A to 2C, the camera is disposed on the upper bottom surface. 2A to 2C may be mounted. Accordingly, the mobile terminal device can be imaged while standing vertically, or the mobile terminal device can be imaged while being tilted obliquely. Further, instead of inclining the upper bottom surface, the upper bottom surface may be substantially perpendicular to the main surface, and only the optical axis of the camera mounted on the upper bottom surface may be inclined. Even in such a case, it is needless to say that the operations and processes described in the above embodiments can be performed.
  • the above-described tilt angle of the upper base and the optical axis angle of the camera can be adjusted. It becomes unnecessary.
  • the captured image in the case of the camera is an area captured with the upper direction centered parallel to the main surface, but for the control unit, the imaging area is perpendicular to the main surface and centered on the front direction.
  • Non-transitory computer readable media include various types of tangible storage media.
  • Examples of non-transitory computer-readable media include magnetic recording media (eg, flexible disks, magnetic tapes, hard disk drives), magneto-optical recording media (eg, magneto-optical disks), CD-ROM (Read Only Memory) CD-R, CD -R / W, including semiconductor memory (for example, mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (Random Access Memory)).
  • the program may be supplied to a computer by various types of temporary computer readable media. Examples of transitory computer readable media include electrical signals, optical signals, and electromagnetic waves.
  • the temporary computer-readable medium can supply the program to the computer via a wired communication path such as an electric wire and an optical fiber, or a wireless communication path.
  • SYMBOLS 1 Portable terminal device 10 Display module 11 Display part 11a Imaging area 12 Touch panel 13 Camera (in camera) 14 Camera (Out camera) 20 casing 21 upper bottom surface 31 control unit 32 wireless communication unit 33 display unit 34 touch panel 35 microphone 36 speaker 37 storage unit 38 camera 41 GUI unit 42 preview display unit 43 imaging region setting unit 44 imaging processing unit 45 image storage unit 46 image processing Unit 100 mobile terminal device 110 display module 111 display unit 112 touch panel 113 camera (in-camera) 114 camera (out camera) 120 Housing 200 Index arrangement area 201 to 206 Index 210 Index arrangement area 301 Horizontal line 302 Inclined line

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Telephone Function (AREA)
  • Studio Devices (AREA)

Abstract

A mobile terminal device (1) is provided with: a camera (14) placed with the optical axis thereof diagonal relative to a main surface; a display section (11) that is placed on the main surface, and that displays a load image loaded from the camera (14); a touch panel (34) that detects an input operation on a surface of the display section (11); an imaging area setting unit (43) that sets an imaging area (11a) for the displayed load image; and an imaging process unit (44) that in response to the detected input operation, generates imaging data that was imaged on the basis of the load image in the set imaging area (11a). By means of this structure, visibility and operation for a user can be improved during imaging.

Description

電子機器、その制御方法及び制御プログラムが格納された非一時的なコンピュータ可読媒体Non-transitory computer-readable medium storing electronic device, its control method, and control program
 本発明は、電子機器、その制御方法及び制御プログラムが格納された非一時的なコンピュータ可読媒体に関し、特に、撮像部及び表示部を備えた電子機器、その制御方法及び制御プログラムが格納された非一時的なコンピュータ可読媒体に関する。 The present invention relates to an electronic device, a control method thereof, and a non-transitory computer-readable medium storing a control program. It relates to a temporary computer readable medium.
 近年、携帯電話やスマートフォン、タブレット端末などカメラ付きの携帯端末装置が広く普及している。通常、カメラ付きの携帯端末装置では、表示部が配置される面と同じ筐体前面にユーザ自身の撮像を主目的とするインカメラや、筐体裏面に通常の撮像に用いるアウトカメラが配置されており、様々な被写体撮像に利用されている。 In recent years, mobile terminal devices with cameras such as mobile phones, smartphones, and tablet terminals have become widespread. Usually, in a mobile terminal device with a camera, an in-camera mainly intended for user's own imaging is disposed on the front of the same casing as the surface on which the display unit is disposed, and an out-camera used for normal imaging is disposed on the rear of the casing. It is used for imaging various subjects.
 例えば、関連するカメラ付きの携帯端末装置として特許文献1に記載の装置が知られている。 For example, a device described in Patent Document 1 is known as a related mobile terminal device with a camera.
特開2004-205885号公報Japanese Patent Laid-Open No. 2004-205858
 通常、携帯端末装置の筐体裏面に配置されたアウトカメラを使用する場合、ユーザの目線方向にある対象物を撮像するためには、ユーザの目線(視線方向)に対し垂直となるようにユーザが携帯端末装置を保持(把持)して撮像することが多い。 Normally, when using an out-camera disposed on the back of the casing of a mobile terminal device, in order to capture an object in the direction of the user's line of sight, the user must be perpendicular to the user's line of sight (line of sight). In many cases, an image is taken while holding (gripping) the portable terminal device.
 例えば、タッチパネルを搭載したタッチパネル型の携帯端末装置の場合は、撮像指示時にタッチパネルを介して補正、撮像枠の設定等の様々な処理を行うことになる。しかし、折り畳み型等の携帯端末装置ではまだしも、操作面が広くなる傾向にあるタッチパネル型の携帯端末装置の場合、形状上、ユーザの目線に対し携帯端末装置を垂直に保持すると、視認性と操作性をともに向上させることが難しい。 For example, in the case of a touch panel type mobile terminal device equipped with a touch panel, various processes such as correction and setting of an imaging frame are performed via the touch panel when an imaging instruction is given. However, in the case of a touch panel type mobile terminal device that tends to have a wider operation surface in a folding type mobile terminal device, if the mobile terminal device is held vertically with respect to the user's line of sight, visibility and operation It is difficult to improve both sex.
 したがって、関連する携帯端末装置などの電子機器では、撮像時にユーザの視認性や操作性が低下するという問題があった。 Therefore, in related electronic devices such as portable terminal devices, there has been a problem that the visibility and operability of the user are reduced during imaging.
 本発明の目的は、このような問題を解決する電子機器、その制御方法及び制御プログラムが格納された非一時的なコンピュータ可読媒体を提供することである。 An object of the present invention is to provide a non-transitory computer-readable medium in which an electronic device that solves such a problem, a control method thereof, and a control program are stored.
 本発明に係る電子機器は、主面に対し光軸方向が斜めに配置された撮像手段と、前記主面に配置され、前記撮像手段から取り込まれる取り込み画像を表示する表示手段と、前記表示手段の面上に対する入力操作を検知する入力検知手段と、前記表示された取り込み画像に対し撮像領域を設定する撮像領域設定手段と、前記検知された入力操作に応じて、前記設定された撮像領域の取り込み画像に基づき撮像した撮像データを生成する撮像処理手段と、を備えるものである。 An electronic apparatus according to the present invention includes an imaging unit arranged with an optical axis direction oblique to a main surface, a display unit arranged on the main surface and displaying a captured image captured from the imaging unit, and the display unit An input detection means for detecting an input operation on the surface of the image, an imaging area setting means for setting an imaging area for the displayed captured image, and in accordance with the detected input operation, Imaging processing means for generating imaging data imaged based on the captured image.
 本発明に係る電子機器の制御方法は、主面に対し光軸方向が斜めに配置された撮像手段を備えた電子機器の制御方法であって、前記主面上の表示手段に、前記撮像手段から取り込まれる取り込み画像を表示し、前記表示手段の面上に対する入力操作を検知し、前記表示された取り込み画像に対し撮像領域を設定し、前記検知された入力操作に応じて、前記設定された撮像領域の取り込み画像に基づき撮像した撮像データを生成するものである。 An electronic device control method according to the present invention is a control method for an electronic device including an image pickup unit in which an optical axis direction is obliquely arranged with respect to a main surface, and the image pickup unit is connected to a display unit on the main surface. The captured image captured from the display unit is displayed, an input operation on the surface of the display means is detected, an imaging region is set for the displayed captured image, and the set operation is performed according to the detected input operation. Imaging data captured based on the captured image of the imaging region is generated.
 本発明に係る制御プログラムが格納された非一時的なコンピュータ可読媒体は、主面に対し光軸方向が斜めに配置された撮像手段を備えた電子機器の制御処理をコンピュータに実行させるための制御プログラムが格納された非一時的なコンピュータ可読媒体であって、前記主面上の表示手段に、前記撮像手段から取り込まれる取り込み画像を表示し、前記表示手段の面上に対する入力操作を検知し、前記表示された取り込み画像に対し撮像領域を設定し、前記検知された入力操作に応じて、前記設定された撮像領域の取り込み画像に基づき撮像した撮像データを生成するものである。 A non-transitory computer-readable medium storing a control program according to the present invention is a control for causing a computer to execute a control process of an electronic device including an imaging unit arranged with an optical axis direction oblique to a main surface. A non-transitory computer-readable medium storing a program, displaying a captured image captured from the imaging unit on the display unit on the main surface, detecting an input operation on the surface of the display unit, An imaging area is set for the displayed captured image, and imaging data captured based on the captured image of the set imaging area is generated according to the detected input operation.
 本発明によれば、撮像時にユーザの視認性や操作性を向上することが可能な電子機器、その制御方法及び制御プログラムが格納された非一時的なコンピュータ可読媒体を提供することである。 According to the present invention, there is provided a non-transitory computer-readable medium in which an electronic device capable of improving the visibility and operability of a user during imaging, a control method thereof, and a control program are stored.
実施の形態の前提例に係る携帯端末装置の外観図である。It is an external view of the portable terminal device which concerns on the premise example of embodiment. 実施の形態の前提例に係る携帯端末装置の外観図である。It is an external view of the portable terminal device which concerns on the premise example of embodiment. 実施の形態の前提例に係る携帯端末装置の外観図である。It is an external view of the portable terminal device which concerns on the premise example of embodiment. 実施の形態1に係る携帯端末装置の外観図である。1 is an external view of a mobile terminal device according to Embodiment 1. FIG. 実施の形態1に係る携帯端末装置の外観図である。1 is an external view of a mobile terminal device according to Embodiment 1. FIG. 実施の形態1に係る携帯端末装置の外観図である。1 is an external view of a mobile terminal device according to Embodiment 1. FIG. 実施の形態1に係る携帯端末装置の斜視図である。1 is a perspective view of a mobile terminal device according to Embodiment 1. FIG. 実施の形態1に係る携帯端末装置の使用状態を説明するための説明図である。6 is an explanatory diagram for explaining a usage state of the mobile terminal device according to Embodiment 1. FIG. 実施の形態1に係る携帯端末装置の使用状態を説明するための説明図である。6 is an explanatory diagram for explaining a usage state of the mobile terminal device according to Embodiment 1. FIG. 実施の形態2に係る携帯端末装置の構成を示す構成図である。5 is a configuration diagram showing a configuration of a mobile terminal device according to Embodiment 2. FIG. 実施の形態2に係る携帯端末装置の機能ブロックを示すブロック図である。6 is a block diagram showing functional blocks of a mobile terminal device according to Embodiment 2. FIG. 実施の形態2に係る携帯端末装置の表示イメージを示すイメージ図である。6 is an image diagram showing a display image of a mobile terminal device according to Embodiment 2. FIG. 実施の形態2に係る携帯端末装置の動作を示すフローチャートである。6 is a flowchart illustrating an operation of the mobile terminal device according to the second embodiment. 実施の形態2に係る携帯端末装置の動作を示すフローチャートである。6 is a flowchart illustrating an operation of the mobile terminal device according to the second embodiment. 実施の形態2に係る携帯端末装置の動作を示すフローチャートである。6 is a flowchart illustrating an operation of the mobile terminal device according to the second embodiment. 実施の形態3に係る携帯端末装置の表示部の表示構成を示す図である。6 is a diagram showing a display configuration of a display unit of a mobile terminal device according to Embodiment 3. FIG. 実施の形態3に係る携帯端末装置の動作を示すフローチャートである。10 is a flowchart showing the operation of the mobile terminal device according to the third embodiment. 実施の形態3に係る携帯端末装置の表示イメージを示すイメージ図である。10 is an image diagram showing a display image of a mobile terminal device according to Embodiment 3. FIG. 実施の形態4に係る携帯端末装置の表示部の表示構成を示す図である。6 is a diagram showing a display configuration of a display unit of a mobile terminal device according to Embodiment 4. FIG. 実施の形態4に係る携帯端末装置の動作を示すフローチャートである。10 is a flowchart showing the operation of the mobile terminal device according to the fourth embodiment. 実施の形態5に係る携帯端末装置の表示イメージを示すイメージ図である。FIG. 9 is an image diagram showing a display image of a mobile terminal device according to a fifth embodiment. 実施の形態5に係る携帯端末装置の表示イメージを示すイメージ図である。FIG. 9 is an image diagram showing a display image of a mobile terminal device according to a fifth embodiment. 実施の形態5に係る携帯端末装置の表示イメージを示すイメージ図である。FIG. 9 is an image diagram showing a display image of a mobile terminal device according to a fifth embodiment.
(実施の形態の前提例)
 実施の形態を説明する前に、図1A~図1Cを用いて、実施の形態を適用する前の前提例となる携帯端末装置について説明する。なお、このような前提例の構成と、後述する実施の形態のいずれか、もしくは複数とを組み合わせて各実施の形態における効果を得ることも可能である。
(Premise example of embodiment)
Before describing the embodiment, a portable terminal device as a precondition before applying the embodiment will be described with reference to FIGS. 1A to 1C. In addition, it is also possible to obtain the effect of each embodiment by combining the configuration of such a premise example and any one or a plurality of embodiments described later.
 図1Aは、前提例の携帯端末装置100の正面外観図であり、図1Bはその側面外観図であり、図1Cはその裏面外観図である。携帯端末装置100は、例えばスマートフォン、タブレット型の携帯端末装置、携帯電話、ゲーム機、電子書籍端末などユーザが携帯可能な端末装置(電子機器)である。 1A is a front external view of a portable terminal device 100 of a premise example, FIG. 1B is a side external view thereof, and FIG. 1C is a back external view thereof. The mobile terminal device 100 is a terminal device (electronic device) that can be carried by a user, such as a smartphone, a tablet-type mobile terminal device, a mobile phone, a game machine, and an electronic book terminal.
 図1A~図1Cに示すように、前提例の携帯端末装置100は、主面に配置される表示モジュール110と、裏面に配置される筐体120とから構成されている。なお、主面とは、ユーザが把持した状態でユーザ側に対向する面であり、主のユーザと入出力を行う入出力面、すなわち、ユーザに画像等を表示する表示面でもあり、ユーザが入力操作を行う操作面でもある。 As shown in FIG. 1A to FIG. 1C, the portable terminal device 100 of the premise example includes a display module 110 disposed on the main surface and a casing 120 disposed on the back surface. The main surface is a surface facing the user side while being held by the user, an input / output surface that performs input / output with the main user, that is, a display surface that displays an image or the like to the user. It is also an operation surface for performing input operations.
 表示モジュール110には、画像等を表示する表示部(表示画面)111、ユーザが入力操作を行うタッチパネル112、ユーザ自身等を撮像するためのカメラ(インカメラ)113が設けられている。なお、その他、音声等を入力するマイクや音声等を出力するスピーカ等も有している。表示モジュール110の表示部111にアイコンなどのGUI(Graphical User Interface)を表示し、ユーザがタッチパネル112に指などで触れてGUIを操作すると、操作に応じた機能を実行する。 The display module 110 is provided with a display unit (display screen) 111 for displaying images and the like, a touch panel 112 on which a user performs an input operation, and a camera (in-camera) 113 for imaging the user himself. In addition, a microphone for inputting sound and the like and a speaker for outputting sound and the like are also provided. When a GUI (Graphical User Interface) such as an icon is displayed on the display unit 111 of the display module 110 and the user operates the GUI by touching the touch panel 112 with a finger or the like, a function corresponding to the operation is executed.
 筐体120は、断面が凹型に形成されており、表示モジュール110の裏面全体を覆うように固定配置されている。筐体120の裏面(底面)に、通常の撮像を行うためのカメラ(アウトカメラ)114が設けられている。カメラ114の焦点方向(光軸方向)は主面の延在方向に対し垂直方向である。 The housing 120 has a concave cross section, and is fixedly arranged so as to cover the entire back surface of the display module 110. A camera (out camera) 114 for performing normal imaging is provided on the back surface (bottom surface) of the housing 120. The focal direction (optical axis direction) of the camera 114 is perpendicular to the extending direction of the main surface.
 このように、前提例の携帯端末装置100は、筐体裏面にアウトカメラ114が配置されている。このため、上記課題で説明したように、撮像時に視認性や操作性が低下する恐れがある。すなわち、前提例の携帯端末装置100でアウトカメラ114を使用して撮像しようとすると、図4Aに示すように、携帯端末装置100をユーザの視線方向、すなわち、被写体の方向に対し垂直となるように把持したまま構える必要がある。そうすると、ユーザは、通常の安定した操作を行う状態とは異なり、携帯端末装置100を垂直に構えたユーザに負担の大きい状態で、画像を確認し、撮像等の操作を行うことになるため、視認性や操作性が低下するという問題がある。 Thus, in the premise example mobile terminal device 100, the out camera 114 is arranged on the back of the casing. For this reason, as described in the above problem, the visibility and operability may be reduced during imaging. That is, when an image is captured using the out-camera 114 in the portable terminal device 100 of the premise example, as shown in FIG. 4A, the portable terminal device 100 is perpendicular to the user's line-of-sight direction, that is, the direction of the subject. It is necessary to hold it while holding it. Then, unlike a state in which a normal stable operation is performed, the user checks an image and performs an operation such as imaging in a state where the burden is high on the user holding the mobile terminal device 100 vertically. There is a problem that visibility and operability deteriorate.
 そこで、このような課題を解決するため、本発明の一実施の形態では、以下に説明するように、アウトカメラを主面に対し斜めに配置することで、視認性や操作性の向上を図る。 Therefore, in order to solve such a problem, in one embodiment of the present invention, as described below, the out camera is arranged obliquely with respect to the main surface, thereby improving visibility and operability. .
(実施の形態1)
 以下、図面を参照して本発明の実施の形態1について説明する。図2Aは、本実施の形態に係る携帯端末装置1の正面外観図であり、図2Bはその側面外観図であり、図2Cはその裏面外観図である。図3は、本実施の形態に係る携帯端末装置1の斜視図である。携帯端末装置1は、図1A~図1Cの携帯端末装置100と同様に、例えばスマートフォン、タブレット型の携帯端末装置、携帯電話、ゲーム機、電子書籍端末などユーザが携帯可能な端末装置(電子機器)である。
(Embodiment 1)
Embodiment 1 of the present invention will be described below with reference to the drawings. 2A is a front external view of mobile terminal device 1 according to the present embodiment, FIG. 2B is a side external view thereof, and FIG. 2C is a back external view thereof. FIG. 3 is a perspective view of the mobile terminal device 1 according to the present embodiment. Similarly to the mobile terminal device 100 of FIGS. 1A to 1C, the mobile terminal device 1 is a terminal device (electronic device) that can be carried by a user, such as a smartphone, a tablet-type mobile terminal device, a mobile phone, a game machine, and an electronic book terminal. ).
 図2A~図2C及び図3に示すように、本実施の形態に係る携帯端末装置1は、主面に配置される表示モジュール10と、裏面に配置される筐体20とから構成されている。 As shown in FIGS. 2A to 2C and FIG. 3, the mobile terminal device 1 according to the present embodiment includes a display module 10 disposed on the main surface and a housing 20 disposed on the back surface. .
 表示モジュール10には、画像等を表示する表示部(表示画面)11、ユーザが入力操作を行うタッチパネル12、ユーザ自身等を撮像するためのカメラ(インカメラ)13が設けられている。なお、その他、音声等を入力するマイクや音声等を出力するスピーカ等も有している。表示モジュール10の表示部11にアイコンなどのGUIを表示し、ユーザがタッチパネル12に指などで触れてGUIを操作すると、操作に応じた機能を実行する。 The display module 10 is provided with a display unit (display screen) 11 for displaying images and the like, a touch panel 12 on which the user performs an input operation, and a camera (in-camera) 13 for imaging the user himself. In addition, a microphone for inputting sound and the like and a speaker for outputting sound and the like are also provided. When a GUI such as an icon is displayed on the display unit 11 of the display module 10 and the user operates the GUI by touching the touch panel 12 with a finger or the like, a function corresponding to the operation is executed.
 筐体20は、断面が凹型に形成されており、表示モジュール10の裏面全体を覆うように固定配置されている。また、ユーザが携帯端末装置1を把持した状態で上方となる筐体上端部に上底面(上側面)21が形成されている。本実施の形態では、この上底面21の略中央に、通常の撮像を行うためのカメラ(アウトカメラ)14が設けられている。上底面21は、主面に対し鋭角の角度αに傾いて形成され、すなわち、カメラ14は主面に対し角度αで傾いている。したがって、カメラ14の焦点方向(光軸方向)も主面の延在方向に対し角度αで傾いている。 The housing 20 has a concave cross section, and is fixedly arranged so as to cover the entire back surface of the display module 10. In addition, an upper bottom surface (upper side surface) 21 is formed at the upper end of the casing that is located upward while the user holds the mobile terminal device 1. In the present embodiment, a camera (out camera) 14 for performing normal imaging is provided in the approximate center of the upper bottom surface 21. The upper bottom surface 21 is formed at an acute angle α with respect to the main surface, that is, the camera 14 is inclined at an angle α with respect to the main surface. Therefore, the focal direction (optical axis direction) of the camera 14 is also inclined at an angle α with respect to the extending direction of the main surface.
 このように、本実施の形態では、ユーザがタッチパネル型表示部を通してプレビューモードの表示を確認しながら操作を行うことが容易となるように、カメラの実装位置を、表示部が表示する面の上方の平面、すなわち、上底面となる面上に、所定角度αを持って配置することを特徴とする。所定角度αとは、主面と上底面が鋭角を形成する角度である。 Thus, in this embodiment, the camera mounting position is set above the surface displayed by the display unit so that the user can easily perform an operation while confirming the display in the preview mode through the touch panel type display unit. It arrange | positions with the predetermined angle (alpha) on the plane of this, ie, the surface used as an upper bottom. The predetermined angle α is an angle at which the main surface and the upper bottom surface form an acute angle.
 特にこの所定角度αでカメラを配置することにより、図4Bに示すように、ユーザが、携帯端末装置の主面に配置されたタッチパネルを操作しながら、上底面にあるカメラで、ユーザの位置する地点から前方領域の画像を取り込むことが可能となる。 In particular, by arranging the camera at the predetermined angle α, as shown in FIG. 4B, the user operates the touch panel arranged on the main surface of the mobile terminal device, and the user is positioned by the camera on the upper bottom surface. It is possible to capture an image of the front area from the point.
 ユーザは、カメラのプレビューモードの表示を表示部で確認しながら同時に操作を行うことができる。本実施の形態に係る携帯端末装置1では、携帯端末装置を斜めに保持した状態、すなわち、携帯端末装置を操作している通常の状態のまま撮像操作を行うことができるため、視認性や操作性が向上する。 The user can operate simultaneously while confirming the display of the camera preview mode on the display unit. In the mobile terminal device 1 according to the present embodiment, the imaging operation can be performed in a state in which the mobile terminal device is held obliquely, that is, in a normal state in which the mobile terminal device is operated. Improves.
(実施の形態2)
 以下、図面を参照して本発明の実施の形態2について説明する。本実施の形態では、主に、実施の形態1で示した携帯端末装置1における撮像動作について説明する。本実施の形態では、携帯端末装置の外観構成を実施の形態1の図2A~図2Cの構成として説明するが、その他、携帯端末装置の外観構成を前提例の図1A~図1Cの構成としてもよい。
(Embodiment 2)
The second embodiment of the present invention will be described below with reference to the drawings. In the present embodiment, the imaging operation in the mobile terminal device 1 shown in the first embodiment will be mainly described. In the present embodiment, the external configuration of the mobile terminal device will be described as the configuration of FIGS. 2A to 2C of Embodiment 1, but the external configuration of the mobile terminal device is assumed to be the configuration of FIGS. 1A to 1C as a premise example. Also good.
 図5は、本実施の形態に係る携帯端末装置1のハードウェア構成を示している。図5に示すように、携帯端末装置1は、制御部31、無線通信部32、表示部33、タッチパネル34、マイク35、スピーカ36、記憶部37、カメラ38を有している。 FIG. 5 shows a hardware configuration of the mobile terminal device 1 according to the present embodiment. As illustrated in FIG. 5, the mobile terminal device 1 includes a control unit 31, a wireless communication unit 32, a display unit 33, a touch panel 34, a microphone 35, a speaker 36, a storage unit 37, and a camera 38.
 制御部31は、各部の動作を制御する制御部であり、例えばCPU等である。制御部31は、記憶部37に記憶されている制御プログラムやアプリケーションプログラムを実行し、各プログラムにしたがって各部の動作を制御する。 The control unit 31 is a control unit that controls the operation of each unit, such as a CPU. The control unit 31 executes a control program and an application program stored in the storage unit 37, and controls the operation of each unit according to each program.
 無線通信部32は、基地局や他の通信装置等と無線通信を行うための通信部である。無線通信部32は、通話等のための音声信号を送受信し、また、撮像した画像等のデータ信号を送受信する。なお、無線に限らず有線通信やその他の通信制御を行ってもよい。 The wireless communication unit 32 is a communication unit for performing wireless communication with a base station and other communication devices. The wireless communication unit 32 transmits and receives audio signals for calls and the like, and transmits and receives data signals such as captured images. In addition, you may perform not only wireless but wired communication and other communication control.
 表示部33は、図2A~図2Cの表示部11に相当し、液晶ディスプレイや有機ELディスプレイ等の表示ディスプレイである。表示部33は、制御部31からの指示にしたがってアイコン等のGUIや画像、映像等を表示する。 The display unit 33 corresponds to the display unit 11 of FIGS. 2A to 2C, and is a display display such as a liquid crystal display or an organic EL display. The display unit 33 displays a GUI such as an icon, an image, a video, or the like in accordance with an instruction from the control unit 31.
 タッチパネル34は、図2A~図2Cのタッチパネル12に相当し、静電容量などを用いてユーザの指が触れた表示部33上の位置を検出し、検出した位置を示す信号を制御部31へ出力する。 The touch panel 34 corresponds to the touch panel 12 of FIGS. 2A to 2C, detects the position on the display unit 33 touched by the user's finger using capacitance or the like, and sends a signal indicating the detected position to the control unit 31. Output.
 マイク35は、ユーザからの音声等を入力し、スピーカ36は、ユーザへ音声等を出力する。例えば、マイク35にユーザから入力される音声に応じた音声信号を無線通信部32を介して送信し、無線通信部32が受信した音声信号に応じた音声をスピーカからユーザへ出力する。 The microphone 35 inputs voice from the user and the speaker 36 outputs voice and the like to the user. For example, a sound signal corresponding to the sound input from the user to the microphone 35 is transmitted via the wireless communication unit 32, and a sound corresponding to the sound signal received by the wireless communication unit 32 is output from the speaker to the user.
 記憶部37は、フラッシュメモリ等の記憶部であり、制御部31や各部の動作に必要な情報が記憶される。例えば、記憶部37は、制御部31で実行される制御プログラムやアプリケーションプログラム、カメラ38が撮像した撮像データ等が記憶される。 The storage unit 37 is a storage unit such as a flash memory, and stores information necessary for the operation of the control unit 31 and each unit. For example, the storage unit 37 stores a control program or application program executed by the control unit 31, imaging data captured by the camera 38, and the like.
 カメラ38は、主に図2A~図2Cのカメラ14に相当し(図2A~図2Cのカメラ13でもよい)、通常のカメラであってもよいが、ここでは広角レンズを有する広角カメラである。広角カメラを搭載することで、より広い範囲の被写体を撮像することができる。 The camera 38 mainly corresponds to the camera 14 in FIGS. 2A to 2C (may be the camera 13 in FIGS. 2A to 2C), and may be a normal camera, but here is a wide angle camera having a wide angle lens. . By mounting a wide-angle camera, a wider range of subjects can be imaged.
 図6は、本実施の形態に係る携帯端末装置1の制御部31で実現される機能ブロックの構成を示している。図6に示すように、制御部31は、GUI部41、プレビュー表示部42、撮像領域設定部43、撮像処理部44、画像保存部45、画像処理部46を有している。なお、図6の制御部31の機能ブロックは一例であり、本実施形態における撮像動作が実現できればその他のブロック構成であってもよい。 FIG. 6 shows a functional block configuration realized by the control unit 31 of the mobile terminal device 1 according to the present embodiment. As illustrated in FIG. 6, the control unit 31 includes a GUI unit 41, a preview display unit 42, an imaging region setting unit 43, an imaging processing unit 44, an image storage unit 45, and an image processing unit 46. Note that the functional blocks of the control unit 31 in FIG. 6 are examples, and other block configurations may be used as long as the imaging operation in the present embodiment can be realized.
 なお、図6の各ブロック(各機能、各処理部)を、ハードウェア又はソフトウェア、もしくはその両方によって構成してもよい。例えば、制御部31の各ブロックは、本実施形態に係る撮像動作を行うための制御プログラムをCPU(コンピュータ)で実行することにより実現される。 Note that each block (each function and each processing unit) in FIG. 6 may be configured by hardware, software, or both. For example, each block of the control unit 31 is realized by a CPU (computer) executing a control program for performing an imaging operation according to the present embodiment.
 GUI部41は、表示部33及びタッチパネル34を介して、ユーザの入力操作を受け付けるインタフェースである。GUI部41は、表示部33に表示されたアイコン(指標)をユーザが操作すると、タッチパネル34からアイコンに対応する位置信号が入力され、アイコンが操作されたことを検出する。GUI部41は、タッチパネル34に対するユーザの操作に応じて、フリック操作、ドラッグ操作、タップ操作、ダブルタップ操作、ピンチイン操作、ピンチアウト操作等を検出し、入力操作を検出したことを各部へ通知する。 The GUI unit 41 is an interface that accepts user input operations via the display unit 33 and the touch panel 34. When the user operates an icon (index) displayed on the display unit 33, the GUI unit 41 receives a position signal corresponding to the icon from the touch panel 34, and detects that the icon has been operated. The GUI unit 41 detects a flick operation, a drag operation, a tap operation, a double tap operation, a pinch-in operation, a pinch-out operation, and the like according to a user operation on the touch panel 34, and notifies each unit that an input operation has been detected. .
 プレビュー表示部42は、カメラ38により撮像を行う際に、表示部33にプレビュー表示を行う。本実施の形態では、カメラ38に広角カメラを使用するために、広角カメラに対応して2つのプレビューモードにより表示を行う。すなわち、表示部33にプレビュー表示される表示情報は、カメラ38で取り込む取り込み画像全体、または当該画像全体のうち各辺の端部から所定幅までの一部領域を除した略全体画像を表示する第一のプレビューモードと、および撮像指示により取り込む撮像画像そのものをプレビューする第二のプレビューモードとの2つを有するものとする。 The preview display unit 42 displays a preview on the display unit 33 when the camera 38 captures an image. In this embodiment, since a wide-angle camera is used as the camera 38, display is performed in two preview modes corresponding to the wide-angle camera. That is, the display information displayed as a preview on the display unit 33 displays the entire captured image captured by the camera 38 or a substantially entire image obtained by excluding a partial region from the end of each side to a predetermined width of the entire image. Assume that the first preview mode and the second preview mode for previewing the captured image itself captured by the imaging instruction are provided.
 例えば、図7の符号701を撮像対象の全体とすると、第一のプレビューモードでは、図7の符号702に示すように、カメラ38が取り込む撮像画像の全体が表示される。第二のプレビューモードでは、図7の符号703に示すように、カメラ38が取り込む撮像画像のうち、特定の被写体を含む撮像領域のみが表示される。 For example, assuming that reference numeral 701 in FIG. 7 is the entire imaging target, in the first preview mode, the entire captured image captured by the camera 38 is displayed as indicated by reference numeral 702 in FIG. In the second preview mode, as indicated by reference numeral 703 in FIG. 7, only an imaging region including a specific subject is displayed among the captured images captured by the camera 38.
 撮像領域設定部43は、第一のプレビューモードまたは第二のプレビューモードにおいて、取り込み画像のうちの撮像領域を設定し、簡略図により表示する。撮像領域設定部43は、ユーザの撮像領域指定操作にしたがって、撮像領域を設定し、図7の符号703のように撮像領域指定用間略図(撮像領域11a)を表示する。撮像領域指定用間略図は、取り込み画像を示す枠内のうち、撮像領域指定領域はどの位置になるかを、撮像領域指定領域枠として表示した二重枠などであり、当該二重枠を表示部の隅に表示してもよい。 The imaging area setting unit 43 sets the imaging area in the captured image and displays it in a simplified diagram in the first preview mode or the second preview mode. The imaging area setting unit 43 sets the imaging area in accordance with the user's imaging area designation operation, and displays an imaging area designation schematic diagram (imaging area 11a) as indicated by reference numeral 703 in FIG. The schematic diagram for imaging area designation is a double frame or the like displayed as an imaging area designation area frame indicating the position of the imaging area designation area within the frame indicating the captured image, and the double frame is displayed. You may display in the corner of a part.
 撮像処理部44は、ユーザの入力ししたがって、指定された撮像領域の画像を撮像し撮像データを生成する。画像保存部45は、撮像により生成された撮像データを記憶部37等に保存する。画像処理部46は、撮像された撮像データに対し種々の画像処理を実施する。 The imaging processing unit 44 captures an image of the designated imaging area and generates imaging data according to the user input. The image storage unit 45 stores imaging data generated by imaging in the storage unit 37 or the like. The image processing unit 46 performs various image processes on the captured image data.
 図8は、本実施の形態に係る携帯端末装置1の撮像動作(撮像処理)を示している。例えば、カメラ用のアイコンをユーザが操作することで、カメラ用アプリケーションが起動し、図6の各部により以下の処理が実行される。 FIG. 8 shows an imaging operation (imaging process) of the mobile terminal device 1 according to the present embodiment. For example, when a user operates a camera icon, a camera application is activated, and the following processing is executed by each unit in FIG.
 図8に示すように、まず、プレビュー表示部42は、第一のプレビューモードによりカメラ38の取り込み画像全体を表示する(S101)。続いて、GUI部41は、撮像領域指定操作が行われたか否か判定する(S102)。第一のプレビューモードの場合に、ユーザは、撮像したい領域を、撮像領域指定操作として、タッチパネル型の表示部33上に対する、接触操作または近接操作を含む操作指示により行う。 As shown in FIG. 8, first, the preview display unit 42 displays the entire captured image of the camera 38 in the first preview mode (S101). Subsequently, the GUI unit 41 determines whether or not an imaging region designation operation has been performed (S102). In the case of the first preview mode, the user performs an operation instruction including a contact operation or a proximity operation on the touch panel type display unit 33 as an imaging region designation operation as an imaging region designation operation.
 GUI部41がユーザによる撮像領域指定操作を検出すると、プレビュー表示部42は第二のプレビューモードに切り替えて表示する(S103)。プレビュー表示部42は、第二のプレビューモードにより、指定された撮像領域を拡大表示する。 When the GUI unit 41 detects an imaging region designation operation by the user, the preview display unit 42 switches to the second preview mode and displays it (S103). The preview display unit 42 enlarges and displays the designated imaging area in the second preview mode.
 続いて、撮像領域設定部43は、第二のプレビューモードにより表示された状態で、指令された撮像領域を簡略図により表示する(S104)。撮像領域設定部43は、上記のように二重枠などにより撮像領域11aを表示する。これにより、ユーザは、携帯端末装置を手で保持したまま、表示部を介して視認するプレビュー画像の中で、いずれの領域を撮像領域として取り込むかを確認することができる。 Subsequently, the imaging area setting unit 43 displays the commanded imaging area as a simplified diagram in the state displayed in the second preview mode (S104). The imaging area setting unit 43 displays the imaging area 11a with a double frame or the like as described above. As a result, the user can check which area is captured as the imaging area in the preview image visually recognized through the display unit while holding the mobile terminal device by hand.
 また、広角カメラであることより、保持状態を維持したまま、撮像領域を指定、変更を行うことができる。この指定、変更操作は、ユーザが撮像領域指定操作として行うことが可能である。 Also, since it is a wide-angle camera, it is possible to specify and change the imaging area while maintaining the holding state. This designation / change operation can be performed by the user as an imaging region designation operation.
 続いて、撮像領域設定部43は、指定した撮像領域に含まれる被写体を追従する追従表示を行う(S105)。例えば、指令された撮像領域に含まれる人や動物等の被写体を画像認識し、被写体の移動を検出すると、被写体の移動に追従して撮像領域を移動させて表示する。本実施の形態では、ユーザによる指定のみならず、端末の制御部による指定処理も可能である。一旦ユーザが指定した領域内にある特定の表示物を、端末内の制御部が、画像認識処理により撮像対象表示物と認識した場合、その表示物が移動したり、ユーザの保持する手が動いた場合であっても、当該表示物を撮像領域として追従することも可能である。 Subsequently, the imaging area setting unit 43 performs a follow-up display that follows the subject included in the designated imaging area (S105). For example, when a subject such as a person or an animal included in the commanded imaging region is recognized and movement of the subject is detected, the imaging region is moved and displayed following the movement of the subject. In the present embodiment, not only designation by the user but also designation processing by the control unit of the terminal is possible. Once a specific display object within the area specified by the user is recognized as an imaging target display object by the image recognition processing, the display object moves or the user's hand moves. Even in such a case, it is possible to follow the display object as an imaging region.
 続いて、撮像処理部44は、撮像実行処理により撮像及び撮像データの保存を行い(S106)、処理が終了する。 Subsequently, the imaging processing unit 44 performs imaging and storage of imaging data by the imaging execution process (S106), and the process ends.
 図9は、図8で示した本実施の形態に係る携帯端末装置1の撮像動作の他の例を示している。図9では、第一のプレビューモードで表示(S101)した後、モード切替操作が入力されると(S107)、第二のプレビューモードで表示が行われる(S103)。モード切替操作により、第二のプレビューモードから第一のプレビューモードに切り替えてもよい。例えば、モード切替用のアイコンを操作することでモードを切り替えることができる。 FIG. 9 shows another example of the imaging operation of the mobile terminal device 1 according to the present embodiment shown in FIG. In FIG. 9, after the display in the first preview mode (S101), when a mode switching operation is input (S107), the display is performed in the second preview mode (S103). The mode may be switched from the second preview mode to the first preview mode by a mode switching operation. For example, the mode can be switched by operating a mode switching icon.
 第一もしくは第二のプレビューモードで撮像領域指定操作(S102)が行われると、第一もしくは第二のプレビューモードの状態で、撮像領域が表示され(S104)、被写体の追従表示が行われる(S105)。続いて、撮像実行処理により撮像及び撮像データの保存を行う(S106)。 When the imaging region designation operation (S102) is performed in the first or second preview mode, the imaging region is displayed in the state of the first or second preview mode (S104), and the subject tracking display is performed ( S105). Subsequently, imaging and storage of imaging data are performed by imaging execution processing (S106).
 図10は、図8または図9における撮像実行処理(S106)の動作を示している。撮像処理部44は、図8や図9の動作により指定された撮像領域11aが表示された状態で、GUI部41は撮像操作が行われたか否か判定する(S110)。表示されている撮像領域や撮像アイコン等に対し接触操作または近接操作を含む操作指示により、撮像操作が行われる。GUI部41が撮像操作を検出すると、撮像処理部44は撮像を実行する(S111)。撮像処理部44は、指定されている撮像領域における画像信号から撮像データを生成する。 FIG. 10 shows the operation of the imaging execution process (S106) in FIG. 8 or FIG. The imaging processing unit 44 determines whether or not an imaging operation has been performed in a state where the imaging region 11a designated by the operations of FIGS. 8 and 9 is displayed (S110). An imaging operation is performed by an operation instruction including a contact operation or a proximity operation on the displayed imaging area, imaging icon, or the like. When the GUI unit 41 detects an imaging operation, the imaging processing unit 44 executes imaging (S111). The imaging processing unit 44 generates imaging data from the image signal in the designated imaging area.
 続いて、GUI41は、撮像データの保存操作が行われたか否か判定する(S112)。表示されている画像保存アイコン等に対し接触操作または近接操作を含む操作指示により、保存操作が行われる。GUI部41が保存操作を検出すると、画像保存部45は撮像データを保存する(S113)。画像保存部45は、生成された撮像データを記憶部37へ保存する。 Subsequently, the GUI 41 determines whether or not an imaging data storage operation has been performed (S112). A storage operation is performed on the displayed image storage icon or the like by an operation instruction including a contact operation or a proximity operation. When the GUI unit 41 detects a saving operation, the image saving unit 45 saves the captured data (S113). The image storage unit 45 stores the generated imaging data in the storage unit 37.
 以上のように、本実施の形態では、カメラと表示部の配置角度が鋭角となる所定の角度を有することより、表示部上で、プレビュー表示を確認しながら、撮像領域の指定操作、または、撮像対象となる表示物を指定する操作が容易となる。 As described above, in the present embodiment, since the arrangement angle between the camera and the display unit has a predetermined angle that is an acute angle, an operation for specifying the imaging region while confirming the preview display on the display unit, or An operation for designating a display object to be imaged becomes easy.
 また、指定する操作により撮像領域や撮像対象となる表示物が指定されると、端末の制御部は当該被写体を追従するため、特に第一のプレビュー表示モードにおいては、追従する表示物をプレビュー表示上で、撮像領域として示された枠情報として視認が可能である。また第二のプレビュー表示モードにおいても、上述した二重枠表示等により容易に視認できる。 In addition, when an imaging area or a display object to be imaged is specified by the specifying operation, the control unit of the terminal follows the subject, so in the first preview display mode, the following display object is displayed as a preview. Above, it can be visually recognized as frame information shown as an imaging region. Also in the second preview display mode, it can be easily recognized by the above-described double frame display or the like.
(実施の形態3)
 以下、図面を参照して本発明の実施の形態3について説明する。本実施の形態では、主に、実施の形態2で示した携帯端末装置1における撮像動作の他の例について説明する。本実施の形態では、例えば、携帯端末装置の外観構成を実施の形態1の図2A~図2Cの構成として説明するが、その他、携帯端末装置の外観構成を前提例の図1A~図1Cの構成としてもよい。また、その他の構成は、実施の形態2の図5及び図6と同様である。
(Embodiment 3)
The third embodiment of the present invention will be described below with reference to the drawings. In the present embodiment, another example of the imaging operation in the mobile terminal device 1 shown in the second embodiment will be mainly described. In the present embodiment, for example, the external configuration of the mobile terminal device will be described as the configuration of FIGS. 2A to 2C of Embodiment 1, but in addition, the external configuration of the mobile terminal device is assumed as an example in FIGS. 1A to 1C. It is good also as a structure. Other configurations are the same as those in FIGS. 5 and 6 of the second embodiment.
 図11は、本実施の形態に係る携帯端末装置1の表示例を示している。図11に示すように、携帯端末装置1では、表示部11における下側の指標配置領域200に、第一の指標201、第二の指標202、第三の指標203が配置されている。 FIG. 11 shows a display example of the mobile terminal device 1 according to the present embodiment. As shown in FIG. 11, in the mobile terminal device 1, a first indicator 201, a second indicator 202, and a third indicator 203 are arranged in the lower indicator arrangement region 200 in the display unit 11.
 ユーザは、図8や図9で示したように、撮像領域指定操作等を行い、撮像領域11aを指定する。この状態で、ユーザが表示部11上に、特定の方向に向けたフリック操作を行うことで、撮像指示を行うことができる。 The user designates the imaging area 11a by performing an imaging area designation operation or the like as shown in FIGS. In this state, the user can issue an imaging instruction by performing a flick operation on the display unit 11 in a specific direction.
 具体的には、表示部11に、撮像後の撮像データをどこに取り込むかを指定する指標をしており、制御部は、プレビュー表示がされた状態において、フリック操作を検出すると、これらの指標に向けた方向であるかどうかを検出する。ここでは、一例として、表示部11の下方に位置するアイコン(指標)を3つ表示されている。 Specifically, the display unit 11 has an index for specifying where to capture the captured image data. When the control unit detects a flick operation in the preview display state, the control unit displays these indexes. Detect if the direction is pointing. Here, as an example, three icons (indexes) located below the display unit 11 are displayed.
 この指標とは、取り込み先を示す情報が関連付けられており、例えば、第一の指標201は、端末内メモリ、第二の指標202は、ネットワーク上の画像共有サービスが提供するサーバ、第三の指標203は、特定のユーザのメモリを示す。 This index is associated with information indicating the capture destination. For example, the first index 201 is the in-terminal memory, the second index 202 is a server provided by the image sharing service on the network, the third index The index 203 indicates a specific user's memory.
 ここで、第一の指標201は、端末内メモリへの保存指示を含み、第二の指標202は、画像共有サービス等の提供するサーバへのアクセス指示、第三の指標203は、特定のユーザへ、メール添付による転送指示等を含む。 Here, the first index 201 includes an instruction to save in the in-terminal memory, the second index 202 is an instruction to access a server such as an image sharing service, and the third index 203 is a specific user. Includes a transfer instruction by e-mail attachment.
 図12は、本実施の形態に係る携帯端末装置1の動作であり、図8や図9における撮像実行処理(S106)の動作を示している。 FIG. 12 shows the operation of the mobile terminal device 1 according to the present embodiment, and shows the operation of the imaging execution process (S106) in FIG. 8 and FIG.
 撮像処理部44は、図8や図9の動作により指定された撮像領域11aが表示された状態で、GUI部41がユーザからフリック操作を受け付ける(S201)。なお、フリック操作に限らず、ドラッグ操作等、撮像領域といずれかの指標とを関連付ける操作であればよい。 The imaging processing unit 44 receives a flick operation from the user in the GUI unit 41 in a state where the imaging region 11a designated by the operation of FIGS. 8 and 9 is displayed (S201). Note that the operation is not limited to the flick operation, and may be an operation that associates the imaging region with one of the indexes, such as a drag operation.
 GUI部41は、フリック操作の操作方向上に、第一の指標201、第二の指標202、第三の指標203が存在するか否か判定する(S202)。操作方向にいずれの指標も存在しない場合、撮像領域を変更する(S203)。フリック操作で示される方向上にいずれの指標も存在しない場合には、保存処理以外の処理と検出してもよく、ここでは、操作方向に撮像領域を移動させて表示する。 The GUI unit 41 determines whether or not the first index 201, the second index 202, and the third index 203 exist in the operation direction of the flick operation (S202). If no index exists in the operation direction, the imaging area is changed (S203). If no index exists in the direction indicated by the flick operation, it may be detected as a process other than the storage process. Here, the imaging region is moved in the operation direction and displayed.
 また、例えば、第一の指標に対し、端末内メモリへの保存操作指示ではなく、当該ユーザがネットワーク上に設けられたユーザ用の外部メモリへの保存指示である場合には、第一ないし第三の指標のいずれも指定しないフリック操作であれば、そのまま端末内メモリへ保存する撮像指示として処理してもよい。 Further, for example, when the user is not a save operation instruction to the in-terminal memory but the user is a save instruction to an external memory for a user provided on the network, the first to first indicators If it is a flick operation that does not specify any of the three indices, it may be processed as an imaging instruction to be stored in the terminal memory as it is.
 フリック操作の操作方向に第一の指標201が存在する場合、撮像処理部44は撮像を行って撮像データを生成し(S204)、さらに画像保存部45は撮像データを端末内の記憶部37に保存する(S205)。 When the first index 201 exists in the operation direction of the flick operation, the imaging processing unit 44 performs imaging to generate imaging data (S204), and the image storage unit 45 stores the imaging data in the storage unit 37 in the terminal. Save (S205).
 フリック操作の操作方向に第二の指標202が存在する場合、撮像処理部44は撮像を行って撮像データを生成し(S206)、さらに画像保存部45はサーバへアクセスして、撮像データをサーバへ送信する(S207)。これによりサーバに撮像データが格納される。 When the second index 202 is present in the operation direction of the flick operation, the imaging processing unit 44 performs imaging to generate imaging data (S206), and the image storage unit 45 accesses the server to acquire the imaging data. (S207). Thereby, imaging data is stored in the server.
 フリック操作の操作方向に第三の指標203が存在する場合、撮像処理部44は撮像を行って撮像データを生成し(S208)、さらに画像保存部45はメールに撮像データを添付して転送する(S209)。これによりメール送信先の装置に撮像データが格納される。 When the third index 203 exists in the operation direction of the flick operation, the imaging processing unit 44 performs imaging to generate imaging data (S208), and the image storage unit 45 forwards the imaging data attached to the mail and transfers it. (S209). As a result, the imaging data is stored in the mail transmission destination apparatus.
 図13は、本実施の形態に係る携帯端末装置1の撮像動作時の表示イメージを示している。例えば、図13の符号1301を撮像対象の全体として、ユーザが撮像領域11aを指定すると、図13の符号1302のように、第二のプレビューモードで撮像領域が拡大表示される。この状態で表示部11にタッチしアイコンの無い方向にフリックすると、ライブ映像をまるでギャラリーのように動かしてベストな画角に設定できる。したがって、カメラのライブ映像をタッチで思いのままに操作することができる。 FIG. 13 shows a display image during the imaging operation of the mobile terminal device 1 according to the present embodiment. For example, when the user designates the imaging area 11a with reference numeral 1301 in FIG. 13 as the entire imaging target, the imaging area is enlarged and displayed in the second preview mode as indicated by reference numeral 1302 in FIG. If the display unit 11 is touched in this state and flicked in the direction without the icon, the live image can be moved like a gallery and set to the best angle of view. Therefore, the live video of the camera can be operated as desired by touching.
 ユーザの操作がフリック、またはドラッグ操作が所定の方向ではない場合、当該操作は撮像指示として検知しない。つまり、所定の方向が、指標に向けられている場合にのみ、撮像指示として検出する。そして、向けられていない場合には、第二のプレビューモードの場合、図13の符号1303のように撮像領域の変更を行うことが可能となる。 If the user's operation is a flick or a drag operation is not in a predetermined direction, the operation is not detected as an imaging instruction. That is, it is detected as an imaging instruction only when a predetermined direction is directed to the index. If not directed, in the second preview mode, the imaging area can be changed as indicated by reference numeral 1303 in FIG.
 図13の符号1304は、図13の符号1303の状態から撮像領域をタッチし、SNSアイコン11bへフリックした状態を示している。図13(d)に示すように、ベストな画角に合わせた画面をタッチするとカメラのライブ映像が一時停止し、そのまま画面の下部にあるSNSアイコン11bへドラッグする。これにより、撮像領域で撮像された撮像データがSNSへ送信される。 13 indicates a state in which the imaging region is touched from the state of reference numeral 1303 in FIG. 13 and flicked to the SNS icon 11b. As shown in FIG. 13D, when a screen in accordance with the best angle of view is touched, the live video of the camera is paused and dragged directly to the SNS icon 11b at the bottom of the screen. Thereby, the imaging data imaged in the imaging area is transmitted to SNS.
 そうすると、画像が撮影されてそのままネットワーク上でシェアされる。いつものタッチ操作で映像を切り取りシェアすることができ、ユーザは今見て感じたキモチすぐにシェアすることができる。 Then, the image is taken and shared as it is on the network. The video can be cut and shared with the usual touch operation, and the user can immediately share it as he / she feels it.
 以上のように、本実施の形態では、ユーザは撮像処理と、撮像した画像の保存先を1つの動作により行うことが可能となる。これにより、撮像画像の取り込みと利用先の指定を同時に行うことができ、操作性が向上する。よって、撮像情報の活用機会を多様化させることが可能となる。 As described above, in this embodiment, the user can perform the imaging process and the storage destination of the captured image by one operation. As a result, it is possible to simultaneously capture the captured image and specify the usage destination, improving operability. Therefore, it is possible to diversify the use opportunities of imaging information.
(実施の形態4)
 以下、図面を参照して本発明の実施の形態4について説明する。本実施の形態では、主に、実施の形態3で示した携帯端末装置1における撮像動作の他の例について説明する。本実施の形態では、例えば、携帯端末装置の外観構成を実施の形態1の図2A~図2Cの構成として説明するが、その他、携帯端末装置の外観構成を前提例の図1A~図1Cの構成としてもよい。また、その他の構成は、実施の形態2の図5及び図6と同様である。
(Embodiment 4)
Embodiment 4 of the present invention will be described below with reference to the drawings. In the present embodiment, another example of the imaging operation in the mobile terminal device 1 shown in the third embodiment will be mainly described. In the present embodiment, for example, the external configuration of the mobile terminal device will be described as the configuration of FIGS. 2A to 2C of Embodiment 1, but in addition, the external configuration of the mobile terminal device is assumed as an example in FIGS. 1A to 1C. It is good also as a structure. Other configurations are the same as those in FIGS. 5 and 6 of the second embodiment.
 図14は、本実施の形態に係る携帯端末装置1の表示例を示している。図14では、図11と同様に、携帯端末装置1は、表示部11における下側の指標配置領域200に、第一の指標201、第二の指標202、第三の指標203が配置されている。加えて、本実施の形態では、指標配置領域210にて、第四の指標204、第五の指標205、第六の指標206が表示されている。 FIG. 14 shows a display example of the mobile terminal device 1 according to the present embodiment. In FIG. 14, as in FIG. 11, the mobile terminal device 1 has a first indicator 201, a second indicator 202, and a third indicator 203 arranged in the lower indicator arrangement region 200 in the display unit 11. Yes. In addition, in the present embodiment, a fourth index 204, a fifth index 205, and a sixth index 206 are displayed in the index arrangement area 210.
 実施の形態3では、プレビュー表示状態において、表示部上にプレビュー表示と撮像取り込み指示先を示す指標を表示している場合、取り込み先を撮像指示と同時に指定することが可能であるとしたが、本実施の形態では、更に画像補正も同様に可能となる。 In the third embodiment, in the preview display state, when the preview display and the index indicating the imaging capture instruction destination are displayed on the display unit, it is possible to specify the capture destination simultaneously with the imaging instruction. In the present embodiment, image correction can be performed in the same manner.
 撮像画像取り込み先指定用指標の他に、画像補正用の指標を表示し、当該指標の位置する方向へのフリック操作、またはドラッグ操作により、プレビュー表示状態から撮像画像取り込みの指示と、画像補正を同時に行うことが可能である。また、画像補正用の指標は、画像補正の種類に応じて複数表示されてもよい。 In addition to the captured image capture destination designation index, an image correction index is displayed, and a captured image capture instruction and image correction are performed from the preview display state by flicking or dragging in the direction in which the index is located. It is possible to do it at the same time. A plurality of image correction indices may be displayed according to the type of image correction.
 すなわち、第四の指標204は、彩度を所定値まで減じるモノクロ処理、第五の指標205は特定の表示物に対するモザイク処理、第六の指標206は、色調をセピア化するセピア処理を指定する指示情報が含まれている場合、例えば、プレビュー表示状態において、撮像領域内から、第四の指標204に向けたフリック、またはドラッグ操作、或いは、第四の指標204を表示する領域上を通る軌跡となるドラッグ操作が検知された場合、制御部は、当該撮像領域内の撮像画像に対し、モノクロ化を行った上で端末内メモリに取り込む。以降はフリック操作と称するが、当該操作には上述の操作方法を含むものとする。 That is, the fourth index 204 specifies monochrome processing for reducing the saturation to a predetermined value, the fifth index 205 specifies mosaic processing for a specific display object, and the sixth index 206 specifies sepia processing for converting the color tone to sepia. When the instruction information is included, for example, in the preview display state, a flick or a drag operation toward the fourth index 204 from the imaging area, or a trajectory passing over the area where the fourth index 204 is displayed. When the drag operation is detected, the control unit performs monochrome conversion on the captured image in the imaging region and captures the captured image in the in-terminal memory. Hereinafter, although referred to as a flick operation, the operation includes the above-described operation method.
 ここで、取り込み先を指定する指標群を表示した指標配置領域200、画像補正を指定する指標群を表示した指標配置領域210が、図14のように表示されている場合の操作内容について例示する。例えば、ユーザが、第二の指標配置領域210内の特定の指標を通過した上で、第一の領域内の指標を通過する、あるいは、当該方向へのフリック操作等の操作指示を検知した場合、制御部は、第二の領域内の特定の指標に対応する画像補正を行った上で、第一の領域内で指定された特定の指標に対応する取り込み先を指定する。 Here, the operation contents when the index placement area 200 displaying the index group designating the import destination and the index placement area 210 displaying the index group designating the image correction are displayed as shown in FIG. 14 will be exemplified. . For example, when the user passes a specific index in the second index arrangement area 210 and then passes the index in the first area or detects an operation instruction such as a flick operation in the direction The control unit performs image correction corresponding to the specific index in the second area, and then specifies the capture destination corresponding to the specific index specified in the first area.
 すなわち、撮像領域内に特定の表示物として、ユーザの友人の顔が指定されている場合で、第一の領域内で、第二の指標が、第二の領域内で第二の指標が指定されている場合、制御部は、撮像指示を受け付けると、撮像領域枠内のユーザの友人の顔にモザイク処理をかけた上で、第一の領域内で第二の指標に対応する、共有サーバへのアップロード処理を行う。 That is, when the face of the user's friend is specified as a specific display object in the imaging area, the second index is specified in the first area, and the second index is specified in the second area In a case where the control unit receives the imaging instruction, the control unit applies a mosaic process to the face of the user's friend in the imaging area frame, and then corresponds to the second index in the first area. Upload process to.
 図15は、本実施の形態に係る携帯端末装置1の動作であり、図8や図9における撮像実行処理(S106)の動作を示している。 FIG. 15 shows the operation of the mobile terminal device 1 according to the present embodiment, and shows the operation of the imaging execution process (S106) in FIGS.
 撮像処理部44は、図8や図9の動作により指定された撮像領域が表示された状態で、GUI部41は、ユーザからフリック操作を受け付ける(S301)。なお、フリック操作に限らず、ドラッグ操作等、撮像領域といずれかの指標とを関連付ける操作であればよい。 The imaging processing unit 44 receives the flick operation from the user in a state where the imaging region designated by the operation of FIGS. 8 and 9 is displayed (S301). Note that the operation is not limited to the flick operation, and may be an operation that associates the imaging region with one of the indexes, such as a drag operation.
 GUI部41は、フリック操作の操作方向上に、第四の指標204、第五の指標205、第六の指標206が存在するか否か判定する(S302)。操作方向にいずれの指標も存在しない場合、撮像領域を変更する(S303)。フリック操作で示される方向上にいずれの指標も存在しない場合には、保存処理以外の処理と検出してもよく、ここでは、操作方向に撮像領域を移動させて表示する。 The GUI unit 41 determines whether or not the fourth index 204, the fifth index 205, and the sixth index 206 exist in the operation direction of the flick operation (S302). If no index exists in the operation direction, the imaging area is changed (S303). If no index exists in the direction indicated by the flick operation, it may be detected as a process other than the storage process. Here, the imaging region is moved in the operation direction and displayed.
 フリック操作の操作方向に第四の指標204が存在する場合、撮像処理部44は撮像を行って撮像データを生成し(S304)、画像処理部46は撮像データに対しモノクロ処理を行う(S305)。 When the fourth index 204 exists in the operation direction of the flick operation, the imaging processing unit 44 performs imaging to generate imaging data (S304), and the image processing unit 46 performs monochrome processing on the imaging data (S305). .
 フリック操作の操作方向に第五の指標205が存在する場合、撮像処理部44は撮像を行って撮像データを生成し(S306)、画像処理部46は撮像データに対しモノクロ処理を行う(S307)。 When the fifth index 205 exists in the operation direction of the flick operation, the imaging processing unit 44 performs imaging to generate imaging data (S306), and the image processing unit 46 performs monochrome processing on the imaging data (S307). .
 フリック操作の操作方向に第六の指標206が存在する場合、撮像処理部44は撮像を行って撮像データを生成し(S308)、画像処理部46は撮像データに対しセピア処理を行う(S307)。 When the sixth index 206 exists in the operation direction of the flick operation, the imaging processing unit 44 performs imaging to generate imaging data (S308), and the image processing unit 46 performs sepia processing on the imaging data (S307). .
 さらに、GUI部41は、フリック操作の方向上に、第一の指標201、第二の指標202、第三の指標203が存在するか否か判定する(S310)。 Furthermore, the GUI unit 41 determines whether or not the first index 201, the second index 202, and the third index 203 exist in the flick operation direction (S310).
 フリック操作の操作方向に第一の指標201が存在する場合、画像保存部45は撮像データを端末内の記憶部37に保存する(S311)。フリック操作の操作方向に第二の指標202が存在する場合、画像保存部45はサーバへアクセスして、撮像データをサーバへ送信する(S312)。これによりサーバに撮像データが格納される。フリック操作の操作方向に第三の指標203が存在する場合、画像保存部45はメールに撮像データを添付して転送する(S313)。これによりメール送信先の装置に撮像データが格納される。 When the first index 201 exists in the operation direction of the flick operation, the image storage unit 45 stores the imaging data in the storage unit 37 in the terminal (S311). When the second index 202 exists in the operation direction of the flick operation, the image storage unit 45 accesses the server and transmits the imaging data to the server (S312). Thereby, imaging data is stored in the server. When the third index 203 exists in the operation direction of the flick operation, the image storage unit 45 attaches the imaging data to the mail and transfers it (S313). As a result, the imaging data is stored in the mail transmission destination apparatus.
 以上のように、本実施の形態では、撮像処理及び撮像画像の取り込み先を指定することが容易となるばかりでなく、画像補正も同時に行うことができるため、取り込み先に応じた画像補正を容易に行うことができる。したがて、操作性がさらに向上する。たとえば、端末内外の境界があいまいとなる撮像画像取り込み処理を行う場合であっても、外部に露呈する可能性のある画像に対する画像補正を適切に行うことができる。 As described above, according to the present embodiment, not only the imaging process and the capture destination of the captured image can be easily specified, but also the image correction can be performed at the same time, so that the image correction according to the capture destination is easy. Can be done. Therefore, the operability is further improved. For example, even when a captured image capturing process in which the boundary between the inside and outside of the terminal is ambiguous is performed, image correction can be appropriately performed on an image that may be exposed to the outside.
 なお、2つの指標に限らず、さらに複数の指標に向けたフリック操作により、撮像データに対し様々な処理を行ってもよい。 It should be noted that the image data may be variously processed by a flick operation for a plurality of indices, not limited to two indices.
(実施の形態5)
 以下、図面を参照して本発明の実施の形態5について説明する。本実施の形態では、主に、実施の形態1で示した携帯端末装置1における撮像動作時に傾斜状態を表示する例について説明する。
(Embodiment 5)
Hereinafter, a fifth embodiment of the present invention will be described with reference to the drawings. In the present embodiment, an example in which the tilt state is displayed during the imaging operation in the mobile terminal device 1 shown in the first embodiment will be mainly described.
 広角カメラで取り込んだ画像を第二のプレビュー表示状態で表示部に表示する場合、所定の傾斜を持って保持された端末の姿勢をユーザが表示部のプレビュー表示状態を見ながら確認することが難しくなる。 When displaying an image captured by a wide-angle camera on the display unit in the second preview display state, it is difficult for the user to check the orientation of the terminal held with a predetermined inclination while viewing the preview display state of the display unit Become.
 そのため、水平ラインに対しどの程度、端末が水平ラインに対し傾斜しているかを示すため、固定的に表示する水平ラインと端末の姿勢に応じて変位する傾斜ラインを、簡略表示してもよい。これにより、端末の傾斜状態を表示部の視認により確認することが可能となる。なお、ここでいう水平ラインとは、筐体の主面に対しカメラ配置角度が、第一の角度とする鋭角を形成している場合、当該カメラの焦点方向に対し平行となる位置を示す。 Therefore, in order to indicate how much the terminal is inclined with respect to the horizontal line, the fixedly displayed horizontal line and the inclined line displaced according to the attitude of the terminal may be simply displayed. Thereby, it becomes possible to confirm the inclination state of a terminal by visual recognition of a display part. Note that the horizontal line here indicates a position parallel to the focal direction of the camera when the camera arrangement angle forms an acute angle as the first angle with respect to the main surface of the housing.
 なお、本実施形態に係る携帯端末装置1は、傾斜ラインを表示するために傾斜センサ(水平位置検知手段)を内蔵している。 Note that the mobile terminal device 1 according to the present embodiment has a built-in tilt sensor (horizontal position detecting means) for displaying a tilt line.
 図16A~図16Cは、本実施の形態に係る携帯端末装置1の表示例を示している。図16Aは、カメラの焦点方向(光軸方向)が水平の場合の表示例である。表示部11には、水平ライン301と傾斜ライン302が表示される。図16Aでは、カメラの焦点方向が水平方向であり、傾斜センサが水平状態を検出するため、水平ライン301と傾斜ライン302が重なって表示される。例えば、傾斜センサに対しこの状態が水平状態であることを予め設定しておくことで、水平ラインに対する傾斜角度を検出することができる。 16A to 16C show display examples of the mobile terminal device 1 according to the present embodiment. FIG. 16A is a display example when the focal direction (optical axis direction) of the camera is horizontal. A horizontal line 301 and an inclined line 302 are displayed on the display unit 11. In FIG. 16A, since the focus direction of the camera is the horizontal direction and the tilt sensor detects the horizontal state, the horizontal line 301 and the tilt line 302 are displayed overlapping each other. For example, it is possible to detect an inclination angle with respect to a horizontal line by setting in advance that this state is a horizontal state with respect to the inclination sensor.
 図16Bは、カメラの焦点方向(光軸方向)が水平方向よりも下に向いている場合の表示例である。表示部11には、水平ライン301と傾斜ライン302が表示される。図16Bでは、カメラの焦点方向が水平より下に向いており、傾斜センサが下方への傾きを検出するため、傾斜ライン302が水平ライン301よりも上に表示される。 FIG. 16B is a display example when the focal direction (optical axis direction) of the camera is directed downward from the horizontal direction. A horizontal line 301 and an inclined line 302 are displayed on the display unit 11. In FIG. 16B, since the focus direction of the camera is directed downward from the horizontal, and the tilt sensor detects a downward tilt, the tilt line 302 is displayed above the horizontal line 301.
 図16Cは、カメラの焦点方向(光軸方向)が水平方向よりも上に向いている場合の表示例である。表示部11には、水平ライン301と傾斜ライン302が表示される。図16Cでは、カメラの焦点方向が水平より上に向いており、傾斜センサが上方への傾きを検出するため、傾斜ライン302が水平ライン301よりも下に表示される。 FIG. 16C is a display example in the case where the focal direction (optical axis direction) of the camera is directed upward from the horizontal direction. A horizontal line 301 and an inclined line 302 are displayed on the display unit 11. In FIG. 16C, the focus direction of the camera is directed above the horizontal, and the tilt sensor 302 detects the tilt upward, so that the tilt line 302 is displayed below the horizontal line 301.
 以上のように、表示部に水平ライン及び傾斜ラインを表示することにより、端末の傾斜状態を表示部の視認により確認することが可能となる。ユーザが装置の傾きを調整し易くなるため、操作性を向上することができる。 As described above, by displaying the horizontal line and the tilt line on the display unit, the tilt state of the terminal can be confirmed by visual recognition of the display unit. Since the user can easily adjust the inclination of the apparatus, the operability can be improved.
 なお、本発明は上記実施の形態に限られたものではなく、趣旨を逸脱しない範囲で適宜変更することが可能である。 Note that the present invention is not limited to the above-described embodiment, and can be appropriately changed without departing from the spirit of the present invention.
 例えば、図1A~図1Cの前提例では、筐体の裏面にカメラを配置し、図2A~図2Cの実施の形態1では、上底面にカメラを配置したが、図1A~図1Cのカメラと図2A~図2Cのカメラの両方を搭載してもよい。これにより、携帯端末装置を垂直に立てて撮像できるし、携帯端末装置を斜めに傾けて撮像することもできる。
 また、上底面を傾斜させるのではなく、上底面は主面に対し略垂直状態とし、上底面に実装するカメラの光軸のみを傾斜させた構造を取ってもよい。その場合でも、上記した各実施例に記載の操作、処理が可能であることは勿論である。
 また、カメラが主面に対し垂直方向に位置する対象物を撮像可能な程度に広域の撮像領域を取り込むことが可能であれば、上述した上底の傾斜角、カメラの光軸角の調整が不要となる。
 もっとも、当該カメラの場合の取り込み画像は主面に対し平行かつ上方向を中心として取り込まれた領域となるが、制御部に対し、撮像領域として、主面に対し垂直かつ前方向を中心とした切り出しを行うことで、上記した実施例に記載の操作態様を提供することは可能となる。
For example, in the premise example of FIGS. 1A to 1C, a camera is disposed on the back surface of the housing, and in Embodiment 1 of FIGS. 2A to 2C, the camera is disposed on the upper bottom surface. 2A to 2C may be mounted. Accordingly, the mobile terminal device can be imaged while standing vertically, or the mobile terminal device can be imaged while being tilted obliquely.
Further, instead of inclining the upper bottom surface, the upper bottom surface may be substantially perpendicular to the main surface, and only the optical axis of the camera mounted on the upper bottom surface may be inclined. Even in such a case, it is needless to say that the operations and processes described in the above embodiments can be performed.
In addition, if it is possible to capture a wide imaging area to the extent that the camera can image an object positioned in a direction perpendicular to the main surface, the above-described tilt angle of the upper base and the optical axis angle of the camera can be adjusted. It becomes unnecessary.
Of course, the captured image in the case of the camera is an area captured with the upper direction centered parallel to the main surface, but for the control unit, the imaging area is perpendicular to the main surface and centered on the front direction. By performing the cut-out, it is possible to provide the operation modes described in the above-described embodiments.
 上述した実施の形態に係る制御プログラムは、様々なタイプの非一時的なコンピュータ可読媒体(non-transitory computer readable medium)を用いて格納され、コンピュータに供給することができる。非一時的なコンピュータ可読媒体は、様々なタイプの実体のある記録媒体(tangible storage medium)を含む。非一時的なコンピュータ可読媒体の例は、磁気記録媒体(例えばフレキシブルディスク、磁気テープ、ハードディスクドライブ)、光磁気記録媒体(例えば光磁気ディスク)、CD-ROM(Read Only Memory)CD-R、CD-R/W、半導体メモリ(例えば、マスクROM、PROM(Programmable ROM)、EPROM(Erasable PROM)、フラッシュROM、RAM(Random Access Memory))を含む。また、プログラムは、様々なタイプの一時的なコンピュータ可読媒体(transitory computer readable medium)によってコンピュータに供給されてもよい。一時的なコンピュータ可読媒体の例は、電気信号、光信号、及び電磁波を含む。一時的なコンピュータ可読媒体は、電線及び光ファイバ等の有線通信路、又は無線通信路を介して、プログラムをコンピュータに供給できる。 The control program according to the above-described embodiment can be stored using various types of non-transitory computer readable media and supplied to a computer. Non-transitory computer readable media include various types of tangible storage media. Examples of non-transitory computer-readable media include magnetic recording media (eg, flexible disks, magnetic tapes, hard disk drives), magneto-optical recording media (eg, magneto-optical disks), CD-ROM (Read Only Memory) CD-R, CD -R / W, including semiconductor memory (for example, mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), flash ROM, RAM (Random Access Memory)). In addition, the program may be supplied to a computer by various types of temporary computer readable media. Examples of transitory computer readable media include electrical signals, optical signals, and electromagnetic waves. The temporary computer-readable medium can supply the program to the computer via a wired communication path such as an electric wire and an optical fiber, or a wireless communication path.
 以上、実施の形態を参照して本願発明を説明したが、本願発明は上記によって限定されるものではない。本願発明の構成や詳細には、発明のスコープ内で当業者が理解し得る様々な変更をすることができる。 The present invention has been described above with reference to the embodiment, but the present invention is not limited to the above. Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the invention.
 この出願は、2012年4月25日に出願された日本出願特願2012-099365を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2012-099365 filed on April 25, 2012, the entire disclosure of which is incorporated herein.
1   携帯端末装置
10  表示モジュール
11  表示部
11a 撮像領域
12  タッチパネル
13  カメラ(インカメラ)
14  カメラ(アウトカメラ)
20  筐体
21  上底面
31  制御部
32  無線通信部
33  表示部
34  タッチパネル
35  マイク
36  スピーカ
37  記憶部
38  カメラ
41  GUI部
42  プレビュー表示部
43  撮像領域設定部
44  撮像処理部
45  画像保存部
46  画像処理部
100 携帯端末装置
110 表示モジュール
111 表示部
112 タッチパネル
113 カメラ(インカメラ)
114 カメラ(アウトカメラ)
120  筐体
200 指標配置領域
201~206 指標
210 指標配置領域
301 水平ライン
302 傾斜ライン
DESCRIPTION OF SYMBOLS 1 Portable terminal device 10 Display module 11 Display part 11a Imaging area 12 Touch panel 13 Camera (in camera)
14 Camera (Out camera)
20 casing 21 upper bottom surface 31 control unit 32 wireless communication unit 33 display unit 34 touch panel 35 microphone 36 speaker 37 storage unit 38 camera 41 GUI unit 42 preview display unit 43 imaging region setting unit 44 imaging processing unit 45 image storage unit 46 image processing Unit 100 mobile terminal device 110 display module 111 display unit 112 touch panel 113 camera (in-camera)
114 camera (out camera)
120 Housing 200 Index arrangement area 201 to 206 Index 210 Index arrangement area 301 Horizontal line 302 Inclined line

Claims (10)

  1.  主面に対し光軸方向が斜めに配置された撮像手段と、
     前記主面に配置され、前記撮像手段から取り込まれる取り込み画像を表示する表示手段と、
     前記表示手段の面上に対する入力操作を検知する入力検知手段と、
     前記表示された取り込み画像に対し撮像領域を設定する撮像領域設定手段と、
     前記検知された入力操作に応じて、前記設定された撮像領域の取り込み画像に基づき撮像した撮像データを生成する撮像処理手段と、
     を備える電子機器。
    An imaging means in which the optical axis direction is obliquely arranged with respect to the main surface;
    A display unit disposed on the main surface and displaying a captured image captured from the imaging unit;
    Input detection means for detecting an input operation on the surface of the display means;
    Imaging area setting means for setting an imaging area for the displayed captured image;
    Imaging processing means for generating imaging data imaged based on the captured image of the set imaging area in response to the detected input operation;
    Electronic equipment comprising.
  2.  前記撮像手段は、前記主面に対し斜めに延在する傾斜面に配置されている、
     請求項1に記載の電子機器。
    The imaging means is disposed on an inclined surface extending obliquely with respect to the main surface,
    The electronic device according to claim 1.
  3.  前記撮像領域設定手段は、前記設定された撮像領域を、当該撮像領域内の被写体の動きに追従して前記表示手段に表示する、
     請求項1または2に記載の電子機器。
    The imaging area setting means displays the set imaging area on the display means following the movement of a subject in the imaging area;
    The electronic device according to claim 1.
  4.  前記撮像領域設定手段は、前記検知された入力操作に応じて、前記撮像領域を設定し、
     前記表示手段は、前記撮像手段から取り込まれる取り込み画像の全体を表示する第1の表示モードと、前記取り込み画像のうち前記設定された撮像領域を拡大して表示する第2の表示モードとを切り替えて表示する、
     請求項1乃至3のいずれか一項に記載の電子機器。
    The imaging area setting means sets the imaging area according to the detected input operation,
    The display unit switches between a first display mode for displaying the entire captured image captured from the imaging unit and a second display mode for displaying the set imaging region in the captured image in an enlarged manner. Display,
    The electronic device as described in any one of Claims 1 thru | or 3.
  5.  前記表示手段は、前記撮像データに対する第1の処理が関連付けられた第1の処理指標を表示し、
     前記撮像処理手段は、前記撮像領域から前記第1の処理指標へ向かう方向に前記入力操作が入力された場合、前記撮像データを生成するとともに、前記撮像データに対し前記第1の処理指標に関連付けられた第1の処理を実行する、
     請求項1乃至4のいずれか一項に記載の電子機器。
    The display means displays a first processing index associated with a first processing for the imaging data,
    The imaging processing unit generates the imaging data and associates the imaging data with the first processing index when the input operation is input in a direction from the imaging area toward the first processing index. The first process is executed,
    The electronic device as described in any one of Claims 1 thru | or 4.
  6.  前記第1の処理指標に関連付けられた第1の処理は、予め設定された取り込み先に前記撮像データを取り込む処理である、
     請求項5に記載の電子機器。
    The first process associated with the first process index is a process of capturing the imaging data in a preset capture destination.
    The electronic device according to claim 5.
  7.  前記表示手段は、前記撮像データに対する第2の処理が関連付けられた第2の処理指標を表示し、
     前記撮像処理手段は、前記撮像領域から前記第1及び第2の処理指標へ向かう方向に前記入力操作が入力された場合、前記撮像データを生成するとともに、前記撮像データに対し前記第1及び第2の処理指標に関連付けられた第1及び第2の処理を実行する、
     請求項5または6に記載の電子機器。
    The display means displays a second processing index associated with a second processing for the imaging data,
    The imaging processing unit generates the imaging data when the input operation is input in a direction from the imaging area toward the first and second processing indices, and the first and first are performed on the imaging data. Executing the first and second processes associated with the two process indices;
    The electronic device according to claim 5 or 6.
  8.  前記第2の処理指標に関連付けられた第2の処理は、予め設定された画像処理を前記撮像データに実行する処理である、
     請求項7に記載の電子機器。
    The second process associated with the second process index is a process of executing preset image processing on the imaging data.
    The electronic device according to claim 7.
  9.  主面に対し光軸方向が斜めに配置された撮像手段を備えた電子機器の制御方法であって、
     前記主面上の表示手段に、前記撮像手段から取り込まれる取り込み画像を表示し、
     前記表示手段の面上に対する入力操作を検知し、
     前記表示された取り込み画像に対し撮像領域を設定し、
     前記検知された入力操作に応じて、前記設定された撮像領域の取り込み画像に基づき撮像した撮像データを生成する、
     電子機器の制御方法。
    A method for controlling an electronic device including an image pickup unit in which an optical axis direction is inclined with respect to a main surface,
    The captured image captured from the imaging unit is displayed on the display unit on the main surface,
    Detecting an input operation on the surface of the display means;
    Set an imaging area for the displayed captured image,
    In accordance with the detected input operation, generate imaging data captured based on the captured image of the set imaging area,
    Control method of electronic equipment.
  10.  主面に対し光軸方向が斜めに配置された撮像手段を備えた電子機器の制御処理をコンピュータに実行させるための制御プログラムが格納された非一時的なコンピュータ可読媒体であって、
     前記主面上の表示手段に、前記撮像手段から取り込まれる取り込み画像を表示し、
     前記表示手段の面上に対する入力操作を検知し、
     前記表示された取り込み画像に対し撮像領域を設定し、
     前記検知された入力操作に応じて、前記設定された撮像領域の取り込み画像に基づき撮像した撮像データを生成する、
     制御プログラムが格納された非一時的なコンピュータ可読媒体。
    A non-transitory computer readable medium storing a control program for causing a computer to execute a control process of an electronic device including an imaging unit arranged obliquely with respect to a main surface.
    The captured image captured from the imaging unit is displayed on the display unit on the main surface,
    Detecting an input operation on the surface of the display means;
    Set an imaging area for the displayed captured image,
    In accordance with the detected input operation, generate imaging data captured based on the captured image of the set imaging area,
    A non-transitory computer readable medium storing a control program.
PCT/JP2013/000083 2012-04-25 2013-01-11 Electronic device, control method for same, and non-temporary computer-readable medium storing control program WO2013161134A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-099365 2012-04-25
JP2012099365 2012-04-25

Publications (1)

Publication Number Publication Date
WO2013161134A1 true WO2013161134A1 (en) 2013-10-31

Family

ID=49482492

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/000083 WO2013161134A1 (en) 2012-04-25 2013-01-11 Electronic device, control method for same, and non-temporary computer-readable medium storing control program

Country Status (1)

Country Link
WO (1) WO2013161134A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5919410B1 (en) * 2015-03-03 2016-05-18 ヤフー株式会社 Imaging apparatus, imaging method, and imaging program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11355706A (en) * 1998-03-24 1999-12-24 Canon Inc Management system of digital camera picture
JP2004312165A (en) * 2003-04-03 2004-11-04 Sharp Corp Mobile communication apparatus with imaging section
JP2007006146A (en) * 2005-06-23 2007-01-11 Fujifilm Holdings Corp Imaging device and mobile phone
JP2007251522A (en) * 2006-03-15 2007-09-27 Brother Ind Ltd Image processing program
JP2011193496A (en) * 2011-04-20 2011-09-29 Casio Computer Co Ltd Imaging apparatus, imaging method, and imaging program
JP2011205345A (en) * 2010-03-25 2011-10-13 Nec Casio Mobile Communications Ltd Image pickup device and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11355706A (en) * 1998-03-24 1999-12-24 Canon Inc Management system of digital camera picture
JP2004312165A (en) * 2003-04-03 2004-11-04 Sharp Corp Mobile communication apparatus with imaging section
JP2007006146A (en) * 2005-06-23 2007-01-11 Fujifilm Holdings Corp Imaging device and mobile phone
JP2007251522A (en) * 2006-03-15 2007-09-27 Brother Ind Ltd Image processing program
JP2011205345A (en) * 2010-03-25 2011-10-13 Nec Casio Mobile Communications Ltd Image pickup device and program
JP2011193496A (en) * 2011-04-20 2011-09-29 Casio Computer Co Ltd Imaging apparatus, imaging method, and imaging program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5919410B1 (en) * 2015-03-03 2016-05-18 ヤフー株式会社 Imaging apparatus, imaging method, and imaging program
JP2016163236A (en) * 2015-03-03 2016-09-05 ヤフー株式会社 Apparatus, method and program for imaging

Similar Documents

Publication Publication Date Title
KR102076771B1 (en) Image capturing using multiple screen sections
EP3116215B1 (en) Mobile terminal and method for controlling the same
JP6205067B2 (en) Pan / tilt operating device, camera system, pan / tilt operating program, and pan / tilt operating method
US10055081B2 (en) Enabling visual recognition of an enlarged image
JP6757268B2 (en) Imaging device and its control method
KR20160131720A (en) Mobile terminal and method for controlling the same
KR101969424B1 (en) Photographing device for displaying image and methods thereof
US9742995B2 (en) Receiver-controlled panoramic view video share
JP2020514813A (en) Shooting method and terminal
KR20170112491A (en) Mobile terminal and method for controlling the same
KR20190014638A (en) Electronic device and method for controlling of the same
JP6484129B2 (en) Electronic device, image display method, and image display program
KR20190008610A (en) Mobile terminal and Control Method for the Same
KR20170055869A (en) Mobile terminal and method for controlling the same
US9509733B2 (en) Program, communication apparatus and control method
US20190387176A1 (en) Display control apparatus, display control method, and computer program
WO2021238564A1 (en) Display device and distortion parameter determination method, apparatus and system thereof, and storage medium
JPWO2016038977A1 (en) Live view control device, live view control method, live view system, and program
WO2017126216A1 (en) Imaging control device, imaging control method, and computer program
US20130235233A1 (en) Methods and devices for capturing images
WO2013161134A1 (en) Electronic device, control method for same, and non-temporary computer-readable medium storing control program
JP5229928B1 (en) Gaze position specifying device and gaze position specifying program
JPWO2016035422A1 (en) Operation device, operation method, and program for imaging apparatus
JPWO2019058641A1 (en) Electronic device, program, control device, and control method
JP2012243266A (en) Electronic apparatus and display method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13782442

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13782442

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP