WO2019003571A1 - Electronic device, method for controlling electronic device, and program - Google Patents

Electronic device, method for controlling electronic device, and program Download PDF

Info

Publication number
WO2019003571A1
WO2019003571A1 PCT/JP2018/015161 JP2018015161W WO2019003571A1 WO 2019003571 A1 WO2019003571 A1 WO 2019003571A1 JP 2018015161 W JP2018015161 W JP 2018015161W WO 2019003571 A1 WO2019003571 A1 WO 2019003571A1
Authority
WO
WIPO (PCT)
Prior art keywords
function execution
contact
camera
function
unit
Prior art date
Application number
PCT/JP2018/015161
Other languages
French (fr)
Japanese (ja)
Inventor
前川 哲也
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Priority to JP2019526168A priority Critical patent/JP6928652B2/en
Publication of WO2019003571A1 publication Critical patent/WO2019003571A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an electronic device provided with a device that executes a function provided to the own device.
  • Patent Document 1 discloses a portable terminal that displays a reminder indicating that a finger is included in a captured image when contact of the finger is detected around the camera.
  • finger contact is detected when the user merely touches the periphery of the camera lens.
  • Patent Document 1 for detecting the contact is used as follows.
  • a problem arises. That is, if the user accidentally touches the periphery of the device, a problem occurs that an operation not intended by the user occurs on the device.
  • an electronic device includes a function execution unit that executes a function provided to the own device, and a user disposed within a predetermined range from the function execution unit.
  • the function according to a sensor for detecting contact, a contact determination unit for determining the contact with the plurality of regions set within the predetermined range from the detection value of the sensor, and the contact with the plurality of regions And a function execution control unit that controls function execution of the execution unit.
  • the control method of the electronic device concerning one mode of the present invention detects the contact arranged in the predetermined range from the function execution part which performs the function with which self-device was equipped.
  • a function of controlling execution of the function of the function execution unit according to the contact determination step of determining the contact to the plurality of regions set within the predetermined range from the detected value of d and the contact to the plurality of regions And an execution control step.
  • an effect of reducing an erroneous operation when the user unintentionally touches is achieved.
  • FIG. (A) And (b) is a figure which shows the outline
  • FIG. It is a flowchart which shows an example of the flow of the process which the information terminal which concerns on this Embodiment 1 performs.
  • FIG. It is a flowchart which shows an example of the flow of the process which the information terminal which concerns on this Embodiment 2 performs.
  • As shown in FIG. It is a flowchart which shows an example of the flow of the process which the information terminal concerning this Embodiment 3 performs.
  • Embodiment 1 Hereinafter, embodiments of the present invention will be described in detail with reference to FIG. 1 to FIG. 3 and FIG.
  • FIG. 2 is a view showing an outline of the information terminal 1.
  • FIG. 8 is a figure which shows the outline
  • the information terminal 100 includes a touch panel 102, a sensor 111, and a camera 112.
  • the shape of the touch panel 102 is a shape including the notch F100.
  • the sensor 111 and the camera 112 are disposed in the notch F100.
  • the sensor 111 is a sensor that detects a touch of a user's finger, and is disposed so as to surround the camera 112.
  • the crosses shown in FIG. 8 indicate the contact points of the user in the peripheral area of the camera 112.
  • the information terminal 100 ends the imaging by the camera 112.
  • the information terminal 100 ends the imaging of the camera 112 when the contact is detected regardless of the number of contact points in the peripheral region of the camera 112, the contact region, and the like.
  • FIG. (A) and (b) of FIG. 2 is a figure which shows the outline
  • FIG. (A) and (b) of FIG. 2 is a figure which shows the outline
  • the information terminal 1 includes a touch panel 2, a sensor 11 for detecting a touch, and a camera (function execution unit) 12.
  • the information terminal 1 may be, for example, a tablet type terminal, a smartphone or the like.
  • the shape of the touch panel 2 is a shape including the notch F1.
  • the camera 12 is disposed in the notch F1.
  • a partial area of the touch panel 2 functions as the sensor 11 (the sensor 11 is configured by a partial area of the touch panel 2).
  • the touch panel 2 in a region within a predetermined range from the edge of the touch panel 2 forming the notch F1 functions as the sensor 11.
  • the sensor 11 is disposed within a predetermined range from the camera 12.
  • an area R1, an area R2, and an area R3 are set in the area where the sensor 11 is disposed.
  • the notch F ⁇ b> 1 of the touch panel 2 has a U-shape.
  • an area R1, an area R2, and an area R3 are set in an area within a predetermined range from an edge forming each side of the U-shape.
  • the information terminal 1 for example, when the finger U of the user contacts only the region R2, the imaging by the camera 12 is not finished.
  • the information terminal 1 ends the imaging by the camera 12.
  • the user covers the camera 12 with a hand, imaging by the camera 12 ends.
  • the information terminal 1 may end the imaging by the camera 12.
  • the information terminal 1 determines a touch on a plurality of areas set within a predetermined range from the detection value of the sensor 11 and controls the function execution of the camera 12 according to the touch on the plurality of areas.
  • field R1, field R2, and field R3 were set up for every side of touch panel 2 which forms notch F1, setting of the field concerned can be performed arbitrarily.
  • the information terminal 1 may operate to control the function execution of the camera 12 when the touch panel 2 is active (normal mode: when a display screen is displayed). Further, the information terminal 1 may operate to control the function execution of the camera 12 when the touch panel 2 is in the save mode (when the display screen is not displayed).
  • the function execution of the camera 12 is controlled in accordance with contact with the regions R1, R2 and R3 which are a plurality of regions set from the camera 12 within a predetermined range.
  • the information terminal 1 does not control the function execution of the camera 12 in the case of contact with only one area set within the predetermined range. Therefore, even when the user does not intend to control the function execution of the camera 12 and contacts the predetermined range in which the sensor 11 is disposed, a plurality of regions among the region R1, the region R2 and the region R3 are used. If there is no contact, control of the function execution of the camera 12 is not performed. Therefore, it is possible to reduce an erroneous operation caused by the user touching the area where the sensor 11 is disposed without intending to control the function execution of the camera 12.
  • FIG. 1 is a block diagram showing the main configuration of the information terminal 1.
  • the information terminal 1 includes a touch panel 2, a sensor 11, a camera 12, a control unit 13, and a storage unit 14.
  • the touch panel 2 receives the touch operation of the user on the display screen.
  • the touch panel 2 is a free form display whose shape can be freely designed.
  • the touch panel 2 is, for example, an integrated display device and a touch panel device. By touching the display unit 20 with a finger, the user can perform operations such as icons displayed on the display unit 20 and scroll the screen.
  • the shape of the touch panel 2 is a shape including the notch F1, as shown in (a) and (b) of FIG.
  • the notch F ⁇ b> 1 is formed at the upper center of the touch panel 2 and is formed so as to cut out a part of the display surface of the touch panel 2.
  • the notch F1 is formed at the upper center of the touch panel 2, but if the notch F1 is formed on at least one side of the outer edge of the touch panel 2, The place where the notch F1 is formed is not particularly limited.
  • the notch F1 may be formed at the upper left end of the touch panel 2, may be formed at the upper right end of the touch panel 2, or may be formed at the lower, left or right edge of the touch panel 2. .
  • the notch F1 may be formed to be shifted to the left or right from the upper center of the touch panel 2.
  • the number of the notches F1 formed in the touch panel 2 is not particularly limited.
  • two or more notches F1 may be formed on the top of the touch panel 2.
  • the shape of the notch F1 is a U-shape (the shape of the notch is a square) in (a) and (b) of FIG. 2, it is not particularly limited.
  • the shape of the notch F1 may be semicircular or rectangular.
  • the sensor 11 detects a touch of the user.
  • a part of the touch panel 2 is a sensor 11.
  • the sensor 11 may be a sensor independent of the touch panel 2.
  • the sensor 11 is not particularly limited as long as it can detect the touch of the user, but may be, for example, a sensor using a capacitance, an infrared sensor, or the like.
  • the senor 11 is disposed within a predetermined range from the installation position of the camera 12.
  • the predetermined range in which the sensor 11 is disposed may be, for example, a range of 2 to 3 mm from the edge of the camera 12.
  • the predetermined range in which the sensor 11 is disposed may be appropriately changed according to the convenience of design.
  • the sensor 11 transmits the detected value to the input operation determination unit (contact determination unit) 131.
  • the camera 12 executes a function of imaging an object.
  • the camera 12 receives an instruction of the camera control unit (function execution control unit) 132, and executes the function according to the instruction.
  • the camera 12 is installed in the notch F1.
  • Control unit 13 The control unit 13 integrally controls each part of the information terminal 1.
  • the control unit 13 includes an input operation determination unit 131 and a camera control unit 132.
  • the input operation determination unit 131 determines whether or not simultaneous contact has been made to two or more of the plurality of areas. When simultaneous contact is performed on two or more areas, the input operation determination unit 131 transmits a message to that effect to the camera control unit 132.
  • the camera control unit 132 controls function execution of the camera 12.
  • the camera control unit 132 controls the function execution of the camera 12 in accordance with the touch on a plurality of areas. Specifically, when simultaneous contact is performed on two or more of the plurality of areas, the camera control unit 132 ends the camera 12 (imaging by the camera 12). When the camera control unit 132 ends the imaging by the camera 12, the camera control unit 132 updates the camera state information 141 stored in the storage unit 14.
  • FIG. 3 is a flowchart showing an example of the flow of processing executed by the information terminal 1.
  • the information terminal 1 may start the process.
  • the input operation determination unit 131 determines whether the camera 12 is activated (S1).
  • the input operation determination unit 131 determines whether or not simultaneous contact has been performed on two or more of the plurality of regions (S2: contact) Judgment step).
  • the camera control unit 132 terminates the camera 12 (image capture by the camera 12) (S3: function execution control) Step), the process ends.
  • S3 function execution control
  • control of the camera 12 has been shown.
  • the configuration shown in the present embodiment may be applied to control of devices other than cameras.
  • the present invention can be applied to control of a device that can apply touch operation, which is intuitive operation.
  • the information terminal 1 b determines the long press operation and the double tap operation, and controls the function execution of the camera 12.
  • FIG. 4 is a block diagram showing the main configuration of the information terminal 1b.
  • the information terminal 1 b includes a touch panel 2, a sensor 11, a camera 12, a control unit 13 b, a storage unit 14, and a timer 15 b.
  • the touch panel 2, the sensor 11, the camera 12, and the storage unit 14 have been described in detail in the first embodiment, and thus the description thereof is omitted here.
  • the control unit 13 b includes an input operation determination unit 131 b and a camera control unit 132 b.
  • the input operation determination unit 131 b performs the following process in addition to the process of the input operation determination unit 131 described above.
  • the input operation determination unit 131b determines whether contact to two or more regions among the plurality of regions has continued for a first predetermined time. That is, the input operation determination unit 131b determines whether the long press operation has been performed.
  • the input operation determination unit 131b simultaneously detects two or more of the plurality of regions within a second predetermined time from the time when two or more of the plurality of regions are simultaneously touched. It is determined whether the contact has been made again. That is, the input operation determination unit 131b determines whether a double tap operation has been performed.
  • the input operation determination unit 131b refers to the timer 15b indicating the elapse of time, and determines the elapse of the first predetermined time and the second predetermined time described above. More specifically, the input operation determination unit 131 b refers to the camera state information 141 indicating the activation state of the camera 12 stored in the storage unit 14 and determines whether the camera 12 is activated. When the camera 12 is activated, the input operation determination unit 131 determines whether a long press operation has been performed. When the camera 12 is not activated, the input operation determination unit 131 determines whether a double tap operation has been performed. When the long press operation or the double tap operation is performed, the input operation determination unit 131 b transmits to the camera control unit 132 b that the long press operation or the double tap operation has been performed.
  • Camera control unit 132b When contact with two or more of the plurality of areas continues for a first predetermined time (a long press operation is performed), the camera control unit 132 b controls the function execution of the camera 12. In the present embodiment, in particular, when the long press operation is performed, the camera control unit 132 b ends the camera 12 (imaging by the camera 12).
  • the camera control unit 132 b controls the function execution of the camera 12.
  • the camera control unit 132 b activates the camera 12 (starts imaging by the camera 12). That is, the camera control unit 132 b controls the execution of the function of the camera 12 corresponding to the long press operation or the double tap operation.
  • FIG. 5 is a flowchart showing an example of the flow of processing performed by the information terminal 1 b.
  • the processes of S1 and S2 are the same as the processes described in the first embodiment, and thus detailed description thereof will be omitted.
  • S11 when the camera 12 is activated (YES in S1) and two or more of the plurality of areas are simultaneously touched (YES in S2), S11 followed by.
  • the input operation determination unit 131b determines whether contact to two or more areas has continued for a first predetermined time (S11: contact determination step). That is, in S2 to S11, the input operation determination unit 131b determines whether the long press operation has been performed.
  • the camera control unit 132b ends the imaging by the camera 12 (S3), and the process ends. If simultaneous contact is not performed on two or more of the plurality of regions (NO in S2), or contact on two or more regions does not continue for the first predetermined time. If (NO at S11), the process returns to S2.
  • input operation determination unit 131 determines whether or not simultaneous contact has been made to two or more of the plurality of regions (S12). : Contact determination step).
  • input operation determination unit 131b causes two or more regions among a plurality of regions to be received within a second predetermined time. It is determined whether simultaneous contact has been performed again (S13: contact determination step). That is, in S12 to S13, the input operation determination unit 131b determines whether a double tap operation has been performed.
  • the input operation determination unit 131b detects whether or not a series of contact, non-contact, and contact operations have been performed on two or more regions within a predetermined time period. Determine from. If simultaneous contact with two or more of the plurality of regions is performed again within the second predetermined time (YES in S13), camera control unit 132b activates camera 12 (image taken by camera 12) (S14: function execution control step), and the process ends. If simultaneous contact is not performed on two or more regions (NO at S12) or within a second predetermined time, simultaneous contact on two or more of the plurality of regions is If not performed (NO in S13), the process returns to S12. Also, this process may end when the touch operation on the peripheral area of the camera 12 is not received.
  • this process may end when the application using the camera 12 ends. Further, when the processes of S3 and S14 end, the process may return to the process of S1. Further, the processing of S2 to S11 in the present processing may be replaced with the processing of S12 to S13. That is, the camera 12 may be ended when the double tap operation is performed, and the camera 12 may be activated when the long press operation is performed.
  • an example has been shown for control of activation and termination of the camera 12.
  • it may be applied to turning on and off the mobile light.
  • the configuration of the present embodiment can be applied to control of a device to which touch operation which is intuitive operation can be applied.
  • the information terminal 1c controls the execution of the function of the microphone (function execution unit) 16c.
  • FIG. 6 is a block diagram showing the main configuration of the information terminal 1c.
  • the information terminal 1c includes a touch panel 2, a sensor 11, a microphone 16c, a control unit 13c, a storage unit 14c, and a timer 15b.
  • the touch panel 2 and the timer 15 b have been described in detail in the first and second embodiments, and thus the description thereof is omitted here.
  • the sensor 11 of the first and second embodiments is disposed within a predetermined range from the installation position of the camera 12, the sensor 11c is disposed within the predetermined range from the installation position of the microphone 16c.
  • the other configuration of the sensor 11 c is similar to that of the sensor 11.
  • the microphone 16c picks up sound.
  • the microphone 16c receives an instruction of the microphone control unit (function execution control unit) 133c, and is put into a mute state according to the instruction. Further, as in the camera 12 described in the first embodiment, the microphone 16c may be installed in the notch F1 shown in (a) of FIG.
  • the control unit 13c includes an input operation determination unit 131c and a microphone control unit 133c.
  • the input operation determination unit 131c refers to the microphone state information 142c indicating whether the function of the microphone 16c stored in the storage unit 14 is valid (whether or not in the mute state), and the microphone 16c is valid. Determine if there is. When the function of the microphone 16c is valid (sound collection is performed), as in the input operation determination unit 131b of the second embodiment, the input operation determination unit 131c determines whether the long press operation by the touch operation has been performed. Determine When the long press operation is performed, the input operation determination unit 131c transmits, to the microphone control unit 133c, the fact that the long press operation is performed.
  • the microphone control unit 133c controls the function execution of the microphone 16c when the long press operation is performed. In detail, when the long press operation is performed, the microphone control unit 133c mutes the microphone 16c. Further, when the microphone control unit 133c mutes the microphone 16c, the microphone control unit 133c updates the microphone state information 142c stored in the storage unit 14c.
  • FIG. 7 is a flowchart showing an example of the flow of processing performed by the information terminal 1c.
  • the information terminal 1c may start the process.
  • the input operation determination unit 131c determines whether the function of the microphone 16c is valid (S21).
  • the input operation determination unit 131c determines whether or not simultaneous contact has been made to two or more of the plurality of regions (S22: Contact determination step). When simultaneous contact is performed on two or more regions among the plurality of regions (YES in S22), input operation determination unit 131c performs the contact for two or more regions for the first predetermined time. It is determined whether it continued (S23: contact determination step). That is, in S22 to S23, the input operation determination unit 131c determines whether the long press operation has been performed.
  • the microphone control unit 133c disables (mutes) the function of the microphone 16c (S24: function execution control step), The process ends. Also, this process may end when the touch operation on the peripheral area of the microphone 16c is not received. For example, this process may end when the application using the microphone 16c ends. In addition, when the processes of S21 to S24 end, the process may return to the process of S21.
  • the present invention may be applied to control to turn off the volume of the speaker.
  • the configuration of the present embodiment can be applied to control of a device to which touch operation which is intuitive operation can be applied.
  • Control blocks of the information terminals 1, 1b and 1c (in particular, the input operation determination unit 131, the input operation determination unit 131b, the input operation determination unit 131c, the camera control unit 132, the camera control unit 132b and the microphone control unit 132c) It may be realized by a logic circuit (hardware) formed in an IC chip or the like, or may be realized by software.
  • the information terminals 1, 1b, and 1c include a computer that executes instructions of a program that is software that implements each function.
  • the computer includes, for example, one or more processors, and a computer readable recording medium storing the program.
  • the processor reads the program from the recording medium and executes the program to achieve the object of the present invention.
  • a CPU Central Processing Unit
  • the above-mentioned recording medium a tape, a disk, a card, a semiconductor memory, a programmable logic circuit or the like can be used besides “a non-temporary tangible medium”, for example, a ROM (Read Only Memory).
  • a RAM Random Access Memory
  • the program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program.
  • any transmission medium communication network, broadcast wave, etc.
  • one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
  • the electronic device (information terminal 1, 1b, 1c) according to aspect 1 of the present invention has a function execution unit (camera 12, microphone 16c) for executing a function provided to the own device, and a predetermined range from the function execution unit.
  • the function execution unit controls the function execution according to the contact with the plurality of areas set within the predetermined range from the function execution unit.
  • the function execution control unit does not control the function execution in the case of contact with only one area set within the predetermined range. Therefore, even when the user touches the predetermined range without intention of control of the function execution, the function execution is not controlled unless the plurality of areas in the predetermined range are touched. Therefore, it is possible to reduce an erroneous operation caused by the user touching the predetermined range without intending to control the function execution.
  • the electronic device includes a touch panel (2) having a shape including a notch in a part, and the function execution unit is the notch (F1)
  • the sensor may be installed inside, and the sensor may be configured by a partial area of the touch panel, and the area may be an area within a predetermined range from an edge of the touch panel forming the notch.
  • the touch panel and the sensor can be made common. Further, in the configuration in which the function execution unit is provided in the notch of the touch panel, it is considered that the user has a lot of chances of coming into contact with the predetermined range by mistake from the function execution unit. According to the above configuration, the function execution control unit does not control the function execution in the case of contact with only one area set within the predetermined range. Therefore, even in a configuration in which the function execution unit is provided in the notch where the user may accidentally contact the predetermined range from the function execution unit, the user does not intend to control the function execution. Erroneous operation when touching within the range of can be reduced.
  • the function execution control unit when two or more of the plurality of areas are simultaneously touched, the function execution control unit performs the function execution You may control the function execution of a department.
  • the function execution unit controls the execution of the function. In other words, when the contact is made to only one of the plurality of areas, the function execution by the function execution unit is not controlled. In addition, even if contact is made to two or more regions, if the contact is not simultaneous, the function execution by the function execution unit is not controlled. Therefore, it is possible to reduce an erroneous operation when the user touches the predetermined range without intending to control the function execution.
  • the function execution control unit (camera control portion 132b) when the contact continues for a first predetermined time.
  • the microphone control unit 132c) may control the function execution of the function execution unit.
  • the function execution unit controls the function execution. In other words, if the contact does not continue for a predetermined time, the function execution by the function execution unit is not controlled. Therefore, it is possible to reduce an erroneous operation when the user touches the predetermined range without intending to control the function execution.
  • the electronic device (information terminal 1 b) performs the above within a second predetermined time from the time of simultaneous contact with two or more of the plurality of regions.
  • the function execution control unit may control the function execution of the function execution unit.
  • simultaneous contact to two or more of the plurality of regions is performed within a second predetermined time from the time of simultaneous contact to two or more of the plurality of regions. If it does, the function execution control unit controls the function execution of the function execution unit. In other words, if there is no simultaneous contact with two or more regions among the plurality of regions within the second predetermined time from simultaneous contact with two or more regions among the plurality of regions, the function execution unit executes the function Is not controlled. Therefore, it is possible to reduce an erroneous operation when the user touches the predetermined range without intending to control the function execution.
  • a control method of an electronic device is the above-mentioned predetermined range from a detection value of a sensor which detects a contact arranged within a predetermined range from a function execution unit which executes a function provided in the own device.
  • the contact determination step S2, S11, S12, S13, S22, S23
  • the function execution of the function execution unit according to the contact with the plurality of regions
  • function execution control steps S3, S14, S24
  • the electronic device may be realized by a computer, and in this case, the computer is realized by the computer as a component (software element) included in the electronic device.
  • a control program of an electronic device and a computer readable recording medium recording the same also fall within the scope of the present invention.

Abstract

The present invention reduces erroneous operations caused by detection of user's erroneous contact. This electronic device is provided with: a camera (12) that executes a function; a sensor (11) that is disposed within a prescribed range from the camera, and detects contact; and a camera control unit (132) that controls execution of the function of the camera in accordance with contact with a plurality of regions set within the prescribed range.

Description

電子機器、電子機器の制御方法およびプログラムElectronic device, control method of electronic device, and program
 本発明は自機器に備えられた機能を実行するデバイスを備えている電子機器に関する。 The present invention relates to an electronic device provided with a device that executes a function provided to the own device.
 これまでに、情報端末が備えているデバイスの周辺にセンサを設け、デバイスの機能を実行するにあたり該センサの検出値を用いる技術が開発されている。例えば、特許文献1には、カメラの周囲において指の接触が検出された場合に、撮影画像に指が写り込んでいることを示す注意喚起を表示する携帯端末が開示されている。 So far, a technology has been developed in which a sensor is provided around a device provided in an information terminal and the detection value of the sensor is used to execute the function of the device. For example, Patent Document 1 discloses a portable terminal that displays a reminder indicating that a finger is included in a captured image when contact of the finger is detected around the camera.
日本国公開特許公報「特開2011-217101号公報(2011年10月27日公開)」Japanese patent publication "Japanese Unexamined Patent Publication No. 2011-217101 (October 27, 2011 published)"
 しかしながら、特許文献1に開示されている指の接触を検出する構成では、ユーザが単にカメラのレンズの周囲に接触しただけで、指の接触が検出されてしまう。 However, in the configuration for detecting finger contact disclosed in Patent Document 1, finger contact is detected when the user merely touches the periphery of the camera lens.
 そのため、例えば、情報端末等に備わっているデバイスの周辺に対してユーザが接触することによって該デバイスに対する操作を行う構成に、特許文献1に開示されている接触を検出する構成を用いると以下のような問題が生じる。すなわち、ユーザが誤ってデバイス周辺に接触すると、該デバイスに対してユーザが意図していない操作が生じてしまうという問題が生じる。 Therefore, for example, in a configuration in which a user performs an operation on a device provided in an information terminal or the like by touching the periphery of the device, the configuration disclosed in Patent Document 1 for detecting the contact is used as follows. Such a problem arises. That is, if the user accidentally touches the periphery of the device, a problem occurs that an operation not intended by the user occurs on the device.
 本発明は、前記の問題点に鑑みてなされたものであり、本発明の一態様は、ユーザの誤った接触を検出することを原因とする誤操作を低減することができる情報端末を実現することを目的とする。 The present invention has been made in view of the above-mentioned problems, and one aspect of the present invention is to realize an information terminal capable of reducing an erroneous operation caused by detecting a user's erroneous contact. With the goal.
 上記の課題を解決するために、本発明の一態様に係る電子機器は、自機器に備えられた機能を実行する機能実行部と、上記機能実行部から所定の範囲内に配置されたユーザの接触を検出するセンサと、上記センサの検出値から、上記所定の範囲内に設定された複数の領域に対する上記接触を判定する接触判定部と、上記複数の領域に対する上記接触に応じて、上記機能実行部の機能実行を制御する機能実行制御部とを備えている。 In order to solve the above problems, an electronic device according to an aspect of the present invention includes a function execution unit that executes a function provided to the own device, and a user disposed within a predetermined range from the function execution unit. The function according to a sensor for detecting contact, a contact determination unit for determining the contact with the plurality of regions set within the predetermined range from the detection value of the sensor, and the contact with the plurality of regions And a function execution control unit that controls function execution of the execution unit.
 上記の課題を解決するために、本発明の一態様に係る電子機器の制御方法は、自機器に備えられた機能を実行する機能実行部から所定の範囲内に配置された接触を検出するセンサの検出値から、上記所定の範囲内に設定された複数の領域に対する上記接触を判定する接触判定ステップと、上記複数の領域に対する上記接触に応じて、上記機能実行部の機能実行を制御する機能実行制御ステップと、を含む。 In order to solve the above-mentioned subject, the control method of the electronic device concerning one mode of the present invention detects the contact arranged in the predetermined range from the function execution part which performs the function with which self-device was equipped. A function of controlling execution of the function of the function execution unit according to the contact determination step of determining the contact to the plurality of regions set within the predetermined range from the detected value of d and the contact to the plurality of regions And an execution control step.
 本発明の一態様によれば、接触操作において、ユーザが意図せずに接触した場合の誤操作を低減するという効果を奏する。 According to one aspect of the present invention, in the touch operation, an effect of reducing an erroneous operation when the user unintentionally touches is achieved.
本実施形態1に係る情報端末の要部構成を示すブロック図である。It is a block diagram which shows the principal part structure of the information terminal which concerns on this Embodiment 1. FIG. (a)および(b)は本実施形態1に係る情報端末の概要を示す図である。(A) And (b) is a figure which shows the outline | summary of the information terminal which concerns on this Embodiment 1. FIG. 本実施形態1に係る情報端末が実行する処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the process which the information terminal which concerns on this Embodiment 1 performs. 本実施形態2に係る情報端末の要部構成を示すブロック図である。It is a block diagram which shows the principal part structure of the information terminal which concerns on this Embodiment 2. FIG. 本実施形態2に係る情報端末が実行する処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the process which the information terminal which concerns on this Embodiment 2 performs. 本実施形態3に係る情報端末の要部構成を示すブロック図である。It is a block diagram which shows the principal part structure of the information terminal concerning this Embodiment 3. As shown in FIG. 本実施形態3に係る情報端末が実行する処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the process which the information terminal concerning this Embodiment 3 performs. 参考の形態に係る情報端末の概要を示す図である。It is a figure which shows the outline | summary of the information terminal which concerns on the form of a reference.
 〔実施形態1〕
 以下、本発明の実施の形態について、図1から図3および図8を参照して詳細に説明する。
Embodiment 1
Hereinafter, embodiments of the present invention will be described in detail with reference to FIG. 1 to FIG. 3 and FIG.
 (情報端末1の概要)
 まず、本実施形態に係る情報端末(電子機器)1の概要について、図2および図8を参照して説明する。図2は情報端末1の概要を示す図である。また、図8は参考の形態に係る情報端末100の概要を示す図である。はじめに、図8を参照して参考の形態について説明する。図8に示すように、情報端末100はタッチパネル102、センサ111およびカメラ112を備えている。タッチパネル102の形状は、切欠きF100を含む形状である。センサ111およびカメラ112は切欠きF100内に配置している。センサ111はユーザの指の接触を検出するセンサであり、カメラ112を囲むように配置されている。図8に示す×印は、カメラ112の周辺領域におけるユーザの接触点を示している。センサ111にてカメラ112の周辺領域におけるユーザの接触を検出した場合、情報端末100はカメラ112による撮像を終了する。情報端末100は、カメラ112の周辺領域の接触点の数、接触した領域等に関わらず、接触を検出するとカメラ112の撮像を終了する。
(Overview of information terminal 1)
First, an outline of the information terminal (electronic apparatus) 1 according to the present embodiment will be described with reference to FIGS. 2 and 8. FIG. 2 is a view showing an outline of the information terminal 1. Moreover, FIG. 8 is a figure which shows the outline | summary of the information terminal 100 which concerns on a reference form. First, a reference embodiment will be described with reference to FIG. As shown in FIG. 8, the information terminal 100 includes a touch panel 102, a sensor 111, and a camera 112. The shape of the touch panel 102 is a shape including the notch F100. The sensor 111 and the camera 112 are disposed in the notch F100. The sensor 111 is a sensor that detects a touch of a user's finger, and is disposed so as to surround the camera 112. The crosses shown in FIG. 8 indicate the contact points of the user in the peripheral area of the camera 112. When the user's touch in the peripheral area of the camera 112 is detected by the sensor 111, the information terminal 100 ends the imaging by the camera 112. The information terminal 100 ends the imaging of the camera 112 when the contact is detected regardless of the number of contact points in the peripheral region of the camera 112, the contact region, and the like.
 次に、図2を参照して本実施形態の情報端末1の概要について説明する。図2の(a)および(b)は情報端末1の概要を示す図である。 Next, the outline of the information terminal 1 of the present embodiment will be described with reference to FIG. (A) and (b) of FIG. 2 is a figure which shows the outline | summary of the information terminal 1. FIG.
 図2に示すように、情報端末1はタッチパネル2、接触を検出するセンサ11およびカメラ(機能実行部)12を備えている。情報端末1は、例えば、タブレット型端末、スマートフォン等であってもよい。タッチパネル2の形状は、欠きF1を含む形状である。カメラ12は切欠きF1内に配置している。図2に示す例では、タッチパネル2の一部の領域がセンサ11として機能する(センサ11はタッチパネル2の一部の領域によって構成されている)。詳細には、切欠きF1を形成するタッチパネル2の縁から所定の範囲内の領域のタッチパネル2がセンサ11として機能する。換言すると、センサ11はカメラ12から所定の範囲内に配置されている。図2の(a)および(b)に示すように、センサ11が配置されている領域には、領域R1、領域R2および領域R3が設定されている。図2に示す例では、タッチパネル2の切欠きF1はコの字の形状である。タッチパネル2には、コの字の各辺を形成する縁から所定の範囲内の領域に、領域R1、領域R2および領域R3が設定されている。 As shown in FIG. 2, the information terminal 1 includes a touch panel 2, a sensor 11 for detecting a touch, and a camera (function execution unit) 12. The information terminal 1 may be, for example, a tablet type terminal, a smartphone or the like. The shape of the touch panel 2 is a shape including the notch F1. The camera 12 is disposed in the notch F1. In the example illustrated in FIG. 2, a partial area of the touch panel 2 functions as the sensor 11 (the sensor 11 is configured by a partial area of the touch panel 2). In detail, the touch panel 2 in a region within a predetermined range from the edge of the touch panel 2 forming the notch F1 functions as the sensor 11. In other words, the sensor 11 is disposed within a predetermined range from the camera 12. As shown in (a) and (b) of FIG. 2, an area R1, an area R2, and an area R3 are set in the area where the sensor 11 is disposed. In the example shown in FIG. 2, the notch F <b> 1 of the touch panel 2 has a U-shape. In the touch panel 2, an area R1, an area R2, and an area R3 are set in an area within a predetermined range from an edge forming each side of the U-shape.
 図2の(a)に示すように、情報端末1においては、例えば、ユーザの指Uが領域R2のみ接触した場合にはカメラ12による撮像を終了しない。一方で、図2の(b)に示すように、ユーザの指Uが領域R1、領域R2および領域R3にて接触した場合、情報端末1はカメラ12による撮像を終了する。例えば、カメラ12をユーザが手で覆うとカメラ12による撮像が終了する。なお、ユーザの指Uが領域R1、領域R2および領域R3のうち2つ以上の領域に接触した場合に、情報端末1はカメラ12による撮像を終了する構成としてもよい。すなわち、情報端末1は、センサ11の検出値から、所定の範囲内に設定された複数の領域に対する接触を判定し、複数の領域に対する接触に応じて、カメラ12の機能実行を制御する。なお、上述の例においては、領域R1、領域R2および領域R3を切欠きF1を形成するタッチパネル2の辺毎に設定したが、該領域の設定は任意に行うことができる。また、情報端末1は、タッチパネル2がアクティブであるとき(通常モード:表示画面が表示されているとき)に、カメラ12の機能実行を制御するように動作してもよい。また、情報端末1は、タッチパネル2がセーブモードのとき(表示画面が表示されていないとき)に、カメラ12の機能実行を制御するように動作してもよい。 As shown in (a) of FIG. 2, in the information terminal 1, for example, when the finger U of the user contacts only the region R2, the imaging by the camera 12 is not finished. On the other hand, as shown in (b) of FIG. 2, when the finger U of the user touches in the area R1, the area R2, and the area R3, the information terminal 1 ends the imaging by the camera 12. For example, when the user covers the camera 12 with a hand, imaging by the camera 12 ends. When the user's finger U touches two or more of the regions R1, R2, and R3, the information terminal 1 may end the imaging by the camera 12. That is, the information terminal 1 determines a touch on a plurality of areas set within a predetermined range from the detection value of the sensor 11 and controls the function execution of the camera 12 according to the touch on the plurality of areas. In the above-mentioned example, although field R1, field R2, and field R3 were set up for every side of touch panel 2 which forms notch F1, setting of the field concerned can be performed arbitrarily. In addition, the information terminal 1 may operate to control the function execution of the camera 12 when the touch panel 2 is active (normal mode: when a display screen is displayed). Further, the information terminal 1 may operate to control the function execution of the camera 12 when the touch panel 2 is in the save mode (when the display screen is not displayed).
 上記の構成によれば、カメラ12から所定の範囲内に設定された複数の領域である領域R1、領域R2および領域R3に対する接触に応じてカメラ12の機能実行を制御する。換言すると、所定の範囲内に設定された1つの領域のみの接触では、情報端末1はカメラ12の機能実行を制御しない。そのため、ユーザがカメラ12の機能実行の制御を意図せずにセンサ11が配置している所定の範囲内に接触した場合においても、領域R1、領域R2および領域R3のうち複数の領域に対して接触していなければカメラ12の機能実行の制御は行われない。従って、ユーザがカメラ12の機能実行の制御を意図せずにセンサ11が配置している範囲内を接触することによって生じる誤操作を低減することができる。 According to the above configuration, the function execution of the camera 12 is controlled in accordance with contact with the regions R1, R2 and R3 which are a plurality of regions set from the camera 12 within a predetermined range. In other words, the information terminal 1 does not control the function execution of the camera 12 in the case of contact with only one area set within the predetermined range. Therefore, even when the user does not intend to control the function execution of the camera 12 and contacts the predetermined range in which the sensor 11 is disposed, a plurality of regions among the region R1, the region R2 and the region R3 are used. If there is no contact, control of the function execution of the camera 12 is not performed. Therefore, it is possible to reduce an erroneous operation caused by the user touching the area where the sensor 11 is disposed without intending to control the function execution of the camera 12.
 (情報端末1の構成)
 次に、情報端末1の構成について、図1を参照して説明する。図1は、情報端末1の要部構成を示すブロック図である。図1に示すように、情報端末1は、タッチパネル2、センサ11、カメラ12、制御部13および記憶部14を備えている。
(Configuration of information terminal 1)
Next, the configuration of the information terminal 1 will be described with reference to FIG. FIG. 1 is a block diagram showing the main configuration of the information terminal 1. As shown in FIG. 1, the information terminal 1 includes a touch panel 2, a sensor 11, a camera 12, a control unit 13, and a storage unit 14.
  (タッチパネル2)
 タッチパネル2は、表示画面に対するユーザの接触操作を受け付ける。特に本実施形態においては、タッチパネル2は形状を自由に設計することが可能なフリーフォームディスプレイである。
(Touch panel 2)
The touch panel 2 receives the touch operation of the user on the display screen. In particular, in the present embodiment, the touch panel 2 is a free form display whose shape can be freely designed.
 タッチパネル2は、例えば、表示デバイスとタッチパネルデバイスとを一体としたものである。ユーザは、表示部20を指で触ることによって、表示部20に表示されているアイコン等の操作、および画面のスクロール等を行うことができる。タッチパネル2の形状は、図2の(a)および(b)に示すように、切欠きF1を含む形状である。 The touch panel 2 is, for example, an integrated display device and a touch panel device. By touching the display unit 20 with a finger, the user can perform operations such as icons displayed on the display unit 20 and scroll the screen. The shape of the touch panel 2 is a shape including the notch F1, as shown in (a) and (b) of FIG.
 切欠きF1は、タッチパネル2の上部中央に形成されており、タッチパネル2の表示面の一部を切り欠くように形成されている。なお、図2の(a)および(b)では、切欠きF1は、タッチパネル2の上部中央に形成されているが、切欠きF1がタッチパネル2の外縁の少なくとも1辺に形成されていれば、切欠きF1が形成される場所は特に限定されない。 The notch F <b> 1 is formed at the upper center of the touch panel 2 and is formed so as to cut out a part of the display surface of the touch panel 2. 2A and 2B, the notch F1 is formed at the upper center of the touch panel 2, but if the notch F1 is formed on at least one side of the outer edge of the touch panel 2, The place where the notch F1 is formed is not particularly limited.
 例えば、切欠きF1は、タッチパネル2の上部左端に形成されてもよく、タッチパネル2の上部右端に形成されてもよいし、タッチパネル2の下部、左縁、または右縁に形成されていてもよい。また、切欠きF1は、タッチパネル2の上部中央から左または右にずれて形成されていてもよい。さらに、タッチパネル2に形成される切欠きF1の数は特に限定されない。例えば、切欠きF1は、タッチパネル2の上部に2つ以上形成されてもよい。切欠きF1の形状は、図2の(a)および(b)ではコの字(切欠きの形状が正方形)の形状になっているが、特に限定されない。例えば、切欠きF1の形状は、半円状の形状であってもよいし、長方形であってもよい。 For example, the notch F1 may be formed at the upper left end of the touch panel 2, may be formed at the upper right end of the touch panel 2, or may be formed at the lower, left or right edge of the touch panel 2. . In addition, the notch F1 may be formed to be shifted to the left or right from the upper center of the touch panel 2. Furthermore, the number of the notches F1 formed in the touch panel 2 is not particularly limited. For example, two or more notches F1 may be formed on the top of the touch panel 2. Although the shape of the notch F1 is a U-shape (the shape of the notch is a square) in (a) and (b) of FIG. 2, it is not particularly limited. For example, the shape of the notch F1 may be semicircular or rectangular.
  (センサ11)
 センサ11は、ユーザの接触を検出する。本実施形態においては、図2の(a)および(b)に示すように、タッチパネル2の一部がセンサ11となっている。なお、センサ11がタッチパネル2とは独立したセンサであってもよい。センサ11はユーザの接触を検出できればよく特に限定されないが、例えば、静電容量を用いたセンサ、赤外線センサ等としてもよい。
(Sensor 11)
The sensor 11 detects a touch of the user. In the present embodiment, as shown in (a) and (b) of FIG. 2, a part of the touch panel 2 is a sensor 11. The sensor 11 may be a sensor independent of the touch panel 2. The sensor 11 is not particularly limited as long as it can detect the touch of the user, but may be, for example, a sensor using a capacitance, an infrared sensor, or the like.
 また、センサ11はカメラ12の設置位置から所定の範囲内に配置している。ここで、センサ11を配置する所定の範囲とは、例えば、カメラ12の縁から2から3mmまでの範囲としてもよい。センサ11を配置する所定の範囲を、設計の都合に応じて適宜変更してもよい。センサ11は検出値を入力操作判定部(接触判定部)131に送信する。 Further, the sensor 11 is disposed within a predetermined range from the installation position of the camera 12. Here, the predetermined range in which the sensor 11 is disposed may be, for example, a range of 2 to 3 mm from the edge of the camera 12. The predetermined range in which the sensor 11 is disposed may be appropriately changed according to the convenience of design. The sensor 11 transmits the detected value to the input operation determination unit (contact determination unit) 131.
  (カメラ12)
  カメラ12は被写体を撮像する機能を実行する。カメラ12はカメラ制御部(機能実行制御部)132の指示を受信し、当該指示に従い機能の実行を行う。また、図2の(a)に示すように、カメラ12は切欠きF1内に設置されている。
(Camera 12)
The camera 12 executes a function of imaging an object. The camera 12 receives an instruction of the camera control unit (function execution control unit) 132, and executes the function according to the instruction. Moreover, as shown to (a) of FIG. 2, the camera 12 is installed in the notch F1.
  (制御部13)
 制御部13は情報端末1の各部を統括して制御する。制御部13は入力操作判定部131およびカメラ制御部132を備えている。
(Control unit 13)
The control unit 13 integrally controls each part of the information terminal 1. The control unit 13 includes an input operation determination unit 131 and a camera control unit 132.
   (入力操作判定部131)
 入力操作判定部131は、センサ11の検出値から、センサ11が配置している所定の範囲内に設定された複数の領域に対するユーザの接触を判定する。複数の領域とは、例えば、図2に示す領域R1、領域R2および領域R3等である。詳細には、入力操作判定部131は、複数の領域のうち2つ以上の領域に対して同時の接触が行なわれたか否かを判定する。さらに詳細に説明すると、入力操作判定部131は、記憶部14に格納されているカメラ12の起動状態を示すカメラ状態情報141を参照し、カメラ12が起動している(カメラ12が撮像処理を行っている)か否かを判定する。カメラ12が起動している場合、入力操作判定部131は複数の領域のうち2つ以上の領域に対して同時の接触が行なわれたか否かを判定する。2つ以上の領域に対して同時の接触が行なわれた場合、入力操作判定部131はその旨をカメラ制御部132に送信する。
(Input operation determination unit 131)
The input operation determination unit 131 determines, based on the detection value of the sensor 11, the user's touch on a plurality of areas set within a predetermined range in which the sensor 11 is disposed. The plurality of regions are, for example, the region R1, the region R2 and the region R3 shown in FIG. In detail, the input operation determination unit 131 determines whether or not simultaneous contact has been performed on two or more of the plurality of areas. More specifically, the input operation determination unit 131 refers to the camera state information 141 indicating the activation state of the camera 12 stored in the storage unit 14, and the camera 12 is activated (the camera 12 performs an imaging process). It is determined whether or not it is done. When the camera 12 is activated, the input operation determination unit 131 determines whether or not simultaneous contact has been made to two or more of the plurality of areas. When simultaneous contact is performed on two or more areas, the input operation determination unit 131 transmits a message to that effect to the camera control unit 132.
   (カメラ制御部132)
 カメラ制御部132はカメラ12の機能実行を制御する。カメラ制御部132は複数の領域に対する接触に応じて、カメラ12の機能実行を制御する。詳細には、複数の領域のうち2つ以上の領域に対して同時の接触が行なわれた場合、カメラ制御部132はカメラ12(カメラ12による撮像)を終了させる。また、カメラ制御部132はカメラ12による撮像を終了させると、記憶部14に格納されているカメラ状態情報141を更新する。
(Camera control unit 132)
The camera control unit 132 controls function execution of the camera 12. The camera control unit 132 controls the function execution of the camera 12 in accordance with the touch on a plurality of areas. Specifically, when simultaneous contact is performed on two or more of the plurality of areas, the camera control unit 132 ends the camera 12 (imaging by the camera 12). When the camera control unit 132 ends the imaging by the camera 12, the camera control unit 132 updates the camera state information 141 stored in the storage unit 14.
 (情報端末1の処理の流れ)
 次に図3を参照して、情報端末1の処理について説明する。図3は、情報端末1が実行する処理の流れの一例を示すフローチャートである。例えば、カメラ12の周辺領域に対する接触操作の受付けが可能となった場合、情報端末1は当該処理を開始してもよい。例えば、カメラ12を用いるアプリケーションが起動した場合に、情報端末1は当該処理を開始してもよい。図3に示すように、入力操作判定部131はカメラ12が起動しているか否かを判定する(S1)。カメラ12が起動している場合(S1にてYES)、入力操作判定部131は複数の領域のうち2つ以上の領域に対して同時の接触が行なわれたか否かを判定する(S2:接触判定ステップ)。複数の領域のうち2つ以上の領域に対して同時の接触が行なわれた場合(S2にてYES)、カメラ制御部132はカメラ12(カメラ12による撮像)を終了させ(S3:機能実行制御ステップ)、処理は終了する。なお、カメラ12が起動していない場合(S1にてNO)、処理は終了する。また、複数の領域のうち2つ以上の領域に対して同時の接触が行なわれなかった場合(S2にてNO)、S2の処理が繰り返し行われる。
(Flow of processing of information terminal 1)
Next, processing of the information terminal 1 will be described with reference to FIG. FIG. 3 is a flowchart showing an example of the flow of processing executed by the information terminal 1. For example, when it is possible to receive a touch operation on the peripheral area of the camera 12, the information terminal 1 may start the process. For example, when an application using the camera 12 is activated, the information terminal 1 may start the process. As shown in FIG. 3, the input operation determination unit 131 determines whether the camera 12 is activated (S1). When the camera 12 is activated (YES in S1), the input operation determination unit 131 determines whether or not simultaneous contact has been performed on two or more of the plurality of regions (S2: contact) Judgment step). If simultaneous contact is performed on two or more of the plurality of areas (YES in S2), the camera control unit 132 terminates the camera 12 (image capture by the camera 12) (S3: function execution control) Step), the process ends. When the camera 12 is not activated (NO in S1), the process ends. When simultaneous contact is not performed on two or more of the plurality of regions (NO in S2), the process of S2 is repeated.
 本実施形態においては、カメラ12の制御について例を示した。一方で、本実施形態に示した構成をカメラ以外のデバイスの制御に適用してもよい。特に、直感的な操作である接触操作を適用できるようなデバイスの制御に適用することができる。 In the present embodiment, an example of control of the camera 12 has been shown. On the other hand, the configuration shown in the present embodiment may be applied to control of devices other than cameras. In particular, the present invention can be applied to control of a device that can apply touch operation, which is intuitive operation.
 〔実施形態2〕
 本発明の他の実施形態について、図4および図5に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、前記実施形態にて説明した部材と同じ機能を有する部材については、同じ符号を付記し、その説明を省略する。本実施形態に係る情報端末1bは、長押し操作およびダブルタップ操作を判定し、カメラ12の機能実行を制御する。
Second Embodiment
Another embodiment of the present invention is described below with reference to FIGS. 4 and 5. In addition, about the member which has the same function as the member demonstrated in the said embodiment for convenience of explanation, the same code | symbol is appended and the description is abbreviate | omitted. The information terminal 1 b according to the present embodiment determines the long press operation and the double tap operation, and controls the function execution of the camera 12.
 (情報端末1bの構成)
 情報端末1bの構成について、図4を参照して説明する。図4は、情報端末1bの要部構成を示すブロック図である。図4に示すように、情報端末1bは、タッチパネル2、センサ11、カメラ12、制御部13b、記憶部14およびタイマ15bを備えている。タッチパネル2、センサ11、カメラ12および記憶部14については、実施形態1にて詳細を説明したため、ここでの説明は省略する。
(Configuration of information terminal 1b)
The configuration of the information terminal 1b will be described with reference to FIG. FIG. 4 is a block diagram showing the main configuration of the information terminal 1b. As shown in FIG. 4, the information terminal 1 b includes a touch panel 2, a sensor 11, a camera 12, a control unit 13 b, a storage unit 14, and a timer 15 b. The touch panel 2, the sensor 11, the camera 12, and the storage unit 14 have been described in detail in the first embodiment, and thus the description thereof is omitted here.
  (制御部13b)
 制御部13bは入力操作判定部131bおよびカメラ制御部132bを備えている。
(Control unit 13b)
The control unit 13 b includes an input operation determination unit 131 b and a camera control unit 132 b.
   (入力操作判定部131b)
 入力操作判定部131bは上述の入力操作判定部131の処理に加えて以下の処理を行う。入力操作判定部131bは複数の領域のうち2つ以上の領域に対する接触が第1の所定の時間の間継続したか否かを判定する。すなわち、入力操作判定部131bは長押し操作が行われたか否かを判定する。また、入力操作判定部131bは、複数の領域のうち2つ以上の領域に対して同時に接触があった時点から第2の所定の時間内に、複数の領域のうち2つ以上の領域に対する同時の接触が再び行なわれたか否かを判定する。すなわち、入力操作判定部131bはダブルタップ操作が行われたか否かを判定する。入力操作判定部131bは時間の経過を示すタイマ15bを参照し、上述の第1の所定の時間および第2の所定の時間の経過を判断する。より詳細に説明すると、入力操作判定部131bは、記憶部14に格納されているカメラ12の起動状態を示すカメラ状態情報141を参照し、カメラ12が起動しているか否かを判定する。カメラ12が起動している場合、入力操作判定部131は長押し操作が行われたか否かを判定する。また、カメラ12が起動していない場合、入力操作判定部131はダブルタップ操作が行われたか否かを判定する。長押し操作またはダブルタップ操作が行われた場合、入力操作判定部131bは長押し操作またはダブルタップ操作が行われた旨をカメラ制御部132bに送信する。
(Input operation determination unit 131b)
The input operation determination unit 131 b performs the following process in addition to the process of the input operation determination unit 131 described above. The input operation determination unit 131b determines whether contact to two or more regions among the plurality of regions has continued for a first predetermined time. That is, the input operation determination unit 131b determines whether the long press operation has been performed. In addition, the input operation determination unit 131b simultaneously detects two or more of the plurality of regions within a second predetermined time from the time when two or more of the plurality of regions are simultaneously touched. It is determined whether the contact has been made again. That is, the input operation determination unit 131b determines whether a double tap operation has been performed. The input operation determination unit 131b refers to the timer 15b indicating the elapse of time, and determines the elapse of the first predetermined time and the second predetermined time described above. More specifically, the input operation determination unit 131 b refers to the camera state information 141 indicating the activation state of the camera 12 stored in the storage unit 14 and determines whether the camera 12 is activated. When the camera 12 is activated, the input operation determination unit 131 determines whether a long press operation has been performed. When the camera 12 is not activated, the input operation determination unit 131 determines whether a double tap operation has been performed. When the long press operation or the double tap operation is performed, the input operation determination unit 131 b transmits to the camera control unit 132 b that the long press operation or the double tap operation has been performed.
   (カメラ制御部132b)
 複数の領域のうち2つ以上の領域に対する接触が第1の所定の時間の間継続した(長押し操作が行われた)場合、カメラ制御部132bはカメラ12の機能実行を制御する。本実施形態では、特に、長押し操作が行なわれた場合、カメラ制御部132bはカメラ12(カメラ12による撮像)を終了させる。
(Camera control unit 132b)
When contact with two or more of the plurality of areas continues for a first predetermined time (a long press operation is performed), the camera control unit 132 b controls the function execution of the camera 12. In the present embodiment, in particular, when the long press operation is performed, the camera control unit 132 b ends the camera 12 (imaging by the camera 12).
 複数の領域のうち2つ以上の領域に対する同時の接触があった時点から第2の所定の時間内に、複数の領域のうち2つ以上の領域に対する同時の接触が行なわれた(ダブルタップ操作が行なわれた)場合、カメラ制御部132bはカメラ12の機能実行を制御する。本実施形態では、特に、ダブルタップ操作が行なわれた場合、カメラ制御部132bはカメラ12を起動させる(カメラ12による撮像を開始させる)。すなわち、カメラ制御部132bは、長押し操作またはダブルタップ操作に対応したカメラ12の機能の実行の制御を行う。 Simultaneous contact to two or more of the plurality of regions is performed within a second predetermined time from the point of simultaneous contact to two or more of the plurality of regions (double tap operation Is performed, the camera control unit 132 b controls the function execution of the camera 12. In the present embodiment, in particular, when the double tap operation is performed, the camera control unit 132 b activates the camera 12 (starts imaging by the camera 12). That is, the camera control unit 132 b controls the execution of the function of the camera 12 corresponding to the long press operation or the double tap operation.
 (情報端末1bの処理の流れ)
 次に図5を参照して、情報端末1bの処理について説明する。図5は、情報端末1bが実行する処理の流れの一例を示すフローチャートである。S1およびS2の処理については実施形態1にて説明した処理と同様のため、ここでの詳細な説明は省略する。図5に示すように、カメラ12が起動しており(S1にてYES)、複数の領域のうち2つ以上の領域に対して同時の接触が行なわれた場合(S2にてYES)、S11に続く。入力操作判定部131bは2つ以上の領域に対する接触が第1の所定の時間の間継続したか否かを判定する(S11:接触判定ステップ)。すなわち、S2からS11では、入力操作判定部131bは長押し操作が行われたか否かを判定する。2つ以上の領域に対する接触が第1の所定の時間の間継続した場合(S11にてYES)、カメラ制御部132bはカメラ12による撮像を終了させ(S3)、処理は終了する。なお、複数の領域のうち2つ以上の領域に対して同時の接触が行なわれなかった場合(S2にてNO)または2つ以上の領域に対する接触が第1の所定の時間の間継続しなかった場合(S11にてNO)、処理はS2に戻る。
(Flow of processing of information terminal 1b)
Next, processing of the information terminal 1b will be described with reference to FIG. FIG. 5 is a flowchart showing an example of the flow of processing performed by the information terminal 1 b. The processes of S1 and S2 are the same as the processes described in the first embodiment, and thus detailed description thereof will be omitted. As shown in FIG. 5, when the camera 12 is activated (YES in S1) and two or more of the plurality of areas are simultaneously touched (YES in S2), S11 followed by. The input operation determination unit 131b determines whether contact to two or more areas has continued for a first predetermined time (S11: contact determination step). That is, in S2 to S11, the input operation determination unit 131b determines whether the long press operation has been performed. When the contact to two or more areas continues for the first predetermined time (YES in S11), the camera control unit 132b ends the imaging by the camera 12 (S3), and the process ends. If simultaneous contact is not performed on two or more of the plurality of regions (NO in S2), or contact on two or more regions does not continue for the first predetermined time. If (NO at S11), the process returns to S2.
 また、カメラ12が起動していない場合(S1にてNO)、入力操作判定部131は複数の領域のうち2つ以上の領域に対して同時の接触が行なわれたか否かを判定する(S12:接触判定ステップ)。2つ以上の領域に対して同時の接触が行なわれた場合(S12にてYES)、入力操作判定部131bは、第2の所定の時間内に、複数の領域のうち2つ以上の領域に対する同時の接触が再び行なわれたか否かを判定する(S13:接触判定ステップ)。すなわち、S12からS13では、入力操作判定部131bはダブルタップ操作が行われたか否かを判定する。換言すると、S12からS13にて、入力操作判定部131bは一定時間以内において、2つ以上の領域に対して接触、非接触および接触の一連の操作が行なわれたか否かをセンサ11の検出値から判定する。第2の所定の時間内に、複数の領域のうち2つ以上の領域に対する同時の接触が再び行なわれた場合(S13にてYES)、カメラ制御部132bはカメラ12を起動(カメラ12による撮像を開始)させ(S14:機能実行制御ステップ)、処理は終了する。なお、2つ以上の領域に対して同時の接触が行なわれなかった場合(S12にてNO)または第2の所定の時間内に、複数の領域のうち2つ以上の領域に対する同時の接触が行なわれなかった場合(S13にてNO)、処理はS12に戻る。また、本処理は、カメラ12の周辺領域に対する接触操作を受付けない状態となった場合に終了してもよい。例えば、カメラ12を用いるアプリケーションが終了した場合に本処理が終了してもよい。また、S3およびS14の処理が終了すると、S1の処理に戻る構成としてもよい。また、本処理におけるS2からS11の処理と、S12からS13の処理とを入れ替えてもよい。すなわち、ダブルタップ操作が行なわれた場合にカメラ12を終了し、長押し操作が行なわれた場合にカメラ12を起動してもよい。 When camera 12 is not activated (NO in S1), input operation determination unit 131 determines whether or not simultaneous contact has been made to two or more of the plurality of regions (S12). : Contact determination step). When simultaneous contact is performed on two or more regions (YES in S12), input operation determination unit 131b causes two or more regions among a plurality of regions to be received within a second predetermined time. It is determined whether simultaneous contact has been performed again (S13: contact determination step). That is, in S12 to S13, the input operation determination unit 131b determines whether a double tap operation has been performed. In other words, in S12 to S13, the input operation determination unit 131b detects whether or not a series of contact, non-contact, and contact operations have been performed on two or more regions within a predetermined time period. Determine from. If simultaneous contact with two or more of the plurality of regions is performed again within the second predetermined time (YES in S13), camera control unit 132b activates camera 12 (image taken by camera 12) (S14: function execution control step), and the process ends. If simultaneous contact is not performed on two or more regions (NO at S12) or within a second predetermined time, simultaneous contact on two or more of the plurality of regions is If not performed (NO in S13), the process returns to S12. Also, this process may end when the touch operation on the peripheral area of the camera 12 is not received. For example, this process may end when the application using the camera 12 ends. Further, when the processes of S3 and S14 end, the process may return to the process of S1. Further, the processing of S2 to S11 in the present processing may be replaced with the processing of S12 to S13. That is, the camera 12 may be ended when the double tap operation is performed, and the camera 12 may be activated when the long press operation is performed.
 本実施形態においては、カメラ12の起動および終了の制御について例を示した。他の例として、モバイルライトの点灯および消灯等に適用してもよい。特に、本実施形態の構成を直感的な操作である接触操作を適用できるようなデバイスの制御に適用することができる。 In the present embodiment, an example has been shown for control of activation and termination of the camera 12. As another example, it may be applied to turning on and off the mobile light. In particular, the configuration of the present embodiment can be applied to control of a device to which touch operation which is intuitive operation can be applied.
 〔実施形態3〕
 本発明の他の実施形態について、図6および図7に基づいて説明すれば、以下のとおりである。なお、説明の便宜上、前記実施形態にて説明した部材と同じ機能を有する部材については、同じ符号を付記し、その説明を省略する。本実施形態に係る情報端末1cは、マイク(機能実行部)16cの機能の実行を制御する。
Third Embodiment
Another embodiment of the present invention is described below with reference to FIGS. 6 and 7. In addition, about the member which has the same function as the member demonstrated in the said embodiment for convenience of explanation, the same code | symbol is appended and the description is abbreviate | omitted. The information terminal 1c according to the present embodiment controls the execution of the function of the microphone (function execution unit) 16c.
 (情報端末1cの構成)
 情報端末1cの構成について、図6を参照して説明する。図6は、情報端末1cの要部構成を示すブロック図である。図6に示すように、情報端末1cは、タッチパネル2、センサ11、マイク16c、制御部13c、記憶部14cおよびタイマ15bを備えている。タッチパネル2およびタイマ15bについては、実施形態1および2にて詳細を説明したため、ここでの説明は省略する。また、実施形態1および2のセンサ11がカメラ12の設置位置から所定の範囲内に配置しているのに対し、センサ11cはマイク16cの設置位置から所定の範囲内に配置している。センサ11cのその他の構成はセンサ11と同様である。
(Configuration of information terminal 1c)
The configuration of the information terminal 1c will be described with reference to FIG. FIG. 6 is a block diagram showing the main configuration of the information terminal 1c. As shown in FIG. 6, the information terminal 1c includes a touch panel 2, a sensor 11, a microphone 16c, a control unit 13c, a storage unit 14c, and a timer 15b. The touch panel 2 and the timer 15 b have been described in detail in the first and second embodiments, and thus the description thereof is omitted here. Further, while the sensor 11 of the first and second embodiments is disposed within a predetermined range from the installation position of the camera 12, the sensor 11c is disposed within the predetermined range from the installation position of the microphone 16c. The other configuration of the sensor 11 c is similar to that of the sensor 11.
  (マイク16c)
 マイク16cは音声を集音する。マイク16cはマイク制御部(機能実行制御部)133cの指示を受信し、当該指示に従いミュート状態となる。また、実施形態1にて説明したカメラ12と同様に、マイク16cは図2の(a)に示す切欠きF1内に設置されていてもよい。
(Mike 16c)
The microphone 16c picks up sound. The microphone 16c receives an instruction of the microphone control unit (function execution control unit) 133c, and is put into a mute state according to the instruction. Further, as in the camera 12 described in the first embodiment, the microphone 16c may be installed in the notch F1 shown in (a) of FIG.
  (制御部13c)
 制御部13cは、入力操作判定部131cおよびマイク制御部133cを備えている。
(Control unit 13c)
The control unit 13c includes an input operation determination unit 131c and a microphone control unit 133c.
   (入力操作判定部131c)
 入力操作判定部131cは、記憶部14に格納されているマイク16cの機能が有効であるか否か(ミュート状態であるか否か)を示すマイク状態情報142cを参照し、マイク16cが有効であるか否かを判定する。マイク16cの機能が有効である(集音を行っている)場合、入力操作判定部131cは、実施形態2の入力操作判定部131bと同様に、接触操作による長押し操作が行われたか否かを判定する。入力操作判定部131cは長押し操作が行われた場合、長押し操作が行われた旨をマイク制御部133cに送信する。
(Input operation determination unit 131c)
The input operation determination unit 131c refers to the microphone state information 142c indicating whether the function of the microphone 16c stored in the storage unit 14 is valid (whether or not in the mute state), and the microphone 16c is valid. Determine if there is. When the function of the microphone 16c is valid (sound collection is performed), as in the input operation determination unit 131b of the second embodiment, the input operation determination unit 131c determines whether the long press operation by the touch operation has been performed. Determine When the long press operation is performed, the input operation determination unit 131c transmits, to the microphone control unit 133c, the fact that the long press operation is performed.
   (マイク制御部133c)
 マイク制御部133cは、長押し操作が行なわれた場合、マイク16cの機能実行を制御する。詳細には、長押し操作が行なわれた場合、マイク制御部133cはマイク16cをミュートにする。また、マイク制御部133cはマイク16cをミュートにすると、記憶部14cに格納されているマイク状態情報142cを更新する。
(Microphone controller 133c)
The microphone control unit 133c controls the function execution of the microphone 16c when the long press operation is performed. In detail, when the long press operation is performed, the microphone control unit 133c mutes the microphone 16c. Further, when the microphone control unit 133c mutes the microphone 16c, the microphone control unit 133c updates the microphone state information 142c stored in the storage unit 14c.
 (情報端末1cの処理の流れ)
 次に図7を参照して、情報端末1cの処理について説明する。図7は、情報端末1cが実行する処理の流れの一例を示すフローチャートである。例えば、マイク16cの周辺領域に対する接触操作の受付けが可能となった場合、情報端末1cは当該処理を開始してもよい。例えば、マイク16cを用いるアプリケーションが起動した場合に、情報端末1cは当該処理を開始してもよい。図7に示すように、入力操作判定部131cはマイク16cの機能が有効であるか否かを判定する(S21)。マイク16cの機能が有効である場合(S21にてYES)、入力操作判定部131cは複数の領域のうち2つ以上の領域に対して同時の接触が行なわれたか否かを判定する(S22:接触判定ステップ)。複数の領域のうち2つ以上の領域に対して同時の接触が行なわれた場合(S22にてYES)、入力操作判定部131cは2つ以上の領域に対する接触が第1の所定の時間の間継続したか否かを判定する(S23:接触判定ステップ)。すなわち、S22からS23では、入力操作判定部131cは長押し操作が行われたか否かを判定する。2つ以上の領域に対する接触が第1の所定の時間の間継続した場合(S23にてYES)、マイク制御部133cはマイク16cの機能を無効(ミュート)し(S24:機能実行制御ステップ)、処理は終了する。また、本処理は、マイク16cの周辺領域に対する接触操作を受付けない状態となった場合に終了してもよい。例えば、マイク16cを用いるアプリケーションが終了した場合に本処理が終了してもよい。また、S21からS24の処理が終了すると、S21の処理に戻る構成としてもよい。
(Flow of processing of information terminal 1c)
Next, processing of the information terminal 1c will be described with reference to FIG. FIG. 7 is a flowchart showing an example of the flow of processing performed by the information terminal 1c. For example, when it is possible to receive a touch operation on the peripheral area of the microphone 16c, the information terminal 1c may start the process. For example, when an application using the microphone 16c is activated, the information terminal 1c may start the process. As shown in FIG. 7, the input operation determination unit 131c determines whether the function of the microphone 16c is valid (S21). If the function of the microphone 16c is valid (YES in S21), the input operation determination unit 131c determines whether or not simultaneous contact has been made to two or more of the plurality of regions (S22: Contact determination step). When simultaneous contact is performed on two or more regions among the plurality of regions (YES in S22), input operation determination unit 131c performs the contact for two or more regions for the first predetermined time. It is determined whether it continued (S23: contact determination step). That is, in S22 to S23, the input operation determination unit 131c determines whether the long press operation has been performed. When the contact to two or more areas continues for the first predetermined time (YES in S23), the microphone control unit 133c disables (mutes) the function of the microphone 16c (S24: function execution control step), The process ends. Also, this process may end when the touch operation on the peripheral area of the microphone 16c is not received. For example, this process may end when the application using the microphone 16c ends. In addition, when the processes of S21 to S24 end, the process may return to the process of S21.
 なお、複数の領域のうち2つ以上の領域に対して同時の接触が行なわれなかった場合(S22にてNO)または2つ以上の領域に対する接触が第1の所定の時間の間継続しなかった場合(S23にてNO)、処理はS22に戻る。 If simultaneous contact is not performed on two or more of the plurality of regions (NO in S22), or contact on two or more regions does not continue for the first predetermined time. If (NO at S23), the process returns to S22.
 本実施形態においては、マイク16cのミュートの制御について例を示した。他の例として、スピーカの音量をOFFにする制御に適用してもよい。特に本実施形態の構成を直感的な操作である接触操作を適用できるようなデバイスの制御に適用することができる。 In the present embodiment, an example has been shown for the control of the mute of the microphone 16c. As another example, the present invention may be applied to control to turn off the volume of the speaker. In particular, the configuration of the present embodiment can be applied to control of a device to which touch operation which is intuitive operation can be applied.
 〔ソフトウェアによる実現例〕
 情報端末1、1b、1cの制御ブロック(特に入力操作判定部131、入力操作判定部131b、入力操作判定部131c、カメラ制御部132、カメラ制御部132bおよびマイク制御部132c)は、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、ソフトウェアによって実現してもよい。
[Example of software implementation]
Control blocks of the information terminals 1, 1b and 1c (in particular, the input operation determination unit 131, the input operation determination unit 131b, the input operation determination unit 131c, the camera control unit 132, the camera control unit 132b and the microphone control unit 132c) It may be realized by a logic circuit (hardware) formed in an IC chip or the like, or may be realized by software.
 後者の場合、情報端末1、1b、1cは、各機能を実現するソフトウェアであるプログラムの命令を実行するコンピュータを備えている。このコンピュータは、例えば1つ以上のプロセッサを備えていると共に、上記プログラムを記憶したコンピュータ読み取り可能な記録媒体を備えている。そして、上記コンピュータにおいて、上記プロセッサが上記プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。上記プロセッサとしては、例えばCPU(Central Processing Unit)を用いることができる。上記記録媒体としては、「一時的でない有形の媒体」、例えば、ROM(Read Only Memory)等の他、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムを展開するRAM(Random Access Memory)などをさらに備えていてもよい。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明の一態様は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。 In the latter case, the information terminals 1, 1b, and 1c include a computer that executes instructions of a program that is software that implements each function. The computer includes, for example, one or more processors, and a computer readable recording medium storing the program. Then, in the computer, the processor reads the program from the recording medium and executes the program to achieve the object of the present invention. For example, a CPU (Central Processing Unit) can be used as the processor. As the above-mentioned recording medium, a tape, a disk, a card, a semiconductor memory, a programmable logic circuit or the like can be used besides “a non-temporary tangible medium”, for example, a ROM (Read Only Memory). In addition, a RAM (Random Access Memory) or the like for developing the program may be further provided. The program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program. Note that one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
 〔まとめ〕
 本発明の態様1に係る電子機器(情報端末1、1b、1c)は、自機器に備えられた機能を実行する機能実行部(カメラ12、マイク16c)と、上記機能実行部から所定の範囲内に配置されたユーザの接触を検出するセンサ(11)と、上記センサの検出値から、上記所定の範囲内に設定された複数の領域に対する上記接触を判定する接触判定部(入力操作判定部131、131b、131c)と、上記複数の領域に対する上記接触に応じて、上記機能実行部の機能実行を制御する機能実行制御部(カメラ制御部132、カメラ制御部132b、マイク制御部132c)とを備えている。
[Summary]
The electronic device ( information terminal 1, 1b, 1c) according to aspect 1 of the present invention has a function execution unit (camera 12, microphone 16c) for executing a function provided to the own device, and a predetermined range from the function execution unit. A sensor (11) for detecting a user's touch arranged in the inside, and a touch judgment unit (an input operation judgment unit that judges the touch on a plurality of areas set within the predetermined range from the detection value of the sensor) 131, 131b, 131c), and a function execution control unit (camera control unit 132, camera control unit 132b, microphone control unit 132c) that controls the function execution of the function execution unit according to the contact with the plurality of areas Is equipped.
 上記の構成によれば、機能実行部から所定の範囲内に設定された複数の領域に対する接触に応じて機能実行部の機能実行を制御する。換言すると、所定の範囲内に設定された1つの領域のみの接触では、機能実行制御部は機能実行を制御しない。そのため、ユーザが機能実行の制御を意図せずに所定の範囲内に接触した場合においても、所定の範囲内の複数の領域に対して接触していなければ機能実行は制御されない。従って、ユーザが機能実行の制御を意図せずに所定の範囲内を接触することによって生じる誤操作を低減することができる。 According to the above configuration, the function execution unit controls the function execution according to the contact with the plurality of areas set within the predetermined range from the function execution unit. In other words, the function execution control unit does not control the function execution in the case of contact with only one area set within the predetermined range. Therefore, even when the user touches the predetermined range without intention of control of the function execution, the function execution is not controlled unless the plurality of areas in the predetermined range are touched. Therefore, it is possible to reduce an erroneous operation caused by the user touching the predetermined range without intending to control the function execution.
 本発明の態様2に係る電子機器は、上記態様1において、上記電子機器は、一部に切欠きを含む形状のタッチパネル(2)を備えており、上記機能実行部は上記切欠き(F1)内に設置されており、上記センサは上記タッチパネルの一部の領域によって構成されており、当該領域は上記切欠きを形成する上記タッチパネルの縁から所定の範囲内の領域であってもよい。 In the electronic device according to aspect 2 of the present invention, in the aspect 1, the electronic device includes a touch panel (2) having a shape including a notch in a part, and the function execution unit is the notch (F1) The sensor may be installed inside, and the sensor may be configured by a partial area of the touch panel, and the area may be an area within a predetermined range from an edge of the touch panel forming the notch.
 上記の構成によれば、タッチパネルとセンサとを共通化できる。また、タッチパネルの切欠き内に機能実行部を備えている構成においては、ユーザが機能実行部から所定の範囲内を誤って接触する機会が多くなると考えられる。上記の構成によれば、所定の範囲内に設定された1つの領域のみの接触では、機能実行制御部は機能実行を制御しない。そのため、ユーザが機能実行部から所定の範囲内を誤って接触する機会が多くなると考えられる切欠き内に機能実行部を備えている構成においても、ユーザが機能実行の制御を意図せずに所定の範囲内を接触した場合の誤操作を低減することができる。 According to the above configuration, the touch panel and the sensor can be made common. Further, in the configuration in which the function execution unit is provided in the notch of the touch panel, it is considered that the user has a lot of chances of coming into contact with the predetermined range by mistake from the function execution unit. According to the above configuration, the function execution control unit does not control the function execution in the case of contact with only one area set within the predetermined range. Therefore, even in a configuration in which the function execution unit is provided in the notch where the user may accidentally contact the predetermined range from the function execution unit, the user does not intend to control the function execution. Erroneous operation when touching within the range of can be reduced.
 本発明の態様3に係る電子機器は、上記態様1または2において、上記複数の領域のうち2つ以上の領域に対して同時の接触が行なわれた場合、上記機能実行制御部は上記機能実行部の機能実行を制御してもよい。 In the electronic device according to aspect 3 of the present invention, in the above aspect 1 or 2, when two or more of the plurality of areas are simultaneously touched, the function execution control unit performs the function execution You may control the function execution of a department.
 上記の構成によれば、複数の領域のうち2つ以上の領域に対して同時の接触が行なわれた場合に、機能実行部による機能実行が制御される。換言すると、複数の領域のうち1つの領域のみに接触が行なわれた場合、機能実行部による機能実行は制御されない。また、2つ以上の領域に接触が行なわれたとしても該接触が同時でない場合、機能実行部による機能実行は制御されない。そのため、ユーザが機能実行の制御を意図せずに所定の範囲内を接触した場合の誤操作を低減することができる。 According to the above configuration, when simultaneous contact is performed on two or more of the plurality of areas, the function execution unit controls the execution of the function. In other words, when the contact is made to only one of the plurality of areas, the function execution by the function execution unit is not controlled. In addition, even if contact is made to two or more regions, if the contact is not simultaneous, the function execution by the function execution unit is not controlled. Therefore, it is possible to reduce an erroneous operation when the user touches the predetermined range without intending to control the function execution.
 本発明の態様4に係る電子機器(情報端末1b、情報端末1c)は、上記態様3において、上記接触が第1の所定の時間の間継続した場合、上記機能実行制御部(カメラ制御部132b、マイク制御部132c)は上記機能実行部の機能実行を制御してもよい。上記の構成によれば、接触が第1の所定の時間の間継続した場合、機能実行部による機能実行が制御される。換言すると、接触が所定の時間の間継続しなかった場合、機能実行部による機能実行は制御されない。そのため、ユーザが機能実行の制御を意図せずに所定の範囲内を接触した場合の誤操作を低減することができる。 In the electronic device (the information terminal 1b, the information terminal 1c) according to aspect 4 of the present invention, in the aspect 3, the function execution control unit (camera control portion 132b) when the contact continues for a first predetermined time. The microphone control unit 132c) may control the function execution of the function execution unit. According to the above configuration, when the contact continues for the first predetermined time, the function execution unit controls the function execution. In other words, if the contact does not continue for a predetermined time, the function execution by the function execution unit is not controlled. Therefore, it is possible to reduce an erroneous operation when the user touches the predetermined range without intending to control the function execution.
 本発明の態様5に係る電子機器(情報端末1b)は、上記態様3において、上記複数の領域のうち2つ以上の領域に対する同時の接触があった時点から第2の所定の時間内に上記複数の領域のうち2つ以上の領域に対する同時の接触が行なわれた場合、上記機能実行制御部(カメラ制御部132b)は上記機能実行部の機能実行を制御してもよい。 In the electronic device (information terminal 1 b) according to aspect 5 of the present invention, in the above-mentioned aspect 3, the electronic device (information terminal 1 b) performs the above within a second predetermined time from the time of simultaneous contact with two or more of the plurality of regions. When simultaneous contact is performed on two or more of the plurality of areas, the function execution control unit (camera control unit 132b) may control the function execution of the function execution unit.
 上記の構成によれば、複数の領域のうち2つ以上の領域に対する同時の接触があった時点から第2の所定の時間内に複数の領域のうち2つ以上の領域に対する同時の接触が行なわれた場合、機能実行制御部は機能実行部の機能実行を制御する。換言すると、複数の領域のうち2つ以上の領域に対する同時の接触から第2の所定の時間内に複数の領域のうち2つ以上の領域に対する同時の接触がない場合、機能実行部による機能実行は制御されない。そのため、ユーザが機能実行の制御を意図せずに所定の範囲内を接触した場合の誤操作を低減することができる。 According to the above configuration, simultaneous contact to two or more of the plurality of regions is performed within a second predetermined time from the time of simultaneous contact to two or more of the plurality of regions. If it does, the function execution control unit controls the function execution of the function execution unit. In other words, if there is no simultaneous contact with two or more regions among the plurality of regions within the second predetermined time from simultaneous contact with two or more regions among the plurality of regions, the function execution unit executes the function Is not controlled. Therefore, it is possible to reduce an erroneous operation when the user touches the predetermined range without intending to control the function execution.
 本発明の態様6に係る電子機器の制御方法は、自機器に備えられた機能を実行する機能実行部から所定の範囲内に配置された接触を検出するセンサの検出値から、上記所定の範囲内に設定された複数の領域に対する上記接触を判定する接触判定ステップ(S2、S11、S12、S13、S22、S23)と、上記複数の領域に対する上記接触に応じて、上記機能実行部の機能実行を制御する機能実行制御ステップ(S3、S14、S24)と、を含む。上記の構成によれば、上記態様1と同様の効果を奏することができる。 A control method of an electronic device according to a sixth aspect of the present invention is the above-mentioned predetermined range from a detection value of a sensor which detects a contact arranged within a predetermined range from a function execution unit which executes a function provided in the own device. In accordance with the contact determination step (S2, S11, S12, S13, S22, S23) for determining the contact with the plurality of regions set in the inside, the function execution of the function execution unit according to the contact with the plurality of regions And function execution control steps (S3, S14, S24) for controlling According to the above configuration, the same effect as that of the above aspect 1 can be obtained.
 本発明の各態様に係る電子機器は、コンピュータによって実現してもよく、この場合には、コンピュータを上記電子機器が備える各部(ソフトウェア要素)として動作させることにより上記電子機器をコンピュータにて実現させる電子機器の制御プログラム、およびそれを記録したコンピュータ読み取り可能な記録媒体も、本発明の範疇に入る。 The electronic device according to each aspect of the present invention may be realized by a computer, and in this case, the computer is realized by the computer as a component (software element) included in the electronic device. A control program of an electronic device and a computer readable recording medium recording the same also fall within the scope of the present invention.
 本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。さらに、各実施形態にそれぞれ開示された技術的手段を組み合わせることにより、新しい技術的特徴を形成することができる。 The present invention is not limited to the above-described embodiments, and various modifications can be made within the scope of the claims, and embodiments obtained by appropriately combining the technical means disclosed in the different embodiments. Is also included in the technical scope of the present invention. Furthermore, new technical features can be formed by combining the technical means disclosed in each embodiment.
 1、1b、1c                情報端末(電子機器)
 2                      タッチパネル
 11                     センサ
 12                     カメラ(機能実行部)
 16c                    マイク(機能実行部)
 131、131b、131c          入力操作判定部(接触判定部)
 132、132b               カメラ制御部(機能実行制御部)
 132c                   マイク制御部(機能実行制御部)
 F1                     切欠き
 S2、S11、S12、S13、S22、S23 接触判定ステップ
 S3、S14、S24             機能実行制御ステップ
1, 1b, 1c Information terminal (electronic equipment)
2 touch panel 11 sensor 12 camera (function execution unit)
16c microphone (function execution unit)
131, 131b, 131c input operation determination unit (contact determination unit)
132, 132b Camera control unit (function execution control unit)
132c Microphone control unit (function execution control unit)
F1 Notch S2, S11, S12, S13, S22, S23 Contact determination step S3, S14, S24 Function execution control step

Claims (7)

  1.  自機器に備えられた機能を実行する機能実行部と、
     上記機能実行部から所定の範囲内に配置されたユーザの接触を検出するセンサと、
     上記センサの検出値から、上記所定の範囲内に設定された複数の領域に対する上記接触を判定する接触判定部と、
     上記複数の領域に対する上記接触に応じて、上記機能実行部の機能実行を制御する機能実行制御部と
     を備えていることを特徴とする電子機器。
    A function execution unit that executes a function provided to the own device;
    A sensor disposed within a predetermined range from the function execution unit for detecting a user's touch;
    A contact determination unit that determines the contact on a plurality of areas set within the predetermined range from the detection value of the sensor;
    An electronic apparatus comprising: a function execution control unit configured to control function execution of the function execution unit according to the contact with the plurality of areas.
  2.  上記電子機器は、一部に切欠きを含む形状のタッチパネルを備えており、
     上記機能実行部は上記切欠き内に設置されており、
     上記センサは上記タッチパネルの一部の領域によって構成されており、当該領域は上記切欠きを形成する上記タッチパネルの縁から所定の範囲内の領域であることを特徴とする請求項1に記載の電子機器。
    The electronic device includes a touch panel having a shape including a notch in part,
    The function execution unit is installed in the notch,
    The electronic device according to claim 1, wherein the sensor is constituted by a partial region of the touch panel, and the region is a region within a predetermined range from an edge of the touch panel which forms the notch. machine.
  3.  上記複数の領域のうち2つ以上の領域に対して同時の接触が行なわれた場合、上記機能実行制御部は上記機能実行部の機能実行を制御することを特徴とする請求項1または2に記載の電子機器。 3. The apparatus according to claim 1, wherein the function execution control unit controls the function execution of the function execution unit when simultaneous contact is performed on two or more of the plurality of regions. Electronic device described.
  4.  上記接触が第1の所定の時間の間継続した場合、上記機能実行制御部は上記機能実行部の機能実行を制御することを特徴とする請求項3に記載の電子機器。 4. The electronic device according to claim 3, wherein the function execution control unit controls the function execution of the function execution unit when the contact continues for a first predetermined time.
  5.  上記複数の領域のうち2つ以上の領域に対する同時の接触があった時点から第2の所定の時間内に上記複数の領域のうち2つ以上の領域に対する同時の接触が行なわれた場合、上記機能実行制御部は上記機能実行部の機能実行を制御することを特徴とする請求項3に記載の電子機器。 In the case where simultaneous contact with two or more of the plurality of regions is performed within a second predetermined time from the point of simultaneous contact with two or more of the plurality of regions, The electronic apparatus according to claim 3, wherein the function execution control unit controls the function execution of the function execution unit.
  6.  自機器に備えられた機能を実行する機能実行部から所定の範囲内に配置された接触を検出するセンサの検出値から、上記所定の範囲内に設定された複数の領域に対する上記接触を判定する接触判定ステップと、
     上記複数の領域に対する上記接触に応じて、上記機能実行部の機能実行を制御する機能実行制御ステップと、
     を含むことを特徴とする電子機器の制御方法。
    From the detection value of the sensor which detects the contact arranged within the predetermined range from the function execution unit which executes the function provided to the own device, the above-mentioned contact to the plurality of areas set within the predetermined range is determined Contact determination step;
    A function execution control step of controlling function execution of the function execution unit according to the contact with the plurality of areas;
    And controlling the electronic device.
  7.  請求項1に記載の電子機器としてコンピュータを機能させるためのプログラムであって、上記接触判定部および上記機能実行制御部としてコンピュータを機能させるためのプログラム。 A program for causing a computer to function as the electronic device according to claim 1, wherein the program causes the computer to function as the contact determination unit and the function execution control unit.
PCT/JP2018/015161 2017-06-28 2018-04-11 Electronic device, method for controlling electronic device, and program WO2019003571A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019526168A JP6928652B2 (en) 2017-06-28 2018-04-11 Electronic devices, control methods and programs for electronic devices

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-126669 2017-06-28
JP2017126669 2017-06-28

Publications (1)

Publication Number Publication Date
WO2019003571A1 true WO2019003571A1 (en) 2019-01-03

Family

ID=64742195

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/015161 WO2019003571A1 (en) 2017-06-28 2018-04-11 Electronic device, method for controlling electronic device, and program

Country Status (2)

Country Link
JP (1) JP6928652B2 (en)
WO (1) WO2019003571A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110740265A (en) * 2019-10-31 2020-01-31 维沃移动通信有限公司 Image processing method and terminal equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008225648A (en) * 2007-03-09 2008-09-25 Alps Electric Co Ltd Power source controller and electronic equipment equipped with the same, and method for starting electronic equipment
WO2014129286A1 (en) * 2013-02-19 2014-08-28 Necカシオモバイルコミュニケーションズ株式会社 Information processing terminal, screen control method, and screen control program
JP2015022489A (en) * 2013-07-18 2015-02-02 富士ゼロックス株式会社 Information processor and program
JP2016539359A (en) * 2013-12-25 2016-12-15 ホアウェイ・デバイス・カンパニー・リミテッド Mobile terminal and method for starting shooting on a mobile terminal
JP2016224834A (en) * 2015-06-03 2016-12-28 キヤノン株式会社 Electronic device and control method thereof
EP3109727A2 (en) * 2015-06-25 2016-12-28 Xiaomi Inc. A method and apparatus for controlling a display, and a mobile terminal
JP2017058840A (en) * 2015-09-15 2017-03-23 セイコーエプソン株式会社 Display system, display device control method, and program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006092321A (en) * 2004-09-24 2006-04-06 Toshiba Corp Electronic equipment and touchpad device
JP4818457B2 (en) * 2010-09-29 2011-11-16 株式会社東芝 Electronic equipment, input control method
JP6370118B2 (en) * 2014-06-06 2018-08-08 キヤノン株式会社 Information processing apparatus, information processing method, and computer program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008225648A (en) * 2007-03-09 2008-09-25 Alps Electric Co Ltd Power source controller and electronic equipment equipped with the same, and method for starting electronic equipment
WO2014129286A1 (en) * 2013-02-19 2014-08-28 Necカシオモバイルコミュニケーションズ株式会社 Information processing terminal, screen control method, and screen control program
JP2015022489A (en) * 2013-07-18 2015-02-02 富士ゼロックス株式会社 Information processor and program
JP2016539359A (en) * 2013-12-25 2016-12-15 ホアウェイ・デバイス・カンパニー・リミテッド Mobile terminal and method for starting shooting on a mobile terminal
JP2016224834A (en) * 2015-06-03 2016-12-28 キヤノン株式会社 Electronic device and control method thereof
EP3109727A2 (en) * 2015-06-25 2016-12-28 Xiaomi Inc. A method and apparatus for controlling a display, and a mobile terminal
JP2017058840A (en) * 2015-09-15 2017-03-23 セイコーエプソン株式会社 Display system, display device control method, and program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110740265A (en) * 2019-10-31 2020-01-31 维沃移动通信有限公司 Image processing method and terminal equipment
CN110740265B (en) * 2019-10-31 2021-03-12 维沃移动通信有限公司 Image processing method and terminal equipment

Also Published As

Publication number Publication date
JP6928652B2 (en) 2021-09-01
JPWO2019003571A1 (en) 2020-04-02

Similar Documents

Publication Publication Date Title
US10754539B2 (en) Touch Operation Processing Method and Terminal Device
JP5407731B2 (en) Electronic device and program
JP5105127B2 (en) Portable terminal, its key operation control method and program
EP3035175B1 (en) Screen edge touch control optimization method, device and terminal
CN107357458B (en) Touch key response method and device, storage medium and mobile terminal
JP6229069B2 (en) Mobile terminal, how to handle virtual buttons
JP6096854B1 (en) Electronic device and method of operating electronic device
JP6381989B2 (en) Portable electronic device, control method and program for portable electronic device
WO2019003571A1 (en) Electronic device, method for controlling electronic device, and program
WO2015199175A1 (en) Portable electronic apparatus, portable electronic apparatus control method, and recording medium
WO2011105061A1 (en) Portable terminal, input control program, and input control method
US20120225698A1 (en) Mobile communication terminal, input control program and input control method
JP2018014111A (en) Electronic apparatus
US10708405B2 (en) Electronic apparatus, control device, and recording medium
JP2010258647A (en) Communication terminal and incoming call control method
US11126294B2 (en) Input apparatus that receives, after fixed period, position on screen of display device specified by touch operation
CN107407982B (en) Input method of touch screen and terminal
JP6248200B2 (en) Information processing apparatus, control method therefor, control program, and recording medium
JP2017142782A (en) Electronic device, calibration method, and program
JP5944974B2 (en) Portable terminal and input control program
JP2015142353A (en) Portable terminal and program
JP2017069990A (en) Electronic apparatus
CN113485632A (en) Touch control method for folding screen, terminal device and computer readable storage medium
WO2015178093A1 (en) Terminal device, control program, and computer-readable recording medium on which control program is recorded
JP2019020854A (en) Electronic apparatus, control apparatus, control program, and recording media

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18824091

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019526168

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18824091

Country of ref document: EP

Kind code of ref document: A1