WO2019003571A1 - Dispositif électronique, procédé de commande de dispositif électronique et programme - Google Patents

Dispositif électronique, procédé de commande de dispositif électronique et programme Download PDF

Info

Publication number
WO2019003571A1
WO2019003571A1 PCT/JP2018/015161 JP2018015161W WO2019003571A1 WO 2019003571 A1 WO2019003571 A1 WO 2019003571A1 JP 2018015161 W JP2018015161 W JP 2018015161W WO 2019003571 A1 WO2019003571 A1 WO 2019003571A1
Authority
WO
WIPO (PCT)
Prior art keywords
function execution
contact
camera
function
unit
Prior art date
Application number
PCT/JP2018/015161
Other languages
English (en)
Japanese (ja)
Inventor
前川 哲也
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Priority to JP2019526168A priority Critical patent/JP6928652B2/ja
Publication of WO2019003571A1 publication Critical patent/WO2019003571A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an electronic device provided with a device that executes a function provided to the own device.
  • Patent Document 1 discloses a portable terminal that displays a reminder indicating that a finger is included in a captured image when contact of the finger is detected around the camera.
  • finger contact is detected when the user merely touches the periphery of the camera lens.
  • Patent Document 1 for detecting the contact is used as follows.
  • a problem arises. That is, if the user accidentally touches the periphery of the device, a problem occurs that an operation not intended by the user occurs on the device.
  • an electronic device includes a function execution unit that executes a function provided to the own device, and a user disposed within a predetermined range from the function execution unit.
  • the function according to a sensor for detecting contact, a contact determination unit for determining the contact with the plurality of regions set within the predetermined range from the detection value of the sensor, and the contact with the plurality of regions And a function execution control unit that controls function execution of the execution unit.
  • the control method of the electronic device concerning one mode of the present invention detects the contact arranged in the predetermined range from the function execution part which performs the function with which self-device was equipped.
  • a function of controlling execution of the function of the function execution unit according to the contact determination step of determining the contact to the plurality of regions set within the predetermined range from the detected value of d and the contact to the plurality of regions And an execution control step.
  • an effect of reducing an erroneous operation when the user unintentionally touches is achieved.
  • FIG. (A) And (b) is a figure which shows the outline
  • FIG. It is a flowchart which shows an example of the flow of the process which the information terminal which concerns on this Embodiment 1 performs.
  • FIG. It is a flowchart which shows an example of the flow of the process which the information terminal which concerns on this Embodiment 2 performs.
  • As shown in FIG. It is a flowchart which shows an example of the flow of the process which the information terminal concerning this Embodiment 3 performs.
  • Embodiment 1 Hereinafter, embodiments of the present invention will be described in detail with reference to FIG. 1 to FIG. 3 and FIG.
  • FIG. 2 is a view showing an outline of the information terminal 1.
  • FIG. 8 is a figure which shows the outline
  • the information terminal 100 includes a touch panel 102, a sensor 111, and a camera 112.
  • the shape of the touch panel 102 is a shape including the notch F100.
  • the sensor 111 and the camera 112 are disposed in the notch F100.
  • the sensor 111 is a sensor that detects a touch of a user's finger, and is disposed so as to surround the camera 112.
  • the crosses shown in FIG. 8 indicate the contact points of the user in the peripheral area of the camera 112.
  • the information terminal 100 ends the imaging by the camera 112.
  • the information terminal 100 ends the imaging of the camera 112 when the contact is detected regardless of the number of contact points in the peripheral region of the camera 112, the contact region, and the like.
  • FIG. (A) and (b) of FIG. 2 is a figure which shows the outline
  • FIG. (A) and (b) of FIG. 2 is a figure which shows the outline
  • the information terminal 1 includes a touch panel 2, a sensor 11 for detecting a touch, and a camera (function execution unit) 12.
  • the information terminal 1 may be, for example, a tablet type terminal, a smartphone or the like.
  • the shape of the touch panel 2 is a shape including the notch F1.
  • the camera 12 is disposed in the notch F1.
  • a partial area of the touch panel 2 functions as the sensor 11 (the sensor 11 is configured by a partial area of the touch panel 2).
  • the touch panel 2 in a region within a predetermined range from the edge of the touch panel 2 forming the notch F1 functions as the sensor 11.
  • the sensor 11 is disposed within a predetermined range from the camera 12.
  • an area R1, an area R2, and an area R3 are set in the area where the sensor 11 is disposed.
  • the notch F ⁇ b> 1 of the touch panel 2 has a U-shape.
  • an area R1, an area R2, and an area R3 are set in an area within a predetermined range from an edge forming each side of the U-shape.
  • the information terminal 1 for example, when the finger U of the user contacts only the region R2, the imaging by the camera 12 is not finished.
  • the information terminal 1 ends the imaging by the camera 12.
  • the user covers the camera 12 with a hand, imaging by the camera 12 ends.
  • the information terminal 1 may end the imaging by the camera 12.
  • the information terminal 1 determines a touch on a plurality of areas set within a predetermined range from the detection value of the sensor 11 and controls the function execution of the camera 12 according to the touch on the plurality of areas.
  • field R1, field R2, and field R3 were set up for every side of touch panel 2 which forms notch F1, setting of the field concerned can be performed arbitrarily.
  • the information terminal 1 may operate to control the function execution of the camera 12 when the touch panel 2 is active (normal mode: when a display screen is displayed). Further, the information terminal 1 may operate to control the function execution of the camera 12 when the touch panel 2 is in the save mode (when the display screen is not displayed).
  • the function execution of the camera 12 is controlled in accordance with contact with the regions R1, R2 and R3 which are a plurality of regions set from the camera 12 within a predetermined range.
  • the information terminal 1 does not control the function execution of the camera 12 in the case of contact with only one area set within the predetermined range. Therefore, even when the user does not intend to control the function execution of the camera 12 and contacts the predetermined range in which the sensor 11 is disposed, a plurality of regions among the region R1, the region R2 and the region R3 are used. If there is no contact, control of the function execution of the camera 12 is not performed. Therefore, it is possible to reduce an erroneous operation caused by the user touching the area where the sensor 11 is disposed without intending to control the function execution of the camera 12.
  • FIG. 1 is a block diagram showing the main configuration of the information terminal 1.
  • the information terminal 1 includes a touch panel 2, a sensor 11, a camera 12, a control unit 13, and a storage unit 14.
  • the touch panel 2 receives the touch operation of the user on the display screen.
  • the touch panel 2 is a free form display whose shape can be freely designed.
  • the touch panel 2 is, for example, an integrated display device and a touch panel device. By touching the display unit 20 with a finger, the user can perform operations such as icons displayed on the display unit 20 and scroll the screen.
  • the shape of the touch panel 2 is a shape including the notch F1, as shown in (a) and (b) of FIG.
  • the notch F ⁇ b> 1 is formed at the upper center of the touch panel 2 and is formed so as to cut out a part of the display surface of the touch panel 2.
  • the notch F1 is formed at the upper center of the touch panel 2, but if the notch F1 is formed on at least one side of the outer edge of the touch panel 2, The place where the notch F1 is formed is not particularly limited.
  • the notch F1 may be formed at the upper left end of the touch panel 2, may be formed at the upper right end of the touch panel 2, or may be formed at the lower, left or right edge of the touch panel 2. .
  • the notch F1 may be formed to be shifted to the left or right from the upper center of the touch panel 2.
  • the number of the notches F1 formed in the touch panel 2 is not particularly limited.
  • two or more notches F1 may be formed on the top of the touch panel 2.
  • the shape of the notch F1 is a U-shape (the shape of the notch is a square) in (a) and (b) of FIG. 2, it is not particularly limited.
  • the shape of the notch F1 may be semicircular or rectangular.
  • the sensor 11 detects a touch of the user.
  • a part of the touch panel 2 is a sensor 11.
  • the sensor 11 may be a sensor independent of the touch panel 2.
  • the sensor 11 is not particularly limited as long as it can detect the touch of the user, but may be, for example, a sensor using a capacitance, an infrared sensor, or the like.
  • the senor 11 is disposed within a predetermined range from the installation position of the camera 12.
  • the predetermined range in which the sensor 11 is disposed may be, for example, a range of 2 to 3 mm from the edge of the camera 12.
  • the predetermined range in which the sensor 11 is disposed may be appropriately changed according to the convenience of design.
  • the sensor 11 transmits the detected value to the input operation determination unit (contact determination unit) 131.
  • the camera 12 executes a function of imaging an object.
  • the camera 12 receives an instruction of the camera control unit (function execution control unit) 132, and executes the function according to the instruction.
  • the camera 12 is installed in the notch F1.
  • Control unit 13 The control unit 13 integrally controls each part of the information terminal 1.
  • the control unit 13 includes an input operation determination unit 131 and a camera control unit 132.
  • the input operation determination unit 131 determines whether or not simultaneous contact has been made to two or more of the plurality of areas. When simultaneous contact is performed on two or more areas, the input operation determination unit 131 transmits a message to that effect to the camera control unit 132.
  • the camera control unit 132 controls function execution of the camera 12.
  • the camera control unit 132 controls the function execution of the camera 12 in accordance with the touch on a plurality of areas. Specifically, when simultaneous contact is performed on two or more of the plurality of areas, the camera control unit 132 ends the camera 12 (imaging by the camera 12). When the camera control unit 132 ends the imaging by the camera 12, the camera control unit 132 updates the camera state information 141 stored in the storage unit 14.
  • FIG. 3 is a flowchart showing an example of the flow of processing executed by the information terminal 1.
  • the information terminal 1 may start the process.
  • the input operation determination unit 131 determines whether the camera 12 is activated (S1).
  • the input operation determination unit 131 determines whether or not simultaneous contact has been performed on two or more of the plurality of regions (S2: contact) Judgment step).
  • the camera control unit 132 terminates the camera 12 (image capture by the camera 12) (S3: function execution control) Step), the process ends.
  • S3 function execution control
  • control of the camera 12 has been shown.
  • the configuration shown in the present embodiment may be applied to control of devices other than cameras.
  • the present invention can be applied to control of a device that can apply touch operation, which is intuitive operation.
  • the information terminal 1 b determines the long press operation and the double tap operation, and controls the function execution of the camera 12.
  • FIG. 4 is a block diagram showing the main configuration of the information terminal 1b.
  • the information terminal 1 b includes a touch panel 2, a sensor 11, a camera 12, a control unit 13 b, a storage unit 14, and a timer 15 b.
  • the touch panel 2, the sensor 11, the camera 12, and the storage unit 14 have been described in detail in the first embodiment, and thus the description thereof is omitted here.
  • the control unit 13 b includes an input operation determination unit 131 b and a camera control unit 132 b.
  • the input operation determination unit 131 b performs the following process in addition to the process of the input operation determination unit 131 described above.
  • the input operation determination unit 131b determines whether contact to two or more regions among the plurality of regions has continued for a first predetermined time. That is, the input operation determination unit 131b determines whether the long press operation has been performed.
  • the input operation determination unit 131b simultaneously detects two or more of the plurality of regions within a second predetermined time from the time when two or more of the plurality of regions are simultaneously touched. It is determined whether the contact has been made again. That is, the input operation determination unit 131b determines whether a double tap operation has been performed.
  • the input operation determination unit 131b refers to the timer 15b indicating the elapse of time, and determines the elapse of the first predetermined time and the second predetermined time described above. More specifically, the input operation determination unit 131 b refers to the camera state information 141 indicating the activation state of the camera 12 stored in the storage unit 14 and determines whether the camera 12 is activated. When the camera 12 is activated, the input operation determination unit 131 determines whether a long press operation has been performed. When the camera 12 is not activated, the input operation determination unit 131 determines whether a double tap operation has been performed. When the long press operation or the double tap operation is performed, the input operation determination unit 131 b transmits to the camera control unit 132 b that the long press operation or the double tap operation has been performed.
  • Camera control unit 132b When contact with two or more of the plurality of areas continues for a first predetermined time (a long press operation is performed), the camera control unit 132 b controls the function execution of the camera 12. In the present embodiment, in particular, when the long press operation is performed, the camera control unit 132 b ends the camera 12 (imaging by the camera 12).
  • the camera control unit 132 b controls the function execution of the camera 12.
  • the camera control unit 132 b activates the camera 12 (starts imaging by the camera 12). That is, the camera control unit 132 b controls the execution of the function of the camera 12 corresponding to the long press operation or the double tap operation.
  • FIG. 5 is a flowchart showing an example of the flow of processing performed by the information terminal 1 b.
  • the processes of S1 and S2 are the same as the processes described in the first embodiment, and thus detailed description thereof will be omitted.
  • S11 when the camera 12 is activated (YES in S1) and two or more of the plurality of areas are simultaneously touched (YES in S2), S11 followed by.
  • the input operation determination unit 131b determines whether contact to two or more areas has continued for a first predetermined time (S11: contact determination step). That is, in S2 to S11, the input operation determination unit 131b determines whether the long press operation has been performed.
  • the camera control unit 132b ends the imaging by the camera 12 (S3), and the process ends. If simultaneous contact is not performed on two or more of the plurality of regions (NO in S2), or contact on two or more regions does not continue for the first predetermined time. If (NO at S11), the process returns to S2.
  • input operation determination unit 131 determines whether or not simultaneous contact has been made to two or more of the plurality of regions (S12). : Contact determination step).
  • input operation determination unit 131b causes two or more regions among a plurality of regions to be received within a second predetermined time. It is determined whether simultaneous contact has been performed again (S13: contact determination step). That is, in S12 to S13, the input operation determination unit 131b determines whether a double tap operation has been performed.
  • the input operation determination unit 131b detects whether or not a series of contact, non-contact, and contact operations have been performed on two or more regions within a predetermined time period. Determine from. If simultaneous contact with two or more of the plurality of regions is performed again within the second predetermined time (YES in S13), camera control unit 132b activates camera 12 (image taken by camera 12) (S14: function execution control step), and the process ends. If simultaneous contact is not performed on two or more regions (NO at S12) or within a second predetermined time, simultaneous contact on two or more of the plurality of regions is If not performed (NO in S13), the process returns to S12. Also, this process may end when the touch operation on the peripheral area of the camera 12 is not received.
  • this process may end when the application using the camera 12 ends. Further, when the processes of S3 and S14 end, the process may return to the process of S1. Further, the processing of S2 to S11 in the present processing may be replaced with the processing of S12 to S13. That is, the camera 12 may be ended when the double tap operation is performed, and the camera 12 may be activated when the long press operation is performed.
  • an example has been shown for control of activation and termination of the camera 12.
  • it may be applied to turning on and off the mobile light.
  • the configuration of the present embodiment can be applied to control of a device to which touch operation which is intuitive operation can be applied.
  • the information terminal 1c controls the execution of the function of the microphone (function execution unit) 16c.
  • FIG. 6 is a block diagram showing the main configuration of the information terminal 1c.
  • the information terminal 1c includes a touch panel 2, a sensor 11, a microphone 16c, a control unit 13c, a storage unit 14c, and a timer 15b.
  • the touch panel 2 and the timer 15 b have been described in detail in the first and second embodiments, and thus the description thereof is omitted here.
  • the sensor 11 of the first and second embodiments is disposed within a predetermined range from the installation position of the camera 12, the sensor 11c is disposed within the predetermined range from the installation position of the microphone 16c.
  • the other configuration of the sensor 11 c is similar to that of the sensor 11.
  • the microphone 16c picks up sound.
  • the microphone 16c receives an instruction of the microphone control unit (function execution control unit) 133c, and is put into a mute state according to the instruction. Further, as in the camera 12 described in the first embodiment, the microphone 16c may be installed in the notch F1 shown in (a) of FIG.
  • the control unit 13c includes an input operation determination unit 131c and a microphone control unit 133c.
  • the input operation determination unit 131c refers to the microphone state information 142c indicating whether the function of the microphone 16c stored in the storage unit 14 is valid (whether or not in the mute state), and the microphone 16c is valid. Determine if there is. When the function of the microphone 16c is valid (sound collection is performed), as in the input operation determination unit 131b of the second embodiment, the input operation determination unit 131c determines whether the long press operation by the touch operation has been performed. Determine When the long press operation is performed, the input operation determination unit 131c transmits, to the microphone control unit 133c, the fact that the long press operation is performed.
  • the microphone control unit 133c controls the function execution of the microphone 16c when the long press operation is performed. In detail, when the long press operation is performed, the microphone control unit 133c mutes the microphone 16c. Further, when the microphone control unit 133c mutes the microphone 16c, the microphone control unit 133c updates the microphone state information 142c stored in the storage unit 14c.
  • FIG. 7 is a flowchart showing an example of the flow of processing performed by the information terminal 1c.
  • the information terminal 1c may start the process.
  • the input operation determination unit 131c determines whether the function of the microphone 16c is valid (S21).
  • the input operation determination unit 131c determines whether or not simultaneous contact has been made to two or more of the plurality of regions (S22: Contact determination step). When simultaneous contact is performed on two or more regions among the plurality of regions (YES in S22), input operation determination unit 131c performs the contact for two or more regions for the first predetermined time. It is determined whether it continued (S23: contact determination step). That is, in S22 to S23, the input operation determination unit 131c determines whether the long press operation has been performed.
  • the microphone control unit 133c disables (mutes) the function of the microphone 16c (S24: function execution control step), The process ends. Also, this process may end when the touch operation on the peripheral area of the microphone 16c is not received. For example, this process may end when the application using the microphone 16c ends. In addition, when the processes of S21 to S24 end, the process may return to the process of S21.
  • the present invention may be applied to control to turn off the volume of the speaker.
  • the configuration of the present embodiment can be applied to control of a device to which touch operation which is intuitive operation can be applied.
  • Control blocks of the information terminals 1, 1b and 1c (in particular, the input operation determination unit 131, the input operation determination unit 131b, the input operation determination unit 131c, the camera control unit 132, the camera control unit 132b and the microphone control unit 132c) It may be realized by a logic circuit (hardware) formed in an IC chip or the like, or may be realized by software.
  • the information terminals 1, 1b, and 1c include a computer that executes instructions of a program that is software that implements each function.
  • the computer includes, for example, one or more processors, and a computer readable recording medium storing the program.
  • the processor reads the program from the recording medium and executes the program to achieve the object of the present invention.
  • a CPU Central Processing Unit
  • the above-mentioned recording medium a tape, a disk, a card, a semiconductor memory, a programmable logic circuit or the like can be used besides “a non-temporary tangible medium”, for example, a ROM (Read Only Memory).
  • a RAM Random Access Memory
  • the program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program.
  • any transmission medium communication network, broadcast wave, etc.
  • one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
  • the electronic device (information terminal 1, 1b, 1c) according to aspect 1 of the present invention has a function execution unit (camera 12, microphone 16c) for executing a function provided to the own device, and a predetermined range from the function execution unit.
  • the function execution unit controls the function execution according to the contact with the plurality of areas set within the predetermined range from the function execution unit.
  • the function execution control unit does not control the function execution in the case of contact with only one area set within the predetermined range. Therefore, even when the user touches the predetermined range without intention of control of the function execution, the function execution is not controlled unless the plurality of areas in the predetermined range are touched. Therefore, it is possible to reduce an erroneous operation caused by the user touching the predetermined range without intending to control the function execution.
  • the electronic device includes a touch panel (2) having a shape including a notch in a part, and the function execution unit is the notch (F1)
  • the sensor may be installed inside, and the sensor may be configured by a partial area of the touch panel, and the area may be an area within a predetermined range from an edge of the touch panel forming the notch.
  • the touch panel and the sensor can be made common. Further, in the configuration in which the function execution unit is provided in the notch of the touch panel, it is considered that the user has a lot of chances of coming into contact with the predetermined range by mistake from the function execution unit. According to the above configuration, the function execution control unit does not control the function execution in the case of contact with only one area set within the predetermined range. Therefore, even in a configuration in which the function execution unit is provided in the notch where the user may accidentally contact the predetermined range from the function execution unit, the user does not intend to control the function execution. Erroneous operation when touching within the range of can be reduced.
  • the function execution control unit when two or more of the plurality of areas are simultaneously touched, the function execution control unit performs the function execution You may control the function execution of a department.
  • the function execution unit controls the execution of the function. In other words, when the contact is made to only one of the plurality of areas, the function execution by the function execution unit is not controlled. In addition, even if contact is made to two or more regions, if the contact is not simultaneous, the function execution by the function execution unit is not controlled. Therefore, it is possible to reduce an erroneous operation when the user touches the predetermined range without intending to control the function execution.
  • the function execution control unit (camera control portion 132b) when the contact continues for a first predetermined time.
  • the microphone control unit 132c) may control the function execution of the function execution unit.
  • the function execution unit controls the function execution. In other words, if the contact does not continue for a predetermined time, the function execution by the function execution unit is not controlled. Therefore, it is possible to reduce an erroneous operation when the user touches the predetermined range without intending to control the function execution.
  • the electronic device (information terminal 1 b) performs the above within a second predetermined time from the time of simultaneous contact with two or more of the plurality of regions.
  • the function execution control unit may control the function execution of the function execution unit.
  • simultaneous contact to two or more of the plurality of regions is performed within a second predetermined time from the time of simultaneous contact to two or more of the plurality of regions. If it does, the function execution control unit controls the function execution of the function execution unit. In other words, if there is no simultaneous contact with two or more regions among the plurality of regions within the second predetermined time from simultaneous contact with two or more regions among the plurality of regions, the function execution unit executes the function Is not controlled. Therefore, it is possible to reduce an erroneous operation when the user touches the predetermined range without intending to control the function execution.
  • a control method of an electronic device is the above-mentioned predetermined range from a detection value of a sensor which detects a contact arranged within a predetermined range from a function execution unit which executes a function provided in the own device.
  • the contact determination step S2, S11, S12, S13, S22, S23
  • the function execution of the function execution unit according to the contact with the plurality of regions
  • function execution control steps S3, S14, S24
  • the electronic device may be realized by a computer, and in this case, the computer is realized by the computer as a component (software element) included in the electronic device.
  • a control program of an electronic device and a computer readable recording medium recording the same also fall within the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)

Abstract

La présente invention réduit les opérations erronées provoquées par la détection d'un contact erroné de l'utilisateur. Ce dispositif électronique est pourvu : d'une caméra (12) qui exécute une fonction ; d'un capteur (11) qui est disposé à l'intérieur d'une plage prescrite à partir de la caméra, et qui détecte un contact ; et d'une unité de commande de caméra (132) qui commande l'exécution de la fonction de la caméra en fonction d'un contact avec une pluralité de régions réglées à l'intérieur de la plage prescrite.
PCT/JP2018/015161 2017-06-28 2018-04-11 Dispositif électronique, procédé de commande de dispositif électronique et programme WO2019003571A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019526168A JP6928652B2 (ja) 2017-06-28 2018-04-11 電子機器、電子機器の制御方法およびプログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017126669 2017-06-28
JP2017-126669 2017-06-28

Publications (1)

Publication Number Publication Date
WO2019003571A1 true WO2019003571A1 (fr) 2019-01-03

Family

ID=64742195

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/015161 WO2019003571A1 (fr) 2017-06-28 2018-04-11 Dispositif électronique, procédé de commande de dispositif électronique et programme

Country Status (2)

Country Link
JP (1) JP6928652B2 (fr)
WO (1) WO2019003571A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110740265A (zh) * 2019-10-31 2020-01-31 维沃移动通信有限公司 图像处理方法及终端设备
JP2023506289A (ja) * 2019-12-20 2023-02-15 維沃移動通信有限公司 カメラ起動方法及び電子機器

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008225648A (ja) * 2007-03-09 2008-09-25 Alps Electric Co Ltd 電源制御装置及びこれを備えた電子機器並びに電子機器の起動方法
WO2014129286A1 (fr) * 2013-02-19 2014-08-28 Necカシオモバイルコミュニケーションズ株式会社 Terminal de traitement d'informations, procédé de commande d'écran et programme de commande d'écran
JP2015022489A (ja) * 2013-07-18 2015-02-02 富士ゼロックス株式会社 情報処理装置及びプログラム
JP2016539359A (ja) * 2013-12-25 2016-12-15 ホアウェイ・デバイス・カンパニー・リミテッド モバイル端末及びモバイル端末上で撮影を開始するための方法
JP2016224834A (ja) * 2015-06-03 2016-12-28 キヤノン株式会社 電子機器、その制御方法
EP3109727A2 (fr) * 2015-06-25 2016-12-28 Xiaomi Inc. Procédé et appareil permettant de commander un dispositif d' affichage et terminal mobile
JP2017058840A (ja) * 2015-09-15 2017-03-23 セイコーエプソン株式会社 表示システム、表示装置の制御方法、及び、プログラム

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006092321A (ja) * 2004-09-24 2006-04-06 Toshiba Corp 電子機器およびタッチパッド装置
JP4818457B2 (ja) * 2010-09-29 2011-11-16 株式会社東芝 電子機器、入力制御方法
JP6370118B2 (ja) * 2014-06-06 2018-08-08 キヤノン株式会社 情報処理装置、情報処理方法、及びコンピュータプログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008225648A (ja) * 2007-03-09 2008-09-25 Alps Electric Co Ltd 電源制御装置及びこれを備えた電子機器並びに電子機器の起動方法
WO2014129286A1 (fr) * 2013-02-19 2014-08-28 Necカシオモバイルコミュニケーションズ株式会社 Terminal de traitement d'informations, procédé de commande d'écran et programme de commande d'écran
JP2015022489A (ja) * 2013-07-18 2015-02-02 富士ゼロックス株式会社 情報処理装置及びプログラム
JP2016539359A (ja) * 2013-12-25 2016-12-15 ホアウェイ・デバイス・カンパニー・リミテッド モバイル端末及びモバイル端末上で撮影を開始するための方法
JP2016224834A (ja) * 2015-06-03 2016-12-28 キヤノン株式会社 電子機器、その制御方法
EP3109727A2 (fr) * 2015-06-25 2016-12-28 Xiaomi Inc. Procédé et appareil permettant de commander un dispositif d' affichage et terminal mobile
JP2017058840A (ja) * 2015-09-15 2017-03-23 セイコーエプソン株式会社 表示システム、表示装置の制御方法、及び、プログラム

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110740265A (zh) * 2019-10-31 2020-01-31 维沃移动通信有限公司 图像处理方法及终端设备
CN110740265B (zh) * 2019-10-31 2021-03-12 维沃移动通信有限公司 图像处理方法及终端设备
JP2023506289A (ja) * 2019-12-20 2023-02-15 維沃移動通信有限公司 カメラ起動方法及び電子機器

Also Published As

Publication number Publication date
JPWO2019003571A1 (ja) 2020-04-02
JP6928652B2 (ja) 2021-09-01

Similar Documents

Publication Publication Date Title
US10754539B2 (en) Touch Operation Processing Method and Terminal Device
JP5407731B2 (ja) 電子機器、および、プログラム
JP5105127B2 (ja) 携帯端末、そのキー操作制御方法およびプログラム
EP3035175B1 (fr) Procédé, dispositif et terminal d'optimisation de commande tactile de bord d'écran
CN107357458B (zh) 触摸按键的响应方法、装置、存储介质及移动终端
CN106873834B (zh) 识别按键被触发的方法及装置和移动终端
JP6229069B2 (ja) モバイルターミナル、バーチャルボタンを処理する方法
JP6096854B1 (ja) 電子機器及び電子機器の動作方法
WO2019003571A1 (fr) Dispositif électronique, procédé de commande de dispositif électronique et programme
WO2015199173A1 (fr) Appareil électronique portable, procédé de commande d'appareil électronique portable et support d'enregistrement
WO2011105061A1 (fr) Terminal portable, programme et procédé de commande d'entrée
WO2011058733A1 (fr) Terminal de communication mobile, programme de commande d'entrée et procédé de commande d'entrée
US10708405B2 (en) Electronic apparatus, control device, and recording medium
JP2010258647A (ja) 通信端末及び着信制御方法
US11126294B2 (en) Input apparatus that receives, after fixed period, position on screen of display device specified by touch operation
JP2018014111A (ja) 電子機器
CN107407982B (zh) 一种触摸屏幕的输入方法及终端
JP6248200B2 (ja) 情報処理装置およびその制御方法、制御プログラム、ならびに記録媒体
JP2017142782A (ja) 電子機器、キャリブレーション方法、およびプログラム
JP5944974B2 (ja) 携帯端末及び入力制御プログラム
JP2015142353A (ja) 携帯端末、プログラム
JP2017069990A (ja) 電子機器
WO2015178093A1 (fr) Dispositif terminal, programme de commande, et support d'enregistrement lisible par ordinateur sur lequel est enregistré le programme de commande
JP2019020854A (ja) 電子機器、制御装置、制御プログラム、及び記録媒体
JP2017143442A (ja) 電子機器、キャリブレーション方法、およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18824091

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019526168

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18824091

Country of ref document: EP

Kind code of ref document: A1