CN114442830B - Electronic device and control method thereof - Google Patents
Electronic device and control method thereof Download PDFInfo
- Publication number
- CN114442830B CN114442830B CN202011204761.0A CN202011204761A CN114442830B CN 114442830 B CN114442830 B CN 114442830B CN 202011204761 A CN202011204761 A CN 202011204761A CN 114442830 B CN114442830 B CN 114442830B
- Authority
- CN
- China
- Prior art keywords
- motion data
- electronic device
- motion
- change amount
- touch
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 230000009471 action Effects 0.000 claims abstract description 35
- 230000001133 acceleration Effects 0.000 claims description 21
- 230000008859 change Effects 0.000 claims description 17
- 230000008569 process Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 5
- 230000007257 malfunction Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
The present disclosure provides an electronic device. The electronic device comprises a touch module, a motion sensor, a memory and a control unit. The touch module is used for generating a touch signal. The motion sensor is used for detecting the motion of the electronic device to generate motion data. The memory stores a preset action condition. The control unit is electrically connected with the contact control module, the motion sensor and the memory and is used for receiving motion data; and judging whether the motion data accords with a preset action condition, and if so, generating a virtual touch signal. The present disclosure also provides a control method for the electronic device.
Description
Technical Field
The present disclosure relates to an electronic device and a control method thereof.
Background
In order to meet the requirement of portability, the mobile device is not too large in size, which is easy to cause difficulty in touch operation of a user. Particularly, when playing a game, it is difficult for a user to perform continuous touch control by using fingers within a limited touch control range of the mobile device, so that the control efficiency is affected.
Disclosure of Invention
The present disclosure provides an electronic device. The electronic device comprises a touch module, a motion sensor, a memory and a control unit. The touch module is used for generating a touch signal. The motion sensor is used for detecting the motion of the electronic device to generate motion data. The memory stores a preset action condition. The control unit is electrically connected with the contact control module, the motion sensor and the memory and is used for receiving motion data; and judging whether the motion data accords with a preset action condition, and if so, generating a virtual touch signal.
The present disclosure also provides a control method suitable for an electronic device. The control method comprises the following steps: setting a preset action condition; detecting the motion of the electronic device to generate motion data; and judging whether the motion data accords with a preset action condition, and if so, generating a virtual touch signal.
Through the electronic device and the control method, when a user inputs through the touch module, the motion sensor can be used for generating a virtual touch signal to input instead of touch, so that the control efficiency is improved.
Drawings
FIG. 1 is a block schematic diagram of an embodiment of an electronic device of the present disclosure;
FIG. 2 is a flow chart of an embodiment of a control method of the present disclosure;
FIG. 3 is a flowchart showing an embodiment of a judging process for judging whether the exercise data meets the preset action condition according to the present disclosure;
FIG. 4 is a waveform diagram showing a judging flow of judging whether the motion data accords with the preset motion condition according to the present disclosure;
FIG. 5 is a flowchart showing another embodiment of the present disclosure for determining whether motion data meets a preset motion condition; and
Fig. 6 is a flow chart illustrating another embodiment of the control method of the present disclosure.
Detailed Description
Specific embodiments of the present disclosure will be described in more detail below with reference to the drawings. Advantages and features of the present disclosure will become more apparent from the following description and claims. It should be noted that the drawings are in a very simplified form and are all to a non-precise scale, merely for convenience and clarity in aiding in the description of embodiments of the disclosure.
Fig. 1 is a block schematic diagram of an embodiment of an electronic device of the present disclosure. As shown in the figure, the electronic device 100 provided in the present embodiment includes a touch module 120, a motion sensor 140, a memory 160 and a control unit 180. In one embodiment, the electronic device 100 may be a handheld device, such as a mobile phone, a tablet computer, a handheld game console, etc.
The touch module 120 is used for generating a touch signal S1. In one embodiment, the touch module 120 may include a touch panel (touch panel). But is not limited thereto. In one embodiment, the touch module 120 may include a touch pad (touch pad).
The motion sensor 140 is used for detecting the motion of the electronic device 100 to generate a motion data D1. In one embodiment, the motion sensor 140 is an accelerometer. But is not limited thereto. In another embodiment, the motion sensor 140 may also be a gyroscope.
The memory 160 stores a predetermined operation condition A1. The preset action condition A1 corresponds to a preset action mode and is used for judging whether the action of a user accords with the preset action mode. In one embodiment, the memory 160 may be a random access memory, a solid state disk, or other storage device commonly used in a handheld device.
The control unit 180 is electrically connected to the touch control module 120, the motion sensor 140 and the memory 160. The control unit 180 may receive the motion data D1 from the motion sensor 140, and may obtain the preset motion condition A1 from the memory 160 to determine whether the motion data D1 meets the preset motion condition A1, so as to determine whether the motion made by the user meets the preset motion mode. In one embodiment, the control unit 180 may be a processor (processor). When the motion data D1 meets the preset motion condition A1, the control unit 180 generates the virtual touch signal S2 to simulate the signal generated by the touch module 120 under the condition of a certain touch mode and a certain touch position. Details of the preset action mode, the preset action condition A1 and the virtual touch signal S2 will be described in more detail in the following paragraphs corresponding to the control method.
FIG. 2 is a flow chart of an embodiment of a control method of the present disclosure. This control method is suitable for the electronic device 100 shown in fig. 1. The control method comprises the following steps.
First, in step S120, a predetermined operation condition A1 is set. The preset action condition A1 corresponds to a preset action mode and is used for judging whether the action of a user accords with the preset action mode.
In one embodiment, the preset operation mode corresponding to the preset operation condition A1 is a single-axis operation mode, for example, a Z-axis operation mode. In an embodiment, the preset motion mode corresponding to the preset motion condition A1 is a Z-axis shaking mode. According to the behavior inertial motion, the Z-axis shaking mode presents a waveform that returns to the fixed point from top to bottom to form an output signal. Taking the Z-axis shaking mode as an example, in one embodiment, the preset action condition A1 may include an acceleration variation Δy and an action time t1 to determine whether the action of the user accords with the preset action mode.
Subsequently, in step S140, the motion of the electronic device 100 is detected to generate a motion data D1. In one embodiment, step S140 may be performed by the motion sensor 140.
Next, as shown in step S160, it is determined whether the motion data D1 meets the preset motion condition A1. If yes, the process proceeds to step S170, and a virtual touch signal S2 is generated. If not, the process returns to step S140 to regenerate the motion data D1. In an embodiment, the steps S160 and S170 may be performed by the control unit 180 and the touch module 120.
The virtual touch signal S2 may include touch position data. The touch position data may be a preset touch coordinate or a preset touch range on the touch panel or the touch pad. In an embodiment, the predetermined touch range may be a square range or a circular range. The touch mode presented by the virtual touch signal S2 may be a click, a single-area continuous click, a track type click (analog sliding or dragging), a pressure type click (heavy pressure), etc. In one embodiment, the virtual touch signal S2 can be set by a user and stored in the memory 160. In one embodiment, the user can directly perform the setting through the touch module 120.
Referring to fig. 3 and fig. 4 together, fig. 3 is a flowchart showing an embodiment of a judging process for judging whether the motion data D1 meets the preset motion condition A1 according to the disclosure, and fig. 4 is a waveform diagram showing a judging process for judging whether the motion data D1 meets the preset motion condition A1 according to the disclosure. The motion data D1 of the present embodiment is Z-axis acceleration data. The vertical axis of the waveform diagram is the Z-axis acceleration value.
The flow continues to step S140 of fig. 2. First, as shown in step S150, it is determined whether or not the motion data D1 meets a start condition. If the motion data D1 does not meet the starting condition, the process returns to step S140 to re-detect the motion data D1. If the motion data D1 meets the starting condition, the process proceeds to step S162, where it is determined whether the motion data D1 meets the preset motion condition A1. In one embodiment, the starting condition is whether the variation of the motion data D1 exceeds a starting variation Δx, so as to distinguish the stationary state from the motion state of the electronic device 100.
In step S162, the process first determines whether the variation of the motion data D1 reaches an acceleration variation Δy. The acceleration change Δy is larger than the start change Δx. If the variation of the motion data D1 is smaller than the acceleration variation Δy, the process returns to step S140 to re-detect the motion data D1. If the variation of the motion data D1 reaches the acceleration variation Δy, the flow proceeds to step S164.
In step S164, the process determines whether the elapsed time Δt for the movement data D1 to reach the acceleration change Δy is less than or equal to an action time T1. If the elapsed time Δt from the motion data D1 reaching the acceleration change Δy is greater than the operation time T1, the process returns to step S140 to re-detect the motion data D1. If the elapsed time Δt when the motion data D1 reaches the acceleration variation Δy is less than or equal to the action time T1, it is determined that the motion data D1 meets the preset action condition A1, the process proceeds to step S170, and a virtual touch signal S2 is generated.
In one embodiment, the accuracy of the determination is improved. Before executing the step of generating the motion data D1 in step S140, the method further includes setting the current motion sensor 140 value to an initial value to reflect different usage situations of the user, such as lying use and sitting use.
In one embodiment, the preset operating condition A1 can be set by the user and stored in the memory 160. In one embodiment, the user can directly perform the setting through the touch module 120.
Fig. 5 is a flowchart showing another embodiment of a judging process of judging whether the exercise data D1 meets the preset action condition A1 according to the present disclosure. The same steps as in fig. 3 are denoted by the same reference numerals. In contrast to the embodiment of fig. 3, in the present embodiment, when the motion data D1 meets the starting condition and the motion data D1 starts to determine whether the motion data D1 meets the preset motion condition A1, the computing time interval t2 is started, the waveform with the largest amplitude (i.e. the largest acceleration change) in the time interval t2 is extracted as the input signal for determining, and the rest of the minor waveforms are ignored.
Subsequently, in step S262, similar to step S162 of fig. 3, the flow first determines whether the variation corresponding to this waveform reaches the acceleration variation Δy. If the variation corresponding to the waveform is smaller than the acceleration variation Δy, the process returns to step S140 to re-detect the motion data D1. If the variation corresponding to the waveform reaches the acceleration variation Δy, the flow proceeds to step S264.
In step S264, the process determines whether the elapsed time for the waveform to reach the acceleration change Δy is less than or equal to an operation time t1. If the elapsed time for the waveform to reach the acceleration change Δy is longer than the operation time t1, the process returns to step S140 to re-detect the motion data D1. If the elapsed time of the waveform reaching the acceleration variation Δy is less than or equal to the action time t1, it is determined that the motion data D1 meets the preset action condition A1, the flow proceeds to step S170, and a virtual touch signal S2 is generated.
Fig. 6 is a flow chart illustrating another embodiment of the control method of the present disclosure. The same steps as in fig. 2 are denoted by the same reference numerals. In order to avoid malfunction of the touch module 120 caused by the motion data D1 received by the motion sensor 140 during normal operation of the electronic device 100, compared to the embodiment of fig. 2, the embodiment further includes a determining step S350 after the motion data D1 is generated in step S140, so as to determine whether the electronic device 100 is in a specific operation mode, such as a horizontal grip. If the specific operation mode is in, the process proceeds to step S360, and it is determined whether the motion data D1 meets the preset motion condition A1. If yes, in step S370, a virtual touch signal S2 is generated. If not, the process returns to step S140 to regenerate the motion data D1.
The embodiment of fig. 6 determines whether to generate the virtual touch signal S2 by determining whether the electronic device 100 is in a specific operation mode. The present disclosure is not so limited. In another embodiment, in order to avoid the malfunction of the touch module 120 caused by the motion data D1 received by the motion sensor 140, the electronic device 100 may be configured to generate the virtual touch signal S2 after receiving the start signal. The start signal may be a gesture signal received by the touch module 120 or a specific action pattern detected by the action sensor 140.
Through the electronic device and the control method of the present disclosure, when a user inputs through the touch module 120, the motion sensor 140 can be used to replace the touch to generate the virtual touch signal S2 for inputting, so as to increase the control efficiency.
Although the present invention has been described with reference to the above embodiments, it should be understood that the invention is not limited thereto, but rather by one skilled in the art, as many modifications and variations can be made thereto without departing from the spirit and scope of the present invention as defined by the appended claims.
Claims (7)
1. An electronic device, comprising:
the touch control module is used for generating a touch control signal;
the motion sensor is used for detecting the motion of the electronic device to generate motion data;
the memory is used for storing preset action conditions; and
The control unit is electrically connected with the touch module, the motion sensor and the memory and is used for:
Receiving the motion data;
judging whether the variation of the motion data exceeds a starting variation; and
When the change amount of the motion data reaches an acceleration change amount which is larger than the starting change amount and the elapsed time of the change amount of the motion data reaching the acceleration change amount is smaller than an action time, judging that the motion data meets the preset action condition, and generating a virtual touch signal.
2. The electronic device of claim 1, wherein the motion sensor is an accelerometer or a gyroscope.
3. The electronic device of claim 1, wherein the predetermined operating condition corresponds to a single axis operating mode.
4. The electronic device of claim 3, wherein the single axis operation mode is a shake mode.
5. The electronic device of claim 1, wherein the motion data is uniaxial motion data.
6. The electronic device of claim 1, wherein the virtual touch signal comprises touch position data and touch mode data.
7. A control method suitable for an electronic device, the control method comprising:
setting preset action conditions;
Detecting the movement of the electronic device to generate movement data;
judging whether the variation of the motion data exceeds a starting variation; and
When the change amount of the motion data reaches an acceleration change amount which is larger than the starting change amount and the elapsed time of the change amount of the motion data reaching the acceleration change amount is smaller than an action time, judging that the motion data meets the preset action condition, and generating a virtual touch signal.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011204761.0A CN114442830B (en) | 2020-11-02 | 2020-11-02 | Electronic device and control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011204761.0A CN114442830B (en) | 2020-11-02 | 2020-11-02 | Electronic device and control method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114442830A CN114442830A (en) | 2022-05-06 |
CN114442830B true CN114442830B (en) | 2024-06-18 |
Family
ID=81358250
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011204761.0A Active CN114442830B (en) | 2020-11-02 | 2020-11-02 | Electronic device and control method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114442830B (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108920228A (en) * | 2018-05-28 | 2018-11-30 | 云谷(固安)科技有限公司 | A kind of control instruction input method and input unit |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102096490A (en) * | 2009-12-09 | 2011-06-15 | 华硕电脑股份有限公司 | Method for controlling touch module and electronic device |
JP2016038905A (en) * | 2014-08-08 | 2016-03-22 | パナソニックIpマネジメント株式会社 | Input device and control method of apparatus |
TWI606366B (en) * | 2015-01-05 | 2017-11-21 | 金寶電子工業股份有限公司 | Wearable apparatus, display method thereof, and control method thereof |
CN105094281A (en) * | 2015-07-20 | 2015-11-25 | 京东方科技集团股份有限公司 | Control method and control module used for controlling display device and display device |
US10705731B2 (en) * | 2017-08-17 | 2020-07-07 | The Boeing Company | Device operational control systems and methods |
EP3521983A1 (en) * | 2018-02-01 | 2019-08-07 | Qeexo, Co. | Classifying contacts or associations with a touch sensitive device |
WO2020080733A1 (en) * | 2018-10-16 | 2020-04-23 | 주식회사 콕스스페이스 | Interface device for controlling virtual content |
CN111225105B (en) * | 2018-11-26 | 2021-08-17 | 奇酷互联网络科技(深圳)有限公司 | Method for controlling screen work, mobile terminal and storage medium |
-
2020
- 2020-11-02 CN CN202011204761.0A patent/CN114442830B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108920228A (en) * | 2018-05-28 | 2018-11-30 | 云谷(固安)科技有限公司 | A kind of control instruction input method and input unit |
Also Published As
Publication number | Publication date |
---|---|
CN114442830A (en) | 2022-05-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2016203222B2 (en) | Touch-sensitive button with two levels | |
KR101424259B1 (en) | Method and apparatus for providing input feedback in portable terminal | |
EP2350782B1 (en) | Mobile devices with motion gesture recognition | |
EP2629183A2 (en) | Method of operating a touch panel, touch panel and display device | |
US20120249461A1 (en) | Dedicated user interface controller for feedback responses | |
JP5743559B2 (en) | Information processing apparatus, control method and program thereof, and recording medium | |
US8797283B2 (en) | Method and apparatus for performing user-defined macros | |
KR20090074063A (en) | Haptic effects with proximity sensing | |
EP2256612A2 (en) | User interface apparatus and method for an electronic device touchscreen | |
EP3046009A1 (en) | Information processing device, input method, and program | |
CN107239166A (en) | It is a kind of to adjust method and the mobile terminal that interface of mobile terminal is shown | |
JP2005174328A (en) | Apparatus and method for controlling screen pointer | |
US11775084B2 (en) | Stylus haptic component arming and power consumption | |
CN107025173B (en) | Method and device for testing function execution time | |
EP2761406A1 (en) | Detection of gesture data segmentation in mobile devices | |
US11759702B2 (en) | Game system, processing method, and information storage medium | |
US20140320481A1 (en) | Display control | |
US20240272756A1 (en) | Touch state detection circuit, electronic device including touch state detection circuit, and touch state detection method | |
CN107340854B (en) | Portable electronic device and control method thereof | |
CN105474164A (en) | Disambiguation of indirect input | |
CN114442830B (en) | Electronic device and control method thereof | |
JP2000137571A (en) | Handwriting input device and recording medium recording handwriting input processing program | |
US9710155B2 (en) | User interface | |
TWI797494B (en) | Electronic device and control method thereof | |
CN106406578A (en) | Information processing apparatus, input control method, method of controlling information processing apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |