WO2018053936A1 - Procédé et dispositif électronique interactif - Google Patents

Procédé et dispositif électronique interactif Download PDF

Info

Publication number
WO2018053936A1
WO2018053936A1 PCT/CN2016/108110 CN2016108110W WO2018053936A1 WO 2018053936 A1 WO2018053936 A1 WO 2018053936A1 CN 2016108110 W CN2016108110 W CN 2016108110W WO 2018053936 A1 WO2018053936 A1 WO 2018053936A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
user
touch screen
acceleration sensor
gesture
Prior art date
Application number
PCT/CN2016/108110
Other languages
English (en)
Chinese (zh)
Inventor
余尚春
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201680089221.8A priority Critical patent/CN109690446B/zh
Publication of WO2018053936A1 publication Critical patent/WO2018053936A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present application relates to the field of communications, and in particular, to an interaction method and an electronic device.
  • smart wearable devices such as wristbands and watches can support the swimming tracking function, which can prevent water from entering for a long time under the depth of 50 meters and 100 meters.
  • the capacitive touch screen used in electronic devices will not work properly after the touch of water, and there will be failure, insensitivity, and false detection. Only when the water is wiped dry can it return to normal. Of course, the touch screen of the electronic device may also malfunction in other scenarios.
  • industry products usually perform human-computer interaction by adding at least one physical button.
  • the embodiment of the present application provides an interaction method and an electronic device, which realize accurate human-computer interaction when the touch screen fails without increasing the cost of the electronic device.
  • an interactive method for use in an electronic device that includes an acceleration sensor.
  • the method specifically includes: determining whether the touch screen of the electronic device is in a failure state; if the touch screen of the electronic device is in a failure state, collecting an operation gesture of the user using the electronic device by the acceleration sensor, and performing an operation indicated by the operation gesture.
  • the user can issue an interactive command to the electronic device through different operation gestures, and the electronic device collects the user's operation gesture through the acceleration sensor to respond to the interactive command, and completes the human-computer interaction.
  • the acceleration sensor existing in the electronic device does not increase the hardware or waterproof cost of the electronic device; and the gesture operation of the user is not caused by the insensitivity or error of the touch screen being in a malfunction state. The acquisition is inaccurate; therefore, the human-computer interaction scheme provided by the present application realizes human-computer interaction when the touch screen fails without increasing the cost of the electronic device.
  • the failure state refers to a state in which the touch screen of the electronic device cannot perform human-computer interaction normally, resulting in misidentification.
  • the failure state may include, but is not limited to, a water wet state, a touch screen failure state, and the like.
  • the electronic device further includes a touch sensing sensor.
  • the method may further include: if the touch screen of the electronic device is in a failure state, Turn off the touch sensor.
  • the touch sensing sensor cannot accurately collect the touch operation of the user.
  • the touch sensing sensor is turned off to turn off the touch interaction function of the electronic device, thereby preventing the false detection when the touch screen of the electronic device is in a malfunction state.
  • the electronic device only relies on the acceleration sensor for human-computer interaction, which improves the accuracy of the human-computer interaction when the electronic device is in a malfunction state.
  • the method includes: receiving an underwater activity monitoring start instruction input by a user.
  • the underwater activity monitoring activation instruction input by the user it indicates that the electronic device performs underwater activity immediately.
  • underwater activities may include, but are not limited to, swimming, diving and the like.
  • the underwater activity monitoring may include collecting and recording the user's underwater activity distance, underwater activity speed and other parameters through the acceleration sensor when the user performs underwater activities.
  • the user can input the underwater activity monitoring activation indication by touching the electronic device touch screen, the swimming icon start button, or the underwater activity monitoring activation indication area in the touch operation mode.
  • users can also preset The physical button inputs the underwater activity monitoring start instruction, or the user can input the underwater activity monitoring start instruction by other means, which is not specifically limited in this application.
  • the acceleration sensor is used to monitor the user's use of the electronic device under water. Because the acceleration signals collected by the acceleration sensors in the electronic device are different under different motion states, the acceleration signals of different motion states are collected in advance, and the acceleration signals of different motion states are trained and learned by the algorithm such as the neural network to extract the motion.
  • the features are preset in the electronic device. In practical applications, the electronic device collects the acceleration signal of the user's motion through the acceleration sensor therein, and extracts the feature value. When the feature value meets the characteristics of the underwater motion, the user can determine that the user uses the electronic device underwater. When the user touches the water caused by using the electronic device during underwater activities, the human-computer interaction is performed without increasing the cost of the electronic device.
  • the electronic device when the failure state is a water-immersed state, providing a specific implementation manner for determining whether the touch screen of the electronic device is in a failure state, Specifically, the electronic device further includes a touch sensing sensor, and the touch parameter of the user acquired by the touch sensing sensor satisfies a failure state touch condition.
  • the touch operation parameters such as the number of touch points, the touch point area, and the like
  • the touch operation parameters are different from those in the non-failure state, and therefore, when the touch parameter of the user acquired by the touch sensing sensor satisfies the failure state touch condition, It can be determined that the electronic device is in a malfunction state.
  • the electronic device automatically detects that the touch screen is wet, the human-computer interaction is performed without increasing the cost of the electronic device.
  • the operation of the user's operation gesture is performed by the acceleration sensor, and the operation indicated by the user's operation gesture is performed, which may include: collecting an activation gesture input instruction input by the user through the acceleration sensor; Thereafter, the operation gesture of the user is acquired by the acceleration sensor, and the operation of the operation gesture indication is performed. After the startup gesture interaction indication input by the user is collected, the operation gesture of the user is collected by the acceleration sensor, and the operation of the operation gesture indication is performed, thereby avoiding the operation of the user's non-human-computer interaction operation gesture as a mobile phone for human-computer interaction. The misoperation has improved the accuracy of human-computer interaction when the touch screen is out of water.
  • the The method may further include: collecting, by the acceleration sensor, an exit gesture interaction indication input by the user; after collecting the exit gesture interaction indication, turning off the acceleration sensor to collect the operation gesture of the user.
  • the acceleration sensor is closed to collect the user's operation gesture, thereby avoiding the misinterpretation of the user's non-human-computer interaction operation gesture as the operation of the mobile phone for human-computer interaction, and improving the touch screen. The accuracy of human-computer interaction when water fails.
  • the method may further include: if the touch screen of the electronic device is in a failure state And displaying a prompt message to the user of the electronic device, the prompt information being used to prompt the electronic device to enter the mobile phone interaction model. So that the user of the electronic device interacts with the electronic device through the operation gesture.
  • the definition of the operation gesture and the operation corresponding to each operation gesture may also be displayed to the user of the electronic device.
  • the debugging interface of the operation gesture may also be displayed to the user of the electronic device for debugging each operation gesture.
  • the operation gesture when the operation gesture is debugged, the operation gesture may be dynamically indicated to the user.
  • the user when debugging the operation gesture, the user may be prompted to input an operation gesture, and the operation gesture for debugging input by the user is collected by the acceleration sensor, and it is determined whether the operation gesture input by the user is identifiable. If the operation gesture input by the user is identifiable, a matching indication is output to the user, and if the operation gesture input by the user is unrecognizable, a prompt to re-enter the operation gesture for debugging is output to the user.
  • the The method may further include receiving an indication of a preset operation gesture input by the user of the electronic device, and then collecting, by the acceleration sensor, a preset operation operation phone input by the user and an operation indicated by the user.
  • the The method further includes: determining whether the touch screen of the electronic device is in a failure state; if the touch screen of the electronic device is in a failure state, continuing to perform an operation gesture of the user using the electronic device by the acceleration sensor, performing an operation indicated by the user's operation gesture; The touch screen of the device is not in a failure state, and the operation gesture of the user who uses the electronic device is collected by the acceleration sensor, and the human-computer interaction is performed by the touch-sensing sensor.
  • the touch-sensing sensor Before the human-computer interaction is performed by the touch-sensing sensor, if the touch-sensing sensor is turned off, the touch-sensing sensor is first turned on, and the human-computer interaction is performed through the touch-sensing sensor.
  • the electronic device may be a wearable device.
  • the wearable device can be a smart watch or a sports bracelet.
  • the present application provides an electronic device that can implement the functions in the foregoing method examples, and the functions can be implemented by using hardware or by executing corresponding software by hardware.
  • the hardware or software includes one or more modules corresponding to the above functions.
  • the structure of the electronic device A processor and a transceiver are included, the processor being configured to support the electronic device to perform a corresponding function in the above method.
  • the transceiver is used to support communication between the electronic device and other devices.
  • the electronic device can also include a memory for coupling with the processor that retains the program instructions and data necessary for the electronic device.
  • the present application provides a computer storage medium for storing computer software instructions for use in the above electronic device, comprising a program designed to perform the above aspects.
  • FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 3 is a schematic flowchart diagram of an interaction method according to an embodiment of the present disclosure.
  • FIG. 4 is a schematic diagram of a display interface of an electronic device according to an embodiment of the present application.
  • FIG. 5 is a schematic flowchart of monitoring a user using an electronic device under water according to an embodiment of the present invention
  • FIG. 6 is a schematic flowchart diagram of another interaction method according to an embodiment of the present disclosure.
  • FIG. 7 is a schematic diagram of another electronic device display interface according to an embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of another electronic device according to an embodiment of the present disclosure.
  • FIG. 9 is a schematic structural diagram of still another electronic device according to an embodiment of the present application.
  • the basic principle of the present application is: when the touch screen of the electronic device fails, the operation gesture of the user is collected by the acceleration sensor existing in the electronic device, and the operation of the gesture indication is performed to realize human-computer interaction. Therefore, when the touch screen fails, the user can issue an interactive command to the electronic device through different operation gestures, and the electronic device collects the user's operation gesture through the acceleration sensor to respond to the interactive command, thereby completing the human-computer interaction.
  • the acceleration sensor existing in the electronic device does not increase the hardware or waterproof cost of the electronic device; and the gesture operation of the user is not caused by the insensitivity or error of the touch screen being in a malfunction state. The acquisition is inaccurate; therefore, the solution provided by the present application realizes human-computer interaction when the touch screen is out of water without increasing the cost of the electronic device.
  • the interaction method provided by the embodiment of the present application can be applied to an electronic device.
  • the electronic device can include, but is not limited to, a wearable device, a user terminal, and the like.
  • the wearable device can include, but is not limited to, a smart watch, a sports bracelet, and the like.
  • the user terminal may include, but is not limited to, a mobile phone, a tablet, and the like.
  • FIG 1 illustrates an electronic device 10 including an acceleration sensor 101 (located inside the electronic device 10, not shown), the user wearing the electronic device 10, and performing an operation recording by the acceleration sensor 101 in the electronic device 10.
  • an acceleration sensor 101 located inside the electronic device 10, not shown
  • the electronic device 10 can also communicate with the terminal through Bluetooth technology or infrared technology or other wireless technologies to implement functions such as data synchronization and parameter configuration.
  • the electronic device 10 can also implement functions such as data synchronization and parameter configuration with a computer through Bluetooth technology or infrared technology or other wireless technologies or wired technologies.
  • the specific content of the electronic device 10 is not limited in the embodiment of the present application.
  • FIG. 1 is only an example to describe an application scenario of the solution of the present application.
  • the interaction method provided by the embodiment of the present application is performed by the electronic device 20 provided by the embodiment of the present application.
  • the electronic device 20 may be the electronic device 10 shown in FIG. 1 or other electronic devices, which is not specifically limited in this application.
  • FIG. 2 is a block diagram showing the structure of an electronic device 20 related to various embodiments of the present application.
  • the electronic device 20 can include: a processor 201, a memory 202, The touch sensor 203, the acceleration sensor 204, and the touch screen 205 are touched.
  • the memory 202 can be a volatile memory (English full name: volatile memory), such as a random access memory (English name: random-access memory, RAM); or a non-volatile memory (English name: non-volatile memory), For example, read-only memory (English full name: read-only memory, ROM), flash memory (English full name: flash memory), hard disk (English full name: hard disk drive, HDD) or solid state hard disk (English full name: solid-state drive, SSD); or a combination of the above types of memory for storing related applications, and configuration files, that implement the methods of the present application.
  • volatile memory such as a random access memory (English name: random-access memory, RAM)
  • non-volatile memory English name: non-volatile memory
  • read-only memory English full name: read-only memory, ROM
  • flash memory English full name: flash memory
  • hard disk English full name: hard disk drive, HDD
  • solid state hard disk English full name: solid-state drive, SSD
  • the processor 201 is a control center of the electronic device 20, and may be a central processing unit (English name: central processing unit, CPU), or may be a specific integrated circuit (English name: Application Specific Integrated Circuit, ASIC), or configured.
  • One or more integrated circuits that implement the embodiments of the present application such as one or more microprocessors (digital singnal processor, DSP), or one or more field programmable gate arrays (English full name: Field) Programmable Gate Array, FPGA).
  • the processor 201 can perform various functions of the electronic device 20 by running or executing software programs and/or modules stored in the memory 202, as well as invoking data stored in the memory 202.
  • the touch sensing sensor 203 is configured to collect a touch operation of a user using the electronic device 20 through the touch screen 205, and the touch sensing sensor 203 feeds back the touch operation to the processor 201 after the touch operation of the user is collected, so that the processor 201 performs a touch operation indication.
  • Touch screen 205 is also used to present information to a user.
  • the acceleration sensor 204 is used to collect an operation of the motion form of the user using the electronic device 20, and after the operation of the motion form of the user is acquired, the touch-sensing sensor 203 feeds back the operation in the form of motion to the processor 201, so that the processor 201 performs the motion.
  • the function of the form of operation indication is used to collect an operation of the motion form of the user using the electronic device 20, and after the operation of the motion form of the user is acquired, the touch-sensing sensor 203 feeds back the operation in the form of motion to the processor 201, so that the processor 201 performs the motion.
  • the function of the form of operation indication is used to collect an operation of the motion form of the user using the electronic device 20, and after the operation of the motion form of the user is acquired, the touch-sensing sensor 203 feeds back the operation in the form of motion to the processor 201, so that the processor 201 performs the motion.
  • the function of the form of operation indication is used to collect an operation of the motion form of the
  • the specific processor 201 performs the following functions by running or executing a software program and/or module stored in the memory 202, and calling data stored in the memory 202. It can be determined whether the touch screen of the electronic device 20 is in a malfunction state; if the touch screen of the electronic device 20 is in a malfunction state, the operation gesture of the user using the electronic device is collected by the acceleration sensor 204, and the operation of the operation gesture indication is performed.
  • an embodiment of the present application provides an interaction method applied to an electronic device, where the electronic device includes an acceleration sensor. As shown in FIG. 3, the method may include:
  • the failure state refers to a state in which the touch screen of the electronic device cannot accurately recognize the touch operation of the user due to water, stains, or malfunctions, or other reasons.
  • the failure state may include, but is not limited to, a water wet state, a fault state, and the like.
  • the specific type of the failure state may be set according to actual requirements, which is not specifically limited in this embodiment of the present application.
  • the solution of the present application can be used for accurate human-computer interaction.
  • determining whether the touch screen of the electronic device is in a failure state in S301 may include, but is not limited to, the following three implementation schemes:
  • An underwater activity monitoring activation indication input by a user receiving the electronic device.
  • Underwater activities may include, but are not limited to, swimming, diving, and the like.
  • the underwater activity monitoring may include collecting and recording the user's underwater activity distance, underwater activity speed and other parameters through the acceleration sensor when the user performs underwater activities.
  • the user can input the underwater activity monitoring activation indication by touching the electronic device touch screen, the swimming icon start button, or the underwater activity monitoring activation indication area in the touch operation mode.
  • the user can also input the underwater activity monitoring start instruction by using a preset physical button, or the user can input the underwater activity monitoring start instruction by other means, which is not specifically limited in this application.
  • the user can touch the "exercise" mode in the interface shown in (a) of FIG. 4 to enter the next level menu of the "exercise” mode. Since the size of the touch screen of the sports bracelet is small, the next level menu of the "exercise” mode needs to slide the screen to display the exercise mode as shown in (b) to (d) of FIG. When the user touches and selects swimming under the interface shown in (d) of FIG. 4, the underwater activity monitoring start instruction is input.
  • the user is monitored by an acceleration sensor to use the electronic device underwater.
  • the acceleration signals collected by the acceleration sensors in the electronic device are different according to different motion states.
  • the process of monitoring the user's underwater use of the electronic device by using the acceleration sensor may specifically include:
  • different motion states include at least all motions that the electronic device can monitor. For example, walking, running, cycling, swimming, diving, etc.
  • the methods of training analysis include algorithms such as neural networks.
  • the method of the training analysis is not specifically limited in the embodiment of the present application.
  • S54 if the acceleration signal when the user moves in S53 matches the underwater motion state recognition feature stored in the electronic device in S52, S55 is performed; otherwise, S53 is re-executed.
  • the electronic device further includes a touch sensing sensor, and the touch parameter of the user acquired by the touch sensing sensor satisfies a malfunction state touch condition.
  • the touch operation parameters (such as the number of touch points, the touch point area, and the like) used are different from those in the non-failure state, and therefore, the touch parameters of the user acquired by the touch sensing sensor satisfy the failure state touch condition. At this time, it can be determined that the electronic device is in a malfunction state.
  • the detection in S301 is similar to the foregoing implementation scheme 3, and details are not described herein.
  • the detection in S301 may be configured in the electronic device to determine whether the touch screen is faulty, and details are not described herein. Any implementation method that can be used to determine whether the touch screen is faulty can be used to implement the function of S301.
  • the operation gesture is an operation behavior performed by the user on the electronic device, and the operation gesture can be quantized into a parameter set recognizable by the electronic device.
  • the parameter set may include, but is not limited to, a motion trajectory, a starting point, a rotation, a rotation direction, a rotation angle, a displacement distance, and the like.
  • the embodiment of the present application does not specifically limit the quantized content of the operation gesture.
  • the operational gesture can be a wrist inversion, a wrist valgus, a single tap, and the like.
  • operation gestures are only examples, and are not specifically limited to the content of the operation gestures.
  • the content of the operation gesture can be defined according to the actual needs in the actual application, which is not specifically limited in this application.
  • the operation indicated by the operation gesture refers to a response operation of the interactive instruction sent to the electronic device when the user performs the operation gesture.
  • the operation of the gesture gesture wrist inversion indication may be the next page or the next option of the currently displayed content; the operation of the gesture gesture wrist eversion indication may be the previous page or the previous option of the currently displayed content; the operation gesture The operation of a single tap indication may be to confirm or select the currently displayed content.
  • operation gestures are merely examples for illustration, and are not specifically limited to the operation of the operation gesture indication.
  • the content of the operation indicated by the operation gesture may be defined according to actual needs in an actual application, which is not specifically limited in this application.
  • operation gesture and the operation of the operation gesture indication may be stored in the electronic device in advance, or may be stored in the electronic device according to the input of the user, which is not specifically limited in the embodiment of the present application.
  • the operation gesture of the user is collected by the acceleration sensor, and the implementation of the operation of the operation gesture indication may include the following two types:
  • the operation gesture of the user using the electronic device is collected by the acceleration sensor, and the operation indicated by the operation gesture of the user is performed. That is to say, after determining that the touch screen of the electronic device is in a failure state, all the operation gestures of the collected user are operation gestures for human-computer interaction.
  • the second implementation manner is: after determining that the touch screen of the electronic device is in a failure state, the activation gesture input instruction input by the user is first collected by the acceleration sensor; after the startup gesture interaction indication input by the user is collected, the operation gesture of the user is collected by the acceleration sensor. , the operation indicated by the operation gesture is performed.
  • the operation gesture of the user is collected by the acceleration sensor, and the operation of the operation gesture indication is performed by using the acceleration sensor, and the exit gesture interactive instruction input by the user is collected by the acceleration sensor; After the input exit gesture interaction indication, the "operation of the user's operation gesture by the acceleration sensor is performed, and the operation of the operation gesture indication is performed".
  • the user when the touch screen of the electronic device fails, the user can send an interactive instruction to the electronic device through different operation gestures, and the electronic device collects the operation gesture of the user through the acceleration sensor to respond to the interactive instruction, and completes the person. Machine interaction.
  • the acceleration sensor existing in the electronic device is implemented, and does not increase the hardware or waterproof cost of the electronic device; The operation does not cause inaccurate acquisition due to the insensitivity or error of the touch screen being in a malfunction state; therefore, the human-computer interaction scheme provided by the present application realizes human-computer interaction when the touch screen fails without increasing the cost of the electronic device. .
  • S301 if it is determined that the touch screen of the electronic device is not in a failure state, S301 is re-executed, and S302 is performed until it is determined that the touch screen of the electronic device is in a failure state.
  • the electronic device with a touch screen further includes a touch sensing sensor. As shown in FIG. 6, after determining whether the touch screen of the electronic device is in a failure state, the method may further include:
  • the S303 and the S302 may be executed at the same time, or may be performed in succession, which is not specifically limited in this application.
  • the sequence of the steps may be determined according to actual requirements.
  • the embodiment of the present application does not specifically limit this.
  • the description in FIG. 1 is merely an example, and the order of execution between the steps is not limited.
  • the method may further include:
  • the prompt information is used to prompt the electronic device to enter the mobile phone interaction model.
  • the prompt information may be a light prompt, including a flashing frequency of the light or a color of the light.
  • the specific meaning of the light topic as the prompt information can be specifically described in the user manual of the electronic device.
  • the prompt information may be a display content prompt, and is presented to the user through a display screen of the electronic device.
  • Display content can include text or graphics or other, this application The embodiment is not specifically limited thereto.
  • the above description of the prompt information is merely an exemplary description, and is not specifically limited to the content and form of the prompt information.
  • the definition of the operation gesture may also be displayed to the user.
  • the specific implementation process of displaying the definition of the operation gesture to the user is not limited in the embodiment of the present application.
  • the method further includes S305.
  • the electronic device re-determines whether the touch screen of the electronic device is in a failure state.
  • S302 or S302 and S303 are continued. If it is determined in S305 that the touch screen of the electronic device is not in a malfunction state, the method further includes S306, and after S306, S301 is re-executed.
  • the following shows an example in which the electronic device is a smart watch, and the display content of the smart watch is described when the smart watch executes the solution of the embodiment of the present application.
  • the user touches and selects a swimming option in the smart watch display interface to initiate swimming motion monitoring.
  • the smart watch judges that the touch screen of the smart watch is in a malfunction state, and outputs a prompt message to the user, prompting the user to currently only support the gesture operation, as shown in FIG. 7(b).
  • the smart watch monitors the user's swimming process and presents relevant data to the user, as shown in (c) of FIG.
  • the smart watch displays the enter gesture interaction mode to the user, as shown in (d) of FIG.
  • the smart watch collects the operation gesture input by the user and performs corresponding operations.
  • the smart watch displays the exit gesture interaction mode to the user, as shown in (e) of FIG. After that, the user exits the swimming movement monitoring by double-clicking the operation input, and the smart watch re-enters the touch screen human-computer interaction mode.
  • the electronic device includes corresponding hardware structures and/or software modules for performing the respective functions in order to implement the above functions.
  • the present application can be implemented in a combination of hardware or hardware and computer software in combination with the elements and algorithm steps of the various examples described in the embodiments disclosed herein. Whether a function is implemented in hardware or computer software to drive hardware depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods to implement the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the present application.
  • the embodiment of the present application may divide the functional modules of the electronic device according to the foregoing method example.
  • each functional module may be divided according to each function, or two or more functions may be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules. It should be noted that the division of the module in the embodiment of the present application is schematic, and is only a logical function division, and the actual implementation may have another division manner.
  • FIG. 8 shows a possible structural diagram of the electronic device 80 involved in the above embodiment in the case where the respective functional modules are divided by corresponding functions.
  • the electronic device 80 includes a determination unit 801, an acquisition unit 802, and an execution unit 803.
  • the determining unit 801 is configured to support the electronic device 80 to perform the process S201 in FIG. 3 or FIG. 6; the collecting unit 802 and the executing unit 803 are configured to support the electronic device 80 to perform the process S302 in FIG. 3 or FIG. 6. All the related content of the steps involved in the foregoing method embodiments may be referred to the functional descriptions of the corresponding functional modules, and details are not described herein again.
  • FIG. 9 shows a possible structural diagram of the electronic device 90 involved in the above embodiment.
  • the electronic device 90 can include a processing module 901 and an acquisition module 902.
  • the processing module 901 is configured to control and manage the actions of the electronic device 90.
  • the processing module 901 is configured to support the electronic device 90 through the acquisition module 902 to perform the processes S301 and S302 in FIG. 3 or FIG. 6.
  • the electronic device 90 may also include a storage module 903 for storing program codes and data of the electronic device 90.
  • the processing module 901 can be the physical structure of the electronic device 20 shown in FIG. 2 .
  • the processor 201 in the middle may be a processor or a controller.
  • it can be a CPU, a general purpose processor, a DSP, an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. It is possible to implement or carry out the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processor 101 can also be a combination of computing functions, such as one or more microprocessor combinations, a combination of a DSP and a microprocessor, and the like.
  • the acquisition module 502 can be the touch sensing sensor 203 and the acceleration sensor 20 in the physical structure of the electronic device 20 shown in FIG. 2 .
  • the storage module 903 can be the memory 202 in the physical structure of the electronic device 20 shown in FIG. 2.
  • the processing module 901 is a processor
  • the acquisition module 902 is a touch sensing sensor
  • an acceleration sensor
  • the storage module 903 is a memory
  • the electronic device 90 of FIG. 8 in the embodiment of the present application may be the electronic device 20 shown in FIG. 2 .
  • the electronic device provided by the embodiment of the present application can be used to implement the method implemented in the foregoing embodiments of the present application.
  • the part related to the embodiment of the present application is shown. If the specific technical details are not disclosed, please refer to the application.
  • the steps of a method or algorithm described in connection with the present disclosure may be implemented in a hardware or may be implemented by a processor executing software instructions.
  • the software instructions may be composed of corresponding software modules, which may be stored in RAM, flash memory, ROM, Erasable Programmable ROM (EPROM), and electrically erasable programmable read only memory (Electrically EPROM).
  • EEPROM electrically erasable programmable read only memory
  • registers hard disk, removable hard disk, compact disk read only (CD-ROM) or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor to enable the processor to read information from, and write information to, the storage medium.
  • the storage medium can also be an integral part of the processor.
  • the processor and the storage medium can be located in an ASIC. Additionally, the ASIC can be located in a core network interface device.
  • the processor and the storage medium may also exist as discrete components in the core network interface device.
  • the functions described herein can be implemented in hardware, software, firmware, or any combination thereof.
  • the functions may be stored in a computer readable medium or transmitted as one or more instructions or code on a computer readable medium.
  • Computer readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one location to another.
  • a storage medium may be any available media that can be accessed by a general purpose or special purpose computer.
  • the disclosed system, apparatus, and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be electrical or otherwise.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may be physically included separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the above integrated unit implemented in the form of a software functional unit can be stored in a Computers can be read from the storage medium.
  • the software functional unit described above is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, server, or network device, etc.) to perform portions of the steps of the methods described in various embodiments of the present application.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like, and the program code can be stored. Medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

L'invention se rapporte au domaine des communications. Un mode de réalisation de l'invention concerne un procédé interactif et un dispositif électronique capables de réaliser une interaction homme-machine précise lors de l'apparition d'un dysfonctionnement d'un écran tactile sans augmenter les coûts d'un dispositif électronique. La solution fournie dans le mode de réalisation de l'invention consiste à : déterminer si un écran tactile d'un dispositif électronique est dans un état de dysfonctionnement; et si tel est le cas, collecter, au moyen d'un capteur d'accélération, un geste d'utilisation d'un utilisateur utilisant le dispositif électronique, et exécuter une opération indiquée par le geste d'utilisation. La présente invention s'applique dans l'interaction homme - machine.
PCT/CN2016/108110 2016-09-26 2016-11-30 Procédé et dispositif électronique interactif WO2018053936A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201680089221.8A CN109690446B (zh) 2016-09-26 2016-11-30 一种交互方法及电子设备

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610852609 2016-09-26
CN201610852609.0 2016-09-26

Publications (1)

Publication Number Publication Date
WO2018053936A1 true WO2018053936A1 (fr) 2018-03-29

Family

ID=61690694

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/108110 WO2018053936A1 (fr) 2016-09-26 2016-11-30 Procédé et dispositif électronique interactif

Country Status (2)

Country Link
CN (1) CN109690446B (fr)
WO (1) WO2018053936A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111752380A (zh) * 2019-04-08 2020-10-09 广东小天才科技有限公司 一种基于腕式穿戴设备的交互方法及腕式穿戴设备
WO2022199624A1 (fr) * 2021-03-25 2022-09-29 华为技术有限公司 Procédé de commande d'un dispositif vestimentaire, et dispositif vestimentaire

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850332A (zh) * 2015-03-19 2015-08-19 惠州Tcl移动通信有限公司 智能终端控制方法以及智能终端
WO2016047153A1 (fr) * 2014-09-26 2016-03-31 Rakuten, Inc. Procédé et système pour détecter de l'eau, des débris ou d'autres objets étrangers sur un écran d'affichage
CN105760005A (zh) * 2014-12-19 2016-07-13 宏达国际电子股份有限公司 触控显示装置及其控制方法
CN105824401A (zh) * 2015-06-24 2016-08-03 维沃移动通信有限公司 一种移动终端的控制方法及其移动终端

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005063092A (ja) * 2003-08-11 2005-03-10 Keio Gijuku ハンドパターンスイッチ装置
CN103309834A (zh) * 2012-03-15 2013-09-18 中兴通讯股份有限公司 一种控制方法、控制装置和电子设备
CN103279189B (zh) * 2013-06-05 2017-02-08 合肥华恒电子科技有限责任公司 一种便携式电子设备的交互装置及其交互方法
US9684405B2 (en) * 2014-11-12 2017-06-20 Rakuten Kobo, Inc. System and method for cyclic motion gesture
US20160162146A1 (en) * 2014-12-04 2016-06-09 Kobo Incorporated Method and system for mobile device airspace alternate gesture interface and invocation thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016047153A1 (fr) * 2014-09-26 2016-03-31 Rakuten, Inc. Procédé et système pour détecter de l'eau, des débris ou d'autres objets étrangers sur un écran d'affichage
CN105760005A (zh) * 2014-12-19 2016-07-13 宏达国际电子股份有限公司 触控显示装置及其控制方法
CN104850332A (zh) * 2015-03-19 2015-08-19 惠州Tcl移动通信有限公司 智能终端控制方法以及智能终端
CN105824401A (zh) * 2015-06-24 2016-08-03 维沃移动通信有限公司 一种移动终端的控制方法及其移动终端

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111752380A (zh) * 2019-04-08 2020-10-09 广东小天才科技有限公司 一种基于腕式穿戴设备的交互方法及腕式穿戴设备
CN111752380B (zh) * 2019-04-08 2024-03-19 广东小天才科技有限公司 一种基于腕式穿戴设备的交互方法及腕式穿戴设备
WO2022199624A1 (fr) * 2021-03-25 2022-09-29 华为技术有限公司 Procédé de commande d'un dispositif vestimentaire, et dispositif vestimentaire

Also Published As

Publication number Publication date
CN109690446B (zh) 2021-06-01
CN109690446A (zh) 2019-04-26

Similar Documents

Publication Publication Date Title
US10628105B2 (en) Devices, methods, and graphical user interfaces for wireless pairing with peripheral devices and displaying status information concerning the peripheral devices
JP6058814B2 (ja) 電子デバイスのためのジェスチャー検出管理
US10282090B2 (en) Systems and methods for disambiguating intended user input at an onscreen keyboard using dual strike zones
EP2940555B1 (fr) Étalonnage automatique du regard
WO2017020660A1 (fr) Procédé et appareil permettant de démarrer une fonction préétablie dans un terminal électronique pouvant être porté
US9746929B2 (en) Gesture recognition using gesture elements
US8878787B2 (en) Multi-touch user input based on multiple quick-point controllers
WO2015110063A1 (fr) Procédé, appareil et dispositif de traitement d'informations
CN105573538B (zh) 滑动断线补偿方法及电子设备
TW201237735A (en) Event recognition
EP3336679A1 (fr) Procédé et terminal pour empêcher le déclenchement accidentel d'une touche tactile et support d'informations
KR20150002786A (ko) 제스처들을 이용한 디바이스와의 상호작용
CN102262504A (zh) 带虚拟键盘的用户交互手势
US20140313363A1 (en) Systems and methods for implementing and using gesture based user interface widgets with camera input
US10228794B2 (en) Gesture recognition and control based on finger differentiation
CN105867822B (zh) 一种信息处理方法及电子设备
WO2015131590A1 (fr) Procédé pour commander un traitement de geste d'écran vide et terminal
WO2017096622A1 (fr) Procédé et appareil de rejet de fausses opérations tactiles, et dispositif électronique
WO2018053936A1 (fr) Procédé et dispositif électronique interactif
US20160188024A1 (en) Information Processing Method And Electronic Device
TWI709876B (zh) 可切換輸入法的電子裝置及其輸入法切換方法、系統
US10962593B2 (en) System on chip and operating method thereof
US11429249B2 (en) Application program data processing method and device
CN110456978B (zh) 一种用于触摸终端的触摸控制方法、系统、终端及介质
US20110205157A1 (en) System and Method for Information Handling System Touchpad Enablement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16916675

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16916675

Country of ref document: EP

Kind code of ref document: A1