WO2023231884A1 - 人机交互的方法及电子设备 - Google Patents

人机交互的方法及电子设备 Download PDF

Info

Publication number
WO2023231884A1
WO2023231884A1 PCT/CN2023/096230 CN2023096230W WO2023231884A1 WO 2023231884 A1 WO2023231884 A1 WO 2023231884A1 CN 2023096230 W CN2023096230 W CN 2023096230W WO 2023231884 A1 WO2023231884 A1 WO 2023231884A1
Authority
WO
WIPO (PCT)
Prior art keywords
program
human
data
computer interaction
electronic device
Prior art date
Application number
PCT/CN2023/096230
Other languages
English (en)
French (fr)
Inventor
邰彦坤
李凌飞
田龙
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023231884A1 publication Critical patent/WO2023231884A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present application relates to the field of terminal technology, and in particular, to a human-computer interaction method and electronic equipment.
  • An application program can be installed in the terminal device.
  • an interactive object corresponding to the application program can be provided to the user on the human-computer interaction interface, thereby interacting with the user through the interactive object.
  • the developer of the application program can generally provide relevant operation instructions to the user in advance, and then the user can interact with the interactive object corresponding to the application program based on the method indicated in the operation instructions.
  • this application provides a human-computer interaction method and electronic device, which can reduce the difficulty and learning cost of human-computer interaction, and also reduce the security risks of the first program stealing the user's privacy without the user's knowledge.
  • embodiments of the present application provide a human-computer interaction method, including: displaying a first human-computer interaction interface, the first human-computer interaction interface including a first interaction object, and the first interaction object Corresponding to the first program; when receiving the first trigger operation, display a second human-computer interaction interface, the second human-computer interaction interface includes first prompt information corresponding to the first program, the first prompt The information is used to prompt a second program related to the first program, the second program is used to provide data of a first data type to the first program, and the first data type is used for the first program to perform The data type on which data processing depends.
  • a first human-computer interaction interface may be displayed, the first human-computer interaction interface includes a first interactive object corresponding to the first program, and when the first trigger operation is received based on the first human-computer interaction interface , displaying a second human-computer interaction interface, wherein the second human-computer interaction interface includes first prompt information corresponding to the first interactive object, thereby intuitively prompting the user through the first prompt information that there is a program related to the first program in the electronic device.
  • the second program wherein the second program can provide the first program with data of the first data type that the first program relies on for data processing, so that the user can simply and clearly determine at least part of the functions and effects of the first program, reducing human effort.
  • the difficulty and learning cost of computer interaction also reduce the security risks of the first program stealing user privacy without the user's knowledge.
  • the first program may be any application program in the electronic device, or may be any subprogram of any application program
  • the second program may be a program different from the first program.
  • the first program and the second program may be different application programs.
  • the first program and the second program may be subprograms corresponding to different application programs.
  • the first program and the second program may be different subroutines corresponding to the same application program.
  • the first triggering operation may be used to trigger the electronic device to display the first prompt information.
  • the first trigger operation may include operations such as voice, key operation, touch operation, or gesture operation.
  • the first trigger operation may also be other operation types, and the operation type of the first trigger operation may be determined in advance.
  • the electronic device may receive an operation set by the user or relevant technical personnel as the first trigger operation.
  • the embodiment of the present application does not limit the operation type of the first trigger operation and the method of determining the first trigger operation.
  • the electronic device may receive the first triggering operation based on the first human-computer interaction interface.
  • the first trigger operation may be that the electronic device receives an operation while displaying the first human-computer interaction interface. For example, when the electronic device displays the first human-computer interaction interface, it detects that the user presses "windows". key + Q key”.
  • the first triggering operation may be an operation detected by the electronic device from the first human-computer interaction interface, or in other words, the first triggering operation may be an operation acting on the first human-computer interaction interface, for example, electronic
  • the device detects that the mouse trajectory or touch trajectory forms a "?” shape on the displayed first human-computer interaction interface, that is, the user draws a "?” gesture on the first human-computer interaction interface through the mouse or touch.
  • the first prompt information is specifically used to prompt the corresponding second program based on each of the first data types.
  • the corresponding first data type of the first program includes text and images
  • the second program includes program A and program B that can provide text
  • program C and program D that can provide images.
  • the first prompt information can change program A
  • the icons of Program B and the icons of Program B are arranged in one row
  • the images of Program C and the images of Program D are arranged in another row, thereby further intuitively and clearly prompting the user about the related types of each second program and the first program.
  • the second human-computer interaction interface further includes second prompt information, the second prompt information is used to prompt at least one of the data processing capability of the first program and the first data type. .
  • the second prompt information prompts the user about the data processing capabilities of the first program, which can intuitively remind the user what data the first program needs to obtain, and what services or experiences can be provided based on these data, further improving the efficiency of human-computer interaction and reducing the user's learning cost. At the same time, it also further avoids the hidden danger of the first program stealing user privacy without the user's knowledge.
  • the combination of the two dimensions of information, the first prompt information and the second prompt information also enables the user to more directly and clearly determine the data processing capabilities of the first program and the possible associations of the first program based on the data capabilities.
  • the second program can more completely and objectively display the functions and effects of the first program to the user.
  • the second prompt information may include at least one of graphics, images, and text.
  • displaying the second human-computer interaction interface includes: highlighting the first prompt information and the second prompt information, thereby displaying the first prompt information and the second prompt information more intuitively, improving Prompt effect.
  • the highlighting method may include displaying the first prompt information as the foreground and displaying other content other than the first prompt information as the background.
  • the foreground display and the background display can have different visual characteristics.
  • the foreground is displayed in color and the background is displayed in gray or black and white.
  • the resolution of the foreground display can be greater than the resolution of the background display, that is, the foreground display is clearer than the background display.
  • the highlighting method may include highlighting the first prompt information.
  • the highlighting method may include adding a border or marquee around the first prompt information.
  • the electronic device can also highlight the first prompt information in other ways, and is not limited to the above-mentioned display methods.
  • the first interactive object includes at least one of a first window and a first control.
  • the first prompt information is specifically used to indicate a second interactive object corresponding to the second program.
  • the second interactive object includes at least one of a second window and a second control.
  • the first prompt information may include at least one of graphics, images, and text.
  • the first prompt information may include the program identification (name and/or icon) of the second program, a thumbnail of the second window, an arrow pointing from the second interactive object to the first interactive object, the second interactive object The connection line between the first interactive object and so on.
  • the first prompt information is not limited to the several types of information mentioned above.
  • the method further includes: when receiving a second trigger operation based on the second human-computer interaction interface, displaying a third human-computer interaction interface, the third human-computer interaction interface including the third human-computer interaction interface.
  • a program performs a first processing result of data processing based on first target data, wherein the first target data is data of the first data type belonging to the second program.
  • the first target data of the first data type belonging to the second program can be provided to the first program, so that the first program can conveniently and quickly obtain the data belonging to the second program.
  • Perform data processing on the first target data and display the first processing result to the user by displaying a third human-computer interaction interface.
  • the difficulty of human-computer interaction and the user's learning cost are further reduced.
  • data is provided to the first program through the second trigger operation, and the user can also proactively provide data to the first program after determining at least part of the functions and effects of the first program.
  • the program provides data, that is, there is no need to set the permissions for the first program to obtain data in advance, and the way of providing data is more flexible, which further improves the problem that the first window may steal the user's privacy without the user's knowledge.
  • the second triggering operation may be used to trigger the electronic device to provide the first target data of the first data type belonging to the second program to the first program.
  • the second trigger operation may include operations such as voice, key operation, touch operation, or gesture operation.
  • the second trigger operation may also be other operation types, and the operation type of the second trigger operation may be determined by the electronic device. It is determined in advance that, for example, the electronic device may receive an operation set by the user or relevant technical personnel as the second trigger operation.
  • the embodiment of the present application does not limit the operation type of the second trigger operation and the way in which the electronic device determines the second trigger operation.
  • the second triggering operation includes a click operation on the second interactive object; or, the second triggering operation includes a dragging operation of dragging the second interactive object toward the first interactive object. ; Or, the second trigger operation includes a drag operation of dragging the second interactive object to the area where the first interactive object is located; wherein the first interactive object corresponds to the first program, and the first interactive object corresponds to the first program. Two interactive objects correspond to the second program.
  • the electronic device may determine the at least part of the data selected by the user as the first target data.
  • the second interaction object includes a second window, and before the second triggering operation, the user has already entered the second window. If at least part of the data is selected by mouth, the at least part of the data can be used as the first target data.
  • the electronic device if the electronic device needs to interact with the user to determine the first target data, then before the second triggering operation, the electronic device does not determine the first target data based on the second program, and the second program is not running in the foreground. (That is, the second program is not running or is running in the background), then after receiving the second trigger operation, the electronic device can run the second program in the foreground, display the second window, and receive the first determination operation submitted by the user based on the second window. , determining the first target data in the second window based on the first determination operation. It should be noted that the embodiment of the present application does not limit the operation type of the first determination operation.
  • the electronic device if the electronic device can determine the first target data without interacting with the user, then before the second triggering operation, the electronic device does not determine the first target data based on the second program, and the second If the program is not running, then the electronic device can run the second program (in the background or foreground) after receiving the second trigger operation, and determine the first target data from the second program. In some embodiments, the electronic device may determine whether the first target data needs to be determined based on interaction with the user based on the first data type. If the first data type is the preset third data type, then the electronic device can determine that interaction with the user is not required to determine the first target data; if the first data type is not the third data type, then the electronic device can determine that interaction with the user is required.
  • the third data type may include data types of information such as positioning information, time information, weather information, etc. that can be obtained by the program in the background.
  • the third data type may include a data type of information that is updated less frequently, such as user attribute information.
  • the third data type may also include more or fewer data types, and is not limited to the above-mentioned data types.
  • the electronic device before the electronic device provides the first target data to the first program, if the first program is not running in the foreground, the electronic device can first run the first program in the foreground, and then provide the first target data to the first program. A program. In other embodiments, before the electronic device provides the first target data to the first program, if the first program is not running, the electronic device can first run the first program in the background, and then provide the first target data to the first program. program.
  • the method further includes: obtaining the first data type requested by the first program; obtaining second target data matching the first data type; and providing the second target data to the first program.
  • Target data obtaining a second processing result fed back by the first program based on the second target data; and determining the data processing capability of the first program based on the second target data and the second processing result.
  • the electronic device may obtain the third target data corresponding to the first data type, and perform desensitization processing on the third target data, thereby obtaining the second target data.
  • desensitization processing may refer to performing data deformation processing on the third target data through preset desensitization rules, so that the desensitized second target data does not carry user characteristics.
  • desensitization processing may include data encryption, data replacement, and the like.
  • both the first program and the second program are programs running in the foreground. That is, the electronic device may prompt the second program included in the program associated with the first program only in the range where the user can see at least part of the program running the process.
  • the electronic device may prompt the second program included in the program associated with the first program only in the range where the user can see at least part of the program running the process.
  • at least one of the first program and the second program may not be a program running in the foreground.
  • embodiments of the present application provide a human-computer interaction device, which has the function of implementing any one of the above-mentioned first aspects.
  • Functions can be implemented by hardware, or by hardware executing corresponding software.
  • Hardware or software includes one or more modules or units corresponding to the above functions. For example, transceiver modules or units, Processing modules or units, obtaining modules or units, etc.
  • embodiments of the present application provide an electronic device, including: a memory and a processor.
  • the memory is used to store a computer program; the processor is used to cause the electronic device to execute the above first aspect when calling the computer program. any one of the methods.
  • inventions of the present application provide a chip system.
  • the chip system includes a processor, the processor is coupled to a memory, and the processor executes a computer program stored in the memory to implement any of the above-mentioned aspects of the first aspect. method described in one item.
  • the chip system may be a single chip or a chip module composed of multiple chips.
  • embodiments of the present application provide a computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the method described in any one of the above-mentioned first aspects is implemented.
  • embodiments of the present application provide a computer program product, which when the computer program product is run on an electronic device, causes the electronic device to execute any of the methods described in the first aspect.
  • Figure 1 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of a human-computer interaction interface provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of another human-computer interaction interface provided by an embodiment of the present application.
  • Figure 4 is a schematic flowchart of a human-computer interaction method provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of a fourth human-computer interaction interface provided by an embodiment of the present application.
  • Figure 6 is a schematic diagram of a first human-computer interaction interface provided by an embodiment of the present application.
  • Figure 7 is a schematic diagram of a second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 8 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 9 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 10 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • FIG 11 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 12 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 13 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 14 is a schematic diagram of a third human-computer interaction interface provided by an embodiment of the present application.
  • Figure 15 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 16 is a schematic diagram of another third human-computer interaction interface provided by an embodiment of the present application.
  • Figure 17 is a schematic diagram of another first human-computer interaction interface provided by an embodiment of the present application.
  • Figure 18 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 19 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 20 is a schematic diagram of another third human-computer interaction interface provided by an embodiment of the present application.
  • Figure 21 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 22 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 23 is a schematic diagram of another third human-computer interaction interface provided by an embodiment of the present application.
  • Figure 24 is a schematic diagram of another second human-computer interaction interface provided by an embodiment of the present application.
  • Figure 25 is a schematic diagram of another third human-computer interaction interface provided by an embodiment of the present application.
  • Figure 26 is a schematic flowchart of a method for obtaining data processing capabilities provided by an embodiment of the present application.
  • Figure 27 is a schematic flowchart of another human-computer interaction method provided by an embodiment of the present application.
  • the human-computer interaction method provided by the embodiments of this application can be applied to mobile phones, tablet computers, wearable devices, vehicle-mounted devices, augmented reality (AR)/virtual reality (VR) devices, notebook computers, and super mobile devices.
  • AR augmented reality
  • VR virtual reality
  • notebook computers notebook computers
  • super mobile devices For electronic devices such as personal computers (ultra-mobile personal computers, UMPCs), netbooks, and personal digital assistants (personal digital assistants, PDA), the embodiments of this application do not place any restrictions on the specific types of electronic devices.
  • FIG. 1 is a schematic structural diagram of an electronic device 100 provided by an embodiment of the present application.
  • the electronic device 100 may include a processor 110, a memory 120, a communication module 130, a display screen 140, and the like.
  • the processor 110 may include one or more processing units, and the memory 120 is used to store program codes and data.
  • the processor 110 can execute computer execution instructions stored in the memory 120 to control and manage the actions of the electronic device 100 .
  • the communication module 130 may be used for communication between various internal modules of the electronic device 100, or communication between the electronic device 100 and other external electronic devices, etc.
  • the communication module 130 may include an interface, etc., such as a USB interface.
  • the USB interface may be an interface that complies with the USB standard specification, and may specifically be a Mini USB interface, Micro USB interface, etc. USB interface, USB Type C interface, etc.
  • the USB interface can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through them. This interface can also be used to connect other electronic devices, such as AR devices, etc.
  • the communication module 130 may include an audio device, a radio frequency circuit, a Bluetooth chip, a wireless fidelity (Wi-Fi) chip, a near-field communication (NFC) module, etc., and may be configured through a variety of different The interaction between the electronic device 100 and other electronic devices is realized in a manner.
  • Wi-Fi wireless fidelity
  • NFC near-field communication
  • the display screen 140 can display images or videos in the human-computer interaction interface.
  • the electronic device 100 may also include peripheral devices 150, such as a mouse, a keyboard, a speaker, a microphone, and the like.
  • peripheral devices 150 such as a mouse, a keyboard, a speaker, a microphone, and the like.
  • the embodiment of the present application does not specifically limit the structure of the electronic device 100 .
  • the electronic device 100 may also include more or less components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the electronic device may include one or more application programs.
  • An application is a collection of executable files and data that have specific functions and can perform operations.
  • An application can include several executable programs, dynamic link libraries, data files and other data.
  • the carrier of an application is an executable file.
  • the executable file can include the necessary information to run the application, such as data and configuration information.
  • executable files may include files with extensions such as exe, app, apk, bat, sys, dll, etc. Of course, in actual applications, the extensions of executable files are not limited to the above.
  • a process can be an instance of a running application.
  • a process can include an address space and a collection of codes, data, objects, and other environments and resources required for program running.
  • one process can correspond to one application, and one application can correspond to multiple processes.
  • one process may correspond to at least one thread, and one thread may correspond to one process.
  • the processes and threads of the application can be understood as subroutines of the application. And it can be understood that in actual applications, applications may include subroutines at a higher level or at a lower level than processes and threads.
  • the electronic device may display a human-computer interaction interface on the display screen, and the human-computer interaction interface may include one or more interactive objects.
  • the interactive object can be an object that can respond to user operations, which can include various operations such as clicking and dragging with the mouse or finger.
  • one interactive object may correspond to one application program or subprogram, and one application program or subprogram may correspond to more than one interactive object.
  • An application or subprogram can obtain data through its corresponding interactive object and perform specific processing operations based on the data. It can also display the processing results to the user through the interactive object.
  • the human-computer interaction interface may be 2D or 3D.
  • the human-computer interaction interface may be 2D; when the electronic device is an AR device or a VR device, the human-computer interaction interface may be 3D.
  • the interactive object may be 2D or 3D.
  • the human-computer interaction interface when the human-computer interaction interface is a 2D interface, the interactive objects in the human-computer interaction interface may be 2D; similarly, when the human-computer interaction interface is a 3D interface, the human-computer interaction interface may be 2D.
  • Interactive objects in the interactive interface can be 3D.
  • the interactive object may include at least one of a window and a control.
  • a window can be created when an application is started.
  • the window can include various visual elements such as title bars, menus, and borders.
  • the window can also be called the main window.
  • the application The program can also create more windows, such as dialog boxes, which in turn can include one or more controls.
  • controls may include, but are not limited to, icons, input boxes, and the like.
  • a human-computer interaction interface can be shown in Figure 2.
  • the human-computer interaction interface is a 2D human-computer interaction interface in a computer.
  • the middle part of the human-computer interaction interface includes a browser window 201 corresponding to the browser program, and the lower part includes a shortcut launch bar 202.
  • the shortcut launch bar includes icons for five application programs. From left to right, they are the browser program and the photo album program. , sports and health programs, communication programs and document editing programs.
  • the browser window 201 also includes a plurality of controls, such as a URL input box included in the upper part of the browser window 201, a search input box included in the lower part of the browser window 201, and six buttons including Web Page 1 to Web Page 6.
  • a human-computer interaction interface can be as shown in Figure 3.
  • the human-computer interaction interface can be a 3D human-computer interaction interface in a VR device.
  • the human-computer interaction interface includes a virtual shopping scene, including a virtual TV 301, a virtual doll 302, and a virtual projector 303, and the virtual TV 301, Both the virtual doll 302 and the virtual projector 303 are controls, corresponding to products such as televisions, doll toys, and projectors respectively.
  • the human-computer interaction interface can also be a human-computer interaction interface in an AR device, in which the virtual TV 301, virtual puppet 302 and virtual projector 303 can also be controls, or, in other embodiments,
  • the virtual TV 301, the virtual doll 302 and the virtual projector 303 can be real objects captured by the camera of the AR device, and each real object can be associated with a preset control (for example, the AR device can be at the same position of the real object Displays a transparent control) such that the user's interaction with the real object is equal to the interaction with the control associated with the real object.
  • FIG. 4 is a flow chart of a human-computer interaction method provided by an embodiment of the present application. It should be noted that this method is not limited to the specific sequence shown in Figure 4 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted.
  • the method includes the following steps:
  • the electronic device displays a first human-computer interaction interface.
  • the first human-computer interaction interface includes a first interactive object, and the first interactive object corresponds to the first program.
  • the electronic device When the electronic device is powered on, it can run the first program and display the first human-computer interaction interface including the first interactive object, so as to facilitate human-computer interaction with the user through the first interactive object.
  • the first program may be any application program in the electronic device, or may be any subprogram of any application program.
  • the first interactive object may include at least one of a first window and a first control.
  • the first control may include an icon for the first program.
  • the first program may be a program running in the foreground.
  • the first program running in the foreground may mean that the user can see at least part of the running process of the first program in the first human-computer interaction interface, for example, the user can see the first window corresponding to the first program; conversely, if the A program running in the background may mean that the user cannot see any running process of the first program in the first human-computer interaction interface. That is to say, the electronic device can only provide interactive prompts for programs that the user can see at least part of the running process.
  • the first program may not be a program running in the foreground.
  • the first interaction object may be an interaction object determined by the electronic device in the first human-computer interaction interface based on the user's second determination operation.
  • the second determination operation is used to determine at least one first interactive object in the first human-computer interaction interface. It should be noted that the embodiment of the present application does not limit the operation type of the second determination operation.
  • the electronic device When receiving the first trigger operation, displays a second human-computer interaction interface.
  • the second human-computer interaction interface includes first prompt information corresponding to the first interactive object.
  • the first prompt information is used to prompt the first interaction object.
  • Program related second program is used to prompt the first interaction object.
  • the electronic device After the electronic device displays the first human-computer interaction interface including the first interactive object, the user may not understand how to specifically interact with the first interactive object. Even if the developer of the first program provides corresponding operation instructions, the user still needs to Only by spending huge learning costs and gradually exploring and practicing can you master the method of interacting with the first interactive object. Therefore, when receiving the first trigger operation, the electronic device can display the first prompt information corresponding to the first interactive object.
  • the two-human-computer interactive interface intuitively prompts the user through the first prompt information that there is a second program related to the first program in the electronic device, so that the user can simply and clearly determine at least part of the functions and effects of the first program, reducing the human-computer interaction time.
  • the difficulty of interaction and learning cost also reduce the security risks of the first program stealing user privacy without the user's knowledge.
  • the second program may be a different program from the first program.
  • the first program and the second program may be different application programs.
  • the first program and the second program may be subprograms corresponding to different application programs.
  • the first program and the second program may be different subroutines corresponding to the same application program.
  • the second program is related to the first program, which can refer to the data processing capabilities of the second program and the data processing capabilities of the first program. Mental abilities can be related to each other.
  • the second program is related to the first program, which may include that the second program can provide data of a first data type to the first program, and the first data type is a data type that the first program relies on for data processing.
  • the first triggering operation may be used to trigger the electronic device to display the first prompt information.
  • the first trigger operation may include operations such as voice, key operation, touch operation, or gesture operation.
  • the first trigger operation may also be other operation types, and the operation type of the first trigger operation may be It is determined in advance that, for example, the electronic device may receive an operation set by the user or relevant technical personnel as the first trigger operation.
  • the embodiment of the present application does not limit the operation type of the first trigger operation and the method of determining the first trigger operation.
  • the electronic device may receive the first triggering operation based on the first human-computer interaction interface.
  • the first trigger operation may be that the electronic device receives an operation while displaying the first human-computer interaction interface. For example, when the electronic device displays the first human-computer interaction interface, it detects that the user presses "windows". key + Q key”.
  • the first triggering operation may be an operation detected by the electronic device from the first human-computer interaction interface, or in other words, the first triggering operation may be an operation acting on the first human-computer interaction interface, for example, electronic
  • the device detects that the mouse trajectory or touch trajectory forms a "?” shape on the displayed first human-computer interaction interface, that is, the user draws a "?” gesture on the first human-computer interaction interface through the mouse or touch.
  • the electronic device may display a fourth human-computer interaction interface, and receive the first trigger operation set by the user through the fourth human-computer interaction interface, where the fourth human-computer interaction interface may be used to receive the user-submitted Configuration information related to the human-computer interaction method provided by the embodiments of this application.
  • the electronic device can display a fourth human-computer interaction interface as shown in Figure 5.
  • the first configuration item 501 in the fourth human-computer interaction interface is the configuration item corresponding to the first trigger operation.
  • the first configuration item 501 includes a sliding switch.
  • the switching state of the sliding switch can be switched between "on” and “off”.
  • the current state of the sliding switch is "on", which means that the electronic device receives the first trigger operation and assists the user in human-computer interaction according to the human-computer interaction method provided in the application embodiment.
  • the first configuration item 501 also includes the text prompt information "first trigger operation" to prompt the user to configure the first trigger operation in the first configuration item 501.
  • the first configuration item 501 also includes the current operation mode display. box and custom button. Among them, the current operation mode display box displays "windows key + Q key", which means that the current first triggering operation mode is to press the windows key and Q key at the same time.
  • the operation mode displayed in the current operation mode display box may be configured in advance by relevant technical personnel or configured previously by the user.
  • the operation mode of the current operation mode display box can be updated to the newly submitted operation mode. It should be noted that this embodiment of the present application only uses FIG. 5 to illustrate the fourth human-computer interaction interface, but does not limit the fourth human-computer interaction interface.
  • the second program may be a program running in the foreground. That is, the electronic device may prompt the second program included in the program associated with the first program only in the range where the user can see at least part of the program running the process.
  • the second program may not be a program running in the foreground.
  • the first prompt information may be used to prompt the second interactive object corresponding to the second program.
  • the second interactive object may include at least one of a second window and a second control corresponding to the second program.
  • the first prompt information may include at least one of graphics, images, and text.
  • the first prompt information may include the program identification (name and/or icon) of the second program, a thumbnail of the second window, an arrow pointing from the second interactive object to the first interactive object, the second interactive object The connection line between the first interactive object and so on.
  • the first prompt information is not limited to the several types of information mentioned above.
  • the electronic device can classify the multiple second programs according to a preset classification method, and the first prompt information can respectively indicate each type of second program based on the classification results. program.
  • different second programs may have different related types to the first program, and the first prompt information may respectively indicate various related types of second programs.
  • the first program corresponds to multiple first data types, and the first prompt information may respectively indicate the second program that can provide data of each first data type.
  • the corresponding first data type of the first program includes text and images
  • the second program includes program A and program B that can provide text
  • program C and program D that can provide images.
  • the first prompt information can change program A
  • the icons of Program B and the icons of Program B are arranged in one row, and the images of Program C and the images of Program D are arranged in another row, thereby further intuitively and clearly prompting the user about the related types of each second program and the first program.
  • the electronic device may determine the second program related to the first program from the program set included in the electronic device based on the first program at any time before displaying the second human-computer interaction interface. In some embodiments, the electronic device can determine the first data type that the first program relies on for data processing and the data type that can be provided by each third program included in the program set. If the data that any third program can provide If the data type is the same as the first data type, the electronic device can determine that any third program is a second program related to the first program.
  • the program set may include one or more programs installed on the electronic device, or may include one or more programs installed and running on the electronic device, or may include one or more programs installed on the electronic device and running in the foreground. or multiple programs. Of course, in actual applications, the programs included in the program collection are not limited to the above-mentioned programs.
  • the electronic device may also determine the second program related to the first program through other methods.
  • the electronic device can generate first prompt information, and determine the second human-computer interaction interface based on the first prompt information.
  • first prompt information is added, thereby obtaining a second human-computer interaction interface.
  • the electronic device can add the first prompt information to the first human-computer interaction interface based on a preset first template, where the first template can be used to determine whether the first prompt information is used in the first human-computer interaction interface.
  • the position in the interface, the display style of the first prompt information or the display style of the second human-computer interaction interface can also determine the second human-computer interaction interface through other methods. The embodiments of the present application do not limit the manner in which the electronic device determines the second human-computer interaction interface.
  • the electronic device can highlight the first prompt information, thereby displaying the first prompt information more intuitively and improving the prompt effect.
  • the highlighting method may include displaying the first prompt information as the foreground and displaying other content other than the first prompt information as the background.
  • the foreground display and the background display can have different visual characteristics. For example, the foreground is displayed in color and the background is displayed in gray or black and white.
  • the foreground is displayed in color and the background is displayed in gray or black and white.
  • the resolution of the foreground display can be greater than the resolution of the background display, that is, the foreground display is clearer than the background display.
  • the highlighting method may include highlighting the first prompt information.
  • the highlighting method may include adding a border or marquee around the first prompt information.
  • the electronic device can also highlight the first prompt information in other ways, and is not limited to the above-mentioned display methods. Mode.
  • the second human-computer interaction interface may also include second prompt information, and the second prompt information is used to prompt the data processing capability of the first program.
  • the second prompt information may be used to prompt the first data type on which the data processing capability of the first program depends, that is, the input of the first program.
  • the second prompt information may be used to indicate a second data type of data that the first program can provide.
  • the second prompt information prompts the user about the data processing capabilities of the first program, which can intuitively remind the user what data the first program needs to obtain, and what services or experiences can be provided based on these data, further improving the efficiency of human-computer interaction and reducing the user's learning cost. At the same time, it also further avoids the hidden danger of the first program stealing user privacy without the user's knowledge.
  • the combination of the two dimensions of information, the first prompt information and the second prompt information also enables the user to more directly and clearly determine the data processing capabilities of the first program and the possible associations of the first program based on the data capabilities.
  • the second program can more completely and objectively display the functions and effects of the first program to the user.
  • the electronic device can determine the data processing capability of the first program in advance, so that it can promptly respond to the first trigger operation and display the second prompt information on the second human-computer interaction interface.
  • the operating system of the electronic device may send a second acquisition request to the first program, and the first program responds to the second acquisition request by feeding back the data processing capability of the first program to the operating system.
  • the electronic device may also refer to the method shown in Figure 26 below to determine the data processing capability of the first program.
  • the second prompt information may include at least one of graphics, images, and text.
  • first prompt information and second prompt information are added, thereby obtaining a second human-computer interaction interface.
  • the electronic device can add second prompt information to the first human-computer interaction interface based on a preset second template, where the second template can be used to determine whether the second prompt information is used in the first human-computer interaction interface.
  • the first template and the second template may be the same template.
  • the first program may have multiple data processing capabilities
  • the second prompt information may respectively indicate various data processing capabilities of the first program.
  • the data processing capabilities of the first program include "search for similar images based on image recognition” and “conduct extended search based on semantic analysis”, and the second prompt information can indicate these two data processing capabilities through text respectively.
  • the electronic device can highlight the second prompt information, thereby displaying the second prompt information more intuitively and improving the prompt effect. It should be noted that the way in which the electronic device highlights the second prompt information may be the same as or similar to the way in which the first prompt information is highlighted.
  • the electronic device can display a fourth human-computer interaction interface, and receive prompt content set by the user through the fourth human-computer interaction interface, that is, whether to display the first prompt information and the second prompt information.
  • the fourth human-computer interaction interface shown in FIG. 5 also includes a second configuration item 502.
  • the text prompt information "prompt content” is used to prompt the user to configure the prompt content in response to the first trigger operation in the second configuration item 502.
  • the second configuration item 502 also includes a prompt content display box and a custom button.
  • the prompt content display box displays "first prompt information and second prompt information", which means that the current prompt content includes the first prompt information and the second prompt information.
  • the prompt content displayed in the prompt content display box may be configured in advance by relevant technical personnel or configured previously by the user.
  • the electronic device receives new prompt content newly submitted by the user based on the custom button, the prompt content in the prompt content display box can be updated. This is the new prompt content.
  • the third human-computer interaction interface includes a first process in which the first program performs data processing based on the first target data.
  • the first target data is data of the first data type belonging to the second program.
  • the first target data of the first data type belonging to the second program can be provided to the first program, so that the first program can conveniently and quickly obtain the data belonging to the second program.
  • Perform data processing on the first target data and display the first processing result to the user by displaying a third human-computer interaction interface.
  • the difficulty of human-computer interaction and the user's learning cost are further reduced.
  • data is provided to the first program through the second trigger operation, and the user can also proactively provide data to the first program after determining at least part of the functions and effects of the first program.
  • the program provides data, that is, there is no need to set the permissions for the first program to obtain data in advance, and the way of providing data is more flexible, which further improves the problem that the first window may steal the user's privacy without the user's knowledge.
  • the second triggering operation may be used to trigger the electronic device to provide the first target data of the first data type belonging to the second program to the first program.
  • the second trigger operation may include operations such as voice, key operation, touch operation or gesture operation.
  • the second trigger operation may also be other operation types, and the operation type of the second trigger operation may be
  • the electronic device determines in advance, for example, the electronic device may receive an operation set by the user or relevant technical personnel as the second trigger operation.
  • the embodiment of the present application does not limit the operation type of the second trigger operation and the way in which the electronic device determines the second trigger operation.
  • the electronic device can display a fourth human-computer interaction interface, and receive the second trigger operation set by the user through the fourth human-computer interaction interface.
  • the fourth human-computer interaction interface shown in FIG. 5 also includes a third configuration item 503 for configuring the second trigger operation.
  • the third configuration item 503 includes the text information "second trigger operation" to prompt the user to configure the second trigger operation in the third configuration item 503.
  • Below the text information it also includes a current operation mode display box and a custom button, where,
  • the current operation mode display box "Drag the first target data in the second window to the first window” means that when the current electronic device detects that the user drags the first target data in the second window to the first window, First target data is provided to the first program.
  • the operation mode displayed in the current operation mode display box may be configured in advance by relevant technical personnel or configured previously by the user. When the electronic device receives a new operation mode submitted by the user based on the custom button, the operation mode of the current operation mode display box can be updated to the newly submitted operation mode.
  • the second trigger operation may include a click operation on the second interactive object. In some embodiments, the second triggering operation may include a dragging operation of dragging the second interactive object toward the first interactive object. In some embodiments, the second trigger operation may include a drag operation of dragging the second interactive object from the area where the second interactive object is located to the area where the first interactive object is located.
  • the second interactive object includes a second window
  • the second triggering operation may include a dragging operation of dragging at least part of the data in the second window toward the first interactive object, or the second triggering operation may include A drag operation is performed to drag at least part of the data in the second window to the area where the first interactive object is located, and at least part of the data dragged by the second trigger operation is the first target data provided to the first program.
  • the electronic device may determine the at least part of the data selected by the user as the first target. data. For example, if the second interactive object includes a second window, and before the second triggering operation, the user has selected at least part of the data in the second window, then the at least part of the data can be used as the first target data.
  • the electronic device if the electronic device needs to interact with the user to determine the first target data, then before the second triggering operation, the electronic device does not determine the first target data based on the second program, and the second program is not running in the foreground. (That is, the second program is not running or is running in the background), then after receiving the second trigger operation, the electronic device can run the second program in the foreground, display the second window, and receive the first determination operation submitted by the user based on the second window. , determining the first target data in the second window based on the first determining operation, wherein the first determining operation is used to determine the first target data in the second window. It should be noted that the embodiment of the present application does not limit the operation type of the first determination operation.
  • the electronic device if the electronic device can determine the first target data without interacting with the user, then before the second triggering operation, the electronic device does not determine the first target data based on the second program, and the second If the program is not running, then the electronic device can run the second program (in the background or foreground) after receiving the second trigger operation, and determine the first target data from the second program. In some embodiments, the electronic device may determine whether the first target data needs to be determined based on interaction with the user based on the first data type. If the first data type is the preset third data type, then the electronic device can determine that interaction with the user is not required to determine the first target data; if the first data type is not the third data type, then the electronic device can determine that interaction with the user is required.
  • the third data type may include data types of information such as positioning information, time information, weather information, etc. that can be obtained by the program in the background.
  • the third data type may include a data type of information that is updated less frequently, such as user attribute information.
  • the third data type may also include more or fewer data types, and is not limited to the above-mentioned data types.
  • the electronic device before the electronic device provides the first target data to the first program, if the first program is not running in the foreground, the electronic device can first run the first program in the foreground, and then provide the first target data to the first program. A program. In other embodiments, before the electronic device provides the first target data to the first program, if the first program is not running, the electronic device can first run the first program in the background, and then provide the first target data to the first program. program.
  • the electronic device may provide the first target data to the first program through the first interface of the first program. In some embodiments, the electronic device can obtain the first processing result of the first program through the second interface of the first program. It should be noted that the first interface and the second interface may be interfaces set in advance by the technical developer of the first program.
  • the first processing result may include at least one of an image and a data stream.
  • the electronic device can display at least part of the functions and effects of the first program to the user through the first prompt information and the second prompt information, without providing the first target data to the first program.
  • the electronic device Therefore, the first processing result will not be obtained, or even if the electronic device obtains the first processing result, the first processing result may not be displayed to the user. Therefore, S403 may be omitted.
  • the electronic device can display a first human-computer interaction interface, the first human-computer interaction interface includes a first interactive object corresponding to the first program, and receives a first trigger based on the first human-computer interaction interface.
  • a second human-computer interaction interface is displayed, wherein the second human-computer interaction interface includes first prompt information corresponding to the first interaction object, thereby intuitively prompting the user through the first prompt information that the first program exists in the electronic device.
  • the related second program allows the user to determine at least part of the functions and effects of the first program simply and clearly, reducing the difficulty of human-computer interaction and learning costs, and also reducing the risk of the first program stealing the user's privacy without the user's knowledge. All hidden dangers.
  • the human-computer interaction method provided by the embodiment of the present application will be described below with reference to Figures 6-14.
  • the first program and the second program both include programs running in the foreground.
  • Figure 6, any one of Figures 7 to Figure 9, any one of Figure 10 to Figure 13, and Figure 14 in sequence which is a human-computer interaction interface related to a human-computer interaction method provided by an embodiment of the present application.
  • the electronic device can display a first human-computer interaction interface as shown in Figure 6.
  • the bottom of the first human-computer interaction interface includes a shortcut launch bar 202.
  • the shortcut launch bar 202 includes a browser program, a photo album program, and a sports and health program from left to right. Icons for applications such as programs, communication programs, and document editing programs. Among them, the sports health program is not currently running in the foreground, so the first human-computer interaction interface does not display any running process of the sports health program. Other applications are currently running in the foreground, so the first human-computer interaction interface also includes browsing.
  • the browser window 201, the photo album window 203, the document editing window 204 and the communication window 205 are used to display at least part of the running process of the corresponding application program to the user through these windows.
  • the first window may include one or more of a browser window 201, a photo album window 203, a document editing window 204, and a communication window 205.
  • the second human-computer interaction interface may be as shown in Figure 7, in which the first prompt information includes an icon 700 of the second program, and on the right edge of each first window, an icon 700 related to the second program is displayed.
  • the right edge of the browser window 201 displays the icon of the photo album program and the icon of the document editing program, and these two icons are displayed in two lines, thereby prompting the user about the related types of the photo album program and the browser program, and the type of the document editing program and The related types of browser programs are not the same.
  • the right edge of the photo album window 203 displays the icon of the browser program, the icon of the document editing program and the icon of the communication program, and these three icons are displayed on the same line, thereby prompting the user of the browser program, the document editing program and the communication program , the same type related to the photo album program.
  • the right edge of the document editing window 204 displays the icons of the browser program and the photo album program
  • the right edge of the communication window 205 displays the icons of the browser program, the photo album program, and the document editing program.
  • the second human-computer interaction interface may be as shown in Figure 8, wherein, based on Figure 7, the right side of each window also includes second prompt information 800 in the form of text.
  • the second prompt information 800 is The data format is "first data type - data processing capability", thereby prompting the user about the data processing methods that the first program can perform based on the data of the first data type.
  • “Image - search for similar images based on image recognition” is displayed above the icon of the photo album program
  • “Text - perform expanded search based on semantic analysis” is displayed above the icon of the document editing program.
  • the browser program can perform image recognition based on the acquired images, search for similar images, and can also perform semantic analysis and expanded search based on the acquired text.
  • "Image - Get window screenshot to save” is displayed above the icons of the browser program, communication program and document editing program, that is, the photo album program can obtain the image and save it.
  • "Data - Copy data to document” is also displayed above the icons of the browser program and the photo album program, that is, the photo document editing program can obtain the data and copy and paste it into the current document file.
  • the communication window 205 "Data - Forward Data to Contacts” is also displayed above the icons of the browser program, photo album program and document editing program, that is, the communication program can obtain and forward data to the contact.
  • the second human-computer interaction interface may be as shown in Figure 9.
  • the first prompt information (and the icon 700 of the second program) and the second prompt information 800 are displayed as the foreground, and the first prompt information and the second prompt information 800 can be clearly seen, and other content is displayed as the background, which is relatively blurry. Therefore, the first prompt information and the second prompt information 800 are highlighted, so that the first prompt information and the second prompt information 800 are displayed more prominently.
  • the electronic device When the electronic device receives the second trigger operation based on the second human-computer interaction interface shown in any of the above-mentioned Figures 7-9, it can provide the first target data of the first data type belonging to the second program to the first
  • the program displays a third human-computer interaction interface including the first processing result.
  • the second triggering operation may be to drag the image 1010 in the album window 203 (ie, the second window) to or to the area where the document editing window 204 (ie, the first window) is located.
  • the first drag operation is 1000.
  • the image 1010 dragged by the second trigger operation is the first target data.
  • the electronic device can insert the image 1010 into the document file being edited in the document editing window 204 through the document editing program, and display the third human-computer interaction interface as shown in Figure 14. In the third human-computer interaction interface shown in Figure 14 In the interface, the image 1010 dragged by the user has been inserted into the document file (ie, the first processing result).
  • the second trigger operation may be a second drag operation 1100 of dragging the album window 203 to or to the area where the document editing window 204 is located.
  • the second triggering operation may be to drag or drag the icon of the photo album program (ie, the second control) on the right edge of the document editing window 204 to the area where the document editing window 204 is located.
  • the third drag operation 1200 may be to drag or drag the icon of the photo album program in the quick launch bar 202 (ie, the second control) to the third area of the area where the document editing window 204 is located.
  • Three drag operations 1300 are examples of three drag operations 1300.
  • the electronic device can directly insert the image 1010 into the document editing window 204 through the document editing program.
  • the third human-computer interaction interface is displayed as shown in Figure 14; if the user does not select any image in the photo album window 203 before the second trigger operation, then the electronic device can, after the second trigger operation, The image selected by the user in the photo album window 203 is received, and then the image is inserted into the document being edited in the document editing window 204 through the document editing program.
  • the first program includes a program running in the foreground
  • the second program may also include a program not running in the foreground.
  • the second program may also include only programs that are not running in the foreground.
  • FIG. 6 , FIG. 15 and FIG. 16 in sequence, which is a schematic diagram of a human-computer interaction interface related to a human-computer interaction method provided by an embodiment of the present application.
  • the electronic device may display the first human-computer interaction interface as shown in FIG. 6 .
  • the electronic device displays the second human-computer interaction interface as shown in Figure 15 below.
  • the icon 700 of the second program displayed on the right side of the communication window 205 also includes a sports and health program. icon 710, and the sports health program is not currently running in the foreground, so the first human-computer interaction interface and the second human-computer interaction interface do not include windows corresponding to the sports health program.
  • the electronic device detects that the user drags or drags the icon 710 of the sports and health program on the right edge of the communication window 205 (that is, dragging the second control) to the communication window 205.
  • the electronic device can run the sports health program, obtain the user's health data (i.e., the first target data) from the sports health program, and provide the health data
  • the communication program sends the health data to friend A who is currently communicating, and displays the third human-computer interaction interface as shown in Figure 16.
  • a new message about health data is added to the communication window 205. Record 1600 (that is, the first processing result).
  • the electronic device may determine the first target data without user interaction based on the sports fitness program. For example, the electronic device can use the latest health data generated by the sports and health program as the first target data, and then the electronic device does not need to run the sports and health data in the foreground. Or, in other embodiments, the electronic device can interact with the user based on the sports and health program to determine the first target data, and then the electronic device can run the sports and health program in the foreground and save the health data selected by the user in the sports and health program. Determine it as the first target data, and then provide the first target data to the communication program.
  • the human-computer interaction method provided by the embodiment of the present application will be described below with reference to Figures 17-20.
  • the first program includes programs that are not running in the foreground
  • the second program includes programs that are running in the foreground.
  • FIG. 17 , FIG. 18 , FIG. 19 and FIG. 20 in sequence, which is a schematic diagram of a human-computer interaction interface related to a human-computer interaction method provided by an embodiment of the present application.
  • the electronic device displays the first human-computer interaction interface as shown in Figure 17.
  • the bottom of the first human-computer interaction interface includes a shortcut launch bar 202.
  • the shortcut launch bar 202 includes icons from left to right for browser programs, photo album programs, sports and health programs, communication programs, documents and other application programs. Among these icons, Each icon can be used as a first control. Among them, the photo album program is currently running in the foreground, and other applications are not running in the foreground, so the first human-computer interaction interface only includes the photo album window 203.
  • the electronic device displays the second human-computer interaction interface as shown in Figure 18.
  • the first prompt information may include a first dotted line 1800. Since only the photo album program is currently running in the foreground, the first dotted line 1800 points from the photo album window 203 to the icon 210 of the browser program, the icon 220 of the communication program and the icon 230 of the document editing program in the shortcut launch bar 202 respectively, thereby indicating and Programs related to browser programs, communication programs, and document editing programs include photo album programs.
  • the electronic device detects that the user drags the photo album window 203 (ie, the second window) to the fifth position of the area where the icon 230 of the document editing program (ie, the first control) is located in the quick launch bar 202 .
  • Drag operation 1900 ie, the second trigger operation. If the user has selected the image 1010 in the upper left corner of the photo album window 203 before the second trigger operation, that is, the electronic device has determined the first target data based on the photo album program before the second trigger operation, then the electronic device can run the document in the foreground.
  • Edit the program insert the image 1010 into the document being edited, and display the third human-computer interaction interface as shown in Figure 20.
  • the electronic device can display the album window 203 in the foreground, based on receiving the user's Select the image in the photo album window 203, run the document editing program in the foreground, insert the image into the document being edited, and display a third human-computer interaction interface.
  • the human-computer interaction method provided by the embodiment of the present application will be described below with reference to FIG. 17 and FIG. 21 to FIG. 23 .
  • the first program includes programs that are not running in the foreground
  • the second program also includes programs that are not running in the foreground.
  • the second program may also include only programs that are not running in the foreground.
  • FIG. 17 , FIG. 21 , FIG. 22 and FIG. 23 a schematic diagram of a human-computer interaction interface related to a human-computer interaction method provided by an embodiment of the present application is shown.
  • the electronic device displays the first human-computer interaction interface as shown in Figure 17.
  • the first prompt information includes a second dotted line 2100, an icon 700 of the second application, and a first thumbnail 2110 of the album window 203.
  • the other end of the second dotted line 2100 connected to the icon 210 of the browser program includes the first thumbnail 2110 of the photo album window 203 displayed in two lines and the icon 230 of the document editing program, which means browsing.
  • Programs related to the browser program include a photo album program running in the foreground and a document editing program not running in the foreground, and the type of correlation between the photo album program and the browser program is different from the type of correlation between the document editing program and the browser program.
  • the other end of the second dotted line 2100 connected to the icon 230 of the document editing program includes the icon 710 of the sports and health program displayed on the same line and the first thumbnail 2110 of the photo album window 203, which means that Programs related to the document editing program include the sports health program and the photo album program.
  • the sports health program is not currently running in the foreground
  • the photo album program is currently running in the foreground
  • the sports health program is related to the Wendan editing program and the photo album program is related to the Wendan editing program. Same type.
  • the electronic device detects that the user drags or drags the icon 710 of the sports and health program (ie, the second control) to the area where the icon 710 of the document editing program (ie, the first control) is located in the quick launch bar 202 .
  • the sixth drag operation 2200 ie, the second trigger operation.
  • the electronic device can run the sports and health program in the background to obtain the latest health data 2300 (ie, the first target data) including “Today’s steps: 6667 steps; Exercise duration: 30 minutes; distance: 4.8 kilometers; calories: 238 kcal", and run the document editing program in the foreground, insert the health data into the newly created blank document (i.e., the first processing result), and display the third human-computer interaction interface As shown in Figure 23.
  • the latest health data 2300 ie, the first target data
  • the first program includes programs running in the foreground
  • the second program includes programs not running in the foreground.
  • the second program may also include only programs that are not running in the foreground, or include both programs that are running in the foreground and programs that are not running in the foreground.
  • FIG. 3 , FIG. 24 and FIG. 25 in sequence, which is a schematic diagram of a human-computer interaction interface related to a human-computer interaction method provided by an embodiment of the present application.
  • the electronic device is a VR device
  • the first human-computer interaction interface is a product display interface provided by the shopping program, as shown in Figure 3 above.
  • the first human-computer interaction interface includes three controls (i.e., first controls) including a virtual TV 301, a virtual doll 302, and a virtual projector 303, which respectively represent three products: a TV, a doll toy, and a projector.
  • the second human-computer interaction interface When the VR device detects that the user's line of sight moves to any first control (i.e., the first trigger operation), the second human-computer interaction interface is displayed. As shown in Figure 24, the upper right corner of the second human-computer interaction interface displays " The second prompt information 800 such as "Positioning information - display product discount information" prompts the user that the shopping program can obtain the user's positioning information and display the preferential prices of each product.
  • the second prompt information 800 such as "Positioning information - display product discount information” prompts the user that the shopping program can obtain the user's positioning information and display the preferential prices of each product.
  • the VR device When the VR device detects the user's nodding action (i.e., the second trigger operation), it can obtain the user's positioning information (i.e., the first target data) from the positioning program and provide it to the shopping program.
  • the shopping program obtains the user's positioning information from the server based on the positioning information.
  • the discount for TVs is 50%
  • the discount for dolls is 90%
  • the discount for projectors is 70%.
  • FIG. 26 is a flow chart of a method for obtaining the data processing capability of the first program provided by an embodiment of the present application.
  • the electronic device can obtain the data processing capability of the first program at any time before the second human-computer interaction interface is displayed. For example, when receiving the first trigger operation, the electronic device may execute the method of obtaining the data processing capability of the first program as shown in FIG. 26 . It should also be noted that in order to reduce interference to the user, the electronic device shields the user from the process of the electronic device acquiring the data processing capabilities of the first program.
  • the electronic device can execute in the background as shown in Figure 26
  • the method of obtaining the data processing capability of the first program or the electronic device can create a virtual running environment such as a virtual machine, thereby acquiring the data processing capability of the first program in the virtual running environment.
  • this method is not limited to the specific order described in Figure 26 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be exchanged according to actual needs. Steps can also be omitted or deleted.
  • the method includes the following steps:
  • the electronic device obtains the first data type requested by the first program.
  • the operating system of the electronic device may send a first acquisition request to the first program, and the first program feeds back the first data type to the operating system in response to the first acquisition request.
  • the first acquisition request may carry a data type set, and the data type set includes the first data type.
  • the first program may determine the first data type from the data type set, and provide the first data type to the operation The system returns the first data type.
  • the operating system of the electronic device can send the first acquisition request to the first program in the form of a broadcast, so that data types including those requested by multiple programs can be quickly acquired.
  • the first program may also proactively notify the operating system of the electronic device of the first data type.
  • the electronic device can also obtain the first data type through other methods.
  • the embodiments of this application do not limit the specific method by which the electronic device obtains the first data type.
  • the electronic device can obtain the first data type requested by the first program from a preset third interface of the first program.
  • the third interface may be set in advance by the developer of the first program.
  • the electronic device obtains the second target data corresponding to the first data type.
  • the electronic device may obtain the second target data from the network, or may obtain the second target data from a database stored locally on the electronic device, or may generate the second target data instantly.
  • the data type of the second target data is the first data type.
  • the electronic device can also obtain the second target data through other methods. The embodiments of this application do not limit the specific method by which the electronic device obtains the second target data.
  • the second target data may have nothing to do with the user's real demands, and the second target data may be data that does not carry user characteristics.
  • the electronic device can obtain the third target data corresponding to the first data type, and perform desensitization processing on the third target data, thereby obtaining the second target data.
  • desensitization processing may refer to performing data deformation processing on the third target data through preset desensitization rules, so that the desensitized second target data does not carry user characteristics.
  • the desensitization process may include data encryption, data replacement, etc. Of course, in actual applications, the desensitization process may also include more or less processing methods. The embodiments of this application do not describe the desensitization process. be limited in specific ways.
  • the electronic device obtains the second processing result fed back by the first program based on the second target data.
  • the operating system of the electronic device can provide the second target data to the first program through the second interface.
  • program, the second program can process the second target data and feed back the second processing result to the operating system through the third interface.
  • the second processing result may include at least one of an image and a data stream.
  • the electronic device may not display the second processing result to the user. For example, the electronic device may not display the second processing result on the display screen. Display the second processing result. In some embodiments, the electronic device may prohibit the first program from sending the second target data to other devices. In some embodiments, after providing the second target data to the first program, the electronic device may reject the first network request when first detecting the first network request of the first program requesting to use the network resource, and reject the first network request.
  • a second network request from the first program is received and network resources are allocated to the first program. That is, the electronic device is After the first program provides the first target data, the first program is prohibited from using network resources, and after a first preset period of silence, the first program is allowed to use network resources, thereby preventing the first program from sending the second processing result to other devices. .
  • the electronic device when receiving the second network request of the first program, can also determine whether the data packets carried by the first network request and the second network request are the same. If the second network request and the first network request If the data packets carried are different, network resources are allocated to the first program. If the data packets carried by the second network request and the first network request are the same, the second network request is rejected again.
  • S2604 The electronic device determines the data processing capability of the first program based on the second processing result.
  • the electronic device can compare the second processing result with the second target data, analyze the processing behavior of the first program based on the second target data, and thereby determine the data processing capability of the first program.
  • the electronic device can input the second target data and the second processing result to the machine learning model to obtain the data processing capability of the first program output by the machine learning model.
  • the machine learning model can be obtained through multiple sample training in advance, each sample can include the fourth target data and the third processing result obtained by processing the fourth target data by the first program, and each sample carries the real Data processing capability mark. It should be noted that the machine learning model can be trained by the electronic device, or can be trained by other devices other than the electronic device. In some embodiments, the electronic device may obtain the fourth target data submitted by the user for the first program and the third processing result obtained by processing the four target data by the first program from the historical data corresponding to the first program.
  • the electronic device can also use other analysis methods to determine the data processing capability of the first program based on the second target data and the second target data results.
  • the embodiment of the present application does not determine the data processing capability of the electronic device based on the second target data. and the second processing result is defined in a manner to determine the data processing capability of the first program.
  • the first program is a browser program.
  • the first program can search based on the image, and the second processing result obtained is other images similar to the image. Therefore, the data processing capabilities of the browser program can include "image-search for similar images based on image recognition.”
  • the first program is a browser program.
  • the first program can search based on the text, and the second processing result obtained is articles related to the text. Therefore, the data processing capabilities of the browser program can include "text-expanded search based on semantic analysis.”
  • the first program is a communication program.
  • the second processing result is to send the second target data to the designated contact person. Therefore, the processing capabilities of the communication program may include "data -Forward data to contacts".
  • the first program is a photo album program
  • the second target data obtained by the first program is an image
  • the The second processing result is to save the image to the photo album program, so the processing capabilities of the photo album program can include "image-get image to save”.
  • the first program is a document editing program.
  • the second processing result is to copy the second target data in the document. Therefore, the processing capabilities of the document editing program may include "data- Copy data to document".
  • the above only takes the first program as a browser program, a communication program, a photo album program and a document editing program as an example to explain the data processing capabilities that the first program may have, but does not describe the first program or the first program. It can be understood that in actual applications, the data processing capabilities of the first program are not limited to browser programs, communication programs, photo album programs and document editing programs, nor are they limited to the above " Image - Search for similar images based on image recognition”, “Text - Extended search based on semantic analysis”, “Data - Forward data to contacts", “Image - Get image to save” and “Data - Copy data to document ".
  • the electronic device may determine the data processing capability of the first program based on the first image, the second image, and the second target data.
  • the first image is the image of the first window before the second target data is provided to the first program
  • the second image is the image of the first window after the second processing result is obtained.
  • the electronic device can determine the image difference between the first image and the second image through image recognition, and then combine the image difference with the second target data to determine the data processing capability.
  • the electronic device can obtain an image set of the first window during the running of the first program, identify the image features of each image included in the image set, and thereby determine the image feature set corresponding to the first window (i.e., the image feature set corresponding to the first window).
  • the program content of a program establishes a high-confidence understanding). Accordingly, the electronic device can identify the first image and the second image based on the image feature set to improve the accuracy of determining the difference between the first image and the second image. This further improves the accuracy of determining the data processing capability of the first program.
  • the electronic device can obtain the first data type requested by the first program, provide the second target data of the first data type to the first program, and obtain the second data fed back by the first program based on the second target data.
  • the processing result is further used to determine the data processing capability of the first program based on the second target data and the second processing result, thereby improving the reliability of determining the data processing capability of the first program.
  • FIG. 27 is a flow chart of a method for obtaining the data processing capability of the first program provided by an embodiment of the present application. It should be noted that this method is not limited to the specific order described in Figure 27 and below. It should be understood that in other embodiments, the order of some steps of the method can be exchanged with each other according to actual needs, or some of the steps can be It can also be omitted or deleted.
  • the method includes the following steps:
  • the electronic device displays a first human-computer interaction interface.
  • the first human-computer interaction interface includes a first interactive object, and the first interactive object corresponds to the first program.
  • the electronic device determines the data processing capability of the first program.
  • S2702 may be omitted.
  • the electronic device displays a second human-computer interaction interface.
  • the second human-computer interaction interface includes first prompt information corresponding to the first program.
  • the first prompt information is used to prompt a second program related to the first program.
  • the second program Used to provide data of a first data type to the first program, where the first data type is a data type that the first program relies on for data processing.
  • the second human-computer interaction interface also includes second prompt information, and the second prompt information is used to indicate Indicates at least one of the data processing capability of the first program and the first data type.
  • the third human-computer interaction interface includes the first process of performing data processing by the first program based on the first target data.
  • the first target data is data of the first data type belonging to the second program.
  • S2704 may be omitted.
  • embodiments of the present application also provide an electronic device, including: a memory and a processor.
  • the memory is used to store a computer program; the processor is used to execute the method described in the above method embodiment when the computer program is called.
  • the electronic device provided in this embodiment can execute the above method embodiments, and its implementation principles and technical effects are similar, and will not be described again here.
  • embodiments of the present application also provide a chip system.
  • the chip system includes a processor, the processor is coupled to a memory, and the processor executes a computer program stored in the memory to implement the method described in the above method embodiment.
  • the chip system may be a single chip or a chip module composed of multiple chips.
  • Embodiments of the present application also provide a computer-readable storage medium on which a computer program is stored. When the computer program is executed by a processor, the method described in the above method embodiment is implemented.
  • An embodiment of the present application also provides a computer program product.
  • the computer program product When the computer program product is run on an electronic device, the electronic device implements the method described in the above method embodiment when executed.
  • the above-mentioned integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • this application can implement all or part of the processes in the methods of the above embodiments by instructing relevant hardware through a computer program.
  • the computer program can be stored in a computer-readable storage medium.
  • the computer program When executed by a processor, the steps of each of the above method embodiments may be implemented.
  • the computer program includes computer program code, which may be in the form of source code, object code, executable file or some intermediate form.
  • the computer-readable storage medium may at least include: any entity or device capable of carrying computer program code to the camera device/terminal device, recording media, computer memory, read-only memory (ROM), random access Memory (random access memory, RAM), electrical carrier signals, telecommunications signals, and software distribution media.
  • ROM read-only memory
  • RAM random access Memory
  • electrical carrier signals telecommunications signals
  • software distribution media For example, U disk, mobile hard disk, magnetic disk or CD, etc.
  • the disclosed devices/devices and methods can be Implemented in other ways.
  • the apparatus/equipment embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division. In actual implementation, there may be other division methods, such as multiple units or units. Components may be combined or may be integrated into another system, or some features may be ignored, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, indirect coupling or communication connection of devices or units, which may be in electrical, mechanical or other forms.
  • the term “if” may be interpreted as “when” or “once” or “in response to determining” or “in response to detecting” depending on the context. ". Similarly, the phrase “if determined” or “if [the described condition or event] is detected” may be interpreted, depending on the context, to mean “once determined” or “in response to a determination” or “once the [described condition or event] is detected ]” or “in response to detection of [the described condition or event]”.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请提供一种人机交互的方法及电子设备,涉及终端技术领域,其中,该方法包括显示第一人机交互界面,所述第一人机交互界面包括第一交互对象,所述第一交互对象与第一程序对应;当接收到第一触发操作时,显示第二人机交互界面,所述第二人机交互界面包括与所述第一程序对应的第一提示信息,所述第一提示信息用于提示与所述第一程序相关的第二程序,所述第二程序用于向所述第一程序提供第一数据类型的数据,所述第一数据类型为所述第一程序进行数据处理所依赖的数据类型。本申请提供的技术方案能够降低人机交互难度和学习成本,也减少了第一程序在用户不知情的情况下窃取用户隐私的安全隐患。

Description

人机交互的方法及电子设备
本申请要求于2022年5月31日提交国家知识产权局、申请号为202210610171.0、申请名称为“人机交互的方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种人机交互的方法及电子设备。
背景技术
随着科学技术的不断进步,终端设备的数量、类型和功能都得到了长足的发展。终端设备中可以安装应用程序,当运行该应用程序时可以在人机交互界面向用户提供与该应用程序对应的交互对象,从而通过该交互对象与用户进行交互。
现有技术中,一般可以由应用程序的开发人员,事先向用户提供相关的操作说明,然后用户可以基于该操作说明中所指示的方式与该应用程序对应的交互对象进行交互。
但应用程序的开发人员所提供的操作说明往往晦涩难懂,且该应用程序的开发人员也可能不提供相应的操作说明,因此一方面,通常用户需要仔细摸索练习,才能够成功地与交互对象进行人机交互,人机交互的效率低下,用户需要付出昂贵的学习成本;另一方面,应用程序可能在用户不知情的情况下窃取用户隐私,存在很大的安全隐患。
发明内容
有鉴于此,本申请提供一种人机交互的方法及电子设备,能够降低人机交互难度和学习成本,也减少了第一程序在用户不知情的情况下窃取用户隐私的安全隐患。
为了实现上述目的,第一方面,本申请实施例提供人机交互的方法,包括:显示第一人机交互界面,所述第一人机交互界面包括第一交互对象,所述第一交互对象与第一程序对应;当接收到第一触发操作时,显示第二人机交互界面,所述第二人机交互界面包括与所述第一程序对应的第一提示信息,所述第一提示信息用于提示与所述第一程序相关的第二程序,所述第二程序用于向所述第一程序提供第一数据类型的数据,所述第一数据类型为所述第一程序进行数据处理所依赖的数据类型。
在本申请实施例中,可以显示第一人机交互界面,第一人机交互界面包括与第一程序对应的第一交互对象,并在基于第一人机交互界面接收到第一触发操作时,显示第二人机交互界面,其中,第二人机交互界面包括与第一交互对象对应的第一提示信息,从而通过第一提示信息直观地提示用户电子设备中存在与第一程序相关的第二程序,其中,第二程序可以向第一程序提供第一程序进行数据处理所以依赖的第一数据类型的数据,使得用户可以简单明了地确定第一程序的至少部分功能和作用,降低人机交互难度和学习成本,也减少了第一程序在用户不知情的情况下窃取用户隐私的安全隐患。
需要说明的是,第一程序可以为电子设备中的任一应用程序,或者可以为任一应用程序的任一子程序,第二程序可以为与第一程序不同的程序。在一些实施例中,第一程序和第二程序可以为不同的应用程序。在一些实施例中,第一程序和第二程序可以为对应不同应用程序的子程序。在一些实施例中,第一程序和第二程序可以为对应同一应用程序的不同子程序。
其中,第一触发操作可以用于触发电子设备显示第一提示信息。第一触发操作可以包括语音、按键操作、触摸操作或手势操作等操作,当然,在实际应用中,第一触发操作也可以为其他操作类型的操作,且第一触发操作的操作类型可以事先确定,比如可以由电子设备接收用户或者相关技术人员设置的操作作为第一触发操作,本申请实施例不对第一触发操作的操作类型以及确定第一触发操作的方式进行限定。
在一些实施例中,电子设备可以基于第一人机交互界面接收第一触发操作。在一些实施例中,第一触发操作可以为电子设备在显示第一人机交互界面的情况下接收到操作,例如,电子设备在显示第一人机交互界面时,检测到用户按下“windows键+Q键”。在一些实施例中,第一触发操作可以为电子设备从第一人机交互界面检测到的操作,或者说,第一触发操作可以为作用在第一人机交互界面中的操作,例如,电子设备检测到鼠标轨迹或者触摸轨迹在显示的第一人机交互界面上,形成一个“?”的形状,即用户通过鼠标或触摸在第一人机交互界面上绘制出“?”的手势。
在一些实施例中,若包括多个所述第一数据类型,则所述第一提示信息具体用于分别基于各所述第一数据类型,提示对应的所述第二程序。
例如,第一程序的对应的第一数据类型包括文字和图像,第二程序包括能够提供文字的程序A和程序B,以及能够提供图像的程序C和程序D,第一提示信息可以将程序A的图标和程序B的图标排列成一行,将程序C的图像和程序D的图像排列成另一行,从而进一步直观明了地提示用户各第二程序与第一程序的相关类型。
在一些实施例中,所述第二人机交互界面还包括第二提示信息,所述第二提示信息用于提示所述第一程序的数据处理能力和所述第一数据类型中的至少一个。
通过第二提示信息提示第一程序的数据处理能力可以直观地提示用户第一程序需要获取哪些数据,基于这些数据又能提供哪些服务或体验,进一步提高人机交互效率,降低用户的学习成本,同时也进一步避免了第一程序在用户不知情的情况下窃取用户隐私的隐患。另外,第一提示信息和第二提示信息这两个维度的信息进行结合,也使得用户能够更加直接明了地确定第一程序所具有的数据处理能力以及第一程序基于该数据能力而可能关联的第二程序,从而更加完整客观地向用户展示第一程序的功能和作用。
在一些实施例中,第二提示信息可以包括图形、图像和文字中的至少一种信息。
在一些实施例中,所述显示第二人机交互界面,包括:突出显示所述第一提示信息和所述第二提示信息,从而更加直观地显示第一提示信息和第二提示信息,提高提示效果。
以第一提示信息为例。在一些实施例中,突出显示的方式可以包括将第一提示信息作为前景显示,将第一提示信息之外的其他内容作为背景显示。其中,前景显示和背景显示可以具有不同的视觉特征。比如前景显示为彩色,背景显示为灰色或黑白色。 又比如,前景显示的分辨率可以大于背景显示的分辨率,即前景显示比背景显示更加清晰。在一些实施例中,突出显示的方式可以包括将第一提示信息进行高亮显示。在一些实施例中,突出显示的方式可以包括在第一提示信息的周围增加边框或跑马灯等。当然,在实际应用中,电子设备也可以通过其他方式来突出显示第一提示信息,而不限于上述提到的几种显示方式。
在一些实施例中,所述第一交互对象包括第一窗口和第一控件中至少一个。
在一些实施例中,所述第一提示信息具体用于指示与所述第二程序对应的第二交互对象。
在一些实施例中,所述第二交互对象包括第二窗口和第二控件中的至少一个。
在一些实施例中,第一提示信息可以包括图形、图像和文字中的至少一种信息。在一些实施例中,第一提示信息可以包括第二程序的程序标识(名称和/或图标)、第二窗口的缩略图、由第二交互对象指向第一交互对象的箭头、第二交互对象和第一交互对象之间的连接线等等。当然,在实际应用中,第一提示信息并不受限于上述提到的有几种信息。
在一些实施例中,所述方法还包括:当基于所述第二人机交互界面接收到第二触发操作时,显示第三人机交互界面,所述第三人机交互界面包括所述第一程序基于第一目标数据进行数据处理的第一处理结果,其中,所述第一目标数据为归属于所述第二程序的所述第一数据类型的数据。
当电子设备检测到第二触发操作时,可以将归属于第二程序的第一数据类型的第一目标数据提供给第一程序,以使得第一程序可以方便快捷地获取到归属于第二程序的第一目标数据进行数据处理,通过显示第三人机交互界面向用户展示第一处理结果。进一步降低了人机交互的难度和用户的学习成本,同时通过第二触发操作向第一程序提供数据,也可以使得用户在确定第一程序的至少部分功能和作用的情况下,主动向第一程序提供数据,即不需要提前针对第一程序获取数据的权限进行设置,提供数据的方式更加灵活,进一步改善了第一窗口可能在用户不知情的情况下窃取用户隐私的问题。
其中,第二触发操作可以用于触发电子设备将归属于第二程序的第一数据类型的第一目标数据提供给第一程序。第二触发操作可以包括语音、按键操作、触摸操作或手势操作等操作,当然在实际应用中,第二触发操作也可以为其他操作类型的操作,且第二触发操作的操作类型可以由电子设备事先确定,比如可以由电子设备接收用户或者相关技术人员设置的操作作为第二触发操作,本申请实施例不对第二触发操作的操作类型以及电子设备确定第二触发操作的方式进行限定。
在一些实施例中,所述第二触发操作包括对第二交互对象的点击操作;或,所述第二触发操作包括将所述第二交互对象拖向所述第一交互对象的拖动操作;或,所述第二触发操作包括将所述第二交互对象拖动至所述第一交互对象所在区域的拖动操作;其中,所述第一交互对象与所述第一程序对应,第二交互对象与所述第二程序对应。
在一些实施例中,若在第二触发操作之前,电子设备已经确定用户在第二程序中选择了至少部分数据,则电子设备可以将用户选择的该至少部分数据确定为第一目标数据。例如,第二交互对象包括第二窗口,在第二触发操作之前,用户已经在第二窗 口中选择了至少部分数据,那么该至少部分数据即可作为第一目标数据。
在一些实施例中,若电子设备需要与用户交互来确定第一目标数据,那么在第二触发操作之前,电子设备并未基于第二程序确定第一目标数据,且第二程序未在前台运行(即第二程序未运行或者在后台运行),那么电子设备可以在接收到第二触发操作之后,前台运行第二程序,并显示第二窗口,基于第二窗口接收用户提交的第一确定操作,基于第一确定操作在第二窗口中确定第一目标数据。需要说明的是,本申请实施例不对第一确定操作的操作类型进行限定。在一些实施例中,若电子设备可以在不需要与用户交互的情况下确定第一目标数据,那么在第二触发操作之前,电子设备并未基于第二程序确定第一目标数据,且第二程序未运行,那么电子设备可以在接收到第二触发操作之后,(在后台或前台)运行第二程序,从第二程序中确定第一目标数据。在一些实施例中,电子设备可以基于第一数据类型来判断是否需要基于与用户的交互来确定第一目标数据。若第一数据类型为预设的第三数据类型,那么电子设备可以确定不需要与用户交互就可以确定第一目标数据;若第一数据类型不为第三数据类型,则确定需要与用户交互来获取第一目标数据。在一些实施例中,第三数据类型可以包括定位信息、时间信息和天气信息等可以由程序在后台获取的信息的数据类型。在一些实施例中,第三数据类型可以包括用户的属性信息等更新频率比较低的信息的数据类型。当然,在实际应用中,第三数据类型也可以包括更多或更少的数据类型,而不限于上述几种数据类型。
在一些实施例中,在电子设备向第一程序提供第一目标数据之前,若第一程序未在前台运行,则电子设备可以先在前台运行第一程序,再将第一目标数据提供给第一程序。在另一些实施例中,在电子设备向第一程序提供第一目标数据之前,若第一程序未运行,则电子设备可以先在后台运行第一程序,再将第一目标数据提供给第一程序。
在一些实施例中,所述方法还包括:获取所述第一程序请求的第一数据类型;获取所述第一数据类型匹配的第二目标数据;向所述第一程序提供所述第二目标数据;获取所述第一程序基于所述第二目标数据反馈的第二处理结果;基于所述第二目标数据和所述第二处理结果,确定所述第一程序的数据处理能力。
在一些实施例中,在一些实施例中,电子设备可以获取与第一数据类型对应的第三目标数据,并对第三目标数据进行脱敏处理,从而得到第二目标数据。其中,脱敏处理可以指对第三目标数据通过预设的脱敏规则进行数据变形处理,使得脱敏后的第二目标数据中不携带用户特征。在一些实施例中,脱敏处理可以包括数据加密和数据替换等等。
在一些实施例中,所述第一程序和所述第二程序均为在前台运行的程序。也即是,电子设备可以仅在用户能看见至少部分运行过程的程序的范围中,提示其中包括的与第一程序关联的第二程序。当然,在实际应用中,第一程序和第二程序中的至少一个也可以不是在前台运行的程序。
第二方面,本申请实施例提供了一种人机交互的装置,该装置具有实现上述第一方面中任一项的功能。功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块或单元。例如,收发模块或单元、 处理模块或单元、获取模块或单元等。
第三方面,本申请实施例提供一种电子设备,包括:存储器和处理器,存储器用于存储计算机程序;所述处理器用于在调用所述计算机程序时使得所述电子设备执行上述第一方面中任一项所述的方法。
第四方面,本申请实施例提供一种芯片系统,所述芯片系统包括处理器,所述处理器与存储器耦合,所述处理器执行存储器中存储的计算机程序,以实现上述第一方面中任一项所述的方法。
其中,所述芯片系统可以为单个芯片,或者多个芯片组成的芯片模组。
第五方面,本申请实施例提供一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述第一方面中任一项所述的方法。
第六方面,本申请实施例提供一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行上述第一方面中任一项所述的方法。
可以理解的是,上述第二方面至第六方面的有益效果可以参见上述第一方面中的相关描述,在此不再赘述。
附图说明
图1为本申请实施例所提供的一种电子设备的结构示意图;
图2为本申请实施例所提供的一种人机交互界面示意图;
图3为本申请实施例所提供的另一种人机交互界面示意图;
图4为本申请实施例提供的一种人机交互的方法的流程示意图;
图5为本申请实施例所提供的一种第四人机交互界面示意图;
图6为本申请实施例所提供的一种第一人机交互界面示意图;
图7为本申请实施例所提供的一种第二人机交互界面示意图;
图8为本申请实施例所提供的另一种第二人机交互界面示意图;
图9为本申请实施例所提供的另一种第二人机交互界面示意图;
图10为本申请实施例所提供的另一第二人机交互界面示意图;
图11为本申请实施例所提供的另一种第二人机交互界面示意图;
图12为本申请实施例所提供的另一种第二人机交互界面示意图;
图13为本申请实施例所提供的另一种第二人机交互界面示意图;
图14为本申请实施例提供的一种第三人机交互界面示意图;
图15为本申请实施例所提供的另一种第二人机交互界面示意图;
图16为本申请实施例提供的另一种第三人机交互界面示意图;
图17为本申请实施例所提供的另一种第一人机交互界面示意图;
图18为本申请实施例所提供的另一种第二人机交互界面示意图;
图19为本申请实施例所提供的另一种第二人机交互界面示意图;
图20为本申请实施例提供的另一种第三人机交互界面示意图;
图21为本申请实施例所提供的另一种第二人机交互界面示意图;
图22为本申请实施例所提供的另一种第二人机交互界面示意图;
图23为本申请实施例提供的另一种第三人机交互界面示意图;
图24为本申请实施例所提供的另一种第二人机交互界面示意图;
图25为本申请实施例提供的另一种第三人机交互界面示意图;
图26为本申请实施例提供的一种获取数据处理能力的方法的流程示意图;
图27为本申请实施例提供的另一种人机交互的方法的流程示意图。
具体实施方式
本申请实施例提供的人机交互的方法可以应用于手机、平板电脑、可穿戴设备、车载设备、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等电子设备上,本申请实施例对电子设备的具体类型不作任何限制。
图1是本申请实施例提供的一例电子设备100的结构示意图。电子设备100可以包括处理器110、存储器120、通信模块130和显示屏140等。
其中,处理器110可以包括一个或多个处理单元,存储器120用于存储程序代码和数据。在本申请实施例中,处理器110可执行存储器120存储的计算机执行指令,用于对电子设备100的动作进行控制管理。
通信模块130可以用于电子设备100的各个内部模块之间的通信、或者电子设备100和其他外部电子设备之间的通信等。示例性的,如果电子设备100通过有线连接的方式和其他电子设备通信,通信模块130可以包括接口等,例如USB接口,USB接口可以是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
或者,通信模块130可以包括音频器件、射频电路、蓝牙芯片、无线保真(wireless fidelity,Wi-Fi)芯片、近距离无线通讯技术(near-field communication,NFC)模块等,可以通过多种不同的方式实现电子设备100与其他电子设备之间的交互。
显示屏140可以显示人机交互界面中的图像或视频等。
可选地,电子设备100还可以包括外设设备150,例如鼠标、键盘、扬声器、麦克风等。
应理解,除了图1中列举的各种部件或者模块之外,本申请实施例对电子设备100的结构不做具体限定。在本申请另一些实施例中,电子设备100还可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
为了便于理解本申请施例中的技术方案,下面首先对本申请实施例的应用场景予以介绍。
电子设备中可以包括一个或多个的应用程序。应用程序是由一系列具有特定功能、能完成执行操作的可执行文件、数据的集合,一个应用程序可以包括若干个可执行程序、动态链接库和数据文件等数据。应用程序的载体是可执行文件,可执行文件可以包括运行该应用程序的必要信息,比如数据和配置信息等。在一些实施例中,可执行文件可以包括扩展名为exe、app、apk、bat、sys和dll等的文件,当然,在实际应用中,可执行文件的扩展名并不限于上述几种。
进程可以为运行中的应用程序的实例。进程可以包括地址空间及代码、数据、对象等程序运行所需环境和资源的集合。在一些实施例中,一个进程可以对应一个应用程序,一个应用程序可以对应多个进程。在一些实施例中,一个进程可以对应至少一个线程,一个线程可以对应一个进程。
其中,应用程序的进程和线程可以理解为该应用程序的子程序。且可以理解的是,在实际应用中,应用程序可以包括比进程和线程更高层级或更低层级的子程序。
当电子设备运行这些应用程序或其子程序时,该电子设备可以在显示屏中显示人机交互界面,该人机交互界面中可以包括一个或多个的交互对象。交互对象可以为可以响应用户操作的对象,该用户操作可以包括通过鼠标或手指的点击、拖拽等各种操作。在一些实施例中,一个交互对象可以对应一个应用程序或子程序,一个应用程序或子程序可以对应一个以上的交互对象。应用程序或子程序可以通过其对应的交互对象,获取数据并基于该数据进行特定的处理操作,还可以通过该交互对象向用户展示处理结果。
在一些实施例中,人机交互界面可以是2D的,也可以是3D的。例如,当电子设备为手机和电脑等设备时,该人机交互界面可以是2D的;当电子设备为AR设备或VR设备时,该人机交互界面可以是3D的。在一些实施例中,交互对象可以为2D的,也可以是3D的。在一些实施例中,当该人机交互界面为2D的界面时,该人机交互界面中的交互对象可以是2D的;相似的,当该人机交互界面为3D的界面时,该人机交互界面中的交互对象可以是3D的。
在一些实施例中,交互对象可以包括窗口和控件中的至少一种。在一些实施例中,当应用程序启动时可以创建窗口,该窗口可以包括标题栏、菜单和边框等多种视觉元素,该窗口也可以被称为主窗口,为了进一步与用户进行交互,该应用程序还可以创建更多窗口,比如对话框,而对话框又可以包括一个或多个的控件。在一些实施例中,控件可以包括但不限于图标和输入框等等。
例如,一种人机交互界面可以如图2所示,该人机交互界面为电脑中的2D人机交互界面。该人机交互界面的中部包括浏览器程序对应的浏览器窗口201,下部包括快捷启动栏202,该快捷启动栏中包括五个应用程序的图标,从左到右分别是浏览器程序、相册程序、运动健康程序、通讯程序和文档编辑程序。其中,浏览器窗口201中还包括多个控件,比如浏览器窗口201上部包括的网址输入框,浏览器窗口201下部包括的搜索输入框以及网页1-网页6等6个按钮。
又例如,一种人机交互界面可以如图3所示。该人机交互界面可以为VR设备中的3D人机交互界面,该人机交互界面中包括虚拟的购物场景,其中包括虚拟电视301、虚拟人偶302和虚拟投影仪303,且虚拟电视301、虚拟人偶302和虚拟投影仪303均为控件,分别对应电视、人偶玩具和投影仪等商品。相似的,该人机交互界面也可以为AR设备中的人机交互界面,其中包括的虚拟电视301、虚拟人偶302和虚拟投影仪303也可以为控件,或者,在另一些实施例中,虚拟电视301、虚拟人偶302和虚拟投影仪303可以为由AR设备的摄像头拍摄的真实物体,且每个真实物体可以与预设的控件相关联(比如AR设备可以在该真实物体的同一位置显示一个透明的控件),使得用户与该真实物体的交互等于与该真实物体关联的控件的交互。
下面以具体地实施例对本申请的技术方案进行详细说明。下面这几个具体的实施例可以相互结合,对于相同或相似的概念或过程可能在某些实施例不再赘述。
请参照图4,为本申请实施例所提供的一种人机交互的方法的流程图。需要说明的是,该方法并不以图4以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:
S401,电子设备显示第一人机交互界面,第一人机交互界面包括第一交互对象,第一交互对象与第一程序对应。
电子设备在开机启动的情况下,可以运行第一程序并显示包括第一交互对象的第一人机交互界面,以便于通过第一交互对象与用户进行人机交互。其中,第一程序可以为电子设备中的任一应用程序,或者可以为任一应用程序的任一子程序。
其中,第一交互对象可以包括第一窗口和第一控件中的至少一个。在一些实施例中,第一控件可以包括第一程序的图标。
在一些实施例中,第一程序可以为在前台运行的程序。其中,第一程序在前台运行可以指用户能够在第一人机交互界面中看到第一程序的至少部分运行过程,比如用户可以看到第一程序对应的第一窗口;相反地,若第一程序在后台运行,可以指用户不能再第一人机交互界面中看到第一程序的任何运行过程。也即是,电子设备可以仅针用户能看到至少部分运行过程的程序,进行交互方式的提示。当然,在实际应用中,第一程序也可以不是在前台运行的程序。
在一些实施例中,第一交互对象可以为电子设备基于用户的第二确定操作在第一人机交互界面中确定的交互对象。其中,第二确定操作用于在第一人机交互界面中确定至少一个第一交互对象。且需要说明的是,本申请实施例不对第二确定操作的操作类型进行限定。
S402,当接收到第一触发操作时,电子设备显示第二人机交互界面,第二人机交互界面包括与第一交互对象对应的第一提示信息,第一提示信息用于提示与第一程序相关的第二程序。
电子设备在显示包括第一交互对象的第一人机交互界面之后,用户可能并不了解具体如何与第一交互对象进行交互,即便第一程序的开发人员提供了相应的操作说明,用户也需要付出巨大的学习成本逐渐摸索练习,才能掌握与第一交互对象进行交互的方法,因此,电子设备可以在接收到第一触发操作时,显示包括与第一交互对象对应的第一提示信息的第二人机交互界面,从而通过第一提示信息直观地提示用户电子设备中存在与第一程序相关的第二程序,使得用户可以简单明了地确定第一程序的至少部分功能和作用,降低人机交互难度和学习成本,也减少了第一程序在用户不知情的情况下窃取用户隐私的安全隐患。
其中,第二程序可以为与第一程序不同的程序。在一些实施例中,第一程序和第二程序可以为不同的应用程序。在一些实施例中,第一程序和第二程序可以为对应不同应用程序的子程序。在一些实施例中,第一程序和第二程序可以为对应同一应用程序的不同子程序。
第二程序与第一程序相关,可以指第二程序的数据处理能力与第一程序的数据处 理能力之间可以互相关联。在一些实施例中,第二程序与第一程序相关,可以包括第二程序可以向第一程序提供第一数据类型的数据,第一数据类型为第一程序进行数据处理所依赖的数据类型。
第一触发操作可以用于触发电子设备显示第一提示信息。其中,第一触发操作可以包括语音、按键操作、触摸操作或手势操作等操作,当然,在实际应用中,第一触发操作也可以为其他操作类型的操作,且第一触发操作的操作类型可以事先确定,比如可以由电子设备接收用户或者相关技术人员设置的操作作为第一触发操作,本申请实施例不对第一触发操作的操作类型以及确定第一触发操作的方式进行限定。
在一些实施例中,电子设备可以基于第一人机交互界面接收第一触发操作。在一些实施例中,第一触发操作可以为电子设备在显示第一人机交互界面的情况下接收到操作,例如,电子设备在显示第一人机交互界面时,检测到用户按下“windows键+Q键”。在一些实施例中,第一触发操作可以为电子设备从第一人机交互界面检测到的操作,或者说,第一触发操作可以为作用在第一人机交互界面中的操作,例如,电子设备检测到鼠标轨迹或者触摸轨迹在显示的第一人机交互界面上,形成一个“?”的形状,即用户通过鼠标或触摸在第一人机交互界面上绘制出“?”的手势。
在一些实施例中,电子设备可以显示第四人机交互界面,并通过第四人机交互界面接收用户设置的第一触发操作,其中,第四人机交互界面可以用于接收用户提交的与本申请实施例所提供的人机交互方法相关的配置信息。
例如,电子设备可以显示第四人机交互界面如图5所示。第四人机交互界面中的第一配置项501即为与第一触发操作对应的配置项。第一配置项501包括滑动开关,电子设备可以基于该滑动开关接收用户的点击操作时,可以将该滑动开关的开关状态在“开”和“关”之间切换。当前该滑动开关的状态为“开”,即表示电子设备接收第一触发操作,并按照申请实施例所提供的人机交互方法辅助用户进行人机交互。相反的,若该滑动开关的状态为关,则表示电子设备不会响应第一触发操作,也不会按照申请实施例所提供的人机交互方法辅助用户进行人机交互。第一配置项501还包括“第一触发操作”这一文字提示信息以提示用户在第一配置项501配置第一触发操作,在该文字信息的下方,第一配置项501还包括当前操作方式显示框和自定义按钮,其中,当前操作方式显示框显示“windows键+Q键”,即表示当前第一触发操作的操作方式为同时按下windows键和Q键。在一些实施例中,当前操作方式显示框中显示的操作方式可以是由相关技术人员事先配置的或者是由用户前一次配置的。当电子设备基于该自定义按钮接收到用户新提交的操作方式时,可以将当前操作方式显示框的操作方式,更新为该新提交的操作方式。需要说明的是,本申请实施例仅以图5对第四人机交互界面进行说明,而不对第四人机交互界面构成限定。
在一些实施例中,第二程序可以为处于前台运行的程序。也即是,电子设备可以仅在用户能看见至少部分运行过程的程序的范围中,提示其中包括的与第一程序关联的第二程序。当然,在实际应用中,第二程序也可以不是在前台运行的程序。
在一些实施例中,第一提示信息可以用于提示第二程序对应的第二交互对象。在一些实施例中,第二交互对象可以包括第二程序对应的第二窗口和第二控件中的至少一个。
第一提示信息可以包括图形、图像和文字中的至少一种信息。在一些实施例中,第一提示信息可以包括第二程序的程序标识(名称和/或图标)、第二窗口的缩略图、由第二交互对象指向第一交互对象的箭头、第二交互对象和第一交互对象之间的连接线等等。当然,在实际应用中,第一提示信息并不受限于上述提到的有几种信息。
在一些实施例中,若包括多个第二程序,则电子设备可以按照预设的分类方式,将该多个第二程序进行分类,第一提示信息可以基于分类结果,分别指示各类第二程序。在一些实施例中,不同的第二程序可能与第一程序的相关类型也会不同,则第一提示信息可以分别指示各种相关类型的第二程序。在一些实施例中,第一程序对应多个第一数据类型,第一提示信息可以分别指示能够提供每个第一数据类型的数据的第二程序。例如,第一程序的对应的第一数据类型包括文字和图像,第二程序包括能够提供文字的程序A和程序B,以及能够提供图像的程序C和程序D,第一提示信息可以将程序A的图标和程序B的图标排列成一行,将程序C的图像和程序D的图像排列成另一行,从而进一步直观明了地提示用户各第二程序与第一程序的相关类型。
在一些实施例中,电子设备可以在显示第二人机交互界面之前的任意时机,基于第一程序,从电子设备中包括的程序集合中确定与第一程序相关的第二程序。在一些实施例中,电子设备可以确定第一程序进行数据处理所依赖的第一数据类型以及程序集合包括的各第三程序所能提供的数据类型,若任一第三程序所能提供的数据的数据类型与第一数据类型,则电子设备可以确定该任一第三程序为与第一程序相关的第二程序。其中,该程序集合,可以包括电子设备已安装的一个或多个程序,或者,可以包括电子设备已安装且运行的一个或多个程序,或者,可以包括电子设备已安装且在前台运行的一个或多个程序。当然,在实际应用中,该程序集合所包括的程序并不限于上述几种程序。电子设备也可以通过其他方式来确定与第一程序相关的第二程序。
电子设备在确定第二程序后,可以生成第一提示信息,并基于第一提示信息确定第二人机交互界面。
在一些实施例中,在第一人机交互界面的基础上,增加第一提示信息,从而得到第二人机交互界面。在一些实施例中,电子设备可以基于预设的第一模板,在第一人机交互界面上增加第一提示信息,其中,第一模板可以用于确定第一提示信息在第一人机交互界面中的位置、第一提示信息的显示风格或者第二人机交互界面的显示风格。当然,在实际应用中,电子设备也可以通过其他方式来确定第二人机交互界面,本申请实施例不对电子设备确定第二人机交互界面的方式进行限定。
在一些实施例中,电子设备可以突出显示第一提示信息,从而更加直观地显示第一提示信息,提高提示效果。在一些实施例中,突出显示的方式可以包括将第一提示信息作为前景显示,将第一提示信息之外的其他内容作为背景显示。其中,前景显示和背景显示可以具有不同的视觉特征。比如前景显示为彩色,背景显示为灰色或黑白色。比如前景显示为彩色,背景显示为灰色或黑白色。又比如,前景显示的分辨率可以大于背景显示的分辨率,即前景显示比背景显示更加清晰。在一些实施例中,突出显示的方式可以包括将第一提示信息进行高亮显示。在一些实施例中,突出显示的方式可以包括在第一提示信息的周围增加边框或跑马灯等。当然,在实际应用中,电子设备也可以通过其他方式来突出显示第一提示信息,而不限于上述提到的几种显示方 式。
在一些实施例中,第二人机交互界面中还可以包括第二提示信息,第二提示信息用于提示第一程序的数据处理能力。在一些实施例中,第二提示信息可以用于提示第一程序的数据处理能力所依赖的第一数据类型,即第一程序的输入。在一些实施例中,第二提示信息可以用于指示第一程序所能提供的数据的第二数据类型。
通过第二提示信息提示第一程序的数据处理能力可以直观地提示用户第一程序需要获取哪些数据,基于这些数据又能提供哪些服务或体验,进一步提高人机交互效率,降低用户的学习成本,同时也进一步避免了第一程序在用户不知情的情况下窃取用户隐私的隐患。另外,第一提示信息和第二提示信息这两个维度的信息进行结合,也使得用户能够更加直接明了地确定第一程序所具有的数据处理能力以及第一程序基于该数据能力而可能关联的第二程序,从而更加完整客观地向用户展示第一程序的功能和作用。
需要说明的是,电子设备可以事先确定第一程序的数据处理能力,从而能够及时响应与第一触发操作,在第二人机交互界面显示第二提示信息。在一些实施例中,电子设备的操作系统可以向第一程序发送第二获取请求,第一程序响应于第二获取请求,向该操作系统反馈第一程序的数据处理能力。在另一些实施例中,电子设备也可以参照下述图26所示的方法来确定第一程序的数据处理能力。
第二提示信息可以包括图形、图像和文字中的至少一种信息。
在一些实施例中,在第一人机交互界面的基础上,增加第一提示信息和第二提示信息,从而得到第二人机交互界面。在一些实施例中,电子设备可以基于预设的第二模板,在第一人机交互界面上增加第二提示信息,其中,第二模板可以用于确定第二提示信息在第一人机交互界面中的位置、第二提示信息的显示风格或者第二人机交互界面的显示风格。在一些实施例中,第一模板和第二模板可以是同一模板。
在一些实施例中,第一程序的数据处理能力可以包括多种,第二提示信息可以分别指示第一程序的各种数据处理能力。例如,第一程序的数据处理能力包括“根据图像识别,搜索相似图像”和“根据语义分析,进行拓展搜索”,第二提示信息可以分别通过文字指示这两种数据处理能力。
在一些实施例中,电子设备可以突出显示第二提示信息,从而更加直观地显示第二提示信息,提高提示效果。且需要说明的是,电子设备突出显示第二提示信息的方式,可以与突出显示第一提示信息的方式相同或相似。
在一些实施例中,电子设备可以显示第四人机交互界面,并通过第四人机交互界面接收用户设置的提示内容,即是否显示第一提示信息和第二提示信息。
例如,在前述如图5所示的第四人机交互界面还包括第二配置项502。“提示内容”这一文字提示信息以提示用户在第二配置项502配置响应于第一触发操作的提示内容,在该文字信息的下方,第二配置项502还包括提示内容显示框和自定义按钮,其中,提示内容显示框显示“第一提示信息和第二提示信息”,即表示当前提示内容包括第一提示信息和第二提示信息。在一些实施例中,提示内容显示框中显示的提示内容可以是由相关技术人员事先配置的或者是由用户前一次配置的。当电子设备基于该自定义按钮接收用户新提交的新的提示内容时,可以将提示内容显示框中的提示内容,更新 为该新的提示内容。
S403,当电子设备基于第二人机交互界面接收到第二触发操作时,显示第三人机交互界面,第三人机交互界面包括第一程序基于第一目标数据进行数据处理的第一处理结果,其中,第一目标数据为归属于第二程序的第一数据类型的数据。
当电子设备检测到第二触发操作时,可以将归属于第二程序的第一数据类型的第一目标数据提供给第一程序,以使得第一程序可以方便快捷地获取到归属于第二程序的第一目标数据进行数据处理,通过显示第三人机交互界面向用户展示第一处理结果。进一步降低了人机交互的难度和用户的学习成本,同时通过第二触发操作向第一程序提供数据,也可以使得用户在确定第一程序的至少部分功能和作用的情况下,主动向第一程序提供数据,即不需要提前针对第一程序获取数据的权限进行设置,提供数据的方式更加灵活,进一步改善了第一窗口可能在用户不知情的情况下窃取用户隐私的问题。
第二触发操作可以用于触发电子设备将归属于第二程序的第一数据类型的第一目标数据提供给第一程序。其中,第二触发操作可以包括语音、按键操作、触摸操作或手势操作等操作,当然在实际应用中,第二触发操作也可以为其他操作类型的操作,且第二触发操作的操作类型可以由电子设备事先确定,比如可以由电子设备接收用户或者相关技术人员设置的操作作为第二触发操作,本申请实施例不对第二触发操作的操作类型以及电子设备确定第二触发操作的方式进行限定。
在一些实施例中,电子设备可以显示第四人机交互界面,并通过第四人机交互界面接收用户设置的第二触发操作。
例如,在前述如图5所示的第四人机交互界面中,还包括用于配置第二触发操作的第三配置项503。第三配置项503包括“第二触发操作”这一文字信息以提示用户在第三配置项503配置第二触发操作,在该文字信息的下方还包括当前操作方式显示框和自定义按钮,其中,当前操作方式显示框“将第二窗口中的第一目标数据拖动到第一窗口”,即表示当前电子设备检测到用户将第二窗口中的第一目标数据拖动到第一窗口时,向第一程序提供第一目标数据。在一些实施例中,当前操作方式显示框中显示的操作方式可以是由相关技术人员事先配置的或者是由用户前一次配置的。当电子设备基于该自定义按钮接收到用户新提交的操作方式时,可以将当前操作方式显示框的操作方式,更新为该新提交的操作方式。
在一些实施例中,第二触发操作可以包括针对第二交互对象的点击操作。在一些实施例中,第二触发操作可以包括将第二交互对象朝向第一交互对象拖动的拖动操作。在一些实施例中,第二触发操作可以包括将第二交互对象从第二交互对象所在的区域,拖动至第一交互对象所在区域的拖动操作。在一些实施例中,第二交互对象包括第二窗口,第二触发操作可以包括将第二窗口中的至少部分数据朝向第一交互对象拖动的拖动操作,或者,第二触发操作可以包括将第二窗口中的至少部分数据拖动至第一交互对象所在区域的拖动操作,且第二触发操作所拖动的至少部分数据即为向第一程序提供的第一目标数据。
在一些实施例中,若在第二触发操作之前,电子设备已经确定用户在第二程序中选择了至少部分数据,则电子设备可以将用户选择的该至少部分数据确定为第一目标 数据。例如,第二交互对象包括第二窗口,在第二触发操作之前,用户已经在第二窗口中选择了至少部分数据,那么该至少部分数据即可作为第一目标数据。
在一些实施例中,若电子设备需要与用户交互来确定第一目标数据,那么在第二触发操作之前,电子设备并未基于第二程序确定第一目标数据,且第二程序未在前台运行(即第二程序未运行或者在后台运行),那么电子设备可以在接收到第二触发操作之后,前台运行第二程序,并显示第二窗口,基于第二窗口接收用户提交的第一确定操作,基于第一确定操作在第二窗口中确定第一目标数据,其中,第一确定操作用于在第二窗口中确定第一目标数据。需要说明的是,本申请实施例不对第一确定操作的操作类型进行限定。在一些实施例中,若电子设备可以在不需要与用户交互的情况下确定第一目标数据,那么在第二触发操作之前,电子设备并未基于第二程序确定第一目标数据,且第二程序未运行,那么电子设备可以在接收到第二触发操作之后,(在后台或前台)运行第二程序,从第二程序中确定第一目标数据。在一些实施例中,电子设备可以基于第一数据类型来判断是否需要基于与用户的交互来确定第一目标数据。若第一数据类型为预设的第三数据类型,那么电子设备可以确定不需要与用户交互就可以确定第一目标数据;若第一数据类型不为第三数据类型,则确定需要与用户交互来获取第一目标数据。在一些实施例中,第三数据类型可以包括定位信息、时间信息和天气信息等可以由程序在后台获取的信息的数据类型。在一些实施例中,第三数据类型可以包括用户的属性信息等更新频率比较低的信息的数据类型。当然,在实际应用中,第三数据类型也可以包括更多或更少的数据类型,而不限于上述几种数据类型。
在一些实施例中,在电子设备向第一程序提供第一目标数据之前,若第一程序未在前台运行,则电子设备可以先在前台运行第一程序,再将第一目标数据提供给第一程序。在另一些实施例中,在电子设备向第一程序提供第一目标数据之前,若第一程序未运行,则电子设备可以先在后台运行第一程序,再将第一目标数据提供给第一程序。
在一些实施例中,电子设备可以通过第一程序的第一接口,向第一程序提供第一目标数据。在一些实施例中,电子设备可以通过第一程序的第二接口,获取第一程序的第一处理结果。需要说明的是,第一接口和第二接口可以是由第一程序的技术开发人员事先设置的接口。
在一些实施例中,第一处理结果可以包括图像和数据流中的至少一种。
在一些实施例中,电子设备可以通过第一提示信息和第二提示信息,向用户展示第一程序的至少部分功能和作用即可,而不需要向第一程序提供第一目标数据,电子设备也就不会得到第一处理结果,或者,即便电子设备得到了第一处理结果,也可以不向用户展示第一处理结果,因此,S403可以省略。
在本申请实施例中,电子设备可以显示第一人机交互界面,第一人机交互界面包括与第一程序对应的第一交互对象,并在基于第一人机交互界面接收到第一触发操作时,显示第二人机交互界面,其中,第二人机交互界面包括与第一交互对象对应的第一提示信息,从而通过第一提示信息直观地提示用户电子设备中存在与第一程序相关的第二程序,使得用户可以简单明了地确定第一程序的至少部分功能和作用,降低人机交互难度和学习成本,也减少了第一程序在用户不知情的情况下窃取用户隐私的安 全隐患。
以下将结合图6-图14,对本申请实施例所提供的人机交互方法进行说明。其中,第一程序和第二程序均包括在前台运行的程序。请依次参照图6、图7-图9中的任一、图10-图13中的任一和图14,为本申请实施例所提供的一种人机交互方法相关的人机交互界面的示意图。
电子设备可以显示第一人机交互界面如图6所示,第一人机交互界面的底部包括快捷启动栏202,该快捷启动栏202中从左到右包括浏览器程序、相册程序、运动健康程序、通讯程序和文档编辑程序等应用程序的图标。其中,运动健康程序当前未在前台运行,因此第一人机交互界面中未显示于运动健康程序的任何运行过程,其他应用程序当前均处于前台运行,因此第一人机交互界面中还包括浏览器窗口201、相册窗口203、文档编辑窗口204和通讯窗口205,从而通过这些窗口向用户展示对应的应用程序的至少部分运行过程。第一窗口可以包括浏览器窗口201、相册窗口203、文档编辑窗口204和通讯窗口205中的一个或多个。
用户按下了“windows键+Q键”(即第一触发操作),电子设备显示如下图7-图9中任一所示的第二人机交互界面。
在一些实施例中,第二人机交互界面可以如图7所示,其中,第一提示信息包括的第二程序的图标700,在每个第一窗口的右侧边缘,显示了与该第一程序相关并处于前台运行的第二程序的图标700,从而提示用户当前处于前台运行的程序之间可能存在的关联。浏览器窗口201右侧的边缘显示了相册程序的图标和文档编辑程序的图标,且这两个图标分两行显示,从而可以提示用户相册程序与浏览器程序的相关类型,和文档编辑程序与浏览器程序的相关类型不相同。相册窗口203右侧的边缘显示了浏览器程序的图标、文档编辑程序的图标和通讯程序的图标,且这三个图标在同一行显示,从而可以提示用户浏览器程序、文档编辑程序和通讯程序,与相册程序的相关类型相同。另外,文档编辑窗口204右侧的边缘显示了浏览器程序的图标和相册程序的图标,通讯窗口205右侧的边缘显示了浏览器程序的图标、相册程序的图标和文档编辑程序的图标。
在一些实施例中,第二人机交互界面可以如图8所示,其中,在图7的基础上,各窗口的右侧还包括文字形式的第二提示信息800,第二提示信息800的数据格式为“第一数据类型-数据处理能力”,从而提示用户第一程序可以基于第一数据类型的数据所能够进行的数据处理方式。在浏览器窗口201右侧的边缘,相册程序的图标的上方显示了“图像-根据图像识别,搜索相似图像”,文档编辑程序的图标的上方显示了“文字-根据语义分析,进行拓展搜索”,即浏览器程序可以基于获取的图像进行图像识别,搜索相似的图像,也可以基于获取的文字进行语义分析和拓展搜索。在相册窗口203右侧边缘,浏览器程序、通讯程序和文档编辑程序的图标的上方显示了“图像-获取窗口截图以保存”,即相册程序可以获取图像进行保存。在文档编辑窗口204右侧边缘,浏览器程序和相册程序的图标的上方还显示了“数据-将数据复制到文档”,即相文档编辑程序可以获取数据并复制粘贴到当前文档文件中。在通讯窗口205右侧边缘,浏览器程序、相册程序和文档编辑程序的图标的上方还显示了“数据-向联系人转发数据”,即通讯程序可以获取并向联系人转发数据。
在一些实施例中,第二人机交互界面可以如图9所示。其中,第一提示信息(及第二程序的图标700)和第二提示信息800作为前景显示,能够清晰看到的第一提示信息和第二提示信息800,其他内容作为背景显示,比较模糊,从而突出显示了第一提示信息和第二提示信息800,以使更显著地展示第一提示信息和第二提示信息800。
电子设备在基于上述图7-图9中任一所示的第二人机交互界面接收到第二触发操作,则可以将归属第二程序的第一数据类型的第一目标数据提供给第一程序,展示包括第一处理结果的第三人机交互界面。
以第一程序为文档编辑程序,第二程序为相册程序为例。
在一些实施例中,如图10所示,第二触发操作可以为将相册窗口203(即第二窗口)中的图像1010拖向或拖动至文档编辑窗口204(即第一窗口)所在区域的第一拖动操作1000。且第二触发操作所拖动的图像1010即为第一目标数据。电子设备可以通过文档编辑程序将该图像1010插入至文档编辑窗口204中正在编辑的文档文件中,并显示第三人机交互界面如图14所示,在图14所示的第三人机交互界面中,用户所拖动的图像1010已经被插入至文档文件中(即第一处理结果)。
在一些实施例中,如图11所示,第二触发操作可以为将相册窗口203拖向或拖动至文档编辑窗口204所在的区域的第二拖动操作1100。在一些实施例中,如图12所示,第二触发操作可以为将文档编辑窗口204右侧边缘的相册程序的图标(即第二控件)拖向或拖动至文档编辑窗口204所在的区域的第三拖动操作1200。在一些实施例中,如图13所示,第二触发操作可以为将快捷启动栏202中的相册程序的图标(即第二控件)拖向或拖动至文档编辑窗口204所在的区域的第三拖动操作1300。
其中,假如用户在第二触发操作之前,已经在相册窗口203选择了左上角的图像(即第一目标数据)1010,那么电子设备可以直接通过文档编辑程序将该图像1010插入至文档编辑窗口204中正在编辑的文档中,显示第三人机交互界面如图14所示;假如用户在第二触发操作之前,并未在相册窗口203选择任何图像,那么电子设备可以在第二触发操作之后,接收用户在相册窗口203选择的图像,再通过文档编辑程序将该图像插入至文档编辑窗口204正在编辑的文档中。
以下又将结合图6、图15-图16,对本申请实施例所提供的人机交互方法进行说明。其中,第一程序包括在前台运行的程序,第二程序还可以包括未在前台运行的程序。需要说明的是,在一些实施例中,第二程序也可以只包括未在前台运行的程序。请依次参照图6、图15和图16,为本申请实施例所提供的一种人机交互方法相关的人机交互界面的示意图。
电子设备显示第一人机交互界面可以如前述图6所示。
用户按下了“windows键+Q键”(即第一触发操作),电子设备显示如下图15所示的第二人机交互界面。与如图7所示的第二人机交互界面相比,在如图15所示的第二人机交互人界面中,通讯窗口205右侧显示的第二程序的图标700还包括运动健康程序的图标710,而运动健康程序当前未在前台运行,因此第一人机交互界面和第二人机交互界面中不包括运动健康程序对应的窗口。
以第一程序为通讯程序,第二程序为运动健康程序为例。电子设备检测到用户将通讯窗口205右侧边缘的运动健康程序的图标710(即第二控件拖动)或拖动至通讯 窗口205所在的区域的第四拖动操作(即第二触发操作),电子设备可以运行运动健康程序,并从运动健康程序获取用户的健康数据(即第一目标数据),将该健康数据提供给通讯程序,通讯程序将该健康数据发送给当前正在通讯的好友A,显示第三人机交互界面如图16所示,由图16可知,通讯窗口205中新增加了一条关于健康数据的消息记录1600(即第一处理结果)。
在一些实施例中,电子设备不基于运动健康程序用户交互,就可以确定第一目标数据。比如电子设备可以将运动健康程序最新生成的健康数据作为第一目标数据,那么电子设备可以不在前台运行运动健康数据。或者,在另一些实施例中,电子设备可以基于运动健康程序与用户进行交互,从而确定第一目标数据,那么电子设备可以在前台运行运动健康程序,将用户在运动健康程序中选择的健康数据确定为第一目标数据,再将该第一目标数据提供给通讯程序。
以下将结合图17-图20,对本申请实施例所提供的人机交互方法进行说明。其中,第一程序包括未在前台运行的程序,第二程序包括在前台运行的程序。请依次参照图17、图18、图19和图20,为本申请实施例所提供的一种人机交互方法相关的人机交互界面的示意图。
电子设备显示第一人机交互界面可以如图17所示。第一人机交互界面的底部包括快捷启动栏202,该快捷启动栏202中从左到右包括浏览器程序、相册程序、运动健康程序、通讯程序和文档等应用程序的图标,这些图标中的每个图标都可以作为第一控件。其中,相册程序当前处于前台运行,其他应用程序未前台运行,因此第一人机交互界面中只包括相册窗口203。
用户按下了“windows键+Q键”,电子设备显示第二人机交互界面可以如图18所示。在图18所示的第二人机交互界面中,第一提示信息可以包括第一虚线1800。由于只有相册程序当前处于前台运行状态,因此第一虚线1800从相册窗口203分别指向快捷启动栏202中的浏览器程序的图标210、通讯程序的图标220和文档编辑程序的图标230,从而指示与浏览器程序、通讯程序和文档编辑程序相关的程序包括相册程序。
以第一程序为文档编辑程序,第二程序为相册程序为例。如图19所示,电子设备检测到用户将相册窗口203(即第二窗口)拖向或拖动至快捷启动栏202中文档编辑程序的图标230(即第一控件)所在的区域的第五拖动操作1900(即第二触发操作)。如果在第二触发操作之前,用户已经在相册窗口203选择了左上角的图像1010,即电子设备在第二触发操作之前已经基于相册程序确定了第一目标数据,那么电子设备可以在前台运行文档编辑程序,并将该图像1010插入至正在编辑的文档中,显示第三人机交互界面如图20所示。如果在第二触发操作之前,用户并未在相册窗口203选择图像,即电子设备在第二触发操作之前并未确定第一目标数据,那么电子设备可以在前台显示相册窗口203,基于接收用户在该相册窗口203中选择的图像,并在前台运行文档编辑程序,将该图像插入至正在编辑的文档中,显示第三人机交互界面。
以下又将结合图17、图21-图23,对本申请实施例所提供的人机交互方法进行说明。其中,第一程序包括未在前台运行的程序,第二程序还包括未在前台运行的程序。需要说明的是,在一些实施例中,第二程序也可以只包括未在前台运行的程序。请依 次参照图17、图21、图22和图23,为本申请实施例所提供的一种人机交互方法相关的人机交互界面的示意图。
电子设备显示第一人机交互界面可以如图17所示。
用户按下了“windows键+Q键”,电子设备显示第二人机交互界面可以如图21所示。其中,第一提示信息包括第二虚线2100、第二应用程序的图标700和相册窗口203的第一缩略图2110。以浏览器程序为例,浏览器程序的图标210所连接的第二虚线2100的另一端包括分两行显示的相册窗口203的第一缩略图2110和文档编辑程序的图标230,即表示与浏览器程序相关的程序包括在前台运行的相册程序和未在前台运行的文档编辑程序,且相册程序与浏览器程序的相关类型,和文档编辑程序与浏览器程序的相关类型不同。又以与文档编辑程序为例,文档编辑程序的图标230所连接的第二虚线2100的另一端包括同一行显示的运动健康程序的图标710和相册窗口203的第一缩略图2110,即表示与文档编辑程序相关的程序包括运动健康程序和相册程序,运动健康程序当前未在前台运行,相册程序当前在前台运行,且运动健康程序与文旦编辑程序的相关类型和相册程序与文旦编辑程序的相关类型相同。
以第一程序为文档编辑程序,第二程序为运动健康程序为例。如图22所示,电子设备检测到用户将运动健康程序的图标710(即第二控件)拖动或拖动至快捷启动栏202中文档编辑程序的图标(即第一控件)所在的区域的第六拖动操作2200(即第二触发操作)。由于文档编辑程序和运动健康程序当前均未在前台运行,因此电子设备可以在后台运行运动健康程序获取最新的健康数据2300(即第一目标数据)包括“今日步数:6667步;运动时长:30分钟;距离:4.8千米;热量:238千卡”,并在前台运行文档编辑程序,将该健康数据插入至新建的空白文档中(即第一处理结果),显示第三人机交互界面如图23所示。
以下又将结合图3、图24和图25,对本申请实施例所提供的人机交互方法进行说明。其中,第一程序包括在前台运行的程序,第二程序包括未在前台运行的程序。需要说明的是,在一些实施例中,第二程序也可以只包括未在前台运行的程序,或者既包括在前台运行的程序也包括未在前台运行的程序。请依次参照图3、图24和图25,为本申请实施例所提供的一种人机交互方法相关的人机交互界面的示意图。
电子设备为VR设备,第一人机交互界面为购物程序提供的商品展示界面,如前述图3所示。第一人机交互界面中包括虚拟电视301、虚拟人偶302和虚拟投影仪303等三个控件(即第一控件),分别表示电视、人偶玩具和投影仪三件商品。
VR设备检测到用户的视线移动到任一第一控件上(即第一触发操作),则显示第二人机交互界面,如图24所示,第二人机交互界面的右上角显显示有“定位信息-显示商品优惠信息”等第二提示信息800,从而提示用户该购物程序可以获取用户的定位信息,并显示各商品的优惠价格。
VR设备检测到用户的点头动作(即第二触发操作),则可以从定位程序获取用户的定位信息(即第一目标数据),并提供给该购物程序,该购物程序基于该定位信息从服务器获取电视、人偶玩具和投影仪三件商品的优惠信息,并通过如图25所示的第三人机交互界面显示该优惠信息2500(即第一处理结果)。其中,电视对应的优惠为50%,人偶的对应的优惠为90%,投影仪的优惠为70%。
请参照图26,为本申请实施例所提供的一种获取第一程序的数据处理能力的方法的流程图。需要说明的是,电子设备可以在前述显示第二人机交互界面之前的任何时机获取第一程序的数据处理能力。例如,电子设备可以在接收到第一触发操作时,执行如图26所示的获取第一程序的数据处理能力的方法。还需要说明的是,为了减少对用户的干扰,电子设备屏蔽用户对电子设备获取第一程序的数据处理能力的过程的感知,在一些实施例中,电子设备可以在后台执行如图26所示的获取第一程序的数据处理能力的方式,或者,电子设备可以创建虚拟机等虚拟运行环境,从而在该虚拟运行环境中获取第一程序的数据处理能力。还需要说明的是,该方法并不以图26以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:
S2601,电子设备获取第一程序请求的第一数据类型。
在一些实施例中,电子设备的操作系统可以向第一程序发送第一获取请求,第一程序响应于第一获取请求,向该操作系统反馈第一数据类型。在一些实施例中,第一获取请求中可以携带数据类型集合,该数据类型集合中包括第一数据类型,相应的,第一程序可以从该数据类型集合确定第一数据类型,并向该操作系统反馈第一数据类型。在一些实施例中,电子设备的操作系统可以通过广播的形式向第一程序发送第一获取请求,从而可以快速获取到包括多个程序所请求的数据类型。
在一些实施例中,第一程序也可以主动向电子设备的操作系统通知第一数据类型。
当然,在实际应用中,电子设备也可以通过其他方式来获取第一数据类型,本申请实施例不对电子设备获取第一数据类型的具体方式进行限定。
在一些实施例中,电子设备可以从第一程序的预设的第三接口,获取第一程序请求的第一数据类型。其中,第三接口可以是由第一程序的开发人员事先设置的。
S2602,电子设备获取与第一数据类型对应的第二目标数据。
其中,电子设备可以从网络中获取第二目标数据,或者,可以从该电子设备本地已存储的数据库获取第二目标数据,或者,可以即时生成第二目标数据。其中,第二目标数据的数据类型为第一数据类型。当然,在实际应用中,电子设备也可以通过其他方式来获取第二目标数据,本申请实施例不对电子设备获取第二目标数据的具体方式进行限定。
在一些实施例中,为了减少对用户的干扰,改善用户隐私泄露的问题,第二目标数据可以与用户的真实诉求无关,第二目标数据可以为不携带用户特征的数据。在一些实施例中,电子设备可以获取与第一数据类型对应的第三目标数据,并对第三目标数据进行脱敏处理,从而得到第二目标数据。其中,脱敏处理可以指对第三目标数据通过预设的脱敏规则进行数据变形处理,使得脱敏后的第二目标数据中不携带用户特征。在一些实施例中,脱敏处理可以包括数据加密和数据替换等等,当然,在实际应用中,脱敏处理也可以包括更多或更少的处理方式,本申请实施例不对脱敏处理的具体方式进行限定。
S2603,电子设备获取第一程序基于第二目标数据反馈的第二处理结果。
电子设备的操作系统可以通过第一程序的第二接口,将第二目标数据提供给第一 程序,第二程序可以对第二目标数据进行处理,并通过第三接口向该操作系统反馈第二处理结果。其中,第二处理结果可以包括图像和数据流中的至少一种。
为了减少获取第一程序的数据处理能力的过程对用户使用第一程序的真实体验造成影响,在一些实施例中,电子设备可以不向用户展示第二处理结果,比如,电子设备可以不在显示屏显示第二处理结果。在一些实施例中,电子设备可以禁止第一程序向其他设备发送第二目标数据。在一些实施例中,电子设备可以在向第一程序提供第二目标数据之后,第一次检测到第一程序请求使用网络资源的第一网络请求时,拒绝第一网络请求,并在拒绝第一请求后且并未接收到第一程序的任何网络请求的第一预设时长之后,再接收第一程序的第二网络请求并向第一程序分配网络资源,也即是,电子设备在向第一程序提供第一目标数据后禁止第一程序使用网络资源,并在静默第一预设时长之后,再允许第一程序使用网络资源,从而避免第一程序将第二处理结果发送给其他设备。在一些实施例中,电子设备还可以在接收第一程序的第二网络请求时,判断第一网络请求和第二网络请求所携带的数据包是否相同,如果第二网络请求和第一网络请求所携带的数据包不同,则向第一程序分配网络资源,如果第二网络请求和第一网络请求所携带的数据包相同,则再次拒绝第二网络请求。
S2604,电子设备基于第二处理结果,确定第一程序的数据处理能力。
在一些实施例中,电子设备可以将第二处理结果和第二目标数据进行对比,分析第一程序基于第二目标数据进行的处理行为,从而确定第一程序的数据处理能力。
在一些实施例中,电子设备可以将第二目标数据和第二处理结果,输入至机器学习模型,得到该机器学习模型输出的第一程序的数据处理能力。
其中,该机器学习模型可以是事先通过多个样本训练得到的,每个样本可以包括第四目标数据和第一程序对第四目标数据处理得到的第三处理结果,且每个样本携带有真实的数据处理能力标记。需要说明的是,该机器学习模型可以由该电子设备训练的,也可以由该电子设备之外其他的设备训练得到的。在一些实施例中,电子设备可以在与第一程序对应的历史数据中,获取用户针对第一程序提交的第四目标数据以及第一程序对四目标数据进行处理得到的第三处理结果。
当然,在实际应用中,电子设备也可以通过其他分析方法,来基于第二目标数据和第二目标数据结果,确定第一程序的数据处理能力,本申请实施例不对电子设备基于第二目标数据和第二处理结果,确定第一程序的数据处理能力的方式进行限定。
例如,第一程序为浏览器程序,当第一程序获取到的第二目标数据为图像时,第一程序可以基于该图像进行搜索,得到的第二处理结果为与该图像相似的其他图像,因此,浏览器程序的数据处理能力可以包括“图像-根据图像识别,搜索相似图像”。
又例如,第一程序为浏览器程序,当第一程序获取到的第二目标数据为文字时,第一程序可以基于该文字进行搜索,得到的第二处理结果为与该文字相关的文章,因此,浏览器程序的数据处理能力可以包括“文字-根据语义分析,进行拓展搜索”。
又例如,第一程序为通讯程序,当第一程序获取到第二目标数据时,第二处理结果为将第二目标数据发送给指定的联系人,因此,通讯程序的处理能力可以包括“数据-向联系人转发数据”。
又例如,第一程序为相册程序,当第一程序获取到的第二目标数据为图像时,第 二处理结果为将该图像保存至相册程序中,因此相册程序的处理能力可以包括“图像-获取图像以保存”。
又例如,第一程序为文档编辑程序,当第一程序获取到第二目标数据时,第二处理结果为将第二目标数据复制在文档中,因此文档编辑程序的处理能力可以包括“数据-将数据复制到文档”。
需要说明的是,上述仅以第一程序为浏览器程序、通讯程序、相册程序和文档编辑程序为例,对第一程序可能具有的数据处理能力进行说明,而不对第一程序或者第一程序的数据处理能力进行限定,可以理解是的,在实际应用中,第一程序并不限于浏览器程序、通讯程序、相册程序和文档编辑程序所具有的数据处理能力也并不限于上述中的“图像-根据图像识别,搜索相似图像”、“文字-根据语义分析,进行拓展搜索”、“数据-向联系人转发数据”、“图像-获取图像以保存”和“数据-将数据复制到文档”。
在一些实施例中,电子设备可以基于第一图像、第二图像和第二目标数据,确定第一程序的数据处理能力。其中,第一图像为向第一程序提供第二目标数据之前,第一窗口的图像,第二图像为得到第二处理结果之后第一窗口的图像。电子设备可以通过图像识别,确定第一图像和第二图像的图像差异,再将该图像差异结合第二目标数据,确定该数据处理能力。在一些实施例中,电子设备可以获取第一程序运行过程中第一窗口的图像集合,识别该图像集合中包括的各图像的图像特征,从而确定第一窗口对应的图像特征集合(即对第一程序的程序内容建立高置信度理解),相应的,电子设备可以基于该图像特征集合,对第一图像和第二图像进行识别,提高确定第一图像和第二图像的差异的准确性,进而提高确定第一程序的数据处理能力的准确性。
在本申请实施例中,电子设备可以获取第一程序请求的第一数据类型,向第一程序提供第一数据类型的第二目标数据,并获取第一程序基于第二目标数据反馈的第二处理结果,进而根据第二目标数据和第二处理结果确定第一程序所具有的数据处理能力,提高确定第一程序的数据处理能力的可靠性。
请参照图27,为本申请实施例所提供的一种获取第一程序的数据处理能力的方法的流程图。需要说明的是,该方法并不以图27以及以下所述的具体顺序为限制,应当理解,在其它实施例中,该方法其中部分步骤的顺序可以根据实际需要相互交换,或者其中的部分步骤也可以省略或删除。该方法包括如下步骤:
S2701,电子设备显示第一人机交互界面,第一人机交互界面包括第一交互对象,第一交互对象与第一程序对应。
S2702,当电子设备接收到第一触发操作时,电子设备确定第一程序的数据处理能力。
在一些实施例中,S2702可以省略。
S2703,电子设备显示第二人机交互界面,第二人机交互界面包括与第一程序对应的第一提示信息,第一提示信息用于提示与第一程序相关的第二程序,第二程序用于向第一程序提供第一数据类型的数据,第一数据类型为第一程序进行数据处理所依赖的数据类型。
在一些实施例中,第二人机交互界面还包括第二提示信息,第二提示信息用于指 示第一程序的数据处理能力和第一数据类型中的至少一个。
S2704,当电子设备基于第二人机交互界面接收到第二触发操作时,显示第三人机交互界面,第三人机交互界面包括第一程序基于第一目标数据进行数据处理的第一处理结果,其中,第一目标数据为归属于第二程序的所述第一数据类型的数据。
其中,电子设备执行S2701、S2703-S2704的方式,可以参见S401-S403中的相关描述,电子设备执行S2702的方式,可以参见S2601-S2604中的相关描述,此处不再一一赘述。
在一些实施例中,S2704可以省略。
基于同一发明构思,本申请实施例还提供了一种电子设备,包括:存储器和处理器,存储器用于存储计算机程序;处理器用于在调用计算机程序时执行上述方法实施例所述的方法。
本实施例提供的电子设备可以执行上述方法实施例,其实现原理与技术效果类似,此处不再赘述。
基于同一发明构思,本申请实施例还提供了一种芯片系统。该所述芯片系统包括处理器,所述处理器与存储器耦合,所述处理器执行存储器中存储的计算机程序,以实现上述方法实施例所述的方法。
其中,该芯片系统可以为单个芯片,或者多个芯片组成的芯片模组。
本申请实施例还提供一种计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述方法实施例所述的方法。
本申请实施例还提供一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行时实现上述方法实施例所述的方法。
上述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实现上述实施例方法中的全部或部分流程,可以通过计算机程序来指令相关的硬件来完成,所述的计算机程序可存储于一计算机可读存储介质中,该计算机程序在被处理器执行时,可实现上述各个方法实施例的步骤。其中,所述计算机程序包括计算机程序代码,所述计算机程序代码可以为源代码形式、对象代码形式、可执行文件或某些中间形式等。所述计算机可读存储介质至少可以包括:能够将计算机程序代码携带到拍照装置/终端设备的任何实体或装置、记录介质、计算机存储器、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、电载波信号、电信信号以及软件分发介质。例如U盘、移动硬盘、磁碟或者光盘等。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述或记载的部分,可以参见其它实施例的相关描述。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
在本申请所提供的实施例中,应该理解到,所揭露的装置/设备和方法,可以通过 其它的方式实现。例如,以上所描述的装置/设备实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通讯连接可以是通过一些接口,装置或单元的间接耦合或通讯连接,可以是电性,机械或其它的形式。
应当理解,当在本申请说明书和所附权利要求书中使用时,术语“包括”指示所描述特征、整体、步骤、操作、元素和/或组件的存在,但并不排除一个或多个其它特征、整体、步骤、操作、元素、组件和/或其集合的存在或添加。
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
如在本申请说明书和所附权利要求书中所使用的那样,术语“如果”可以依据上下文被解释为“当...时”或“一旦”或“响应于确定”或“响应于检测到”。类似地,短语“如果确定”或“如果检测到[所描述条件或事件]”可以依据上下文被解释为意指“一旦确定”或“响应于确定”或“一旦检测到[所描述条件或事件]”或“响应于检测到[所描述条件或事件]”。
另外,在本申请说明书和所附权利要求书的描述中,术语“第一”、“第二”、“第三”等仅用于区分描述,而不能理解为指示或暗示相对重要性。
在本申请说明书中描述的参考“一个实施例”或“一些实施例”等意味着在本申请的一个或多个实施例中包括结合该实施例描述的特定特征、结构或特点。由此,在本说明书中的不同之处出现的语句“在一个实施例中”、“在一些实施例中”、“在其他一些实施例中”、“在另外一些实施例中”等不是必然都参考相同的实施例,而是意味着“一个或多个但不是所有的实施例”,除非是以其他方式另外特别强调。术语“包括”、“包含”、“具有”及它们的变形都意味着“包括但不限于”,除非是以其他方式另外特别强调。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (14)

  1. 一种人机交互的方法,其特征在于,包括:
    显示第一人机交互界面,所述第一人机交互界面包括第一交互对象,所述第一交互对象与第一程序对应;
    当接收到第一触发操作时,显示第二人机交互界面,所述第二人机交互界面包括与所述第一程序对应的第一提示信息,所述第一提示信息用于提示与所述第一程序相关的第二程序,所述第二程序用于向所述第一程序提供第一数据类型的数据,所述第一数据类型为所述第一程序进行数据处理所依赖的数据类型。
  2. 根据权利要求1所述的方法,其特征在于,若包括多个所述第一数据类型,则所述第一提示信息具体用于分别基于各所述第一数据类型,提示对应的所述第二程序。
  3. 根据权利要求1或2所述的方法,其特征在于,所述第二人机交互界面还包括第二提示信息,所述第二提示信息用于提示所述第一程序的数据处理能力和所述第一数据类型中的至少一个。
  4. 根据权利要求3所述的方法,其特征在于,所述显示第二人机交互界面,包括:
    突出显示所述第一提示信息和所述第二提示信息。
  5. 根据权利要求1-4任一所述的方法,其特征在于,所述第一交互对象包括第一窗口和第一控件中至少一个。
  6. 根据权利要求1-5任一所述的方法,其特征在于,所述第一提示信息具体用于指示与所述第二程序对应的第二交互对象。
  7. 根据权利要求6所述的方法,其特征在于,所述第二交互对象包括第二窗口和第二控件中的至少一个。
  8. 根据权利要求1-7任一所述的方法,其特征在于,所述方法还包括:
    当基于所述第二人机交互界面接收到第二触发操作时,显示第三人机交互界面,所述第三人机交互界面包括所述第一程序基于第一目标数据进行数据处理的第一处理结果,其中,所述第一目标数据为归属于所述第二程序的所述第一数据类型的数据。
  9. 根据权利要求8所述的方法,其特征在于,所述第二触发操作包括对第二交互对象的点击操作;或,
    所述第二触发操作包括将所述第二交互对象拖向所述第一交互对象的拖动操作;或,
    所述第二触发操作包括将所述第二交互对象拖动至所述第一交互对象所在区域的拖动操作;
    其中,所述第一交互对象与所述第一程序对应,第二交互对象与所述第二程序对应。
  10. 根据权利要求1-9任一所述的方法,其特征在于,所述方法还包括:
    获取所述第一程序请求的第一数据类型;
    获取所述第一数据类型匹配的第二目标数据;
    向所述第一程序提供所述第二目标数据;
    获取所述第一程序基于所述第二目标数据反馈的第二处理结果;
    基于所述第二目标数据和所述第二处理结果,确定所述第一程序的数据处理能力。
  11. 根据权利要求1-10任一所述的方法,其特征在于,所述第一程序和所述第二程序均为在前台运行的程序。
  12. 一种电子设备,其特征在于,包括:存储器和处理器,所述存储器用于存储计算机程序;所述处理器用于在调用所述计算机程序时使得所述电子设备执行如权利要求1-11任一项所述的方法。
  13. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1-11任一项所述的方法。
  14. 一种计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1至11中任一项所述的方法。
PCT/CN2023/096230 2022-05-31 2023-05-25 人机交互的方法及电子设备 WO2023231884A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210610171.0A CN117193514A (zh) 2022-05-31 2022-05-31 人机交互的方法及电子设备
CN202210610171.0 2022-05-31

Publications (1)

Publication Number Publication Date
WO2023231884A1 true WO2023231884A1 (zh) 2023-12-07

Family

ID=89002187

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/096230 WO2023231884A1 (zh) 2022-05-31 2023-05-25 人机交互的方法及电子设备

Country Status (2)

Country Link
CN (1) CN117193514A (zh)
WO (1) WO2023231884A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150083743A (ko) * 2014-01-10 2015-07-20 엘지전자 주식회사 전자 기기 및 그 제어 방법
US20180196584A1 (en) * 2017-01-09 2018-07-12 Alibaba Group Holding Limited Execution of multiple applications on a device
US20180217864A1 (en) * 2017-02-02 2018-08-02 Samsung Electronics Co., Ltd Method and apparatus for managing content across applications
CN109246464A (zh) * 2018-08-22 2019-01-18 Oppo广东移动通信有限公司 用户界面显示方法、装置、终端及存储介质
CN109933446A (zh) * 2019-03-18 2019-06-25 Oppo广东移动通信有限公司 电子设备中跨应用程序的数据传输控制方法和装置
CN113297406A (zh) * 2021-04-30 2021-08-24 阿里巴巴新加坡控股有限公司 图片搜索方法、系统及电子设备
CN113885746A (zh) * 2021-09-16 2022-01-04 维沃移动通信有限公司 消息发送方法、装置及电子设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150083743A (ko) * 2014-01-10 2015-07-20 엘지전자 주식회사 전자 기기 및 그 제어 방법
US20180196584A1 (en) * 2017-01-09 2018-07-12 Alibaba Group Holding Limited Execution of multiple applications on a device
US20180217864A1 (en) * 2017-02-02 2018-08-02 Samsung Electronics Co., Ltd Method and apparatus for managing content across applications
CN109246464A (zh) * 2018-08-22 2019-01-18 Oppo广东移动通信有限公司 用户界面显示方法、装置、终端及存储介质
CN109933446A (zh) * 2019-03-18 2019-06-25 Oppo广东移动通信有限公司 电子设备中跨应用程序的数据传输控制方法和装置
CN113297406A (zh) * 2021-04-30 2021-08-24 阿里巴巴新加坡控股有限公司 图片搜索方法、系统及电子设备
CN113885746A (zh) * 2021-09-16 2022-01-04 维沃移动通信有限公司 消息发送方法、装置及电子设备

Also Published As

Publication number Publication date
CN117193514A (zh) 2023-12-08

Similar Documents

Publication Publication Date Title
KR102155688B1 (ko) 사용자 단말 장치 및 이의 디스플레이 방법
US11481428B2 (en) Bullet screen content processing method, application server, and user terminal
CN105830422B (zh) 可折叠电子设备及其界面交互方法
WO2021104365A1 (zh) 对象分享方法及电子设备
WO2020063091A1 (zh) 一种图片处理方法及终端设备
WO2021218902A1 (zh) 显示控制方法、装置及电子设备
CN108920515B (zh) 网页显示过程的信息推荐方法、装置、设备及存储介质
US20120210275A1 (en) Display device and method of controlling operation thereof
AU2013355450A1 (en) User terminal apparatus and method of controlling the same
CN111274777B (zh) 思维导图显示方法及电子设备
CN104199552A (zh) 多屏显示方法、设备及系统
US10416783B2 (en) Causing specific location of an object provided to a device
WO2020199783A1 (zh) 界面显示方法及终端设备
WO2021057301A1 (zh) 文件控制方法及电子设备
KR20120093745A (ko) 디스플레이 장치의 동작 제어 방법 및 그를 이용한 디스플레이 장치
WO2021169954A1 (zh) 搜索方法及电子设备
WO2023197648A1 (zh) 截图处理方法及装置、电子设备和计算机可读介质
EP2939411B1 (en) Image capture
CN111459363A (zh) 信息展示方法、装置、设备及存储介质
CN112954046A (zh) 信息发送方法、信息发送装置和电子设备
CN112827171A (zh) 交互方法、装置、电子设备和存储介质
EP3151578A1 (en) Display device, user terminal device, server, and method for controlling same
WO2022068721A1 (zh) 截屏方法、装置及电子设备
WO2021197260A1 (zh) 便签创建方法及电子设备
WO2020000975A1 (zh) 视频拍摄方法、客户端、终端及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23815067

Country of ref document: EP

Kind code of ref document: A1