CN115543167A - Interface interaction method and device - Google Patents
Interface interaction method and device Download PDFInfo
- Publication number
- CN115543167A CN115543167A CN202110721566.3A CN202110721566A CN115543167A CN 115543167 A CN115543167 A CN 115543167A CN 202110721566 A CN202110721566 A CN 202110721566A CN 115543167 A CN115543167 A CN 115543167A
- Authority
- CN
- China
- Prior art keywords
- information
- application
- application interface
- interface information
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application aims to provide an interface interaction method and device, which specifically comprise the following steps: presenting, by a display device of a head-mounted device, current application interface information of a current application being used by a user; and interface interaction operation of the user on the current application interface information is acquired, and a plurality of pieces of application interface information are presented in a screen through the display device while the current application interface information is presented on the basis of the interface interaction operation, wherein the plurality of pieces of application interface information comprise upper and lower side application interface information and left and right side application interface information of the current application interface information. According to the application switching method and device, the user can conveniently realize application switching in the screen, the user does not need to return to a main interface from the current application, and then finds the corresponding application to perform complex actions such as clicking to switch the application, so that the interaction is greatly simplified, and the user operation experience is improved.
Description
Technical Field
The application relates to the field of communication, in particular to a technology of interface interaction.
Background
With the rapid development of computers and image processing technologies, various head-mounted display devices have been developed and applied to many applications such as military, navigation, electronic games, entertainment, media, and the like. Augmented Reality (AR) glasses or Virtual Reality (VR) glasses, etc., are typical head-mounted display devices that can "seamlessly" integrate real world information and Virtual world information and display them to a user, thereby achieving a sensory experience beyond Reality. However, the interaction interface of the existing head-mounted device (augmented reality or virtual reality device) is not friendly, the interaction steps are too cumbersome, and the operation of the user is inconvenient, thereby affecting the use experience.
Disclosure of Invention
An object of the present application is to provide an interface interaction method and device.
According to an aspect of the present application, there is provided an interface interaction method applied to a head-mounted device, the method including:
presenting, by a display device of a head-mounted device, current application interface information of a current application being used by a user;
and acquiring interface interaction operation of the user about the current application interface information, and presenting a plurality of pieces of application interface information in a screen through the display device while presenting the current application interface information based on the interface interaction operation, wherein the plurality of pieces of application interface information comprise upper and lower side application interface information and left and right side application interface information of the current application interface information.
According to another aspect of the present application, there is provided an interfacing interactive head mounted device, wherein the device comprises:
a one-to-one module for presenting, via a display device of the head-mounted device, current application interface information of a current application being used by a user;
and the second module is used for acquiring interface interaction operation of the user about the current application interface information, and presenting a plurality of pieces of application interface information in a screen through the display device while presenting the current application interface information based on the interface interaction operation, wherein the plurality of pieces of application interface information comprise upper and lower side application interface information and left and right side application interface information of the current application interface information.
According to an aspect of the present application, there is provided an interface interaction apparatus, wherein the apparatus includes:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the operations of any of the methods described above.
According to one aspect of the application, there is provided a computer-readable medium storing instructions that, when executed, cause a system to perform the operations of any of the methods described above.
Compared with the prior art, the application passes through the user about the interface interactive operation of current application interface information, based on interface interactive operation is presenting current application interface information is the time, through display device presents a plurality of application interface information in the screen to convenience of customers realizes the application switching in the screen, need not to return the main interface from current application earlier, finds corresponding application again and clicks loaded down with trivial details action such as come the switching application, has guaranteed that the interaction is extremely simplified, promotes user operation experience.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
FIG. 1 illustrates a method flow diagram of a method of interface interaction according to one embodiment of the present application;
FIG. 2 illustrates an application distribution presentation diagram of an interface interaction according to another embodiment of the present application;
FIG. 3 illustrates an interaction example diagram of a functionality application according to one embodiment of the present application;
FIG. 4 illustrates an application distribution presentation diagram of an interface interaction according to one embodiment of the present application;
FIG. 5 illustrates an application distribution presentation diagram of an interface interaction according to another embodiment of the present application;
FIG. 6 illustrates functional modules of an interface interaction headset 100 according to one embodiment of the present application;
FIG. 7 illustrates an exemplary system that can be used to implement the various embodiments described in this application.
The same or similar reference numbers in the drawings identify the same or similar elements.
Detailed Description
The present application is described in further detail below with reference to the attached figures.
In a typical configuration of the present application, the terminal, the device serving the network, and the trusted party each include one or more processors (e.g., central Processing Units (CPUs)), input/output interfaces, network interfaces, and memory.
The Memory may include forms of volatile Memory, random Access Memory (RAM), and/or non-volatile Memory in a computer-readable medium, such as Read Only Memory (ROM) or Flash Memory. Memory is an example of a computer-readable medium.
Computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase-Change Memory (PCM), programmable Random Access Memory (PRAM), static Random-Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash Memory or other Memory technology, compact Disc Read Only Memory (CD-ROM), digital Versatile Disc (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The device referred to in this application includes, but is not limited to, a user device, a network device, or a device formed by integrating a user device and a network device through a network. The user equipment includes, but is not limited to, any mobile electronic product capable of performing human-computer interaction with a user, for example, a smart phone, a tablet computer, a head-mounted device, and the like, and the mobile electronic product may adopt any operating system, such as an Android operating system, an iOS operating system, and the like. The network Device includes an electronic Device capable of automatically performing numerical calculation and information processing according to a preset or stored instruction, and the hardware includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded Device, and the like. The network device includes but is not limited to a computer, a network host, a single network server, a plurality of network server sets or a cloud of a plurality of servers; here, the Cloud is composed of a large number of computers or network servers based on Cloud Computing (Cloud Computing), which is a kind of distributed Computing, one virtual supercomputer consisting of a collection of loosely coupled computers. Including, but not limited to, the internet, a wide area network, a metropolitan area network, a local area network, a VPN network, a wireless Ad Hoc network (Ad Hoc network), etc. Preferably, the device may also be a program running on the user device, the network device, or a device formed by integrating the user device and the network device, the touch terminal, or the network device and the touch terminal through a network.
Of course, those skilled in the art will appreciate that the foregoing is by way of example only, and that other existing or future devices, which may be suitable for use in the present application, are also encompassed within the scope of the present application and are hereby incorporated by reference.
In the description of the present application, "a plurality" means two or more unless specifically defined otherwise.
Fig. 1 illustrates an interface interaction method according to an aspect of the present application, wherein the method is applied to a head-mounted device, and the method includes step S101 and step S102. In step S101, presenting, by a display device of a head-mounted apparatus, current application interface information of a current application being used by a user; in step S102, interface interaction operation of the user with respect to the current application interface information is obtained, and a plurality of pieces of application interface information are presented in a screen through the display device while the current application interface information is presented based on the interface interaction operation, where the plurality of pieces of application interface information include upper and lower side application interface information and left and right side application interface information of the current application interface information. Here, the head-mounted device includes, but is not limited to, augmented reality glasses, virtual reality glasses, mixed reality glasses, augmented reality helmets, virtual reality helmets, mixed reality helmets, and the like. The head-mounted equipment comprises an acquisition device, a data acquisition device and a data processing device, wherein the acquisition device is used for acquiring head motion information of a user, and the head motion information comprises but is not limited to a three-axis sensor, an inertial measurement unit, a gyroscope, 6DOF and the like; the head-mounted equipment also comprises a data processing device which is used for processing, storing, transmitting, retrieving, processing and the like the data; the head-mounted device further comprises a display means for presenting application interface information of the application, etc., such as a liquid crystal display, an optical display, etc.
Specifically, in step S101, current application interface information of a current application being used by the user is presented through the display device of the head-mounted apparatus. For example, a user holds and wears a head-mounted device, the head-mounted device is in a use state, and a plurality of applications, such as a functional application or a third-party application carried by the head-mounted device, are installed on the head-mounted device. The head-mounted device starts the corresponding application based on the application starting operation (such as direct selection operation, or selection and confirmation operation) of the user, and takes the application currently used by the user as the current application. The head-mounted device presents application interface information of the current application through the display device, wherein the application interface information comprises an application interactive interface after the current application is started. Here, when the head-mounted device is started, it is determined that the presentation position information of the interface of the head-mounted device is in front of the user, for example, the front of the user's field of view is taken as a screen center, and each interface, such as a main interface of the head-mounted device or application interface information after starting, is presented in the screen, where the corresponding application interface information and the main interface both use the screen center as a reference to adapt to a current screen and the like, for example, the application interface information and the main interface both are presented in front of the user in the same ratio (e.g., 16: 16 are presented in front of the user, corresponding to the main interface at 16:9 is presented in front of the user; here, the presentation scale, spatial position area, and the like of each interface are not limited. In some embodiments, a gaze point is set at a fixed location in the screen, the location of the gaze point in the screen being unchanged, such as a gaze point set at the center of the screen. And the control in the interface acquires the focus through the fixation point, for example, the position of the interface is dynamically moved through head movement, and when the control in the interface moves to the fixation point, the control acquires the focus.
In step S102, interface interaction operation of the user with respect to the current application interface information is obtained, and a plurality of pieces of application interface information are presented in a screen through the display device while the current application interface information is presented based on the interface interaction operation, where the plurality of pieces of application interface information include upper and lower side application interface information and left and right side application interface information of the current application interface information.
For example, the head-mounted device may collect user interface interaction operations, such as head motion information, voice information of the user, key information, touch information, or gesture information, where the head motion information includes any one of a head motion angle, a distance, a speed, and an orientation. If the collected head motion information, voice information, key information, touch information or gesture information is similar to a preset interface interaction operation (if the similarity exceeds a certain similarity threshold value and the like), determining that the user operation is the interface interaction operation, starting a corresponding interface interaction mode by the head-mounted device, and in the mode, not only displaying the current application interface information of the current application, but also simultaneously displaying a plurality of application interface information on a screen by the head-mounted device. The information of the plurality of application interfaces is distributed on the upper side, the lower side, the left side and the right side of the information of the application interface of the current application. In some embodiments, the application attributes of the applications on the upper and lower sides are the same, the application attributes of the applications on the left and right sides are the same, and the application attributes of the applications on the upper and lower sides are different from the application attributes of the applications on the left and right sides; as shown in fig. 2, the applications on the upper and lower sides are function applications, which are control applications and shortcut command applications, respectively, and the applications on the left and right sides are other installed applications (such as third party applications), which are auxiliary applications and other applications, respectively. In other embodiments, the top and bottom and left and right side application properties are not required. The upper and lower side applications and the left and right side applications are displayed beside the application interface information of the current application, and the specific display size is not limited; the upper and lower side applications and the left and right side applications can be connected with the interface boundary of the application interface of the current application, can be partially overlapped with the application interface of the current application, and can be separated from the interface boundary of the application interface of the current application by a preset distance.
In some cases, the top-bottom side application and the applications of the left-right side application may be presented directly and concurrently around the application interface information of the current application; in other cases, the top-bottom side application and the left-right side application may also be hidden around the application interface information of the current application, for example, the side application is presented around the application interface information of the current application with different transparencies (e.g., 0-100%) according to different requirements, and when the gaze point of the user is outside the interface range of the application interface information of the current application or in an interface overlapping area, the side application that is closer to the side application is presented according to the drop point range or the drop point of the current gaze point, so that the corresponding side application is presented on the current screen in an appearance manner.
In some cases, the top and bottom side applications and the left and right side applications may be directly and simultaneously presented around the application interface information of the current application, and all the side applications and the current application are distributed in different spatial positions of the current screen; in other cases, the applications of the top-bottom side application and the left-right side application can also be partially presented around the application interface information of the current application, such as presenting a certain proportion of application interface information near the border boundary of the side application to the current screen according to the screen size, or presenting the overlapping part of the side application and the current application to the current screen; in other cases, the top-bottom side application and the left-right side application may also be sequentially presented in the screen according to the user operation, for example, the current application interface is currently presented in the screen, the user rotates the head, and the application interfaces of the side applications are sequentially presented on the screen.
In some embodiments, the side application interface information corresponding to the top and bottom side application interface information or the left and right side application interface information includes application interface information corresponding to a functional application. For example, the function application includes a function setting application of a device self-parameter of the head-mounted device, such as a power setting application, a network connection application, a screen brightness adjustment application, a flashlight operation application, a sound adjustment application, a head control mode switching application, or an external device connection application. The application interface information of the head-mounted device in the transverse or longitudinal dimension includes application interface information corresponding to the functional application, in other words, the application interface information corresponding to one of the upper side, the lower side, the left side and the right side is the application interface information of the functional application; the other side can display application interface information of other applications, application interface information of related applications of the current application and the like. As shown in fig. 2, application interface information of a functional application is displayed on the upper and lower sides of the application interface of the current application, and application interface information of an auxiliary application or another application is displayed on the left and right sides of the application interface of the current application. Of course, the application interface information of the auxiliary application or other applications may be displayed on the upper and lower sides of the application interface of the current application, and the application interface information of the functional application may be displayed on the left and right sides of the application interface of the current application.
In some embodiments, the functional applications include a parameter setting application for accessing a parameter setting function of the headset and a shortcut instruction application for triggering and generating shortcut instruction information. For example, the functional applications include a parameter setting application for setting a device-related parameter of the headset itself and a shortcut instruction application for triggering and generating shortcut instruction information including, but not limited to, operation instruction information determined based on a head action, voice, key press, touch, or gesture, etc. The shortcut instruction information includes shortcut instruction information related to equipment, such as operation instruction information related to equipment frequently used by a user, for example, photographing, recording, scanning, calling to xx, checking call records, opening an album, and the like, and the shortcut instruction information may also include application instruction information related to a current application, such as default operation instruction information frequently used by the user or provided by the application in the current application, for example, logging out, switching accounts, entering the next step, returning to the previous step, clicking, saving, logging out an interface interaction mode, and the like. The shortcut instruction information is merely an example and is not limited. The application interface information of the functional application is presented on the upper and lower sides of the application interface information of the current application, for example, the parameter setting applications of the wireless network, the flashlight, the sound, the screen brightness, the head control mode and other devices are presented on the upper side, and the parameter setting applications comprise the functional applications capable of user interaction, for example, setting the corresponding wireless network, adjusting the sound, the brightness, the head control mode, the flashlight and the like. The parameter setting application may also be a functional application that only displays parameter-related information, such as a functional display application that displays current electric quantity, current time, and the like; of course, in some cases, the function display applications may be pure function display applications without an interactive function, or may also be function display applications with a certain interactive function, for example, the time display application for mainly displaying time may also include a regional time setting function, and the like.
In some embodiments, the application interface information of the parameter setting application includes parameter identification information; the method further comprises a step S103 (not shown), in which in the step S103, head movement information of the user is obtained, and gaze position information of a gaze point of the user in an interface is determined according to the head movement information; and if the gaze position information is in the identification range of certain parameter identification information in the parameter identification information, presenting a parameter setting function corresponding to the parameter identification information. For example, the parameter setting application of the head-mounted device includes at least one piece of parameter identification information, where a part of the parameter identification information is used to expose the relevant function application, or a part of the parameter identification information is used to identify a parameter setting entry for interacting with a user. For the parameter identification information corresponding to the parameter setting entry interacted with the user, the user can align the fixation point with the corresponding parameter identification information by controlling the action of the moving head, so that a parameter setting page of the parameter identification information is accessed. Specifically, as shown in fig. 3, a parameter setting page corresponding to each parameter identification information, where the head-mounted device obtains head action information of a user, dynamically moves a user interface presented in a screen according to the head action information, and sets a gaze point at a fixed position in the screen, and if the gaze point is set at a center position of the screen, the position of the gaze point in the screen is not changed, when the user interface presented in the screen is dynamically moved, gaze position information of the gaze point in the interface changes, and if the gaze position information is within an identification range corresponding to a certain parameter identification information, it is determined that the parameter identification information is selected by the user, a setting page or further functions of the parameter setting application can be displayed, such as displaying a setting page corresponding to a setting function of increasing, decreasing, or muting, etc., or displaying more detailed parameter related information, etc. Specifically, in some embodiments, the rotation vector sensor is implemented by an inertial measurement unit of the headset to collect head motion information of the user, calculate the euler angle from the output result of the rotation vector sensor, and then dynamically move the position of the user interface according to the change of the X and Y direction angles.
In some embodiments, the shortcut instruction application comprises shortcut instruction information to be triggered; the method further includes a step S104 (not shown), in which in step S104, head motion information of the user is acquired, and gaze position information of a gaze point of the user in an interface is determined according to the head motion information; and if the gaze position information is in the trigger range of a certain shortcut instruction information to be triggered in the shortcut instruction information to be triggered, determining the shortcut instruction information to be triggered as corresponding shortcut instruction information, and executing the shortcut instruction information. For example, the shortcut instruction application comprises shortcut instruction information to be triggered, and the shortcut instruction information to be triggered is triggered based on the selection operation of the user. Specifically, at least one piece of to-be-triggered shortcut instruction information is presented in the application interface information, and the at least one piece of to-be-triggered shortcut instruction information is used for guiding a user to move the gaze point to a corresponding position so as to trigger the corresponding shortcut instruction information and the like. The method comprises the steps that head-mounted equipment obtains head action information of a user, corresponding gaze position information is determined on the basis of the head action information, if the gaze position information is in a trigger range of certain to-be-triggered shortcut instruction information, it is determined that the user selects the to-be-triggered shortcut instruction information, and shortcut instruction information corresponding to the selected to-be-triggered shortcut instruction information is started.
In some embodiments, if the gaze location information is in a trigger range of a certain shortcut instruction information to be triggered in the shortcut instruction information to be triggered, determining the shortcut instruction information to be triggered as corresponding shortcut instruction information, and executing the shortcut instruction information, where the determining includes: and if the gaze position information is in the trigger range of a certain shortcut instruction information to be triggered in the shortcut instruction information to be triggered and the trigger confirmation operation of the user is obtained, determining the shortcut instruction information to be triggered as the corresponding shortcut instruction information and executing the shortcut instruction information. For example, there may be a large error, such as false triggering during head movement, only by determining and executing the shortcut instruction information by gaze position information, the head-mounted device may determine corresponding shortcut instruction information based on selection and determination operations of the user, for example, the head-mounted device determines to-be-triggered shortcut instruction information selected by the user based on gaze position information of the user, and if a selection determination operation (such as head pointing, voice input determination, or touch pad click input determination) of the user with respect to the selected to-be-triggered shortcut instruction information is obtained, the head-mounted device executes the shortcut instruction information corresponding to the selected to-be-triggered shortcut instruction information, and the like.
In some embodiments, the shortcut instruction application includes voice text prompt information corresponding to the voice instruction information; the method further includes step S105 (not shown), in step S105, presenting the voice text prompt information in the application interface information corresponding to the shortcut instruction application through the display device; and if the voice information of the user is acquired, determining and executing corresponding user voice instruction information according to the voice information. For example, the shortcut instruction application includes voice text prompt information corresponding to the voice instruction information, and is used for prompting the user to input a corresponding text through voice so as to trigger the corresponding voice instruction information, and the like. The head-mounted equipment also presents voice text prompt information corresponding to the voice instruction information in the shortcut instruction application, if the head-mounted equipment acquires the voice input information of a user through an acquisition device (such as a microphone and the like), voice recognition is carried out on the voice input information, the voice text information corresponding to the voice input information is determined, and the corresponding voice instruction information is determined based on the voice text information; or matching the voice text information with the voice text prompt information, and if the same voice text prompt information is matched, executing the voice instruction information corresponding to the voice text prompt information by the head-mounted equipment.
In some embodiments, the upper and lower side application interface information or other side application interface information corresponding to the left and right side application interface information includes application interface information corresponding to other applications. For example, if the application interface information of the functional application is presented on the upper and lower sides of the application interface information of the current application, the application interface information corresponding to other applications is presented on the left and right sides of the application interface information of the current application; if the application interface information of the functional application is presented on the left side and the right side of the application interface information of the current application, presenting application interface information corresponding to other applications on the upper side and the lower side of the application interface information of the current application; in other embodiments, the left and right sides or the top and bottom sides of the application interface information of the current application may respectively present the application interface information of the functional application and the application interface information corresponding to other applications, which is not limited herein.
In some embodiments, the other applications include, but are not limited to: an associated application of the current application; a successor application to the current application; a background application currently running in the background in the head-mounted device; an installed application in the head-mounted device. For example, the other applications may be associated applications of the current application, such as applications developed by the same service provider, or associated applications with a service provider cooperation relationship, or applications with business association, and the like. For example, the other applications include a successor application of the current application, where the successor application is used for an application that is subsequently run based on the current application, such as an application that is subsequently run based on the running data of the current application; for example, the other applications include background applications currently running in the background in the head-mounted device, and the current application and at least one background running application are presented simultaneously, so that the convenient switching of the applications is realized; for example, the other applications include applications installed in the head-mounted device, such as an installation application carried by the device itself or an installed third-party application, for the user to start the corresponding application.
In some embodiments, the method further includes step S106 (not shown), in step S106, acquiring head motion information of the user, and determining gaze position information of a gaze point of the user in an interface according to the head motion information; and if the gaze position information is in the display range of the application interface information of the selected other application currently selected from the other applications, closing the application interface information of the applications except the selected other application. For example, the application interface information corresponding to the other application may present application identification information (such as an application icon) of the other application, may also present a running interface of the other application, and the like, and the head-mounted device may implement switching of the application based on a selected operation of the user. If the head-mounted device acquires the head action information of the user, the gaze position information of the user is determined based on the head action information, and if the gaze position information is in an interface range corresponding to the application interface information of other applications, the head-mounted device determines that the user selects the other applications, presents the application interface information of the other applications, and closes the application interface information of the applications except the other applications. Here, the other application referred to herein is an application for switching applications, and the function application is only used for setting and inputting functions or instructions, and the like, and application switching is not performed, in other words, even if the gaze position information is within the application interface of the function application, the head-mounted device does not close the application interface information of the application other than the function application based on the movement of the gaze position information.
In some embodiments, the closing application interface information of the application other than the selected other application includes: and if the confirmation operation of the user about the selected other applications is obtained, closing the application interface information of the applications except the selected other applications. For example, there may be a large error, such as false triggering during head movement, only by determining the gaze position information and performing application switching, and the head-mounted device may determine to switch to the application interface information of another application based on the selection and confirmation operation of the user, for example, the head-mounted device determines another application selected by the user based on the gaze position information of the user, and if a selection confirmation operation (such as head pointing, voice input confirmation, or touch pad click input confirmation) of the user with respect to the selected another application is obtained, the head-mounted device closes the application interface information of an application other than the another application and presents the application interface information of the another application. In some embodiments, if the head-mounted device determines the other application selected by the user based on the gaze location information of the user and does not acquire a user selection confirmation operation about the other selected application, and at this time, the application interface information of the application other than the selected application is not closed, the user may further change the gaze location information through the head to switch to the application interface of the application other than the selected application. Further, in some embodiments, if the head-mounted device determines other applications selected by the user based on the gaze position information of the user and does not acquire a selection confirmation operation of the user on the selected other applications, the functional application interface does not move based on the selection operation of the user on the other applications, such as when the other applications on the left side are selected by the user, the presentation positions of the corresponding functional applications are not changed, and the corresponding functional applications are still presented on the upper and lower sides of the current application.
In some embodiments, the closing the application interface information of the application other than the selected other application if the confirmation operation of the user on the selected other application is obtained includes: and if the confirmation operation of the user on the selected other applications is obtained, closing the application interface information of the applications except the selected other applications, and presenting the selected other applications at the display position of the current application. For example, the presentation position of the current application is directly in front of the user. If the head-mounted device obtains the confirmation operation of the user on the selection of other applications, the application interface information of the applications except the selected other applications is closed, and the selected other applications are presented at the display position corresponding to the current application, for example, the selected other applications are presented right in front, so that the user can conveniently operate the selected other applications, a user-friendly interactive scene is provided, and the quick switching of different applications is realized.
In some embodiments, the method further includes step S107 (not shown), in step S107, acquiring an interface interaction operation of the user with respect to the application interface information of the other application, and presenting, by the display device, a plurality of other application interface information in the screen while presenting the application interface information of the other application based on the interface interaction operation; wherein the plurality of other application interface information includes any one of: other upper side application interface information and/or lower side application interface information of the other applications; and other left side application interface information and/or right side application interface information of the application interface information of other applications. For example, the application interface information of the functional application does not move with the user's selection confirmation operation with respect to the other application, and is closed with the application switching operation. If the head-mounted device acquires the selection confirmation operation of the user about other applications, closing the application interface information of the applications except the selected other applications. In some cases, the user may wake up the interface interaction mode corresponding to the selected other application again, and in the interface interaction mode corresponding to the selected other application, in some embodiments, the interface interaction mode may be configured to display a plurality of pieces of application interface information on upper and lower sides and left and right sides of the interface interaction mode, such as displaying a function application on the upper and lower sides, displaying an application to be switched on the left and right sides, such as displaying a function application on the left and right sides, and displaying an application to be switched on the upper and lower sides, where the application to be switched may be an application associated with the selected other application or an application installed on the headset, and may also be the current application and the other application that is not selected. In other embodiments, the multiple pieces of application interface information may be displayed on the upper and lower sides and the left or right sides of the interface interaction mode, for example, functional applications are displayed on the upper and lower sides, applications to be switched are displayed on the left or right sides, or multiple pieces of application interface information may be displayed on the left and right sides and the upper or lower sides of the interface interaction mode, for example, functional applications are displayed on the left and right sides, and applications to be switched are displayed on the upper or lower sides, where the applications to be switched may be associated applications of the selected other applications or applications installed on the headset, or the like, or the current application and other applications not selected, and further, the displaying of the applications to be switched on the left or right side/upper or lower side is performed based on the application interface presentation sequence of the interface interaction mode of the current application, the application interface information of the current application is displayed on the side of the selected other applications, and the application interface information of the other associated applications or applications is displayed on the opposite side of the current application.
In some embodiments, the current application interface information of the current application is included in the plurality of other application interface information, and the plurality of other application interface information further includes application interface information corresponding to the functional application. For example, as shown in fig. 2, application interface information applied to a function is displayed on the upper and lower sides of the application interface of the current application, and application interface information of another application is displayed on the left and right sides of the application interface of the current application. And after the user selects and confirms the other applications on the left side, the head-mounted device closes the application interface information of the applications except the other applications on the left side and presents the application interface information of the other applications on the left side. After the user wakes up the interface interaction mode corresponding to the selected other application on the left again, referring to fig. 4, while the application interface of the other application on the left is presented, the functional application is presented on the upper and lower sides of the other application on the left, and the current application and the other application on the right side of the other application on the left are presented in sequence. Similarly, based on fig. 2, when the user performs the selection confirmation operation on the other application on the right side, the head-mounted device closes the application interface information of the application other than the other application on the right side and presents the application interface information of the other application on the right side. After the user wakes up the interface interaction mode corresponding to the selected other application on the right again, as shown in fig. 5, the application interfaces of the other application on the right are presented, and at the same time, the functional applications are presented on the upper and lower sides of the other application on the right, and the current application and the other application on the left are presented in sequence on the left side of the other application on the right. In some cases, the functional application moves along with an application operation interface corresponding to the interface interaction operation, and is presented on the upper side, the lower side, the left side and the right side corresponding to the application operation interface when the interface interaction mode is awakened, and is presented on a certain side in sequence corresponding to the current application and other unselected applications according to the arrangement sequence of the other applications and the current application in the interface interaction mode of the current application.
The foregoing mainly describes various embodiments of an interface interaction method, and in addition, the present application also provides an apparatus capable of implementing the above embodiments, which is described with reference to fig. 6.
Fig. 6 illustrates an interface interaction headset 100 according to an aspect of the subject application, the device including a one-module 101 and a two-module 102. A module 101, configured to present, through a display device of the head-mounted device, current application interface information of a current application being used by a user; a second module 102, configured to obtain an interface interaction operation of the user with respect to the current application interface information, and present, through the display device, a plurality of pieces of application interface information in a screen while presenting the current application interface information based on the interface interaction operation, where the plurality of pieces of application interface information include upper and lower side application interface information and left and right side application interface information of the current application interface information. In some embodiments, the side application interface information corresponding to the top and bottom side application interface information or the left and right side application interface information includes application interface information corresponding to a functional application. In some embodiments, the functional applications include a parameter setting application for accessing a parameter setting function of the head-mounted device and a shortcut instruction application for triggering and generating shortcut instruction information.
Here, the specific implementation corresponding to the one-to-one module 101 and the two-to-two module 102 shown in fig. 6 is the same as or similar to the embodiment of the step S101 and the step S102 shown in fig. 1, and thus is not repeated here, and is included herein by way of reference.
In some embodiments, the application interface information of the parameter setting application includes parameter identification information; the device further comprises a three-module (not shown) for acquiring head motion information of the user, and determining gaze position information of a gaze point of the user in an interface according to the head motion information; and if the gaze position information is in the identification range of certain parameter identification information in the parameter identification information, presenting a parameter setting function corresponding to the parameter identification information.
In some embodiments, the shortcut instruction application includes shortcut instruction information to be triggered; the device further comprises a fourth module (not shown) for acquiring head motion information of the user, and determining gaze position information of a gaze point of the user in an interface according to the head motion information; and if the gaze position information is in the trigger range of a certain shortcut instruction information to be triggered in the shortcut instruction information to be triggered, determining the shortcut instruction information to be triggered as corresponding shortcut instruction information, and executing the shortcut instruction information.
In some embodiments, if the gaze location information is in a trigger range of a shortcut instruction information to be triggered in the shortcut instruction information to be triggered, determining the shortcut instruction information to be triggered as a corresponding shortcut instruction information, and executing the shortcut instruction information includes: and if the gaze position information is in the trigger range of a certain shortcut instruction information to be triggered in the shortcut instruction information to be triggered and the trigger confirmation operation of the user is obtained, determining the shortcut instruction information to be triggered as the corresponding shortcut instruction information and executing the shortcut instruction information.
In some embodiments, the shortcut instruction application includes voice text prompt information corresponding to the voice instruction information; the device further comprises a fifth module (not shown) for presenting the voice text prompt information in the application interface information corresponding to the shortcut instruction application through the display device; and if the voice information of the user is acquired, determining and executing corresponding user voice instruction information according to the voice information.
In some embodiments, the upper and lower side application interface information or the other side application interface information corresponding to the left and right side application interface information includes application interface information corresponding to other applications. In some embodiments, the other applications include, but are not limited to: an associated application of the current application; a successor application to the current application; a background application currently running in the background in the head-mounted device; an installed application in the head-mounted device.
In some embodiments, the apparatus further includes a sixth module (not shown) configured to obtain head motion information of the user, and determine gaze position information of a gaze point of the user in an interface according to the head motion information; and if the gaze position information is in the display range of the application interface information of the selected other application currently selected from the other applications, closing the application interface information of the applications except the selected other application.
In some embodiments, the closing application interface information of the application other than the selected other application includes: and if the confirmation operation of the user on the selected other applications is obtained, closing the application interface information of the applications except the selected other applications.
In some embodiments, the closing the application interface information of the application other than the selected other application if the confirmation operation of the user on the selected other application is obtained includes: and if the confirmation operation of the user on the selected other applications is obtained, closing the application interface information of the applications except the selected other applications, and presenting the selected other applications at the display position of the current application.
In some embodiments, the apparatus further includes a seventh module (not shown) configured to obtain an interface interaction operation of the user with respect to the application interface information of the other application, and present, through the display device, a plurality of pieces of other application interface information in the screen while presenting the application interface information of the other application based on the interface interaction operation; wherein the plurality of other application interface information includes any one of: other upper side application interface information and/or lower side application interface information of the application interface information of other applications; and other left side application interface information and/or right side application interface information of the application interface information of other applications. For example, application interface information of a functional application does not move with a user's selection operation with respect to other applications, and is closed with an interface switching operation. If the headset device acquires the selection confirmation operation of the user about other applications, the application interface information of the applications except the selected other applications is closed.
In some embodiments, the current application interface information of the current application is included in the plurality of other application interface information, and the plurality of other application interface information further includes application interface information corresponding to the functional application.
Here, the specific implementation corresponding to the three to seven modules is the same as or similar to the embodiment of the foregoing steps S103 to S107, and thus is not repeated herein and is included by way of reference.
In addition to the methods and apparatus described in the embodiments above, the present application also provides a computer readable storage medium storing computer code that, when executed, performs the method as described in any of the preceding claims.
The present application also provides a computer program product, which when executed by a computer device performs the method of any of the preceding claims.
The present application further provides a computer device, comprising:
one or more processors;
a memory for storing one or more computer programs;
the one or more computer programs, when executed by the one or more processors, cause the one or more processors to implement the method as recited in any preceding claim.
FIG. 7 illustrates an exemplary system that can be used to implement the various embodiments described herein;
in some embodiments, as shown in FIG. 7, the system 300 can be implemented as any of the above-described devices in the various embodiments. In some embodiments, system 300 may include one or more computer-readable media (e.g., system memory or NVM/storage 320) having instructions and one or more processors (e.g., processor(s) 305) coupled with the one or more computer-readable media and configured to execute the instructions to implement modules to perform the actions described herein.
For one embodiment, system control module 310 may include any suitable interface controllers to provide any suitable interface to at least one of processor(s) 305 and/or any suitable device or component in communication with system control module 310.
The system control module 310 may include a memory controller module 330 to provide an interface to the system memory 315. Memory controller module 330 may be a hardware module, a software module, and/or a firmware module.
System memory 315 may be used, for example, to load and store data and/or instructions for system 300. For one embodiment, system memory 315 may include any suitable volatile memory, such as suitable DRAM. In some embodiments, the system memory 315 may include a double data rate type four synchronous dynamic random access memory (DDR 4 SDRAM).
For one embodiment, system control module 310 may include one or more input/output (I/O) controllers to provide an interface to NVM/storage 320 and communication interface(s) 325.
For example, NVM/storage 320 may be used to store data and/or instructions. NVM/storage 320 may include any suitable non-volatile memory (e.g., flash memory) and/or may include any suitable non-volatile storage device(s) (e.g., one or more Hard Disk Drives (HDDs), one or more Compact Disc (CD) drives, and/or one or more Digital Versatile Disc (DVD) drives).
NVM/storage 320 may include storage resources that are physically part of the device on which system 300 is installed or may be accessed by the device and not necessarily part of the device. For example, NVM/storage 320 may be accessible over a network via communication interface(s) 325.
Communication interface(s) 325 may provide an interface for system 300 to communicate over one or more networks and/or with any other suitable device. System 300 may wirelessly communicate with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols.
For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controller(s) (e.g., memory controller module 330) of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be packaged together with logic for one or more controller(s) of the system control module 310 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310. For one embodiment, at least one of the processor(s) 305 may be integrated on the same die with logic for one or more controller(s) of the system control module 310 to form a system on chip (SoC).
In various embodiments, system 300 may be, but is not limited to being: a server, a workstation, a desktop computing device, or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.). In various embodiments, system 300 may have more or fewer components and/or different architectures. For example, in some embodiments, system 300 includes one or more cameras, a keyboard, a Liquid Crystal Display (LCD) screen (including a touch screen display), a non-volatile memory port, multiple antennas, a graphics chip, an Application Specific Integrated Circuit (ASIC), and speakers.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware, for example, as an Application Specific Integrated Circuit (ASIC), a general purpose computer or any other similar hardware device. In one embodiment, the software programs of the present application may be executed by a processor to implement the steps or functions described above. Likewise, the software programs (including associated data structures) of the present application may be stored in a computer readable recording medium, such as RAM memory, magnetic or optical drive or diskette and the like. Additionally, some of the steps or functions of the present application may be implemented in hardware, for example, as circuitry that cooperates with the processor to perform various steps or functions.
In addition, some of the present application may be implemented as a computer program product, such as computer program instructions, which when executed by a computer, may invoke or provide methods and/or techniques in accordance with the present application through the operation of the computer. Those skilled in the art will appreciate that the forms of computer program instructions that reside on a computer-readable medium include, but are not limited to, source files, executable files, installation package files, and the like, and that the manner in which the computer program instructions are executed by a computer includes, but is not limited to: the computer directly executes the instruction, or the computer compiles the instruction and then executes the corresponding compiled program, or the computer reads and executes the instruction, or the computer reads and installs the instruction and then executes the corresponding installed program. Computer-readable media herein can be any available computer-readable storage media or communication media that can be accessed by a computer.
Communication media includes media whereby communication signals, including, for example, computer readable instructions, data structures, program modules, or other data, are transmitted from one system to another. Communication media may include conductive transmission media such as cables and wires (e.g., fiber optics, coaxial, etc.) and wireless (non-conductive transmission) media capable of propagating energy waves such as acoustic, electromagnetic, RF, microwave, and infrared. Computer readable instructions, data structures, program modules or other data may be embodied in a modulated data signal, such as a carrier wave or similar mechanism that is embodied in a wireless medium, such as part of spread-spectrum techniques, for example. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. The modulation may be analog, digital, or hybrid modulation techniques.
By way of example, and not limitation, computer-readable storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, computer-readable storage media include, but are not limited to, volatile memory such as random access memory (RAM, DRAM, SRAM); and non-volatile memory such as flash memory, various read-only memories (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memories (MRAM, feRAM); and magnetic and optical storage devices (hard disk, tape, CD, DVD); or other now known media or later developed that can store computer-readable information/data for use by a computer system.
An embodiment according to the present application herein comprises an apparatus comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the apparatus to perform a method and/or solution according to embodiments of the present application as described above.
It will be evident to those skilled in the art that the present application is not limited to the details of the foregoing illustrative embodiments, and that the present application may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the application being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the apparatus claims may also be implemented by one unit or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Claims (17)
1. An interface interaction method is applied to a head-mounted device, and comprises the following steps:
presenting, by a display device of a head-mounted device, current application interface information of a current application being used by a user;
and acquiring interface interaction operation of the user about the current application interface information, and presenting a plurality of pieces of application interface information in a screen through the display device while presenting the current application interface information based on the interface interaction operation, wherein the plurality of pieces of application interface information comprise upper and lower side application interface information and left and right side application interface information of the current application interface information.
2. The method according to claim 1, wherein the side application interface information corresponding to the top-side application interface information, the bottom-side application interface information, or the left-side application interface information and the right-side application interface information includes application interface information corresponding to a function application.
3. The method of claim 2, wherein the functional applications include a parameter setting application for accessing parameter setting functions of the headset and a shortcut instruction application for triggering and generating shortcut instruction information.
4. The method according to claim 3, wherein the application interface information of the parameter setting application comprises parameter identification information; wherein the method further comprises:
acquiring head movement information of the user, and determining gaze position information of a gaze point of the user in an interface according to the head movement information;
and if the gaze position information is in the identification range of certain parameter identification information in the parameter identification information, presenting a parameter setting function corresponding to the parameter identification information.
5. The method of claim 3, wherein the shortcut instruction application comprises shortcut instruction information to be triggered; wherein the method further comprises:
acquiring head movement information of the user, and determining gaze position information of a gaze point of the user in an interface according to the head movement information;
and if the gaze position information is in the trigger range of a certain shortcut instruction information to be triggered in the shortcut instruction information to be triggered, determining the shortcut instruction information to be triggered as corresponding shortcut instruction information, and executing the shortcut instruction information.
6. The method according to claim 5, wherein if the gaze location information is within a trigger range of a certain shortcut instruction information to be triggered in the shortcut instruction information to be triggered, determining the shortcut instruction information to be triggered as the corresponding shortcut instruction information, and executing the shortcut instruction information includes:
and if the gaze position information is in the trigger range of a certain shortcut instruction information to be triggered in the shortcut instruction information to be triggered and the trigger confirmation operation of the user is obtained, determining the shortcut instruction information to be triggered as the corresponding shortcut instruction information and executing the shortcut instruction information.
7. The method of claim 3, wherein the shortcut instruction application comprises voice text prompt information corresponding to voice instruction information; wherein the method comprises the following steps:
presenting the voice text prompt information in application interface information corresponding to the shortcut instruction application through the display device;
and if the voice information of the user is acquired, determining and executing corresponding user voice instruction information according to the voice information.
8. The method according to any one of claims 2 to 7, wherein the upper and lower side application interface information or other side application interface information corresponding to the left and right side application interface information includes application interface information corresponding to other applications.
9. The method of claim 8, wherein the other applications comprise at least any one of:
an associated application of the current application;
a successor application to the current application;
a background application currently running in the background in the head-mounted device;
an installed application in the head-mounted device.
10. The method of claim 8, wherein the method further comprises:
acquiring head movement information of the user, and determining gaze position information of a gaze point of the user in an interface according to the head movement information;
and if the gaze location information is in the display range of the application interface information of the selected other application currently selected from the other applications, closing the application interface information of the applications other than the selected other application.
11. The method of claim 10, wherein the closing application interface information for applications other than the selected other application comprises:
and if the confirmation operation of the user about the selected other applications is obtained, closing the application interface information of the applications except the selected other applications.
12. The method of claim 11, wherein the closing application interface information of the applications other than the selected other application if the confirmation operation of the user on the selected other application is obtained comprises:
and if the confirmation operation of the user on the selected other applications is obtained, closing the application interface information of the applications except the selected other applications, and presenting the selected other applications at the display position of the current application.
13. The method of claim 11, wherein the method further comprises:
acquiring interface interaction operation of the user about the application interface information of the other applications, and presenting a plurality of pieces of other application interface information in a screen through the display device while presenting the application interface information of the other applications based on the interface interaction operation;
wherein the plurality of other application interface information includes any one of:
other upper side application interface information and/or lower side application interface information of the other applications;
and other left side application interface information and/or right side application interface information of the application interface information of other applications.
14. The method of claim 13, wherein the current application interface information of the current application is included in the plurality of other application interface information, and the plurality of other application interface information further includes application interface information corresponding to the functional application.
15. An interface interaction headset, wherein the device comprises:
a one-to-one module for presenting, via a display device of the head-mounted device, current application interface information of a current application being used by a user;
and the second module is used for acquiring interface interaction operation of the user about the current application interface information, and presenting a plurality of pieces of application interface information in a screen through the display device while presenting the current application interface information based on the interface interaction operation, wherein the plurality of pieces of application interface information comprise upper and lower side application interface information and left and right side application interface information of the current application interface information.
16. An interface interaction device, wherein the device comprises:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the operations of the method of any one of claims 1 to 14.
17. A computer-readable medium storing instructions that, when executed, cause a system to perform operations to perform a method as recited in any of claims 1-14.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110721566.3A CN115543167A (en) | 2021-06-28 | 2021-06-28 | Interface interaction method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110721566.3A CN115543167A (en) | 2021-06-28 | 2021-06-28 | Interface interaction method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115543167A true CN115543167A (en) | 2022-12-30 |
Family
ID=84717429
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110721566.3A Pending CN115543167A (en) | 2021-06-28 | 2021-06-28 | Interface interaction method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115543167A (en) |
-
2021
- 2021-06-28 CN CN202110721566.3A patent/CN115543167A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11853523B2 (en) | Display device and method of indicating an active region in a multi-window display | |
US10534534B2 (en) | Method for controlling display, storage medium, and electronic device | |
US9323446B2 (en) | Apparatus including a touch screen and screen change method thereof | |
WO2021115194A1 (en) | Application icon display method and electronic device | |
KR102330829B1 (en) | Method and apparatus for providing augmented reality function in electornic device | |
EP3540586B1 (en) | Method and apparatus for providing a changed shortcut icon corresponding to a status thereof | |
KR101984673B1 (en) | Display apparatus for excuting plurality of applications and method for controlling thereof | |
EP2790391B1 (en) | Method and apparatus for displaying screen of portable terminal device | |
KR20190133055A (en) | System and method for using 2D application in 3D virtual reality environment | |
US10042596B2 (en) | Electronic device and method for controlling the same | |
KR20140019088A (en) | Display apparatus and method for controlling thereof | |
EP2864858B1 (en) | Apparatus including a touch screen and screen change method thereof | |
WO2023024871A1 (en) | Interface interaction method and device | |
KR102548687B1 (en) | Wearable Device for Controlling Application Executed on Device and Method Thereof | |
CN112230910B (en) | Page generation method, device and equipment of embedded program and storage medium | |
US20180143681A1 (en) | Electronic device for displaying image and method for controlling the same | |
CN110780788B (en) | Method and device for executing touch operation | |
US11209970B2 (en) | Method, device, and system for providing an interface based on an interaction with a terminal | |
CN114153535B (en) | Method, apparatus, medium and program product for jumping pages on an open page | |
CN115543167A (en) | Interface interaction method and device | |
CN113282472B (en) | Performance test method and device | |
KR20140133370A (en) | Electronic device having camera | |
KR20140117092A (en) | display device and method for controlling thereof | |
KR20140028352A (en) | Apparatus for processing multiple applications and method thereof | |
CN117687549A (en) | Interaction method based on keys and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: 201210 7th Floor, No. 1, Lane 5005, Shenjiang Road, China (Shanghai) Pilot Free Trade Zone, Pudong New Area, Shanghai Applicant after: HISCENE INFORMATION TECHNOLOGY Co.,Ltd. Address before: Room 501 / 503-505, 570 shengxia Road, China (Shanghai) pilot Free Trade Zone, Pudong New Area, Shanghai, 201203 Applicant before: HISCENE INFORMATION TECHNOLOGY Co.,Ltd. |
|
CB02 | Change of applicant information |