CN112181219A - Icon display method and device - Google Patents

Icon display method and device Download PDF

Info

Publication number
CN112181219A
CN112181219A CN202010899281.4A CN202010899281A CN112181219A CN 112181219 A CN112181219 A CN 112181219A CN 202010899281 A CN202010899281 A CN 202010899281A CN 112181219 A CN112181219 A CN 112181219A
Authority
CN
China
Prior art keywords
application icon
application
gesture
icon
navigation interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010899281.4A
Other languages
Chinese (zh)
Other versions
CN112181219B (en
Inventor
蔡文琪
曹新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010899281.4A priority Critical patent/CN112181219B/en
Priority to CN202211085735.XA priority patent/CN116301485A/en
Publication of CN112181219A publication Critical patent/CN112181219A/en
Application granted granted Critical
Publication of CN112181219B publication Critical patent/CN112181219B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an icon display method, which can be applied to an augmented reality AR device, a virtual reality VR device or a mixed reality MR device, and comprises the following steps: displaying an application navigation interface, wherein the application navigation interface comprises a first application icon and a second application icon; detecting a sliding gesture of a user aiming at the application navigation interface; in response to the swipe gesture, hiding the first application icon, maintaining display of the second application icon on the application navigation interface, and displaying a third application icon on the application navigation interface. According to the application, under the condition that the display area of the AR device or the VR device is limited, the user can switch the displayed application icon on the application navigation interface through the sliding gesture, and therefore the operation is more convenient and faster.

Description

Icon display method and device
Technical Field
The present application relates to the field of electronic technologies, and in particular, to an icon display method and an icon display apparatus.
Background
Virtual Reality (VR) technology, Augmented Reality (AR) or Mixed Reality (MR) is a computer simulation system that can create and experience a virtual world, which uses a computer to create a simulated environment, a system simulation of multi-source information-fused, interactive three-dimensional dynamic views and physical behaviors, and immerse users in the environment.
The AR/VR/MR device, like the mobile phone, needs a launcher system of its own system, i.e. a desktop homepage (or called application navigation interface) appears in the device system after the mobile phone is turned on, as a collection of application portals.
In the existing implementation, the area of a visual area of the AR/VR/MR device is limited, an application navigation interface can only bear fewer application icons, and more application icons can be found only by entering a secondary or even higher-level display interface, thereby increasing the interaction cost of a user.
Disclosure of Invention
In a first aspect, the present application provides a method of icon display that may be applied to AR/VR/MR devices, or more similar devices after technological evolution. One or more of the devices may be used as an example at times for brevity when the scheme is described below. The method comprises the following steps: displaying an application navigation interface, wherein the application navigation interface comprises a first application icon and a second application icon; detecting a sliding gesture of a user aiming at the application navigation interface; in response to the swipe gesture, hiding the first application icon, maintaining display of the second application icon on the application navigation interface, and displaying a third application icon on the application navigation interface. It should be understood that in some scenarios, the VR device is collocated with a handle, and a user can operate the handle to operate the application navigation interface viewed visually (sliding gesture), in other scenarios, the VR device is not collocated with a handle, and a user can operate the application navigation interface viewed visually through a finger (sliding gesture), and in other scenarios, the AR device is not collocated with a handle, and a user can operate the application navigation interface viewed visually through a finger (sliding gesture).
Compared with the existing implementation, the embodiment has the advantages that a user needs to enter the multi-level menu to find more application entries, and under the condition that the display area of the AR device, the VR device or the MR device is limited, the user can switch the displayed application icons on the application navigation interface through a sliding gesture, so that the operation is more convenient and faster. It should be understood that this scheme may also be applied to non-AR/VR/MR devices.
Meanwhile, in the embodiment, in response to the sliding gesture of the user, the application icons on the application navigation interface are not completely hidden, but the display of a part of the application icons (second application icons) is maintained, so that even if the user performs the operation of sliding by mistake, the part of the application icons can be displayed, the user may not need to slide back to the previous application navigation interface to select the application to be opened, and the operation cost of the user is saved.
In one possible implementation, the maintaining the display of the second application icon includes: and moving the second application icon along the operation direction of the sliding operation.
In one possible implementation, the displaying a third application icon on the application navigation interface includes: and displaying the third application icon at the position where the second application icon is located before moving, or gradually displaying and moving the third application icon from the hidden state to the position where the second application icon is located before moving. The third application icon may be displayed in a direction opposite to the operation direction of the sliding operation, moved in the operation direction of the sliding operation, and moved by a certain displacement to a position where the second application icon is located before the movement. The third application icon may change from a lower definition state to a higher definition state or from a partial display to a full display as the movement progresses.
In one possible implementation, the first application icon is located in a target direction of the second application icon, and the hiding the first application icon in response to the swipe gesture includes: and responding to the sliding gesture, and hiding the first application icon based on the fact that the included angle between the operation direction of the sliding gesture and the target direction is within a preset range.
In this application embodiment, first application icon can be located the target direction of second application icon, VR equipment AR equipment detects the slide gesture, need to judge whether the slide gesture satisfies certain condition, in an implementation, VR equipment AR equipment can judge whether the operation direction of slide gesture with the contained angle of target direction is in predetermineeing the within range, and it can be less than 90 degrees to predetermine the range, in this embodiment, only if the operation direction of slide gesture with the contained angle of target direction is in predetermineeing the within range, just will first application icon is hidden.
In one possible implementation, the hiding the first application icon in response to the swipe gesture includes: in response to the swipe hand gesture, moving the first application icon in an operational direction of the swipe gesture; when the first application icon moves to a target position of the navigation interface, the first application icon is hidden, and the target position is close to or located at the edge of the navigation interface.
In one possible implementation, the application navigation interface includes a first area and a second area, where the first application icon and the second application icon are located in the first area, and the hiding the first application icon in response to the slide gesture includes: in response to the swipe hand gesture, moving the first application icon in an operational direction of the swipe gesture; when the first application icon moves to a target position of the navigation interface, the first application icon is hidden, wherein the target position is located between the first area and the second area, and the application icon contained in the second area is kept unchanged in the moving process or the hiding process of the first application icon.
In one possible implementation, the target position displays a target identifier, and the target identifier is used for indicating that the target position is a position of a hidden icon. The target identification may be implemented as an icon, control, image, or the like. Similar references to other embodiments may be made to this implementation.
In one possible implementation, the hiding the first application icon includes: hiding an area of the first application icon, which is overlapped with the target identifier, until the first application icon is hidden.
In one possible implementation, the application navigation interface further includes a bearer identifier, where the first application icon and the second application icon are located on the bearer identifier, and the target location is located at an edge of the bearer identifier, and the hiding the first application icon includes: and dropping the first application icon from the edge of the bearing identifier to the position below the edge of the bearing identifier, and hiding the first application icon when the movement displacement of the first application icon exceeds a preset value. Optionally, the bearing mark may be implemented as a long strip mark similar to a bottom plate or a desktop. The "movement displacement" of the first application icon utilized in this implementation may be the vertical distance of the drop calculated from the drop, or may be a displacement calculated from the beginning of the movement.
In one possible implementation, the application navigation interface includes a first area and a second area, wherein the first application icon and the second application icon are located in the first area, and application icons within the second area are not hidden in response to the swipe gesture by the user, the method further comprising:
moving the second application icon to the second region in response to a first target gesture, the first target gesture including a gesture to pinch the second application icon, a gesture to maintain the pinch gesture and drag the second application icon to the second region, and a gesture to drop the second application icon in the second region.
In one possible implementation, the application navigation interface includes a first area and a second area, the first area includes the first application icon and the second application icon, the second area includes a fourth application icon, and the method further includes: maintaining display of the fourth application icon on the second region after detecting a swipe gesture by a user.
In this embodiment, the application navigation interface includes two regions, one of the regions (the first region) may display application icons that move, hide, or display along with the sliding operation of the user, and the other region (the second region) may display icons that are fixed, where "fixed" may be understood as that the number and types of the application icons are not changed, and even the positions are not changed.
In one possible implementation, the method further comprises: detecting a first target gesture of a user for the second application icon, and moving the second application icon to the second area; detecting a sliding gesture of a user aiming at the application navigation interface; maintaining display of the second application icon on the second region in response to the swipe gesture.
In one possible implementation, the first target gesture includes a gesture of long-pressing the second application icon, a gesture of maintaining the long-pressing the second application icon and dragging the second application icon to the second area, and a gesture of raising a hand in the second area; or the like, or, alternatively,
the first target gesture includes a gesture to pinch the second application icon, a gesture to maintain the pinch gesture to the second application icon and drag the second application icon to the second region, and a gesture to drop the second application icon in the second region.
In one possible implementation, the method further comprises: detecting a second target gesture of the user for the fourth application icon, and moving the fourth application icon to the first area; detecting a sliding gesture of a user aiming at the application navigation interface; maintaining display of the fourth application icon on the first region in response to the swipe gesture; or hiding the fourth application icon.
In one possible implementation, the second target gesture includes a long-press gesture of the fourth application icon, a hold gesture of the long-press gesture of the fourth application icon and a drag gesture of the fourth application icon to the first area, and a hand-up gesture of the first area; or, the second target gesture includes a gesture to pinch the fourth application icon, a gesture to maintain the pinch gesture to the fourth application icon and drag the fourth application icon to the first region, and a gesture to pop the fourth application icon in the first region.
In one possible implementation, the hiding the first application icon includes: and moving the first application icon along the operation direction of the sliding operation, and reducing the display definition of the first application icon along with the movement of the first application icon until the first application icon is hidden.
In one possible implementation, the application navigation interface further includes: a target identifier, the target identifier being adjacent to the first application icon and located in an operation direction of the sliding operation relative to the first application icon, the hiding the first application icon comprising: and moving the first application icon along the operation direction of the sliding operation, and hiding an area, which is overlapped with the target identifier, in the first application icon until the first application icon is hidden.
In this embodiment of the application, a target identifier similar to a baffle or other shapes may be further displayed on the application navigation interface, the target identifier is located on both sides of the application identifier in the application navigation interface, and the AR/VR/MR device may move the first application icon along the operation direction of the sliding operation, and hide an area of the first application icon coinciding with the target identifier until the first application icon is hidden.
The method provided by the application can also be applied to other interfaces which do not apply navigation functions.
In a second aspect, the present application provides an icon display apparatus applied to an AR/VR/MR device, the apparatus comprising:
the display module is used for displaying an application navigation interface, and the application navigation interface comprises a first application icon and a second application icon;
the detection module is used for detecting a sliding gesture of a user aiming at the application navigation interface;
the display module is configured to hide the first application icon in response to the swipe gesture, maintain display of the second application icon on the application navigation interface, and display a third application icon on the application navigation interface.
In a possible implementation, the display module is configured to move the second application icon in an operation direction of the sliding operation.
In a possible implementation, the display module is configured to display the third application icon at a position where the second application icon is located before moving, or gradually display and move the third application icon from a hidden state to a position where the second application icon is located before moving.
In a possible implementation, the first application icon is located in a target direction of the second application icon, and the display module is configured to hide the first application icon in response to the sliding gesture based on that an included angle between an operation direction of the sliding gesture and the target direction is within a preset range.
In one possible implementation, the display module is configured to move the first application icon in an operation direction of the swipe gesture in response to the swipe hand gesture;
when the first application icon moves to a target position of the navigation interface, the first application icon is hidden, and the target position is close to or located at the edge of the navigation interface.
In one possible implementation, the target position displays a target identifier, and the target identifier is used for indicating that the target position is a position of a hidden icon.
In a possible implementation, the display module is configured to hide an area of the first application icon that coincides with the target identifier until the first application icon is hidden.
In a possible implementation, the application navigation interface further includes a bearer identifier, the first application icon and the second application icon are located on the bearer identifier, the target location is located at an edge of the bearer identifier, and the display module is configured to drop the first application icon from the edge of the bearer identifier to a position below the edge of the bearer identifier, and hide the first application icon when a movement displacement of the first application icon exceeds a preset value.
In one possible implementation, the display module is configured to decrease the display definition of the first application icon with the movement of the first application icon until the first application icon is hidden.
In one possible implementation, the application navigation interface includes a first region and a second region, wherein the first application icon and the second application icon are located in the first region, application icons within the second region are not hidden in response to the swipe gesture by the user, the display module is further to move the second application icon to the second region in response to a first target gesture, the first target gesture includes a gesture to pinch the second application icon, a gesture to maintain the pinch with respect to the second application icon and drag the second application icon to the second region, and a gesture to pop-up the second application icon in the second region.
In one possible implementation, the application navigation interface includes a first area and a second area, the first area includes the first application icon and the second application icon, the second area includes a fourth application icon, and the display module is configured to maintain display of the fourth application icon on the second area after detecting a sliding gesture of a user.
In one possible implementation, the detection module is configured to detect a first target gesture of a user for the second application icon, and move the second application icon to the second area; detecting a sliding gesture of a user aiming at the application navigation interface;
the display module is configured to maintain display of the second application icon on the second region in response to the swipe gesture.
In one possible implementation, the first target gesture includes a gesture of long-pressing the second application icon, a gesture of maintaining the long-pressing the second application icon and dragging the second application icon to the second area, and a gesture of raising a hand in the second area; or the like, or, alternatively,
the first target gesture includes a gesture to pinch the second application icon, a gesture to maintain the pinch gesture to the second application icon and drag the second application icon to the second region, and a gesture to drop the second application icon in the second region.
In one possible implementation, the detection module is configured to detect a second target gesture of the user for the fourth application icon, and move the fourth application icon to the first area; detecting a sliding gesture of a user aiming at the application navigation interface;
the display module to maintain display of the fourth application icon on the first region in response to the swipe gesture; or hiding the fourth application icon.
In one possible implementation, the second target gesture includes a long-press gesture of the fourth application icon, a hold gesture of the long-press gesture of the fourth application icon and a drag gesture of the fourth application icon to the first area, and a hand-up gesture of the first area; or the like, or, alternatively,
the second target gesture includes a gesture to pinch the fourth application icon, a gesture to maintain the gesture to pinch the fourth application icon and drag the fourth application icon to the first region, and a gesture to drop the fourth application icon in the first region.
In a possible implementation, the display module is configured to move the first application icon in an operation direction of the sliding operation, and decrease the display definition of the first application icon as the first application icon moves until the first application icon is hidden.
In one possible implementation, the application navigation interface further includes: the display module is used for moving the first application icon along the operation direction of the sliding operation and hiding an area, which is overlapped with the target mark, in the first application icon until the first application icon is hidden.
In a third aspect, the present application provides an icon display apparatus comprising a processor, a memory coupled to the processor, the memory storing program instructions, the program instructions stored in the memory when executed by the processor implementing the method of the first aspect. For the processor to execute the steps in each possible implementation manner of the first aspect, reference may be made to the first aspect specifically, and details are not described here.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when run on a computer, causes the computer to perform the method of the first aspect described above.
In a fifth aspect, the present application provides circuitry comprising processing circuitry configured to perform the method of the first aspect.
In a sixth aspect, the present application provides a computer program which, when run on a computer, causes the computer to perform the method of the first aspect described above.
In a seventh aspect, the present application provides a chip system, where the chip system includes a processor, configured to support a server or a threshold value obtaining apparatus to implement the functions recited in the foregoing aspects, for example, to send or process data and/or information recited in the foregoing methods. In one possible design, the system-on-chip further includes a memory for storing program instructions and data necessary for the server or the communication device. The chip system may be formed by a chip, or may include a chip and other discrete devices.
In an eighth aspect, the present application provides a User Interface (UI), which may also be referred to as an interactive interface or a user interactive interface, and which may also be referred to as an application navigation interface according to the function. Specifically, the user interface includes:
a first application icon and a second application icon, the first application icon being hidden when a user performs a swipe gesture operation on the user interface, the second application icon maintaining display on the user interface; and newly adding a third application icon for display.
In one possible implementation, in the case that the user performs an operation of a slide gesture on the user interface, the second application icon moves along an operation direction of the slide gesture.
In a possible implementation, in the case of a user performing a sliding gesture on the user interface, the third application icon is displayed at a position where the second application icon is located before moving, or the third application icon is gradually displayed from a hidden state and moves to a position where the second application icon is located before moving.
In one possible implementation, in the case of an operation of a slide gesture performed on the user interface by a user, the first application icon is moved along an operation direction of the slide gesture; when the first application icon moves to a target position of the navigation interface, the first application icon is hidden, and the target position is close to or located at the edge of the navigation interface.
In one possible implementation, the target position displays a target identifier, and the target identifier is used for indicating that the target position is a position of a hidden icon.
In one possible implementation, in the case that the user performs a sliding gesture on the user interface, hiding an area of the first application icon that coincides with the target identifier until the first application icon is hidden.
In a possible implementation, the display interface further includes a bearing identifier, the first application icon and the second application icon are located on the bearing identifier, the target location is located at an edge of the bearing identifier, the first application icon is dropped from the edge of the bearing identifier to a position below the edge of the bearing identifier under the condition that a user performs a sliding gesture on the user interface, and when a movement displacement of the first application icon exceeds a preset value, the first application icon is hidden.
In one possible implementation, in the case of a user operation of a slide gesture on the user interface, the display definition of the first application icon is reduced as the first application icon moves until the first application icon is hidden.
In one possible implementation, the display interface includes a first area and a second area, wherein the first application icon and the second application icon are located in the first area, and application icons within the second area are not hidden in response to the swipe gesture by the user.
The embodiment of the application provides an icon display method, which is applied to an AR device, a VR device or an MR device, and comprises the following steps: displaying an application navigation interface, wherein the application navigation interface comprises a first application icon and a second application icon; detecting a sliding gesture of a user aiming at the application navigation interface; in response to the swipe gesture, hiding the first application icon, maintaining display of the second application icon on the application navigation interface, and displaying a third application icon on the application navigation interface. Through the mode, compared with the existing implementation, a user needs to enter the multilevel menu to find more application entries, and under the condition that the display area of the AR device or the VR device or the MR device is limited, the user can switch the displayed application icons on the application navigation interface through a sliding gesture, so that the operation is more convenient and faster.
Drawings
FIG. 1 is a schematic diagram of an application scenario;
FIG. 2 is a schematic of a VR system;
FIG. 3 is a diagram of a hardware configuration of a communication device;
fig. 4 is a schematic diagram of an icon display method provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 6 is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 7 is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 8 is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 9 is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 10a is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 10b is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 11 is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 12 is a schematic diagram illustrating an application navigation interface provided by the present embodiment;
FIG. 13 is a schematic diagram illustrating an application navigation interface provided by the present embodiment;
FIG. 14 is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 15 is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 16 is a schematic diagram illustrating an application navigation interface provided by the present embodiment;
FIG. 17 is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 18 is a schematic diagram illustrating an application navigation interface provided by the present embodiment;
FIG. 19 is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 20 is a schematic diagram illustrating an application navigation interface provided by the present embodiment;
FIG. 21 is a schematic diagram of an application navigation interface provided by the present embodiment;
FIG. 22 is a schematic diagram illustrating an application navigation interface provided by the present embodiment;
fig. 23 is a schematic diagram of an icon display method according to an embodiment of the present application;
fig. 24 is a schematic diagram of an icon display method according to an embodiment of the present application.
Detailed Description
Embodiments of the present application are described below with reference to the accompanying drawings. As can be known to those skilled in the art, with the development of technology and the emergence of new scenarios, the technical solution provided in the embodiments of the present application is also applicable to similar technical problems.
The terms "first," "second," and the like in the description and in the claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and are merely descriptive of the various embodiments of the application and how objects of the same nature can be distinguished. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of elements is not necessarily limited to those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Virtual Reality (VR) is a high and new technology appearing in recent years, and is a computer simulation system capable of creating and experiencing a Virtual world, which generates a Virtual environment of a three-dimensional (also called 3D) space by using computer simulation, and is a system simulation of multi-source information fusion, interactive three-dimensional dynamic views and entity behaviors, so that a user can be immersed in the VR scene. Briefly, VR is a virtual reality technology, and through rendering a visual environment, a user is enabled to be blended into a VR scene to the maximum extent, and an immersive experience is enjoyed.
The application can be applied to a Virtual Reality (VR) device, an Augmented Reality (AR) device, or a mixed reality device, wherein the VR device can be a device having processing functions (e.g., decoding and rendering, view angle capturing, re-projecting, and eyepiece correction) and playing VR video functions. The VR equipment may be a VR machine. Of course, the VR device may also be a device that does not have a processing function but supports a function of playing VR video.
In this embodiment of the present application, a VR device may acquire a media data stream (for example, a VR video stream), and perform processing such as decoding, rendering, view capturing, re-projecting, and eye lens correction on the media data stream. And playing pictures based on the processed media data stream to realize the function of watching the VR video for the user. The user can experience VR application services such as VR videos and VR games through VR equipment, and immersive scene experience is obtained.
For example, the VR device may be a head-mounted VR display device, a VR helmet (e.g., an all-in-one virtual reality helmet, a virtual reality helmet connected to a mobile phone end, a virtual reality helmet connected to a desktop computer, a light guide mixed reality helmet, an augmented reality helmet, etc.), VR glasses (VR Glass) with a processing function, a VR Box (VR Box) with a processing function, a device with a function of playing a panoramic video, such as a computer or a television, etc. Of course, the VR device can also be any other device that can be used to process and play VR video.
Taking the VR device as an all-in-one machine as an example, as shown in fig. 1, when a user watches media content played on the VR device, the VR device can be worn on the head.
As shown in fig. 2, an embodiment of the present application provides a Virtual Reality (VR) system, including: a VR device 70 and one or more terminals 60 in communication with the VR device 70.
The VR device 70 and one or more terminals 60 may be connected wirelessly or in a wired manner. The embodiments of the present application do not limit this.
For example, the VR device 70 and the one or more terminals 60 may be connected by a USB data line, a High Definition Multimedia Interface (HDMI) transmission line, or a Type-C connection, which is not limited in this embodiment.
The terminal 60 may be a device having decoding and rendering capabilities. The terminal 60 may be a media terminal without a display device. Such as a Set-Top Box (STB). Of course the terminal 60 may also be a terminal with a display device. For example, the terminal 60 may be a mobile phone, a Computer, a tablet Computer, a notebook Computer, an Ultra-mobile Personal Computer (UMPC), a netbook, or a Personal Digital Assistant (PDA). In the following embodiments, the specific form of the terminal 60 is not limited in any way.
In this embodiment of the application, the VR device 70 may obtain a processed media data stream (e.g., a VR video stream) from the terminal 60, and play a picture based on the processed media data stream, so as to provide a function of watching a VR video to a user, and the user may experience VR application services such as a VR video and a VR game through the VR device, so as to obtain an immersive scene experience.
The terminal 60 may be a device having decoding and rendering capabilities. Specifically, the terminal 60 may acquire the media data stream from the media server 80, perform decoding, rendering, view angle capturing, re-projecting, and eyepiece correcting on the media data stream, and then transmit the processed media data stream to the VR device 70.
The terminal 60 provided by the embodiment of the present application may be a device having decoding and rendering capabilities. For example, the terminal 60 may be a mobile phone, a set-top box, a Computer, a tablet Computer, a notebook Computer, an Ultra-mobile Personal Computer (UMPC), a netbook, a Personal Digital Assistant (PDA). Of course, in the following embodiments, the specific form of the terminal 60 is not limited at all.
In addition, in this embodiment of the application, the VR device 70 may also acquire a media data stream (for example, a VR video stream) by other manners, and perform processing such as decoding, rendering, view capturing, re-projecting, and eye lens correcting on the media data stream. And playing pictures based on the processed media data stream to realize the function of watching the VR video for the user.
Optionally, the VR system as shown in fig. 2 further includes: a media server 80 connected to the VR device 70 and the terminal 60. The media server 80 has VR video resources stored therein for providing media data streams to the VR device 70. The media server 80 may be a server, a server cluster composed of a plurality of servers, or a cloud computing service center.
Optionally, the media server 80 is a background server for providing the network VR service, for example, a background server of a website or an application for providing the network VR service.
There is a communication connection between the VR device 70 and the media server 80. The communication connection may be a wireless network connection or a wired network connection. Illustratively, the media server 80 includes: a content operation platform, a content composition management platform and a content distribution network.
For example, the content composition management platform is responsible for managing the content in the service system, performing editing, adding and deleting, querying and EPG making on the media content according to a certain policy, and performing related statistics and report generation on the media content or the value-added service content. The content distribution network is used for acquiring content injection from the content synthesis management platform and realizing distribution of video content or service business from the central storage server to the edge server. For example, a content distribution network is used to transmit media data streams to VR device 70.
As shown in fig. 3, fig. 3 is a schematic diagram illustrating a hardware structure of a communication device according to an embodiment of the present application, and the structure of the VR device 70 in the embodiment of the present application may refer to the structure illustrated in fig. 3. The communication device includes: one or more display devices 71, sensors 72, a power bus 73, and a communication interface 74.
One or more display devices 71, sensors 72, power bus 73, and communication interface 74 are connected by bus 75.
Optionally, the communication device in the embodiment of the present application may further include an auxiliary device. For example, the auxiliary device may include a remote control handle for human interaction.
Wherein the display device 71 is adapted to display a simulated video projected on the retina of the user. The Liquid Crystal Display (LCD) with ultrahigh resolution, high refresh rate and higher response time can be adopted to cooperate with the optical imaging system to bring high-definition image quality and smoother Display effect.
The sensor 72 is configured to collect data of a motion and an environmental state of an observer, and send the collected data to the terminal 60, and is specifically configured to collect a motion state and a position of a head of the observer, for example, rotation data of the head of the observer, for example, head position information, such as displacement and speed generated by forward and backward movement of the observer, or motions of the observer such as head swinging, head shaking, head raising, and head lowering, or operations such as clicking and pressing input by the observer, which is not limited herein. Specifically, the sensor may be an Inertial Measurement Unit (IMU) provided in the VR device, which is a device for measuring a three-axis attitude angle (or angular velocity) and an acceleration.
The communication interface 74 may support the VR device to communicate with other devices through a wireless network, a wired network, bluetooth or other communication methods, which is not limited herein, for example, the communication interface 74 is used to access different wireless network systems (e.g., LTE network) to perform transceiving processing on wireless signals, and may be a baseband chip of an integrated radio frequency chip.
The power bus 73 is used to provide power to the VR device.
The VR device may further include: a memory 75. The memory 75 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that may store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disk read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory is used for storing information such as media data streams transmitted by the execution terminal 60.
Referring to fig. 4, fig. 4 is an icon display method provided in an embodiment of the present application, where the method is applied to an AR device or a VR device, and the method includes:
401. displaying an application navigation interface, wherein the application navigation interface comprises a first application icon and a second application icon.
In this embodiment, the AR device or the VR device may display an application navigation interface, and both the AR device and the VR device need a launcher system provided in the system, that is, a desktop homepage (also referred to as an application navigation interface in this embodiment) appearing in the device system after the mobile phone is turned on, the application navigation interface may include a plurality of application icons as a set of application entries, and the user may open a corresponding application by performing a gesture operation on the application icon of the application navigation interface.
In an embodiment of the present application, the application navigation interface may include a first application icon and a second application icon, where the first application icon corresponds to an a application, the second application icon corresponds to a B application, and the a application and the B application may be one of a word processing application, a telephone application, an email application, an instant messaging application, a photo management application, a web browsing application, a digital music player application, and a digital video player application.
In this embodiment of the application, the application navigation interface may display a plurality of application icons, and the first application icon and the second application icon are two of the application icons.
Referring to fig. 5, fig. 5 shows an illustration of an application navigation interface provided by the present embodiment, as shown in fig. 5, an application navigation interface 500 includes a plurality of application icons, where the plurality of application icons includes a first application icon 501 and a second application icon 502.
It should be understood that the number of application icons included in the application navigation interface can be determined according to the length of the visible area of the VR/AR device, and referring to fig. 6, fig. 6 shows an illustration of an application navigation interface provided by the present embodiment, as shown in fig. 6, the length of the visible area of the VR device in fig. 6 is greater than that of the visible area of the VR device shown in fig. 5, and therefore, more application icons than the number of application icons in fig. 5 can be placed. The present embodiment may provide flexible selection of the number of application icons displayed for AR/VR devices having different viewing areas.
402. Detecting a sliding gesture of a user aiming at the application navigation interface;
in this application embodiment, a user may switch displayed application icons in the application navigation interface through a sliding gesture for the application navigation interface, it should be understood that, in some scenarios, the VR device is collocated with a handle, the user may operate the application navigation interface visually seen by operating the handle (sliding gesture), in other scenarios, the VR device is not collocated with a handle, the user may operate the application navigation interface visually seen by a finger (sliding gesture), in other scenarios, the AR device is not collocated with a handle, and the user may operate the application navigation interface visually seen by a finger (sliding gesture).
403. In response to the swipe gesture, hiding the first application icon, maintaining display of the second application icon on the application navigation interface, and displaying a third application icon on the application navigation interface.
Specifically, the user may perform an operation (a swipe gesture) on the visually-viewed application navigation interface, and the VR device/AR device may hide the first application icon, maintain the display of the second application icon on the application navigation interface, and display a third application icon on the application navigation interface in response to the swipe gesture.
In an embodiment, the first application icon is located in a target direction of the second application icon, and the user may perform a sliding gesture along the target direction or a direction within a preset included angle (within 90 degrees) from the target direction, so that the VR device/AR device may hide the first application icon, maintain the display of the second application icon on the application navigation interface, and display a third application icon on the application navigation interface in response to the sliding gesture of the user.
In one embodiment, the second application icon may be moved in an operation direction of the sliding operation, and displays the third application icon at the position where the second application icon is located before the movement, the first application icon also moves in the operation direction of the sliding operation, and is hidden after moving a certain displacement, the third application icon can be displayed in the opposite direction of the operation direction of the sliding operation, moves along the operation direction of the sliding operation, and reaches the position of the second application icon before moving after moving a certain displacement, or the third application icon is gradually displayed from the hidden state and moves to the position where the second application icon is located before moving, and the third application icon can change from a lower-definition state to a higher-definition state or change from a partial display to a complete display along with the movement.
Referring to fig. 7, fig. 7 shows a schematic diagram of an application navigation interface provided in this embodiment, as shown in fig. 7, a first application icon 501 is located on the left side of a second application icon 502, and a user may perform a leftward sliding operation on the application navigation interface, as shown in fig. 8, in response to the operation, a VR device/AR device may hide a display of a first application identifier, move the second application icon 502 along an operation direction of the sliding operation, and display a third application icon 503 at a position where the second application icon is located before moving. Through the mode, although only limited application icons can be displayed on the application navigation interface, a part of application icons can be hidden through the sliding gesture of the user, and new application icons are displayed, so that the user can control the display of the application identifiers on the application navigation interface through the sliding gesture. It should be understood that if the user performs a sliding operation to the application navigation interface to the right, the VR device/AR device may perform a response similar to the above sliding to the left, specifically, referring to fig. 9, fig. 9 shows an illustration of an application navigation interface provided in this embodiment, as shown in fig. 9, the first application icon 501 is located at the left side of the second application icon 502, and the user may perform a sliding operation to the application navigation interface to the right, as shown in fig. 10a, in response to which the VR device/AR device may move the first application icon 501 and the second application icon 502 along the operation direction of the sliding operation, hide the rightmost application icon in fig. 9, and display a new application icon at a position before the first application icon moves.
In this application embodiment, first application icon can be located the target direction of second application icon, VR equipment AR equipment detects the slide gesture, need to judge whether the slide gesture satisfies certain condition, in an implementation, VR equipment AR equipment can judge whether the operation direction of slide gesture with the contained angle of target direction is in predetermineeing the within range, and it can be less than 90 degrees to predetermine the range, in this embodiment, only if the operation direction of slide gesture with the contained angle of target direction is in predetermineeing the within range, just will first application icon is hidden. Specifically, referring to fig. 10b, as shown in fig. 10b, a first application icon is located in a target direction of a second application icon, an included angle between an operation direction of a sliding gesture of a user and the target direction is 1001, and a value of the included angle 1001 is smaller than a preset range (for example, 90 degrees).
Fig. 5 to 10a above only show examples in which the application icons are displayed in the application navigation interface according to one horizontal line, and in some other implementations, the arrangement of the application icons in the application navigation interface may not be limited to the horizontal line, and may be displayed according to a certain arc, or multiple columns (as shown in fig. 11), for example.
In the embodiment of the application, the first application icon can be moved along the operation direction of the sliding operation in response to the sliding gesture of the user, and the display definition of the first application icon is reduced along with the movement of the first application icon until the first application icon is hidden.
In the embodiment of the application, when a first application icon is moved to a target position of a navigation interface, the first application icon is hidden, the target position is close to or located at the edge of the navigation interface, a target identifier can be displayed at the target position, and the target identifier is used for indicating that the target position is the position of the hidden icon. Specifically, referring to fig. 12, fig. 12 shows an illustration of an application navigation interface provided in this embodiment, as shown in fig. 12, the first application icon may be moved along the operation direction of the sliding operation, and the display definition of the first application icon is reduced along with the movement of the first application icon until the first application icon is hidden. Similarly, a third application icon may be displayed at a position opposite to the operation direction of the sliding operation, and the display definition of the third application icon is low, the third application icon is moved in the operation direction of the sliding operation, and the display definition of the first application icon is increased as the third application icon is moved.
In this embodiment of the application navigation interface, the application navigation interface may further include a bearer identifier, the first application icon and the second application icon are located on the bearer identifier, and the target location is located at an edge of the bearer identifier. In this application embodiment, a bar-shaped identifier similar to the bottom plate (the bar-shaped identifier is an implementation manner of the above-mentioned bearing identifier) may be further displayed on the application navigation interface, the application identifier on the application navigation interface is located on the bottom plate, and then the AR/VR device may respond to the sliding gesture of the user, and move the first application identifier in the operation direction of the sliding operation, and when the first application identifier moves to the edge of the bottom plate, the drop effect of the first application identifier is displayed until the first application identifier is hidden. Taking the first application identifier located on the left side of the second application identifier and the user sliding to the left side as an example, specifically, referring to fig. 13, fig. 13 shows an illustration of an application navigation interface provided in this embodiment, as shown in fig. 13, an identifier 504 similar to a bottom board may be displayed on the application navigation interface, an application identifier on the application navigation interface is located on the bottom board 504, and further, the AR/VR device may move the first application icon 501 along the operation direction of the sliding operation in response to the sliding gesture of the user, and when the first application identifier moves to the edge of the bottom board 504, a dropping effect of the first application identifier 501 is displayed (as shown in fig. 14) until the first application icon 501 is hidden.
The method provided by the embodiment is different from a page type application navigation interface of the existing mobile phone, different pages are not divided, all icons are transversely arranged, and each icon is independently managed.
In this embodiment, the application navigation interface may further include a target identifier, where the target identifier is adjacent to the first application icon, and the target identifier is located in the operation direction of the sliding operation relative to the first application icon.
In this embodiment of the application, a target identifier similar to a baffle or other shapes may be further displayed on the application navigation interface, the target identifier is located on both sides of the application identifier in the application navigation interface, and the AR/VR device may move the first application icon along the operation direction of the sliding operation, and hide an area of the first application icon coinciding with the target identifier until the first application icon is hidden. Taking the first application identifier located on the left side of the second application identifier and the user sliding to the left side as an example, specifically, referring to fig. 15, fig. 15 shows an illustration of an application navigation interface provided in this embodiment, as shown in fig. 15, a baffle-like target identifier 505 may be displayed on the application navigation interface, the target identifier 505 is located on the left side of the first application identifier 501 in the application navigation interface, the AR/VR device may move the first application icon 501 along the operation direction of the sliding operation and hide an area of the first application icon 501 coinciding with the target identifier 505 until the first application icon 501 is hidden (as shown in fig. 16), and similarly, a baffle-like target identifier 506 may be displayed on the application navigation interface, the target identifier 506 is located on the right side of the first application identifier 501 in the application navigation interface, the AR/VR device may display the third application identifier 503 in a direction opposite to the operation direction of the sliding operation, and hide an area of the third application icon 503 coinciding with the target identifier 506 until the third application icon 503 is completely displayed.
In one embodiment, the application navigation interface may include a first area and a second area, the first area including the first application icon and the second application icon, the second area including a fourth application icon, and the AR/VR device may maintain display of the fourth application icon on the second area after detecting a swipe gesture by the user.
In an embodiment of the application, the AR/VR device may detect a sliding gesture of a user with respect to the application navigation interface; in response to the swipe gesture, hiding the first application icon located in the first area, maintaining display of the second application icon located in the first area, and displaying the third application icon in the first area, and maintaining display of the fourth application icon on the second area, that is, application icons within the second area are not hidden in response to the swipe gesture by the user.
In this embodiment, the application navigation interface includes two regions, one of the regions (the first region) may display application icons that move, hide, or display along with the sliding operation of the user, and the other region (the second region) may display icons that are fixed, where "fixed" may be understood as that the number and types of the application icons are not changed, and even the positions are not changed.
Taking the first application identifier located on the left side of the second application identifier and the user sliding to the left side as an example, specifically, referring to fig. 17, fig. 17 shows an illustration of an application navigation interface provided in this embodiment, as shown in fig. 17, the application navigation interface may include a first area 601 and a second area 602, specifically, the first area 601 and the second area 602 may be taken as isolation indications through a preset identifier, and the AR/VR device may detect a sliding gesture of the user for the application navigation interface; in response to the swipe gesture, the first application icon located in the first area 601 is hidden, the display of the second application icon 502 located in the first area 601 is maintained, the third application icon 503 is displayed in the first area 601, and the display of the fourth application icon 603 on the second area 602 is maintained.
In this embodiment, the AR/VR device may support the user to move the application icon in the first area to the second area, and the application icon moved to the second area may not be hidden, moved, or displayed along with the sliding gesture of the user (e.g., as shown in fig. 18). In one embodiment, the AR/VR device may detect a first target gesture of the user with respect to the second application icon, moving the second application icon to the second area; in this case, if the AR/VR device detects a swipe gesture of the user with respect to the application navigation interface, display of the second application icon on the second area is maintained in response to the swipe gesture. It should be understood that the second application identifier is only an illustration, and in this embodiment, a first target gesture of a user with respect to any application icon on the application navigation interface may be detected, and the application icon is moved to the second area.
In one implementation, the first target gesture includes a gesture of long-pressing the second application icon, a gesture of maintaining the long-pressing the second application icon and dragging the second application icon to the second region, and a gesture of raising a hand in the second region. Specifically, referring to fig. 19, fig. 19 shows an illustration of an application navigation interface provided by this embodiment, as shown in fig. 19, a user may long-press a second application identifier 502, maintain the long-press of the second application icon and drag the second application icon to the second area, and raise a hand in the second area, and the AR/VR device detects the gesture and moves the second application identifier 502 to the second area 602 (as shown in fig. 20).
In one implementation, the first target gesture includes a gesture to pinch the second application icon, a gesture to maintain the gesture to pinch the second application icon and drag the second application icon to the second region, and a gesture to un-pinch the second application icon in the second region. It should be understood that the above-described pinch gesture may be, but is not limited to, a pinch gesture of an index finger and a thumb.
Specifically, referring to fig. 21, fig. 21 shows an illustration of an application navigation interface provided in this embodiment, as shown in fig. 21, a user may pinch a second application identifier 502, maintain the pinching of the second application icon, drag the second application icon to the second area, and play the second application icon loosely in the second area, and the AR/VR device detects the gesture, and moves the second application identifier 502 to the second area 602.
In one implementation, a user may press or perform other gesture operations on an application icon on an application navigation interface for a long time, and after a certain length of time elapses, the VR/AR device may display a shortcut tool in an area near the application icon, for example, referring to fig. 22, taking the application icon as an icon of a music application as an example, at this time, the shortcut tool may appear above the application icon, and the shortcut tool may include a shortcut operation control (e.g., a play control shown in fig. 22, a control for switching audio) for the music application. In the embodiment, a trigger mechanism of the quick application tool is arranged on the application navigation interface, so that a convenient and fast way for a user to operate quickly is provided.
It should be understood that the methods provided herein may also be applied to other interfaces that do not apply navigation effects.
The embodiment of the application provides an icon display method, which is applied to AR equipment or VR equipment and comprises the following steps: displaying an application navigation interface, wherein the application navigation interface comprises a first application icon and a second application icon; detecting a sliding gesture of a user aiming at the application navigation interface; in response to the swipe gesture, hiding the first application icon, maintaining display of the second application icon on the application navigation interface, and displaying a third application icon on the application navigation interface. Through the mode, compared with the existing implementation, a user needs to enter the multilevel menu to find more application entries, and under the condition that the display area of the AR device or the VR device is limited, the user can switch the displayed application icons on the application navigation interface through a sliding gesture, so that the operation is more convenient and faster. Meanwhile, in the embodiment, in response to the sliding gesture of the user, the application icons on the application navigation interface are not completely hidden, but the display of a part of the application icons (second application icons) is maintained, so that even if the user performs the operation of sliding by mistake, the part of the application icons can be displayed, the user may not need to slide back to the previous application navigation interface to select the application to be opened, and the operation cost of the user is saved.
Referring to fig. 23, an embodiment of the present application further provides an icon display apparatus, which is applied to an AR/VR/MR device, and includes:
a display module 2301, configured to display an application navigation interface, where the application navigation interface includes a first application icon and a second application icon;
a detection module 2302 for detecting a sliding gesture of a user for the application navigation interface;
the display module 2301 is configured to hide the first application icon in response to the swipe gesture, maintain the display of the second application icon on the application navigation interface, and display a third application icon on the application navigation interface.
In one possible implementation, the display module 2301 is configured to move the second application icon in an operation direction of the sliding operation.
In a possible implementation, the display module 2301 is configured to display the third application icon at a position where the second application icon is located before moving, or gradually display and move the third application icon from a hidden state to a position where the second application icon is located before moving.
In a possible implementation, the first application icon is located in a target direction of the second application icon, and the display module 2301 is configured to hide the first application icon in response to the sliding gesture based on an included angle between an operation direction of the sliding gesture and the target direction being within a preset range.
In one possible implementation, the display module 2301 is configured to move the first application icon in an operation direction of the sliding gesture in response to the sliding hand gesture;
when the first application icon moves to a target position of the navigation interface, the first application icon is hidden, and the target position is close to or located at the edge of the navigation interface.
In one possible implementation, the target position displays a target identifier, and the target identifier is used for indicating that the target position is a position of a hidden icon.
In a possible implementation, the display module 2301 is configured to hide an area of the first application icon that coincides with the target identifier until the first application icon is hidden.
In a possible implementation, the application navigation interface further includes a strip-shaped identifier, the first application icon and the second application icon are located on the strip-shaped identifier, the target position is located at an edge of the strip-shaped identifier, and the display module 2301 is configured to drop the first application icon from the edge of the strip-shaped identifier to a position below the edge of the strip-shaped identifier, and hide the first application icon when the movement displacement of the first application icon exceeds a preset value.
In one possible implementation, the display module 2301 is configured to decrease the display definition of the first application icon as the first application icon moves until the first application icon is hidden.
In one possible implementation, the application navigation interface includes a first region and a second region, wherein the first application icon and the second application icon are located in the first region, and application icons within the second region are not hidden in response to the swipe gesture by the user, the display module 2301 is further configured to move the second application icon to the second region in response to a first target gesture, the first target gesture including a gesture to pinch in relation to the second application icon, a gesture to maintain the pinch in relation to the second application icon and drag the second application icon to the second region, and a gesture to un-zoom the second application icon in the second region.
In one possible implementation, the application navigation interface includes a first area and a second area, the first area includes the first application icon and the second application icon, the second area includes a fourth application icon, and the display module is configured to maintain display of the fourth application icon on the second area after detecting a sliding gesture of a user.
In a possible implementation, the detecting module 2302 is configured to detect a first target gesture of a user on the second application icon to move the second application icon to the second area; detecting a sliding gesture of a user aiming at the application navigation interface;
the display module is configured to maintain display of the second application icon on the second region in response to the swipe gesture.
In one possible implementation, the first target gesture includes a gesture of long-pressing the second application icon, a gesture of maintaining the long-pressing the second application icon and dragging the second application icon to the second area, and a gesture of raising a hand in the second area; or the like, or, alternatively,
the first target gesture includes a gesture to pinch the second application icon, a gesture to maintain the pinch gesture to the second application icon and drag the second application icon to the second region, and a gesture to drop the second application icon in the second region.
In a possible implementation, the detecting module 2302 is configured to detect a second target gesture of the user for the fourth application icon to move the fourth application icon to the first area; detecting a sliding gesture of a user aiming at the application navigation interface;
the display module to maintain display of the fourth application icon on the first region in response to the swipe gesture; or hiding the fourth application icon.
In one possible implementation, the second target gesture includes a long-press gesture of the fourth application icon, a hold gesture of the long-press gesture of the fourth application icon and a drag gesture of the fourth application icon to the first area, and a hand-up gesture of the first area; or the like, or, alternatively,
the second target gesture includes a gesture to pinch the fourth application icon, a gesture to maintain the gesture to pinch the fourth application icon and drag the fourth application icon to the first region, and a gesture to drop the fourth application icon in the first region.
In a possible implementation, the display module 2301 is configured to move the first application icon in the operation direction of the sliding operation, and decrease the display definition of the first application icon along with the movement of the first application icon until the first application icon is hidden.
In one possible implementation, the application navigation interface further includes: the display module is used for moving the first application icon along the operation direction of the sliding operation and hiding an area, which is overlapped with the target mark, in the first application icon until the first application icon is hidden.
In a simple embodiment, it will be appreciated by those skilled in the art that the terminal device may take the form shown in fig. 24. The terminal device shown in fig. 24 may be an AR/VR device in the above-described embodiment.
The apparatus 2400 shown in fig. 24 includes at least one processor 2401, a transceiver 2402, and optionally a memory 2403.
In one possible implementation, the apparatus 2400 may further include a display 2406; the apparatus may also include a sensor 2405 for capturing the pose and position of the terminal device.
Memory 2403 may be volatile memory, such as random access memory; the memory may also be a non-volatile memory such as, but not limited to, a read-only memory, a flash memory, a Hard Disk Drive (HDD) or solid-state drive (SSD), or the memory 2403 is any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory 2403 may be a combination of the above.
The specific connection medium between the processor 2401 and the memory 2403 is not limited in the embodiment of the present application. In the embodiment of the present application, the memory 2403 and the processor 2401 are connected through a bus, the bus is represented by a thick line in the figure, and the connection manner between other components is merely illustrative and is not limited. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 24, but this does not mean only one bus or one type of bus.
When the terminal device adopts the form shown in fig. 24, the processor 2401 in fig. 24 may make the terminal device execute the method executed by the terminal device in any of the above method embodiments by calling the computer execution instructions stored in the memory 2403.
Specifically, the memory 2403 stores computer execution instructions for implementing the trigger detection module and the display module in fig. 23 and executing corresponding steps, and both the functions/implementation processes of the detection module and the display module in fig. 23 can be implemented by the processor 2401 in fig. 24 calling the computer execution instructions stored in the memory 2403.
Wherein, when the processor 2401 executes the function of triggering the display module, such as the operation related to displaying the image, the processor 2401 may display the image through the display 2406 in the device 2400.
Optionally, when executing the function of the display module, the processor 2401 may also display an image through a display in another device, for example, send a display instruction to the other device to instruct to display the image.
The embodiment of the present application further provides a User Interface (UI), which may also be referred to as an interactive interface or a user interactive interface, and the user interface may also be referred to as an application navigation interface according to the function. Specifically, the user interface includes:
a first application icon and a second application icon, the first application icon being hidden when a user performs a swipe gesture operation on the user interface, the second application icon maintaining display on the user interface; and newly adding a third application icon for display.
In one possible implementation, in the case that the user performs an operation of a slide gesture on the user interface, the second application icon moves along an operation direction of the slide gesture.
In a possible implementation, in the case of a user performing a sliding gesture on the user interface, the third application icon is displayed at a position where the second application icon is located before moving, or the third application icon is gradually displayed from a hidden state and moves to a position where the second application icon is located before moving.
In one possible implementation, in the case of an operation of a slide gesture performed on the user interface by a user, the first application icon is moved along an operation direction of the slide gesture; when the first application icon moves to a target position of the navigation interface, the first application icon is hidden, and the target position is close to or located at the edge of the navigation interface.
In one possible implementation, the target position displays a target identifier, and the target identifier is used for indicating that the target position is a position of a hidden icon.
In one possible implementation, in the case that the user performs a sliding gesture on the user interface, hiding an area of the first application icon that coincides with the target identifier until the first application icon is hidden.
In a possible implementation, the display interface further includes a strip-shaped identifier, the first application icon and the second application icon are located on the strip-shaped identifier, the target position is located at the edge of the strip-shaped identifier, and when the user operates the user interface to perform a sliding gesture, the first application icon is dropped to the position below the edge of the strip-shaped identifier from the edge of the strip-shaped identifier, and when the movement displacement of the first application icon exceeds a preset value, the first application icon is hidden.
In one possible implementation, in the case of a user operation of a slide gesture on the user interface, the display definition of the first application icon is reduced as the first application icon moves until the first application icon is hidden.
In one possible implementation, the display interface includes a first area and a second area, wherein the first application icon and the second application icon are located in the first area, and application icons within the second area are not hidden in response to the swipe gesture by the user.
How to perform corresponding response on the display element included in the user interface based on the operation of the user may refer to the description in the above embodiments, and similar parts are not described again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or other network devices) to execute all or part of the steps of the method described in the embodiments of fig. 2 to 16 of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. An icon display method, applied to an Augmented Reality (AR) device, a Virtual Reality (VR) device or a Mixed Reality (MR) device, comprising:
displaying an application navigation interface, wherein the application navigation interface comprises a first application icon and a second application icon;
detecting a sliding gesture of a user aiming at the application navigation interface;
in response to the swipe gesture, hiding the first application icon, maintaining display of the second application icon on the application navigation interface, and displaying a third application icon on the application navigation interface.
2. The method of claim 1, wherein said maintaining the display of the second application icon comprises:
moving the second application icon along the operation direction of the sliding gesture.
3. The method of claim 2, wherein displaying a third application icon on the application navigation interface comprises:
and displaying the third application icon at the position where the second application icon is located before moving, or gradually displaying and moving the third application icon from the hidden state to the position where the second application icon is located before moving.
4. The method of any of claims 1 to 3, wherein the first application icon is located in a target direction of the second application icon, and wherein hiding the first application icon in response to the swipe gesture comprises:
and responding to the sliding gesture, and hiding the first application icon based on the fact that the included angle between the operation direction of the sliding gesture and the target direction is within a preset range.
5. The method of any of claims 1-3, wherein said hiding the first application icon in response to the swipe gesture comprises:
in response to the swipe hand gesture, moving the first application icon in an operational direction of the swipe gesture;
when the first application icon moves to a target position of the navigation interface, the first application icon is hidden, and the target position is close to or located at the edge of the navigation interface.
6. The method of any of claims 1-3, wherein the application navigation interface includes a first area and a second area, wherein the first application icon and the second application icon are located in the first area, and wherein hiding the first application icon in response to the swipe gesture includes:
in response to the swipe hand gesture, moving the first application icon in an operational direction of the swipe gesture;
when the first application icon moves to a target position of the navigation interface, the first application icon is hidden, wherein the target position is located between the first area and the second area, and the application icon contained in the second area is kept unchanged in the moving process or the hiding process of the first application icon.
7. The method according to claim 5 or 6, wherein the target position is displayed with a target identifier, and the target identifier is used for indicating that the target position is a position of a hidden icon.
8. The method of claim 7, wherein hiding the first application icon comprises:
hiding an area of the first application icon, which is overlapped with the target identifier, until the first application icon is hidden.
9. The method of claim 5, wherein the application navigation interface further comprises a bearer identifier, wherein the first application icon and the second application icon are located on the bearer identifier, wherein the target location is located at an edge of the bearer identifier, and wherein hiding the first application icon comprises:
and dropping the first application icon from the edge of the bearing identifier to the position below the edge of the bearing identifier, and hiding the first application icon when the dropping distance of the first application icon exceeds a preset value.
10. The method according to any of claims 1 to 9, wherein said hiding the first application icon comprises:
reducing the display clarity of the first application icon as the first application icon moves until the first application icon is hidden.
11. The method of any of claims 1-10, wherein the application navigation interface includes a first region and a second region, wherein the first application icon and the second application icon are located in the first region, and wherein application icons in the second region are not hidden in response to the swipe gesture by the user, the method further comprising:
moving the second application icon to the second region in response to a first target gesture, the first target gesture including a gesture to pinch the second application icon, a gesture to maintain the pinch gesture and drag the second application icon to the second region, and a gesture to drop the second application icon in the second region.
12. An electronic device comprising a processor, and a memory, wherein the memory is configured to store a computer program comprising program instructions that, when executed by the processor, cause the electronic device to perform the method of any of claims 1 to 11.
13. A computer-readable storage medium, characterized in that it stores a computer program comprising program instructions which, when said program requests to be executed by a computer, cause the computer to carry out the method according to any one of claims 1 to 11.
CN202010899281.4A 2020-08-31 2020-08-31 Icon display method and device Active CN112181219B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010899281.4A CN112181219B (en) 2020-08-31 2020-08-31 Icon display method and device
CN202211085735.XA CN116301485A (en) 2020-08-31 2020-08-31 Icon display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010899281.4A CN112181219B (en) 2020-08-31 2020-08-31 Icon display method and device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202211085735.XA Division CN116301485A (en) 2020-08-31 2020-08-31 Icon display method and device

Publications (2)

Publication Number Publication Date
CN112181219A true CN112181219A (en) 2021-01-05
CN112181219B CN112181219B (en) 2022-09-23

Family

ID=73925615

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202211085735.XA Pending CN116301485A (en) 2020-08-31 2020-08-31 Icon display method and device
CN202010899281.4A Active CN112181219B (en) 2020-08-31 2020-08-31 Icon display method and device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202211085735.XA Pending CN116301485A (en) 2020-08-31 2020-08-31 Icon display method and device

Country Status (1)

Country Link
CN (2) CN116301485A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342433A (en) * 2021-05-08 2021-09-03 杭州灵伴科技有限公司 Application page display method, head-mounted display device and computer readable medium
CN113835585A (en) * 2021-09-28 2021-12-24 深圳集智数字科技有限公司 Application interface switching method, device and equipment based on navigation and storage medium
CN115396558A (en) * 2021-05-24 2022-11-25 精工爱普生株式会社 Multifunction device, display control method for multifunction device, and recording medium
WO2024066754A1 (en) * 2022-09-29 2024-04-04 歌尔股份有限公司 Interaction control method and apparatus, and electronic device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109976626A (en) * 2019-02-19 2019-07-05 华为技术有限公司 A kind of switching method and electronic equipment of application icon
CN110602321A (en) * 2019-09-11 2019-12-20 腾讯科技(深圳)有限公司 Application program switching method and device, electronic device and storage medium
CN111078091A (en) * 2019-11-29 2020-04-28 华为技术有限公司 Split screen display processing method and device and electronic equipment
CN111163205A (en) * 2019-12-25 2020-05-15 上海传英信息技术有限公司 Display method, display device and computer storage medium
US20200219322A1 (en) * 2019-01-09 2020-07-09 Vmware, Inc. Snapping, virtual inking, and accessibility in augmented reality
CN111596757A (en) * 2020-04-02 2020-08-28 林宗宇 Gesture control method and device based on fingertip interaction

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200219322A1 (en) * 2019-01-09 2020-07-09 Vmware, Inc. Snapping, virtual inking, and accessibility in augmented reality
CN109976626A (en) * 2019-02-19 2019-07-05 华为技术有限公司 A kind of switching method and electronic equipment of application icon
CN110602321A (en) * 2019-09-11 2019-12-20 腾讯科技(深圳)有限公司 Application program switching method and device, electronic device and storage medium
CN111078091A (en) * 2019-11-29 2020-04-28 华为技术有限公司 Split screen display processing method and device and electronic equipment
CN111163205A (en) * 2019-12-25 2020-05-15 上海传英信息技术有限公司 Display method, display device and computer storage medium
CN111596757A (en) * 2020-04-02 2020-08-28 林宗宇 Gesture control method and device based on fingertip interaction

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113342433A (en) * 2021-05-08 2021-09-03 杭州灵伴科技有限公司 Application page display method, head-mounted display device and computer readable medium
CN115396558A (en) * 2021-05-24 2022-11-25 精工爱普生株式会社 Multifunction device, display control method for multifunction device, and recording medium
US11722612B2 (en) 2021-05-24 2023-08-08 Seiko Epson Corporation Multifunction device, display control method of multifunction device, and non-transitory computer-readable storage medium storing display control program
CN115396558B (en) * 2021-05-24 2023-10-03 精工爱普生株式会社 Multifunction peripheral, display control method for multifunction peripheral, and recording medium
CN113835585A (en) * 2021-09-28 2021-12-24 深圳集智数字科技有限公司 Application interface switching method, device and equipment based on navigation and storage medium
WO2024066754A1 (en) * 2022-09-29 2024-04-04 歌尔股份有限公司 Interaction control method and apparatus, and electronic device

Also Published As

Publication number Publication date
CN116301485A (en) 2023-06-23
CN112181219B (en) 2022-09-23

Similar Documents

Publication Publication Date Title
CN112181219B (en) Icon display method and device
CN107341018B (en) Method and device for continuously displaying view after page switching
CN108604175B (en) Apparatus and associated methods
CN106716302B (en) Method, apparatus, and computer-readable medium for displaying image
EP3926441B1 (en) Output of virtual content
EP2359915B1 (en) Media viewing
US20150121225A1 (en) Method and System for Navigating Video to an Instant Time
US11825177B2 (en) Methods, systems, and media for presenting interactive elements within video content
US20140368495A1 (en) Method and system for displaying multi-viewpoint images and non-transitory computer readable storage medium thereof
CN107765986B (en) Information processing method and device of game system
US20130155108A1 (en) Augmented Reality User Interaction Methods, Computing Devices, And Articles Of Manufacture
WO2014125403A2 (en) Method of video interaction using poster view
EP3236336B1 (en) Virtual reality causal summary content
US20160103574A1 (en) Selecting frame from video on user interface
EP3860109A1 (en) Method for processing vr video, and related apparatus
CN106970734B (en) Task starting method and device of display equipment
EP2341412A1 (en) Portable electronic device and method of controlling a portable electronic device
CN115396741A (en) Panoramic video playing method and device, electronic equipment and readable storage medium
CN112256167A (en) Multimedia resource switching method and device, electronic equipment and storage medium
CN117687508A (en) Interactive control method, device, electronic equipment and computer readable storage medium
CN115617163A (en) Display control method, display control device, head-mounted display equipment and medium
WO2015081528A1 (en) Causing the display of a time domain video image
CN118260005A (en) Information display method and device based on AR technology and electronic equipment
CN118142164A (en) Game information display method, game information display device, electronic device, and medium
CN117369677A (en) Cursor position determining method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant