CN116880726A - Icon interaction method and device for 3D space, electronic equipment and medium - Google Patents

Icon interaction method and device for 3D space, electronic equipment and medium Download PDF

Info

Publication number
CN116880726A
CN116880726A CN202311140121.1A CN202311140121A CN116880726A CN 116880726 A CN116880726 A CN 116880726A CN 202311140121 A CN202311140121 A CN 202311140121A CN 116880726 A CN116880726 A CN 116880726A
Authority
CN
China
Prior art keywords
information
position information
icon
initial
screen position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311140121.1A
Other languages
Chinese (zh)
Other versions
CN116880726B (en
Inventor
蒋斌
刘进
张洲
徐晨晨
何会亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Kilakila Technology Co ltd
Original Assignee
Shenzhen Kilakila Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Kilakila Technology Co ltd filed Critical Shenzhen Kilakila Technology Co ltd
Priority to CN202311140121.1A priority Critical patent/CN116880726B/en
Publication of CN116880726A publication Critical patent/CN116880726A/en
Application granted granted Critical
Publication of CN116880726B publication Critical patent/CN116880726B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an icon interaction method, an icon interaction device, electronic equipment and a medium for a 3D space. One embodiment of the method comprises the following steps: in response to determining that the screen position information meets a preset position condition, determining the screen position information as initial screen position information, and acquiring initial icon parameter information of each icon displayed in the 3D space; in response to determining that the interactive operation meets the preset operation condition and the screen position information corresponding to the interactive operation is updated, executing the following operation steps: generating each display parameter information according to the updated screen position information, the initial screen position information and each initial icon parameter information, and updating each displayed icon; and determining whether the current interaction operation meets a preset operation condition and whether screen position information corresponding to the current interaction operation is updated. According to the method and the device, waste of screen display resources is reduced, and experience of a user is improved.

Description

Icon interaction method and device for 3D space, electronic equipment and medium
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to an icon interaction method, an icon interaction device, electronic equipment and a medium for a 3D space.
Background
In the Internet era, the display mode and the interaction mode of network information greatly influence the information transmission efficiency and the experience effect of users. Typical icon interactions include displaying a corresponding icon frame in response to detecting a click, a slide, etc. action by a user.
However, the inventors have found that when the icon interaction is performed in the above manner, there are often the following technical problems:
first, when two-dimensional plane display icons are used in a screen, the number of icons displayed on the two-dimensional plane is limited, so that waste of screen display resources is caused. And the common interaction design causes that the user needs frequent operation to acquire the information required by the user, thereby reducing the experience of the user.
Second, when the icon frame is displayed according to the sliding motion of the user, if the sliding speed of the user is high, the delay of updating and displaying the icon frame to the user is long, so that the picture is blocked, and the experience of the user is poor.
Third, when the article icon matching is performed on the selected icon in response to detecting the user selection operation on the icon, the picture matching is performed by singly adopting the selected icon and the article icon, and the label information of the selected icon and the label information of the article icon are not considered. Resulting in lower accuracy of the matched article. Resulting in a poor user experience.
The above information disclosed in this background section is only for enhancement of understanding of the background of the inventive concept and, therefore, may contain information that does not form the prior art that is already known to those of ordinary skill in the art in this country.
Disclosure of Invention
The disclosure is in part intended to introduce concepts in a simplified form that are further described below in the detailed description. The disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose icon interaction methods, apparatuses, electronic devices, and computer-readable media for 3D space to solve one or more of the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide an icon interaction method for a 3D space, the method including: in response to detecting an interactive operation acting on the screen, determining whether screen position information of the interactive operation meets a preset position condition; in response to determining that the screen position information meets the preset position condition, determining the screen position information as initial screen position information, and acquiring initial icon parameter information of each icon displayed in a 3D space; in response to determining that the interactive operation meets the preset operation condition and the screen position information corresponding to the interactive operation is updated, executing the following operation steps based on the initial screen position information and each initial icon parameter information: determining updated screen position information corresponding to the interactive operation; generating display parameter information of each icon according to the updated screen position information, the initial screen position information and the initial icon parameter information of each icon; updating each displayed icon according to each display parameter information; and determining whether the current interaction operation meets the preset operation condition or not and whether the screen position information corresponding to the current interaction operation is updated or not by taking the updated screen position information as initial screen position information and taking the display parameter information as initial icon parameter information so as to execute the operation steps again.
In a second aspect, some embodiments of the present disclosure provide an icon interaction apparatus for a 3D space, including: a determining unit configured to determine whether screen position information of an interaction satisfies a preset position condition in response to detection of the interaction on the screen; an acquisition unit configured to determine the screen position information as initial screen position information and acquire respective initial icon parameter information of respective icons displayed in a 3D space in response to determining that the screen position information satisfies the preset position condition; an execution unit configured to, in response to determining that the interactive operation satisfies a preset operation condition, and that screen position information corresponding to the interactive operation is updated, execute the following operation steps based on the initial screen position information and each of the initial icon parameter information: determining updated screen position information corresponding to the interactive operation; generating display parameter information of each icon according to the updated screen position information, the initial screen position information and the initial icon parameter information of each icon; updating each displayed icon according to each display parameter information; and determining whether the current interaction operation meets the preset operation condition or not and whether the screen position information corresponding to the current interaction operation is updated or not by taking the updated screen position information as initial screen position information and taking the display parameter information as initial icon parameter information so as to execute the operation steps again.
In a third aspect, some embodiments of the present disclosure provide an electronic device comprising: one or more processors; a storage means for storing one or more programs; the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method described in any implementation of the first aspect.
In a fourth aspect, some embodiments of the present disclosure provide a computer readable medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the method described in any of the implementations of the first aspect above.
The above embodiments of the present disclosure have the following advantageous effects: through the icon interaction method of the 3D space, waste of screen display resources is reduced, and experience of a user is improved. Specifically, the reasons for wasting screen display resources and making the user experience feel worse are as follows: when the two-dimensional plane display icons are used in the screen, the number of the icons displayed by the two-dimensional plane is limited, so that the waste of screen display resources is caused. And the common interaction design causes that the user needs frequent operation to acquire the information required by the user, thereby reducing the experience of the user. Based on this, the icon interaction method of the 3D space of some embodiments of the present disclosure first determines whether screen position information of an interaction satisfies a preset position condition in response to detecting the interaction operation acting on the screen. Thereby, it can be determined whether the position of the user operation on the screen satisfies the preset position condition. Then, in response to determining that the screen position information satisfies the preset position condition, the screen position information is determined as initial screen position information, and respective initial icon parameter information of respective icons displayed in the 3D space is acquired. Thus, the starting position of the user operation on the screen and the respective initial icon parameter information of the respective icons can be obtained. Then, in response to determining that the interactive operation meets the preset operation condition and the screen position information corresponding to the interactive operation is updated, the following operation steps are executed based on the initial screen position information and each initial icon parameter information: first, updated screen position information corresponding to the interactive operation is determined. Thereby, the position information of the current interactive operation on the screen can be obtained. Then, each display parameter information of each icon is generated according to the updated screen position information, the initial screen position information and each initial icon parameter information of each icon. Thus, the respective display parameter information of the respective icons corresponding to the updated screen positions can be obtained. And then, updating each displayed icon according to each display parameter information. Thus, the respective icons can be updated and displayed. And finally, taking the updated screen position information as initial screen position information, taking the display parameter information as initial icon parameter information, and determining whether the current interaction operation meets the preset operation condition and whether the screen position information corresponding to the current interaction operation is updated so as to execute the operation steps again. Thus, when it is determined that the current interactive operation satisfies the preset operation condition, and the screen position corresponding to the current interactive operation is different from the screen position corresponding to the previous interactive operation, the above operation steps can be performed again. And because each icon is displayed in the 3D virtual space instead of the two-dimensional plane space, the information of page display is enriched. And in addition, the display parameter information of each icon can be adjusted in response to the detection of the interactive operation of the user, so that the user can browse rich icon information by means of fewer interactive operations, waste of screen display resources is reduced, and experience of the user is improved.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a flow chart of some embodiments of an icon interaction method for a 3D space according to the present disclosure;
FIG. 2 is a flow chart of some embodiments of an icon interaction device for a 3D space according to the present disclosure;
fig. 3 is a schematic structural diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings. Embodiments of the present disclosure and features of embodiments may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates a flow 100 of some embodiments of an icon interaction method for a 3D space according to the present disclosure. The icon interaction method of the 3D space comprises the following steps:
In step 101, in response to detecting an interactive operation acting on a screen, it is determined whether screen position information of the interactive operation meets a preset position condition.
In some embodiments, in response to detecting an interaction operation acting on a screen, an execution subject (e.g., a computing device) of an icon interaction method of a 3D space may determine whether screen position information of the interaction operation satisfies a preset position condition. The interactive operation may be an operation of a display screen by a user. For example, the interaction may be a drag, flick, or swipe. The screen position information may be position coordinates of the current interactive operation on the screen. The position coordinates may be two-dimensional coordinates using a plane in which the screen is located as a coordinate system. The preset position condition may be that the screen position information is in a circular area with a preset distance as a radius around a center point of the screen. Here, the setting of the preset distance is not limited. In practice, in response to detecting an interactive operation acting on a screen, first, the execution subject may acquire screen position information of the interactive operation through a MotionEvent tool. Then, in response to determining that the screen position information is within a circular area having the predetermined distance as a radius, it is determined that the screen position information satisfies the predetermined position condition.
Step 102, in response to determining that the screen position information meets the preset position condition, determining the screen position information as initial screen position information, and acquiring respective initial icon parameter information of each icon displayed in the 3D space.
In some embodiments, in response to determining that the screen position information satisfies the preset position condition, the execution body may determine the screen position information as initial screen position information and acquire respective initial icon parameter information of respective icons displayed in the 3D space. The initial screen position information may be coordinates corresponding to a position on the screen where the interactive operation is performed when the user performs the interactive operation for the first time. When the user performs the interactive operation, the initial screen position information may be coordinates corresponding to a position on the screen where the last interactive operation was performed. The 3D space may be a virtual space in which the respective icons are displayed. And the coordinates of the central points corresponding to the icons are positioned on a virtual sphere in the 3D space. The diameter of the virtual sphere is adapted to the width of the screen. The above-described respective initial icon parameter information may be display parameter information of the respective icons at the start of the interactive operation. The display parameter information may include a display size, a display transparency value, and a display darkness value. In practice, the execution subject may determine, as the initial icon parameter information, the position coordinates, the display transparency value, and the display darkness value of each of the icons at the start of the interactive operation, to obtain the initial icon parameter information.
Step 103, in response to determining that the interactive operation meets the preset operation condition, and updating screen position information corresponding to the interactive operation, based on the initial screen position information and each initial icon parameter information, executing the following operation steps:
step 1031, updated screen location information corresponding to the interactive operation is determined.
In some embodiments, the executing body may determine updated screen position information corresponding to the interactive operation. Wherein the updated screen position information may be position coordinates of the current interaction on the screen. The preset operation condition may be that the interactive operation characterizes the user sliding on the screen. In practice, the executing body can determine screen position information corresponding to the interactive operation through a MotionEvent tool.
Step 1032, generating each display parameter information of each icon according to the updated screen position information, the initial screen position information and each initial icon parameter information of each icon.
In some embodiments, the execution body may generate the respective display parameter information of the respective icons according to the updated screen position information, the initial screen position information, and the respective initial icon parameter information of the respective icons. The display parameter information may be display parameters of icons corresponding to the current interactive operation. In practice, the execution subject may generate the respective display parameter information of the respective icons according to the updated screen position information, the initial screen position information, and the respective initial icon parameter information of the respective icons in various manners.
In some optional implementations of some embodiments, the executing entity may generate the respective display parameter information of the respective icons according to updated screen position information, initial screen position information, and respective initial icon parameter information of the respective icons by:
first, according to the updated screen position information and the initial screen position information, determining the rotation information corresponding to each icon. The rotation information may represent a rotation matrix of the displacement of the current interaction operation and the adjacent previous interaction operation on the screen relative to the rotation of the virtual sphere. In practice, the execution body may determine the rotation information corresponding to the respective icons according to the updated screen position information and the initial screen position information in various manners.
And a second step of generating the display position information of each icon according to the rotation information. In practice, the execution subject may multiply each of the forward display position information of the respective icons by the rotation information to obtain the respective display position information of the respective icons. The forward display position information may be display position information of each icon corresponding to the forward interactive operation. The forward interaction may be a previous interaction adjacent to the current interaction.
And thirdly, determining the display attribute information of each icon according to the display position information. In practice, the execution body may determine the respective display attribute information of the respective icons according to the respective display position information in various manners.
And fourth, determining the display position information and the display attribute information of each icon as display parameter information to obtain the display parameter information.
In some optional implementations of some embodiments, the executing body may determine the rotation information corresponding to the respective icons according to the updated screen position information and the initial screen position information by:
and the first step is to perform three-dimensional space conversion processing on the initial screen position information to obtain first space position information. The three-dimensional space conversion process may be a process of converting two-dimensional coordinates on a screen into three-dimensional coordinates on a virtual sphere in the 3D space. The first spatial location information may represent three-dimensional coordinates on a virtual sphere in the 3D space corresponding to the forward interactive operation. In practice, the executing body may perform three-dimensional spatial conversion processing on the initial screen position information by using an arc sphere algorithm, so as to obtain first spatial position information.
And a second step of performing three-dimensional space conversion processing on the updated screen position information to obtain second space position information. The second spatial position information may represent three-dimensional coordinates on a virtual sphere in the 3D space corresponding to the current interaction operation.
And thirdly, generating rotation information corresponding to each icon according to the first spatial position information and the second spatial position information. In practice, the execution body may generate the rotation information corresponding to each icon according to the first spatial position information and the second spatial position information in various manners.
In some optional implementations of some embodiments, for each of the respective display position information, the executing body may determine respective display attribute information of the respective icons according to the respective display position information by:
first, determining distance information between the display position information and a plane of a screen in the 3D space. In practice, the execution subject may determine a component of the display position information in the Z-axis direction in the 3D space as distance information of the display position information from a plane in which the screen is located.
And a second step of determining a distance section corresponding to the distance information according to the distance information. The distance section may be a section in which a distance between the display position information and a plane on which the screen is located. In practice, the execution body may determine a distance zone corresponding to the distance information in a preset distance zone set as the distance zone corresponding to the distance information. The preset distance interval set may be a set formed by each distance interval. As an example, the set of preset distance intervals may include [0, 2), [2, 4), [4, 6), [6, 8), [8,10 ]. When the distance information is 5, the distance interval corresponding to the distance information may be [4,6 ].
And thirdly, determining display attribute information corresponding to the distance interval according to the distance interval. In practice, the executing body may query the display attribute information corresponding to the distance interval in the preset relationship table between the distance interval and the display attribute information, to obtain the display attribute information corresponding to the distance interval. The relationship table between the preset distance interval and the display attribute information may represent a relationship between the distance interval and the display attribute information.
In some optional implementations of some embodiments, the executing body may generate rotation information corresponding to the respective icons according to the first spatial position information and the second spatial position information by:
And a first step of generating rotation axis information and rotation angle information based on the first spatial position information and the second spatial position information. The rotation axis information may represent a rotation axis corresponding to each interactive operation. The rotation angle information may represent a rotation angle corresponding to each interactive operation. In practice, the execution body may generate the rotation axis information and the rotation angle information from the first spatial position information and the second spatial position information by:
and a first sub-step of determining a vector of a straight line where the first spatial position information and the center of the virtual sphere are located as a first vector by taking the center of the virtual sphere as an origin of a coordinate axis.
And a second sub-step of determining a vector of a straight line where the second spatial position information and the center of the virtual sphere are located as a second vector.
And a third sub-step of determining a result of the cross multiplication of the first vector and the second vector as rotation axis information.
And a fourth sub-step of subtracting the first vector from the second vector to obtain a difference vector.
And a fifth sub-step of determining a modulus of the difference vector as rotation angle information.
And a second step of generating rotation parameter information based on the rotation axis information and the rotation angle information. Wherein the rotation parameter information may characterize a rotation parameter vector. In practice, the execution subject mayIs determined as rotation parameter information. Above->The rotation angle information described above may be characterized. Above->The rotation axis information described above may be characterized.
And thirdly, normalizing the rotation parameter information to obtain normalized parameter information. The normalization process may be a process of converting the rotation parameter vector represented by the rotation parameter information into a unit vector. The normalization parameter information may be rotation parameter information after normalization processing. In practice, the execution body may determine the unit vector corresponding to the rotation parameter information as specification parameter information.
Fourth, generating accumulated parameter information according to the standard parameter information and the forward accumulated parameter information. The forward accumulated parameter information may be accumulated parameter information corresponding to forward interaction. The above-described cumulative parameter information may characterize a cumulative parameter of the cumulative rotational effect produced by the interactive operation. The cumulative rotational effect described above may characterize rotational displacement and rotational angle corresponding to successive interactions. In practice, the execution body may determine a result of the product of the forward accumulated parameter information and the specification parameter information as the accumulated parameter information.
And fifthly, matrixing the accumulated parameter information to obtain rotation information corresponding to each icon. The matrixing process may be a process of converting the accumulated parameter information into a matrix. In practice, the execution body may convert each parameter corresponding to the accumulated parameter information into a rotation matrix of 4×4 dimensions. As an example, when the above-mentioned accumulated parameter information isThe corresponding rotation matrix is (+)>,/>,0;/>,/>,0;/>,/>,/>,0;0,0,0,1)。
The related content of the technical scheme is taken as an invention point of the embodiment of the disclosure, so that the technical problem mentioned in the background art is solved, and if the sliding speed of the user is higher when the icon picture is displayed according to the sliding action of the user, the delay of updating and displaying the icon picture to the user is longer, and thus the picture is blocked, and the experience of the user is poorer. Factors that cause the user's experience to be poor tend to be as follows: when the icon picture is displayed according to the sliding action of the user, if the sliding speed of the user is high, the delay of updating and displaying the icon picture to the user is long, and thus the picture is blocked. If the above factors are solved, the effect of improving the experience of the user can be achieved. To achieve this, the present disclosure introduces the use of approximate calculations to obtain the rotation angle when determining the rotation information of a virtual sphere in 3D space, reducing the time required to calculate the rotation angle using an inverse trigonometric function. Thereby improving the updating speed of the icon picture. Thereby reducing the delay in updating and displaying the icon screen to the user. Therefore, the blocking of the picture can be reduced, and the experience of the user is improved.
And 1033, updating each displayed icon according to each display parameter information.
In some embodiments, the executing body may update each displayed icon according to each display parameter information. In practice, the executing body may redisplay the icons according to the display parameter information, so as to update the icons.
Step 1034, using the updated screen position information as initial screen position information and each display parameter information as each initial icon parameter information, determining whether the current interactive operation satisfies the preset operation condition and whether the screen position information corresponding to the current interactive operation is updated, so as to execute the operation step again.
In some embodiments, the executing body may use the updated screen position information as initial screen position information and use the respective display parameter information as respective initial icon parameter information, so as to determine whether the current interaction operation satisfies the preset operation condition and whether the screen position information corresponding to the current interaction operation is updated, so as to execute the operation step again. In practice, first, in response to detecting that the user slides on the screen, the execution subject may determine that the current interactive operation satisfies the preset operation condition. Then, in response to determining that the screen position information corresponding to the current interactive operation is different from the initial screen position information, the execution subject may determine that the screen position information corresponding to the current interactive operation is updated. After that, the execution body may execute the operation steps again.
Alternatively, in response to determining that the current interactive operation does not satisfy the preset operation condition, first, the execution subject may determine initial speed information corresponding to the interactive operation. The initial speed information may be an inertial speed at the moment when the user slides out of the screen. In practice, the executing body may use a VelocityTracker tool to determine initial speed information corresponding to the interaction operation.
Then, the displayed icons are updated and displayed according to the initial speed information. In practice, first, the executing body may input the initial speed information to a preset icon display interface to obtain each display position information and each display parameter information corresponding to each icon. And then, displaying the icons according to the display position information and the display parameter information. The preset icon display interface may be an interface for obtaining icon display position information and display parameter information according to the initial speed information.
Optionally, before updating and displaying each displayed icon according to the initial speed information, in response to determining that the initial speed information meets a preset speed condition, the executing body may further perform attenuation processing on the initial speed information to obtain speed information after the attenuation processing as the initial speed information. The preset speed condition may be that a speed corresponding to the initial speed is greater than a preset speed. Here, the setting of the above-described preset speed is not limited. In practice, in response to determining that the initial speed information satisfies the preset speed condition, the executing body may perform attenuation processing on the initial speed information in various manners, so as to obtain the speed information after the attenuation processing as the initial speed information.
In some optional implementations of some embodiments, in response to determining that the initial speed information meets the preset speed condition, the executing body may perform attenuation processing on the initial speed information to obtain the attenuated speed information as the initial speed information by:
first, according to the initial speed information and preset attenuation parameter information, determining attenuation speed information corresponding to the initial speed information. The preset damping parameter information may be a ratio of damping the initial speed information. The ratio of the above-mentioned attenuations may be 1/2000. In practice, the execution subject may determine the product of the initial speed information and the damping parameter information as damping speed information corresponding to the initial speed information.
And a second step of determining the speed information after the attenuation processing according to the initial speed information and the attenuation speed information. In practice, the executing body may subtract the attenuation speed information from the initial speed information to obtain speed information after the attenuation process.
And thirdly, determining the determined speed information after the attenuation processing as initial speed information.
Alternatively, in response to detecting a selection operation of any one of the icons in the 3D space, first, the execution body may determine an icon on which the selection operation is performed as a target icon. Wherein, the selecting operation may include, but is not limited to, one of the following: clicking and hovering. The target icon may be an icon selected by the selection operation.
And then updating the target icon according to the preset size information, the preset display attribute information and the preset rotation information. The preset size information may be adapted to a display screen. The preset display attribute information may represent display transparency and display darkness corresponding to the target icon. The preset rotation information may represent a rotation matrix updated for the rotation of the target icon.
And secondly, extracting features of the image of the target icon to obtain an image feature vector of the target icon. The image feature vector may be a feature vector obtained by extracting features of the target icon. In practice, the execution subject may use an IDCNN neural network model to perform feature extraction on the image of the target icon, so as to obtain an image feature vector of the target icon.
And then, extracting the characteristics of the label information of the target icon to obtain the text characteristic vector of the target icon. The text feature vector may be a feature vector obtained by extracting text features from tag information of the target icon. In practice, the execution subject may perform feature extraction on the tag information of the target icon by using an XLNet deep learning model, so as to obtain a text feature vector of the target icon.
And then, generating feature information of the target icon according to the image feature vector and the text feature vector of the target icon. The feature information may be a feature vector obtained by fusing the image feature vector and the text feature vector. In practice, the execution subject may determine, as the feature information of the target icon, a result of adding the product of the image feature vector and the image feature coefficient to the product of the text feature vector and the text feature coefficient. The image feature coefficients may be weights of the image feature vectors. The text feature coefficients may be weights of text feature vectors.
And then, according to the characteristic information, selecting each item identifier corresponding to each piece of preset item characteristic information meeting preset conditions from the preset item characteristic information set as an item identifier set. The preset condition may be that a similarity value between the characteristic information and the characteristic information of the preset article is greater than or equal to a preset similarity value. The preset similarity value may be 80%. The article identifier set may be a set formed by respective article identifiers corresponding to respective preset article feature information having a similarity value equal to or greater than the preset similarity value. In practice, the executing body may determine, as the article identifier set, each article identifier corresponding to each preset article feature information having a similarity value greater than or equal to a preset similarity value in the preset article feature information set.
And finally, displaying the icons and the labels corresponding to the article identifications in the article identification set in a sequencing way. In practice, the execution body may adopt a heap sort algorithm, sort the article identifiers in the article identifier set according to the creation time of the article, and display icons and labels corresponding to the article identifiers of the preset number arranged in front. Here, the setting of the preset numerical value is not limited.
The related content of the technical scheme is taken as an invention point of the embodiment of the disclosure, so that the technical problem mentioned in the background art is solved, and when the selected icon is matched with the object icon in response to detecting the selection operation of the user on the icon, the selected icon and the object icon are singly adopted for picture matching, and the label information of the selected icon and the label information of the object icon are not considered. Resulting in lower accuracy of the matched article. Resulting in a poor user experience. Factors that cause the user's experience to be poor tend to be as follows: when article icon matching is carried out on the selected icon in response to detection of the selection operation of the user on the icon, the selected icon and the article icon are singly adopted for picture matching, and the label information of the selected icon and the label information of the article icon are not considered. Resulting in lower accuracy of the matched article. If the above factors are solved, the effect of improving the user experience can be achieved. To achieve this effect, the present disclosure introduces feature information for generating a selected icon from the icon selected by the user and tag information of the icon. And then carrying out similarity matching with the article characteristic information in the preset article characteristic information set, and displaying the identification of each article with higher similarity. Therefore, the accuracy of the matched articles is improved, and the experience of the user can be improved.
The above embodiments of the present disclosure have the following advantageous effects: through the icon interaction method of the 3D space, waste of screen display resources is reduced, and experience of a user is improved. Specifically, the reasons for wasting screen display resources and making the user experience feel worse are as follows: when the two-dimensional plane display icons are used in the screen, the number of the icons displayed by the two-dimensional plane is limited, so that the waste of screen display resources is caused. And the common interaction design causes that the user needs frequent operation to acquire the information required by the user, thereby reducing the experience of the user. Based on this, the icon interaction method of the 3D space of some embodiments of the present disclosure first determines whether screen position information of an interaction satisfies a preset position condition in response to detecting the interaction operation acting on the screen. Thereby, it can be determined whether the position of the user operation on the screen satisfies the preset position condition. Then, in response to determining that the screen position information satisfies the preset position condition, the screen position information is determined as initial screen position information, and respective initial icon parameter information of respective icons displayed in the 3D space is acquired. Thus, the starting position of the user operation on the screen and the respective initial icon parameter information of the respective icons can be obtained. Then, in response to determining that the interactive operation meets the preset operation condition and the screen position information corresponding to the interactive operation is updated, the following operation steps are executed based on the initial screen position information and each initial icon parameter information: first, updated screen position information corresponding to the interactive operation is determined. Thereby, the position information of the current interactive operation on the screen can be obtained. Then, each display parameter information of each icon is generated according to the updated screen position information, the initial screen position information and each initial icon parameter information of each icon. Thus, the respective display parameter information of the respective icons corresponding to the updated screen positions can be obtained. And then, updating each displayed icon according to each display parameter information. Thus, the respective icons can be updated and displayed. And finally, taking the updated screen position information as initial screen position information, taking the display parameter information as initial icon parameter information, and determining whether the current interaction operation meets the preset operation condition and whether the screen position information corresponding to the current interaction operation is updated so as to execute the operation steps again. Thus, the above-described operation steps may be performed again after determining whether the current interactive operation satisfies the above-described preset operation condition, and the screen position corresponding to the current interactive operation is different from the screen position corresponding to the previous interactive operation. And because each icon is displayed in the 3D virtual space instead of the two-dimensional plane space, the information of page display is enriched. And the display parameter information of each icon can be adjusted in response to the detection of the interactive operation of the user, so that the user can acquire rich icon information by means of fewer interactive operations, waste of screen display resources is reduced, and experience of the user is improved.
With further reference to fig. 2, as an implementation of the method shown in the foregoing figures, the present disclosure provides some embodiments of an icon interaction apparatus for 3D space, where the apparatus embodiments correspond to those method embodiments shown in fig. 1, and the apparatus is particularly applicable to various electronic devices.
As shown in fig. 2, the icon interaction device 200 of the 3D space of some embodiments includes: a determination unit 201, an acquisition unit 202, and an execution unit 203. Wherein the determining unit 201 is configured to determine whether screen position information of interaction satisfies a preset position condition in response to detection of interaction on the screen; the acquiring unit 202 is configured to determine the screen position information as initial screen position information and acquire respective initial icon parameter information of respective icons displayed in the 3D space in response to determining that the screen position information satisfies the preset position condition; the execution unit 203 is configured to, in response to determining that the interactive operation satisfies the preset operation condition, and that the screen position information corresponding to the interactive operation is updated, execute the following operation steps based on the initial screen position information and the respective initial icon parameter information: determining updated screen position information corresponding to the interactive operation; generating display parameter information of each icon according to the updated screen position information, the initial screen position information and the initial icon parameter information of each icon; updating each displayed icon according to each display parameter information; and determining whether the current interaction operation meets the preset operation condition or not and whether the screen position information corresponding to the current interaction operation is updated or not by taking the updated screen position information as initial screen position information and taking the display parameter information as initial icon parameter information so as to execute the operation steps again.
It will be appreciated that the elements described in the apparatus 200 correspond to the various steps in the method described with reference to fig. 1. Thus, the operations, features and resulting benefits described above for the method are equally applicable to the apparatus 200 and the units contained therein, and are not described in detail herein.
Referring now to FIG. 3, a schematic diagram of an electronic device (e.g., computing device) 300 suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device shown in fig. 3 is merely an example and should not impose any limitations on the functionality and scope of use of embodiments of the present disclosure.
As shown in fig. 3, the electronic device 300 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 301 that may perform various suitable actions and processes in accordance with a program stored in a Read Only Memory (ROM) 302 or a program loaded from a storage means 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data required for the operation of the electronic apparatus 300 are also stored. The processing device 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
In general, the following devices may be connected to the I/O interface 305: input devices 306 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 308 including, for example, magnetic tape, hard disk, etc.; and communication means 309. The communication means 309 may allow the electronic device 300 to communicate with other devices wirelessly or by wire to exchange data. While fig. 3 shows an electronic device 300 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead. Each block shown in fig. 3 may represent one device or a plurality of devices as needed.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via communications device 309, or from storage device 308, or from ROM 302. The above-described functions defined in the methods of some embodiments of the present disclosure are performed when the computer program is executed by the processing means 301.
It should be noted that, the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, the computer-readable signal medium may comprise a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic display device, cause the electronic device to: in response to detecting an interactive operation acting on the screen, determining whether screen position information of the interactive operation meets a preset position condition; in response to determining that the screen position information meets the preset position condition, determining the screen position information as initial screen position information, and acquiring initial icon parameter information of each icon displayed in a 3D space; in response to determining that the interactive operation meets the preset operation condition and the screen position information corresponding to the interactive operation is updated, executing the following operation steps based on the initial screen position information and each initial icon parameter information: determining updated screen position information corresponding to the interactive operation; generating display parameter information of each icon according to the updated screen position information, the initial screen position information and the initial icon parameter information of each icon; updating each displayed icon according to each display parameter information; and determining whether the current interaction operation meets the preset operation condition or not and whether the screen position information corresponding to the current interaction operation is updated or not by taking the updated screen position information as initial screen position information and taking the display parameter information as initial icon parameter information so as to execute the operation steps again.
Computer program code for carrying out operations for some embodiments of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The described units may also be provided in a processor, for example, described as: a processor includes a determination unit, an acquisition unit, and an execution unit. Wherein the names of these units do not constitute a limitation of the unit itself in some cases, for example, the determination unit may also be described as "a unit that determines whether screen position information of interaction satisfies a preset position condition in response to detection of interaction with a screen".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above technical features, but encompasses other technical features formed by any combination of the above technical features or their equivalents without departing from the spirit of the invention. Such as the above-described features, are mutually substituted with (but not limited to) the features having similar functions disclosed in the embodiments of the present disclosure.

Claims (10)

1. An icon interaction method of a 3D space, comprising:
in response to detecting an interaction operation acting on a screen, determining whether screen position information interacted with by the interaction operation meets a preset position condition;
in response to determining that the screen position information meets the preset position condition, determining the screen position information as initial screen position information, and acquiring initial icon parameter information of each icon displayed in a 3D space;
in response to determining that the interactive operation meets the preset operation condition and the screen position information corresponding to the interactive operation is updated, executing the following operation steps based on the initial screen position information and each initial icon parameter information:
determining updated screen position information corresponding to the interactive operation;
generating each display parameter information of each icon according to the updated screen position information, the initial screen position information and each initial icon parameter information of each icon;
updating each displayed icon according to each display parameter information;
and determining whether the current interaction operation meets the preset operation condition or not and whether the screen position information corresponding to the current interaction operation is updated or not by taking the updated screen position information as initial screen position information and taking the display parameter information as initial icon parameter information so as to execute the operation step again.
2. The method of claim 1, wherein the method further comprises:
determining initial speed information corresponding to the current interactive operation in response to determining that the current interactive operation does not meet the preset operation condition;
and updating and displaying each displayed icon according to the initial speed information.
3. The method of claim 1, wherein the generating respective display parameter information for the respective icons from the updated screen position information, the initial screen position information, and the respective initial icon parameter information for the respective icons comprises:
determining rotation information corresponding to each icon according to the updated screen position information and the initial screen position information;
generating display position information of each icon according to the rotation information;
determining each display attribute information of each icon according to each display position information;
and determining the display position information and the display attribute information of each icon as display parameter information to obtain the display parameter information.
4. A method according to claim 3, wherein said determining rotation information corresponding to said respective icons from said updated screen position information and said initial screen position information comprises:
Performing three-dimensional space conversion processing on the initial screen position information to obtain first space position information;
performing three-dimensional space conversion processing on the updated screen position information to obtain second space position information;
and generating rotation information corresponding to each icon according to the first spatial position information and the second spatial position information.
5. A method according to claim 3, wherein said determining respective display attribute information for said respective icons from said respective display position information comprises:
for each of the respective display position information, the following steps are performed:
determining the distance information between the display position information and the plane of the screen in the 3D space;
determining a distance section corresponding to the distance information according to the distance information;
and determining display attribute information corresponding to the distance interval according to the distance interval.
6. The method of claim 2, wherein prior to said updating display of each icon displayed in accordance with the initial speed information, the method further comprises:
and in response to determining that the initial speed information meets the preset speed condition, carrying out attenuation processing on the initial speed information to obtain the speed information after the attenuation processing as the initial speed information.
7. The method according to claim 6, wherein the performing the attenuation processing on the initial speed information in response to determining that the initial speed information satisfies the preset speed condition, to obtain the attenuated speed information as the initial speed information, includes:
according to the initial speed information and preset attenuation parameter information, determining attenuation speed information corresponding to the initial speed information;
determining the speed information after the attenuation processing according to the initial speed information and the attenuation speed information;
the determined speed information after the attenuation process is determined as initial speed information.
8. An icon interaction device for a 3D space, comprising:
a determining unit configured to determine, in response to detection of an interactive operation acting on a screen, whether screen position information of the interactive operation meets a preset position condition;
an acquisition unit configured to determine the screen position information as initial screen position information and acquire respective initial icon parameter information of respective icons displayed in a 3D space in response to determining that the screen position information satisfies the preset position condition;
an execution unit configured to, in response to determining that the interactive operation satisfies a preset operation condition, and that screen position information corresponding to the interactive operation is updated, execute the following operation steps based on the initial screen position information and each of the initial icon parameter information: determining updated screen position information corresponding to the interactive operation; generating each display parameter information of each icon according to the updated screen position information, the initial screen position information and each initial icon parameter information of each icon; updating each displayed icon according to each display parameter information; and determining whether the current interaction operation meets the preset operation condition or not and whether the screen position information corresponding to the current interaction operation is updated or not by taking the updated screen position information as initial screen position information and taking the display parameter information as initial icon parameter information so as to execute the operation step again.
9. An electronic device, comprising:
one or more processors;
a storage means for storing one or more programs;
when executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-7.
10. A computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the method of any of claims 1-7.
CN202311140121.1A 2023-09-06 2023-09-06 Icon interaction method and device for 3D space, electronic equipment and medium Active CN116880726B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311140121.1A CN116880726B (en) 2023-09-06 2023-09-06 Icon interaction method and device for 3D space, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311140121.1A CN116880726B (en) 2023-09-06 2023-09-06 Icon interaction method and device for 3D space, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN116880726A true CN116880726A (en) 2023-10-13
CN116880726B CN116880726B (en) 2023-12-19

Family

ID=88271871

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311140121.1A Active CN116880726B (en) 2023-09-06 2023-09-06 Icon interaction method and device for 3D space, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN116880726B (en)

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6295062B1 (en) * 1997-11-14 2001-09-25 Matsushita Electric Industrial Co., Ltd. Icon display apparatus and method used therein
US20090293014A1 (en) * 2008-05-23 2009-11-26 At&T Intellectual Property, Lp Multimedia Content Information Display Methods and Device
KR20110059009A (en) * 2009-11-27 2011-06-02 삼성전자주식회사 Apparatus and method for user interface configuration in portable terminal
JP2012252384A (en) * 2011-05-31 2012-12-20 Camelot:Kk Screen control system, screen control method, and screen control program
US20130311946A1 (en) * 2012-05-17 2013-11-21 O-Hyeong KWON Apparatus and method for user-centered icon layout on main screen
CN103853452A (en) * 2014-03-04 2014-06-11 广州市久邦数码科技有限公司 Method and system for realizing multi-screen switchover of stereo desktop
CN104504761A (en) * 2014-12-15 2015-04-08 天脉聚源(北京)科技有限公司 Method and device for controlling rotation of 3D (three-dimensional) model
CN105278819A (en) * 2014-06-30 2016-01-27 西安Tcl软件开发有限公司 Application navigation method of user interaction interfaces and intelligent equipment
CN105302407A (en) * 2014-06-23 2016-02-03 中兴通讯股份有限公司 Application icon display method and apparatus
CN106020655A (en) * 2016-05-18 2016-10-12 北京金山安全软件有限公司 Method and device for switching interface screen and electronic equipment
US20170131785A1 (en) * 2014-07-31 2017-05-11 Starship Vending-Machine Corp. Method and apparatus for providing interface interacting with user by means of nui device
CN107728886A (en) * 2017-10-25 2018-02-23 维沃移动通信有限公司 A kind of one-handed performance method and apparatus
CN108983954A (en) * 2017-05-31 2018-12-11 腾讯科技(深圳)有限公司 Data processing method, device and system based on virtual reality
WO2019051785A1 (en) * 2017-09-15 2019-03-21 深圳传音通讯有限公司 Icon display method and device for intelligent terminal
CN109992345A (en) * 2019-04-01 2019-07-09 珠海格力电器股份有限公司 A kind of control method of screen-icon, device, storage medium and terminal
CN113064529A (en) * 2021-03-05 2021-07-02 青岛海尔科技有限公司 Application icon display method and device, storage medium and electronic device
CN113420193A (en) * 2021-07-20 2021-09-21 北京字节跳动网络技术有限公司 Display method and device
WO2022242395A1 (en) * 2021-05-20 2022-11-24 北京城市网邻信息技术有限公司 Image processing method and apparatus, electronic device and computer-readable storage medium

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6295062B1 (en) * 1997-11-14 2001-09-25 Matsushita Electric Industrial Co., Ltd. Icon display apparatus and method used therein
US20090293014A1 (en) * 2008-05-23 2009-11-26 At&T Intellectual Property, Lp Multimedia Content Information Display Methods and Device
KR20110059009A (en) * 2009-11-27 2011-06-02 삼성전자주식회사 Apparatus and method for user interface configuration in portable terminal
JP2012252384A (en) * 2011-05-31 2012-12-20 Camelot:Kk Screen control system, screen control method, and screen control program
US20130311946A1 (en) * 2012-05-17 2013-11-21 O-Hyeong KWON Apparatus and method for user-centered icon layout on main screen
CN103853452A (en) * 2014-03-04 2014-06-11 广州市久邦数码科技有限公司 Method and system for realizing multi-screen switchover of stereo desktop
CN105302407A (en) * 2014-06-23 2016-02-03 中兴通讯股份有限公司 Application icon display method and apparatus
CN105278819A (en) * 2014-06-30 2016-01-27 西安Tcl软件开发有限公司 Application navigation method of user interaction interfaces and intelligent equipment
US20170131785A1 (en) * 2014-07-31 2017-05-11 Starship Vending-Machine Corp. Method and apparatus for providing interface interacting with user by means of nui device
CN104504761A (en) * 2014-12-15 2015-04-08 天脉聚源(北京)科技有限公司 Method and device for controlling rotation of 3D (three-dimensional) model
CN106020655A (en) * 2016-05-18 2016-10-12 北京金山安全软件有限公司 Method and device for switching interface screen and electronic equipment
CN108983954A (en) * 2017-05-31 2018-12-11 腾讯科技(深圳)有限公司 Data processing method, device and system based on virtual reality
WO2019051785A1 (en) * 2017-09-15 2019-03-21 深圳传音通讯有限公司 Icon display method and device for intelligent terminal
CN107728886A (en) * 2017-10-25 2018-02-23 维沃移动通信有限公司 A kind of one-handed performance method and apparatus
CN109992345A (en) * 2019-04-01 2019-07-09 珠海格力电器股份有限公司 A kind of control method of screen-icon, device, storage medium and terminal
CN113064529A (en) * 2021-03-05 2021-07-02 青岛海尔科技有限公司 Application icon display method and device, storage medium and electronic device
WO2022242395A1 (en) * 2021-05-20 2022-11-24 北京城市网邻信息技术有限公司 Image processing method and apparatus, electronic device and computer-readable storage medium
CN113420193A (en) * 2021-07-20 2021-09-21 北京字节跳动网络技术有限公司 Display method and device

Also Published As

Publication number Publication date
CN116880726B (en) 2023-12-19

Similar Documents

Publication Publication Date Title
CN112015314B (en) Information display method and device, electronic equipment and medium
CN113934958B (en) Page loading method and device, electronic equipment and computer readable medium
CN112051961A (en) Virtual interaction method and device, electronic equipment and computer readable storage medium
CN111597466A (en) Display method and device and electronic equipment
CN111459364A (en) Icon updating method and device and electronic equipment
CN111461967B (en) Picture processing method, device, equipment and computer readable medium
CN111652675A (en) Display method and device and electronic equipment
CN110806834A (en) Information processing method and device based on input method, electronic equipment and medium
CN111461968A (en) Picture processing method and device, electronic equipment and computer readable medium
CN111273830A (en) Data display method and device, electronic equipment and computer readable medium
CN111292406B (en) Model rendering method, device, electronic equipment and medium
CN116880726B (en) Icon interaction method and device for 3D space, electronic equipment and medium
CN113741750B (en) Cursor position updating method and device and electronic equipment
CN110399059A (en) Method and apparatus for showing information
CN111680754B (en) Image classification method, device, electronic equipment and computer readable storage medium
CN113253874A (en) Display device control method, device, terminal and storage medium
CN112711457A (en) Method and device for field drawing and electronic equipment
CN111460334A (en) Information display method and device and electronic equipment
CN111461969A (en) Method, device, electronic equipment and computer readable medium for processing picture
CN112925593A (en) Method and device for scaling and rotating target layer
CN112395826B (en) Text special effect processing method and device
CN112015421B (en) Article information display method and device, electronic equipment and computer readable medium
WO2023025181A1 (en) Image recognition method and apparatus, and electronic device
CN111797932B (en) Image classification method, apparatus, device and computer readable medium
CN111489286B (en) Picture processing method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant